Sample records for refactoring process improving

  1. Refactoring and Its Benefits

    NASA Astrophysics Data System (ADS)

    Veerraju, R. P. S. P.; Rao, A. Srinivasa; Murali, G.

    2010-10-01

    Refactoring is a disciplined technique for restructuring an existing body of code, altering its internal structure without changing its external behavior. It improves internal code structure without altering its external functionality by transforming functions and rethinking algorithms. It is an iterative process. Refactoring include reducing scope, replacing complex instructions with simpler or built-in instructions, and combining multiple statements into one statement. By transforming the code with refactoring techniques it will be faster to change, execute, and download. It is an excellent best practice to adopt for programmers wanting to improve their productivity. Refactoring is similar to things like performance optimizations, which are also behavior- preserving transformations. It also helps us find bugs when we are trying to fix a bug in difficult-to-understand code. By cleaning things up, we make it easier to expose the bug. Refactoring improves the quality of application design and implementation. In general, three cases concerning refactoring. Iterative refactoring, Refactoring when is necessary, Not refactor. Mr. Martin Fowler identifies four key reasons to refractor. Refactoring improves the design of software, makes software easier to understand, helps us find bugs and also helps in executing the program faster. There is an additional benefit of refactoring. It changes the way a developer thinks about the implementation when not refactoring. There are the three types of refactorings. 1) Code refactoring: It often referred to simply as refactoring. This is the refactoring of programming source code. 2) Database refactoring: It is a simple change to a database schema that improves its design while retaining both its behavioral and informational semantics. 3) User interface (UI) refactoring: It is a simple change to the UI which retains its semantics. Finally, we conclude the benefits of Refactoring are: Improves the design of software, Makes software easier to understand, Software gets cleaned up and Helps us to find bugs and Helps us to program faster.

  2. Refactoring and Its Benefits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Veerraju, R. P. S. P.; Rao, A. Srinivasa; Murali, G.

    2010-10-26

    Refactoring is a disciplined technique for restructuring an existing body of code, altering its internal structure without changing its external behavior. It improves internal code structure without altering its external functionality by transforming functions and rethinking algorithms. It is an iterative process. Refactoring include reducing scope, replacing complex instructions with simpler or built-in instructions, and combining multiple statements into one statement. By transforming the code with refactoring techniques it will be faster to change, execute, and download. It is an excellent best practice to adopt for programmers wanting to improve their productivity. Refactoring is similar to things like performance optimizations,more » which are also behavior- preserving transformations. It also helps us find bugs when we are trying to fix a bug in difficult-to-understand code. By cleaning things up, we make it easier to expose the bug. Refactoring improves the quality of application design and implementation. In general, three cases concerning refactoring. Iterative refactoring, Refactoring when is necessary, Not refactor.Mr. Martin Fowler identifies four key reasons to refractor. Refactoring improves the design of software, makes software easier to understand, helps us find bugs and also helps in executing the program faster. There is an additional benefit of refactoring. It changes the way a developer thinks about the implementation when not refactoring. There are the three types of refactorings. 1) Code refactoring: It often referred to simply as refactoring. This is the refactoring of programming source code. 2) Database refactoring: It is a simple change to a database schema that improves its design while retaining both its behavioral and informational semantics. 3) User interface (UI) refactoring: It is a simple change to the UI which retains its semantics. Finally, we conclude the benefits of Refactoring are: Improves the design of software, Makes software easier to understand, Software gets cleaned up and Helps us to find bugs and Helps us to program faster.« less

  3. Refactoring affordances in corporate wikis: a case for the use of mind maps

    NASA Astrophysics Data System (ADS)

    Puente, Gorka; Díaz, Oscar; Azanza, Maider

    2015-11-01

    The organisation of corporate wikis tends to deteriorate as time goes by. Rearranging categories, structuring articles and even moving sections among articles are cumbersome tasks in current wiki engines. This discourages the layman. But, it is the layman who writes the articles, knows the wiki content and detects refactoring opportunities. Our goal is to improve the refactoring affordances of current wiki engines by providing an alternative front-end tuned to refactoring. This is achieved by (1) surfacing the structure of the wiki corpus as a mind map, and (2) conducting refactoring as mind map reshaping. To this end, we introduce WikiWhirl, a domain-specific language for wiki refactoring. WikiWhirl is supported as an extension of FreeMind, a popular mind mapping tool. In this way, refactoring operations are intuitively conducted as actions upon mind map nodes. In a refactoring session a user imports the wiki structure as a FreeMind map; next, conducts the refactoring operations on the map, and finally, the effects are saved in the wiki database. The operational semantics of the WikiWhirl operations follow refactoring good practices (e.g., authorship preservation). Results from a controlled experiment suggest that WikiWhirl outperforms MediaWiki in three main affordance enablers: understandability, productivity and fulfillment of refactoring good practices.

  4. LAURA Users Manual: 5.3-48528

    NASA Technical Reports Server (NTRS)

    Mazaheri, Alireza; Gnoffo, Peter A.; Johnston, Chirstopher O.; Kleb, Bil

    2010-01-01

    This users manual provides in-depth information concerning installation and execution of LAURA, version 5. LAURA is a structured, multi-block, computational aerothermodynamic simulation code. Version 5 represents a major refactoring of the original Fortran 77 LAURA code toward a modular structure afforded by Fortran 95. The refactoring improved usability and maintainability by eliminating the requirement for problem-dependent re-compilations, providing more intuitive distribution of functionality, and simplifying interfaces required for multi-physics coupling. As a result, LAURA now shares gas-physics modules, MPI modules, and other low-level modules with the FUN3D unstructured-grid code. In addition to internal refactoring, several new features and capabilities have been added, e.g., a GNU-standard installation process, parallel load balancing, automatic trajectory point sequencing, free-energy minimization, and coupled ablation and flowfield radiation.

  5. LAURA Users Manual: 5.5-64987

    NASA Technical Reports Server (NTRS)

    Mazaheri, Alireza; Gnoffo, Peter A.; Johnston, Christopher O.; Kleb, William L.

    2013-01-01

    This users manual provides in-depth information concerning installation and execution of LAURA, version 5. LAURA is a structured, multi-block, computational aerothermodynamic simulation code. Version 5 represents a major refactoring of the original Fortran 77 LAURA code toward a modular structure afforded by Fortran 95. The refactoring improved usability and maintain ability by eliminating the requirement for problem dependent recompilations, providing more intuitive distribution of functionality, and simplifying interfaces required for multi-physics coupling. As a result, LAURA now shares gas-physics modules, MPI modules, and other low-level modules with the Fun3D unstructured-grid code. In addition to internal refactoring, several new features and capabilities have been added, e.g., a GNU standard installation process, parallel load balancing, automatic trajectory point sequencing, free-energy minimization, and coupled ablation and flowfield radiation.

  6. LAURA Users Manual: 5.4-54166

    NASA Technical Reports Server (NTRS)

    Mazaheri, Alireza; Gnoffo, Peter A.; Johnston, Christopher O.; Kleb, Bil

    2011-01-01

    This users manual provides in-depth information concerning installation and execution of Laura, version 5. Laura is a structured, multi-block, computational aerothermodynamic simulation code. Version 5 represents a major refactoring of the original Fortran 77 Laura code toward a modular structure afforded by Fortran 95. The refactoring improved usability and maintainability by eliminating the requirement for problem dependent re-compilations, providing more intuitive distribution of functionality, and simplifying interfaces required for multi-physics coupling. As a result, Laura now shares gas-physics modules, MPI modules, and other low-level modules with the Fun3D unstructured-grid code. In addition to internal refactoring, several new features and capabilities have been added, e.g., a GNU-standard installation process, parallel load balancing, automatic trajectory point sequencing, free-energy minimization, and coupled ablation and flowfield radiation.

  7. LAURA Users Manual: 5.2-43231

    NASA Technical Reports Server (NTRS)

    Mazaheri, Alireza; Gnoffo, Peter A.; Johnston, Christopher O.; Kleb, Bil

    2009-01-01

    This users manual provides in-depth information concerning installation and execution of LAURA, version 5. LAURA is a structured, multi-block, computational aerothermodynamic simulation code. Version 5 represents a major refactoring of the original Fortran 77 LAURA code toward a modular structure afforded by Fortran 95. The refactoring improved usability and maintainability by eliminating the requirement for problem-dependent re-compilations, providing more intuitive distribution of functionality, and simplifying interfaces required for multiphysics coupling. As a result, LAURA now shares gas-physics modules, MPI modules, and other low-level modules with the FUN3D unstructured-grid code. In addition to internal refactoring, several new features and capabilities have been added, e.g., a GNU-standard installation process, parallel load balancing, automatic trajectory point sequencing, free-energy minimization, and coupled ablation and flowfield radiation.

  8. Laura Users Manual: 5.1-41601

    NASA Technical Reports Server (NTRS)

    Mazaheri, Alireza; Gnoffo, Peter A.; Johnston, Christopher O.; Kleb, Bil

    2009-01-01

    This users manual provides in-depth information concerning installation and execution of LAURA, version 5. LAURA is a structured, multi-block, computational aerothermodynamic simulation code. Version 5 represents a major refactoring of the original Fortran 77 LAURA code toward a modular structure afforded by Fortran 95. The refactoring improved usability and maintainability by eliminating the requirement for problem-dependent re-compilations, providing more intuitive distribution of functionality, and simplifying interfaces required for multiphysics coupling. As a result, LAURA now shares gas-physics modules, MPI modules, and other low-level modules with the FUN3D unstructured-grid code. In addition to internal refactoring, several new features and capabilities have been added, e.g., a GNU-standard installation process, parallel load balancing, automatic trajectory point sequencing, free-energy minimization, and coupled ablation and flowfield radiation.

  9. Sparse PCA corrects for cell type heterogeneity in epigenome-wide association studies.

    PubMed

    Rahmani, Elior; Zaitlen, Noah; Baran, Yael; Eng, Celeste; Hu, Donglei; Galanter, Joshua; Oh, Sam; Burchard, Esteban G; Eskin, Eleazar; Zou, James; Halperin, Eran

    2016-05-01

    In epigenome-wide association studies (EWAS), different methylation profiles of distinct cell types may lead to false discoveries. We introduce ReFACTor, a method based on principal component analysis (PCA) and designed for the correction of cell type heterogeneity in EWAS. ReFACTor does not require knowledge of cell counts, and it provides improved estimates of cell type composition, resulting in improved power and control for false positives in EWAS. Corresponding software is available at http://www.cs.tau.ac.il/~heran/cozygene/software/refactor.html.

  10. An NAFP Project: Use of Object Oriented Methodologies and Design Patterns to Refactor Software Design

    NASA Technical Reports Server (NTRS)

    Shaykhian, Gholam Ali; Baggs, Rhoda

    2007-01-01

    In the early problem-solution era of software programming, functional decompositions were mainly used to design and implement software solutions. In functional decompositions, functions and data are introduced as two separate entities during the design phase, and are followed as such in the implementation phase. Functional decompositions make use of refactoring through optimizing the algorithms, grouping similar functionalities into common reusable functions, and using abstract representations of data where possible; all these are done during the implementation phase. This paper advocates the usage of object-oriented methodologies and design patterns as the centerpieces of refactoring software solutions. Refactoring software is a method of changing software design while explicitly preserving its external functionalities. The combined usage of object-oriented methodologies and design patterns to refactor should also benefit the overall software life cycle cost with improved software.

  11. Genome Calligrapher: A Web Tool for Refactoring Bacterial Genome Sequences for de Novo DNA Synthesis.

    PubMed

    Christen, Matthias; Deutsch, Samuel; Christen, Beat

    2015-08-21

    Recent advances in synthetic biology have resulted in an increasing demand for the de novo synthesis of large-scale DNA constructs. Any process improvement that enables fast and cost-effective streamlining of digitized genetic information into fabricable DNA sequences holds great promise to study, mine, and engineer genomes. Here, we present Genome Calligrapher, a computer-aided design web tool intended for whole genome refactoring of bacterial chromosomes for de novo DNA synthesis. By applying a neutral recoding algorithm, Genome Calligrapher optimizes GC content and removes obstructive DNA features known to interfere with the synthesis of double-stranded DNA and the higher order assembly into large DNA constructs. Subsequent bioinformatics analysis revealed that synthesis constraints are prevalent among bacterial genomes. However, a low level of codon replacement is sufficient for refactoring bacterial genomes into easy-to-synthesize DNA sequences. To test the algorithm, 168 kb of synthetic DNA comprising approximately 20 percent of the synthetic essential genome of the cell-cycle bacterium Caulobacter crescentus was streamlined and then ordered from a commercial supplier of low-cost de novo DNA synthesis. The successful assembly into eight 20 kb segments indicates that Genome Calligrapher algorithm can be efficiently used to refactor difficult-to-synthesize DNA. Genome Calligrapher is broadly applicable to recode biosynthetic pathways, DNA sequences, and whole bacterial genomes, thus offering new opportunities to use synthetic biology tools to explore the functionality of microbial diversity. The Genome Calligrapher web tool can be accessed at https://christenlab.ethz.ch/GenomeCalligrapher  .

  12. Refactoring DIRT

    NASA Astrophysics Data System (ADS)

    Amarnath, N. S.; Pound, M. W.; Wolfire, M. G.

    The Dust InfraRed ToolBox (DIRT - a part of the Web Infrared ToolShed, or WITS, located at http://dustem.astro.umd.edu) is a Java applet for modeling astrophysical processes in circumstellar shells around young and evolved stars. DIRT has been used by the astrophysics community for about 4 years. DIRT uses results from a number of numerical models of astrophysical processes, and has an AWT based user interface. DIRT has been refactored to decouple data representation from plotting and curve fitting. This makes it easier to add new kinds of astrophysical models, use the plotter in other applications, migrate the user interface to Swing components, and modify the user interface to add functionality (for example, SIRTF tools). DIRT is now an extension of two generic libraries, one of which manages data representation and caching, and the second of which manages plotting and curve fitting. This project is an example of refactoring with no impact on user interface, so the existing user community was not affected.

  13. Decaffeination and measurement of caffeine content by addicted Escherichia coli with a refactored N-demethylation operon from Pseudomonas putida CBB5.

    PubMed

    Quandt, Erik M; Hammerling, Michael J; Summers, Ryan M; Otoupal, Peter B; Slater, Ben; Alnahhas, Razan N; Dasgupta, Aurko; Bachman, James L; Subramanian, Mani V; Barrick, Jeffrey E

    2013-06-21

    The widespread use of caffeine (1,3,7-trimethylxanthine) and other methylxanthines in beverages and pharmaceuticals has led to significant environmental pollution. We have developed a portable caffeine degradation operon by refactoring the alkylxanthine degradation (Alx) gene cluster from Pseudomonas putida CBB5 to function in Escherichia coli. In the process, we discovered that adding a glutathione S-transferase from Janthinobacterium sp. Marseille was necessary to achieve N 7 -demethylation activity. E. coli cells with the synthetic operon degrade caffeine to the guanine precursor, xanthine. Cells deficient in de novo guanine biosynthesis that contain the refactored operon are ″addicted″ to caffeine: their growth density is limited by the availability of caffeine or other xanthines. We show that the addicted strain can be used as a biosensor to measure the caffeine content of common beverages. The synthetic N-demethylation operon could be useful for reclaiming nutrient-rich byproducts of coffee bean processing and for the cost-effective bioproduction of methylxanthine drugs.

  14. Application of Design Patterns in Refactoring Software Design

    NASA Technical Reports Server (NTRS)

    Baggs. Rjpda; Shaykhian, Gholam Ali

    2007-01-01

    Refactoring software design is a method of changing software design while explicitly preserving its unique design functionalities. Presented approach is to utilize design patterns as the basis for refactoring software design. Comparison of a design solution will be made through C++ programming language examples to exploit this approach. Developing reusable component will be discussed, the paper presents that the construction of such components can diminish the added burden of both refactoring and the use of design patterns.

  15. Apply Design Patterns to Refactor Software Design

    NASA Technical Reports Server (NTRS)

    Baggs, Rhoda; Shaykhian, Gholam Ali

    2007-01-01

    Refactoring software design is a method of changing software design while explicitly preserving its unique design functionalities. Presented approach is to utilize design patterns as the basis for refactoring software design. Comparison of a design solution will be made through C++ programming language examples to exploit this approach. Developing reusable component will be discussed, the paper presents that the construction of such components can diminish the added burden of both refactoring and the use of design patterns.

  16. Refactored M13 Bacteriophage as a Platform for Tumor Cell Imaging and Drug Delivery

    PubMed Central

    MOSER, FELIX; ENDY, DREW; BELCHER, ANGELA M.

    2014-01-01

    M13 bacteriophage is a well-characterized platform for peptide display. The utility of the M13 display platform is derived from the ability to encode phage protein fusions with display peptides at the genomic level. However, the genome of the phage is complicated by overlaps of key genetic elements. These overlaps directly couple the coding sequence of one gene to the coding or regulatory sequence of another, making it difficult to alter one gene without disrupting the other. Specifically, overlap of the end of gene VII and the beginning of gene IX has prevented the functional genomic modification of the N-terminus of p9. By redesigning the M13 genome to physically separate these overlapping genetic elements, a process known as “refactoring,” we enabled independent manipulation of gene VII and gene IX and the construction of the first N-terminal genomic modification of p9 for peptide display. We demonstrate the utility of this refactored genome by developing an M13 bacteriophage-based platform for targeted imaging of and drug delivery to prostate cancer cells in vitro. This successful use of refactoring principles to reengineer a natural biological system strengthens the suggestion that natural genomes can be rationally designed for a number of applications. PMID:23656279

  17. Refactored M13 bacteriophage as a platform for tumor cell imaging and drug delivery.

    PubMed

    Ghosh, Debadyuti; Kohli, Aditya G; Moser, Felix; Endy, Drew; Belcher, Angela M

    2012-12-21

    M13 bacteriophage is a well-characterized platform for peptide display. The utility of the M13 display platform is derived from the ability to encode phage protein fusions with display peptides at the genomic level. However, the genome of the phage is complicated by overlaps of key genetic elements. These overlaps directly couple the coding sequence of one gene to the coding or regulatory sequence of another, making it difficult to alter one gene without disrupting the other. Specifically, overlap of the end of gene VII and the beginning of gene IX has prevented the functional genomic modification of the N-terminus of p9. By redesigning the M13 genome to physically separate these overlapping genetic elements, a process known as "refactoring," we enabled independent manipulation of gene VII and gene IX and the construction of the first N-terminal genomic modification of p9 for peptide display. We demonstrate the utility of this refactored genome by developing an M13 bacteriophage-based platform for targeted imaging of and drug delivery to prostate cancer cells in vitro. This successful use of refactoring principles to re-engineer a natural biological system strengthens the suggestion that natural genomes can be rationally designed for a number of applications.

  18. Toward a Formal Evaluation of Refactorings

    NASA Technical Reports Server (NTRS)

    Paul, John; Kuzmina, Nadya; Gamboa, Ruben; Caldwell, James

    2008-01-01

    Refactoring is a software development strategy that characteristically alters the syntactic structure of a program without changing its external behavior [2]. In this talk we present a methodology for extracting formal models from programs in order to evaluate how incremental refactorings affect the verifiability of their structural specifications. We envision that this same technique may be applicable to other types of properties such as those that concern the design and maintenance of safety-critical systems.

  19. A plug-and-play pathway refactoring workflow for natural product research in Escherichia coli and Saccharomyces cerevisiae.

    PubMed

    Ren, Hengqian; Hu, Pingfan; Zhao, Huimin

    2017-08-01

    Pathway refactoring serves as an invaluable synthetic biology tool for natural product discovery, characterization, and engineering. However, the complicated and laborious molecular biology techniques largely hinder its application in natural product research, especially in a high-throughput manner. Here we report a plug-and-play pathway refactoring workflow for high-throughput, flexible pathway construction, and expression in both Escherichia coli and Saccharomyces cerevisiae. Biosynthetic genes were firstly cloned into pre-assembled helper plasmids with promoters and terminators, resulting in a series of expression cassettes. These expression cassettes were further assembled using Golden Gate reaction to generate fully refactored pathways. The inclusion of spacer plasmids in this system would not only increase the flexibility for refactoring pathways with different number of genes, but also facilitate gene deletion and replacement. As proof of concept, a total of 96 pathways for combinatorial carotenoid biosynthesis were built successfully. This workflow should be generally applicable to different classes of natural products produced by various organisms. Biotechnol. Bioeng. 2017;114: 1847-1854. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  20. Code Analysis and Refactoring with Clang Tools, Version 0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelley, Timothy M.

    2016-12-23

    Code Analysis and Refactoring with Clang Tools is a small set of example code that demonstrates techniques for applying tools distributed with the open source Clang compiler. Examples include analyzing where variables are used and replacing old data structures with standard structures.

  1. Refactoring a CS0 Course for Engineering Students to Use Active Learning

    ERIC Educational Resources Information Center

    Lokkila, Erno; Kaila, Erkki; Lindén, Rolf; Laakso, Mikko-Jussi; Sutinen, Erkki

    2017-01-01

    Purpose: The purpose of this paper was to determine whether applying e-learning material to a course leads to consistently improved student performance. Design/methodology/approach: This paper analyzes grade data from seven instances of the course. The first three instances were performed traditionally. After an intervention, in the form of…

  2. Handbook for Implementing Agile in Department of Defense Information Technology Acquisition

    DTIC Science & Technology

    2010-12-15

    Wire-frame Mockup of iTunes Cover Flow Feature (source: http://www.balsamiq.com/products/mockups/examples#mytunez...programming. The JOPES customer was included early in the development process in order to understand requirements management (story cards ), observe...transition by teaching the new members Agile processes, such as story card development, refactoring, and pair programming. Additionally, the team worked to

  3. Rational synthetic pathway refactoring of natural products biosynthesis in actinobacteria.

    PubMed

    Tan, Gao-Yi; Liu, Tiangang

    2017-01-01

    Natural products (NPs) and their derivatives are widely used as frontline treatments for many diseases. Actinobacteria spp. are used to produce most of NP antibiotics and have also been intensively investigated for NP production, derivatization, and discovery. However, due to the complicated transcriptional and metabolic regulation of NP biosynthesis in Actinobacteria, especially in the cases of genome mining and heterologous expression, it is often difficult to rationally and systematically engineer synthetic pathways to maximize biosynthetic efficiency. With the emergence of new tools and methods in metabolic engineering, the synthetic pathways of many chemicals, such as fatty acids and biofuels, in model organisms (e.g. Escherichia coli ), have been refactored to realize precise and flexible control of production. These studies also offer a promising approach for synthetic pathway refactoring in Actinobacteria. In this review, the great potential of Actinobacteria as a microbial cell factory for biosynthesis of NPs is discussed. To this end, recent progress in metabolic engineering of NP synthetic pathways in Actinobacteria are summarized and strategies and perspectives to rationally and systematically refactor synthetic pathways in Actinobacteria are highlighted. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  4. Image compression using singular value decomposition

    NASA Astrophysics Data System (ADS)

    Swathi, H. R.; Sohini, Shah; Surbhi; Gopichand, G.

    2017-11-01

    We often need to transmit and store the images in many applications. Smaller the image, less is the cost associated with transmission and storage. So we often need to apply data compression techniques to reduce the storage space consumed by the image. One approach is to apply Singular Value Decomposition (SVD) on the image matrix. In this method, digital image is given to SVD. SVD refactors the given digital image into three matrices. Singular values are used to refactor the image and at the end of this process, image is represented with smaller set of values, hence reducing the storage space required by the image. Goal here is to achieve the image compression while preserving the important features which describe the original image. SVD can be adapted to any arbitrary, square, reversible and non-reversible matrix of m × n size. Compression ratio and Mean Square Error is used as performance metrics.

  5. Yield Improvement of the Anti-MRSA Antibiotics WAP-8294A by CRISPR/dCas9 Combined with Refactoring Self-Protection Genes in Lysobacter enzymogenes OH11.

    PubMed

    Yu, Lingjun; Su, Wei; Fey, Paul D; Liu, Fengquan; Du, Liangcheng

    2018-01-19

    The cyclic lipodepsipeptides WAP-8294A are antibiotics with potent activity against methicillin-resistant Staphylococcus aureus (MRSA). One member of this family, WAP-8294A2 (Lotilibcin), was in clinical trials due to its high activity and distinct chemistry. However, WAP-8294A compounds are produced in a very low yield by Lysobacter and only under very stringent conditions. Improving WAP-8294A yield has become very critical for research and application of these anti-MRSA compounds. Here, we report a strategy to increase WAP-8294A production. We first used the CRISPR/dCas9 system to increase the expression of five cotranscribed genes (orf1-5) in the WAP gene cluster, by fusing the omega subunit of RNA polymerase with dCas9 that targets the operon's promoter region. This led to the transcription of the genes increased by 5-48 folds in strain dCas9-ω3. We then refactored four putative self-protection genes (orf6, orf7, orf9 and orf10) by reorganizing them into an operon under the control of a strong Lysobacter promoter, P HSAF . The refactored operon was introduced into strain dCas9-ω3, and the transcription of the self-protection genes increased by 20-60 folds in the resultant engineered strains. The yield of the three main WAP-8294A compounds, WAP-8294A1, WAP-8294A2, and WAP-8294A4, increased by 6, 4, and 9 folds, respectively, in the engineered strains. The data also showed that the yield increase of WAP-8294A compounds was mainly due to the increase of the extracellular distribution. WAP-8294A2 exhibited potent (MIC 0.2-0.8 μg/mL) and specific activity against S. aureus among a battery of clinically relevant Gram-positive pathogens (54 isolates).

  6. PyGirl: Generating Whole-System VMs from High-Level Prototypes Using PyPy

    NASA Astrophysics Data System (ADS)

    Bruni, Camillo; Verwaest, Toon

    Virtual machines (VMs) emulating hardware devices are generally implemented in low-level languages for performance reasons. This results in unmaintainable systems that are difficult to understand. In this paper we report on our experience using the PyPy toolchain to improve the portability and reduce the complexity of whole-system VM implementations. As a case study we implement a VM prototype for a Nintendo Game Boy, called PyGirl, in which the high-level model is separated from low-level VM implementation issues. We shed light on the process of refactoring from a low-level VM implementation in Java to a high-level model in RPython. We show that our whole-system VM written with PyPy is significantly less complex than standard implementations, without substantial loss in performance.

  7. Towards refactoring the Molecular Function Ontology with a UML profile for function modeling.

    PubMed

    Burek, Patryk; Loebe, Frank; Herre, Heinrich

    2017-10-04

    Gene Ontology (GO) is the largest resource for cataloging gene products. This resource grows steadily and, naturally, this growth raises issues regarding the structure of the ontology. Moreover, modeling and refactoring large ontologies such as GO is generally far from being simple, as a whole as well as when focusing on certain aspects or fragments. It seems that human-friendly graphical modeling languages such as the Unified Modeling Language (UML) could be helpful in connection with these tasks. We investigate the use of UML for making the structural organization of the Molecular Function Ontology (MFO), a sub-ontology of GO, more explicit. More precisely, we present a UML dialect, called the Function Modeling Language (FueL), which is suited for capturing functions in an ontologically founded way. FueL is equipped, among other features, with language elements that arise from studying patterns of subsumption between functions. We show how to use this UML dialect for capturing the structure of molecular functions. Furthermore, we propose and discuss some refactoring options concerning fragments of MFO. FueL enables the systematic, graphical representation of functions and their interrelations, including making information explicit that is currently either implicit in MFO or is mainly captured in textual descriptions. Moreover, the considered subsumption patterns lend themselves to the methodical analysis of refactoring options with respect to MFO. On this basis we argue that the approach can increase the comprehensibility of the structure of MFO for humans and can support communication, for example, during revision and further development.

  8. RankProd 2.0: a refactored bioconductor package for detecting differentially expressed features in molecular profiling datasets.

    PubMed

    Del Carratore, Francesco; Jankevics, Andris; Eisinga, Rob; Heskes, Tom; Hong, Fangxin; Breitling, Rainer

    2017-09-01

    The Rank Product (RP) is a statistical technique widely used to detect differentially expressed features in molecular profiling experiments such as transcriptomics, metabolomics and proteomics studies. An implementation of the RP and the closely related Rank Sum (RS) statistics has been available in the RankProd Bioconductor package for several years. However, several recent advances in the understanding of the statistical foundations of the method have made a complete refactoring of the existing package desirable. We implemented a completely refactored version of the RankProd package, which provides a more principled implementation of the statistics for unpaired datasets. Moreover, the permutation-based P -value estimation methods have been replaced by exact methods, providing faster and more accurate results. RankProd 2.0 is available at Bioconductor ( https://www.bioconductor.org/packages/devel/bioc/html/RankProd.html ) and as part of the mzMatch pipeline ( http://www.mzmatch.sourceforge.net ). rainer.breitling@manchester.ac.uk. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  9. Designing Collaborative Developmental Standards by Refactoring of the Earth Science Models, Libraries, Workflows and Frameworks.

    NASA Astrophysics Data System (ADS)

    Mirvis, E.; Iredell, M.

    2015-12-01

    The operational (OPS) NOAA National Centers for Environmental Prediction (NCEP) suite, traditionally, consist of a large set of multi- scale HPC models, workflows, scripts, tools and utilities, which are very much depending on the variety of the additional components. Namely, this suite utilizes a unique collection of the in-house developed 20+ shared libraries (NCEPLIBS), certain versions of the 3-rd party libraries (like netcdf, HDF, ESMF, jasper, xml etc.), HPC workflow tool within dedicated (sometimes even vendors' customized) HPC system homogeneous environment. This domain and site specific, accompanied with NCEP's product- driven large scale real-time data operations complicates NCEP collaborative development tremendously by reducing chances to replicate this OPS environment anywhere else. The NOAA/NCEP's Environmental Modeling Center (EMC) missions to develop and improve numerical weather, climate, hydrological and ocean prediction through the partnership with the research community. Realizing said difficulties, lately, EMC has been taken an innovative approach to improve flexibility of the HPC environment by building the elements and a foundation for NCEP OPS functionally equivalent environment (FEE), which can be used to ease the external interface constructs as well. Aiming to reduce turnaround time of the community code enhancements via Research-to-Operations (R2O) cycle, EMC developed and deployed several project sub-set standards that already paved the road to NCEP OPS implementation standards. In this topic we will discuss the EMC FEE for O2R requirements and approaches in collaborative standardization, including NCEPLIBS FEE and models code version control paired with the models' derived customized HPC modules and FEE footprints. We will share NCEP/EMC experience and potential in the refactoring of EMC development processes, legacy codes and in securing model source code quality standards by using combination of the Eclipse IDE, integrated with the reverse engineering tools/APIs. We will also inform on collaborative efforts in the restructuring of the NOAA Environmental Modeling System (NEMS) - the multi- model and coupling framework, and transitioning FEE verification methodology.

  10. [Study on Differential Optical Absorption Spectroscopy Data Processing Based on Chirp-Z Transformation].

    PubMed

    Zheng, Hai-ming; Li, Guang-jie; Wu, Hao

    2015-06-01

    Differential optical absorption spectroscopy (DOAS) is a commonly used atmospheric pollution monitoring method. Denoising of monitoring spectral data will improve the inversion accuracy. Fourier transform filtering method is effectively capable of filtering out the noise in the spectral data. But the algorithm itself can introduce errors. In this paper, a chirp-z transform method is put forward. By means of the local thinning of Fourier transform spectrum, it can retain the denoising effect of Fourier transform and compensate the error of the algorithm, which will further improve the inversion accuracy. The paper study on the concentration retrieving of SO2 and NO2. The results show that simple division causes bigger error and is not very stable. Chirp-z transform is proved to be more accurate than Fourier transform. Results of the frequency spectrum analysis show that Fourier transform cannot solve the distortion and weakening problems of characteristic absorption spectrum. Chirp-z transform shows ability in fine refactoring of specific frequency spectrum.

  11. The GBS code for tokamak scrape-off layer simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Halpern, F.D., E-mail: federico.halpern@epfl.ch; Ricci, P.; Jolliet, S.

    2016-06-15

    We describe a new version of GBS, a 3D global, flux-driven plasma turbulence code to simulate the turbulent dynamics in the tokamak scrape-off layer (SOL), superseding the code presented by Ricci et al. (2012) [14]. The present work is driven by the objective of studying SOL turbulent dynamics in medium size tokamaks and beyond with a high-fidelity physics model. We emphasize an intertwining framework of improved physics models and the computational improvements that allow them. The model extensions include neutral atom physics, finite ion temperature, the addition of a closed field line region, and a non-Boussinesq treatment of the polarizationmore » drift. GBS has been completely refactored with the introduction of a 3-D Cartesian communicator and a scalable parallel multigrid solver. We report dramatically enhanced parallel scalability, with the possibility of treating electromagnetic fluctuations very efficiently. The method of manufactured solutions as a verification process has been carried out for this new code version, demonstrating the correct implementation of the physical model.« less

  12. Discovery of a Phosphonoacetic Acid Derived Natural Product by Pathway Refactoring.

    PubMed

    Freestone, Todd S; Ju, Kou-San; Wang, Bin; Zhao, Huimin

    2017-02-17

    The activation of silent natural product gene clusters is a synthetic biology problem of great interest. As the rate at which gene clusters are identified outpaces the discovery rate of new molecules, this unknown chemical space is rapidly growing, as too are the rewards for developing technologies to exploit it. One class of natural products that has been underrepresented is phosphonic acids, which have important medical and agricultural uses. Hundreds of phosphonic acid biosynthetic gene clusters have been identified encoding for unknown molecules. Although methods exist to elicit secondary metabolite gene clusters in native hosts, they require the strain to be amenable to genetic manipulation. One method to circumvent this is pathway refactoring, which we implemented in an effort to discover new phosphonic acids from a gene cluster from Streptomyces sp. strain NRRL F-525. By reengineering this cluster for expression in the production host Streptomyces lividans, utility of refactoring is demonstrated with the isolation of a novel phosphonic acid, O-phosphonoacetic acid serine, and the characterization of its biosynthesis. In addition, a new biosynthetic branch point is identified with a phosphonoacetaldehyde dehydrogenase, which was used to identify additional phosphonic acid gene clusters that share phosphonoacetic acid as an intermediate.

  13. Abstract Interpreters for Free

    NASA Astrophysics Data System (ADS)

    Might, Matthew

    In small-step abstract interpretations, the concrete and abstract semantics bear an uncanny resemblance. In this work, we present an analysis-design methodology that both explains and exploits that resemblance. Specifically, we present a two-step method to convert a small-step concrete semantics into a family of sound, computable abstract interpretations. The first step re-factors the concrete state-space to eliminate recursive structure; this refactoring of the state-space simultaneously determines a store-passing-style transformation on the underlying concrete semantics. The second step uses inference rules to generate an abstract state-space and a Galois connection simultaneously. The Galois connection allows the calculation of the "optimal" abstract interpretation. The two-step process is unambiguous, but nondeterministic: at each step, analysis designers face choices. Some of these choices ultimately influence properties such as flow-, field- and context-sensitivity. Thus, under the method, we can give the emergence of these properties a graph-theoretic characterization. To illustrate the method, we systematically abstract the continuation-passing style lambda calculus to arrive at two distinct families of analyses. The first is the well-known k-CFA family of analyses. The second consists of novel "environment-centric" abstract interpretations, none of which appear in the literature on static analysis of higher-order programs.

  14. Implementation of collisions on GPU architecture in the Vorpal code

    NASA Astrophysics Data System (ADS)

    Leddy, Jarrod; Averkin, Sergey; Cowan, Ben; Sides, Scott; Werner, Greg; Cary, John

    2017-10-01

    The Vorpal code contains a variety of collision operators allowing for the simulation of plasmas containing multiple charge species interacting with neutrals, background gas, and EM fields. These existing algorithms have been improved and reimplemented to take advantage of the massive parallelization allowed by GPU architecture. The use of GPUs is most effective when algorithms are single-instruction multiple-data, so particle collisions are an ideal candidate for this parallelization technique due to their nature as a series of independent processes with the same underlying operation. This refactoring required data memory reorganization and careful consideration of device/host data allocation to minimize memory access and data communication per operation. Successful implementation has resulted in an order of magnitude increase in simulation speed for a test-case involving multiple binary collisions using the null collision method. Work supported by DARPA under contract W31P4Q-16-C-0009.

  15. Calculation of the transverse parton distribution functions at next-to-next-to-leading order

    NASA Astrophysics Data System (ADS)

    Gehrmann, Thomas; Lübbert, Thomas; Yang, Li Lin

    2014-06-01

    We describe the perturbative calculation of the transverse parton distribution functions in all partonic channels up to next-to-next-to-leading order based on a gauge invariant operator definition. We demonstrate the cancellation of light-cone divergences and show that universal process-independent transverse parton distribution functions can be obtained through a refactorization. Our results serve as the first explicit higher-order calculation of these functions starting from first principles, and can be used to perform next-to-next-to-next-to-leading logarithmic q T resummation for a large class of processes at hadron colliders.

  16. Development of an object-oriented ORIGEN for advanced nuclear fuel modeling applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skutnik, S.; Havloej, F.; Lago, D.

    2013-07-01

    The ORIGEN package serves as the core depletion and decay calculation module within the SCALE code system. A recent major re-factor to the ORIGEN code architecture as part of an overall modernization of the SCALE code system has both greatly enhanced its maintainability as well as afforded several new capabilities useful for incorporating depletion analysis into other code frameworks. This paper will present an overview of the improved ORIGEN code architecture (including the methods and data structures introduced) as well as current and potential future applications utilizing the new ORIGEN framework. (authors)

  17. Linked Data: Forming Partnerships at the Data Layer

    NASA Astrophysics Data System (ADS)

    Shepherd, A.; Chandler, C. L.; Arko, R. A.; Jones, M. B.; Hitzler, P.; Janowicz, K.; Krisnadhi, A.; Schildhauer, M.; Fils, D.; Narock, T.; Groman, R. C.; O'Brien, M.; Patton, E. W.; Kinkade, D.; Rauch, S.

    2015-12-01

    The challenges presented by big data are straining data management software architectures of the past. For smaller existing data facilities, the technical refactoring of software layers become costly to scale across the big data landscape. In response to these challenges, data facilities will need partnerships with external entities for improved solutions to perform tasks such as data cataloging, discovery and reuse, and data integration and processing with provenance. At its surface, the concept of linked open data suggests an uncalculated altruism. Yet, in his concept of five star open data, Tim Berners-Lee explains the strategic costs and benefits of deploying linked open data from the perspective of its consumer and producer - a data partnership. The Biological and Chemical Oceanography Data Management Office (BCO-DMO) addresses some of the emerging needs of its research community by partnering with groups doing complementary work and linking their respective data layers using linked open data principles. Examples will show how these links, explicit manifestations of partnerships, reduce technical debt and provide a swift flexibility for future considerations.

  18. MOOSE IPL Extensions (Control Logic)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Permann, Cody

    In FY-2015, the development of MOOSE was driven by the needs of the NEAMS MOOSE-based applications, BISON, MARMOT, and RELAP-7. An emphasis was placed on the continued upkeep and improvement MOOSE in support of the product line integration goals. New unified documentation tools have been developed, several improvements to regression testing have been enforced and overall better software quality practices have been implemented. In addition the Multiapps and Transfers systems have seen significant refactoring and robustness improvements, as has the “Restart and Recover” system in support of Multiapp simulations. Finally, a completely new “Control Logic” system has been engineered tomore » replace the prototype system currently in use in the RELAP-7 code. The development of this system continues and is expected to handle existing needs as well as support future enhancements.« less

  19. Scintillation-Hardened GPS Receiver

    NASA Technical Reports Server (NTRS)

    Stephens, Donald R.

    2015-01-01

    CommLargo, Inc., has developed a scintillation-hardened Global Positioning System (GPS) receiver that improves reliability for low-orbit missions and complies with NASA's Space Telecommunications Radio System (STRS) architecture standards. A software-defined radio (SDR) implementation allows a single hardware element to function as either a conventional radio or as a GPS receiver, providing backup and redundancy for platforms such as the International Space Station (ISS) and high-value remote sensing platforms. The innovation's flexible SDR implementation reduces cost, weight, and power requirements. Scintillation hardening improves mission reliability and variability. In Phase I, CommLargo refactored an open-source GPS software package with Kalman filter-based tracking loops to improve performance during scintillation and also demonstrated improved navigation during a geomagnetic storm. In Phase II, the company generated a new field-programmable gate array (FPGA)-based GPS waveform to demonstrate on NASA's Space Communication and Navigation (SCaN) test bed.

  20. Microservices for systematic profiling and monitoring of the refactoring process at the LHCb experiment

    NASA Astrophysics Data System (ADS)

    Mazurov, Alexander; Couturier, Ben; Popov, Dmitry; Farley, Nathanael

    2017-10-01

    Any time you modify an implementation within a program, change compiler version or operating system, you should also do regression testing. You can do regression testing by rerunning existing tests against the changes to determine whether this breaks anything that worked prior to the change and by writing new tests where necessary. At LHCb we have a huge codebase which is maintained by many people and can be run within different setups. Such situations lead to the crucial necessity to guide refactoring with a central profiling system that helps to run tests and find the impact of changes. In our work we present a software architecture and tools for running a profiling system. This system is responsible for systematically running regression tests, collecting and comparing results of these tests so changes between different setups can be observed and reported. The main feature of our solution is that it is based on a microservices architecture. Microservices break a large project into loosely coupled modules, which communicate with each other through simple APIs. Such modular architectural style helps us to avoid general pitfalls of monolithic architectures such as hard to understand a codebase as well as maintaining a large codebase and ineffective scalability. Our solution also allows to escape a complexity of microservices deployment process by using software containers and services management tools. Containers and service managers let us quickly deploy linked modules in development, production or in any other environments. Most of the developed modules are generic which means that the proposed architecture and tools can be used not only in LHCb but adopted for other experiments and companies.

  1. GeoSciML v3.0 - a significant upgrade of the CGI-IUGS geoscience data model

    NASA Astrophysics Data System (ADS)

    Raymond, O.; Duclaux, G.; Boisvert, E.; Cipolloni, C.; Cox, S.; Laxton, J.; Letourneau, F.; Richard, S.; Ritchie, A.; Sen, M.; Serrano, J.-J.; Simons, B.; Vuollo, J.

    2012-04-01

    GeoSciML version 3.0 (http://www.geosciml.org), released in late 2011, is the latest version of the CGI-IUGS* Interoperability Working Group geoscience data interchange standard. The new version is a significant upgrade and refactoring of GeoSciML v2 which was released in 2008. GeoSciML v3 has already been adopted by several major international interoperability initiatives, including OneGeology, the EU INSPIRE program, and the US Geoscience Information Network, as their standard data exchange format for geoscience data. GeoSciML v3 makes use of recently upgraded versions of several Open Geospatial Consortium (OGC) and ISO data transfer standards, including GML v3.2, SWE Common v2.0, and Observations and Measurements v2 (ISO 19156). The GeoSciML v3 data model has been refactored from a single large application schema with many packages, into a number of smaller, but related, application schema modules with individual namespaces. This refactoring allows the use and future development of modules of GeoSciML (eg; GeologicUnit, GeologicStructure, GeologicAge, Borehole) in smaller, more manageable units. As a result of this refactoring and the integration with new OGC and ISO standards, GeoSciML v3 is not backwardly compatible with previous GeoSciML versions. The scope of GeoSciML has been extended in version 3.0 to include new models for geomorphological data (a Geomorphology application schema), and for geological specimens, geochronological interpretations, and metadata for geochemical and geochronological analyses (a LaboratoryAnalysis-Specimen application schema). In addition, there is better support for borehole data, and the PhysicalProperties model now supports a wider range of petrophysical measurements. The previously used CGI_Value data type has been superseded in favour of externally governed data types provided by OGC's SWE Common v2 and GML v3.2 data standards. The GeoSciML v3 release includes worked examples of best practice in delivering geochemical analytical data using the Observations and Measurements (ISO19156) and SWE Common v2 models. The GeoSciML v3 data model does not include vocabularies to support the data model. However, it does provide a standard pattern to reference controlled vocabulary concepts using HTTP-URIs. The international GeoSciML community has developed distributed RDF-based geoscience vocabularies that can be accessed by GeoSciML web services using the standard pattern recommended in GeoSciML v3. GeoSciML v3 is the first version of GeoSciML that will be accompanied by web service validation tools using Schematron rules. For example, these validation tools may check for compliance of a web service to a particular profile of GeoSciML, or for logical consistency of data content that cannot be enforced by the application schemas. This validation process will support accreditation of GeoSciML services and a higher degree of semantic interoperability. * International Union of Geological Sciences Commission for Management and Application of Geoscience Information (CGI-IUGS)

  2. OpenSeesPy: Python library for the OpenSees finite element framework

    NASA Astrophysics Data System (ADS)

    Zhu, Minjie; McKenna, Frank; Scott, Michael H.

    2018-01-01

    OpenSees, an open source finite element software framework, has been used broadly in the earthquake engineering community for simulating the seismic response of structural and geotechnical systems. The framework allows users to perform finite element analysis with a scripting language and for developers to create both serial and parallel finite element computer applications as interpreters. For the last 15 years, Tcl has been the primary scripting language to which the model building and analysis modules of OpenSees are linked. To provide users with different scripting language options, particularly Python, the OpenSees interpreter interface was refactored to provide multi-interpreter capabilities. This refactoring, resulting in the creation of OpenSeesPy as a Python module, is accomplished through an abstract interface for interpreter calls with concrete implementations for different scripting languages. Through this approach, users are able to develop applications that utilize the unique features of several scripting languages while taking advantage of advanced finite element analysis models and algorithms.

  3. Horizon: The Portable, Scalable, and Reusable Framework for Developing Automated Data Management and Product Generation Systems

    NASA Astrophysics Data System (ADS)

    Huang, T.; Alarcon, C.; Quach, N. T.

    2014-12-01

    Capture, curate, and analysis are the typical activities performed at any given Earth Science data center. Modern data management systems must be adaptable to heterogeneous science data formats, scalable to meet the mission's quality of service requirements, and able to manage the life-cycle of any given science data product. Designing a scalable data management doesn't happen overnight. It takes countless hours of refining, refactoring, retesting, and re-architecting. The Horizon data management and workflow framework, developed at the Jet Propulsion Laboratory, is a portable, scalable, and reusable framework for developing high-performance data management and product generation workflow systems to automate data capturing, data curation, and data analysis activities. The NASA's Physical Oceanography Distributed Active Archive Center (PO.DAAC)'s Data Management and Archive System (DMAS) is its core data infrastructure that handles capturing and distribution of hundreds of thousands of satellite observations each day around the clock. DMAS is an application of the Horizon framework. The NASA Global Imagery Browse Services (GIBS) is NASA's Earth Observing System Data and Information System (EOSDIS)'s solution for making high-resolution global imageries available to the science communities. The Imagery Exchange (TIE), an application of the Horizon framework, is a core subsystem for GIBS responsible for data capturing and imagery generation automation to support the EOSDIS' 12 distributed active archive centers and 17 Science Investigator-led Processing Systems (SIPS). This presentation discusses our ongoing effort in refining, refactoring, retesting, and re-architecting the Horizon framework to enable data-intensive science and its applications.

  4. GCS component development cycle

    NASA Astrophysics Data System (ADS)

    Rodríguez, Jose A.; Macias, Rosa; Molgo, Jordi; Guerra, Dailos; Pi, Marti

    2012-09-01

    The GTC1 is an optical-infrared 10-meter segmented mirror telescope at the ORM observatory in Canary Islands (Spain). First light was at 13/07/2007 and since them it is in the operation phase. The GTC control system (GCS) is a distributed object & component oriented system based on RT-CORBA8 and it is responsible for the management and operation of the telescope, including its instrumentation. GCS has used the Rational Unified process (RUP9) in its development. RUP is an iterative software development process framework. After analysing (use cases) and designing (UML10) any of GCS subsystems, an initial component description of its interface is obtained and from that information a component specification is written. In order to improve the code productivity, GCS has adopted the code generation to transform this component specification into the skeleton of component classes based on a software framework, called Device Component Framework. Using the GCS development tools, based on javadoc and gcc, in only one step, the component is generated, compiled and deployed to be tested for the first time through our GUI inspector. The main advantages of this approach are the following: It reduces the learning curve of new developers and the development error rate, allows a systematic use of design patterns in the development and software reuse, speeds up the deliverables of the software product and massively increase the timescale, design consistency and design quality, and eliminates the future refactoring process required for the code.

  5. RSAT 2018: regulatory sequence analysis tools 20th anniversary.

    PubMed

    Nguyen, Nga Thi Thuy; Contreras-Moreira, Bruno; Castro-Mondragon, Jaime A; Santana-Garcia, Walter; Ossio, Raul; Robles-Espinoza, Carla Daniela; Bahin, Mathieu; Collombet, Samuel; Vincens, Pierre; Thieffry, Denis; van Helden, Jacques; Medina-Rivera, Alejandra; Thomas-Chollier, Morgane

    2018-05-02

    RSAT (Regulatory Sequence Analysis Tools) is a suite of modular tools for the detection and the analysis of cis-regulatory elements in genome sequences. Its main applications are (i) motif discovery, including from genome-wide datasets like ChIP-seq/ATAC-seq, (ii) motif scanning, (iii) motif analysis (quality assessment, comparisons and clustering), (iv) analysis of regulatory variations, (v) comparative genomics. Six public servers jointly support 10 000 genomes from all kingdoms. Six novel or refactored programs have been added since the 2015 NAR Web Software Issue, including updated programs to analyse regulatory variants (retrieve-variation-seq, variation-scan, convert-variations), along with tools to extract sequences from a list of coordinates (retrieve-seq-bed), to select motifs from motif collections (retrieve-matrix), and to extract orthologs based on Ensembl Compara (get-orthologs-compara). Three use cases illustrate the integration of new and refactored tools to the suite. This Anniversary update gives a 20-year perspective on the software suite. RSAT is well-documented and available through Web sites, SOAP/WSDL (Simple Object Access Protocol/Web Services Description Language) web services, virtual machines and stand-alone programs at http://www.rsat.eu/.

  6. A posteriori operation detection in evolving software models

    PubMed Central

    Langer, Philip; Wimmer, Manuel; Brosch, Petra; Herrmannsdörfer, Markus; Seidl, Martina; Wieland, Konrad; Kappel, Gerti

    2013-01-01

    As every software artifact, also software models are subject to continuous evolution. The operations applied between two successive versions of a model are crucial for understanding its evolution. Generic approaches for detecting operations a posteriori identify atomic operations, but neglect composite operations, such as refactorings, which leads to cluttered difference reports. To tackle this limitation, we present an orthogonal extension of existing atomic operation detection approaches for detecting also composite operations. Our approach searches for occurrences of composite operations within a set of detected atomic operations in a post-processing manner. One major benefit is the reuse of specifications available for executing composite operations also for detecting applications of them. We evaluate the accuracy of the approach in a real-world case study and investigate the scalability of our implementation in an experiment. PMID:23471366

  7. EMAN2: an extensible image processing suite for electron microscopy.

    PubMed

    Tang, Guang; Peng, Liwei; Baldwin, Philip R; Mann, Deepinder S; Jiang, Wen; Rees, Ian; Ludtke, Steven J

    2007-01-01

    EMAN is a scientific image processing package with a particular focus on single particle reconstruction from transmission electron microscopy (TEM) images. It was first released in 1999, and new versions have been released typically 2-3 times each year since that time. EMAN2 has been under development for the last two years, with a completely refactored image processing library, and a wide range of features to make it much more flexible and extensible than EMAN1. The user-level programs are better documented, more straightforward to use, and written in the Python scripting language, so advanced users can modify the programs' behavior without any recompilation. A completely rewritten 3D transformation class simplifies translation between Euler angle standards and symmetry conventions. The core C++ library has over 500 functions for image processing and associated tasks, and it is modular with introspection capabilities, so programmers can add new algorithms with minimal effort and programs can incorporate new capabilities automatically. Finally, a flexible new parallelism system has been designed to address the shortcomings in the rigid system in EMAN1.

  8. Introducing Object-Oriented Concepts into GSI

    NASA Technical Reports Server (NTRS)

    Guo, Jing; Todling, Ricardo

    2017-01-01

    Enhancements are now being made to the Gridpoint Statistical Interpolation (GSI) data assimilation system to expand its capabilities. This effort opens the way for broadening the scope of GSI's applications by using some standard object-oriented features in Fortran, and represents a starting point for the so-called GSI refactoring, as a part of the Joint Effort for Data-assimilationI ntegration (JEDI) project of JCSDA.

  9. Refactorizing NRQCD short-distance coefficients in exclusive quarkonium production

    NASA Astrophysics Data System (ADS)

    Jia, Yu; Yang, Deshan

    2009-06-01

    In a typical exclusive quarkonium production process, when the center-of-mass energy, √{s}, is much greater than the heavy quark mass m, large kinematic logarithms of s/m will unavoidably arise at each order of perturbative expansion in the short-distance coefficients of the nonrelativistic QCD (NRQCD) factorization formalism, which may potentially harm the perturbative expansion. This symptom reflects that the hard regime in NRQCD factorization is too coarse and should be further factorized. We suggest that this regime can be further separated into "hard" and "collinear" degrees of freedom, so that the familiar light-cone approach can be employed to reproduce the NRQCD matching coefficients at the zeroth order of m/s and order by order in α. Taking two simple processes, exclusive η+γ production in ee annihilation and Higgs boson radiative decay into ϒ, as examples, we illustrate how the leading logarithms of s/m in the NRQCD matching coefficients are identified and summed to all orders in α with the aid of Brodsky-Lepage evolution equation.

  10. A plug-in to Eclipse for VHDL source codes: functionalities

    NASA Astrophysics Data System (ADS)

    Niton, B.; Poźniak, K. T.; Romaniuk, R. S.

    The paper presents an original application, written by authors, which supports writing and edition of source codes in VHDL language. It is a step towards fully automatic, augmented code writing for photonic and electronic systems, also systems based on FPGA and/or DSP processors. An implementation is described, based on VEditor. VEditor is a free license program. Thus, the work presented in this paper supplements and extends this free license. The introduction characterizes shortly available tools on the market which serve for aiding the design processes of electronic systems in VHDL. Particular attention was put on plug-ins to the Eclipse environment and Emacs program. There are presented detailed properties of the written plug-in such as: programming extension conception, and the results of the activities of formatter, re-factorizer, code hider, and other new additions to the VEditor program.

  11. AWARE: Adaptive Software Monitoring and Dynamic Reconfiguration for Critical Infrastructure Protection

    DTIC Science & Technology

    2015-04-29

    in which we applied these adaptation patterns to an adaptive news web server intended to tolerate extremely heavy, unexpected loads. To address...collection of existing models used as benchmarks for OO-based refactoring and an existing web -based repository called REMODD to provide users with model...invariant properties. Specifically, we developed Avida- MDE (based on the Avida digital evolution platform) to support the automatic generation of software

  12. SpecTAD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zamora, Richard; Voter, Arthur; Uberuaga, Bla

    2017-10-23

    The SpecTAD software represents a refactoring of the Temperature Accelerated Dynamics (TAD2) code authored by Arthur F. Voter and Blas P. Uberuaga (LA-CC-02-05). SpecTAD extends the capabilities of TAD2, by providing algorithms for both temporal and spatial parallelism. The novel algorithms for temporal parallelism include both speculation and replication based techniques. SpecTAD also offers the optional capability to dynamically link to the open-source LAMMPS package.

  13. Rationally reduced libraries for combinatorial pathway optimization minimizing experimental effort.

    PubMed

    Jeschek, Markus; Gerngross, Daniel; Panke, Sven

    2016-03-31

    Rational flux design in metabolic engineering approaches remains difficult since important pathway information is frequently not available. Therefore empirical methods are applied that randomly change absolute and relative pathway enzyme levels and subsequently screen for variants with improved performance. However, screening is often limited on the analytical side, generating a strong incentive to construct small but smart libraries. Here we introduce RedLibs (Reduced Libraries), an algorithm that allows for the rational design of smart combinatorial libraries for pathway optimization thereby minimizing the use of experimental resources. We demonstrate the utility of RedLibs for the design of ribosome-binding site libraries by in silico and in vivo screening with fluorescent proteins and perform a simple two-step optimization of the product selectivity in the branched multistep pathway for violacein biosynthesis, indicating a general applicability for the algorithm and the proposed heuristics. We expect that RedLibs will substantially simplify the refactoring of synthetic metabolic pathways.

  14. Metabolic engineering of an industrial polyoxin producer for the targeted overproduction of designer nucleoside antibiotics.

    PubMed

    Qi, Jianzhao; Liu, Jin; Wan, Dan; Cai, You-Sheng; Wang, Yinghu; Li, Shunying; Wu, Pan; Feng, Xuan; Qiu, Guofu; Yang, Sheng-Ping; Chen, Wenqing; Deng, Zixin

    2015-09-01

    Polyoxin and nikkomycin are naturally occurring peptidyl nucleoside antibiotics with potent antifungal bioactivity. Both exhibit similar structural features, having a nucleoside skeleton and one or two peptidyl moieties. Combining the refactoring of the polyoxin producer Streptomyces aureochromogenes with import of the hydroxypyridylhomothreonine pathway of nikkomycin allows the targeted production of three designer nucleoside antibiotics designated as nikkoxin E, F, and G. These structures were determined by NMR and/or high resolution mass spectrometry. Remarkably, the introduction of an extra copy of the nikS gene encoding an ATP-dependent ligase significantly enhanced the production of the designer antibiotics. Moreover, all three nikkoxins displayed improved bioactivity against several pathogenic fungi as compared with the naturally-occurring antibiotics. These data provide a feasible model for high efficiency generation of nucleoside antibiotics related to polyoxins and nikkomycins in a polyoxin cell factory via synthetic biology strategy. © 2015 Wiley Periodicals, Inc.

  15. Archive Management of NASA Earth Observation Data to Support Cloud Analysis

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher; Baynes, Kathleen; McInerney, Mark A.

    2017-01-01

    NASA collects, processes and distributes petabytes of Earth Observation (EO) data from satellites, aircraft, in situ instruments and model output, with an order of magnitude increase expected by 2024. Cloud-based web object storage (WOS) of these data can simplify the execution of such an increase. More importantly, it can also facilitate user analysis of those volumes by making the data available to the massively parallel computing power in the cloud. However, storing EO data in cloud WOS has a ripple effect throughout the NASA archive system with unexpected challenges and opportunities. One challenge is modifying data servicing software (such as Web Coverage Service servers) to access and subset data that are no longer on a directly accessible file system, but rather in cloud WOS. Opportunities include refactoring of the archive software to a cloud-native architecture; virtualizing data products by computing on demand; and reorganizing data to be more analysis-friendly.

  16. FLOWER IPv4/IPv6 Network Flow Summarization software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nickless, Bill; Curtis, Darren; Christy, Jason

    FLOWER was written as a refactoring/reimplementation of the existing Flo software used by the Cooperative Protection Program (CPP) to provide network flow summaries for analysis by the Operational Analysis Center (OAC) and other US Department of Energy cyber security elements. FLOWER is designed and tested to operate at 10 gigabits/second, nearly 10 times faster than competing solutions. FLOWER output is optimized for importation into SQL databases for categorization and analysis. FLOWER is written in C++ using current best software engineering practices.

  17. Governance in Open Source Software Development Projects: Towards a Model for Network-Centric Edge Organizations

    DTIC Science & Technology

    2008-06-01

    project is not an isolated OSSD project. Instead, the NetBeans IDE which is the focus of development activities in the NetBeans.org project community...facilitate or constrain the intended usage of the NetBeans IDE. Figure 1 provides a rendering of some of the more visible OSSD projects that...as BioBeans and RefactorIT communities build tools on top of or extending the NetBeans platform or IDE. How do these organizations interact with

  18. ESA's Planetary Science Archive: Preserve and present reliable scientific data sets

    NASA Astrophysics Data System (ADS)

    Besse, S.; Vallat, C.; Barthelemy, M.; Coia, D.; Costa, M.; De Marchi, G.; Fraga, D.; Grotheer, E.; Heather, D.; Lim, T.; Martinez, S.; Arviset, C.; Barbarisi, I.; Docasal, R.; Macfarlane, A.; Rios, C.; Saiz, J.; Vallejo, F.

    2018-01-01

    The European Space Agency (ESA) Planetary Science Archive (PSA) is undergoing a significant refactoring of all its components to improve the services provided to the scientific community and the public. The PSA supports ESA's missions exploring the Solar System by archiving scientific peer-reviewed observations as well as engineering data sets. This includes the Giotto, SMART-1, Huygens, Venus Express, Mars Express, Rosetta, Exomars 2016, Exomars RSP, BepiColombo, and JUICE missions. The PSA is offering a newly designed graphical user interface which is simultaneously meant to maximize the interaction with scientific observations and also minimise the efforts needed to download these scientific observations. The PSA still offers the same services as before (i.e., FTP, documentation, helpdesk, etc.). In addition, it will support the two formats of the Planetary Data System (i.e., PDS3 and PDS4), as well as providing new ways for searching the data products with specific metadata and geometrical parameters. As well as enhanced services, the PSA will also provide new services to improve the visualisation of data products and scientific content (e.g., spectra, etc.). Together with improved access to the spacecraft engineering data sets, the PSA will provide easier access to scientific data products that will help to maximize the science return of ESA's space missions.

  19. Modular design of metabolic network for robust production of n-butanol from galactose-glucose mixtures.

    PubMed

    Lim, Hyun Gyu; Lim, Jae Hyung; Jung, Gyoo Yeol

    2015-01-01

    Refactoring microorganisms for efficient production of advanced biofuel such as n-butanol from a mixture of sugars in the cheap feedstock is a prerequisite to achieve economic feasibility in biorefinery. However, production of biofuel from inedible and cheap feedstock is highly challenging due to the slower utilization of biomass-driven sugars, arising from complex assimilation pathway, difficulties in amplification of biosynthetic pathways for heterologous metabolite, and redox imbalance caused by consuming intracellular reducing power to produce quite reduced biofuel. Even with these problems, the microorganisms should show robust production of biofuel to obtain industrial feasibility. Thus, refactoring microorganisms for efficient conversion is highly desirable in biofuel production. In this study, we engineered robust Escherichia coli to accomplish high production of n-butanol from galactose-glucose mixtures via the design of modular pathway, an efficient and systematic way, to reconstruct the entire metabolic pathway with many target genes. Three modular pathways designed using the predictable genetic elements were assembled for efficient galactose utilization, n-butanol production, and redox re-balancing to robustly produce n-butanol from a sugar mixture of galactose and glucose. Specifically, the engineered strain showed dramatically increased n-butanol production (3.3-fold increased to 6.2 g/L after 48-h fermentation) compared to the parental strain (1.9 g/L) in galactose-supplemented medium. Moreover, fermentation with mixtures of galactose and glucose at various ratios from 2:1 to 1:2 confirmed that our engineered strain was able to robustly produce n-butanol regardless of sugar composition with simultaneous utilization of galactose and glucose. Collectively, modular pathway engineering of metabolic network can be an effective approach in strain development for optimal biofuel production with cost-effective fermentable sugars. To the best of our knowledge, this study demonstrated the first and highest n-butanol production from galactose in E. coli. Moreover, robust production of n-butanol with sugar mixtures with variable composition would facilitate the economic feasibility of the microbial process using a mixture of sugars from cheap biomass in the near future.

  20. An experiment in big data: storage, querying and visualisation of data taken from the Liverpool Telescope's wide field cameras

    NASA Astrophysics Data System (ADS)

    Barnsley, R. M.; Steele, Iain A.; Smith, R. J.; Mawson, Neil R.

    2014-07-01

    The Small Telescopes Installed at the Liverpool Telescope (STILT) project has been in operation since March 2009, collecting data with three wide field unfiltered cameras: SkycamA, SkycamT and SkycamZ. To process the data, a pipeline was developed to automate source extraction, catalogue cross-matching, photometric calibration and database storage. In this paper, modifications and further developments to this pipeline will be discussed, including a complete refactor of the pipeline's codebase into Python, migration of the back-end database technology from MySQL to PostgreSQL, and changing the catalogue used for source cross-matching from USNO-B1 to APASS. In addition to this, details will be given relating to the development of a preliminary front-end to the source extracted database which will allow a user to perform common queries such as cone searches and light curve comparisons of catalogue and non-catalogue matched objects. Some next steps and future ideas for the project will also be presented.

  1. Archive Management of NASA Earth Observation Data to Support Cloud Analysis

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher; Baynes, Kathleen; McInerney, Mark

    2017-01-01

    NASA collects, processes and distributes petabytes of Earth Observation (EO) data from satellites, aircraft, in situ instruments and model output, with an order of magnitude increase expected by 2024. Cloud-based web object storage (WOS) of these data can simplify the execution of such an increase. More importantly, it can also facilitate user analysis of those volumes by making the data available to the massively parallel computing power in the cloud. However, storing EO data in cloud WOS has a ripple effect throughout the NASA archive system with unexpected challenges and opportunities. One challenge is modifying data servicing software (such as Web Coverage Service servers) to access and subset data that are no longer on a directly accessible file system, but rather in cloud WOS. Opportunities include refactoring of the archive software to a cloud-native architecture; virtualizing data products by computing on demand; and reorganizing data to be more analysis-friendly. Reviewed by Mark McInerney ESDIS Deputy Project Manager.

  2. Establishing the Common Community Physics Package by Transitioning the GFS Physics to a Collaborative Software Framework

    NASA Astrophysics Data System (ADS)

    Xue, L.; Firl, G.; Zhang, M.; Jimenez, P. A.; Gill, D.; Carson, L.; Bernardet, L.; Brown, T.; Dudhia, J.; Nance, L. B.; Stark, D. R.

    2017-12-01

    The Global Model Test Bed (GMTB) has been established to support the evolution of atmospheric physical parameterizations in NCEP global modeling applications. To accelerate the transition to the Next Generation Global Prediction System (NGGPS), a collaborative model development framework known as the Common Community Physics Package (CCPP) is created within the GMTB to facilitate engagement from the broad community on physics experimentation and development. A key component to this Research to Operation (R2O) software framework is the Interoperable Physics Driver (IPD) that hooks the physics parameterizations from one end to the dynamical cores on the other end with minimum implementation effort. To initiate the CCPP, scientists and engineers from the GMTB separated and refactored the GFS physics. This exercise demonstrated the process of creating IPD-compliant code and can serve as an example for other physics schemes to do the same and be considered for inclusion into the CCPP. Further benefits to this process include run-time physics suite configuration and considerably reduced effort for testing modifications to physics suites through GMTB's physics test harness. The implementation will be described and the preliminary results will be presented at the conference.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Ai-Qun; Pratomo Juwono, Nina Kurniasih; Synthetic Biology Research Program, National University of Singapore, Singapore

    Fatty acid derivatives, such as hydroxy fatty acids, fatty alcohols, fatty acid methyl/ethyl esters, and fatty alka(e)nes, have a wide range of industrial applications including plastics, lubricants, and fuels. Currently, these chemicals are obtained mainly through chemical synthesis, which is complex and costly, and their availability from natural biological sources is extremely limited. Metabolic engineering of microorganisms has provided a platform for effective production of these valuable biochemicals. Notably, synthetic biology-based metabolic engineering strategies have been extensively applied to refactor microorganisms for improved biochemical production. Here, we reviewed: (i) the current status of metabolic engineering of microbes that produce fattymore » acid-derived valuable chemicals, and (ii) the recent progress of synthetic biology approaches that assist metabolic engineering, such as mRNA secondary structure engineering, sensor-regulator system, regulatable expression system, ultrasensitive input/output control system, and computer science-based design of complex gene circuits. Furthermore, key challenges and strategies were discussed. Finally, we concluded that synthetic biology provides useful metabolic engineering strategies for economically viable production of fatty acid-derived valuable chemicals in engineered microbes.« less

  4. Production of Fatty Acid-Derived Valuable Chemicals in Synthetic Microbes

    PubMed Central

    Yu, Ai-Qun; Pratomo Juwono, Nina Kurniasih; Leong, Susanna Su Jan; Chang, Matthew Wook

    2014-01-01

    Fatty acid derivatives, such as hydroxy fatty acids, fatty alcohols, fatty acid methyl/ethyl esters, and fatty alka(e)nes, have a wide range of industrial applications including plastics, lubricants, and fuels. Currently, these chemicals are obtained mainly through chemical synthesis, which is complex and costly, and their availability from natural biological sources is extremely limited. Metabolic engineering of microorganisms has provided a platform for effective production of these valuable biochemicals. Notably, synthetic biology-based metabolic engineering strategies have been extensively applied to refactor microorganisms for improved biochemical production. Here, we reviewed: (i) the current status of metabolic engineering of microbes that produce fatty acid-derived valuable chemicals, and (ii) the recent progress of synthetic biology approaches that assist metabolic engineering, such as mRNA secondary structure engineering, sensor-regulator system, regulatable expression system, ultrasensitive input/output control system, and computer science-based design of complex gene circuits. Furthermore, key challenges and strategies were discussed. Finally, we concluded that synthetic biology provides useful metabolic engineering strategies for economically viable production of fatty acid-derived valuable chemicals in engineered microbes. PMID:25566540

  5. Umbra (core)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradley, Jon David; Oppel III, Fred J.; Hart, Brian E.

    Umbra is a flexible simulation framework for complex systems that can be used by itself for modeling, simulation, and analysis, or to create specific applications. It has been applied to many operations, primarily dealing with robotics and system of system simulations. This version, from 4.8 to 4.8.3b, incorporates bug fixes, refactored code, and new managed C++ wrapper code that can be used to bridge new applications written in C# to the C++ libraries. The new managed C++ wrapper code includes (project/directories) BasicSimulation, CSharpUmbraInterpreter, LogFileView, UmbraAboutBox, UmbraControls, UmbraMonitor and UmbraWrapper.

  6. Performance Refactoring of Instrumentation, Measurement, and Analysis Technologies for Petascale Computing. The PRIMA Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malony, Allen D.; Wolf, Felix G.

    2014-01-31

    The growing number of cores provided by today’s high-­end computing systems present substantial challenges to application developers in their pursuit of parallel efficiency. To find the most effective optimization strategy, application developers need insight into the runtime behavior of their code. The University of Oregon (UO) and the Juelich Supercomputing Centre of Forschungszentrum Juelich (FZJ) develop the performance analysis tools TAU and Scalasca, respectively, which allow high-­performance computing (HPC) users to collect and analyze relevant performance data – even at very large scales. TAU and Scalasca are considered among the most advanced parallel performance systems available, and are used extensivelymore » across HPC centers in the U.S., Germany, and around the world. The TAU and Scalasca groups share a heritage of parallel performance tool research and partnership throughout the past fifteen years. Indeed, the close interactions of the two groups resulted in a cross-­fertilization of tool ideas and technologies that pushed TAU and Scalasca to what they are today. It also produced two performance systems with an increasing degree of functional overlap. While each tool has its specific analysis focus, the tools were implementing measurement infrastructures that were substantially similar. Because each tool provides complementary performance analysis, sharing of measurement results is valuable to provide the user with more facets to understand performance behavior. However, each measurement system was producing performance data in different formats, requiring data interoperability tools to be created. A common measurement and instrumentation system was needed to more closely integrate TAU and Scalasca and to avoid the duplication of development and maintenance effort. The PRIMA (Performance Refactoring of Instrumentation, Measurement, and Analysis) project was proposed over three years ago as a joint international effort between UO and FZJ to accomplish these objectives: (1) refactor TAU and Scalasca performance system components for core code sharing and (2) integrate TAU and Scalasca functionality through data interfaces, formats, and utilities. As presented in this report, the project has completed these goals. In addition to shared technical advances, the groups have worked to engage with users through application performance engineering and tools training. In this regard, the project benefits from the close interactions the teams have with national laboratories in the United States and Germany. We have also sought to enhance our interactions through joint tutorials and outreach. UO has become a member of the Virtual Institute of High-­Productivity Supercomputing (VI-­HPS) established by the Helmholtz Association of German Research Centres as a center of excellence, focusing on HPC tools for diagnosing programming errors and optimizing performance. UO and FZJ have conducted several VI-­HPS training activities together within the past three years.« less

  7. Performance Refactoring of Instrumentation, Measurement, and Analysis Technologies for Petascale Computing: the PRIMA Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malony, Allen D.; Wolf, Felix G.

    2014-01-31

    The growing number of cores provided by today’s high-end computing systems present substantial challenges to application developers in their pursuit of parallel efficiency. To find the most effective optimization strategy, application developers need insight into the runtime behavior of their code. The University of Oregon (UO) and the Juelich Supercomputing Centre of Forschungszentrum Juelich (FZJ) develop the performance analysis tools TAU and Scalasca, respectively, which allow high-performance computing (HPC) users to collect and analyze relevant performance data – even at very large scales. TAU and Scalasca are considered among the most advanced parallel performance systems available, and are used extensivelymore » across HPC centers in the U.S., Germany, and around the world. The TAU and Scalasca groups share a heritage of parallel performance tool research and partnership throughout the past fifteen years. Indeed, the close interactions of the two groups resulted in a cross-fertilization of tool ideas and technologies that pushed TAU and Scalasca to what they are today. It also produced two performance systems with an increasing degree of functional overlap. While each tool has its specific analysis focus, the tools were implementing measurement infrastructures that were substantially similar. Because each tool provides complementary performance analysis, sharing of measurement results is valuable to provide the user with more facets to understand performance behavior. However, each measurement system was producing performance data in different formats, requiring data interoperability tools to be created. A common measurement and instrumentation system was needed to more closely integrate TAU and Scalasca and to avoid the duplication of development and maintenance effort. The PRIMA (Performance Refactoring of Instrumentation, Measurement, and Analysis) project was proposed over three years ago as a joint international effort between UO and FZJ to accomplish these objectives: (1) refactor TAU and Scalasca performance system components for core code sharing and (2) integrate TAU and Scalasca functionality through data interfaces, formats, and utilities. As presented in this report, the project has completed these goals. In addition to shared technical advances, the groups have worked to engage with users through application performance engineering and tools training. In this regard, the project benefits from the close interactions the teams have with national laboratories in the United States and Germany. We have also sought to enhance our interactions through joint tutorials and outreach. UO has become a member of the Virtual Institute of High-Productivity Supercomputing (VI-HPS) established by the Helmholtz Association of German Research Centres as a center of excellence, focusing on HPC tools for diagnosing programming errors and optimizing performance. UO and FZJ have conducted several VI-HPS training activities together within the past three years.« less

  8. Transverse vetoes with rapidity cutoff in SCET

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hornig, Andrew; Kang, Daekyoung; Makris, Yiannis

    We consider di-jet production in hadron collisions where a transverse veto is imposed on radiation for (pseudo-)rapidities in the central region only, where this central region is defined with rapidity cutoff. For the case where the transverse measurement (e.g., transverse energy or min p T for jet veto) is parametrically larger relative to the typical transverse momentum beyond the cutoff, the cross section is insensitive to the cutoff parameter and is factorized in terms of collinear and soft degrees of freedom. The virtuality for these degrees of freedom is set by the transverse measurement, as in typical transverse-momentum dependent observablesmore » such as Drell-Yan, Higgs production, and the event shape broadening. This paper focuses on the other region, where the typical transverse momentum below and beyond the cutoff is of similar size. In this region the rapidity cutoff further resolves soft radiation into (u)soft and soft-collinear radiation with different rapidities but identical virtuality. This gives rise to rapidity logarithms of the rapidity cutoff parameter which we resum using renormalization group methods. We factorize the cross section in this region in terms of soft and collinear functions in the framework of soft-collinear effective theory, then further refactorize the soft function as a convolution of the (u)soft and soft-collinear functions. All these functions are calculated at one-loop order. As an example, we calculate a differential cross section for a specific partonic channel, qq ' → qq ' , for the jet shape angularities and show that the refactorization allows us to resum the rapidity logarithms and significantly reduce theoretical uncertainties in the jet shape spectrum.« less

  9. Transverse vetoes with rapidity cutoff in SCET

    DOE PAGES

    Hornig, Andrew; Kang, Daekyoung; Makris, Yiannis; ...

    2017-12-11

    We consider di-jet production in hadron collisions where a transverse veto is imposed on radiation for (pseudo-)rapidities in the central region only, where this central region is defined with rapidity cutoff. For the case where the transverse measurement (e.g., transverse energy or min p T for jet veto) is parametrically larger relative to the typical transverse momentum beyond the cutoff, the cross section is insensitive to the cutoff parameter and is factorized in terms of collinear and soft degrees of freedom. The virtuality for these degrees of freedom is set by the transverse measurement, as in typical transverse-momentum dependent observablesmore » such as Drell-Yan, Higgs production, and the event shape broadening. This paper focuses on the other region, where the typical transverse momentum below and beyond the cutoff is of similar size. In this region the rapidity cutoff further resolves soft radiation into (u)soft and soft-collinear radiation with different rapidities but identical virtuality. This gives rise to rapidity logarithms of the rapidity cutoff parameter which we resum using renormalization group methods. We factorize the cross section in this region in terms of soft and collinear functions in the framework of soft-collinear effective theory, then further refactorize the soft function as a convolution of the (u)soft and soft-collinear functions. All these functions are calculated at one-loop order. As an example, we calculate a differential cross section for a specific partonic channel, qq ' → qq ' , for the jet shape angularities and show that the refactorization allows us to resum the rapidity logarithms and significantly reduce theoretical uncertainties in the jet shape spectrum.« less

  10. High-resolution RCMs as pioneers for future GCMs

    NASA Astrophysics Data System (ADS)

    Schar, C.; Ban, N.; Arteaga, A.; Charpilloz, C.; Di Girolamo, S.; Fuhrer, O.; Hoefler, T.; Leutwyler, D.; Lüthi, D.; Piaget, N.; Ruedisuehli, S.; Schlemmer, L.; Schulthess, T. C.; Wernli, H.

    2017-12-01

    Currently large efforts are underway to refine the horizontal resolution of global and regional climate models to O(1 km), with the intent to represent convective clouds explicitly rather than using semi-empirical parameterizations. This refinement will move the governing equations closer to first principles and is expected to reduce the uncertainties of climate models. High resolution is particularly attractive in order to better represent critical cloud feedback processes (e.g. related to global climate sensitivity and extratropical summer convection) and extreme events (such as heavy precipitation events, floods, and hurricanes). The presentation will be illustrated using decade-long simulations at 2 km horizontal grid spacing, some of these covering the European continent on a computational mesh with 1536x1536x60 grid points. To accomplish such simulations, use is made of emerging heterogeneous supercomputing architectures, using a version of the COSMO limited-area weather and climate model that is able to run entirely on GPUs. Results show that kilometer-scale resolution dramatically improves the simulation of precipitation in terms of the diurnal cycle and short-term extremes. The modeling framework is used to address changes of precipitation scaling with climate change. It is argued that already today, modern supercomputers would in principle enable global atmospheric convection-resolving climate simulations, provided appropriately refactored codes were available, and provided solutions were found to cope with the rapidly growing output volume. A discussion will be provided of key challenges affecting the design of future high-resolution climate models. It is suggested that km-scale RCMs should be exploited to pioneer this terrain, at a time when GCMs are not yet available at such resolutions. Areas of interest include the development of new parameterization schemes adequate for km-scale resolution, the exploration of new validation methodologies and data sets, the assessment of regional-scale climate feedback processes, and the development of alternative output analysis methodologies.

  11. EMAGE mouse embryo spatial gene expression database: 2010 update

    PubMed Central

    Richardson, Lorna; Venkataraman, Shanmugasundaram; Stevenson, Peter; Yang, Yiya; Burton, Nicholas; Rao, Jianguo; Fisher, Malcolm; Baldock, Richard A.; Davidson, Duncan R.; Christiansen, Jeffrey H.

    2010-01-01

    EMAGE (http://www.emouseatlas.org/emage) is a freely available online database of in situ gene expression patterns in the developing mouse embryo. Gene expression domains from raw images are extracted and integrated spatially into a set of standard 3D virtual mouse embryos at different stages of development, which allows data interrogation by spatial methods. An anatomy ontology is also used to describe sites of expression, which allows data to be queried using text-based methods. Here, we describe recent enhancements to EMAGE including: the release of a completely re-designed website, which offers integration of many different search functions in HTML web pages, improved user feedback and the ability to find similar expression patterns at the click of a button; back-end refactoring from an object oriented to relational architecture, allowing associated SQL access; and the provision of further access by standard formatted URLs and a Java API. We have also increased data coverage by sourcing from a greater selection of journals and developed automated methods for spatial data annotation that are being applied to spatially incorporate the genome-wide (∼19 000 gene) ‘EURExpress’ dataset into EMAGE. PMID:19767607

  12. DMI's Baltic Sea Coastal operational forecasting system

    NASA Astrophysics Data System (ADS)

    Murawski, Jens; Berg, Per; Weismann Poulsen, Jacob

    2017-04-01

    Operational forecasting is challenged with bridging the gap between the large scales of the driving weather systems and the local, human scales of the model applications. The limit of what can be represented by local model has been continuously shifted to higher and higher spatial resolution, with the aim to better resolve the local dynamic and to make it possible to describe processes that could only be parameterised in older versions, with the ultimate goal to improve the quality of the forecast. Current hardware trends demand a str onger focus on the development of efficient, highly parallelised software and require a refactoring of the code with a solid focus on portable performance. The gained performance can be used for running high resolution model with a larger coverage. Together with the development of efficient two-way nesting routines, this has made it possible to approach the near-coastal zone with model applications that can run in a time effective way. Denmarks Meteorological Institute uses the HBM(1) ocean circulation model for applications that covers the entire Baltic Sea and North Sea with an integrated model set-up that spans the range of horizontal resolution from 1nm for the entire Baltic Sea to approx. 200m resolution in local fjords (Limfjord). For the next model generation, the high resolution set-ups are going to be extended and new high resolution domains in coastal zones are either implemented or tested for operational use. For the first time it will be possible to cover large stretches of the Baltic coastal zone with sufficiently high resolution to model the local hydrodynamic adequately. (1) HBM stands for HIROMB-BOOS-Model, whereas HIROMB stands for "High Resolution Model for the Baltic Sea" and BOOS stands for "Baltic Operational Oceanography System".

  13. A comparison of native GPU computing versus OpenACC for implementing flow-routing algorithms in hydrological applications

    NASA Astrophysics Data System (ADS)

    Rueda, Antonio J.; Noguera, José M.; Luque, Adrián

    2016-02-01

    In recent years GPU computing has gained wide acceptance as a simple low-cost solution for speeding up computationally expensive processing in many scientific and engineering applications. However, in most cases accelerating a traditional CPU implementation for a GPU is a non-trivial task that requires a thorough refactorization of the code and specific optimizations that depend on the architecture of the device. OpenACC is a promising technology that aims at reducing the effort required to accelerate C/C++/Fortran code on an attached multicore device. Virtually with this technology the CPU code only has to be augmented with a few compiler directives to identify the areas to be accelerated and the way in which data has to be moved between the CPU and GPU. Its potential benefits are multiple: better code readability, less development time, lower risk of errors and less dependency on the underlying architecture and future evolution of the GPU technology. Our aim with this work is to evaluate the pros and cons of using OpenACC against native GPU implementations in computationally expensive hydrological applications, using the classic D8 algorithm of O'Callaghan and Mark for river network extraction as case-study. We implemented the flow accumulation step of this algorithm in CPU, using OpenACC and two different CUDA versions, comparing the length and complexity of the code and its performance with different datasets. We advance that although OpenACC can not match the performance of a CUDA optimized implementation (×3.5 slower in average), it provides a significant performance improvement against a CPU implementation (×2-6) with by far a simpler code and less implementation effort.

  14. Multivariate and Naive Bayes Text Classification Approach to Cost Growth Risk in Department of Defense Acquisition Programs

    DTIC Science & Technology

    2013-03-01

    alerts 0.00011 3.26E-06 alternative 0.000161 0.000426 amp 5.25E-05 0.003127 amplifier 0.001501 0.000277 angular 0.000103 3.26E-06 anticipate 0.000755...0.000217 0.00056 amp 4.07E-05 0.004884 amplifier 0.002158 0.00043 angular 0.000109 4.48E-06 anticipation 0.000136 0.000453 aperture 0.000624...0.000215 instructed 0.00057 4.93E-05 java 0.000258 4.48E-05 refactoring 0.00019 2.69E-05 strike 0.000271 5.83E-05 touches 1.36E-05 9.86E-05

  15. Jet shapes in dijet events at the LHC in SCET

    NASA Astrophysics Data System (ADS)

    Hornig, Andrew; Makris, Yiannis; Mehen, Thomas

    2016-04-01

    We consider the class of jet shapes known as angularities in dijet production at hadron colliders. These angularities are modified from the original definitions in e + e - collisions to be boost invariant along the beam axis. These shapes apply to the constituents of jets defined with respect to either k T -type (anti- k T , C/ A, and k T ) algorithms and cone-type algorithms. We present an SCET factorization formula and calculate the ingredients needed to achieve next-to-leading-log (NLL) accuracy in kinematic regions where non-global logarithms are not large. The factorization formula involves previously unstudied "unmeasured beam functions," which are present for finite rapidity cuts around the beams. We derive relations between the jet functions and the shape-dependent part of the soft function that appear in the factorized cross section and those previously calculated for e + e - collisions, and present the calculation of the non-trivial, color-connected part of the soft-function to O({α}_s) . This latter part of the soft function is universal in the sense that it applies to any experimental setup with an out-of-jet p T veto and rapidity cuts together with two identified jets and it is independent of the choice of jet (sub-)structure measurement. In addition, we implement the recently introduced soft-collinear refactorization to resum logarithms of the jet size, valid in the region of non-enhanced non-global logarithm effects. While our results are valid for all 2 → 2 channels, we compute explicitly for the qq' → qq' channel the color-flow matrices and plot the NLL resummed differential dijet cross section as an explicit example, which shows that the normalization and scale uncertainty is reduced when the soft function is refactorized. For this channel, we also plot the jet size R dependence, the p T cut dependence, and the dependence on the angularity parameter a.

  16. Jet shapes in dijet events at the LHC in SCET

    DOE PAGES

    Hornig, Andrew; Makris, Yiannis; Mehen, Thomas

    2016-04-15

    Here, we consider the class of jet shapes known as angularities in dijet production at hadron colliders. These angularities are modified from the original definitions in e + e- collisions to be boost invariant along the beam axis. These shapes apply to the constituents of jets defined with respect to either k T-type (anti-k T, C/A, and k T) algorithms and cone-type algorithms. We present an SCET factorization formula and calculate the ingredients needed to achieve next-to-leading-log (NLL) accuracy in kinematic regions where non-global logarithms are not large. The factorization formula involves previously unstudied “unmeasured beam functions,” which are present for finite rapidity cuts around the beams. We derive relations between the jet functions and the shape-dependent part of the soft function that appear in the factorized cross section and those previously calculated for e +e - collisions, and present the calculation of the non-trivial, color-connected part of the soft-function to O(αs) . This latter part of the soft function is universal in the sense that it applies to any experimental setup with an out-of-jet p T veto and rapidity cuts together with two identified jets and it is independent of the choice of jet (sub-)structure measurement. In addition, we implement the recently introduced soft-collinear refactorization to resum logarithms of the jet size, valid in the region of non-enhanced non-global logarithm effects. While our results are valid for all 2 → 2 channels, we compute explicitly for the qq' → qq' channel the color-flow matrices and plot the NLL resummed differential dijet cross section as an explicit example, which shows that the normalization and scale uncertainty is reduced when the soft function is refactorized. For this channel, we also plot the jet size R dependence, the pmore » $$cut\\atop{T}$$ dependence, and the dependence on the angularity parameter a.« less

  17. Flexible Early Warning Systems with Workflows and Decision Tables

    NASA Astrophysics Data System (ADS)

    Riedel, F.; Chaves, F.; Zeiner, H.

    2012-04-01

    An essential part of early warning systems and systems for crisis management are decision support systems that facilitate communication and collaboration. Often official policies specify how different organizations collaborate and what information is communicated to whom. For early warning systems it is crucial that information is exchanged dynamically in a timely manner and all participants get exactly the information they need to fulfil their role in the crisis management process. Information technology obviously lends itself to automate parts of the process. We have experienced however that in current operational systems the information logistics processes are hard-coded, even though they are subject to change. In addition, systems are tailored to the policies and requirements of a certain organization and changes can require major software refactoring. We seek to develop a system that can be deployed and adapted to multiple organizations with different dynamic runtime policies. A major requirement for such a system is that changes can be applied locally without affecting larger parts of the system. In addition to the flexibility regarding changes in policies and processes, the system needs to be able to evolve; when new information sources become available, it should be possible to integrate and use these in the decision process. In general, this kind of flexibility comes with a significant increase in complexity. This implies that only IT professionals can maintain a system that can be reconfigured and adapted; end-users are unable to utilise the provided flexibility. In the business world similar problems arise and previous work suggested using business process management systems (BPMS) or workflow management systems (WfMS) to guide and automate early warning processes or crisis management plans. However, the usability and flexibility of current WfMS are limited, because current notations and user interfaces are still not suitable for end-users, and workflows are usually only suited for rigid processes. We show how improvements can be achieved by using decision tables and rule-based adaptive workflows. Decision tables have been shown to be an intuitive tool that can be used by domain experts to express rule sets that can be interpreted automatically at runtime. Adaptive workflows use a rule-based approach to increase the flexibility of workflows by providing mechanisms to adapt workflows based on context changes, human intervention and availability of services. The combination of workflows, decision tables and rule-based adaption creates a framework that opens up new possibilities for flexible and adaptable workflows, especially, for use in early warning and crisis management systems.

  18. GlobiPack v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartlett, Roscoe

    2010-03-31

    GlobiPack contains a small collection of optimization globalization algorithms. These algorithms are used by optimization and various nonlinear equation solver algorithms.Used as the line-search procedure with Newton and Quasi-Newton optimization and nonlinear equation solver methods. These are standard published 1-D line search algorithms such as are described in the book Nocedal and Wright Numerical Optimization: 2nd edition, 2006. One set of algorithms were copied and refactored from the existing open-source Trilinos package MOOCHO where the linear search code is used to globalize SQP methods. This software is generic to any mathematical optimization problem where smooth derivatives exist. There is nomore » specific connection or mention whatsoever to any specific application, period. You cannot find more general mathematical software.« less

  19. Using Coarrays to Parallelize Legacy Fortran Applications: Strategy and Case Study

    DOE PAGES

    Radhakrishnan, Hari; Rouson, Damian W. I.; Morris, Karla; ...

    2015-01-01

    This paper summarizes a strategy for parallelizing a legacy Fortran 77 program using the object-oriented (OO) and coarray features that entered Fortran in the 2003 and 2008 standards, respectively. OO programming (OOP) facilitates the construction of an extensible suite of model-verification and performance tests that drive the development. Coarray parallel programming facilitates a rapid evolution from a serial application to a parallel application capable of running on multicore processors and many-core accelerators in shared and distributed memory. We delineate 17 code modernization steps used to refactor and parallelize the program and study the resulting performance. Our initial studies were donemore » using the Intel Fortran compiler on a 32-core shared memory server. Scaling behavior was very poor, and profile analysis using TAU showed that the bottleneck in the performance was due to our implementation of a collective, sequential summation procedure. We were able to improve the scalability and achieve nearly linear speedup by replacing the sequential summation with a parallel, binary tree algorithm. We also tested the Cray compiler, which provides its own collective summation procedure. Intel provides no collective reductions. With Cray, the program shows linear speedup even in distributed-memory execution. We anticipate similar results with other compilers once they support the new collective procedures proposed for Fortran 2015.« less

  20. Offline detection of broken rotor bars in AC induction motors

    NASA Astrophysics Data System (ADS)

    Powers, Craig Stephen

    ABSTRACT. OFFLINE DETECTION OF BROKEN ROTOR BARS IN AC INDUCTION MOTORS. The detection of the broken rotor bar defect in medium- and large-sized AC induction machines is currently one of the most difficult tasks for the motor condition and monitoring industry. If a broken rotor bar defect goes undetected, it can cause a catastrophic failure of an expensive machine. If a broken rotor bar defect is falsely determined, it wastes time and money to physically tear down and inspect the machine only to find an incorrect diagnosis. Previous work in 2009 at Baker/SKF-USA in collaboration with the Korea University has developed a prototype instrument that has been highly successful in correctly detecting the broken rotor bar defect in ACIMs where other methods have failed. Dr. Sang Bin and his students at the Korea University have been using this prototype instrument to help the industry save money in the successful detection of the BRB defect. A review of the current state of motor conditioning and monitoring technology for detecting the broken rotor bar defect in ACIMs shows improved detection of this fault is still relevant. An analysis of previous work in the creation of this prototype instrument leads into the refactoring of the software and hardware into something more deployable, cost effective and commercially viable.

  1. PRIMA-X Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lorenz, Daniel; Wolf, Felix

    2016-02-17

    The PRIMA-X (Performance Retargeting of Instrumentation, Measurement, and Analysis Technologies for Exascale Computing) project is the successor of the DOE PRIMA (Performance Refactoring of Instrumentation, Measurement, and Analysis Technologies for Petascale Computing) project, which addressed the challenge of creating a core measurement infrastructure that would serve as a common platform for both integrating leading parallel performance systems (notably TAU and Scalasca) and developing next-generation scalable performance tools. The PRIMA-X project shifts the focus away from refactorization of robust performance tools towards a re-targeting of the parallel performance measurement and analysis architecture for extreme scales. The massive concurrency, asynchronous execution dynamics,more » hardware heterogeneity, and multi-objective prerequisites (performance, power, resilience) that identify exascale systems introduce fundamental constraints on the ability to carry forward existing performance methodologies. In particular, there must be a deemphasis of per-thread observation techniques to significantly reduce the otherwise unsustainable flood of redundant performance data. Instead, it will be necessary to assimilate multi-level resource observations into macroscopic performance views, from which resilient performance metrics can be attributed to the computational features of the application. This requires a scalable framework for node-level and system-wide monitoring and runtime analyses of dynamic performance information. Also, the interest in optimizing parallelism parameters with respect to performance and energy drives the integration of tool capabilities in the exascale environment further. Initially, PRIMA-X was a collaborative project between the University of Oregon (lead institution) and the German Research School for Simulation Sciences (GRS). Because Prof. Wolf, the PI at GRS, accepted a position as full professor at Technische Universität Darmstadt (TU Darmstadt) starting February 1st, 2015, the project ended at GRS on January 31st, 2015. This report reflects the work accomplished at GRS until then. The work of GRS is expected to be continued at TU Darmstadt. The first main accomplishment of GRS is the design of different thread-level aggregation techniques. We created a prototype capable of aggregating the thread-level information in performance profiles using these techniques. The next step will be the integration of the most promising techniques into the Score-P measurement system and their evaluation. The second main accomplishment is a substantial increase of Score-P’s scalability, achieved by improving the design of the system-tree representation in Score-P’s profile format. We developed a new representation and a distributed algorithm to create the scalable system tree representation. Finally, we developed a lightweight approach to MPI wait-state profiling. Former algorithms either needed piggy-backing, which can cause significant runtime overhead, or tracing, which comes with its own set of scaling challenges. Our approach works with local data only and, thus, is scalable and has very little overhead.« less

  2. MDSplus quality improvement project

    DOE PAGES

    Fredian, Thomas W.; Stillerman, Joshua; Manduchi, Gabriele; ...

    2016-05-31

    MDSplus is a data acquisition and analysis system used worldwide predominantly in the fusion research community. Development began 29 years ago on the OpenVMS operating system. Since that time there have been many new features added and the code has been ported to many different operating systems. There have been contributions to the MDSplus development from the fusion community in the way of feature suggestions, feature implementations, documentation and porting to different operating systems. The bulk of the development and support of MDSplus, however, has been provided by a relatively small core developer group of three or four members. Givenmore » the size of the development team and the large number of users much more effort was focused on providing new features for the community than on keeping the underlying code and documentation up to date with the evolving software development standards. To ensure that MDSplus will continue to provide the needs of the community in the future, the MDSplus development team along with other members of the MDSplus user community has commenced on a major quality improvement project. The planned improvements include changes to software build scripts to better use GNU Autoconf and Automake tools, refactoring many of the source code modules using new language features available in modern compilers, using GNU MinGW-w64 to create MS Windows distributions, migrating to a more modern source code management system, improvement of source documentation as well as improvements to the www.mdsplus.org web site documentation and layout, and the addition of more comprehensive test suites to apply to MDSplus code builds prior to releasing installation kits to the community. This paper should lead to a much more robust product and establish a framework to maintain stability as more enhancements and features are added. Finally, this paper will describe these efforts that are either in progress or planned for the near future.« less

  3. Smashing the Stovepipe: Leveraging the GMSEC Open Architecture and Advanced IT Automation to Rapidly Prototype, Develop and Deploy Next-Generation Multi-Mission Ground Systems

    NASA Technical Reports Server (NTRS)

    Swenson, Paul

    2017-01-01

    Satellite/Payload Ground Systems - Typically highly-customized to a specific mission's use cases - Utilize hundreds (or thousands!) of specialized point-to-point interfaces for data flows / file transfers Documentation and tracking of these complex interfaces requires extensive time to develop and extremely high staffing costs Implementation and testing of these interfaces are even more cost-prohibitive, and documentation often lags behind implementation resulting in inconsistencies down the road With expanding threat vectors, IT Security, Information Assurance and Operational Security have become key Ground System architecture drivers New Federal security-related directives are generated on a daily basis, imposing new requirements on current / existing ground systems - These mandated activities and data calls typically carry little or no additional funding for implementation As a result, Ground System Sustaining Engineering groups and Information Technology staff continually struggle to keep up with the rolling tide of security Advancing security concerns and shrinking budgets are pushing these large stove-piped ground systems to begin sharing resources - I.e. Operational / SysAdmin staff, IT security baselines, architecture decisions or even networks / hosting infrastructure Refactoring these existing ground systems into multi-mission assets proves extremely challenging due to what is typically very tight coupling between legacy components As a result, many "Multi-Mission" ops. environments end up simply sharing compute resources and networks due to the difficulty of refactoring into true multi-mission systems Utilizing continuous integration / rapid system deployment technologies in conjunction with an open architecture messaging approach allows System Engineers and Architects to worry less about the low-level details of interfaces between components and configuration of systems GMSEC messaging is inherently designed to support multi-mission requirements, and allows components to aggregate data across multiple homogeneous or heterogeneous satellites or payloads - The highly-successful Goddard Science and Planetary Operations Control Center (SPOCC) utilizes GMSEC as the hub for it's automation and situational awareness capability Shifts focus towards getting GS to a final configuration-managed baseline, as well as multi-mission / big-picture capabilities that help increase situational awareness, promote cross-mission sharing and establish enhanced fleet management capabilities across all levels of the enterprise.

  4. Seismic Canvas: Evolution as a Data Exploration and Analysis Tool

    NASA Astrophysics Data System (ADS)

    Kroeger, G. C.

    2015-12-01

    SeismicCanvas, originally developed as a prototype interactive waveform display and printing application for educational use has evolved to include significant data exploration and analysis functionality. The most recent version supports data import from a variety of standard file formats including SAC and mini-SEED, as well as search and download capabilities via IRIS/FDSN Web Services. Data processing tools now include removal of means and trends, interactive windowing, filtering, smoothing, tapering, resampling. Waveforms can be displayed in a free-form canvas or as a record section based on angular or great circle distance, azimuth or back azimuth. Integrated tau-p code allows the calculation and display of theoretical phase arrivals from a variety of radial Earth models. Waveforms can be aligned by absolute time, event time, picked or theoretical arrival times and can be stacked after alignment. Interactive measurements include means, amplitudes, time delays, ray parameters and apparent velocities. Interactive picking of an arbitrary list of seismic phases is supported. Bode plots of amplitude and phase spectra and spectrograms can be created from multiple seismograms or selected windows of seismograms. Direct printing is implemented on all supported platforms along with output of high-resolution pdf files. With these added capabilities, the application is now being used as a data exploration tool for research. Coded in C++ and using the cross-platform Qt framework, the most recent version is available as a 64-bit application for Windows 7-10, Mac OS X 10.6-10.11, and most distributions of Linux, and a 32-bit version for Windows XP and 7. With the latest improvements and refactoring of trace display classes, the 64-bit versions have been tested with over 250 million samples and remain responsive in interactive operations. The source code is available under a LPGLv3 license and both source and executables are available through the IRIS SeisCode repository.

  5. Bumper 3 Update for IADC Protection Manual

    NASA Technical Reports Server (NTRS)

    Christiansen, Eric L.; Nagy, Kornel; Hyde, Jim

    2016-01-01

    The Bumper code has been the standard in use by NASA and contractors to perform meteoroid/debris risk assessments since 1990. It has undergone extensive revisions and updates [NASA JSC HITF website; Christiansen et al., 1992, 1997]. NASA Johnson Space Center (JSC) has applied BUMPER to risk assessments for Space Station, Shuttle, Mir, Extravehicular Mobility Units (EMU) space suits, and other spacecraft (e.g., LDEF, Iridium, TDRS, and Hubble Space Telescope). Bumper continues to be updated with changes in the ballistic limit equations describing failure threshold of various spacecraft components, as well as changes in the meteoroid and debris environment models. Significant efforts are expended to validate Bumper and benchmark it to other meteoroid/debris risk assessment codes. Bumper 3 is a refactored version of Bumper II. The structure of the code was extensively modified to improve maintenance, performance and flexibility. The architecture was changed to separate the frequently updated ballistic limit equations from the relatively stable common core functions of the program. These updates allow NASA to produce specific editions of the Bumper 3 that are tailored for specific customer requirements. The core consists of common code necessary to process the Micrometeoroid and Orbital Debris (MMOD) environment models, assess shadowing and calculate MMOD risk. The library of target response subroutines includes a board range of different types of MMOD shield ballistic limit equations as well as equations describing damage to various spacecraft subsystems or hardware (thermal protection materials, windows, radiators, solar arrays, cables, etc.). The core and library of ballistic response subroutines are maintained under configuration control. A change in the core will affect all editions of the code, whereas a change in one or more of the response subroutines will affect all editions of the code that contain the particular response subroutines which are modified. Note that the Bumper II program is no longer maintained or distributed by NASA.

  6. Multiphysics Application Coupling Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campbell, Michael T.

    2013-12-02

    This particular consortium implementation of the software integration infrastructure will, in large part, refactor portions of the Rocstar multiphysics infrastructure. Development of this infrastructure originated at the University of Illinois DOE ASCI Center for Simulation of Advanced Rockets (CSAR) to support the center's massively parallel multiphysics simulation application, Rocstar, and has continued at IllinoisRocstar, a small company formed near the end of the University-based program. IllinoisRocstar is now licensing these new developments as free, open source, in hopes to help improve their own and others' access to infrastructure which can be readily utilized in developing coupled or composite software systems;more » with particular attention to more rapid production and utilization of multiphysics applications in the HPC environment. There are two major pieces to the consortium implementation, the Application Component Toolkit (ACT), and the Multiphysics Application Coupling Toolkit (MPACT). The current development focus is the ACT, which is (will be) the substrate for MPACT. The ACT itself is built up from the components described in the technical approach. In particular, the ACT has the following major components: 1.The Component Object Manager (COM): The COM package provides encapsulation of user applications, and their data. COM also provides the inter-component function call mechanism. 2.The System Integration Manager (SIM): The SIM package provides constructs and mechanisms for orchestrating composite systems of multiply integrated pieces.« less

  7. PB-AM: An open-source, fully analytical linear poisson-boltzmann solver

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Felberg, Lisa E.; Brookes, David H.; Yap, Eng-Hui

    2016-11-02

    We present the open source distributed software package Poisson-Boltzmann Analytical Method (PB-AM), a fully analytical solution to the linearized Poisson Boltzmann equation. The PB-AM software package includes the generation of outputs files appropriate for visualization using VMD, a Brownian dynamics scheme that uses periodic boundary conditions to simulate dynamics, the ability to specify docking criteria, and offers two different kinetics schemes to evaluate biomolecular association rate constants. Given that PB-AM defines mutual polarization completely and accurately, it can be refactored as a many-body expansion to explore 2- and 3-body polarization. Additionally, the software has been integrated into the Adaptive Poisson-Boltzmannmore » Solver (APBS) software package to make it more accessible to a larger group of scientists, educators and students that are more familiar with the APBS framework.« less

  8. A fast image registration approach of neural activities in light-sheet fluorescence microscopy images

    NASA Astrophysics Data System (ADS)

    Meng, Hui; Hui, Hui; Hu, Chaoen; Yang, Xin; Tian, Jie

    2017-03-01

    The ability of fast and single-neuron resolution imaging of neural activities enables light-sheet fluorescence microscopy (LSFM) as a powerful imaging technique in functional neural connection applications. The state-of-art LSFM imaging system can record the neuronal activities of entire brain for small animal, such as zebrafish or C. elegans at single-neuron resolution. However, the stimulated and spontaneous movements in animal brain result in inconsistent neuron positions during recording process. It is time consuming to register the acquired large-scale images with conventional method. In this work, we address the problem of fast registration of neural positions in stacks of LSFM images. This is necessary to register brain structures and activities. To achieve fast registration of neural activities, we present a rigid registration architecture by implementation of Graphics Processing Unit (GPU). In this approach, the image stacks were preprocessed on GPU by mean stretching to reduce the computation effort. The present image was registered to the previous image stack that considered as reference. A fast Fourier transform (FFT) algorithm was used for calculating the shift of the image stack. The calculations for image registration were performed in different threads while the preparation functionality was refactored and called only once by the master thread. We implemented our registration algorithm on NVIDIA Quadro K4200 GPU under Compute Unified Device Architecture (CUDA) programming environment. The experimental results showed that the registration computation can speed-up to 550ms for a full high-resolution brain image. Our approach also has potential to be used for other dynamic image registrations in biomedical applications.

  9. Refactoring the Genetic Code for Increased Evolvability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pines, Gur; Winkler, James D.; Pines, Assaf

    ABSTRACT The standard genetic code is robust to mutations during transcription and translation. Point mutations are likely to be synonymous or to preserve the chemical properties of the original amino acid. Saturation mutagenesis experiments suggest that in some cases the best-performing mutant requires replacement of more than a single nucleotide within a codon. These replacements are essentially inaccessible to common error-based laboratory engineering techniques that alter a single nucleotide per mutation event, due to the extreme rarity of adjacent mutations. In this theoretical study, we suggest a radical reordering of the genetic code that maximizes the mutagenic potential of singlemore » nucleotide replacements. We explore several possible genetic codes that allow a greater degree of accessibility to the mutational landscape and may result in a hyperevolvable organism that could serve as an ideal platform for directed evolution experiments. We then conclude by evaluating the challenges of constructing such recoded organisms and their potential applications within the field of synthetic biology. IMPORTANCE The conservative nature of the genetic code prevents bioengineers from efficiently accessing the full mutational landscape of a gene via common error-prone methods. Here, we present two computational approaches to generate alternative genetic codes with increased accessibility. These new codes allow mutational transitions to a larger pool of amino acids and with a greater extent of chemical differences, based on a single nucleotide replacement within the codon, thus increasing evolvability both at the single-gene and at the genome levels. Given the widespread use of these techniques for strain and protein improvement, along with more fundamental evolutionary biology questions, the use of recoded organisms that maximize evolvability should significantly improve the efficiency of directed evolution, library generation, and fitness maximization.« less

  10. Refactoring the Genetic Code for Increased Evolvability

    DOE PAGES

    Pines, Gur; Winkler, James D.; Pines, Assaf; ...

    2017-11-14

    ABSTRACT The standard genetic code is robust to mutations during transcription and translation. Point mutations are likely to be synonymous or to preserve the chemical properties of the original amino acid. Saturation mutagenesis experiments suggest that in some cases the best-performing mutant requires replacement of more than a single nucleotide within a codon. These replacements are essentially inaccessible to common error-based laboratory engineering techniques that alter a single nucleotide per mutation event, due to the extreme rarity of adjacent mutations. In this theoretical study, we suggest a radical reordering of the genetic code that maximizes the mutagenic potential of singlemore » nucleotide replacements. We explore several possible genetic codes that allow a greater degree of accessibility to the mutational landscape and may result in a hyperevolvable organism that could serve as an ideal platform for directed evolution experiments. We then conclude by evaluating the challenges of constructing such recoded organisms and their potential applications within the field of synthetic biology. IMPORTANCE The conservative nature of the genetic code prevents bioengineers from efficiently accessing the full mutational landscape of a gene via common error-prone methods. Here, we present two computational approaches to generate alternative genetic codes with increased accessibility. These new codes allow mutational transitions to a larger pool of amino acids and with a greater extent of chemical differences, based on a single nucleotide replacement within the codon, thus increasing evolvability both at the single-gene and at the genome levels. Given the widespread use of these techniques for strain and protein improvement, along with more fundamental evolutionary biology questions, the use of recoded organisms that maximize evolvability should significantly improve the efficiency of directed evolution, library generation, and fitness maximization.« less

  11. Retinal Connectomics: Towards Complete, Accurate Networks

    PubMed Central

    Marc, Robert E.; Jones, Bryan W.; Watt, Carl B.; Anderson, James R.; Sigulinsky, Crystal; Lauritzen, Scott

    2013-01-01

    Connectomics is a strategy for mapping complex neural networks based on high-speed automated electron optical imaging, computational assembly of neural data volumes, web-based navigational tools to explore 1012–1015 byte (terabyte to petabyte) image volumes, and annotation and markup tools to convert images into rich networks with cellular metadata. These collections of network data and associated metadata, analyzed using tools from graph theory and classification theory, can be merged with classical systems theory, giving a more completely parameterized view of how biologic information processing systems are implemented in retina and brain. Networks have two separable features: topology and connection attributes. The first findings from connectomics strongly validate the idea that the topologies complete retinal networks are far more complex than the simple schematics that emerged from classical anatomy. In particular, connectomics has permitted an aggressive refactoring of the retinal inner plexiform layer, demonstrating that network function cannot be simply inferred from stratification; exposing the complex geometric rules for inserting different cells into a shared network; revealing unexpected bidirectional signaling pathways between mammalian rod and cone systems; documenting selective feedforward systems, novel candidate signaling architectures, new coupling motifs, and the highly complex architecture of the mammalian AII amacrine cell. This is but the beginning, as the underlying principles of connectomics are readily transferrable to non-neural cell complexes and provide new contexts for assessing intercellular communication. PMID:24016532

  12. PB-AM: An open-source, fully analytical linear poisson-boltzmann solver.

    PubMed

    Felberg, Lisa E; Brookes, David H; Yap, Eng-Hui; Jurrus, Elizabeth; Baker, Nathan A; Head-Gordon, Teresa

    2017-06-05

    We present the open source distributed software package Poisson-Boltzmann Analytical Method (PB-AM), a fully analytical solution to the linearized PB equation, for molecules represented as non-overlapping spherical cavities. The PB-AM software package includes the generation of outputs files appropriate for visualization using visual molecular dynamics, a Brownian dynamics scheme that uses periodic boundary conditions to simulate dynamics, the ability to specify docking criteria, and offers two different kinetics schemes to evaluate biomolecular association rate constants. Given that PB-AM defines mutual polarization completely and accurately, it can be refactored as a many-body expansion to explore 2- and 3-body polarization. Additionally, the software has been integrated into the Adaptive Poisson-Boltzmann Solver (APBS) software package to make it more accessible to a larger group of scientists, educators, and students that are more familiar with the APBS framework. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  13. Toward Millions of File System IOPS on Low-Cost, Commodity Hardware

    PubMed Central

    Zheng, Da; Burns, Randal; Szalay, Alexander S.

    2013-01-01

    We describe a storage system that removes I/O bottlenecks to achieve more than one million IOPS based on a user-space file abstraction for arrays of commodity SSDs. The file abstraction refactors I/O scheduling and placement for extreme parallelism and non-uniform memory and I/O. The system includes a set-associative, parallel page cache in the user space. We redesign page caching to eliminate CPU overhead and lock-contention in non-uniform memory architecture machines. We evaluate our design on a 32 core NUMA machine with four, eight-core processors. Experiments show that our design delivers 1.23 million 512-byte read IOPS. The page cache realizes the scalable IOPS of Linux asynchronous I/O (AIO) and increases user-perceived I/O performance linearly with cache hit rates. The parallel, set-associative cache matches the cache hit rates of the global Linux page cache under real workloads. PMID:24402052

  14. Toward Millions of File System IOPS on Low-Cost, Commodity Hardware.

    PubMed

    Zheng, Da; Burns, Randal; Szalay, Alexander S

    2013-01-01

    We describe a storage system that removes I/O bottlenecks to achieve more than one million IOPS based on a user-space file abstraction for arrays of commodity SSDs. The file abstraction refactors I/O scheduling and placement for extreme parallelism and non-uniform memory and I/O. The system includes a set-associative, parallel page cache in the user space. We redesign page caching to eliminate CPU overhead and lock-contention in non-uniform memory architecture machines. We evaluate our design on a 32 core NUMA machine with four, eight-core processors. Experiments show that our design delivers 1.23 million 512-byte read IOPS. The page cache realizes the scalable IOPS of Linux asynchronous I/O (AIO) and increases user-perceived I/O performance linearly with cache hit rates. The parallel, set-associative cache matches the cache hit rates of the global Linux page cache under real workloads.

  15. Development of a Cloud Resolving Model for Heterogeneous Supercomputers

    NASA Astrophysics Data System (ADS)

    Sreepathi, S.; Norman, M. R.; Pal, A.; Hannah, W.; Ponder, C.

    2017-12-01

    A cloud resolving climate model is needed to reduce major systematic errors in climate simulations due to structural uncertainty in numerical treatments of convection - such as convective storm systems. This research describes the porting effort to enable SAM (System for Atmosphere Modeling) cloud resolving model on heterogeneous supercomputers using GPUs (Graphical Processing Units). We have isolated a standalone configuration of SAM that is targeted to be integrated into the DOE ACME (Accelerated Climate Modeling for Energy) Earth System model. We have identified key computational kernels from the model and offloaded them to a GPU using the OpenACC programming model. Furthermore, we are investigating various optimization strategies intended to enhance GPU utilization including loop fusion/fission, coalesced data access and loop refactoring to a higher abstraction level. We will present early performance results, lessons learned as well as optimization strategies. The computational platform used in this study is the Summitdev system, an early testbed that is one generation removed from Summit, the next leadership class supercomputer at Oak Ridge National Laboratory. The system contains 54 nodes wherein each node has 2 IBM POWER8 CPUs and 4 NVIDIA Tesla P100 GPUs. This work is part of a larger project, ACME-MMF component of the U.S. Department of Energy(DOE) Exascale Computing Project. The ACME-MMF approach addresses structural uncertainty in cloud processes by replacing traditional parameterizations with cloud resolving "superparameterization" within each grid cell of global climate model. Super-parameterization dramatically increases arithmetic intensity, making the MMF approach an ideal strategy to achieve good performance on emerging exascale computing architectures. The goal of the project is to integrate superparameterization into ACME, and explore its full potential to scientifically and computationally advance climate simulation and prediction.

  16. The Virtual Environment for Reactor Applications (VERA): Design and architecture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turner, John A., E-mail: turnerja@ornl.gov; Clarno, Kevin; Sieger, Matt

    VERA, the Virtual Environment for Reactor Applications, is the system of physics capabilities being developed and deployed by the Consortium for Advanced Simulation of Light Water Reactors (CASL). CASL was established for the modeling and simulation of commercial nuclear reactors. VERA consists of integrating and interfacing software together with a suite of physics components adapted and/or refactored to simulate relevant physical phenomena in a coupled manner. VERA also includes the software development environment and computational infrastructure needed for these components to be effectively used. We describe the architecture of VERA from both software and numerical perspectives, along with the goalsmore » and constraints that drove major design decisions, and their implications. We explain why VERA is an environment rather than a framework or toolkit, why these distinctions are relevant (particularly for coupled physics applications), and provide an overview of results that demonstrate the use of VERA tools for a variety of challenging applications within the nuclear industry.« less

  17. Elucidating steroid alkaloid biosynthesis in Veratrum californicum: production of verazine in Sf9 cells

    PubMed Central

    Augustin, Megan M.; Ruzicka, Dan R.; Shukla, Ashutosh K.; Augustin, Jörg M.; Starks, Courtney M.; O’Neil-Johnson, Mark; McKain, Michael R.; Evans, Bradley S.; Barrett, Matt D.; Smithson, Ann; Wong, Gane Ka-Shu; Deyholos, Michael K.; Edger, Patrick P.; Pires, J. Chris; Leebens-Mack, James H.; Mann, David A.; Kutchan, Toni M.

    2015-01-01

    Summary Steroid alkaloids have been shown to elicit a wide range of pharmacological effects that include anticancer and antifungal activities. Understanding the biosynthesis of these molecules is essential to bioengineering for sustainable production. Herein, we investigate the biosynthetic pathway to cyclopamine, a steroid alkaloid that shows promising antineoplastic activities. Supply of cyclopamine is limited, as the current source is solely derived from wild collection of the plant Veratrum californicum. To elucidate the early stages of the pathway to cyclopamine, we interrogated a V. californicum RNA-seq dataset using the cyclopamine accumulation profile as a predefined model for gene expression with the pattern-matching algorithm Haystack. Refactoring candidate genes in Sf9 insect cells led to discovery of four enzymes that catalyze the first six steps in steroid alkaloid biosynthesis to produce verazine, a predicted precursor to cyclopamine. Three of the enzymes are cytochromes P450 while the fourth is a γ-aminobutyrate transaminase; together they produce verazine from cholesterol. PMID:25939370

  18. SimVascular: An Open Source Pipeline for Cardiovascular Simulation.

    PubMed

    Updegrove, Adam; Wilson, Nathan M; Merkow, Jameson; Lan, Hongzhi; Marsden, Alison L; Shadden, Shawn C

    2017-03-01

    Patient-specific cardiovascular simulation has become a paradigm in cardiovascular research and is emerging as a powerful tool in basic, translational and clinical research. In this paper we discuss the recent development of a fully open-source SimVascular software package, which provides a complete pipeline from medical image data segmentation to patient-specific blood flow simulation and analysis. This package serves as a research tool for cardiovascular modeling and simulation, and has contributed to numerous advances in personalized medicine, surgical planning and medical device design. The SimVascular software has recently been refactored and expanded to enhance functionality, usability, efficiency and accuracy of image-based patient-specific modeling tools. Moreover, SimVascular previously required several licensed components that hindered new user adoption and code management and our recent developments have replaced these commercial components to create a fully open source pipeline. These developments foster advances in cardiovascular modeling research, increased collaboration, standardization of methods, and a growing developer community.

  19. GPU accelerated particle visualization with Splotch

    NASA Astrophysics Data System (ADS)

    Rivi, M.; Gheller, C.; Dykes, T.; Krokos, M.; Dolag, K.

    2014-07-01

    Splotch is a rendering algorithm for exploration and visual discovery in particle-based datasets coming from astronomical observations or numerical simulations. The strengths of the approach are production of high quality imagery and support for very large-scale datasets through an effective mix of the OpenMP and MPI parallel programming paradigms. This article reports our experiences in re-designing Splotch for exploiting emerging HPC architectures nowadays increasingly populated with GPUs. A performance model is introduced to guide our re-factoring of Splotch. A number of parallelization issues are discussed, in particular relating to race conditions and workload balancing, towards achieving optimal performances. Our implementation was accomplished by using the CUDA programming paradigm. Our strategy is founded on novel schemes achieving optimized data organization and classification of particles. We deploy a reference cosmological simulation to present performance results on acceleration gains and scalability. We finally outline our vision for future work developments including possibilities for further optimizations and exploitation of hybrid systems and emerging accelerators.

  20. Indico — the Road to 2.0

    NASA Astrophysics Data System (ADS)

    Ferreira, P.; Avilés, A.; Dafflon, J.; Mönnich, A.; Trichopoulos, I.

    2015-12-01

    Indico has come a long way since it was first used to organize CHEP 2004. More than ten years of development have brought new features and projects, widening the application's feature set and enabling event organizers to work even more efficiently. While that has boosted the tool's usage and facilitated its adoption by a remarkable 300,000 events (at CERN only), it has also generated a whole new range of challenges, which have been the target of the team's attention for the last 2 years. One of them was that of scalability and the maintainability of the current database solution (ZODB). After careful consideration, the decision was taken to move away from ZODB to PostgreSQL, a relational and widely-adopted solution that will permit the development of a more ambitious feature set as well as improved performance and scalability. A change of this type is by no means trivial in nature and requires the refactoring of most backend code as well as the full rewrite of significant portions of it. We are taking this opportunity to modernize Indico, by employing standard web modules, technologies and concepts that not only make development and maintenance easier but also constitute an upgrade to Indico's stack. The first results are already visible since August 2014, with the full migration of the Room Booking module to the new paradigm. In this paper we explain what has been done so far in the context of this ambitious migration, what have been the main findings and challenges, as well as the main technologies and concepts that will constitute the foundation of the resultant Indico 2.0.

  1. Open-Source Development of the Petascale Reactive Flow and Transport Code PFLOTRAN

    NASA Astrophysics Data System (ADS)

    Hammond, G. E.; Andre, B.; Bisht, G.; Johnson, T.; Karra, S.; Lichtner, P. C.; Mills, R. T.

    2013-12-01

    Open-source software development has become increasingly popular in recent years. Open-source encourages collaborative and transparent software development and promotes unlimited free redistribution of source code to the public. Open-source development is good for science as it reveals implementation details that are critical to scientific reproducibility, but generally excluded from journal publications. In addition, research funds that would have been spent on licensing fees can be redirected to code development that benefits more scientists. In 2006, the developers of PFLOTRAN open-sourced their code under the U.S. Department of Energy SciDAC-II program. Since that time, the code has gained popularity among code developers and users from around the world seeking to employ PFLOTRAN to simulate thermal, hydraulic, mechanical and biogeochemical processes in the Earth's surface/subsurface environment. PFLOTRAN is a massively-parallel subsurface reactive multiphase flow and transport simulator designed from the ground up to run efficiently on computing platforms ranging from the laptop to leadership-class supercomputers, all from a single code base. The code employs domain decomposition for parallelism and is founded upon the well-established and open-source parallel PETSc and HDF5 frameworks. PFLOTRAN leverages modern Fortran (i.e. Fortran 2003-2008) in its extensible object-oriented design. The use of this progressive, yet domain-friendly programming language has greatly facilitated collaboration in the code's software development. Over the past year, PFLOTRAN's top-level data structures were refactored as Fortran classes (i.e. extendible derived types) to improve the flexibility of the code, ease the addition of new process models, and enable coupling to external simulators. For instance, PFLOTRAN has been coupled to the parallel electrical resistivity tomography code E4D to enable hydrogeophysical inversion while the same code base can be used as a third-party library to provide hydrologic flow, energy transport, and biogeochemical capability to the community land model, CLM, part of the open-source community earth system model (CESM) for climate. In this presentation, the advantages and disadvantages of open source software development in support of geoscience research at government laboratories, universities, and the private sector are discussed. Since the code is open-source (i.e. it's transparent and readily available to competitors), the PFLOTRAN team's development strategy within a competitive research environment is presented. Finally, the developers discuss their approach to object-oriented programming and the leveraging of modern Fortran in support of collaborative geoscience research as the Fortran standard evolves among compiler vendors.

  2. HMMER web server: 2018 update.

    PubMed

    Potter, Simon C; Luciani, Aurélien; Eddy, Sean R; Park, Youngmi; Lopez, Rodrigo; Finn, Robert D

    2018-06-14

    The HMMER webserver [http://www.ebi.ac.uk/Tools/hmmer] is a free-to-use service which provides fast searches against widely used sequence databases and profile hidden Markov model (HMM) libraries using the HMMER software suite (http://hmmer.org). The results of a sequence search may be summarized in a number of ways, allowing users to view and filter the significant hits by domain architecture or taxonomy. For large scale usage, we provide an application programmatic interface (API) which has been expanded in scope, such that all result presentations are available via both HTML and API. Furthermore, we have refactored our JavaScript visualization library to provide standalone components for different result representations. These consume the aforementioned API and can be integrated into third-party websites. The range of databases that can be searched against has been expanded, adding four sequence datasets (12 in total) and one profile HMM library (6 in total). To help users explore the biological context of their results, and to discover new data resources, search results are now supplemented with cross references to other EMBL-EBI databases.

  3. The Virtual Environment for Reactor Applications (VERA): Design and architecture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turner, John A.; Clarno, Kevin; Sieger, Matt

    VERA, the Virtual Environment for Reactor Applications, is the system of physics capabilities being developed and deployed by the Consortium for Advanced Simulation of Light Water Reactors (CASL), the first DOE Hub, which was established in July 2010 for the modeling and simulation of commercial nuclear reactors. VERA consists of integrating and interfacing software together with a suite of physics components adapted and/or refactored to simulate relevant physical phenomena in a coupled manner. VERA also includes the software development environment and computational infrastructure needed for these components to be effectively used. We describe the architecture of VERA from both amore » software and a numerical perspective, along with the goals and constraints that drove the major design decisions and their implications. As a result, we explain why VERA is an environment rather than a framework or toolkit, why these distinctions are relevant (particularly for coupled physics applications), and provide an overview of results that demonstrate the application of VERA tools for a variety of challenging problems within the nuclear industry.« less

  4. The Virtual Environment for Reactor Applications (VERA): Design and architecture

    DOE PAGES

    Turner, John A.; Clarno, Kevin; Sieger, Matt; ...

    2016-09-08

    VERA, the Virtual Environment for Reactor Applications, is the system of physics capabilities being developed and deployed by the Consortium for Advanced Simulation of Light Water Reactors (CASL), the first DOE Hub, which was established in July 2010 for the modeling and simulation of commercial nuclear reactors. VERA consists of integrating and interfacing software together with a suite of physics components adapted and/or refactored to simulate relevant physical phenomena in a coupled manner. VERA also includes the software development environment and computational infrastructure needed for these components to be effectively used. We describe the architecture of VERA from both amore » software and a numerical perspective, along with the goals and constraints that drove the major design decisions and their implications. As a result, we explain why VERA is an environment rather than a framework or toolkit, why these distinctions are relevant (particularly for coupled physics applications), and provide an overview of results that demonstrate the application of VERA tools for a variety of challenging problems within the nuclear industry.« less

  5. Synthetic Biology for Cell-Free Biosynthesis: Fundamentals of Designing Novel In Vitro Multi-Enzyme Reaction Networks.

    PubMed

    Morgado, Gaspar; Gerngross, Daniel; Roberts, Tania M; Panke, Sven

    Cell-free biosynthesis in the form of in vitro multi-enzyme reaction networks or enzyme cascade reactions emerges as a promising tool to carry out complex catalysis in one-step, one-vessel settings. It combines the advantages of well-established in vitro biocatalysis with the power of multi-step in vivo pathways. Such cascades have been successfully applied to the synthesis of fine and bulk chemicals, monomers and complex polymers of chemical importance, and energy molecules from renewable resources as well as electricity. The scale of these initial attempts remains small, suggesting that more robust control of such systems and more efficient optimization are currently major bottlenecks. To this end, the very nature of enzyme cascade reactions as multi-membered systems requires novel approaches for implementation and optimization, some of which can be obtained from in vivo disciplines (such as pathway refactoring and DNA assembly), and some of which can be built on the unique, cell-free properties of cascade reactions (such as easy analytical access to all system intermediates to facilitate modeling).

  6. Traversing the fungal terpenome

    PubMed Central

    Quin, Maureen B.; Flynn, Christopher M.; Schmidt-Dannert, Claudia

    2014-01-01

    Fungi (Ascomycota and Basidiomycota) are prolific producers of structurally diverse terpenoid compounds. Classes of terpenoids identified in fungi include the sesqui-, di- and triterpenoids. Biosynthetic pathways and enzymes to terpenoids from each of these classes have been described. These typically involve the scaffold generating terpene synthases and cyclases, and scaffold tailoring enzymes such as e.g. cytochrome P450 monoxygenases, NAD(P)+ and flavin dependent oxidoreductases, and various group transferases that generate the final bioactive structures. The biosynthesis of several sesquiterpenoid mycotoxins and bioactive diterpenoids has been well-studied in Ascomycota (e.g. filamentous fungi). Little is known about the terpenoid biosynthetic pathways in Basidiomycota (e.g. mushroom forming fungi), although they produce a huge diversity of terpenoid natural products. Specifically, many trans-humulyl cation derived sesquiterpenoid natural products with potent bioactivities have been isolated. Biosynthetic gene clusters responsible for the production of trans-humulyl cation derived protoilludanes, and other sesquiterpenoids, can be rapidly identified by genome sequencing and bioinformatic methods. Genome mining combined with heterologous biosynthetic pathway refactoring has the potential to facilitate discovery and production of pharmaceutically relevant fungal terpenoids. PMID:25171145

  7. BiDiBlast: comparative genomics pipeline for the PC.

    PubMed

    de Almeida, João M G C F

    2010-06-01

    Bi-directional BLAST is a simple approach to detect, annotate, and analyze candidate orthologous or paralogous sequences in a single go. This procedure is usually confined to the realm of customized Perl scripts, usually tuned for UNIX-like environments. Porting those scripts to other operating systems involves refactoring them, and also the installation of the Perl programming environment with the required libraries. To overcome these limitations, a data pipeline was implemented in Java. This application submits two batches of sequences to local versions of the NCBI BLAST tool, manages result lists, and refines both bi-directional and simple hits. GO Slim terms are attached to hits, several statistics are derived, and molecular evolution rates are estimated through PAML. The results are written to a set of delimited text tables intended for further analysis. The provided graphic user interface allows a friendly interaction with this application, which is documented and available to download at http://moodle.fct.unl.pt/course/view.php?id=2079 or https://sourceforge.net/projects/bidiblast/ under the GNU GPL license. Copyright 2010 Beijing Genomics Institute. Published by Elsevier Ltd. All rights reserved.

  8. The High Field Path to Practical Fusion Energy

    NASA Astrophysics Data System (ADS)

    Mumgaard, Robert; Whyte, D.; Greenwald, M.; Hartwig, Z.; Brunner, D.; Sorbom, B.; Marmar, E.; Minervini, J.; Bonoli, P.; Irby, J.; Labombard, B.; Terry, J.; Vieira, R.; Wukitch, S.

    2017-10-01

    We propose a faster, lower cost development path for fusion energy enabled by high temperature superconductors, devices at high magnetic field, innovative technologies and modern approaches to technology development. Timeliness, scale, and economic-viability are the drivers for fusion energy to combat climate change and aid economic development. The opportunities provided by high-temperature superconductors, innovative engineering and physics, and new organizational structures identified over the last few years open new possibilities for realizing practical fusion energy that could meet mid-century de-carbonization needs. We discuss re-factoring the fusion energy development path with an emphasis on concrete risk retirement strategies utilizing a modular approach based on the high-field tokamak that leverages the broader tokamak physics understanding of confinement, stability, and operational limits. Elements of this plan include development of high-temperature superconductor magnets, simplified immersion blankets, advanced long-leg divertors, a compact divertor test tokamak, efficient current drive, modular construction, and demountable magnet joints. An R&D plan culminating in the construction of an integrated pilot plant and test facility modeled on the ARC concept is presented.

  9. nmsBuilder: Freeware to create subject-specific musculoskeletal models for OpenSim.

    PubMed

    Valente, Giordano; Crimi, Gianluigi; Vanella, Nicola; Schileo, Enrico; Taddei, Fulvia

    2017-12-01

    Musculoskeletal modeling and simulations of movement have been increasingly used in orthopedic and neurological scenarios, with increased attention to subject-specific applications. In general, musculoskeletal modeling applications have been facilitated by the development of dedicated software tools; however, subject-specific studies have been limited also by time-consuming modeling workflows and high skilled expertise required. In addition, no reference tools exist to standardize the process of musculoskeletal model creation and make it more efficient. Here we present a freely available software application, nmsBuilder 2.0, to create musculoskeletal models in the file format of OpenSim, a widely-used open-source platform for musculoskeletal modeling and simulation. nmsBuilder 2.0 is the result of a major refactoring of a previous implementation that moved a first step toward an efficient workflow for subject-specific model creation. nmsBuilder includes a graphical user interface that provides access to all functionalities, based on a framework for computer-aided medicine written in C++. The operations implemented can be used in a workflow to create OpenSim musculoskeletal models from 3D surfaces. A first step includes data processing to create supporting objects necessary to create models, e.g. surfaces, anatomical landmarks, reference systems; and a second step includes the creation of OpenSim objects, e.g. bodies, joints, muscles, and the corresponding model. We present a case study using nmsBuilder 2.0: the creation of an MRI-based musculoskeletal model of the lower limb. The model included four rigid bodies, five degrees of freedom and 43 musculotendon actuators, and was created from 3D surfaces of the segmented images of a healthy subject through the modeling workflow implemented in the software application. We have presented nmsBuilder 2.0 for the creation of musculoskeletal OpenSim models from image-based data, and made it freely available via nmsbuilder.org. This application provides an efficient workflow for model creation and helps standardize the process. We hope this would help promote personalized applications in musculoskeletal biomechanics, including larger sample size studies, and might also represent a basis for future developments for specific applications. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. PanDA for ATLAS distributed computing in the next decade

    NASA Astrophysics Data System (ADS)

    Barreiro Megino, F. H.; De, K.; Klimentov, A.; Maeno, T.; Nilsson, P.; Oleynik, D.; Padolski, S.; Panitkin, S.; Wenaus, T.; ATLAS Collaboration

    2017-10-01

    The Production and Distributed Analysis (PanDA) system has been developed to meet ATLAS production and analysis requirements for a data-driven workload management system capable of operating at the Large Hadron Collider (LHC) data processing scale. Heterogeneous resources used by the ATLAS experiment are distributed worldwide at hundreds of sites, thousands of physicists analyse the data remotely, the volume of processed data is beyond the exabyte scale, dozens of scientific applications are supported, while data processing requires more than a few billion hours of computing usage per year. PanDA performed very well over the last decade including the LHC Run 1 data taking period. However, it was decided to upgrade the whole system concurrently with the LHC’s first long shutdown in order to cope with rapidly changing computing infrastructure. After two years of reengineering efforts, PanDA has embedded capabilities for fully dynamic and flexible workload management. The static batch job paradigm was discarded in favor of a more automated and scalable model. Workloads are dynamically tailored for optimal usage of resources, with the brokerage taking network traffic and forecasts into account. Computing resources are partitioned based on dynamic knowledge of their status and characteristics. The pilot has been re-factored around a plugin structure for easier development and deployment. Bookkeeping is handled with both coarse and fine granularities for efficient utilization of pledged or opportunistic resources. An in-house security mechanism authenticates the pilot and data management services in off-grid environments such as volunteer computing and private local clusters. The PanDA monitor has been extensively optimized for performance and extended with analytics to provide aggregated summaries of the system as well as drill-down to operational details. There are as well many other challenges planned or recently implemented, and adoption by non-LHC experiments such as bioinformatics groups successfully running Paleomix (microbial genome and metagenomes) payload on supercomputers. In this paper we will focus on the new and planned features that are most important to the next decade of distributed computing workload management.

  11. SuperB Simulation Production System

    NASA Astrophysics Data System (ADS)

    Tomassetti, L.; Bianchi, F.; Ciaschini, V.; Corvo, M.; Del Prete, D.; Di Simone, A.; Donvito, G.; Fella, A.; Franchini, P.; Giacomini, F.; Gianoli, A.; Longo, S.; Luitz, S.; Luppi, E.; Manzali, M.; Pardi, S.; Paolini, A.; Perez, A.; Rama, M.; Russo, G.; Santeramo, B.; Stroili, R.

    2012-12-01

    The SuperB asymmetric e+e- collider and detector to be built at the newly founded Nicola Cabibbo Lab will provide a uniquely sensitive probe of New Physics in the flavor sector of the Standard Model. Studying minute effects in the heavy quark and heavy lepton sectors requires a data sample of 75 ab-1 and a peak luminosity of 1036 cm-2 s-1. The SuperB Computing group is working on developing a simulation production framework capable to satisfy the experiment needs. It provides access to distributed resources in order to support both the detector design definition and its performance evaluation studies. During last year the framework has evolved from the point of view of job workflow, Grid services interfaces and technologies adoption. A complete code refactoring and sub-component language porting now permits the framework to sustain distributed production involving resources from two continents and Grid Flavors. In this paper we will report a complete description of the production system status of the art, its evolution and its integration with Grid services; in particular, we will focus on the utilization of new Grid component features as in LB and WMS version 3. Results from the last official SuperB production cycle will be reported.

  12. Dynamical and Mechanistic Reconstructive Approaches of T Lymphocyte Dynamics: Using Visual Modeling Languages to Bridge the Gap between Immunologists, Theoreticians, and Programmers

    PubMed Central

    Thomas-Vaslin, Véronique; Six, Adrien; Ganascia, Jean-Gabriel; Bersini, Hugues

    2013-01-01

    Dynamic modeling of lymphocyte behavior has primarily been based on populations based differential equations or on cellular agents moving in space and interacting each other. The final steps of this modeling effort are expressed in a code written in a programing language. On account of the complete lack of standardization of the different steps to proceed, we have to deplore poor communication and sharing between experimentalists, theoreticians and programmers. The adoption of diagrammatic visual computer language should however greatly help the immunologists to better communicate, to more easily identify the models similarities and facilitate the reuse and extension of existing software models. Since immunologists often conceptualize the dynamical evolution of immune systems in terms of “state-transitions” of biological objects, we promote the use of unified modeling language (UML) state-transition diagram. To demonstrate the feasibility of this approach, we present a UML refactoring of two published models on thymocyte differentiation. Originally built with different modeling strategies, a mathematical ordinary differential equation-based model and a cellular automata model, the two models are now in the same visual formalism and can be compared. PMID:24101919

  13. Phage Therapy in the Era of Synthetic Biology.

    PubMed

    Barbu, E Magda; Cady, Kyle C; Hubby, Bolyn

    2016-10-03

    For more than a century, bacteriophage (or phage) research has enabled some of the most important discoveries in biological sciences and has equipped scientists with many of the molecular biology tools that have advanced our understanding of replication, maintenance, and expression of genetic material. Phages have also been recognized and exploited as natural antimicrobial agents and nanovectors for gene therapy, but their potential as therapeutics has not been fully exploited in Western medicine because of challenges such as narrow host range, bacterial resistance, and unique pharmacokinetics. However, increasing concern related to the emergence of bacteria resistant to multiple antibiotics has heightened interest in phage therapy and the development of strategies to overcome hurdles associated with bacteriophage therapeutics. Recent progress in sequencing technologies, DNA manipulation, and synthetic biology allowed scientists to refactor the entire bacterial genome of Mycoplasma mycoides, thereby creating the first synthetic cell. These new strategies for engineering genomes may have the potential to accelerate the construction of designer phage genomes with superior therapeutic potential. Here, we discuss the use of phage as therapeutics, as well as how synthetic biology can create bacteriophage with desirable attributes. Copyright © 2016 Cold Spring Harbor Laboratory Press; all rights reserved.

  14. Next Generation Sequencing of Actinobacteria for the Discovery of Novel Natural Products

    PubMed Central

    Gomez-Escribano, Juan Pablo; Alt, Silke; Bibb, Mervyn J.

    2016-01-01

    Like many fields of the biosciences, actinomycete natural products research has been revolutionised by next-generation DNA sequencing (NGS). Hundreds of new genome sequences from actinobacteria are made public every year, many of them as a result of projects aimed at identifying new natural products and their biosynthetic pathways through genome mining. Advances in these technologies in the last five years have meant not only a reduction in the cost of whole genome sequencing, but also a substantial increase in the quality of the data, having moved from obtaining a draft genome sequence comprised of several hundred short contigs, sometimes of doubtful reliability, to the possibility of obtaining an almost complete and accurate chromosome sequence in a single contig, allowing a detailed study of gene clusters and the design of strategies for refactoring and full gene cluster synthesis. The impact that these technologies are having in the discovery and study of natural products from actinobacteria, including those from the marine environment, is only starting to be realised. In this review we provide a historical perspective of the field, analyse the strengths and limitations of the most relevant technologies, and share the insights acquired during our genome mining projects. PMID:27089350

  15. The Confluence of GIS, Cloud and Open Source, Enabling Big Raster Data Applications

    NASA Astrophysics Data System (ADS)

    Plesea, L.; Emmart, C. B.; Boller, R. A.; Becker, P.; Baynes, K.

    2016-12-01

    The rapid evolution of available cloud services is profoundly changing the way applications are being developed and used. Massive object stores, service scalability, continuous integration are some of the most important cloud technology advances that directly influence science applications and GIS. At the same time, more and more scientists are using GIS platforms in their day to day research. Yet with new opportunities there are always some challenges. Given the large amount of data commonly required in science applications, usually large raster datasets, connectivity is one of the biggest problems. Connectivity has two aspects, one being the limited bandwidth and latency of the communication link due to the geographical location of the resources, the other one being the interoperability and intrinsic efficiency of the interface protocol used to connect. NASA and Esri are actively helping each other and collaborating on a few open source projects, aiming to provide some of the core technology components to directly address the GIS enabled data connectivity problems. Last year Esri contributed LERC, a very fast and efficient compression algorithm to the GDAL/MRF format, which itself is a NASA/Esri collaboration project. The MRF raster format has some cloud aware features that make it possible to build high performance web services on cloud platforms, as some of the Esri projects demonstrate. Currently, another NASA open source project, the high performance OnEarth WMTS server is being refactored and enhanced to better integrate with MRF, GDAL and Esri software. Taken together, the GDAL, MRF and OnEarth form the core of an open source CloudGIS toolkit that is already showing results. Since it is well integrated with GDAL, which is the most common interoperability component of GIS applications, this approach should improve the connectivity and performance of many science and GIS applications in the cloud.

  16. Practices to enable the geophysical research spectrum: from fundamentals to applications

    NASA Astrophysics Data System (ADS)

    Kang, S.; Cockett, R.; Heagy, L. J.; Oldenburg, D.

    2016-12-01

    In a geophysical survey, a source injects energy into the earth and a response is measured. These physical systems are governed by partial differential equations and their numerical solutions are obtained by discretizing the earth. Geophysical simulations and inversions are tools for understanding physical responses and constructing models of the subsurface given a finite amount of data. SimPEG (http://simpeg.xyz) is our effort to synthesize geophysical forward and inverse methodologies into a consistent framework. The primary focus of our initial development has been on the electromagnetics (EM) package, with recent extensions to magnetotelluric, direct current (DC), and induced polarization. Across these methods, and applied geophysics in general, we require tools to explore and build an understanding of the physics (behaviour of fields, fluxes), and work with data to produce models through reproducible inversions. If we consider DC or EM experiments, with the aim of understanding responses from subsurface conductors, we require resources that provide multiple "entry points" into the geophysical problem. To understand the physical responses and measured data, we must simulate the physical system and visualize electric fields, currents, and charges. Performing an inversion requires that many moving pieces be brought together: simulation, physics, linear algebra, data processing, optimization, etc. Each component must be trusted, accessible to interrogation and manipulation, and readily combined in order to enable investigation into inversion methodologies. To support such research, we not only require "entry points" into the software, but also extensibility to new situations. In our development of SimPEG, we have sought to use leading practices in software development with the aim of supporting and promoting collaborations across a spectrum of geophysical research: from fundamentals to applications. Designing software to enable this spectrum puts unique constraints on both the architecture of the codebase as well as the development practices that are employed. In this presentation, we will share some lessons learned and, in particular, how our prioritization of testing, documentation, and refactoring has impacted our own research and fostered collaborations.

  17. Performance Evaluation of NWChem Ab-Initio Molecular Dynamics (AIMD) Simulations on the Intel® Xeon Phi™ Processor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bylaska, Eric J.; Jacquelin, Mathias; De Jong, Wibe A.

    2017-10-20

    Ab-initio Molecular Dynamics (AIMD) methods are an important class of algorithms, as they enable scientists to understand the chemistry and dynamics of molecular and condensed phase systems while retaining a first-principles-based description of their interactions. Many-core architectures such as the Intel® Xeon Phi™ processor are an interesting and promising target for these algorithms, as they can provide the computational power that is needed to solve interesting problems in chemistry. In this paper, we describe the efforts of refactoring the existing AIMD plane-wave method of NWChem from an MPI-only implementation to a scalable, hybrid code that employs MPI and OpenMP tomore » exploit the capabilities of current and future many-core architectures. We describe the optimizations required to get close to optimal performance for the multiplication of the tall-and-skinny matrices that form the core of the computational algorithm. We present strong scaling results on the complete AIMD simulation for a test case that simulates 256 water molecules and that strong-scales well on a cluster of 1024 nodes of Intel Xeon Phi processors. We compare the performance obtained with a cluster of dual-socket Intel® Xeon® E5–2698v3 processors.« less

  18. IDCDACS: IDC's Distributed Application Control System

    NASA Astrophysics Data System (ADS)

    Ertl, Martin; Boresch, Alexander; Kianička, Ján; Sudakov, Alexander; Tomuta, Elena

    2015-04-01

    The Preparatory Commission for the CTBTO is an international organization based in Vienna, Austria. Its mission is to establish a global verification regime to monitor compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT), which bans all nuclear explosions. For this purpose time series data from a global network of seismic, hydro-acoustic and infrasound (SHI) sensors are transmitted to the International Data Centre (IDC) in Vienna in near-real-time, where it is processed to locate events that may be nuclear explosions. We newly designed the distributed application control system that glues together the various components of the automatic waveform data processing system at the IDC (IDCDACS). Our highly-scalable solution preserves the existing architecture of the IDC processing system that proved successful over many years of operational use, but replaces proprietary components with open-source solutions and custom developed software. Existing code was refactored and extended to obtain a reusable software framework that is flexibly adaptable to different types of processing workflows. Automatic data processing is organized in series of self-contained processing steps, each series being referred to as a processing pipeline. Pipelines process data by time intervals, i.e. the time-series data received from monitoring stations is organized in segments based on the time when the data was recorded. So-called data monitor applications queue the data for processing in each pipeline based on specific conditions, e.g. data availability, elapsed time or completion states of preceding processing pipelines. IDCDACS consists of a configurable number of distributed monitoring and controlling processes, a message broker and a relational database. All processes communicate through message queues hosted on the message broker. Persistent state information is stored in the database. A configurable processing controller instantiates and monitors all data processing applications. Due to decoupling by message queues the system is highly versatile and failure tolerant. The implementation utilizes the RabbitMQ open-source messaging platform that is based upon the Advanced Message Queuing Protocol (AMQP), an on-the-wire protocol (like HTML) and open industry standard. IDCDACS uses high availability capabilities provided by RabbitMQ and is equipped with failure recovery features to survive network and server outages. It is implemented in C and Python and is operated in a Linux environment at the IDC. Although IDCDACS was specifically designed for the existing IDC processing system its architecture is generic and reusable for different automatic processing workflows, e.g. similar to those described in (Olivieri et al. 2012, Kværna et al. 2012). Major advantages are its independence of the specific data processing applications used and the possibility to reconfigure IDCDACS for different types of processing, data and trigger logic. A possible future development would be to use the IDCDACS framework for different scientific domains, e.g. for processing of Earth observation satellite data extending the one-dimensional time-series intervals to spatio-temporal data cubes. REFERENCES Olivieri M., J. Clinton (2012) An almost fair comparison between Earthworm and SeisComp3, Seismological Research Letters, 83(4), 720-727. Kværna, T., S. J. Gibbons, D. B. Harris, D. A. Dodge (2012) Adapting pipeline architectures to track developing aftershock sequences and recurrent explosions, Proceedings of the 2012 Monitoring Research Review: Ground-Based Nuclear Explosion Monitoring Technologies, 776-785.

  19. Steps towards the synthetic biology of polyketide biosynthesis.

    PubMed

    Cummings, Matthew; Breitling, Rainer; Takano, Eriko

    2014-02-01

    Nature is providing a bountiful pool of valuable secondary metabolites, many of which possess therapeutic properties. However, the discovery of new bioactive secondary metabolites is slowing down, at a time when the rise of multidrug-resistant pathogens and the realization of acute and long-term side effects of widely used drugs lead to an urgent need for new therapeutic agents. Approaches such as synthetic biology are promising to deliver a much-needed boost to secondary metabolite drug development through plug-and-play optimized hosts and refactoring novel or cryptic bacterial gene clusters. Here, we discuss this prospect focusing on one comprehensively studied class of clinically relevant bioactive molecules, the polyketides. Extensive efforts towards optimization and derivatization of compounds via combinatorial biosynthesis and classical engineering have elucidated the modularity, flexibility and promiscuity of polyketide biosynthetic enzymes. Hence, a synthetic biology approach can build upon a solid basis of guidelines and principles, while providing a new perspective towards the discovery and generation of novel and new-to-nature compounds. We discuss the lessons learned from the classical engineering of polyketide synthases and indicate their importance when attempting to engineer biosynthetic pathways using synthetic biology approaches for the introduction of novelty and overexpression of products in a controllable manner. © 2013 The Authors FEMS Microbiology Letters published by John Wiley & Sons Ltd on behalf of Federation of European Microbiological Societies.

  20. Continued Development of Python-Based Thomson Data Analysis and Associated Visualization Tool for NSTX-U

    NASA Astrophysics Data System (ADS)

    Wallace, William; Miller, Jared; Diallo, Ahmed

    2015-11-01

    MultiPoint Thomson Scattering (MPTS) is an established, accurate method of finding the temperature, density, and pressure of a magnetically confined plasma. Two Nd:YAG (1064 nm) lasers are fired into the plasma with a effective frequency of 60 Hz, and the light is Doppler shifted by Thomson scattering. Polychromators on the NSTX-U midplane collect the scattered photons at various radii/scattering angles, and the avalanche photodiode voltages are saved to an MDSplus tree for later analysis. IDL code is then used to determine plasma temperature, pressure, and density from the captured polychromator measurements via Selden formulas. [1] Previous work [2] converted the single-processor IDL code into Python code, and prepared a new architecture for multiprocessing MPTS in parallel. However, that work was not completed to the generation of output data and curve fits that match with the previous IDL. This project refactored the Python code into a object-oriented architecture, and created a software test suite for the new architecture which allowed identification of the code which generated the difference in output. Another effort currently underway is to display the Thomson data in an intuitive, interactive format. This work was supported in part by the U.S. Department of Energy, Office of Science, Office of Workforce Development for Teachers and Scientists (WDTS) under the Community College Internship (CCI) program.

  1. Re-Factoring Glycolytic Genes for Targeted Engineering of Catabolism in Gram-Negative Bacteria.

    PubMed

    Sánchez-Pascuala, Alberto; Nikel, Pablo I; de Lorenzo, Víctor

    2018-01-01

    The Embden-Meyerhof-Parnas (EMP) pathway is widely accepted to be the biochemical standard of glucose catabolism. The well-characterized glycolytic route of Escherichia coli, based on the EMP catabolism, is an example of an intricate pathway in terms of genomic organization of the genes involved and patterns of gene expression and regulation. This intrinsic genetic and metabolic complexity renders it difficult to engineer glycolytic activities and transfer them onto other microbial cell factories, thus limiting the biotechnological potential of bacterial hosts that lack the route. Taking into account the potential applications of such a portable tool for targeted pathway engineering, in the present protocol we describe how the genes encoding all the enzymes of the linear EMP route have been individually recruited from the genome of E. coli K-12, edited in silico to remove their endogenous regulatory signals, and synthesized de novo following a standard (i.e., GlucoBrick) that facilitates their grouping in the form of functional modules that can be combined at the user's will. This novel genetic tool allows for the à la carte implementation or boosting of EMP pathway activities into different Gram-negative bacteria. The potential of the GlucoBrick platform is further illustrated by engineering novel glycolytic activities in the most representative members of the Pseudomonas genus (Pseudomonas putida and Pseudomonas aeruginosa).

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yeung, Yu-Hong; Pothen, Alex; Halappanavar, Mahantesh

    We present an augmented matrix approach to update the solution to a linear system of equations when the coefficient matrix is modified by a few elements within a principal submatrix. This problem arises in the dynamic security analysis of a power grid, where operators need to performmore » $N-x$ contingency analysis, i.e., determine the state of the system when up to $x$ links from $N$ fail. Our algorithms augment the coefficient matrix to account for the changes in it, and then compute the solution to the augmented system without refactoring the modified matrix. We provide two algorithms, a direct method, and a hybrid direct-iterative method for solving the augmented system. We also exploit the sparsity of the matrices and vectors to accelerate the overall computation. Our algorithms are compared on three power grids with PARDISO, a parallel direct solver, and CHOLMOD, a direct solver with the ability to modify the Cholesky factors of the coefficient matrix. We show that our augmented algorithms outperform PARDISO (by two orders of magnitude), and CHOLMOD (by a factor of up to 5). Further, our algorithms scale better than CHOLMOD as the number of elements updated increases. The solutions are computed with high accuracy. Our algorithms are capable of computing $N-x$ contingency analysis on a $778K$ bus grid, updating a solution with $x=20$ elements in $$1.6 \\times 10^{-2}$$ seconds on an Intel Xeon processor.« less

  3. An Exponential Regulator for Rapidity Divergences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Ye; Neill, Duff; Zhu, Hua Xing

    2016-04-01

    Finding an efficient and compelling regularization of soft and collinear degrees of freedom at the same invariant mass scale, but separated in rapidity is a persistent problem in high-energy factorization. In the course of a calculation, one encounters divergences unregulated by dimensional regularization, often called rapidity divergences. Once regulated, a general framework exists for their renormalization, the rapidity renormalization group (RRG), leading to fully resummed calculations of transverse momentum (to the jet axis) sensitive quantities. We examine how this regularization can be implemented via a multi-differential factorization of the soft-collinear phase-space, leading to an (in principle) alternative non-perturbative regularization ofmore » rapidity divergences. As an example, we examine the fully-differential factorization of a color singlet's momentum spectrum in a hadron-hadron collision at threshold. We show how this factorization acts as a mother theory to both traditional threshold and transverse momentum resummation, recovering the classical results for both resummations. Examining the refactorization of the transverse momentum beam functions in the threshold region, we show that one can directly calculate the rapidity renormalized function, while shedding light on the structure of joint resummation. Finally, we show how using modern bootstrap techniques, the transverse momentum spectrum is determined by an expansion about the threshold factorization, leading to a viable higher loop scheme for calculating the relevant anomalous dimensions for the transverse momentum spectrum.« less

  4. Enabling Automated Graph-based Search for the Identification and Characterization of Mesoscale Convective Complexes in Satellite Datasets through Integration with the Apache Open Climate Workbench

    NASA Astrophysics Data System (ADS)

    McGibbney, L. J.; Whitehall, K. D.; Mattmann, C. A.; Goodale, C. E.; Joyce, M.; Ramirez, P.; Zimdars, P.

    2014-12-01

    We detail how Apache Open Climate Workbench (OCW) (recently open sourced by NASA JPL) was adapted to facilitate an ongoing study of Mesoscale Convective Complexes (MCCs) in West Africa and their contributions within the weather-climate continuum as it relates to climate variability. More than 400 MCCs occur annually over various locations on the globe. In West Africa, approximately one-fifth of that total occur during the summer months (June-November) alone and are estimated to contribute more than 50% of the seasonal rainfall amounts. Furthermore, in general the non-discriminatory socio-economic geospatial distribution of these features correlates with currently and projected densely populated locations. As such, the convective nature of MCCs raises questions regarding their seasonal variability and frequency in current and future climates, amongst others. However, in spite of the formal observation criteria of these features in 1980, these questions have remained comprehensively unanswered because of the untimely and subjective methods for identifying and characterizing MCCs due to limitations data-handling limitations. The main outcome of this work therefore documents how a graph-based search algorithm was implemented on top of the OCW stack with the ultimate goal of improving fully automated end-to-end identification and characterization of MCCs in high resolution observational datasets. Apache OCW as an open source project was demonstrated from inception and we display how it was again utilized to advance understanding and knowledge within the above domain. The project was born out of refactored code donated by NASA JPL from the Earth science community's Regional Climate Model Evaluation System (RCMES), a joint project between the Joint Institute for Regional Earth System Science and Engineering (JIFRESSE), and a scientific collaboration between the University of California at Los Angeles (UCLA) and NASA JPL. The Apache OCW project was then integrated back into the donor code with the aim of more efficiently powering that project. Notwithstanding, the object-oriented approach to creating a core set of libraries Apache OCW has scaled the usability of the project beyond climate model evaluation as displayed in the MCC use case detailed herewith.

  5. Phenotype Instance Verification and Evaluation Tool (PIVET): A Scaled Phenotype Evidence Generation Framework Using Web-Based Medical Literature.

    PubMed

    Henderson, Jette; Ke, Junyuan; Ho, Joyce C; Ghosh, Joydeep; Wallace, Byron C

    2018-05-04

    Researchers are developing methods to automatically extract clinically relevant and useful patient characteristics from raw healthcare datasets. These characteristics, often capturing essential properties of patients with common medical conditions, are called computational phenotypes. Being generated by automated or semiautomated, data-driven methods, such potential phenotypes need to be validated as clinically meaningful (or not) before they are acceptable for use in decision making. The objective of this study was to present Phenotype Instance Verification and Evaluation Tool (PIVET), a framework that uses co-occurrence analysis on an online corpus of publically available medical journal articles to build clinical relevance evidence sets for user-supplied phenotypes. PIVET adopts a conceptual framework similar to the pioneering prototype tool PheKnow-Cloud that was developed for the phenotype validation task. PIVET completely refactors each part of the PheKnow-Cloud pipeline to deliver vast improvements in speed without sacrificing the quality of the insights PheKnow-Cloud achieved. PIVET leverages indexing in NoSQL databases to efficiently generate evidence sets. Specifically, PIVET uses a succinct representation of the phenotypes that corresponds to the index on the corpus database and an optimized co-occurrence algorithm inspired by the Aho-Corasick algorithm. We compare PIVET's phenotype representation with PheKnow-Cloud's by using PheKnow-Cloud's experimental setup. In PIVET's framework, we also introduce a statistical model trained on domain expert-verified phenotypes to automatically classify phenotypes as clinically relevant or not. Additionally, we show how the classification model can be used to examine user-supplied phenotypes in an online, rather than batch, manner. PIVET maintains the discriminative power of PheKnow-Cloud in terms of identifying clinically relevant phenotypes for the same corpus with which PheKnow-Cloud was originally developed, but PIVET's analysis is an order of magnitude faster than that of PheKnow-Cloud. Not only is PIVET much faster, it can be scaled to a larger corpus and still retain speed. We evaluated multiple classification models on top of the PIVET framework and found ridge regression to perform best, realizing an average F1 score of 0.91 when predicting clinically relevant phenotypes. Our study shows that PIVET improves on the most notable existing computational tool for phenotype validation in terms of speed and automation and is comparable in terms of accuracy. ©Jette Henderson, Junyuan Ke, Joyce C Ho, Joydeep Ghosh, Byron C Wallace. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 04.05.2018.

  6. Phenotype Instance Verification and Evaluation Tool (PIVET): A Scaled Phenotype Evidence Generation Framework Using Web-Based Medical Literature

    PubMed Central

    Ke, Junyuan; Ho, Joyce C; Ghosh, Joydeep; Wallace, Byron C

    2018-01-01

    Background Researchers are developing methods to automatically extract clinically relevant and useful patient characteristics from raw healthcare datasets. These characteristics, often capturing essential properties of patients with common medical conditions, are called computational phenotypes. Being generated by automated or semiautomated, data-driven methods, such potential phenotypes need to be validated as clinically meaningful (or not) before they are acceptable for use in decision making. Objective The objective of this study was to present Phenotype Instance Verification and Evaluation Tool (PIVET), a framework that uses co-occurrence analysis on an online corpus of publically available medical journal articles to build clinical relevance evidence sets for user-supplied phenotypes. PIVET adopts a conceptual framework similar to the pioneering prototype tool PheKnow-Cloud that was developed for the phenotype validation task. PIVET completely refactors each part of the PheKnow-Cloud pipeline to deliver vast improvements in speed without sacrificing the quality of the insights PheKnow-Cloud achieved. Methods PIVET leverages indexing in NoSQL databases to efficiently generate evidence sets. Specifically, PIVET uses a succinct representation of the phenotypes that corresponds to the index on the corpus database and an optimized co-occurrence algorithm inspired by the Aho-Corasick algorithm. We compare PIVET’s phenotype representation with PheKnow-Cloud’s by using PheKnow-Cloud’s experimental setup. In PIVET’s framework, we also introduce a statistical model trained on domain expert–verified phenotypes to automatically classify phenotypes as clinically relevant or not. Additionally, we show how the classification model can be used to examine user-supplied phenotypes in an online, rather than batch, manner. Results PIVET maintains the discriminative power of PheKnow-Cloud in terms of identifying clinically relevant phenotypes for the same corpus with which PheKnow-Cloud was originally developed, but PIVET’s analysis is an order of magnitude faster than that of PheKnow-Cloud. Not only is PIVET much faster, it can be scaled to a larger corpus and still retain speed. We evaluated multiple classification models on top of the PIVET framework and found ridge regression to perform best, realizing an average F1 score of 0.91 when predicting clinically relevant phenotypes. Conclusions Our study shows that PIVET improves on the most notable existing computational tool for phenotype validation in terms of speed and automation and is comparable in terms of accuracy. PMID:29728351

  7. List-mode PET image reconstruction for motion correction using the Intel XEON PHI co-processor

    NASA Astrophysics Data System (ADS)

    Ryder, W. J.; Angelis, G. I.; Bashar, R.; Gillam, J. E.; Fulton, R.; Meikle, S.

    2014-03-01

    List-mode image reconstruction with motion correction is computationally expensive, as it requires projection of hundreds of millions of rays through a 3D array. To decrease reconstruction time it is possible to use symmetric multiprocessing computers or graphics processing units. The former can have high financial costs, while the latter can require refactoring of algorithms. The Xeon Phi is a new co-processor card with a Many Integrated Core architecture that can run 4 multiple-instruction, multiple data threads per core with each thread having a 512-bit single instruction, multiple data vector register. Thus, it is possible to run in the region of 220 threads simultaneously. The aim of this study was to investigate whether the Xeon Phi co-processor card is a viable alternative to an x86 Linux server for accelerating List-mode PET image reconstruction for motion correction. An existing list-mode image reconstruction algorithm with motion correction was ported to run on the Xeon Phi coprocessor with the multi-threading implemented using pthreads. There were no differences between images reconstructed using the Phi co-processor card and images reconstructed using the same algorithm run on a Linux server. However, it was found that the reconstruction runtimes were 3 times greater for the Phi than the server. A new version of the image reconstruction algorithm was developed in C++ using OpenMP for mutli-threading and the Phi runtimes decreased to 1.67 times that of the host Linux server. Data transfer from the host to co-processor card was found to be a rate-limiting step; this needs to be carefully considered in order to maximize runtime speeds. When considering the purchase price of a Linux workstation with Xeon Phi co-processor card and top of the range Linux server, the former is a cost-effective computation resource for list-mode image reconstruction. A multi-Phi workstation could be a viable alternative to cluster computers at a lower cost for medical imaging applications.

  8. Toward modular biological models: defining analog modules based on referent physiological mechanisms

    PubMed Central

    2014-01-01

    Background Currently, most biomedical models exist in isolation. It is often difficult to reuse or integrate models or their components, in part because they are not modular. Modular components allow the modeler to think more deeply about the role of the model and to more completely address a modeling project’s requirements. In particular, modularity facilitates component reuse and model integration for models with different use cases, including the ability to exchange modules during or between simulations. The heterogeneous nature of biology and vast range of wet-lab experimental platforms call for modular models designed to satisfy a variety of use cases. We argue that software analogs of biological mechanisms are reasonable candidates for modularization. Biomimetic software mechanisms comprised of physiomimetic mechanism modules offer benefits that are unique or especially important to multi-scale, biomedical modeling and simulation. Results We present a general, scientific method of modularizing mechanisms into reusable software components that we call physiomimetic mechanism modules (PMMs). PMMs utilize parametric containers that partition and expose state information into physiologically meaningful groupings. To demonstrate, we modularize four pharmacodynamic response mechanisms adapted from an in silico liver (ISL). We verified the modularization process by showing that drug clearance results from in silico experiments are identical before and after modularization. The modularized ISL achieves validation targets drawn from propranolol outflow profile data. In addition, an in silico hepatocyte culture (ISHC) is created. The ISHC uses the same PMMs and required no refactoring. The ISHC achieves validation targets drawn from propranolol intrinsic clearance data exhibiting considerable between-lab variability. The data used as validation targets for PMMs originate from both in vitro to in vivo experiments exhibiting large fold differences in time scale. Conclusions This report demonstrates the feasibility of PMMs and their usefulness across multiple model use cases. The pharmacodynamic response module developed here is robust to changes in model context and flexible in its ability to achieve validation targets in the face of considerable experimental uncertainty. Adopting the modularization methods presented here is expected to facilitate model reuse and integration, thereby accelerating the pace of biomedical research. PMID:25123169

  9. Toward modular biological models: defining analog modules based on referent physiological mechanisms.

    PubMed

    Petersen, Brenden K; Ropella, Glen E P; Hunt, C Anthony

    2014-08-16

    Currently, most biomedical models exist in isolation. It is often difficult to reuse or integrate models or their components, in part because they are not modular. Modular components allow the modeler to think more deeply about the role of the model and to more completely address a modeling project's requirements. In particular, modularity facilitates component reuse and model integration for models with different use cases, including the ability to exchange modules during or between simulations. The heterogeneous nature of biology and vast range of wet-lab experimental platforms call for modular models designed to satisfy a variety of use cases. We argue that software analogs of biological mechanisms are reasonable candidates for modularization. Biomimetic software mechanisms comprised of physiomimetic mechanism modules offer benefits that are unique or especially important to multi-scale, biomedical modeling and simulation. We present a general, scientific method of modularizing mechanisms into reusable software components that we call physiomimetic mechanism modules (PMMs). PMMs utilize parametric containers that partition and expose state information into physiologically meaningful groupings. To demonstrate, we modularize four pharmacodynamic response mechanisms adapted from an in silico liver (ISL). We verified the modularization process by showing that drug clearance results from in silico experiments are identical before and after modularization. The modularized ISL achieves validation targets drawn from propranolol outflow profile data. In addition, an in silico hepatocyte culture (ISHC) is created. The ISHC uses the same PMMs and required no refactoring. The ISHC achieves validation targets drawn from propranolol intrinsic clearance data exhibiting considerable between-lab variability. The data used as validation targets for PMMs originate from both in vitro to in vivo experiments exhibiting large fold differences in time scale. This report demonstrates the feasibility of PMMs and their usefulness across multiple model use cases. The pharmacodynamic response module developed here is robust to changes in model context and flexible in its ability to achieve validation targets in the face of considerable experimental uncertainty. Adopting the modularization methods presented here is expected to facilitate model reuse and integration, thereby accelerating the pace of biomedical research.

  10. Interface cloning and sharing: Interaction designs for conserving labor and maintaining state across 24X7 sensor operations teams

    NASA Astrophysics Data System (ADS)

    Ganter, John H.; Reeves, Paul C.

    2017-05-01

    Processing remote sensing data is the epitome of computation, yet real-time collection systems remain human-labor intensive. Operator labor is consumed by both overhead tasks (cost) and value-added production (benefit). In effect, labor is taxed and then lost. When an operator comes on-shift, they typically duplicate setup work that their teammates have already performed many times. "Pass down" of state information can be difficult if security restrictions require total logouts and blank screens - hours or even days of valuable history and context are lost. As work proceeds, duplicative effort is common because it is typically easier for operators to "do it over" rather than share what others have already done. As we begin a major new system version, we are refactoring the user interface to reduce time and motion losses. Working with users, we are developing "click budgets" to streamline interface use. One basic function is shared clipboards to reduce the use of sticky notes and verbal communication of data strings. We illustrate two additional designs to share work: window copying and window sharing. Copying (technically, shallow or deep object cloning) allows any system user to duplicate a window and configuration for themselves or another to use. Sharing allows a window to have multiple users: shareholders with read-write functionality and viewers with read-only. These solutions would allow windows to persist across multiple shifts, with a rotating cast of shareholders and viewers. Windows thus become durable objects of shared effort and persistent state. While these are low-tech functions, the cumulative labor savings in a 24X7 crew position (525,000 minutes/year spread over multiple individuals) would be significant. New design and implementation is never free and these investments typically do not appeal to government acquisition officers with short-term acquisition-cost concerns rather than a long-term O and M (operations and maintenance) perspective. We share some successes in educating some officers, in collaboration with system users, about the human capital involved in operating the systems they are acquiring.

  11. Using Workflows to Explore and Optimise Named Entity Recognition for Chemistry

    PubMed Central

    Kolluru, BalaKrishna; Hawizy, Lezan; Murray-Rust, Peter; Tsujii, Junichi; Ananiadou, Sophia

    2011-01-01

    Chemistry text mining tools should be interoperable and adaptable regardless of system-level implementation, installation or even programming issues. We aim to abstract the functionality of these tools from the underlying implementation via reconfigurable workflows for automatically identifying chemical names. To achieve this, we refactored an established named entity recogniser (in the chemistry domain), OSCAR and studied the impact of each component on the net performance. We developed two reconfigurable workflows from OSCAR using an interoperable text mining framework, U-Compare. These workflows can be altered using the drag-&-drop mechanism of the graphical user interface of U-Compare. These workflows also provide a platform to study the relationship between text mining components such as tokenisation and named entity recognition (using maximum entropy Markov model (MEMM) and pattern recognition based classifiers). Results indicate that, for chemistry in particular, eliminating noise generated by tokenisation techniques lead to a slightly better performance than others, in terms of named entity recognition (NER) accuracy. Poor tokenisation translates into poorer input to the classifier components which in turn leads to an increase in Type I or Type II errors, thus, lowering the overall performance. On the Sciborg corpus, the workflow based system, which uses a new tokeniser whilst retaining the same MEMM component, increases the F-score from 82.35% to 84.44%. On the PubMed corpus, it recorded an F-score of 84.84% as against 84.23% by OSCAR. PMID:21633495

  12. State-transition diagrams for biologists.

    PubMed

    Bersini, Hugues; Klatzmann, David; Six, Adrien; Thomas-Vaslin, Véronique

    2012-01-01

    It is clearly in the tradition of biologists to conceptualize the dynamical evolution of biological systems in terms of state-transitions of biological objects. This paper is mainly concerned with (but obviously not limited too) the immunological branch of biology and shows how the adoption of UML (Unified Modeling Language) state-transition diagrams can ease the modeling, the understanding, the coding, the manipulation or the documentation of population-based immune software model generally defined as a set of ordinary differential equations (ODE), describing the evolution in time of populations of various biological objects. Moreover, that same UML adoption naturally entails a far from negligible representational economy since one graphical item of the diagram might have to be repeated in various places of the mathematical model. First, the main graphical elements of the UML state-transition diagram and how they can be mapped onto a corresponding ODE mathematical model are presented. Then, two already published immune models of thymocyte behavior and time evolution in the thymus, the first one originally conceived as an ODE population-based model whereas the second one as an agent-based one, are refactored and expressed in a state-transition form so as to make them much easier to understand and their respective code easier to access, to modify and run. As an illustrative proof, for any immunologist, it should be possible to understand faithfully enough what the two software models are supposed to reproduce and how they execute with no need to plunge into the Java or Fortran lines.

  13. EASEE: an open architecture approach for modeling battlespace signal and sensor phenomenology

    NASA Astrophysics Data System (ADS)

    Waldrop, Lauren E.; Wilson, D. Keith; Ekegren, Michael T.; Borden, Christian T.

    2017-04-01

    Open architecture in the context of defense applications encourages collaboration across government agencies and academia. This paper describes a success story in the implementation of an open architecture framework that fosters transparency and modularity in the context of Environmental Awareness for Sensor and Emitter Employment (EASEE), a complex physics-based software package for modeling the effects of terrain and atmospheric conditions on signal propagation and sensor performance. Among the highlighted features in this paper are: (1) a code refactorization to separate sensitive parts of EASEE, thus allowing collaborators the opportunity to view and interact with non-sensitive parts of the EASEE framework with the end goal of supporting collaborative innovation, (2) a data exchange and validation effort to enable the dynamic addition of signatures within EASEE thus supporting a modular notion that components can be easily added or removed to the software without requiring recompilation by developers, and (3) a flexible and extensible XML interface, which aids in decoupling graphical user interfaces from EASEE's calculation engine, and thus encourages adaptability to many different defense applications. In addition to the outlined points above, this paper also addresses EASEE's ability to interface with both proprietary systems such as ArcGIS. A specific use case regarding the implementation of an ArcGIS toolbar that leverages EASEE's XML interface and enables users to set up an EASEE-compliant configuration for probability of detection or optimal sensor placement calculations in various modalities is discussed as well.

  14. State-Transition Diagrams for Biologists

    PubMed Central

    Bersini, Hugues; Klatzmann, David; Six, Adrien; Thomas-Vaslin, Véronique

    2012-01-01

    It is clearly in the tradition of biologists to conceptualize the dynamical evolution of biological systems in terms of state-transitions of biological objects. This paper is mainly concerned with (but obviously not limited too) the immunological branch of biology and shows how the adoption of UML (Unified Modeling Language) state-transition diagrams can ease the modeling, the understanding, the coding, the manipulation or the documentation of population-based immune software model generally defined as a set of ordinary differential equations (ODE), describing the evolution in time of populations of various biological objects. Moreover, that same UML adoption naturally entails a far from negligible representational economy since one graphical item of the diagram might have to be repeated in various places of the mathematical model. First, the main graphical elements of the UML state-transition diagram and how they can be mapped onto a corresponding ODE mathematical model are presented. Then, two already published immune models of thymocyte behavior and time evolution in the thymus, the first one originally conceived as an ODE population-based model whereas the second one as an agent-based one, are refactored and expressed in a state-transition form so as to make them much easier to understand and their respective code easier to access, to modify and run. As an illustrative proof, for any immunologist, it should be possible to understand faithfully enough what the two software models are supposed to reproduce and how they execute with no need to plunge into the Java or Fortran lines. PMID:22844438

  15. muBLASTP: database-indexed protein sequence search on multicore CPUs.

    PubMed

    Zhang, Jing; Misra, Sanchit; Wang, Hao; Feng, Wu-Chun

    2016-11-04

    The Basic Local Alignment Search Tool (BLAST) is a fundamental program in the life sciences that searches databases for sequences that are most similar to a query sequence. Currently, the BLAST algorithm utilizes a query-indexed approach. Although many approaches suggest that sequence search with a database index can achieve much higher throughput (e.g., BLAT, SSAHA, and CAFE), they cannot deliver the same level of sensitivity as the query-indexed BLAST, i.e., NCBI BLAST, or they can only support nucleotide sequence search, e.g., MegaBLAST. Due to different challenges and characteristics between query indexing and database indexing, the existing techniques for query-indexed search cannot be used into database indexed search. muBLASTP, a novel database-indexed BLAST for protein sequence search, delivers identical hits returned to NCBI BLAST. On Intel Haswell multicore CPUs, for a single query, the single-threaded muBLASTP achieves up to a 4.41-fold speedup for alignment stages, and up to a 1.75-fold end-to-end speedup over single-threaded NCBI BLAST. For a batch of queries, the multithreaded muBLASTP achieves up to a 5.7-fold speedups for alignment stages, and up to a 4.56-fold end-to-end speedup over multithreaded NCBI BLAST. With a newly designed index structure for protein database and associated optimizations in BLASTP algorithm, we re-factored BLASTP algorithm for modern multicore processors that achieves much higher throughput with acceptable memory footprint for the database index.

  16. Using workflows to explore and optimise named entity recognition for chemistry.

    PubMed

    Kolluru, Balakrishna; Hawizy, Lezan; Murray-Rust, Peter; Tsujii, Junichi; Ananiadou, Sophia

    2011-01-01

    Chemistry text mining tools should be interoperable and adaptable regardless of system-level implementation, installation or even programming issues. We aim to abstract the functionality of these tools from the underlying implementation via reconfigurable workflows for automatically identifying chemical names. To achieve this, we refactored an established named entity recogniser (in the chemistry domain), OSCAR and studied the impact of each component on the net performance. We developed two reconfigurable workflows from OSCAR using an interoperable text mining framework, U-Compare. These workflows can be altered using the drag-&-drop mechanism of the graphical user interface of U-Compare. These workflows also provide a platform to study the relationship between text mining components such as tokenisation and named entity recognition (using maximum entropy Markov model (MEMM) and pattern recognition based classifiers). Results indicate that, for chemistry in particular, eliminating noise generated by tokenisation techniques lead to a slightly better performance than others, in terms of named entity recognition (NER) accuracy. Poor tokenisation translates into poorer input to the classifier components which in turn leads to an increase in Type I or Type II errors, thus, lowering the overall performance. On the Sciborg corpus, the workflow based system, which uses a new tokeniser whilst retaining the same MEMM component, increases the F-score from 82.35% to 84.44%. On the PubMed corpus, it recorded an F-score of 84.84% as against 84.23% by OSCAR.

  17. Production of mono- and sesquiterpenes in Camelina sativa oilseed.

    PubMed

    Augustin, Jörg M; Higashi, Yasuhiro; Feng, Xiaohong; Kutchan, Toni M

    2015-09-01

    Camelina was bioengineered to accumulate (4 S )-limonene and (+)-δ-cadinene in seed. Plastidic localization of the recombinant enzymes resulted in higher yields than cytosolic localization. Overexpressing 1-deoxy- d -xylulose-5-phosphate synthase ( DXS ) further increased terpene accumulation. Many plant-derived compounds of high value for industrial or pharmaceutical applications originate from plant species that are not amenable to cultivation. Biotechnological production in low-input organisms is an attractive alternative. Several microbes are well established as biotechnological production platforms; however, their growth requires fermentation units, energy input, and nutrients. Plant-based production systems potentially allow the generation of high-value compounds on arable land with minimal input. Here we explore whether Camelina sativa (camelina), an emerging low-input non-foodstuff Brassicaceae oilseed crop grown on marginal lands or as a rotation crop on fallow land, can successfully be refactored to produce and store novel compounds in seed. As proof-of-concept, we use the cyclic monoterpene hydrocarbon (4S)-limonene and the bicyclic sesquiterpene hydrocarbon (+)-δ-cadinene, which have potential biofuel and industrial solvent applications. Post-translational translocation of the recombinant enzymes to the plastid with concurrent overexpression of 1-deoxy-D-xylulose-5-phosphate synthase (DXS) resulted in the accumulation of (4S)-limonene and (+)-δ-cadinene up to 7 mg g(-1) seed and 5 mg g(-1) seed, respectively. This study presents the framework for rapid engineering of camelina oilseed production platforms for terpene-based high-value compounds.

  18. Assessing data assimilation and model boundary error strategies for high resolution ocean model downscaling in the Northern North Sea

    NASA Astrophysics Data System (ADS)

    Sandvig Mariegaard, Jesper; Huiban, Méven Robin; Tornfeldt Sørensen, Jacob; Andersson, Henrik

    2017-04-01

    Determining the optimal domain size and associated position of open boundaries in local high-resolution downscaling ocean models is often difficult. As an important input data set for downscaling ocean modelling, the European Copernicus Marine Environment Monitoring Service (CMEMS) provides baroclinic initial and boundary conditions for local ocean models. Tidal dynamics is often neglected in CMEMS services at large scale but tides are generally crucial for coastal ocean dynamics. To address this need, tides can be superposed via Flather (1976) boundary conditions and the combined flow downscaled using unstructured mesh. The surge component is also only partially represented in selected CMEMS products and must be modelled inside the domain and modelled independently and superposed if the domain becomes too small to model the effect in the downscaling model. The tide and surge components can generally be improved by assimilating water level from tide gauge and altimetry data. An intrinsic part of the problem is to find the limitations of local scale data assimilation and the requirement for consistency between the larger scale ocean models and the local scale assimilation methodologies. This contribution investigates the impact of domain size and associated positions of open boundaries with and without data assimilation of water level. We have used the baroclinic ocean model, MIKE 3 FM, and its newly re-factored built-in data assimilation package. We consider boundary conditions of salinity, temperature, water level and depth varying currents from the Global CMEMS 1/4 degree resolution model from 2011, where in situ ADCP velocity data is available for validation. We apply data assimilation of in-situ tide gauge water levels and along track altimetry surface elevation data from selected satellites. The MIKE 3 FM data assimilation model which use the Ensemble Kalman filter have recently been parallelized with MPI allowing for much larger applications running on HPC. The success of the downscaling is to a large degree determined by the ability to realistically describe and dynamically model the errors on the open boundaries. Three different sizes of downscaling model domains in the Northern North Sea have been examined and two different strategies for modelling the uncertainties on the open Flather boundaries are investigated. The combined downscaling and local data assimilation skill is assessed and the impact on recommended domain size is compared to pure downscaling.

  19. The PhEDEx next-gen website

    NASA Astrophysics Data System (ADS)

    Egeland, R.; Huang, C.-H.; Rossman, P.; Sundarrajan, P.; Wildish, T.

    2012-12-01

    PhEDEx is the data-transfer management solution written by CMS. It consists of agents running at each site, a website for presentation of information, and a web-based data-service for scripted access to information. The website allows users to monitor the progress of data-transfers, the status of site agents and links between sites, and the overall status and behaviour of everything about PhEDEx. It also allows users to make and approve requests for data-transfers and for deletion of data. It is the main point-of-entry for all users wishing to interact with PhEDEx. For several years, the website has consisted of a single perl program with about 10K SLOC. This program has limited capabilities for exploring the data, with only coarse filtering capabilities and no context-sensitive awareness. Graphical information is presented as static images, generated on the server, with no interactivity. It is also not well connected to the rest of the PhEDEx codebase, since much of it was written before the data-service was developed. All this makes it hard to maintain and extend. We are re-implementing the website to address these issues. The UI is being rewritten in Javascript, replacing most of the server-side code. We are using the YUI toolkit to provide advanced features and context-sensitive interaction, and will adopt a Javascript charting library for generating graphical representations client-side. This relieves the server of much of its load, and automatically improves server-side security. The Javascript components can be re-used in many ways, allowing custom pages to be developed for specific uses. In particular, standalone test-cases using small numbers of components make it easier to debug the Javascript than it is to debug a large server program. Information about PhEDEx is accessed through the PhEDEx data-service, since direct SQL is not available from the clients’ browser. This provides consistent semantics with other, externally written monitoring tools, which already use the data-service. It also reduces redundancy in the code, yielding a simpler, consolidated codebase. In this talk we describe our experience of re-factoring this monolithic server-side program into a lighter client-side framework. We describe some of the techniques that worked well for us, and some of the mistakes we made along the way. We present the current state of the project, and its future direction.

  20. Software framework for automatic learning of telescope operation

    NASA Astrophysics Data System (ADS)

    Rodríguez, Jose A.; Molgó, Jordi; Guerra, Dailos

    2016-07-01

    The "Gran Telescopio de Canarias" (GTC) is an optical-infrared 10-meter segmented mirror telescope at the ORM observatory in Canary Islands (Spain). The GTC Control System (GCS) is a distributed object and component oriented system based on RT-CORBA and it is responsible for the operation of the telescope, including its instrumentation. The current development state of GCS is mature and fully operational. On the one hand telescope users as PI's implement the sequences of observing modes of future scientific instruments that will be installed in the telescope and operators, in turn, design their own sequences for maintenance. On the other hand engineers develop new components that provide new functionality required by the system. This great work effort is possible to minimize so that costs are reduced, especially if one considers that software maintenance is the most expensive phase of the software life cycle. Could we design a system that allows the progressive assimilation of sequences of operation and maintenance of the telescope, through an automatic self-programming system, so that it can evolve from one Component oriented organization to a Service oriented organization? One possible way to achieve this is to use mechanisms of learning and knowledge consolidation to reduce to the minimum expression the effort to transform the specifications of the different telescope users to the operational deployments. This article proposes a framework for solving this problem based on the combination of the following tools: data mining, self-Adaptive software, code generation, refactoring based on metrics, Hierarchical Agglomerative Clustering and Service Oriented Architectures.

  1. Generic, Extensible, Configurable Push-Pull Framework for Large-Scale Science Missions

    NASA Technical Reports Server (NTRS)

    Foster, Brian M.; Chang, Albert Y.; Freeborn, Dana J.; Crichton, Daniel J.; Woollard, David M.; Mattmann, Chris A.

    2011-01-01

    The push-pull framework was developed in hopes that an infrastructure would be created that could literally connect to any given remote site, and (given a set of restrictions) download files from that remote site based on those restrictions. The Cataloging and Archiving Service (CAS) has recently been re-architected and re-factored in its canonical services, including file management, workflow management, and resource management. Additionally, a generic CAS Crawling Framework was built based on motivation from Apache s open-source search engine project called Nutch. Nutch is an Apache effort to provide search engine services (akin to Google), including crawling, parsing, content analysis, and indexing. It has produced several stable software releases, and is currently used in production services at companies such as Yahoo, and at NASA's Planetary Data System. The CAS Crawling Framework supports many of the Nutch Crawler's generic services, including metadata extraction, crawling, and ingestion. However, one service that was not ported over from Nutch is a generic protocol layer service that allows the Nutch crawler to obtain content using protocol plug-ins that download content using implementations of remote protocols, such as HTTP, FTP, WinNT file system, HTTPS, etc. Such a generic protocol layer would greatly aid in the CAS Crawling Framework, as the layer would allow the framework to generically obtain content (i.e., data products) from remote sites using protocols such as FTP and others. Augmented with this capability, the Orbiting Carbon Observatory (OCO) and NPP (NPOESS Preparatory Project) Sounder PEATE (Product Evaluation and Analysis Tools Elements) would be provided with an infrastructure to support generic FTP-based pull access to remote data products, obviating the need for any specialized software outside of the context of their existing process control systems. This extensible configurable framework was created in Java, and allows the use of different underlying communication middleware (at present, both XMLRPC, and RMI). In addition, the framework is entirely suitable in a multi-mission environment and is supporting both NPP Sounder PEATE and the OCO Mission. Both systems involve tasks such as high-throughput job processing, terabyte-scale data management, and science computing facilities. NPP Sounder PEATE is already using the push-pull framework to accept hundreds of gigabytes of IASI (infrared atmospheric sounding interferometer) data, and is in preparation to accept CRIMS (Cross-track Infrared Microwave Sounding Suite) data. OCO will leverage the framework to download MODIS, CloudSat, and other ancillary data products for use in the high-performance Level 2 Science Algorithm. The National Cancer Institute is also evaluating the framework for use in sharing and disseminating cancer research data through its Early Detection Research Network (EDRN).

  2. Cooperative Work and Sustainable Scientific Software Practices in R

    NASA Astrophysics Data System (ADS)

    Weber, N.

    2013-12-01

    Most scientific software projects are dependent on the work of many diverse people, institutions and organizations. Incentivizing these actors to cooperatively develop software that is both reliable, and sustainable is complicated by the fact that the reward structures of these various actors greatly differ: research scientists want results from a software or model run in order to publish papers, produce new data, or test a hypothesis; software engineers and research centers want compilable, well documented code that is refactorable, reusable and reproducible in future research scenarios. While much research has been done on incentives and motivations for participating in open source software projects or cyberinfrastrcture development, little work has been done on what motivates or incentivizes developers to maintain scientific software projects beyond their original application. This poster will present early results of research into the incentives and motivation for cooperative scientific software development. In particular, this work focuses on motivations for the maintenance and repair of libraries on the software platform R. Our work here uses a sample of R packages that were created by research centers, or are specific to earth, environmental and climate science applications. We first mined 'check' logs from the Comprehensive R Archive Network (CRAN) to determine the amount of time a package has existed, the number of versions it has gone through over this time, the number of releases, and finally the contact information for each official package 'maintainer'. We then sent a survey to each official maintainer, asking them questions about what role they played in developing the original package, and what their motivations were for sustaining the project over time. We will present early results from this mining and our survey of R maintainers.

  3. Pattern matching through Chaos Game Representation: bridging numerical and discrete data structures for biological sequence analysis

    PubMed Central

    2012-01-01

    Background Chaos Game Representation (CGR) is an iterated function that bijectively maps discrete sequences into a continuous domain. As a result, discrete sequences can be object of statistical and topological analyses otherwise reserved to numerical systems. Characteristically, CGR coordinates of substrings sharing an L-long suffix will be located within 2-L distance of each other. In the two decades since its original proposal, CGR has been generalized beyond its original focus on genomic sequences and has been successfully applied to a wide range of problems in bioinformatics. This report explores the possibility that it can be further extended to approach algorithms that rely on discrete, graph-based representations. Results The exploratory analysis described here consisted of selecting foundational string problems and refactoring them using CGR-based algorithms. We found that CGR can take the role of suffix trees and emulate sophisticated string algorithms, efficiently solving exact and approximate string matching problems such as finding all palindromes and tandem repeats, and matching with mismatches. The common feature of these problems is that they use longest common extension (LCE) queries as subtasks of their procedures, which we show to have a constant time solution with CGR. Additionally, we show that CGR can be used as a rolling hash function within the Rabin-Karp algorithm. Conclusions The analysis of biological sequences relies on algorithmic foundations facing mounting challenges, both logistic (performance) and analytical (lack of unifying mathematical framework). CGR is found to provide the latter and to promise the former: graph-based data structures for sequence analysis operations are entailed by numerical-based data structures produced by CGR maps, providing a unifying analytical framework for a diversity of pattern matching problems. PMID:22551152

  4. QCD evolution of (un)polarized gluon TMDPDFs and the Higgs q T -distribution

    NASA Astrophysics Data System (ADS)

    Echevarria, Miguel G.; Kasemets, Tomas; Mulders, Piet J.; Pisano, Cristian

    2015-07-01

    We provide the proper definition of all the leading-twist (un)polarized gluon transverse momentum dependent parton distribution functions (TMDPDFs), by considering the Higgs boson transverse momentum distribution in hadron-hadron collisions and deriving the factorization theorem in terms of them. We show that the evolution of all the (un)polarized gluon TMDPDFs is driven by a universal evolution kernel, which can be resummed up to next-to-next-to-leading-logarithmic accuracy. Considering the proper definition of gluon TMDPDFs, we perform an explicit next-to-leading-order calculation of the unpolarized ( f {1/ g }), linearly polarized ( h {1/⊥ g }) and helicity ( g {1/L g }) gluon TMDPDFs, and show that, as expected, they are free from rapidity divergences. As a byproduct, we obtain the Wilson coefficients of the refactorization of these TMDPDFs at large transverse momentum. In particular, the coefficient of g {1/L g }, which has never been calculated before, constitutes a new and necessary ingredient for a reliable phenomenological extraction of this quantity, for instance at RHIC or the future AFTER@LHC or Electron-Ion Collider. The coefficients of f {1/ g } and h {1/⊥ g } have never been calculated in the present formalism, although they could be obtained by carefully collecting and recasting previous results in the new TMD formalism. We apply these results to analyze the contribution of linearly polarized gluons at different scales, relevant, for instance, for the inclusive production of the Higgs boson and the C-even pseudoscalar bottomonium state η b . Applying our resummation scheme we finally provide predictions for the Higgs boson q T -distribution at the LHC.

  5. The PhEDEx next-gen website

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Egeland, R.; Huang, C. H.; Rossman, P.

    PhEDEx is the data-transfer management solution written by CMS. It consists of agents running at each site, a website for presentation of information, and a web-based data-service for scripted access to information. The website allows users to monitor the progress of data-transfers, the status of site agents and links between sites, and the overall status and behaviour of everything about PhEDEx. It also allows users to make and approve requests for data-transfers and for deletion of data. It is the main point-of-entry for all users wishing to interact with PhEDEx. For several years, the website has consisted of a singlemore » perl program with about 10K SLOC. This program has limited capabilities for exploring the data, with only coarse filtering capabilities and no context-sensitive awareness. Graphical information is presented as static images, generated on the server, with no interactivity. It is also not well connected to the rest of the PhEDEx codebase, since much of it was written before the data-service was developed. All this makes it hard to maintain and extend. We are re-implementing the website to address these issues. The UI is being rewritten in Javascript, replacing most of the server-side code. We are using the YUI toolkit to provide advanced features and context-sensitive interaction, and will adopt a Javascript charting library for generating graphical representations client-side. This relieves the server of much of its load, and automatically improves server-side security. The Javascript components can be re-used in many ways, allowing custom pages to be developed for specific uses. In particular, standalone test-cases using small numbers of components make it easier to debug the Javascript than it is to debug a large server program. Information about PhEDEx is accessed through the PhEDEx data-service, since direct SQL is not available from the clients browser. This provides consistent semantics with other, externally written monitoring tools, which already use the data-service. It also reduces redundancy in the code, yielding a simpler, consolidated codebase. In this talk we describe our experience of re-factoring this monolithic server-side program into a lighter client-side framework. We describe some of the techniques that worked well for us, and some of the mistakes we made along the way. We present the current state of the project, and its future direction.« less

  6. Trident: scalable compute archives: workflows, visualization, and analysis

    NASA Astrophysics Data System (ADS)

    Gopu, Arvind; Hayashi, Soichi; Young, Michael D.; Kotulla, Ralf; Henschel, Robert; Harbeck, Daniel

    2016-08-01

    The Astronomy scientific community has embraced Big Data processing challenges, e.g. associated with time-domain astronomy, and come up with a variety of novel and efficient data processing solutions. However, data processing is only a small part of the Big Data challenge. Efficient knowledge discovery and scientific advancement in the Big Data era requires new and equally efficient tools: modern user interfaces for searching, identifying and viewing data online without direct access to the data; tracking of data provenance; searching, plotting and analyzing metadata; interactive visual analysis, especially of (time-dependent) image data; and the ability to execute pipelines on supercomputing and cloud resources with minimal user overhead or expertise even to novice computing users. The Trident project at Indiana University offers a comprehensive web and cloud-based microservice software suite that enables the straight forward deployment of highly customized Scalable Compute Archive (SCA) systems; including extensive visualization and analysis capabilities, with minimal amount of additional coding. Trident seamlessly scales up or down in terms of data volumes and computational needs, and allows feature sets within a web user interface to be quickly adapted to meet individual project requirements. Domain experts only have to provide code or business logic about handling/visualizing their domain's data products and about executing their pipelines and application work flows. Trident's microservices architecture is made up of light-weight services connected by a REST API and/or a message bus; a web interface elements are built using NodeJS, AngularJS, and HighCharts JavaScript libraries among others while backend services are written in NodeJS, PHP/Zend, and Python. The software suite currently consists of (1) a simple work flow execution framework to integrate, deploy, and execute pipelines and applications (2) a progress service to monitor work flows and sub-work flows (3) ImageX, an interactive image visualization service (3) an authentication and authorization service (4) a data service that handles archival, staging and serving of data products, and (5) a notification service that serves statistical collation and reporting needs of various projects. Several other additional components are under development. Trident is an umbrella project, that evolved from the One Degree Imager, Portal, Pipeline, and Archive (ODI-PPA) project which we had initially refactored toward (1) a powerful analysis/visualization portal for Globular Cluster System (GCS) survey data collected by IU researchers, 2) a data search and download portal for the IU Electron Microscopy Center's data (EMC-SCA), 3) a prototype archive for the Ludwig Maximilian University's Wide Field Imager. The new Trident software has been used to deploy (1) a metadata quality control and analytics portal (RADY-SCA) for DICOM formatted medical imaging data produced by the IU Radiology Center, 2) Several prototype work flows for different domains, 3) a snapshot tool within IU's Karst Desktop environment, 4) a limited component-set to serve GIS data within the IU GIS web portal. Trident SCA systems leverage supercomputing and storage resources at Indiana University but can be configured to make use of any cloud/grid resource, from local workstations/servers to (inter)national supercomputing facilities such as XSEDE.

  7. 76 FR 41246 - Pesticide Program Dialogue Committee, Pesticide Registration Improvement Act Process Improvement...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-13

    ... Committee, Pesticide Registration Improvement Act Process Improvement Workgroup; Notice of Public Meeting...) Process Improvement Work Group. EPA plans to meet its ESA consultation obligations through the pesticide... a pesticide during the registration review process. This meeting of the PRIA Process Improvement...

  8. Making process improvement 'stick'.

    PubMed

    Studer, Quint

    2014-06-01

    To sustain gains from a process improvement initiative, healthcare organizations should: Explain to staff why a process improvement initiative is needed. Encourage leaders within the organization to champion the process improvement, and tie their evaluations to its outcomes. Ensure that both leaders and employees have the skills to help sustain the sought-after process improvements.

  9. Process Correlation Analysis Model for Process Improvement Identification

    PubMed Central

    Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data. PMID:24977170

  10. Process correlation analysis model for process improvement identification.

    PubMed

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  11. Improving the Simplified Acquisition of Base Engineering Requirements (SABER) Delivery Order Award Process: Results of a Process Improvement Plan

    DTIC Science & Technology

    1991-09-01

    putting all tasks directed towsrds achieving an outcome in aequence. The tasks can be viewed as steps in the process (39:2.3). Using this...improvement opportunity is investigated. A plan is developed, root causes are identified, and solutions are tested and implemented. The process is... solutions , check for actual improvement, and integrate the successful improvements into the process. ?UP 7. Check Improvement Performance. Finally, the

  12. Process safety improvement--quality and target zero.

    PubMed

    Van Scyoc, Karl

    2008-11-15

    Process safety practitioners have adopted quality management principles in design of process safety management systems with positive effect, yet achieving safety objectives sometimes remain a distant target. Companies regularly apply tools and methods which have roots in quality and productivity improvement. The "plan, do, check, act" improvement loop, statistical analysis of incidents (non-conformities), and performance trending popularized by Dr. Deming are now commonly used in the context of process safety. Significant advancements in HSE performance are reported after applying methods viewed as fundamental for quality management. In pursuit of continual process safety improvement, the paper examines various quality improvement methods, and explores how methods intended for product quality can be additionally applied to continual improvement of process safety. Methods such as Kaizen, Poke yoke, and TRIZ, while long established for quality improvement, are quite unfamiliar in the process safety arena. These methods are discussed for application in improving both process safety leadership and field work team performance. Practical ways to advance process safety, based on the methods, are given.

  13. SPARC: Demonstrate burst-buffer-based checkpoint/restart on ATS-1.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oldfield, Ron A.; Ulmer, Craig D.; Widener, Patrick

    Recent high-performance computing (HPC) platforms such as the Trinity Advanced Technology System (ATS-1) feature burst buffer resources that can have a dramatic impact on an application’s I/O performance. While these non-volatile memory (NVM) resources provide a new tier in the storage hierarchy, developers must find the right way to incorporate the technology into their applications in order to reap the benefits. Similar to other laboratories, Sandia is actively investigating ways in which these resources can be incorporated into our existing libraries and workflows without burdening our application developers with excessive, platform-specific details. This FY18Q1 milestone summaries our progress in adaptingmore » the Sandia Parallel Aerodynamics and Reentry Code (SPARC) in Sandia’s ATDM program to leverage Trinity’s burst buffers for checkpoint/restart operations. We investigated four different approaches with varying tradeoffs in this work: (1) simply updating job script to use stage-in/stage out burst buffer directives, (2) modifying SPARC to use LANL’s hierarchical I/O (HIO) library to store/retrieve checkpoints, (3) updating Sandia’s IOSS library to incorporate the burst buffer in all meshing I/O operations, and (4) modifying SPARC to use our Kelpie distributed memory library to store/retrieve checkpoints. Team members were successful in generating initial implementation for all four approaches, but were unable to obtain performance numbers in time for this report (reasons: initial problem sizes were not large enough to stress I/O, and SPARC refactor will require changes to our code). When we presented our work to the SPARC team, they expressed the most interest in the second and third approaches. The HIO work was favored because it is lightweight, unobtrusive, and should be portable to ATS-2. The IOSS work is seen as a long-term solution, and is favored because all I/O work (including checkpoints) can be deferred to a single library.« less

  14. [Sustainable process improvement with application of 'lean philosophy'].

    PubMed

    Rouppe van der Voort, Marc B V; van Merode, G G Frits; Veraart, Henricus G N

    2013-01-01

    Process improvement is increasingly being implemented, particularly with the aid of 'lean philosophy'. This management philosophy aims to improve quality by reducing 'wastage'. Local improvements can produce negative effects elsewhere due to interdependence of processes. An 'integrated system approach' is required to prevent this. Some hospitals claim that this has been successful. Research into process improvement with the application of lean philosophy has reported many positive effects, defined as improved safety, quality and efficiency. Due to methodological shortcomings and lack of rigorous evaluations it is, however, not yet possible to determine the impact of this approach. It is, however, obvious that the investigated applications are fragmentary, with a dominant focus on the instrumental aspect of the philosophy and a lack of integration in a total system, and with insufficient attention to human aspects. Process improvement is required to achieve better and more goal-oriented healthcare. To achieve this, hospitals must develop integrated system approaches that combine methods for process design with continuous improvement of processes and with personnel management. It is crucial that doctors take the initiative to guide and improve processes in an integral manner.

  15. Improvement in Patient Transfer Process From the Operating Room to the PICU Using a Lean and Six Sigma-Based Quality Improvement Project.

    PubMed

    Gleich, Stephen J; Nemergut, Michael E; Stans, Anthony A; Haile, Dawit T; Feigal, Scott A; Heinrich, Angela L; Bosley, Christopher L; Tripathi, Sandeep

    2016-08-01

    Ineffective and inefficient patient transfer processes can increase the chance of medical errors. Improvements in such processes are high-priority local institutional and national patient safety goals. At our institution, nonintubated postoperative pediatric patients are first admitted to the postanesthesia care unit before transfer to the PICU. This quality improvement project was designed to improve the patient transfer process from the operating room (OR) to the PICU. After direct observation of the baseline process, we introduced a structured, direct OR-PICU transfer process for orthopedic spinal fusion patients. We performed value stream mapping of the process to determine error-prone and inefficient areas. We evaluated primary outcome measures of handoff error reduction and the overall efficiency of patient transfer process time. Staff satisfaction was evaluated as a counterbalance measure. With the introduction of the new direct OR-PICU patient transfer process, the handoff communication error rate improved from 1.9 to 0.3 errors per patient handoff (P = .002). Inefficiency (patient wait time and non-value-creating activity) was reduced from 90 to 32 minutes. Handoff content was improved with fewer information omissions (P < .001). Staff satisfaction significantly improved among nearly all PICU providers. By using quality improvement methodology to design and implement a new direct OR-PICU transfer process with a structured multidisciplinary verbal handoff, we achieved sustained improvements in patient safety and efficiency. Handoff communication was enhanced, with fewer errors and content omissions. The new process improved efficiency, with high staff satisfaction. Copyright © 2016 by the American Academy of Pediatrics.

  16. The theory, practice, and future of process improvement in general thoracic surgery.

    PubMed

    Freeman, Richard K

    2014-01-01

    Process improvement, in its broadest sense, is the analysis of a given set of actions with the aim of elevating quality and reducing costs. The tenets of process improvement have been applied to medicine in increasing frequency for at least the last quarter century including thoracic surgery. This review outlines the theory underlying process improvement, the currently available data sources for process improvement and possible future directions of research. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Process for improving metal production in steelmaking processes

    DOEpatents

    Pal, Uday B.; Gazula, Gopala K. M.; Hasham, Ali

    1996-01-01

    A process and apparatus for improving metal production in ironmaking and steelmaking processes is disclosed. The use of an inert metallic conductor in the slag containing crucible and the addition of a transition metal oxide to the slag are the disclosed process improvements.

  18. Case Studies in Continuous Process Improvement

    NASA Technical Reports Server (NTRS)

    Mehta, A.

    1997-01-01

    This study focuses on improving the SMT assembly process in a low-volume, high-reliability environment with emphasis on fine pitch and BGA packages. Before a process improvement is carried out, it is important to evaluate where the process stands in terms of process capability.

  19. Model-based software process improvement

    NASA Technical Reports Server (NTRS)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  20. 76 FR 19976 - Proposed Information Collection; Comment Request; Survey of EDA Grant Process Improvement

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-11

    ...; Comment Request; Survey of EDA Grant Process Improvement AGENCY: Economic Development Administration.... In 2010, EDA made improvements in its grant application process. The proposed short survey of five to... improvements to the grant application process and to make any necessary adjustments. EDA would like to conduct...

  1. Using Unified Modelling Language (UML) as a process-modelling technique for clinical-research process improvement.

    PubMed

    Kumarapeli, P; De Lusignan, S; Ellis, T; Jones, B

    2007-03-01

    The Primary Care Data Quality programme (PCDQ) is a quality-improvement programme which processes routinely collected general practice computer data. Patient data collected from a wide range of different brands of clinical computer systems are aggregated, processed, and fed back to practices in an educational context to improve the quality of care. Process modelling is a well-established approach used to gain understanding and systematic appraisal, and identify areas of improvement of a business process. Unified modelling language (UML) is a general purpose modelling technique used for this purpose. We used UML to appraise the PCDQ process to see if the efficiency and predictability of the process could be improved. Activity analysis and thinking-aloud sessions were used to collect data to generate UML diagrams. The UML model highlighted the sequential nature of the current process as a barrier for efficiency gains. It also identified the uneven distribution of process controls, lack of symmetric communication channels, critical dependencies among processing stages, and failure to implement all the lessons learned in the piloting phase. It also suggested that improved structured reporting at each stage - especially from the pilot phase, parallel processing of data and correctly positioned process controls - should improve the efficiency and predictability of research projects. Process modelling provided a rational basis for the critical appraisal of a clinical data processing system; its potential maybe underutilized within health care.

  2. Process for improving metal production in steelmaking processes

    DOEpatents

    Pal, U.B.; Gazula, G.K.M.; Hasham, A.

    1996-06-18

    A process and apparatus for improving metal production in ironmaking and steelmaking processes is disclosed. The use of an inert metallic conductor in the slag containing crucible and the addition of a transition metal oxide to the slag are the disclosed process improvements. 6 figs.

  3. Applications of process improvement techniques to improve workflow in abdominal imaging.

    PubMed

    Tamm, Eric Peter

    2016-03-01

    Major changes in the management and funding of healthcare are underway that will markedly change the way radiology studies will be reimbursed. The result will be the need to deliver radiology services in a highly efficient manner while maintaining quality. The science of process improvement provides a practical approach to improve the processes utilized in radiology. This article will address in a step-by-step manner how to implement process improvement techniques to improve workflow in abdominal imaging.

  4. Process capability improvement through DMAIC for aluminum alloy wheel machining

    NASA Astrophysics Data System (ADS)

    Sharma, G. V. S. S.; Rao, P. Srinivasa; Babu, B. Surendra

    2017-07-01

    This paper first enlists the generic problems of alloy wheel machining and subsequently details on the process improvement of the identified critical-to-quality machining characteristic of A356 aluminum alloy wheel machining process. The causal factors are traced using the Ishikawa diagram and prioritization of corrective actions is done through process failure modes and effects analysis. Process monitoring charts are employed for improving the process capability index of the process, at the industrial benchmark of four sigma level, which is equal to the value of 1.33. The procedure adopted for improving the process capability levels is the define-measure-analyze-improve-control (DMAIC) approach. By following the DMAIC approach, the C p, C pk and C pm showed signs of improvement from an initial value of 0.66, -0.24 and 0.27, to a final value of 4.19, 3.24 and 1.41, respectively.

  5. Teaching the NIATx Model of Process Improvement as an Evidence-Based Process

    ERIC Educational Resources Information Center

    Evans, Alyson C.; Rieckmann, Traci; Fitzgerald, Maureen M.; Gustafson, David H.

    2007-01-01

    Process Improvement (PI) is an approach for helping organizations to identify and resolve inefficient and ineffective processes through problem solving and pilot testing change. Use of PI in improving client access, retention and outcomes in addiction treatment is on the rise through the teaching of the Network for the Improvement of Addiction…

  6. The use of process mapping in healthcare quality improvement projects.

    PubMed

    Antonacci, Grazia; Reed, Julie E; Lennox, Laura; Barlow, James

    2018-05-01

    Introduction Process mapping provides insight into systems and processes in which improvement interventions are introduced and is seen as useful in healthcare quality improvement projects. There is little empirical evidence on the use of process mapping in healthcare practice. This study advances understanding of the benefits and success factors of process mapping within quality improvement projects. Methods Eight quality improvement projects were purposively selected from different healthcare settings within the UK's National Health Service. Data were gathered from multiple data-sources, including interviews exploring participants' experience of using process mapping in their projects and perceptions of benefits and challenges related to its use. These were analysed using inductive analysis. Results Eight key benefits related to process mapping use were reported by participants (gathering a shared understanding of the reality; identifying improvement opportunities; engaging stakeholders in the project; defining project's objectives; monitoring project progress; learning; increased empathy; simplicity of the method) and five factors related to successful process mapping exercises (simple and appropriate visual representation, information gathered from multiple stakeholders, facilitator's experience and soft skills, basic training, iterative use of process mapping throughout the project). Conclusions Findings highlight benefits and versatility of process mapping and provide practical suggestions to improve its use in practice.

  7. Spatially-Distributed Stream Flow and Nutrient Dynamics Simulations Using the Component-Based AgroEcoSystem-Watershed (AgES-W) Model

    NASA Astrophysics Data System (ADS)

    Ascough, J. C.; David, O.; Heathman, G. C.; Smith, D. R.; Green, T. R.; Krause, P.; Kipka, H.; Fink, M.

    2010-12-01

    The Object Modeling System 3 (OMS3), currently being developed by the USDA-ARS Agricultural Systems Research Unit and Colorado State University (Fort Collins, CO), provides a component-based environmental modeling framework which allows the implementation of single- or multi-process modules that can be developed and applied as custom-tailored model configurations. OMS3 as a “lightweight” modeling framework contains four primary foundations: modeling resources (e.g., components) annotated with modeling metadata; domain specific knowledge bases and ontologies; tools for calibration, sensitivity analysis, and model optimization; and methods for model integration and performance scalability. The core is able to manage modeling resources and development tools for model and simulation creation, execution, evaluation, and documentation. OMS3 is based on the Java platform but is highly interoperable with C, C++, and FORTRAN on all major operating systems and architectures. The ARS Conservation Effects Assessment Project (CEAP) Watershed Assessment Study (WAS) Project Plan provides detailed descriptions of ongoing research studies at 14 benchmark watersheds in the United States. In order to satisfy the requirements of CEAP WAS Objective 5 (“develop and verify regional watershed models that quantify environmental outcomes of conservation practices in major agricultural regions”), a new watershed model development approach was initiated to take advantage of OMS3 modeling framework capabilities. Specific objectives of this study were to: 1) disaggregate and refactor various agroecosystem models (e.g., J2K-S, SWAT, WEPP) and implement hydrological, N dynamics, and crop growth science components under OMS3, 2) assemble a new modular watershed scale model for fully-distributed transfer of water and N loading between land units and stream channels, and 3) evaluate the accuracy and applicability of the modular watershed model for estimating stream flow and N dynamics. The Cedar Creek watershed (CCW) in northeastern Indiana, USA was selected for application of the OMS3-based AgroEcoSystem-Watershed (AgES-W) model. AgES-W performance for stream flow and N loading was assessed using Nash-Sutcliffe model efficiency (ENS) and percent bias (PBIAS) model evaluation statistics. Comparisons of daily and average monthly simulated and observed stream flow and N loads for the 1997-2005 simulation period resulted in PBIAS and ENS values that were similar or better than those reported in the literature for SWAT stream flow and N loading predictions at a similar scale. The results show that the AgES-W model was able to reproduce the hydrological and N dynamics of the CCW with sufficient quality, and should serve as a foundation upon which to better quantify additional water quality indicators (e.g., sediment transport and P dynamics) at the watershed scale.

  8. GeoSciML version 3: A GML application for geologic information

    NASA Astrophysics Data System (ADS)

    International Union of Geological Sciences., I. C.; Richard, S. M.

    2011-12-01

    After 2 years of testing and development, XML schema for GeoSciML version 3 are now ready for application deployment. GeoSciML draws from many geoscience data modelling efforts to establish a common suite of feature types to represent information associated with geologic maps (materials, structures, and geologic units) and observations including structure data, samples, and chemical analyses. After extensive testing and use case analysis, in December 2008 the CGI Interoperability Working Group (IWG) released GeoSciML 2.0 as an application schema for basic geological information. GeoSciML 2.0 is in use to deliver geologic data by the OneGeology Europe portal, the Geological Survey of Canada Groundwater Information Network (wet GIN), and the Auscope Mineral Resources portal. GeoSciML to version 3.0 is updated to OGC Geography Markup Language v3.2, re-engineered patterns for association of element values with controlled vocabulary concepts, incorporation of ISO19156 Observation and Measurement constructs for representing numeric and categorical values and for representing analytical data, incorporation of EarthResourceML to represent mineral occurrences and mines, incorporation of the GeoTime model to represent GSSP and stratigraphic time scale, and refactoring of the GeoSciML namespace to follow emerging ISO practices for decoupling of dependencies between standardized namespaces. These changes will make it easier for data providers to link to standard vocabulary and registry services. The depth and breadth of GeoSciML remains largely unchanged, covering the representation of geologic units, earth materials and geologic structures. ISO19156 elements and patterns are used to represent sampling features such as boreholes and rock samples, as well as geochemical and geochronologic measurements. Geologic structures include shear displacement structures (brittle faults and ductile shears), contacts, folds, foliations, lineations and structures with no preferred orientation (e.g. 'miarolitic cavities'). The Earth material package allows for the description of both individual components, such as minerals, and compound materials, such as rocks or unconsolidated materials. Provision is made for alteration, weathering, metamorphism, particle geometry, fabric, and petrophysical descriptions. Mapped features describe the shape of the geological features using standard GML geometries, such as polygons, lines, points or 3D volumes. Geological events provide the age, process and environment of formation of geological features. The Earth Resource section includes features to represent mineral occurrences and mines and associated human activities independently. This addition allows description of resources and reserves that can comply with national and internationally accepted reporting codes. GeoSciML v3 is under consideration as the data model for INSPIRE annex 2 geologic reporting in Europe.

  9. PROCESS IMPROVEMENT STUDIES ON THE BATTELLE HYDROTHERMAL COAL PROCESS

    EPA Science Inventory

    The report gives results of a study to improve the economic viability of the Battelle Hydrothermal (HT) Coal Process by reducing the costs associated with liquid/solid separation and leachant regeneration. Laboratory experiments were conducted to evaluate process improvements for...

  10. Speaking the right language: the scientific method as a framework for a continuous quality improvement program within academic medical research compliance units.

    PubMed

    Nolte, Kurt B; Stewart, Douglas M; O'Hair, Kevin C; Gannon, William L; Briggs, Michael S; Barron, A Marie; Pointer, Judy; Larson, Richard S

    2008-10-01

    The authors developed a novel continuous quality improvement (CQI) process for academic biomedical research compliance administration. A challenge in developing a quality improvement program in a nonbusiness environment is that the terminology and processes are often foreign. Rather than training staff in an existing quality improvement process, the authors opted to develop a novel process based on the scientific method--a paradigm familiar to all team members. The CQI process included our research compliance units. Unit leaders identified problems in compliance administration where a resolution would have a positive impact and which could be resolved or improved with current resources. They then generated testable hypotheses about a change to standard practice expected to improve the problem, and they developed methods and metrics to assess the impact of the change. The CQI process was managed in a "peer review" environment. The program included processes to reduce the incidence of infections in animal colonies, decrease research protocol-approval times, improve compliance and protection of animal and human research subjects, and improve research protocol quality. This novel CQI approach is well suited to the needs and the unique processes of research compliance administration. Using the scientific method as the improvement paradigm fostered acceptance of the project by unit leaders and facilitated the development of specific improvement projects. These quality initiatives will allow us to improve support for investigators while ensuring that compliance standards continue to be met. We believe that our CQI process can readily be used in other academically based offices of research.

  11. Improving Service Delivery in a County Health Department WIC Clinic: An Application of Statistical Process Control Techniques

    PubMed Central

    Boe, Debra Thingstad; Parsons, Helen

    2009-01-01

    Local public health agencies are challenged to continually improve service delivery, yet they frequently operate with constrained resources. Quality improvement methods and techniques such as statistical process control are commonly used in other industries, and they have recently been proposed as a means of improving service delivery and performance in public health settings. We analyzed a quality improvement project undertaken at a local Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) clinic to reduce waiting times and improve client satisfaction with a walk-in nutrition education service. We used statistical process control techniques to evaluate initial process performance, implement an intervention, and assess process improvements. We found that implementation of these techniques significantly reduced waiting time and improved clients' satisfaction with the WIC service. PMID:19608964

  12. Applying Process Improvement Methods to Clinical and Translational Research: Conceptual Framework and Case Examples

    PubMed Central

    Selker, Harry P.; Leslie, Laurel K.

    2015-01-01

    Abstract There is growing appreciation that process improvement holds promise for improving quality and efficiency across the translational research continuum but frameworks for such programs are not often described. The purpose of this paper is to present a framework and case examples of a Research Process Improvement Program implemented at Tufts CTSI. To promote research process improvement, we developed online training seminars, workshops, and in‐person consultation models to describe core process improvement principles and methods, demonstrate the use of improvement tools, and illustrate the application of these methods in case examples. We implemented these methods, as well as relational coordination theory, with junior researchers, pilot funding awardees, our CTRC, and CTSI resource and service providers. The program focuses on capacity building to address common process problems and quality gaps that threaten the efficient, timely and successful completion of clinical and translational studies. PMID:26332869

  13. Strategy for 90% autoverification of clinical chemistry and immunoassay test results using six sigma process improvement.

    PubMed

    Randell, Edward W; Short, Garry; Lee, Natasha; Beresford, Allison; Spencer, Margaret; Kennell, Marina; Moores, Zoë; Parry, David

    2018-06-01

    Six Sigma involves a structured process improvement strategy that places processes on a pathway to continued improvement. The data presented here summarizes a project that took three clinical laboratories from autoverification processes that allowed between about 40% to 60% of tests being auto-verified to more than 90% of tests and samples auto-verified. The project schedule, metrics and targets, a description of the previous system and detailed information on the changes made to achieve greater than 90% auto-verification is presented for this Six Sigma DMAIC (Design, Measure, Analyze, Improve, Control) process improvement project.

  14. Improving preanalytic processes using the principles of lean production (Toyota Production System).

    PubMed

    Persoon, Thomas J; Zaleski, Sue; Frerichs, Janice

    2006-01-01

    The basic technologies used in preanalytic processes for chemistry tests have been mature for a long time, and improvements in preanalytic processes have lagged behind improvements in analytic and postanalytic processes. We describe our successful efforts to improve chemistry test turnaround time from a central laboratory by improving preanalytic processes, using existing resources and the principles of lean production. Our goal is to report 80% of chemistry tests in less than 1 hour and to no longer recognize a distinction between expedited and routine testing. We used principles of lean production (the Toyota Production System) to redesign preanalytic processes. The redesigned preanalytic process has fewer steps and uses 1-piece flow to move blood samples through the accessioning, centrifugation, and aliquoting processes. Median preanalytic processing time was reduced from 29 to 19 minutes, and the laboratory met the goal of reporting 80% of chemistry results in less than 1 hour for 11 consecutive months.

  15. Improving Earth/Prediction Models to Improve Network Processing

    NASA Astrophysics Data System (ADS)

    Wagner, G. S.

    2017-12-01

    The United States Atomic Energy Detection System (USAEDS) primaryseismic network consists of a relatively small number of arrays andthree-component stations. The relatively small number of stationsin the USAEDS primary network make it both necessary and feasibleto optimize both station and network processing.Station processing improvements include detector tuning effortsthat use Receiver Operator Characteristic (ROC) curves to helpjudiciously set acceptable Type 1 (false) vs. Type 2 (miss) errorrates. Other station processing improvements include the use ofempirical/historical observations and continuous background noisemeasurements to compute time-varying, maximum likelihood probabilityof detection thresholds.The USAEDS network processing software makes extensive use of theazimuth and slowness information provided by frequency-wavenumberanalysis at array sites, and polarization analysis at three-componentsites. Most of the improvements in USAEDS network processing aredue to improvements in the models used to predict azimuth, slowness,and probability of detection. Kriged travel-time, azimuth andslowness corrections-and associated uncertainties-are computedusing a ground truth database. Improvements in station processingand the use of improved models for azimuth, slowness, and probabilityof detection have led to significant improvements in USADES networkprocessing.

  16. Capability Maturity Model (CMM) for Software Process Improvements

    NASA Technical Reports Server (NTRS)

    Ling, Robert Y.

    2000-01-01

    This slide presentation reviews the Avionic Systems Division's implementation of the Capability Maturity Model (CMM) for improvements in the software development process. The presentation reviews the process involved in implementing the model and the benefits of using CMM to improve the software development process.

  17. Optimization and Improvement of Test Processes on a Production Line

    NASA Astrophysics Data System (ADS)

    Sujová, Erika; Čierna, Helena

    2018-06-01

    The paper deals with increasing processes efficiency at a production line of cylinder heads of engines in a production company operating in the automotive industry. The goal is to achieve improvement and optimization of test processes on a production line. It analyzes options for improving capacity, availability and productivity of processes of an output test by using modern technology available on the market. We have focused on analysis of operation times before and after optimization of test processes at specific production sections. By analyzing measured results we have determined differences in time before and after improvement of the process. We have determined a coefficient of efficiency OEE and by comparing outputs we have confirmed real improvement of the process of the output test of cylinder heads.

  18. Why Process Improvement Training Fails

    ERIC Educational Resources Information Center

    Lu, Dawei; Betts, Alan

    2011-01-01

    Purpose: The purpose of this paper is to explore the underlying reasons why providing process improvement training, by itself, may not be sufficient to achieve the desired outcome of improved processes; and to attempt a conceptual framework of management training for more effective improvement. Design/methodology/approach: Two similar units within…

  19. [Improving the continuous care process in primary care during weekends and holidays: redesigning and FMEA].

    PubMed

    Cañada Dorado, A; Cárdenas Valladolid, J; Espejo Matorrales, F; García Ferradal, I; Sastre Páez, S; Vicente Martín, I

    2010-01-01

    To describe a project carried out in order to improve the process of Continuous Health Care (CHC) on Saturdays and bank holidays in Primary Care, area number 4, Madrid. The aim of this project was to guarantee a safe and error-free service to patients receiving home health care on weekends. The urgent need for improving CHC process was identified by the Risk Management Functional Unit (RMFU) of the area. In addition, some complaints had been received from the nurses involved in the process as well as from their patients. A SWOT (Strengths, Weaknesses, Opportunities and Threats) analysis performed in 2009 highlighted a number of problems with the process. As a result, a project for improvement was drawn up, to be implemented in the following stages: 1. Redesigning and improving the existing process. 2. Application of failure mode and effect analysis (FMEA) to the new process. 3. Follow up, managing and leading the project. 4. Nurse training. 5. Implementing the process in the whole area. 6. CHC nurse satisfaction surveys. After carrying out this project, the efficiency and level of automation improved considerably. Since implementation of the process enhancement measures, no complaints have been received from patients and surveys show that CHC nurse satisfaction has improved. By using FMEA, errors were given priority and enhancement steps were taken in order to: Inform professionals, back-up personnel and patients about the process. Improve the specialist follow-up report. Provide training in ulcer patient care. The process enhancement, and especially its automation, has resulted in a significant step forward toward achieving greater patient safety. FMEA was a useful tool, which helped in taking some important actions. Finally, CHC nurse satisfaction has clearly improved. Copyright © 2009 SECA. Published by Elsevier Espana. All rights reserved.

  20. Six sigma: process of understanding the control and capability of ranitidine hydrochloride tablet.

    PubMed

    Chabukswar, Ar; Jagdale, Sc; Kuchekar, Bs; Joshi, Vd; Deshmukh, Gr; Kothawade, Hs; Kuckekar, Ab; Lokhande, Pd

    2011-01-01

    The process of understanding the control and capability (PUCC) is an iterative closed loop process for continuous improvement. It covers the DMAIC toolkit in its three phases. PUCC is an iterative approach that rotates between the three pillars of the process of understanding, process control, and process capability, with each iteration resulting in a more capable and robust process. It is rightly said that being at the top is a marathon and not a sprint. The objective of the six sigma study of Ranitidine hydrochloride tablets is to achieve perfection in tablet manufacturing by reviewing the present robust manufacturing process, to find out ways to improve and modify the process, which will yield tablets that are defect-free and will give more customer satisfaction. The application of six sigma led to an improved process capability, due to the improved sigma level of the process from 1.5 to 4, a higher yield, due to reduced variation and reduction of thick tablets, reduction in packing line stoppages, reduction in re-work by 50%, a more standardized process, with smooth flow and change in coating suspension reconstitution level (8%w/w), a huge cost reduction of approximately Rs.90 to 95 lakhs per annum, an improved overall efficiency by 30% approximately, and improved overall quality of the product.

  1. Six Sigma: Process of Understanding the Control and Capability of Ranitidine Hydrochloride Tablet

    PubMed Central

    Chabukswar, AR; Jagdale, SC; Kuchekar, BS; Joshi, VD; Deshmukh, GR; Kothawade, HS; Kuckekar, AB; Lokhande, PD

    2011-01-01

    The process of understanding the control and capability (PUCC) is an iterative closed loop process for continuous improvement. It covers the DMAIC toolkit in its three phases. PUCC is an iterative approach that rotates between the three pillars of the process of understanding, process control, and process capability, with each iteration resulting in a more capable and robust process. It is rightly said that being at the top is a marathon and not a sprint. The objective of the six sigma study of Ranitidine hydrochloride tablets is to achieve perfection in tablet manufacturing by reviewing the present robust manufacturing process, to find out ways to improve and modify the process, which will yield tablets that are defect-free and will give more customer satisfaction. The application of six sigma led to an improved process capability, due to the improved sigma level of the process from 1.5 to 4, a higher yield, due to reduced variation and reduction of thick tablets, reduction in packing line stoppages, reduction in re-work by 50%, a more standardized process, with smooth flow and change in coating suspension reconstitution level (8%w/w), a huge cost reduction of approximately Rs.90 to 95 lakhs per annum, an improved overall efficiency by 30% approximately, and improved overall quality of the product. PMID:21607050

  2. Applying Process Improvement Methods to Clinical and Translational Research: Conceptual Framework and Case Examples.

    PubMed

    Daudelin, Denise H; Selker, Harry P; Leslie, Laurel K

    2015-12-01

    There is growing appreciation that process improvement holds promise for improving quality and efficiency across the translational research continuum but frameworks for such programs are not often described. The purpose of this paper is to present a framework and case examples of a Research Process Improvement Program implemented at Tufts CTSI. To promote research process improvement, we developed online training seminars, workshops, and in-person consultation models to describe core process improvement principles and methods, demonstrate the use of improvement tools, and illustrate the application of these methods in case examples. We implemented these methods, as well as relational coordination theory, with junior researchers, pilot funding awardees, our CTRC, and CTSI resource and service providers. The program focuses on capacity building to address common process problems and quality gaps that threaten the efficient, timely and successful completion of clinical and translational studies. © 2015 The Authors. Clinical and Translational Science published by Wiley Periodicals, Inc.

  3. The Evolution of School Improvement from the Classroom Teacher's Perspective.

    ERIC Educational Resources Information Center

    Thompson, Marci; Mitchell, Deborah

    2002-01-01

    Highlights changes that have occurred since 1992 at Elm Dale Elementary School (Greenfield, Wisconsin) through the school improvement process. Describes how teachers have become involved in and developed ownership of the improvement process, and how they have learned to analyze data. Asserts that the school improvement process has changed the…

  4. An overview of the Software Engineering Laboratory

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This report describes the background and structure of the SEL organization, the SEL process improvement approach, and its experimentation and data collection process. Results of some sample SEL studies are included. It includes a discussion of the overall implication of trends observed over 17 years of process improvement efforts and looks at the return on investment based on a comparison of total investment in process improvement with the measurable improvements seen in the organization's software product.

  5. Performance improvements of binary diffractive structures via optimization of the photolithography and dry etch processes

    NASA Astrophysics Data System (ADS)

    Welch, Kevin; Leonard, Jerry; Jones, Richard D.

    2010-08-01

    Increasingly stringent requirements on the performance of diffractive optical elements (DOEs) used in wafer scanner illumination systems are driving continuous improvements in their associated manufacturing processes. Specifically, these processes are designed to improve the output pattern uniformity of off-axis illumination systems to minimize degradation in the ultimate imaging performance of a lithographic tool. In this paper, we discuss performance improvements in both photolithographic patterning and RIE etching of fused silica diffractive optical structures. In summary, optimized photolithographic processes were developed to increase critical dimension uniformity and featuresize linearity across the substrate. The photoresist film thickness was also optimized for integration with an improved etch process. This etch process was itself optimized for pattern transfer fidelity, sidewall profile (wall angle, trench bottom flatness), and across-wafer etch depth uniformity. Improvements observed with these processes on idealized test structures (for ease of analysis) led to their implementation in product flows, with comparable increases in performance and yield on customer designs.

  6. Accelerating quality improvement within your organization: Applying the Model for Improvement.

    PubMed

    Crowl, Ashley; Sharma, Anita; Sorge, Lindsay; Sorensen, Todd

    2015-01-01

    To discuss the fundamentals of the Model for Improvement and how the model can be applied to quality improvement activities associated with medication use, including understanding the three essential questions that guide quality improvement, applying a process for actively testing change within an organization, and measuring the success of these changes on care delivery. PubMed from 1990 through April 2014 using the search terms quality improvement, process improvement, hospitals, and primary care. At the authors' discretion, studies were selected based on their relevance in demonstrating the quality improvement process and tests of change within an organization. Organizations are continuously seeking to enhance quality in patient care services, and much of this work focuses on improving care delivery processes. Yet change in these systems is often slow, which can lead to frustration or apathy among frontline practitioners. Adopting and applying the Model for Improvement as a core strategy for quality improvement efforts can accelerate the process. While the model is frequently well known in hospitals and primary care settings, it is not always familiar to pharmacists. In addition, while some organizations may be familiar with the "plan, do, study, act" (PDSA) cycles-one element of the Model for Improvement-many do not apply it effectively. The goal of the model is to combine a continuous process of small tests of change (PDSA cycles) within an overarching aim with a longitudinal measurement process. This process differs from other forms of improvement work that plan and implement large-scale change over an extended period, followed by months of data collection. In this scenario it may take months or years to determine whether an intervention will have a positive impact. By following the Model for Improvement, frontline practitioners and their organizational leaders quickly identify strategies that make a positive difference and result in a greater degree of success.

  7. Evaluating Fidelity to a Modified NIATx Process Improvement Strategy for Improving HIV Services in Correctional Facilities.

    PubMed

    Pankow, Jennifer; Willett, Jennifer; Yang, Yang; Swan, Holly; Dembo, Richard; Burdon, William M; Patterson, Yvonne; Pearson, Frank S; Belenko, Steven; Frisman, Linda K

    2018-04-01

    In a study aimed at improving the quality of HIV services for inmates, an organizational process improvement strategy using change teams was tested in 14 correctional facilities in 8 US states and Puerto Rico. Data to examine fidelity to the process improvement strategy consisted of quantitative ratings of the structural and process components of the strategy and qualitative notes that explicate challenges in maintaining fidelity to the strategy. Fidelity challenges included (1) lack of communication and leadership within change teams, (2) instability in team membership, and (3) issues with data utilization in decision-making to implement improvements to services delivery.

  8. Mapping modern software process engineering techniques onto an HEP development environment

    NASA Astrophysics Data System (ADS)

    Wellisch, J. P.

    2003-04-01

    One of the most challenging issues faced in HEP in recent years is the question of how to capitalise on software development and maintenance experience in a continuous manner. To capitalise means in our context to evaluate and apply new process technologies as they arise, and to further evolve technologies already widely in use. It also implies the definition and adoption of standards. The CMS off-line software improvement effort aims at continual software quality improvement, and continual improvement in the efficiency of the working environment with the goal to facilitate doing great new physics. To achieve this, we followed a process improvement program based on ISO-15504, and Rational Unified Process. This experiment in software process improvement in HEP has been progressing now for a period of 3 years. Taking previous experience from ATLAS and SPIDER into account, we used a soft approach of continuous change within the limits of current culture to create of de facto software process standards within the CMS off line community as the only viable route to a successful software process improvement program in HEP. We will present the CMS approach to software process improvement in this process R&D, describe lessons learned, and mistakes made. We will demonstrate the benefits gained, and the current status of the software processes established in CMS off-line software.

  9. Improved process for forming a three-dimensional undersurface on a printed cantilever

    NASA Astrophysics Data System (ADS)

    Kanazawa, Shusuke; Kusaka, Yasuyuki; Yamamoto, Noritaka; Ushijima, Hirobumi

    2018-05-01

    An improvement in the lift-on offset printing process is reported as a means of enabling the structural customization of hollow structures used as moving parts of sensors and actuators. The improved process can add structures to the underside of a hollow structure by modifying the preparation of the pre-structure. As a demonstration, the mechanical displacement of a cantilever in a gravitational acceleration sensor was enhanced by the addition of a proof mass. The improved process can be expected to further produce functionalized hollow structures by an efficient manufacturing process.

  10. Building an outpatient imaging center: A case study at genesis healthcare system, part 2.

    PubMed

    Yanci, Jim

    2006-01-01

    In the second of 2 parts, this article will focus on process improvement projects utilizing a case study at Genesis HealthCare System located in Zanesville, OH. Operational efficiency is a key step in developing a freestanding diagnostic imaging center. The process improvement projects began with an Expert Improvement Session (EIS) on the scheduling process. An EIS session is a facilitated meeting that can last anywhere from 3 hours to 2 days. Its intention is to take a group of people involved with the problem or operational process and work to understand current failures or breakdowns in the process. Recommendations are jointly developed to overcome any current deficiencies, and a work plan is structured to create ownership over the changes. A total of 11 EIS sessions occurred over the course of this project, covering 5 sections: Scheduling/telephone call process, Pre-registration, Verification/pre-certification, MRI throughput, CT throughput. Following is a single example of a project focused on the process improvement efforts. All of the process improvement projects utilized a quasi methodology of "DMAIC" (Define, Measure, Analyze, Improve, and Control).

  11. Advanced Hydrogen Liquefaction Process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwartz, Joseph; Kromer, Brian; Neu, Ben

    2011-09-28

    The project identified and quantified ways to reduce the cost of hydrogen liquefaction, and reduce the cost of hydrogen distribution. The goal was to reduce the power consumption by 20% and then to reduce the capital cost. Optimizing the process, improving process equipment, and improving ortho-para conversion significantly reduced the power consumption of liquefaction, but by less than 20%. Because the efficiency improvement was less than the target, the program was stopped before the capital cost was addressed. These efficiency improvements could provide a benefit to the public to improve the design of future hydrogen liquefiers. The project increased themore » understanding of hydrogen liquefaction by modeling different processes and thoroughly examining ortho-para separation and conversion. The process modeling provided a benefit to the public because the project incorporated para hydrogen into the process modeling software, so liquefaction processes can be modeled more accurately than using only normal hydrogen. Adding catalyst to the first heat exchanger, a simple method to reduce liquefaction power, was identified, analyzed, and quantified. The demonstrated performance of ortho-para separation is sufficient for at least one identified process concept to show reduced power cost when compared to hydrogen liquefaction processes using conventional ortho-para conversion. The impact of improved ortho-para conversion can be significant because ortho para conversion uses about 20-25% of the total liquefaction power, but performance improvement is necessary to realize a substantial benefit. Most of the energy used in liquefaction is for gas compression. Improvements in hydrogen compression will have a significant impact on overall liquefier efficiency. Improvements to turbines, heat exchangers, and other process equipment will have less impact.« less

  12. Improving a Dental School's Clinic Operations Using Lean Process Improvement.

    PubMed

    Robinson, Fonda G; Cunningham, Larry L; Turner, Sharon P; Lindroth, John; Ray, Deborah; Khan, Talib; Yates, Audrey

    2016-10-01

    The term "lean production," also known as "Lean," describes a process of operations management pioneered at the Toyota Motor Company that contributed significantly to the success of the company. Although developed by Toyota, the Lean process has been implemented at many other organizations, including those in health care, and should be considered by dental schools in evaluating their clinical operations. Lean combines engineering principles with operations management and improvement tools to optimize business and operating processes. One of the core concepts is relentless elimination of waste (non-value-added components of a process). Another key concept is utilization of individuals closest to the actual work to analyze and improve the process. When the medical center of the University of Kentucky adopted the Lean process for improving clinical operations, members of the College of Dentistry trained in the process applied the techniques to improve inefficient operations at the Walk-In Dental Clinic. The purpose of this project was to reduce patients' average in-the-door-to-out-the-door time from over four hours to three hours within 90 days. Achievement of this goal was realized by streamlining patient flow and strategically relocating key phases of the process. This initiative resulted in patient benefits such as shortening average in-the-door-to-out-the-door time by over an hour, improving satisfaction by 21%, and reducing negative comments by 24%, as well as providing opportunity to implement the electronic health record, improving teamwork, and enhancing educational experiences for students. These benefits were achieved while maintaining high-quality patient care with zero adverse outcomes during and two years following the process improvement project.

  13. Developing Process Maps as a Tool for a Surgical Infection Prevention Quality Improvement Initiative in Resource-Constrained Settings.

    PubMed

    Forrester, Jared A; Koritsanszky, Luca A; Amenu, Demisew; Haynes, Alex B; Berry, William R; Alemu, Seifu; Jiru, Fekadu; Weiser, Thomas G

    2018-06-01

    Surgical infections cause substantial morbidity and mortality in low-and middle-income countries (LMICs). To improve adherence to critical perioperative infection prevention standards, we developed Clean Cut, a checklist-based quality improvement program to improve compliance with best practices. We hypothesized that process mapping infection prevention activities can help clinicians identify strategies for improving surgical safety. We introduced Clean Cut at a tertiary hospital in Ethiopia. Infection prevention standards included skin antisepsis, ensuring a sterile field, instrument decontamination/sterilization, prophylactic antibiotic administration, routine swab/gauze counting, and use of a surgical safety checklist. Processes were mapped by a visiting surgical fellow and local operating theater staff to facilitate the development of contextually relevant solutions; processes were reassessed for improvements. Process mapping helped identify barriers to using alcohol-based hand solution due to skin irritation, inconsistent administration of prophylactic antibiotics due to variable delivery outside of the operating theater, inefficiencies in assuring sterility of surgical instruments through lack of confirmatory measures, and occurrences of retained surgical items through inappropriate guidelines, staffing, and training in proper routine gauze counting. Compliance with most processes improved significantly following organizational changes to align tasks with specific process goals. Enumerating the steps involved in surgical infection prevention using a process mapping technique helped identify opportunities for improving adherence and plotting contextually relevant solutions, resulting in superior compliance with antiseptic standards. Simplifying these process maps into an adaptable tool could be a powerful strategy for improving safe surgery delivery in LMICs. Copyright © 2018 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  14. Launching a Laboratory Testing Process Quality Improvement Toolkit: From the Shared Networks of Colorado Ambulatory Practices and Partners (SNOCAP).

    PubMed

    Fernald, Douglas; Hamer, Mika; James, Kathy; Tutt, Brandon; West, David

    2015-01-01

    Family medicine and internal medicine physicians order diagnostic laboratory tests for nearly one-third of patient encounters in an average week, yet among medical errors in primary care, an estimated 15% to 54% are attributed to laboratory testing processes. From a practice improvement perspective, we (1) describe the need for laboratory testing process quality improvements from the perspective of primary care practices, and (2) describe the approaches and resources needed to implement laboratory testing process quality improvements in practice. We applied practice observations, process mapping, and interviews with primary care practices in the Shared Networks of Colorado Ambulatory Practices and Partners (SNOCAP)-affiliated practice-based research networks that field-tested in 2013 a laboratory testing process improvement toolkit. From the data collected in each of the 22 participating practices, common testing quality issues included, but were not limited to, 3 main testing process steps: laboratory test preparation, test tracking, and patient notification. Three overarching qualitative themes emerged: practices readily acknowledge multiple laboratory testing process problems; practices know that they need help addressing the issues; and practices face challenges with finding patient-centered solutions compatible with practice priorities and available resources. While practices were able to get started with guidance and a toolkit to improve laboratory testing processes, most did not seem able to achieve their quality improvement aims unassisted. Providing specific guidance tools with practice facilitation or other rapid-cycle quality improvement support may be an effective approach to improve common laboratory testing issues in primary care. © Copyright 2015 by the American Board of Family Medicine.

  15. Deployment of lean six sigma in care coordination: an improved discharge process.

    PubMed

    Breslin, Susan Ellen; Hamilton, Karen Marie; Paynter, Jacquelyn

    2014-01-01

    This article presents a quality improvement project to reduce readmissions in the Medicare population related to heart failure, acute myocardial infarction, and pneumonia. The article describes a systematic approach to the discharge process aimed at improving transitions of care from hospital to post-acute care, utilizing Lean Six Sigma methodology. Inpatient acute care hospital. A coordinated discharge process, which includes postdischarge follow-up, can reduce avoidable readmissions. Implications for The quality improvement project demonstrated the significant role case management plays in preventing costly readmissions and improving outcomes for patients through better transitions of care from the hospital to the community. By utilizing Lean Six Sigma methodology, hospitals can focus on eliminating waste in their current processes and build more sustainable improvements to deliver a safe, quality, discharge process for their patients. Case managers are leading this effort to improve care transitions and assure a smoother transition into the community postdischarge..

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elliott, C.

    The Rocky Flats Environmental Technology Site (RFETS) has initiated a major work process improvement campaign using the tools of formalized benchmarking and streamlining. This paper provides insights into some of the process improvement activities performed at Rocky Flats from November 1995 through December 1996. It reviews the background, motivation, methodology, results, and lessons learned from this ongoing effort. The paper also presents important gains realized through process analysis and improvement including significant cost savings, productivity improvements, and an enhanced understanding of site work processes.

  17. Creating State Accountability Systems That Help Schools Improve

    ERIC Educational Resources Information Center

    Elgart, Mark A.

    2016-01-01

    Organizational leaders from nearly every sector have been using continuous improvement models and improvement science for years to improve products, services, and processes. Though continuous improvement processes are not new in education, they are relatively new in the state policy arena. In a continuous improvement system, educators use data,…

  18. Positive affect improves working memory: implications for controlled cognitive processing.

    PubMed

    Yang, Hwajin; Yang, Sujin; Isen, Alice M

    2013-01-01

    This study examined the effects of positive affect on working memory (WM) and short-term memory (STM). Given that WM involves both storage and controlled processing and that STM primarily involves storage processing, we hypothesised that if positive affect facilitates controlled processing, it should improve WM more than STM. The results demonstrated that positive affect, compared with neutral affect, significantly enhanced WM, as measured by the operation span task. The influence of positive affect on STM, however, was weaker. These results suggest that positive affect enhances WM, a task that involves controlled processing, not just storage processing. Additional analyses of recall and processing times and accuracy further suggest that improved WM under positive affect is not attributable to motivational differences, but results instead from improved controlled cognitive processing.

  19. Software Engineering Program: Software Process Improvement Guidebook

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  20. Improved silicon carbide for advanced heat engines. I - Process development for injection molding

    NASA Technical Reports Server (NTRS)

    Whalen, Thomas J.; Trela, Walter

    1989-01-01

    Alternate processing methods have been investigated as a means of improving the mechanical properties of injection-molded SiC. Various mixing processes (dry, high-sheer, and fluid) were evaluated along with the morphology and particle size of the starting beta-SiC powder. Statistically-designed experiments were used to determine significant effects and interactions of variables in the mixing, injection molding, and binder removal process steps. Improvements in mechanical strength can be correlated with the reduction in flaw size observed in the injection molded green bodies obtained with improved processing methods.

  1. Challenges and potential improvements in the admission process of patients with spinal cord injury in a specialized rehabilitation clinic - an interview based qualitative study of an interdisciplinary team.

    PubMed

    Röthlisberger, Fabian; Boes, Stefan; Rubinelli, Sara; Schmitt, Klaus; Scheel-Sailer, Anke

    2017-06-26

    The admission process of patients to a hospital is the starting point for inpatient services. In order to optimize the quality of the health services provision, one needs a good understanding of the patient admission workflow in a clinic. The aim of this study was to identify challenges and potential improvements in the admission process of spinal cord injury patients at a specialized rehabilitation clinic from the perspective of an interdisciplinary team of health professionals. Semi-structured interviews with eight health professionals (medical doctors, physical therapists, occupational therapists, nurses) at the Swiss Paraplegic Centre (acute and rehabilitation clinic) were conducted based on a maximum variety purposive sampling strategy. The interviews were analyzed using a thematic analysis approach. The interviewees described the challenges and potential improvements in this admission process, focusing on five themes. First, the characteristics of the patient with his/her health condition and personality and his/her family influence different areas in the admission process. Improvements in the exchange of information between the hospital and the patient could speed up and simplify the admission process. In addition, challenges and potential improvements were found concerning the rehabilitation planning, the organization of the admission process and the interdisciplinary work. This study identified five themes of challenges and potential improvements in the admission process of spinal cord injury patients at a specialized rehabilitation clinic. When planning adaptations of process steps in one of the areas, awareness of effects in other fields is necessary. Improved pre-admission information would be a first important step to optimize the admission process. A common IT-system providing an interdisciplinary overview and possibilities for interdisciplinary exchange would support the management of the admission process. Managers of other hospitals can supplement the results of this study with their own process analyses, to improve their own patient admission processes.

  2. Reducing RN Vacancy Rate: A Nursing Recruitment Office Process Improvement Project.

    PubMed

    Hisgen, Stephanie A; Page, Nancy E; Thornlow, Deirdre K; Merwin, Elizabeth I

    2018-06-01

    The aim of this study was to reduce the RN vacancy rate at an academic medical center by improving the hiring process in the Nursing Recruitment Office. Inability to fill RN positions can lead to higher vacancy rates and negatively impact staff and patient satisfaction, quality outcomes, and the organization's bottom line. The Model for Improvement was used to design and implement a process improvement project to improve the hiring process from time of interview through the position being filled. Number of days to interview and check references decreased significantly, but no change in overall time to hire and time to fill positions was noted. RN vacancy rate also decreased significantly. Nurse manager satisfaction with the hiring process increased significantly. Redesigning the recruitment process supported operational efficiencies of the organization related to RN recruitment.

  3. Lean principles optimize on-time vascular surgery operating room starts and decrease resident work hours.

    PubMed

    Warner, Courtney J; Walsh, Daniel B; Horvath, Alexander J; Walsh, Teri R; Herrick, Daniel P; Prentiss, Steven J; Powell, Richard J

    2013-11-01

    Lean process improvement techniques are used in industry to improve efficiency and quality while controlling costs. These techniques are less commonly applied in health care. This study assessed the effectiveness of Lean principles on first case on-time operating room starts and quantified effects on resident work hours. Standard process improvement techniques (DMAIC methodology: define, measure, analyze, improve, control) were used to identify causes of delayed vascular surgery first case starts. Value stream maps and process flow diagrams were created. Process data were analyzed with Pareto and control charts. High-yield changes were identified and simulated in computer and live settings prior to implementation. The primary outcome measure was the proportion of on-time first case starts; secondary outcomes included hospital costs, resident rounding time, and work hours. Data were compared with existing benchmarks. Prior to implementation, 39% of first cases started on time. Process mapping identified late resident arrival in preoperative holding as a cause of delayed first case starts. Resident rounding process inefficiencies were identified and changed through the use of checklists, standardization, and elimination of nonvalue-added activity. Following implementation of process improvements, first case on-time starts improved to 71% at 6 weeks (P = .002). Improvement was sustained with an 86% on-time rate at 1 year (P < .001). Resident rounding time was reduced by 33% (from 70 to 47 minutes). At 9 weeks following implementation, these changes generated an opportunity cost potential of $12,582. Use of Lean principles allowed rapid identification and implementation of perioperative process changes that improved efficiency and resulted in significant cost savings. This improvement was sustained at 1 year. Downstream effects included improved resident efficiency with decreased work hours. Copyright © 2013 Society for Vascular Surgery. Published by Mosby, Inc. All rights reserved.

  4. [Process management in the hospital pharmacy for the improvement of the patient safety].

    PubMed

    Govindarajan, R; Perelló-Juncá, A; Parès-Marimòn, R M; Serrais-Benavente, J; Ferrandez-Martí, D; Sala-Robinat, R; Camacho-Calvente, A; Campabanal-Prats, C; Solà-Anderiu, I; Sanchez-Caparrós, S; Gonzalez-Estrada, J; Martinez-Olalla, P; Colomer-Palomo, J; Perez-Mañosas, R; Rodríguez-Gallego, D

    2013-01-01

    To define a process management model for a hospital pharmacy in order to measure, analyse and make continuous improvements in patient safety and healthcare quality. In order to implement process management, Igualada Hospital was divided into different processes, one of which was the Hospital Pharmacy. A multidisciplinary management team was given responsibility for each process. For each sub-process one person was identified to be responsible, and a working group was formed under his/her leadership. With the help of each working group, a risk analysis using failure modes and effects analysis (FMEA) was performed, and the corresponding improvement actions were implemented. Sub-process indicators were also identified, and different process management mechanisms were introduced. The first risk analysis with FMEA produced more than thirty preventive actions to improve patient safety. Later, the weekly analysis of errors, as well as the monthly analysis of key process indicators, permitted us to monitor process results and, as each sub-process manager participated in these meetings, also to assume accountability and responsibility, thus consolidating the culture of excellence. The introduction of different process management mechanisms, with the participation of people responsible for each sub-process, introduces a participative management tool for the continuous improvement of patient safety and healthcare quality. Copyright © 2012 SECA. Published by Elsevier Espana. All rights reserved.

  5. Business Process Reengineering in the Inventory Management to Improve Aircraft Maintenance Operations in the Indonesian Air Force

    DTIC Science & Technology

    2006-06-01

    research will cover an overview of business process engineering (BPR) and operation management . The focus will be on the basic process of BPR, inventory...management and improvement of the process of business operation management to appropriately provide a basic model for the Indonesian Air Force in...discuss the operation management aspects of inventory management and process improvement, including Economic Order Quantity, Material Requirement

  6. Innovating the Standard Procurement System Through Electronic Commerce Technologies

    DTIC Science & Technology

    1999-12-01

    commerce are emerging almost daily as businesses continue to realize the overwhelming ability of agent applications to reduce costs and improve ...processed using the SPS. The result may reduce cycle time, assist contracting professionals, improve the acquisition process, save money and aid...of innovation processes, and it offers enormous potential for helping organizations achieve major improvements in terms of process cost , time

  7. Process Improvement for Next Generation Space Flight Vehicles: MSFC Lessons Learned

    NASA Technical Reports Server (NTRS)

    Housch, Helen

    2008-01-01

    This viewgraph presentation reviews the lessons learned from process improvement for Next Generation Space Flight Vehicles. The contents include: 1) Organizational profile; 2) Process Improvement History; 3) Appraisal Preparation; 4) The Appraisal Experience; 5) Useful Tools; and 6) Is CMMI working?

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simpson, L.; Britt, J.; Birkmire, R.

    ITN Energy Systems, Inc., and Global Solar Energy, Inc., assisted by NREL's PV Manufacturing R&D program, have continued to advance CIGS production technology by developing trajectory-oriented predictive/control models, fault-tolerance control, control platform development, in-situ sensors, and process improvements. Modeling activities included developing physics-based and empirical models for CIGS and sputter-deposition processing, implementing model-based control, and applying predictive models to the construction of new evaporation sources and for control. Model-based control is enabled by implementing reduced or empirical models into a control platform. Reliability improvement activities include implementing preventive maintenance schedules; detecting failed sensors/equipment and reconfiguring to tinue processing; and systematicmore » development of fault prevention and reconfiguration strategies for the full range of CIGS PV production deposition processes. In-situ sensor development activities have resulted in improved control and indicated the potential for enhanced process status monitoring and control of the deposition processes. Substantial process improvements have been made, including significant improvement in CIGS uniformity, thickness control, efficiency, yield, and throughput. In large measure, these gains have been driven by process optimization, which in turn have been enabled by control and reliability improvements due to this PV Manufacturing R&D program.« less

  9. A DMAIC approach for process capability improvement an engine crankshaft manufacturing process

    NASA Astrophysics Data System (ADS)

    Sharma, G. V. S. S.; Rao, P. Srinivasa

    2014-05-01

    The define-measure-analyze-improve-control (DMAIC) approach is a five-strata approach, namely DMAIC. This approach is the scientific approach for reducing the deviations and improving the capability levels of the manufacturing processes. The present work elaborates on DMAIC approach applied in reducing the process variations of the stub-end-hole boring operation of the manufacture of crankshaft. This statistical process control study starts with selection of the critical-to-quality (CTQ) characteristic in the define stratum. The next stratum constitutes the collection of dimensional measurement data of the CTQ characteristic identified. This is followed by the analysis and improvement strata where the various quality control tools like Ishikawa diagram, physical mechanism analysis, failure modes effects analysis and analysis of variance are applied. Finally, the process monitoring charts are deployed at the workplace for regular monitoring and control of the concerned CTQ characteristic. By adopting DMAIC approach, standard deviation is reduced from 0.003 to 0.002. The process potential capability index ( C P) values improved from 1.29 to 2.02 and the process performance capability index ( C PK) values improved from 0.32 to 1.45, respectively.

  10. Investigation of optical current transformer signal processing method based on an improved Kalman algorithm

    NASA Astrophysics Data System (ADS)

    Shen, Yan; Ge, Jin-ming; Zhang, Guo-qing; Yu, Wen-bin; Liu, Rui-tong; Fan, Wei; Yang, Ying-xuan

    2018-01-01

    This paper explores the problem of signal processing in optical current transformers (OCTs). Based on the noise characteristics of OCTs, such as overlapping signals, noise frequency bands, low signal-to-noise ratios, and difficulties in acquiring statistical features of noise power, an improved standard Kalman filtering algorithm was proposed for direct current (DC) signal processing. The state-space model of the OCT DC measurement system is first established, and then mixed noise can be processed by adding mixed noise into measurement and state parameters. According to the minimum mean squared error criterion, state predictions and update equations of the improved Kalman algorithm could be deduced based on the established model. An improved central difference Kalman filter was proposed for alternating current (AC) signal processing, which improved the sampling strategy and noise processing of colored noise. Real-time estimation and correction of noise were achieved by designing AC and DC noise recursive filters. Experimental results show that the improved signal processing algorithms had a good filtering effect on the AC and DC signals with mixed noise of OCT. Furthermore, the proposed algorithm was able to achieve real-time correction of noise during the OCT filtering process.

  11. Using Six Sigma to improve once daily gentamicin dosing and therapeutic drug monitoring performance.

    PubMed

    Egan, Sean; Murphy, Philip G; Fennell, Jerome P; Kelly, Sinead; Hickey, Mary; McLean, Carolyn; Pate, Muriel; Kirke, Ciara; Whiriskey, Annette; Wall, Niall; McCullagh, Eddie; Murphy, Joan; Delaney, Tim

    2012-12-01

    Safe, effective therapy with the antimicrobial gentamicin requires good practice in dose selection and monitoring of serum levels. Suboptimal therapy occurs with breakdown in the process of drug dosing, serum blood sampling, laboratory processing and level interpretation. Unintentional underdosing may result. This improvement effort aimed to optimise this process in an academic teaching hospital using Six Sigma process improvement methodology. A multidisciplinary project team was formed. Process measures considered critical to quality were defined, and baseline practice was examined through process mapping and audit. Root cause analysis informed improvement measures. These included a new dosing and monitoring schedule, and standardised assay sampling and drug administration timing which maximised local capabilities. Three iterations of the improvement cycle were conducted over a 24-month period. The attainment of serum level sampling in the required time window improved by 85% (p≤0.0001). A 66% improvement in accuracy of dosing was observed (p≤0.0001). Unnecessary dose omission while awaiting level results and inadvertent disruption to therapy due to dosing and monitoring process breakdown were eliminated. Average daily dose administered increased from 3.39 mg/kg to 4.78 mg/kg/day. Using Six Sigma methodology enhanced gentamicin usage process performance. Local process related factors may adversely affect adherence to practice guidelines for gentamicin, a drug which is complex to use. It is vital to adapt dosing guidance and monitoring requirements so that they are capable of being implemented in the clinical environment as a matter of routine. Improvement may be achieved through a structured localised approach with multidisciplinary stakeholder involvement.

  12. An Aspect-Oriented Framework for Business Process Improvement

    NASA Astrophysics Data System (ADS)

    Pourshahid, Alireza; Mussbacher, Gunter; Amyot, Daniel; Weiss, Michael

    Recently, many organizations invested in Business Process Management Systems (BPMSs) in order to automate and monitor their processes. Business Activity Monitoring is one of the essential modules of a BPMS as it provides the core monitoring capabilities. Although the natural step after process monitoring is process improvement, most of the existing systems do not provide the means to help users with the improvement step. In this paper, we address this issue by proposing an aspect-oriented framework that allows the impact of changes to business processes to be explored with what-if scenarios based on the most appropriate process redesign patterns among several possibilities. As the four cornerstones of a BPMS are process, goal, performance and validation views, these views need to be aligned automatically by any approach that intends to support automated improvement of business processes. Our framework therefore provides means to reflect process changes also in the other views of the business process. A health care case study presented as a proof of concept suggests that this novel approach is feasible.

  13. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.

  14. Improving surgeon utilization in an orthopedic department using simulation modeling

    PubMed Central

    Simwita, Yusta W; Helgheim, Berit I

    2016-01-01

    Purpose Worldwide more than two billion people lack appropriate access to surgical services due to mismatch between existing human resource and patient demands. Improving utilization of existing workforce capacity can reduce the existing gap between surgical demand and available workforce capacity. In this paper, the authors use discrete event simulation to explore the care process at an orthopedic department. Our main focus is improving utilization of surgeons while minimizing patient wait time. Methods The authors collaborated with orthopedic department personnel to map the current operations of orthopedic care process in order to identify factors that influence poor surgeons utilization and high patient waiting time. The authors used an observational approach to collect data. The developed model was validated by comparing the simulation output with the actual patient data that were collected from the studied orthopedic care process. The authors developed a proposal scenario to show how to improve surgeon utilization. Results The simulation results showed that if ancillary services could be performed before the start of clinic examination services, the orthopedic care process could be highly improved. That is, improved surgeon utilization and reduced patient waiting time. Simulation results demonstrate that with improved surgeon utilizations, up to 55% increase of future demand can be accommodated without patients reaching current waiting time at this clinic, thus, improving patient access to health care services. Conclusion This study shows how simulation modeling can be used to improve health care processes. This study was limited to a single care process; however the findings can be applied to improve other orthopedic care process with similar operational characteristics. PMID:29355193

  15. System and method for networking electrochemical devices

    DOEpatents

    Williams, Mark C.; Wimer, John G.; Archer, David H.

    1995-01-01

    An improved electrochemically active system and method including a plurality of electrochemical devices, such as fuel cells and fluid separation devices, in which the anode and cathode process-fluid flow chambers are connected in fluid-flow arrangements so that the operating parameters of each of said plurality of electrochemical devices which are dependent upon process-fluid parameters may be individually controlled to provide improved operating efficiency. The improvements in operation include improved power efficiency and improved fuel utilization in fuel cell power generating systems and reduced power consumption in fluid separation devices and the like through interstage process fluid parameter control for series networked electrochemical devices. The improved networking method includes recycling of various process flows to enhance the overall control scheme.

  16. Lean-driven improvements slash wait times, drive up patient satisfaction scores.

    PubMed

    2012-07-01

    Administrators at LifePoint Hospitals, based in Brentwood, TN, used lean manufacturing techniques to slash wait times by as much as 30 minutes and achieve double-digit increases in patient satisfaction scores in the EDs at three hospitals. In each case, front-line workers took the lead on identifying opportunities for improvement and redesigning the patient-flow process. As a result of the new efficiencies, patient volume is up by about 25% at all three hospitals. At each hospital, the improvement process began with Kaizen, a lean process that involves bringing personnel together to flow-chart the current system, identify problem areas, and redesign the process. Improvement teams found big opportunities for improvement at the front end of the flow process. Key to the approach was having a plan up front to deal with non-compliance. To sustain improvements, administrators gather and disseminate key metrics on a daily basis.

  17. ISO 9001 in a neonatal intensive care unit (NICU).

    PubMed

    Vitner, Gad; Nadir, Erez; Feldman, Michael; Yurman, Shmuel

    2011-01-01

    The aim of this paper is to present the process for approving and certifying a neonatal intensive care unit to ISO 9001 standards. The process started with the department head's decision to improve services quality before deciding to achieve ISO 9001 certification. Department processes were mapped and quality management mechanisms were developed. Process control and performance measurements were defined and implemented to monitor the daily work. A service satisfaction review was conducted to get feedback from families. In total, 28 processes and related work instructions were defined. Process yields showed service improvements. Family satisfaction improved. The paper is based on preparing only one neonatal intensive care unit to the ISO 9001 standard. The case study should act as an incentive for hospital managers aiming to improve service quality based on the ISO 9001 standard. ISO 9001 is becoming a recommended tool to improve clinical service quality.

  18. The Role of Lean Process Improvement in Implementation of Evidence-Based Practices in Behavioral Health Care.

    PubMed

    Steinfeld, Bradley; Scott, Jennifer; Vilander, Gavin; Marx, Larry; Quirk, Michael; Lindberg, Julie; Koerner, Kelly

    2015-10-01

    To effectively implement evidence-based practices (EBP) in behavioral health care, an organization needs to have operating structures and processes that can address core EBP implementation factors and stages. Lean, a widely used quality improvement process, can potentially address the factors crucial to successful implementation of EBP. This article provides an overview of Lean and the relationship between Lean process improvement steps, and EBP implementation models. Examples of how Lean process improvement methodologies can be used to help plan and carry out implementation of EBP in mental health delivery systems are presented along with limitations and recommendations for future research and clinical application.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simpson, L.

    ITN Energy Systems, Inc., and Global Solar Energy, Inc., with the assistance of NREL's PV Manufacturing R&D program, have continued the advancement of CIGS production technology through the development of trajectory-oriented predictive/control models, fault-tolerance control, control-platform development, in-situ sensors, and process improvements. Modeling activities to date include the development of physics-based and empirical models for CIGS and sputter-deposition processing, implementation of model-based control, and application of predictive models to the construction of new evaporation sources and for control. Model-based control is enabled through implementation of reduced or empirical models into a control platform. Reliability improvement activities include implementation of preventivemore » maintenance schedules; detection of failed sensors/equipment and reconfiguration to continue processing; and systematic development of fault prevention and reconfiguration strategies for the full range of CIGS PV production deposition processes. In-situ sensor development activities have resulted in improved control and indicated the potential for enhanced process status monitoring and control of the deposition processes. Substantial process improvements have been made, including significant improvement in CIGS uniformity, thickness control, efficiency, yield, and throughput. In large measure, these gains have been driven by process optimization, which, in turn, have been enabled by control and reliability improvements due to this PV Manufacturing R&D program. This has resulted in substantial improvements of flexible CIGS PV module performance and efficiency.« less

  20. Lightning Arrestor Connectors Production Readiness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marten, Steve; Linder, Kim; Emmons, Jim

    2008-10-20

    The Lightning Arrestor Connector (LAC), part “M”, presented opportunities to improve the processes used to fabricate LACs. The A## LACs were the first production LACs produced at the KCP, after the product was transferred from Pinnellas. The new LAC relied on the lessons learned from the A## LACs; however, additional improvements were needed to meet the required budget, yield, and schedule requirements. Improvement projects completed since 2001 include Hermetic Connector Sealing Improvement, Contact Assembly molding Improvement, development of a second vendor for LAC shells, general process improvement, tooling improvement, reduction of the LAC production cycle time, and documention of themore » LAC granule fabrication process. This report summarizes the accomplishments achieved in improving the LAC Production Readiness.« less

  1. Optimizing The DSSC Fabrication Process Using Lean Six Sigma

    NASA Astrophysics Data System (ADS)

    Fauss, Brian

    Alternative energy technologies must become more cost effective to achieve grid parity with fossil fuels. Dye sensitized solar cells (DSSCs) are an innovative third generation photovoltaic technology, which is demonstrating tremendous potential to become a revolutionary technology due to recent breakthroughs in cost of fabrication. The study here focused on quality improvement measures undertaken to improve fabrication of DSSCs and enhance process efficiency and effectiveness. Several quality improvement methods were implemented to optimize the seven step individual DSSC fabrication processes. Lean Manufacturing's 5S method successfully increased efficiency in all of the processes. Six Sigma's DMAIC methodology was used to identify and eliminate each of the root causes of defects in the critical titanium dioxide deposition process. These optimizations resulted with the following significant improvements in the production process: 1. fabrication time of the DSSCs was reduced by 54 %; 2. fabrication procedures were improved to the extent that all critical defects in the process were eliminated; 3. the quantity of functioning DSSCs fabricated was increased from 17 % to 90 %.

  2. Improvement of Organizational Performance and Instructional Design: An Analogy Based on General Principles of Natural Information Processing Systems

    ERIC Educational Resources Information Center

    Darabi, Aubteen; Kalyuga, Slava

    2012-01-01

    The process of improving organizational performance through designing systemic interventions has remarkable similarities to designing instruction for improving learners' performance. Both processes deal with subjects (learners and organizations correspondingly) with certain capabilities that are exposed to novel information designed for producing…

  3. Customers First: Using Process Improvement To Improve Service Quality and Efficiency.

    ERIC Educational Resources Information Center

    Larson, Catherine A.

    1998-01-01

    Describes steps in a process-improvement project for reserve book services at the University of Arizona Library: (1) plan--identify process boundaries and customer requirements, gather/analyze data, prioritize problems; (2) do--encourage divergent thinking, reach convergent thinking, find solutions; (3) check--pilot solutions, compare costs; and…

  4. Infrared pre-drying and dry-dehulling of walnuts for improved processing efficiency and product quality

    USDA-ARS?s Scientific Manuscript database

    The walnut industry is faced with an urgent need to improve post-harvest processing efficiency, particularly drying and dehulling operations. This research investigated the feasibility of dry-dehulling and infrared (IR) pre-drying of walnuts for improved processing efficiency and dried product quali...

  5. Environmental Data Flow Six Sigma Process Improvement Savings Overview

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paige, Karen S

    An overview of the Environmental Data Flow Six Sigma improvement project covers LANL’s environmental data processing following receipt from the analytical laboratories. The Six Sigma project identified thirty-three process improvements, many of which focused on cutting costs or reducing the time it took to deliver data to clients.

  6. EUV process improvement with novel litho track hardware

    NASA Astrophysics Data System (ADS)

    Stokes, Harold; Harumoto, Masahiko; Tanaka, Yuji; Kaneyama, Koji; Pieczulewski, Charles; Asai, Masaya

    2017-03-01

    Currently, there are many developments in the field of EUV lithography that are helping to move it towards increased HVM feasibility. Targeted improvements in hardware design for advanced lithography are of interest to our group specifically for metrics such as CD uniformity, LWR, and defect density. Of course, our work is focused on EUV process steps that are specifically affected by litho track performance, and consequently, can be improved by litho track design improvement and optimization. In this study we are building on our experience to provide continual improvement for LWR, CDU, and Defects as applied to a standard EUV process by employing novel hardware solutions on our SOKUDO DUO coat develop track system. Although it is preferable to achieve such improvements post-etch process we feel, as many do, that improvements after patterning are a precursor to improvements after etching. We hereby present our work utilizing the SOKUDO DUO coat develop track system with an ASML NXE:3300 in the IMEC (Leuven, Belgium) cleanroom environment to improve aggressive dense L/S patterns.

  7. Measuring and improving the quality of postoperative epidural analgesia for major abdominal surgery using statistical process control charts.

    PubMed

    Duncan, Fiona; Haigh, Carol

    2013-10-01

    To explore and improve the quality of continuous epidural analgesia for pain relief using Statistical Process Control tools. Measuring the quality of pain management interventions is complex. Intermittent audits do not accurately capture the results of quality improvement initiatives. The failure rate for one intervention, epidural analgesia, is approximately 30% in everyday practice, so it is an important area for improvement. Continuous measurement and analysis are required to understand the multiple factors involved in providing effective pain relief. Process control and quality improvement Routine prospectively acquired data collection started in 2006. Patients were asked about their pain and side effects of treatment. Statistical Process Control methods were applied for continuous data analysis. A multidisciplinary group worked together to identify reasons for variation in the data and instigated ideas for improvement. The key measure for improvement was a reduction in the percentage of patients with an epidural in severe pain. The baseline control charts illustrated the recorded variation in the rate of several processes and outcomes for 293 surgical patients. The mean visual analogue pain score (VNRS) was four. There was no special cause variation when data were stratified by surgeons, clinical area or patients who had experienced pain before surgery. Fifty-seven per cent of patients were hypotensive on the first day after surgery. We were able to demonstrate a significant improvement in the failure rate of epidurals as the project continued with quality improvement interventions. Statistical Process Control is a useful tool for measuring and improving the quality of pain management. The applications of Statistical Process Control methods offer the potential to learn more about the process of change and outcomes in an Acute Pain Service both locally and nationally. We have been able to develop measures for improvement and benchmarking in routine care that has led to the establishment of a national pain registry. © 2013 Blackwell Publishing Ltd.

  8. SEL's Software Process-Improvement Program

    NASA Technical Reports Server (NTRS)

    Basili, Victor; Zelkowitz, Marvin; McGarry, Frank; Page, Jerry; Waligora, Sharon; Pajerski, Rose

    1995-01-01

    The goals and operations of the Software Engineering Laboratory (SEL) is reviewed. For nearly 20 years the SEL has worked to understand, assess, and improve software and the development process within the production environment of the Flight Dynamics Division (FDD) of NASA's Goddard Space Flight Center. The SEL was established in 1976 with the goals of reducing: (1) the defect rate of delivered software, (2) the cost of software to support flight projects, and (3) the average time to produce mission-support software. After studying over 125 projects of FDD, the results have guided the standards, management practices, technologies, and the training within the division. The results of the studies have been a 75 percent reduction in defects, a 50 percent reduction in cost, and a 25 percent reduction in development time. Over time the goals of SEL have been clarified. The goals are now stated as: (1) Understand baseline processes and product characteristics, (2) Assess improvements that have been incorporated into the development projects, (3) Package and infuse improvements into the standard SEL process. The SEL improvement goal is to demonstrate continual improvement of the software process by carrying out analysis, measurement and feedback to projects with in the FDD environment. The SEL supports the understanding of the process by study of several processes including, the effort distribution, and error detection rates. The SEL assesses and refines the processes. Once the assessment and refinement of a process is completed, the SEL packages the process by capturing the process in standards, tools and training.

  9. Process of discharging charge-build up in slag steelmaking processes

    DOEpatents

    Pal, Uday B.; Gazula, Gopala K. M.; Hasham, Ali

    1994-01-01

    A process and apparatus for improving metal production in ironmaking and steelmaking processes is disclosed. The use of an inert metallic conductor in the slag-containing crucible and the addition of a transition metal oxide to the slag are the disclosed process improvements.

  10. Study protocol: improving the transition of care from a non-network hospital back to the patient's medical home.

    PubMed

    Ayele, Roman A; Lawrence, Emily; McCreight, Marina; Fehling, Kelty; Peterson, Jamie; Glasgow, Russell E; Rabin, Borsika A; Burke, Robert; Battaglia, Catherine

    2017-02-10

    The process of transitioning Veterans to primary care following a non-Veterans Affairs (VA) hospitalization can be challenging. Poor transitions result in medical complications and increased hospital readmissions. The goal of this transition of care quality improvement (QI) project is to identify gaps in the current transition process and implement an intervention that bridges the gap and improves the current transition of care process within the Eastern Colorado Health Care System (ECHCS). We will employ qualitative methods to understand the current transition of care process back to VA primary care for Veterans who received care in a non-VA hospital in ECHCS. We will conduct in-depth semi-structured interviews with Veterans hospitalized in 2015 in non-VA hospitals as well as both VA and non-VA providers, staff, and administrators involved in the current care transition process. Participants will be recruited using convenience and snowball sampling. Qualitative data analysis will be guided by conventional content analysis and Lean Six Sigma process improvement tools. We will use VA claim data to identify the top ten non-VA hospitals serving rural and urban Veterans by volume and Veterans that received inpatient services at non-VA hospitals. Informed by both qualitative and quantitative data, we will then develop a transitions care coordinator led intervention to improve the transitions process. We will test the transition of care coordinator intervention using repeated improvement cycles incorporating salient factors in value stream mapping that are important for an efficient and effective transition process. Furthermore, we will complete a value stream map of the transition process at two other VA Medical Centers and test whether an implementation strategy of audit and feedback (the value stream map of the current transition process with the Transition of Care Dashboard) versus audit and feedback with Transition Nurse facilitation of the process using the Resource Guide and Transition of Care Dashboard improves the transition process, continuity of care, patient satisfaction and clinical outcomes. Our current transition of care process has shortcomings. An intervention utilizing a transition care coordinator has the potential to improve this process. Transitioning Veterans to primary care following a non-VA hospitalization is a crucial step for improving care coordination for Veterans.

  11. Significant improvement in the thermal annealing process of optical resonators

    NASA Astrophysics Data System (ADS)

    Salzenstein, Patrice; Zarubin, Mikhail

    2017-05-01

    Thermal annealing performed during process improves the quality of the roughness of optical resonators reducing stresses at the periphery of their surface thus allowing higher Q-factors. After a preliminary realization, the design of the oven and the electronic method were significantly improved thanks to nichrome resistant alloy wires and chopped basalt fibers for thermal isolation during the annealing process. Q-factors can then be improved.

  12. Using Lean Six Sigma Methodology to Improve a Mass Immunizations Process at the United States Naval Academy.

    PubMed

    Ha, Chrysanthy; McCoy, Donald A; Taylor, Christopher B; Kirk, Kayla D; Fry, Robert S; Modi, Jitendrakumar R

    2016-06-01

    Lean Six Sigma (LSS) is a process improvement methodology developed in the manufacturing industry to increase process efficiency while maintaining product quality. The efficacy of LSS application to the health care setting has not been adequately studied. This article presents a quality improvement project at the U.S. Naval Academy that uses LSS to improve the mass immunizations process for Midshipmen during in-processing. The process was standardized to give all vaccinations at one station instead of giving a different vaccination at each station. After project implementation, the average immunizations lead time decreased by 79% and staffing decreased by 10%. The process was shown to be in control with a capability index of 1.18 and performance index of 1.10, resulting in a defect rate of 0.04%. This project demonstrates that the LSS methodology can be applied successfully to the health care setting to make sustainable process improvements if used correctly and completely. Reprint & Copyright © 2016 Association of Military Surgeons of the U.S.

  13. How high-performance work systems drive health care value: an examination of leading process improvement strategies.

    PubMed

    Robbins, Julie; Garman, Andrew N; Song, Paula H; McAlearney, Ann Scheck

    2012-01-01

    As hospitals focus on increasing health care value, process improvement strategies have proliferated, seemingly faster than the evidence base supporting them. Yet, most process improvement strategies are associated with work practices for which solid evidence does exist. Evaluating improvement strategies in the context of evidence-based work practices can provide guidance about which strategies would work best for a given health care organization. We combined a literature review with analysis of key informant interview data collected from 5 case studies of high-performance work practices (HPWPs) in health care organizations. We explored the link between an evidence-based framework for HPWP use and 3 process improvement strategies: Hardwiring Excellence, Lean/Six Sigma, and Baldrige. We found that each of these process improvement strategies has not only strengths but also important gaps with respect to incorporating HPWPs involving engaging staff, aligning leaders, acquiring and developing talent, and empowering the front line. Given differences among these strategies, our analyses suggest that some may work better than others for individual health care organizations, depending on the organizations' current management systems. In practice, most organizations implementing improvement strategies would benefit from including evidence-based HPWPs to maximize the potential for process improvement strategies to increase value in health care.

  14. Measuring the value of process improvement initiatives in a preoperative assessment center using time-driven activity-based costing.

    PubMed

    French, Katy E; Albright, Heidi W; Frenzel, John C; Incalcaterra, James R; Rubio, Augustin C; Jones, Jessica F; Feeley, Thomas W

    2013-12-01

    The value and impact of process improvement initiatives are difficult to quantify. We describe the use of time-driven activity-based costing (TDABC) in a clinical setting to quantify the value of process improvements in terms of cost, time and personnel resources. Difficulty in identifying and measuring the cost savings of process improvement initiatives in a Preoperative Assessment Center (PAC). Use TDABC to measure the value of process improvement initiatives that reduce the costs of performing a preoperative assessment while maintaining the quality of the assessment. Apply the principles of TDABC in a PAC to measure the value, from baseline, of two phases of performance improvement initiatives and determine the impact of each implementation in terms of cost, time and efficiency. Through two rounds of performance improvements, we quantified an overall reduction in time spent by patient and personnel of 33% that resulted in a 46% reduction in the costs of providing care in the center. The performance improvements resulted in a 17% decrease in the total number of full time equivalents (FTE's) needed to staff the center and a 19% increase in the numbers of patients assessed in the center. Quality of care, as assessed by the rate of cancellations on the day of surgery, was not adversely impacted by the process improvements. © 2013 Published by Elsevier Inc.

  15. The experience factory: Can it make you a 5? or what is its relationship to other quality and improvement concepts?

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.

    1992-01-01

    The concepts of quality improvements have permeated many businesses. It is clear that the nineties will be the quality era for software and there is a growing need to develop or adapt quality improvement approaches to the software business. Thus we must understand software as an artifact and software as a business. Since the business we are dealing with is software, we must understand the nature of software and software development. The software discipline is evolutionary and experimental; it is a laboratory science. Software is development not production. The technologies of the discipline are human based. There is a lack of models that allow us to reason about the process and the product. All software is not the same; process is a variable, goals are variable, etc. Packaged, reusable, experiences require additional resources in the form of organization, processes, people, etc. There have been a variety of organizational frameworks proposed to improve quality for various businesses. The ones discussed in this presentation include: Plan-Do-Check-Act, a quality improvement process based upon a feedback cycle for optimizing a single process model/production line; the Experience Factory/Quality Improvement Paradigm, continuous improvements through the experimentation, packaging, and reuse of experiences based upon a business's needs; Total Quality Management, a management approach to long term success through customer satisfaction based on the participation of all members of an organization; the SEI capability maturity model, a staged process improvement based upon assessment with regard to a set of key process areas until you reach a level 5 which represents a continuous process improvement; and Lean (software) Development, a principle supporting the concentration of the production on 'value added' activities and the elimination of reduction of 'not value added' activities.

  16. Application of process improvement principles to increase the frequency of complete airway management documentation.

    PubMed

    McCarty, L Kelsey; Saddawi-Konefka, Daniel; Gargan, Lauren M; Driscoll, William D; Walsh, John L; Peterfreund, Robert A

    2014-12-01

    Process improvement in healthcare delivery settings can be difficult, even when there is consensus among clinicians about a clinical practice or desired outcome. Airway management is a medical intervention fundamental to the delivery of anesthesia care. Like other medical interventions, a detailed description of the management methods should be documented. Despite this expectation, airway documentation is often insufficient. The authors hypothesized that formal adoption of process improvement methods could be used to increase the rate of "complete" airway management documentation. The authors defined a set of criteria as a local practice standard of "complete" airway management documentation. The authors then employed selected process improvement methodologies over 13 months in three iterative and escalating phases to increase the percentage of records with complete documentation. The criteria were applied retrospectively to determine the baseline frequency of complete records, and prospectively to measure the impact of process improvements efforts over the three phases of implementation. Immediately before the initial intervention, a retrospective review of 23,011 general anesthesia cases over 6 months showed that 13.2% of patient records included complete documentation. At the conclusion of the 13-month improvement effort, documentation improved to a completion rate of 91.6% (P<0.0001). During the subsequent 21 months, the completion rate was sustained at an average of 90.7% (SD, 0.9%) across 82,571 general anesthetic records. Systematic application of process improvement methodologies can improve airway documentation and may be similarly effective in improving other areas of anesthesia clinical practice.

  17. A Mixed-Methods Research Framework for Healthcare Process Improvement.

    PubMed

    Bastian, Nathaniel D; Munoz, David; Ventura, Marta

    2016-01-01

    The healthcare system in the United States is spiraling out of control due to ever-increasing costs without significant improvements in quality, access to care, satisfaction, and efficiency. Efficient workflow is paramount to improving healthcare value while maintaining the utmost standards of patient care and provider satisfaction in high stress environments. This article provides healthcare managers and quality engineers with a practical healthcare process improvement framework to assess, measure and improve clinical workflow processes. The proposed mixed-methods research framework integrates qualitative and quantitative tools to foster the improvement of processes and workflow in a systematic way. The framework consists of three distinct phases: 1) stakeholder analysis, 2a) survey design, 2b) time-motion study, and 3) process improvement. The proposed framework is applied to the pediatric intensive care unit of the Penn State Hershey Children's Hospital. The implementation of this methodology led to identification and categorization of different workflow tasks and activities into both value-added and non-value added in an effort to provide more valuable and higher quality patient care. Based upon the lessons learned from the case study, the three-phase methodology provides a better, broader, leaner, and holistic assessment of clinical workflow. The proposed framework can be implemented in various healthcare settings to support continuous improvement efforts in which complexity is a daily element that impacts workflow. We proffer a general methodology for process improvement in a healthcare setting, providing decision makers and stakeholders with a useful framework to help their organizations improve efficiency. Published by Elsevier Inc.

  18. Coal liquefaction process with enhanced process solvent

    DOEpatents

    Givens, Edwin N.; Kang, Dohee

    1984-01-01

    In an improved coal liquefaction process, including a critical solvent deashing stage, high value product recovery is improved and enhanced process-derived solvent is provided by recycling second separator underflow in the critical solvent deashing stage to the coal slurry mix, for inclusion in the process solvent pool.

  19. The use of discrete-event simulation modelling to improve radiation therapy planning processes.

    PubMed

    Werker, Greg; Sauré, Antoine; French, John; Shechter, Steven

    2009-07-01

    The planning portion of the radiation therapy treatment process at the British Columbia Cancer Agency is efficient but nevertheless contains room for improvement. The purpose of this study is to show how a discrete-event simulation (DES) model can be used to represent this complex process and to suggest improvements that may reduce the planning time and ultimately reduce overall waiting times. A simulation model of the radiation therapy (RT) planning process was constructed using the Arena simulation software, representing the complexities of the system. Several types of inputs feed into the model; these inputs come from historical data, a staff survey, and interviews with planners. The simulation model was validated against historical data and then used to test various scenarios to identify and quantify potential improvements to the RT planning process. Simulation modelling is an attractive tool for describing complex systems, and can be used to identify improvements to the processes involved. It is possible to use this technique in the area of radiation therapy planning with the intent of reducing process times and subsequent delays for patient treatment. In this particular system, reducing the variability and length of oncologist-related delays contributes most to improving the planning time.

  20. Quality Improvement Process in a Large Intensive Care Unit: Structure and Outcomes.

    PubMed

    Reddy, Anita J; Guzman, Jorge A

    2016-11-01

    Quality improvement in the health care setting is a complex process, and even more so in the critical care environment. The development of intensive care unit process measures and quality improvement strategies are associated with improved outcomes, but should be individualized to each medical center as structure and culture can differ from institution to institution. The purpose of this report is to describe the structure of quality improvement processes within a large medical intensive care unit while using examples of the study institution's successes and challenges in the areas of stat antibiotic administration, reduction in blood product waste, central line-associated bloodstream infections, and medication errors. © The Author(s) 2015.

  1. Health-care process improvement decisions: a systems perspective.

    PubMed

    Walley, Paul; Silvester, Kate; Mountford, Shaun

    2006-01-01

    The paper seeks to investigate decision-making processes within hospital improvement activity, to understand how performance measurement systems influence decisions and potentially lead to unsuccessful or unsustainable process changes. A longitudinal study over a 33-month period investigates key events, decisions and outcomes at one medium-sized hospital in the UK. Process improvement events are monitored using process control methods and by direct observation. The authors took a systems perspective of the health-care processes, ensuring that the impacts of decisions across the health-care supply chain were appropriately interpreted. The research uncovers the ways in which measurement systems disguise failed decisions and encourage managers to take a low-risk approach of "symptomatic relief" when trying to improve performance metrics. This prevents many managers from trying higher risk, sustainable process improvement changes. The behaviour of the health-care system is not understood by many managers and this leads to poor analysis of problem situations. Measurement using time-series methodologies, such as statistical process control are vital for a better understanding of the systems impact of changes. Senior managers must also be aware of the behavioural influence of similar performance measurement systems that discourage sustainable improvement. There is a risk that such experiences will tarnish the reputation of performance management as a discipline. Recommends process control measures as a way of creating an organization memory of how decisions affect performance--something that is currently lacking.

  2. Possible Overlapping Time Frames of Acquisition and Consolidation Phases in Object Memory Processes: A Pharmacological Approach

    ERIC Educational Resources Information Center

    Akkerman, Sven; Blokland, Arjan; Prickaerts, Jos

    2016-01-01

    In previous studies, we have shown that acetylcholinesterase inhibitors and phosphodiesterase inhibitors (PDE-Is) are able to improve object memory by enhancing acquisition processes. On the other hand, only PDE-Is improve consolidation processes. Here we show that the cholinesterase inhibitor donepezil also improves memory performance when…

  3. Agent-Based Computing in Distributed Adversarial Planning

    DTIC Science & Technology

    2010-08-09

    plans. An agent is expected to agree to deviate from its optimal uncoordinated plan only if it improves its position. - process models for opponent...Game . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 2.2 Improvements ...plan only if it improves its position. – process models for opponent modeling – We have analyzed the suitability of business process models for creating

  4. Advances in Measuring Soil Moisture using Global Navigation Satellite Systems Interferometric Reflectometry (GNSS-IR)

    NASA Astrophysics Data System (ADS)

    Moore, A. W.; Small, E. E.; Owen, S. E.; Hardman, S. H.; Wong, C.; Freeborn, D. J.; Larson, K. M.

    2016-12-01

    GNSS Interferometric Reflectometry (GNSS-IR) uses GNSS signals reflected off the land to infer changes in the near-antenna environment and monitor fluctuations in soil moisture, as well as other related hydrologic variables: snow depth/snow water equivalent (SWE), vegetation water content, and water level [Larson and Small, 2013; McCreight, et al., 2014; Larson et al., 2013]. GNSS instruments installed by geoscientists and surveyors to measure land motions can measure soil moisture fluctuations with accuracy (RMSE <0.04 cm3/cm3 [Small et al., 2016]) and latency sufficient for many applications (e.g., weather forecasting, climate studies, satellite validation). The soil moisture products have a unique and complementary footprint intermediate in scale between satellite and standard in situ sensors. Variations in vegetation conditions introduce considerable errors, but algorithms have been developed to address this issue [Small et al., 2016]. A pilot project (PBO H2O) using 100+ GPS sites in the western U.S. (Figure 1) from a single network (the Plate Boundary Observatory) has been operated by the University of Colorado (CU) at http://xenon.colorado.edu/portal since October 2012. JPL and CU are funded by NASA ESTO to refactor the PBO H2O software within an Apache OODT framework for robust operational analysis of soil moisture data and auto-configuration when new stations are added. We will report progress on the new GNSS H2O analysis portal, and plans to expand to global networks and from GPS to other GNSS signals. ReferencesLarson, K. M., & Small, E. E. (2013) Eos, 94(52), 505-512. McCreight, J. L., Small, E. E., & Larson, K. M. (2014). Water Resour. Res., 50(8), 6892-6909. Larson, K. M., Ray, R. D., Nievinski, F. G., & Freymueller, J. T. (2013). IEEE Geosci Remote S, 10(5), 1200-1204. Small, E. E., Larson, K. M., Chew, C. C., Dong, J., & Ochsner, T. E. (2016). IEEE J Sel. Top. Appl. PP(39). Figure 1: (R) Western U.S. GPS-IR soil moisture sites. (L): Products derived from GNSS reflection data for (clockwise from upper left) vegetation water content, SWE, sea level, and volumetric soil moisture.

  5. Ninety to Nothing: a PDSA quality improvement project.

    PubMed

    Prybutok, Gayle Linda

    2018-05-14

    Purpose The purpose of this paper is to present a case study of a successful quality improvement project in an acute care hospital focused on reducing the time of the total patient visit in the emergency department. Design/methodology/approach A multidisciplinary quality improvement team, using the PDSA (Plan, Do, Study, Act) Cycle, analyzed the emergency department care delivery process and sequentially made process improvements that contributed to project success. Findings The average turnaround time goal of 90 minutes or less per visit was achieved in four months, and the organization enjoyed significant collateral benefits both internal to the organization and for its customers. Practical implications This successful PDSA process can be duplicated by healthcare organizations of all sizes seeking to improve a process related to timely, high-quality patient care delivery. Originality/value Extended wait time in hospital emergency departments is a universal problem in the USA that reduces the quality of the customer experience and that delays necessary patient care. This case study demonstrates that a structured quality improvement process implemented by a multidisciplinary team with the authority to make necessary process changes can successfully redefine the norm.

  6. Modelling energy and environmental impacts of traditional and improved shea butter production in West Africa for food security.

    PubMed

    Naughton, Colleen C; Zhang, Qiong; Mihelcic, James R

    2017-01-15

    This study improves the global application of methods and analyses, especially Life Cycle Assessment (LCA), that properly incorporates environmental impacts of firewood and a social sustainability indicator (human energy) as tools for sustainable human development. Specifically shea butter production processes, common throughout sub-Saharan Africa and crucial to food security, environmental sustainability, and women's empowerment, are analyzed. Many economic activities in the world rely on firewood for energy and labor that aren't included in traditional LCAs. Human energy (entirely from women) contributed 25-100% of shea butter production processes (2000-6100kJ/kg of shea butter) and mechanized production processes had reduced human energy without considerably greater total energy. Firewood accounted for 94-100% of total embodied energy (103 and 172MJ/kg of shea butter for improved and traditional shea butter production processes respectively) and global warming potential and 18-100% of human toxicity of the production processes. Implementation of improved cookstoves modeled in this study could reduce: (1) global warming potential by 78% (from 18 to 4.1kg CO 2 eq/kg and 11 to 2.4kg CO 2 eq/kg of shea butter for the traditional and improved processes respectively), (2) the embodied energy of using firewood by 52% (from 170 to 82MJ/kg and 103 to 49MJ/kg for the traditional and improved processes respectively), and (3) human toxicity by 83% for the non-mechanized traditional and improved processes (from 0.041 to 0.0071 1,4 DB eq/kg and 0.025 to 0.0042 1,4 DB eq/kg respectively). In addition, this is the first study to compare Economic Input-Output Life Cycle Assessment (EIO-LCA) and process-based LCA in a developing country and evaluate five traditional and improved shea butter production processes over different impact categories. Overall, this study developed a framework to evaluate and improve processes for achievement of the United Nation's Sustainable Development Goals for 2030 particularly to obtain food security. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Impact of Radio Frequency Identification (RFID) on the Marine Corps’ Supply Process

    DTIC Science & Technology

    2006-09-01

    Hypothetical Improvement Using a Real-Time Order Processing System Vice a Batch Order Processing System ................56 3. As-Is: The Current... Processing System Vice a Batch Order Processing System ................58 V. RESULTS ................................................69 A. SIMULATION...Time: Hypothetical Improvement Using a Real-Time Order Processing System Vice a Batch Order Processing System ................71 3. As-Is: The

  8. Proceedings of Synthetic Biology: Engineering, Evolution and Design (SEED) Conference 2015

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Silver, Pamela; Flach, Evan

    Synthetic Biology is an emerging discipline that seeks to accelerate the process of engineering biology. As such, the tools are broadly applicable to application areas, including chemicals and biofuels, materials, medicine and agriculture. A characteristic of the field is to look holistically at cellular design, from sensing and genetic circuitry to the manipulation of cellular processes and actuators, to controlling metabolism, to programming multicellular behaviors. Further, the types of cells that are manipulated are broad, from in vitro systems to microbes and fungi to mammalian and plant cells and living animals. Many of the projects in synthetic biology seek tomore » move biochemical functions across organisms. The field is highly interdisciplinary with faculty and students spread across departments that focus on engineering (biological, chemical, electrical, mechanical, civil, computer science) and basic science (biology and systems biology, chemistry, physics). While there have been many one-off workshops and meeting on synthetic biology, the 2014 Synthetic Biology: Engineering, Evolution and Design (SEED) was the first of an annual conference series that serves as a reliable place to pull together the involved disciplines in order to organize and exchange advances in the science and technology in the field. Further, the SEED conferences have a strong focus on industry, with many companies represented and actively participating. A number of these companies have started major efforts in synthetic biology including large companies (e.g., Pfizer, Novartis, Dow, Dupont, BP, Total), smaller companies have recently gone public (e.g., Amyris, Gevo, Intrexon), and many start-ups (e.g., Teslagen, Refactored Materials, Pivot, Genomatica). There are a number of loosely affiliated Synthetic Biology Centers, including ones at MIT, Boston University, UCSD, UCSF, UC-Berkeley, Imperial College, Oxford, and ETH. SEED 2015 will serve as the primary meeting at which international synthetic biology centers and related infrastructure (synthesis/software/foundries) meet to discuss technology, standards, and education. SEED2015 will be the second in an annual series of meeting held to bring researchers from industry and academia in the area of Synthetic Biology. The first SEED conference was highly successful, attracting 285 attendees with varying backgrounds from academia, industry and government. The SEED series provides leadership in the development of the field of synthetic biology and serves to broaden the participants in the field by appealing to broad sectors in industry and providing a means for young investigators and those outside of the field to participate. Further, the series closely integrates with groups such as the SBCC to provide a means by which the synthetic biology community can communicate with policy makers. Further, we will pursue making the meeting the center for the exchange of educational materials as centers for synthetic biology emerge globally. Proceedings will be published each year in the journal ACS Synthetic Biology. After each SEED meeting, surveys are distributed to assess the success of the conference and to help guide changes year-to-year. The diverse application areas further extend the expertise needed from people in areas such as plant biology, agriculture and soil science, environmental science, medicine, and the chemical industry. These areas could have a widespread impact on society in a number of ways. For example, the CRISPR/Cas9 system that serves to immunize bacteria from phage has provided the fundamental chemistry that is used to edit the genomes of diverse organisms, including human stem cells, crop plants, and livestock animals.« less

  9. The Improvement Cycle: Analyzing Our Experience

    NASA Technical Reports Server (NTRS)

    Pajerski, Rose; Waligora, Sharon

    1996-01-01

    NASA's Software Engineering Laboratory (SEL), one of the earliest pioneers in the areas of software process improvement and measurement, has had a significant impact on the software business at NASA Goddard. At the heart of the SEL's improvement program is a belief that software products can be improved by optimizing the software engineering process used to develop them and a long-term improvement strategy that facilitates small incremental improvements that accumulate into significant gains. As a result of its efforts, the SEL has incrementally reduced development costs by 60%, decreased error rates by 85%, and reduced cycle time by 25%. In this paper, we analyze the SEL's experiences on three major improvement initiatives to better understand the cyclic nature of the improvement process and to understand why some improvements take much longer than others.

  10. Performance in Physiology Evaluation: Possible Improvement by Active Learning Strategies

    ERIC Educational Resources Information Center

    Montrezor, Luís H.

    2016-01-01

    The evaluation process is complex and extremely important in the teaching/learning process. Evaluations are constantly employed in the classroom to assist students in the learning process and to help teachers improve the teaching process. The use of active methodologies encourages students to participate in the learning process, encourages…

  11. Process Evaluation for Improving K12 Program Effectiveness: Case Study of a National Institutes of Health Building Interdisciplinary Research Careers in Women's Health Research Career Development Program.

    PubMed

    Raymond, Nancy C; Wyman, Jean F; Dighe, Satlaj; Harwood, Eileen M; Hang, Mikow

    2018-06-01

    Process evaluation is an important tool in quality improvement efforts. This article illustrates how a systematic and continuous evaluation process can be used to improve the quality of faculty career development programs by using the University of Minnesota's Building Interdisciplinary Research Careers in Women's Health (BIRCWH) K12 program as an exemplar. Data from a rigorous process evaluation incorporating quantitative and qualitative measurements were analyzed and reviewed by the BIRCWH program leadership on a regular basis. Examples are provided of how this evaluation model and processes were used to improve many aspects of the program, thereby improving scholar, mentor, and advisory committee members' satisfaction and scholar outcomes. A rigorous evaluation plan can increase the effectiveness and impact of a research career development plan.

  12. Quality initiatives: planning, setting up, and carrying out radiology process improvement projects.

    PubMed

    Tamm, Eric P; Szklaruk, Janio; Puthooran, Leejo; Stone, Danna; Stevens, Brian L; Modaro, Cathy

    2012-01-01

    In the coming decades, those who provide radiologic imaging services will be increasingly challenged by the economic, demographic, and political forces affecting healthcare to improve their efficiency, enhance the value of their services, and achieve greater customer satisfaction. It is essential that radiologists master and consistently apply basic process improvement skills that have allowed professionals in many other fields to thrive in a competitive environment. The authors provide a step-by-step overview of process improvement from the perspective of a radiologic imaging practice by describing their experience in conducting a process improvement project: to increase the daily volume of body magnetic resonance imaging examinations performed at their institution. The first step in any process improvement project is to identify and prioritize opportunities for improvement in the work process. Next, an effective project team must be formed that includes representatives of all participants in the process. An achievable aim must be formulated, appropriate measures selected, and baseline data collected to determine the effects of subsequent efforts to achieve the aim. Each aspect of the process in question is then analyzed by using appropriate tools (eg, flowcharts, fishbone diagrams, Pareto diagrams) to identify opportunities for beneficial change. Plans for change are then established and implemented with regular measurements and review followed by necessary adjustments in course. These so-called PDSA (planning, doing, studying, and acting) cycles are repeated until the aim is achieved or modified and the project closed.

  13. Information Technology Process Improvement Decision-Making: An Exploratory Study from the Perspective of Process Owners and Process Managers

    ERIC Educational Resources Information Center

    Lamp, Sandra A.

    2012-01-01

    There is information available in the literature that discusses information technology (IT) governance and investment decision making from an executive-level perception, yet there is little information available that offers the perspective of process owners and process managers pertaining to their role in IT process improvement and investment…

  14. A Prototype for the Support of Integrated Software Process Development and Improvement

    NASA Astrophysics Data System (ADS)

    Porrawatpreyakorn, Nalinpat; Quirchmayr, Gerald; Chutimaskul, Wichian

    An efficient software development process is one of key success factors for quality software. Not only can the appropriate establishment but also the continuous improvement of integrated project management and of the software development process result in efficiency. This paper hence proposes a software process maintenance framework which consists of two core components: an integrated PMBOK-Scrum model describing how to establish a comprehensive set of project management and software engineering processes and a software development maturity model advocating software process improvement. Besides, a prototype tool to support the framework is introduced.

  15. Teacher Research as Continuous Process Improvement

    ERIC Educational Resources Information Center

    Ellis, Charles; Castle, Kathryn

    2010-01-01

    Purpose: Teacher research (inquiry) has been characterized as practice improvement, professional development and action research, among numerous names and descriptions. The purpose of this paper is to support the case that teacher research is also a form of quality improvement known as continuous process improvement (CPI).…

  16. Roofline Analysis in the Intel® Advisor to Deliver Optimized Performance for applications on Intel® Xeon Phi™ Processor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koskela, Tuomas S.; Lobet, Mathieu; Deslippe, Jack

    In this session we show, in two case studies, how the roofline feature of Intel Advisor has been utilized to optimize the performance of kernels of the XGC1 and PICSAR codes in preparation for Intel Knights Landing architecture. The impact of the implemented optimizations and the benefits of using the automatic roofline feature of Intel Advisor to study performance of large applications will be presented. This demonstrates an effective optimization strategy that has enabled these science applications to achieve up to 4.6 times speed-up and prepare for future exascale architectures. # Goal/Relevance of Session The roofline model [1,2] is amore » powerful tool for analyzing the performance of applications with respect to the theoretical peak achievable on a given computer architecture. It allows one to graphically represent the performance of an application in terms of operational intensity, i.e. the ratio of flops performed and bytes moved from memory in order to guide optimization efforts. Given the scale and complexity of modern science applications, it can often be a tedious task for the user to perform the analysis on the level of functions or loops to identify where performance gains can be made. With new Intel tools, it is now possible to automate this task, as well as base the estimates of peak performance on measurements rather than vendor specifications. The goal of this session is to demonstrate how the roofline feature of Intel Advisor can be used to balance memory vs. computation related optimization efforts and effectively identify performance bottlenecks. A series of typical optimization techniques: cache blocking, structure refactoring, data alignment, and vectorization illustrated by the kernel cases will be addressed. # Description of the codes ## XGC1 The XGC1 code [3] is a magnetic fusion Particle-In-Cell code that uses an unstructured mesh for its Poisson solver that allows it to accurately resolve the edge plasma of a magnetic fusion device. After recent optimizations to its collision kernel [4], most of the computing time is spent in the electron push (pushe) kernel, where these optimization efforts have been focused. The kernel code scaled well with MPI+OpenMP but had almost no automatic compiler vectorization, in part due to indirect memory addresses and in part due to low trip counts of low-level loops that would be candidates for vectorization. Particle blocking and sorting have been implemented to increase trip counts of low-level loops and improve memory locality, and OpenMP directives have been added to vectorize compute-intensive loops that were identified by Advisor. The optimizations have improved the performance of the pushe kernel 2x on Haswell processors and 1.7x on KNL. The KNL node-for-node performance has been brought to within 30% of a NERSC Cori phase I Haswell node and we expect to bridge this gap by reducing the memory footprint of compute intensive routines to improve cache reuse. ## PICSAR is a Fortran/Python high-performance Particle-In-Cell library targeting at MIC architectures first designed to be coupled with the PIC code WARP for the simulation of laser-matter interaction and particle accelerators. PICSAR also contains a FORTRAN stand-alone kernel for performance studies and benchmarks. A MPI domain decomposition is used between NUMA domains and a tile decomposition (cache-blocking) handled by OpenMP has been added for shared-memory parallelism and better cache management. The so-called current deposition and field gathering steps that compose the PIC time loop constitute major hotspots that have been rewritten to enable more efficient vectorization. Particle communications between tiles and MPI domain has been merged and parallelized. All considered, these improvements provide speedups of 3.1 for order 1 and 4.6 for order 3 interpolation shape factors on KNL configured in SNC4 quadrant flat mode. Performance is similar between a node of cori phase 1 and KNL at order 1 and better on KNL by a factor 1.6 at order 3 with the considered test case (homogeneous thermal plasma).« less

  17. Identification of High Performance, Low Environmental Impact Materials and Processes Using Systematic Substitution (SyS)

    NASA Technical Reports Server (NTRS)

    Dhooge, P. M.; Nimitz, J. S.

    2001-01-01

    Process analysis can identify opportunities for efficiency improvement including cost reduction, increased safety, improved quality, and decreased environmental impact. A thorough, systematic approach to materials and process selection is valuable in any analysis. New operations and facilities design offer the best opportunities for proactive cost reduction and environmental improvement, but existing operations and facilities can also benefit greatly. Materials and processes that have been used for many years may be sources of excessive resource use, waste generation, pollution, and cost burden that should be replaced. Operational and purchasing personnel may not recognize some materials and processes as problems. Reasons for materials or process replacement may include quality and efficiency improvements, excessive resource use and waste generation, materials and operational costs, safety (flammability or toxicity), pollution prevention, compatibility with new processes or materials, and new or anticipated regulations.

  18. Self-Study Guide for Florida VPK Provider Improvement Plan Development

    ERIC Educational Resources Information Center

    Phillips, Beth M.; Mazzeo, Debbie; Smith, Kevin

    2016-01-01

    This Self-Study Guide has been developed to support Florida Voluntary Prekindergarten Providers (VPK) who are required to complete an improvement plan process (i.e., low-performing providers). The guide has sections that can be used during both the process of selecting target areas for an improvement plan and the process of implementing new or…

  19. Linking the Teacher Appraisal Process to the School Improvement Plan

    ERIC Educational Resources Information Center

    Reddekopp, Therese

    2007-01-01

    If a school improvement plan includes input from all stakeholders and focuses on data-driven processes that are linked to teacher appraisal, it can be powerful in leading the school toward the common mission of achieving student success. Linking the school improvement plan to the teacher appraisal process creates a system whereby all individuals…

  20. 12 CFR 541.16 - Improved residential real estate.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... improved residential real estate means residential real estate containing offsite or other improvements sufficient to make the property ready for primarily residential construction, and real estate in the process of being improved by a building or buildings to be constructed or in the process of construction for...

  1. Improvements in the efficiency of turboexpanders in cryogenic applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agahi, R.R.; Lin, M.C.; Ershaghi, B.

    1996-12-31

    Process designers have utilized turboexpanders in cryogenic processes because of their higher thermal efficiencies when compared with conventional refrigeration cycles. Process design and equipment performance have improved substantially through the utilization of modern technologies. Turboexpander manufacturers have also adopted Computational Fluid Dynamic Software, Computer Numerical Control Technology and Holography Techniques to further improve an already impressive turboexpander efficiency performance. In this paper, the authors explain the design process of the turboexpander utilizing modern technology. Two cases of turboexpanders processing helium (4.35{degrees}K) and hydrogen (56{degrees}K) will be presented.

  2. Results of a Regional Effort to Improve Warfarin Management.

    PubMed

    Rose, Adam J; Park, Angela; Gillespie, Christopher; Van Deusen Lukas, Carol; Ozonoff, Al; Petrakis, Beth Ann; Reisman, Joel I; Borzecki, Ann M; Benedict, Ashley J; Lukesh, William N; Schmoke, Timothy J; Jones, Ellen A; Morreale, Anthony P; Ourth, Heather L; Schlosser, James E; Mayo-Smith, Michael F; Allen, Arthur L; Witt, Daniel M; Helfrich, Christian D; McCullough, Megan B

    2017-05-01

    Improved anticoagulation control with warfarin reduces adverse events and represents a target for quality improvement. No previous study has described an effort to improve anticoagulation control across a health system. To describe the results of an effort to improve anticoagulation control in the New England region of the Veterans Health Administration (VA). Our intervention encompassed 8 VA sites managing warfarin for more than 5000 patients in New England (Veterans Integrated Service Network 1 [VISN 1]). We provided sites with a system to measure processes of care, along with targeted audit and feedback. We focused on processes of care associated with site-level anticoagulation control, including prompt follow-up after out-of-range international normalized ratio (INR) values, minimizing loss to follow-up, and use of guideline-concordant INR target ranges. We used a difference-in-differences (DID) model to examine changes in anticoagulation control, measured as percentage time in therapeutic range (TTR), as well as process measures and compared VISN 1 sites with 116 VA sites located outside VISN 1. VISN 1 sites improved on TTR, our main indicator of quality, from 66.4% to 69.2%, whereas sites outside VISN 1 improved from 65.9% to 66.4% (DID 2.3%, P < 0.001). Improvement in TTR correlated strongly with the extent of improvement on process-of-care measures, which varied widely across VISN 1 sites. A regional quality improvement initiative, using performance measurement with audit and feedback, improved TTR by 2.3% more than control sites, which is a clinically important difference. Improving relevant processes of care can improve outcomes for patients receiving warfarin.

  3. Creating a standardized process to offer the standard of care: continuous process improvement methodology is associated with increased rates of sperm cryopreservation among adolescent and young adult males with cancer.

    PubMed

    Shnorhavorian, Margarett; Kroon, Leah; Jeffries, Howard; Johnson, Rebecca

    2012-11-01

    There is limited literature on strategies to overcome the barriers to sperm banking among adolescent and young adult (AYA) males with cancer. By standardizing our process for offering sperm banking to AYA males before cancer treatment, we aimed to improve rates of sperm banking at our institution. Continuous process improvement is a technique that has recently been applied to improve health care delivery. We used continuous process improvement methodologies to create a standard process for fertility preservation for AYA males with cancer at our institution. We compared rates of sperm banking before and after standardization. In the 12-month period after implementation of a standardized process, 90% of patients were offered sperm banking. We demonstrated an 8-fold increase in the proportion of AYA males' sperm banking, and a 5-fold increase in the rate of sperm banking at our institution. Implementation of a standardized process for sperm banking for AYA males with cancer was associated with increased rates of sperm banking at our institution. This study supports the role of standardized health care in decreasing barriers to sperm banking.

  4. Lean methodology for performance improvement in the trauma discharge process.

    PubMed

    O'Mara, Michael Shaymus; Ramaniuk, Aliaksandr; Graymire, Vickie; Rozzell, Monica; Martin, Stacey

    2014-07-01

    High-volume, complex services such as trauma and acute care surgery are at risk for inefficiency. Lean process improvement can reduce health care waste. Lean allows a structured look at processes not easily amenable to analysis. We applied lean methodology to the current state of communication and discharge planning on an urban trauma service, citing areas for improvement. A lean process mapping event was held. The process map was used to identify areas for immediate analysis and intervention-defining metrics for the stakeholders. After intervention, new performance was assessed by direct data evaluation. The process was completed with an analysis of effect and plans made for addressing future focus areas. The primary area of concern identified was interservice communication. Changes centering on a standardized morning report structure reduced the number of consult questions unanswered from 67% to 34% (p = 0.0021). Physical therapy rework was reduced from 35% to 19% (p = 0.016). Patients admitted to units not designated to the trauma service had 1.6 times longer stays (p < 0.0001). The lean process lasted 8 months, and three areas for new improvement were identified: (1) the off-unit patients; (2) patients with length of stay more than 15 days contribute disproportionately to length of stay; and (3) miscommunication exists around patient education at discharge. Lean process improvement is a viable means of health care analysis. When applied to a trauma service with 4,000 admissions annually, lean identifies areas ripe for improvement. Our inefficiencies surrounded communication and patient localization. Strategies arising from the input of all stakeholders led to real solutions for communication through a face-to-face morning report and identified areas for ongoing improvement. This focuses resource use and identifies areas for improvement of throughput in care delivery.

  5. Monitoring outcomes with relational databases: does it improve quality of care?

    PubMed

    Clemmer, Terry P

    2004-12-01

    There are 3 key ingredients in improving quality of medial care: 1) using a scientific process of improvement, 2) executing the process at the lowest possible level in the organization, and 3) measuring the results of any change reliably. Relational databases when used within these guidelines are of great value in these efforts if they contain reliable information that is pertinent to the project and used in a scientific process of quality improvement by a front line team. Unfortunately, the data are frequently unreliable and/or not pertinent to the local process and is used by persons at very high levels in the organization without a scientific process and without reliable measurement of the outcome. Under these circumstances the effectiveness of relational databases in improving care is marginal at best, frequently wasteful and has the potential to be harmful. This article explores examples of these concepts.

  6. MILITARY PAY: Processes for Retaining Injured Army National Guard and Reserve Soldiers on Active Duty Have Been Improved, but Some Challenges Remain

    DTIC Science & Technology

    2007-05-01

    Active Duty Have Been Improved, but Some Challenges Remain The Army’s MRP program has largely resolved the widespread delays in order processing that...interviewed confirmed that they did not experience gaps in pay and associated benefits because of order processing delays. However, some of the...and injured reserve component soldiers we interviewed, these improvements have virtually eliminated the widespread delays in order processing that

  7. Introducing the CERT (Trademark) Resiliency Engineering Framework: Improving the Security and Sustainability Processes

    DTIC Science & Technology

    2007-05-01

    Organizational Structure 40 6.1.3 Funding Model 40 6.1.4 Role of Information Technology 40 6.2 Considering Process Improvement 41 6.2.1 Dimensions of...to the process definition for resiliency engineering. 6.1.3 Funding Model Just as organizational structures tend to align across security and...responsibility. Adopting an enter- prise view of operational resiliency and a process improvement approach requires that the funding model evolve to one

  8. Exploratory Development on a New Process to Produce Improved RDX crystals: Supercritical Fluid Anti-Solvent Recrystallization

    DTIC Science & Technology

    1988-05-02

    G. and J. Chiovini. Decaffeination Process . U.S. Patent 4,251.559; 17 February 1981. 43. Friedrich, J.P.. G.R. List, and A.J. Leakin. Petroleum...0 CONTRACT REPORT BRL-CR-606 EXPLORATORY DEVELOPMENT ON A NEW PROCESS TO PRODUCE IMPROVED RDX CRYSTALS: SUPERCRITICAL FLUID ANTI-SOLVENT...CCESSION NO. 11. TITLE (icnude Sun• y Uasuihcanon) I . • EXPLORATORY DEVELOPMENT ON A NEW PROCESS TO PRODUCE IMPROVED RDX CRYSTALS: SUPERCRITICAL

  9. Improving the medical records department processes by lean management.

    PubMed

    Ajami, Sima; Ketabi, Saeedeh; Sadeghian, Akram; Saghaeinnejad-Isfahani, Sakine

    2015-01-01

    Lean management is a process improvement technique to identify waste actions and processes to eliminate them. The benefits of Lean for healthcare organizations are that first, the quality of the outcomes in terms of mistakes and errors improves. The second is that the amount of time taken through the whole process significantly improves. The purpose of this paper is to improve the Medical Records Department (MRD) processes at Ayatolah-Kashani Hospital in Isfahan, Iran by utilizing Lean management. This research was applied and an interventional study. The data have been collected by brainstorming, observation, interview, and workflow review. The study population included MRD staff and other expert staff within the hospital who were stakeholders and users of the MRD. The MRD were initially taught the concepts of Lean management and then formed into the MRD Lean team. The team then identified and reviewed the current processes subsequently; they identified wastes and values, and proposed solutions. The findings showed that the MRD units (Archive, Coding, Statistics, and Admission) had 17 current processes, 28 wastes, and 11 values were identified. In addition, they offered 27 comments for eliminating the wastes. The MRD is the critical department for the hospital information system and, therefore, the continuous improvement of its services and processes, through scientific methods such as Lean management, are essential. The study represents one of the few attempts trying to eliminate wastes in the MRD.

  10. How to conduct a clinical audit and quality improvement project.

    PubMed

    Limb, Christopher; Fowler, Alex; Gundogan, Buket; Koshy, Kiron; Agha, Riaz

    2017-07-01

    Audits and quality improvement projects are vital aspects of clinical governance and continual service improvement in medicine. In this article we describe the process of clinical audit and quality improvement project. Guidance is also provided on how to design an effective audit and bypass barriers encountered during the process.

  11. Using Workflow Modeling to Identify Areas to Improve Genetic Test Processes in the University of Maryland Translational Pharmacogenomics Project.

    PubMed

    Cutting, Elizabeth M; Overby, Casey L; Banchero, Meghan; Pollin, Toni; Kelemen, Mark; Shuldiner, Alan R; Beitelshees, Amber L

    Delivering genetic test results to clinicians is a complex process. It involves many actors and multiple steps, requiring all of these to work together in order to create an optimal course of treatment for the patient. We used information gained from focus groups in order to illustrate the current process of delivering genetic test results to clinicians. We propose a business process model and notation (BPMN) representation of this process for a Translational Pharmacogenomics Project being implemented at the University of Maryland Medical Center, so that personalized medicine program implementers can identify areas to improve genetic testing processes. We found that the current process could be improved to reduce input errors, better inform and notify clinicians about the implications of certain genetic tests, and make results more easily understood. We demonstrate our use of BPMN to improve this important clinical process for CYP2C19 genetic testing in patients undergoing invasive treatment of coronary heart disease.

  12. Using Workflow Modeling to Identify Areas to Improve Genetic Test Processes in the University of Maryland Translational Pharmacogenomics Project

    PubMed Central

    Cutting, Elizabeth M.; Overby, Casey L.; Banchero, Meghan; Pollin, Toni; Kelemen, Mark; Shuldiner, Alan R.; Beitelshees, Amber L.

    2015-01-01

    Delivering genetic test results to clinicians is a complex process. It involves many actors and multiple steps, requiring all of these to work together in order to create an optimal course of treatment for the patient. We used information gained from focus groups in order to illustrate the current process of delivering genetic test results to clinicians. We propose a business process model and notation (BPMN) representation of this process for a Translational Pharmacogenomics Project being implemented at the University of Maryland Medical Center, so that personalized medicine program implementers can identify areas to improve genetic testing processes. We found that the current process could be improved to reduce input errors, better inform and notify clinicians about the implications of certain genetic tests, and make results more easily understood. We demonstrate our use of BPMN to improve this important clinical process for CYP2C19 genetic testing in patients undergoing invasive treatment of coronary heart disease. PMID:26958179

  13. Emotional processing during experiential treatment of depression.

    PubMed

    Pos, Alberta E; Greenberg, Leslie S; Goldman, Rhonda N; Korman, Lorne M

    2003-12-01

    This study explored the importance of early and late emotional processing to change in depressive and general symptomology, self-esteem, and interpersonal problems for 34 clients who received 16-20 sessions of experiential treatment for depression. The independent contribution to outcome of the early working alliance was also explored. Early and late emotional processing predicted reductions in reported symptoms and gains in self-esteem. More important, emotional-processing skill significantly improved during treatment. Hierarchical regression models demonstrated that late emotional processing both mediated the relationship between clients' early emotional processing capacity and outcome and was the sole emotional-processing variable that independently predicted improvement. After controlling for emotional processing, the working alliance added an independent contribution to explaining improvement in reported symptomology only. (c) 2003 APA

  14. Eco-Efficient Process Improvement at the Early Development Stage: Identifying Environmental and Economic Process Hotspots for Synergetic Improvement Potential.

    PubMed

    Piccinno, Fabiano; Hischier, Roland; Seeger, Stefan; Som, Claudia

    2018-05-15

    We present here a new eco-efficiency process-improvement method to highlight combined environmental and costs hotspots of the production process of new material at a very early development stage. Production-specific and scaled-up results for life cycle assessment (LCA) and production costs are combined in a new analysis to identify synergetic improvement potentials and trade-offs, setting goals for the eco-design of new processes. The identified hotspots and bottlenecks will help users to focus on the relevant steps for improvements from an eco-efficiency perspective and potentially reduce their associated environmental impacts and production costs. Our method is illustrated with a case study of nanocellulose. The results indicate that the production route should start with carrot pomace, use heat and solvent recovery, and deactivate the enzymes with bleach instead of heat. To further improve the process, the results show that focus should be laid on the carrier polymer, sodium alginate, and the production of the GripX coating. Overall, the method shows that the underlying LCA scale-up framework is valuable for purposes beyond conventional LCA studies and is applicable at a very early stage to provide researchers with a better understanding of their production process.

  15. Improving Immunization Rates Using Lean Six Sigma Processes: Alliance of Independent Academic Medical Centers National Initiative III Project.

    PubMed

    Hina-Syeda, Hussaini; Kimbrough, Christina; Murdoch, William; Markova, Tsveti

    2013-01-01

    Quality improvement education and work in interdisciplinary teams is a healthcare priority. Healthcare systems are trying to meet core measures and provide excellent patient care, thus improving their Hospital Consumer Assessment of Healthcare Providers & Systems scores. Crittenton Hospital Medical Center in Rochester Hills, MI, aligned educational and clinical objectives, focusing on improving immunization rates against pneumonia and influenza prior to the rates being implemented as core measures. Improving immunization rates prevents infections, minimizes hospitalizations, and results in overall improved patient care. Teaching hospitals offer an effective way to work on clinical projects by bringing together the skill sets of residents, faculty, and hospital staff to achieve superior results. WE DESIGNED AND IMPLEMENTED A STRUCTURED CURRICULUM IN WHICH INTERDISCIPLINARY TEAMS ACQUIRED KNOWLEDGE ON QUALITY IMPROVEMENT AND TEAMWORK, WHILE FOCUSING ON A SPECIFIC CLINICAL PROJECT: improving global immunization rates. We used the Lean Six Sigma process tools to quantify the initial process capability to immunize against pneumococcus and influenza. The hospital's process to vaccinate against pneumonia overall was operating at a Z score of 3.13, and the influenza vaccination Z score was 2.53. However, the process to vaccinate high-risk patients against pneumonia operated at a Z score of 1.96. Improvement in immunization rates of high-risk patients became the focus of the project. After the implementation of solutions, the process to vaccinate high-risk patients against pneumonia operated at a Z score of 3.9 with a defects/million opportunities rate of 9,346 and a yield of 93.5%. Revisions to the adult assessment form fixed 80% of the problems identified. This process improvement project was not only beneficial in terms of improved quality of patient care but was also a positive learning experience for the interdisciplinary team, particularly for the residents. The hospital has completed quality improvement projects in the past; however, this project was the first in which residents were actively involved. The didactic components and experiential learning were powerfully synergistic. This and similar projects can have far-reaching implications in terms of promoting patient health and improving the quality of care delivered by the healthcare systems and teaching hospitals.

  16. Template for success: using a resident-designed sign-out template in the handover of patient care.

    PubMed

    Clark, Clancy J; Sindell, Sarah L; Koehler, Richard P

    2011-01-01

    Report our implementation of a standardized handover process in a general surgery residency program. The standardized handover process, sign-out template, method of implementation, and continuous quality improvement process were designed by general surgery residents with support of faculty and senior hospital administration using standard work principles and business models of the Virginia Mason Production System and the Toyota Production System. Nonprofit, tertiary referral teaching hospital. General surgery residents, residency faculty, patient care providers, and hospital administration. After instruction in quality improvement initiatives, a team of general surgery residents designed a sign-out process using an electronic template and standard procedures. The initial implementation phase resulted in 73% compliance. Using resident-driven continuous quality improvement processes, real-time feedback enabled residents to modify and improve this process, eventually attaining 100% compliance and acceptance by residents. The creation of a standardized template and protocol for patient handovers might eliminate communication failures. Encouraging residents to participate in this process can establish the groundwork for successful implementation of a standardized handover process. Integrating a continuous quality-improvement process into such an initiative can promote active participation of busy general surgery residents and lead to successful implementation of standard procedures. Copyright © 2011 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  17. Event (error and near-miss) reporting and learning system for process improvement in radiation oncology.

    PubMed

    Mutic, Sasa; Brame, R Scott; Oddiraju, Swetha; Parikh, Parag; Westfall, Melisa A; Hopkins, Merilee L; Medina, Angel D; Danieley, Jonathan C; Michalski, Jeff M; El Naqa, Issam M; Low, Daniel A; Wu, Bin

    2010-09-01

    The value of near-miss and error reporting processes in many industries is well appreciated and typically can be supported with data that have been collected over time. While it is generally accepted that such processes are important in the radiation therapy (RT) setting, studies analyzing the effects of organized reporting and process improvement systems on operation and patient safety in individual clinics remain scarce. The purpose of this work is to report on the design and long-term use of an electronic reporting system in a RT department and compare it to the paper-based reporting system it replaced. A specifically designed web-based system was designed for reporting of individual events in RT and clinically implemented in 2007. An event was defined as any occurrence that could have, or had, resulted in a deviation in the delivery of patient care. The aim of the system was to support process improvement in patient care and safety. The reporting tool was designed so individual events could be quickly and easily reported without disrupting clinical work. This was very important because the system use was voluntary. The spectrum of reported deviations extended from minor workflow issues (e.g., scheduling) to errors in treatment delivery. Reports were categorized based on functional area, type, and severity of an event. The events were processed and analyzed by a formal process improvement group that used the data and the statistics collected through the web-based tool for guidance in reengineering clinical processes. The reporting trends for the first 24 months with the electronic system were compared to the events that were reported in the same clinic with a paper-based system over a seven-year period. The reporting system and the process improvement structure resulted in increased event reporting, improved event communication, and improved identification of clinical areas which needed process and safety improvements. The reported data were also useful for the evaluation of corrective measures and recognition of ineffective measures and efforts. The electronic system was relatively well accepted by personnel and resulted in minimal disruption of clinical work. Event reporting in the quarters with the fewest number of reported events, though voluntary, was almost four times greater than the most events reported in any one quarter with the paper-based system and remained consistent from the inception of the process through the date of this report. However, the acceptance was not universal, validating the need for improved education regarding reporting processes and systematic approaches to reporting culture development. Specially designed electronic event reporting systems in a radiotherapy setting can provide valuable data for process and patient safety improvement and are more effective reporting mechanisms than paper-based systems. Additional work is needed to develop methods that can more effectively utilize reported data for process improvement, including the development of standardized event taxonomy and a classification system for RT.

  18. Process improvement as an investment: Measuring its worth

    NASA Technical Reports Server (NTRS)

    Mcgarry, Frank; Jeletic, Kellyann

    1993-01-01

    This paper discusses return on investment (ROI) generated from software process improvement programs. It details the steps needed to compute ROI and compares these steps from the perspective of two process improvement approaches: the widely known Software Engineering Institute's capability maturity model and the approach employed by NASA's Software Engineering Laboratory (SEL). The paper then describes the specific investments made in the SEL over the past 18 years and discusses the improvements gained from this investment by the production organization in the SEL.

  19. Use of Process Improvement Tools in Radiology.

    PubMed

    Rawson, James V; Kannan, Amogha; Furman, Melissa

    2016-01-01

    Process improvement techniques are common in manufacturing and industry. Over the past few decades these principles have been slowly introduced in select health care settings. This article reviews the Plan, Do, Study, and Act cycle, Six Sigma, the System of Profound Knowledge, Lean, and the theory of constraints. Specific process improvement tools in health care and radiology are presented in the order the radiologist is likely to encounter them in an improvement project. Copyright © 2015 Mosby, Inc. All rights reserved.

  20. Parameter prediction based on Improved Process neural network and ARMA error compensation in Evaporation Process

    NASA Astrophysics Data System (ADS)

    Qian, Xiaoshan

    2018-01-01

    The traditional model of evaporation process parameters have continuity and cumulative characteristics of the prediction error larger issues, based on the basis of the process proposed an adaptive particle swarm neural network forecasting method parameters established on the autoregressive moving average (ARMA) error correction procedure compensated prediction model to predict the results of the neural network to improve prediction accuracy. Taking a alumina plant evaporation process to analyze production data validation, and compared with the traditional model, the new model prediction accuracy greatly improved, can be used to predict the dynamic process of evaporation of sodium aluminate solution components.

  1. 41 CFR 102-193.25 - What type of records management business process improvements should my agency strive to achieve?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 41 Public Contracts and Property Management 3 2013-07-01 2013-07-01 false What type of records management business process improvements should my agency strive to achieve? 102-193.25 Section 102-193.25...-193.25 What type of records management business process improvements should my agency strive to...

  2. 41 CFR 102-193.25 - What type of records management business process improvements should my agency strive to achieve?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 41 Public Contracts and Property Management 3 2011-01-01 2011-01-01 false What type of records management business process improvements should my agency strive to achieve? 102-193.25 Section 102-193.25...-193.25 What type of records management business process improvements should my agency strive to...

  3. 41 CFR 102-193.25 - What type of records management business process improvements should my agency strive to achieve?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 41 Public Contracts and Property Management 3 2014-01-01 2014-01-01 false What type of records management business process improvements should my agency strive to achieve? 102-193.25 Section 102-193.25...-193.25 What type of records management business process improvements should my agency strive to...

  4. 41 CFR 102-193.25 - What type of records management business process improvements should my agency strive to achieve?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 41 Public Contracts and Property Management 3 2012-01-01 2012-01-01 false What type of records management business process improvements should my agency strive to achieve? 102-193.25 Section 102-193.25...-193.25 What type of records management business process improvements should my agency strive to...

  5. Employee empowerment through team building and use of process control methods.

    PubMed

    Willems, S

    1998-02-01

    The article examines the use of statistical process control and performance improvement techniques in employee empowerment. The focus is how these techniques provide employees with information to improve their productivity and become involved in the decision-making process. Findings suggest that at one Mississippi hospital employee improvement has had a positive effect on employee productivity, morale, and quality of work.

  6. Improving the two-step remediation process for CCA-treated wood. Part I, Evaluating oxalic acid extraction

    Treesearch

    Carol Clausen

    2004-01-01

    In this study, three possible improvements to a remediation process for chromated-copper-arsenate (CCA) treated wood were evaluated. The process involves two steps: oxalic acid extraction of wood fiber followed by bacterial culture with Bacillus licheniformis CC01. The three potential improvements to the oxalic acid extraction step were (1) reusing oxalic acid for...

  7. Formation process of graphite film on Ni substrate with improved thickness uniformity through precipitation control

    NASA Astrophysics Data System (ADS)

    Kim, Seul-Gi; Hu, Qicheng; Nam, Ki-Bong; Kim, Mun Ja; Yoo, Ji-Beom

    2018-04-01

    Large-scale graphitic thin film with high thickness uniformity needs to be developed for industrial applications. Graphitic films with thicknesses ranging from 3 to 20 nm have rarely been reported, and achieving the thickness uniformity in that range is a challenging task. In this study, a process for growing 20 nm-thick graphite films on Ni with improved thickness uniformity is demonstrated and compared with the conventional growth process. In the film grown by the process, the surface roughness and coverage were improved and no wrinkles were observed. Observations of the film structure reveal the reasons for the improvements and growth mechanisms.

  8. Applying Lean Six Sigma to improve medication management.

    PubMed

    Nayar, Preethy; Ojha, Diptee; Fetrick, Ann; Nguyen, Anh T

    2016-01-01

    A significant proportion of veterans use dual care or health care services within and outside the Veterans Health Administration (VHA). In this study conducted at a VHA medical center in the USA, the authors used Lean Six Sigma principles to develop recommendations to eliminate wasteful processes and implement a more efficient and effective process to manage medications for dual care veteran patients. The purpose of this study is to: assess compliance with the VHA's dual care policy; collect data and describe the current process for co-management of dual care veterans' medications; and draft recommendations to improve the current process for dual care medications co-management. Input was obtained from the VHA patient care team members to draw a process map to describe the current process for filling a non-VHA prescription at a VHA facility. Data were collected through surveys and direct observation to measure the current process and to develop recommendations to redesign and improve the process. A key bottleneck in the process that was identified was the receipt of the non-VHA medical record which resulted in delays in filling prescriptions. The recommendations of this project focus on the four domains of: documentation of dual care; veteran education; process redesign; and outreach to community providers. This case study describes the application of Lean Six Sigma principles in one urban Veterans Affairs Medical Center (VAMC) in the Mid-Western USA to solve a specific organizational quality problem. Therefore, the findings may not be generalizable to other organizations. The Lean Six Sigma general principles applied in this project to develop recommendations to improve medication management for dual care veterans are applicable to any process improvement or redesign project and has valuable lessons for other VAMCs seeking to improve care for their dual care veteran patients. The findings of this project will be of value to VA providers and policy makers and health care managers who plan to apply Lean Six Sigma techniques in their organizations to improve the quality of care for their patients.

  9. Improving the effectiveness of partnering : final report.

    DOT National Transportation Integrated Search

    2002-11-01

    The objectives of the research were to: (1) assess the current state of the Oregon Department of Transportations (ODOT) partnering program; (2) examine ways for improving current processes; and (3) recommend process improvements and possible new m...

  10. Catalytic cracking process

    DOEpatents

    Lokhandwala, Kaaeid A.; Baker, Richard W.

    2001-01-01

    Processes and apparatus for providing improved catalytic cracking, specifically improved recovery of olefins, LPG or hydrogen from catalytic crackers. The improvement is achieved by passing part of the wet gas stream across membranes selective in favor of light hydrocarbons over hydrogen.

  11. Discontinuing Medications: A Novel Approach for Revising the Prescribing Stage of the Medication-Use Process

    PubMed Central

    Bain, Kevin T.; Holmes, Holly M.; Beers, Mark H.; Maio, Vittorio; Handler, Steven M.; Pauker, Stephen G.

    2009-01-01

    Thousands of Americans are injured or die each year from adverse drug reactions, many of which are preventable. The burden of harm conveyed by the use of medications is a significant public health problem and, therefore, improving the medication-use process is a priority. Recent and ongoing efforts to improve the medication-use process focus primarily on improving medication prescribing, and not much emphasis has been put on improving medication discontinuation. A formalized approach for rationally discontinuing medications is a necessary antecedent to improving medication safety and improving the nation’s quality of care. This paper proposes a conceptual framework for revising the prescribing stage of the medication-use process to include discontinuing medications. This framework has substantial practice and research implications, especially for the clinical care of older persons, who are particularly susceptible to the adverse effects of medications. PMID:18771457

  12. Curbing variations in packaging process through Six Sigma way in a large-scale food-processing industry

    NASA Astrophysics Data System (ADS)

    Desai, Darshak A.; Kotadiya, Parth; Makwana, Nikheel; Patel, Sonalinkumar

    2015-03-01

    Indian industries need overall operational excellence for sustainable profitability and growth in the present age of global competitiveness. Among different quality and productivity improvement techniques, Six Sigma has emerged as one of the most effective breakthrough improvement strategies. Though Indian industries are exploring this improvement methodology to their advantage and reaping the benefits, not much has been presented and published regarding experience of Six Sigma in the food-processing industries. This paper is an effort to exemplify the application of Six Sigma quality improvement drive to one of the large-scale food-processing sectors in India. The paper discusses the phase wiz implementation of define, measure, analyze, improve, and control (DMAIC) on one of the chronic problems, variations in the weight of milk powder pouch. The paper wraps up with the improvements achieved and projected bottom-line gain to the unit by application of Six Sigma methodology.

  13. Process Improvement for Interinstitutional Research Contracting

    PubMed Central

    Logan, Jennifer; Bjorklund, Todd; Whitfield, Jesse; Reed, Peggy; Lesher, Laurie; Sikalis, Amy; Brown, Brent; Drollinger, Sandy; Larrabee, Kristine; Thompson, Kristie; Clark, Erin; Workman, Michael; Boi, Luca

    2015-01-01

    Abstract Introduction Sponsored research increasingly requires multiinstitutional collaboration. However, research contracting procedures have become more complicated and time consuming. The perinatal research units of two colocated healthcare systems sought to improve their research contracting processes. Methods The Lean Process, a management practice that iteratively involves team members in root cause analyses and process improvement, was applied to the research contracting process, initially using Process Mapping and then developing Problem Solving Reports. Results Root cause analyses revealed that the longest delays were the individual contract legal negotiations. In addition, the “business entity” was the research support personnel of both healthcare systems whose “customers” were investigators attempting to conduct interinstitutional research. Development of mutually acceptable research contract templates and language, chain of custody templates, and process development and refinement formats decreased the Notice of Grant Award to Purchase Order time from a mean of 103.5 days in the year prior to Lean Process implementation to 45.8 days in the year after implementation (p = 0.004). Conclusions The Lean Process can be applied to interinstitutional research contracting with significant improvement in contract implementation. PMID:26083433

  14. Process Improvement for Interinstitutional Research Contracting.

    PubMed

    Varner, Michael; Logan, Jennifer; Bjorklund, Todd; Whitfield, Jesse; Reed, Peggy; Lesher, Laurie; Sikalis, Amy; Brown, Brent; Drollinger, Sandy; Larrabee, Kristine; Thompson, Kristie; Clark, Erin; Workman, Michael; Boi, Luca

    2015-08-01

    Sponsored research increasingly requires multiinstitutional collaboration. However, research contracting procedures have become more complicated and time consuming. The perinatal research units of two colocated healthcare systems sought to improve their research contracting processes. The Lean Process, a management practice that iteratively involves team members in root cause analyses and process improvement, was applied to the research contracting process, initially using Process Mapping and then developing Problem Solving Reports. Root cause analyses revealed that the longest delays were the individual contract legal negotiations. In addition, the "business entity" was the research support personnel of both healthcare systems whose "customers" were investigators attempting to conduct interinstitutional research. Development of mutually acceptable research contract templates and language, chain of custody templates, and process development and refinement formats decreased the Notice of Grant Award to Purchase Order time from a mean of 103.5 days in the year prior to Lean Process implementation to 45.8 days in the year after implementation (p = 0.004). The Lean Process can be applied to interinstitutional research contracting with significant improvement in contract implementation. © 2015 Wiley Periodicals, Inc.

  15. Using task analysis to improve the requirements elicitation in health information system.

    PubMed

    Teixeira, Leonor; Ferreira, Carlos; Santos, Beatriz Sousa

    2007-01-01

    This paper describes the application of task analysis within the design process of a Web-based information system for managing clinical information in hemophilia care, in order to improve the requirements elicitation and, consequently, to validate the domain model obtained in a previous phase of the design process (system analysis). The use of task analysis in this case proved to be a practical and efficient way to improve the requirements engineering process by involving users in the design process.

  16. Advanced Practice Nursing Committee on Process Improvement in Trauma: An Innovative Application of the Strong Model.

    PubMed

    West, Sarah Katherine

    2016-01-01

    This article aims to summarize the successes and future implications for a nurse practitioner-driven committee on process improvement in trauma. The trauma nurse practitioner is uniquely positioned to recognize the need for clinical process improvement and enact change within the clinical setting. Application of the Strong Model of Advanced Practice proves to actively engage the trauma nurse practitioner in process improvement initiatives. Through enhancing nurse practitioner professional engagement, the committee aims to improve health care delivery to the traumatically injured patient. A retrospective review of the committee's first year reveals trauma nurse practitioner success in the domains of direct comprehensive care, support of systems, education, and leadership. The need for increased trauma nurse practitioner involvement has been identified for the domains of research and publication.

  17. The maturing of the quality improvement paradigm in the SEL

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.

    1993-01-01

    The Software Engineering Laboratory uses a paradigm for improving the software process and product, called the quality improvement paradigm. This paradigm has evolved over the past 18 years, along with our software development processes and product. Since 1976, when we first began the SEL, we have learned a great deal about improving the software process and product, making a great many mistakes along the way. Quality improvement paradigm, as it is currently defined, can be broken up into six steps: characterize the current project and its environment with respect to the appropriate models and metrics; set the quantifiable goals for successful project performance and improvement; choose the appropriate process model and supporting methods and tools for this project; execute the processes, construct the products, and collect, validate, and analyze the data to provide real-time feedback for corrective action; analyze the data to evaluate the current practices, determine problems, record findings, and make recommendations for future project improvements; and package the experience gained in the form of updated and refined models and other forms of structured knowledge gained from this and prior projects and save it in an experience base to be reused on future projects.

  18. Improving operational anodising process performance using simulation approach

    NASA Astrophysics Data System (ADS)

    Liong, Choong-Yeun; Ghazali, Syarah Syahidah

    2015-10-01

    The use of aluminium is very widespread, especially in transportation, electrical and electronics, architectural, automotive and engineering applications sectors. Therefore, the anodizing process is an important process for aluminium in order to make the aluminium durable, attractive and weather resistant. This research is focused on the anodizing process operations in manufacturing and supplying of aluminium extrusion. The data required for the development of the model is collected from the observations and interviews conducted in the study. To study the current system, the processes involved in the anodizing process are modeled by using Arena 14.5 simulation software. Those processes consist of five main processes, namely the degreasing process, the etching process, the desmut process, the anodizing process, the sealing process and 16 other processes. The results obtained were analyzed to identify the problems or bottlenecks that occurred and to propose improvement methods that can be implemented on the original model. Based on the comparisons that have been done between the improvement methods, the productivity could be increased by reallocating the workers and reducing loading time.

  19. Redesigning the care of fragility fracture patients to improve osteoporosis management: a health care improvement project.

    PubMed

    Harrington, J Timothy; Barash, Harvey L; Day, Sherry; Lease, Joellen

    2005-04-15

    To develop new processes that assure more reliable, population-based care of fragility fracture patients. A 4-year clinical improvement project was performed in a multispecialty, community practice health system using evidence-based guidelines and rapid cycle process improvement methods (plan-do-study-act cycles). Prior to this project, appropriate osteoporosis care was provided to only 5% of our 1999 hip fracture patients. In 2001, primary physicians were provided prompts about appropriate care (cycle 1), which resulted in improved care for only 20% of patients. A process improvement pilot in 2002 (cycle 2) and full program implementation in 2003 (cycle 3) have assured osteoporosis care for all willing and able patients with any fragility fracture. Altogether, 58% of 2003 fragility fracture patients, including 46% of those with hip fracture, have had a bone measurement, have been assigned to osteoporosis care with their primary physician or a consultant, and are being monitored regularly. Only 19% refused osteoporosis care. Key process improvements have included using orthopedic billings to identify patients, referring patients directly from orthopedics to an osteoporosis care program, organizing care with a nurse manager and process management computer software, assigning patients to primary or consultative physician care based on disease severity, and monitoring adherence to therapy by telephone. Reliable osteoporosis care is achievable by redesigning clinical processes. Performance data motivate physicians to reconsider traditional approaches. Improving the care of osteoporosis and other chronic diseases requires coordinated care across specialty boundaries and health system support.

  20. Design modification and optimisation of the perfusion system of a tri-axial bioreactor for tissue engineering.

    PubMed

    Hussein, Husnah; Williams, David J; Liu, Yang

    2015-07-01

    A systematic design of experiments (DOE) approach was used to optimize the perfusion process of a tri-axial bioreactor designed for translational tissue engineering exploiting mechanical stimuli and mechanotransduction. Four controllable design parameters affecting the perfusion process were identified in a cause-effect diagram as potential improvement opportunities. A screening process was used to separate out the factors that have the largest impact from the insignificant ones. DOE was employed to find the settings of the platen design, return tubing configuration and the elevation difference that minimise the load on the pump and variation in the perfusion process and improve the controllability of the perfusion pressures within the prescribed limits. DOE was very effective for gaining increased knowledge of the perfusion process and optimizing the process for improved functionality. It is hypothesized that the optimized perfusion system will result in improved biological performance and consistency.

  1. A novel double loop control model design for chemical unstable processes.

    PubMed

    Cong, Er-Ding; Hu, Ming-Hui; Tu, Shan-Tung; Xuan, Fu-Zhen; Shao, Hui-He

    2014-03-01

    In this manuscript, based on Smith predictor control scheme for unstable process in industry, an improved double loop control model is proposed for chemical unstable processes. Inner loop is to stabilize integrating the unstable process and transform the original process to first-order plus pure dead-time dynamic stable process. Outer loop is to enhance the performance of set point response. Disturbance controller is designed to enhance the performance of disturbance response. The improved control system is simple with exact physical meaning. The characteristic equation is easy to realize stabilization. Three controllers are separately design in the improved scheme. It is easy to design each controller and good control performance for the respective closed-loop transfer function separately. The robust stability of the proposed control scheme is analyzed. Finally, case studies illustrate that the improved method can give better system performance than existing design methods. © 2013 ISA Published by ISA All rights reserved.

  2. Method to Prepare Processable Polyimides with Non-Reactive Endgroups Using 1,3-bis(3-Aminophenoxy) Benzene

    NASA Technical Reports Server (NTRS)

    Jensen, Brian J. (Inventor)

    2000-01-01

    Polyimide copolymers were obtained containing 1,3-bis(3-aminophenoxy)benzene (APB) and other diamines and dianhydrides and terminating with the appropriate amount of a non-reactive endcapper, such as phthalic anhydride. Homopolymers containing only other diamines and dianhydrides which are not processable under conditions described previously can be made processable by incorporating various amounts of APB, depending on the chemical structures of the diamines and dianhydrides used. Polyimides that are more rigid in nature require more APB to impart processability than polyimides that are less rigid in nature. The copolymers that result from using APB to enhance processability have a unique combination of properties including excellent thin film properties, low pressure processing (200 psi and below), improved toughness, improved solvent resistance, improved adhesive properties, improved composite mechanical properties, long term melt stability (several hours at 390 C), and lower melt viscosities.

  3. Method To Prepare Processable Polyimides With Reactive Endogroups Using 1,3-bis(3-aminophenoxy)benzene

    NASA Technical Reports Server (NTRS)

    Jensen, Brian J. (Inventor)

    2001-01-01

    Polyimide copolymers were obtained containing 1,3-bis(3-aminophenoxy)benzene (APB) and other diamines and dianhydrides and terminating with the appropriate amount of a non-reactive endcapper, such as phthalic anhydride. Homopolymers containing only other diamines and dianhydrides which are not processable under conditions described previously can be made processable by incorporating various amounts of APB, depending on the chemical structures of the diamines and dianhydrides used. Polyimides that are more rigid in nature require more APB to impart processability than polyimides that are less rigid in nature. The copolymers that result from using APB to enhance processability have a unique combination of properties including excellent thin film properties, low pressure processing (200 psi and below), improved toughness, improved solvent resistance, improved adhesive properties, improved composite mechanical properties, long term melt stability (several hours at 390 C), and lower melt viscosities.

  4. Management of local economic and ecological system of coal processing company

    NASA Astrophysics Data System (ADS)

    Kiseleva, T. V.; Mikhailov, V. G.; Karasev, V. A.

    2016-10-01

    The management issues of local ecological and economic system of coal processing company - coal processing plant - are considered in the article. The objectives of the research are the identification and the analysis of local ecological and economic system (coal processing company) performance and the proposals for improving the mechanism to support the management decision aimed at improving its environmental safety. The data on the structure of run-of-mine coal processing products are shown. The analysis of main ecological and economic indicators of coal processing enterprises, characterizing the state of its environmental safety, is done. The main result of the study is the development of proposals to improve the efficiency of local enterprise ecological and economic system management, including technical, technological and business measures. The results of the study can be recommended to industrial enterprises to improve their ecological and economic efficiency.

  5. Performance excellence: using Lean Six Sigma tools to improve the US Army behavioral health surveillance process, boost team morale, and maximize value to customers and stakeholders.

    PubMed

    Watkins, Eren Youmans; Kemeter, Dave M; Spiess, Anita; Corrigan, Elizabeth; Kateley, Keri; Wills, John V; Mancha, Brent Edward; Nichols, Jerrica; Bell, Amy Millikan

    2014-01-01

    Lean Six Sigma (LSS) is a process improvement, problem-solving methodology used in business and manufacturing to improve the speed, quality, and cost of products. LSS can also be used to improve knowledge-based products integral to public health surveillance. An LSS project by the Behavioral Social Health Outcomes Program of the Army Institute of Public Health reduced the number of labor hours spent producing the routine surveillance of suicidal behavior publication. At baseline, the total number of labor hours was 448; after project completion, total labor hours were 199. Based on customer feedback, publication production was reduced from quarterly to annually. Process improvements enhanced group morale and established best practices in the form of standard operating procedures and business rules to ensure solutions are sustained. LSS project participation also fostered a change in the conceptualization of tasks and projects. These results demonstrate that LSS can be used to inform the public health process and should be considered a viable method of improving knowledge-based products and processes.

  6. Improving Science Pedagogic Quality in Elementary School Using Process Skill Approach Can Motivate Student to Be Active in Learning

    ERIC Educational Resources Information Center

    Sukiniarti

    2016-01-01

    On global era todays, as the professional teacher should be improving their pedagogic competency, including to improve their science pedagogy quality. This study is aimed to identify: (1) Process skill approach which has been used by Elementary School Teacher in science learning; (2) Teacher's opinion that process skill can motivate the student to…

  7. 25 CFR 256.14 - What are the steps that must be taken to process my application for the Housing Improvement Program?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... to process my application for the Housing Improvement Program? (a) The servicing housing office must... 25 Indians 1 2010-04-01 2010-04-01 false What are the steps that must be taken to process my application for the Housing Improvement Program? 256.14 Section 256.14 Indians BUREAU OF INDIAN AFFAIRS...

  8. Quality Improvement Initiatives in Colorectal Surgery: Value of Physician Feedback.

    PubMed

    Waters, Joshua A; Francone, Todd; Marcello, Peter W; Roberts, Patricia L; Schoetz, David J; Read, Thomas E; Stafford, Caitlin; Ricciardi, Rocco

    2017-02-01

    The impact of process improvement through surgeon feedback on outcomes is unclear. We sought to evaluate the effect of biannual surgeon-specific feedback on outcomes and adherence to departmental and Surgical Care Improvement Project process measures on colorectal surgery outcomes. This was a retrospective analysis of prospectively collected 100% capture surgical quality improvement data. This study was conducted at the department of colorectal surgery at a tertiary care teaching hospital from January 2008 through December 2013. Each surgeon was provided with biannual feedback on process adherence and surgeon-specific outcomes of urinary tract infection, deep vein thrombosis, surgical site infection, anastomotic leak, 30-day readmission, reoperation, and mortality. We recorded adherence to Surgical Care Improvement Project process measures and departmentally implemented measures (ie, anastomotic leak testing) as well as surgeon-specific outcomes. We abstracted 7975 operations. There was no difference in demographics, laparoscopy, or blood loss. Adherence to catheter removal increased from 73% to 100% (p < 0.0001), whereas urinary tract infection decreased 52% (p < 0.01). Adherence to thromboprophylaxis administration remained unchanged as did the deep vein thrombosis rate (p = not significant). Adherence to preoperative antibiotic administration increased from 72% to 100% (p < 0.0001), whereas surgical site infection did not change (7.6%-6.6%; p = 0.3). There were 2589 operative encounters with anastomoses. For right-sided anastomoses, the proportion of handsewn anastomoses declined from 19% to 1.5% (p < 0.001). For left-sided anastomoses, without diversion, anastomotic leak testing adherence increased from 88% to 95% (p < 0.01). Overall leak rate decreased from 5.2% to 2.9% (p < 0.05). Concurrent process changes make isolation of the impact from individual process improvement changes challenging. Nearly complete adherence to process measures for deep vein thrombosis and surgical site infection did not lead to measureable outcomes improvement. Process measure adherence was associated with decreased rate of anastomotic leak and urinary tract infection. Biannual surgeon-specific feedback of outcomes was associated with improved process measure adherence and improvement in surgical quality.

  9. An Application of Six Sigma to Reduce Supplier Quality Cost

    NASA Astrophysics Data System (ADS)

    Gaikwad, Lokpriya Mohanrao; Teli, Shivagond Nagappa; Majali, Vijay Shashikant; Bhushi, Umesh Mahadevappa

    2016-01-01

    This article presents an application of Six Sigma to reduce supplier quality cost in manufacturing industry. Although there is a wider acceptance of Six Sigma in many organizations today, there is still a lack of in-depth case study of Six Sigma. For the present research the case study methodology was used. The company decided to reduce quality cost and improve selected processes using Six Sigma methodologies. Regarding the fact that there is a lack of case studies dealing with Six Sigma especially in individual manufacturing organization this article could be of great importance also for the practitioners. This paper discusses the quality and productivity improvement in a supplier enterprise through a case study. The paper deals with an application of Six Sigma define-measure-analyze-improve-control methodology in an industry which provides a framework to identify, quantify and eliminate sources of variation in an operational process in question, to optimize the operation variables, improve and sustain performance viz. process yield with well-executed control plans. Six Sigma improves the process performance (process yield) of the critical operational process, leading to better utilization of resources, decreases variations and maintains consistent quality of the process output.

  10. An Information System Development Method Connecting Business Process Modeling and its Experimental Evaluation

    NASA Astrophysics Data System (ADS)

    Okawa, Tsutomu; Kaminishi, Tsukasa; Kojima, Yoshiyuki; Hirabayashi, Syuichi; Koizumi, Hisao

    Business process modeling (BPM) is gaining attention as a measure of analysis and improvement of the business process. BPM analyses the current business process as an AS-IS model and solves problems to improve the current business and moreover it aims to create a business process, which produces values, as a TO-BE model. However, researches of techniques that connect the business process improvement acquired by BPM to the implementation of the information system seamlessly are rarely reported. If the business model obtained by BPM is converted into UML, and the implementation can be carried out by the technique of UML, we can expect the improvement in efficiency of information system implementation. In this paper, we describe a method of the system development, which converts the process model obtained by BPM into UML and the method is evaluated by modeling a prototype of a parts procurement system. In the evaluation, comparison with the case where the system is implemented by the conventional UML technique without going via BPM is performed.

  11. Process mapping as a framework for performance improvement in emergency general surgery.

    PubMed

    DeGirolamo, Kristin; D'Souza, Karan; Hall, William; Joos, Emilie; Garraway, Naisan; Sing, Chad Kim; McLaughlin, Patrick; Hameed, Morad

    2017-12-01

    Emergency general surgery conditions are often thought of as being too acute for the development of standardized approaches to quality improvement. However, process mapping, a concept that has been applied extensively in manufacturing quality improvement, is now being used in health care. The objective of this study was to create process maps for small bowel obstruction in an effort to identify potential areas for quality improvement. We used the American College of Surgeons Emergency General Surgery Quality Improvement Program pilot database to identify patients who received nonoperative or operative management of small bowel obstruction between March 2015 and March 2016. This database, patient charts and electronic health records were used to create process maps from the time of presentation to discharge. Eighty-eight patients with small bowel obstruction (33 operative; 55 nonoperative) were identified. Patients who received surgery had a complication rate of 32%. The processes of care from the time of presentation to the time of follow-up were highly elaborate and variable in terms of duration; however, the sequences of care were found to be consistent. We used data visualization strategies to identify bottlenecks in care, and they showed substantial variability in terms of operating room access. Variability in the operative care of small bowel obstruction is high and represents an important improvement opportunity in general surgery. Process mapping can identify common themes, even in acute care, and suggest specific performance improvement measures.

  12. Process mapping as a framework for performance improvement in emergency general surgery.

    PubMed

    DeGirolamo, Kristin; D'Souza, Karan; Hall, William; Joos, Emilie; Garraway, Naisan; Sing, Chad Kim; McLaughlin, Patrick; Hameed, Morad

    2018-02-01

    Emergency general surgery conditions are often thought of as being too acute for the development of standardized approaches to quality improvement. However, process mapping, a concept that has been applied extensively in manufacturing quality improvement, is now being used in health care. The objective of this study was to create process maps for small bowel obstruction in an effort to identify potential areas for quality improvement. We used the American College of Surgeons Emergency General Surgery Quality Improvement Program pilot database to identify patients who received nonoperative or operative management of small bowel obstruction between March 2015 and March 2016. This database, patient charts and electronic health records were used to create process maps from the time of presentation to discharge. Eighty-eight patients with small bowel obstruction (33 operative; 55 nonoperative) were identified. Patients who received surgery had a complication rate of 32%. The processes of care from the time of presentation to the time of follow-up were highly elaborate and variable in terms of duration; however, the sequences of care were found to be consistent. We used data visualization strategies to identify bottlenecks in care, and they showed substantial variability in terms of operating room access. Variability in the operative care of small bowel obstruction is high and represents an important improvement opportunity in general surgery. Process mapping can identify common themes, even in acute care, and suggest specific performance improvement measures.

  13. Performance in physiology evaluation: possible improvement by active learning strategies.

    PubMed

    Montrezor, Luís H

    2016-12-01

    The evaluation process is complex and extremely important in the teaching/learning process. Evaluations are constantly employed in the classroom to assist students in the learning process and to help teachers improve the teaching process. The use of active methodologies encourages students to participate in the learning process, encourages interaction with their peers, and stimulates thinking about physiological mechanisms. This study examined the performance of medical students on physiology over four semesters with and without active engagement methodologies. Four activities were used: a puzzle, a board game, a debate, and a video. The results show that engaging in activities with active methodologies before a physiology cognitive monitoring test significantly improved student performance compared with not performing the activities. We integrate the use of these methodologies with classic lectures, and this integration appears to improve the teaching/learning process in the discipline of physiology and improves the integration of physiology with cardiology and neurology. In addition, students enjoy the activities and perform better on their evaluations when they use them. Copyright © 2016 The American Physiological Society.

  14. The Use of Lean Six Sigma Methodology in Increasing Capacity of a Chemical Production Facility at DSM.

    PubMed

    Meeuwse, Marco

    2018-03-30

    Lean Six Sigma is an improvement method, combining Lean, which focuses on removing 'waste' from a process, with Six Sigma, which is a data-driven approach, making use of statistical tools. Traditionally it is used to improve the quality of products (reducing defects), or processes (reducing variability). However, it can also be used as a tool to increase the productivity or capacity of a production plant. The Lean Six Sigma methodology is therefore an important pillar of continuous improvement within DSM. In the example shown here a multistep batch process is improved, by analyzing the duration of the relevant process steps, and optimizing the procedures. Process steps were performed in parallel instead of sequential, and some steps were made shorter. The variability was reduced, giving the opportunity to make a tighter planning, and thereby reducing waiting times. Without any investment in new equipment or technical modifications, the productivity of the plant was improved by more than 20%; only by changing procedures and the programming of the process control system.

  15. Using Six Sigma to reduce medication errors in a home-delivery pharmacy service.

    PubMed

    Castle, Lon; Franzblau-Isaac, Ellen; Paulsen, Jim

    2005-06-01

    Medco Health Solutions, Inc. conducted a project to reduce medication errors in its home-delivery service, which is composed of eight prescription-processing pharmacies, three dispensing pharmacies, and six call-center pharmacies. Medco uses the Six Sigma methodology to reduce process variation, establish procedures to monitor the effectiveness of medication safety programs, and determine when these efforts do not achieve performance goals. A team reviewed the processes in home-delivery pharmacy and suggested strategies to improve the data-collection and medication-dispensing practices. A variety of improvement activities were implemented, including a procedure for developing, reviewing, and enhancing sound-alike/look-alike (SALA) alerts and system enhancements to improve processing consistency across the pharmacies. "External nonconformances" were reduced for several categories of medication errors, including wrong-drug selection (33%), wrong directions (49%), and SALA errors (69%). Control charts demonstrated evidence of sustained process improvement and actual reduction in specific medication error elements. Establishing a continuous quality improvement process to ensure that medication errors are minimized is critical to any health care organization providing medication services.

  16. Improving the medical records department processes by lean management

    PubMed Central

    Ajami, Sima; Ketabi, Saeedeh; Sadeghian, Akram; Saghaeinnejad-Isfahani, Sakine

    2015-01-01

    Background: Lean management is a process improvement technique to identify waste actions and processes to eliminate them. The benefits of Lean for healthcare organizations are that first, the quality of the outcomes in terms of mistakes and errors improves. The second is that the amount of time taken through the whole process significantly improves. Aims: The purpose of this paper is to improve the Medical Records Department (MRD) processes at Ayatolah-Kashani Hospital in Isfahan, Iran by utilizing Lean management. Materials and Methods: This research was applied and an interventional study. The data have been collected by brainstorming, observation, interview, and workflow review. The study population included MRD staff and other expert staff within the hospital who were stakeholders and users of the MRD. Statistical Analysis Used: The MRD were initially taught the concepts of Lean management and then formed into the MRD Lean team. The team then identified and reviewed the current processes subsequently; they identified wastes and values, and proposed solutions. Results: The findings showed that the MRD units (Archive, Coding, Statistics, and Admission) had 17 current processes, 28 wastes, and 11 values were identified. In addition, they offered 27 comments for eliminating the wastes. Conclusion: The MRD is the critical department for the hospital information system and, therefore, the continuous improvement of its services and processes, through scientific methods such as Lean management, are essential. Originality/Value: The study represents one of the few attempts trying to eliminate wastes in the MRD. PMID:26097862

  17. GOCI Level-2 Processing Improvements and Cloud Motion Analysis

    NASA Technical Reports Server (NTRS)

    Robinson, Wayne D.

    2015-01-01

    The Ocean Biology Processing Group has been working with the Korean Institute of Ocean Science and Technology (KIOST) to process geosynchronous ocean color data from the GOCI (Geostationary Ocean Color Instrument) aboard the COMS (Communications, Ocean and Meteorological Satellite). The level-2 processing program, l2gen has GOCI processing as an option. Improvements made to that processing are discussed here as well as a discussion about cloud motion effects.

  18. Improvement of hospital processes through business process management in Qaem Teaching Hospital: A work in progress.

    PubMed

    Yarmohammadian, Mohammad H; Ebrahimipour, Hossein; Doosty, Farzaneh

    2014-01-01

    In a world of continuously changing business environments, organizations have no option; however, to deal with such a big level of transformation in order to adjust the consequential demands. Therefore, many companies need to continually improve and review their processes to maintain their competitive advantages in an uncertain environment. Meeting these challenges requires implementing the most efficient possible business processes, geared to the needs of the industry and market segments that the organization serves globally. In the last 10 years, total quality management, business process reengineering, and business process management (BPM) have been some of the management tools applied by organizations to increase business competiveness. This paper is an original article that presents implementation of "BPM" approach in the healthcare domain that allows an organization to improve and review its critical business processes. This project was performed in "Qaem Teaching Hospital" in Mashhad city, Iran and consists of four distinct steps; (1) identify business processes, (2) document the process, (3) analyze and measure the process, and (4) improve the process. Implementing BPM in Qaem Teaching Hospital changed the nature of management by allowing the organization to avoid the complexity of disparate, soloed systems. BPM instead enabled the organization to focus on business processes at a higher level.

  19. Women-focused treatment agencies and process improvement: Strategies to increase client engagement

    PubMed Central

    Wisdom, Jennifer P.; Hoffman, Kim; Rechberger, Elke; Seim, Kay; Owens, Betta

    2009-01-01

    Behavioral health treatment agencies often struggle to keep clients engaged in treatment. Women clients often have additional factors such as family responsibilities, financial difficulties, or abuse histories that provide extra challenges to remaining in care. As part of a national initiative, four women-focused drug treatment agencies used process improvement to address treatment engagement. Interviews and focus groups with staff assessed the nature and extent of interventions. Women-focused drug treatment agencies selected relational-based interventions to engage clients in treatment and improved four-week treatment retention from 66% to 76%. Process improvement interventions in women-focused treatment may be useful to improve engagement. PMID:20046914

  20. Kaizen: a process improvement model for the business of health care and perioperative nursing professionals.

    PubMed

    Tetteh, Hassan A

    2012-01-01

    Kaizen is a proven management technique that has a practical application for health care in the context of health care reform and the 2010 Institute of Medicine landmark report on the future of nursing. Compounded productivity is the unique benefit of kaizen, and its principles are change, efficiency, performance of key essential steps, and the elimination of waste through small and continuous process improvements. The kaizen model offers specific instruction for perioperative nurses to achieve process improvement in a five-step framework that includes teamwork, personal discipline, improved morale, quality circles, and suggestions for improvement. Published by Elsevier Inc.

  1. Using continuous process improvement methodology to standardize nursing handoff communication.

    PubMed

    Klee, Kristi; Latta, Linda; Davis-Kirsch, Sallie; Pecchia, Maria

    2012-04-01

    The purpose of this article was to describe the use of continuous performance improvement (CPI) methodology to standardize nurse shift-to-shift handoff communication. The goals of the process were to standardize the content and process of shift handoff, improve patient safety, increase patient and family involvement in the handoff process, and decrease end-of-shift overtime. This article will describe process changes made over a 4-year period as result of application of the plan-do-check-act procedure, which is an integral part of the CPI methodology, and discuss further work needed to continue to refine this critical nursing care process. Copyright © 2012 Elsevier Inc. All rights reserved.

  2. Course Development Cycle Time: A Framework for Continuous Process Improvement.

    ERIC Educational Resources Information Center

    Lake, Erinn

    2003-01-01

    Details Edinboro University's efforts to reduce the extended cycle time required to develop new courses and programs. Describes a collaborative process improvement framework, illustrated data findings, the team's recommendations for improvement, and the outcomes of those recommendations. (EV)

  3. Process improvement methods increase the efficiency, accuracy, and utility of a neurocritical care research repository.

    PubMed

    O'Connor, Sydney; Ayres, Alison; Cortellini, Lynelle; Rosand, Jonathan; Rosenthal, Eric; Kimberly, W Taylor

    2012-08-01

    Reliable and efficient data repositories are essential for the advancement of research in Neurocritical care. Various factors, such as the large volume of patients treated within the neuro ICU, their differing length and complexity of hospital stay, and the substantial amount of desired information can complicate the process of data collection. We adapted the tools of process improvement to the data collection and database design of a research repository for a Neuroscience intensive care unit. By the Shewhart-Deming method, we implemented an iterative approach to improve the process of data collection for each element. After an initial design phase, we re-evaluated all data fields that were challenging or time-consuming to collect. We then applied root-cause analysis to optimize the accuracy and ease of collection, and to determine the most efficient manner of collecting the maximal amount of data. During a 6-month period, we iteratively analyzed the process of data collection for various data elements. For example, the pre-admission medications were found to contain numerous inaccuracies after comparison with a gold standard (sensitivity 71% and specificity 94%). Also, our first method of tracking patient admissions and discharges contained higher than expected errors (sensitivity 94% and specificity 93%). In addition to increasing accuracy, we focused on improving efficiency. Through repeated incremental improvements, we reduced the number of subject records that required daily monitoring from 40 to 6 per day, and decreased daily effort from 4.5 to 1.5 h/day. By applying process improvement methods to the design of a Neuroscience ICU data repository, we achieved a threefold improvement in efficiency and increased accuracy. Although individual barriers to data collection will vary from institution to institution, a focus on process improvement is critical to overcoming these barriers.

  4. SU-E-T-760: Tolerance Design for Site-Specific Range in Proton Patient QA Process Using the Six Sigma Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lah, J; Shin, D; Kim, G

    Purpose: To show how tolerance design and tolerancing approaches can be used to predict and improve the site-specific range in patient QA process in implementing the Six Sigma. Methods: In this study, patient QA plans were selected according to 6 site-treatment groups: head &neck (94 cases), spine (76 cases), lung (89 cases), liver (53 cases), pancreas (55 cases), and prostate (121 cases), treated between 2007 and 2013. We evaluated a model of the Six Sigma that determines allowable deviations in design parameters and process variables in patient-specific QA, where possible, tolerance may be loosened, then customized if it necessary tomore » meet the functional requirements. A Six Sigma problem-solving methodology is known as DMAIC phases, which are used stand for: Define a problem or improvement opportunity, Measure process performance, Analyze the process to determine the root causes of poor performance, Improve the process by fixing root causes, Control the improved process to hold the gains. Results: The process capability for patient-specific range QA is 0.65 with only ±1 mm of tolerance criteria. Our results suggested the tolerance level of ±2–3 mm for prostate and liver cases and ±5 mm for lung cases. We found that customized tolerance between calculated and measured range reduce that patient QA plan failure and almost all sites had failure rates less than 1%. The average QA time also improved from 2 hr to less than 1 hr for all including planning and converting process, depth-dose measurement and evaluation. Conclusion: The objective of tolerance design is to achieve optimization beyond that obtained through QA process improvement and statistical analysis function detailing to implement a Six Sigma capable design.« less

  5. Coal-oil coprocessing at HTI - development and improvement of the technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stalzer, R.H.; Lee, L.K.; Hu, J.

    1995-12-31

    Co-Processing refers to the combined processing of coal and petroleum-derived heavy oil feedstocks. The coal feedstocks used are those typically utilized in direct coal liquefaction: bituminous, subbituminous, and lignites. Petroleum-derived oil, is typically a petroleum residuum, containing at least 70 W% material boiling above 525{degrees}C. The combined coal and oil feedstocks are processed simultaneously with the dual objective of liquefying the coal and upgrading the petroleum-derived residuum to lower boiling (<525{degrees}C) premium products. HTI`s investigation of the Co-Processing technology has included work performed in laboratory, bench and PDU scale operations. The concept of co-processing technology is quite simple and amore » natural outgrowth of the work done with direct coal liquefaction. A 36 month program to evaluate new process concepts in coal-oil coprocessing at the bench-scale was begun in September 1994 and runs until September 1997. Included in this continuous bench-scale program are provisions to examine new improvements in areas such as: interstage product separation, feedstock concentrations (coal/oil), improved supported/dispersed catalysts, optimization of reactor temperature sequencing, and in-line hydrotreating. This does not preclude other ideas from DOE contracts and other sources that can lead to improved product quality and economics. This research work has led to important findings which significantly increased liquid yields, improved product quality, and improved process economics.« less

  6. Co-optimization of lithographic and patterning processes for improved EPE performance

    NASA Astrophysics Data System (ADS)

    Maslow, Mark J.; Timoshkov, Vadim; Kiers, Ton; Jee, Tae Kwon; de Loijer, Peter; Morikita, Shinya; Demand, Marc; Metz, Andrew W.; Okada, Soichiro; Kumar, Kaushik A.; Biesemans, Serge; Yaegashi, Hidetami; Di Lorenzo, Paolo; Bekaert, Joost P.; Mao, Ming; Beral, Christophe; Larivière, Stephane

    2017-03-01

    Complimentary lithography is already being used for advanced logic patterns. The tight pitches for 1D Metal layers are expected to be created using spacer based multiple patterning ArF-i exposures and the more complex cut/block patterns are made using EUV exposures. At the same time, control requirements of CDU, pattern shift and pitch-walk are approaching sub-nanometer levels to meet edge placement error (EPE) requirements. Local variability, such as Line Edge Roughness (LER), Local CDU, and Local Placement Error (LPE), are dominant factors in the total Edge Placement error budget. In the lithography process, improving the imaging contrast when printing the core pattern has been shown to improve the local variability. In the etch process, it has been shown that the fusion of atomic level etching and deposition can also improve these local variations. Co-optimization of lithography and etch processing is expected to further improve the performance over individual optimizations alone. To meet the scaling requirements and keep process complexity to a minimum, EUV is increasingly seen as the platform for delivering the exposures for both the grating and the cut/block patterns beyond N7. In this work, we evaluated the overlay and pattern fidelity of an EUV block printed in a negative tone resist on an ArF-i SAQP grating. High-order Overlay modeling and corrections during the exposure can reduce overlay error after development, a significant component of the total EPE. During etch, additional degrees of freedom are available to improve the pattern placement error in single layer processes. Process control of advanced pitch nanoscale-multi-patterning techniques as described above is exceedingly complicated in a high volume manufacturing environment. Incorporating potential patterning optimizations into both design and HVM controls for the lithography process is expected to bring a combined benefit over individual optimizations. In this work we will show the EPE performance improvement for a 32nm pitch SAQP + block patterned Metal 2 layer by cooptimizing the lithography and etch processes. Recommendations for further improvements and alternative processes will be given.

  7. Measuring, managing and maximizing refinery performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bascur, O.A.; Kennedy, J.P.

    1996-01-01

    Implementing continuous quality improvement is a confluence of total quality management, people empowerment, performance indicators and information engineering. Supporting information technologies allow a refiner to narrow the gap between management objectives and the process control level. Dynamic performance monitoring benefits come from production cost savings, improved communications and enhanced decision making. A refinery workgroup information flow model helps automate continuous improvement of processes, performance and the organization. The paper discusses the rethinking of refinery operations, dynamic performance monitoring, continuous process improvement, the knowledge coordinator and repository manager, an integrated plant operations workflow, and successful implementation.

  8. Improvement of Selected Logistics Processes Using Quality Engineering Tools

    NASA Astrophysics Data System (ADS)

    Zasadzień, Michał; Žarnovský, Jozef

    2018-03-01

    Increase in the number of orders, the increasing quality requirements and the speed of order preparation require implementation of new solutions and improvement of logistics processes. Any disruption that occurs during execution of an order often leads to customer dissatisfaction, as well as loss of his/her confidence. The article presents a case study of the use of quality engineering methods and tools to improve the e-commerce logistic process. This made it possible to identify and prioritize key issues, identify their causes, and formulate improvement and prevention measures.

  9. Process improvement for the safe delivery of multidisciplinary-executed treatments-A case in Y-90 microspheres therapy.

    PubMed

    Cai, Bin; Altman, Michael B; Garcia-Ramirez, Jose; LaBrash, Jason; Goddu, S Murty; Mutic, Sasa; Parikh, Parag J; Olsen, Jeffrey R; Saad, Nael; Zoberi, Jacqueline E

    To develop a safe and robust workflow for yttrium-90 (Y-90) radioembolization procedures in a multidisciplinary team environment. A generalized Define-Measure-Analyze-Improve-Control (DMAIC)-based approach to process improvement was applied to a Y-90 radioembolization workflow. In the first DMAIC cycle, events with the Y-90 workflow were defined and analyzed. To improve the workflow, a web-based interactive electronic white board (EWB) system was adopted as the central communication platform and information processing hub. The EWB-based Y-90 workflow then underwent a second DMAIC cycle. Out of 245 treatments, three misses that went undetected until treatment initiation were recorded over a period of 21 months, and root-cause-analysis was performed to determine causes of each incident and opportunities for improvement. The EWB-based Y-90 process was further improved via new rules to define reliable sources of information as inputs into the planning process, as well as new check points to ensure this information was communicated correctly throughout the process flow. After implementation of the revised EWB-based Y-90 workflow, after two DMAIC-like cycles, there were zero misses out of 153 patient treatments in 1 year. The DMAIC-based approach adopted here allowed the iterative development of a robust workflow to achieve an adaptable, event-minimizing planning process despite a complex setting which requires the participation of multiple teams for Y-90 microspheres therapy. Implementation of such a workflow using the EWB or similar platform with a DMAIC-based process improvement approach could be expanded to other treatment procedures, especially those requiring multidisciplinary management. Copyright © 2016 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  10. Microphysics, Radiation and Surface Processes in the Goddard Cumulus Ensemble (GCE) Model

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo

    2002-01-01

    In this talk, five specific major GCE improvements: (1) ice microphysics, (2) longwave and shortwave radiative transfer processes, (3) land surface processes, (4) ocean surface fluxes and (5) ocean mixed layer processes are presented. The performance of these new GCE improvements will be examined. Observations are used for model validation.

  11. Characterization of stainless steel surface processed using electrolytic oxidation and titanium complex ion solution

    NASA Astrophysics Data System (ADS)

    Kang, Yubin; Choi, Jaeyoung; Park, Jinju; Kim, Woo-Byoung; Lee, Kun-Jae

    2017-09-01

    This study attempts to improve the physical and chemical adhesion between metals and ceramics by using electrolytic oxidation and a titanium organic/inorganic complex ion solution on the SS-304 plate. Surface analysis confirmed the existence of the Tisbnd Osbnd Mx bonds formed by the bonding between the metal ions and the Ti oxide at the surface of the pre-processed SS plate, and improved chemical adhesion during ceramic coating was expected by confirming the presence of the carboxylic group. The adhesion was evaluated by using the ceramic coating solution in order to assess the improved adhesion of the SS plate under conditions. The results showed that both the adhesion and durability were largely improved in the sample processed with all the pre-processing steps, thus confirming that the physical and chemical adhesion between metals and ceramics can be improved by enhancing the physical roughness via electrolytic oxidation and pre-processing using a Ti complex ion solution.

  12. Staff Training for Business Process Improvement: The Benefit of Role-Plays in the Case of KreditSim

    ERIC Educational Resources Information Center

    Borner, Rene; Moormann, Jurgen; Wang, Minhong

    2012-01-01

    Purpose: The paper aims to explore staff's experience with role-plays using the example of training bank employees in Six Sigma as a major methodology for business process improvement. Design/methodology/approach: The research is based on a case study. A role-play, KreditSim, is used to simulate a loan approval process that has to be improved by…

  13. Plant breeding and genetics

    USDA-ARS?s Scientific Manuscript database

    The ultimate goal of plant breeding is to develop improved crops. Improvements can be made in crop productivity, crop processing and marketing, and/or consumer quality. The process of developing an improved cultivar begins with intercrossing lines with high performance for the traits of interest, th...

  14. The role of CQI in the strategic planning process.

    PubMed

    Sahney, V K; Warden, G L

    1993-01-01

    This article describes the strategic planning process used to define the health care needs of a region and to prepare Henry Ford Health System (HFHS) to meet the needs of the 21st century. It presents key applications of continuous quality improvement in the development and implementation of the strategic plans for HFHS; explains how HFHS adapted the Deming/Shewhart cycle of continuous improvement for the purpose of improving its planning process; and delineates how the strategic planning, financial planning, and quality planning processes have been integrated.

  15. Improved process for generating ClF/sub 3/ from ClF and F/sub 2/

    DOEpatents

    Reiner, R.H.; Pashley, J.H.; Barber, E.J.

    The invention is an improvement in the process for producing gaseous ClF/sub 3/ by reacting ClF and F/sub 2/ at elevated temperature. The improved process comprises conducting the reaction in the presence of NiF/sub 2/, which preferably is in the form of particles or in the form of a film or layer on a particulate substrate. The nickel fluoride acts as a reaction catalyst, significantly increasing the reaction rate and thus permitting valuable reductions in process temperature, pressure, and/or reactor volume.

  16. A primer on the cost of quality for improvement of laboratory and pathology specimen processes.

    PubMed

    Carlson, Richard O; Amirahmadi, Fazlollaah; Hernandez, James S

    2012-09-01

    In today's environment, many laboratories and pathology practices are challenged to maintain or increase their quality while simultaneously lowering their overall costs. The cost of improving specimen processes is related to quality, and we demonstrate that actual costs can be reduced by designing "quality at the source" into the processes. Various costs are hidden along the total testing process, and we suggest ways to identify opportunities to reduce cost by improving quality in laboratories and pathology practices through the use of Lean, Six Sigma, and industrial engineering.

  17. Improving Immunization Rates Using Lean Six Sigma Processes: Alliance of Independent Academic Medical Centers National Initiative III Project

    PubMed Central

    Hina-Syeda, Hussaini; Kimbrough, Christina; Murdoch, William; Markova, Tsveti

    2013-01-01

    Background Quality improvement education and work in interdisciplinary teams is a healthcare priority. Healthcare systems are trying to meet core measures and provide excellent patient care, thus improving their Hospital Consumer Assessment of Healthcare Providers & Systems scores. Crittenton Hospital Medical Center in Rochester Hills, MI, aligned educational and clinical objectives, focusing on improving immunization rates against pneumonia and influenza prior to the rates being implemented as core measures. Improving immunization rates prevents infections, minimizes hospitalizations, and results in overall improved patient care. Teaching hospitals offer an effective way to work on clinical projects by bringing together the skill sets of residents, faculty, and hospital staff to achieve superior results. Methods We designed and implemented a structured curriculum in which interdisciplinary teams acquired knowledge on quality improvement and teamwork, while focusing on a specific clinical project: improving global immunization rates. We used the Lean Six Sigma process tools to quantify the initial process capability to immunize against pneumococcus and influenza. Results The hospital's process to vaccinate against pneumonia overall was operating at a Z score of 3.13, and the influenza vaccination Z score was 2.53. However, the process to vaccinate high-risk patients against pneumonia operated at a Z score of 1.96. Improvement in immunization rates of high-risk patients became the focus of the project. After the implementation of solutions, the process to vaccinate high-risk patients against pneumonia operated at a Z score of 3.9 with a defects/million opportunities rate of 9,346 and a yield of 93.5%. Revisions to the adult assessment form fixed 80% of the problems identified. Conclusions This process improvement project was not only beneficial in terms of improved quality of patient care but was also a positive learning experience for the interdisciplinary team, particularly for the residents. The hospital has completed quality improvement projects in the past; however, this project was the first in which residents were actively involved. The didactic components and experiential learning were powerfully synergistic. This and similar projects can have far-reaching implications in terms of promoting patient health and improving the quality of care delivered by the healthcare systems and teaching hospitals. PMID:24052758

  18. Defective Reduction in Frozen Pie Manufacturing Process

    NASA Astrophysics Data System (ADS)

    Nooted, Oranuch; Tangjitsitcharoen, Somkiat

    2017-06-01

    The frozen pie production has a lot of defects resulting in high production cost. Failure mode and effect analysis (FMEA) technique has been applied to improve the frozen pie process. Pareto chart is also used to determine the major defects of frozen pie. There are 3 main processes that cause the defects which are the 1st freezing to glazing process, the forming process, and the folding process. The Risk Priority Number (RPN) obtained from FMEA is analyzed to reduce the defects. If RPN of each cause exceeds 45, the process will be considered to be improved and selected for the corrective and preventive actions. The results showed that RPN values decreased after the correction. Therefore, the implementation of FMEA technique can help to improve the performance of frozen pie process and reduce the defects approximately 51.9%.

  19. Activating clinical trials: a process improvement approach.

    PubMed

    Martinez, Diego A; Tsalatsanis, Athanasios; Yalcin, Ali; Zayas-Castro, José L; Djulbegovic, Benjamin

    2016-02-24

    The administrative process associated with clinical trial activation has been criticized as costly, complex, and time-consuming. Prior research has concentrated on identifying administrative barriers and proposing various solutions to reduce activation time, and consequently associated costs. Here, we expand on previous research by incorporating social network analysis and discrete-event simulation to support process improvement decision-making. We searched for all operational data associated with the administrative process of activating industry-sponsored clinical trials at the Office of Clinical Research of the University of South Florida in Tampa, Florida. We limited the search to those trials initiated and activated between July 2011 and June 2012. We described the process using value stream mapping, studied the interactions of the various process participants using social network analysis, and modeled potential process modifications using discrete-event simulation. The administrative process comprised 5 sub-processes, 30 activities, 11 decision points, 5 loops, and 8 participants. The mean activation time was 76.6 days. Rate-limiting sub-processes were those of contract and budget development. Key participants during contract and budget development were the Office of Clinical Research, sponsors, and the principal investigator. Simulation results indicate that slight increments on the number of trials, arriving to the Office of Clinical Research, would increase activation time by 11 %. Also, incrementing the efficiency of contract and budget development would reduce the activation time by 28 %. Finally, better synchronization between contract and budget development would reduce time spent on batching documentation; however, no improvements would be attained in total activation time. The presented process improvement analytic framework not only identifies administrative barriers, but also helps to devise and evaluate potential improvement scenarios. The strength of our framework lies in its system analysis approach that recognizes the stochastic duration of the activation process and the interdependence between process activities and entities.

  20. Development of the Upgraded DC Brush Gear Motor for Spacebus Platforms

    NASA Technical Reports Server (NTRS)

    Berning, Robert H.; Viout, Olivier

    2010-01-01

    The obsolescence of materials and processes used in the manufacture of traditional DC brush gear motors has necessitated the development of an upgraded DC brush gear motor (UBGM). The current traditional DC brush gear motor (BGM) design was evaluated using Six-Sigma process to identify potential design and production process improvements. The development effort resulted in a qualified UBGM design which improved manufacturability and reduced production costs. Using Six-Sigma processes and incorporating lessons learned during the development process also improved motor performance for UBGM making it a more viable option for future use as a deployment mechanism in space flight applications.

  1. Software process improvement in the NASA software engineering laboratory

    NASA Technical Reports Server (NTRS)

    Mcgarry, Frank; Pajerski, Rose; Page, Gerald; Waligora, Sharon; Basili, Victor; Zelkowitz, Marvin

    1994-01-01

    The Software Engineering Laboratory (SEL) was established in 1976 for the purpose of studying and measuring software processes with the intent of identifying improvements that could be applied to the production of ground support software within the Flight Dynamics Division (FDD) at the National Aeronautics and Space Administration (NASA)/Goddard Space Flight Center (GSFC). The SEL has three member organizations: NASA/GSFC, the University of Maryland, and Computer Sciences Corporation (CSC). The concept of process improvement within the SEL focuses on the continual understanding of both process and product as well as goal-driven experimentation and analysis of process change within a production environment.

  2. Hybrid 3D printing by bridging micro/nano processes

    NASA Astrophysics Data System (ADS)

    Yoon, Hae-Sung; Jang, Ki-Hwan; Kim, Eunseob; Lee, Hyun-Taek; Ahn, Sung-Hoon

    2017-06-01

    A hybrid 3D printing process was developed for multiple-material/freeform nano-scale manufacturing. The process consisted of aerodynamically focused nanoparticle (AFN) printing, micro-machining, focused ion beam milling, and spin-coating. Theoretical and experimental investigations were carried out to improve the compatibility of each of the processes, enabling bridging of various different techniques. The resulting hybrid process could address the limitations of individual processes, enabling improved process scaling and dimensional degrees of freedom, without losing the advantages of the existing processes. The minimum structure width can be reduced to 50 nm using undercut structures. In addition, AFN printing employs particle impact for adhesion, and various inorganic materials are suitable for printing, including metals and functional ceramics. Using the developed system, we fabricated bi-material cantilevers for applications as a thermal actuator. The mechanical and thermal properties of the structure were investigated using an in situ measurement system, and irregular thermal phenomena due to the fabrication process were analyzed. We expect that this work will lead to improvements in the area of customized nano-scale manufacturing, as well as further improvements in manufacturing technology by combining different fabrication techniques.

  3. Some Improvements in H-PDLCs

    NASA Technical Reports Server (NTRS)

    Crawford, Gregory P.; Li, Liuliu

    2005-01-01

    Some improvements have been made in the formulation of holographically formed polymer-dispersed liquid crystals (H-PDLCs) and in the fabrication of devices made from these materials, with resulting improvements in performance. H-PDLCs are essentially volume Bragg gratings. Devices made from H-PDLCs function as electrically switchable reflective filters. Heretofore, it has been necessary to apply undesirably high drive voltages in order to switch H-PDLC devices. Many scientific papers on H-PDLCs and on the potential utility of H-PDLC devices for display and telecommunication applications have been published. However, until now, little has been published about improving quality control in synthesis of H-PDLCs and fabrication of H-PDLC devices to minimize (1) spatial nonuniformities within individual devices, (2) nonuniformities among nominally identical devices, and (3) variations in performance among nominally identical devices. The improvements reported here are results of a research effort directed partly toward solving these quality-control problems and partly toward reducing switching voltages. The quality-control improvements include incorporation of a number of process controls to create a relatively robust process, such that the H-PDLC devices fabricated in this process are more nearly uniform than were those fabricated in a prior laboratory-type process. The improved process includes ultrasonic mixing, ultrasonic cleaning, the use of a micro dispensing technique, and the use of a bubble press.

  4. IMPROVING THE ENVIRONMENTAL PERFORMANCE OF CHEMICAL PROCESSES THROUGH THE USE OF INFORMATION TECHNOLOGY

    EPA Science Inventory

    Efforts are currently underway at the USEPA to develop information technology applications to improve the environmental performance of the chemical process industry. These efforts include the use of genetic algorithms to optimize different process options for minimal environmenta...

  5. Producing Quantum Dots by Spray Pyrolysis

    NASA Technical Reports Server (NTRS)

    Banger, Kulbinder; Jin, Michael H.; Hepp, Aloysius

    2006-01-01

    An improved process for making nanocrystallites, commonly denoted quantum dots (QDs), is based on spray pyrolysis. Unlike the process used heretofore, the improved process is amenable to mass production of either passivated or non-passivated QDs, with computer control to ensure near uniformity of size.

  6. Process Security in Chemical Engineering Education

    ERIC Educational Resources Information Center

    Piluso, Cristina; Uygun, Korkut; Huang, Yinlun; Lou, Helen H.

    2005-01-01

    The threats of terrorism have greatly alerted the chemical process industries to assure plant security at all levels: infrastructure-improvement-focused physical security, information-protection-focused cyber security, and design-and-operation-improvement-focused process security. While developing effective plant security methods and technologies…

  7. NCCDS configuration management process improvement

    NASA Technical Reports Server (NTRS)

    Shay, Kathy

    1993-01-01

    By concentrating on defining and improving specific Configuration Management (CM) functions, processes, procedures, personnel selection/development, and tools, internal and external customers received improved CM services. Job performance within the section increased in both satisfaction and output. Participation in achieving major improvements has led to the delivery of consistent quality CM products as well as significant decreases in every measured CM metrics category.

  8. Application of Six Sigma towards improving surgical outcomes.

    PubMed

    Shukla, P J; Barreto, S G; Nadkarni, M S

    2008-01-01

    Six Sigma is a 'process excellence' tool targeting continuous improvement achieved by providing a methodology for improving key steps of a process. It is ripe for application into health care since almost all health care processes require a near-zero tolerance for mistakes. The aim of this study is to apply the Six Sigma methodology into a clinical surgical process and to assess the improvement (if any) in the outcomes and patient care. The guiding principles of Six Sigma, namely DMAIC (Define, Measure, Analyze, Improve, Control), were used to analyze the impact of double stapling technique (DST) towards improving sphincter preservation rates for rectal cancer. The analysis using the Six Sigma methodology revealed a Sigma score of 2.10 in relation to successful sphincter preservation. This score demonstrates an improvement over the previous technique (73% over previous 54%). This study represents one of the first clinical applications of Six Sigma in the surgical field. By understanding, accepting, and applying the principles of Six Sigma, we have an opportunity to transfer a very successful management philosophy to facilitate the identification of key steps that can improve outcomes and ultimately patient safety and the quality of surgical care provided.

  9. Improving operational anodising process performance using simulation approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liong, Choong-Yeun, E-mail: lg@ukm.edu.my; Ghazali, Syarah Syahidah, E-mail: syarah@gapps.kptm.edu.my

    The use of aluminium is very widespread, especially in transportation, electrical and electronics, architectural, automotive and engineering applications sectors. Therefore, the anodizing process is an important process for aluminium in order to make the aluminium durable, attractive and weather resistant. This research is focused on the anodizing process operations in manufacturing and supplying of aluminium extrusion. The data required for the development of the model is collected from the observations and interviews conducted in the study. To study the current system, the processes involved in the anodizing process are modeled by using Arena 14.5 simulation software. Those processes consist ofmore » five main processes, namely the degreasing process, the etching process, the desmut process, the anodizing process, the sealing process and 16 other processes. The results obtained were analyzed to identify the problems or bottlenecks that occurred and to propose improvement methods that can be implemented on the original model. Based on the comparisons that have been done between the improvement methods, the productivity could be increased by reallocating the workers and reducing loading time.« less

  10. The Goddard Cumulus Ensemble Model (GCE): Improvements and Applications for Studying Precipitation Processes

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo; Lang, Stephen E.; Zeng, Xiping; Li, Xiaowen; Matsui, Toshi; Mohr, Karen; Posselt, Derek; Chern, Jiundar; Peters-Lidard, Christa; Norris, Peter M.; hide

    2014-01-01

    Convection is the primary transport process in the Earth's atmosphere. About two-thirds of the Earth's rainfall and severe floods derive from convection. In addition, two-thirds of the global rain falls in the tropics, while the associated latent heat release accounts for three-fourths of the total heat energy for the Earth's atmosphere. Cloud-resolving models (CRMs) have been used to improve our understanding of cloud and precipitation processes and phenomena from micro-scale to cloud-scale and mesoscale as well as their interactions with radiation and surface processes. CRMs use sophisticated and realistic representations of cloud microphysical processes and can reasonably well resolve the time evolution, structure, and life cycles of clouds and cloud systems. CRMs also allow for explicit interaction between clouds, outgoing longwave (cooling) and incoming solar (heating) radiation, and ocean and land surface processes. Observations are required to initialize CRMs and to validate their results. The Goddard Cumulus Ensemble model (GCE) has been developed and improved at NASA/Goddard Space Flight Center over the past three decades. It is amulti-dimensional non-hydrostatic CRM that can simulate clouds and cloud systems in different environments. Early improvements and testing were presented in Tao and Simpson (1993) and Tao et al. (2003a). A review on the application of the GCE to the understanding of precipitation processes can be found in Simpson and Tao (1993) and Tao (2003). In this paper, recent model improvements (microphysics, radiation and land surface processes) are described along with their impact and performance on cloud and precipitation events in different geographic locations via comparisons with observations. In addition, recent advanced applications of the GCE are presented that include understanding the physical processes responsible for diurnal variation, examining the impact of aerosols (cloud condensation nuclei or CCN and ice nuclei or IN) on precipitation processes, utilizing a satellite simulator to improve the microphysics, providing better simulations for satellite-derived latent heating retrieval, and coupling with a general circulation model to improve the representation of precipitation processes.

  11. An intelligent processing environment for real-time simulation

    NASA Technical Reports Server (NTRS)

    Carroll, Chester C.; Wells, Buren Earl, Jr.

    1988-01-01

    The development of a highly efficient and thus truly intelligent processing environment for real-time general purpose simulation of continuous systems is described. Such an environment can be created by mapping the simulation process directly onto the University of Alamba's OPERA architecture. To facilitate this effort, the field of continuous simulation is explored, highlighting areas in which efficiency can be improved. Areas in which parallel processing can be applied are also identified, and several general OPERA type hardware configurations that support improved simulation are investigated. Three direct execution parallel processing environments are introduced, each of which greatly improves efficiency by exploiting distinct areas of the simulation process. These suggested environments are candidate architectures around which a highly intelligent real-time simulation configuration can be developed.

  12. “Using Statistical Comparisons between SPartICus Cirrus Microphysical Measurements, Detailed Cloud Models, and GCM Cloud Parameterizations to Understand Physical Processes Controlling Cirrus Properties and to Improve the Cloud Parameterizations”

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woods, Sarah

    2015-12-01

    The dual objectives of this project were improving our basic understanding of processes that control cirrus microphysical properties and improvement of the representation of these processes in the parameterizations. A major effort in the proposed research was to integrate, calibrate, and better understand the uncertainties in all of these measurements.

  13. Proceedings of Interservice/Industry Training Systems and Education Conference (16th) Held on 28 November -1 December 1994.

    DTIC Science & Technology

    1994-12-01

    be INTRODUCTION familiar: best value source selection, processes and metrics In simplified terms, acquisition and continuous improvement ; of a training ...pro- continuous improvement , MIL-STD- posed processes and metrics are 1379D, the systems approach to placed in the contract in a training , concurrent...identification and 5 Continuous Process Improvement correction of errors are critical to software product 6 Training correctness and quality. Correcting

  14. Case Study: Accelerating Process Improvement by Integrating the TSP and CMMI

    DTIC Science & Technology

    2005-12-01

    improve their work? Watts S . Humphrey , a founder of the process improvement initiative at the SEI, de- cided to apply SW-CMM principles to the...authorized PSP instructor. At Schwalb’s urging, Watts Humphrey briefed the SLT on the PSP and TSP, and after the briefing, the team understood...hefley.html. [ Humphrey 96] Humphrey , Watts S . Introduction to the Personal Software Process. Boston, MA: Addison-Wesley Publishing Company, Inc., 1996

  15. Defense Acquisitions: How and Where DOD Spends Its Contracting Dollars

    DTIC Science & Technology

    2015-04-30

    process . GSA is undertaking a multi-year effort to improve the reliability and usefulness of the information contained in FPDS and other federal... Improve FPDS According to GSA, a number of data systems, including FPDS, are undergoing a significant overhaul. This overhaul is a multi-year process ...data accuracy and completeness, then initiating a process to ensure that these standards are met, would improve data accuracy and completeness.” U.S

  16. Design of launch systems using continuous improvement process

    NASA Technical Reports Server (NTRS)

    Brown, Richard W.

    1995-01-01

    The purpose of this paper is to identify a systematic process for improving ground operations for future launch systems. This approach is based on the Total Quality Management (TQM) continuous improvement process. While the continuous improvement process is normally identified with making incremental changes to an existing system, it can be used on new systems if they use past experience as a knowledge base. In the case of the Reusable Launch Vehicle (RLV), the Space Shuttle operations provide many lessons. The TQM methodology used for this paper will be borrowed from the United States Air Force 'Quality Air Force' Program. There is a general overview of the continuous improvement process, with concentration on the formulation phase. During this phase critical analyses are conducted to determine the strategy and goals for the remaining development process. These analyses include analyzing the mission from the customers point of view, developing an operations concept for the future, assessing current capabilities and determining the gap to be closed between current capabilities and future needs and requirements. A brief analyses of the RLV, relative to the Space Shuttle, will be used to illustrate the concept. Using the continuous improvement design concept has many advantages. These include a customer oriented process which will develop a more marketable product and a better integration of operations and systems during the design phase. But, the use of TQM techniques will require changes, including more discipline in the design process and more emphasis on data gathering for operational systems. The benefits will far outweigh the additional effort.

  17. Patterned wafer geometry grouping for improved overlay control

    NASA Astrophysics Data System (ADS)

    Lee, Honggoo; Han, Sangjun; Woo, Jaeson; Park, Junbeom; Song, Changrock; Anis, Fatima; Vukkadala, Pradeep; Jeon, Sanghuck; Choi, DongSub; Huang, Kevin; Heo, Hoyoung; Smith, Mark D.; Robinson, John C.

    2017-03-01

    Process-induced overlay errors from outside the litho cell have become a significant contributor to the overlay error budget including non-uniform wafer stress. Previous studies have shown the correlation between process-induced stress and overlay and the opportunity for improvement in process control, including the use of patterned wafer geometry (PWG) metrology to reduce stress-induced overlay signatures. Key challenges of volume semiconductor manufacturing are how to improve not only the magnitude of these signatures, but also the wafer to wafer variability. This work involves a novel technique of using PWG metrology to provide improved litho-control by wafer-level grouping based on incoming process induced overlay, relevant for both 3D NAND and DRAM. Examples shown in this study are from 19 nm DRAM manufacturing.

  18. System design and improvement of an emergency department using Simulation-Based Multi-Objective Optimization

    NASA Astrophysics Data System (ADS)

    Goienetxea Uriarte, A.; Ruiz Zúñiga, E.; Urenda Moris, M.; Ng, A. H. C.

    2015-05-01

    Discrete Event Simulation (DES) is nowadays widely used to support decision makers in system analysis and improvement. However, the use of simulation for improving stochastic logistic processes is not common among healthcare providers. The process of improving healthcare systems involves the necessity to deal with trade-off optimal solutions that take into consideration a multiple number of variables and objectives. Complementing DES with Multi-Objective Optimization (SMO) creates a superior base for finding these solutions and in consequence, facilitates the decision-making process. This paper presents how SMO has been applied for system improvement analysis in a Swedish Emergency Department (ED). A significant number of input variables, constraints and objectives were considered when defining the optimization problem. As a result of the project, the decision makers were provided with a range of optimal solutions which reduces considerably the length of stay and waiting times for the ED patients. SMO has proved to be an appropriate technique to support healthcare system design and improvement processes. A key factor for the success of this project has been the involvement and engagement of the stakeholders during the whole process.

  19. Health care managers' views on and approaches to implementing models for improving care processes.

    PubMed

    Andreasson, Jörgen; Eriksson, Andrea; Dellve, Lotta

    2016-03-01

    To develop a deeper understanding of health-care managers' views on and approaches to the implementation of models for improving care processes. In health care, there are difficulties in implementing models for improving care processes that have been decided on by upper management. Leadership approaches to this implementation can affect the outcome. In-depth interviews with first- and second-line managers in Swedish hospitals were conducted and analysed using grounded theory. 'Coaching for participation' emerged as a central theme for managers in handling top-down initiated process development. The vertical approach in this coaching addresses how managers attempt to sustain unit integrity through adapting and translating orders from top management. The horizontal approach in the coaching refers to managers' strategies for motivating and engaging their employees in implementation work. Implementation models for improving care processes require a coaching leadership built on close manager-employee interaction, mindfulness regarding the pace of change at the unit level, managers with the competence to share responsibility with their teams and engaged employees with the competence to share responsibility for improving the care processes, and organisational structures that support process-oriented work. Implications for nursing management are the importance of giving nurse managers knowledge of change management. © 2015 John Wiley & Sons Ltd.

  20. Improving ED specimen TAT using Lean Six Sigma.

    PubMed

    Sanders, Janet H; Karr, Tedd

    2015-01-01

    Lean and Six Sigma are continuous improvement methodologies that have garnered international fame for improving manufacturing and service processes. Increasingly these methodologies are demonstrating their power to also improve healthcare processes. The purpose of this paper is to discuss a case study for the application of Lean and Six Sigma tools in the reduction of turnaround time (TAT) for Emergency Department (ED) specimens. This application of the scientific methodologies uncovered opportunities to improve the entire ED to lab system for the specimens. This case study provides details on the completion of a Lean Six Sigma project in a 1,000 bed tertiary care teaching hospital. Six Sigma's Define, Measure, Analyze, Improve, and Control methodology is very similar to good medical practice: first, relevant information is obtained and assembled; second, a careful and thorough diagnosis is completed; third, a treatment is proposed and implemented; and fourth, checks are made to determine if the treatment was effective. Lean's primary goal is to do more with less work and waste. The Lean methodology was used to identify and eliminate waste through rapid implementation of change. The initial focus of this project was the reduction of turn-around-times for ED specimens. However, the results led to better processes for both the internal and external customers of this and other processes. The project results included: a 50 percent decrease in vials used for testing, a 50 percent decrease in unused or extra specimens, a 90 percent decrease in ED specimens without orders, a 30 percent decrease in complete blood count analysis (CBCA) Median TAT, a 50 percent decrease in CBCA TAT Variation, a 10 percent decrease in Troponin TAT Variation, a 18.2 percent decrease in URPN TAT Variation, and a 2-5 minute decrease in ED registered nurses rainbow draw time. This case study demonstrated how the quantitative power of Six Sigma and the speed of Lean worked in harmony to improve the blood draw process for a 1,000 bed tertiary care teaching hospital. The blood draw process is a standard process used in hospitals to collect blood chemistry and hematology information for clinicians. The methods used in this case study demonstrated valuable and practical applications of process improvement methodologies that can be used for any hospital process and/or service environment. While this is not the first case study that has demonstrated the use of continuous process improvement methodologies to improve a hospital process, it is unique in the way in which it utilizes the strength of the project focussed approach that adheres more to the structure and rigor of Six Sigma and relied less on the speed of lean. Additionally, the application of these methodologies in healthcare is emerging research.

  1. Improved hybridization of Fuzzy Analytic Hierarchy Process (FAHP) algorithm with Fuzzy Multiple Attribute Decision Making - Simple Additive Weighting (FMADM-SAW)

    NASA Astrophysics Data System (ADS)

    Zaiwani, B. E.; Zarlis, M.; Efendi, S.

    2018-03-01

    In this research, the improvement of hybridization algorithm of Fuzzy Analytic Hierarchy Process (FAHP) with Fuzzy Technique for Order Preference by Similarity to Ideal Solution (FTOPSIS) in selecting the best bank chief inspector based on several qualitative and quantitative criteria with various priorities. To improve the performance of the above research, FAHP algorithm hybridization with Fuzzy Multiple Attribute Decision Making - Simple Additive Weighting (FMADM-SAW) algorithm was adopted, which applied FAHP algorithm to the weighting process and SAW for the ranking process to determine the promotion of employee at a government institution. The result of improvement of the average value of Efficiency Rate (ER) is 85.24%, which means that this research has succeeded in improving the previous research that is equal to 77.82%. Keywords: Ranking and Selection, Fuzzy AHP, Fuzzy TOPSIS, FMADM-SAW.

  2. Marshaling and Acquiring Resources for the Process Improvement Process

    DTIC Science & Technology

    1993-06-01

    stakeholders. ( Geber , 1990) D. IDENTIFYING SUPPLIERS Suppliers are just as crucial to setting requirements for processes as are customers. Although...output ( Geber , 1990, p. 32). Before gathering resources for process improvement, the functional manager must ensure that the relationship of internal...him patent information and clerical people process his applications. ( Geber , 1990, pp. 29-34) To get the full benefit of a white-collar worker as a

  3. IRB Process Improvements: A Machine Learning Analysis.

    PubMed

    Shoenbill, Kimberly; Song, Yiqiang; Cobb, Nichelle L; Drezner, Marc K; Mendonca, Eneida A

    2017-06-01

    Clinical research involving humans is critically important, but it is a lengthy and expensive process. Most studies require institutional review board (IRB) approval. Our objective is to identify predictors of delays or accelerations in the IRB review process and apply this knowledge to inform process change in an effort to improve IRB efficiency, transparency, consistency and communication. We analyzed timelines of protocol submissions to determine protocol or IRB characteristics associated with different processing times. Our evaluation included single variable analysis to identify significant predictors of IRB processing time and machine learning methods to predict processing times through the IRB review system. Based on initial identified predictors, changes to IRB workflow and staffing procedures were instituted and we repeated our analysis. Our analysis identified several predictors of delays in the IRB review process including type of IRB review to be conducted, whether a protocol falls under Veteran's Administration purview and specific staff in charge of a protocol's review. We have identified several predictors of delays in IRB protocol review processing times using statistical and machine learning methods. Application of this knowledge to process improvement efforts in two IRBs has led to increased efficiency in protocol review. The workflow and system enhancements that are being made support our four-part goal of improving IRB efficiency, consistency, transparency, and communication.

  4. Improved molding process ensures plastic parts of higher tensile strength

    NASA Technical Reports Server (NTRS)

    Heier, W. C.

    1968-01-01

    Single molding process ensures that plastic parts /of a given mechanical design/ produced from a conventional thermosetting molding compound will have a maximum tensile strength. The process can also be used for other thermosetting compounds to produce parts with improved physical properties.

  5. Using IT to improve quality at NewYork-Presybterian Hospital: a requirements-driven strategic planning process.

    PubMed

    Kuperman, Gilad J; Boyer, Aurelia; Cole, Curt; Forman, Bruce; Stetson, Peter D; Cooper, Mary

    2006-01-01

    At NewYork-Presbyterian Hospital, we are committed to the delivery of high quality care. We have implemented a strategic planning process to determine the information technology initiatives that will best help us improve quality. The process began with the creation of a Clinical Quality and IT Committee. The Committee identified 2 high priority goals that would enable demonstrably high quality care: 1) excellence at data warehousing, and 2) optimal use of automated clinical documentation to capture encounter-related quality and safety data. For each high priority goal, a working group was created to develop specific recommendations. The Data Warehousing subgroup has recommended the implementation of an architecture management process and an improved ability for users to get access to aggregate data. The Structured Documentation subgroup is establishing recommendations for a documentation template creation process. The strategic planning process at times is slow, but assures that the organization is focusing on the information technology activities most likely to lead to improved quality.

  6. Ensemble analyses improve signatures of tumour hypoxia and reveal inter-platform differences

    PubMed Central

    2014-01-01

    Background The reproducibility of transcriptomic biomarkers across datasets remains poor, limiting clinical application. We and others have suggested that this is in-part caused by differential error-structure between datasets, and their incomplete removal by pre-processing algorithms. Methods To test this hypothesis, we systematically assessed the effects of pre-processing on biomarker classification using 24 different pre-processing methods and 15 distinct signatures of tumour hypoxia in 10 datasets (2,143 patients). Results We confirm strong pre-processing effects for all datasets and signatures, and find that these differ between microarray versions. Importantly, exploiting different pre-processing techniques in an ensemble technique improved classification for a majority of signatures. Conclusions Assessing biomarkers using an ensemble of pre-processing techniques shows clear value across multiple diseases, datasets and biomarkers. Importantly, ensemble classification improves biomarkers with initially good results but does not result in spuriously improved performance for poor biomarkers. While further research is required, this approach has the potential to become a standard for transcriptomic biomarkers. PMID:24902696

  7. Integration of Value Stream Map and Healthcare Failure Mode and Effect Analysis into Six Sigma Methodology to Improve Process of Surgical Specimen Handling.

    PubMed

    Hung, Sheng-Hui; Wang, Pa-Chun; Lin, Hung-Chun; Chen, Hung-Ying; Su, Chao-Ton

    2015-01-01

    Specimen handling is a critical patient safety issue. Problematic handling process, such as misidentification (of patients, surgical site, and specimen counts), specimen loss, or improper specimen preparation can lead to serious patient harms and lawsuits. Value stream map (VSM) is a tool used to find out non-value-added works, enhance the quality, and reduce the cost of the studied process. On the other hand, healthcare failure mode and effect analysis (HFMEA) is now frequently employed to avoid possible medication errors in healthcare process. Both of them have a goal similar to Six Sigma methodology for process improvement. This study proposes a model that integrates VSM and HFMEA into the framework, which mainly consists of define, measure, analyze, improve, and control (DMAIC), of Six Sigma. A Six Sigma project for improving the process of surgical specimen handling in a hospital was conducted to demonstrate the effectiveness of the proposed model.

  8. Using IT to Improve Quality at NewYork-Presybterian Hospital: A Requirements-Driven Strategic Planning Process

    PubMed Central

    Kuperman, Gilad J.; Boyer, Aurelia; Cole, Curt; Forman, Bruce; Stetson, Peter D.; Cooper, Mary

    2006-01-01

    At NewYork-Presbyterian Hospital, we are committed to the delivery of high quality care. We have implemented a strategic planning process to determine the information technology initiatives that will best help us improve quality. The process began with the creation of a Clinical Quality and IT Committee. The Committee identified 2 high priority goals that would enable demonstrably high quality care: 1) excellence at data warehousing, and 2) optimal use of automated clinical documentation to capture encounter-related quality and safety data. For each high priority goal, a working group was created to develop specific recommendations. The Data Warehousing subgroup has recommended the implementation of an architecture management process and an improved ability for users to get access to aggregate data. The Structured Documentation subgroup is establishing recommendations for a documentation template creation process. The strategic planning process at times is slow, but assures that the organization is focusing on the information technology activities most likely to lead to improved quality. PMID:17238381

  9. Visit, revamp, and revitalize your business plan: Part 2.

    PubMed

    Waldron, David

    2011-01-01

    The diagnostic imaging department strives for the highest quality outcomes in imaging quality, in diagnostic reporting, and in providing a caring patient experience while also satisfying the needs of referring physicians. Understand how tools such as process mapping and concepts such as Six Sigma and Lean Six Sigma can be used to facilitate quality improvements and team building, resulting in staff led process improvement initiatives. Discover how to integrate a continuous staff management cycle to implement process improvements,capture the promised performance improvements, and achieve a culture change away from the "way it has always been done".

  10. Statistical process control: separating signal from noise in emergency department operations.

    PubMed

    Pimentel, Laura; Barrueto, Fermin

    2015-05-01

    Statistical process control (SPC) is a visually appealing and statistically rigorous methodology very suitable to the analysis of emergency department (ED) operations. We demonstrate that the control chart is the primary tool of SPC; it is constructed by plotting data measuring the key quality indicators of operational processes in rationally ordered subgroups such as units of time. Control limits are calculated using formulas reflecting the variation in the data points from one another and from the mean. SPC allows managers to determine whether operational processes are controlled and predictable. We review why the moving range chart is most appropriate for use in the complex ED milieu, how to apply SPC to ED operations, and how to determine when performance improvement is needed. SPC is an excellent tool for operational analysis and quality improvement for these reasons: 1) control charts make large data sets intuitively coherent by integrating statistical and visual descriptions; 2) SPC provides analysis of process stability and capability rather than simple comparison with a benchmark; 3) SPC allows distinction between special cause variation (signal), indicating an unstable process requiring action, and common cause variation (noise), reflecting a stable process; and 4) SPC keeps the focus of quality improvement on process rather than individual performance. Because data have no meaning apart from their context, and every process generates information that can be used to improve it, we contend that SPC should be seriously considered for driving quality improvement in emergency medicine. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Research on material removal accuracy analysis and correction of removal function during ion beam figuring

    NASA Astrophysics Data System (ADS)

    Wu, Weibin; Dai, Yifan; Zhou, Lin; Xu, Mingjin

    2016-09-01

    Material removal accuracy has a direct impact on the machining precision and efficiency of ion beam figuring. By analyzing the factors suppressing the improvement of material removal accuracy, we conclude that correcting the removal function deviation and reducing the removal material amount during each iterative process could help to improve material removal accuracy. Removal function correcting principle can effectively compensate removal function deviation between actual figuring and simulated processes, while experiments indicate that material removal accuracy decreases with a long machining time, so a small amount of removal material in each iterative process is suggested. However, more clamping and measuring steps will be introduced in this way, which will also generate machining errors and suppress the improvement of material removal accuracy. On this account, a free-measurement iterative process method is put forward to improve material removal accuracy and figuring efficiency by using less measuring and clamping steps. Finally, an experiment on a φ 100-mm Zerodur planar is preformed, which shows that, in similar figuring time, three free-measurement iterative processes could improve the material removal accuracy and the surface error convergence rate by 62.5% and 17.6%, respectively, compared with a single iterative process.

  12. Flow chemistry using milli- and microstructured reactors-from conventional to novel process windows.

    PubMed

    Illg, Tobias; Löb, Patrick; Hessel, Volker

    2010-06-01

    The terminology Novel Process Window unites different methods to improve existing processes by applying unconventional and harsh process conditions like: process routes at much elevated pressure, much elevated temperature, or processing in a thermal runaway regime to achieve a significant impact on process performance. This paper is a review of parts of IMM's works in particular the applicability of above mentioned Novel Process Windows on selected chemical reactions. First, general characteristics of microreactors are discussed like excellent mass and heat transfer and improved mixing quality. Different types of reactions are presented in which the use of microstructured devices led to an increased process performance by applying Novel Process Windows. These examples were chosen to demonstrate how chemical reactions can benefit from the use of milli- and microstructured devices and how existing protocols can be changed toward process conditions hitherto not applicable in standard laboratory equipment. The used milli- and microstructured reactors can also offer advantages in other areas, for example, high-throughput screening of catalysts and better control of size distribution in a particle synthesis process by improved mixing, etc. The chemical industry is under continuous improvement. So, a lot of research is being done to synthesize high value chemicals, to optimize existing processes in view of process safety and energy consumption and to search for new routes to produce such chemicals. Leitmotifs of such undertakings are often sustainable development(1) and Green Chemistry(2).

  13. Improving Video Game Development: Facilitating Heterogeneous Team Collaboration through Flexible Software Processes

    NASA Astrophysics Data System (ADS)

    Musil, Juergen; Schweda, Angelika; Winkler, Dietmar; Biffl, Stefan

    Based on our observations of Austrian video game software development (VGSD) practices we identified a lack of systematic processes/method support and inefficient collaboration between various involved disciplines, i.e. engineers and artists. VGSD includes heterogeneous disciplines, e.g. creative arts, game/content design, and software. Nevertheless, improving team collaboration and process support is an ongoing challenge to enable a comprehensive view on game development projects. Lessons learned from software engineering practices can help game developers to increase game development processes within a heterogeneous environment. Based on a state of the practice survey in the Austrian games industry, this paper presents (a) first results with focus on process/method support and (b) suggests a candidate flexible process approach based on Scrum to improve VGSD and team collaboration. Results showed (a) a trend to highly flexible software processes involving various disciplines and (b) identified the suggested flexible process approach as feasible and useful for project application.

  14. [Establishment of industry promotion technology system in Chinese medicine secondary exploitation based on "component structure theory"].

    PubMed

    Cheng, Xu-Dong; Feng, Liang; Zhang, Ming-Hua; Gu, Jun-Fei; Jia, Xiao-Bin

    2014-10-01

    The purpose of the secondary exploitation of Chinese medicine is to improve the quality of Chinese medicine products, enhance core competitiveness, for better use in clinical practice, and more effectively solve the patient suffering. Herbs, extraction, separation, refreshing, preparation and quality control are all involved in the industry promotion of Chinese medicine secondary exploitation of industrial production. The Chinese medicine quality improvement and industry promotion could be realized with the whole process of process optimization, quality control, overall processes improvement. Based on the "component structure theory", "multi-dimensional structure & process dynamic quality control system" and systematic and holistic character of Chinese medicine, impacts of whole process were discussed. Technology systems of Chinese medicine industry promotion was built to provide theoretical basis for improving the quality and efficacy of the secondary development of traditional Chinese medicine products.

  15. Leveraging electronic health record documentation for Failure Mode and Effects Analysis team identification

    PubMed Central

    Carson, Matthew B; Lee, Young Ji; Benacka, Corrine; Mutharasan, R. Kannan; Ahmad, Faraz S; Kansal, Preeti; Yancy, Clyde W; Anderson, Allen S; Soulakis, Nicholas D

    2017-01-01

    Objective: Using Failure Mode and Effects Analysis (FMEA) as an example quality improvement approach, our objective was to evaluate whether secondary use of orders, forms, and notes recorded by the electronic health record (EHR) during daily practice can enhance the accuracy of process maps used to guide improvement. We examined discrepancies between expected and observed activities and individuals involved in a high-risk process and devised diagnostic measures for understanding discrepancies that may be used to inform quality improvement planning. Methods: Inpatient cardiology unit staff developed a process map of discharge from the unit. We matched activities and providers identified on the process map to EHR data. Using four diagnostic measures, we analyzed discrepancies between expectation and observation. Results: EHR data showed that 35% of activities were completed by unexpected providers, including providers from 12 categories not identified as part of the discharge workflow. The EHR also revealed sub-components of process activities not identified on the process map. Additional information from the EHR was used to revise the process map and show differences between expectation and observation. Conclusion: Findings suggest EHR data may reveal gaps in process maps used for quality improvement and identify characteristics about workflow activities that can identify perspectives for inclusion in an FMEA. Organizations with access to EHR data may be able to leverage clinical documentation to enhance process maps used for quality improvement. While focused on FMEA protocols, findings from this study may be applicable to other quality activities that require process maps. PMID:27589944

  16. Improving the Quality of Service and Security of Military Networks with a Network Tasking Order Process

    DTIC Science & Technology

    2010-09-01

    IMPROVING THE QUALITY OF SERVICE AND SECURITY OF MILITARY NETWORKS WITH A NETWORK TASKING ORDER...United States. AFIT/DCS/ENG/10-09 IMPROVING THE QUALITY OF SERVICE AND SECURITY OF MILITARY NETWORKS WITH A NETWORK TASKING ORDER PROCESS...USAF September 2010 APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED AFIT/DCS/ENG/10-09 IMPROVING THE QUALITY OF SERVICE AND

  17. Development and Processing Improvement of Aerospace Aluminum Alloys

    NASA Technical Reports Server (NTRS)

    Lisagor, W. Barry; Bales, Thomas T.

    2007-01-01

    This final report, in multiple presentation format, describes a comprehensive multi-tasked contract study to improve the overall property response of selected aerospace alloys, explore further a newly-developed and registered alloy, and correlate the processing, metallurgical structure, and subsequent properties achieved with particular emphasis on the crystallographic orientation texture developed. Modifications to plate processing, specifically hot rolling practices, were evaluated for Al-Li alloys 2195 and 2297, for the recently registered Al-Cu-Ag alloy, 2139, and for the Al-Zn-Mg-Cu alloy, 7050. For all of the alloys evaluated, the processing modifications resulted in significant improvements in mechanical properties. Analyses also resulted in an enhanced understanding of the correlation of processing, crystallographic texture, and mechanical properties.

  18. Improvement of radiology services based on the process management approach.

    PubMed

    Amaral, Creusa Sayuri Tahara; Rozenfeld, Henrique; Costa, Janaina Mascarenhas Hornos; Magon, Maria de Fátima de Andrade; Mascarenhas, Yvone Maria

    2011-06-01

    The health sector requires continuous investments to ensure the improvement of products and services from a technological standpoint, the use of new materials, equipment and tools, and the application of process management methods. Methods associated with the process management approach, such as the development of reference models of business processes, can provide significant innovations in the health sector and respond to the current market trend for modern management in this sector (Gunderman et al. (2008) [4]). This article proposes a process model for diagnostic medical X-ray imaging, from which it derives a primary reference model and describes how this information leads to gains in quality and improvements. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  19. Lean Thinking in Libraries: A Case Study on Improving Shelving Turnaround

    ERIC Educational Resources Information Center

    Kress, Nancy J.

    2007-01-01

    The University of Chicago's Joseph Regenstein Library Bookstacks Department has used process mapping and continuous improvement to successfully improve its overall operations. The most recent efforts focus on Lean manufacturing, an initiative centered on eliminating waste in manufacturing processes. The conversion of the Bookstacks Department from…

  20. Creating a Cycle of Continuous Improvement through Instructional Rounds

    ERIC Educational Resources Information Center

    Meyer-Looze, Catherine L.

    2015-01-01

    Instructional Rounds is a continuous improvement strategy that focuses on the technical core of educational systems as well as educators collaborating side-by-side. Concentrating on collective learning, this process only makes sense within an overall strategy of improvement. This case study examined the Instructional Rounds process in a northern…

  1. How Does Knowledge Promote Memory? The Distinctiveness Theory of Skilled Memory

    ERIC Educational Resources Information Center

    Rawson, Katherine A.; Van Overschelde, James P.

    2008-01-01

    The robust effects of knowledge on memory for domain-relevant information reported in previous research have largely been attributed to improved organizational processing. The present research proposes the distinctiveness theory of skilled memory, which states that knowledge improves memory not only through improved organizational processing but…

  2. Technological Improvements for Digital Fire Control Systems

    DTIC Science & Technology

    2017-09-30

    Final Technical Status Report For DOTC-12-01-INIT061 Technological Improvements for Digital Fire Control Systems Reporting Period: 30 Sep...Initiative Information Develop and fabricate next generation designs using advanced materials and processes. This will include but is not limited to...4.2 Develop manufacturing processes 100% 4.3 Develop manufacturing processes 100% 4.4 Develop manufacturing processes 100% 5 Design Tooling

  3. Using the Results of Teaching Evaluations to Improve Teaching: A Case Study of a New Systematic Process

    ERIC Educational Resources Information Center

    Malouff, John M.; Reid, Jackie; Wilkes, Janelle; Emmerton, Ashley J.

    2015-01-01

    This article describes a new 14-step process for using student evaluations of teaching to improve teaching. The new process includes examination of student evaluations in the context of instructor goals, student evaluations of the same course completed in prior terms, and evaluations of similar courses taught by other instructors. The process has…

  4. Application Process Improvement Yields Results.

    ERIC Educational Resources Information Center

    Holesovsky, Jan Paul

    1995-01-01

    After a continuing effort to improve its grant application process, the department of medical microbiology and immunology at the University of Wisconsin-Madison is submitting many more applications and realizing increased funding. The methods and strategy used to make the process more efficient and effective are outlined. (Author/MSE)

  5. Electronic Timekeeping: North Dakota State University Improves Payroll Processing.

    ERIC Educational Resources Information Center

    Vetter, Ronald J.; And Others

    1993-01-01

    North Dakota State University has adopted automated timekeeping to improve the efficiency and effectiveness of payroll processing. The microcomputer-based system accurately records and computes employee time, tracks labor distribution, accommodates complex labor policies and company pay practices, provides automatic data processing and reporting,…

  6. WISE: Automated support for software project management and measurement. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Ramakrishnan, Sudhakar

    1995-01-01

    One important aspect of software development and IV&V is measurement. Unless a software development effort is measured in some way, it is difficult to judge the effectiveness of current efforts and predict future performances. Collection of metrics and adherence to a process are difficult tasks in a software project. Change activity is a powerful indicator of project status. Automated systems that can handle change requests, issues, and other process documents provide an excellent platform for tracking the status of the project. A World Wide Web based architecture is developed for (a) making metrics collection an implicit part of the software process, (b) providing metric analysis dynamically, (c) supporting automated tools that can complement current practices of in-process improvement, and (d) overcoming geographical barrier. An operational system (WISE) instantiates this architecture allowing for the improvement of software process in a realistic environment. The tool tracks issues in software development process, provides informal communication between the users with different roles, supports to-do lists (TDL), and helps in software process improvement. WISE minimizes the time devoted to metrics collection, analysis, and captures software change data. Automated tools like WISE focus on understanding and managing the software process. The goal is improvement through measurement.

  7. An Investigation of Sintering Parameters on Titanium Powder for Electron Beam Melting Processing Optimization.

    PubMed

    Drescher, Philipp; Sarhan, Mohamed; Seitz, Hermann

    2016-12-01

    Selective electron beam melting (SEBM) is a relatively new additive manufacturing technology for metallic materials. Specific to this technology is the sintering of the metal powder prior to the melting process. The sintering process has disadvantages for post-processing. The post-processing of parts produced by SEBM typically involves the removal of semi-sintered powder through the use of a powder blasting system. Furthermore, the sintering of large areas before melting decreases productivity. Current investigations are aimed at improving the sintering process in order to achieve better productivity, geometric accuracy, and resolution. In this study, the focus lies on the modification of the sintering process. In order to investigate and improve the sintering process, highly porous titanium test specimens with various scan speeds were built. The aim of this study was to decrease build time with comparable mechanical properties of the components and to remove the residual powder more easily after a build. By only sintering the area in which the melt pool for the components is created, an average productivity improvement of approx. 20% was achieved. Tensile tests were carried out, and the measured mechanical properties show comparatively or slightly improved values compared with the reference.

  8. The impact of a lean rounding process in a pediatric intensive care unit.

    PubMed

    Vats, Atul; Goin, Kristin H; Villarreal, Monica C; Yilmaz, Tuba; Fortenberry, James D; Keskinocak, Pinar

    2012-02-01

    Poor workflow associated with physician rounding can produce inefficiencies that decrease time for essential activities, delay clinical decisions, and reduce staff and patient satisfaction. Workflow and provider resources were not optimized when a pediatric intensive care unit increased by 22,000 square feet (to 33,000) and by nine beds (to 30). Lean methods (focusing on essential processes) and scenario analysis were used to develop and implement a patient-centric standardized rounding process, which we hypothesize would lead to improved rounding efficiency, decrease required physician resources, improve satisfaction, and enhance throughput. Human factors techniques and statistical tools were used to collect and analyze observational data for 11 rounding events before and 12 rounding events after process redesign. Actions included: 1) recording rounding events, times, and patient interactions and classifying them as essential, nonessential, or nonvalue added; 2) comparing rounding duration and time per patient to determine the impact on efficiency; 3) analyzing discharge orders for timeliness; 4) conducting staff surveys to assess improvements in communication and care coordination; and 5) analyzing customer satisfaction data to evaluate impact on patient experience. Thirty-bed pediatric intensive care unit in a children's hospital with academic affiliation. Eight attending pediatric intensivists and their physician rounding teams. Eight attending physician-led teams were observed for 11 rounding events before and 12 rounding events after implementation of a standardized lean rounding process focusing on essential processes. Total rounding time decreased significantly (157 ± 35 mins before vs. 121 ± 20 mins after), through a reduction in time spent on nonessential (53 ± 30 vs. 9 ± 6 mins) activities. The previous process required three attending physicians for an average of 157 mins (7.55 attending physician man-hours), while the new process required two attending physicians for an average of 121 mins (4.03 attending physician man-hours). Cumulative distribution of completed patient rounds by hour of day showed an improvement from 40% to 80% of patients rounded by 9:30 AM. Discharge data showed pediatric intensive care unit patients were discharged an average of 58.05 mins sooner (p < .05). Staff surveys showed a significant increase in satisfaction with the new process (including increased efficiency, improved physician identification, and clearer understanding of process). Customer satisfaction scores showed improvement after implementing the new process. Implementation of a lean-focused, patient-centric rounding structure stressing essential processes was associated with increased timeliness and efficiency of rounds, improved staff and customer satisfaction, improved throughput, and reduced attending physician man-hours.

  9. Improved image processing of road pavement defect by infrared thermography

    NASA Astrophysics Data System (ADS)

    Sim, Jun-Gi

    2018-03-01

    This paper intends to achieve improved image processing for the clear identification of defects in damaged road pavement structure using infrared thermography non-destructive testing (NDT). To that goal, 4 types of pavement specimen including internal defects were fabricated to exploit the results obtained by heating the specimens by natural light. The results showed that defects located down to a depth of 3 cm could be detected by infrared thermography NDT using the improved image processing method.

  10. Collaborative Project: Improving the Representation of Coastal and Estuarine Processes in Earth System Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bryan, Frank; Dennis, John; MacCready, Parker

    This project aimed to improve long term global climate simulations by resolving and enhancing the representation of the processes involved in the cycling of freshwater through estuaries and coastal regions. This was a collaborative multi-institution project consisting of physical oceanographers, climate model developers, and computational scientists. It specifically targeted the DOE objectives of advancing simulation and predictive capability of climate models through improvements in resolution and physical process representation.

  11. Final Report Collaborative Project: Improving the Representation of Coastal and Estuarine Processes in Earth System Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bryan, Frank; Dennis, John; MacCready, Parker

    This project aimed to improve long term global climate simulations by resolving and enhancing the representation of the processes involved in the cycling of freshwater through estuaries and coastal regions. This was a collaborative multi-institution project consisting of physical oceanographers, climate model developers, and computational scientists. It specifically targeted the DOE objectives of advancing simulation and predictive capability of climate models through improvements in resolution and physical process representation.

  12. Life cycle assessment as a tool for the environmental improvement of the tannery industry in developing countries.

    PubMed

    Rivela, B; Moreira, M T; Bornhardt, C; Méndez, R; Feijoo, G

    2004-03-15

    A representative leather tannery industry in a Latin American developing country has been studied from an environmental point of view, including both technical and economic analysis. Life Cycle Analysis (LCA) methodology has been used for the quantification and evaluation of the impacts of the chromium tanning process as a basis to propose further improvement actions. Four main subsystems were considered: beamhouse, tanyard, retanning, and wood furnace. Damages to human health, ecosystem quality, and resources are mainly produced by the tanyard subsystem. The control and reduction of chromium and ammonia emissions are the critical points to be considered to improve the environmental performance of the process. Technologies available for improved management of chromium tanning were profoundly studied, and improvement actions related to optimized operational conditions and a high exhaustion chrome-tanning process were selected. These actions related to the implementation of internal procedures affected the economy of the process with savings ranging from US dollars 8.63 to US dollars 22.5 for the processing of 1 ton of wet salt hides, meanwhile the global environmental impact was reduced to 44-50%. Moreover, the treatment of wastewaters was considered in two scenarios. Primary treatment presented the largest reduction of the environmental impact of the tanning process, while no significant improvement for the evaluated impact categories was achieved when combining primary and secondary treatments.

  13. Quality Improvement on the Acute Inpatient Psychiatry Unit Using the Model for Improvement

    PubMed Central

    Singh, Kuldeep; Sanderson, Joshua; Galarneau, David; Keister, Thomas; Hickman, Dean

    2013-01-01

    Background A need exists for constant evaluation and modification of processes within healthcare systems to achieve quality improvement. One common approach is the Model for Improvement that can be used to clearly define aims, measures, and changes that are then implemented through a plan-do-study-act (PDSA) cycle. This approach is a commonly used method for improving quality in a wide range of fields. The Model for Improvement allows for a systematic process that can be revised at set time intervals to achieve a desired result. Methods We used the Model for Improvement in an acute psychiatry unit (APU) to improve the screening incidence of abnormal involuntary movements in eligible patients—those starting or continuing on standing neuroleptics—with the Abnormal Involuntary Movement Scale (AIMS). Results After 8 weeks of using the Model for Improvement, both of the participating inpatient services in the APU showed substantial overall improvement in screening for abnormal involuntary movements using the AIMS. Conclusion Crucial aspects of a successful quality improvement initiative based on the Model for Improvement are well-defined goals, process measures, and structured PDSA cycles. Success also requires communication, organization, and participation of the entire team. PMID:24052768

  14. Quality improvement on the acute inpatient psychiatry unit using the model for improvement.

    PubMed

    Singh, Kuldeep; Sanderson, Joshua; Galarneau, David; Keister, Thomas; Hickman, Dean

    2013-01-01

    A need exists for constant evaluation and modification of processes within healthcare systems to achieve quality improvement. One common approach is the Model for Improvement that can be used to clearly define aims, measures, and changes that are then implemented through a plan-do-study-act (PDSA) cycle. This approach is a commonly used method for improving quality in a wide range of fields. The Model for Improvement allows for a systematic process that can be revised at set time intervals to achieve a desired result. We used the Model for Improvement in an acute psychiatry unit (APU) to improve the screening incidence of abnormal involuntary movements in eligible patients-those starting or continuing on standing neuroleptics-with the Abnormal Involuntary Movement Scale (AIMS). After 8 weeks of using the Model for Improvement, both of the participating inpatient services in the APU showed substantial overall improvement in screening for abnormal involuntary movements using the AIMS. Crucial aspects of a successful quality improvement initiative based on the Model for Improvement are well-defined goals, process measures, and structured PDSA cycles. Success also requires communication, organization, and participation of the entire team.

  15. Recent Improvements in the FDNS CFD Code and its Associated Process

    NASA Technical Reports Server (NTRS)

    West, Jeff S.; Dorney, Suzanne M.; Turner, Jim (Technical Monitor)

    2002-01-01

    This viewgraph presentation gives an overview on recent improvements in the Finite Difference Navier Stokes (FDNS) computational fluid dynamics (CFD) code and its associated process. The development of a utility, PreViewer, has essentially eliminated the creeping of simple human error into the FDNS Solution process. Extension of PreViewer to encapsulate the Domain Decompression process has made practical the routine use of parallel processing. The combination of CVS source control and ATS consistency validation significantly increases the efficiency of the CFD process.

  16. Operations research methods improve chemotherapy patient appointment scheduling.

    PubMed

    Santibáñez, Pablo; Aristizabal, Ruben; Puterman, Martin L; Chow, Vincent S; Huang, Wenhai; Kollmannsberger, Christian; Nordin, Travis; Runzer, Nancy; Tyldesley, Scott

    2012-12-01

    Clinical complexity, scheduling restrictions, and outdated manual booking processes resulted in frequent clerical rework, long waitlists for treatment, and late appointment notification for patients at a chemotherapy clinic in a large cancer center in British Columbia, Canada. A 17-month study was conducted to address booking, scheduling and workload issues and to develop, implement, and evaluate solutions. A review of scheduling practices included process observation and mapping, analysis of historical appointment data, creation of a new performance metric (final appointment notification lead time), and a baseline patient satisfaction survey. Process improvement involved discrete event simulation to evaluate alternative booking practice scenarios, development of an optimization-based scheduling tool to improve scheduling efficiency, and change management for implementation of process changes. Results were evaluated through analysis of appointment data, a follow-up patient survey, and staff surveys. Process review revealed a two-stage scheduling process. Long waitlists and late notification resulted from an inflexible first-stage process. The second-stage process was time consuming and tedious. After a revised, more flexible first-stage process and an automated second-stage process were implemented, the median percentage of appointments exceeding the final appointment notification lead time target of one week was reduced by 57% and median waitlist size decreased by 83%. Patient surveys confirmed increased satisfaction while staff feedback reported reduced stress levels. Significant operational improvements can be achieved through process redesign combined with operations research methods.

  17. Got (the Right) Milk? How a Blended Quality Improvement Approach Catalyzed Change.

    PubMed

    Luton, Alexandra; Bondurant, Patricia G; Campbell, Amy; Conkin, Claudia; Hernandez, Jae; Hurst, Nancy

    2015-10-01

    The expression, storage, preparation, fortification, and feeding of breast milk are common ongoing activities in many neonatal intensive care units (NICUs) today. Errors in breast milk administration are a serious issue that should be prevented to preserve the health and well-being of NICU babies and their families. This paper describes how a program to improve processes surrounding infant feeding was developed, implemented, and evaluated. The project team used a blended quality improvement approach that included the Model for Improvement, Lean and Six Sigma methodologies, and principles of High Reliability Organizations to identify and drive short-term, medium-term, and long-term improvement strategies. Through its blended quality improvement approach, the team strengthened the entire dispensation system for both human milk and formula and outlined a clear vision and plan for further improvements as well. The NICU reduced feeding errors by 83%. Be systematic in the quality improvement approach, and apply proven methods to improving processes surrounding infant feeding. Involve expert project managers with nonclinical perspective to guide work in a systematic way and provide unbiased feedback. Create multidisciplinary, cross-departmental teams that include a vast array of stakeholders in NICU feeding processes to ensure comprehensive examination of current state, identification of potential risks, and "outside the box" potential solutions. As in the realm of pharmacy, the processes involved in preparing feedings for critically ill infants should be carried out via predictable, reliable means including robust automated verification that integrates seamlessly into existing processes. The use of systems employed in pharmacy for medication preparation should be considered in the human milk and formula preparation setting.

  18. Improvement in interfacial characteristics of low-voltage carbon nanotube thin-film transistors with solution-processed boron nitride thin films

    NASA Astrophysics Data System (ADS)

    Jeon, Jun-Young; Ha, Tae-Jun

    2017-08-01

    In this article, we demonstrate the potential of solution-processed boron nitride (BN) thin films for high performance single-walled carbon nanotube thin-film transistors (SWCNT-TFTs) with low-voltage operation. The use of BN thin films between solution-processed high-k dielectric layers improved the interfacial characteristics of metal-insulator-metal devices, thereby reducing the current density by three orders of magnitude. We also investigated the origin of improved device performance in SWCNT-TFTs by employing solution-processed BN thin films as an encapsulation layer. The BN encapsulation layer improves the electrical characteristics of SWCNT-TFTs, which includes the device key metrics of linear field-effect mobility, sub-threshold swing, and threshold voltage as well as the long-term stability against the aging effect in air. Such improvements can be achieved by reduced interaction of interfacial localized states with charge carriers. We believe that this work can open up a promising route to demonstrate the potential of solution-processed BN thin films on nanoelectronics.

  19. A Journey to Improved Inpatient Glycemic Control by Redesigning Meal Delivery and Insulin Administration.

    PubMed

    Engle, Martha; Ferguson, Allison; Fields, Willa

    2016-01-01

    The purpose of this quality improvement project was to redesign a hospital meal delivery process in order to shorten the time between blood glucose monitoring and corresponding insulin administration and improve glycemic control. This process change redesigned the workflow of the dietary and nursing departments. Modifications included nursing, rather than dietary, delivering meal trays to patients receiving insulin. Dietary marked the appropriate meal trays and phoned each unit prior to arrival on the unit. The process change was trialed on 2 acute care units prior to implementation hospital wide. Elapsed time between blood glucose monitoring and insulin administration was analyzed before and after process change as well as evaluation of glucometrics: percentage of patients with blood glucose between 70 and 180 mg/dL (percent perfect), blood glucose greater than 300 mg/dL (extreme hyperglycemia), and blood glucose less than 70 mg/dL (hypoglycemia). Percent perfect glucose results improved from 45% to 53%, extreme hyperglycemia (blood glucose >300 mg/dL) fell from 11.7% to 5%. Hypoglycemia demonstrated a downward trend line, demonstrating that with improving glycemic control hypoglycemia rates did not increase. Percentage of patients receiving meal insulin within 30 minutes of blood glucose check increased from 35% to 73%. In the hospital, numerous obstacles were present that interfered with on-time meal insulin delivery. Establishing a meal delivery process with the nurse performing the premeal blood glucose check, delivering the meal, and administering the insulin improves overall blood glucose control. Nurse-led process improvement of blood glucose monitoring, meal tray delivery, and insulin administration does lead to improved glycemic control for the inpatient population.

  20. Improving student-perceived benefit of academic advising within education of occupational and physical therapy in the United States: a quality improvement initiative.

    PubMed

    Barnes, Lisa J; Parish, Robin

    2017-01-01

    Academic advising is a key role for faculty in the educational process of health professionals; however, the best practice of effective academic advising for occupational and physical therapy students has not been identified in the current literature. The purpose of this quality improvement initiative was to assess and improve the faculty/student advisor/advisee process within occupational and physical therapy programs within a school of allied health professions in the United States in 2015. A quality improvement initiative utilizing quantitative and qualitative information was gathered via survey focused on the assessment and improvement of an advisor/advisee process. The overall initiative utilized an adaptive iterative design incorporating the plan-do-study-act model which included a three-step process over a one year time frame utilizing 2 cohorts, the first with 80 students and the second with 88 students. Baseline data were gathered prior to initiating the new process. A pilot was conducted and assessed during the first semester of the occupational and physical therapy programs. Final information was gathered after one full academic year with final comparisons made to baseline. Defining an effective advisory program with an established framework led to improved awareness and participation by students and faculty. Early initiation of the process combined with increased frequency of interaction led to improved student satisfaction. Based on student perceptions, programmatic policies were initiated to promote advisory meetings early and often to establish a positive relationship. The policies focus on academic advising as one of proactivity in which the advisor serves as a portal which the student may access leading to a more successful academic experience.

  1. Autoverification process improvement by Six Sigma approach: Clinical chemistry & immunoassay.

    PubMed

    Randell, Edward W; Short, Garry; Lee, Natasha; Beresford, Allison; Spencer, Margaret; Kennell, Marina; Moores, Zoë; Parry, David

    2018-05-01

    This study examines effectiveness of a project to enhance an autoverification (AV) system through application of Six Sigma (DMAIC) process improvement strategies. Similar AV systems set up at three sites underwent examination and modification to produce improved systems while monitoring proportions of samples autoverified, the time required for manual review and verification, sample processing time, and examining characteristics of tests not autoverified. This information was used to identify areas for improvement and monitor the impact of changes. Use of reference range based criteria had the greatest impact on the proportion of tests autoverified. To improve AV process, reference range based criteria was replaced with extreme value limits based on a 99.5% test result interval, delta check criteria were broadened, and new specimen consistency rules were implemented. Decision guidance tools were also developed to assist staff using the AV system. The mean proportion of tests and samples autoverified improved from <62% for samples and <80% for tests, to >90% for samples and >95% for tests across all three sites. The new AV system significantly decreased turn-around time and total sample review time (to about a third), however, time spent for manual review of held samples almost tripled. There was no evidence of compromise to the quality of testing process and <1% of samples held for exceeding delta check or extreme limits required corrective action. The Six Sigma (DMAIC) process improvement methodology was successfully applied to AV systems resulting in an increase in overall test and sample AV by >90%, improved turn-around time, reduced time for manual verification, and with no obvious compromise to quality or error detection. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  2. Focused process improvement events: sustainability of impact on process and performance in an academic radiology department.

    PubMed

    Rosenkrantz, Andrew B; Lawson, Kirk; Ally, Rosina; Chen, David; Donno, Frank; Rittberg, Steven; Rodriguez, Joan; Recht, Michael P

    2015-01-01

    To evaluate sustainability of impact of rapid, focused process improvement (PI) events on process and performance within an academic radiology department. Our department conducted PI during 2011 and 2012 in CT, MRI, ultrasound, breast imaging, and research billing. PI entailed participation by all stakeholders, facilitation by the department chair, collection of baseline data, meetings during several weeks, definition of performance metrics, creation of an improvement plan, and prompt implementation. We explore common themes among PI events regarding initial impact and durability of changes. We also assess performance in each area pre-PI, immediately post-PI, and at the time of the current study. All PI events achieved an immediate improvement in performance metrics, often entailing both examination volumes and on-time performance. IT-based solutions, process standardization, and redefinition of staff responsibilities were often central in these changes, and participants consistently expressed improved internal leadership and problem-solving ability. Major environmental changes commonly occurred after PI, including a natural disaster with equipment loss, a change in location or services offered, and new enterprise-wide electronic medical record system incorporating new billing and radiology informatics systems, requiring flexibility in the PI implementation plan. Only one PI team conducted regular post-PI follow-up meetings. Sustained improvement was frequently, but not universally, observed: in the long-term following initial PI, measures of examination volume showed continued progressive improvements, whereas measures of operational efficiency remained stable or occasionally declined. Focused PI is generally effective in achieving performance improvement, although a changing environment influences the sustainability of impact. Thus, continued process evaluation and ongoing workflow modifications are warranted. Copyright © 2015 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  3. Process development of human multipotent stromal cell microcarrier culture using an automated high-throughput microbioreactor.

    PubMed

    Rafiq, Qasim A; Hanga, Mariana P; Heathman, Thomas R J; Coopman, Karen; Nienow, Alvin W; Williams, David J; Hewitt, Christopher J

    2017-10-01

    Microbioreactors play a critical role in process development as they reduce reagent requirements and can facilitate high-throughput screening of process parameters and culture conditions. Here, we have demonstrated and explained in detail, for the first time, the amenability of the automated ambr15 cell culture microbioreactor system for the development of scalable adherent human mesenchymal multipotent stromal/stem cell (hMSC) microcarrier culture processes. This was achieved by first improving suspension and mixing of the microcarriers and then improving cell attachment thereby reducing the initial growth lag phase. The latter was achieved by using only 50% of the final working volume of medium for the first 24 h and using an intermittent agitation strategy. These changes resulted in >150% increase in viable cell density after 24 h compared to the original process (no agitation for 24 h and 100% working volume). Using the same methodology as in the ambr15, similar improvements were obtained with larger scale spinner flask studies. Finally, this improved bioprocess methodology based on a serum-based medium was applied to a serum-free process in the ambr15, resulting in >250% increase in yield compared to the serum-based process. At both scales, the agitation used during culture was the minimum required for microcarrier suspension, N JS . The use of the ambr15, with its improved control compared to the spinner flask, reduced the coefficient of variation on viable cell density in the serum containing medium from 7.65% to 4.08%, and the switch to serum free further reduced these to 1.06-0.54%, respectively. The combination of both serum-free and automated processing improved the reproducibility more than 10-fold compared to the serum-based, manual spinner flask process. The findings of this study demonstrate that the ambr15 microbioreactor is an effective tool for bioprocess development of hMSC microcarrier cultures and that a combination of serum-free medium, control, and automation improves both process yield and consistency. Biotechnol. Bioeng. 2017;114: 2253-2266. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  4. Process development of human multipotent stromal cell microcarrier culture using an automated high‐throughput microbioreactor

    PubMed Central

    Hanga, Mariana P.; Heathman, Thomas R. J.; Coopman, Karen; Nienow, Alvin W.; Williams, David J.; Hewitt, Christopher J.

    2017-01-01

    ABSTRACT Microbioreactors play a critical role in process development as they reduce reagent requirements and can facilitate high‐throughput screening of process parameters and culture conditions. Here, we have demonstrated and explained in detail, for the first time, the amenability of the automated ambr15 cell culture microbioreactor system for the development of scalable adherent human mesenchymal multipotent stromal/stem cell (hMSC) microcarrier culture processes. This was achieved by first improving suspension and mixing of the microcarriers and then improving cell attachment thereby reducing the initial growth lag phase. The latter was achieved by using only 50% of the final working volume of medium for the first 24 h and using an intermittent agitation strategy. These changes resulted in >150% increase in viable cell density after 24 h compared to the original process (no agitation for 24 h and 100% working volume). Using the same methodology as in the ambr15, similar improvements were obtained with larger scale spinner flask studies. Finally, this improved bioprocess methodology based on a serum‐based medium was applied to a serum‐free process in the ambr15, resulting in >250% increase in yield compared to the serum‐based process. At both scales, the agitation used during culture was the minimum required for microcarrier suspension, NJS. The use of the ambr15, with its improved control compared to the spinner flask, reduced the coefficient of variation on viable cell density in the serum containing medium from 7.65% to 4.08%, and the switch to serum free further reduced these to 1.06–0.54%, respectively. The combination of both serum‐free and automated processing improved the reproducibility more than 10‐fold compared to the serum‐based, manual spinner flask process. The findings of this study demonstrate that the ambr15 microbioreactor is an effective tool for bioprocess development of hMSC microcarrier cultures and that a combination of serum‐free medium, control, and automation improves both process yield and consistency. Biotechnol. Bioeng. 2017;114: 2253–2266. © 2017 The Authors. Biotechnology and Bioengineering Published by Wiley Periodicals, Inc. PMID:28627713

  5. Quality control process improvement of flexible printed circuit board by FMEA

    NASA Astrophysics Data System (ADS)

    Krasaephol, Siwaporn; Chutima, Parames

    2018-02-01

    This research focuses on the quality control process improvement of Flexible Printed Circuit Board (FPCB), centred around model 7-Flex, by using Failure Mode and Effect Analysis (FMEA) method to decrease proportion of defective finished goods that are found at the final inspection process. Due to a number of defective units that were found at the final inspection process, high scraps may be escaped to customers. The problem comes from poor quality control process which is not efficient enough to filter defective products from in-process because there is no In-Process Quality Control (IPQC) or sampling inspection in the process. Therefore, the quality control process has to be improved by setting inspection gates and IPCQs at critical processes in order to filter the defective products. The critical processes are analysed by the FMEA method. IPQC is used for detecting defective products and reducing chances of defective finished goods escaped to the customers. Reducing proportion of defective finished goods also decreases scrap cost because finished goods incur higher scrap cost than work in-process. Moreover, defective products that are found during process can reflect the abnormal processes; therefore, engineers and operators should timely solve the problems. Improved quality control was implemented for 7-Flex production lines from July 2017 to September 2017. The result shows decreasing of the average proportion of defective finished goods and the average of Customer Manufacturers Lot Reject Rate (%LRR of CMs) equal to 4.5% and 4.1% respectively. Furthermore, cost saving of this quality control process equals to 100K Baht.

  6. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  7. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  8. Top-Down Computerized Cognitive Remediation in Schizophrenia: A Case Study of an Individual with Impairment in Verbal Fluency

    PubMed Central

    Masson, Marjolaine; Wykes, Til; Maziade, Michel; Reeder, Clare; Gariépy, Marie-Anne; Roy, Marc-André; Ivers, Hans; Cellard, Caroline

    2015-01-01

    The objective of this case study was to assess the specific effect of cognitive remediation for schizophrenia on the pattern of cognitive impairments. Case A is a 33-year-old man with a schizophrenia diagnosis and impairments in visual memory, inhibition, problem solving, and verbal fluency. He was provided with a therapist delivered cognitive remediation program involving practice and strategy which was designed to train attention, memory, executive functioning, visual-perceptual processing, and metacognitive skills. Neuropsychological and clinical assessments were administered at baseline and after three months of treatment. At posttest assessment, Case A had improved significantly on targeted (visual memory and problem solving) and nontargeted (verbal fluency) cognitive processes. The results of the current case study suggest that (1) it is possible to improve specific cognitive processes with targeted exercises, as seen by the improvement in visual memory due to training exercises targeting this cognitive domain; (2) cognitive remediation can produce improvements in cognitive processes not targeted during remediation since verbal fluency was improved while there was no training exercise on this specific cognitive process; and (3) including learning strategies in cognitive remediation increases the value of the approach and enhances participant improvement, possibly because strategies using verbalization can lead to improvement in verbal fluency even if it was not practiced. PMID:25949840

  9. [Applying healthcare failure mode and effect analysis to improve the surgical specimen transportation process and rejection rate].

    PubMed

    Hu, Pao-Hsueh; Hu, Hsiao-Chen; Huang, Hui-Ju; Chao, Hui-Lin; Lei, Ei-Fang

    2014-04-01

    Because surgical pathology specimens are crucial to the diagnosis and treatment of disease, it is critical that they be collected and transported safely and securely. Due to recent near-miss events in our department, we used the healthcare failure model and effect analysis to identify 14 potential perils in the specimen collection and transportation process. Improvement and prevention strategies were developed accordingly to improve quality of care. Using health care failure mode and effect analysis (HFMEA) may improve the surgical specimen transportation process and reduce the rate of surgical specimen rejection. Rectify standard operating procedures for surgical pathology specimen collection and transportation. Create educational videos and posters. Rectify methods of specimen verification. Organize and create an online and instantaneous management system for specimen tracking and specimen rejection. Implementation of the new surgical specimen transportation process effectively eliminated the 14 identified potential perils. In addition, the specimen rejection fell from 0.86% to 0.03%. This project was applied to improve the specimen transportation process, enhance interdisciplinary cooperation, and improve the patient-centered healthcare system. The creation and implementation of an online information system significantly facilitates specimen tracking, hospital cost reductions, and patient safety improvements. The success in our department is currently being replicated across all departments in our hospital that transport specimens. Our experience and strategy may be applied to inter-hospital specimen transportation in the future.

  10. A Study on Improving Information Processing Abilities Based on PBL

    ERIC Educational Resources Information Center

    Kim, Du Gyu; Lee, JaeMu

    2014-01-01

    This study examined an instruction method for the improvement of information processing abilities in elementary school students. Current elementary students are required to develop information processing abilities to create new knowledge for this digital age. There is, however, a shortage of instruction strategies for these information processing…

  11. 78 FR 53436 - Improving Performance of Federal Permitting and Review of Infrastructure Projects

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-29

    ... an efficient decision-making process within each agency; to the extent possible, unifying and... IIP Process, the developer is encouraged to inform DOE in writing as soon as possible of its decision... to improve the performance of Federal siting, permitting, and review processes for infrastructure...

  12. Using Data-Based Inquiry and Decision Making To Improve Instruction.

    ERIC Educational Resources Information Center

    Feldman, Jay; Tung, Rosann

    2001-01-01

    Discusses a study of six schools using data-based inquiry and decision-making process to improve instruction. Findings identified two conditions to support successful implementation of the process: administrative support, especially in providing teachers learning time, and teacher leadership to encourage and support colleagues to own the process.…

  13. Army Needs to Identify Government Purchase Card High-Risk Transactions

    DTIC Science & Technology

    2012-01-20

    Purchase Card Program Data Mining Process Needs Improvement 11...Mining Process Needs Improvement The 17 transactions that were noncompliant occurred because cardholders ignored the GPC business rules so the...Scope and Methodology 16 Use of Computer- Processed Data 16 Use of Technical Assistance 17 Prior Coverage

  14. Improved Warm-Working Process For An Iron-Base Alloy

    NASA Technical Reports Server (NTRS)

    Cone, Fred P.; Cryns, Brendan J.; Miller, John A.; Zanoni, Robert

    1992-01-01

    Warm-working process produces predominantly unrecrystallized grain structure in forgings of iron-base alloy A286 (PWA 1052 composition). Yield strength and ultimate strength increased, and elongation and reduction of area at break decreased. Improved process used on forgings up to 10 in. thick and weighing up to 900 lb.

  15. Improving bed turnover time with a bed management system.

    PubMed

    Tortorella, Frank; Ukanowicz, Donna; Douglas-Ntagha, Pamela; Ray, Robert; Triller, Maureen

    2013-01-01

    Efficient patient throughput requires a high degree of coordination and communication. Opportunities abound to improve the patient experience by eliminating waste from the process and improving communication among the multiple disciplines involved in facilitating patient flow. In this article, we demonstrate how an interdisciplinary team at a large tertiary cancer center implemented an electronic bed management system to improve the bed turnover component of the patient throughput process.

  16. Extension of ERIM multispectral data processing capabilities through improved data handling techniques

    NASA Technical Reports Server (NTRS)

    Kriegler, F. J.

    1973-01-01

    The improvement and extension of the capabilities of the Environmental Research Institute of Michigan processing facility in handling multispectral data are discussed. Improvements consisted of implementing hardware modifications which permitted more rapid access to the recorded data through improved numbering and indexing of such data. In addition, techniques are discussed for handling data from sources other than the ERIM M-5 and M-7 scanner systems.

  17. Enhancing performing characteristics of organic semiconducting films by improved solution processing

    DOEpatents

    Bazan, Guillermo C; Moses, Daniel; Peet, Jeffrey; Heeger, Alan J

    2014-05-13

    Improved processing methods for enhanced properties of conjugated polymer films are disclosed, as well as the enhanced conjugated polymer films produced thereby. Addition of low molecular weight alkyl-containing molecules to solutions used to form conjugated polymer films leads to improved photoconductivity and improvements in other electronic properties. The enhanced conjugated polymer films can be used in a variety of electronic devices, such as solar cells and photodiodes.

  18. Characteristics of Volunteer Coaches in a Clinical Process Improvement Program.

    PubMed

    Morley, Katharine E; Barysauskas, Constance M; Carballo, Victoria; Kalibatas, Orinta; Rao, Sandhya K; Jacobson, Joseph O; Cummings, Brian M

    The Partners Clinical Process Improvement Leadership Program provides quality improvement training for clinicians and administrators, utilizing graduates as volunteer peer coaches for mentorship. We sought to understand the factors associated with volunteer coach participation and gain insight into how to improve and sustain this program. Review of coach characteristics from course database and survey of frequent coaches. Out of 516 Partners Clinical Process Improvement Leadership Program graduates from March 2010 to June 2015, 117 (23%) individuals volunteered as coaches. Sixty-one (52%) individuals coached once, 31 (27%) coached twice, and 25 (21%) coached 3 or more times. There were statistically significant associations between coaching and occupation (P = .005), Partners Clinical Process Improvement Leadership Program course taken (P = .001), and course location (P = .007). Administrators were more likely to coach than physicians (odds ratio: 1.75, P = .04). Reasons for volunteering as a coach included further development of skills, desire to stay involved with program, and enjoying mentoring. Reasons for repeated coaching included maintaining quality improvement skills, expanding skills to a wider variety of projects, and networking. A peer graduate volunteer coach model is a viable strategy for interprofessional quality improvement mentorship. Strategies that support repeat coaching and engage clinicians should be promoted to ensure an experienced and diversified group of coaches.

  19. Quality initiatives: improving patient flow for a bone densitometry practice: results from a Mayo Clinic radiology quality initiative.

    PubMed

    Aakre, Kenneth T; Valley, Timothy B; O'Connor, Michael K

    2010-03-01

    Lean Six Sigma process improvement methodologies have been used in manufacturing for some time. However, Lean Six Sigma process improvement methodologies also are applicable to radiology as a way to identify opportunities for improvement in patient care delivery settings. A multidisciplinary team of physicians and staff conducted a 100-day quality improvement project with the guidance of a quality advisor. By using the framework of DMAIC (define, measure, analyze, improve, and control), time studies were performed for all aspects of patient and technologist involvement. From these studies, value stream maps for the current state and for the future were developed, and tests of change were implemented. Comprehensive value stream maps showed that before implementation of process changes, an average time of 20.95 minutes was required for completion of a bone densitometry study. Two process changes (ie, tests of change) were undertaken. First, the location for completion of a patient assessment form was moved from inside the imaging room to the waiting area, enabling patients to complete the form while waiting for the technologist. Second, the patient was instructed to sit in a waiting area immediately outside the imaging rooms, rather than in the main reception area, which is far removed from the imaging area. Realignment of these process steps, with reduced technologist travel distances, resulted in a 3-minute average decrease in the patient cycle time. This represented a 15% reduction in the initial patient cycle time with no change in staff or costs. Radiology process improvement projects can yield positive results despite small incremental changes.

  20. MORS Workshop on Improving Defense Analysis through Better Data Practices, held in Alexandria, Virginia on March 25, 26 and 27, 2003

    DTIC Science & Technology

    2004-12-03

    other process improvements could also enhance DoD data practices. These include the incorporation of library science techniques as well as processes to...coalition communities as well as adapting the approaches and lessons of the library science community. Second, there is a need to generate a plan of...Best Practices (2 of 2) - Processes - Incorporate library science techniques in repository design - Improve visibility and accessibility of DoD data

  1. Improving program documentation quality through the application of continuous improvement processes.

    PubMed

    Lovlien, Cheryl A; Johansen, Martha; Timm, Sandra; Eversman, Shari; Gusa, Dorothy; Twedell, Diane

    2007-01-01

    Maintaining the integrity of record keeping and retrievable information related to the provision of continuing education credit creates challenges for a large organization. Accurate educational program documentation is vital to support the knowledge and professional development of nursing staff. Quality review and accurate documentation of programs for nursing staff development occurred at one institution through the use of continuous improvement principles. Integration of the new process into the current system maintains the process of providing quality record keeping.

  2. Process Improvements in Training Device Acceptance Testing: A Study in Total Quality Management

    DTIC Science & Technology

    1990-12-12

    Quality Management , a small group of Government and industry specialists examined the existing training device acceptance test process for potential improvements. The agreed-to mission of the Air Force/Industry partnership was to continuously identify and promote implementable approaches to minimize the cost and time required for acceptance testing while ensuring that validated performance supports the user training requirements. Application of a Total Quality process improvement model focused on the customers and their requirements, analyzed how work was accomplished, and

  3. The Armed Services and Model Employer Status for Child Support Enforcement: A Proposal to Improve Service of Process

    DTIC Science & Technology

    1996-04-01

    CHILD SUPPORT ENFORCEMENT: A PROPOSAL TO IMPROVE SERVICE OF PROCESS A Thesis Presented to The Judge Advocate General’s School United States Army The...19960 THE ARMED SERVICES AND MODEL EMPLOYER STATUS FOR CHILD SUPPORT ENFORCEMENT: A PROPOSAL TO IMPROVE SERVICE OF PROCESS by Major Alan L. Cook...ABSTRACT: On February 27, 1995, President Clinton issued Executive Order 12953, "Actions Required of all Executive Agencies to Facilitate Payment of Child

  4. A Total Quality Leadership Process Improvement Model

    DTIC Science & Technology

    1993-12-01

    Leadership Process Improvement Model by Archester Houston, Ph.D. and Steven L. Dockstader, Ph.D. DTICS ELECTE tleaese oand sale itsFeat ben proe 94-12058...tTl ’AND SIATE COVERID0 Z lits Z40 uerI’Ll12/93 IFinalS.FNR IM F A Total Quality Leadership Process Improvement Model M ARRhOW~ Archester Houston, Ph.D...and Steven L. Dockstader, Ph.D. ?. 7PEJORMING ORG-AN1:AION NAMEIS) AND 00-RESS(ES) L PERFORMIN4 ORAINIZATION Total Quality Leadership OfficeREOTNMR

  5. Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.

    PubMed

    Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias

    2016-01-01

    To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matthew R. Kumjian; Giangrande, Scott E.; Mishra, Subashree

    Polarimetric radar observations increasingly are used to understand cloud microphysical processes, which is critical for improving their representation in cloud and climate models. In particular, there has been recent focus on improving representations of ice collection processes (e.g., aggregation, riming), as these influence precipitation rate, heating profiles, and ultimately cloud life cycles. However, distinguishing these processes using conventional polarimetric radar observations is difficult, as they produce similar fingerprints. This necessitates improved analysis techniques and integration of complementary data sources. Furthermore, the Midlatitude Continental Convective Clouds Experiment (MC3E) provided such an opportunity.

  7. E-learning process maturity level: a conceptual framework

    NASA Astrophysics Data System (ADS)

    Rahmah, A.; Santoso, H. B.; Hasibuan, Z. A.

    2018-03-01

    ICT advancement is a sure thing with the impact influencing many domains, including learning in both formal and informal situations. It leads to a new mindset that we should not only utilize the given ICT to support the learning process, but also improve it gradually involving a lot of factors. These phenomenon is called e-learning process evolution. Accordingly, this study attempts to explore maturity level concept to provide the improvement direction gradually and progression monitoring for the individual e-learning process. Extensive literature review, observation, and forming constructs are conducted to develop a conceptual framework for e-learning process maturity level. The conceptual framework consists of learner, e-learning process, continuous improvement, evolution of e-learning process, technology, and learning objectives. Whilst, evolution of e-learning process depicted as current versus expected conditions of e-learning process maturity level. The study concludes that from the e-learning process maturity level conceptual framework, it may guide the evolution roadmap for e-learning process, accelerate the evolution, and decrease the negative impact of ICT. The conceptual framework will be verified and tested in the future study.

  8. Healthcare quality measurement in orthopaedic surgery: current state of the art.

    PubMed

    Auerbach, Andrew

    2009-10-01

    Improving quality of care in arthroplasty is of increasing importance to payors, hospitals, surgeons, and patients. Efforts to compel improvement have traditionally focused measurement and reporting of data describing structural factors, care processes (or 'quality measures'), and clinical outcomes. Reporting structural measures (eg, surgical case volume) has been used with varying degrees of success. Care process measures, exemplified by initiatives such as the Surgical Care Improvement Project measures, are chosen based on the strength of randomized trial evidence linking the process to improved outcomes. However, evidence linking improved performance on Surgical Care Improvement Project measures with improved outcomes is limited. Outcome measures in surgery are of increasing importance as an approach to compel care improvement with prominent examples represented by the National Surgical Quality Improvement Project. Although outcomes-focused approaches are often costly, when linked to active benchmarking and collaborative activities, they may improve care broadly. Moreover, implementation of computerized data systems collecting information formerly collected on paper only will facilitate benchmarking. In the end, care will only be improved if these data are used to define methods for innovating care systems that deliver better outcomes at lower or equivalent costs.

  9. Selective aqueous extraction of organics coupled with trapping by membrane separation

    DOEpatents

    van Eikeren, Paul; Brose, Daniel J.; Ray, Roderick J.

    1991-01-01

    An improvement to processes for the selective extractation of organic solutes from organic solvents by water-based extractants is disclosed, the improvement comprising coupling various membrane separation processes with the organic extraction process, the membrane separation process being utilized to continuously recycle the water-based extractant and at the same time selectively remove or concentrate organic solute from the water-based extractant.

  10. The Performance Improvement of the Lagrangian Particle Dispersion Model (LPDM) Using Graphics Processing Unit (GPU) Computing

    DTIC Science & Technology

    2017-08-01

    access to the GPU for general purpose processing .5 CUDA is designed to work easily with multiple programming languages , including Fortran. CUDA is a...Using Graphics Processing Unit (GPU) Computing by Leelinda P Dawson Approved for public release; distribution unlimited...The Performance Improvement of the Lagrangian Particle Dispersion Model (LPDM) Using Graphics Processing Unit (GPU) Computing by Leelinda

  11. Statistical process control methods allow the analysis and improvement of anesthesia care.

    PubMed

    Fasting, Sigurd; Gisvold, Sven E

    2003-10-01

    Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.

  12. Training for Template Creation: A Performance Improvement Method

    ERIC Educational Resources Information Center

    Lyons, Paul

    2008-01-01

    Purpose: There are three purposes to this article: first, to offer a training approach to employee learning and performance improvement that makes use of a step-by-step process of skill/knowledge creation. The process offers follow-up opportunities for skill maintenance and improvement; second, to explain the conceptual bases of the approach; and…

  13. Winning performance improvement strategies--linking documentation and accounts receivable.

    PubMed

    Braden, J H; Swadley, D

    1996-01-01

    When the HIM department at The University of Texas Medical Branch set out to improve documentation and accounts receivable management, it established a plan that encompassed a broad spectrum of data management process changes. The department examined and acknowledged the deficiencies in data management processes and used performance improvement tools to achieve successful results.

  14. Feedback Providing Improvement Strategies and Reflection on Feedback Use: Effects on Students' Writing Motivation, Process, and Performance

    ERIC Educational Resources Information Center

    Duijnhouwer, Hendrien; Prins, Frans J.; Stokking, Karel M.

    2012-01-01

    This study investigated the effects of feedback providing improvement strategies and a reflection assignment on students' writing motivation, process, and performance. Students in the experimental feedback condition (n = 41) received feedback including improvement strategies, whereas students in the control feedback condition (n = 41) received…

  15. The Data-to-Action Framework: A Rapid Program Improvement Process

    ERIC Educational Resources Information Center

    Zakocs, Ronda; Hill, Jessica A.; Brown, Pamela; Wheaton, Jocelyn; Freire, Kimberley E.

    2015-01-01

    Although health education programs may benefit from quality improvement methods, scant resources exist to help practitioners apply these methods for program improvement. The purpose of this article is to describe the Data-to-Action framework, a process that guides practitioners through rapid-feedback cycles in order to generate actionable data to…

  16. 78 FR 13100 - Models for Plant-Specific Adoption of Technical Specifications Task Force Traveler TSTF-535...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-26

    ... Definition To Address Advanced Fuel Designs,'' Using the Consolidated Line Item Improvement Process AGENCY...-specific adoption using the Consolidated Line Item Improvement Process (CLIIP). Additionally, the NRC staff..., which may be more reactive at shutdown temperatures above 68[emsp14][deg]F. This STS improvement is part...

  17. Creating Sustainable Education Projects in Roatán, Honduras through Continuous Process Improvement

    ERIC Educational Resources Information Center

    Raven, Arjan; Randolph, Adriane B.; Heil, Shelli

    2010-01-01

    The investigators worked together with permanent residents of Roatán, Honduras on sustainable initiatives to help improve the island's troubled educational programs. Our initiatives focused on increasing the number of students eligible and likely to attend a university. Using a methodology based in continuous process improvement, we developed…

  18. An Action Plan for Improving Mediocre or Stagnant Student Achievement

    ERIC Educational Resources Information Center

    Redmond, Kimberley B.

    2013-01-01

    Although all of the schools in the target school system adhere to a school improvement process, achievement scores remain mediocre or stagnant within the overseas school in Italy that serves children of United States armed service members. To address this problem, this study explored the target school's improvement process to discover how…

  19. Improving Student Retention in Higher Education: Improving Teaching and Learning

    ERIC Educational Resources Information Center

    Crosling, Glenda; Heagney, Margaret; Thomas, Liz

    2009-01-01

    As a key performance indicator in university quality assurance processes, the retention of students in their studies is an issue of concern world-wide. Implicit in the process of quality assurance is quality improvement. In this article, we examine student retention from a teaching and learning perspective, in terms of teaching and learning…

  20. The road to business process improvement--can you get there from here?

    PubMed

    Gilberto, P A

    1995-11-01

    Historically, "improvements" within the organization have been frequently attained through automation by building and installing computer systems. Material requirements planning (MRP), manufacturing resource planning II (MRP II), just-in-time (JIT), computer aided design (CAD), computer aided manufacturing (CAM), electronic data interchange (EDI), and various other TLAs (three-letter acronyms) have been used as the methods to attain business objectives. But most companies have found that installing computer software, cleaning up their data, and providing every employee with training on how to best use the systems have not resulted in the level of business improvements needed. The software systems have simply made management around the problems easier but did little to solve the basic problems. The missing element in the efforts to improve the performance of the organization has been a shift in focus from individual department improvements to cross-organizational business process improvements. This article describes how the Electric Boat Division of General Dynamics Corporation, in conjunction with the Data Systems Division, moved its focus from one of vertical organizational processes to horizontal business processes. In other words, how we got rid of the dinosaurs.

  1. Improving the work function of the niobium surface of SRF cavities by plasma processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tyagi, P. V.; Doleans, M.; Hannah, B.

    2016-01-01

    An in situ plasma processing technique using chemically reactive oxygen plasma to remove hydrocarbons from superconducting radio frequency cavity surfaces at room temperature was developed at the spallation neutron source, at Oak Ridge National Laboratory. To understand better the interaction between the plasma and niobium surface, surface studies on small samples were performed. In this article, we report the results from those surface studies. The results show that plasma processing removes hydrocarbons from top surface and improves the surface work function by 0.5₋1.0 eV. Improving the work function of RF surface of cavities can help to improve their operational performance.

  2. The process of managerial control in quality improvement initiatives.

    PubMed

    Slovensky, D J; Fottler, M D

    1994-11-01

    The fundamental intent of strategic management is to position an organization with in its market to exploit organizational competencies and strengths to gain competitive advantage. Competitive advantage may be achieved through such strategies as low cost, high quality, or unique services or products. For health care organizations accredited by the Joint Commission on Accreditation of Healthcare Organizations, continually improving both processes and outcomes of organizational performance--quality improvement--in all operational areas of the organization is a mandated strategy. Defining and measuring quality and controlling the quality improvement strategy remain problematic. The article discusses the nature and processes of managerial control, some potential measures of quality, and related information needs.

  3. Burnishing of rotatory parts to improve surface quality

    NASA Astrophysics Data System (ADS)

    Celaya, A.; López de Lacalle, L. N.; Albizuri, J.; Alberdi, R.

    2009-11-01

    In this paper, the use of rolling burnishing process to improve the final quality of railway and automotive workpieces is studied. The results are focused on the improvement of the manufacturing processes of rotary workpieces used in railway and automotion industry, attending to generic target of achieving `maximum surface quality with minimal process time'. Burnishing is a finishing operation in which plastic deformation of surface irregularities occurs by applying pressure through a very hard element, a roller or a ceramic ball. This process gives additional advantages to the workpiece such as good surface roughness, increased hardness and high compressive residual stresses. The effect of the initial turning conditions on the final burnishing operation has also been studied. The results show that feeds used in the initial rough turning have little influence in the surface finish of the burnished workpieces. So, the process times of the combined turning and burnishing processes can be reduced, optimizing the shaft's machining process.

  4. Student evaluations of the portfolio process.

    PubMed

    Murphy, John E; Airey, Tatum C; Bisso, Andrea M; Slack, Marion K

    2011-09-10

    To evaluate pharmacy students' perceived benefits of the portfolio process and to gather suggestions for improving the process. A questionnaire was designed and administered to 250 first-, second-, and third-year pharmacy students at the University of Arizona College of Pharmacy. Although the objectives of the portfolio process were for students to understand the expected outcomes, understand the impact of extracurricular activities on attaining competencies, identify what should be learned, identify their strengths and weaknesses, and modify their approach to learning, overall students perceived the portfolio process as having less than moderate benefit. First-year students wanted more examples of portfolios while second- and third-year students suggested that more time with their advisor would be beneficial. The portfolio process will continue to be refined and efforts made to improve students' perceptions of the process as it is intended to develop the self-assessments skills they will need to improve their knowledge and professional skills throughout their pharmacy careers.

  5. Using Lean methodologies to streamline processing of requests for durable medical equipment and supplies for children with complex conditions.

    PubMed

    Fields, Elise; Neogi, Smriti; Schoettker, Pamela J; Lail, Jennifer

    2017-12-12

    An improvement team from the Complex Care Center at our large pediatric medical center participated in a 60-day initiative to use Lean methodologies to standardize their processes, eliminate waste and improve the timely and reliable provision of durable medical equipment and supplies. The team used value stream mapping to identify processes needing improvement. Improvement activities addressed the initial processing of a request, provider signature on the form, returning the form to the sender, and uploading the completed documents to the electronic medical record. Data on lead time (time between receiving a request and sending the completed request to the Health Information Management department) and process time (amount of time the staff worked on the request) were collected via manual pre- and post-time studies. Following implementation of interventions, the median lead time for processing durable medical equipment and supply requests decreased from 50 days to 3 days (p < 0.0001). Median processing time decreased from 14min to 9min (p < 0.0001). The decrease in processing time realized annual cost savings of approximately $11,000. Collaborative leadership and multidisciplinary training in Lean methods allowed the CCC staff to incorporate common sense, standardize practices, and adapt their work environment to improve the timely and reliable provision of equipment and supplies that are essential for their patients. The application of Lean methodologies to processing requests for DME and supplies could also result in a natural spread to other paperwork and requests, thus avoiding delays and potential risk for clinical instability or deterioration. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. 76 FR 22678 - Trademark Trial and Appeal Board Participation in Settlement Discussions

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-22

    ... of suggestions for process improvements, including suggestions related to fostering settlement... whether to pursue suggestions for process improvements. If the Office decides to pursue implementation of...

  7. An Analysis of the Defense Logistics Agency Medical Supplies Requisition Process

    DTIC Science & Technology

    1991-09-01

    Flowchart of the Improvement Process .... ....... 8 2. Generic Order Processing Flow .. .......... 19 3. Total Order Cycle: A Customer’s Perspective . 20 4...concentrated in the area of order processing and how it can be improved, especially in the medical supplies arena. This chapter is divided into four major...1989b:l). This time period may be also referred to as lead time, or the replenishment cycle. Figure 2 illustrates a generic order processing flow, which

  8. Improving Treatment Response for Paediatric Anxiety Disorders: An Information-Processing Perspective.

    PubMed

    Ege, Sarah; Reinholdt-Dunne, Marie Louise

    2016-12-01

    Cognitive behavioural therapy (CBT) is considered the treatment of choice for paediatric anxiety disorders, yet there remains substantial room for improvement in treatment outcomes. This paper examines whether theory and research into the role of information-processing in the underlying psychopathology of paediatric anxiety disorders indicate possibilities for improving treatment response. Using a critical review of recent theoretical, empirical and academic literature, the paper examines the role of information-processing biases in paediatric anxiety disorders, the extent to which CBT targets information-processing biases, and possibilities for improving treatment response. The literature reviewed indicates a role for attentional and interpretational biases in anxious psychopathology. While there is theoretical grounding and limited empirical evidence to indicate that CBT ameliorates interpretational biases, evidence regarding the effects of CBT on attentional biases is mixed. Novel treatment methods including attention bias modification training, attention feedback awareness and control training, and mindfulness-based therapy may hold potential in targeting attentional biases, and thereby in improving treatment response. The integration of novel interventions into an existing evidence-based protocol is a complex issue and faces important challenges with regard to determining the optimal treatment package. Novel interventions targeting information-processing biases may hold potential in improving response to CBT for paediatric anxiety disorders. Many important questions remain to be answered.

  9. NASA Johnson Space Center: Total quality partnership

    NASA Technical Reports Server (NTRS)

    Harlan, Charlie; Boyd, Alfred A.

    1992-01-01

    The development of and benefits realized from a joint NASA, support contractor continuous improvement process at the Johnson Space Center (JSC) is traced. The joint effort described is the Safety, Reliability, and Quality Assurance Directorate relationship with its three support contractors which began in early 1990. The Continuous Improvement effort started in early 1990 with an initiative to document and simplify numerous engineering change evaluation processes. This effort quickly grew in scope and intensity to include process improvement teams, improvement methodologies, awareness, and training. By early 1991, the support contractor had teams in place and functioning, program goals established and a cultural change effort underway. In mid-l991 it became apparent that a major redirection was needed to counter a growing sense of frustration and dissatisfaction from teams and managers. Sources of frustration were isolated to insufficient joint participation on teams, and to a poorly defined vision. Over the next year, the effort was transformed to a truly joint process. The presentation covers the steps taken to define vision, values, goals, and priorities and to form a joint Steering Committee and joint process improvement teams. The most recent assessment against the President's award criteria is presented as a summary of progress. Small, but important improvement results have already demonstrated the value of the joint effort.

  10. MARKOV: A methodology for the solution of infinite time horizon MARKOV decision processes

    USGS Publications Warehouse

    Williams, B.K.

    1988-01-01

    Algorithms are described for determining optimal policies for finite state, finite action, infinite discrete time horizon Markov decision processes. Both value-improvement and policy-improvement techniques are used in the algorithms. Computing procedures are also described. The algorithms are appropriate for processes that are either finite or infinite, deterministic or stochastic, discounted or undiscounted, in any meaningful combination of these features. Computing procedures are described in terms of initial data processing, bound improvements, process reduction, and testing and solution. Application of the methodology is illustrated with an example involving natural resource management. Management implications of certain hypothesized relationships between mallard survival and harvest rates are addressed by applying the optimality procedures to mallard population models.

  11. Improving Informed Consent with Minority Participants: Results from Researcher and Community Surveys

    PubMed Central

    Quinn, Sandra Crouse; Garza, Mary A.; Butler, James; Fryer, Craig S.; Casper, Erica T.; Thomas, Stephen B.; Barnard, David; Kim, Kevin H.

    2013-01-01

    Strengthening the informed consent process is one avenue for improving recruitment of minorities into research. This study examines that process from two different perspectives, that of researchers and that of African American and Latino community members. Through the use of two separate surveys, we compared strategies used by researchers with the preferences and attitudes of community members during the informed consent process. Our data suggest that researchers can improve the informed consent process by incorporating methods preferred by the community members along with methods shown in the literature for increasing comprehension. With this approach, the informed consent process may increase both participants’ comprehension of the material and overall satisfaction, fostering greater trust in research and openness to future research opportunities. PMID:23324203

  12. Evaluation of process excellence tools in improving donor flow management in a tertiary care hospital in South India

    PubMed Central

    Venugopal, Divya; Rafi, Aboobacker Mohamed; Innah, Susheela Jacob; Puthayath, Bibin T.

    2017-01-01

    BACKGROUND: Process Excellence is a value based approach and focuses on standardizing work processes by eliminating the non-value added processes, identify process improving methodologies and maximize capacity and expertise of the staff. AIM AND OBJECTIVES: To Evaluate the utility of Process Excellence Tools in improving Donor Flow Management in a Tertiary care Hospital by studying the current state of donor movement within the blood bank and providing recommendations for eliminating the wait times and to improve the process and workflow. MATERIALS AND METHODS: The work was done in two phases; The First Phase comprised of on-site observations with the help of an expert trained in Process Excellence Methodology who observed and documented various aspects of donor flow, donor turn around time, total staff details and operator process flow. The Second Phase comprised of constitution of a Team to analyse the data collected. The analyzed data along with the recommendations were presented before an expert hospital committee and the management. RESULTS: Our analysis put forward our strengths and identified potential problems. Donor wait time was reduced by 50% after lean due to better donor management with reorganization of the infrastructure of the donor area. Receptionist tracking showed that 62% of the total time the staff wastes in walking and 22% in other non-value added activities. Defining Duties for each staff reduced the time spent by them in non-value added activities. Implementation of the token system, generation of unique identification code for donors and bar code labeling of the tubes and bags are among the other recommendations. CONCLUSION: Process Excellence is not a programme; it's a culture that transforms an organization and improves its Quality and Efficiency through new attitudes, elimination of wastes and reduction in costs. PMID:28970681

  13. Evaluation of process excellence tools in improving donor flow management in a tertiary care hospital in South India.

    PubMed

    Venugopal, Divya; Rafi, Aboobacker Mohamed; Innah, Susheela Jacob; Puthayath, Bibin T

    2017-01-01

    Process Excellence is a value based approach and focuses on standardizing work processes by eliminating the non-value added processes, identify process improving methodologies and maximize capacity and expertise of the staff. To Evaluate the utility of Process Excellence Tools in improving Donor Flow Management in a Tertiary care Hospital by studying the current state of donor movement within the blood bank and providing recommendations for eliminating the wait times and to improve the process and workflow. The work was done in two phases; The First Phase comprised of on-site observations with the help of an expert trained in Process Excellence Methodology who observed and documented various aspects of donor flow, donor turn around time, total staff details and operator process flow. The Second Phase comprised of constitution of a Team to analyse the data collected. The analyzed data along with the recommendations were presented before an expert hospital committee and the management. Our analysis put forward our strengths and identified potential problems. Donor wait time was reduced by 50% after lean due to better donor management with reorganization of the infrastructure of the donor area. Receptionist tracking showed that 62% of the total time the staff wastes in walking and 22% in other non-value added activities. Defining Duties for each staff reduced the time spent by them in non-value added activities. Implementation of the token system, generation of unique identification code for donors and bar code labeling of the tubes and bags are among the other recommendations. Process Excellence is not a programme; it's a culture that transforms an organization and improves its Quality and Efficiency through new attitudes, elimination of wastes and reduction in costs.

  14. Use of Failure Mode and Effects Analysis to Improve Emergency Department Handoff Processes.

    PubMed

    Sorrentino, Patricia

    2016-01-01

    The purpose of this article is to describe a quality improvement process using failure mode and effects analysis (FMEA) to evaluate systems handoff communication processes, improve emergency department (ED) throughput and reduce crowding through development of a standardized handoff, and, ultimately, improve patient safety. Risk of patient harm through ineffective communication during handoff transitions is a major reason for breakdown of systems. Complexities of ED processes put patient safety at risk. An increased incidence of submitted patient safety event reports for handoff communication failures between the ED and inpatient units solidified a decision to implement the use of FMEA to identify handoff failures to mitigate patient harm through redesign. The clinical nurse specialist implemented an FMEA. Handoff failure themes were created from deidentified retrospective reviews. Weekly meetings were held over a 3-month period to identify failure modes and determine cause and effect on the process. A functional block diagram process map tool was used to illustrate handoff processes. An FMEA grid was used to list failure modes and assign a risk priority number to quantify results. Multiple areas with actionable failures were identified. A majority of causes for high-priority failure modes were specific to communications. Findings demonstrate the complexity of transition and handoff processes. The FMEA served to identify and evaluate risk of handoff failures and provide a framework for process improvement. A focus on mentoring nurses to quality handoff processes so that it becomes habitual practice is crucial to safe patient transitions. Standardizing content and hardwiring within the system are best practice. The clinical nurse specialist is prepared to provide strong leadership to drive and implement system-wide quality projects.

  15. Processing speed and working memory training in multiple sclerosis: a double-blind randomized controlled pilot study.

    PubMed

    Hancock, Laura M; Bruce, Jared M; Bruce, Amanda S; Lynch, Sharon G

    2015-01-01

    Between 40-65% of multiple sclerosis patients experience cognitive deficits, with processing speed and working memory most commonly affected. This pilot study investigated the effect of computerized cognitive training focused on improving processing speed and working memory. Participants were randomized into either an active or a sham training group and engaged in six weeks of training. The active training group improved on a measure of processing speed and attention following cognitive training, and data trended toward significance on measures of other domains. Results provide preliminary evidence that cognitive training with multiple sclerosis patients may produce moderate improvement in select areas of cognitive functioning.

  16. Clinical audit of diabetes management can improve the quality of care in a resource-limited primary care setting.

    PubMed

    Govender, Indira; Ehrlich, Rodney; Van Vuuren, Unita; De Vries, Elma; Namane, Mosedi; De Sa, Angela; Murie, Katy; Schlemmer, Arina; Govender, Strini; Isaacs, Abdul; Martell, Rob

    2012-12-01

    To determine whether clinical audit improved the performance of diabetic clinical processes in the health district in which it was implemented. Patient folders were systematically sampled annually for review. Primary health-care facilities in the Metro health district of the Western Cape Province in South Africa. Health-care workers involved in diabetes management. Clinical audit and feedback. The Skillings-Mack test was applied to median values of pooled audit results for nine diabetic clinical processes to measure whether there were statistically significant differences between annual audits performed in 2005, 2007, 2008 and 2009. Descriptive statistics were used to illustrate the order of values per process. A total of 40 community health centres participated in the baseline audit of 2005 that decreased to 30 in 2009. Except for two routine processes, baseline medians for six out of nine processes were below 50%. Pooled audit results showed statistically significant improvements in seven out of nine clinical processes. The findings indicate an association between the application of clinical audit and quality improvement in resource-limited settings. Co-interventions introduced after the baseline audit are likely to have contributed to improved outcomes. In addition, support from the relevant government health programmes and commitment of managers and frontline staff contributed to the audit's success.

  17. Remote Sensing Image Quality Assessment Experiment with Post-Processing

    NASA Astrophysics Data System (ADS)

    Jiang, W.; Chen, S.; Wang, X.; Huang, Q.; Shi, H.; Man, Y.

    2018-04-01

    This paper briefly describes the post-processing influence assessment experiment, the experiment includes three steps: the physical simulation, image processing, and image quality assessment. The physical simulation models sampled imaging system in laboratory, the imaging system parameters are tested, the digital image serving as image processing input are produced by this imaging system with the same imaging system parameters. The gathered optical sampled images with the tested imaging parameters are processed by 3 digital image processes, including calibration pre-processing, lossy compression with different compression ratio and image post-processing with different core. Image quality assessment method used is just noticeable difference (JND) subject assessment based on ISO20462, through subject assessment of the gathered and processing images, the influence of different imaging parameters and post-processing to image quality can be found. The six JND subject assessment experimental data can be validated each other. Main conclusions include: image post-processing can improve image quality; image post-processing can improve image quality even with lossy compression, image quality with higher compression ratio improves less than lower ratio; with our image post-processing method, image quality is better, when camera MTF being within a small range.

  18. To Plan or Not to Plan, That Is the Question

    ERIC Educational Resources Information Center

    Dolph, David A.

    2016-01-01

    Strategic planning is a process utilized by numerous organizations, including K-12 school boards, intent on improvement and reform. A thoughtful strategic planning process can help develop a board's desired future driven by goals and strategies aimed at progress. However, improvement processes such as strategic planning are challenging. In fact,…

  19. Peat Processing

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Humics, Inc. already had patented their process for separating wet peat into components and processing it when they consulted NERAC regarding possible applications. The NERAC search revealed numerous uses for humic acid extracted from peat. The product improves seed germination, stimulates root development, and improves crop yields. There are also potential applications in sewage disposal and horticultural peat, etc.

  20. Advanced Information Processing. Volume II. Instructor's Materials. Curriculum Improvement Project. Region II.

    ERIC Educational Resources Information Center

    Stanford, Linda

    This course curriculum is intended for use by community college insructors and administrators in implementing an advanced information processing course. It builds on the skills developed in the previous information processing course but goes one step further by requiring students to perform in a simulated office environment and improve their…

  1. Computer-Based Enhancements for the Improvement of Learning.

    ERIC Educational Resources Information Center

    Tennyson, Robert D.

    The third of four symposium papers argues that, if instructional methods are to improve learning, they must have two aspects: a direct trace to a specific learning process, and empirical support that demonstrates their significance. Focusing on the tracing process, the paper presents an information processing model of learning that can be used by…

  2. Turning Schools Around: The National Board Certification Process as a School Improvement Strategy

    ERIC Educational Resources Information Center

    Jaquith, Ann; Snyder, Jon

    2016-01-01

    Can the National Board certification process support school improvement where large proportions of students score below grade level on standardized tests? This SCOPE study examines a project that sought to seize and capitalize upon the learning opportunities embedded in the National Board certification process, particularly opportunities to learn…

  3. Turning Schools Around: The National Board Certification Process as a School Improvement Strategy. Research Brief

    ERIC Educational Resources Information Center

    Jaquith, Ann; Snyder, Jon

    2016-01-01

    Can the National Board certification process support school improvement where large proportions of students score below grade level on standardized tests? This SCOPE study examines a project that sought to seize and capitalize upon the learning opportunities embedded in the National Board certification process, particularly opportunities to learn…

  4. Executing Quality: A Grounded Theory of Child Care Quality Improvement Engagement Process in Pennsylvania

    ERIC Educational Resources Information Center

    Critchosin, Heather

    2014-01-01

    Executing Quality describes the perceived process experienced by participants while engaging in Keystone Standards, Training, Assistance, Resources, and Support (Keystone STARS) quality rating improvement system (QRIS). The purpose of this qualitative inquiry was to understand the process of Keystone STARS engagement in order to generate a…

  5. Improving Students’ Science Process Skills through Simple Computer Simulations on Linear Motion Conceptions

    NASA Astrophysics Data System (ADS)

    Siahaan, P.; Suryani, A.; Kaniawati, I.; Suhendi, E.; Samsudin, A.

    2017-02-01

    The purpose of this research is to identify the development of students’ science process skills (SPS) on linear motion concept by utilizing simple computer simulation. In order to simplify the learning process, the concept is able to be divided into three sub-concepts: 1) the definition of motion, 2) the uniform linear motion and 3) the uniformly accelerated motion. This research was administered via pre-experimental method with one group pretest-posttest design. The respondents which were involved in this research were 23 students of seventh grade in one of junior high schools in Bandung City. The improving process of students’ science process skill is examined based on normalized gain analysis from pretest and posttest scores for all sub-concepts. The result of this research shows that students’ science process skills are dramatically improved by 47% (moderate) on observation skill; 43% (moderate) on summarizing skill, 70% (high) on prediction skill, 44% (moderate) on communication skill and 49% (moderate) on classification skill. These results clarify that the utilizing simple computer simulations in physics learning is be able to improve overall science skills at moderate level.

  6. Applying the Principles of Lean Production to Gastrointestinal Biopsy Handling: From the Factory Floor to the Anatomic Pathology Laboratory.

    PubMed

    Sugianto, Jessica Z; Stewart, Brian; Ambruzs, Josephine M; Arista, Amanda; Park, Jason Y; Cope-Yokoyama, Sandy; Luu, Hung S

    2015-01-01

    To implement Lean principles to accommodate expanding volumes of gastrointestinal biopsies and to improve laboratory processes overall. Our continuous improvement (kaizen) project analyzed the current state for gastrointestinal biopsy handling using value-stream mapping for specimens obtained at a 487-bed tertiary care pediatric hospital in Dallas, Texas. We identified non-value-added time within the workflow process, from receipt of the specimen in the histology laboratory to the delivery of slides and paperwork to the pathologist. To eliminate non-value-added steps, we implemented the changes depicted in a revised-state value-stream map. Current-state value-stream mapping identified a total specimen processing time of 507 minutes, of which 358 minutes were non-value-added. This translated to a process cycle efficiency of 29%. Implementation of a revised-state value stream resulted in a total process time reduction to 238 minutes, of which 89 minutes were non-value-added, and an improved process cycle efficiency of 63%. Lean production principles of continuous improvement and waste elimination can be successfully implemented within the clinical laboratory.

  7. Technological Innovations of Carbon Dioxide Injection in EAF-LF Steelmaking

    NASA Astrophysics Data System (ADS)

    Wei, Guangsheng; Zhu, Rong; Wu, Xuetao; Dong, Kai; Yang, Lingzhi; Liu, Runzao

    2018-06-01

    In this study, the recent innovations and improvements in carbon dioxide (CO2) injection technologies for electric arc furnace (EAF)-ladle furnace (LF) steelmaking processes have been reviewed. The utilization of CO2 in the EAF-LF steelmaking process resulted in improved efficiency, purity and environmental impact. For example, coherent jets with CO2 and O2 mixed injection can reduce the amount of iron loss and dust generation, and submerged O2 and powder injection with CO2 in an EAF can increase the production efficiency and improve the dephosphorization and denitrification characteristics. Additionally, bottom-blowing CO2 in an EAF can strengthen molten bath stirring and improve nitrogen removal, while bottom-blowing CO2 in a LF can increase the rate of desulfurization and improve the removal of inclusions. Based on these innovations, a prospective process for the cyclic utilization of CO2 in the EAF-LF steelmaking process is introduced that is effective in mitigating greenhouse gas emissions from the steelmaking shop.

  8. Technological Innovations of Carbon Dioxide Injection in EAF-LF Steelmaking

    NASA Astrophysics Data System (ADS)

    Wei, Guangsheng; Zhu, Rong; Wu, Xuetao; Dong, Kai; Yang, Lingzhi; Liu, Runzao

    2018-03-01

    In this study, the recent innovations and improvements in carbon dioxide (CO2) injection technologies for electric arc furnace (EAF)-ladle furnace (LF) steelmaking processes have been reviewed. The utilization of CO2 in the EAF-LF steelmaking process resulted in improved efficiency, purity and environmental impact. For example, coherent jets with CO2 and O2 mixed injection can reduce the amount of iron loss and dust generation, and submerged O2 and powder injection with CO2 in an EAF can increase the production efficiency and improve the dephosphorization and denitrification characteristics. Additionally, bottom-blowing CO2 in an EAF can strengthen molten bath stirring and improve nitrogen removal, while bottom-blowing CO2 in a LF can increase the rate of desulfurization and improve the removal of inclusions. Based on these innovations, a prospective process for the cyclic utilization of CO2 in the EAF-LF steelmaking process is introduced that is effective in mitigating greenhouse gas emissions from the steelmaking shop.

  9. IRQN award paper: Operational rounds: a practical administrative process to improve safety and clinical services in radiology.

    PubMed

    Donnelly, Lane F; Dickerson, Julie M; Lehkamp, Todd W; Gessner, Kevin E; Moskovitz, Jay; Hutchinson, Sally

    2008-11-01

    As part of a patient safety program in the authors' department of radiology, operational rounds have been instituted. This process consists of radiology leaders' visiting imaging divisions at the site of imaging and discussing frontline employees' concerns about patient safety, the quality of care, and patient and family satisfaction. Operational rounds are executed at a time to optimize the number of attendees. Minutes that describe the issues identified, persons responsible for improvement, and updated improvement plan status are available to employees online. Via this process, multiple patient safety and other issues have been identified and remedied. The authors believe that the process has improved patient safety, the quality of care, and the efficiency of operations. Since the inception of the safety program, the mean number of days between serious safety events involving radiology has doubled. The authors review the background around such walk rounds, describe their particular program, and give multiple illustrative examples of issues identified and improvement plans put in place.

  10. Hospital cost structure in the USA: what's behind the costs? A business case.

    PubMed

    Chandra, Charu; Kumar, Sameer; Ghildayal, Neha S

    2011-01-01

    Hospital costs in the USA are a large part of the national GDP. Medical billing and supplies processes are significant and growing contributors to hospital operations costs in the USA. This article aims to identify cost drivers associated with these processes and to suggest improvements to reduce hospital costs. A Monte Carlo simulation model that uses @Risk software facilitates cost analysis and captures variability associated with the medical billing process (administrative) and medical supplies process (variable). The model produces estimated savings for implementing new processes. Significant waste exists across the entire medical supply process that needs to be eliminated. Annual savings, by implementing the improved process, have the potential to save several billion dollars annually in US hospitals. The other analysis in this study is related to hospital billing processes. Increased spending on hospital billing processes is not entirely due to hospital inefficiency. The study lacks concrete data for accurately measuring cost savings, but there is obviously room for improvement in the two US healthcare processes. This article only looks at two specific costs associated with medical supply and medical billing processes, respectively. This study facilitates awareness of escalating US hospital expenditures. Cost categories, namely, fixed, variable and administrative, are presented to identify the greatest areas for improvement. The study will be valuable to US Congress policy makers and US healthcare industry decision makers. Medical billing process, part of a hospital's administrative costs, and hospital supplies management processes are part of variable costs. These are the two major cost drivers of US hospitals' expenditures that were examined and analyzed.

  11. Efficiency improvement of technological preparation of power equipment manufacturing

    NASA Astrophysics Data System (ADS)

    Milukov, I. A.; Rogalev, A. N.; Sokolov, V. P.; Shevchenko, I. V.

    2017-11-01

    Competitiveness of power equipment primarily depends on speeding-up the development and mastering of new equipment samples and technologies, enhancement of organisation and management of design, manufacturing and operation. Actual political, technological and economic conditions cause the acute need in changing the strategy and tactics of process planning. At that the issues of maintenance of equipment with simultaneous improvement of its efficiency and compatibility to domestically produced components are considering. In order to solve these problems, using the systems of computer-aided process planning for process design at all stages of power equipment life cycle is economically viable. Computer-aided process planning is developed for the purpose of improvement of process planning by using mathematical methods and optimisation of design and management processes on the basis of CALS technologies, which allows for simultaneous process design, process planning organisation and management based on mathematical and physical modelling of interrelated design objects and production system. An integration of computer-aided systems providing the interaction of informative and material processes at all stages of product life cycle is proposed as effective solution to the challenges in new equipment design and process planning.

  12. Improved silicon nitride for advanced heat engines

    NASA Technical Reports Server (NTRS)

    Yeh, Harry C.; Fang, Ho T.

    1991-01-01

    The results of a four year program to improve the strength and reliability of injection-molded silicon nitride are summarized. Statistically designed processing experiments were performed to identify and optimize critical processing parameters and compositions. Process improvements were monitored by strength testing at room and elevated temperatures, and microstructural characterization by optical, scanning electron microscopes, and scanning transmission electron microscope. Processing modifications resulted in a 20 percent strength and 72 percent Weibull slope improvement of the baseline material. Additional sintering aids screening and optimization experiments succeeded in developing a new composition (GN-10) capable of 581.2 MPa at 1399 C. A SiC whisker toughened composite using this material as a matrix achieved a room temperature toughness of 6.9 MPa m(exp .5) by the Chevron notched bar technique. Exploratory experiments were conducted on injection molding of turbocharger rotors.

  13. Feasibility study of using statistical process control to customized quality assurance in proton therapy.

    PubMed

    Rah, Jeong-Eun; Shin, Dongho; Oh, Do Hoon; Kim, Tae Hyun; Kim, Gwe-Ya

    2014-09-01

    To evaluate and improve the reliability of proton quality assurance (QA) processes and, to provide an optimal customized tolerance level using the statistical process control (SPC) methodology. The authors investigated the consistency check of dose per monitor unit (D/MU) and range in proton beams to see whether it was within the tolerance level of the daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to improve the patient-specific QA process in proton beams by using process capability indices. The authors established a customized tolerance level of ±2% for D/MU and ±0.5 mm for beam range in the daily proton QA process. In the authors' analysis of the process capability indices, the patient-specific range measurements were capable of a specification limit of ±2% in clinical plans. SPC methodology is a useful tool for customizing the optimal QA tolerance levels and improving the quality of proton machine maintenance, treatment delivery, and ultimately patient safety.

  14. Using Pilots to Assess the Value and Approach of CMMI Implementation

    NASA Technical Reports Server (NTRS)

    Godfrey, Sara; Andary, James; Rosenberg, Linda

    2002-01-01

    At Goddard Space Flight Center (GSFC), we have chosen to use Capability Maturity Model Integrated (CMMI) to guide our process improvement program. Projects at GSFC consist of complex systems of software and hardware that control satellites, operate ground systems, run instruments, manage databases and data and support scientific research. It is a challenge to launch a process improvement program that encompasses our diverse systems, yet is manageable in terms of cost effectiveness. In order to establish the best approach for improvement, our process improvement effort was divided into three phases: 1) Pilot projects; 2) Staged implementation; and 3) Sustainment and continual improvement. During Phase 1 the focus of the activities was on a baselining process, using pre-appraisals in order to get a baseline for making a better cost and effort estimate for the improvement effort. Pilot pre-appraisals were conducted from different perspectives so different approaches for process implementation could be evaluated. Phase 1 also concentrated on establishing an improvement infrastructure and training of the improvement teams. At the time of this paper, three pilot appraisals have been completed. Our initial appraisal was performed in a flight software area, considering the flight software organization as the organization. The second appraisal was done from a project perspective, focusing on systems engineering and acquisition, and using the organization as GSFC. The final appraisal was in a ground support software area, again using GSFC as the organization. This paper will present our initial approach, lessons learned from all three pilots and the changes in our approach based on the lessons learned.

  15. Energy Efficiency of the Outotec® Ausmelt Process for Primary Copper Smelting

    NASA Astrophysics Data System (ADS)

    Wood, Jacob; Hoang, Joey; Hughes, Stephen

    2017-03-01

    The global, non-ferrous smelting industry has witnessed the continual development and evolution of processing technologies in a bid to reduce operating costs and improve the safety and environmental performance of processing plants. This is particularly true in the copper industry, which has seen a number of bath smelting technologies developed and implemented during the past 30 years. The Outotec® Ausmelt Top Submerged Lance Process is one such example, which has been widely adopted in the modernisation of copper processing facilities in China and Russia. Despite improvements in the energy efficiency of modern copper smelting and converting technologies, additional innovation and development is required to further reduce energy consumption, whilst still complying with stringent environmental regulations. In response to this challenge, the Ausmelt Process has undergone significant change and improvement over the course of its history, in an effort to improve its overall competitiveness, particularly with respect to energy efficiency and operating costs. This paper covers a number of recent advances to the technology and highlights the impacts of these developments in reducing energy consumptions for a range of different copper flowsheets. It also compares the energy efficiency of the Ausmelt Process against the Bottom Blown Smelting process, which has become widely adopted in China over the past 5-10 years.

  16. Increasing patient safety and efficiency in transfusion therapy using formal process definitions.

    PubMed

    Henneman, Elizabeth A; Avrunin, George S; Clarke, Lori A; Osterweil, Leon J; Andrzejewski, Chester; Merrigan, Karen; Cobleigh, Rachel; Frederick, Kimberly; Katz-Bassett, Ethan; Henneman, Philip L

    2007-01-01

    The administration of blood products is a common, resource-intensive, and potentially problem-prone area that may place patients at elevated risk in the clinical setting. Much of the emphasis in transfusion safety has been targeted toward quality control measures in laboratory settings where blood products are prepared for administration as well as in automation of certain laboratory processes. In contrast, the process of transfusing blood in the clinical setting (ie, at the point of care) has essentially remained unchanged over the past several decades. Many of the currently available methods for improving the quality and safety of blood transfusions in the clinical setting rely on informal process descriptions, such as flow charts and medical algorithms, to describe medical processes. These informal descriptions, although useful in presenting an overview of standard processes, can be ambiguous or incomplete. For example, they often describe only the standard process and leave out how to handle possible failures or exceptions. One alternative to these informal descriptions is to use formal process definitions, which can serve as the basis for a variety of analyses because these formal definitions offer precision in the representation of all possible ways that a process can be carried out in both standard and exceptional situations. Formal process definitions have not previously been used to describe and improve medical processes. The use of such formal definitions to prospectively identify potential error and improve the transfusion process has not previously been reported. The purpose of this article is to introduce the concept of formally defining processes and to describe how formal definitions of blood transfusion processes can be used to detect and correct transfusion process errors in ways not currently possible using existing quality improvement methods.

  17. A tale of two audits: statistical process control for improving diabetes care in primary care settings.

    PubMed

    Al-Hussein, Fahad Abdullah

    2008-01-01

    Diabetes constitutes a major burden of disease globally. Both primary and secondary prevention need to improve in order to face this challenge. Improving management of diabetes in primary care is therefore of fundamental importance. The objective of these series of audits was to find means of improving diabetes management in chronic disease mini-clinics in primary health care. In the process, we were able to study the effect and practical usefulness of different audit designs - those measuring clinical outcomes, process of care, or both. King Saud City Family and Community Medicine Centre, Saudi National Guard Health Affairs in Riyadh city, Saudi Arabia. Simple random samples of 30 files were selected every two weeks from a sampling frame of file numbers for all diabetes clients seen over the period. Information was transferred to a form, entered on the computer and an automated response was generated regarding the appropriateness of management, a criterion mutually agreed upon by care providers. The results were plotted on statistical process control charts, p charts, displayed for all employees. Data extraction, archiving, entry, analysis, plotting and design and preparation of p charts were managed by nursing staff specially trained for the purpose by physicians with relevant previous experience. Audit series with mixed outcome and process measures failed to detect any changes in the proportion of non-conforming cases over a period of one year. The process measures series, on the other hand, showed improvement in care corresponding to a reduction in the proportion non-conforming by 10% within a period of 3 months. Non-conformities dropped from a mean of 5.0 to 1.4 over the year (P < 0.001). It is possible to improve providers' behaviour regarding implementation of given guidelines through periodic process audits and feedbacks. Frequent process audits in the context of statistical process control should be supplemented with concurrent outcome audits, once or twice a year.

  18. Polyimide processing additives

    NASA Technical Reports Server (NTRS)

    Fletcher, James C. (Inventor); Pratt, J. Richard (Inventor); St.clair, Terry L. (Inventor); Stoakley, Diane M. (Inventor); Burks, Harold D. (Inventor)

    1992-01-01

    A process for preparing polyimides having enhanced melt flow properties is described. The process consists of heating a mixture of a high molecular weight poly-(amic acid) or polyimide with a low molecular weight amic acid or imide additive in the range of 0.05 to 15 percent by weight of additive. The polyimide powders so obtained show improved processability, as evidenced by lower melt viscosity by capillary rheometry. Likewise, films prepared from mixtures of polymers with additives show improved processability with earlier onset of stretching by TMA.

  19. Paths of Adoption: Routes to Continuous Process Improvement

    DTIC Science & Technology

    2014-07-01

    To obtain a process the team will use: The Team Lead has worked on teams with good processes and wants their new team to start out on the right foot ...eventually going to have to eat the entire process-improvement elephant . So, how do you get the “Never- Adopters” to undertake the effort? The key is to...Air Warfare Center Abstract. This paper covers the different types of teams the authors have en- countered as NAVAIR Internal Process Coaches and how

  20. Polyimide processing additives

    NASA Technical Reports Server (NTRS)

    Pratt, J. Richard (Inventor); St.clair, Terry L. (Inventor); Stoakley, Diane M. (Inventor); Burks, Harold D. (Inventor)

    1993-01-01

    A process for preparing polyimides having enhanced melt flow properties is described. The process consists of heating a mixture of a high molecular weight poly-(amic acid) or polyimide with a low molecular weight amic acid or imide additive in the range of 0.05 to 15 percent by weight of the additive. The polyimide powders so obtained show improved processability, as evidenced by lower melt viscosity by capillary rheometry. Likewise, films prepared from mixtures of polymers with additives show improved processability with earlier onset of stretching by TMA.

  1. Designing Process Improvement of Finished Good On Time Release and Performance Indicator Tool in Milk Industry Using Business Process Reengineering Method

    NASA Astrophysics Data System (ADS)

    Dachyar, M.; Christy, E.

    2014-04-01

    To maintain position as a major milk producer, the Indonesian milk industry should do some business development with the purpose of increasing customer service level. One strategy is to create on time release conditions for finished goods which will be distributed to customers and distributors. To achieve this condition, management information systems of finished goods on time release needs to be improved. The focus of this research is to conduct business process improvement using Business Process Reengineering (BPR). The deliverable key of this study is a comprehensive business strategy which is the solution of the root problems. To achieve the goal, evaluation, reengineering, and improvement of the ERP system are conducted. To visualize the predicted implementation, a simulation model is built by Oracle BPM. The output of this simulation showed that the proposed solution could effectively reduce the process lead time and increase the number of quality releases.

  2. Using Rapid Improvement Events for Disaster After-Action Reviews: Experience in a Hospital Information Technology Outage and Response.

    PubMed

    Little, Charles M; McStay, Christopher; Oeth, Justin; Koehler, April; Bookman, Kelly

    2018-02-01

    The use of after-action reviews (AARs) following major emergency events, such as a disaster, is common and mandated for hospitals and similar organizations. There is a recurrent challenge of identified problems not being resolved and repeated in subsequent events. A process improvement technique called a rapid improvement event (RIE) was used to conduct an AAR following a complete information technology (IT) outage at a large urban hospital. Using RIE methodology to conduct the AAR allowed for the rapid development and implementation of major process improvements to prepare for future IT downtime events. Thus, process improvement methodology, particularly the RIE, is suited for conducting AARs following disasters and holds promise for improving outcomes in emergency management. Little CM , McStay C , Oeth J , Koehler A , Bookman K . Using rapid improvement events for disaster after-action reviews: experience in a hospital information technology outage and response. Prehosp Disaster Med. 2018;33(1):98-100.

  3. Improving Readability of an Evaluation Tool for Low-Income Clients Using Visual Information Processing Theories

    ERIC Educational Resources Information Center

    Townsend, Marilyn S.; Sylva, Kathryn; Martin, Anna; Metz, Diane; Wooten-Swanson, Patti

    2008-01-01

    Literacy is an issue for many low-income audiences. Using visual information processing theories, the goal was improving readability of a food behavior checklist and ultimately improving its ability to accurately capture existing changes in dietary behaviors. Using group interviews, low-income clients (n = 18) evaluated 4 visual styles. The text…

  4. Align the Design: A Blueprint for School Improvement

    ERIC Educational Resources Information Center

    Mooney, Nancy J.; Mausbach, Ann T.

    2008-01-01

    Regardless of where you are in your school improvement process, here's a book that helps you make sure you have all the right elements working for you in the right way. The authors take you through the core processes that are essential for all school improvement efforts--from establishing your mission to differentiating your supervision based on…

  5. External Technical Support for School Improvement: Critical Issues from the Chilean Experience

    ERIC Educational Resources Information Center

    Osses, Alejandra; Bellei, Cristián; Valenzuela, Juan Pablo

    2015-01-01

    To what extent school improvement processes can be initiated and sustained from the outside has been a relevant question for policy-makers seeking to increase quality in education. Since 2008, the Chilean Government is strongly promoting the use of external technical support (ETS) services to support school improvement processes, as part of the…

  6. Superintendents' Perceptions of the School Improvement Planning Process in the Southeastern USA

    ERIC Educational Resources Information Center

    Dunaway, David M.; Bird, James J.; Wang, Chuang; Hancock, Dawson

    2014-01-01

    The purpose of this study of school improvement planning in the southeastern USA was to establish the current view of the process through the eyes of the district superintendents. The answers to the questions were consistently mixed. Generally, the presence of school improvement planning is prevalent in the large majority of districts. However,…

  7. Continuous Improvement in Action: Educators' Evidence Use for School Improvement

    ERIC Educational Resources Information Center

    Cannata, Marisa; Redding, Christopher; Rubin, Mollie

    2016-01-01

    The focus of the article is the process educators use to interpret data to turn it into usable knowledge (Honig & Coburn, 2008) while engaging in a continuous improvement process. The authors examine the types of evidence educators draw upon, its perceived relevance, and the social context in which the evidence is examined. Evidence includes…

  8. Perceptions of the Purpose and Value of the School Improvement Plan Process

    ERIC Educational Resources Information Center

    Dunaway, David M.; Kim, Do-Hong; Szad, Elizabeth R.

    2012-01-01

    The purpose of this research was to determine how teachers and administrators in a successful North Carolina district perceived the purpose and value of a school improvement plan (SIP) and the planning process. The SIP is the accepted best practice for school-wide improvement, and the perceptions of the purpose and value of the process…

  9. Improved Creative Thinkers in a Class: A Model of Activity Based Tasks for Improving University Students' Creative Thinking Abilities

    ERIC Educational Resources Information Center

    Oncu, Elif Celebi

    2016-01-01

    The main objective of this study was improving university students' from different faculties creativity thinking through a creativity education process. The education process took twelve weeks' time. As pretest, Torrance test of creative thinking (TTCT) figural form was used. Participants were 24 university students from different faculties who…

  10. Biotechnology in Food Production and Processing

    NASA Astrophysics Data System (ADS)

    Knorr, Dietrich; Sinskey, Anthony J.

    1985-09-01

    The food processing industry is the oldest and largest industry using biotechnological processes. Further development of food products and processes based on biotechnology depends upon the improvement of existing processes, such as fermentation, immobilized biocatalyst technology, and production of additives and processing aids, as well as the development of new opportunities for food biotechnology. Improvements are needed in the characterization, safety, and quality control of food materials, in processing methods, in waste conversion and utilization processes, and in currently used food microorganism and tissue culture systems. Also needed are fundamental studies of the structure-function relationship of food materials and of the cell physiology and biochemistry of raw materials.

  11. Leadership, safety climate, and continuous quality improvement: impact on process quality and patient safety.

    PubMed

    McFadden, Kathleen L; Stock, Gregory N; Gowen, Charles R

    2014-10-01

    Successful amelioration of medical errors represents a significant problem in the health care industry. There is a need for greater understanding of the factors that lead to improved process quality and patient safety outcomes in hospitals. We present a research model that shows how transformational leadership, safety climate, and continuous quality improvement (CQI) initiatives are related to objective quality and patient safety outcome measures. The proposed framework is tested using structural equation modeling, based on data collected for 204 hospitals, and supplemented with objective outcome data from the Centers for Medicare and Medicaid Services. The results provide empirical evidence that a safety climate, which is connected to the chief executive officer's transformational leadership style, is related to CQI initiatives, which are linked to improved process quality. A unique finding of this study is that, although CQI initiatives are positively associated with improved process quality, they are also associated with higher hospital-acquired condition rates, a measure of patient safety. Likewise, safety climate is directly related to improved patient safety outcomes. The notion that patient safety climate and CQI initiatives are not interchangeable or universally beneficial is an important contribution to the literature. The results confirm the importance of using CQI to effectively enhance process quality in hospitals, and patient safety climate to improve patient safety outcomes. The overall pattern of findings suggests that simultaneous implementation of CQI initiatives and patient safety climate produces greater combined benefits.

  12. Leadership, safety climate, and continuous quality improvement: impact on process quality and patient safety.

    PubMed

    McFadden, Kathleen L; Stock, Gregory N; Gowen, Charles R

    2015-01-01

    Successful amelioration of medical errors represents a significant problem in the health care industry. There is a need for greater understanding of the factors that lead to improved process quality and patient safety outcomes in hospitals. We present a research model that shows how transformational leadership, safety climate, and continuous quality improvement (CQI) initiatives are related to objective quality and patient safety outcome measures. The proposed framework is tested using structural equation modeling, based on data collected for 204 hospitals, and supplemented with objective outcome data from the Centers for Medicare and Medicaid Services. The results provide empirical evidence that a safety climate, which is connected to the chief executive officer's transformational leadership style, is related to CQI initiatives, which are linked to improved process quality. A unique finding of this study is that, although CQI initiatives are positively associated with improved process quality, they are also associated with higher hospital-acquired condition rates, a measure of patient safety. Likewise, safety climate is directly related to improved patient safety outcomes. The notion that patient safety climate and CQI initiatives are not interchangeable or universally beneficial is an important contribution to the literature. The results confirm the importance of using CQI to effectively enhance process quality in hospitals, and patient safety climate to improve patient safety outcomes. The overall pattern of findings suggests that simultaneous implementation of CQI initiatives and patient safety climate produces greater combined benefits.

  13. Improving performances of the knee replacement surgery process by applying DMAIC principles.

    PubMed

    Improta, Giovanni; Balato, Giovanni; Romano, Maria; Ponsiglione, Alfonso Maria; Raiola, Eliana; Russo, Mario Alessandro; Cuccaro, Patrizia; Santillo, Liberatina Carmela; Cesarelli, Mario

    2017-12-01

    The work is a part of a project about the application of the Lean Six Sigma to improve health care processes. A previously published work regarding the hip replacement surgery has shown promising results. Here, we propose an application of the DMAIC (Define, Measure, Analyse, Improve, and Control) cycle to improve quality and reduce costs related to the prosthetic knee replacement surgery by decreasing patients' length of hospital stay (LOS) METHODS: The DMAIC cycle has been adopted to decrease the patients' LOS. The University Hospital "Federico II" of Naples, one of the most important university hospitals in Southern Italy, participated in this study. Data on 148 patients who underwent prosthetic knee replacement between 2010 and 2013 were used. Process mapping, statistical measures, brainstorming activities, and comparative analysis were performed to identify factors influencing LOS and improvement strategies. The study allowed the identification of variables influencing the prolongation of the LOS and the implementation of corrective actions to improve the process of care. The adopted actions reduced the LOS by 42%, from a mean value of 14.2 to 8.3 days (standard deviation also decreased from 5.2 to 2.3 days). The DMAIC approach has proven to be a helpful strategy ensuring a significant decreasing of the LOS. Furthermore, through its implementation, a significant reduction of the average costs of hospital stay can be achieved. Such a versatile approach could be applied to improve a wide range of health care processes. © 2017 John Wiley & Sons, Ltd.

  14. A green desulfurization technique: utilization of flue gas SO2 to produce H2 via a photoelectrochemical process based on Mo-doped BiVO4

    NASA Astrophysics Data System (ADS)

    Han, Jin; Li, Kejian; Cheng, Hanyun; Zhang, Liwu

    2017-12-01

    A green photoelectrochemical (PEC) process with simultaneous SO2 removal and H2 production has attracted an increasing attention. The proposed process uses flue gas SO2 to improve H2 production. The improvement of the efficiency of this process is necessary before it can become industrial viable. Herein, we reported a Mo modified BiVO4 photocatalysts for a simultaneous SO2 removal and H2 production. And the PEC performance could be significantly improved with doping and flue gas removal. The evolution rate of H2 and removal of SO2 could be enhanced by almost 3 times after Mo doping as compared with pristine BiVO4. The enhanced H2 production and SO2 removal is attributed to the improved bulk charge carrier transportation after Mo doping, and greatly enhanced oxidation reaction kinetics on the photoanode due to the formation of SO32- after SO2 absorption by the electrolyte. Due to the utilization of SO2 to improve the production of H2, the proposed PEC process may become a profitable desulfurization technique.

  15. Prevention and management of "do not return" notices: a quality improvement process for supplemental staffing nursing agencies.

    PubMed

    Ade-Oshifogun, Jochebed Bosede; Dufelmeier, Thaddeus

    2012-01-01

    This article describes a quality improvement process for "do not return" (DNR) notices for healthcare supplemental staffing agencies and healthcare facilities that use them. It is imperative that supplemental staffing agencies partner with healthcare facilities in assuring the quality of supplemental staff. Although supplemental staffing agencies attempt to ensure quality staffing, supplemental staff are sometimes subjectively evaluated by healthcare facilities as "DNR." The objective of this article is to describe a quality improvement process to prevent and manage "DNR" within healthcare organizations. We developed a curriculum and accompanying evaluation tool by adapting Rampersad's problem-solving discipline approach: (a) definition of area(s) for improvement; (b) identification of all possible causes; (c) development of an action plan; (d) implementation of the action plan; (e) evaluation for program improvement; and (f) standardization of the process. Face and content validity of the evaluation tool was ascertained by input from a panel of experienced supplemental staff and nursing faculty. This curriculum and its evaluation tool will have practical implications for supplemental staffing agencies and healthcare facilities in reducing "DNR" rates and in meeting certification/accreditation requirements. Further work is needed to translate this process into future research. © 2012 Wiley Periodicals, Inc.

  16. A Green Desulfurization Technique: Utilization of Flue Gas SO2 to Produce H2 via a Photoelectrochemical Process Based on Mo-Doped BiVO4

    PubMed Central

    Han, Jin; Li, Kejian; Cheng, Hanyun; Zhang, Liwu

    2017-01-01

    A green photoelectrochemical (PEC) process with simultaneous SO2 removal and H2 production has attracted an increasing attention. The proposed process uses flue gas SO2 to improve H2 production. The improvement of the efficiency of this process is necessary before it can become industrial viable. Herein, we reported a Mo modified BiVO4 photocatalysts for a simultaneous SO2 removal and H2 production. And the PEC performance could be significantly improved with doping and flue gas removal. The evolution rate of H2 and removal of SO2 could be enhanced by almost three times after Mo doping as compared with pristine BiVO4. The enhanced H2 production and SO2 removal is attributed to the improved bulk charge carrier transportation after Mo doping, and greatly enhanced oxidation reaction kinetics on the photoanode due to the formation of SO32− after SO2 absorption by the electrolyte. Due to the utilization of SO2 to improve the production of H2, the proposed PEC process may become a profitable desulfurization technique. PMID:29312924

  17. A Green Desulfurization Technique: Utilization of Flue Gas SO2 to Produce H2 via a Photoelectrochemical Process Based on Mo-Doped BiVO4.

    PubMed

    Han, Jin; Li, Kejian; Cheng, Hanyun; Zhang, Liwu

    2017-01-01

    A green photoelectrochemical (PEC) process with simultaneous SO 2 removal and H 2 production has attracted an increasing attention. The proposed process uses flue gas SO 2 to improve H 2 production. The improvement of the efficiency of this process is necessary before it can become industrial viable. Herein, we reported a Mo modified BiVO 4 photocatalysts for a simultaneous SO 2 removal and H 2 production. And the PEC performance could be significantly improved with doping and flue gas removal. The evolution rate of H 2 and removal of SO 2 could be enhanced by almost three times after Mo doping as compared with pristine BiVO 4 . The enhanced H 2 production and SO 2 removal is attributed to the improved bulk charge carrier transportation after Mo doping, and greatly enhanced oxidation reaction kinetics on the photoanode due to the formation of [Formula: see text] after SO 2 absorption by the electrolyte. Due to the utilization of SO 2 to improve the production of H 2 , the proposed PEC process may become a profitable desulfurization technique.

  18. Quantitative CMMI Assessment for Offshoring through the Analysis of Project Management Repositories

    NASA Astrophysics Data System (ADS)

    Sunetnanta, Thanwadee; Nobprapai, Ni-On; Gotel, Olly

    The nature of distributed teams and the existence of multiple sites in offshore software development projects pose a challenging setting for software process improvement. Often, the improvement and appraisal of software processes is achieved through a turnkey solution where best practices are imposed or transferred from a company’s headquarters to its offshore units. In so doing, successful project health checks and monitoring for quality on software processes requires strong project management skills, well-built onshore-offshore coordination, and often needs regular onsite visits by software process improvement consultants from the headquarters’ team. This paper focuses on software process improvement as guided by the Capability Maturity Model Integration (CMMI) and proposes a model to evaluate the status of such improvement efforts in the context of distributed multi-site projects without some of this overhead. The paper discusses the application of quantitative CMMI assessment through the collection and analysis of project data gathered directly from project repositories to facilitate CMMI implementation and reduce the cost of such implementation for offshore-outsourced software development projects. We exemplify this approach to quantitative CMMI assessment through the analysis of project management data and discuss the future directions of this work in progress.

  19. Design of production process main shaft process with lean manufacturing to improve productivity

    NASA Astrophysics Data System (ADS)

    Siregar, I.; Nasution, A. A.; Andayani, U.; Anizar; Syahputri, K.

    2018-02-01

    This object research is one of manufacturing companies that produce oil palm machinery parts. In the production process there is delay in the completion of the Main shaft order. Delays in the completion of the order indicate the low productivity of the company in terms of resource utilization. This study aimed to obtain a draft improvement of production processes that can improve productivity by identifying and eliminating activities that do not add value (non-value added activity). One approach that can be used to reduce and eliminate non-value added activity is Lean Manufacturing. This study focuses on the identification of non-value added activity with value stream mapping analysis tools, while the elimination of non-value added activity is done with tools 5 whys and implementation of pull demand system. Based on the research known that non-value added activity on the production process of the main shaft is 9,509.51 minutes of total lead time 10,804.59 minutes. This shows the level of efficiency (Process Cycle Efficiency) in the production process of the main shaft is still very low by 11.89%. Estimation results of improvement showed a decrease in total lead time became 4,355.08 minutes and greater process cycle efficiency that is equal to 29.73%, which indicates that the process was nearing the concept of lean production.

  20. CM Process Improvement and the International Space Station Program (ISSP)

    NASA Technical Reports Server (NTRS)

    Stephenson, Ginny

    2007-01-01

    This viewgraph presentation reviews the Configuration Management (CM) process improvements planned and undertaken for the International Space Station Program (ISSP). It reviews the 2004 findings and recommendations and the progress towards their implementation.

Top