Sample records for highly efficient tool

  1. Development of High Efficiency (14%) Solar Cell Array Module

    NASA Technical Reports Server (NTRS)

    Iles, P. A.; Khemthong, S.; Olah, S.; Sampson, W. J.; Ling, K. S.

    1979-01-01

    High efficiency solar cells required for the low cost modules was developed. The production tooling for the manufacture of the cells and modules was designed. The tooling consisted of: (1) back contact soldering machine; (2) vacuum pickup; (3) antireflective coating tooling; and (4) test fixture.

  2. High accurate interpolation of NURBS tool path for CNC machine tools

    NASA Astrophysics Data System (ADS)

    Liu, Qiang; Liu, Huan; Yuan, Songmei

    2016-09-01

    Feedrate fluctuation caused by approximation errors of interpolation methods has great effects on machining quality in NURBS interpolation, but few methods can efficiently eliminate or reduce it to a satisfying level without sacrificing the computing efficiency at present. In order to solve this problem, a high accurate interpolation method for NURBS tool path is proposed. The proposed method can efficiently reduce the feedrate fluctuation by forming a quartic equation with respect to the curve parameter increment, which can be efficiently solved by analytic methods in real-time. Theoretically, the proposed method can totally eliminate the feedrate fluctuation for any 2nd degree NURBS curves and can interpolate 3rd degree NURBS curves with minimal feedrate fluctuation. Moreover, a smooth feedrate planning algorithm is also proposed to generate smooth tool motion with considering multiple constraints and scheduling errors by an efficient planning strategy. Experiments are conducted to verify the feasibility and applicability of the proposed method. This research presents a novel NURBS interpolation method with not only high accuracy but also satisfying computing efficiency.

  3. Fundamental Aeronautics Program: Overview of Project Work in Supersonic Cruise Efficiency

    NASA Technical Reports Server (NTRS)

    Castner, Raymond

    2011-01-01

    The Supersonics Project, part of NASA?s Fundamental Aeronautics Program, contains a number of technical challenge areas which include sonic boom community response, airport noise, high altitude emissions, cruise efficiency, light weight durable engines/airframes, and integrated multi-discipline system design. This presentation provides an overview of the current (2011) activities in the supersonic cruise efficiency technical challenge, and is focused specifically on propulsion technologies. The intent is to develop and validate high-performance supersonic inlet and nozzle technologies. Additional work is planned for design and analysis tools for highly-integrated low-noise, low-boom applications. If successful, the payoffs include improved technologies and tools for optimized propulsion systems, propulsion technologies for a minimized sonic boom signature, and a balanced approach to meeting efficiency and community noise goals. In this propulsion area, the work is divided into advanced supersonic inlet concepts, advanced supersonic nozzle concepts, low fidelity computational tool development, high fidelity computational tools, and improved sensors and measurement capability. The current work in each area is summarized.

  4. Fundamental Aeronautics Program: Overview of Propulsion Work in the Supersonic Cruise Efficiency Technical Challenge

    NASA Technical Reports Server (NTRS)

    Castner, Ray

    2012-01-01

    The Supersonics Project, part of NASA's Fundamental Aeronautics Program, contains a number of technical challenge areas which include sonic boom community response, airport noise, high altitude emissions, cruise efficiency, light weight durable engines/airframes, and integrated multi-discipline system design. This presentation provides an overview of the current (2012) activities in the supersonic cruise efficiency technical challenge, and is focused specifically on propulsion technologies. The intent is to develop and validate high-performance supersonic inlet and nozzle technologies. Additional work is planned for design and analysis tools for highly-integrated low-noise, low-boom applications. If successful, the payoffs include improved technologies and tools for optimized propulsion systems, propulsion technologies for a minimized sonic boom signature, and a balanced approach to meeting efficiency and community noise goals. In this propulsion area, the work is divided into advanced supersonic inlet concepts, advanced supersonic nozzle concepts, low fidelity computational tool development, high fidelity computational tools, and improved sensors and measurement capability. The current work in each area is summarized.

  5. Improving smoothing efficiency of rigid conformal polishing tool using time-dependent smoothing evaluation model

    NASA Astrophysics Data System (ADS)

    Song, Chi; Zhang, Xuejun; Zhang, Xin; Hu, Haifei; Zeng, Xuefeng

    2017-06-01

    A rigid conformal (RC) lap can smooth mid-spatial-frequency (MSF) errors, which are naturally smaller than the tool size, while still removing large-scale errors in a short time. However, the RC-lap smoothing efficiency performance is poorer than expected, and existing smoothing models cannot explicitly specify the methods to improve this efficiency. We presented an explicit time-dependent smoothing evaluation model that contained specific smoothing parameters directly derived from the parametric smoothing model and the Preston equation. Based on the time-dependent model, we proposed a strategy to improve the RC-lap smoothing efficiency, which incorporated the theoretical model, tool optimization, and efficiency limit determination. Two sets of smoothing experiments were performed to demonstrate the smoothing efficiency achieved using the time-dependent smoothing model. A high, theory-like tool influence function and a limiting tool speed of 300 RPM were o

  6. A Fixed-Wing Aircraft Simulation Tool for Improving the efficiency of DoD Acquisition

    DTIC Science & Technology

    2015-10-05

    simulation tool , CREATETM-AV Helios [12-14], a high fidelity rotary wing vehicle simulation tool , and CREATETM-AV DaVinci [15-16], a conceptual through...05/2015 Oct 2008-Sep 2015 A Fixed-Wing Aircraft Simulation Tool for Improving the Efficiency of DoD Acquisition Scott A. Morton and David R...multi-disciplinary fixed-wing virtual aircraft simulation tool incorporating aerodynamics, structural dynamics, kinematics, and kinetics. Kestrel allows

  7. High efficiency, long life terrestrial solar panel

    NASA Technical Reports Server (NTRS)

    Chao, T.; Khemthong, S.; Ling, R.; Olah, S.

    1977-01-01

    The design of a high efficiency, long life terrestrial module was completed. It utilized 256 rectangular, high efficiency solar cells to achieve high packing density and electrical output. Tooling for the fabrication of solar cells was in house and evaluation of the cell performance was begun. Based on the power output analysis, the goal of a 13% efficiency module was achievable.

  8. How to Compute a Slot Marker - Calculation of Controller Managed Spacing Tools for Efficient Descents with Precision Scheduling

    NASA Technical Reports Server (NTRS)

    Prevot, Thomas

    2012-01-01

    This paper describes the underlying principles and algorithms for computing the primary controller managed spacing (CMS) tools developed at NASA for precisely spacing aircraft along efficient descent paths. The trajectory-based CMS tools include slot markers, delay indications and speed advisories. These tools are one of three core NASA technologies integrated in NASAs ATM technology demonstration-1 (ATD-1) that will operationally demonstrate the feasibility of fuel-efficient, high throughput arrival operations using Automatic Dependent Surveillance Broadcast (ADS-B) and ground-based and airborne NASA technologies for precision scheduling and spacing.

  9. The ChIP-Seq tools and web server: a resource for analyzing ChIP-seq and other types of genomic data.

    PubMed

    Ambrosini, Giovanna; Dreos, René; Kumar, Sunil; Bucher, Philipp

    2016-11-18

    ChIP-seq and related high-throughput chromatin profilig assays generate ever increasing volumes of highly valuable biological data. To make sense out of it, biologists need versatile, efficient and user-friendly tools for access, visualization and itegrative analysis of such data. Here we present the ChIP-Seq command line tools and web server, implementing basic algorithms for ChIP-seq data analysis starting with a read alignment file. The tools are optimized for memory-efficiency and speed thus allowing for processing of large data volumes on inexpensive hardware. The web interface provides access to a large database of public data. The ChIP-Seq tools have a modular and interoperable design in that the output from one application can serve as input to another one. Complex and innovative tasks can thus be achieved by running several tools in a cascade. The various ChIP-Seq command line tools and web services either complement or compare favorably to related bioinformatics resources in terms of computational efficiency, ease of access to public data and interoperability with other web-based tools. The ChIP-Seq server is accessible at http://ccg.vital-it.ch/chipseq/ .

  10. Quality and Efficiency Improvement Tools for Every Radiologist.

    PubMed

    Kudla, Alexei U; Brook, Olga R

    2018-06-01

    In an era of value-based medicine, data-driven quality improvement is more important than ever to ensure safe and efficient imaging services. Familiarity with high-value tools enables all radiologists to successfully engage in quality and efficiency improvement. In this article, we review the model for improvement, strategies for measurement, and common practical tools with real-life examples that include Run chart, Control chart (Shewhart chart), Fishbone (Cause-and-Effect or Ishikawa) diagram, Pareto chart, 5 Whys, and Root Cause Analysis. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  11. Research on the tool holder mode in high speed machining

    NASA Astrophysics Data System (ADS)

    Zhenyu, Zhao; Yongquan, Zhou; Houming, Zhou; Xiaomei, Xu; Haibin, Xiao

    2018-03-01

    High speed machining technology can improve the processing efficiency and precision, but also reduce the processing cost. Therefore, the technology is widely regarded in the industry. With the extensive application of high-speed machining technology, high-speed tool system has higher and higher requirements on the tool chuck. At present, in high speed precision machining, several new kinds of clip heads are as long as there are heat shrinkage tool-holder, high-precision spring chuck, hydraulic tool-holder, and the three-rib deformation chuck. Among them, the heat shrinkage tool-holder has the advantages of high precision, high clamping force, high bending rigidity and dynamic balance, etc., which are widely used. Therefore, it is of great significance to research the new requirements of the machining tool system. In order to adapt to the requirement of high speed machining precision machining technology, this paper expounds the common tool holder technology of high precision machining, and proposes how to select correctly tool clamping system in practice. The characteristics and existing problems are analyzed in the tool clamping system.

  12. ATD-1 ATM Technology Demonstration-1 and Integrated Scheduling

    NASA Technical Reports Server (NTRS)

    Quon, Leighton

    2014-01-01

    Enabling efficient arrivals for the NextGen Air Traffic Management System and developing a set of integrated decision support tools to reduce the high cognitive workload so that controllers are able to simultaneously achieve safe, efficient, and expedient operations at high traffic demand levels.

  13. Influence of Process Parameters on the Process Efficiency in Laser Metal Deposition Welding

    NASA Astrophysics Data System (ADS)

    Güpner, Michael; Patschger, Andreas; Bliedtner, Jens

    Conventionally manufactured tools are often completely constructed of a high-alloyed, expensive tool steel. An alternative way to manufacture tools is the combination of a cost-efficient, mild steel and a functional coating in the interaction zone of the tool. Thermal processing methods, like laser metal deposition, are always characterized by thermal distortion. The resistance against the thermal distortion decreases with the reduction of the material thickness. As a consequence, there is a necessity of a special process management for the laser based coating of thin parts or tools. The experimental approach in the present paper is to keep the energy and the mass per unit length constant by varying the laser power, the feed rate and the powder mass flow. The typical seam parameters are measured in order to characterize the cladding process, define process limits and evaluate the process efficiency. Ways to optimize dilution, angular distortion and clad height are presented.

  14. Exploration of depth modeling mode one lossless wedgelets storage strategies for 3D-high efficiency video coding

    NASA Astrophysics Data System (ADS)

    Sanchez, Gustavo; Marcon, César; Agostini, Luciano Volcan

    2018-01-01

    The 3D-high efficiency video coding has introduced tools to obtain higher efficiency in 3-D video coding, and most of them are related to the depth maps coding. Among these tools, the depth modeling mode-1 (DMM-1) focuses on better encoding edges regions of depth maps. The large memory required for storing all wedgelet patterns is one of the bottlenecks in the DMM-1 hardware design of both encoder and decoder since many patterns must be stored. Three algorithms to reduce the DMM-1 memory requirements and a hardware design targeting the most efficient among these algorithms are presented. Experimental results demonstrate that the proposed solutions surpass related works reducing up to 78.8% of the wedgelet memory, without degrading the encoding efficiency. Synthesis results demonstrate that the proposed algorithm reduces almost 75% of the power dissipation when compared to the standard approach.

  15. Simulation Evaluation of Controller-Managed Spacing Tools under Realistic Operational Conditions

    NASA Technical Reports Server (NTRS)

    Callantine, Todd J.; Hunt, Sarah M.; Prevot, Thomas

    2014-01-01

    Controller-Managed Spacing (CMS) tools have been developed to aid air traffic controllers in managing high volumes of arriving aircraft according to a schedule while enabling them to fly efficient descent profiles. The CMS tools are undergoing refinement in preparation for field demonstration as part of NASA's Air Traffic Management (ATM) Technology Demonstration-1 (ATD-1). System-level ATD-1 simulations have been conducted to quantify expected efficiency and capacity gains under realistic operational conditions. This paper presents simulation results with a focus on CMS-tool human factors. The results suggest experienced controllers new to the tools find them acceptable and can use them effectively in ATD-1 operations.

  16. Tool path strategy and cutting process monitoring in intelligent machining

    NASA Astrophysics Data System (ADS)

    Chen, Ming; Wang, Chengdong; An, Qinglong; Ming, Weiwei

    2018-06-01

    Intelligent machining is a current focus in advanced manufacturing technology, and is characterized by high accuracy and efficiency. A central technology of intelligent machining—the cutting process online monitoring and optimization—is urgently needed for mass production. In this research, the cutting process online monitoring and optimization in jet engine impeller machining, cranio-maxillofacial surgery, and hydraulic servo valve deburring are introduced as examples of intelligent machining. Results show that intelligent tool path optimization and cutting process online monitoring are efficient techniques for improving the efficiency, quality, and reliability of machining.

  17. High-order computational fluid dynamics tools for aircraft design

    PubMed Central

    Wang, Z. J.

    2014-01-01

    Most forecasts predict an annual airline traffic growth rate between 4.5 and 5% in the foreseeable future. To sustain that growth, the environmental impact of aircraft cannot be ignored. Future aircraft must have much better fuel economy, dramatically less greenhouse gas emissions and noise, in addition to better performance. Many technical breakthroughs must take place to achieve the aggressive environmental goals set up by governments in North America and Europe. One of these breakthroughs will be physics-based, highly accurate and efficient computational fluid dynamics and aeroacoustics tools capable of predicting complex flows over the entire flight envelope and through an aircraft engine, and computing aircraft noise. Some of these flows are dominated by unsteady vortices of disparate scales, often highly turbulent, and they call for higher-order methods. As these tools will be integral components of a multi-disciplinary optimization environment, they must be efficient to impact design. Ultimately, the accuracy, efficiency, robustness, scalability and geometric flexibility will determine which methods will be adopted in the design process. This article explores these aspects and identifies pacing items. PMID:25024419

  18. Induction Consolidation of Thermoplastic Composites Using Smart Susceptors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matsen, Marc R

    2012-06-14

    This project has focused on the area of energy efficient consolidation and molding of fiber reinforced thermoplastic composite components as an energy efficient alternative to the conventional processing methods such as autoclave processing. The expanding application of composite materials in wind energy, automotive, and aerospace provides an attractive energy efficiency target for process development. The intent is to have this efficient processing along with the recyclable thermoplastic materials ready for large scale application before these high production volume levels are reached. Therefore, the process can be implemented in a timely manner to realize the maximum economic, energy, and environmental efficiencies.more » Under this project an increased understanding of the use of induction heating with smart susceptors applied to consolidation of thermoplastic has been achieved. This was done by the establishment of processing equipment and tooling and the subsequent demonstration of this fabrication technology by consolidating/molding of entry level components for each of the participating industrial segments, wind energy, aerospace, and automotive. This understanding adds to the nation's capability to affordably manufacture high quality lightweight high performance components from advanced recyclable composite materials in a lean and energy efficient manner. The use of induction heating with smart susceptors is a precisely controlled low energy method for the consolidation and molding of thermoplastic composites. The smart susceptor provides intrinsic thermal control based on the interaction with the magnetic field from the induction coil thereby producing highly repeatable processing. The low energy usage is enabled by the fact that only the smart susceptor surface of the tool is heated, not the entire tool. Therefore much less mass is heated resulting in significantly less required energy to consolidate/mold the desired composite components. This energy efficiency results in potential energy savings of {approx}75% as compared to autoclave processing in aerospace, {approx}63% as compared to compression molding in automotive, and {approx}42% energy savings as compared to convectively heated tools in wind energy. The ability to make parts in a rapid and controlled manner provides significant economic advantages for each of the industrial segments. These attributes were demonstrated during the processing of the demonstration components on this project.« less

  19. Development of optimal grinding and polishing tools for aspheric surfaces

    NASA Astrophysics Data System (ADS)

    Burge, James H.; Anderson, Bill; Benjamin, Scott; Cho, Myung K.; Smith, Koby Z.; Valente, Martin J.

    2001-12-01

    The ability to grind and polish steep aspheric surfaces to high quality is limited by the tools used for working the surface. The optician prefers to use large, stiff tools to get good natural smoothing, avoiding small scale surface errors. This is difficult for steep aspheres because the tools must have sufficient compliance to fit the aspheric surface, yet we wish the tools to be stiff so they wear down high regions on the surface. This paper presents a toolkit for designing optimal tools that provide large scale compliance to fit the aspheric surface, yet maintain small scale stiffness for efficient polishing.

  20. Simulation of laser radar tooling ball measurements: focus dependence

    NASA Astrophysics Data System (ADS)

    Smith, Daniel G.; Slotwinski, Anthony; Hedges, Thomas

    2015-10-01

    The Nikon Metrology Laser Radar system focuses a beam from a fiber to a target object and receives the light scattered from the target through the same fiber. The system can, among other things, make highly accurate measurements of the position of a tooling ball by locating the angular position of peak signal quality, which is related to the fiber coupling efficiency. This article explores the relationship between fiber coupling efficiency and focus condition.

  1. iTunes song-gifting is a low-cost, efficient recruitment tool to engage high-risk MSM in internet research.

    PubMed

    Holland, Christine M; Ritchie, Natalie D; Du Bois, Steve N

    2015-10-01

    This brief report describes methodology and results of a novel, efficient, and low-cost recruitment tool to engage high-risk MSM in online research. We developed an incentivization protocol using iTunes song-gifting to encourage participation of high-risk MSM in an Internet-based survey of HIV status, childhood sexual abuse, and adult behavior and functioning. Our recruitment methodology yielded 489 participants in 4.5 months at a total incentive cost of $1.43USD per participant. The sample comprised a critically high-risk group of MSM, including 71.0 % who reported recent condomless anal intercourse. We offer a "how-to" guide to aid future investigators in using iTunes song-gifting incentives.

  2. Modeling Methodologies for Design and Control of Solid Oxide Fuel Cell APUs

    NASA Astrophysics Data System (ADS)

    Pianese, C.; Sorrentino, M.

    2009-08-01

    Among the existing fuel cell technologies, Solid Oxide Fuel Cells (SOFC) are particularly suitable for both stationary and mobile applications, due to their high energy conversion efficiencies, modularity, high fuel flexibility, low emissions and noise. Moreover, the high working temperatures enable their use for efficient cogeneration applications. SOFCs are entering in a pre-industrial era and a strong interest for designing tools has growth in the last years. Optimal system configuration, components sizing, control and diagnostic system design require computational tools that meet the conflicting needs of accuracy, affordable computational time, limited experimental efforts and flexibility. The paper gives an overview on control-oriented modeling of SOFC at both single cell and stack level. Such an approach provides useful simulation tools for designing and controlling SOFC-APUs destined to a wide application area, ranging from automotive to marine and airplane APUs.

  3. Reflective Writing for a Better Understanding of Scientific Concepts in High School

    ERIC Educational Resources Information Center

    El-Helou, Joseph; Kalman, Calvin S.

    2018-01-01

    Science teachers can always benefit from efficient tools that help students to engage with the subject and understand it better without significantly adding to the teacher's workload nor requiring too much of class time to manage. Reflective writing is such a low-impact, high-return tool. What follows is an introduction to reflective writing, and…

  4. High-Speed TCP Testing

    NASA Technical Reports Server (NTRS)

    Brooks, David E.; Gassman, Holly; Beering, Dave R.; Welch, Arun; Hoder, Douglas J.; Ivancic, William D.

    1999-01-01

    Transmission Control Protocol (TCP) is the underlying protocol used within the Internet for reliable information transfer. As such, there is great interest to have all implementations of TCP efficiently interoperate. This is particularly important for links exhibiting long bandwidth-delay products. The tools exist to perform TCP analysis at low rates and low delays. However, for extremely high-rate and lone-delay links such as 622 Mbps over geosynchronous satellites, new tools and testing techniques are required. This paper describes the tools and techniques used to analyze and debug various TCP implementations over high-speed, long-delay links.

  5. MODEST: a web-based design tool for oligonucleotide-mediated genome engineering and recombineering

    PubMed Central

    Bonde, Mads T.; Klausen, Michael S.; Anderson, Mads V.; Wallin, Annika I.N.; Wang, Harris H.; Sommer, Morten O.A.

    2014-01-01

    Recombineering and multiplex automated genome engineering (MAGE) offer the possibility to rapidly modify multiple genomic or plasmid sites at high efficiencies. This enables efficient creation of genetic variants including both single mutants with specifically targeted modifications as well as combinatorial cell libraries. Manual design of oligonucleotides for these approaches can be tedious, time-consuming, and may not be practical for larger projects targeting many genomic sites. At present, the change from a desired phenotype (e.g. altered expression of a specific protein) to a designed MAGE oligo, which confers the corresponding genetic change, is performed manually. To address these challenges, we have developed the MAGE Oligo Design Tool (MODEST). This web-based tool allows designing of MAGE oligos for (i) tuning translation rates by modifying the ribosomal binding site, (ii) generating translational gene knockouts and (iii) introducing other coding or non-coding mutations, including amino acid substitutions, insertions, deletions and point mutations. The tool automatically designs oligos based on desired genotypic or phenotypic changes defined by the user, which can be used for high efficiency recombineering and MAGE. MODEST is available for free and is open to all users at http://modest.biosustain.dtu.dk. PMID:24838561

  6. EOSCUBE: A Constraint Database System for High-Level Specification and Efficient Generation of EOSDIS Products. Phase 1; Proof-of-Concept

    NASA Technical Reports Server (NTRS)

    Brodsky, Alexander; Segal, Victor E.

    1999-01-01

    The EOSCUBE constraint database system is designed to be a software productivity tool for high-level specification and efficient generation of EOSDIS and other scientific products. These products are typically derived from large volumes of multidimensional data which are collected via a range of scientific instruments.

  7. Semi-autonomous parking for enhanced safety and efficiency.

    DOT National Transportation Integrated Search

    2017-06-01

    This project focuses on the use of tools from a combination of computer vision and localization based navigation schemes to aid the process of efficient and safe parking of vehicles in high density parking spaces. The principles of collision avoidanc...

  8. Advancing Exposure Characterization for Chemical Evaluation and Risk Assessment

    EPA Science Inventory

    A new generation of scientific tools has emerged to rapidly measure signals from cells, tissues, and organisms following exposure to chemicals. High-visibility efforts to apply these tools for efficient toxicity testing raise important research questions in exposure science. As v...

  9. Necessity of creating digital tools to ensure efficiency of technical means

    NASA Astrophysics Data System (ADS)

    Rakov, V. I.; Zakharova, O. V.

    2018-05-01

    The authors estimated the problems of functioning of technical objects. The article notes that the increasing complexity of automation systems may lead to an increase of the redundant resource in proportion to the number of components and relationships in the system, and to the need of the redundant resource constant change that can make implementation of traditional structures with redundancy unnecessarily costly (Standby System, Fault Tolerance, High Availability). It proposes the idea of creating digital tools to ensure efficiency of technical facilities.

  10. XML-based data model and architecture for a knowledge-based grid-enabled problem-solving environment for high-throughput biological imaging.

    PubMed

    Ahmed, Wamiq M; Lenz, Dominik; Liu, Jia; Paul Robinson, J; Ghafoor, Arif

    2008-03-01

    High-throughput biological imaging uses automated imaging devices to collect a large number of microscopic images for analysis of biological systems and validation of scientific hypotheses. Efficient manipulation of these datasets for knowledge discovery requires high-performance computational resources, efficient storage, and automated tools for extracting and sharing such knowledge among different research sites. Newly emerging grid technologies provide powerful means for exploiting the full potential of these imaging techniques. Efficient utilization of grid resources requires the development of knowledge-based tools and services that combine domain knowledge with analysis algorithms. In this paper, we first investigate how grid infrastructure can facilitate high-throughput biological imaging research, and present an architecture for providing knowledge-based grid services for this field. We identify two levels of knowledge-based services. The first level provides tools for extracting spatiotemporal knowledge from image sets and the second level provides high-level knowledge management and reasoning services. We then present cellular imaging markup language, an extensible markup language-based language for modeling of biological images and representation of spatiotemporal knowledge. This scheme can be used for spatiotemporal event composition, matching, and automated knowledge extraction and representation for large biological imaging datasets. We demonstrate the expressive power of this formalism by means of different examples and extensive experimental results.

  11. Study on Ultra-deep Azimuthal Electromagnetic Resistivity LWD Tool by Influence Quantification on Azimuthal Depth of Investigation and Real Signal

    NASA Astrophysics Data System (ADS)

    Li, Kesai; Gao, Jie; Ju, Xiaodong; Zhu, Jun; Xiong, Yanchun; Liu, Shuai

    2018-05-01

    This paper proposes a new tool design of ultra-deep azimuthal electromagnetic (EM) resistivity logging while drilling (LWD) for deeper geosteering and formation evaluation, which can benefit hydrocarbon exploration and development. First, a forward numerical simulation of azimuthal EM resistivity LWD is created based on the fast Hankel transform (FHT) method, and its accuracy is confirmed under classic formation conditions. Then, a reasonable range of tool parameters is designed by analyzing the logging response. However, modern technological limitations pose challenges to selecting appropriate tool parameters for ultra-deep azimuthal detection under detectable signal conditions. Therefore, this paper uses grey relational analysis (GRA) to quantify the influence of tool parameters on voltage and azimuthal investigation depth. After analyzing thousands of simulation data under different environmental conditions, the random forest is used to fit data and identify an optimal combination of tool parameters due to its high efficiency and accuracy. Finally, the structure of the ultra-deep azimuthal EM resistivity LWD tool is designed with a theoretical azimuthal investigation depth of 27.42-29.89 m in classic different isotropic and anisotropic formations. This design serves as a reliable theoretical foundation for efficient geosteering and formation evaluation in high-angle and horizontal (HA/HZ) wells in the future.

  12. High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis.

    PubMed

    Simonyan, Vahan; Mazumder, Raja

    2014-09-30

    The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  13. High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis

    PubMed Central

    Simonyan, Vahan; Mazumder, Raja

    2014-01-01

    The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis. PMID:25271953

  14. HAlign-II: efficient ultra-large multiple sequence alignment and phylogenetic tree reconstruction with distributed and parallel computing.

    PubMed

    Wan, Shixiang; Zou, Quan

    2017-01-01

    Multiple sequence alignment (MSA) plays a key role in biological sequence analyses, especially in phylogenetic tree construction. Extreme increase in next-generation sequencing results in shortage of efficient ultra-large biological sequence alignment approaches for coping with different sequence types. Distributed and parallel computing represents a crucial technique for accelerating ultra-large (e.g. files more than 1 GB) sequence analyses. Based on HAlign and Spark distributed computing system, we implement a highly cost-efficient and time-efficient HAlign-II tool to address ultra-large multiple biological sequence alignment and phylogenetic tree construction. The experiments in the DNA and protein large scale data sets, which are more than 1GB files, showed that HAlign II could save time and space. It outperformed the current software tools. HAlign-II can efficiently carry out MSA and construct phylogenetic trees with ultra-large numbers of biological sequences. HAlign-II shows extremely high memory efficiency and scales well with increases in computing resource. THAlign-II provides a user-friendly web server based on our distributed computing infrastructure. HAlign-II with open-source codes and datasets was established at http://lab.malab.cn/soft/halign.

  15. Evaluating the Zebrafish Embryo Toxicity Test for Pesticide Hazard Screening

    EPA Science Inventory

    Given the numerous chemicals used in society, it is critical to develop tools for accurate and efficient evaluation of potential risks to human and ecological receptors. Fish embryo acute toxicity tests are 1 tool that has been shown to be highly predictive of standard, more reso...

  16. NEUROSCIENCE. Natural light-gated anion channels: A family of microbial rhodopsins for advanced optogenetics.

    PubMed

    Govorunova, Elena G; Sineshchekov, Oleg A; Janz, Roger; Liu, Xiaoqin; Spudich, John L

    2015-08-07

    Light-gated rhodopsin cation channels from chlorophyte algae have transformed neuroscience research through their use as membrane-depolarizing optogenetic tools for targeted photoactivation of neuron firing. Photosuppression of neuronal action potentials has been limited by the lack of equally efficient tools for membrane hyperpolarization. We describe anion channel rhodopsins (ACRs), a family of light-gated anion channels from cryptophyte algae that provide highly sensitive and efficient membrane hyperpolarization and neuronal silencing through light-gated chloride conduction. ACRs strictly conducted anions, completely excluding protons and larger cations, and hyperpolarized the membrane of cultured animal cells with much faster kinetics at less than one-thousandth of the light intensity required by the most efficient currently available optogenetic proteins. Natural ACRs provide optogenetic inhibition tools with unprecedented light sensitivity and temporal precision. Copyright © 2015, American Association for the Advancement of Science.

  17. CRISPR/Cas9 cleavage efficiency regression through boosting algorithms and Markov sequence profiling.

    PubMed

    Peng, Hui; Zheng, Yi; Blumenstein, Michael; Tao, Dacheng; Li, Jinyan

    2018-04-16

    CRISPR/Cas9 system is a widely used genome editing tool. A prediction problem of great interests for this system is: how to select optimal single guide RNAs (sgRNAs) such that its cleavage efficiency is high meanwhile the off-target effect is low. This work proposed a two-step averaging method (TSAM) for the regression of cleavage efficiencies of a set of sgRNAs by averaging the predicted efficiency scores of a boosting algorithm and those by a support vector machine (SVM).We also proposed to use profiled Markov properties as novel features to capture the global characteristics of sgRNAs. These new features are combined with the outstanding features ranked by the boosting algorithm for the training of the SVM regressor. TSAM improved the mean Spearman correlation coefficiencies comparing with the state-of-the-art performance on benchmark datasets containing thousands of human, mouse and zebrafish sgRNAs. Our method can be also converted to make binary distinctions between efficient and inefficient sgRNAs with superior performance to the existing methods. The analysis reveals that highly efficient sgRNAs have lower melting temperature at the middle of the spacer, cut at 5'-end closer parts of the genome and contain more 'A' but less 'G' comparing with inefficient ones. Comprehensive further analysis also demonstrates that our tool can predict an sgRNA's cutting efficiency with consistently good performance no matter it is expressed from an U6 promoter in cells or from a T7 promoter in vitro. Online tool is available at http://www.aai-bioinfo.com/CRISPR/. Python and Matlab source codes are freely available at https://github.com/penn-hui/TSAM. Jinyan.Li@uts.edu.au. Supplementary data are available at Bioinformatics online.

  18. msBiodat analysis tool, big data analysis for high-throughput experiments.

    PubMed

    Muñoz-Torres, Pau M; Rokć, Filip; Belužic, Robert; Grbeša, Ivana; Vugrek, Oliver

    2016-01-01

    Mass spectrometry (MS) are a group of a high-throughput techniques used to increase knowledge about biomolecules. They produce a large amount of data which is presented as a list of hundreds or thousands of proteins. Filtering those data efficiently is the first step for extracting biologically relevant information. The filtering may increase interest by merging previous data with the data obtained from public databases, resulting in an accurate list of proteins which meet the predetermined conditions. In this article we present msBiodat Analysis Tool, a web-based application thought to approach proteomics to the big data analysis. With this tool, researchers can easily select the most relevant information from their MS experiments using an easy-to-use web interface. An interesting feature of msBiodat analysis tool is the possibility of selecting proteins by its annotation on Gene Ontology using its Gene Id, ensembl or UniProt codes. The msBiodat analysis tool is a web-based application that allows researchers with any programming experience to deal with efficient database querying advantages. Its versatility and user-friendly interface makes easy to perform fast and accurate data screening by using complex queries. Once the analysis is finished, the result is delivered by e-mail. msBiodat analysis tool is freely available at http://msbiodata.irb.hr.

  19. A front-end automation tool supporting design, verification and reuse of SOC.

    PubMed

    Yan, Xiao-lang; Yu, Long-li; Wang, Jie-bing

    2004-09-01

    This paper describes an in-house developed language tool called VPerl used in developing a 250 MHz 32-bit high-performance low power embedded CPU core. The authors showed that use of this tool can compress the Verilog code by more than a factor of 5, increase the efficiency of the front-end design, reduce the bug rate significantly. This tool can be used to enhance the reusability of an intellectual property model, and facilitate porting design for different platforms.

  20. Tribology in secondary wood machining

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ko, P.L.; Hawthorne, H.M.; Andiappan, J.

    Secondary wood manufacturing covers a wide range of products from furniture, cabinets, doors and windows, to musical instruments. Many of these are now mass produced in sophisticated, high speed numerical controlled machines. The performance and the reliability of the tools are key to an efficient and economical manufacturing process as well as to the quality of the finished products. A program concerned with three aspects of tribology of wood machining, namely, tool wear, tool-wood friction characteristics and wood surface quality characterization, was set up in the Integrated Manufacturing Technologies Institute (IMTI) of the National Research Council of Canada. The studiesmore » include friction and wear mechanism identification and modeling, wear performance of surface-engineered tool materials, friction-induced vibration and cutting efficiency, and the influence of wear and friction on finished products. This research program underlines the importance of tribology in secondary wood manufacturing and at the same time adds new challenges to tribology research since wood is a complex, heterogeneous, material and its behavior during machining is highly sensitive to the surrounding environments and to the moisture content in the work piece.« less

  1. Reflective Writing for a Better Understanding of Scientific Concepts in High School

    NASA Astrophysics Data System (ADS)

    El-Helou, Joseph; Kalman, Calvin S.

    2018-02-01

    Science teachers can always benefit from efficient tools that help students to engage with the subject and understand it better without significantly adding to the teacher's workload nor requiring too much of class time to manage. Reflective writing is such a low-impact, high-return tool. What follows is an introduction to reflective writing, and more on its usefulness for teachers is given in the last part of this article.

  2. Performance profiling for brachytherapy applications

    NASA Astrophysics Data System (ADS)

    Choi, Wonqook; Cho, Kihyeon; Yeo, Insung

    2018-05-01

    In many physics applications, a significant amount of software (e.g. R, ROOT and Geant4) is developed on novel computing architectures, and much effort is expended to ensure the software is efficient in terms of central processing unit (CPU) time and memory usage. Profiling tools are used during the evaluation process to evaluate the efficiency; however, few such tools are able to accommodate low-energy physics regions. To address this limitation, we developed a low-energy physics profiling system in Geant4 to profile the CPU time and memory of software applications in brachytherapy applications. This paper describes and evaluates specific models that are applied to brachytherapy applications in Geant4, such as QGSP_BIC_LIV, QGSP_BIC_EMZ, and QGSP_BIC_EMY. The physics range in this tool allows it to be used to generate low energy profiles in brachytherapy applications. This was a limitation in previous studies, which caused us to develop a new profiling tool that supports profiling in the MeV range, in contrast to the TeV range that is supported by existing high-energy profiling tools. In order to easily compare the profiling results between low-energy and high-energy modes, we employed the same software architecture as that in the SimpliCarlo tool developed at the Fermilab National Accelerator Laboratory (FNAL) for the Large Hadron Collider (LHC). The results show that the newly developed profiling system for low-energy physics (less than MeV) complements the current profiling system used for high-energy physics (greater than TeV) applications.

  3. A novel bioinformatics method for efficient knowledge discovery by BLSOM from big genomic sequence data.

    PubMed

    Bai, Yu; Iwasaki, Yuki; Kanaya, Shigehiko; Zhao, Yue; Ikemura, Toshimichi

    2014-01-01

    With remarkable increase of genomic sequence data of a wide range of species, novel tools are needed for comprehensive analyses of the big sequence data. Self-Organizing Map (SOM) is an effective tool for clustering and visualizing high-dimensional data such as oligonucleotide composition on one map. By modifying the conventional SOM, we have previously developed Batch-Learning SOM (BLSOM), which allows classification of sequence fragments according to species, solely depending on the oligonucleotide composition. In the present study, we introduce the oligonucleotide BLSOM used for characterization of vertebrate genome sequences. We first analyzed pentanucleotide compositions in 100 kb sequences derived from a wide range of vertebrate genomes and then the compositions in the human and mouse genomes in order to investigate an efficient method for detecting differences between the closely related genomes. BLSOM can recognize the species-specific key combination of oligonucleotide frequencies in each genome, which is called a "genome signature," and the specific regions specifically enriched in transcription-factor-binding sequences. Because the classification and visualization power is very high, BLSOM is an efficient powerful tool for extracting a wide range of information from massive amounts of genomic sequences (i.e., big sequence data).

  4. Traffic control for high occupancy vehicle facilities in Virginia.

    DOT National Transportation Integrated Search

    1998-04-01

    High occupancy vehicle (HOV) facilities are an important tool in relieving the congestion that continues to build on many urban roadways. By moving more people in fewer vehicles, the existing infrastructure can be used more efficiently. Operating HOV...

  5. Efficient 10 kW diode-pumped Nd:YAG rod laser

    NASA Astrophysics Data System (ADS)

    Akiyama, Yasuhiro; Takada, Hiroyuki; Sasaki, Mitsuo; Yuasa, Hiroshi; Nishida, Naoto

    2003-03-01

    As a tool for high speed and high precision material processing such as cutting and welding, we developed a rod-type all-solid-state laser with an average power of more than 10 kW, an electrical-optical efficiency of more than 20%, and a laser head volume of less than 0.05 m3. We developed a highly efficient diode pumped module, and successfully obtained electrical-optical efficiencies of 22% in CW operation and 26% in QCW operation at multi-kW output powers. We also succeeded to reduce the laser head volume, and obtained the output power of 12 kW with an efficiency of 23%, and laser head volume of 0.045 m3. We transferred the technology to SHIBAURA mechatronics corp., who started to provide the LD pumped Nd:YAG laser system with output power up to 4.5 kW. We are now continuing development for further high power laser equipment.

  6. CscoreTool: fast Hi-C compartment analysis at high resolution.

    PubMed

    Zheng, Xiaobin; Zheng, Yixian

    2018-05-01

    The genome-wide chromosome conformation capture (Hi-C) has revealed that the eukaryotic genome can be partitioned into A and B compartments that have distinctive chromatin and transcription features. Current Principle Component Analyses (PCA)-based method for the A/B compartment prediction based on Hi-C data requires substantial CPU time and memory. We report the development of a method, CscoreTool, which enables fast and memory-efficient determination of A/B compartments at high resolution even in datasets with low sequencing depth. https://github.com/scoutzxb/CscoreTool. xzheng@carnegiescience.edu. Supplementary data are available at Bioinformatics online.

  7. VirusDetect: An automated pipeline for efficient virus discovery using deep sequencing of small RNAs

    USDA-ARS?s Scientific Manuscript database

    Accurate detection of viruses in plants and animals is critical for agriculture production and human health. Deep sequencing and assembly of virus-derived siRNAs has proven to be a highly efficient approach for virus discovery. However, to date no computational tools specifically designed for both k...

  8. Recent developments in software tools for high-throughput in vitro ADME support with high-resolution MS.

    PubMed

    Paiva, Anthony; Shou, Wilson Z

    2016-08-01

    The last several years have seen the rapid adoption of the high-resolution MS (HRMS) for bioanalytical support of high throughput in vitro ADME profiling. Many capable software tools have been developed and refined to process quantitative HRMS bioanalysis data for ADME samples with excellent performance. Additionally, new software applications specifically designed for quan/qual soft spot identification workflows using HRMS have greatly enhanced the quality and efficiency of the structure elucidation process for high throughput metabolite ID in early in vitro ADME profiling. Finally, novel approaches in data acquisition and compression, as well as tools for transferring, archiving and retrieving HRMS data, are being continuously refined to tackle the issue of large data file size typical for HRMS analyses.

  9. Improve load balancing and coding efficiency of tiles in high efficiency video coding by adaptive tile boundary

    NASA Astrophysics Data System (ADS)

    Chan, Chia-Hsin; Tu, Chun-Chuan; Tsai, Wen-Jiin

    2017-01-01

    High efficiency video coding (HEVC) not only improves the coding efficiency drastically compared to the well-known H.264/AVC but also introduces coding tools for parallel processing, one of which is tiles. Tile partitioning is allowed to be arbitrary in HEVC, but how to decide tile boundaries remains an open issue. An adaptive tile boundary (ATB) method is proposed to select a better tile partitioning to improve load balancing (ATB-LoadB) and coding efficiency (ATB-Gain) with a unified scheme. Experimental results show that, compared to ordinary uniform-space partitioning, the proposed ATB can save up to 17.65% of encoding times in parallel encoding scenarios and can reduce up to 0.8% of total bit rates for coding efficiency.

  10. A combination of HPLC and automated data analysis for monitoring the efficiency of high-pressure homogenization.

    PubMed

    Eggenreich, Britta; Rajamanickam, Vignesh; Wurm, David Johannes; Fricke, Jens; Herwig, Christoph; Spadiut, Oliver

    2017-08-01

    Cell disruption is a key unit operation to make valuable, intracellular target products accessible for further downstream unit operations. Independent of the applied cell disruption method, each cell disruption process must be evaluated with respect to disruption efficiency and potential product loss. Current state-of-the-art methods, like measuring the total amount of released protein and plating-out assays, are usually time-delayed and involve manual intervention making them error-prone. An automated method to monitor cell disruption efficiency at-line is not available to date. In the current study we implemented a methodology, which we had originally developed to monitor E. coli cell integrity during bioreactor cultivations, to automatically monitor and evaluate cell disruption of a recombinant E. coli strain by high-pressure homogenization. We compared our tool with a library of state-of-the-art methods, analyzed the effect of freezing the biomass before high-pressure homogenization and finally investigated this unit operation in more detail by a multivariate approach. A combination of HPLC and automated data analysis describes a valuable, novel tool to monitor and evaluate cell disruption processes. Our methodology, which can be used both in upstream (USP) and downstream processing (DSP), describes a valuable tool to evaluate cell disruption processes as it can be implemented at-line, gives results within minutes after sampling and does not need manual intervention.

  11. Development of the electric vehicle analyzer

    NASA Astrophysics Data System (ADS)

    Dickey, Michael R.; Klucz, Raymond S.; Ennix, Kimberly A.; Matuszak, Leo M.

    1990-06-01

    The increasing technological maturity of high power (greater than 20 kW) electric propulsion devices has led to renewed interest in their use as a means of efficiently transferring payloads between earth orbits. Several systems and architecture studies have identified the potential cost benefits of high performance Electric Orbital Transfer Vehicles (EOTVs). These studies led to the initiation of the Electric Insertion Transfer Experiment (ELITE) in 1988. Managed by the Astronautics Laboratory, ELITE is a flight experiment designed to sufficiently demonstrate key technologies and options to pave the way for the full-scale development of an operational EOTV. An important consideration in the development of the ELITE program is the capability of available analytical tools to simulate the orbital mechanics of a low thrust, electric propulsion transfer vehicle. These tools are necessary not only for ELITE mission planning exercises but also for continued, efficient, accurate evaluation of DoD space transportation architectures which include EOTVs. This paper presents such a tool: the Electric Vehicle Analyzer (EVA).

  12. Kernel based machine learning algorithm for the efficient prediction of type III polyketide synthase family of proteins.

    PubMed

    Mallika, V; Sivakumar, K C; Jaichand, S; Soniya, E V

    2010-07-13

    Type III Polyketide synthases (PKS) are family of proteins considered to have significant roles in the biosynthesis of various polyketides in plants, fungi and bacteria. As these proteins shows positive effects to human health, more researches are going on regarding this particular protein. Developing a tool to identify the probability of sequence being a type III polyketide synthase will minimize the time consumption and manpower efforts. In this approach, we have designed and implemented PKSIIIpred, a high performance prediction server for type III PKS where the classifier is Support Vector Machines (SVMs). Based on the limited training dataset, the tool efficiently predicts the type III PKS superfamily of proteins with high sensitivity and specificity. The PKSIIIpred is available at http://type3pks.in/prediction/. We expect that this tool may serve as a useful resource for type III PKS researchers. Currently work is being progressed for further betterment of prediction accuracy by including more sequence features in the training dataset.

  13. An episomal vector-based CRISPR/Cas9 system for highly efficient gene knockout in human pluripotent stem cells.

    PubMed

    Xie, Yifang; Wang, Daqi; Lan, Feng; Wei, Gang; Ni, Ting; Chai, Renjie; Liu, Dong; Hu, Shijun; Li, Mingqing; Li, Dajin; Wang, Hongyan; Wang, Yongming

    2017-05-24

    Human pluripotent stem cells (hPSCs) represent a unique opportunity for understanding the molecular mechanisms underlying complex traits and diseases. CRISPR/Cas9 is a powerful tool to introduce genetic mutations into the hPSCs for loss-of-function studies. Here, we developed an episomal vector-based CRISPR/Cas9 system, which we called epiCRISPR, for highly efficient gene knockout in hPSCs. The epiCRISPR system enables generation of up to 100% Insertion/Deletion (indel) rates. In addition, the epiCRISPR system enables efficient double-gene knockout and genomic deletion. To minimize off-target cleavage, we combined the episomal vector technology with double-nicking strategy and recent developed high fidelity Cas9. Thus the epiCRISPR system offers a highly efficient platform for genetic analysis in hPSCs.

  14. Study on electroplating technology of diamond tools for machining hard and brittle materials

    NASA Astrophysics Data System (ADS)

    Cui, Ying; Chen, Jian Hua; Sun, Li Peng; Wang, Yue

    2016-10-01

    With the development of the high speed cutting, the ultra-precision machining and ultrasonic vibration technique in processing hard and brittle material , the requirement of cutting tools is becoming higher and higher. As electroplated diamond tools have distinct advantages, such as high adaptability, high durability, long service life and good dimensional stability, the cutting tools are effective and extensive used in grinding hard and brittle materials. In this paper, the coating structure of electroplating diamond tool is described. The electroplating process flow is presented, and the influence of pretreatment on the machining quality is analyzed. Through the experimental research and summary, the reasonable formula of the electrolyte, the electroplating technologic parameters and the suitable sanding method were determined. Meanwhile, the drilling experiment on glass-ceramic shows that the electroplating process can effectively improve the cutting performance of diamond tools. It has laid a good foundation for further improving the quality and efficiency of the machining of hard and brittle materials.

  15. Development of an Efficient Genome Editing Tool in Bacillus licheniformis Using CRISPR-Cas9 Nickase.

    PubMed

    Li, Kaifeng; Cai, Dongbo; Wang, Zhangqian; He, Zhili; Chen, Shouwen

    2018-03-15

    Bacillus strains are important industrial bacteria that can produce various biochemical products. However, low transformation efficiencies and a lack of effective genome editing tools have hindered its widespread application. Recently, clustered regularly interspaced short palindromic repeat (CRISPR)-Cas9 techniques have been utilized in many organisms as genome editing tools because of their high efficiency and easy manipulation. In this study, an efficient genome editing method was developed for Bacillus licheniformis using a CRISPR-Cas9 nickase integrated into the genome of B. licheniformis DW2 with overexpression driven by the P43 promoter. The yvmC gene was deleted using the CRISPR-Cas9n technique with homology arms of 1.0 kb as a representative example, and an efficiency of 100% was achieved. In addition, two genes were simultaneously disrupted with an efficiency of 11.6%, and the large DNA fragment bacABC (42.7 kb) was deleted with an efficiency of 79.0%. Furthermore, the heterologous reporter gene aprN , which codes for nattokinase in Bacillus subtilis , was inserted into the chromosome of B. licheniformis with an efficiency of 76.5%. The activity of nattokinase in the DWc9nΔ7/pP43SNT-S sacC strain reached 59.7 fibrinolytic units (FU)/ml, which was 25.7% higher than that of DWc9n/pP43SNT-S sacC Finally, the engineered strain DWc9nΔ7 (Δ epr Δ wprA Δ mpr Δ aprE Δ vpr Δ bprA Δ bacABC ), with multiple disrupted genes, was constructed using the CRISPR-Cas9n technique. Taken together, we have developed an efficient genome editing tool based on CRISPR-Cas9n in B. licheniformis This tool could be applied to strain improvement for future research. IMPORTANCE As important industrial bacteria, Bacillus strains have attracted significant attention due to their production of biological products. However, genetic manipulation of these bacteria is difficult. The CRISPR-Cas9 system has been applied to genome editing in some bacteria, and CRISPR-Cas9n was proven to be an efficient and precise tool in previous reports. The significance of our research is the development of an efficient, more precise, and systematic genome editing method for single-gene deletion, multiple-gene disruption, large DNA fragment deletion, and single-gene integration in Bacillus licheniformis via Cas9 nickase. We also applied this method to the genetic engineering of the host strain for protein expression. Copyright © 2018 American Society for Microbiology.

  16. Low-power, high-speed 1-bit inexact Full Adder cell designs applicable to low-energy image processing

    NASA Astrophysics Data System (ADS)

    Zareei, Zahra; Navi, Keivan; Keshavarziyan, Peiman

    2018-03-01

    In this paper, three novel low-power and high-speed 1-bit inexact Full Adder cell designs are presented based on current mode logic in 32 nm carbon nanotube field effect transistor technology for the first time. The circuit-level figures of merits, i.e. power, delay and power-delay product as well as application-level metric such as error distance, are considered to assess the efficiency of the proposed cells over their counterparts. The effect of voltage scaling and temperature variation on the proposed cells is studied using HSPICE tool. Moreover, using MATLAB tool, the peak signal to noise ratio of the proposed cells is evaluated in an image-processing application referred to as motion detector. Simulation results confirm the efficiency of the proposed cells.

  17. Genome Editing Tools in Plants

    PubMed Central

    Mohanta, Tapan Kumar; Bashir, Tufail; Hashem, Abeer; Bae, Hanhong

    2017-01-01

    Genome editing tools have the potential to change the genomic architecture of a genome at precise locations, with desired accuracy. These tools have been efficiently used for trait discovery and for the generation of plants with high crop yields and resistance to biotic and abiotic stresses. Due to complex genomic architecture, it is challenging to edit all of the genes/genomes using a particular genome editing tool. Therefore, to overcome this challenging task, several genome editing tools have been developed to facilitate efficient genome editing. Some of the major genome editing tools used to edit plant genomes are: Homologous recombination (HR), zinc finger nucleases (ZFNs), transcription activator-like effector nucleases (TALENs), pentatricopeptide repeat proteins (PPRs), the CRISPR/Cas9 system, RNA interference (RNAi), cisgenesis, and intragenesis. In addition, site-directed sequence editing and oligonucleotide-directed mutagenesis have the potential to edit the genome at the single-nucleotide level. Recently, adenine base editors (ABEs) have been developed to mutate A-T base pairs to G-C base pairs. ABEs use deoxyadeninedeaminase (TadA) with catalytically impaired Cas9 nickase to mutate A-T base pairs to G-C base pairs. PMID:29257124

  18. Effective gene delivery to Trypanosoma cruzi epimastigotes through nucleofection.

    PubMed

    Pacheco-Lugo, Lisandro; Díaz-Olmos, Yirys; Sáenz-García, José; Probst, Christian Macagnan; DaRocha, Wanderson Duarte

    2017-06-01

    New opportunities have raised to study the gene function approaches of Trypanosoma cruzi after its genome sequencing in 2005. Functional genomic approaches in Trypanosoma cruzi are challenging due to the reduced tools available for genetic manipulation, as well as to the reduced efficiency of the transient transfection conducted through conventional methods. The Amaxa nucleofector device was systematically tested in the present study in order to improve the electroporation conditions in the epimastigote forms of T. cruzi. The transfection efficiency was quantified using the green fluorescent protein (GFP) as reporter gene followed by cell survival assessment. The herein used nucleofection parameters have increased the survival rates (>90%) and the transfection efficiency by approximately 35%. The small amount of epimastigotes and DNA required for the nucleofection can turn the method adopted here into an attractive tool for high throughput screening (HTS) applications, and for gene editing in parasites where genetic manipulation tools remain relatively scarce. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Correction of mid-spatial-frequency errors by smoothing in spin motion for CCOS

    NASA Astrophysics Data System (ADS)

    Zhang, Yizhong; Wei, Chaoyang; Shao, Jianda; Xu, Xueke; Liu, Shijie; Hu, Chen; Zhang, Haichao; Gu, Haojin

    2015-08-01

    Smoothing is a convenient and efficient way to correct mid-spatial-frequency errors. Quantifying the smoothing effect allows improvements in efficiency for finishing precision optics. A series experiments in spin motion are performed to study the smoothing effects about correcting mid-spatial-frequency errors. Some of them use a same pitch tool at different spinning speed, and others at a same spinning speed with different tools. Introduced and improved Shu's model to describe and compare the smoothing efficiency with different spinning speed and different tools. From the experimental results, the mid-spatial-frequency errors on the initial surface were nearly smoothed out after the process in spin motion and the number of smoothing times can be estimated by the model before the process. Meanwhile this method was also applied to smooth the aspherical component, which has an obvious mid-spatial-frequency error after Magnetorheological Finishing processing. As a result, a high precision aspheric optical component was obtained with PV=0.1λ and RMS=0.01λ.

  20. Development and Validation of a Multidisciplinary Tool for Accurate and Efficient Rotorcraft Noise Prediction (MUTE)

    NASA Technical Reports Server (NTRS)

    Liu, Yi; Anusonti-Inthra, Phuriwat; Diskin, Boris

    2011-01-01

    A physics-based, systematically coupled, multidisciplinary prediction tool (MUTE) for rotorcraft noise was developed and validated with a wide range of flight configurations and conditions. MUTE is an aggregation of multidisciplinary computational tools that accurately and efficiently model the physics of the source of rotorcraft noise, and predict the noise at far-field observer locations. It uses systematic coupling approaches among multiple disciplines including Computational Fluid Dynamics (CFD), Computational Structural Dynamics (CSD), and high fidelity acoustics. Within MUTE, advanced high-order CFD tools are used around the rotor blade to predict the transonic flow (shock wave) effects, which generate the high-speed impulsive noise. Predictions of the blade-vortex interaction noise in low speed flight are also improved by using the Particle Vortex Transport Method (PVTM), which preserves the wake flow details required for blade/wake and fuselage/wake interactions. The accuracy of the source noise prediction is further improved by utilizing a coupling approach between CFD and CSD, so that the effects of key structural dynamics, elastic blade deformations, and trim solutions are correctly represented in the analysis. The blade loading information and/or the flow field parameters around the rotor blade predicted by the CFD/CSD coupling approach are used to predict the acoustic signatures at far-field observer locations with a high-fidelity noise propagation code (WOPWOP3). The predicted results from the MUTE tool for rotor blade aerodynamic loading and far-field acoustic signatures are compared and validated with a variation of experimental data sets, such as UH60-A data, DNW test data and HART II test data.

  1. Design of a high altitude long endurance flying-wing solar-powered unmanned air vehicle

    NASA Astrophysics Data System (ADS)

    Alsahlani, A. A.; Johnston, L. J.; Atcliffe, P. A.

    2017-06-01

    The low-Reynolds number environment of high-altitude §ight places severe demands on the aerodynamic design and stability and control of a high altitude, long endurance (HALE) unmanned air vehicle (UAV). The aerodynamic efficiency of a §ying-wing configuration makes it an attractive design option for such an application and is investigated in the present work. The proposed configuration has a high-aspect ratio, swept-wing planform, the wing sweep being necessary to provide an adequate moment arm for outboard longitudinal and lateral control surfaces. A design optimization framework is developed under a MATLAB environment, combining aerodynamic, structural, and stability analysis. Low-order analysis tools are employed to facilitate efficient computations, which is important when there are multiple optimization loops for the various engineering analyses. In particular, a vortex-lattice method is used to compute the wing planform aerodynamics, coupled to a twodimensional (2D) panel method to derive aerofoil sectional characteristics. Integral boundary-layer methods are coupled to the panel method in order to predict §ow separation boundaries during the design iterations. A quasi-analytical method is adapted for application to flyingwing con¦gurations to predict the wing weight and a linear finite-beam element approach is used for structural analysis of the wing-box. Stability is a particular concern in the low-density environment of high-altitude flight for flying-wing aircraft and so provision of adequate directional stability and control power forms part of the optimization process. At present, a modified Genetic Algorithm is used in all of the optimization loops. Each of the low-order engineering analysis tools is validated using higher-order methods to provide con¦dence in the use of these computationally-efficient tools in the present design-optimization framework. This paper includes the results of employing the present optimization tools in the design of a HALE, flying-wing UAV to indicate that this is a viable design configuration option.

  2. Creation of High Efficient Firefly Luciferase

    NASA Astrophysics Data System (ADS)

    Nakatsu, Toru

    Firefly emits visible yellow-green light. The bioluminescence reaction is carried out by the enzyme luciferase. The bioluminescence of luciferase is widely used as an excellent tool for monitoring gene expression, the measurement of the amount of ATP and in vivo imaging. Recently a study of the cancer metastasis is carried out by in vivo luminescence imaging system, because luminescence imaging is less toxic and more useful for long-term assay than fluorescence imaging by GFP. However the luminescence is much dimmer than fluorescence. Then bioluminescence imaging in living organisms demands the high efficient luciferase which emits near infrared lights or enhances the emission intensity. Here I introduce an idea for creating the high efficient luciferase based on the crystal structure.

  3. Re-tooling critical care to become a better intensivist: something old and something new.

    PubMed

    Marini, John J

    2015-01-01

    Developments in recent years have placed powerful new tools of diagnosis, therapy, and communication at the disposal of medicine in general, and of critical care in particular. The art of healing requires not only technical proficiency, but also personal connection, multidisciplinary teamwork, and commitment to the venerable traditions of our profession. The latter often seem to be under assault by today's high-pressure, high-efficiency, and increasingly business-driven hospital environments. Re-tooling critical care for the future generations of caregivers requires something old--empathetic connection--as well as the exciting newer technologies of our science and practice.

  4. Re-tooling critical care to become a better intensivist: something old and something new

    PubMed Central

    2015-01-01

    Developments in recent years have placed powerful new tools of diagnosis, therapy, and communication at the disposal of medicine in general, and of critical care in particular. The art of healing requires not only technical proficiency, but also personal connection, multidisciplinary teamwork, and commitment to the venerable traditions of our profession. The latter often seem to be under assault by today's high-pressure, high-efficiency, and increasingly business-driven hospital environments. Re-tooling critical care for the future generations of caregivers requires something old--empathetic connection--as well as the exciting newer technologies of our science and practice. PMID:26728560

  5. Agrobacterium rhizogenes-mediated transformation of Superroot-derived Lotus corniculatus plants: a valuable tool for functional genomics.

    PubMed

    Jian, Bo; Hou, Wensheng; Wu, Cunxiang; Liu, Bin; Liu, Wei; Song, Shikui; Bi, Yurong; Han, Tianfu

    2009-06-25

    Transgenic approaches provide a powerful tool for gene function investigations in plants. However, some legumes are still recalcitrant to current transformation technologies, limiting the extent to which functional genomic studies can be performed on. Superroot of Lotus corniculatus is a continuous root cloning system allowing direct somatic embryogenesis and mass regeneration of plants. Recently, a technique to obtain transgenic L. corniculatus plants from Superroot-derived leaves through A. tumefaciens-mediated transformation was described. However, transformation efficiency was low and it took about six months from gene transfer to PCR identification. In the present study, we developed an A. rhizogenes-mediated transformation of Superroot-derived L. corniculatus for gene function investigation, combining the efficient A. rhizogenes-mediated transformation and the rapid regeneration system of Superroot. The transformation system using A. rhizogenes K599 harbouring pGFPGUSPlus was improved by validating some parameters which may influence the transformation frequency. Using stem sections with one node as explants, a 2-day pre-culture of explants, infection with K599 at OD(600) = 0.6, and co-cultivation on medium (pH 5.4) at 22 degrees C for 2 days enhanced the transformation frequency significantly. As proof of concept, Superroot-derived L. corniculatus was transformed with a gene from wheat encoding an Na+/H+ antiporter (TaNHX2) using the described system. Transgenic Superroot plants were obtained and had increased salt tolerance, as expected from the expression of TaNHX2. A rapid and efficient tool for gene function investigation in L. corniculatus was developed, combining the simplicity and high efficiency of the Superroot regeneration system and the availability of A. rhizogenes-mediated transformation. This system was improved by validating some parameters influencing the transformation frequency, which could reach 92% based on GUS detection. The combination of the highly efficient transformation and the regeneration system of Superroot provides a valuable tool for functional genomics studies in L. corniculatus.

  6. CoCoNUT: an efficient system for the comparison and analysis of genomes

    PubMed Central

    2008-01-01

    Background Comparative genomics is the analysis and comparison of genomes from different species. This area of research is driven by the large number of sequenced genomes and heavily relies on efficient algorithms and software to perform pairwise and multiple genome comparisons. Results Most of the software tools available are tailored for one specific task. In contrast, we have developed a novel system CoCoNUT (Computational Comparative geNomics Utility Toolkit) that allows solving several different tasks in a unified framework: (1) finding regions of high similarity among multiple genomic sequences and aligning them, (2) comparing two draft or multi-chromosomal genomes, (3) locating large segmental duplications in large genomic sequences, and (4) mapping cDNA/EST to genomic sequences. Conclusion CoCoNUT is competitive with other software tools w.r.t. the quality of the results. The use of state of the art algorithms and data structures allows CoCoNUT to solve comparative genomics tasks more efficiently than previous tools. With the improved user interface (including an interactive visualization component), CoCoNUT provides a unified, versatile, and easy-to-use software tool for large scale studies in comparative genomics. PMID:19014477

  7. Potential high-frequency off-target mutagenesis induced by CRISPR/Cas9 in Arabidopsis and its prevention.

    PubMed

    Zhang, Qiang; Xing, Hui-Li; Wang, Zhi-Ping; Zhang, Hai-Yan; Yang, Fang; Wang, Xue-Chen; Chen, Qi-Jun

    2018-03-01

    We present novel observations of high-specificity SpCas9 variants, sgRNA expression strategies based on mutant sgRNA scaffold and tRNA processing system, and CRISPR/Cas9-mediated T-DNA integrations. Specificity of CRISPR/Cas9 tools has been a major concern along with the reports of their successful applications. We report unexpected observations of high frequency off-target mutagenesis induced by CRISPR/Cas9 in T1 Arabidopsis mutants although the sgRNA was predicted to have a high specificity score. We also present evidence that the off-target effects were further exacerbated in the T2 progeny. To prevent the off-target effects, we tested and optimized two strategies in Arabidopsis, including introduction of a mCherry cassette for a simple and reliable isolation of Cas9-free mutants and the use of highly specific mutant SpCas9 variants. Optimization of the mCherry vectors and subsequent validation found that fusion of tRNA with the mutant rather than the original sgRNA scaffold significantly improves editing efficiency. We then examined the editing efficiency of eight high-specificity SpCas9 variants in combination with the improved tRNA-sgRNA fusion strategy. Our results suggest that highly specific SpCas9 variants require a higher level of expression than their wild-type counterpart to maintain high editing efficiency. Additionally, we demonstrate that T-DNA can be inserted into the cleavage sites of CRISPR/Cas9 targets with high frequency. Altogether, our results suggest that in plants, continuous attention should be paid to off-target effects induced by CRISPR/Cas9 in current and subsequent generations, and that the tools optimized in this report will be useful in improving genome editing efficiency and specificity in plants and other organisms.

  8. Method for Evaluating Energy Use of Dishwashers, Clothes Washers, and Clothes Dryers: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eastment, M.; Hendron, R.

    Building America teams are researching opportunities to improve energy efficiency for some of the more challenging end-uses, such as lighting (both fixed and occupant-provided), appliances (clothes washer, dishwasher, clothes dryer, refrigerator, and range), and miscellaneous electric loads, which are all heavily dependent on occupant behavior and product choices. These end-uses have grown to be a much more significant fraction of total household energy use (as much as 50% for very efficient homes) as energy efficient homes have become more commonplace through programs such as ENERGY STAR and Building America. As modern appliances become more sophisticated the residential energy analyst ismore » faced with a daunting task in trying to calculate the energy savings of high efficiency appliances. Unfortunately, most whole-building simulation tools do not allow the input of detailed appliance specifications. Using DOE test procedures the method outlined in this paper presents a reasonable way to generate inputs for whole-building energy-simulation tools. The information necessary to generate these inputs is available on Energy-Guide labels, the ENERGY-STAR website, California Energy Commission's Appliance website and manufacturer's literature. Building America has developed a standard method for analyzing the effect of high efficiency appliances on whole-building energy consumption when compared to the Building America's Research Benchmark building.« less

  9. Leveraging business intelligence to make better decisions: Part I.

    PubMed

    Reimers, Mona

    2014-01-01

    Data is the new currency. Business intelligence tools will provide better performing practices with a competitive intelligence advantage that will separate the high performers from the rest of the pack. Given the investments of time and money into our data systems, practice leaders must work to take every advantage and look at the datasets as a potential goldmine of business intelligence decision tools. A fresh look at decision tools created from practice data will create efficiencies and improve effectiveness for end-users and managers.

  10. Open Source GIS based integrated watershed management

    NASA Astrophysics Data System (ADS)

    Byrne, J. M.; Lindsay, J.; Berg, A. A.

    2013-12-01

    Optimal land and water management to address future and current resource stresses and allocation challenges requires the development of state-of-the-art geomatics and hydrological modelling tools. Future hydrological modelling tools should be of high resolution, process based with real-time capability to assess changing resource issues critical to short, medium and long-term enviromental management. The objective here is to merge two renowned, well published resource modeling programs to create an source toolbox for integrated land and water management applications. This work will facilitate a much increased efficiency in land and water resource security, management and planning. Following an 'open-source' philosophy, the tools will be computer platform independent with source code freely available, maximizing knowledge transfer and the global value of the proposed research. The envisioned set of water resource management tools will be housed within 'Whitebox Geospatial Analysis Tools'. Whitebox, is an open-source geographical information system (GIS) developed by Dr. John Lindsay at the University of Guelph. The emphasis of the Whitebox project has been to develop a user-friendly interface for advanced spatial analysis in environmental applications. The plugin architecture of the software is ideal for the tight-integration of spatially distributed models and spatial analysis algorithms such as those contained within the GENESYS suite. Open-source development extends knowledge and technology transfer to a broad range of end-users and builds Canadian capability to address complex resource management problems with better tools and expertise for managers in Canada and around the world. GENESYS (Generate Earth Systems Science input) is an innovative, efficient, high-resolution hydro- and agro-meteorological model for complex terrain watersheds developed under the direction of Dr. James Byrne. GENESYS is an outstanding research and applications tool to address challenging resource management issues in industry, government and nongovernmental agencies. Current research and analysis tools were developed to manage meteorological, climatological, and land and water resource data efficiently at high resolution in space and time. The deliverable for this work is a Whitebox-GENESYS open-source resource management capacity with routines for GIS based watershed management including water in agriculture and food production. We are adding urban water management routines through GENESYS in 2013-15 with an engineering PhD candidate. Both Whitebox-GAT and GENESYS are already well-established tools. The proposed research will combine these products to create an open-source geomatics based water resource management tool that is revolutionary in both capacity and availability to a wide array of Canadian and global users

  11. Open source cardiology electronic health record development for DIGICARDIAC implementation

    NASA Astrophysics Data System (ADS)

    Dugarte, Nelson; Medina, Rubén.; Huiracocha, Lourdes; Rojas, Rubén.

    2015-12-01

    This article presents the development of a Cardiology Electronic Health Record (CEHR) system. Software consists of a structured algorithm designed under Health Level-7 (HL7) international standards. Novelty of the system is the integration of high resolution ECG (HRECG) signal acquisition and processing tools, patient information management tools and telecardiology tools. Acquisition tools are for management and control of the DIGICARDIAC electrocardiograph functions. Processing tools allow management of HRECG signal analysis searching for indicative patterns of cardiovascular pathologies. Telecardiology tools incorporation allows system communication with other health care centers decreasing access time to the patient information. CEHR system was completely developed using open source software. Preliminary results of process validation showed the system efficiency.

  12. Center for Efficient Exascale Discretizations Software Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kolev, Tzanio; Dobrev, Veselin; Tomov, Vladimir

    The CEED Software suite is a collection of generally applicable software tools focusing on the following computational motives: PDE discretizations on unstructured meshes, high-order finite element and spectral element methods and unstructured adaptive mesh refinement. All of this software is being developed as part of CEED, a co-design Center for Efficient Exascale Discretizations, within DOE's Exascale Computing Project (ECP) program.

  13. Protein structural similarity search by Ramachandran codes

    PubMed Central

    Lo, Wei-Cheng; Huang, Po-Jung; Chang, Chih-Hung; Lyu, Ping-Chiang

    2007-01-01

    Background Protein structural data has increased exponentially, such that fast and accurate tools are necessary to access structure similarity search. To improve the search speed, several methods have been designed to reduce three-dimensional protein structures to one-dimensional text strings that are then analyzed by traditional sequence alignment methods; however, the accuracy is usually sacrificed and the speed is still unable to match sequence similarity search tools. Here, we aimed to improve the linear encoding methodology and develop efficient search tools that can rapidly retrieve structural homologs from large protein databases. Results We propose a new linear encoding method, SARST (Structural similarity search Aided by Ramachandran Sequential Transformation). SARST transforms protein structures into text strings through a Ramachandran map organized by nearest-neighbor clustering and uses a regenerative approach to produce substitution matrices. Then, classical sequence similarity search methods can be applied to the structural similarity search. Its accuracy is similar to Combinatorial Extension (CE) and works over 243,000 times faster, searching 34,000 proteins in 0.34 sec with a 3.2-GHz CPU. SARST provides statistically meaningful expectation values to assess the retrieved information. It has been implemented into a web service and a stand-alone Java program that is able to run on many different platforms. Conclusion As a database search method, SARST can rapidly distinguish high from low similarities and efficiently retrieve homologous structures. It demonstrates that the easily accessible linear encoding methodology has the potential to serve as a foundation for efficient protein structural similarity search tools. These search tools are supposed applicable to automated and high-throughput functional annotations or predictions for the ever increasing number of published protein structures in this post-genomic era. PMID:17716377

  14. High throughput SNP discovery and genotyping in hexaploid wheat.

    PubMed

    Rimbert, Hélène; Darrier, Benoît; Navarro, Julien; Kitt, Jonathan; Choulet, Frédéric; Leveugle, Magalie; Duarte, Jorge; Rivière, Nathalie; Eversole, Kellye; Le Gouis, Jacques; Davassi, Alessandro; Balfourier, François; Le Paslier, Marie-Christine; Berard, Aurélie; Brunel, Dominique; Feuillet, Catherine; Poncet, Charles; Sourdille, Pierre; Paux, Etienne

    2018-01-01

    Because of their abundance and their amenability to high-throughput genotyping techniques, Single Nucleotide Polymorphisms (SNPs) are powerful tools for efficient genetics and genomics studies, including characterization of genetic resources, genome-wide association studies and genomic selection. In wheat, most of the previous SNP discovery initiatives targeted the coding fraction, leaving almost 98% of the wheat genome largely unexploited. Here we report on the use of whole-genome resequencing data from eight wheat lines to mine for SNPs in the genic, the repetitive and non-repetitive intergenic fractions of the wheat genome. Eventually, we identified 3.3 million SNPs, 49% being located on the B-genome, 41% on the A-genome and 10% on the D-genome. We also describe the development of the TaBW280K high-throughput genotyping array containing 280,226 SNPs. Performance of this chip was examined by genotyping a set of 96 wheat accessions representing the worldwide diversity. Sixty-nine percent of the SNPs can be efficiently scored, half of them showing a diploid-like clustering. The TaBW280K was proven to be a very efficient tool for diversity analyses, as well as for breeding as it can discriminate between closely related elite varieties. Finally, the TaBW280K array was used to genotype a population derived from a cross between Chinese Spring and Renan, leading to the construction a dense genetic map comprising 83,721 markers. The results described here will provide the wheat community with powerful tools for both basic and applied research.

  15. A GIS-based tool for estimating tree canopy cover on fixed-radius plots using high-resolution aerial imagery

    Treesearch

    Sara A. Goeking; Greg C. Liknes; Erik Lindblom; John Chase; Dennis M. Jacobs; Robert. Benton

    2012-01-01

    Recent changes to the Forest Inventory and Analysis (FIA) Program's definition of forest land precipitated the development of a geographic information system (GIS)-based tool for efficiently estimating tree canopy cover for all FIA plots. The FIA definition of forest land has shifted from a density-related criterion based on stocking to a 10 percent tree canopy...

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lherbier, Louis, W.; Novotnak, David, J.; Herling, Darrell, R.

    Hot forming processes such as forging, die casting and glass forming require tooling that is subjected to high temperatures during the manufacturing of components. Current tooling is adversely affected by prolonged exposure at high temperatures. Initial studies were conducted to determine the root cause of tool failures in a number of applications. Results show that tool failures vary and depend on the operating environment under which they are used. Major root cause failures include (1) thermal softening, (2) fatigue and (3) tool erosion, all of which are affected by process boundary conditions such as lubrication, cooling, process speed, etc. Whilemore » thermal management is a key to addressing tooling failures, it was clear that new tooling materials with superior high temperature strength could provide improved manufacturing efficiencies. These efficiencies are based on the use of functionally graded materials (FGM), a new subset of hybrid tools with customizable properties that can be fabricated using advanced powder metallurgy manufacturing technologies. Modeling studies of the various hot forming processes helped identify the effect of key variables such as stress, temperature and cooling rate and aid in the selection of tooling materials for specific applications. To address the problem of high temperature strength, several advanced powder metallurgy nickel and cobalt based alloys were selected for evaluation. These materials were manufactured into tooling using two relatively new consolidation processes. One process involved laser powder deposition (LPD) and the second involved a solid state dynamic powder consolidation (SSDPC) process. These processes made possible functionally graded materials (FGM) that resulted in shaped tooling that was monolithic, bi-metallic or substrate coated. Manufacturing of tooling with these processes was determined to be robust and consistent for a variety of materials. Prototype and production testing of FGM tooling showed the benefits of the nickel and cobalt based powder metallurgy alloys in a number of applications evaluated. Improvements in tool life ranged from three (3) to twenty (20) or more times than currently used tooling. Improvements were most dramatic where tool softening and deformation were the major cause of tool failures in hot/warm forging applications. Significant improvement was also noted in erosion of aluminum die casting tooling. Cost and energy savings can be realized as a result of increased tooling life, increased productivity and a reduction in scrap because of improved dimensional controls. Although LPD and SSDPC tooling usually have higher acquisition costs, net tooling costs per component produced drops dramatically with superior tool performance. Less energy is used to manufacture the tooling because fewer tools are required and less recycling of used tools are needed for the hot forming process. Energy is saved during the component manufacturing cycle because more parts can be produced in shorter periods of time. Energy is also saved by minimizing heating furnace idling time because of less downtime for tooling changes.« less

  17. Inductively coupled plasma mass spectrometry (ICP MS): a versatile tool.

    PubMed

    Ammann, Adrian A

    2007-04-01

    Inductively coupled plasma (ICP) mass spectrometry (MS) is routinely used in many diverse research fields such as earth, environmental, life and forensic sciences and in food, material, chemical, semiconductor and nuclear industries. The high ion density and the high temperature in a plasma provide an ideal atomizer and element ionizer for all types of samples and matrices introduced by a variety of specialized devices. Outstanding properties such as high sensitivity (ppt-ppq), relative salt tolerance, compound-independent element response and highest quantitation accuracy lead to the unchallenged performance of ICP MS in efficiently detecting, identifying and reliably quantifying trace elements. The increasing availability of relevant reference compounds and high separation selectivity extend the molecular identification capability of ICP MS hyphenated to species-specific separation techniques. While molecular ion source MS is specialized in determining the structure of unknown molecules, ICP MS is an efficient and highly sensitive tool for target-element orientated discoveries of relevant and unknown compounds. This special-feature, tutorial article presents the principle and advantages of ICP MS, highlighting these using examples from recently published investigations. Copyright 2007 John Wiley & Sons, Ltd.

  18. Fundamental Studies and Development of III-N Visible LEDs for High-Power Solid-State Lighting Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dupuis, Russell

    The goal of this program is to understand in a fundamental way the impact of strain, defects, polarization, and Stokes loss in relation to unique device structures upon the internal quantum efficiency (IQE) and efficiency droop (ED) of III-nitride (III-N) light-emitting diodes (LEDs) and to employ this understanding in the design and growth of high-efficiency LEDs capable of highly-reliable, high-current, high-power operation. This knowledge will be the basis for our advanced device epitaxial designs that lead to improved device performance. The primary approach we will employ is to exploit new scientific and engineering knowledge generated through the application of amore » set of unique advanced growth and characterization tools to develop new concepts in strain-, polarization-, and carrier dynamics-engineered and low-defect materials and device designs having reduced dislocations and improved carrier collection followed by efficient photon generation. We studied the effects of crystalline defect, polarizations, hole transport, electron-spillover, electron blocking layer, underlying layer below the multiplequantum- well active region, and developed high-efficiency and efficiency-droop-mitigated blue LEDs with a new LED epitaxial structures. We believe new LEDs developed in this program will make a breakthrough in the development of high-efficiency high-power visible III-N LEDs from violet to green spectral region.« less

  19. Toward Genome-Based Metabolic Engineering in Bacteria.

    PubMed

    Oesterle, Sabine; Wuethrich, Irene; Panke, Sven

    2017-01-01

    Prokaryotes modified stably on the genome are of great importance for production of fine and commodity chemicals. Traditional methods for genome engineering have long suffered from imprecision and low efficiencies, making construction of suitable high-producer strains laborious. Here, we review the recent advances in discovery and refinement of molecular precision engineering tools for genome-based metabolic engineering in bacteria for chemical production, with focus on the λ-Red recombineering and the clustered regularly interspaced short palindromic repeats/Cas9 nuclease systems. In conjunction, they enable the integration of in vitro-synthesized DNA segments into specified locations on the chromosome and allow for enrichment of rare mutants by elimination of unmodified wild-type cells. Combination with concurrently developing improvements in important accessory technologies such as DNA synthesis, high-throughput screening methods, regulatory element design, and metabolic pathway optimization tools has resulted in novel efficient microbial producer strains and given access to new metabolic products. These new tools have made and will likely continue to make a big impact on the bioengineering strategies that transform the chemical industry. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kitanidis, Peter

    As large-scale, commercial storage projects become operational, the problem of utilizing information from diverse sources becomes more critically important. In this project, we developed, tested, and applied an advanced joint data inversion system for CO 2 storage modeling with large data sets for use in site characterization and real-time monitoring. Emphasis was on the development of advanced and efficient computational algorithms for joint inversion of hydro-geophysical data, coupled with state-of-the-art forward process simulations. The developed system consists of (1) inversion tools using characterization data, such as 3D seismic survey (amplitude images), borehole log and core data, as well as hydraulic,more » tracer and thermal tests before CO 2 injection, (2) joint inversion tools for updating the geologic model with the distribution of rock properties, thus reducing uncertainty, using hydro-geophysical monitoring data, and (3) highly efficient algorithms for directly solving the dense or sparse linear algebra systems derived from the joint inversion. The system combines methods from stochastic analysis, fast linear algebra, and high performance computing. The developed joint inversion tools have been tested through synthetic CO 2 storage examples.« less

  1. IDEAL: Images Across Domains, Experiments, Algorithms and Learning

    NASA Astrophysics Data System (ADS)

    Ushizima, Daniela M.; Bale, Hrishikesh A.; Bethel, E. Wes; Ercius, Peter; Helms, Brett A.; Krishnan, Harinarayan; Grinberg, Lea T.; Haranczyk, Maciej; Macdowell, Alastair A.; Odziomek, Katarzyna; Parkinson, Dilworth Y.; Perciano, Talita; Ritchie, Robert O.; Yang, Chao

    2016-11-01

    Research across science domains is increasingly reliant on image-centric data. Software tools are in high demand to uncover relevant, but hidden, information in digital images, such as those coming from faster next generation high-throughput imaging platforms. The challenge is to analyze the data torrent generated by the advanced instruments efficiently, and provide insights such as measurements for decision-making. In this paper, we overview work performed by an interdisciplinary team of computational and materials scientists, aimed at designing software applications and coordinating research efforts connecting (1) emerging algorithms for dealing with large and complex datasets; (2) data analysis methods with emphasis in pattern recognition and machine learning; and (3) advances in evolving computer architectures. Engineering tools around these efforts accelerate the analyses of image-based recordings, improve reusability and reproducibility, scale scientific procedures by reducing time between experiments, increase efficiency, and open opportunities for more users of the imaging facilities. This paper describes our algorithms and software tools, showing results across image scales, demonstrating how our framework plays a role in improving image understanding for quality control of existent materials and discovery of new compounds.

  2. Live minimal path for interactive segmentation of medical images

    NASA Astrophysics Data System (ADS)

    Chartrand, Gabriel; Tang, An; Chav, Ramnada; Cresson, Thierry; Chantrel, Steeve; De Guise, Jacques A.

    2015-03-01

    Medical image segmentation is nowadays required for medical device development and in a growing number of clinical and research applications. Since dedicated automatic segmentation methods are not always available, generic and efficient interactive tools can alleviate the burden of manual segmentation. In this paper we propose an interactive segmentation tool based on image warping and minimal path segmentation that is efficient for a wide variety of segmentation tasks. While the user roughly delineates the desired organs boundary, a narrow band along the cursors path is straightened, providing an ideal subspace for feature aligned filtering and minimal path algorithm. Once the segmentation is performed on the narrow band, the path is warped back onto the original image, precisely delineating the desired structure. This tool was found to have a highly intuitive dynamic behavior. It is especially efficient against misleading edges and required only coarse interaction from the user to achieve good precision. The proposed segmentation method was tested for 10 difficult liver segmentations on CT and MRI images, and the resulting 2D overlap Dice coefficient was 99% on average..

  3. The impact of a novel resident leadership training curriculum.

    PubMed

    Awad, Samir S; Hayley, Barbara; Fagan, Shawn P; Berger, David H; Brunicardi, F Charles

    2004-11-01

    Today's complex health care environment coupled with the 80-hour workweek mandate has required that surgical resident team interactions evolve from a military command-and-control style to a collaborative leadership style. A novel educational curriculum was implemented with objectives of training the residents to have the capacity/ability to create and manage powerful teams through alignment, communication, and integrity integral tools to practicing a collaborative leadership style while working 80 hours per week. Specific strategies were as follows: (1) to focus on quality of patient care and service while receiving a high education-to-service ratio, and (2) to maximize efficiency through time management. This article shows that leadership training as part of a resident curriculum can significantly increase a resident's view of leadership in the areas of alignment, communication, and integrity; tools previously shown in business models to be vital for effective and efficient teams. This curriculum, over the course of the surgical residency, can provide residents with the necessary tools to deliver efficient quality of care while working within the 80-hour workweek mandate in a more collaborative style environment.

  4. Validation and Diagnostic Efficiency of the Mini-SPIN in Spanish-Speaking Adolescents

    PubMed Central

    Garcia-Lopez, LuisJoaquín; Moore, Harry T. A.

    2015-01-01

    Objectives Social Anxiety Disorder (SAD) is one of the most common mental disorders in adolescence. Many validated psychometric tools are available to diagnose individuals with SAD efficaciously. However, there is a demand for shortened self-report instruments that identify adolescents at risk of developing SAD. We validate the Mini-SPIN and its diagnostic efficiency in overcoming this problem in Spanish-speaking adolescents in Spain. Methods The psychometric properties of the 3-item Mini-SPIN scale for adolescents were assessed in a community (study 1) and clinical sample (study 2). Results Study 1 consisted of 573 adolescents, and found the Mini-SPIN to have appropriate internal consistency and high construct validity. Study 2 consisted of 354 adolescents (147 participants diagnosed with SAD and 207 healthy controls). Data revealed that the Mini-SPIN has good internal consistency, high construct validity and adequate diagnostic efficiency. Conclusions Our findings suggest that the Mini-SPIN has good psychometric properties on clinical and healthy control adolescents and general population, which indicates that it can be used as a screening tool in Spanish-speaking adolescents. Cut-off scores are provided. PMID:26317695

  5. The future challenge for aeropropulsion

    NASA Technical Reports Server (NTRS)

    Rosen, Robert; Bowditch, David N.

    1992-01-01

    NASA's research in aeropropulsion is focused on improving the efficiency, capability, and environmental compatibility for all classes of future aircraft. The development of innovative concepts, and theoretical, experimental, and computational tools provide the knowledge base for continued propulsion system advances. Key enabling technologies include advances in internal fluid mechanics, structures, light-weight high-strength composite materials, and advanced sensors and controls. Recent emphasis has been on the development of advanced computational tools in internal fluid mechanics, structural mechanics, reacting flows, and computational chemistry. For subsonic transport applications, very high bypass ratio turbofans with increased engine pressure ratio are being investigated to increase fuel efficiency and reduce airport noise levels. In a joint supersonic cruise propulsion program with industry, the critical environmental concerns of emissions and community noise are being addressed. NASA is also providing key technologies for the National Aerospaceplane, and is studying propulsion systems that provide the capability for aircraft to accelerate to and cruise in the Mach 4-6 speed range. The combination of fundamental, component, and focused technology development underway at NASA will make possible dramatic advances in aeropropulsion efficiency and environmental compatibility for future aeronautical vehicles.

  6. Validation and Diagnostic Efficiency of the Mini-SPIN in Spanish-Speaking Adolescents.

    PubMed

    Garcia-Lopez, LuisJoaquín; Moore, Harry T A

    2015-01-01

    Social Anxiety Disorder (SAD) is one of the most common mental disorders in adolescence. Many validated psychometric tools are available to diagnose individuals with SAD efficaciously. However, there is a demand for shortened self-report instruments that identify adolescents at risk of developing SAD. We validate the Mini-SPIN and its diagnostic efficiency in overcoming this problem in Spanish-speaking adolescents in Spain. The psychometric properties of the 3-item Mini-SPIN scale for adolescents were assessed in a community (study 1) and clinical sample (study 2). Study 1 consisted of 573 adolescents, and found the Mini-SPIN to have appropriate internal consistency and high construct validity. Study 2 consisted of 354 adolescents (147 participants diagnosed with SAD and 207 healthy controls). Data revealed that the Mini-SPIN has good internal consistency, high construct validity and adequate diagnostic efficiency. Our findings suggest that the Mini-SPIN has good psychometric properties on clinical and healthy control adolescents and general population, which indicates that it can be used as a screening tool in Spanish-speaking adolescents. Cut-off scores are provided.

  7. Advances in algal-prokaryotic wastewater treatment: A review of nitrogen transformations, reactor configurations and molecular tools.

    PubMed

    Wang, Meng; Keeley, Ryan; Zalivina, Nadezhda; Halfhide, Trina; Scott, Kathleen; Zhang, Qiong; van der Steen, Peter; Ergas, Sarina J

    2018-07-01

    The synergistic activity of algae and prokaryotic microorganisms can be used to improve the efficiency of biological wastewater treatment, particularly with regards to nitrogen removal. For example, algae can provide oxygen through photosynthesis needed for aerobic degradation of organic carbon and nitrification and harvested algal-prokaryotic biomass can be used to produce high value chemicals or biogas. Algal-prokaryotic consortia have been used to treat wastewater in different types of reactors, including waste stabilization ponds, high rate algal ponds and closed photobioreactors. This review addresses the current literature and identifies research gaps related to the following topics: 1) the complex interactions between algae and prokaryotes in wastewater treatment; 2) advances in bioreactor technologies that can achieve high nitrogen removal efficiencies in small reactor volumes, such as algal-prokaryotic biofilm reactors and enhanced algal-prokaryotic treatment systems (EAPS); 3) molecular tools that have expanded our understanding of the activities of algal and prokaryotic communities in wastewater treatment processes. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Energy efficient engine high-pressure turbine single crystal vane and blade fabrication technology report

    NASA Technical Reports Server (NTRS)

    Giamei, A. F.; Salkeld, R. W.; Hayes, C. W.

    1981-01-01

    The objective of the High-Pressure Turbine Fabrication Program was to demonstrate the application and feasibility of Pratt & Whitney Aircraft-developed two-piece, single crystal casting and bonding technology on the turbine blade and vane configurations required for the high-pressure turbine in the Energy Efficient Engine. During the first phase of the program, casting feasibility was demonstrated. Several blade and vane halves were made for the bonding trials, plus solid blades and vanes were successfully cast for materials evaluation tests. Specimens exhibited the required microstructure and chemical composition. Bonding feasibility was demonstrated in the second phase of the effort. Bonding yields of 75 percent for the vane and 30 percent for the blade were achieved, and methods for improving these yield percentages were identified. A bond process was established for PWA 1480 single crystal material which incorporated a transient liquid phase interlayer. Bond properties were substantiated and sensitivities determined. Tooling die materials were identified, and an advanced differential thermal expansion tooling concept was incorporated into the bond process.

  9. A RNA nanotechnology platform for a simultaneous two-in-one siRNA delivery and its application in synergistic RNAi therapy

    PubMed Central

    Jang, Mihue; Han, Hee Dong; Ahn, Hyung Jun

    2016-01-01

    Incorporating multiple copies of two RNAi molecules into a single nanostructure in a precisely controlled manner can provide an efficient delivery tool to regulate multiple gene pathways in the relation of mutual dependence. Here, we show a RNA nanotechnology platform for a two-in-one RNAi delivery system to contain polymeric two RNAi molecules within the same RNA nanoparticles, without the aid of polyelectrolyte condensation reagents. As our RNA nanoparticles lead to the simultaneous silencing of two targeted mRNAs, of which biological functions are highly interdependent, combination therapy for multi-drug resistance cancer cells, which was studied as a specific application of our two-in-one RNAi delivery system, demonstrates the efficient synergistic effects for cancer therapy. Therefore, this RNA nanoparticles approach has an efficient tool for a simultaneous co-delivery of RNAi molecules in the RNAi-based biomedical applications, and our current studies present an efficient strategy to overcome multi-drug resistance caused by malfunction of genes in chemotherapy. PMID:27562435

  10. Highly efficient gene inactivation by adenoviral CRISPR/Cas9 in human primary cells

    PubMed Central

    Tielen, Frans; Elstak, Edo; Benschop, Julian; Grimbergen, Max; Stallen, Jan; Janssen, Richard; van Marle, Andre; Essrich, Christian

    2017-01-01

    Phenotypic assays using human primary cells are highly valuable tools for target discovery and validation in drug discovery. Expression knockdown (KD) of such targets in these assays allows the investigation of their role in models of disease processes. Therefore, efficient and fast modes of protein KD in phenotypic assays are required. The CRISPR/Cas9 system has been shown to be a versatile and efficient means of gene inactivation in immortalized cell lines. Here we describe the use of adenoviral (AdV) CRISPR/Cas9 vectors for efficient gene inactivation in two human primary cell types, normal human lung fibroblasts and human bronchial epithelial cells. The effects of gene inactivation were studied in the TGF-β-induced fibroblast to myofibroblast transition assay (FMT) and the epithelial to mesenchymal transition assay (EMT), which are SMAD3 dependent and reflect pathogenic mechanisms observed in fibrosis. Co-transduction (co-TD) of AdV Cas9 with SMAD3-targeting guide RNAs (gRNAs) resulted in fast and efficient genome editing judged by insertion/deletion (indel) formation, as well as significant reduction of SMAD3 protein expression and nuclear translocation. This led to phenotypic changes downstream of SMAD3 inhibition, including substantially decreased alpha smooth muscle actin and fibronectin 1 expression, which are markers for FMT and EMT, respectively. A direct comparison between co-TD of separate Cas9 and gRNA AdV, versus TD with a single “all-in-one” Cas9/gRNA AdV, revealed that both methods achieve similar levels of indel formation. These data demonstrate that AdV CRISPR/Cas9 is a useful and efficient tool for protein KD in human primary cell phenotypic assays. The use of AdV CRISPR/Cas9 may offer significant advantages over the current existing tools and should enhance target discovery and validation opportunities. PMID:28800587

  11. A microwave applicator for uniform irradiation by circularly polarized waves in an anechoic chamber

    NASA Astrophysics Data System (ADS)

    Chiang, W. Y.; Wu, M. H.; Wu, K. L.; Lin, M. H.; Teng, H. H.; Tsai, Y. F.; Ko, C. C.; Yang, E. C.; Jiang, J. A.; Barnett, L. R.; Chu, K. R.

    2014-08-01

    Microwave applicators are widely employed for materials heating in scientific research and industrial applications, such as food processing, wood drying, ceramic sintering, chemical synthesis, waste treatment, and insect control. For the majority of microwave applicators, materials are heated in the standing waves of a resonant cavity, which can be highly efficient in energy consumption, but often lacks the field uniformity and controllability required for a scientific study. Here, we report a microwave applicator for rapid heating of small samples by highly uniform irradiation. It features an anechoic chamber, a 24-GHz microwave source, and a linear-to-circular polarization converter. With a rather low energy efficiency, such an applicator functions mainly as a research tool. This paper discusses the significance of its special features and describes the structure, in situ diagnostic tools, calculated and measured field patterns, and a preliminary heating test of the overall system.

  12. A microwave applicator for uniform irradiation by circularly polarized waves in an anechoic chamber.

    PubMed

    Chiang, W Y; Wu, M H; Wu, K L; Lin, M H; Teng, H H; Tsai, Y F; Ko, C C; Yang, E C; Jiang, J A; Barnett, L R; Chu, K R

    2014-08-01

    Microwave applicators are widely employed for materials heating in scientific research and industrial applications, such as food processing, wood drying, ceramic sintering, chemical synthesis, waste treatment, and insect control. For the majority of microwave applicators, materials are heated in the standing waves of a resonant cavity, which can be highly efficient in energy consumption, but often lacks the field uniformity and controllability required for a scientific study. Here, we report a microwave applicator for rapid heating of small samples by highly uniform irradiation. It features an anechoic chamber, a 24-GHz microwave source, and a linear-to-circular polarization converter. With a rather low energy efficiency, such an applicator functions mainly as a research tool. This paper discusses the significance of its special features and describes the structure, in situ diagnostic tools, calculated and measured field patterns, and a preliminary heating test of the overall system.

  13. Assessing motivation for work environment improvements: internal consistency, reliability and factorial structure.

    PubMed

    Hedlund, Ann; Ateg, Mattias; Andersson, Ing-Marie; Rosén, Gunnar

    2010-04-01

    Workers' motivation to actively take part in improvements to the work environment is assumed to be important for the efficiency of investments for that purpose. That gives rise to the need for a tool to measure this motivation. A questionnaire to measure motivation for improvements to the work environment has been designed. Internal consistency and test-retest reliability of the domains of the questionnaire have been measured, and the factorial structure has been explored, from the answers of 113 employees. The internal consistency is high (0.94), as well as the correlation for the total score (0.84). Three factors are identified accounting for 61.6% of the total variance. The questionnaire can be a useful tool in improving intervention methods. The expectation is that the tool can be useful, particularly with the aim of improving efficiency of companies' investments for work environment improvements. Copyright 2010 Elsevier Ltd. All rights reserved.

  14. Precision cut lung slices as an efficient tool for in vitro lung physio-pharmacotoxicology studies.

    PubMed

    Morin, Jean-Paul; Baste, Jean-Marc; Gay, Arnaud; Crochemore, Clément; Corbière, Cécile; Monteil, Christelle

    2013-01-01

    1.We review the specific approaches for lung tissue slices preparation and incubation systems and the research application fields in which lung slices proved to be a very efficient alternative to animal experimentation for biomechanical, physiological, pharmacological and toxicological approaches. 2.Focus is made on air-liquid interface dynamic organ culture systems that allow direct tissue exposure to complex aerosol and that best mimic in vivo lung tissue physiology. 3.A compilation of research applications in the fields of vascular and airway reactivity, mucociliary transport, polyamine transport, xenobiotic biotransformation, chemicals toxicology and complex aerosols supports the concept that precision cut lung slices are a very efficient tool maintaining highly differentiated functions similar to in vivo lung organ when kept under dynamic organ culture. They also have been successfully used for lung gene transfer efficiency assessment, for lung viral infection efficiency assessment, for studies of tissue preservation media and tissue post-conditioning to optimize lung tissue viability before grafting. 4.Taken all together, the reviewed studies point to a great interest for precision cut lung slices as an efficient and valuable alternative to in vivo lung organ experimentation.

  15. Electroforming of optical tooling in high-strength Ni-Co alloy

    NASA Astrophysics Data System (ADS)

    Stein, Berl

    2003-05-01

    Plastic optics are often mass produced by injection, compression or injection-compression molding. Optical quality molds can be directly machined in appropriate materials (tool steels, electroless nickel, aluminum, etc.), but much greater cost efficiency can be achieved with electroformed modl inserts. Traditionally, electroforming of optical quality mold inserts has been carried out in nickel, a material much softer than tool steels which, when hardened to 45 - 50 HRc usually exhibit high wear resistance and long service life (hundreds of thousands of impressions per mold). Because of their low hardness (< 20 HRc), nickel molds can produce only tens of thousands of parts before they are scrapped due to wear or accidental damage. This drawback prevented their wider usage in general plastic and optical mold making. Recently, NiCoForm has developed a proprietary Ni-CO electroforming bath combining the high strength and wear resistance of the alloy with the low stress and high replication fidelity typical of pure nickel electroforming. This paper will outline the approach to electroforming of optical quality tooling in low stress, high strength Ni-Co alloy and present several examples of electroformed NiColoy mold inserts.

  16. Efficiency and productivity change in the English National Health Service: can data envelopment analysis provide a robust and useful measure?

    PubMed

    Hollingsworth, Bruce; Parkin, David

    2003-10-01

    Several tools are available to health care organisations in England to measure efficiency, but these are widely reported to be unpopular and unusable. Moreover, they do not have a sound conceptual basis. This paper describes the development and evaluation of a user-friendly tool that organisations can use to measure their efficiency, based on the technique of data envelopment analysis (DEA), which has a firm basis in economic theory. Routine data from 57 providers and 14 purchasing organisations in one region of the English National Health Service (NHS) for 1994-1996 were used to create information on efficiency based on DEA. This was presented to them using guides that explained the information and how it was to be used. They were surveyed to elicit their views on current measures of efficiency and on the potential use of the DEA-based information. The DEA measure demonstrated considerable scope for improvements in health service efficiency. There was a very small improvement over time with larger changes in some hospitals than others. Overall, 80% of those surveyed gave high scores for the potential usefulness of the DEA-based measures compared with 9-45% for existing methods. The quality of presentation of the information was also consistently high. There is dissatisfaction with efficiency information currently available to the NHS. DEA produces potentially useful information, which is easy to use and can be easily explained to and understood by potential users. The next step would be the implementation, on a developmental basis, of a routine DEA-based information system.

  17. Learning Technology: Enhancing Learning in New Designs for the Comprehensive High School.

    ERIC Educational Resources Information Center

    Damyanovich, Mike; And Others

    Technology, directed to each of the parts that collectively give shape and direction to the school, should provide the critical mass necessary to realize the specifications for the New Designs for the Comprehensive High School project. Learners should have access to personal productivity tools that increase effectiveness and efficiency in the…

  18. From Ambiguities to Insights: Query-based Comparisons of High-Dimensional Data

    NASA Astrophysics Data System (ADS)

    Kowalski, Jeanne; Talbot, Conover; Tsai, Hua L.; Prasad, Nijaguna; Umbricht, Christopher; Zeiger, Martha A.

    2007-11-01

    Genomic technologies will revolutionize drag discovery and development; that much is universally agreed upon. The high dimension of data from such technologies has challenged available data analytic methods; that much is apparent. To date, large-scale data repositories have not been utilized in ways that permit their wealth of information to be efficiently processed for knowledge, presumably due in large part to inadequate analytical tools to address numerous comparisons of high-dimensional data. In candidate gene discovery, expression comparisons are often made between two features (e.g., cancerous versus normal), such that the enumeration of outcomes is manageable. With multiple features, the setting becomes more complex, in terms of comparing expression levels of tens of thousands transcripts across hundreds of features. In this case, the number of outcomes, while enumerable, become rapidly large and unmanageable, and scientific inquiries become more abstract, such as "which one of these (compounds, stimuli, etc.) is not like the others?" We develop analytical tools that promote more extensive, efficient, and rigorous utilization of the public data resources generated by the massive support of genomic studies. Our work innovates by enabling access to such metadata with logically formulated scientific inquires that define, compare and integrate query-comparison pair relations for analysis. We demonstrate our computational tool's potential to address an outstanding biomedical informatics issue of identifying reliable molecular markers in thyroid cancer. Our proposed query-based comparison (QBC) facilitates access to and efficient utilization of metadata through logically formed inquires expressed as query-based comparisons by organizing and comparing results from biotechnologies to address applications in biomedicine.

  19. High throughput SNP discovery and genotyping in hexaploid wheat

    PubMed Central

    Navarro, Julien; Kitt, Jonathan; Choulet, Frédéric; Leveugle, Magalie; Duarte, Jorge; Rivière, Nathalie; Eversole, Kellye; Le Gouis, Jacques; Davassi, Alessandro; Balfourier, François; Le Paslier, Marie-Christine; Berard, Aurélie; Brunel, Dominique; Feuillet, Catherine; Poncet, Charles; Sourdille, Pierre

    2018-01-01

    Because of their abundance and their amenability to high-throughput genotyping techniques, Single Nucleotide Polymorphisms (SNPs) are powerful tools for efficient genetics and genomics studies, including characterization of genetic resources, genome-wide association studies and genomic selection. In wheat, most of the previous SNP discovery initiatives targeted the coding fraction, leaving almost 98% of the wheat genome largely unexploited. Here we report on the use of whole-genome resequencing data from eight wheat lines to mine for SNPs in the genic, the repetitive and non-repetitive intergenic fractions of the wheat genome. Eventually, we identified 3.3 million SNPs, 49% being located on the B-genome, 41% on the A-genome and 10% on the D-genome. We also describe the development of the TaBW280K high-throughput genotyping array containing 280,226 SNPs. Performance of this chip was examined by genotyping a set of 96 wheat accessions representing the worldwide diversity. Sixty-nine percent of the SNPs can be efficiently scored, half of them showing a diploid-like clustering. The TaBW280K was proven to be a very efficient tool for diversity analyses, as well as for breeding as it can discriminate between closely related elite varieties. Finally, the TaBW280K array was used to genotype a population derived from a cross between Chinese Spring and Renan, leading to the construction a dense genetic map comprising 83,721 markers. The results described here will provide the wheat community with powerful tools for both basic and applied research. PMID:29293495

  20. Chapter 22: Compressed Air Evaluation Protocol. The Uniform Methods Project: Methods for Determining Energy Efficiency Savings for Specific Measures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurnik, Charles W; Benton, Nathanael; Burns, Patrick

    Compressed-air systems are used widely throughout industry for many operations, including pneumatic tools, packaging and automation equipment, conveyors, and other industrial process operations. Compressed-air systems are defined as a group of subsystems composed of air compressors, air treatment equipment, controls, piping, pneumatic tools, pneumatically powered machinery, and process applications using compressed air. A compressed-air system has three primary functional subsystems: supply, distribution, and demand. Air compressors are the primary energy consumers in a compressed-air system and are the primary focus of this protocol. The two compressed-air energy efficiency measures specifically addressed in this protocol are: High-efficiency/variable speed drive (VSD) compressormore » replacing modulating, load/unload, or constant-speed compressor; and Compressed-air leak survey and repairs. This protocol provides direction on how to reliably verify savings from these two measures using a consistent approach for each.« less

  1. The manipulator tool state classification based on inertia forces analysis

    NASA Astrophysics Data System (ADS)

    Gierlak, Piotr

    2018-07-01

    In this article, we discuss the detection of damage to the cutting tool used in robotised light mechanical processing. Continuous monitoring of the state of the tool mounted in the tool holder of the robot is required due to the necessity to save time. The tool is a brush with ceramic fibres used for surface grinding. A typical example of damage to the brush is the breaking of fibres, resulting in a tool imbalance and vibrations at a high rotational speed, e.g. during grinding. This also results in a limited operating surface of the tool and a decrease in the efficiency of processing. While an imbalanced tool is spinning, fictitious forces occur that carry the information regarding the balance of the tool. The forces can be measured using a force sensor located in the end-effector of the robot allowing the assessment of the damage to the brush in an automatized way, devoid of any operator.

  2. The development of a new breast feeding assessment tool and the relationship with breast feeding self-efficacy

    PubMed Central

    Ingram, Jenny; Johnson, Debbie; Copeland, Marion; Churchill, Cathy; Taylor, Hazel

    2015-01-01

    Objective to develop a breast feeding assessment tool to facilitate improved targeting of optimum positioning and attachment advice and to describe the changes seen following the release of a tongue-tie. Design development and validation of the Bristol Breastfeeding Assessment Tool (BBAT) and correlation with breast feeding self-efficacy. Setting maternity hospital in South West England. Participants 218 breast feeds (160 mother–infant dyads); seven midwife assessors. Findings the tool has more explanation than other tools to remind those supporting breast-feeding women about the components of an efficient breast feed. There was good internal reliability for the final 4-item BBAT (Cronbach׳s alpha=0.668) and the midwives who used it showed a high correlation in the consistency of its use (ICC=0.782). Midwives were able to score a breast feed consistently using the BBAT and felt that it helped them with advice to mothers about improving positioning and attachment to make breast feeding less painful, particularly with a tongue-tied infant. The tool showed strong correlation with breast feeding self-efficacy, indicating that more efficient breast feeding technique is associated with increased confidence in breast feeding an infant. Conclusions the BBAT is a concise breast feeding assessment tool facilitating accurate, rapid breast feeding appraisal, and targeting breast feeding advice to mothers acquiring early breast feeding skills or for those experiencing problems with an older infant. Accurate assessment is essential to ensure enhanced breast feeding efficiency and increased maternal self-confidence. Implications for practice the BBAT could be used both clinically and in research to target advice to improve breast feeding efficacy. Further research is needed to establish its wider usefulness. PMID:25061006

  3. The development of a new breast feeding assessment tool and the relationship with breast feeding self-efficacy.

    PubMed

    Ingram, Jenny; Johnson, Debbie; Copeland, Marion; Churchill, Cathy; Taylor, Hazel

    2015-01-01

    to develop a breast feeding assessment tool to facilitate improved targeting of optimum positioning and attachment advice and to describe the changes seen following the release of a tongue-tie. development and validation of the Bristol Breastfeeding Assessment Tool (BBAT) and correlation with breast feeding self-efficacy. maternity hospital in South West England. 218 breast feeds (160 mother-infant dyads); seven midwife assessors. the tool has more explanation than other tools to remind those supporting breast-feeding women about the components of an efficient breast feed. There was good internal reliability for the final 4-item BBAT (Cronbach's alpha=0.668) and the midwives who used it showed a high correlation in the consistency of its use (ICC=0.782). Midwives were able to score a breast feed consistently using the BBAT and felt that it helped them with advice to mothers about improving positioning and attachment to make breast feeding less painful, particularly with a tongue-tied infant. The tool showed strong correlation with breast feeding self-efficacy, indicating that more efficient breast feeding technique is associated with increased confidence in breast feeding an infant. the BBAT is a concise breast feeding assessment tool facilitating accurate, rapid breast feeding appraisal, and targeting breast feeding advice to mothers acquiring early breast feeding skills or for those experiencing problems with an older infant. Accurate assessment is essential to ensure enhanced breast feeding efficiency and increased maternal self-confidence. the BBAT could be used both clinically and in research to target advice to improve breast feeding efficacy. Further research is needed to establish its wider usefulness. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  4. 2-D Circulation Control Airfoil Benchmark Experiments Intended for CFD Code Validation

    NASA Technical Reports Server (NTRS)

    Englar, Robert J.; Jones, Gregory S.; Allan, Brian G.; Lin, Johb C.

    2009-01-01

    A current NASA Research Announcement (NRA) project being conducted by Georgia Tech Research Institute (GTRI) personnel and NASA collaborators includes the development of Circulation Control (CC) blown airfoils to improve subsonic aircraft high-lift and cruise performance. The emphasis of this program is the development of CC active flow control concepts for both high-lift augmentation, drag control, and cruise efficiency. A collaboration in this project includes work by NASA research engineers, whereas CFD validation and flow physics experimental research are part of NASA s systematic approach to developing design and optimization tools for CC applications to fixed-wing aircraft. The design space for CESTOL type aircraft is focusing on geometries that depend on advanced flow control technologies that include Circulation Control aerodynamics. The ability to consistently predict advanced aircraft performance requires improvements in design tools to include these advanced concepts. Validation of these tools will be based on experimental methods applied to complex flows that go beyond conventional aircraft modeling techniques. This paper focuses on recent/ongoing benchmark high-lift experiments and CFD efforts intended to provide 2-D CFD validation data sets related to NASA s Cruise Efficient Short Take Off and Landing (CESTOL) study. Both the experimental data and related CFD predictions are discussed.

  5. Performance evaluation of inpatient service in Beijing: a horizontal comparison with risk adjustment based on Diagnosis Related Groups.

    PubMed

    Jian, Weiyan; Huang, Yinmin; Hu, Mu; Zhang, Xiumei

    2009-04-30

    The medical performance evaluation, which provides a basis for rational decision-making, is an important part of medical service research. Current progress with health services reform in China is far from satisfactory, without sufficient regulation. To achieve better progress, an effective tool for evaluating medical performance needs to be established. In view of this, this study attempted to develop such a tool appropriate for the Chinese context. Data was collected from the front pages of medical records (FPMR) of all large general public hospitals (21 hospitals) in the third and fourth quarter of 2007. Locally developed Diagnosis Related Groups (DRGs) were introduced as a tool for risk adjustment and performance evaluation indicators were established: Charge Efficiency Index (CEI), Time Efficiency Index (TEI) and inpatient mortality of low-risk group cases (IMLRG), to reflect respectively work efficiency and medical service quality. Using these indicators, the inpatient services' performance was horizontally compared among hospitals. Case-mix Index (CMI) was used to adjust efficiency indices and then produce adjusted CEI (aCEI) and adjusted TEI (aTEI). Poisson distribution analysis was used to test the statistical significance of the IMLRG differences between different hospitals. Using the aCEI, aTEI and IMLRG scores for the 21 hospitals, Hospital A and C had relatively good overall performance because their medical charges were lower, LOS shorter and IMLRG smaller. The performance of Hospital P and Q was the worst due to their relatively high charge level, long LOS and high IMLRG. Various performance problems also existed in the other hospitals. It is possible to develop an accurate and easy to run performance evaluation system using Case-Mix as the tool for risk adjustment, choosing indicators close to consumers and managers, and utilizing routine report forms as the basic information source. To keep such a system running effectively, it is necessary to improve the reliability of clinical information and the risk-adjustment ability of Case-Mix.

  6. UPIC + GO: Zeroing in on informative markers

    USDA-ARS?s Scientific Manuscript database

    Microsatellites/SSRs (simple sequence repeats) have become a powerful tool in genomic biology because of their broad range of applications and availability. An efficient method recently developed to generate microsatellite-enriched libraries used in combination with high throughput DNA pyrosequencin...

  7. LIVING SHORES GALLERY MX964015

    EPA Science Inventory

    An interactive computer kiosk will allow the Texas State Aquarium to deliver a considerable amount of information in an efficient and highly effective manner. Touch screen interactives have proven to be excellent teaching tools in the Aquarium's Jellies: Floating Phantoms galler...

  8. Biomimetics: using nature as an inspiring model for human innovation

    NASA Technical Reports Server (NTRS)

    Bar-Cohen, Yoseph

    2006-01-01

    The evolution of nature over 3.8 billion years led to the highly effective and power efficient biological mechanisms. Imitating these mechanisms offers enormous potentials for the improvement of our life and the tools we use.

  9. GenomicTools: a computational platform for developing high-throughput analytics in genomics.

    PubMed

    Tsirigos, Aristotelis; Haiminen, Niina; Bilal, Erhan; Utro, Filippo

    2012-01-15

    Recent advances in sequencing technology have resulted in the dramatic increase of sequencing data, which, in turn, requires efficient management of computational resources, such as computing time, memory requirements as well as prototyping of computational pipelines. We present GenomicTools, a flexible computational platform, comprising both a command-line set of tools and a C++ API, for the analysis and manipulation of high-throughput sequencing data such as DNA-seq, RNA-seq, ChIP-seq and MethylC-seq. GenomicTools implements a variety of mathematical operations between sets of genomic regions thereby enabling the prototyping of computational pipelines that can address a wide spectrum of tasks ranging from pre-processing and quality control to meta-analyses. Additionally, the GenomicTools platform is designed to analyze large datasets of any size by minimizing memory requirements. In practical applications, where comparable, GenomicTools outperforms existing tools in terms of both time and memory usage. The GenomicTools platform (version 2.0.0) was implemented in C++. The source code, documentation, user manual, example datasets and scripts are available online at http://code.google.com/p/ibm-cbc-genomic-tools.

  10. High sensitivity spectroscopic and thermal characterization of cooling efficiency for optical refrigeration materials

    NASA Astrophysics Data System (ADS)

    Melgaard, Seth D.; Seletskiy, Denis V.; Di Lieto, Alberto; Tonelli, Mauro; Sheik-Bahae, Mansoor

    2012-03-01

    Since recent demonstration of cryogenic optical refrigeration, a need for reliable characterization tools of cooling performance of different materials is in high demand. We present our experimental apparatus that allows for temperature and wavelength dependent characterization of the materials' cooling efficiency and is based on highly sensitive spectral differencing technique or two-band differential spectral metrology (2B-DSM). First characterization of a 5% w.t. ytterbium-doped YLF crystal showed quantitative agreement with the current laser cooling model, as well as measured a minimum achievable temperature (MAT) at 110 K. Other materials and ion concentrations are also investigated and reported here.

  11. Spectral and Concentration Sensitivity of Multijunction Solar Cells at High Temperature: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedman, Daniel J.; Steiner, Myles A.; Perl, Emmett E.

    2017-06-14

    We model the performance of two-junction solar cells at very high temperatures of ~400 degrees C and beyond for applications such as hybrid PV/solar-thermal power production, and identify areas in which the design and performance characteristics behave significantly differently than at more conventional near-room-temperature operating conditions. We show that high-temperature operation reduces the sensitivity of the cell efficiency to spectral content, but increases the sensitivity to concentration, both of which have implications for energy yield in terrestrial PV applications. For other high-temperature applications such as near-sun space missions, our findings indicate that concentration may be a useful tool to enhancemore » cell efficiency.« less

  12. BEST Winery Guidebook: Benchmarking and Energy and Water SavingsTool for the Wine Industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galitsky, Christina; Worrell, Ernst; Radspieler, Anthony

    2005-10-15

    Not all industrial facilities have the staff or the opportunity to perform a detailed audit of their operations. The lack of knowledge of energy efficiency opportunities provides an important barrier to improving efficiency. Benchmarking has demonstrated to help energy users understand energy use and the potential for energy efficiency improvement, reducing the information barrier. In California, the wine making industry is not only one of the economic pillars of the economy; it is also a large energy consumer, with a considerable potential for energy-efficiency improvement. Lawrence Berkeley National Laboratory and Fetzer Vineyards developed an integrated benchmarking and self-assessment tool formore » the California wine industry called ''BEST''(Benchmarking and Energy and water Savings Tool) Winery. BEST Winery enables a winery to compare its energy efficiency to a best practice winery, accounting for differences in product mix and other characteristics of the winery. The tool enables the user to evaluate the impact of implementing energy and water efficiency measures. The tool facilitates strategic planning of efficiency measures, based on the estimated impact of the measures, their costs and savings. BEST Winery is available as a software tool in an Excel environment. This report serves as background material, documenting assumptions and information on the included energy and water efficiency measures. It also serves as a user guide for the software package.« less

  13. LoRTE: Detecting transposon-induced genomic variants using low coverage PacBio long read sequences.

    PubMed

    Disdero, Eric; Filée, Jonathan

    2017-01-01

    Population genomic analysis of transposable elements has greatly benefited from recent advances of sequencing technologies. However, the short size of the reads and the propensity of transposable elements to nest in highly repeated regions of genomes limits the efficiency of bioinformatic tools when Illumina or 454 technologies are used. Fortunately, long read sequencing technologies generating read length that may span the entire length of full transposons are now available. However, existing TE population genomic softwares were not designed to handle long reads and the development of new dedicated tools is needed. LoRTE is the first tool able to use PacBio long read sequences to identify transposon deletions and insertions between a reference genome and genomes of different strains or populations. Tested against simulated and genuine Drosophila melanogaster PacBio datasets, LoRTE appears to be a reliable and broadly applicable tool to study the dynamic and evolutionary impact of transposable elements using low coverage, long read sequences. LoRTE is an efficient and accurate tool to identify structural genomic variants caused by TE insertion or deletion. LoRTE is available for download at http://www.egce.cnrs-gif.fr/?p=6422.

  14. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses

    PubMed Central

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-01-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. PMID:24462600

  15. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses.

    PubMed

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-06-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Precision injection molding of freeform optics

    NASA Astrophysics Data System (ADS)

    Fang, Fengzhou; Zhang, Nan; Zhang, Xiaodong

    2016-08-01

    Precision injection molding is the most efficient mass production technology for manufacturing plastic optics. Applications of plastic optics in field of imaging, illumination, and concentration demonstrate a variety of complex surface forms, developing from conventional plano and spherical surfaces to aspheric and freeform surfaces. It requires high optical quality with high form accuracy and lower residual stresses, which challenges both optical tool inserts machining and precision injection molding process. The present paper reviews recent progress in mold tool machining and precision injection molding, with more emphasis on precision injection molding. The challenges and future development trend are also discussed.

  17. Application of dynamic milling in stainless steel processing

    NASA Astrophysics Data System (ADS)

    Shan, Wenju

    2017-09-01

    This paper mainly introduces the method of parameter setting for NC programming of stainless steel parts by dynamic milling. Stainless steel is of high plasticity and toughness, serious hard working, large cutting force, high temperature in cutting area and easy wear of tool. It is difficult to process material. Dynamic motion technology is the newest NC programming technology of Mastercam software. It is an advanced machining idea. The tool path generated by the dynamic motion technology is more smooth, more efficient and more stable in the machining process. Dynamic motion technology is very suitable for cutting hard machining materials.

  18. From the Paper to the Tablet: On the Design of an AR-Based Tool for the Inspection of Pre-Fab Buildings. Preliminary Results of the SIRAE Project.

    PubMed

    Portalés, Cristina; Casas, Sergio; Gimeno, Jesús; Fernández, Marcos; Poza, Montse

    2018-04-19

    Energy-efficient Buildings (EeB) are demanded in today’s constructions, fulfilling the requirements for green cities. Pre-fab buildings, which are modularly fully-built in factories, are a good example of this. Although this kind of building is quite new, the in situ inspection is documented using traditional tools, mainly based on paper annotations. Thus, the inspection process is not taking advantage of new technologies. In this paper, we present the preliminary results of the SIRAE project that aims to provide an Augmented Reality (AR) tool that can seamlessly aid in the regular processes of pre-fab building inspections to detect and eliminate the possible existing quality and energy efficiency deviations. In this regards, we show a description of the current inspection process and how an interactive tool can be designed and adapted to it. Our first results show the design and implementation of our tool, which is highly interactive and involves AR visualizations and 3D data-gathering, allowing the inspectors to quickly manage it without altering the way the inspection process is done. First trials on a real environment show that the tool is promising for massive inspection processes.

  19. From the Paper to the Tablet: On the Design of an AR-Based Tool for the Inspection of Pre-Fab Buildings. Preliminary Results of the SIRAE Project

    PubMed Central

    Fernández, Marcos; Poza, Montse

    2018-01-01

    Energy-efficient Buildings (EeB) are demanded in today’s constructions, fulfilling the requirements for green cities. Pre-fab buildings, which are modularly fully-built in factories, are a good example of this. Although this kind of building is quite new, the in situ inspection is documented using traditional tools, mainly based on paper annotations. Thus, the inspection process is not taking advantage of new technologies. In this paper, we present the preliminary results of the SIRAE project that aims to provide an Augmented Reality (AR) tool that can seamlessly aid in the regular processes of pre-fab building inspections to detect and eliminate the possible existing quality and energy efficiency deviations. In this regards, we show a description of the current inspection process and how an interactive tool can be designed and adapted to it. Our first results show the design and implementation of our tool, which is highly interactive and involves AR visualizations and 3D data-gathering, allowing the inspectors to quickly manage it without altering the way the inspection process is done. First trials on a real environment show that the tool is promising for massive inspection processes. PMID:29671799

  20. Method for automation of tool preproduction

    NASA Astrophysics Data System (ADS)

    Rychkov, D. A.; Yanyushkin, A. S.; Lobanov, D. V.; Arkhipov, P. V.

    2018-03-01

    The primary objective of tool production is a creation or selection of such tool design which could make it possible to secure high process efficiency, tool availability as well as a quality of received surfaces with minimum means and resources spent on it. It takes much time of application people, being engaged in tool preparation, to make a correct selection of the appropriate tool among the set of variants. Program software has been developed to solve the problem, which helps to create, systematize and carry out a comparative analysis of tool design to identify the rational variant under given production conditions. The literature indicates that systematization and selection of the tool rational design has been carried out in accordance with the developed modeling technology and comparative design analysis. Software application makes it possible to reduce the period of design by 80....85% and obtain a significant annual saving.

  1. System technology for laser-assisted milling with tool integrated optics

    NASA Astrophysics Data System (ADS)

    Hermani, Jan-Patrick; Emonts, Michael; Brecher, Christian

    2013-02-01

    High strength metal alloys and ceramics offer a huge potential for increased efficiency (e. g. in engine components for aerospace or components for gas turbines). However, mass application is still hampered by cost- and time-consuming end-machining due to long processing times and high tool wear. Laser-induced heating shortly before machining can reduce the material strength and improve machinability significantly. The Fraunhofer IPT has developed and successfully realized a new approach for laser-assisted milling with spindle and tool integrated, co-rotating optics. The novel optical system inside the tool consists of one deflection prism to position the laser spot in front of the cutting insert and one focusing lens. Using a fiber laser with high beam quality the laser spot diameter can be precisely adjusted to the chip size. A high dynamic adaption of the laser power signal according to the engagement condition of the cutting tool was realized in order not to irradiate already machined work piece material. During the tool engagement the laser power is controlled in proportion to the current material removal rate, which has to be calculated continuously. The needed geometric values are generated by a CAD/CAM program and converted into a laser power signal by a real-time controller. The developed milling tool with integrated optics and the algorithm for laser power control enable a multi-axis laser-assisted machining of complex parts.

  2. High-power Broadband Organic THz Generator

    PubMed Central

    Jeong, Jae-Hyeok; Kang, Bong-Joo; Kim, Ji-Soo; Jazbinsek, Mojca; Lee, Seung-Heon; Lee, Seung-Chul; Baek, In-Hyung; Yun, Hoseop; Kim, Jongtaek; Lee, Yoon Sup; Lee, Jae-Hyeok; Kim, Jae-Ho; Rotermund, Fabian; Kwon, O-Pil

    2013-01-01

    The high-power broadband terahertz (THz) generator is an essential tool for a wide range of THz applications. Here, we present a novel highly efficient electro-optic quinolinium single crystal for THz wave generation. For obtaining intense and broadband THz waves by optical-to-THz frequency conversion, a quinolinium crystal was developed to fulfill all the requirements, which are in general extremely difficult to maintain simultaneously in a single medium, such as a large macroscopic electro-optic response and excellent crystal characteristics including a large crystal size with desired facets, good environmental stability, high optical quality, wide transparency range, and controllable crystal thickness. Compared to the benchmark inorganic and organic crystals, the new quinolinium crystal possesses excellent crystal properties and THz generation characteristics with broader THz spectral coverage and higher THz conversion efficiency at the technologically important pump wavelength of 800 nm. Therefore, the quinolinium crystal offers great potential for efficient and gap-free broadband THz wave generation. PMID:24220234

  3. High-power broadband organic THz generator.

    PubMed

    Jeong, Jae-Hyeok; Kang, Bong-Joo; Kim, Ji-Soo; Jazbinsek, Mojca; Lee, Seung-Heon; Lee, Seung-Chul; Baek, In-Hyung; Yun, Hoseop; Kim, Jongtaek; Lee, Yoon Sup; Lee, Jae-Hyeok; Kim, Jae-Ho; Rotermund, Fabian; Kwon, O-Pil

    2013-11-13

    The high-power broadband terahertz (THz) generator is an essential tool for a wide range of THz applications. Here, we present a novel highly efficient electro-optic quinolinium single crystal for THz wave generation. For obtaining intense and broadband THz waves by optical-to-THz frequency conversion, a quinolinium crystal was developed to fulfill all the requirements, which are in general extremely difficult to maintain simultaneously in a single medium, such as a large macroscopic electro-optic response and excellent crystal characteristics including a large crystal size with desired facets, good environmental stability, high optical quality, wide transparency range, and controllable crystal thickness. Compared to the benchmark inorganic and organic crystals, the new quinolinium crystal possesses excellent crystal properties and THz generation characteristics with broader THz spectral coverage and higher THz conversion efficiency at the technologically important pump wavelength of 800 nm. Therefore, the quinolinium crystal offers great potential for efficient and gap-free broadband THz wave generation.

  4. High-efficiency machining methods for aviation materials

    NASA Astrophysics Data System (ADS)

    Kononov, V. K.

    1991-07-01

    The papers contained in this volume present results of theoretical and experimental studies aimed at increasing the efficiency of cutting tools during the machining of high-temperature materials and titanium alloys. Specific topics discussed include a study of the performance of disk cutters during the machining of flexible parts of a high-temperature alloy, VZhL14N; a study of the wear resistance of cutters of hard alloys of various types; effect of a deformed electric field on the precision of the electrochemical machining of gas turbine engine components; and efficient machining of parts of composite materials. The discussion also covers the effect of the technological process structure on the residual stress distribution in the blades of gas turbine engines; modeling of the multiparameter assembly of engineering products for a specified priority of geometrical output parameters; and a study of the quality of the surface and surface layer of specimens machined by a high-temperature pulsed plasma.

  5. Improved injection needles facilitate germline transformation of the buckeye butterfly Junonia coenia.

    PubMed

    Beaudette, Kahlia; Hughes, Tia M; Marcus, Jeffrey M

    2014-01-01

    Germline transformation with transposon vectors is an important tool for insect genetics, but progress in developing transformation protocols for butterflies has been limited by high post-injection ova mortality. Here we present an improved glass injection needle design for injecting butterfly ova that increases survival in three Nymphalid butterfly species. Using the needles to genetically transform the common buckeye butterfly Junonia coenia, the hatch rate for injected Junonia ova was 21.7%, the transformation rate was 3%, and the overall experimental efficiency was 0.327%, a substantial improvement over previous results in other butterfly species. Improved needle design and a higher efficiency of transformation should permit the deployment of transposon-based genetic tools in a broad range of less fecund lepidopteran species.

  6. Advanced Dynamically Adaptive Algorithms for Stochastic Simulations on Extreme Scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiu, Dongbin

    2017-03-03

    The focus of the project is the development of mathematical methods and high-performance computational tools for stochastic simulations, with a particular emphasis on computations on extreme scales. The core of the project revolves around the design of highly efficient and scalable numerical algorithms that can adaptively and accurately, in high dimensional spaces, resolve stochastic problems with limited smoothness, even containing discontinuities.

  7. A study with ESI PAM-STAMP® on the influence of tool deformation on final part quality during a forming process

    NASA Astrophysics Data System (ADS)

    Vrolijk, Mark; Ogawa, Takayuki; Camanho, Arthur; Biasutti, Manfredi; Lorenz, David

    2018-05-01

    As a result from the ever increasing demand to produce lighter vehicles, more and more advanced high-strength materials are used in automotive industry. Focusing on sheet metal cold forming processes, these materials require high pressing forces and exhibit large springback after forming. Due to the high pressing forces deformations occur in the tooling geometry, introducing dimensional inaccuracies in the blank and potentially impact the final springback behavior. As a result the tool deformations can have an impact on the final assembly or introduce cosmetic defects. Often several iterations are required in try-out to obtain the required tolerances, with costs going up to as much as 30% of the entire product development cost. To investigate the sheet metal part feasibility and quality, in automotive industry CAE tools are widely used. However, in current practice the influence of the tool deformations on the final part quality is generally neglected and simulations are carried out with rigid tools to avoid drastically increased calculation times. If the tool deformation is analyzed through simulation it is normally done at the end of the drawing prosses, when contact conditions are mapped on the die structure and a static analysis is performed to check the deflections of the tool. But this method does not predict the influence of these deflections on the final quality of the part. In order to take tool deformations into account during drawing simulations, ESI has developed the ability to couple solvers efficiently in a way the tool deformations can be real-time included in the drawing simulation without high increase in simulation time compared to simulations with rigid tools. In this paper a study will be presented which demonstrates the effect of tool deformations on the final part quality.

  8. FY17 Status Report on NEAMS Neutronics Activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, C. H.; Jung, Y. S.; Smith, M. A.

    2017-09-30

    Under the U.S. DOE NEAMS program, the high-fidelity neutronics code system has been developed to support the multiphysics modeling and simulation capability named SHARP. The neutronics code system includes the high-fidelity neutronics code PROTEUS, the cross section library and preprocessing tools, the multigroup cross section generation code MC2-3, the in-house meshing generation tool, the perturbation and sensitivity analysis code PERSENT, and post-processing tools. The main objectives of the NEAMS neutronics activities in FY17 are to continue development of an advanced nodal solver in PROTEUS for use in nuclear reactor design and analysis projects, implement a simplified sub-channel based thermal-hydraulic (T/H)more » capability into PROTEUS to efficiently compute the thermal feedback, improve the performance of PROTEUS-MOCEX using numerical acceleration and code optimization, improve the cross section generation tools including MC2-3, and continue to perform verification and validation tests for PROTEUS.« less

  9. CMOST: an open-source framework for the microsimulation of colorectal cancer screening strategies.

    PubMed

    Prakash, Meher K; Lang, Brian; Heinrich, Henriette; Valli, Piero V; Bauerfeind, Peter; Sonnenberg, Amnon; Beerenwinkel, Niko; Misselwitz, Benjamin

    2017-06-05

    Colorectal cancer (CRC) is a leading cause of cancer-related mortality. CRC incidence and mortality can be reduced by several screening strategies, including colonoscopy, but randomized CRC prevention trials face significant obstacles such as the need for large study populations with long follow-up. Therefore, CRC screening strategies will likely be designed and optimized based on computer simulations. Several computational microsimulation tools have been reported for estimating efficiency and cost-effectiveness of CRC prevention. However, none of these tools is publicly available. There is a need for an open source framework to answer practical questions including testing of new screening interventions and adapting findings to local conditions. We developed and implemented a new microsimulation model, Colon Modeling Open Source Tool (CMOST), for modeling the natural history of CRC, simulating the effects of CRC screening interventions, and calculating the resulting costs. CMOST facilitates automated parameter calibration against epidemiological adenoma prevalence and CRC incidence data. Predictions of CMOST were highly similar compared to a large endoscopic CRC prevention study as well as predictions of existing microsimulation models. We applied CMOST to calculate the optimal timing of a screening colonoscopy. CRC incidence and mortality are reduced most efficiently by a colonoscopy between the ages of 56 and 59; while discounted life years gained (LYG) is maximal at 49-50 years. With a dwell time of 13 years, the most cost-effective screening is at 59 years, at $17,211 discounted USD per LYG. While cost-efficiency varied according to dwell time it did not influence the optimal time point of screening interventions within the tested range. Predictions of CMOST are highly similar compared to a randomized CRC prevention trial as well as those of other microsimulation tools. This open source tool will enable health-economics analyses in for various countries, health-care scenarios and CRC prevention strategies. CMOST is freely available under the GNU General Public License at https://gitlab.com/misselwb/CMOST.

  10. DOVIS 2.0: an efficient and easy to use parallel virtual screening tool based on AutoDock 4.0.

    PubMed

    Jiang, Xiaohui; Kumar, Kamal; Hu, Xin; Wallqvist, Anders; Reifman, Jaques

    2008-09-08

    Small-molecule docking is an important tool in studying receptor-ligand interactions and in identifying potential drug candidates. Previously, we developed a software tool (DOVIS) to perform large-scale virtual screening of small molecules in parallel on Linux clusters, using AutoDock 3.05 as the docking engine. DOVIS enables the seamless screening of millions of compounds on high-performance computing platforms. In this paper, we report significant advances in the software implementation of DOVIS 2.0, including enhanced screening capability, improved file system efficiency, and extended usability. To keep DOVIS up-to-date, we upgraded the software's docking engine to the more accurate AutoDock 4.0 code. We developed a new parallelization scheme to improve runtime efficiency and modified the AutoDock code to reduce excessive file operations during large-scale virtual screening jobs. We also implemented an algorithm to output docked ligands in an industry standard format, sd-file format, which can be easily interfaced with other modeling programs. Finally, we constructed a wrapper-script interface to enable automatic rescoring of docked ligands by arbitrarily selected third-party scoring programs. The significance of the new DOVIS 2.0 software compared with the previous version lies in its improved performance and usability. The new version makes the computation highly efficient by automating load balancing, significantly reducing excessive file operations by more than 95%, providing outputs that conform to industry standard sd-file format, and providing a general wrapper-script interface for rescoring of docked ligands. The new DOVIS 2.0 package is freely available to the public under the GNU General Public License.

  11. Micro-optical fabrication by ultraprecision diamond machining and precision molding

    NASA Astrophysics Data System (ADS)

    Li, Hui; Li, Likai; Naples, Neil J.; Roblee, Jeffrey W.; Yi, Allen Y.

    2017-06-01

    Ultraprecision diamond machining and high volume molding for affordable high precision high performance optical elements are becoming a viable process in optical industry for low cost high quality microoptical component manufacturing. In this process, first high precision microoptical molds are fabricated using ultraprecision single point diamond machining followed by high volume production methods such as compression or injection molding. In the last two decades, there have been steady improvements in ultraprecision machine design and performance, particularly with the introduction of both slow tool and fast tool servo. Today optical molds, including freeform surfaces and microlens arrays, are routinely diamond machined to final finish without post machining polishing. For consumers, compression molding or injection molding provide efficient and high quality optics at extremely low cost. In this paper, first ultraprecision machine design and machining processes such as slow tool and fast too servo are described then both compression molding and injection molding of polymer optics are discussed. To implement precision optical manufacturing by molding, numerical modeling can be included in the future as a critical part of the manufacturing process to ensure high product quality.

  12. Efficient Use of Distributed Systems for Scientific Applications

    NASA Technical Reports Server (NTRS)

    Taylor, Valerie; Chen, Jian; Canfield, Thomas; Richard, Jacques

    2000-01-01

    Distributed computing has been regarded as the future of high performance computing. Nationwide high speed networks such as vBNS are becoming widely available to interconnect high-speed computers, virtual environments, scientific instruments and large data sets. One of the major issues to be addressed with distributed systems is the development of computational tools that facilitate the efficient execution of parallel applications on such systems. These tools must exploit the heterogeneous resources (networks and compute nodes) in distributed systems. This paper presents a tool, called PART, which addresses this issue for mesh partitioning. PART takes advantage of the following heterogeneous system features: (1) processor speed; (2) number of processors; (3) local network performance; and (4) wide area network performance. Further, different finite element applications under consideration may have different computational complexities, different communication patterns, and different element types, which also must be taken into consideration when partitioning. PART uses parallel simulated annealing to partition the domain, taking into consideration network and processor heterogeneity. The results of using PART for an explicit finite element application executing on two IBM SPs (located at Argonne National Laboratory and the San Diego Supercomputer Center) indicate an increase in efficiency by up to 36% as compared to METIS, a widely used mesh partitioning tool. The input to METIS was modified to take into consideration heterogeneous processor performance; METIS does not take into consideration heterogeneous networks. The execution times for these applications were reduced by up to 30% as compared to METIS. These results are given in Figure 1 for four irregular meshes with number of elements ranging from 30,269 elements for the Barth5 mesh to 11,451 elements for the Barth4 mesh. Future work with PART entails using the tool with an integrated application requiring distributed systems. In particular this application, illustrated in the document entails an integration of finite element and fluid dynamic simulations to address the cooling of turbine blades of a gas turbine engine design. It is not uncommon to encounter high-temperature, film-cooled turbine airfoils with 1,000,000s of degrees of freedom. This results because of the complexity of the various components of the airfoils, requiring fine-grain meshing for accuracy. Additional information is contained in the original.

  13. Quality engineering tools focused on high power LED driver design using boost power stages in switch mode

    NASA Astrophysics Data System (ADS)

    Ileana, Ioan; Risteiu, Mircea; Marc, Gheorghe

    2016-12-01

    This paper is a part of our research dedicated to high power LED lamps designing. The boost-up selected technology wants to meet driver producers' tendency in the frame of efficiency and disturbances constrains. In our work we used modeling and simulation tools for implementing scenarios of the driver work when some controlling functions are executed (output voltage/ current versus input voltage and fixed switching frequency, input and output electric power transfer versus switching frequency, transient inductor voltage analysis, and transient out capacitor analysis). Some electrical and thermal stress conditions are also analyzed. Based on these aspects, a high reliable power LED driver has been designed.

  14. Efficient production of a gene mutant cell line through integrating TALENs and high-throughput cell cloning.

    PubMed

    Sun, Changhong; Fan, Yu; Li, Juan; Wang, Gancheng; Zhang, Hanshuo; Xi, Jianzhong Jeff

    2015-02-01

    Transcription activator-like effectors (TALEs) are becoming powerful DNA-targeting tools in a variety of mammalian cells and model organisms. However, generating a stable cell line with specific gene mutations in a simple and rapid manner remains a challenging task. Here, we report a new method to efficiently produce monoclonal cells using integrated TALE nuclease technology and a series of high-throughput cell cloning approaches. Following this method, we obtained three mTOR mutant 293T cell lines within 2 months, which included one homozygous mutant line. © 2014 Society for Laboratory Automation and Screening.

  15. Innovative and Advanced Coupled Neutron Transport and Thermal Hydraulic Method (Tool) for the Design, Analysis and Optimization of VHTR/NGNP Prismatic Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rahnema, Farzad; Garimeela, Srinivas; Ougouag, Abderrafi

    2013-11-29

    This project will develop a 3D, advanced coarse mesh transport method (COMET-Hex) for steady- state and transient analyses in advanced very high-temperature reactors (VHTRs). The project will lead to a coupled neutronics and thermal hydraulic (T/H) core simulation tool with fuel depletion capability. The computational tool will be developed in hexagonal geometry, based solely on transport theory without (spatial) homogenization in complicated 3D geometries. In addition to the hexagonal geometry extension, collaborators will concurrently develop three additional capabilities to increase the code’s versatility as an advanced and robust core simulator for VHTRs. First, the project team will develop and implementmore » a depletion method within the core simulator. Second, the team will develop an elementary (proof-of-concept) 1D time-dependent transport method for efficient transient analyses. The third capability will be a thermal hydraulic method coupled to the neutronics transport module for VHTRs. Current advancements in reactor core design are pushing VHTRs toward greater core and fuel heterogeneity to pursue higher burn-ups, efficiently transmute used fuel, maximize energy production, and improve plant economics and safety. As a result, an accurate and efficient neutron transport, with capabilities to treat heterogeneous burnable poison effects, is highly desirable for predicting VHTR neutronics performance. This research project’s primary objective is to advance the state of the art for reactor analysis.« less

  16. Optimized RNP transfection for highly efficient CRISPR/Cas9-mediated gene knockout in primary T cells.

    PubMed

    Seki, Akiko; Rutz, Sascha

    2018-03-05

    CRISPR (clustered, regularly interspaced, short palindromic repeats)/Cas9 (CRISPR-associated protein 9) has become the tool of choice for generating gene knockouts across a variety of species. The ability for efficient gene editing in primary T cells not only represents a valuable research tool to study gene function but also holds great promise for T cell-based immunotherapies, such as next-generation chimeric antigen receptor (CAR) T cells. Previous attempts to apply CRIPSR/Cas9 for gene editing in primary T cells have resulted in highly variable knockout efficiency and required T cell receptor (TCR) stimulation, thus largely precluding the study of genes involved in T cell activation or differentiation. Here, we describe an optimized approach for Cas9/RNP transfection of primary mouse and human T cells without TCR stimulation that results in near complete loss of target gene expression at the population level, mitigating the need for selection. We believe that this method will greatly extend the feasibly of target gene discovery and validation in primary T cells and simplify the gene editing process for next-generation immunotherapies. © 2018 Genentech.

  17. High-Tech Roof Management.

    ERIC Educational Resources Information Center

    Benzie, Tim

    1997-01-01

    Describes the use of a computerized roof management system (CRMS) for school districts to foster multiple roof maintenance efficiency and cost effectiveness. Highlights CRMS software manufacturer choices, as well as the types of nondestructive testing equipment tools that can be used to evaluate roof conditions. (GR)

  18. Growth and yield models for central hardwoods

    Treesearch

    Martin E. Dale; Donald E. Hilt

    1989-01-01

    Over the last 20 years computers have become an efficient tool to estimate growth and yield. Computerized yield estimates vary from simple approximation or interpolation of traditional normal yield tables to highly sophisticated programs that simulate the growth and yield of each individual tree.

  19. Science and technology in the stockpile stewardship program, S & TR reprints

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Storm, E

    This document reports on these topics: Computer Simulations in Support of National Security; Enhanced Surveillance of Aging Weapons; A New Precision Cutting Tool: The Femtosecond Laser; Superlasers as a Tool of Stockpile Stewardship; Nova Laser Experiments and Stockpile Stewardship; Transforming Explosive Art into Science; Better Flash Radiography Using the FXR; Preserving Nuclear Weapons Information; Site 300Õs New Contained Firing Facility; The Linear Electric Motor: Instability at 1,000 gÕs; A Powerful New Tool to Detect Clandestine Nuclear Tests; High Explosives in Stockpile Surveillance Indicate Constancy; Addressing a Cold War Legacy with a New Way to Produce TATB; JumpinÕ Jupiter! Metallic Hydrogen;more » Keeping the Nuclear Stockpile Safe, Secure, and Reliable; The Multibeam FabryÐPerot Velocimeter: Efficient Measurements of High Velocities; Theory and Modeling in Material Science; The Diamond Anvil Cell; Gamma-Ray Imaging Spectrometry; X-Ray Lasers and High-Density Plasma« less

  20. Baculoviral delivery of CRISPR/Cas9 facilitates efficient genome editing in human cells

    PubMed Central

    Hindriksen, Sanne; Bramer, Arne J.; Truong, My Anh; Vromans, Martijn J. M.; Post, Jasmin B.; Verlaan-Klink, Ingrid; Snippert, Hugo J.; Lens, Susanne M. A.

    2017-01-01

    The CRISPR/Cas9 system is a highly effective tool for genome editing. Key to robust genome editing is the efficient delivery of the CRISPR/Cas9 machinery. Viral delivery systems are efficient vehicles for the transduction of foreign genes but commonly used viral vectors suffer from a limited capacity in the genetic information they can carry. Baculovirus however is capable of carrying large exogenous DNA fragments. Here we investigate the use of baculoviral vectors as a delivery vehicle for CRISPR/Cas9 based genome-editing tools. We demonstrate transduction of a panel of cell lines with Cas9 and an sgRNA sequence, which results in efficient knockout of all four targeted subunits of the chromosomal passenger complex (CPC). We further show that introduction of a homology directed repair template into the same CRISPR/Cas9 baculovirus facilitates introduction of specific point mutations and endogenous gene tags. Tagging of the CPC recruitment factor Haspin with the fluorescent reporter YFP allowed us to study its native localization as well as recruitment to the cohesin subunit Pds5B. PMID:28640891

  1. BLESS 2: accurate, memory-efficient and fast error correction method.

    PubMed

    Heo, Yun; Ramachandran, Anand; Hwu, Wen-Mei; Ma, Jian; Chen, Deming

    2016-08-01

    The most important features of error correction tools for sequencing data are accuracy, memory efficiency and fast runtime. The previous version of BLESS was highly memory-efficient and accurate, but it was too slow to handle reads from large genomes. We have developed a new version of BLESS to improve runtime and accuracy while maintaining a small memory usage. The new version, called BLESS 2, has an error correction algorithm that is more accurate than BLESS, and the algorithm has been parallelized using hybrid MPI and OpenMP programming. BLESS 2 was compared with five top-performing tools, and it was found to be the fastest when it was executed on two computing nodes using MPI, with each node containing twelve cores. Also, BLESS 2 showed at least 11% higher gain while retaining the memory efficiency of the previous version for large genomes. Freely available at https://sourceforge.net/projects/bless-ec dchen@illinois.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  2. An Efficient, Rapid, and Recyclable System for CRISPR-Mediated Genome Editing in Candida albicans.

    PubMed

    Nguyen, Namkha; Quail, Morgan M F; Hernday, Aaron D

    2017-01-01

    Candida albicans is the most common fungal pathogen of humans. Historically, molecular genetic analysis of this important pathogen has been hampered by the lack of stable plasmids or meiotic cell division, limited selectable markers, and inefficient methods for generating gene knockouts. The recent development of clustered regularly interspaced short palindromic repeat(s) (CRISPR)-based tools for use with C. albicans has opened the door to more efficient genome editing; however, previously reported systems have specific limitations. We report the development of an optimized CRISPR-based genome editing system for use with C. albicans . Our system is highly efficient, does not require molecular cloning, does not leave permanent markers in the genome, and supports rapid, precise genome editing in C. albicans . We also demonstrate the utility of our system for generating two independent homozygous gene knockouts in a single transformation and present a method for generating homozygous wild-type gene addbacks at the native locus. Furthermore, each step of our protocol is compatible with high-throughput strain engineering approaches, thus opening the door to the generation of a complete C. albicans gene knockout library. IMPORTANCE Candida albicans is the major fungal pathogen of humans and is the subject of intense biomedical and discovery research. Until recently, the pace of research in this field has been hampered by the lack of efficient methods for genome editing. We report the development of a highly efficient and flexible genome editing system for use with C. albicans . This system improves upon previously published C. albicans CRISPR systems and enables rapid, precise genome editing without the use of permanent markers. This new tool kit promises to expedite the pace of research on this important fungal pathogen.

  3. Ultrawidefield microscope for high-speed fluorescence imaging and targeted optogenetic stimulation.

    PubMed

    Werley, Christopher A; Chien, Miao-Ping; Cohen, Adam E

    2017-12-01

    The rapid increase in the number and quality of fluorescent reporters and optogenetic actuators has yielded a powerful set of tools for recording and controlling cellular state and function. To achieve the full benefit of these tools requires improved optical systems with high light collection efficiency, high spatial and temporal resolution, and patterned optical stimulation, in a wide field of view (FOV). Here we describe our 'Firefly' microscope, which achieves these goals in a Ø6 mm FOV. The Firefly optical system is optimized for simultaneous photostimulation and fluorescence imaging in cultured cells. All but one of the optical elements are commercially available, yet the microscope achieves 10-fold higher light collection efficiency at its design magnification than the comparable commercially available microscope using the same objective. The Firefly microscope enables all-optical electrophysiology ('Optopatch') in cultured neurons with a throughput and information content unmatched by other neuronal phenotyping systems. This capability opens possibilities in disease modeling and phenotypic drug screening. We also demonstrate applications of the system to voltage and calcium recordings in human induced pluripotent stem cell derived cardiomyocytes.

  4. Ultrawidefield microscope for high-speed fluorescence imaging and targeted optogenetic stimulation

    PubMed Central

    Werley, Christopher A.; Chien, Miao-Ping; Cohen, Adam E.

    2017-01-01

    The rapid increase in the number and quality of fluorescent reporters and optogenetic actuators has yielded a powerful set of tools for recording and controlling cellular state and function. To achieve the full benefit of these tools requires improved optical systems with high light collection efficiency, high spatial and temporal resolution, and patterned optical stimulation, in a wide field of view (FOV). Here we describe our ‘Firefly’ microscope, which achieves these goals in a Ø6 mm FOV. The Firefly optical system is optimized for simultaneous photostimulation and fluorescence imaging in cultured cells. All but one of the optical elements are commercially available, yet the microscope achieves 10-fold higher light collection efficiency at its design magnification than the comparable commercially available microscope using the same objective. The Firefly microscope enables all-optical electrophysiology (‘Optopatch’) in cultured neurons with a throughput and information content unmatched by other neuronal phenotyping systems. This capability opens possibilities in disease modeling and phenotypic drug screening. We also demonstrate applications of the system to voltage and calcium recordings in human induced pluripotent stem cell derived cardiomyocytes. PMID:29296505

  5. Investigation of the effects of process and geometrical parameters on formability in tube hydroforming using a modular hydroforming tool

    NASA Astrophysics Data System (ADS)

    Joghan, Hamed Dardaei; Staupendahl, Daniel; Hassan, Hamad ul; Henke, Andreas; Keesser, Thorsten; Legat, Francois; Tekkaya, A. Erman

    2018-05-01

    Tube hydroforming is one of the most important manufacturing processes for the production of exhaust systems. Tube hydroforming allows generating parts with highly complex geometries with the forming accuracies needed in the automotive sector. This is possible due to the form-closed nature of the production process. One of the main cost drivers is tool manufacturing, which is expensive and time consuming, especially when forming large parts. To cope with the design trend of individuality, which is gaining more and more importance and leads to a high number of product variants, a new flexible tool design was developed. The designed tool offers a high flexibility in manufacturing different shapes and geometries of tubes with just local alterations and relocation of tool segments. The tolerancing problems that segmented tools from the state of the art have are overcome by an innovative and flexible die holder design. The break-even point of this initially more expensive tool design is already overcome when forming more than 4 different tube shapes. Together with an additionally designed rotary hydraulic tube feeding system, a highly adaptable forming setup is generated. To investigate the performance of the developed tool setup, a study on geometrical and process parameters during forming of a spherical dome was done. Austenitic stainless steel (grade 1.4301) tube with a diameter of 40 mm and a thickness of 1.5 mm was used for the investigations. The experimental analyses were supported by finite element simulations and statistical analyses. The results show that the flexible tool setup can efficiently be used to analyze the interaction of the inner pressure, friction, and the location of the spherical dome and demonstrate the high influence of the feeding rate on the formed part.

  6. Methods and tools to simulate the effect of economic instruments in complex water resources systems. Application to the Jucar river basin.

    NASA Astrophysics Data System (ADS)

    Lopez-Nicolas, Antonio; Pulido-Velazquez, Manuel

    2014-05-01

    The main challenge of the BLUEPRINT to safeguard Europe's water resources (EC, 2012) is to guarantee that enough good quality water is available for people's needs, the economy and the environment. In this sense, economic policy instruments such as water pricing policies and water markets can be applied to enhance efficient use of water. This paper presents a method based on hydro-economic tools to assess the effect of economic instruments on water resource systems. Hydro-economic models allow integrated analysis of water supply, demand and infrastructure operation at the river basin scale, by simultaneously combining engineering, hydrologic and economic aspects of water resources management. The method made use of the simulation and optimization hydroeconomic tools SIMGAMS and OPTIGAMS. The simulation tool SIMGAMS allocates water resources among the users according to priorities and operating rules, and evaluate economic scarcity costs of the system by using economic demand functions. The model's objective function is designed so that the system aims to meet the operational targets (ranked according to priorities) at each month while following the system operating rules. The optimization tool OPTIGAMS allocates water resources based on an economic efficiency criterion: maximize net benefits, or alternatively, minimizing the total water scarcity and operating cost of water use. SIMGAS allows to simulate incentive water pricing policies based on marginal resource opportunity costs (MROC; Pulido-Velazquez et al., 2013). Storage-dependent step pricing functions are derived from the time series of MROC values at a certain reservoir in the system. These water pricing policies are defined based on water availability in the system (scarcity pricing), so that when water storage is high, the MROC is low, while low storage (drought periods) will be associated to high MROC and therefore, high prices. We also illustrate the use of OPTIGAMS to simulate the effect of ideal water markets by economic optimization, without considering the potential effect of transaction costs. These methods and tools have been applied to the Jucar River basin (Spain). The results show the potential of economic instruments in setting incentives for a more efficient management of water resources systems. Acknowledgments: The study has been partially supported by the European Community 7th Framework Project (GENESIS project, n. 226536), SAWARES (Plan Nacional I+D+i 2008-2011, CGL2009-13238-C02-01 and C02-02), SCARCE (Consolider-Ingenio 2010 CSD2009-00065) of the Spanish Ministry of Economy and Competitiveness; and EC 7th Framework Project ENHANCE (n. 308438) Reference: Pulido-Velazquez, M., Alvarez-Mendiola, E., and Andreu, J., 2013. Design of Efficient Water Pricing Policies Integrating Basinwide Resource Opportunity Costs. J. Water Resour. Plann. Manage., 139(5): 583-592.

  7. PageMan: an interactive ontology tool to generate, display, and annotate overview graphs for profiling experiments.

    PubMed

    Usadel, Björn; Nagel, Axel; Steinhauser, Dirk; Gibon, Yves; Bläsing, Oliver E; Redestig, Henning; Sreenivasulu, Nese; Krall, Leonard; Hannah, Matthew A; Poree, Fabien; Fernie, Alisdair R; Stitt, Mark

    2006-12-18

    Microarray technology has become a widely accepted and standardized tool in biology. The first microarray data analysis programs were developed to support pair-wise comparison. However, as microarray experiments have become more routine, large scale experiments have become more common, which investigate multiple time points or sets of mutants or transgenics. To extract biological information from such high-throughput expression data, it is necessary to develop efficient analytical platforms, which combine manually curated gene ontologies with efficient visualization and navigation tools. Currently, most tools focus on a few limited biological aspects, rather than offering a holistic, integrated analysis. Here we introduce PageMan, a multiplatform, user-friendly, and stand-alone software tool that annotates, investigates, and condenses high-throughput microarray data in the context of functional ontologies. It includes a GUI tool to transform different ontologies into a suitable format, enabling the user to compare and choose between different ontologies. It is equipped with several statistical modules for data analysis, including over-representation analysis and Wilcoxon statistical testing. Results are exported in a graphical format for direct use, or for further editing in graphics programs.PageMan provides a fast overview of single treatments, allows genome-level responses to be compared across several microarray experiments covering, for example, stress responses at multiple time points. This aids in searching for trait-specific changes in pathways using mutants or transgenics, analyzing development time-courses, and comparison between species. In a case study, we analyze the results of publicly available microarrays of multiple cold stress experiments using PageMan, and compare the results to a previously published meta-analysis.PageMan offers a complete user's guide, a web-based over-representation analysis as well as a tutorial, and is freely available at http://mapman.mpimp-golm.mpg.de/pageman/. PageMan allows multiple microarray experiments to be efficiently condensed into a single page graphical display. The flexible interface allows data to be quickly and easily visualized, facilitating comparisons within experiments and to published experiments, thus enabling researchers to gain a rapid overview of the biological responses in the experiments.

  8. Got Graphs? An Assessment of Data Visualization Tools

    NASA Technical Reports Server (NTRS)

    Schaefer, C. M.; Foy, M.

    2015-01-01

    Graphs are powerful tools for simplifying complex data. They are useful for quickly assessing patterns and relationships among one or more variables from a dataset. As the amount of data increases, it becomes more difficult to visualize potential associations. Lifetime Surveillance of Astronaut Health (LSAH) was charged with assessing its current visualization tools along with others on the market to determine whether new tools would be useful for supporting NASA's occupational surveillance effort. It was concluded by members of LSAH that the current tools hindered their ability to provide quick results to researchers working with the department. Due to the high volume of data requests and the many iterations of visualizations requested by researchers, software with a better ability to replicate graphs and edit quickly could improve LSAH's efficiency and lead to faster research results.

  9. Vibration in car repair work.

    PubMed

    Hansson, J E; Eklund, L; Kihlberg, S; Ostergren, C E

    1987-03-01

    The main objective of the study was to find efficient hand tools which caused only minor vibration loading. Vibration measurements were carried out under standardised working conditions. The time during which car body repairers in seven companies were exposed to vibration was determined. Chisel hammers, impact wrenches, sanders and saws were the types of tools which generated the highest vibration accelerations. The average daily exposure at the different garages ranged from 22 to 70 min. The risk of vibration injury is currently rated as high. The difference between the highest and lowest levels of vibration was considerable in most tool categories. Therefore the choice of tool has a major impact on the magnitude of vibration exposure. The importance of choosing the right tools and working methods is discussed and a counselling service on vibration is proposed.

  10. Adaptive Modeling, Engineering Analysis and Design of Advanced Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Mukhopadhyay, Vivek; Hsu, Su-Yuen; Mason, Brian H.; Hicks, Mike D.; Jones, William T.; Sleight, David W.; Chun, Julio; Spangler, Jan L.; Kamhawi, Hilmi; Dahl, Jorgen L.

    2006-01-01

    This paper describes initial progress towards the development and enhancement of a set of software tools for rapid adaptive modeling, and conceptual design of advanced aerospace vehicle concepts. With demanding structural and aerodynamic performance requirements, these high fidelity geometry based modeling tools are essential for rapid and accurate engineering analysis at the early concept development stage. This adaptive modeling tool was used for generating vehicle parametric geometry, outer mold line and detailed internal structural layout of wing, fuselage, skin, spars, ribs, control surfaces, frames, bulkheads, floors, etc., that facilitated rapid finite element analysis, sizing study and weight optimization. The high quality outer mold line enabled rapid aerodynamic analysis in order to provide reliable design data at critical flight conditions. Example application for structural design of a conventional aircraft and a high altitude long endurance vehicle configuration are presented. This work was performed under the Conceptual Design Shop sub-project within the Efficient Aerodynamic Shape and Integration project, under the former Vehicle Systems Program. The project objective was to design and assess unconventional atmospheric vehicle concepts efficiently and confidently. The implementation may also dramatically facilitate physics-based systems analysis for the NASA Fundamental Aeronautics Mission. In addition to providing technology for design and development of unconventional aircraft, the techniques for generation of accurate geometry and internal sub-structure and the automated interface with the high fidelity analysis codes could also be applied towards the design of vehicles for the NASA Exploration and Space Science Mission projects.

  11. Statistical properties of mean stand biomass estimators in a LIDAR-based double sampling forest survey design.

    Treesearch

    H.E. Anderson; J. Breidenbach

    2007-01-01

    Airborne laser scanning (LIDAR) can be a valuable tool in double-sampling forest survey designs. LIDAR-derived forest structure metrics are often highly correlated with important forest inventory variables, such as mean stand biomass, and LIDAR-based synthetic regression estimators have the potential to be highly efficient compared to single-stage estimators, which...

  12. Quantum tomography of near-unitary processes in high-dimensional quantum systems

    NASA Astrophysics Data System (ADS)

    Lysne, Nathan; Sosa Martinez, Hector; Jessen, Poul; Baldwin, Charles; Kalev, Amir; Deutsch, Ivan

    2016-05-01

    Quantum Tomography (QT) is often considered the ideal tool for experimental debugging of quantum devices, capable of delivering complete information about quantum states (QST) or processes (QPT). In practice, the protocols used for QT are resource intensive and scale poorly with system size. In this situation, a well behaved model system with access to large state spaces (qudits) can serve as a useful platform for examining the tradeoffs between resource cost and accuracy inherent in QT. In past years we have developed one such experimental testbed, consisting of the electron-nuclear spins in the electronic ground state of individual Cs atoms. Our available toolkit includes high fidelity state preparation, complete unitary control, arbitrary orthogonal measurements, and accurate and efficient QST in Hilbert space dimensions up to d = 16. Using these tools, we have recently completed a comprehensive study of QPT in 4, 7 and 16 dimensions. Our results show that QPT of near-unitary processes is quite feasible if one chooses optimal input states and efficient QST on the outputs. We further show that for unitary processes in high dimensional spaces, one can use informationally incomplete QPT to achieve high-fidelity process reconstruction (90% in d = 16) with greatly reduced resource requirements.

  13. The efficiency of geophysical adjoint codes generated by automatic differentiation tools

    NASA Astrophysics Data System (ADS)

    Vlasenko, A. V.; Köhl, A.; Stammer, D.

    2016-02-01

    The accuracy of numerical models that describe complex physical or chemical processes depends on the choice of model parameters. Estimating an optimal set of parameters by optimization algorithms requires knowledge of the sensitivity of the process of interest to model parameters. Typically the sensitivity computation involves differentiation of the model, which can be performed by applying algorithmic differentiation (AD) tools to the underlying numerical code. However, existing AD tools differ substantially in design, legibility and computational efficiency. In this study we show that, for geophysical data assimilation problems of varying complexity, the performance of adjoint codes generated by the existing AD tools (i) Open_AD, (ii) Tapenade, (iii) NAGWare and (iv) Transformation of Algorithms in Fortran (TAF) can be vastly different. Based on simple test problems, we evaluate the efficiency of each AD tool with respect to computational speed, accuracy of the adjoint, the efficiency of memory usage, and the capability of each AD tool to handle modern FORTRAN 90-95 elements such as structures and pointers, which are new elements that either combine groups of variables or provide aliases to memory addresses, respectively. We show that, while operator overloading tools are the only ones suitable for modern codes written in object-oriented programming languages, their computational efficiency lags behind source transformation by orders of magnitude, rendering the application of these modern tools to practical assimilation problems prohibitive. In contrast, the application of source transformation tools appears to be the most efficient choice, allowing handling even large geophysical data assimilation problems. However, they can only be applied to numerical models written in earlier generations of programming languages. Our study indicates that applying existing AD tools to realistic geophysical problems faces limitations that urgently need to be solved to allow the continuous use of AD tools for solving geophysical problems on modern computer architectures.

  14. Using leaf optical properties to detect ozone effects on foliar biochemistry

    USDA-ARS?s Scientific Manuscript database

    Efficient methods for accurate and meaningful high-throughput plant phenotyping are limiting the development and breeding of stress-tolerant crops. A number of emerging techniques, specifically remote sensing methods, have been identified as promising tools for plant phenotyping. These remote-sensin...

  15. AOP-informed assessment of endocrine disruption in freshwater crustaceans

    EPA Science Inventory

    To date, most research focused on developing more efficient and cost effective methods to predict toxicity have focused on human biology. However, there is also a need for effective high throughput tools to predict toxicity to other species that perform critical ecosystem functio...

  16. Evaluation of the efficiency and fault density of software generated by code generators

    NASA Technical Reports Server (NTRS)

    Schreur, Barbara

    1993-01-01

    Flight computers and flight software are used for GN&C (guidance, navigation, and control), engine controllers, and avionics during missions. The software development requires the generation of a considerable amount of code. The engineers who generate the code make mistakes and the generation of a large body of code with high reliability requires considerable time. Computer-aided software engineering (CASE) tools are available which generates code automatically with inputs through graphical interfaces. These tools are referred to as code generators. In theory, code generators could write highly reliable code quickly and inexpensively. The various code generators offer different levels of reliability checking. Some check only the finished product while some allow checking of individual modules and combined sets of modules as well. Considering NASA's requirement for reliability, an in house manually generated code is needed. Furthermore, automatically generated code is reputed to be as efficient as the best manually generated code when executed. In house verification is warranted.

  17. Laser processes and system technology for the production of high-efficient crystalline solar cells

    NASA Astrophysics Data System (ADS)

    Mayerhofer, R.; Hendel, R.; Zhu, Wenjie; Geiger, S.

    2012-10-01

    The laser as an industrial tool is an essential part of today's solar cell production. Due to the on-going efforts in the solar industry, to increase the cell efficiency, more and more laser-based processes, which have been discussed and tested at lab-scale for many years, are now being implemented in mass production lines. In order to cope with throughput requirements, standard laser concepts have to be improved continuously with respect to available average power levels, repetition rates or beam profile. Some of the laser concepts, that showed high potential in the past couple of years, will be substituted by other, more economic laser types. Furthermore, requirements for processing with less-heat affected zones fuel the development of industry-ready ultra short pulsed lasers with pulse widths even below the picosecond range. In 2011, the German Ministry of Education and Research (BMBF) had launched the program "PV-Innovation Alliance", with the aim to support the rapid transfer of high-efficiency processes out of development departments and research institutes into solar cell production lines. Here, lasers play an important role as production tools, allowing the fast implementation of high-performance solar cell concepts. We will report on the results achieved within the joint project FUTUREFAB, where efficiency optimization, throughput enhancement and cost reduction are the main goals. Here, the presentation will focus on laser processes like selective emitter doping and ablation of dielectric layers. An indispensable part of the efforts towards cost reduction in solar cell production is the improvement of wafer handling and throughput capabilities of the laser processing system. Therefore, the presentation will also elaborate on new developments in the design of complete production machines.

  18. Efficient utilization of graphics technology for space animation

    NASA Technical Reports Server (NTRS)

    Panos, Gregory Peter

    1989-01-01

    Efficient utilization of computer graphics technology has become a major investment in the work of aerospace engineers and mission designers. These new tools are having a significant impact in the development and analysis of complex tasks and procedures which must be prepared prior to actual space flight. Design and implementation of useful methods in applying these tools has evolved into a complex interaction of hardware, software, network, video and various user interfaces. Because few people can understand every aspect of this broad mix of technology, many specialists are required to build, train, maintain and adapt these tools to changing user needs. Researchers have set out to create systems where an engineering designer can easily work to achieve goals with a minimum of technological distraction. This was accomplished with high-performance flight simulation visual systems and supercomputer computational horsepower. Control throughout the creative process is judiciously applied while maintaining generality and ease of use to accommodate a wide variety of engineering needs.

  19. Fluorescent tagged episomals for stoichiometric induced pluripotent stem cell reprogramming.

    PubMed

    Schmitt, Christopher E; Morales, Blanca M; Schmitz, Ellen M H; Hawkins, John S; Lizama, Carlos O; Zape, Joan P; Hsiao, Edward C; Zovein, Ann C

    2017-06-05

    Non-integrating episomal vectors have become an important tool for induced pluripotent stem cell reprogramming. The episomal vectors carrying the "Yamanaka reprogramming factors" (Oct4, Klf, Sox2, and L-Myc + Lin28) are critical tools for non-integrating reprogramming of cells to a pluripotent state. However, the reprogramming process remains highly stochastic, and is hampered by an inability to easily identify clones that carry the episomal vectors. We modified the original set of vectors to express spectrally separable fluorescent proteins to allow for enrichment of transfected cells. The vectors were then tested against the standard original vectors for reprogramming efficiency and for the ability to enrich for stoichiometric ratios of factors. The reengineered vectors allow for cell sorting based on reprogramming factor expression. We show that these vectors can assist in tracking episomal expression in individual cells and can select the reprogramming factor dosage. Together, these modified vectors are a useful tool for understanding the reprogramming process and improving induced pluripotent stem cell isolation efficiency.

  20. Tools for Atmospheric Radiative Transfer: Streamer and FluxNet. Revised

    NASA Technical Reports Server (NTRS)

    Key, Jeffrey R.; Schweiger, Axel J.

    1998-01-01

    Two tools for the solution of radiative transfer problems are presented. Streamer is a highly flexible medium spectral resolution radiative transfer model based on the plane-parallel theory of radiative transfer. Capable of computing either fluxes or radiances, it is suitable for studying radiative processes at the surface or within the atmosphere and for the development of remote-sensing algorithms. FluxNet is a fast neural network-based implementation of Streamer for computing surface fluxes. It allows for a sophisticated treatment of radiative processes in the analysis of large data sets and potential integration into geophysical models where computational efficiency is an issue. Documentation and tools for the development of alternative versions of Fluxnet are available. Collectively, Streamer and FluxNet solve a wide variety of problems related to radiative transfer: Streamer provides the detail and sophistication needed to perform basic research on most aspects of complex radiative processes while the efficiency and simplicity of FluxNet make it ideal for operational use.

  1. Process in manufacturing high efficiency AlGaAs/GaAs solar cells by MO-CVD

    NASA Technical Reports Server (NTRS)

    Yeh, Y. C. M.; Chang, K. I.; Tandon, J.

    1984-01-01

    Manufacturing technology for mass producing high efficiency GaAs solar cells is discussed. A progress using a high throughput MO-CVD reactor to produce high efficiency GaAs solar cells is discussed. Thickness and doping concentration uniformity of metal oxide chemical vapor deposition (MO-CVD) GaAs and AlGaAs layer growth are discussed. In addition, new tooling designs are given which increase the throughput of solar cell processing. To date, 2cm x 2cm AlGaAs/GaAs solar cells with efficiency up to 16.5% were produced. In order to meet throughput goals for mass producing GaAs solar cells, a large MO-CVD system (Cambridge Instrument Model MR-200) with a susceptor which was initially capable of processing 20 wafers (up to 75 mm diameter) during a single growth run was installed. In the MR-200, the sequencing of the gases and the heating power are controlled by a microprocessor-based programmable control console. Hence, operator errors can be reduced, leading to a more reproducible production sequence.

  2. Gemi: PCR Primers Prediction from Multiple Alignments

    PubMed Central

    Sobhy, Haitham; Colson, Philippe

    2012-01-01

    Designing primers and probes for polymerase chain reaction (PCR) is a preliminary and critical step that requires the identification of highly conserved regions in a given set of sequences. This task can be challenging if the targeted sequences display a high level of diversity, as frequently encountered in microbiologic studies. We developed Gemi, an automated, fast, and easy-to-use bioinformatics tool with a user-friendly interface to design primers and probes based on multiple aligned sequences. This tool can be used for the purpose of real-time and conventional PCR and can deal efficiently with large sets of sequences of a large size. PMID:23316117

  3. Topology and boundary shape optimization as an integrated design tool

    NASA Technical Reports Server (NTRS)

    Bendsoe, Martin Philip; Rodrigues, Helder Carrico

    1990-01-01

    The optimal topology of a two dimensional linear elastic body can be computed by regarding the body as a domain of the plane with a high density of material. Such an optimal topology can then be used as the basis for a shape optimization method that computes the optimal form of the boundary curves of the body. This results in an efficient and reliable design tool, which can be implemented via common FEM mesh generator and CAD type input-output facilities.

  4. A multi-fidelity framework for physics based rotor blade simulation and optimization

    NASA Astrophysics Data System (ADS)

    Collins, Kyle Brian

    New helicopter rotor designs are desired that offer increased efficiency, reduced vibration, and reduced noise. Rotor Designers in industry need methods that allow them to use the most accurate simulation tools available to search for these optimal designs. Computer based rotor analysis and optimization have been advanced by the development of industry standard codes known as "comprehensive" rotorcraft analysis tools. These tools typically use table look-up aerodynamics, simplified inflow models and perform aeroelastic analysis using Computational Structural Dynamics (CSD). Due to the simplified aerodynamics, most design studies are performed varying structural related design variables like sectional mass and stiffness. The optimization of shape related variables in forward flight using these tools is complicated and results are viewed with skepticism because rotor blade loads are not accurately predicted. The most accurate methods of rotor simulation utilize Computational Fluid Dynamics (CFD) but have historically been considered too computationally intensive to be used in computer based optimization, where numerous simulations are required. An approach is needed where high fidelity CFD rotor analysis can be utilized in a shape variable optimization problem with multiple objectives. Any approach should be capable of working in forward flight in addition to hover. An alternative is proposed and founded on the idea that efficient hybrid CFD methods of rotor analysis are ready to be used in preliminary design. In addition, the proposed approach recognizes the usefulness of lower fidelity physics based analysis and surrogate modeling. Together, they are used with high fidelity analysis in an intelligent process of surrogate model building of parameters in the high fidelity domain. Closing the loop between high and low fidelity analysis is a key aspect of the proposed approach. This is done by using information from higher fidelity analysis to improve predictions made with lower fidelity models. This thesis documents the development of automated low and high fidelity physics based rotor simulation frameworks. The low fidelity framework uses a comprehensive code with simplified aerodynamics. The high fidelity model uses a parallel processor capable CFD/CSD methodology. Both low and high fidelity frameworks include an aeroacoustic simulation for prediction of noise. A synergistic process is developed that uses both the low and high fidelity frameworks together to build approximate models of important high fidelity metrics as functions of certain design variables. To test the process, a 4-bladed hingeless rotor model is used as a baseline. The design variables investigated include tip geometry and spanwise twist distribution. Approximation models are built for metrics related to rotor efficiency and vibration using the results from 60+ high fidelity (CFD/CSD) experiments and 400+ low fidelity experiments. Optimization using the approximation models found the Pareto Frontier anchor points, or the design having maximum rotor efficiency and the design having minimum vibration. Various Pareto generation methods are used to find designs on the frontier between these two anchor designs. When tested in the high fidelity framework, the Pareto anchor designs are shown to be very good designs when compared with other designs from the high fidelity database. This provides evidence that the process proposed has merit. Ultimately, this process can be utilized by industry rotor designers with their existing tools to bring high fidelity analysis into the preliminary design stage of rotors. In conclusion, the methods developed and documented in this thesis have made several novel contributions. First, an automated high fidelity CFD based forward flight simulation framework has been built for use in preliminary design optimization. The framework was built around an integrated, parallel processor capable CFD/CSD/AA process. Second, a novel method of building approximate models of high fidelity parameters has been developed. The method uses a combination of low and high fidelity results and combines Design of Experiments, statistical effects analysis, and aspects of approximation model management. And third, the determination of rotor blade shape variables through optimization using CFD based analysis in forward flight has been performed. This was done using the high fidelity CFD/CSD/AA framework and method mentioned above. While the low and high fidelity predictions methods used in the work still have inaccuracies that can affect the absolute levels of the results, a framework has been successfully developed and demonstrated that allows for an efficient process to improve rotor blade designs in terms of a selected choice of objective function(s). Using engineering judgment, this methodology could be applied today to investigate opportunities to improve existing designs. With improvements in the low and high fidelity prediction components that will certainly occur, this framework could become a powerful tool for future rotorcraft design work. (Abstract shortened by UMI.)

  5. Some Observations on the Current Status of Performing Finite Element Analyses

    NASA Technical Reports Server (NTRS)

    Raju, Ivatury S.; Knight, Norman F., Jr; Shivakumar, Kunigal N.

    2015-01-01

    Aerospace structures are complex high-performance structures. Advances in reliable and efficient computing and modeling tools are enabling analysts to consider complex configurations, build complex finite element models, and perform analysis rapidly. Many of the early career engineers of today are very proficient in the usage of modern computers, computing engines, complex software systems, and visualization tools. These young engineers are becoming increasingly efficient in building complex 3D models of complicated aerospace components. However, the current trends demonstrate blind acceptance of the results of the finite element analysis results. This paper is aimed at raising an awareness of this situation. Examples of the common encounters are presented. To overcome the current trends, some guidelines and suggestions for analysts, senior engineers, and educators are offered.

  6. [Retention management by means of applied human resource development: lessons from cardiovascular anaesthesiology].

    PubMed

    Padosch, Stephan A; Schmidt, Christian E; Spöhr, Fabian A M

    2011-05-01

    At present, besides well-known financial problems, German hospitals are facing a serious lack of qualified medical staff. Given these facts, it is of great importance, especially in work load burdened disciplines, such as cardiovascular anaesthesiology, to retain highly qualified medical staff. Here, human resource development measures offer valuable tools for efficient retention management. Moreover, most of these are applicable to almost any clinical specialty. Surprisingly, financial aspects play a minor role in such concepts, in contrast to human resource development tools, such as mentoring, interviews, training and motivational activities. Especially, with regard to "Generation Y", an efficient retention management will play a key role to keep these physicians as hospital employees of long duration in the future. © Georg Thieme Verlag Stuttgart · New York.

  7. NREL's Building-Integrated Supercomputer Provides Heating and Efficient Computing (Fact Sheet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2014-09-01

    NREL's Energy Systems Integration Facility (ESIF) is meant to investigate new ways to integrate energy sources so they work together efficiently, and one of the key tools to that investigation, a new supercomputer, is itself a prime example of energy systems integration. NREL teamed with Hewlett-Packard (HP) and Intel to develop the innovative warm-water, liquid-cooled Peregrine supercomputer, which not only operates efficiently but also serves as the primary source of building heat for ESIF offices and laboratories. This innovative high-performance computer (HPC) can perform more than a quadrillion calculations per second as part of the world's most energy-efficient HPC datamore » center.« less

  8. Dynamic Analyses of Result Quality in Energy-Aware Approximate Programs

    NASA Astrophysics Data System (ADS)

    RIngenburg, Michael F.

    Energy efficiency is a key concern in the design of modern computer systems. One promising approach to energy-efficient computation, approximate computing, trades off output precision for energy efficiency. However, this tradeoff can have unexpected effects on computation quality. This thesis presents dynamic analysis tools to study, debug, and monitor the quality and energy efficiency of approximate computations. We propose three styles of tools: prototyping tools that allow developers to experiment with approximation in their applications, online tools that instrument code to determine the key sources of error, and online tools that monitor the quality of deployed applications in real time. Our prototyping tool is based on an extension to the functional language OCaml. We add approximation constructs to the language, an approximation simulator to the runtime, and profiling and auto-tuning tools for studying and experimenting with energy-quality tradeoffs. We also present two online debugging tools and three online monitoring tools. The first online tool identifies correlations between output quality and the total number of executions of, and errors in, individual approximate operations. The second tracks the number of approximate operations that flow into a particular value. Our online tools comprise three low-cost approaches to dynamic quality monitoring. They are designed to monitor quality in deployed applications without spending more energy than is saved by approximation. Online monitors can be used to perform real time adjustments to energy usage in order to meet specific quality goals. We present prototype implementations of all of these tools and describe their usage with several applications. Our prototyping, profiling, and autotuning tools allow us to experiment with approximation strategies and identify new strategies, our online tools succeed in providing new insights into the effects of approximation on output quality, and our monitors succeed in controlling output quality while still maintaining significant energy efficiency gains.

  9. High Efficiency, Low Cost Solar Cells Manufactured Using 'Silicon Ink' on Thin Crystalline Silicon Wafers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Antoniadis, H.

    Reported are the development and demonstration of a 17% efficient 25mm x 25mm crystalline Silicon solar cell and a 16% efficient 125mm x 125mm crystalline Silicon solar cell, both produced by Ink-jet printing Silicon Ink on a thin crystalline Silicon wafer. To achieve these objectives, processing approaches were developed to print the Silicon Ink in a predetermined pattern to form a high efficiency selective emitter, remove the solvents in the Silicon Ink and fuse the deposited particle Silicon films. Additionally, standard solar cell manufacturing equipment with slightly modified processes were used to complete the fabrication of the Silicon Ink highmore » efficiency solar cells. Also reported are the development and demonstration of a 18.5% efficient 125mm x 125mm monocrystalline Silicon cell, and a 17% efficient 125mm x 125mm multicrystalline Silicon cell, by utilizing high throughput Ink-jet and screen printing technologies. To achieve these objectives, Innovalight developed new high throughput processing tools to print and fuse both p and n type particle Silicon Inks in a predetermined pat-tern applied either on the front or the back of the cell. Additionally, a customized Ink-jet and screen printing systems, coupled with customized substrate handling solution, customized printing algorithms, and a customized ink drying process, in combination with a purchased turn-key line, were used to complete the high efficiency solar cells. This development work delivered a process capable of high volume producing 18.5% efficient crystalline Silicon solar cells and enabled the Innovalight to commercialize its technology by the summer of 2010.« less

  10. Development of efficient and cost-effective distributed hydrological modeling tool MWEasyDHM based on open-source MapWindow GIS

    NASA Astrophysics Data System (ADS)

    Lei, Xiaohui; Wang, Yuhui; Liao, Weihong; Jiang, Yunzhong; Tian, Yu; Wang, Hao

    2011-09-01

    Many regions are still threatened with frequent floods and water resource shortage problems in China. Consequently, the task of reproducing and predicting the hydrological process in watersheds is hard and unavoidable for reducing the risks of damage and loss. Thus, it is necessary to develop an efficient and cost-effective hydrological tool in China as many areas should be modeled. Currently, developed hydrological tools such as Mike SHE and ArcSWAT (soil and water assessment tool based on ArcGIS) show significant power in improving the precision of hydrological modeling in China by considering spatial variability both in land cover and in soil type. However, adopting developed commercial tools in such a large developing country comes at a high cost. Commercial modeling tools usually contain large numbers of formulas, complicated data formats, and many preprocessing or postprocessing steps that may make it difficult for the user to carry out simulation, thus lowering the efficiency of the modeling process. Besides, commercial hydrological models usually cannot be modified or improved to be suitable for some special hydrological conditions in China. Some other hydrological models are open source, but integrated into commercial GIS systems. Therefore, by integrating hydrological simulation code EasyDHM, a hydrological simulation tool named MWEasyDHM was developed based on open-source MapWindow GIS, the purpose of which is to establish the first open-source GIS-based distributed hydrological model tool in China by integrating modules of preprocessing, model computation, parameter estimation, result display, and analysis. MWEasyDHM provides users with a friendly manipulating MapWindow GIS interface, selectable multifunctional hydrological processing modules, and, more importantly, an efficient and cost-effective hydrological simulation tool. The general construction of MWEasyDHM consists of four major parts: (1) a general GIS module for hydrological analysis, (2) a preprocessing module for modeling inputs, (3) a model calibration module, and (4) a postprocessing module. The general GIS module for hydrological analysis is developed on the basis of totally open-source GIS software, MapWindow, which contains basic GIS functions. The preprocessing module is made up of three submodules including a DEM-based submodule for hydrological analysis, a submodule for default parameter calculation, and a submodule for the spatial interpolation of meteorological data. The calibration module contains parallel computation, real-time computation, and visualization. The postprocessing module includes model calibration and model results spatial visualization using tabular form and spatial grids. MWEasyDHM makes it possible for efficient modeling and calibration of EasyDHM, and promises further development of cost-effective applications in various watersheds.

  11. The in-situ 3D measurement system combined with CNC machine tools

    NASA Astrophysics Data System (ADS)

    Zhao, Huijie; Jiang, Hongzhi; Li, Xudong; Sui, Shaochun; Tang, Limin; Liang, Xiaoyue; Diao, Xiaochun; Dai, Jiliang

    2013-06-01

    With the development of manufacturing industry, the in-situ 3D measurement for the machining workpieces in CNC machine tools is regarded as the new trend of efficient measurement. We introduce a 3D measurement system based on the stereovision and phase-shifting method combined with CNC machine tools, which can measure 3D profile of the machining workpieces between the key machining processes. The measurement system utilizes the method of high dynamic range fringe acquisition to solve the problem of saturation induced by specular lights reflected from shiny surfaces such as aluminum alloy workpiece or titanium alloy workpiece. We measured two workpieces of aluminum alloy on the CNC machine tools to demonstrate the effectiveness of the developed measurement system.

  12. Progress in development of coated indexable cemented carbide inserts for machining of iron based work piece materials

    NASA Astrophysics Data System (ADS)

    Czettl, C.; Pohler, M.

    2016-03-01

    Increasing demands on material properties of iron based work piece materials, e.g. for the turbine industry, complicate the machining process and reduce the lifetime of the cutting tools. Therefore, improved tool solutions, adapted to the requirements of the desired application have to be developed. Especially, the interplay of macro- and micro geometry, substrate material, coating and post treatment processes is crucial for the durability of modern high performance tool solutions. Improved and novel analytical methods allow a detailed understanding of material properties responsible for the wear behaviour of the tools. Those support the knowledge based development of tailored cutting materials for selected applications. One important factor for such a solution is the proper choice of coating material, which can be synthesized by physical or chemical vapor deposition techniques. Within this work an overview of state-of-the-art coated carbide grades is presented and application examples are shown to demonstrate their high efficiency. Machining processes for a material range from cast iron, low carbon steels to high alloyed steels are covered.

  13. FusionAnalyser: a new graphical, event-driven tool for fusion rearrangements discovery

    PubMed Central

    Piazza, Rocco; Pirola, Alessandra; Spinelli, Roberta; Valletta, Simona; Redaelli, Sara; Magistroni, Vera; Gambacorti-Passerini, Carlo

    2012-01-01

    Gene fusions are common driver events in leukaemias and solid tumours; here we present FusionAnalyser, a tool dedicated to the identification of driver fusion rearrangements in human cancer through the analysis of paired-end high-throughput transcriptome sequencing data. We initially tested FusionAnalyser by using a set of in silico randomly generated sequencing data from 20 known human translocations occurring in cancer and subsequently using transcriptome data from three chronic and three acute myeloid leukaemia samples. in all the cases our tool was invariably able to detect the presence of the correct driver fusion event(s) with high specificity. In one of the acute myeloid leukaemia samples, FusionAnalyser identified a novel, cryptic, in-frame ETS2–ERG fusion. A fully event-driven graphical interface and a flexible filtering system allow complex analyses to be run in the absence of any a priori programming or scripting knowledge. Therefore, we propose FusionAnalyser as an efficient and robust graphical tool for the identification of functional rearrangements in the context of high-throughput transcriptome sequencing data. PMID:22570408

  14. FusionAnalyser: a new graphical, event-driven tool for fusion rearrangements discovery.

    PubMed

    Piazza, Rocco; Pirola, Alessandra; Spinelli, Roberta; Valletta, Simona; Redaelli, Sara; Magistroni, Vera; Gambacorti-Passerini, Carlo

    2012-09-01

    Gene fusions are common driver events in leukaemias and solid tumours; here we present FusionAnalyser, a tool dedicated to the identification of driver fusion rearrangements in human cancer through the analysis of paired-end high-throughput transcriptome sequencing data. We initially tested FusionAnalyser by using a set of in silico randomly generated sequencing data from 20 known human translocations occurring in cancer and subsequently using transcriptome data from three chronic and three acute myeloid leukaemia samples. in all the cases our tool was invariably able to detect the presence of the correct driver fusion event(s) with high specificity. In one of the acute myeloid leukaemia samples, FusionAnalyser identified a novel, cryptic, in-frame ETS2-ERG fusion. A fully event-driven graphical interface and a flexible filtering system allow complex analyses to be run in the absence of any a priori programming or scripting knowledge. Therefore, we propose FusionAnalyser as an efficient and robust graphical tool for the identification of functional rearrangements in the context of high-throughput transcriptome sequencing data.

  15. Efficiency of Different Sampling Tools for Aquatic Macroinvertebrate Collections in Malaysian Streams

    PubMed Central

    Ghani, Wan Mohd Hafezul Wan Abdul; Rawi, Che Salmah Md; Hamid, Suhaila Abd; Al-Shami, Salman Abdo

    2016-01-01

    This study analyses the sampling performance of three benthic sampling tools commonly used to collect freshwater macroinvertebrates. Efficiency of qualitative D-frame and square aquatic nets were compared to a quantitative Surber sampler in tropical Malaysian streams. The abundance and diversity of macroinvertebrates collected using each tool evaluated along with their relative variations (RVs). Each tool was used to sample macroinvertebrates from three streams draining different areas: a vegetable farm, a tea plantation and a forest reserve. High macroinvertebrate diversities were recorded using the square net and Surber sampler at the forested stream site; however, very low species abundance was recorded by the Surber sampler. Relatively large variations in the Surber sampler collections (RVs of 36% and 28%) were observed for the vegetable farm and tea plantation streams, respectively. Of the three sampling methods, the square net was the most efficient, collecting a greater diversity of macroinvertebrate taxa and a greater number of specimens (i.e., abundance) overall, particularly from the vegetable farm and the tea plantation streams (RV<25%). Fewer square net sample passes (<8 samples) were sufficient to perform a biological assessment of water quality, but each sample required a slightly longer processing time (±20 min) compared with those gathered via the other samplers. In conclusion, all three apparatuses were suitable for macroinvertebrate collection in Malaysian streams and gathered assemblages that resulted in the determination of similar biological water quality classes using the Family Biotic Index (FBI) and the Biological Monitoring Working Party (BMWP). However, despite a slightly longer processing time, the square net was more efficient (lowest RV) at collecting samples and more suitable for the collection of macroinvertebrates from deep, fast flowing, wadeable streams with coarse substrates. PMID:27019685

  16. Sustainable cooling method for machining titanium alloy

    NASA Astrophysics Data System (ADS)

    Boswell, B.; Islam, M. N.

    2016-02-01

    Hard to machine materials such as Titanium Alloy TI-6AI-4V Grade 5 are notoriously known to generate high temperatures and adverse reactions between the workpiece and the tool tip materials. These conditions all contribute to an increase in the wear mechanisms, reducing tool life. Titanium Alloy, for example always requires coolant to be used during machining. However, traditional flood cooling needs to be replaced due to environmental issues, and an alternative cooling method found that has minimum impact on the environment. For true sustainable cooling of the tool it is necessary to account for all energy used in the cooling process, including the energy involved in producing the coolant. Previous research has established that efficient cooling of the tool interface improves the tool life and cutting action. The objective of this research is to determine the most appropriate sustainable cooling method that can also reduce the rate of wear at the tool interface.

  17. Quantifying Oldowan Stone Tool Production at Olduvai Gorge, Tanzania

    PubMed Central

    Reti, Jay S.

    2016-01-01

    Recent research suggests that variation exists among and between Oldowan stone tool assemblages. Oldowan variation might represent differential constraints on raw materials used to produce these stone implements. Alternatively, variation among Oldowan assemblages could represent different methods that Oldowan producing hominins utilized to produce these lithic implements. Identifying differential patterns of stone tool production within the Oldowan has implications for assessing how stone tool technology evolved, how traditions of lithic production might have been culturally transmitted, and for defining the timing and scope of these evolutionary events. At present there is no null model to predict what morphological variation in the Oldowan should look like. Without such a model, quantifying whether Oldowan assemblages vary due to raw material constraints or whether they vary due to differences in production technique is not possible. This research establishes a null model for Oldowan lithic artifact morphological variation. To establish these expectations this research 1) models the expected range of variation through large scale reduction experiments, 2) develops an algorithm to categorize archaeological flakes based on how they are produced, and 3) statistically assesses the methods of production behavior used by Oldowan producing hominins at the site of DK from Olduvai Gorge, Tanzania via the experimental model. Results indicate that a subset of quartzite flakes deviate from the null expectations in a manner that demonstrates efficiency in flake manufacture, while some basalt flakes deviate from null expectations in a manner that demonstrates inefficiency in flake manufacture. The simultaneous presence of efficiency in stone tool production for one raw material (quartzite) and inefficiency in stone tool production for another raw material (basalt) suggests that Oldowan producing hominins at DK were able to mediate the economic costs associated with stone tool procurement by utilizing high-cost materials more efficiently than is expected and low-cost materials in an inefficient manner. PMID:26808429

  18. Improving the selection efficiency of the counter-selection marker pheS* for the genetic engineering of Bacillus amyloliquefaciens.

    PubMed

    Kharchenko, Maria S; Teslya, Petr N; Babaeva, Maria N; Zakataeva, Natalia P

    2018-05-01

    Bacillus subtilis pheS was genetically modified to obtain a counter-selection marker with high selection efficiency in Bacillus amyloliquefaciens. The application of the new replication-thermosensitive integrative vector pNZTM1, containing this marker, pheS BsT255S/A309G , with a two-step replacement recombination procedure provides an effective tool for the genetic engineering of industrially important Bacillus species. Copyright © 2018. Published by Elsevier B.V.

  19. Sugar microanalysis by HPLC with benzoylation: improvement via introduction of a C-8 cartridge and a high efficiency ODS column.

    PubMed

    Miyagi, Michiko; Yokoyama, Hirokazu; Hibi, Toshifumi

    2007-07-01

    An HPLC protocol for sugar microanalysis based on the formation of ultraviolet-absorbing benzoyl chloride derivatives was improved. Here, samples were prepared with a C-8 cartridge and analyzed with a high efficiency ODS column, in which porous spherical silica particles 3 microm in diameter were packed. These devices allowed us to simultaneously quantify multiple sugars and sugar alcohols up to 10 ng/ml and to provide satisfactory separations of some sugars, such as fructose and myo-inositol and sorbitol and mannitol. This protocol, which does not require special apparatuses, should become a powerful tool in sugar research.

  20. Performance evaluation of inpatient service in Beijing: a horizontal comparison with risk adjustment based on Diagnosis Related Groups

    PubMed Central

    Jian, Weiyan; Huang, Yinmin; Hu, Mu; Zhang, Xiumei

    2009-01-01

    Background The medical performance evaluation, which provides a basis for rational decision-making, is an important part of medical service research. Current progress with health services reform in China is far from satisfactory, without sufficient regulation. To achieve better progress, an effective tool for evaluating medical performance needs to be established. In view of this, this study attempted to develop such a tool appropriate for the Chinese context. Methods Data was collected from the front pages of medical records (FPMR) of all large general public hospitals (21 hospitals) in the third and fourth quarter of 2007. Locally developed Diagnosis Related Groups (DRGs) were introduced as a tool for risk adjustment and performance evaluation indicators were established: Charge Efficiency Index (CEI), Time Efficiency Index (TEI) and inpatient mortality of low-risk group cases (IMLRG), to reflect respectively work efficiency and medical service quality. Using these indicators, the inpatient services' performance was horizontally compared among hospitals. Case-mix Index (CMI) was used to adjust efficiency indices and then produce adjusted CEI (aCEI) and adjusted TEI (aTEI). Poisson distribution analysis was used to test the statistical significance of the IMLRG differences between different hospitals. Results Using the aCEI, aTEI and IMLRG scores for the 21 hospitals, Hospital A and C had relatively good overall performance because their medical charges were lower, LOS shorter and IMLRG smaller. The performance of Hospital P and Q was the worst due to their relatively high charge level, long LOS and high IMLRG. Various performance problems also existed in the other hospitals. Conclusion It is possible to develop an accurate and easy to run performance evaluation system using Case-Mix as the tool for risk adjustment, choosing indicators close to consumers and managers, and utilizing routine report forms as the basic information source. To keep such a system running effectively, it is necessary to improve the reliability of clinical information and the risk-adjustment ability of Case-Mix. PMID:19402913

  1. Commercial Building Energy Asset Rating Program -- Market Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCabe, Molly J.; Wang, Na

    2012-04-19

    Under contract to Pacific Northwest National Laboratory, HaydenTanner, LLC conducted an in-depth analysis of the potential market value of a commercial building energy asset rating program for the U.S. Department of Energy (DOE) Office of Energy Efficiency and Renewable Energy. The market research objectives were to: (1) Evaluate market interest and need for a program and tool to offer asset rating and rapidly identify potential energy efficiency measures for the commercial building sector. (2) Identify key input variables and asset rating outputs that would facilitate increased investment in energy efficiency. (3) Assess best practices and lessons learned from existing nationalmore » and international energy rating programs. (4) Identify core messaging to motivate owners, investors, financiers, and others in the real estate sector to adopt a voluntary asset rating program and, as a consequence, deploy high-performance strategies and technologies across new and existing buildings. (5) Identify leverage factors and incentives that facilitate increased investment in these buildings. To meet these objectives, work consisted of a review of the relevant literature, examination of existing and emergent asset and operational rating systems, interviews with industry stakeholders, and an evaluation of the value implication of an asset label on asset valuation. This report documents the analysis methodology and findings, conclusion, and recommendations. Its intent is to support and inform the DOE Office of Energy Efficiency and Renewable Energy on the market need and potential value impacts of an asset labeling and diagnostic tool to encourage high-performance new buildings and building efficiency retrofit projects.« less

  2. OLTARIS: An Efficient Web-Based Tool for Analyzing Materials Exposed to Space Radiation

    NASA Technical Reports Server (NTRS)

    Slaba, Tony; McMullen, Amelia M.; Thibeault, Sheila A.; Sandridge, Chris A.; Clowdsley, Martha S.; Blatting, Steve R.

    2011-01-01

    The near-Earth space radiation environment includes energetic galactic cosmic rays (GCR), high intensity proton and electron belts, and the potential for solar particle events (SPE). These sources may penetrate shielding materials and deposit significant energy in sensitive electronic devices on board spacecraft and satellites. Material and design optimization methods may be used to reduce the exposure and extend the operational lifetime of individual components and systems. Since laboratory experiments are expensive and may not cover the range of particles and energies relevant for space applications, such optimization may be done computationally with efficient algorithms that include the various constraints placed on the component, system, or mission. In the present work, the web-based tool OLTARIS (On-Line Tool for the Assessment of Radiation in Space) is presented, and the applicability of the tool for rapidly analyzing exposure levels within either complicated shielding geometries or user-defined material slabs exposed to space radiation is demonstrated. An example approach for material optimization is also presented. Slabs of various advanced multifunctional materials are defined and exposed to several space radiation environments. The materials and thicknesses defining each layer in the slab are then systematically adjusted to arrive at an optimal slab configuration.

  3. MITHRA 1.0: A full-wave simulation tool for free electron lasers

    NASA Astrophysics Data System (ADS)

    Fallahi, Arya; Yahaghi, Alireza; Kärtner, Franz X.

    2018-07-01

    Free Electron Lasers (FELs) are a solution for providing intense, coherent and bright radiation in the hard X-ray regime. Due to the low wall-plug efficiency of FEL facilities, it is crucial and additionally very useful to develop complete and accurate simulation tools for better optimizing a FEL interaction. The highly sophisticated dynamics involved in a FEL process was the main obstacle hindering the development of general simulation tools for this problem. We present a numerical algorithm based on finite difference time domain/Particle in cell (FDTD/PIC) in a Lorentz boosted coordinate system which is able to fulfill a full-wave simulation of a FEL process. The developed software offers a suitable tool for the analysis of FEL interactions without considering any of the usual approximations. A coordinate transformation to bunch rest frame makes the very different length scales of bunch size, optical wavelengths and the undulator period transform to values with the same order. Consequently, FDTD/PIC simulations in conjunction with efficient parallelization techniques make the full-wave simulation feasible using the available computational resources. Several examples of free electron lasers are analyzed using the developed software, the results are benchmarked based on standard FEL codes and discussed in detail.

  4. An efficient framework for Java data processing systems in HPC environments

    NASA Astrophysics Data System (ADS)

    Fries, Aidan; Castañeda, Javier; Isasi, Yago; Taboada, Guillermo L.; Portell de Mora, Jordi; Sirvent, Raül

    2011-11-01

    Java is a commonly used programming language, although its use in High Performance Computing (HPC) remains relatively low. One of the reasons is a lack of libraries offering specific HPC functions to Java applications. In this paper we present a Java-based framework, called DpcbTools, designed to provide a set of functions that fill this gap. It includes a set of efficient data communication functions based on message-passing, thus providing, when a low latency network such as Myrinet is available, higher throughputs and lower latencies than standard solutions used by Java. DpcbTools also includes routines for the launching, monitoring and management of Java applications on several computing nodes by making use of JMX to communicate with remote Java VMs. The Gaia Data Processing and Analysis Consortium (DPAC) is a real case where scientific data from the ESA Gaia astrometric satellite will be entirely processed using Java. In this paper we describe the main elements of DPAC and its usage of the DpcbTools framework. We also assess the usefulness and performance of DpcbTools through its performance evaluation and the analysis of its impact on some DPAC systems deployed in the MareNostrum supercomputer (Barcelona Supercomputing Center).

  5. Fundamental CRISPR-Cas9 tools and current applications in microbial systems.

    PubMed

    Tian, Pingfang; Wang, Jia; Shen, Xiaolin; Rey, Justin Forrest; Yuan, Qipeng; Yan, Yajun

    2017-09-01

    Derived from the bacterial adaptive immune system, CRISPR technology has revolutionized conventional genetic engineering methods and unprecedentedly facilitated strain engineering. In this review, we outline the fundamental CRISPR tools that have been employed for strain optimization. These tools include CRISPR editing, CRISPR interference, CRISPR activation and protein imaging. To further characterize the CRISPR technology, we present current applications of these tools in microbial systems, including model- and non-model industrial microorganisms. Specially, we point out the major challenges of the CRISPR tools when utilized for multiplex genome editing and sophisticated expression regulation. To address these challenges, we came up with strategies that place emphasis on the amelioration of DNA repair efficiency through CRISPR-Cas9-assisted recombineering. Lastly, multiple promising research directions were proposed, mainly focusing on CRISPR-based construction of microbial ecosystems toward high production of desired chemicals.

  6. Microgrid Analysis Tools Summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jimenez, Antonio; Haase, Scott G; Mathur, Shivani

    2018-03-05

    The over-arching goal of the Alaska Microgrid Partnership is to reduce the use of total imported fuel into communities to secure all energy services by at least 50% in Alaska's remote microgrids without increasing system life cycle costs while also improving overall system reliability, security, and resilience. One goal of the Alaska Microgrid Partnership is to investigate whether a combination of energy efficiency and high-contribution (from renewable energy) power systems can reduce total imported energy usage by 50% while reducing life cycle costs and improving reliability and resiliency. This presentation provides an overview of the following four renewable energy optimizationmore » tools. Information is from respective tool websites, tool developers, and author experience. Distributed Energy Resources Customer Adoption Model (DER-CAM) Microgrid Design Toolkit (MDT) Renewable Energy Optimization (REopt) Tool Hybrid Optimization Model for Electric Renewables (HOMER).« less

  7. Bridging the gap between fluxomics and industrial biotechnology.

    PubMed

    Feng, Xueyang; Page, Lawrence; Rubens, Jacob; Chircus, Lauren; Colletti, Peter; Pakrasi, Himadri B; Tang, Yinjie J

    2010-01-01

    Metabolic flux analysis is a vital tool used to determine the ultimate output of cellular metabolism and thus detect biotechnologically relevant bottlenecks in productivity. ¹³C-based metabolic flux analysis (¹³C-MFA) and flux balance analysis (FBA) have many potential applications in biotechnology. However, noteworthy hurdles in fluxomics study are still present. First, several technical difficulties in both ¹³C-MFA and FBA severely limit the scope of fluxomics findings and the applicability of obtained metabolic information. Second, the complexity of metabolic regulation poses a great challenge for precise prediction and analysis of metabolic networks, as there are gaps between fluxomics results and other omics studies. Third, despite identified metabolic bottlenecks or sources of host stress from product synthesis, it remains difficult to overcome inherent metabolic robustness or to efficiently import and express nonnative pathways. Fourth, product yields often decrease as the number of enzymatic steps increases. Such decrease in yield may not be caused by rate-limiting enzymes, but rather is accumulated through each enzymatic reaction. Fifth, a high-throughput fluxomics tool hasnot been developed for characterizing nonmodel microorganisms and maximizing their application in industrial biotechnology. Refining fluxomics tools and understanding these obstacles will improve our ability to engineer highly efficient metabolic pathways in microbial hosts.

  8. Sex chromosomal abnormalities associated with equine infertility: validation of a simple molecular screening tool in the Purebred Spanish Horse.

    PubMed

    Anaya, G; Molina, A; Valera, M; Moreno-Millán, M; Azor, P; Peral-García, P; Demyda-Peyrás, S

    2017-08-01

    Chromosomal abnormalities in the sex chromosome pair (ECAX and ECAY) are widely associated with reproductive problems in horses. However, a large proportion of these abnormalities remains undiagnosed due to the lack of an affordable diagnostic tool that allows for avoiding karyotyping tests. Hereby, we developed an STR (single-tandem-repeat)-based molecular method to determine the presence of the main sex chromosomal abnormalities in horses in a fast, cheap and reliable way. The frequency of five ECAX-linked (LEX026, LEX003, TKY38, TKY270 and UCDEQ502) and two ECAY-linked (EcaYH12 and SRY) markers was characterized in 261 Purebred Spanish Horses to determine the efficiency of the methodology developed to be used as a chromosomal diagnostic tool. All the microsatellites analyzed were highly polymorphic, with a sizeable number of alleles (polymorphic information content > 0.5). Based on this variability, the methodology showed 100% sensitivity and 99.82% specificity to detect the most important sex chromosomal abnormalities reported in horses (chimerism, Turner's syndrome and sex reversal syndromes). The method was also validated with 100% efficiency in 10 individuals previously diagnosed as chromosomally aberrant. This STR screening panel is an efficient and reliable molecular-cytogenetic tool for the early detection of sex chromosomal abnormalities in equines that could be included in breeding programs to save money, effort and time of veterinary practitioners and breeders. © 2017 Stichting International Foundation for Animal Genetics.

  9. Force feedback requirements for efficient laparoscopic grasp control.

    PubMed

    Westebring-van der Putten, Eleonora P; van den Dobbelsteen, John J; Goossens, Richard H M; Jakimowicz, Jack J; Dankelman, Jenny

    2009-09-01

    During laparoscopic grasping, tissue damage may occur due to use of excessive grasp forces and tissue slippage, whereas in barehanded grasping, humans control their grasp to prevent slippage and use of excessive force (safe grasp). This study investigates the differences in grasp control during barehanded and laparoscopic lifts. Ten novices performed lifts in order to compare pinch forces under four conditions: barehanded; using tweezers; a low-efficient grasper; and a high-efficient grasper. Results showed that participants increased their pinch force significantly later during a barehanded lift (at a pull-force level of 2.63 N) than when lifting laparoscopically (from pull-force levels of 0.77 to 1.08 N). In barehanded lifts all participants could accomplish a safe grasp, whereas in laparoscopic lifts excessive force (up to 7.9 N) and slippage (up to 38% of the trials) occurred frequently. For novices, it can be concluded that force feedback (additional to the hand-tool interface), as in skin-tissue contact, is a prerequisite to maintain a safe grasp. Much is known about grasp control during barehanded object manipulation, especially the control of pinch forces to changing loading, whereas little is known about force perception and grasp control during tool usage. This knowledge is a prerequisite for the ergonomic design of tools that are used to manipulate objects.

  10. Unsteady Loss in the Stator Due to the Incoming Rotor Wake in a Highly-Loaded Transonic Compressor

    NASA Technical Reports Server (NTRS)

    Hah, Chunill

    2015-01-01

    The present paper reports an investigation of unsteady loss generation in the stator due to the incoming rotor wake in an advanced GE transonic compressor design with a high-fidelity numerical method. This advanced compressor with high reaction and high stage loading has been investigated both experimentally and analytically in the past. The measured efficiency in this advanced compressor is significantly lower than the design intention goal. The general understanding is that the current generation of compressor design analysis tools miss some important flow physics in this modern compressor design. To pinpoint the source of the efficiency miss, an advanced test with a detailed flow traverse was performed for the front one and a half stage at the NASA Glenn Research Center.

  11. ESH assessment of advanced lithography materials and processes

    NASA Astrophysics Data System (ADS)

    Worth, Walter F.; Mallela, Ram

    2004-05-01

    The ESH Technology group at International SEMATECH is conducting environment, safety, and health (ESH) assessments in collaboration with the lithography technologists evaluating the performance of an increasing number of new materials and technologies being considered for advanced lithography such as 157nm photresist and extreme ultraviolet (EUV). By performing data searches for 75 critical data types, emissions characterizations, and industrial hygiene (IH) monitoring during the use of the resist candidates, it has been shown that the best performing resist formulations, so far, appear to be free of potential ESH concerns. The ESH assessment of the EUV lithography tool that is being developed for SEMATECH has identified several features of the tool that are of ESH concern: high energy consumption, poor energy conversion efficiency, tool complexity, potential ergonomic and safety interlock issues, use of high powered laser(s), generation of ionizing radiation (soft X-rays), need for adequate shielding, and characterization of the debris formed by the extreme temperature of the plasma. By bringing these ESH challenges to the attention of the technologists and tool designers, it is hoped that the processes and tools can be made more ESH friendly.

  12. 20180312 - Evaluating the applicability of read-across tools and high throughput screening data for food relevant chemicals (SOT)

    EPA Science Inventory

    Alternative toxicity assessment methods to characterize the hazards of chemical substances have been proposed to reduce animal testing and screen thousands of chemicals in an efficient manner. Resources to accomplish these goals include utilizing large in vitro chemical screening...

  13. CRISPR/Cas9 mediated high efficiency knockout of the eye color gene vermillion in Helicoverpa zea (Boddie)

    USDA-ARS?s Scientific Manuscript database

    Among various genome editing tools available for functional genomic studies, reagents based on clustered regularly interspersed palindromic repeats (CRISPR) have gained popularity due to ease and versatility. CRISPR reagents consists of ribonucleoprotein (RNP) complexes formed by combining guide RNA...

  14. Working in a Vacuum

    ERIC Educational Resources Information Center

    Rathey, Allen

    2005-01-01

    In this article, the author discusses several myths about vacuum cleaners and offers tips on evaluating and purchasing this essential maintenance tool. These myths are: (1) Amps mean performance; (2) Everyone needs high-efficiency particulate air (HEPA): (3) Picking up a "bowling ball" shows cleaning power; (4) All vacuum bags are the same; (5)…

  15. The dental handpiece: technology continues to impact everyday practice.

    PubMed

    Lowe, Robert A

    2015-04-01

    One of the most fundamental devices used in dentistry, the handpiece can enhance the efficiency of everyday dental tasks. Through the years, handpieces have gradually been redesigned and upgraded to become the highly accurate and sophisticated tools they are today. Technological advances continue to improve these indispensable instruments.

  16. Multi-locus mixed model analysis of stem rust resistance in a worldwide collection of winter wheat

    USDA-ARS?s Scientific Manuscript database

    Genome-wide association mapping is a powerful tool for dissecting the relationship between phenotypes and genetic variants in diverse populations. With improved cost efficiency of high-throughput genotyping platforms, association mapping is a desirable method to mine populations for favorable allele...

  17. Methodology of project management at implementation of projects of high-rise construction

    NASA Astrophysics Data System (ADS)

    Papelniuk, Oksana

    2018-03-01

    High-rise construction is the perspective direction in urban development. An opportunity to arrange on rather small land plot a huge number of the living and commercial space makes high-rise construction very attractive for developers. However investment projects of high-rise buildings' construction are very expensive and complex that sets a task of effective management of such projects for the company builder. The best tool in this area today is the methodology of project management, which becomes a key factor of efficiency.

  18. An online network tool for quality information to answer questions about occupational safety and health: usability and applicability.

    PubMed

    Rhebergen, Martijn D F; Hulshof, Carel T J; Lenderink, Annet F; van Dijk, Frank J H

    2010-10-22

    Common information facilities do not always provide the quality information needed to answer questions on health or health-related issues, such as Occupational Safety and Health (OSH) matters. Barriers may be the accessibility, quantity and readability of information. Online Question & Answer (Q&A) network tools, which link questioners directly to experts can overcome some of these barriers. When designing and testing online tools, assessing the usability and applicability is essential. Therefore, the purpose of this study is to assess the usability and applicability of a new online Q&A network tool for answers on OSH questions. We applied a cross-sectional usability test design. Eight occupational health experts and twelve potential questioners from the working population (workers) were purposively selected to include a variety of computer- and internet-experiences. During the test, participants were first observed while executing eight tasks that entailed important features of the tool. In addition, they were interviewed. Through task observations and interviews we assessed applicability, usability (effectiveness, efficiency and satisfaction) and facilitators and barriers in use. Most features were usable, though several could be improved. Most tasks were executed effectively. Some tasks, for example searching stored questions in categories, were not executed efficiently and participants were less satisfied with the corresponding features. Participants' recommendations led to improvements. The tool was found mostly applicable for additional information, to observe new OSH trends and to improve contact between OSH experts and workers. Hosting and support by a trustworthy professional organization, effective implementation campaigns, timely answering and anonymity were seen as important use requirements. This network tool is a promising new strategy for offering company workers high quality information to answer OSH questions. Q&A network tools can be an addition to existing information facilities in the field of OSH, but also to other healthcare fields struggling with how to answer questions from people in practice with high quality information. In the near future, we will focus on the use of the tool and its effects on information and knowledge dissemination.

  19. Coupling of metal-organic frameworks-containing monolithic capillary-based selective enrichment with matrix-assisted laser desorption ionization-time-of-flight mass spectrometry for efficient analysis of protein phosphorylation.

    PubMed

    Li, Daojin; Yin, Danyang; Chen, Yang; Liu, Zhen

    2017-05-19

    Protein phosphorylation is a major post-translational modification, which plays a vital role in cellular signaling of numerous biological processes. Mass spectrometry (MS) has been an essential tool for the analysis of protein phosphorylation, for which it is a key step to selectively enrich phosphopeptides from complex biological samples. In this study, metal-organic frameworks (MOFs)-based monolithic capillary has been successfully prepared as an effective sorbent for the selective enrichment of phosphopeptides and has been off-line coupled with matrix-assisted laser desorption ionization-time-of-flight mass spectrometry (MALDI-TOF MS) for efficient analysis of phosphopeptides. Using š-casein as a representative phosphoprotein, efficient phosphorylation analysis by this off-line platform was verified. Phosphorylation analysis of a nonfat milk sample was also demonstrated. Through introducing large surface areas and highly ordered pores of MOFs into monolithic column, the MOFs-based monolithic capillary exhibited several significant advantages, such as excellent selectivity toward phosphopeptides, superb tolerance to interference and simple operation procedure. Because of these highly desirable properties, the MOFs-based monolithic capillary could be a useful tool for protein phosphorylation analysis. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Energy-Saving Melting and Revert Reduction Technology (E-SMARRT): Use of Laser Engineered Net Shaping for Rapid Manufacturing of Dies with Protective Coatings and Improved Thermal Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brevick, Jerald R.

    2014-06-13

    In the high pressure die casting process, molten metal is introduced into a die cavity at high pressure and velocity, enabling castings of thin wall section and complex geometry to be obtained. Traditional die materials have been hot work die steels, commonly H13. Manufacture of the dies involves machining the desired geometry from monolithic blocks of annealed tool steel, heat treating to desired hardness and toughness, and final machining, grinding and polishing. The die is fabricated with internal water cooling passages created by drilling. These materials and fabrication methods have been used for many years, however, there are limitations. Toolmore » steels have relatively low thermal conductivity, and as a result, it takes time to remove the heat from the tool steel via the drilled internal water cooling passages. Furthermore, the low thermal conductivity generates large thermal gradients at the die cavity surfaces, which ultimately leads to thermal fatigue cracking on the surfaces of the die steel. The high die surface temperatures also promote the metallurgical bonding of the aluminum casting alloy to the surface of the die steel (soldering). In terms of process efficiency, these tooling limitations reduce the number of die castings that can be made per unit time by increasing cycle time required for cooling, and increasing downtime and cost to replace tooling which has failed either by soldering or by thermal fatigue cracking (heat checking). The objective of this research was to evaluate the feasibility of designing, fabricating, and testing high pressure die casting tooling having properties equivalent to H13 on the surface in contact with molten casting alloy - for high temperature and high velocity molten metal erosion resistance – but with the ability to conduct heat rapidly to interior water cooling passages. A layered bimetallic tool design was selected, and the design evaluated for thermal and mechanical performance via finite element analysis. H13 was retained as the exterior layer of the tooling, while commercially pure copper was chosen for the interior structure of the tooling. The tooling was fabricated by traditional machining of the copper substrate, and H13 powder was deposited on the copper via the Laser Engineered Net Shape (LENSTM) process. The H13 deposition layer was then final machined by traditional methods. Two tooling components were designed and fabricated; a thermal fatigue test specimen, and a core for a commercial aluminum high pressure die casting tool. The bimetallic thermal fatigue specimen demonstrated promising performance during testing, and the test results were used to improve the design and LENS TM deposition methods for subsequent manufacture of the commercial core. Results of the thermal finite element analysis for the thermal fatigue test specimen indicate that it has the ability to lose heat to the internal water cooling passages, and to external spray cooling, significantly faster than a monolithic H13 thermal fatigue sample. The commercial core is currently in the final stages of fabrication, and will be evaluated in an actual production environment at Shiloh Die casting. In this research, the feasibility of designing and fabricating copper/H13 bimetallic die casting tooling via LENS TM processing, for the purpose of improving die casting process efficiency, is demonstrated.« less

  1. The Outdoor MEDIA DOT: The development and inter-rater reliability of a tool designed to measure food and beverage outlets and outdoor advertising.

    PubMed

    Poulos, Natalie S; Pasch, Keryn E

    2015-07-01

    Few studies of the food environment have collected primary data, and even fewer have reported reliability of the tool used. This study focused on the development of an innovative electronic data collection tool used to document outdoor food and beverage (FB) advertising and establishments near 43 middle and high schools in the Outdoor MEDIA Study. Tool development used GIS based mapping, an electronic data collection form on handheld devices, and an easily adaptable interface to efficiently collect primary data within the food environment. For the reliability study, two teams of data collectors documented all FB advertising and establishments within one half-mile of six middle schools. Inter-rater reliability was calculated overall and by advertisement or establishment category using percent agreement. A total of 824 advertisements (n=233), establishment advertisements (n=499), and establishments (n=92) were documented (range=8-229 per school). Overall inter-rater reliability of the developed tool ranged from 69-89% for advertisements and establishments. Results suggest that the developed tool is highly reliable and effective for documenting the outdoor FB environment. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Machine tools error characterization and compensation by on-line measurement of artifact

    NASA Astrophysics Data System (ADS)

    Wahid Khan, Abdul; Chen, Wuyi; Wu, Lili

    2009-11-01

    Most manufacturing machine tools are utilized for mass production or batch production with high accuracy at a deterministic manufacturing principle. Volumetric accuracy of machine tools depends on the positional accuracy of the cutting tool, probe or end effector related to the workpiece in the workspace volume. In this research paper, a methodology is presented for volumetric calibration of machine tools by on-line measurement of an artifact or an object of a similar type. The machine tool geometric error characterization was carried out through a standard or an artifact, having similar geometry to the mass production or batch production product. The artifact was measured at an arbitrary position in the volumetric workspace with a calibrated Renishaw touch trigger probe system. Positional errors were stored into a computer for compensation purpose, to further run the manufacturing batch through compensated codes. This methodology was found quite effective to manufacture high precision components with more dimensional accuracy and reliability. Calibration by on-line measurement gives the advantage to improve the manufacturing process by use of deterministic manufacturing principle and found efficient and economical but limited to the workspace or envelop surface of the measured artifact's geometry or the profile.

  3. The Outdoor MEDIA DOT: The Development and Inter-Rater Reliability of a Tool Designed to Measure Food and Beverage Outlets and Outdoor Advertising

    PubMed Central

    Poulos, Natalie S.; Pasch, Keryn E.

    2015-01-01

    Few studies of the food environment have collected primary data, and even fewer have reported reliability of the tool used. This study focused on the development of an innovative electronic data collection tool used to document outdoor food and beverage (FB) advertising and establishments near 43 middle and high schools in the Outdoor MEDIA Study. Tool development used GIS based mapping, an electronic data collection form on handheld devices, and an easily adaptable interface to efficiently collect primary data within the food environment. For the reliability study, two teams of data collectors documented all FB advertising and establishments within one half-mile of six middle schools. Inter-rater reliability was calculated overall and by advertisement or establishment category using percent agreement. A total of 824 advertisements (n=233), establishment advertisements (n=499), and establishments (n=92) were documented (range=8–229 per school). Overall inter-rater reliability of the developed tool ranged from 69–89% for advertisements and establishments. Results suggest that the developed tool is highly reliable and effective for documenting the outdoor FB environment. PMID:26022774

  4. A data envelope analysis to assess factors affecting technical and economic efficiency of individual broiler breeder hens.

    PubMed

    Romero, L F; Zuidhof, M J; Jeffrey, S R; Naeima, A; Renema, R A; Robinson, F E

    2010-08-01

    This study evaluated the effect of feed allocation and energetic efficiency on technical and economic efficiency of broiler breeder hens using the data envelope analysis methodology and quantified the effect of variables affecting technical efficiency. A total of 288 Ross 708 pullets were placed in individual cages at 16 wk of age and assigned to 1 of 4 feed allocation groups. Three of them had feed allocated on a group basis with divergent BW targets: standard, high (standard x 1.1), and low (standard x 0.9). The fourth group had feed allocated on an individual bird basis following the standard BW target. Birds were classified in 3 energetic efficiency categories: low, average, and high, based on estimated maintenance requirements. Technical efficiency considered saleable chicks as output and cumulative ME intake and time as inputs. Economic efficiency of feed allocation treatments was analyzed under different cost scenarios. Birds with low feed allocation exhibited a lower technical efficiency (69.4%) than standard (72.1%), which reflected a reduced egg production rate. Feed allocation of the high treatment could have been reduced by 10% with the same chick production as the standard treatment. The low treatment exhibited reduced economic efficiency at greater capital costs, whereas high had reduced economic efficiency at greater feed costs. The average energetic efficiency hens had a lower technical efficiency in the low compared with the standard feed allocation. A 1% increment in estimated maintenance requirement changed technical efficiency by -0.23%, whereas a 1% increment in ME intake had a -0.47% effect. The negative relationship between technical efficiency and ME intake was counterbalanced by a positive correlation of ME intake and egg production. The negative relationship of technical efficiency and maintenance requirements was synergized by a negative correlation of hen maintenance and egg production. Economic efficiency methodologies are effective tools to assess the economic effect of selection and flock management programs because biological, allocative, and economic factors can be independently analyzed.

  5. Plant Aquaporins: Genome-Wide Identification, Transcriptomics, Proteomics, and Advanced Analytical Tools.

    PubMed

    Deshmukh, Rupesh K; Sonah, Humira; Bélanger, Richard R

    2016-01-01

    Aquaporins (AQPs) are channel-forming integral membrane proteins that facilitate the movement of water and many other small molecules. Compared to animals, plants contain a much higher number of AQPs in their genome. Homology-based identification of AQPs in sequenced species is feasible because of the high level of conservation of protein sequences across plant species. Genome-wide characterization of AQPs has highlighted several important aspects such as distribution, genetic organization, evolution and conserved features governing solute specificity. From a functional point of view, the understanding of AQP transport system has expanded rapidly with the help of transcriptomics and proteomics data. The efficient analysis of enormous amounts of data generated through omic scale studies has been facilitated through computational advancements. Prediction of protein tertiary structures, pore architecture, cavities, phosphorylation sites, heterodimerization, and co-expression networks has become more sophisticated and accurate with increasing computational tools and pipelines. However, the effectiveness of computational approaches is based on the understanding of physiological and biochemical properties, transport kinetics, solute specificity, molecular interactions, sequence variations, phylogeny and evolution of aquaporins. For this purpose, tools like Xenopus oocyte assays, yeast expression systems, artificial proteoliposomes, and lipid membranes have been efficiently exploited to study the many facets that influence solute transport by AQPs. In the present review, we discuss genome-wide identification of AQPs in plants in relation with recent advancements in analytical tools, and their availability and technological challenges as they apply to AQPs. An exhaustive review of omics resources available for AQP research is also provided in order to optimize their efficient utilization. Finally, a detailed catalog of computational tools and analytical pipelines is offered as a resource for AQP research.

  6. Plant Aquaporins: Genome-Wide Identification, Transcriptomics, Proteomics, and Advanced Analytical Tools

    PubMed Central

    Deshmukh, Rupesh K.; Sonah, Humira; Bélanger, Richard R.

    2016-01-01

    Aquaporins (AQPs) are channel-forming integral membrane proteins that facilitate the movement of water and many other small molecules. Compared to animals, plants contain a much higher number of AQPs in their genome. Homology-based identification of AQPs in sequenced species is feasible because of the high level of conservation of protein sequences across plant species. Genome-wide characterization of AQPs has highlighted several important aspects such as distribution, genetic organization, evolution and conserved features governing solute specificity. From a functional point of view, the understanding of AQP transport system has expanded rapidly with the help of transcriptomics and proteomics data. The efficient analysis of enormous amounts of data generated through omic scale studies has been facilitated through computational advancements. Prediction of protein tertiary structures, pore architecture, cavities, phosphorylation sites, heterodimerization, and co-expression networks has become more sophisticated and accurate with increasing computational tools and pipelines. However, the effectiveness of computational approaches is based on the understanding of physiological and biochemical properties, transport kinetics, solute specificity, molecular interactions, sequence variations, phylogeny and evolution of aquaporins. For this purpose, tools like Xenopus oocyte assays, yeast expression systems, artificial proteoliposomes, and lipid membranes have been efficiently exploited to study the many facets that influence solute transport by AQPs. In the present review, we discuss genome-wide identification of AQPs in plants in relation with recent advancements in analytical tools, and their availability and technological challenges as they apply to AQPs. An exhaustive review of omics resources available for AQP research is also provided in order to optimize their efficient utilization. Finally, a detailed catalog of computational tools and analytical pipelines is offered as a resource for AQP research. PMID:28066459

  7. High-Fidelity Multidisciplinary Design Optimization of Aircraft Configurations

    NASA Technical Reports Server (NTRS)

    Martins, Joaquim R. R. A.; Kenway, Gaetan K. W.; Burdette, David; Jonsson, Eirikur; Kennedy, Graeme J.

    2017-01-01

    To evaluate new airframe technologies we need design tools based on high-fidelity models that consider multidisciplinary interactions early in the design process. The overarching goal of this NRA is to develop tools that enable high-fidelity multidisciplinary design optimization of aircraft configurations, and to apply these tools to the design of high aspect ratio flexible wings. We develop a geometry engine that is capable of quickly generating conventional and unconventional aircraft configurations including the internal structure. This geometry engine features adjoint derivative computation for efficient gradient-based optimization. We also added overset capability to a computational fluid dynamics solver, complete with an adjoint implementation and semiautomatic mesh generation. We also developed an approach to constraining buffet and started the development of an approach for constraining utter. On the applications side, we developed a new common high-fidelity model for aeroelastic studies of high aspect ratio wings. We performed optimal design trade-o s between fuel burn and aircraft weight for metal, conventional composite, and carbon nanotube composite wings. We also assessed a continuous morphing trailing edge technology applied to high aspect ratio wings. This research resulted in the publication of 26 manuscripts so far, and the developed methodologies were used in two other NRAs. 1

  8. Health economics, equity, and efficiency: are we almost there?

    PubMed

    Ferraz, Marcos Bosi

    2015-01-01

    Health care is a highly complex, dynamic, and creative sector of the economy. While health economics has to continue its efforts to improve its methods and tools to better inform decisions, the application needs to be aligned with the insights and models of other social sciences disciplines. Decisions may be guided by four concept models based on ethical and distributive justice: libertarian, communitarian, egalitarian, and utilitarian. The societal agreement on one model or a defined mix of models is critical to avoid inequity and unfair decisions in a public and/or private insurance-based health care system. The excess use of methods and tools without fully defining the basic goals and philosophical principles of the health care system and without evaluating the fitness of these measures to reaching these goals may not contribute to an efficient improvement of population health.

  9. Health economics, equity, and efficiency: are we almost there?

    PubMed Central

    Ferraz, Marcos Bosi

    2015-01-01

    Health care is a highly complex, dynamic, and creative sector of the economy. While health economics has to continue its efforts to improve its methods and tools to better inform decisions, the application needs to be aligned with the insights and models of other social sciences disciplines. Decisions may be guided by four concept models based on ethical and distributive justice: libertarian, communitarian, egalitarian, and utilitarian. The societal agreement on one model or a defined mix of models is critical to avoid inequity and unfair decisions in a public and/or private insurance-based health care system. The excess use of methods and tools without fully defining the basic goals and philosophical principles of the health care system and without evaluating the fitness of these measures to reaching these goals may not contribute to an efficient improvement of population health. PMID:25709481

  10. Bringing the light to high throughput screening: use of optogenetic tools for the development of recombinant cellular assays

    NASA Astrophysics Data System (ADS)

    Agus, Viviana; Di Silvio, Alberto; Rolland, Jean Francois; Mondini, Anna; Tremolada, Sara; Montag, Katharina; Scarabottolo, Lia; Redaelli, Loredana; Lohmer, Stefan

    2015-03-01

    The use of light-activated proteins represents a powerful tool to control biological processes with high spatial and temporal precision. These so called "optogenetic" technologies have been successfully validated in many recombinant systems, and have been widely applied to the study of cellular mechanisms in intact tissues or behaving animals; to do that, complex, high-intensity, often home-made instrumentations were developed to achieve the optimal power and precision of light stimulation. In our study we sought to determine if this optical modulation can be obtained also in a miniaturized format, such as a 384-well plate, using the instrumentations normally dedicated to fluorescence analysis in High Throughput Screening (HTS) activities, such as for example the FLIPR (Fluorometric Imaging Plate Reader) instrument. We successfully generated optogenetic assays for the study of different ion channel targets: the CaV1.3 calcium channel was modulated by the light-activated Channelrhodopsin-2, the HCN2 cyclic nucleotide gated (CNG) channel was modulated by the light activated bPAC adenylyl cyclase, and finally the genetically encoded voltage indicator ArcLight was efficiently used to measure potassium, sodium or chloride channel activity. Our results showed that stable, robust and miniaturized cellular assays can be developed using different optogenetic tools, and efficiently modulated by the FLIPR instrument LEDs in a 384-well format. The spatial and temporal resolution delivered by this technology might enormously advantage the early stages of drug discovery, leading to the identification of more physiological and effective drug molecules.

  11. Effect of multiple forming tools on geometrical and mechanical properties in incremental sheet forming

    NASA Astrophysics Data System (ADS)

    Wernicke, S.; Dang, T.; Gies, S.; Tekkaya, A. E.

    2018-05-01

    The tendency to a higher variety of products requires economical manufacturing processes suitable for the production of prototypes and small batches. In the case of complex hollow-shaped parts, single point incremental forming (SPIF) represents a highly flexible process. The flexibility of this process comes along with a very long process time. To decrease the process time, a new incremental forming approach with multiple forming tools is investigated. The influence of two incremental forming tools on the resulting mechanical and geometrical component properties compared to SPIF is presented. Sheets made of EN AW-1050A were formed to frustums of a pyramid using different tool-path strategies. Furthermore, several variations of the tool-path strategy are analyzed. A time saving between 40% and 60% was observed depending on the tool-path and the radii of the forming tools while the mechanical properties remained unchanged. This knowledge can increase the cost efficiency of incremental forming processes.

  12. Identification of high-efficiency 3'GG gRNA motifs in indexed FASTA files with ngg2.

    PubMed

    Roberson, Elisha D O

    CRISPR/Cas9 is emerging as one of the most-used methods of genome modification in organisms ranging from bacteria to human cells. However, the efficiency of editing varies tremendously site-to-site. A recent report identified a novel motif, called the 3'GG motif, which substantially increases the efficiency of editing at all sites tested in C. elegans . Furthermore, they highlighted that previously published gRNAs with high editing efficiency also had this motif. I designed a python command-line tool, ngg2, to identify 3'GG gRNA sites from indexed FASTA files. As a proof-of-concept, I screened for these motifs in six model genomes: Saccharomyces cerevisiae , Caenorhabditis elegans , Drosophila melanogaster , Danio rerio , Mus musculus , and Homo sapiens. I also scanned the genomes of pig ( Sus scrofa ) and African elephant ( Loxodonta africana ) to demonstrate the utility in non-model organisms. I identified more than 60 million single match 3'GG motifs in these genomes. Greater than 61% of all protein coding genes in the reference genomes had at least one unique 3'GG gRNA site overlapping an exon. In particular, more than 96% of mouse and 93% of human protein coding genes have at least one unique, overlapping 3'GG gRNA. These identified sites can be used as a starting point in gRNA selection, and the ngg2 tool provides an important ability to identify 3'GG editing sites in any species with an available genome sequence.

  13. Imaging Carbon Nanotubes in High Performance Polymer Composites via Magnetic Force Microscope

    NASA Technical Reports Server (NTRS)

    Lillehei, Peter T.; Park, Cheol; Rouse, Jason H.; Siochi, Emilie J.; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    Application of carbon nanotubes as reinforcement in structural composites is dependent on the efficient dispersion of the nanotubes in a high performance polymer matrix. The characterization of such dispersion is limited by the lack of available tools to visualize the quality of the matrix/carbon nanotube interaction. The work reported herein demonstrates the use of magnetic force microscopy (MFM) as a promising technique for characterizing the dispersion of nanotubes in a high performance polymer matrix.

  14. Compliant energy and momentum conservation in NEGF simulation of electron-phonon scattering in semiconductor nano-wire transistors

    NASA Astrophysics Data System (ADS)

    Barker, J. R.; Martinez, A.; Aldegunde, M.

    2012-05-01

    The modelling of spatially inhomogeneous silicon nanowire field-effect transistors has benefited from powerful simulation tools built around the Keldysh formulation of non-equilibrium Green function (NEGF) theory. The methodology is highly efficient for situations where the self-energies are diagonal (local) in space coordinates. It has thus been common practice to adopt diagonality (locality) approximations. We demonstrate here that the scattering kernel that controls the self-energies for electron-phonon interactions is generally non-local on the scale of at least a few lattice spacings (and thus within the spatial scale of features in extreme nano-transistors) and for polar optical phonon-electron interactions may be very much longer. It is shown that the diagonality approximation strongly under-estimates the scattering rates for scattering on polar optical phonons. This is an unexpected problem in silicon devices but occurs due to strong polar SO phonon-electron interactions extending into a narrow silicon channel surrounded by high kappa dielectric in wrap-round gate devices. Since dissipative inelastic scattering is already a serious problem for highly confined devices it is concluded that new algorithms need to be forthcoming to provide appropriate and efficient NEGF tools.

  15. Collaborative Aviation Weather Statement - An Impact-based Decision Support Tool

    NASA Astrophysics Data System (ADS)

    Blondin, Debra

    2016-04-01

    Historically, convection causes the highest number of air traffic constraints on the United States National Air Space (NAS). Increased NAS predictability allows traffic flow managers to more effectively initiate, amend or terminate planned or active traffic management initiatives, resulting in more efficient use of available airspace. A Collaborative Aviation Weather Statement (CAWS) is an impact-based decision support tool used for the timely delivery of high-confidence, high-relevance aviation convective weather forecasts to air traffic managers. The CAWS is a graphical and textual forecast produced by a collaborative team of meteorologists from the Aviation Weather Center (AWC), Center Weather Service Units, and airlines to bring attention to high impact areas of thunderstorms. The CAWS addresses thunderstorm initiation or movement into the airports having the highest volume of traffic or into traffic sensitive jet routes. These statements are assessed by planners at the Federal Aviation Administration's (FAA) Air Route Traffic Control Centers and are used for planning traffic management initiatives to balance air traffic flow across the United States. The FAA and the airline industry use the CAWS to plan, manage, and execute operations in the NAS, thereby improving the system efficiency and safety and also saving dollars for industry and the traveling public.

  16. Addressing Curse of Dimensionality in Sensitivity Analysis: How Can We Handle High-Dimensional Problems?

    NASA Astrophysics Data System (ADS)

    Safaei, S.; Haghnegahdar, A.; Razavi, S.

    2016-12-01

    Complex environmental models are now the primary tool to inform decision makers for the current or future management of environmental resources under the climate and environmental changes. These complex models often contain a large number of parameters that need to be determined by a computationally intensive calibration procedure. Sensitivity analysis (SA) is a very useful tool that not only allows for understanding the model behavior, but also helps in reducing the number of calibration parameters by identifying unimportant ones. The issue is that most global sensitivity techniques are highly computationally demanding themselves for generating robust and stable sensitivity metrics over the entire model response surface. Recently, a novel global sensitivity analysis method, Variogram Analysis of Response Surfaces (VARS), is introduced that can efficiently provide a comprehensive assessment of global sensitivity using the Variogram concept. In this work, we aim to evaluate the effectiveness of this highly efficient GSA method in saving computational burden, when applied to systems with extra-large number of input factors ( 100). We use a test function and a hydrological modelling case study to demonstrate the capability of VARS method in reducing problem dimensionality by identifying important vs unimportant input factors.

  17. Benchmarking and Self-Assessment in the Wine Industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galitsky, Christina; Radspieler, Anthony; Worrell, Ernst

    2005-12-01

    Not all industrial facilities have the staff or theopportunity to perform a detailed audit of their operations. The lack ofknowledge of energy efficiency opportunities provides an importantbarrier to improving efficiency. Benchmarking programs in the U.S. andabroad have shown to improve knowledge of the energy performance ofindustrial facilities and buildings and to fuel energy managementpractices. Benchmarking provides a fair way to compare the energyintensity of plants, while accounting for structural differences (e.g.,the mix of products produced, climate conditions) between differentfacilities. In California, the winemaking industry is not only one of theeconomic pillars of the economy; it is also a large energymore » consumer, witha considerable potential for energy-efficiency improvement. LawrenceBerkeley National Laboratory and Fetzer Vineyards developed the firstbenchmarking tool for the California wine industry called "BEST(Benchmarking and Energy and water Savings Tool) Winery". BEST Wineryenables a winery to compare its energy efficiency to a best practicereference winery. Besides overall performance, the tool enables the userto evaluate the impact of implementing efficiency measures. The toolfacilitates strategic planning of efficiency measures, based on theestimated impact of the measures, their costs and savings. The tool willraise awareness of current energy intensities and offer an efficient wayto evaluate the impact of future efficiency measures.« less

  18. Determination of high-strength materials diamond grinding rational modes

    NASA Astrophysics Data System (ADS)

    Arkhipov, P. V.; Lobanov, D. V.; Rychkov, D. A.; Yanyushkin, A. S.

    2018-03-01

    The analysis of methods of high-strength materials abrasive processing is carried out. This method made it possible to determine the necessary directions and prospects for the development of shaping combined methods. The need to use metal bonded diamond abrasive tools in combination with a different kind of energy is noted to improve the processing efficiency and reduce the complexity of operations. The complex of experimental research on revealing the importance of mechanical and electrical components of cutting regimes, on the cutting ability of diamond tools, as well as the need to reduce the specific consumption of an abrasive wheel as one of the important economic indicators of the processing process is performed. It is established that combined diamond grinding with simultaneous continuous correction of the abrasive wheel contributes to an increase in the cutting ability of metal bonded diamond abrasive tools when processing high-strength materials by an average of 30% compared to diamond grinding. Particular recommendations on the designation of technological factors are developed depending on specific production problems.

  19. Will we exceed 50% efficiency in photovoltaics?

    NASA Astrophysics Data System (ADS)

    Luque, Antonio

    2011-08-01

    Solar energy is the most abundant and reliable source of energy we have to provide for the multi-terawatt challenge we are facing. Although huge, this resource is relatively dispersed. High conversion efficiency is probably necessary for cost effectiveness. Solar cell efficiencies above 40% have been achieved with multijunction (MJ) solar cells. These achievements are here described. Possible paths for improvement are hinted at including third generation photovoltaics concepts. It is concluded that it is very likely that the target of 50% will eventually be achieved. This high efficiency requires operating under concentrated sunlight, partly because concentration helps increase the efficiency but mainly because the cost of the sophisticated cells needed can only be paid by extracting as much electric power form each cell as possible. The optical challenges associated with the concentrator optics and the tools for overcoming them, in particular non-imaging optics, are briefly discussed and the results and trends are described. It is probable that optical efficiency over 90% will be possible in the future. This would lead to a module efficiency of 45%. The manufacturing of a concentrator has to be addressed at three levels of integration: module, array, and photovoltaic (PV) subfield. The PV plant as a whole is very similar than a flat module PV plant with two-axes tracking. At the module level, the development of tools for easy manufacturing and quality control is an important topic. Furthermore, they can accommodate in different position cells with different spectral sensitivities so complementing the effort in manufacturing MJ cells. At the array level, a proper definition of the nameplate watts, since the diffuse light is not used, is under discussion. The cost of installation of arrays in the field can be very much reduced by self aligning tracking control strategies. At the subfield level, aspects such as the self shadowing of arrays causes the CPV subfields to be sparsely packed leading to a ground efficiency, in the range of 10%, that in some cases will be below that of fixed modules of much lower cell efficiency. All this taken into account, High Concentration PV (HCPV) has the opportunity to become the cheapest of the PV technologies and beat the prevalent electricity generation technologies. Of course the way will be paved with challenges, and success is not guaranteed.

  20. CRISPR/Cas9 nuclease-mediated gene knock-in in bovine-induced pluripotent cells.

    PubMed

    Heo, Young Tae; Quan, Xiaoyuan; Xu, Yong Nan; Baek, Soonbong; Choi, Hwan; Kim, Nam-Hyung; Kim, Jongpil

    2015-02-01

    Efficient and precise genetic engineering in livestock such as cattle holds great promise in agriculture and biomedicine. However, techniques that generate pluripotent stem cells, as well as reliable tools for gene targeting in livestock, are still inefficient, and thus not routinely used. Here, we report highly efficient gene targeting in the bovine genome using bovine pluripotent cells and clustered regularly interspaced short palindromic repeat (CRISPR)/Cas9 nuclease. First, we generate induced pluripotent stem cells (iPSCs) from bovine somatic fibroblasts by the ectopic expression of yamanaka factors and GSK3β and MEK inhibitor (2i) treatment. We observed that these bovine iPSCs are highly similar to naïve pluripotent stem cells with regard to gene expression and developmental potential in teratomas. Moreover, CRISPR/Cas9 nuclease, which was specific for the bovine NANOG locus, showed highly efficient editing of the bovine genome in bovine iPSCs and embryos. To conclude, CRISPR/Cas9 nuclease-mediated homologous recombination targeting in bovine pluripotent cells is an efficient gene editing method that can be used to generate transgenic livestock in the future.

  1. Investigation of surface finishing of carbon based coated tools for dry deep drawing of aluminium alloys

    NASA Astrophysics Data System (ADS)

    Steiner, J.; Andreas, K.; Merklein, M.

    2016-11-01

    Global trends like growing environmental awareness and demand for resource efficiency motivate an abandonment of lubricants in metal forming. However, dry forming evokes increased friction and wear. Especially, dry deep drawing of aluminum alloys leads to intensive interaction between tool and workpiece due to its high adhesion tendency. One approach to improve the tribological behavior is the application of carbon based coatings. These coatings are characterized by high wear resistance. In order to investigate the potential of carbon based coatings for dry deep drawing, friction and wear behavior of different coating compositions are evaluated in strip drawing tests. This setup is used to model the tribological conditions in the flange area of deep drawing operations. The tribological behavior of tetrahedral amorphous (ta-C) and hydrogenated amorphous carbon coatings with and without tungsten modification (a-C:H:W, a-C:H) is investigated. The influence of tool topography is analyzed by applying different surface finishing. The results show reduced friction with decreased roughness for coated tools. Besides tool topography the coating type determines the tribological conditions. Smooth tools with ta-C and a-C:H coatings reveal low friction and prevent adhesive wear. In contrast, smooth a-C:H:W coated tools only lead to slight improvement compared to rough, uncoated specimen.

  2. Developing an eLearning tool formalizing in YAWL the guidelines used in a transfusion medicine service.

    PubMed

    Russo, Paola; Piazza, Miriam; Leonardi, Giorgio; Roncoroni, Layla; Russo, Carlo; Spadaro, Salvatore; Quaglini, Silvana

    2012-01-01

    The blood transfusion is a complex activity subject to a high risk of eventually fatal errors. The development and application of computer-based systems could help reducing the error rate, playing a fundamental role in the improvement of the quality of care. This poster presents an under development eLearning tool formalizing the guidelines of the transfusion process. This system, implemented in YAWL (Yet Another Workflow Language), will be used to train the personnel in order to improve the efficiency of care and to reduce errors.

  3. Chemiluminescence analyzer of NOx as a high-throughput screening tool in selective catalytic reduction of NO

    PubMed Central

    Oh, Kwang Seok; Woo, Seong Ihl

    2011-01-01

    A chemiluminescence-based analyzer of NOx gas species has been applied for high-throughput screening of a library of catalytic materials. The applicability of the commercial NOx analyzer as a rapid screening tool was evaluated using selective catalytic reduction of NO gas. A library of 60 binary alloys composed of Pt and Co, Zr, La, Ce, Fe or W on Al2O3 substrate was tested for the efficiency of NOx removal using a home-built 64-channel parallel and sequential tubular reactor. The NOx concentrations measured by the NOx analyzer agreed well with the results obtained using micro gas chromatography for a reference catalyst consisting of 1 wt% Pt on γ-Al2O3. Most alloys showed high efficiency at 275 °C, which is typical of Pt-based catalysts for selective catalytic reduction of NO. The screening with NOx analyzer allowed to select Pt-Ce(X) (X=1–3) and Pt–Fe(2) as the optimal catalysts for NOx removal: 73% NOx conversion was achieved with the Pt–Fe(2) alloy, which was much better than the results for the reference catalyst and the other library alloys. This study demonstrates a sequential high-throughput method of practical evaluation of catalysts for the selective reduction of NO. PMID:27877438

  4. A General-purpose Framework for Parallel Processing of Large-scale LiDAR Data

    NASA Astrophysics Data System (ADS)

    Li, Z.; Hodgson, M.; Li, W.

    2016-12-01

    Light detection and ranging (LiDAR) technologies have proven efficiency to quickly obtain very detailed Earth surface data for a large spatial extent. Such data is important for scientific discoveries such as Earth and ecological sciences and natural disasters and environmental applications. However, handling LiDAR data poses grand geoprocessing challenges due to data intensity and computational intensity. Previous studies received notable success on parallel processing of LiDAR data to these challenges. However, these studies either relied on high performance computers and specialized hardware (GPUs) or focused mostly on finding customized solutions for some specific algorithms. We developed a general-purpose scalable framework coupled with sophisticated data decomposition and parallelization strategy to efficiently handle big LiDAR data. Specifically, 1) a tile-based spatial index is proposed to manage big LiDAR data in the scalable and fault-tolerable Hadoop distributed file system, 2) two spatial decomposition techniques are developed to enable efficient parallelization of different types of LiDAR processing tasks, and 3) by coupling existing LiDAR processing tools with Hadoop, this framework is able to conduct a variety of LiDAR data processing tasks in parallel in a highly scalable distributed computing environment. The performance and scalability of the framework is evaluated with a series of experiments conducted on a real LiDAR dataset using a proof-of-concept prototype system. The results show that the proposed framework 1) is able to handle massive LiDAR data more efficiently than standalone tools; and 2) provides almost linear scalability in terms of either increased workload (data volume) or increased computing nodes with both spatial decomposition strategies. We believe that the proposed framework provides valuable references on developing a collaborative cyberinfrastructure for processing big earth science data in a highly scalable environment.

  5. Bright high-repetition-rate source of narrowband extreme-ultraviolet harmonics beyond 22 eV

    PubMed Central

    Wang, He; Xu, Yiming; Ulonska, Stefan; Robinson, Joseph S.; Ranitovic, Predrag; Kaindl, Robert A.

    2015-01-01

    Novel table-top sources of extreme-ultraviolet light based on high-harmonic generation yield unique insight into the fundamental properties of molecules, nanomaterials or correlated solids, and enable advanced applications in imaging or metrology. Extending high-harmonic generation to high repetition rates portends great experimental benefits, yet efficient extreme-ultraviolet conversion of correspondingly weak driving pulses is challenging. Here, we demonstrate a highly-efficient source of femtosecond extreme-ultraviolet pulses at 50-kHz repetition rate, utilizing the ultraviolet second-harmonic focused tightly into Kr gas. In this cascaded scheme, a photon flux beyond ≈3 × 1013 s−1 is generated at 22.3 eV, with 5 × 10−5 conversion efficiency that surpasses similar harmonics directly driven by the fundamental by two orders-of-magnitude. The enhancement arises from both wavelength scaling of the atomic dipole and improved spatio-temporal phase matching, confirmed by simulations. Spectral isolation of a single 72-meV-wide harmonic renders this bright, 50-kHz extreme-ultraviolet source a powerful tool for ultrafast photoemission, nanoscale imaging and other applications. PMID:26067922

  6. Learnings From the Pilot Implementation of Mobile Medical Milestones Application.

    PubMed

    Page, Cristen P; Reid, Alfred; Coe, Catherine L; Carlough, Martha; Rosenbaum, Daryl; Beste, Janalynn; Fagan, Blake; Steinbacher, Erika; Jones, Geoffrey; Newton, Warren P

    2016-10-01

    Implementation of the educational milestones benefits from mobile technology that facilitates ready assessments in the clinical environment. We developed a point-of-care resident evaluation tool, the Mobile Medical Milestones Application (M3App), and piloted it in 8 North Carolina family medicine residency programs. We sought to examine variations we found in the use of the tool across programs and explored the experiences of program directors, faculty, and residents to better understand the perceived benefits and challenges of implementing the new tool. Residents and faculty completed presurveys and postsurveys about the tool and the evaluation process in their program. Program directors were interviewed individually. Interviews and open-ended survey responses were analyzed and coded using the constant comparative method, and responses were tabulated under themes. Common perceptions included increased data collection, enhanced efficiency, and increased perceived quality of the information gathered with the M3App. Residents appreciated the timely, high-quality feedback they received. Faculty reported becoming more comfortable with the tool over time, and a more favorable evaluation of the tool was associated with higher utilization. Program directors reported improvements in faculty knowledge of the milestones and resident satisfaction with feedback. Faculty and residents credited the M3App with improving the quality and efficiency of resident feedback. Residents appreciated the frequency, proximity, and specificity of feedback, and faculty reported the app improved their familiarity with the milestones. Implementation challenges included lack of a physician champion and competing demands on faculty time.

  7. Systems-Level Synthetic Biology for Advanced Biofuel Production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruffing, Anne; Jensen, Travis J.; Strickland, Lucas Marshall

    2015-03-01

    Cyanobacteria have been shown to be capable of producing a variety of advanced biofuels; however, product yields remain well below those necessary for large scale production. New genetic tools and high throughput metabolic engineering techniques are needed to optimize cyanobacterial metabolisms for enhanced biofuel production. Towards this goal, this project advances the development of a multiple promoter replacement technique for systems-level optimization of gene expression in a model cyanobacterial host: Synechococcus sp. PCC 7002. To realize this multiple-target approach, key capabilities were developed, including a high throughput detection method for advanced biofuels, enhanced transformation efficiency, and genetic tools for Synechococcusmore » sp. PCC 7002. Moreover, several additional obstacles were identified for realization of this multiple promoter replacement technique. The techniques and tools developed in this project will help to enable future efforts in the advancement of cyanobacterial biofuels.« less

  8. pKAMA-ITACHI Vectors for Highly Efficient CRISPR/Cas9-Mediated Gene Knockout in Arabidopsis thaliana

    PubMed Central

    2017-01-01

    The CRISPR/Cas9 (clustered regularly interspaced short palindromic repeats/CRISPR-associated 9) system is widely used as a tool for genome engineering in various organisms. A complex consisting of Cas9 and single guide RNA (sgRNA) induces a DNA double-strand break in a sequence-specific manner, resulting in knockout. Some binary vectors for CRISPR/Cas9 in plants have been reported, but there is a problem with low efficiency. Here, we present a newly developed, highly efficient CRISPR/Cas9 vector for Arabidopsis thaliana, pKAMA-ITACHI Red (pKIR), harboring the RIBOSOMAL PROTEIN S5 A (RPS5A) promoter to drive Cas9. The RPS5A promoter maintains high constitutive expression at all developmental stages starting from the egg cell and including meristematic cells. Even in the T1 generation, pKIR induced null phenotypes in some genes: PHYTOENE DESATURASE 3 (PDS3), AGAMOUS (AG) and DUO POLLEN 1 (DUO1). Mutations induced by pKIR were carried in the germ cell line of the T1 generation. Surprisingly, in some lines, 100% of the T2 plants had the adh1 (ALCOHOL DEHYDROGENASE 1) null phenotype, indicating that pKIR strongly induced heritable mutations. Cas9-free T2 mutant plants were obtained by removing T2 seeds expressing a fluorescent marker in pKIR. Our results suggest that the pKIR system is a powerful molecular tool for genome engineering in Arabidopsis. PMID:27856772

  9. Breast cancer screening (BCS) chart: a basic and preliminary model for making screening mammography more productive and efficient.

    PubMed

    Poorolajal, Jalal; Akbari, Mohammad Esmaeil; Ziaee, Fatane; Karami, Manoochehr; Ghoncheh, Mahshid

    2017-05-15

    The breast cancer screening (BCS) chart is suggested as a basic and preliminary tool to improve efficiency of screening mammography. We conducted this case-control study in 2016 and enrolled 1422 women aged 30-75 years, including 506 women with breast cancer (cases) and 916 women without breast cancer (controls). We developed the BCS chart using a multiple logistic regression analysis. We combined the risks of breast cancer to predict the individual risk of breast cancer. Then, we stratified and colored the predicted risk probabilities as follows: <05% (green), 05-09% (yellow), 10-14% (orange), 15-19% (red), 20-24% (brown) and ≥25% (black). The BCS chart provides the risk probability of breast cancer, based on age, body mass index, late menopause, having a benign breast disease and a positive family history of breast cancer among the first-degree or the second/third-degree relatives. According to this chart, an individual can be classified in a category of low risk (green), medium risk (yellow and orange), high risk (red and brown) and very high risk (black) for breast cancer. This chart is a flexible and easy to use tool that can detect high-risk subjects and make the screening program more efficient and productive. © The Author 2017. Published by Oxford University Press on behalf of Faculty of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  10. Technology Combination Analysis Tool (TCAT) for Active Debris Removal

    NASA Astrophysics Data System (ADS)

    Chamot, B.; Richard, M.; Salmon, T.; Pisseloup, A.; Cougnet, C.; Axthelm, R.; Saunder, C.; Dupont, C.; Lequette, L.

    2013-08-01

    This paper present the work of the Swiss Space Center EPFL within the CNES-funded OTV-2 study. In order to find the most performant Active Debris Removal (ADR) mission architectures and technologies, a tool was developed in order to design and compare ADR spacecraft, and to plan ADR campaigns to remove large debris. Two types of architectures are considered to be efficient: the Chaser (single-debris spacecraft), the Mothership/ Kits (multiple-debris spacecraft). Both are able to perform controlled re-entry. The tool includes modules to optimise the launch dates and the order of capture, to design missions and spacecraft, and to select launch vehicles. The propulsion, power and structure subsystems are sized by the tool thanks to high-level parametric models whilst the other ones are defined by their mass and power consumption. Final results are still under investigation by the consortium but two concrete examples of the tool's outputs are presented in the paper.

  11. Micro electrical discharge milling using deionized water as a dielectric fluid

    NASA Astrophysics Data System (ADS)

    Chung, Do Kwan; Kim, Bo Hyun; Chu, Chong Nam

    2007-05-01

    In electrical discharge machining, dielectric fluid is an important factor affecting machining characteristics. Generally, kerosene and deionized water have been used as dielectric fluids. In micro electrical discharge milling, which uses a micro electrode as a tool, the wear of the tool electrode decreases the machining accuracy. However, the use of deionized water instead of kerosene can reduce the tool wear and increase the machining speed. This paper investigates micro electrical discharge milling using deionized water. Deionized water with high resistivity was used to minimize the machining gap. Machining characteristics such as the tool wear, machining gap and machining rate were investigated according to resistivity of deionized water. As the resistivity of deionized water decreased, the tool wear was reduced, but the machining gap increased due to electrochemical dissolution. Micro hemispheres were machined for the purpose of investigating machining efficiency between dielectric fluids, kerosene and deionized water.

  12. LittleQuickWarp: an ultrafast image warping tool.

    PubMed

    Qu, Lei; Peng, Hanchuan

    2015-02-01

    Warping images into a standard coordinate space is critical for many image computing related tasks. However, for multi-dimensional and high-resolution images, an accurate warping operation itself is often very expensive in terms of computer memory and computational time. For high-throughput image analysis studies such as brain mapping projects, it is desirable to have high performance image warping tools that are compatible with common image analysis pipelines. In this article, we present LittleQuickWarp, a swift and memory efficient tool that boosts 3D image warping performance dramatically and at the same time has high warping quality similar to the widely used thin plate spline (TPS) warping. Compared to the TPS, LittleQuickWarp can improve the warping speed 2-5 times and reduce the memory consumption 6-20 times. We have implemented LittleQuickWarp as an Open Source plug-in program on top of the Vaa3D system (http://vaa3d.org). The source code and a brief tutorial can be found in the Vaa3D plugin source code repository. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. GIS Toolsets for Planetary Geomorphology and Landing-Site Analysis

    NASA Astrophysics Data System (ADS)

    Nass, Andrea; van Gasselt, Stephan

    2015-04-01

    Modern Geographic Information Systems (GIS) allow expert and lay users alike to load and position geographic data and perform simple to highly complex surface analyses. For many applications dedicated and ready-to-use GIS tools are available in standard software systems while other applications require the modular combination of available basic tools to answer more specific questions. This also applies to analyses in modern planetary geomorphology where many of such (basic) tools can be used to build complex analysis tools, e.g. in image- and terrain model analysis. Apart from the simple application of sets of different tools, many complex tasks require a more sophisticated design for storing and accessing data using databases (e.g. ArcHydro for hydrological data analysis). In planetary sciences, complex database-driven models are often required to efficiently analyse potential landings sites or store rover data, but also geologic mapping data can be efficiently stored and accessed using database models rather than stand-alone shapefiles. For landings-site analyses, relief and surface roughness estimates are two common concepts that are of particular interest and for both, a number of different definitions co-exist. We here present an advanced toolset for the analysis of image and terrain-model data with an emphasis on extraction of landing site characteristics using established criteria. We provide working examples and particularly focus on the concepts of terrain roughness as it is interpreted in geomorphology and engineering studies.

  14. MicroRNAs associated with the efficacy of photodynamic therapy in biliary tract cancer cell lines.

    PubMed

    Wagner, Andrej; Mayr, Christian; Bach, Doris; Illig, Romana; Plaetzer, Kristjan; Berr, Frieder; Pichler, Martin; Neureiter, Daniel; Kiesslich, Tobias

    2014-11-05

    Photodynamic therapy (PDT) is a palliative treatment option for unresectable hilar biliary tract cancer (BTC) showing a considerable benefit for survival and quality of life with few side effects. Currently, factors determining the cellular response of BTC cells towards PDT are unknown. Due to their multifaceted nature, microRNAs (miRs) are a promising analyte to investigate the cellular mechanisms following PDT. For two photosensitizers, Photofrin® and Foscan®, the phototoxicity was investigated in eight BTC cell lines. Each cell line (untreated) was profiled for expression of n=754 miRs using TaqMan® Array Human MicroRNA Cards. Statistical analysis and bioinformatic tools were used to identify miRs associated with PDT efficiency and their putative targets, respectively. Twenty miRs correlated significantly with either high or low PDT efficiency. PDT was particularly effective in cells with high levels of clustered miRs 25-93*-106b and (in case of miR-106b) a phenotype characterized by high expression of the mesenchymal marker vimentin and high proliferation (cyclinD1 and Ki67 expression). Insensitivity towards PDT was associated with high miR-200 family expression and (for miR-cluster 200a/b-429) expression of differentiation markers Ck19 and Ck8/18. Predicted and validated downstream targets indicate plausible involvement of miRs 20a*, 25, 93*, 130a, 141, 200a, 200c and 203 in response mechanisms to PDT, suggesting that targeting these miRs could improve susceptibility to PDT in insensitive cell lines. Taken together, the miRNome pattern may provide a novel tool for predicting the efficiency of PDT and-following appropriate functional verification-may subsequently allow for optimization of the PDT protocol.

  15. MicroRNAs Associated with the Efficacy of Photodynamic Therapy in Biliary Tract Cancer Cell Lines

    PubMed Central

    Wagner, Andrej; Mayr, Christian; Bach, Doris; Illig, Romana; Plaetzer, Kristjan; Berr, Frieder; Pichler, Martin; Neureiter, Daniel; Kiesslich, Tobias

    2014-01-01

    Photodynamic therapy (PDT) is a palliative treatment option for unresectable hilar biliary tract cancer (BTC) showing a considerable benefit for survival and quality of life with few side effects. Currently, factors determining the cellular response of BTC cells towards PDT are unknown. Due to their multifaceted nature, microRNAs (miRs) are a promising analyte to investigate the cellular mechanisms following PDT. For two photosensitizers, Photofrin® and Foscan®, the phototoxicity was investigated in eight BTC cell lines. Each cell line (untreated) was profiled for expression of n = 754 miRs using TaqMan® Array Human MicroRNA Cards. Statistical analysis and bioinformatic tools were used to identify miRs associated with PDT efficiency and their putative targets, respectively. Twenty miRs correlated significantly with either high or low PDT efficiency. PDT was particularly effective in cells with high levels of clustered miRs 25-93*-106b and (in case of miR-106b) a phenotype characterized by high expression of the mesenchymal marker vimentin and high proliferation (cyclinD1 and Ki67 expression). Insensitivity towards PDT was associated with high miR-200 family expression and (for miR-cluster 200a/b-429) expression of differentiation markers Ck19 and Ck8/18. Predicted and validated downstream targets indicate plausible involvement of miRs 20a*, 25, 93*, 130a, 141, 200a, 200c and 203 in response mechanisms to PDT, suggesting that targeting these miRs could improve susceptibility to PDT in insensitive cell lines. Taken together, the miRNome pattern may provide a novel tool for predicting the efficiency of PDT and—following appropriate functional verification—may subsequently allow for optimization of the PDT protocol. PMID:25380521

  16. Large-scale fabrication of micro-lens array by novel end-fly-cutting-servo diamond machining.

    PubMed

    Zhu, Zhiwei; To, Suet; Zhang, Shaojian

    2015-08-10

    Fast/slow tool servo (FTS/STS) diamond turning is a very promising technique for the generation of micro-lens array (MLA). However, it is still a challenge to process MLA in large scale due to certain inherent limitations of this technique. In the present study, a novel ultra-precision diamond cutting method, as the end-fly-cutting-servo (EFCS) system, is adopted and investigated for large-scale generation of MLA. After a detailed discussion of the characteristic advantages for processing MLA, the optimal toolpath generation strategy for the EFCS is developed with consideration of the geometry and installation pose of the diamond tool. A typical aspheric MLA over a large area is experimentally fabricated, and the resulting form accuracy, surface micro-topography and machining efficiency are critically investigated. The result indicates that the MLA with homogeneous quality over the whole area is obtained. Besides, high machining efficiency, extremely small volume of control points for the toolpath, and optimal usage of system dynamics of the machine tool during the whole cutting can be simultaneously achieved.

  17. Demonstration Of Ultra HI-FI (UHF) Methods

    NASA Technical Reports Server (NTRS)

    Dyson, Rodger W.

    2004-01-01

    Computational aero-acoustics (CAA) requires efficient, high-resolution simulation tools. Most current techniques utilize finite-difference approaches because high order accuracy is considered too difficult or expensive to achieve with finite volume or finite element methods. However, a novel finite volume approach (Ultra HI-FI or UHF) which utilizes Hermite fluxes is presented which can achieve both arbitrary accuracy and fidelity in space and time. The technique can be applied to unstructured grids with some loss of fidelity or with multi-block structured grids for maximum efficiency and resolution. In either paradigm, it is possible to resolve ultra-short waves (less than 2 PPW). This is demonstrated here by solving the 4th CAA workshop Category 1 Problem 1.

  18. A narrow open tubular column for high efficiency liquid chromatographic separation.

    PubMed

    Chen, Huang; Yang, Yu; Qiao, Zhenzhen; Xiang, Piliang; Ren, Jiangtao; Meng, Yunzhu; Zhang, Kaiqi; Juan Lu, Joann; Liu, Shaorong

    2018-04-30

    We report a great feature of open tubular liquid chromatography when it is run using an extremely narrow (e.g., 2 μm inner diameter) open tubular column: more than 10 million plates per meter can be achieved in less than 10 min and under an elution pressure of ca. 20 bar. The column is coated with octadecylsilane and both isocratic and gradient separations are performed. We reveal a focusing effect that may be used to interpret the efficiency enhancement. We also demonstrate the feasibility of using this technique for separating complex peptide samples. This high-resolution and fast separation technique is promising and can lead to a powerful tool for trace sample analysis.

  19. Identification and Validation of a Brief Test Anxiety Screening Tool

    ERIC Educational Resources Information Center

    von der Embse, Nathaniel P.; Kilgus, Stephen P.; Segool, Natasha; Putwain, Dave

    2013-01-01

    The implementation of test-based accountability policies around the world has increased the pressure placed on students to perform well on state achievement tests. Educational researchers have begun taking a closer look at the reciprocal effects of test anxiety and high-stakes testing. However, existing test anxiety assessments lack efficiency and…

  20. Visiting Scholars Program

    DTIC Science & Technology

    2016-09-01

    other associated grants. 15. SUBJECT TERMS SUNY Poly, STEM, Artificial Intelligence , Command and Control 16. SECURITY CLASSIFICATION OF: 17...neuromorphic system has the potential to be widely used in a high-efficiency artificial intelligence system. Simulation results have indicated that the...novel multiresolution fusion and advanced fusion performance evaluation tool for an Artificial Intelligence based natural language annotation engine for

  1. An intercomparison study of TSM, SEBS, and SEBAL using high-resolution imagery and lysimetric data

    USDA-ARS?s Scientific Manuscript database

    Over the past three decades, numerous remote sensing based ET mapping algorithms were developed. These algorithms provided a robust, economical, and efficient tool for ET estimations at field and regional scales. The Two Source Model (TSM), Surface Energy Balance System (SEBS), and Surface Energy Ba...

  2. High-efficiency propagation of mature 'Washington Navel' orange and juvenile "Carrizo" citrange using axillary shoot proliferation

    USDA-ARS?s Scientific Manuscript database

    Citrus propagation by conventional means is restricted to particular season and availability of plant material. It doesn’t guarantee trueness of cultivars and mass production of certified Citrus plants throughout the year. Plant tissue culture has emerged as a powerful tool to propagation and improv...

  3. Guide to Operating and Maintaining EnergySmart Schools

    ERIC Educational Resources Information Center

    US Department of Energy, 2010

    2010-01-01

    The guide allows users to adapt and implement suggested O&M (operating and maintaining) strategies to address specific energy efficiency goals. It recognizes and expands on existing tools and resources that are widely used throughout the high-performance school industry. The guide is organized into the following sections: (1) Chapter 1:…

  4. CRISPR-Cas9, a tool to efficiently increase the development of recombinant African swine fever viruses

    USDA-ARS?s Scientific Manuscript database

    African swine fever is a contagious and often lethal disease for domestic pigs with a significant economic impact on the swine industry. The etiological agent, African swine fever virus (ASFV), is a highly structurally complex double stranded DNA virus. No effective vaccines or antiviral treatment ...

  5. Exposomics research using suspect screening and non-targeted analysis methods and tools at the U.S. Environmental Protection Agency (ASMS Presentation)

    EPA Science Inventory

    High-resolution mass spectrometry (HRMS) is used for suspect screening (SSA) and non-targeted analysis (NTA) in an attempt to characterize xenobiotic chemicals in various samples broadly and efficiently. These important techniques aid characterization of the exposome, the totalit...

  6. Non-iterative Voltage Stability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makarov, Yuri V.; Vyakaranam, Bharat; Hou, Zhangshuan

    2014-09-30

    This report demonstrates promising capabilities and performance characteristics of the proposed method using several power systems models. The new method will help to develop a new generation of highly efficient tools suitable for real-time parallel implementation. The ultimate benefit obtained will be early detection of system instability and prevention of system blackouts in real time.

  7. Information Communication Technology Policy and Public Primary Schools' Efficiency in Rwanda

    ERIC Educational Resources Information Center

    Munyengabe, Sylvestre; Haiyan, He; Yiyi, Zhao

    2018-01-01

    Teaching and learning processes have been developed through different methods and materials; nowadays the introduction of computers and other ICT tools in different forms and levels of education have been found to be highly influential in education system of different countries. The main objective of this study was to correlate Information…

  8. Research on AutoCAD secondary development and function expansion based on VBA technology

    NASA Astrophysics Data System (ADS)

    Zhang, Runmei; Gu, Yehuan

    2017-06-01

    AutoCAD is the most widely used drawing tool among the similar design drawing products. In the process of drawing different types of design drawings of the same product, there are a lot of repetitive and single work contents. The traditional manual method uses a drawing software AutoCAD drawing graphics with low efficiency, high error rate and high input cost shortcomings and many more. In order to solve these problems, the design of the parametric drawing system of the hot-rolled I-beam (steel beam) cross-section is completed by using the VBA secondary development tool and the Access database software with large-capacity storage data, and the analysis of the functional extension of the plane drawing and the parametric drawing design in this paper. For the secondary development of AutoCAD functions, the system drawing work will be simplified and work efficiency also has been greatly improved. This introduction of parametric design of AutoCAD drawing system to promote the industrial mass production and related industries economic growth rate similar to the standard I-beam hot-rolled products.

  9. Data-driven Applications for the Sun-Earth System

    NASA Astrophysics Data System (ADS)

    Kondrashov, D. A.

    2016-12-01

    Advances in observational and data mining techniques allow extracting information from the large volume of Sun-Earth observational data that can be assimilated into first principles physical models. However, equations governing Sun-Earth phenomena are typically nonlinear, complex, and high-dimensional. The high computational demand of solving the full governing equations over a large range of scales precludes the use of a variety of useful assimilative tools that rely on applied mathematical and statistical techniques for quantifying uncertainty and predictability. Effective use of such tools requires the development of computationally efficient methods to facilitate fusion of data with models. This presentation will provide an overview of various existing as well as newly developed data-driven techniques adopted from atmospheric and oceanic sciences that proved to be useful for space physics applications, such as computationally efficient implementation of Kalman Filter in radiation belts modeling, solar wind gap-filling by Singular Spectrum Analysis, and low-rank procedure for assimilation of low-altitude ionospheric magnetic perturbations into the Lyon-Fedder-Mobarry (LFM) global magnetospheric model. Reduced-order non-Markovian inverse modeling and novel data-adaptive decompositions of Sun-Earth datasets will be also demonstrated.

  10. The replication of a mouse adapted SARS-CoV in a mouse cell line stably expressing the murine SARS-CoV receptor mACE2 efficiently induces the expression of proinflammatory cytokines

    PubMed Central

    Regla-Nava, Jose A.; Jimenez-Guardeño, Jose M.; Nieto-Torres, Jose L.; Gallagher, Thomas M.; Enjuanes, Luis; DeDiego, Marta L.

    2013-01-01

    Infection of conventional mice with a mouse adapted (MA15) severe acute respiratory syndrome (SARS) coronavirus (CoV) reproduces many aspects of human SARS such as pathological changes in lung, viremia, neutrophilia, and lethality. However, established mouse cell lines highly susceptible to mouse-adapted SARS-CoV infection are not available. In this work, efficiently transfectable mouse cell lines stably expressing the murine SARS-CoV receptor angiotensin converting enzyme 2 (ACE2) have been generated. These cells yielded high SARS-CoV-MA15 titers and also served as excellent tools for plaque assays. In addition, in these cell lines, SARS-CoV-MA15 induced the expression of proinflammatory cytokines and IFN-β, mimicking what has been observed in experimental animal models infected with SARS-CoV and SARS patients. These cell lines are valuable tools to perform in vitro studies in a mouse cell system that reflects the species used for in vivo studies of SARS-CoV-MA15 pathogenesis. PMID:23911968

  11. Seismic tomography as a tool for measuring stress in mines

    USGS Publications Warehouse

    Scott, Douglas F.; Williams, T.J.; Denton, D.K.; Friedel, M.J.

    1999-01-01

    Spokane Research Center personnel have been investigating the use of seismic tomography to monitor the behavior of a rock mass, detect hazardous ground conditions and assess the mechanical integrity of a rock mass affected by mining. Seismic tomography can be a valuable tool for determining relative stress in deep, >1,220-m (>4,000-ft), underground pillars. If high-stress areas are detected, they can be destressed prior to development or they can be avoided. High-stress areas can be monitored with successive seismic surveys to determine if stress decreases to a level where development can be initiated safely. There are several benefits to using seismic tomography to identify high stress in deep underground pillars. The technique is reliable, cost-effective, efficient and noninvasive. Also, investigators can monitor large rock masses, as well as monitor pillars during the mining cycle. By identifying areas of high stress, engineers will be able to assure that miners are working in a safer environment.Spokane Research Center personnel have been investigating the use of seismic tomography to monitor the behavior of a rock mass, detect hazardous ground conditions and assess the mechanical integrity of a rock mass affected by mining. Seismic tomography can be a valuable tool for determining relative stress in deep, >1,200-m (>4,000-ft), underground pillars. If high-stress areas are detected, they can be destressed prior to development or they can be avoided. High-stress areas can be monitored with successive seismic surveys to determine if stress decreases to a level where development can be initiated safely. There are several benefits to using seismic tomography to identify high stress in deep underground pillars. The technique is reliable, cost-effective, efficient and noninvasive. Also, investigators can monitor large rock masses, as well as monitor pillars during the mining cycle. By identifying areas of high stress. engineers will be able to assure that miners are working in a safer environment.

  12. Measure Guideline: Heat Pump Water Heaters in New and Existing Homes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shapiro, C.; Puttagunta, S.; Owens, D.

    2012-02-01

    This Building America Measure Guideline is intended for builders, contractors, homeowners, and policy-makers. This document is intended to explore the issues surrounding heat pump water heaters (HPWHs) to ensure that homeowners and contractors have the tools needed to appropriately and efficiently install HPWHs. Heat pump water heaters (HPWHs) promise to significantly reduce energy consumption for domestic hot water (DHW) over standard electric resistance water heaters (ERWHs). While ERWHs perform with energy factors (EFs) around 0.9, new HPWHs boast EFs upwards of 2.0. High energy factors in HPWHs are achieved by combining a vapor compression system, which extracts heat from themore » surrounding air at high efficiencies, with electric resistance element(s), which are better suited to meet large hot water demands. Swapping ERWHs with HPWHs could result in roughly 50% reduction in water heating energy consumption for 35.6% of all U.S. households. This Building America Measure Guideline is intended for builders, contractors, homeowners, and policy-makers. While HPWHs promise to significantly reduce energy use for DHW, proper installation, selection, and maintenance of HPWHs is required to ensure high operating efficiency and reliability. This document is intended to explore the issues surrounding HPWHs to ensure that homeowners and contractors have the tools needed to appropriately and efficiently install HPWHs. Section 1 of this guideline provides a brief description of HPWHs and their operation. Section 2 highlights the cost and energy savings of HPWHs as well as the variables that affect HPWH performance, reliability, and efficiency. Section 3 gives guidelines for proper installation and maintenance of HPWHs, selection criteria for locating HPWHs, and highlights of important differences between ERWH and HPWH installations. Throughout this document, CARB has included results from the evaluation of 14 heat pump water heaters (including three recently released HPWH products) installed in existing homes in the northeast region of the United States.« less

  13. Patient Populations, Clinical Associations, and System Efficiency in Healthcare Delivery System

    NASA Astrophysics Data System (ADS)

    Liu, Yazhuo

    The efforts to improve health care delivery usually involve studies and analysis of patient populations and healthcare systems. In this dissertation, I present the research conducted in the following areas: identifying patient groups, improving treatments for specific conditions by using statistical as well as data mining techniques, and developing new operation research models to increase system efficiency from the health institutes' perspective. The results provide better understanding of high risk patient groups, more accuracy in detecting disease' correlations and practical scheduling tools that consider uncertain operation durations and real-life constraints.

  14. Gulf Petro Initiative

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fathi Boukadi

    2011-02-05

    In this report, technologies for petroleum production and exploration enhancement in deepwater and mature fields are developed through basic and applied research by: (1) Designing new fluids to efficiently drill deepwater wells that can not be cost-effectively drilled with current technologies. The new fluids will be heavy liquid foams that have low-density at shallow dept to avoid formation breakdown and high density at drilling depth to control formation pressure. The goal of this project is to provide industry with formulations of new fluids for reducing casing programs and thus well construction cost in deepwater development. (2) Studying the effects ofmore » flue gas/CO{sub 2} huff n puff on incremental oil recovery in Louisiana oilfields bearing light oil. An artificial neural network (ANN) model will be developed and used to map recovery efficiencies for candidate reservoirs in Louisiana. (3) Arriving at a quantitative understanding for the three-dimensional controlled-source electromagnetic (CSEM) geophysical response of typical Gulf of Mexico hydrocarbon reservoirs. We will seek to make available tools for the qualitative, rapid interpretation of marine CSEM signatures, and tools for efficient, three-dimensional subsurface conductivity modeling.« less

  15. Using NERSC High-Performance Computing (HPC) systems for high-energy nuclear physics applications with ALICE

    NASA Astrophysics Data System (ADS)

    Fasel, Markus

    2016-10-01

    High-Performance Computing Systems are powerful tools tailored to support large- scale applications that rely on low-latency inter-process communications to run efficiently. By design, these systems often impose constraints on application workflows, such as limited external network connectivity and whole node scheduling, that make more general-purpose computing tasks, such as those commonly found in high-energy nuclear physics applications, more difficult to carry out. In this work, we present a tool designed to simplify access to such complicated environments by handling the common tasks of job submission, software management, and local data management, in a framework that is easily adaptable to the specific requirements of various computing systems. The tool, initially constructed to process stand-alone ALICE simulations for detector and software development, was successfully deployed on the NERSC computing systems, Carver, Hopper and Edison, and is being configured to provide access to the next generation NERSC system, Cori. In this report, we describe the tool and discuss our experience running ALICE applications on NERSC HPC systems. The discussion will include our initial benchmarks of Cori compared to other systems and our attempts to leverage the new capabilities offered with Cori to support data-intensive applications, with a future goal of full integration of such systems into ALICE grid operations.

  16. Meristem culture and subsequent micropropagation of Chilean strawberry (Fragaria chiloensis (L.) Duch.).

    PubMed

    Quiroz, Karla A; Berríos, Miguel; Carrasco, Basilio; Retamales, Jorge B; Caligari, Peter D S; García-Gonzáles, Rolando

    2017-06-02

    Vegetative propagation of Fragaria sp. is traditionally carried out using stolons. This system of propagation, in addition to being slow, can spread plant diseases, particularly serious being viral. In vitro culture of meristems and the establishment of micropropagation protocols are important tools for solving these problems. In recent years, considerable effort has been made to develop in vitro propagation of the commercial strawberry in order to produce virus-free plants of high quality. These previous results can serve as the basis for developing in vitro-based propagation technologies in the less studied species Fragaria chiloensis. In this context, we studied the cultivation of meristems and establishment of a micropropagation protocol for F. chiloensis. The addition of polyvinylpyrrolidone (PVP) improved the meristem regeneration efficiency of F. chiloensis accessions. Similarly, the use of 6-benzylaminopurine (BAP) in the culture media increased the average rate of multiplication to 3-6 shoots per plant. In addition, the use of 6-benzylaminopurine (BAP), had low levels (near zero) of explant losses due to oxidation. However, plant height as well as number of leaves and roots were higher in media without growth regulators, with average values of 0.5 cm, 9 leaves and 4 roots per plant. For the first time in Chilean strawberry, meristem culture demonstrated to be an efficient tool for eliminating virus from infected plants, giving the possibility to produce disease free propagation material. Also, the addition of PVP into the basal MS medium improved the efficiency of plant recovery from isolated meristems. Farmers can now access to high quality plant material produced by biotech tools which will improve their technological practices.

  17. How to use MPI communication in highly parallel climate simulations more easily and more efficiently.

    NASA Astrophysics Data System (ADS)

    Behrens, Jörg; Hanke, Moritz; Jahns, Thomas

    2014-05-01

    In this talk we present a way to facilitate efficient use of MPI communication for developers of climate models. Exploitation of the performance potential of today's highly parallel supercomputers with real world simulations is a complex task. This is partly caused by the low level nature of the MPI communication library which is the dominant communication tool at least for inter-node communication. In order to manage the complexity of the task, climate simulations with non-trivial communication patterns often use an internal abstraction layer above MPI without exploiting the benefits of communication aggregation or MPI-datatypes. The solution for the complexity and performance problem we propose is the communication library YAXT. This library is built on top of MPI and takes high level descriptions of arbitrary domain decompositions and automatically derives an efficient collective data exchange. Several exchanges can be aggregated in order to reduce latency costs. Examples are given which demonstrate the simplicity and the performance gains for selected climate applications.

  18. SCALING AN URBAN EMERGENCY EVACUATION FRAMEWORK: CHALLENGES AND PRACTICES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karthik, Rajasekar; Lu, Wei

    2014-01-01

    Critical infrastructure disruption, caused by severe weather events, natural disasters, terrorist attacks, etc., has significant impacts on urban transportation systems. We built a computational framework to simulate urban transportation systems under critical infrastructure disruption in order to aid real-time emergency evacuation. This framework will use large scale datasets to provide a scalable tool for emergency planning and management. Our framework, World-Wide Emergency Evacuation (WWEE), integrates population distribution and urban infrastructure networks to model travel demand in emergency situations at global level. Also, a computational model of agent-based traffic simulation is used to provide an optimal evacuation plan for traffic operationmore » purpose [1]. In addition, our framework provides a web-based high resolution visualization tool for emergency evacuation modelers and practitioners. We have successfully tested our framework with scenarios in both United States (Alexandria, VA) and Europe (Berlin, Germany) [2]. However, there are still some major drawbacks for scaling this framework to handle big data workloads in real time. On our back-end, lack of proper infrastructure limits us in ability to process large amounts of data, run the simulation efficiently and quickly, and provide fast retrieval and serving of data. On the front-end, the visualization performance of microscopic evacuation results is still not efficient enough due to high volume data communication between server and client. We are addressing these drawbacks by using cloud computing and next-generation web technologies, namely Node.js, NoSQL, WebGL, Open Layers 3 and HTML5 technologies. We will describe briefly about each one and how we are using and leveraging these technologies to provide an efficient tool for emergency management organizations. Our early experimentation demonstrates that using above technologies is a promising approach to build a scalable and high performance urban emergency evacuation framework that can improve traffic mobility and safety under critical infrastructure disruption in today s socially connected world.« less

  19. Investigation of fatigue strength of tool steels in sheet-bulk metal forming

    NASA Astrophysics Data System (ADS)

    Pilz, F.; Gröbel, D.; Merklein, M.

    2018-05-01

    To encounter trends regarding an efficient production of complex functional components in forming technology, the process class of sheet-bulk metal forming (SBMF) can be applied. SBMF is characterized by the application of bulk forming operations on sheet metal, often in combination with sheet forming operations [1]. The combination of these conventional process classes leads to locally varying load conditions. The resulting load conditions cause high tool loads, which lead to a reduced tool life, and an uncontrolled material flow. Several studies have shown that locally modified tool surfaces, so-called tailored surfaces, have the potential to control the material flow and thus to increase the die filling of functional elements [2]. A combination of these modified tool surfaces and high tool loads in SBMF is furthermore critical for the tool life and leads to fatigue. Tool fatigue is hardly predictable and due to a lack of data [3], a challenge in tool design. Thus, it is necessary to provide such data for tool steels used in SBMF. The aim of this study is the investigation of the influence of tailored surfaces on the fatigue strength of the powder metallurgical tool steel ASP2023 (1.3344, AISI M3:2), which is typically used in cold forging applications, with a hardness 60 HRC ± 1 HRC. To conduct this investigation, the rotating bending test is chosen. As tailored surfaces, a DLC-coating and a surface manufactured by a high-feed-milling process are chosen. As reference a polished surface which is typical for cold forging tools is used. Before the rotating bending test, the surface integrity is characterized by measuring topography and residual stresses. After testing, the determined values of the surface integrity are correlated with the reached fracture load cycle to derive functional relations. Based on the gained results the investigated tailored surfaces are evaluated regarding their feasibility to modify tool surfaces within SBMF.

  20. Benefits of Efficient Windows | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  1. MetaBAT, an efficient tool for accurately reconstructing single genomes from complex microbial communities

    DOE PAGES

    Kang, Dongwan D.; Froula, Jeff; Egan, Rob; ...

    2015-01-01

    Grouping large genomic fragments assembled from shotgun metagenomic sequences to deconvolute complex microbial communities, or metagenome binning, enables the study of individual organisms and their interactions. Because of the complex nature of these communities, existing metagenome binning methods often miss a large number of microbial species. In addition, most of the tools are not scalable to large datasets. Here we introduce automated software called MetaBAT that integrates empirical probabilistic distances of genome abundance and tetranucleotide frequency for accurate metagenome binning. MetaBAT outperforms alternative methods in accuracy and computational efficiency on both synthetic and real metagenome datasets. Lastly, it automatically formsmore » hundreds of high quality genome bins on a very large assembly consisting millions of contigs in a matter of hours on a single node. MetaBAT is open source software and available at https://bitbucket.org/berkeleylab/metabat.« less

  2. A study of Ganoderma lucidum spores by FTIR microspectroscopy

    NASA Astrophysics Data System (ADS)

    Wang, Xin; Chen, Xianliang; Qi, Zeming; Liu, Xingcun; Li, Weizu; Wang, Shengyi

    2012-06-01

    In order to obtain unique information of Ganoderma lucidum spores, FTIR microspectroscopy was used to study G. lucidum spores from Anhui Province (A), Liaoning Province (B) and Shangdong Province (C) of China. IR micro-spectra were acquired with high-resolution and well-reproducibility. The IR spectra of G. lucidum spores from different areas were similar and mainly made up of the absorption bands of polysaccharide, sterols, proteins, fatty acids, etc. The results of curve fitting indicated the protein secondary structures were dissimilar among the above G. lucidum spores. To identify G. lucidum spores from different areas, the H1078/H1640 value might be a potentially useful factor, furthermore FTIR microspectroscopy could realize this identification efficiently with the help of hierarchical cluster analysis. The result indicates FTIR microspectroscopy is an efficient tool for identification of G. lucidum spores from different areas. The result also suggests FTIR microspectroscopy is a potentially useful tool for the study of TCM.

  3. Tool Efficiency Analysis model research in SEMI industry

    NASA Astrophysics Data System (ADS)

    Lei, Ma; Nana, Zhang; Zhongqiu, Zhang

    2018-06-01

    One of the key goals in SEMI industry is to improve equipment through put and ensure equipment production efficiency maximization. This paper is based on SEMI standards in semiconductor equipment control, defines the transaction rules between different tool states, and presents a TEA system model which is to analysis tool performance automatically based on finite state machine. The system was applied to fab tools and verified its effectiveness successfully, and obtained the parameter values used to measure the equipment performance, also including the advices of improvement.

  4. AnyWave: a cross-platform and modular software for visualizing and processing electrophysiological signals.

    PubMed

    Colombet, B; Woodman, M; Badier, J M; Bénar, C G

    2015-03-15

    The importance of digital signal processing in clinical neurophysiology is growing steadily, involving clinical researchers and methodologists. There is a need for crossing the gap between these communities by providing efficient delivery of newly designed algorithms to end users. We have developed such a tool which both visualizes and processes data and, additionally, acts as a software development platform. AnyWave was designed to run on all common operating systems. It provides access to a variety of data formats and it employs high fidelity visualization techniques. It also allows using external tools as plug-ins, which can be developed in languages including C++, MATLAB and Python. In the current version, plug-ins allow computation of connectivity graphs (non-linear correlation h2) and time-frequency representation (Morlet wavelets). The software is freely available under the LGPL3 license. AnyWave is designed as an open, highly extensible solution, with an architecture that permits rapid delivery of new techniques to end users. We have developed AnyWave software as an efficient neurophysiological data visualizer able to integrate state of the art techniques. AnyWave offers an interface well suited to the needs of clinical research and an architecture designed for integrating new tools. We expect this software to strengthen the collaboration between clinical neurophysiologists and researchers in biomedical engineering and signal processing. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Design and Analysis of Turbines for Space Applications

    NASA Technical Reports Server (NTRS)

    Griffin, Lisa W.; Dorney, Daniel J.; Huber, Frank W.

    2003-01-01

    In order to mitigate the risk of rocket propulsion development, efficient, accurate, detailed fluid dynamics analysis of the turbomachinery is necessary. This analysis is used for component development, design parametrics, performance prediction, and environment definition. To support this requirement, a task was developed at NASAh4arshall Space Flight Center (MSFC) to improve turbine aerodynamic performance through the application of advanced design and analysis tools. There are four major objectives of this task: 1) to develop, enhance, and integrate advanced turbine aerodynamic design and analysis tools; 2) to develop the methodology for application of the analytical techniques; 3) to demonstrate the benefits of the advanced turbine design procedure through its application to a relevant turbine design point; and 4) to verify the optimized design and analysis with testing. The turbine chosen on which to demonstrate the procedure was a supersonic design suitable for a reusable launch vehicle (RLV). The hot gas path and blading were redesigned to obtain an increased efficiency. The redesign of the turbine was conducted with a consideration of system requirements, realizing that a highly efficient turbine that, for example, significantly increases engine weight, is of limited benefit. Both preliminary and detailed designs were considered. To generate an improved design, one-dimensional (1D) design and analysis tools, computational fluid dynamics (CFD), response surface methodology (RSM), and neural nets (NN) were used.

  6. Improving productivity and profitability of a bioanalytical business through sales and operation planning.

    PubMed

    Islam, Rafiqul

    2013-07-01

    Today's bioanalytical CROs face increasing global competition, highly variable demand, high fixed costs, pricing pressure, and increasing demand for quality and speed. Most bioanalytical laboratories have responded to these challenges by implementing automation and by implementing process improvement methodologies (e.g., Six Sigma). These solutions have not resulted in a significant improvement in productivity and profitability since none of them are able to predict the upturn or downturn in demand. High volatility of demand causes long lead times and high costs during peak demand and poor productivity during trough demand. Most bioanalytical laboratories lack the tools to align supply efficiently to meet changing demand. In this paper, sales and operation planning (S&OP) has been investigated as a tool to balance supply and demand. The S&OP process, when executed effectively, can be the single greatest determinant of profitability for a bioanalytical business.

  7. Highly Efficient Electroporation-mediated Transformation into Edible Mushroom Flammulina velutipes

    PubMed Central

    Kim, Jong Kun; Park, Young Jin; Kong, Won Sik

    2010-01-01

    In this study, we developed an efficient electroporation-mediated transformation system featuring Flammulina velutipes. The flammutoxin (ftx) gene of F. velutipes was isolated by reverse transcription-PCR. pFTXHg plasmid was constructed using the partial ftx gene (410 bp) along with the hygromycin B phosphotransferase gene (hygB) downstream of the glyceraldehydes-3-phosphate dehydrogenase (gpd) promoter. The plasmid was transformed into protoplasts of monokaryotic strain 4019-20 of F. velutipes by electroporation. High transformation efficiency was obtained with an electric-pulse of 1.25 kV/cm by using 177 transformants/µg of DNA in 1 × 107 protoplasts. PCR and Southern blot hybridization indicated that a single copy of the plasmid DNA was inserted at different locations in the F. velutipes genome by non-homologous recombination. Therefore, this transformation system could be used as a useful tool for gene function analysis of F. velutipes. PMID:23956676

  8. Highly Efficient Electroporation-mediated Transformation into Edible Mushroom Flammulina velutipes.

    PubMed

    Kim, Jong Kun; Park, Young Jin; Kong, Won Sik; Kang, Hee Wan

    2010-12-01

    In this study, we developed an efficient electroporation-mediated transformation system featuring Flammulina velutipes. The flammutoxin (ftx) gene of F. velutipes was isolated by reverse transcription-PCR. pFTXHg plasmid was constructed using the partial ftx gene (410 bp) along with the hygromycin B phosphotransferase gene (hygB) downstream of the glyceraldehydes-3-phosphate dehydrogenase (gpd) promoter. The plasmid was transformed into protoplasts of monokaryotic strain 4019-20 of F. velutipes by electroporation. High transformation efficiency was obtained with an electric-pulse of 1.25 kV/cm by using 177 transformants/µg of DNA in 1 × 10(7) protoplasts. PCR and Southern blot hybridization indicated that a single copy of the plasmid DNA was inserted at different locations in the F. velutipes genome by non-homologous recombination. Therefore, this transformation system could be used as a useful tool for gene function analysis of F. velutipes.

  9. Novel tool wear monitoring method in milling difficult-to-machine materials using cutting chip formation

    NASA Astrophysics Data System (ADS)

    Zhang, P. P.; Guo, Y.; Wang, B.

    2017-05-01

    The main problems in milling difficult-to-machine materials are the high cutting temperature and rapid tool wear. However it is impossible to investigate tool wear in machining. Tool wear and cutting chip formation are two of the most important representations for machining efficiency and quality. The purpose of this paper is to develop the model of tool wear with cutting chip formation (width of chip and radian of chip) on difficult-to-machine materials. Thereby tool wear is monitored by cutting chip formation. A milling experiment on the machining centre with three sets cutting parameters was performed to obtain chip formation and tool wear. The experimental results show that tool wear increases gradually along with cutting process. In contrast, width of chip and radian of chip decrease. The model is developed by fitting the experimental data and formula transformations. The most of monitored errors of tool wear by the chip formation are less than 10%. The smallest error is 0.2%. Overall errors by the radian of chip are less than the ones by the width of chip. It is new way to monitor and detect tool wear by cutting chip formation in milling difficult-to-machine materials.

  10. MsViz: A Graphical Software Tool for In-Depth Manual Validation and Quantitation of Post-translational Modifications.

    PubMed

    Martín-Campos, Trinidad; Mylonas, Roman; Masselot, Alexandre; Waridel, Patrice; Petricevic, Tanja; Xenarios, Ioannis; Quadroni, Manfredo

    2017-08-04

    Mass spectrometry (MS) has become the tool of choice for the large scale identification and quantitation of proteins and their post-translational modifications (PTMs). This development has been enabled by powerful software packages for the automated analysis of MS data. While data on PTMs of thousands of proteins can nowadays be readily obtained, fully deciphering the complexity and combinatorics of modification patterns even on a single protein often remains challenging. Moreover, functional investigation of PTMs on a protein of interest requires validation of the localization and the accurate quantitation of its changes across several conditions, tasks that often still require human evaluation. Software tools for large scale analyses are highly efficient but are rarely conceived for interactive, in-depth exploration of data on individual proteins. We here describe MsViz, a web-based and interactive software tool that supports manual validation of PTMs and their relative quantitation in small- and medium-size experiments. The tool displays sequence coverage information, peptide-spectrum matches, tandem MS spectra and extracted ion chromatograms through a single, highly intuitive interface. We found that MsViz greatly facilitates manual data inspection to validate PTM location and quantitate modified species across multiple samples.

  11. Rapid Modeling and Analysis Tools: Evolution, Status, Needs and Directions

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Stone, Thomas J.; Ransom, Jonathan B. (Technical Monitor)

    2002-01-01

    Advanced aerospace systems are becoming increasingly more complex, and customers are demanding lower cost, higher performance, and high reliability. Increased demands are placed on the design engineers to collaborate and integrate design needs and objectives early in the design process to minimize risks that may occur later in the design development stage. High performance systems require better understanding of system sensitivities much earlier in the design process to meet these goals. The knowledge, skills, intuition, and experience of an individual design engineer will need to be extended significantly for the next generation of aerospace system designs. Then a collaborative effort involving the designer, rapid and reliable analysis tools and virtual experts will result in advanced aerospace systems that are safe, reliable, and efficient. This paper discusses the evolution, status, needs and directions for rapid modeling and analysis tools for structural analysis. First, the evolution of computerized design and analysis tools is briefly described. Next, the status of representative design and analysis tools is described along with a brief statement on their functionality. Then technology advancements to achieve rapid modeling and analysis are identified. Finally, potential future directions including possible prototype configurations are proposed.

  12. Continuous processing and the applications of online tools in pharmaceutical product manufacture: developments and examples.

    PubMed

    Ooi, Shing Ming; Sarkar, Srimanta; van Varenbergh, Griet; Schoeters, Kris; Heng, Paul Wan Sia

    2013-04-01

    Continuous processing and production in pharmaceutical manufacturing has received increased attention in recent years mainly due to the industries' pressing needs for more efficient, cost-effective processes and production, as well as regulatory facilitation. To achieve optimum product quality, the traditional trial-and-error method for the optimization of different process and formulation parameters is expensive and time consuming. Real-time evaluation and the control of product quality using an online process analyzer in continuous processing can provide high-quality production with very high-throughput at low unit cost. This review focuses on continuous processing and the application of different real-time monitoring tools used in the pharmaceutical industry for continuous processing from powder to tablets.

  13. Engineering a mobile health tool for resource-poor settings to assess and manage cardiovascular disease risk: SMARThealth study.

    PubMed

    Raghu, Arvind; Praveen, Devarsetty; Peiris, David; Tarassenko, Lionel; Clifford, Gari

    2015-04-29

    The incidence of chronic diseases in low- and middle-income countries is rapidly increasing both in urban and rural regions. A major challenge for health systems globally is to develop innovative solutions for the prevention and control of these diseases. This paper discusses the development and pilot testing of SMARTHealth, a mobile-based, point-of-care Clinical Decision Support (CDS) tool to assess and manage cardiovascular disease (CVD) risk in resource-constrained settings. Through pilot testing, the preliminary acceptability, utility, and efficiency of the CDS tool was obtained. The CDS tool was part of an mHealth system comprising a mobile application that consisted of an evidence-based risk prediction and management algorithm, and a server-side electronic medical record system. Through an agile development process and user-centred design approach, key features of the mobile application that fitted the requirements of the end users and environment were obtained. A comprehensive analytics framework facilitated a data-driven approach to investigate four areas, namely, system efficiency, end-user variability, manual data entry errors, and usefulness of point-of-care management recommendations to the healthcare worker. A four-point Likert scale was used at the end of every risk assessment to gauge ease-of-use of the system. The system was field-tested with eleven village healthcare workers and three Primary Health Centre doctors, who screened a total of 292 adults aged 40 years and above. 34% of participants screened by health workers were identified by the CDS tool to be high CVD risk and referred to a doctor. In-depth analysis of user interactions found the CDS tool feasible for use and easily integrable into the workflow of healthcare workers. Following completion of the pilot, further technical enhancements were implemented to improve uptake of the mHealth platform. It will then be evaluated for effectiveness and cost-effectiveness in a cluster randomized controlled trial involving 54 southern Indian villages and over 16000 individuals at high CVD risk. An evidence-based CVD risk prediction and management tool was used to develop an mHealth platform in rural India for CVD screening and management with proper engagement of health care providers and local communities. With over a third of screened participants being high risk, there is a need to demonstrate the clinical impact of the mHealth platform so that it could contribute to improved CVD detection in high risk low resource settings.

  14. Cloud-Based Tools to Support High-Resolution Modeling (Invited)

    NASA Astrophysics Data System (ADS)

    Jones, N.; Nelson, J.; Swain, N.; Christensen, S.

    2013-12-01

    The majority of watershed models developed to support decision-making by water management agencies are simple, lumped-parameter models. Maturity in research codes and advances in the computational power from multi-core processors on desktop machines, commercial cloud-computing resources, and supercomputers with thousands of cores have created new opportunities for employing more accurate, high-resolution distributed models for routine use in decision support. The barriers for using such models on a more routine basis include massive amounts of spatial data that must be processed for each new scenario and lack of efficient visualization tools. In this presentation we will review a current NSF-funded project called CI-WATER that is intended to overcome many of these roadblocks associated with high-resolution modeling. We are developing a suite of tools that will make it possible to deploy customized web-based apps for running custom scenarios for high-resolution models with minimal effort. These tools are based on a software stack that includes 52 North, MapServer, PostGIS, HT Condor, CKAN, and Python. This open source stack provides a simple scripting environment for quickly configuring new custom applications for running high-resolution models as geoprocessing workflows. The HT Condor component facilitates simple access to local distributed computers or commercial cloud resources when necessary for stochastic simulations. The CKAN framework provides a powerful suite of tools for hosting such workflows in a web-based environment that includes visualization tools and storage of model simulations in a database to archival, querying, and sharing of model results. Prototype applications including land use change, snow melt, and burned area analysis will be presented. This material is based upon work supported by the National Science Foundation under Grant No. 1135482

  15. Rapid Freeform Sheet Metal Forming: Technology Development and System Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kiridena, Vijitha; Verma, Ravi; Gutowski, Timothy

    The objective of this project is to develop a transformational RApid Freeform sheet metal Forming Technology (RAFFT) in an industrial environment, which has the potential to increase manufacturing energy efficiency up to ten times, at a fraction of the cost of conventional technologies. The RAFFT technology is a flexible and energy-efficient process that eliminates the need for having geometry-specific forming dies. The innovation lies in the idea of using the energy resource at the local deformation area which provides greater formability, process control, and process flexibility relative to traditional methods. Double-Sided Incremental Forming (DSIF), the core technology in RAFFT, ismore » a new concept for sheet metal forming. A blank sheet is clamped around its periphery and gradually deformed into a complex 3D freeform part by two strategically aligned stylus-type tools that follow a pre-described toolpath. The two tools, one on each side of the blank, can form a part with sharp features for both concave and convex shapes. Since deformation happens locally, the forming force at any instant is significantly decreased when compared to traditional methods. The key advantages of DSIF are its high process flexibility, high energy-efficiency, low capital investment, and the elimination of the need for massive amounts of die casting and machining. Additionally, the enhanced formability and process flexibility of DSIF can open up design spaces and result in greater weight savings.« less

  16. Resolving Off-Nominal Situations in Schedule-Based Terminal Area Operations: Results from a Human-in-the-Loop Simulation

    NASA Technical Reports Server (NTRS)

    Mercer, Joey; Callantine, Todd; Martin, Lynne

    2012-01-01

    A recent human-in-the-loop simulation in the Airspace Operations Laboratory (AOL) at NASA's Ames Research Center investigated the robustness of Controller-Managed Spacing (CMS) operations. CMS refers to AOL-developed controller tools and procedures for enabling arrivals to conduct efficient Optimized Profile Descents with sustained high throughput. The simulation provided a rich data set for examining how a traffic management supervisor and terminal-area controller participants used the CMS tools and coordinated to respond to off-nominal events. This paper proposes quantitative measures for characterizing the participants responses. Case studies of go-around events, replicated during the simulation, provide insights into the strategies employed and the role the CMS tools played in supporting them.

  17. The Integrated Waste Tracking System - A Flexible Waste Management Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Robert Stephen

    2001-02-01

    The US Department of Energy (DOE) Idaho National Engineering and Environmental Laboratory (INEEL) has fully embraced a flexible, computer-based tool to help increase waste management efficiency and integrate multiple operational functions from waste generation through waste disposition while reducing cost. The Integrated Waste Tracking System (IWTS)provides comprehensive information management for containerized waste during generation,storage, treatment, transport, and disposal. The IWTS provides all information necessary for facilities to properly manage and demonstrate regulatory compliance. As a platformindependent, client-server and Web-based inventory and compliance system, the IWTS has proven to be a successful tracking, characterization, compliance, and reporting tool that meets themore » needs of both operations and management while providing a high level of management flexibility.« less

  18. Development of an Irrigation Scheduling Tool for the High Plains Region

    NASA Astrophysics Data System (ADS)

    Shulski, M.; Hubbard, K. G.; You, J.

    2009-12-01

    The High Plains Regional Climate Center (HPRCC) at the University of Nebraska is one of NOAA’s six regional climate centers in the U.S. Primary objectives of the HPRCC are to conduct applied climate research, engage in climate education and outreach, and increase the use and availability of climate information by developing value-added products. Scientists at the center are engaged in utilizing regional weather data to develop tools that can be used directly by area stakeholders, particularly for agricultural sectors. A new study is proposed that will combine NOAA products (short-term forecasts and seasonal outlooks of temperature and precipitation) with existing capabilities to construct an irrigation scheduling tool that can be used by producers in the region. This tool will make use of weather observations from the regional mesonet (specifically the AWDN, Automated Weather Data Network) and the nation-wide relational database and web portal (ACIS, Applied Climate Information System). The primary benefit to stakeholders will be a more efficient use of water and energy resources owing to the reduction of uncertainty in the timing of irrigation.

  19. Clinical guideline representation in a CDS: a human information processing method.

    PubMed

    Kilsdonk, Ellen; Riezebos, Rinke; Kremer, Leontien; Peute, Linda; Jaspers, Monique

    2012-01-01

    The Dutch Childhood Oncology Group (DCOG) has developed evidence-based guidelines for screening childhood cancer survivors for possible late complications of treatment. These paper-based guidelines appeared to not suit clinicians' information retrieval strategies; it was thus decided to communicate the guidelines through a Computerized Decision Support (CDS) tool. To ensure high usability of this tool, an analysis of clinicians' cognitive strategies in retrieving information from the paper-based guidelines was used as requirements elicitation method. An information processing model was developed through an analysis of think aloud protocols and used as input for the design of the CDS user interface. Usability analysis of the user interface showed that the navigational structure of the CDS tool fitted well with the clinicians' mental strategies employed in deciding on survivors screening protocols. Clinicians were more efficient and more complete in deciding on patient-tailored screening procedures when supported by the CDS tool than by the paper-based guideline booklet. The think-aloud method provided detailed insight into users' clinical work patterns that supported the design of a highly usable CDS system.

  20. iScreen: Image-Based High-Content RNAi Screening Analysis Tools.

    PubMed

    Zhong, Rui; Dong, Xiaonan; Levine, Beth; Xie, Yang; Xiao, Guanghua

    2015-09-01

    High-throughput RNA interference (RNAi) screening has opened up a path to investigating functional genomics in a genome-wide pattern. However, such studies are often restricted to assays that have a single readout format. Recently, advanced image technologies have been coupled with high-throughput RNAi screening to develop high-content screening, in which one or more cell image(s), instead of a single readout, were generated from each well. This image-based high-content screening technology has led to genome-wide functional annotation in a wider spectrum of biological research studies, as well as in drug and target discovery, so that complex cellular phenotypes can be measured in a multiparametric format. Despite these advances, data analysis and visualization tools are still largely lacking for these types of experiments. Therefore, we developed iScreen (image-Based High-content RNAi Screening Analysis Tool), an R package for the statistical modeling and visualization of image-based high-content RNAi screening. Two case studies were used to demonstrate the capability and efficiency of the iScreen package. iScreen is available for download on CRAN (http://cran.cnr.berkeley.edu/web/packages/iScreen/index.html). The user manual is also available as a supplementary document. © 2014 Society for Laboratory Automation and Screening.

  1. Rapid evolution of regulatory element libraries for tunable transcriptional and translational control of gene expression.

    PubMed

    Jin, Erqing; Wong, Lynn; Jiao, Yun; Engel, Jake; Holdridge, Benjamin; Xu, Peng

    2017-12-01

    Engineering cell factories for producing biofuels and pharmaceuticals has spurred great interests to develop rapid and efficient synthetic biology tools customized for modular pathway engineering. Along the way, combinatorial gene expression control through modification of regulatory element offered tremendous opportunity for fine-tuning gene expression and generating digital-like genetic circuits. In this report, we present an efficient evolutionary approach to build a range of regulatory control elements. The reported method allows for rapid construction of promoter, 5'UTR, terminator and trans -activating RNA libraries. Synthetic overlapping oligos with high portion of degenerate nucleotides flanking the regulatory element could be efficiently assembled to a vector expressing fluorescence reporter. This approach combines high mutation rate of the synthetic DNA with the high assembly efficiency of Gibson Mix. Our constructed library demonstrates broad range of transcriptional or translational gene expression dynamics. Specifically, both the promoter library and 5'UTR library exhibits gene expression dynamics spanning across three order of magnitude. The terminator library and trans -activating RNA library displays relatively narrowed gene expression pattern. The reported study provides a versatile toolbox for rapidly constructing a large family of prokaryotic regulatory elements. These libraries also facilitate the implementation of combinatorial pathway engineering principles and the engineering of more efficient microbial cell factory for various biomanufacturing applications.

  2. Seamless Insert-Plasmid Assembly at High Efficiency and Low Cost

    PubMed Central

    Benoit, Roger M.; Ostermeier, Christian; Geiser, Martin; Li, Julia Su Zhou; Widmer, Hans; Auer, Manfred

    2016-01-01

    Seamless cloning methods, such as co-transformation cloning, sequence- and ligation-independent cloning (SLIC) or the Gibson assembly, are essential tools for the precise construction of plasmids. The efficiency of co-transformation cloning is however low and the Gibson assembly reagents are expensive. With the aim to improve the robustness of seamless cloning experiments while keeping costs low, we examined the importance of complementary single-stranded DNA ends for co-transformation cloning and the influence of single-stranded gaps in circular plasmids on SLIC cloning efficiency. Most importantly, our data show that single-stranded gaps in double-stranded plasmids, which occur in typical SLIC protocols, can drastically decrease the efficiency at which the DNA transforms competent E. coli bacteria. Accordingly, filling-in of single-stranded gaps using DNA polymerase resulted in increased transformation efficiency. Ligation of the remaining nicks did not lead to a further increase in transformation efficiency. These findings demonstrate that highly efficient insert-plasmid assembly can be achieved by using only T5 exonuclease and Phusion DNA polymerase, without Taq DNA ligase from the original Gibson protocol, which significantly reduces the cost of the reactions. We successfully used this modified Gibson assembly protocol with two short insert-plasmid overlap regions, each counting only 15 nucleotides. PMID:27073895

  3. Polycistronic tRNA and CRISPR guide-RNA enables highly efficient multiplexed genome engineering in human cells

    PubMed Central

    Dong, Fengping; Xie, Kabin; Chen, Yueying; Yang, Yinong; Mao, Yingwei

    2016-01-01

    CRISPR/Cas9 has been widely used for genomic editing in many organisms. Many human diseases are caused by multiple mutations. The CRISPR/Cas9 system provides a potential tool to introduce multiple mutations in a genome. To mimic complicated genomic variants in human diseases, such as multiple gene deletions or mutations, two or more small guide RNAs (sgRNAs) need to be introduced all together. This can be achieved by separate Pol III promoters in a construct. However, limited enzyme sites and increased insertion size lower the efficiency to make a construct. Here, we report a strategy to quickly assembly multiple sgRNAs in one construct using a polycistronic-tRNA-gRNA (PTG) strategy. Taking advantage of the endogenous tRNA processing system in mammalian cells, we efficiently express multiple sgRNAs driven using only one Pol III promoter. Using an all-in-one construct carrying PTG, we disrupt the deacetylase domain in multiple histone deacetylases (HDACs) in human cells simultaneously. We demonstrate that multiple HDAC deletions significantly affect the activation of the Wnt-signaling pathway. Thus, this method enables to efficiently target multiple genes and provide a useful tool to establish mutated cells mimicking human diseases. PMID:27890617

  4. Polycistronic tRNA and CRISPR guide-RNA enables highly efficient multiplexed genome engineering in human cells.

    PubMed

    Dong, Fengping; Xie, Kabin; Chen, Yueying; Yang, Yinong; Mao, Yingwei

    2017-01-22

    CRISPR/Cas9 has been widely used for genomic editing in many organisms. Many human diseases are caused by multiple mutations. The CRISPR/Cas9 system provides a potential tool to introduce multiple mutations in a genome. To mimic complicated genomic variants in human diseases, such as multiple gene deletions or mutations, two or more small guide RNAs (sgRNAs) need to be introduced all together. This can be achieved by separate Pol III promoters in a construct. However, limited enzyme sites and increased insertion size lower the efficiency to make a construct. Here, we report a strategy to quickly assembly multiple sgRNAs in one construct using a polycistronic-tRNA-gRNA (PTG) strategy. Taking advantage of the endogenous tRNA processing system in mammalian cells, we efficiently express multiple sgRNAs driven using only one Pol III promoter. Using an all-in-one construct carrying PTG, we disrupt the deacetylase domain in multiple histone deacetylases (HDACs) in human cells simultaneously. We demonstrate that multiple HDAC deletions significantly affect the activation of the Wnt-signaling pathway. Thus, this method enables to efficiently target multiple genes and provide a useful tool to establish mutated cells mimicking human diseases. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Homology-integrated CRISPR-Cas (HI-CRISPR) system for one-step multigene disruption in Saccharomyces cerevisiae.

    PubMed

    Bao, Zehua; Xiao, Han; Liang, Jing; Zhang, Lu; Xiong, Xiong; Sun, Ning; Si, Tong; Zhao, Huimin

    2015-05-15

    One-step multiple gene disruption in the model organism Saccharomyces cerevisiae is a highly useful tool for both basic and applied research, but it remains a challenge. Here, we report a rapid, efficient, and potentially scalable strategy based on the type II Clustered Regularly Interspaced Short Palindromic Repeats (CRISPR)-CRISPR associated proteins (Cas) system to generate multiple gene disruptions simultaneously in S. cerevisiae. A 100 bp dsDNA mutagenizing homologous recombination donor is inserted between two direct repeats for each target gene in a CRISPR array consisting of multiple donor and guide sequence pairs. An ultrahigh copy number plasmid carrying iCas9, a variant of wild-type Cas9, trans-encoded RNA (tracrRNA), and a homology-integrated crRNA cassette is designed to greatly increase the gene disruption efficiency. As proof of concept, three genes, CAN1, ADE2, and LYP1, were simultaneously disrupted in 4 days with an efficiency ranging from 27 to 87%. Another three genes involved in an artificial hydrocortisone biosynthetic pathway, ATF2, GCY1, and YPR1, were simultaneously disrupted in 6 days with 100% efficiency. This homology-integrated CRISPR (HI-CRISPR) strategy represents a powerful tool for creating yeast strains with multiple gene knockouts.

  6. Design Aids for Real-Time Systems (DARTS)

    NASA Technical Reports Server (NTRS)

    Szulewski, P. A.

    1982-01-01

    Design-Aids for Real-Time Systems (DARTS) is a tool that assists in defining embedded computer systems through tree structured graphics, military standard documentation support, and various analyses including automated Software Science parameter counting and metrics calculation. These analyses provide both static and dynamic design quality feedback which can potentially aid in producing efficient, high quality software systems.

  7. The Complete Toolkit for Building High-Performance Work Teams.

    ERIC Educational Resources Information Center

    Golden, Nancy; Gall, Joyce P.

    This workbook is designed for leaders and members of work teams in educational and social-service systems. It presents in a systematic fashion a set of tested facilitation tools that will allow teams to work more efficiently and harmoniously, enabling them to achieve their goals, to deal directly with both personal and work-related issues that…

  8. Somatic cell nuclear transfer followed by CRIPSR/CAS9 microinjection results in highly efficient genome editing in cloned pigs

    USDA-ARS?s Scientific Manuscript database

    The domestic pig is an ideal “dual purpose” animal model for agricultural and biomedical research. With the availability of genome editing tools [e.g. clustered regularly interspersed short palindromic repeat (CRISPR) and associated nuclease Cas9 (CRISPR/Cas9)] it is now possible to perform site-sp...

  9. Complementing in vitro screening assays with in silico molecular chemistry tools to examine potential in vivo metabolite-mediated effects

    EPA Science Inventory

    High-throughput in vitro assays offer a rapid, cost-efficient means to screen thousands of chemicals across hundreds of pathway-based toxicity endpoints. However, one main concern involved with the use of in vitro assays is the erroneous omission of chemicals that are inactive un...

  10. Cognitive Readiness Assessment and Reporting: An Open Source Mobile Framework for Operational Decision Support and Performance Improvement

    ERIC Educational Resources Information Center

    Heric, Matthew; Carter, Jenn

    2011-01-01

    Cognitive readiness (CR) and performance for operational time-critical environments are continuing points of focus for military and academic communities. In response to this need, we designed an open source interactive CR assessment application as a highly adaptive and efficient open source testing administration and analysis tool. It is capable…

  11. The Plant Protoplast: A Useful Tool for Plant Research and Student Instruction

    ERIC Educational Resources Information Center

    Wagner, George J.; And Others

    1978-01-01

    A plant protoplast is basically a plant cell that lacks a cell wall. This article outlines some of the ways in which protoplasts may be used to advance understanding of plant cell biology in research and student instruction. Topics include high efficiency experimental virus infection, organelle isolation, and osmotic effects. (Author/MA)

  12. Improving irrigation efficiency : the need for a relevant sequence of the management tools

    NASA Astrophysics Data System (ADS)

    Fayolle, Y.

    2009-04-01

    With 70 % of worldwide withdrawals, irrigation efficiency is a key issue in the overall problem of water resources. Management of water dedicated to agriculture should be improved to secure food production and save water to deal with increasing domestic and industrial demands. This paper is based on the results of a collaborative research project conducted in India with a local NGO (the Aga Khan Rural Support Programme, AKRSP(I)) during which GIS were tested. It is aimed at analyzing the efficiency of water usage in a water development programme conducted by the partner NGO in the semi-arid margins of Gujarat state. The analysis raises the question of the articulation of legal, institutional, economical, and technical tools to improve water efficiency. The NGO supervises the construction of surface water harvesting structures for irrigation purposes. Following a participatory approach, it creates and trains user groups to which the management of dams would then be devolved. User group membership depends on financial contribution to the building costs. A legal vacuum regarding surface water management combined with unequal investment capacities favor the concentration of water resources in the hands of a limited number of farmers. This causes low water use efficiency, irrigation choices being mostly oriented to high water consumptive crops and recipient farmers showing no interest in investing in water saving techniques. Our observations favor equality of access and paying more attention to the sequence in which management tools are articulated. On a national scale, as a prerequisite, water user rights as well as NGO's intervention legal framework should be clarified. On a project scale, before construction, information systems could help to identify all potential beneficiaries and optimize equality of access. It aims at reducing the volume of water per farmer to encourage them to irrigate low water consumptive crops and invest in water saving techniques. Depending on individual investment capacities, financial support could be proposed to favor investments in micro-irrigation devices. Finally, we suggest delaying the use of economic tools, giving up financial participation to the building costs (to limit their discriminating effect on user groups access), and limiting their applications to watering charges to cover maintenance expenses.

  13. Red laser based on intra-cavity Nd:YAG/CH4 frequency doubled Raman lasers

    NASA Astrophysics Data System (ADS)

    Wang, Yanchao; Wang, Pengyuan; Liu, Jinbo; Liu, Wanfa; Guo, Jingwei

    2017-01-01

    Stimulated Raman scattering (SRS) is a powerful tool for the extension of the spectral range of lasers. To obtain efficient Raman conversion in SRS, many researchers have studied different types of Raman laser configurations. Among these configurations, the intra-cavity type is particularly attractive. Intra-cavity SRS has the advantages of high intra-cavity laser intensity, low-SRS threshold, and high Raman conversion efficiency. In this paper, An Q-switched intra-cavity Nd: YAG/CH4 frequency-doubled Raman lasers is reported. A negative branch confocal resonator with M= 1.25 is used for the frequency-doubling of Nd: YAG laser. The consequent 532nm light is confined in intra- cavity SRS with travelling wave resonator, and the focal of one mirror of cavity is overlap with the center of the other mirror of the cavity. We found this design is especially efficient to reduce the threshold of SRS, and increase conversion efficiency. The threshold is measured to be 0.62 MW, and at the pump energy of 16.1 mJ, the conversion efficiency is 34%. With the smaller magnification M, the threshold could further decrease, and the conversion efficiency could be improved further. This is a successful try to extend the spectral range of a laser to the shorter wavelength by SRS, and this design may play an important role in the fulfillment of high power red lasers.

  14. Economics of Agroforestry

    Treesearch

    D. Evan Mercer; Frederick W. Cubbage; Gregory E. Frey

    2014-01-01

    This chapter provides principles, literature and a case study about the economics of agroforestry. We examine necessary conditions for achieving efficiency in agroforestry system design and economic analysis tools for assessing efficiency and adoptability of agroforestry. The tools presented here (capital budgeting, linear progranuning, production frontier analysis...

  15. Prediction of miRNA targets.

    PubMed

    Oulas, Anastasis; Karathanasis, Nestoras; Louloupi, Annita; Pavlopoulos, Georgios A; Poirazi, Panayiota; Kalantidis, Kriton; Iliopoulos, Ioannis

    2015-01-01

    Computational methods for miRNA target prediction are currently undergoing extensive review and evaluation. There is still a great need for improvement of these tools and bioinformatics approaches are looking towards high-throughput experiments in order to validate predictions. The combination of large-scale techniques with computational tools will not only provide greater credence to computational predictions but also lead to the better understanding of specific biological questions. Current miRNA target prediction tools utilize probabilistic learning algorithms, machine learning methods and even empirical biologically defined rules in order to build models based on experimentally verified miRNA targets. Large-scale protein downregulation assays and next-generation sequencing (NGS) are now being used to validate methodologies and compare the performance of existing tools. Tools that exhibit greater correlation between computational predictions and protein downregulation or RNA downregulation are considered the state of the art. Moreover, efficiency in prediction of miRNA targets that are concurrently verified experimentally provides additional validity to computational predictions and further highlights the competitive advantage of specific tools and their efficacy in extracting biologically significant results. In this review paper, we discuss the computational methods for miRNA target prediction and provide a detailed comparison of methodologies and features utilized by each specific tool. Moreover, we provide an overview of current state-of-the-art high-throughput methods used in miRNA target prediction.

  16. eRNA: a graphic user interface-based tool optimized for large data analysis from high-throughput RNA sequencing

    PubMed Central

    2014-01-01

    Background RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. Results We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification” includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module “mRNA identification” includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module “Target screening” provides expression profiling analyses and graphic visualization. The module “Self-testing” offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program’s functionality. Conclusions eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory. PMID:24593312

  17. eRNA: a graphic user interface-based tool optimized for large data analysis from high-throughput RNA sequencing.

    PubMed

    Yuan, Tiezheng; Huang, Xiaoyi; Dittmar, Rachel L; Du, Meijun; Kohli, Manish; Boardman, Lisa; Thibodeau, Stephen N; Wang, Liang

    2014-03-05

    RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification" includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module "mRNA identification" includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module "Target screening" provides expression profiling analyses and graphic visualization. The module "Self-testing" offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program's functionality. eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory.

  18. deepTools: a flexible platform for exploring deep-sequencing data.

    PubMed

    Ramírez, Fidel; Dündar, Friederike; Diehl, Sarah; Grüning, Björn A; Manke, Thomas

    2014-07-01

    We present a Galaxy based web server for processing and visualizing deeply sequenced data. The web server's core functionality consists of a suite of newly developed tools, called deepTools, that enable users with little bioinformatic background to explore the results of their sequencing experiments in a standardized setting. Users can upload pre-processed files with continuous data in standard formats and generate heatmaps and summary plots in a straight-forward, yet highly customizable manner. In addition, we offer several tools for the analysis of files containing aligned reads and enable efficient and reproducible generation of normalized coverage files. As a modular and open-source platform, deepTools can easily be expanded and customized to future demands and developments. The deepTools webserver is freely available at http://deeptools.ie-freiburg.mpg.de and is accompanied by extensive documentation and tutorials aimed at conveying the principles of deep-sequencing data analysis. The web server can be used without registration. deepTools can be installed locally either stand-alone or as part of Galaxy. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  19. Open | SpeedShop: An Open Source Infrastructure for Parallel Performance Analysis

    DOE PAGES

    Schulz, Martin; Galarowicz, Jim; Maghrak, Don; ...

    2008-01-01

    Over the last decades a large number of performance tools has been developed to analyze and optimize high performance applications. Their acceptance by end users, however, has been slow: each tool alone is often limited in scope and comes with widely varying interfaces and workflow constraints, requiring different changes in the often complex build and execution infrastructure of the target application. We started the Open | SpeedShop project about 3 years ago to overcome these limitations and provide efficient, easy to apply, and integrated performance analysis for parallel systems. Open | SpeedShop has two different faces: it provides an interoperable tool set covering themore » most common analysis steps as well as a comprehensive plugin infrastructure for building new tools. In both cases, the tools can be deployed to large scale parallel applications using DPCL/Dyninst for distributed binary instrumentation. Further, all tools developed within or on top of Open | SpeedShop are accessible through multiple fully equivalent interfaces including an easy-to-use GUI as well as an interactive command line interface reducing the usage threshold for those tools.« less

  20. Polarization control of high order harmonics in the EUV photon energy range.

    PubMed

    Vodungbo, Boris; Barszczak Sardinha, Anna; Gautier, Julien; Lambert, Guillaume; Valentin, Constance; Lozano, Magali; Iaquaniello, Grégory; Delmotte, Franck; Sebban, Stéphane; Lüning, Jan; Zeitoun, Philippe

    2011-02-28

    We report the generation of circularly polarized high order harmonics in the extreme ultraviolet range (18-27 nm) from a linearly polarized infrared laser (40 fs, 0.25 TW) focused into a neon filled gas cell. To circularly polarize the initially linearly polarized harmonics we have implemented a four-reflector phase-shifter. Fully circularly polarized radiation has been obtained with an efficiency of a few percents, thus being significantly more efficient than currently demonstrated direct generation of elliptically polarized harmonics. This demonstration opens up new experimental capabilities based on high order harmonics, for example, in biology and materials science. The inherent femtosecond time resolution of high order harmonic generating table top laser sources renders these an ideal tool for the investigation of ultrafast magnetization dynamics now that the magnetic circular dichroism at the absorption M-edges of transition metals can be exploited.

  1. CRISPR-Cpf1: A New Tool for Plant Genome Editing.

    PubMed

    Zaidi, Syed Shan-E-Ali; Mahfouz, Magdy M; Mansoor, Shahid

    2017-07-01

    Clustered regularly interspaced palindromic repeats (CRISPR)-CRISPR-associated proteins (CRISPR-Cas), a groundbreaking genome-engineering tool, has facilitated targeted trait improvement in plants. Recently, CRISPR-CRISPR from Prevotella and Francisella 1 (Cpf1) has emerged as a new tool for efficient genome editing, including DNA-free editing in plants, with higher efficiency, specificity, and potentially wider applications than CRISPR-Cas9. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Topological signature in the NEXT high pressure xenon TPC

    NASA Astrophysics Data System (ADS)

    Ferrario, Paola; NEXT Collaboration

    2017-09-01

    The NEXT experiment aims to observe the neutrinoless double beta decay of 136Xe in a high-pressure xenon gas TPC using electroluminescence to amplify the signal from ionization. One of the main advantages of this technology is the possibility to use the topology of events with energies close to Qββ as an extra tool to reject background. In these proceedings we show with data from prototypes that an extra background rejection factor of 24.3 ± 1.4 (stat.)% can be achieved, while maintaining an efficiency of 66.7 ± 1.% for signal events. The performance expected in NEW, the next stage of the experiment, is to improve to 12.9% ± 0.6% background acceptance for 66.9% ± 0.6% signal efficiency.

  3. Clustering methods applied in the detection of Ki67 hot-spots in whole tumor slide images: an efficient way to characterize heterogeneous tissue-based biomarkers.

    PubMed

    Lopez, Xavier Moles; Debeir, Olivier; Maris, Calliope; Rorive, Sandrine; Roland, Isabelle; Saerens, Marco; Salmon, Isabelle; Decaestecker, Christine

    2012-09-01

    Whole-slide scanners allow the digitization of an entire histological slide at very high resolution. This new acquisition technique opens a wide range of possibilities for addressing challenging image analysis problems, including the identification of tissue-based biomarkers. In this study, we use whole-slide scanner technology for imaging the proliferating activity patterns in tumor slides based on Ki67 immunohistochemistry. Faced with large images, pathologists require tools that can help them identify tumor regions that exhibit high proliferating activity, called "hot-spots" (HSs). Pathologists need tools that can quantitatively characterize these HS patterns. To respond to this clinical need, the present study investigates various clustering methods with the aim of identifying Ki67 HSs in whole tumor slide images. This task requires a method capable of identifying an unknown number of clusters, which may be highly variable in terms of shape, size, and density. We developed a hybrid clustering method, referred to as Seedlink. Compared to manual HS selections by three pathologists, we show that Seedlink provides an efficient way of detecting Ki67 HSs and improves the agreement among pathologists when identifying HSs. Copyright © 2012 International Society for Advancement of Cytometry.

  4. Tools & Resources | Efficient Windows Collaborative

    Science.gov Websites

    Selection Tool Mobile App Window Selection Tool Mobile App Use the Window Selection Tool Mobile App for new Window Selection Tool Mobile App. LBNL's RESFEN RESFEN RESFEN is used for calculating the heating and

  5. Using a Novel Spatial Tool to Inform Invasive Species Early Detection and Rapid Response Efforts

    NASA Astrophysics Data System (ADS)

    Davidson, Alisha D.; Fusaro, Abigail J.; Kashian, Donna R.

    2015-07-01

    Management of invasive species has increasingly emphasized the importance of early detection and rapid response (EDRR) programs in limiting introductions, establishment, and impacts. These programs require an understanding of vector and species spatial dynamics to prioritize monitoring sites and efficiently allocate resources. Yet managers often lack the empirical data necessary to make these decisions. We developed an empirical mapping tool that can facilitate development of EDRR programs through identifying high-risk locations, particularly within the recreational boating vector. We demonstrated the utility of this tool in the Great Lakes watershed. We surveyed boaters to identify trips among water bodies and to quantify behaviors associated with high likelihood of species transfer (e.g., not removing organic materials from boat trailers) during that trip. We mapped water bodies with high-risk inbound and outbound boater movements using ArcGIS. We also tested for differences in high-risk behaviors based on demographic variables to understand risk differences among boater groups. Incorporation of boater behavior led to identification of additional high-risk water bodies compared to using the number of trips alone. Therefore, the number of trips itself may not fully reflect the likelihood of invasion. This tool can be broadly applied in other geographic contexts and with different taxa, and can be adjusted according to varying levels of information concerning the vector or species of interest. The methodology is straightforward and can be followed after a basic introduction to ArcGIS software. The visual nature of the mapping tool will facilitate site prioritization by managers and stakeholders from diverse backgrounds.

  6. Inter-view prediction of intra mode decision for high-efficiency video coding-based multiview video coding

    NASA Astrophysics Data System (ADS)

    da Silva, Thaísa Leal; Agostini, Luciano Volcan; da Silva Cruz, Luis A.

    2014-05-01

    Intra prediction is a very important tool in current video coding standards. High-efficiency video coding (HEVC) intra prediction presents relevant gains in encoding efficiency when compared to previous standards, but with a very important increase in the computational complexity since 33 directional angular modes must be evaluated. Motivated by this high complexity, this article presents a complexity reduction algorithm developed to reduce the HEVC intra mode decision complexity targeting multiview videos. The proposed algorithm presents an efficient fast intra prediction compliant with singleview and multiview video encoding. This fast solution defines a reduced subset of intra directions according to the video texture and it exploits the relationship between prediction units (PUs) of neighbor depth levels of the coding tree. This fast intra coding procedure is used to develop an inter-view prediction method, which exploits the relationship between the intra mode directions of adjacent views to further accelerate the intra prediction process in multiview video encoding applications. When compared to HEVC simulcast, our method achieves a complexity reduction of up to 47.77%, at the cost of an average BD-PSNR loss of 0.08 dB.

  7. A robust TALENs system for highly efficient mammalian genome editing.

    PubMed

    Feng, Yuanxi; Zhang, Siliang; Huang, Xin

    2014-01-10

    Recently, transcription activator-like effector nucleases (TALENs) have emerged as a highly effective tool for genomic editing. A pair of TALENs binds to two DNA recognition sites separated by a spacer sequence, and the dimerized FokI nucleases at the C terminal then cleave DNA in the spacer. Because of its modular design and capacity to precisely target almost any desired genomic locus, TALEN is a technology that can revolutionize the entire biomedical research field. Currently, for genomic editing in cultured cells, two plasmids encoding a pair of TALENs are co-transfected, followed by limited dilution to isolate cell colonies with the intended genomic manipulation. However, uncertain transfection efficiency becomes a bottleneck, especially in hard-to-transfect cells, reducing the overall efficiency of genome editing. We have developed a robust TALENs system in which each TALEN plasmid also encodes a fluorescence protein. Thus, cells transfected with both TALEN plasmids, a prerequisite for genomic editing, can be isolated by fluorescence-activated cell sorting. Our improved TALENs system can be applied to all cultured cells to achieve highly efficient genomic editing. Furthermore, an optimized procedure for genomic editing using TALENs is also presented. We expect our system to be widely adopted by the scientific community.

  8. Energy efficiency façade design in high-rise apartment buildings using the calculation of solar heat transfer through windows with shading devices

    NASA Astrophysics Data System (ADS)

    Ha, P. T. H.

    2018-04-01

    The architectural design orientation at the first design stage plays a key role and has a great impact on the energy consumption of a building throughout its life-cycle. To provide designers with a simple and useful tool in quantitatively determining and simply optimizing the energy efficiency of a building at the very first stage of conceptual design, a factor namely building envelope energy efficiency (Khqnl ) should be investigated and proposed. Heat transfer through windows and other glazed areas of mezzanine floors accounts for 86% of overall thermal transfer through building envelope, so the factor Khqnl of high-rise buildings largely depends on shading solutions. The author has established tables and charts to make reference to the values of Khqnl factor in certain high-rise apartment buildings in Hanoi calculated with a software program subject to various inputs including: types and sizes of shading devices, building orientations and at different points of time to be respectively analyzed. It is possible and easier for architects to refer to these tables and charts in façade design for a higher level of energy efficiency.

  9. Highly selective rhodium catalyzed domino C-H activation/cyclizations.

    PubMed

    Trans, Duc N; Cramer, Nicolai

    2011-01-01

    The direct functionalization of carbon-hydrogen bonds is an emerging tool to establish more sustainable and efficient synthetic methods. We present its implementation in a cascade reaction that provides a rapid assembly of functionalized indanylamines from simple and readily available starting materials. Careful choice of the ancillary ligand---an electron-rich bidentate phosphine ligand--enables highly diastereoselective rhodium(i)-catalyzed intramolecular allylations of unsubstituted ketimines induced by a directed C-H bond activation and allene carbo-metalation sequence.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grote, D. P.

    Forthon generates links between Fortran and Python. Python is a high level, object oriented, interactive and scripting language that allows a flexible and versatile interface to computational tools. The Forthon package generates the necessary wrapping code which allows access to the Fortran database and to the Fortran subroutines and functions. This provides a development package where the computationally intensive parts of a code can be written in efficient Fortran, and the high level controlling code can be written in the much more versatile Python language.

  11. Study on the separation effect of high-speed ultrasonic vibration cutting.

    PubMed

    Zhang, Xiangyu; Sui, He; Zhang, Deyuan; Jiang, Xinggang

    2018-07-01

    High-speed ultrasonic vibration cutting (HUVC) has been proven to be significantly effective when turning Ti-6Al-4V alloy in recent researches. Despite of breaking through the cutting speed restriction of the ultrasonic vibration cutting (UVC) method, HUVC can also achieve the reduction of cutting force and the improvements in surface quality and cutting efficiency in the high-speed machining field. These benefits all result from the separation effect that occurs during the HUVC process. Despite the fact that the influences of vibration and cutting parameters have been discussed in previous researches, the separation analysis of HUVC should be conducted in detail in real cutting situations, and the tool geometry parameters should also be considered. In this paper, three situations are investigated in details: (1) cutting without negative transient clearance angle and without tool wear, (2) cutting with negative transient clearance angle and without tool wear, and (3) cutting with tool wear. And then, complete separation state, partial separation state and continuous cutting state are deduced according to real cutting processes. All the analysis about the above situations demonstrate that the tool-workpiece separation will take place only if appropriate cutting parameters, vibration parameters, and tool geometry parameters are set up. The best separation effect was obtained with a low feedrate and a phase shift approaching 180 degrees. Moreover, flank face interference resulted from the negative transient clearance angle and tool wear contributes to an improved separation effect that makes the workpiece and tool separate even at zero phase shift. Finally, axial and radial transient cutting force are firstly obtained to verify the separation effect of HUVC, and the cutting chips are collected to weigh the influence of flank face interference. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. pKAMA-ITACHI Vectors for Highly Efficient CRISPR/Cas9-Mediated Gene Knockout in Arabidopsis thaliana.

    PubMed

    Tsutsui, Hiroki; Higashiyama, Tetsuya

    2017-01-01

    The CRISPR/Cas9 (clustered regularly interspaced short palindromic repeats/CRISPR-associated 9) system is widely used as a tool for genome engineering in various organisms. A complex consisting of Cas9 and single guide RNA (sgRNA) induces a DNA double-strand break in a sequence-specific manner, resulting in knockout. Some binary vectors for CRISPR/Cas9 in plants have been reported, but there is a problem with low efficiency. Here, we present a newly developed, highly efficient CRISPR/Cas9 vector for Arabidopsis thaliana, pKAMA-ITACHI Red (pKIR), harboring the RIBOSOMAL PROTEIN S5 A (RPS5A) promoter to drive Cas9. The RPS5A promoter maintains high constitutive expression at all developmental stages starting from the egg cell and including meristematic cells. Even in the T1 generation, pKIR induced null phenotypes in some genes: PHYTOENE DESATURASE 3 (PDS3), AGAMOUS (AG) and DUO POLLEN 1 (DUO1). Mutations induced by pKIR were carried in the germ cell line of the T1 generation. Surprisingly, in some lines, 100% of the T2 plants had the adh1 (ALCOHOL DEHYDROGENASE 1) null phenotype, indicating that pKIR strongly induced heritable mutations. Cas9-free T2 mutant plants were obtained by removing T2 seeds expressing a fluorescent marker in pKIR. Our results suggest that the pKIR system is a powerful molecular tool for genome engineering in Arabidopsis. © The Author 2016. Published by Oxford University Press on behalf of Japanese Society of Plant Physiologists.

  13. Development status of EUV sources for use in beta-tools and high-volume chip manufacturing tools

    NASA Astrophysics Data System (ADS)

    Stamm, U.; Kleinschmidt, J.; Bolshukhin, D.; Brudermann, J.; Hergenhan, G.; Korobotchko, V.; Nikolaus, B.; Schürmann, M. C.; Schriever, G.; Ziener, C.; Borisov, V. M.

    2006-03-01

    In the paper we give an update about the development status of gas discharge produced plasma (GDPP) EUV sources at XTREME technologies. Already in 2003 first commercial prototypes of xenon GDPP sources of the type XTS 13-35 based on the Z-pinch with 35 W power in 2π sr have been delivered and integrated into micro-exposure tools from Exitech, UK. The micro-exposure tools with these sources have been installed in industry in 2004. The first tool has made more than 100 million pulses without visible degradation of the source collector optics. For the next generation of full-field exposure tools (we call it Beta-tools) we develop GDPP sources with power of > 10 W in intermediate focus. Also these sources use xenon as fuel which has the advantage of not introducing additional contaminations. Here we describe basic performance of these sources as well as aspects of collector integration and debris mitigation and optics lifetime. To achieve source performance data required for high volume chip manufacturing we consider tin as fuel for the source because of its higher conversion efficiency compared to xenon. While we had earlier reported an output power of 400 W in 2π sr from a tin source we could reach meanwhile 800 W in 2π sr from the source in burst operation. Provided a high power collector is available with a realistic collector module efficiency of between 9% and 15 % these data would support 70-120 W power in intermediate focus. However, we do not expect that the required duty cycle and the required electrode lifetimes can be met with this standing electrode design Z-pinch approach. To overcome lifetime and duty cycle limitations we have investigated GDPP sources with tin fuel and rotating disk electrodes. Currently we can generate more than 200 W in 2π sr with these sources at 4 kHz repetition rate. To achieve 180 W power in intermediate focus which is the recent requirement of some exposure tool manufacturers this type of source needs to operate at 21-28 kHz repetition rate which may be not possible by various reasons. In order to make operation at reasonable repetition rates with sufficient power possible we have investigated various new excitation concepts of the rotating disk electrode configurations. With one of the concepts pulse energies above 170 mJ in 2π sr could be demonstrated. This approach promises to support 180 W intermediate focus power at repetition rates in the range between 7 and 10 kHz. It will be developed to the next power level in the following phase of XTREME technologies' high volume manufacturing source development program.

  14. Quantum Monte Carlo Endstation for Petascale Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lubos Mitas

    2011-01-26

    NCSU research group has been focused on accomplising the key goals of this initiative: establishing new generation of quantum Monte Carlo (QMC) computational tools as a part of Endstation petaflop initiative for use at the DOE ORNL computational facilities and for use by computational electronic structure community at large; carrying out high accuracy quantum Monte Carlo demonstration projects in application of these tools to the forefront electronic structure problems in molecular and solid systems; expanding the impact of QMC methods and approaches; explaining and enhancing the impact of these advanced computational approaches. In particular, we have developed quantum Monte Carlomore » code (QWalk, www.qwalk.org) which was significantly expanded and optimized using funds from this support and at present became an actively used tool in the petascale regime by ORNL researchers and beyond. These developments have been built upon efforts undertaken by the PI's group and collaborators over the period of the last decade. The code was optimized and tested extensively on a number of parallel architectures including petaflop ORNL Jaguar machine. We have developed and redesigned a number of code modules such as evaluation of wave functions and orbitals, calculations of pfaffians and introduction of backflow coordinates together with overall organization of the code and random walker distribution over multicore architectures. We have addressed several bottlenecks such as load balancing and verified efficiency and accuracy of the calculations with the other groups of the Endstation team. The QWalk package contains about 50,000 lines of high quality object-oriented C++ and includes also interfaces to data files from other conventional electronic structure codes such as Gamess, Gaussian, Crystal and others. This grant supported PI for one month during summers, a full-time postdoc and partially three graduate students over the period of the grant duration, it has resulted in 13 published papers, 15 invited talks and lectures nationally and internationally. My former graduate student and postdoc Dr. Michal Bajdich, who was supported byt this grant, is currently a postdoc with ORNL in the group of Dr. F. Reboredo and Dr. P. Kent and is using the developed tools in a number of DOE projects. The QWalk package has become a truly important research tool used by the electronic structure community and has attracted several new developers in other research groups. Our tools use several types of correlated wavefunction approaches, variational, diffusion and reptation methods, large-scale optimization methods for wavefunctions and enables to calculate energy differences such as cohesion, electronic gaps, but also densities and other properties, using multiple runs one can obtain equations of state for given structures and beyond. Our codes use efficient numerical and Monte Carlo strategies (high accuracy numerical orbitals, multi-reference wave functions, highly accurate correlation factors, pairing orbitals, force biased and correlated sampling Monte Carlo), are robustly parallelized and enable to run on tens of thousands cores very efficiently. Our demonstration applications were focused on the challenging research problems in several fields of materials science such as transition metal solids. We note that our study of FeO solid was the first QMC calculation of transition metal oxides at high pressures.« less

  15. Atropos: specific, sensitive, and speedy trimming of sequencing reads.

    PubMed

    Didion, John P; Martin, Marcel; Collins, Francis S

    2017-01-01

    A key step in the transformation of raw sequencing reads into biological insights is the trimming of adapter sequences and low-quality bases. Read trimming has been shown to increase the quality and reliability while decreasing the computational requirements of downstream analyses. Many read trimming software tools are available; however, no tool simultaneously provides the accuracy, computational efficiency, and feature set required to handle the types and volumes of data generated in modern sequencing-based experiments. Here we introduce Atropos and show that it trims reads with high sensitivity and specificity while maintaining leading-edge speed. Compared to other state-of-the-art read trimming tools, Atropos achieves significant increases in trimming accuracy while remaining competitive in execution times. Furthermore, Atropos maintains high accuracy even when trimming data with elevated rates of sequencing errors. The accuracy, high performance, and broad feature set offered by Atropos makes it an appropriate choice for the pre-processing of Illumina, ABI SOLiD, and other current-generation short-read sequencing datasets. Atropos is open source and free software written in Python (3.3+) and available at https://github.com/jdidion/atropos.

  16. Atropos: specific, sensitive, and speedy trimming of sequencing reads

    PubMed Central

    Collins, Francis S.

    2017-01-01

    A key step in the transformation of raw sequencing reads into biological insights is the trimming of adapter sequences and low-quality bases. Read trimming has been shown to increase the quality and reliability while decreasing the computational requirements of downstream analyses. Many read trimming software tools are available; however, no tool simultaneously provides the accuracy, computational efficiency, and feature set required to handle the types and volumes of data generated in modern sequencing-based experiments. Here we introduce Atropos and show that it trims reads with high sensitivity and specificity while maintaining leading-edge speed. Compared to other state-of-the-art read trimming tools, Atropos achieves significant increases in trimming accuracy while remaining competitive in execution times. Furthermore, Atropos maintains high accuracy even when trimming data with elevated rates of sequencing errors. The accuracy, high performance, and broad feature set offered by Atropos makes it an appropriate choice for the pre-processing of Illumina, ABI SOLiD, and other current-generation short-read sequencing datasets. Atropos is open source and free software written in Python (3.3+) and available at https://github.com/jdidion/atropos. PMID:28875074

  17. A novel FPGA-programmable switch matrix interconnection element in quantum-dot cellular automata

    NASA Astrophysics Data System (ADS)

    Hashemi, Sara; Rahimi Azghadi, Mostafa; Zakerolhosseini, Ali; Navi, Keivan

    2015-04-01

    The Quantum-dot cellular automata (QCA) is a novel nanotechnology, promising extra low-power, extremely dense and very high-speed structure for the construction of logical circuits at a nanoscale. In this paper, initially previous works on QCA-based FPGA's routing elements are investigated, and then an efficient, symmetric and reliable QCA programmable switch matrix (PSM) interconnection element is introduced. This element has a simple structure and offers a complete routing capability. It is implemented using a bottom-up design approach that starts from a dense and high-speed 2:1 multiplexer and utilise it to build the target PSM interconnection element. In this study, simulations of the proposed circuits are carried out using QCAdesigner, a layout and simulation tool for QCA circuits. The results demonstrate high efficiency of the proposed designs in QCA-based FPGA routing.

  18. Transmutation prospect of long-lived nuclear waste induced by high-charge electron beam from laser plasma accelerator

    NASA Astrophysics Data System (ADS)

    Wang, X. L.; Xu, Z. Y.; Luo, W.; Lu, H. Y.; Zhu, Z. C.; Yan, X. Q.

    2017-09-01

    Photo-transmutation of long-lived nuclear waste induced by a high-charge relativistic electron beam (e-beam) from a laser plasma accelerator is demonstrated. A collimated relativistic e-beam with a high charge of approximately 100 nC is produced from high-intensity laser interaction with near-critical-density (NCD) plasma. Such e-beam impinges on a high-Z convertor and then radiates energetic bremsstrahlung photons with flux approaching 1011 per laser shot. Taking a long-lived radionuclide 126Sn as an example, the resulting transmutation reaction yield is the order of 109 per laser shot, which is two orders of magnitude higher than obtained from previous studies. It is found that at lower densities, a tightly focused laser irradiating relatively longer NCD plasmas can effectively enhance the transmutation efficiency. Furthermore, the photo-transmutation is generalized by considering mixed-nuclide waste samples, which suggests that the laser-accelerated high-charge e-beam could be an efficient tool to transmute long-lived nuclear waste.

  19. Laboratory automation of high-quality and efficient ligand-binding assays for biotherapeutic drug development.

    PubMed

    Wang, Jin; Patel, Vimal; Burns, Daniel; Laycock, John; Pandya, Kinnari; Tsoi, Jennifer; DeSilva, Binodh; Ma, Mark; Lee, Jean

    2013-07-01

    Regulated bioanalytical laboratories that run ligand-binding assays in support of biotherapeutics development face ever-increasing demand to support more projects with increased efficiency. Laboratory automation is a tool that has the potential to improve both quality and efficiency in a bioanalytical laboratory. The success of laboratory automation requires thoughtful evaluation of program needs and fit-for-purpose strategies, followed by pragmatic implementation plans and continuous user support. In this article, we present the development of fit-for-purpose automation of total walk-away and flexible modular modes. We shared the sustaining experience of vendor collaboration and team work to educate, promote and track the use of automation. The implementation of laboratory automation improves assay performance, data quality, process efficiency and method transfer to CRO in a regulated bioanalytical laboratory environment.

  20. Ultra-low loss fully-etched grating couplers for perfectly vertical coupling compatible with DUV lithography tools

    NASA Astrophysics Data System (ADS)

    Dabos, G.; Pleros, N.; Tsiokos, D.

    2016-03-01

    Hybrid integration of VCSELs onto silicon-on-insulator (SOI) substrates has emerged as an attractive approach for bridging the gap between cost-effective and energy-efficient directly modulated laser sources and silicon-based PICs by leveraging flip-chip (FC) bonding techniques and silicon grating couplers (GCs). In this context, silicon GCs, should comply with the process requirements imposed by the complimentary-metal-oxide-semiconductor manufacturing tools addressing in parallel the challenges originating from the perfectly vertical incidence. Firstly, fully etched GCs compatible with deep-ultraviolet lithography tools offering high coupling efficiencies are imperatively needed to maintain low fabrication cost. Secondly, GC's tolerance to VCSEL bonding misalignment errors is a prerequisite for practical deployment. Finally, a major challenge originating from the perfectly vertical coupling scheme is the minimization of the direct back-reflection to the VCSEL's outgoing facet which may destabilize its operation. Motivated from the above challenges, we used numerical simulation tools to design an ultra-low loss, bidirectional VCSEL-to-SOI optical coupling scheme for either TE or TM polarization, based on low-cost fully etched GCs with a Si-layer of 340 nm without employing bottom reflectors or optimizing the buried-oxide layer. Comprehensive 2D Finite-Difference-Time- Domain simulations have been performed. The reported GC layout remains fully compatible with the back-end-of-line (BEOL) stack associated with the 3D integration technology exploiting all the inter-metal-dielectric (IMD) layers of the CMOS fab. Simulation results predicted for the first time in fully etched structures a coupling efficiency of as low as -0.87 dB at 1548 nm and -1.47 dB at 1560 nm with a minimum direct back-reflection of -27.4 dB and -14.2 dB for TE and TM polarization, respectively.

  1. A Focus Group Exploration of Automated Case-Finders to Identify High-Risk Heart Failure Patients Within an Urban Safety Net Hospital.

    PubMed

    Patterson, Mark E; Miranda, Derick; Schuman, Greg; Eaton, Christopher; Smith, Andrew; Silver, Brad

    2016-01-01

    Leveraging "big data" as a means of informing cost-effective care holds potential in triaging high-risk heart failure (HF) patients for interventions within hospitals seeking to reduce 30-day readmissions. Explore provider's beliefs and perceptions about using an electronic health record (EHR)-based tool that uses unstructured clinical notes to risk-stratify high-risk heart failure patients. Six providers from an inpatient HF clinic within an urban safety net hospital were recruited to participate in a semistructured focus group. A facilitator led a discussion on the feasibility and value of using an EHR tool driven by unstructured clinical notes to help identify high-risk patients. Data collected from transcripts were analyzed using a thematic analysis that facilitated drawing conclusions clustered around categories and themes. From six categories emerged two themes: (1) challenges of finding valid and accurate results, and (2) strategies used to overcome these challenges. Although employing a tool that uses electronic medical record (EMR) unstructured text as the benchmark by which to identify high-risk patients is efficient, choosing appropriate benchmark groups could be challenging given the multiple causes of readmission. Strategies to mitigate these challenges include establishing clear selection criteria to guide benchmark group composition, and quality outcome goals for the hospital. Prior to implementing into practice an innovative EMR-based case-finder driven by unstructured clinical notes, providers are advised to do the following: (1) define patient quality outcome goals, (2) establish criteria by which to guide benchmark selection, and (3) verify the tool's validity and reliability. Achieving consensus on these issues would be necessary for this innovative EHR-based tool to effectively improve clinical decision-making and in turn, decrease readmissions for high-risk patients.

  2. Web Audio/Video Streaming Tool

    NASA Technical Reports Server (NTRS)

    Guruvadoo, Eranna K.

    2003-01-01

    In order to promote NASA-wide educational outreach program to educate and inform the public of space exploration, NASA, at Kennedy Space Center, is seeking efficient ways to add more contents to the web by streaming audio/video files. This project proposes a high level overview of a framework for the creation, management, and scheduling of audio/video assets over the web. To support short-term goals, the prototype of a web-based tool is designed and demonstrated to automate the process of streaming audio/video files. The tool provides web-enabled users interfaces to manage video assets, create publishable schedules of video assets for streaming, and schedule the streaming events. These operations are performed on user-defined and system-derived metadata of audio/video assets stored in a relational database while the assets reside on separate repository. The prototype tool is designed using ColdFusion 5.0.

  3. The development of Zirconia and Copper toughened Alumina ceramic insert

    NASA Astrophysics Data System (ADS)

    Amalina Sabuan, Nur; Zolkafli, Nurfatini; Mebrahitom, A.; Azhari, Azmir; Mamat, Othman

    2018-04-01

    Ceramic cutting tools have been utilized in industry for over a century for its productivity and efficiency in machine tools and cutting tool material. However, due to the brittleness property the application has been limited. In order to manufacture high strength ceramic cutting tools, there is a need for suitable reinforcement to improve its toughness. In this case, copper (Cu) and zirconia (ZrO2) powders were added to investigate the hardness and physical properties of the developed composite insert. A uniaxial pre-forming process of the mix powder was done prior to densification by sintering at 1000 and 1300°C. The effect of the composition of the reinforcement on the hardness, density, shrinkage and microstructure of the inserts was investigated. It was found that an optimum density of 3.26 % and hardness 1385HV was obtained for composite of 10wt % zirconia and 10wt% copper at temperature 1000 °C.

  4. A Computationally Efficient Method for Polyphonic Pitch Estimation

    NASA Astrophysics Data System (ADS)

    Zhou, Ruohua; Reiss, Joshua D.; Mattavelli, Marco; Zoia, Giorgio

    2009-12-01

    This paper presents a computationally efficient method for polyphonic pitch estimation. The method employs the Fast Resonator Time-Frequency Image (RTFI) as the basic time-frequency analysis tool. The approach is composed of two main stages. First, a preliminary pitch estimation is obtained by means of a simple peak-picking procedure in the pitch energy spectrum. Such spectrum is calculated from the original RTFI energy spectrum according to harmonic grouping principles. Then the incorrect estimations are removed according to spectral irregularity and knowledge of the harmonic structures of the music notes played on commonly used music instruments. The new approach is compared with a variety of other frame-based polyphonic pitch estimation methods, and results demonstrate the high performance and computational efficiency of the approach.

  5. Evaluation of the efficiency and reliability of software generated by code generators

    NASA Technical Reports Server (NTRS)

    Schreur, Barbara

    1994-01-01

    There are numerous studies which show that CASE Tools greatly facilitate software development. As a result of these advantages, an increasing amount of software development is done with CASE Tools. As more software engineers become proficient with these tools, their experience and feedback lead to further development with the tools themselves. What has not been widely studied, however, is the reliability and efficiency of the actual code produced by the CASE Tools. This investigation considered these matters. Three segments of code generated by MATRIXx, one of many commercially available CASE Tools, were chosen for analysis: ETOFLIGHT, a portion of the Earth to Orbit Flight software, and ECLSS and PFMC, modules for Environmental Control and Life Support System and Pump Fan Motor Control, respectively.

  6. CCTop: An Intuitive, Flexible and Reliable CRISPR/Cas9 Target Prediction Tool

    PubMed Central

    del Sol Keyer, Maria; Wittbrodt, Joachim; Mateo, Juan L.

    2015-01-01

    Engineering of the CRISPR/Cas9 system has opened a plethora of new opportunities for site-directed mutagenesis and targeted genome modification. Fundamental to this is a stretch of twenty nucleotides at the 5’ end of a guide RNA that provides specificity to the bound Cas9 endonuclease. Since a sequence of twenty nucleotides can occur multiple times in a given genome and some mismatches seem to be accepted by the CRISPR/Cas9 complex, an efficient and reliable in silico selection and evaluation of the targeting site is key prerequisite for the experimental success. Here we present the CRISPR/Cas9 target online predictor (CCTop, http://crispr.cos.uni-heidelberg.de) to overcome limitations of already available tools. CCTop provides an intuitive user interface with reasonable default parameters that can easily be tuned by the user. From a given query sequence, CCTop identifies and ranks all candidate sgRNA target sites according to their off-target quality and displays full documentation. CCTop was experimentally validated for gene inactivation, non-homologous end-joining as well as homology directed repair. Thus, CCTop provides the bench biologist with a tool for the rapid and efficient identification of high quality target sites. PMID:25909470

  7. The fourth annual BRDS on genome editing and silencing for precision medicines

    PubMed Central

    Chaudhary, Amit Kumar; Bhattarai, Rajan Sharma; Mahato, Ram I.

    2018-01-01

    Precision medicine is promising for treating human diseases, as it focuses on tailoring drugs to a patient’s genes, environment, and lifestyle. The need for personalized medicines has opened the doors for turning nucleic acids into therapeutics. Although gene therapy has the potential to treat and cure genetic and acquired diseases, it needs to overcome certain obstacles before creating the overall prescription drugs. Recent advancement in the life science has helped to understand the effective manipulation and delivery of genome-engineering tools better. The use of sequence-specific nucleases allows genetic changes in human cells to be easily made with higher efficiency and precision than before. Nanotechnology has made rapid advancement in the field of drug delivery, but the delivery of nucleic acids presents unique challenges. Also, designing efficient and short time-consuming genome-editing tools with negligible off-target effects are in high demand for precision medicine. In the fourth annual Biopharmaceutical Research and Development Symposium (BRDS) held at the University of Nebraska Medical Center (UNMC) on September 7–8, 2017, we covered different facets of developing tools for precision medicine for therapeutic and diagnosis of genetic disorders. PMID:29209906

  8. Simple Tools to Facilitate Project Management of a Nursing Research Project.

    PubMed

    Aycock, Dawn M; Clark, Patricia C; Thomas-Seaton, LaTeshia; Lee, Shih-Yu; Moloney, Margaret

    2016-07-01

    Highly organized project management facilitates rigorous study implementation. Research involves gathering large amounts of information that can be overwhelming when organizational strategies are not used. We describe a variety of project management and organizational tools used in different studies that may be particularly useful for novice researchers. The studies were a multisite study of caregivers of stroke survivors, an Internet-based diary study of women with migraines, and a pilot study testing a sleep intervention in mothers of low-birth-weight infants. Project management tools were used to facilitate enrollment, data collection, and access to results. The tools included protocol and eligibility checklists, event calendars, screening and enrollment logs, instrument scoring tables, and data summary sheets. These tools created efficiency, promoted a positive image, minimized errors, and provided researchers with a sense of control. For the studies described, there were no protocol violations, there were minimal missing data, and the integrity of data collection was maintained. © The Author(s) 2016.

  9. Benchmarking CRISPR on-target sgRNA design.

    PubMed

    Yan, Jifang; Chuai, Guohui; Zhou, Chi; Zhu, Chenyu; Yang, Jing; Zhang, Chao; Gu, Feng; Xu, Han; Wei, Jia; Liu, Qi

    2017-02-15

    CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats)-based gene editing has been widely implemented in various cell types and organisms. A major challenge in the effective application of the CRISPR system is the need to design highly efficient single-guide RNA (sgRNA) with minimal off-target cleavage. Several tools are available for sgRNA design, while limited tools were compared. In our opinion, benchmarking the performance of the available tools and indicating their applicable scenarios are important issues. Moreover, whether the reported sgRNA design rules are reproducible across different sgRNA libraries, cell types and organisms remains unclear. In our study, a systematic and unbiased benchmark of the sgRNA predicting efficacy was performed on nine representative on-target design tools, based on six benchmark data sets covering five different cell types. The benchmark study presented here provides novel quantitative insights into the available CRISPR tools. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  10. Efficient induction of dopaminergic neuron differentiation from induced pluripotent stem cells reveals impaired mitophagy in PARK2 neurons.

    PubMed

    Suzuki, Sadafumi; Akamatsu, Wado; Kisa, Fumihiko; Sone, Takefumi; Ishikawa, Kei-Ichi; Kuzumaki, Naoko; Katayama, Hiroyuki; Miyawaki, Atsushi; Hattori, Nobutaka; Okano, Hideyuki

    2017-01-29

    Patient-specific induced pluripotent stem cells (iPSCs) show promise for use as tools for in vitro modeling of Parkinson's disease. We sought to improve the efficiency of dopaminergic (DA) neuron induction from iPSCs by the using surface markers expressed in DA progenitors to increase the significance of the phenotypic analysis. By sorting for a CD184 high /CD44 - fraction during neural differentiation, we obtained a population of cells that were enriched in DA neuron precursor cells and achieved higher differentiation efficiencies than those obtained through the same protocol without sorting. This high efficiency method of DA neuronal induction enabled reliable detection of reactive oxygen species (ROS) accumulation and vulnerable phenotypes in PARK2 iPSCs-derived DA neurons. We additionally established a quantitative system using the mt-mKeima reporter system to monitor mitophagy in which mitochondria fuse with lysosomes and, by combining this system with the method of DA neuronal induction described above, determined that mitophagy is impaired in PARK2 neurons. These findings suggest that the efficiency of DA neuron induction is important for the precise detection of cellular phenotypes in modeling Parkinson's disease. Copyright © 2016. Published by Elsevier Inc.

  11. Efficient temporal and interlayer parameter prediction for weighted prediction in scalable high efficiency video coding

    NASA Astrophysics Data System (ADS)

    Tsang, Sik-Ho; Chan, Yui-Lam; Siu, Wan-Chi

    2017-01-01

    Weighted prediction (WP) is an efficient video coding tool that was introduced since the establishment of the H.264/AVC video coding standard, for compensating the temporal illumination change in motion estimation and compensation. WP parameters, including a multiplicative weight and an additive offset for each reference frame, are required to be estimated and transmitted to the decoder by slice header. These parameters cause extra bits in the coded video bitstream. High efficiency video coding (HEVC) provides WP parameter prediction to reduce the overhead. Therefore, WP parameter prediction is crucial to research works or applications, which are related to WP. Prior art has been suggested to further improve the WP parameter prediction by implicit prediction of image characteristics and derivation of parameters. By exploiting both temporal and interlayer redundancies, we propose three WP parameter prediction algorithms, enhanced implicit WP parameter, enhanced direct WP parameter derivation, and interlayer WP parameter, to further improve the coding efficiency of HEVC. Results show that our proposed algorithms can achieve up to 5.83% and 5.23% bitrate reduction compared to the conventional scalable HEVC in the base layer for SNR scalability and 2× spatial scalability, respectively.

  12. A high-performance spatial database based approach for pathology imaging algorithm evaluation

    PubMed Central

    Wang, Fusheng; Kong, Jun; Gao, Jingjing; Cooper, Lee A.D.; Kurc, Tahsin; Zhou, Zhengwen; Adler, David; Vergara-Niedermayr, Cristobal; Katigbak, Bryan; Brat, Daniel J.; Saltz, Joel H.

    2013-01-01

    Background: Algorithm evaluation provides a means to characterize variability across image analysis algorithms, validate algorithms by comparison with human annotations, combine results from multiple algorithms for performance improvement, and facilitate algorithm sensitivity studies. The sizes of images and image analysis results in pathology image analysis pose significant challenges in algorithm evaluation. We present an efficient parallel spatial database approach to model, normalize, manage, and query large volumes of analytical image result data. This provides an efficient platform for algorithm evaluation. Our experiments with a set of brain tumor images demonstrate the application, scalability, and effectiveness of the platform. Context: The paper describes an approach and platform for evaluation of pathology image analysis algorithms. The platform facilitates algorithm evaluation through a high-performance database built on the Pathology Analytic Imaging Standards (PAIS) data model. Aims: (1) Develop a framework to support algorithm evaluation by modeling and managing analytical results and human annotations from pathology images; (2) Create a robust data normalization tool for converting, validating, and fixing spatial data from algorithm or human annotations; (3) Develop a set of queries to support data sampling and result comparisons; (4) Achieve high performance computation capacity via a parallel data management infrastructure, parallel data loading and spatial indexing optimizations in this infrastructure. Materials and Methods: We have considered two scenarios for algorithm evaluation: (1) algorithm comparison where multiple result sets from different methods are compared and consolidated; and (2) algorithm validation where algorithm results are compared with human annotations. We have developed a spatial normalization toolkit to validate and normalize spatial boundaries produced by image analysis algorithms or human annotations. The validated data were formatted based on the PAIS data model and loaded into a spatial database. To support efficient data loading, we have implemented a parallel data loading tool that takes advantage of multi-core CPUs to accelerate data injection. The spatial database manages both geometric shapes and image features or classifications, and enables spatial sampling, result comparison, and result aggregation through expressive structured query language (SQL) queries with spatial extensions. To provide scalable and efficient query support, we have employed a shared nothing parallel database architecture, which distributes data homogenously across multiple database partitions to take advantage of parallel computation power and implements spatial indexing to achieve high I/O throughput. Results: Our work proposes a high performance, parallel spatial database platform for algorithm validation and comparison. This platform was evaluated by storing, managing, and comparing analysis results from a set of brain tumor whole slide images. The tools we develop are open source and available to download. Conclusions: Pathology image algorithm validation and comparison are essential to iterative algorithm development and refinement. One critical component is the support for queries involving spatial predicates and comparisons. In our work, we develop an efficient data model and parallel database approach to model, normalize, manage and query large volumes of analytical image result data. Our experiments demonstrate that the data partitioning strategy and the grid-based indexing result in good data distribution across database nodes and reduce I/O overhead in spatial join queries through parallel retrieval of relevant data and quick subsetting of datasets. The set of tools in the framework provide a full pipeline to normalize, load, manage and query analytical results for algorithm evaluation. PMID:23599905

  13. Distribution and Validation of CERES Irradiance Global Data Products Via Web Based Tools

    NASA Technical Reports Server (NTRS)

    Rutan, David; Mitrescu, Cristian; Doelling, David; Kato, Seiji

    2016-01-01

    The CERES SYN1deg product provides climate quality 3-hourly globally gridded and temporally complete maps of top of atmosphere, in atmosphere, and surface fluxes. This product requires efficient release to the public and validation to maintain quality assurance. The CERES team developed web-tools for the distribution of both the global gridded products and grid boxes that contain long term validation sites that maintain high quality flux observations at the Earth's surface. These are found at: http://ceres.larc.nasa.gov/order_data.php. In this poster we explore the various tools available to users to sub-set, download, and validate using surface observations the SYN1Deg and Surface-EBAF products. We also analyze differences found in long-term records from well-maintained land surface sites such as the ARM central facility and high quality buoy radiometers, which due to their isolated nature cannot be maintained in a similar manner to their land based counterparts.

  14. The Creation of a CPU Timer for High Fidelity Programs

    NASA Technical Reports Server (NTRS)

    Dick, Aidan A.

    2011-01-01

    Using C and C++ programming languages, a tool was developed that measures the efficiency of a program by recording the amount of CPU time that various functions consume. By inserting the tool between lines of code in the program, one can receive a detailed report of the absolute and relative time consumption associated with each section. After adapting the generic tool for a high-fidelity launch vehicle simulation program called MAVERIC, the components of a frequently used function called "derivatives ( )" were measured. Out of the 34 sub-functions in "derivatives ( )", it was found that the top 8 sub-functions made up 83.1% of the total time spent. In order to decrease the overall run time of MAVERIC, a launch vehicle simulation program, a change was implemented in the sub-function "Event_Controller ( )". Reformatting "Event_Controller ( )" led to a 36.9% decrease in the total CPU time spent by that sub-function, and a 3.2% decrease in the total CPU time spent by the overarching function "derivatives ( )".

  15. From field notes to data portal - An operational QA/QC framework for tower networks

    NASA Astrophysics Data System (ADS)

    Sturtevant, C.; Hackley, S.; Meehan, T.; Roberti, J. A.; Holling, G.; Bonarrigo, S.

    2016-12-01

    Quality assurance and control (QA/QC) is one of the most important yet challenging aspects of producing research-quality data. This is especially so for environmental sensor networks collecting numerous high-frequency measurement streams at distributed sites. Here, the quality issues are multi-faceted, including sensor malfunctions, unmet theoretical assumptions, and measurement interference from the natural environment. To complicate matters, there are often multiple personnel managing different sites or different steps in the data flow. For large, centrally managed sensor networks such as NEON, the separation of field and processing duties is in the extreme. Tower networks such as Ameriflux, ICOS, and NEON continue to grow in size and sophistication, yet tools for robust, efficient, scalable QA/QC have lagged. Quality control remains a largely manual process relying on visual inspection of the data. In addition, notes of observed measurement interference or visible problems are often recorded on paper without an explicit pathway to data flagging during processing. As such, an increase in network size requires a near-proportional increase in personnel devoted to QA/QC, quickly stressing the human resources available. There is a need for a scalable, operational QA/QC framework that combines the efficiency and standardization of automated tests with the power and flexibility of visual checks, and includes an efficient communication pathway from field personnel to data processors to end users. Here we propose such a framework and an accompanying set of tools in development, including a mobile application template for recording tower maintenance and an R/shiny application for efficiently monitoring and synthesizing data quality issues. This framework seeks to incorporate lessons learned from the Ameriflux community and provide tools to aid continued network advancements.

  16. Strategies in Interventional Radiology: Formation of an Interdisciplinary Center of Vascular Anomalies - Chances and Challenges for Effective and Efficient Patient Management.

    PubMed

    Sadick, Maliha; Dally, Franz Josef; Schönberg, Stefan O; Stroszczynski, Christian; Wohlgemuth, Walter A

    2017-10-01

    Background  Radiology is an interdisciplinary field dedicated to the diagnosis and treatment of numerous diseases and is involved in the development of multimodal treatment concepts. Method  Interdisciplinary case management, a broad spectrum of diagnostic imaging facilities and dedicated endovascular radiological treatment options are valuable tools that allow radiology to set up an interdisciplinary center for vascular anomalies. Results  Image-based diagnosis combined with endovascular treatment options is an essential tool for the treatment of patients with highly complex vascular diseases. These vascular anomalies can affect numerous parts of the body so that a multidisciplinary treatment approach is required for optimal patient care. Conclusion  This paper discusses the possibilities and challenges regarding effective and efficient patient management in connection with the formation of an interdisciplinary center for vascular anomalies with strengthening of the clinical role of radiologists. Key points   · Vascular anomalies, which include vascular tumors and malformations, are complex to diagnose and treat.. · There are far more patients with vascular anomalies requiring therapy than interdisciplinary centers for vascular anomalies - there is currently a shortage of dedicated interdisciplinary centers for vascular anomalies in Germany that can provide dedicated care for affected patients.. · Radiology includes a broad spectrum of diagnostic and minimally invasive therapeutic tools which allow the formation of an interdisciplinary center for vascular anomalies for effective, efficient and comprehensive patient management.. Citation Format · Sadick M, Dally FJ, Schönberg SO et al. Strategies in Interventional Radiology: Formation of an Interdisciplinary Center of Vascular Anomalies - Chances and Challenges for Effective and Efficient Patient Management. Fortschr Röntgenstr 2017; 189: 957 - 966. © Georg Thieme Verlag KG Stuttgart · New York.

  17. Moving Large Data Sets Over High-Performance Long Distance Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hodson, Stephen W; Poole, Stephen W; Ruwart, Thomas

    2011-04-01

    In this project we look at the performance characteristics of three tools used to move large data sets over dedicated long distance networking infrastructure. Although performance studies of wide area networks have been a frequent topic of interest, performance analyses have tended to focus on network latency characteristics and peak throughput using network traffic generators. In this study we instead perform an end-to-end long distance networking analysis that includes reading large data sets from a source file system and committing large data sets to a destination file system. An evaluation of end-to-end data movement is also an evaluation of themore » system configurations employed and the tools used to move the data. For this paper, we have built several storage platforms and connected them with a high performance long distance network configuration. We use these systems to analyze the capabilities of three data movement tools: BBcp, GridFTP, and XDD. Our studies demonstrate that existing data movement tools do not provide efficient performance levels or exercise the storage devices in their highest performance modes. We describe the device information required to achieve high levels of I/O performance and discuss how this data is applicable in use cases beyond data movement performance.« less

  18. Nucleic acid tool enzymes-aided signal amplification strategy for biochemical analysis: status and challenges.

    PubMed

    Qing, Taiping; He, Dinggeng; He, Xiaoxiao; Wang, Kemin; Xu, Fengzhou; Wen, Li; Shangguan, Jingfang; Mao, Zhengui; Lei, Yanli

    2016-04-01

    Owing to their highly efficient catalytic effects and substrate specificity, the nucleic acid tool enzymes are applied as 'nano-tools' for manipulating different nucleic acid substrates both in the test-tube and in living organisms. In addition to the function as molecular scissors and molecular glue in genetic engineering, the application of nucleic acid tool enzymes in biochemical analysis has also been extensively developed in the past few decades. Used as amplifying labels for biorecognition events, the nucleic acid tool enzymes are mainly applied in nucleic acids amplification sensing, as well as the amplification sensing of biorelated variations of nucleic acids. With the introduction of aptamers, which can bind different target molecules, the nucleic acid tool enzymes-aided signal amplification strategies can also be used to sense non-nucleic targets (e.g., ions, small molecules, proteins, and cells). This review describes and discusses the amplification strategies of nucleic acid tool enzymes-aided biosensors for biochemical analysis applications. Various analytes, including nucleic acids, ions, small molecules, proteins, and cells, are reviewed briefly. This work also addresses the future trends and outlooks for signal amplification in nucleic acid tool enzymes-aided biosensors.

  19. Glimpse: Sparsity based weak lensing mass-mapping tool

    NASA Astrophysics Data System (ADS)

    Lanusse, F.; Starck, J.-L.; Leonard, A.; Pires, S.

    2018-02-01

    Glimpse, also known as Glimpse2D, is a weak lensing mass-mapping tool that relies on a robust sparsity-based regularization scheme to recover high resolution convergence from either gravitational shear alone or from a combination of shear and flexion. Including flexion allows the supplementation of the shear on small scales in order to increase the sensitivity to substructures and the overall resolution of the convergence map. To preserve all available small scale information, Glimpse avoids any binning of the irregularly sampled input shear and flexion fields and treats the mass-mapping problem as a general ill-posed inverse problem, regularized using a multi-scale wavelet sparsity prior. The resulting algorithm incorporates redshift, reduced shear, and reduced flexion measurements for individual galaxies and is made highly efficient by the use of fast Fourier estimators.

  20. Recent advances in metabolic engineering of Saccharomyces cerevisiae: New tools and their applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lian, Jiazhang; Mishra, Shekhar; Zhao, Huimin

    Metabolic engineering aims to develop efficient cell factories by rewiring cellular metabolism. As one of the most commonly used cell factories, Saccharomyces cerevisiae has been extensively engineered to produce a wide variety of products at high levels from various feedstocks. In this paper, we summarize the recent development of metabolic engineering approaches to modulate yeast metabolism with representative examples. Particularly, we highlight new tools for biosynthetic pathway optimization (i.e. combinatorial transcriptional engineering and dynamic metabolic flux control) and genome engineering (i.e. clustered regularly interspaced short palindromic repeats (CRISPR)/CRISPR associated (Cas) system based genome engineering and RNA interference assisted genome evolution)more » to advance metabolic engineering in yeast. Lastly, we also discuss the challenges and perspectives for high throughput metabolic engineering.« less

  1. Recent advances in metabolic engineering of Saccharomyces cerevisiae: New tools and their applications

    DOE PAGES

    Lian, Jiazhang; Mishra, Shekhar; Zhao, Huimin

    2018-04-25

    Metabolic engineering aims to develop efficient cell factories by rewiring cellular metabolism. As one of the most commonly used cell factories, Saccharomyces cerevisiae has been extensively engineered to produce a wide variety of products at high levels from various feedstocks. In this paper, we summarize the recent development of metabolic engineering approaches to modulate yeast metabolism with representative examples. Particularly, we highlight new tools for biosynthetic pathway optimization (i.e. combinatorial transcriptional engineering and dynamic metabolic flux control) and genome engineering (i.e. clustered regularly interspaced short palindromic repeats (CRISPR)/CRISPR associated (Cas) system based genome engineering and RNA interference assisted genome evolution)more » to advance metabolic engineering in yeast. Lastly, we also discuss the challenges and perspectives for high throughput metabolic engineering.« less

  2. Use of a Local Immunotherapy as an Adjunctive Tool for the Generation of Human Monoclonal Antibodies from Regional Lymph Nodes of Colonic Cancer Patients

    PubMed Central

    Yagyu, Toshio; Monden, Takushi; Tamaki, Yasuhiro; Morimoto, Hideki; Takeda, Tsutomu; Kobayashi, Tetsuro; Shimano, Takashi; Murakami, Hiroki; Mori, Takesada

    1992-01-01

    Human hybridomas were generated through the fusion of the human B‐lymphoblastoid cell line HO‐323 with the regional lymph node lymphocytes of colonic cancer patients who had received a local immunotherapy. A total of 353 hybridomas were obtained from 4 patients and 116 of these were found to secrete ≧ 100 ng/ml human immunoglobulin. The efficiency was remarkably high as compared with that from patients without the local immunotherapy. Further immunohistological examination showed that 5 hybridomas secreted IgM which selectively reacted with colonic cancers. The results indicate that local immunotherapy could be an adjunctive tool for the generation of highly potent human hybridomas through augmenting the host's immunity. PMID:1544869

  3. Inlet Flow Control and Prediction Technologies for Embedded Propulsion Systems

    NASA Technical Reports Server (NTRS)

    McMillan, Michelle L.; Mackie, Scott A.; Gissen, Abe; Vukasinovic, Bojan; Lakebrink, Matthew T.; Glezer, Ari; Mani, Mori; Mace, James L.

    2011-01-01

    Fail-safe, hybrid, flow control (HFC) is a promising technology for meeting high-speed cruise efficiency, low-noise signature, and reduced fuel-burn goals for future, Hybrid-Wing-Body (HWB) aircraft with embedded engines. This report details the development of HFC technology that enables improved inlet performance in HWB vehicles with highly integrated inlets and embedded engines without adversely affecting vehicle performance. In addition, new test techniques for evaluating Boundary-Layer-Ingesting (BLI)-inlet flow-control technologies developed and demonstrated through this program are documented, including the ability to generate a BLI-like inlet-entrance flow in a direct-connect, wind-tunnel facility, as well as, the use of D-optimal, statistically designed experiments to optimize test efficiency and enable interpretation of results. Validated improvements in numerical analysis tools and methods accomplished through this program are also documented, including Reynolds-Averaged Navier-Stokes CFD simulations of steady-state flow physics for baseline, BLI-inlet diffuser flow, as well as, that created by flow-control devices. Finally, numerical methods were employed in a ground-breaking attempt to directly simulate dynamic distortion. The advances in inlet technologies and prediction tools will help to meet and exceed "N+2" project goals for future HWB aircraft.

  4. The replication of a mouse adapted SARS-CoV in a mouse cell line stably expressing the murine SARS-CoV receptor mACE2 efficiently induces the expression of proinflammatory cytokines.

    PubMed

    Regla-Nava, Jose A; Jimenez-Guardeño, Jose M; Nieto-Torres, Jose L; Gallagher, Thomas M; Enjuanes, Luis; DeDiego, Marta L

    2013-11-01

    Infection of conventional mice with a mouse adapted (MA15) severe acute respiratory syndrome (SARS) coronavirus (CoV) reproduces many aspects of human SARS such as pathological changes in lung, viremia, neutrophilia, and lethality. However, established mouse cell lines highly susceptible to mouse-adapted SARS-CoV infection are not available. In this work, efficiently transfectable mouse cell lines stably expressing the murine SARS-CoV receptor angiotensin converting enzyme 2 (ACE2) have been generated. These cells yielded high SARS-CoV-MA15 titers and also served as excellent tools for plaque assays. In addition, in these cell lines, SARS-CoV-MA15 induced the expression of proinflammatory cytokines and IFN-β, mimicking what has been observed in experimental animal models infected with SARS-CoV and SARS patients. These cell lines are valuable tools to perform in vitro studies in a mouse cell system that reflects the species used for in vivo studies of SARS-CoV-MA15 pathogenesis. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. Combustor design tool for a gas fired thermophotovoltaic energy converter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lindler, K.W.; Harper, M.J.

    1995-12-31

    Recently, there has been a renewed interest in thermophotovoltaic (TPV) energy conversion. A TPV device converts radiant energy from a high temperature incandescent emitter directly into electricity by photovoltaic cells. The current Department of Energy sponsored research involves the design, construction and demonstration of a prototype TPV converter that uses a hydrocarbon fuel (such as natural gas) as the energy source. As the photovoltaic cells are designed to efficiently convert radiant energy at a prescribed wavelength, it is important that the temperature of the emitter be nearly constant over its entire surface. The U. S. Naval Academy has been taskedmore » with the development of a small emitter (with a high emissivity) that can be maintained at 1756 K (2700 F). This paper describes the computer spreadsheet model that was developed as a tool to be used for the design of the high temperature emitter.« less

  6. Development of Genome Engineering Tools from Plant-Specific PPR Proteins Using Animal Cultured Cells.

    PubMed

    Kobayashi, Takehito; Yagi, Yusuke; Nakamura, Takahiro

    2016-01-01

    The pentatricopeptide repeat (PPR) motif is a sequence-specific RNA/DNA-binding module. Elucidation of the RNA/DNA recognition mechanism has enabled engineering of PPR motifs as new RNA/DNA manipulation tools in living cells, including for genome editing. However, the biochemical characteristics of PPR proteins remain unknown, mostly due to the instability and/or unfolding propensities of PPR proteins in heterologous expression systems such as bacteria and yeast. To overcome this issue, we constructed reporter systems using animal cultured cells. The cell-based system has highly attractive features for PPR engineering: robust eukaryotic gene expression; availability of various vectors, reagents, and antibodies; highly efficient DNA delivery ratio (>80 %); and rapid, high-throughput data production. In this chapter, we introduce an example of such reporter systems: a PPR-based sequence-specific translational activation system. The cell-based reporter system can be applied to characterize plant genes of interested and to PPR engineering.

  7. ClusCo: clustering and comparison of protein models.

    PubMed

    Jamroz, Michal; Kolinski, Andrzej

    2013-02-22

    The development, optimization and validation of protein modeling methods require efficient tools for structural comparison. Frequently, a large number of models need to be compared with the target native structure. The main reason for the development of Clusco software was to create a high-throughput tool for all-versus-all comparison, because calculating similarity matrix is the one of the bottlenecks in the protein modeling pipeline. Clusco is fast and easy-to-use software for high-throughput comparison of protein models with different similarity measures (cRMSD, dRMSD, GDT_TS, TM-Score, MaxSub, Contact Map Overlap) and clustering of the comparison results with standard methods: K-means Clustering or Hierarchical Agglomerative Clustering. The application was highly optimized and written in C/C++, including the code for parallel execution on CPU and GPU, which resulted in a significant speedup over similar clustering and scoring computation programs.

  8. Shallow aquifer storage and recovery (SASR): Initial findings from the Willamette Basin, Oregon

    NASA Astrophysics Data System (ADS)

    Neumann, P.; Haggerty, R.

    2012-12-01

    A novel mode of shallow aquifer management could increase the volumetric potential and distribution of groundwater storage. We refer to this mode as shallow aquifer storage and recovery (SASR) and gauge its potential as a freshwater storage tool. By this mode, water is stored in hydraulically connected aquifers with minimal impact to surface water resources. Basin-scale numerical modeling provides a linkage between storage efficiency and hydrogeological parameters, which in turn guides rulemaking for how and where water can be stored. Increased understanding of regional groundwater-surface water interactions is vital to effective SASR implementation. In this study we (1) use a calibrated model of the central Willamette Basin (CWB), Oregon to quantify SASR storage efficiency at 30 locations; (2) estimate SASR volumetric storage potential throughout the CWB based on these results and pertinent hydrogeological parameters; and (3) introduce a methodology for management of SASR by such parameters. Of 3 shallow, sedimentary aquifers in the CWB, we find the moderately conductive, semi-confined, middle sedimentary unit (MSU) to be most efficient for SASR. We estimate that users overlying 80% of the area in this aquifer could store injected water with greater than 80% efficiency, and find efficiencies of up to 95%. As a function of local production well yields, we estimate a maximum annual volumetric storage potential of 30 million m3 using SASR in the MSU. This volume constitutes roughly 9% of the current estimated summer pumpage in the Willamette basin at large. The dimensionless quantity lag #—calculated using modeled specific capacity, distance to nearest in-layer stream boundary, and injection duration—exhibits relatively high correlation to SASR storage efficiency at potential locations in the CWB. This correlation suggests that basic field measurements could guide SASR as an efficient shallow aquifer storage tool.

  9. Comparison in partition efficiency of protein separation between four different tubing modifications in spiral high-speed countercurrent chromatography

    PubMed Central

    Ito, Yoichiro; Clary, Robert

    2016-01-01

    High-speed countercurrent chromatography with a spiral tube assembly can retain a satisfactory amount of stationary phase of polymer phase systems used for protein separation. In order to improve the partition efficiency a simple tool to modify the tubing shapes was fabricated, and the following four different tubing modifications were made: intermittently pressed at 10 mm width, flat, flat-wave, and flat-twist. Partition efficiencies of the separation column made from these modified tubing were examined in protein separation with an aqueous-aqueous polymer phase system at flow rates of 1–2 ml/min under 800 rpm. The results indicated that the column with all modified tubing improved the partition efficiency at a flow rate of 1 ml/min, but at a higher flow rate of 2 ml/min the columns made of flattened tubing showed lowered partition efficiency apparently due to the loss of the retained stationary phase. Among all the modified columns, the column with intermittently pressed tubing gave the best peak resolution. It may be concluded that the intermittently pressed and flat-twist improve the partition efficiency in a semi-preparative separation while other modified tubing of flat and flat-wave configurations may be used for analytical separations with a low flow rate. PMID:27790621

  10. Comparison in partition efficiency of protein separation between four different tubing modifications in spiral high-speed countercurrent chromatography.

    PubMed

    Ito, Yoichiro; Clary, Robert

    2016-12-01

    High-speed countercurrent chromatography with a spiral tube assembly can retain a satisfactory amount of stationary phase of polymer phase systems used for protein separation. In order to improve the partition efficiency a simple tool to modify the tubing shapes was fabricated, and the following four different tubing modifications were made: intermittently pressed at 10 mm width, flat, flat-wave, and flat-twist. Partition efficiencies of the separation column made from these modified tubing were examined in protein separation with an aqueous-aqueous polymer phase system at flow rates of 1-2 ml/min under 800 rpm. The results indicated that the column with all modified tubing improved the partition efficiency at a flow rate of 1 ml/min, but at a higher flow rate of 2 ml/min the columns made of flattened tubing showed lowered partition efficiency apparently due to the loss of the retained stationary phase. Among all the modified columns, the column with intermittently pressed tubing gave the best peak resolution. It may be concluded that the intermittently pressed and flat-twist improve the partition efficiency in a semi-preparative separation while other modified tubing of flat and flat-wave configurations may be used for analytical separations with a low flow rate.

  11. Optimizing the ASC WAN: evaluating network performance tools for comparing transport protocols.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lydick, Christopher L.

    2007-07-01

    The Advanced Simulation & Computing Wide Area Network (ASC WAN), which is a high delay-bandwidth network connection between US Department of Energy National Laboratories, is constantly being examined and evaluated for efficiency. One of the current transport-layer protocols which is used, TCP, was developed for traffic demands which are different from that on the ASC WAN. The Stream Control Transport Protocol (SCTP), on the other hand, has shown characteristics which make it more appealing to networks such as these. Most important, before considering a replacement for TCP on any network, a testing tool that performs well against certain criteria needsmore » to be found. In order to try to find such a tool, two popular networking tools (Netperf v.2.4.3 & v.2.4.6 (OpenSS7 STREAMS), and Iperf v.2.0.6) were tested. These tools implement both TCP and SCTP and were evaluated using four metrics: (1) How effectively can the tool reach a throughput near the bandwidth? (2) How much of the CPU does the tool utilize during operation? (3) Is the tool freely and widely available? And, (4) Is the tool actively developed? Following the analysis of those tools, this paper goes further into explaining some recommendations and ideas for future work.« less

  12. An efficient 3-D eddy-current solver using an independent impedance method for transcranial magnetic stimulation.

    PubMed

    De Geeter, Nele; Crevecoeur, Guillaume; Dupre, Luc

    2011-02-01

    In many important bioelectromagnetic problem settings, eddy-current simulations are required. Examples are the reduction of eddy-current artifacts in magnetic resonance imaging and techniques, whereby the eddy currents interact with the biological system, like the alteration of the neurophysiology due to transcranial magnetic stimulation (TMS). TMS has become an important tool for the diagnosis and treatment of neurological diseases and psychiatric disorders. A widely applied method for simulating the eddy currents is the impedance method (IM). However, this method has to contend with an ill conditioned problem and consequently a long convergence time. When dealing with optimal design problems and sensitivity control, the convergence rate becomes even more crucial since the eddy-current solver needs to be evaluated in an iterative loop. Therefore, we introduce an independent IM (IIM), which improves the conditionality and speeds up the numerical convergence. This paper shows how IIM is based on IM and what are the advantages. Moreover, the method is applied to the efficient simulation of TMS. The proposed IIM achieves superior convergence properties with high time efficiency, compared to the traditional IM and is therefore a useful tool for accurate and fast TMS simulations.

  13. An Investigation into III-V Compounds to Reach 20% Efficiency with Minimum Cell Thickness in Ultrathin-Film Solar Cells

    NASA Astrophysics Data System (ADS)

    Haque, K. A. S. M. Ehteshamul; Galib, Md. Mehedi Hassan

    2013-10-01

    III-V single-junction solar cells have already achieved very high efficiency levels. However, their use in terrestrial applications is limited by the high fabrication cost. High-efficiency, ultrathin-film solar cells can effectively solve this problem, as their material requirement is minimum. This work presents a comparison among several III-V compounds that have high optical absorption capability as well as optimum bandgap (around 1.4 eV) for use as solar cell absorbers. The aim is to observe and compare the ability of these materials to reach a target efficiency level of 20% with minimum possible cell thickness. The solar cell considered has an n-type ZnSe window layer, an n-type Al0.1Ga0.9As emitter layer, and a p-type Ga0.5In0.5P back surface field (BSF) layer. Ge is used as the substrate. In the initial design, a p-type InP base was sandwiched between the emitter and the BSF layer, and the design parameters for the device were optimized by analyzing the simulation outcomes with ADEPT/F, a one-dimensional (1D) simulation tool. Then, the minimum cell thickness that achieves 20% efficiency was determined by observing the efficiency variation with cell thickness. Afterwards, the base material was changed to a few other selected III-V compounds, and for each case, the minimum cell thickness was determined in a similar manner. Finally, these cell thickness values were compared and analyzed to identify more effective base layer materials for III-V single-junction solar cells.

  14. Motor origins of tool use.

    PubMed

    Kahrs, Björn A; Jung, Wendy P; Lockman, Jeffrey J

    2013-01-01

    The current study examines the developmental trajectory of banging movements and its implications for tool use development. Twenty (6- to 15-month-old) infants wore reflective markers while banging a handled cube; movements were recorded at 240 Hz. Results indicated that through the second half-year, banging movements undergo developmental changes making them ideally suited for instrumental hammering and pounding. Younger infants were inefficient and variable when banging the object: Their hands followed circuitous paths of great lengths at high velocities. By 1 year, infants showed consistent and efficient straight up-down hand trajectories of smaller magnitude and velocity, allowing for precise aiming and delivering dependable levels of force. The findings suggest that tool use develops gradually from infants' existing manual behaviors. © 2012 The Authors. Child Development © 2012 Society for Research in Child Development, Inc.

  15. Commercial Building Energy Asset Score

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    This software (Asset Scoring Tool) is designed to help building owners and managers to gain insight into the as-built efficiency of their buildings. It is a web tool where users can enter their building information and obtain an asset score report. The asset score report consists of modeled building energy use (by end use and by fuel type), building systems (envelope, lighting, heating, cooling, service hot water) evaluations, and recommended energy efficiency measures. The intended users are building owners and operators who have limited knowledge of building energy efficiency. The scoring tool collects minimum building data (~20 data entries) frommore » users and build a full-scale energy model using the inference functionalities from Facility Energy Decision System (FEDS). The scoring tool runs real-time building energy simulation using EnergyPlus and performs life-cycle cost analysis using FEDS. An API is also under development to allow the third-party applications to exchange data with the web service of the scoring tool.« less

  16. Solid state nuclear magnetic resonance studies of prion peptides and proteins

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heller, Jonathan

    1997-08-01

    High-resolution structural studies using x-ray diffraction and solution nuclear magnetic resonance (NMR) are not feasible for proteins of low volubility and high tendency to aggregate. Solid state NMR (SSNMR) is in principle capable of providing structural information in such systems, however to do this efficiently and accurately, further SSNMR tools must be developed This dissertation describes the development of three new methods and their application to a biological system of interest, the priori protein (PrP).

  17. Sscience & technology review; Science Technology Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-07-01

    This review is published ten times a year to communicate, to a broad audience, Lawrence Livermore National Laboratory`s scientific and technological accomplishments, particularly in the Laboratory`s core mission areas - global security, energy and the environment, and bioscience and biotechnology. This review for the month of July 1996 discusses: Frontiers of research in advanced computations, The multibeam Fabry-Perot velocimeter: Efficient measurement of high velocities, High-tech tools for the American textile industry, and Rock mechanics: can the Tuff take the stress.

  18. Micro-Spec: An Ultracompact, High-sensitivity Spectrometer for Far-Infrared and Submillimeter Astronomy

    NASA Technical Reports Server (NTRS)

    Cataldo, Giuseppe; Hsieh, Wen-Ting; Huang, Wei-Chung; Moseley, S. Harvey; Stevenson, Thomas R.; Wollack, Edward J.

    2014-01-01

    High-performance, integrated spectrometers operating in the far-infrared and submillimeter ranges promise to be powerful tools for the exploration of the epochs of reionization and initial galaxy formation. These devices, using high-efficiency superconducting transmission lines, can achieve the performance of a meter-scale grating spectrometer in an instrument implemented on a 4 inch silicon wafer. Such a device, when combined with a cryogenic telescope in space, provides an enabling capability for studies of the early universe. Here, the optical design process for Micro-Spec (micron-Spec) is presented, with particular attention given to its two-dimensional diffractive region, where the light of different wavelengths is focused on the different detectors. The method is based on the stigmatization and minimization of the light path function in this bounded region, which results in an optimized geometrical configuration. A point design with an efficiency of (is) approximately 90% has been developed for initial demonstration and can serve as the basis for future instruments. Design variations on this implementation are also discussed, which can lead to lower efficiencies due to diffractive losses in the multimode region.

  19. Micro-Spec: An Ultra-Compact, High-Sensitivity Spectrometer for Far-Infrared and Sub-Millimeter Astronomy

    NASA Technical Reports Server (NTRS)

    Cataldo, Giuseppe; Hsieh, Wen-Ting; Huang, Wei-Chung; Moseley, S. Harvey; Stevenson, Thomas R.; Wollack, Edward J.

    2013-01-01

    High-performance, integrated spectrometers operating in the far-infrared and sub-millimeter promise to be powerful tools for the exploration of the epochs of reionization and initial galaxy formation. These devices, using high-efficiency superconducting transmission lines, can achieve the performance of a meter-scale grating spectrometer in an instrument implemented on a four-inch silicon wafer. Such a device, when combined with a cryogenic telescope in space, provides an enabling capability for studies of the early universe. Here, the optical design process for Micro-Spec (mu-Spec) is presented, with particular attention given to its two-dimensional diffractive region, where the light of different wavelengths is focused on the different detectors. The method is based on the stigmatization and minimization of the light path function in this bounded region, which results in an optimized geometrical configuration. A point design with an efficiency of approx. 90% has been developed for initial demonstration, and can serve as the basis for future instruments. Design variations on this implementation are also discussed, which can lead to lower efficiencies due to diffractive losses in the multimode region.

  20. Refraction effects in soft x-ray multilayer blazed gratings.

    PubMed

    Voronov, D L; Salmassi, F; Meyer-Ilse, J; Gullikson, E M; Warwick, T; Padmore, H A

    2016-05-30

    A 2500 lines/mm Multilayer Blazed Grating (MBG) optimized for the soft x-ray wavelength range was fabricated and tested. The grating coated with a W/B4C multilayer demonstrated a record diffraction efficiency in the 2nd blazed diffraction order in the energy range from 500 to 1200 eV. Detailed investigation of the diffraction properties of the grating demonstrated that the diffraction efficiency of high groove density MBGs is not limited by the normal shadowing effects that limits grazing incidence x-ray grating performance. Refraction effects inherent in asymmetrical Bragg diffraction were experimentally confirmed for MBGs. The refraction affects the blazing properties of the MBGs and results in a shift of the resonance wavelength of the gratings and broadening or narrowing of the grating bandwidth depending on diffraction geometry. The true blaze angle of the MBGs is defined by both the real structure of the multilayer stack and by asymmetrical refraction effects. Refraction effects can be used as a powerful tool in providing highly efficient suppression of high order harmonics.

  1. 5S rRNA Promoter for Guide RNA Expression Enabled Highly Efficient CRISPR/Cas9 Genome Editing in Aspergillus niger.

    PubMed

    Zheng, Xiaomei; Zheng, Ping; Zhang, Kun; Cairns, Timothy C; Meyer, Vera; Sun, Jibin; Ma, Yanhe

    2018-04-30

    The CRISPR/Cas9 system is a revolutionary genome editing tool. However, in eukaryotes, search and optimization of a suitable promoter for guide RNA expression is a significant technical challenge. Here we used the industrially important fungus, Aspergillus niger, to demonstrate that the 5S rRNA gene, which is both highly conserved and efficiently expressed in eukaryotes, can be used as a guide RNA promoter. The gene editing system was established with 100% rates of precision gene modifications among dozens of transformants using short (40-bp) homologous donor DNA. This system was also applicable for generation of designer chromosomes, as evidenced by deletion of a 48 kb gene cluster required for biosynthesis of the mycotoxin fumonisin B1. Moreover, this system also facilitated simultaneous mutagenesis of multiple genes in A. niger. We anticipate that the use of the 5S rRNA gene as guide RNA promoter can broadly be applied for engineering highly efficient eukaryotic CRISPR/Cas9 toolkits. Additionally, the system reported here will enable development of designer chromosomes in model and industrially important fungi.

  2. Micro-Spec: an ultracompact, high-sensitivity spectrometer for far-infrared and submillimeter astronomy.

    PubMed

    Cataldo, Giuseppe; Hsieh, Wen-Ting; Huang, Wei-Chung; Moseley, S Harvey; Stevenson, Thomas R; Wollack, Edward J

    2014-02-20

    High-performance, integrated spectrometers operating in the far-infrared and submillimeter ranges promise to be powerful tools for the exploration of the epochs of reionization and initial galaxy formation. These devices, using high-efficiency superconducting transmission lines, can achieve the performance of a meter-scale grating spectrometer in an instrument implemented on a 4 inch silicon wafer. Such a device, when combined with a cryogenic telescope in space, provides an enabling capability for studies of the early universe. Here, the optical design process for Micro-Spec (μ-Spec) is presented, with particular attention given to its two-dimensional diffractive region, where the light of different wavelengths is focused on the different detectors. The method is based on the stigmatization and minimization of the light path function in this bounded region, which results in an optimized geometrical configuration. A point design with an efficiency of ~90% has been developed for initial demonstration and can serve as the basis for future instruments. Design variations on this implementation are also discussed, which can lead to lower efficiencies due to diffractive losses in the multimode region.

  3. Compressor and Turbine Multidisciplinary Design for Highly Efficient Micro-gas Turbine

    NASA Astrophysics Data System (ADS)

    Barsi, Dario; Perrone, Andrea; Qu, Yonglei; Ratto, Luca; Ricci, Gianluca; Sergeev, Vitaliy; Zunino, Pietro

    2018-06-01

    Multidisciplinary design optimization (MDO) is widely employed to enhance turbomachinery components efficiency. The aim of this work is to describe a complete tool for the aero-mechanical design of a radial inflow turbine and a centrifugal compressor. The high rotational speed of such machines and the high exhaust gas temperature (only for the turbine) expose blades to really high stresses and therefore the aerodynamics design has to be coupled with the mechanical one through an integrated procedure. The described approach employs a fully 3D Reynolds Averaged Navier-Stokes (RANS) solver for the aerodynamics and an open source Finite Element Analysis (FEA) solver for the mechanical integrity assessment. Due to the high computational cost of both these two solvers, a meta model, such as an artificial neural network (ANN), is used to speed up the optimization design process. The interaction between two codes, the mesh generation and the post processing of the results are achieved via in-house developed scripting modules. The obtained results are widely presented and discussed.

  4. Optimization of Process Parameters for High Efficiency Laser Forming of Advanced High Strength Steels within Metallurgical Constraints

    NASA Astrophysics Data System (ADS)

    Sheikholeslami, Ghazal; Griffiths, Jonathan; Dearden, Geoff; Edwardson, Stuart P.

    Laser forming (LF) has been shown to be a viable alternative to form automotive grade advanced high strength steels (AHSS). Due to their high strength, heat sensitivity and low conventional formability show early fractures, larger springback, batch-to-batch inconsistency and high tool wear. In this paper, optimisation of the LF process parameters has been conducted to further understand the impact of a surface heat treatment on DP1000. A FE numerical simulation has been developed to analyse the dynamic thermo-mechanical effects. This has been verified against empirical data. The goal of the optimisation has been to develop a usable process window for the LF of AHSS within strict metallurgical constraints. Results indicate it is possible to LF this material, however a complex relationship has been found between the generation and maintenance of hardness values in the heated zone. A laser surface hardening effect has been observed that could be beneficial to the efficiency of the process.

  5. MPGD for breast cancer prevention: a high resolution and low dose radiation medical imaging

    NASA Astrophysics Data System (ADS)

    Gutierrez, R. M.; Cerquera, E. A.; Mañana, G.

    2012-07-01

    Early detection of small calcifications in mammograms is considered the best preventive tool of breast cancer. However, existing digital mammography with relatively low radiation skin exposure has limited accessibility and insufficient spatial resolution for small calcification detection. Micro Pattern Gaseous Detectors (MPGD) and associated technologies, increasingly provide new information useful to generate images of microscopic structures and make more accessible cutting edge technology for medical imaging and many other applications. In this work we foresee and develop an application for the new information provided by a MPGD camera in the form of highly controlled images with high dynamical resolution. We present a new Super Detail Image (S-DI) that efficiently profits of this new information provided by the MPGD camera to obtain very high spatial resolution images. Therefore, the method presented in this work shows that the MPGD camera with SD-I, can produce mammograms with the necessary spatial resolution to detect microcalcifications. It would substantially increase efficiency and accessibility of screening mammography to highly improve breast cancer prevention.

  6. An Efficient Method for the Isolation of Highly Purified RNA from Seeds for Use in Quantitative Transcriptome Analysis.

    PubMed

    Kanai, Masatake; Mano, Shoji; Nishimura, Mikio

    2017-01-11

    Plant seeds accumulate large amounts of storage reserves comprising biodegradable organic matter. Humans rely on seed storage reserves for food and as industrial materials. Gene expression profiles are powerful tools for investigating metabolic regulation in plant cells. Therefore, detailed, accurate gene expression profiles during seed development are required for crop breeding. Acquiring highly purified RNA is essential for producing these profiles. Efficient methods are needed to isolate highly purified RNA from seeds. Here, we describe a method for isolating RNA from seeds containing large amounts of oils, proteins, and polyphenols, which have inhibitory effects on high-purity RNA isolation. Our method enables highly purified RNA to be obtained from seeds without the use of phenol, chloroform, or additional processes for RNA purification. This method is applicable to Arabidopsis, rapeseed, and soybean seeds. Our method will be useful for monitoring the expression patterns of low level transcripts in developing and mature seeds.

  7. An end user evaluation of query formulation and results review tools in three medical meta-search engines.

    PubMed

    Leroy, Gondy; Xu, Jennifer; Chung, Wingyan; Eggers, Shauna; Chen, Hsinchun

    2007-01-01

    Retrieving sufficient relevant information online is difficult for many people because they use too few keywords to search and search engines do not provide many support tools. To further complicate the search, users often ignore support tools when available. Our goal is to evaluate in a realistic setting when users use support tools and how they perceive these tools. We compared three medical search engines with support tools that require more or less effort from users to form a query and evaluate results. We carried out an end user study with 23 users who were asked to find information, i.e., subtopics and supporting abstracts, for a given theme. We used a balanced within-subjects design and report on the effectiveness, efficiency and usability of the support tools from the end user perspective. We found significant differences in efficiency but did not find significant differences in effectiveness between the three search engines. Dynamic user support tools requiring less effort led to higher efficiency. Fewer searches were needed and more documents were found per search when both query reformulation and result review tools dynamically adjust to the user query. The query reformulation tool that provided a long list of keywords, dynamically adjusted to the user query, was used most often and led to more subtopics. As hypothesized, the dynamic result review tools were used more often and led to more subtopics than static ones. These results were corroborated by the usability questionnaires, which showed that support tools that dynamically optimize output were preferred.

  8. Micro Slot Generation by μ-ED Milling

    NASA Astrophysics Data System (ADS)

    Dave, H. K.; Mayanak, M. K.; Rajpurohit, S. R.; Mathai, V. J.

    2016-08-01

    Micro electro discharge machining is one of the most widely used advanced micro machining technique owing to its capability to fabricate micro features on any electrically conductive materials irrespective of its material properties. Despite its wide acceptability, the process is always adversely affected by issues like wear that occurred on the tool electrode, which results into generation of inaccurate features. Micro ED milling, a process variant in which the tool electrode simultaneously rotated and scanned during machining, is reported to have high process efficiency for generation of 3D complicated shapes and features with relatively less electrode wear intensity. In the present study an attempt has been made to study the effect of two process parameters viz. capacitance and scanning speed of tool electrode on end wear that occurs on the tool electrode and overcut of micro slots generated by micro ED milling. The experiment has been conducted on Al 1100 alloy with tungsten electrode having diameter of 300 μm. Results suggest that wear on the tool electrode and overcut of the micro features generated are highly influenced by the level of the capacitance employed during machining. For the parameter usage employed for present study however, no significant effect of variation of scanning speed has been observed on both responses.

  9. Development of a CRISPR/Cas9 genome editing toolbox for Corynebacterium glutamicum.

    PubMed

    Liu, Jiao; Wang, Yu; Lu, Yujiao; Zheng, Ping; Sun, Jibin; Ma, Yanhe

    2017-11-16

    Corynebacterium glutamicum is an important industrial workhorse and advanced genetic engineering tools are urgently demanded. Recently, the clustered regularly interspaced short palindromic repeats (CRISPR) and their CRISPR-associated proteins (Cas) have revolutionized the field of genome engineering. The CRISPR/Cas9 system that utilizes NGG as protospacer adjacent motif (PAM) and has good targeting specificity can be developed into a powerful tool for efficient and precise genome editing of C. glutamicum. Herein, we developed a versatile CRISPR/Cas9 genome editing toolbox for C. glutamicum. Cas9 and gRNA expression cassettes were reconstituted to combat Cas9 toxicity and facilitate effective termination of gRNA transcription. Co-transformation of Cas9 and gRNA expression plasmids was exploited to overcome high-frequency mutation of cas9, allowing not only highly efficient gene deletion and insertion with plasmid-borne editing templates (efficiencies up to 60.0 and 62.5%, respectively) but also simple and time-saving operation. Furthermore, CRISPR/Cas9-mediated ssDNA recombineering was developed to precisely introduce small modifications and single-nucleotide changes into the genome of C. glutamicum with efficiencies over 80.0%. Notably, double-locus editing was also achieved in C. glutamicum. This toolbox works well in several C. glutamicum strains including the widely-used strains ATCC 13032 and ATCC 13869. In this study, we developed a CRISPR/Cas9 toolbox that could facilitate markerless gene deletion, gene insertion, precise base editing, and double-locus editing in C. glutamicum. The CRISPR/Cas9 toolbox holds promise for accelerating the engineering of C. glutamicum and advancing its application in the production of biochemicals and biofuels.

  10. Developing and testing a street audit tool using Google Street View to measure environmental supportiveness for physical activity.

    PubMed

    Griew, Pippa; Hillsdon, Melvyn; Foster, Charlie; Coombes, Emma; Jones, Andy; Wilkinson, Paul

    2013-08-23

    Walking for physical activity is associated with substantial health benefits for adults. Increasingly research has focused on associations between walking behaviours and neighbourhood environments including street characteristics such as pavement availability and aesthetics. Nevertheless, objective assessment of street-level data is challenging. This research investigates the reliability of a new street characteristic audit tool designed for use with Google Street View, and assesses levels of agreement between computer-based and on-site auditing. The Forty Area STudy street VIEW (FASTVIEW) tool, a Google Street View based audit tool, was developed incorporating nine categories of street characteristics. Using the tool, desk-based audits were conducted by trained researchers across one large UK town during 2011. Both inter and intra-rater reliability were assessed. On-site street audits were also completed to test the criterion validity of the method. All reliability scores were assessed by percentage agreement and the kappa statistic. Within-rater agreement was high for each category of street characteristic (range: 66.7%-90.0%) and good to high between raters (range: 51.3%-89.1%). A high level of agreement was found between the Google Street View audits and those conducted in-person across the nine categories examined (range: 75.0%-96.7%). The audit tool was found to provide a reliable and valid measure of street characteristics. The use of Google Street View to capture street characteristic data is recommended as an efficient method that could substantially increase potential for large-scale objective data collection.

  11. Efficient computation of electrograms and ECGs in human whole heart simulations using a reaction-eikonal model.

    PubMed

    Neic, Aurel; Campos, Fernando O; Prassl, Anton J; Niederer, Steven A; Bishop, Martin J; Vigmond, Edward J; Plank, Gernot

    2017-10-01

    Anatomically accurate and biophysically detailed bidomain models of the human heart have proven a powerful tool for gaining quantitative insight into the links between electrical sources in the myocardium and the concomitant current flow in the surrounding medium as they represent their relationship mechanistically based on first principles. Such models are increasingly considered as a clinical research tool with the perspective of being used, ultimately, as a complementary diagnostic modality. An important prerequisite in many clinical modeling applications is the ability of models to faithfully replicate potential maps and electrograms recorded from a given patient. However, while the personalization of electrophysiology models based on the gold standard bidomain formulation is in principle feasible, the associated computational expenses are significant, rendering their use incompatible with clinical time frames. In this study we report on the development of a novel computationally efficient reaction-eikonal (R-E) model for modeling extracellular potential maps and electrograms. Using a biventricular human electrophysiology model, which incorporates a topologically realistic His-Purkinje system (HPS), we demonstrate by comparing against a high-resolution reaction-diffusion (R-D) bidomain model that the R-E model predicts extracellular potential fields, electrograms as well as ECGs at the body surface with high fidelity and offers vast computational savings greater than three orders of magnitude. Due to their efficiency R-E models are ideally suitable for forward simulations in clinical modeling studies which attempt to personalize electrophysiological model features.

  12. Systemic Synthesis Questions [SSynQs] as Tools to Help Students to Build Their Cognitive Structures in a Systemic Manner

    NASA Astrophysics Data System (ADS)

    Hrin, Tamara N.; Fahmy, Ameen F. M.; Segedinac, Mirjana D.; Milenković, Dušica D.

    2016-08-01

    Many studies dedicated to the teaching and learning of organic chemistry courses have emphasized that high school students have shown significant difficulties in mastering the concepts of this discipline. Therefore, the aim of our study was to help students to overcome these difficulties by applying systemic synthesis questions, [SSynQs], as the instructional method in our intervention. This work shows that students from the group exposed to the new teaching method achieved higher scores on final testing than students from the control group, who were taught by the traditional method, when students' achievements in conventional, linear questions [LQs] and in [SSynQs] were studied. These results were followed by observation of lower levels of mental effort by students from the intervention group, and higher levels of mental effort in the control group, invested during solving both types of questions. This correlation between achievement and mental effort resulted in high instructional efficiency for the applied method in the intervention group, [SSynQs], and low instructional efficiency for the traditional teaching and learning method applied in the control group. A systemic triangular relation between achievement, mental effort, and instructional efficiency, established by each group and gender, emphasized that the application of [SSynQs] was more suited to female students than for male students because of [SSynQs] characteristics as teaching and learning tools and because of learning style and ability differences between genders.

  13. Efficient computation of electrograms and ECGs in human whole heart simulations using a reaction-eikonal model

    NASA Astrophysics Data System (ADS)

    Neic, Aurel; Campos, Fernando O.; Prassl, Anton J.; Niederer, Steven A.; Bishop, Martin J.; Vigmond, Edward J.; Plank, Gernot

    2017-10-01

    Anatomically accurate and biophysically detailed bidomain models of the human heart have proven a powerful tool for gaining quantitative insight into the links between electrical sources in the myocardium and the concomitant current flow in the surrounding medium as they represent their relationship mechanistically based on first principles. Such models are increasingly considered as a clinical research tool with the perspective of being used, ultimately, as a complementary diagnostic modality. An important prerequisite in many clinical modeling applications is the ability of models to faithfully replicate potential maps and electrograms recorded from a given patient. However, while the personalization of electrophysiology models based on the gold standard bidomain formulation is in principle feasible, the associated computational expenses are significant, rendering their use incompatible with clinical time frames. In this study we report on the development of a novel computationally efficient reaction-eikonal (R-E) model for modeling extracellular potential maps and electrograms. Using a biventricular human electrophysiology model, which incorporates a topologically realistic His-Purkinje system (HPS), we demonstrate by comparing against a high-resolution reaction-diffusion (R-D) bidomain model that the R-E model predicts extracellular potential fields, electrograms as well as ECGs at the body surface with high fidelity and offers vast computational savings greater than three orders of magnitude. Due to their efficiency R-E models are ideally suitable for forward simulations in clinical modeling studies which attempt to personalize electrophysiological model features.

  14. P-Hint-Hunt: a deep parallelized whole genome DNA methylation detection tool.

    PubMed

    Peng, Shaoliang; Yang, Shunyun; Gao, Ming; Liao, Xiangke; Liu, Jie; Yang, Canqun; Wu, Chengkun; Yu, Wenqiang

    2017-03-14

    The increasing studies have been conducted using whole genome DNA methylation detection as one of the most important part of epigenetics research to find the significant relationships among DNA methylation and several typical diseases, such as cancers and diabetes. In many of those studies, mapping the bisulfite treated sequence to the whole genome has been the main method to study DNA cytosine methylation. However, today's relative tools almost suffer from inaccuracies and time-consuming problems. In our study, we designed a new DNA methylation prediction tool ("Hint-Hunt") to solve the problem. By having an optimal complex alignment computation and Smith-Waterman matrix dynamic programming, Hint-Hunt could analyze and predict the DNA methylation status. But when Hint-Hunt tried to predict DNA methylation status with large-scale dataset, there are still slow speed and low temporal-spatial efficiency problems. In order to solve the problems of Smith-Waterman dynamic programming and low temporal-spatial efficiency, we further design a deep parallelized whole genome DNA methylation detection tool ("P-Hint-Hunt") on Tianhe-2 (TH-2) supercomputer. To the best of our knowledge, P-Hint-Hunt is the first parallel DNA methylation detection tool with a high speed-up to process large-scale dataset, and could run both on CPU and Intel Xeon Phi coprocessors. Moreover, we deploy and evaluate Hint-Hunt and P-Hint-Hunt on TH-2 supercomputer in different scales. The experimental results illuminate our tools eliminate the deviation caused by bisulfite treatment in mapping procedure and the multi-level parallel program yields a 48 times speed-up with 64 threads. P-Hint-Hunt gain a deep acceleration on CPU and Intel Xeon Phi heterogeneous platform, which gives full play of the advantages of multi-cores (CPU) and many-cores (Phi).

  15. Development of Low Global Warming Potential Refrigerant Solutions for Commercial Refrigeration Systems using a Life Cycle Climate Performance Design Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abdelaziz, Omar; Fricke, Brian A; Vineyard, Edward Allan

    Commercial refrigeration systems are known to be prone to high leak rates and to consume large amounts of electricity. As such, direct emissions related to refrigerant leakage and indirect emissions resulting from primary energy consumption contribute greatly to their Life Cycle Climate Performance (LCCP). In this paper, an LCCP design tool is used to evaluate the performance of a typical commercial refrigeration system with alternative refrigerants and minor system modifications to provide lower Global Warming Potential (GWP) refrigerant solutions with improved LCCP compared to baseline systems. The LCCP design tool accounts for system performance, ambient temperature, and system load; systemmore » performance is evaluated using a validated vapor compression system simulation tool while ambient temperature and system load are devised from a widely used building energy modeling tool (EnergyPlus). The LCCP design tool also accounts for the change in hourly electricity emission rate to yield an accurate prediction of indirect emissions. The analysis shows that conventional commercial refrigeration system life cycle emissions are largely due to direct emissions associated with refrigerant leaks and that system efficiency plays a smaller role in the LCCP. However, as a transition occurs to low GWP refrigerants, the indirect emissions become more relevant. Low GWP refrigerants may not be suitable for drop-in replacements in conventional commercial refrigeration systems; however some mixtures may be introduced as transitional drop-in replacements. These transitional refrigerants have a significantly lower GWP than baseline refrigerants and as such, improved LCCP. The paper concludes with a brief discussion on the tradeoffs between refrigerant GWP, efficiency and capacity.« less

  16. SAGE: The Self-Adaptive Grid Code. 3

    NASA Technical Reports Server (NTRS)

    Davies, Carol B.; Venkatapathy, Ethiraj

    1999-01-01

    The multi-dimensional self-adaptive grid code, SAGE, is an important tool in the field of computational fluid dynamics (CFD). It provides an efficient method to improve the accuracy of flow solutions while simultaneously reducing computer processing time. Briefly, SAGE enhances an initial computational grid by redistributing the mesh points into more appropriate locations. The movement of these points is driven by an equal-error-distribution algorithm that utilizes the relationship between high flow gradients and excessive solution errors. The method also provides a balance between clustering points in the high gradient regions and maintaining the smoothness and continuity of the adapted grid, The latest version, Version 3, includes the ability to change the boundaries of a given grid to more efficiently enclose flow structures and provides alternative redistribution algorithms.

  17. Changing computing paradigms towards power efficiency

    PubMed Central

    Klavík, Pavel; Malossi, A. Cristiano I.; Bekas, Costas; Curioni, Alessandro

    2014-01-01

    Power awareness is fast becoming immensely important in computing, ranging from the traditional high-performance computing applications to the new generation of data centric workloads. In this work, we describe our efforts towards a power-efficient computing paradigm that combines low- and high-precision arithmetic. We showcase our ideas for the widely used kernel of solving systems of linear equations that finds numerous applications in scientific and engineering disciplines as well as in large-scale data analytics, statistics and machine learning. Towards this goal, we developed tools for the seamless power profiling of applications at a fine-grain level. In addition, we verify here previous work on post-FLOPS/W metrics and show that these can shed much more light in the power/energy profile of important applications. PMID:24842033

  18. Enabling the High Level Synthesis of Data Analytics Accelerators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Minutoli, Marco; Castellana, Vito G.; Tumeo, Antonino

    Conventional High Level Synthesis (HLS) tools mainly tar- get compute intensive kernels typical of digital signal pro- cessing applications. We are developing techniques and ar- chitectural templates to enable HLS of data analytics appli- cations. These applications are memory intensive, present fine-grained, unpredictable data accesses, and irregular, dy- namic task parallelism. We discuss an architectural tem- plate based around a distributed controller to efficiently ex- ploit thread level parallelism. We present a memory in- terface that supports parallel memory subsystems and en- ables implementing atomic memory operations. We intro- duce a dynamic task scheduling approach to efficiently ex- ecute heavilymore » unbalanced workload. The templates are val- idated by synthesizing queries from the Lehigh University Benchmark (LUBM), a well know SPARQL benchmark.« less

  19. A high efficiency gene disruption strategy using a positive-negative split selection marker and electroporation for Fusarium oxysporum.

    PubMed

    Liang, Liqin; Li, Jianqiang; Cheng, Lin; Ling, Jian; Luo, Zhongqin; Bai, Miao; Xie, Bingyan

    2014-11-01

    The Fusarium oxysporum species complex consists of fungal pathogens that cause serial vascular wilt disease on more than 100 cultivated species throughout the world. Gene function analysis is rapidly becoming more and more important as the whole-genome sequences of various F. oxysporum strains are being completed. Gene-disruption techniques are a common molecular tool for studying gene function, yet are often a limiting step in gene function identification. In this study we have developed a F. oxysporum high-efficiency gene-disruption strategy based on split-marker homologous recombination cassettes with dual selection and electroporation transformation. The method was efficiently used to delete three RNA-dependent RNA polymerase (RdRP) genes. The gene-disruption cassettes of three genes can be constructed simultaneously within a short time using this technique. The optimal condition for electroporation is 10μF capacitance, 300Ω resistance, 4kV/cm field strength, with 1μg of DNA (gene-disruption cassettes). Under these optimal conditions, we were able to obtain 95 transformants per μg DNA. And after positive-negative selection, the transformants were efficiently screened by PCR, screening efficiency averaged 85%: 90% (RdRP1), 85% (RdRP2) and 77% (RdRP3). This gene-disruption strategy should pave the way for high throughout genetic analysis in F. oxysporum. Copyright © 2014 Elsevier GmbH. All rights reserved.

  20. SciDB versus Spark: A Preliminary Comparison Based on an Earth Science Use Case

    NASA Astrophysics Data System (ADS)

    Clune, T.; Kuo, K. S.; Doan, K.; Oloso, A.

    2015-12-01

    We compare two Big Data technologies, SciDB and Spark, for performance, usability, and extensibility, when applied to a representative Earth science use case. SciDB is a new-generation parallel distributed database management system (DBMS) based on the array data model that is capable of handling multidimensional arrays efficiently but requires lengthy data ingest prior to analysis, whereas Spark is a fast and general engine for large scale data processing that can immediately process raw data files and thereby avoid the ingest process. Once data have been ingested, SciDB is very efficient in database operations such as subsetting. Spark, on the other hand, provides greater flexibility by supporting a wide variety of high-level tools including DBMS's. For the performance aspect of this preliminary comparison, we configure Spark to operate directly on text or binary data files and thereby limit the need for additional tools. Arguably, a more appropriate comparison would involve exploring other configurations of Spark which exploit supported high-level tools, but that is beyond our current resources. To make the comparison as "fair" as possible, we export the arrays produced by SciDB into text files (or converting them to binary files) for the intake by Spark and thereby avoid any additional file processing penalties. The Earth science use case selected for this comparison is the identification and tracking of snowstorms in the NASA Modern Era Retrospective-analysis for Research and Applications (MERRA) reanalysis data. The identification portion of the use case is to flag all grid cells of the MERRA high-resolution hourly data that satisfies our criteria for snowstorm, whereas the tracking portion connects flagged cells adjacent in time and space to form a snowstorm episode. We will report the results of our comparisons at this presentation.

  1. Fermilab computing at the Intensity Frontier

    DOE PAGES

    Group, Craig; Fuess, S.; Gutsche, O.; ...

    2015-12-23

    The Intensity Frontier refers to a diverse set of particle physics experiments using high- intensity beams. In this paper I will focus the discussion on the computing requirements and solutions of a set of neutrino and muon experiments in progress or planned to take place at the Fermi National Accelerator Laboratory located near Chicago, Illinois. In addition, the experiments face unique challenges, but also have overlapping computational needs. In principle, by exploiting the commonality and utilizing centralized computing tools and resources, requirements can be satisfied efficiently and scientists of individual experiments can focus more on the science and less onmore » the development of tools and infrastructure.« less

  2. S3D: An interactive surface grid generation tool

    NASA Technical Reports Server (NTRS)

    Luh, Raymond Ching-Chung; Pierce, Lawrence E.; Yip, David

    1992-01-01

    S3D, an interactive software tool for surface grid generation, is described. S3D provides the means with which a geometry definition based either on a discretized curve set or a rectangular set can be quickly processed towards the generation of a surface grid for computational fluid dynamics (CFD) applications. This is made possible as a result of implementing commonly encountered surface gridding tasks in an environment with a highly efficient and user friendly graphical interface. Some of the more advanced features of S3D include surface-surface intersections, optimized surface domain decomposition and recomposition, and automated propagation of edge distributions to surrounding grids.

  3. A high order approach to flight software development and testing

    NASA Technical Reports Server (NTRS)

    Steinbacher, J.

    1981-01-01

    The use of a software development facility is discussed as a means of producing a reliable and maintainable ECS software system, and as a means of providing efficient use of the ECS hardware test facility. Principles applied to software design are given, including modularity, abstraction, hiding, and uniformity. The general objectives of each phase of the software life cycle are also given, including testing, maintenance, code development, and requirement specifications. Software development facility tools are summarized, and tool deficiencies recognized in the code development and testing phases are considered. Due to limited lab resources, the functional simulation capabilities may be indispensable in the testing phase.

  4. Probabilistic power flow using improved Monte Carlo simulation method with correlated wind sources

    NASA Astrophysics Data System (ADS)

    Bie, Pei; Zhang, Buhan; Li, Hang; Deng, Weisi; Wu, Jiasi

    2017-01-01

    Probabilistic Power Flow (PPF) is a very useful tool for power system steady-state analysis. However, the correlation among different random injection power (like wind power) brings great difficulties to calculate PPF. Monte Carlo simulation (MCS) and analytical methods are two commonly used methods to solve PPF. MCS has high accuracy but is very time consuming. Analytical method like cumulants method (CM) has high computing efficiency but the cumulants calculating is not convenient when wind power output does not obey any typical distribution, especially when correlated wind sources are considered. In this paper, an Improved Monte Carlo simulation method (IMCS) is proposed. The joint empirical distribution is applied to model different wind power output. This method combines the advantages of both MCS and analytical method. It not only has high computing efficiency, but also can provide solutions with enough accuracy, which is very suitable for on-line analysis.

  5. Expanding the Toolbox of Photoswitches for DNA Nanotechnology Using Arylazopyrazoles.

    PubMed

    Adam, Volker; Prusty, Deepak K; Centola, Mathias; Škugor, Marko; Hannam, Jeffrey S; Valero, Julián; Klöckner, Bernhard; Famulok, Michael

    2018-01-24

    Photoregulation is among the most promising tools for development of dynamic DNA nanosystems, due to its high spatiotemporal precision, biocompatibility, and ease of use. So far, azobenzene and its derivatives have shown high potential in photocontrolling DNA duplex hybridization by light-dependent photoisomerization. Despite many recent advances, obtaining sufficiently high photoswitching efficiency under conditions more suitable for work with DNA nanostructures are challenging. Here we introduce a pair of arylazopyrazoles as new photoswitches for efficient and reversible control of DNA hybridization achieved even at room temperature with a low number of required modifications. Their photophysical properties in the native state and in DNA strands result in near-quantitative isomerization rates by irradiation with UV and orange light. To demonstrate the applicability of these photoswitches, we have successfully applied one of them to open and close a DNA hairpin by light at room temperature. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Fundamentals of bipolar high-frequency surgery.

    PubMed

    Reidenbach, H D

    1993-04-01

    In endoscopic surgery a very precise surgical dissection technique and an efficient hemostasis are of decisive importance. The bipolar technique may be regarded as a method which satisfies both requirements, especially regarding a high safety standard in application. In this context the biophysical and technical fundamentals of this method, which have been known in principle for a long time, are described with regard to the special demands of a newly developed field of modern surgery. After classification of this method into a general and a quasi-bipolar mode, various technological solutions of specific bipolar probes, in a strict and in a generalized sense, are characterized in terms of indication. Experimental results obtained with different bipolar instruments and probes are given. The application of modern microprocessor-controlled high-frequency surgery equipment and, wherever necessary, the integration of additional ancillary technology into the specialized bipolar instruments may result in most useful and efficient tools of a key technology in endoscopic surgery.

  7. Diving deeper into Zebrafish development of social behavior: analyzing high resolution data.

    PubMed

    Buske, Christine; Gerlai, Robert

    2014-08-30

    Vertebrate model organisms have been utilized in high throughput screening but only with substantial cost and human capital investment. The zebrafish is a vertebrate model species that is a promising and cost effective candidate for efficient high throughput screening. Larval zebrafish have already been successfully employed in this regard (Lessman, 2011), but adult zebrafish also show great promise. High throughput screening requires the use of a large number of subjects and collection of substantial amount of data. Collection of data is only one of the demanding aspects of screening. However, in most screening approaches that involve behavioral data the main bottleneck that slows throughput is the time consuming aspect of analysis of the collected data. Some automated analytical tools do exist, but often they only work for one subject at a time, eliminating the possibility of fully utilizing zebrafish as a screening tool. This is a particularly important limitation for such complex phenotypes as social behavior. Testing multiple fish at a time can reveal complex social interactions but it may also allow the identification of outliers from a group of mutagenized or pharmacologically treated fish. Here, we describe a novel method using a custom software tool developed within our laboratory, which enables tracking multiple fish, in combination with a sophisticated analytical approach for summarizing and analyzing high resolution behavioral data. This paper focuses on the latter, the analytic tool, which we have developed using the R programming language and environment for statistical computing. We argue that combining sophisticated data collection methods with appropriate analytical tools will propel zebrafish into the future of neurobehavioral genetic research. Copyright © 2014. Published by Elsevier B.V.

  8. Web-based visual analysis for high-throughput genomics

    PubMed Central

    2013-01-01

    Background Visualization plays an essential role in genomics research by making it possible to observe correlations and trends in large datasets as well as communicate findings to others. Visual analysis, which combines visualization with analysis tools to enable seamless use of both approaches for scientific investigation, offers a powerful method for performing complex genomic analyses. However, there are numerous challenges that arise when creating rich, interactive Web-based visualizations/visual analysis applications for high-throughput genomics. These challenges include managing data flow from Web server to Web browser, integrating analysis tools and visualizations, and sharing visualizations with colleagues. Results We have created a platform simplifies the creation of Web-based visualization/visual analysis applications for high-throughput genomics. This platform provides components that make it simple to efficiently query very large datasets, draw common representations of genomic data, integrate with analysis tools, and share or publish fully interactive visualizations. Using this platform, we have created a Circos-style genome-wide viewer, a generic scatter plot for correlation analysis, an interactive phylogenetic tree, a scalable genome browser for next-generation sequencing data, and an application for systematically exploring tool parameter spaces to find good parameter values. All visualizations are interactive and fully customizable. The platform is integrated with the Galaxy (http://galaxyproject.org) genomics workbench, making it easy to integrate new visual applications into Galaxy. Conclusions Visualization and visual analysis play an important role in high-throughput genomics experiments, and approaches are needed to make it easier to create applications for these activities. Our framework provides a foundation for creating Web-based visualizations and integrating them into Galaxy. Finally, the visualizations we have created using the framework are useful tools for high-throughput genomics experiments. PMID:23758618

  9. Benchmarking: A Study of School and School District Effect and Efficiency.

    ERIC Educational Resources Information Center

    Swanson, Austin D.; Engert, Frank

    The "New York State School Report Card" provides a vehicle for benchmarking with respect to student achievement. In this study, additional tools were developed for making external comparisons with respect to achievement, and tools were added for assessing fiscal policy and efficiency. Data from school years 1993-94 through 1995-96 were…

  10. Protoplast isolation, transient transformation of leaf mesophyll protoplasts and improved Agrobacterium-mediated leaf disc infiltration of Phaseolus vulgaris: tools for rapid gene expression analysis.

    PubMed

    Nanjareddy, Kalpana; Arthikala, Manoj-Kumar; Blanco, Lourdes; Arellano, Elizabeth S; Lara, Miguel

    2016-06-24

    Phaseolus vulgaris is one of the most extensively studied model legumes in the world. The P. vulgaris genome sequence is available; therefore, the need for an efficient and rapid transformation system is more imperative than ever. The functional characterization of P. vulgaris genes is impeded chiefly due to the non-amenable nature of Phaseolus sp. to stable genetic transformation. Transient transformation systems are convenient and versatile alternatives for rapid gene functional characterization studies. Hence, the present work focuses on standardizing methodologies for protoplast isolation from multiple tissues and transient transformation protocols for rapid gene expression analysis in the recalcitrant grain legume P. vulgaris. Herein, we provide methodologies for the high-throughput isolation of leaf mesophyll-, flower petal-, hypocotyl-, root- and nodule-derived protoplasts from P. vulgaris. The highly efficient polyethylene glycol-mannitol magnesium (PEG-MMG)-mediated transformation of leaf mesophyll protoplasts was optimized using a GUS reporter gene. We used the P. vulgaris SNF1-related protein kinase 1 (PvSnRK1) gene as proof of concept to demonstrate rapid gene functional analysis. An RT-qPCR analysis of protoplasts that had been transformed with PvSnRK1-RNAi and PvSnRK1-OE vectors showed the significant downregulation and ectopic constitutive expression (overexpression), respectively, of the PvSnRK1 transcript. We also demonstrated an improved transient transformation approach, sonication-assisted Agrobacterium-mediated transformation (SAAT), for the leaf disc infiltration of P. vulgaris. Interestingly, this method resulted in a 90 % transformation efficiency and transformed 60-85 % of the cells in a given area of the leaf surface. The constitutive expression of YFP further confirmed the amenability of the system to gene functional characterization studies. We present simple and efficient methodologies for protoplast isolation from multiple P. vulgaris tissues. We also provide a high-efficiency and amenable method for leaf mesophyll transformation for rapid gene functional characterization studies. Furthermore, a modified SAAT leaf disc infiltration approach aids in validating genes and their functions. Together, these methods help to rapidly unravel novel gene functions and are promising tools for P. vulgaris research.

  11. On the next generation of reliability analysis tools

    NASA Technical Reports Server (NTRS)

    Babcock, Philip S., IV; Leong, Frank; Gai, Eli

    1987-01-01

    The current generation of reliability analysis tools concentrates on improving the efficiency of the description and solution of the fault-handling processes and providing a solution algorithm for the full system model. The tools have improved user efficiency in these areas to the extent that the problem of constructing the fault-occurrence model is now the major analysis bottleneck. For the next generation of reliability tools, it is proposed that techniques be developed to improve the efficiency of the fault-occurrence model generation and input. Further, the goal is to provide an environment permitting a user to provide a top-down design description of the system from which a Markov reliability model is automatically constructed. Thus, the user is relieved of the tedious and error-prone process of model construction, permitting an efficient exploration of the design space, and an independent validation of the system's operation is obtained. An additional benefit of automating the model construction process is the opportunity to reduce the specialized knowledge required. Hence, the user need only be an expert in the system he is analyzing; the expertise in reliability analysis techniques is supplied.

  12. Overview of Fundamental High-Lift Research for Transport Aircraft at NASA

    NASA Technical Reports Server (NTRS)

    Leavitt, L. D.; Washburn, A. E.; Wahls, R. A.

    2007-01-01

    NASA has had a long history in fundamental and applied high lift research. Current programs provide a focus on the validation of technologies and tools that will enable extremely short take off and landing coupled with efficient cruise performance, simple flaps with flow control for improved effectiveness, circulation control wing concepts, some exploration into new aircraft concepts, and partnership with Air Force Research Lab in mobility. Transport high-lift development testing will shift more toward mid and high Rn facilities at least until the question: "How much Rn is required" is answered. This viewgraph presentation provides an overview of High-Lift research at NASA.

  13. ROOT.NET: Using ROOT from .NET languages like C# and F#

    NASA Astrophysics Data System (ADS)

    Watts, G.

    2012-12-01

    ROOT.NET provides an interface between Microsoft's Common Language Runtime (CLR) and .NET technology and the ubiquitous particle physics analysis tool, ROOT. ROOT.NET automatically generates a series of efficient wrappers around the ROOT API. Unlike pyROOT, these wrappers are statically typed and so are highly efficient as compared to the Python wrappers. The connection to .NET means that one gains access to the full series of languages developed for the CLR including functional languages like F# (based on OCaml). Many features that make ROOT objects work well in the .NET world are added (properties, IEnumerable interface, LINQ compatibility, etc.). Dynamic languages based on the CLR can be used as well, of course (Python, for example). Additionally it is now possible to access ROOT objects that are unknown to the translation tool. This poster will describe the techniques used to effect this translation, along with performance comparisons, and examples. All described source code is posted on the open source site CodePlex.

  14. Artificial intelligence-based computer modeling tools for controlling slag foaming in electric arc furnaces

    NASA Astrophysics Data System (ADS)

    Wilson, Eric Lee

    Due to increased competition in a world economy, steel companies are currently interested in developing techniques that will allow for the improvement of the steelmaking process, either by increasing output efficiency or by improving the quality of their product, or both. Slag foaming is one practice that has been shown to contribute to both these goals. However, slag foaming is highly dynamic and difficult to model or control. This dissertation describes an effort to use artificial intelligence-based tools (genetic algorithms, fuzzy logic, and neural networks) to both model and control the slag foaming process. Specifically, a neural network is trained and tested on slag foaming data provided by a steel plant. This neural network model is then controlled by a fuzzy logic controller, which in turn is optimized by a genetic algorithm. This tuned controller is then installed at a steel plant and given control be a more efficient slag foaming controller than what was previously used by the steel plant.

  15. A Computational Framework for Efficient Low Temperature Plasma Simulations

    NASA Astrophysics Data System (ADS)

    Verma, Abhishek Kumar; Venkattraman, Ayyaswamy

    2016-10-01

    Over the past years, scientific computing has emerged as an essential tool for the investigation and prediction of low temperature plasmas (LTP) applications which includes electronics, nanomaterial synthesis, metamaterials etc. To further explore the LTP behavior with greater fidelity, we present a computational toolbox developed to perform LTP simulations. This framework will allow us to enhance our understanding of multiscale plasma phenomenon using high performance computing tools mainly based on OpenFOAM FVM distribution. Although aimed at microplasma simulations, the modular framework is able to perform multiscale, multiphysics simulations of physical systems comprises of LTP. Some salient introductory features are capability to perform parallel, 3D simulations of LTP applications on unstructured meshes. Performance of the solver is tested based on numerical results assessing accuracy and efficiency of benchmarks for problems in microdischarge devices. Numerical simulation of microplasma reactor at atmospheric pressure with hemispherical dielectric coated electrodes will be discussed and hence, provide an overview of applicability and future scope of this framework.

  16. [Advances in CRISPR-Cas-mediated genome editing system in plants].

    PubMed

    Wang, Chun; Wang, Kejian

    2017-10-25

    Targeted genome editing technology is an important tool to study the function of genes and to modify organisms at the genetic level. Recently, CRISPR-Cas (clustered regularly interspaced short palindromic repeats and CRISPR-associated proteins) system has emerged as an efficient tool for specific genome editing in animals and plants. CRISPR-Cas system uses CRISPR-associated endonuclease and a guide RNA to generate double-strand breaks at the target DNA site, subsequently leading to genetic modifications. CRISPR-Cas system has received widespread attention for manipulating the genomes with simple, easy and high specificity. This review summarizes recent advances of diverse applications of the CRISPR-Cas toolkit in plant research and crop breeding, including expanding the range of genome editing, precise editing of a target base, and efficient DNA-free genome editing technology. This review also discusses the potential challenges and application prospect in the future, and provides a useful reference for researchers who are interested in this field.

  17. Car Assembly Line Efficiency Improvement by Lean Principle

    NASA Astrophysics Data System (ADS)

    Sawassalung, Suwalee; Chutima, Parames

    2017-06-01

    This research aimed to increase the efficiency of actual working time to compare to design standard time ratio (DSTR) as per analysing process of Lean System of the assembly line in a car manufacturer in Thailand. Currently, the case study factory and its group of factories, which have many branches all over the world, have competed with each other on quality, delivered time and production cost. The production cost which can reduce without affecting quality and acceptable by clients is the manpower cost. The index of competition is DSTR. The factory now has DSTR of 6.13 and DSTR of the assembly department is 4.24 which is very high comparing to other departments. The low DSTR indicates that the factory has good quality. The ways to solve the problem are to apply the following tools, i.e. Lean principle, Value Stream Mapping (VSM), Waste Analysis and ECRS. After implementing the above tools, the results showed that DSTR decreased from 4.24 to 4.06 or 4.25%.

  18. Increasing energy efficiency level of building production based on applying modern mechanization facilities

    NASA Astrophysics Data System (ADS)

    Prokhorov, Sergey

    2017-10-01

    Building industry in a present day going through the hard times. Machine and mechanism exploitation cost, on a field of construction and installation works, takes a substantial part in total building construction expenses. There is a necessity to elaborate high efficient method, which allows not only to increase production, but also to reduce direct costs during machine fleet exploitation, and to increase its energy efficiency. In order to achieve the goal we plan to use modern methods of work production, hi-tech and energy saving machine tools and technologies, and use of optimal mechanization sets. As the optimization criteria there are exploitation prime cost and set efficiency. During actual task-solving process we made a conclusion, which shows that mechanization works, energy audit with production juxtaposition, prime prices and costs for energy resources allow to make complex machine fleet supply, improve ecological level and increase construction and installation work quality.

  19. Maximizing Efficiency and Reducing Robotic Surgery Costs Using the NASA Task Load Index.

    PubMed

    Walters, Carrie; Webb, Paula J

    2017-10-01

    Perioperative leaders at our facility were struggling to meet efficiency targets for robotic surgery procedures while also maintaining the satisfaction of the surgical team. We developed a human resources time and motion study tool and used it in conjunction with the NASA Task Load Index to observe and analyze the required workload of personnel assigned to 25 robotic surgery procedures. The time and motion study identified opportunities to enlist the help of nonlicensed support personnel to ensure safe patient care and improve OR efficiency. Using the NASA Task Load Index demonstrated that high temporal, effort, and physical demands existed for personnel assisting with and performing robotic surgery. We believe that this process could be used to develop cost-effective staffing models, resulting in safe and efficient care for all surgical patients. Copyright © 2017 AORN, Inc. Published by Elsevier Inc. All rights reserved.

  20. Generation of an ICF syndrome model by efficient genome editing of human induced pluripotent stem cells using the CRISPR system.

    PubMed

    Horii, Takuro; Tamura, Daiki; Morita, Sumiyo; Kimura, Mika; Hatada, Izuho

    2013-09-30

    Genome manipulation of human induced pluripotent stem (iPS) cells is essential to achieve their full potential as tools for regenerative medicine. To date, however, gene targeting in human pluripotent stem cells (hPSCs) has proven to be extremely difficult. Recently, an efficient genome manipulation technology using the RNA-guided DNase Cas9, the clustered regularly interspaced short palindromic repeats (CRISPR) system, has been developed. Here we report the efficient generation of an iPS cell model for immunodeficiency, centromeric region instability, facial anomalies syndrome (ICF) syndrome using the CRISPR system. We obtained iPS cells with mutations in both alleles of DNA methyltransferase 3B (DNMT3B) in 63% of transfected clones. Our data suggest that the CRISPR system is highly efficient and useful for genome engineering of human iPS cells.

  1. Estimating Tool-Tissue Forces Using a 3-Degree-of-Freedom Robotic Surgical Tool.

    PubMed

    Zhao, Baoliang; Nelson, Carl A

    2016-10-01

    Robot-assisted minimally invasive surgery (MIS) has gained popularity due to its high dexterity and reduced invasiveness to the patient; however, due to the loss of direct touch of the surgical site, surgeons may be prone to exert larger forces and cause tissue damage. To quantify tool-tissue interaction forces, researchers have tried to attach different kinds of sensors on the surgical tools. This sensor attachment generally makes the tools bulky and/or unduly expensive and may hinder the normal function of the tools; it is also unlikely that these sensors can survive harsh sterilization processes. This paper investigates an alternative method by estimating tool-tissue interaction forces using driving motors' current, and validates this sensorless force estimation method on a 3-degree-of-freedom (DOF) robotic surgical grasper prototype. The results show that the performance of this method is acceptable with regard to latency and accuracy. With this tool-tissue interaction force estimation method, it is possible to implement force feedback on existing robotic surgical systems without any sensors. This may allow a haptic surgical robot which is compatible with existing sterilization methods and surgical procedures, so that the surgeon can obtain tool-tissue interaction forces in real time, thereby increasing surgical efficiency and safety.

  2. Cooperative problem solving with personal mobile information tools in hospitals.

    PubMed

    Buchauer, A; Werner, R; Haux, R

    1998-01-01

    Health-care professionals have a broad range of needs for information and cooperation while working at different points of care (e.g., outpatient departments, wards, and functional units such as operating theaters). Patient-related data and medical knowledge have to be widely available to support high-quality patient care. Furthermore, due to the increased specialization of health-care professionals, efficient collaboration is required. Personal mobile information tools have a considerable potential to realize almost ubiquitous information and collaborative support. They enable to unite the functionality of conventional tools such as paper forms, dictating machines, and pagers into one tool. Moreover, they can extend the support already provided by clinical workstations. An approach is described for the integration of mobile information tools with heterogeneous hospital information systems. This approach includes identification of functions which should be provided on mobile tools. Major functions are the presentation of medical records and reports, electronic mailing to support interpersonal communication, and the provision of editors for structured clinical documentation. To realize those functions on mobile tools, we propose a document-based client-server architecture that enables mobile information tools to interoperate with existing computer-based application systems. Open application systems and powerful, partially wireless, hospital-wide networks are the prerequisites for the introduction of mobile information tools.

  3. New Automotive Air Conditioning System Simulation Tool Developed in MATLAB/Simulink

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kiss, T.; Chaney, L.; Meyer, J.

    Further improvements in vehicle fuel efficiency require accurate evaluation of the vehicle's transient total power requirement. When operated, the air conditioning (A/C) system is the largest auxiliary load on a vehicle; therefore, accurate evaluation of the load it places on the vehicle's engine and/or energy storage system is especially important. Vehicle simulation software, such as 'Autonomie,' has been used by OEMs to evaluate vehicles' energy performance. A transient A/C simulation tool incorporated into vehicle simulation models would also provide a tool for developing more efficient A/C systems through a thorough consideration of the transient A/C system performance. The dynamic systemmore » simulation software Matlab/Simulink was used to develop new and more efficient vehicle energy system controls. The various modeling methods used for the new simulation tool are described in detail. Comparison with measured data is provided to demonstrate the validity of the model.« less

  4. Subnanometer and nanometer catalysts, method for preparing size-selected catalysts

    DOEpatents

    Vajda, Stefan , Pellin, Michael J.; Elam, Jeffrey W [Elmhurst, IL; Marshall, Christopher L [Naperville, IL; Winans, Randall A [Downers Grove, IL; Meiwes-Broer, Karl-Heinz [Roggentin, GR

    2012-04-03

    Highly uniform cluster based nanocatalysts supported on technologically relevant supports were synthesized for reactions of top industrial relevance. The Pt-cluster based catalysts outperformed the very best reported ODHP catalyst in both activity (by up to two orders of magnitude higher turn-over frequencies) and in selectivity. The results clearly demonstrate that highly dispersed ultra-small Pt clusters precisely localized on high-surface area supports can lead to affordable new catalysts for highly efficient and economic propene production, including considerably simplified separation of the final product. The combined GISAXS-mass spectrometry provides an excellent tool to monitor the evolution of size and shape of nanocatalyst at action under realistic conditions. Also provided are sub-nanometer gold and sub-nanometer to few nm size-selected silver catalysts which possess size dependent tunable catalytic properties in the epoxidation of alkenes. Invented size-selected cluster deposition provides a unique tool to tune material properties by atom-by-atom fashion, which can be stabilized by protective overcoats.

  5. Subnanometer and nanometer catalysts, method for preparing size-selected catalysts

    DOEpatents

    Vajda, Stefan [Lisle, IL; Pellin, Michael J [Naperville, IL; Elam, Jeffrey W [Elmhurst, IL; Marshall, Christopher L [Naperville, IL; Winans, Randall A [Downers Grove, IL; Meiwes-Broer, Karl-Heinz [Roggentin, GR

    2012-03-27

    Highly uniform cluster based nanocatalysts supported on technologically relevant supports were synthesized for reactions of top industrial relevance. The Pt-cluster based catalysts outperformed the very best reported ODHP catalyst in both activity (by up to two orders of magnitude higher turn-over frequencies) and in selectivity. The results clearly demonstrate that highly dispersed ultra-small Pt clusters precisely localized on high-surface area supports can lead to affordable new catalysts for highly efficient and economic propene production, including considerably simplified separation of the final product. The combined GISAXS-mass spectrometry provides an excellent tool to monitor the evolution of size and shape of nanocatalyst at action under realistic conditions. Also provided are sub-nanometer gold and sub-nanometer to few nm size-selected silver catalysts which possess size dependent tunable catalytic properties in the epoxidation of alkenes. Invented size-selected cluster deposition provides a unique tool to tune material properties by atom-by-atom fashion, which can be stabilized by protective overcoats.

  6. P-TRAP: a Panicle TRAit Phenotyping tool.

    PubMed

    A L-Tam, Faroq; Adam, Helene; Anjos, António dos; Lorieux, Mathias; Larmande, Pierre; Ghesquière, Alain; Jouannic, Stefan; Shahbazkia, Hamid Reza

    2013-08-29

    In crops, inflorescence complexity and the shape and size of the seed are among the most important characters that influence yield. For example, rice panicles vary considerably in the number and order of branches, elongation of the axis, and the shape and size of the seed. Manual low-throughput phenotyping methods are time consuming, and the results are unreliable. However, high-throughput image analysis of the qualitative and quantitative traits of rice panicles is essential for understanding the diversity of the panicle as well as for breeding programs. This paper presents P-TRAP software (Panicle TRAit Phenotyping), a free open source application for high-throughput measurements of panicle architecture and seed-related traits. The software is written in Java and can be used with different platforms (the user-friendly Graphical User Interface (GUI) uses Netbeans Platform 7.3). The application offers three main tools: a tool for the analysis of panicle structure, a spikelet/grain counting tool, and a tool for the analysis of seed shape. The three tools can be used independently or simultaneously for analysis of the same image. Results are then reported in the Extensible Markup Language (XML) and Comma Separated Values (CSV) file formats. Images of rice panicles were used to evaluate the efficiency and robustness of the software. Compared to data obtained by manual processing, P-TRAP produced reliable results in a much shorter time. In addition, manual processing is not repeatable because dry panicles are vulnerable to damage. The software is very useful, practical and collects much more data than human operators. P-TRAP is a new open source software that automatically recognizes the structure of a panicle and the seeds on the panicle in numeric images. The software processes and quantifies several traits related to panicle structure, detects and counts the grains, and measures their shape parameters. In short, P-TRAP offers both efficient results and a user-friendly environment for experiments. The experimental results showed very good accuracy compared to field operator, expert verification and well-known academic methods.

  7. P-TRAP: a Panicle Trait Phenotyping tool

    PubMed Central

    2013-01-01

    Background In crops, inflorescence complexity and the shape and size of the seed are among the most important characters that influence yield. For example, rice panicles vary considerably in the number and order of branches, elongation of the axis, and the shape and size of the seed. Manual low-throughput phenotyping methods are time consuming, and the results are unreliable. However, high-throughput image analysis of the qualitative and quantitative traits of rice panicles is essential for understanding the diversity of the panicle as well as for breeding programs. Results This paper presents P-TRAP software (Panicle TRAit Phenotyping), a free open source application for high-throughput measurements of panicle architecture and seed-related traits. The software is written in Java and can be used with different platforms (the user-friendly Graphical User Interface (GUI) uses Netbeans Platform 7.3). The application offers three main tools: a tool for the analysis of panicle structure, a spikelet/grain counting tool, and a tool for the analysis of seed shape. The three tools can be used independently or simultaneously for analysis of the same image. Results are then reported in the Extensible Markup Language (XML) and Comma Separated Values (CSV) file formats. Images of rice panicles were used to evaluate the efficiency and robustness of the software. Compared to data obtained by manual processing, P-TRAP produced reliable results in a much shorter time. In addition, manual processing is not repeatable because dry panicles are vulnerable to damage. The software is very useful, practical and collects much more data than human operators. Conclusions P-TRAP is a new open source software that automatically recognizes the structure of a panicle and the seeds on the panicle in numeric images. The software processes and quantifies several traits related to panicle structure, detects and counts the grains, and measures their shape parameters. In short, P-TRAP offers both efficient results and a user-friendly environment for experiments. The experimental results showed very good accuracy compared to field operator, expert verification and well-known academic methods. PMID:23987653

  8. FLaapLUC: A pipeline for the generation of prompt alerts on transient Fermi-LAT γ-ray sources

    NASA Astrophysics Data System (ADS)

    Lenain, J.-P.

    2018-01-01

    The large majority of high energy sources detected with Fermi-LAT are blazars, which are known to be very variable sources. High cadence long-term monitoring simultaneously at different wavelengths being prohibitive, the study of their transient activities can help shedding light on our understanding of these objects. The early detection of such potentially fast transient events is the key for triggering follow-up observations at other wavelengths. A Python tool, FLaapLUC, built on top of the Science Tools provided by the Fermi Science Support Center and the Fermi-LAT collaboration, has been developed using a simple aperture photometry approach. This tool can effectively detect relative flux variations in a set of predefined sources and alert potential users. Such alerts can then be used to trigger target of opportunity observations with other facilities. It is shown that FLaapLUC is an efficient tool to reveal transient events in Fermi-LAT data, providing quick results which can be used to promptly organise follow-up observations. Results from this simple aperture photometry method are also compared to full likelihood analyses. The FLaapLUC package is made available on GitHub and is open to contributions by the community.

  9. Wavelet-Based Peak Detection and a New Charge Inference Procedure for MS/MS Implemented in ProteoWizard’s msConvert

    PubMed Central

    2015-01-01

    We report the implementation of high-quality signal processing algorithms into ProteoWizard, an efficient, open-source software package designed for analyzing proteomics tandem mass spectrometry data. Specifically, a new wavelet-based peak-picker (CantWaiT) and a precursor charge determination algorithm (Turbocharger) have been implemented. These additions into ProteoWizard provide universal tools that are independent of vendor platform for tandem mass spectrometry analyses and have particular utility for intralaboratory studies requiring the advantages of different platforms convergent on a particular workflow or for interlaboratory investigations spanning multiple platforms. We compared results from these tools to those obtained using vendor and commercial software, finding that in all cases our algorithms resulted in a comparable number of identified peptides for simple and complex samples measured on Waters, Agilent, and AB SCIEX quadrupole time-of-flight and Thermo Q-Exactive mass spectrometers. The mass accuracy of matched precursor ions also compared favorably with vendor and commercial tools. Additionally, typical analysis runtimes (∼1–100 ms per MS/MS spectrum) were short enough to enable the practical use of these high-quality signal processing tools for large clinical and research data sets. PMID:25411686

  10. Wavelet-based peak detection and a new charge inference procedure for MS/MS implemented in ProteoWizard's msConvert.

    PubMed

    French, William R; Zimmerman, Lisa J; Schilling, Birgit; Gibson, Bradford W; Miller, Christine A; Townsend, R Reid; Sherrod, Stacy D; Goodwin, Cody R; McLean, John A; Tabb, David L

    2015-02-06

    We report the implementation of high-quality signal processing algorithms into ProteoWizard, an efficient, open-source software package designed for analyzing proteomics tandem mass spectrometry data. Specifically, a new wavelet-based peak-picker (CantWaiT) and a precursor charge determination algorithm (Turbocharger) have been implemented. These additions into ProteoWizard provide universal tools that are independent of vendor platform for tandem mass spectrometry analyses and have particular utility for intralaboratory studies requiring the advantages of different platforms convergent on a particular workflow or for interlaboratory investigations spanning multiple platforms. We compared results from these tools to those obtained using vendor and commercial software, finding that in all cases our algorithms resulted in a comparable number of identified peptides for simple and complex samples measured on Waters, Agilent, and AB SCIEX quadrupole time-of-flight and Thermo Q-Exactive mass spectrometers. The mass accuracy of matched precursor ions also compared favorably with vendor and commercial tools. Additionally, typical analysis runtimes (∼1-100 ms per MS/MS spectrum) were short enough to enable the practical use of these high-quality signal processing tools for large clinical and research data sets.

  11. Open-source Software for Demand Forecasting of Clinical Laboratory Test Volumes Using Time-series Analysis.

    PubMed

    Mohammed, Emad A; Naugler, Christopher

    2017-01-01

    Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. This tool will allow anyone with historic test volume data to model future demand.

  12. Open-source Software for Demand Forecasting of Clinical Laboratory Test Volumes Using Time-series Analysis

    PubMed Central

    Mohammed, Emad A.; Naugler, Christopher

    2017-01-01

    Background: Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. Method: In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. Results: This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. Conclusion: This tool will allow anyone with historic test volume data to model future demand. PMID:28400996

  13. Advances in biotechnology and genomics of switchgrass

    PubMed Central

    2013-01-01

    Switchgrass (Panicum virgatum L.) is a C4 perennial warm season grass indigenous to the North American tallgrass prairie. A number of its natural and agronomic traits, including adaptation to a wide geographical distribution, low nutrient requirements and production costs, high water use efficiency, high biomass potential, ease of harvesting, and potential for carbon storage, make it an attractive dedicated biomass crop for biofuel production. We believe that genetic improvements using biotechnology will be important to realize the potential of the biomass and biofuel-related uses of switchgrass. Tissue culture techniques aimed at rapid propagation of switchgrass and genetic transformation protocols have been developed. Rapid progress in genome sequencing and bioinformatics has provided efficient strategies to identify, tag, clone and manipulate many economically-important genes, including those related to higher biomass, saccharification efficiency, and lignin biosynthesis. Application of the best genetic tools should render improved switchgrass that will be more economically and environmentally sustainable as a lignocellulosic bioenergy feedstock. PMID:23663491

  14. CRISPR-Cas9, a tool to efficiently increase the development of recombinant African swine fever viruses.

    PubMed

    Borca, Manuel V; Holinka, Lauren G; Berggren, Keith A; Gladue, Douglas P

    2018-02-16

    African swine fever virus (ASFV) causes a highly contagious disease called African swine fever. This disease is often lethal for domestic pigs, causing extensive losses for the swine industry. ASFV is a large and complex double stranded DNA virus. Currently there is no commercially available treatment or vaccine to prevent this devastating disease. Development of recombinant ASFV for producing live-attenuated vaccines or studying the involvement of specific genes in virus virulence has relied on the relatively rare event of homologous recombination in primary swine macrophages, causing difficulty to purify the recombinant virus from the wild-type parental ASFV. Here we present the use of the CRISPR-Cas9 gene editing system as a more robust and efficient system to produce recombinant ASFVs. Using CRISPR-Cas9 a recombinant virus was efficiently developed by deleting the non-essential gene 8-DR from the genome of the highly virulent field strain Georgia07 using swine macrophages as cell substrate.

  15. Epidemics in Complex Networks: The Diversity of Hubs

    NASA Astrophysics Data System (ADS)

    Kitsak, Maksim; Gallos, Lazaros K.; Havlin, Shlomo; Stanley, H. Eugene; Makse, Hernan A.

    2009-03-01

    Many complex systems are believed to be vulnerable to spread of viruses and information owing to their high level of interconnectivity. Even viruses of low contagiousness easily proliferate the Internet. Rumors, fads, and innovation ideas are prone to efficient spreading in various social systems. Another commonly accepted standpoint is the importance of the most connected elements (hubs) in the spreading processes. We address following questions. Do all hubs conduct epidemics in the same manner? How does the epidemics spread depend on the structure of the network? What is the most efficient way to spread information over the system? We analyze several large-scale systems in the framework of of the susceptible/infective/removed (SIR) disease spread model which can also be mapped to the problem of rumor or fad spreading. We show that hubs are often ineffective in the transmission of virus or information owing to the highly heterogeneous topology of most networks. We also propose a new tool to evaluate the efficiency of nodes in spreading virus or information.

  16. Programming Light-Harvesting Efficiency Using DNA Origami

    PubMed Central

    2016-01-01

    The remarkable performance and quantum efficiency of biological light-harvesting complexes has prompted a multidisciplinary interest in engineering biologically inspired antenna systems as a possible route to novel solar cell technologies. Key to the effectiveness of biological “nanomachines” in light capture and energy transport is their highly ordered nanoscale architecture of photoactive molecules. Recently, DNA origami has emerged as a powerful tool for organizing multiple chromophores with base-pair accuracy and full geometric freedom. Here, we present a programmable antenna array on a DNA origami platform that enables the implementation of rationally designed antenna structures. We systematically analyze the light-harvesting efficiency with respect to number of donors and interdye distances of a ring-like antenna using ensemble and single-molecule fluorescence spectroscopy and detailed Förster modeling. This comprehensive study demonstrates exquisite and reliable structural control over multichromophoric geometries and points to DNA origami as highly versatile platform for testing design concepts in artificial light-harvesting networks. PMID:26906456

  17. Coupled Neutron Transport for HZETRN

    NASA Technical Reports Server (NTRS)

    Slaba, Tony C.; Blattnig, Steve R.

    2009-01-01

    Exposure estimates inside space vehicles, surface habitats, and high altitude aircrafts exposed to space radiation are highly influenced by secondary neutron production. The deterministic transport code HZETRN has been identified as a reliable and efficient tool for such studies, but improvements to the underlying transport models and numerical methods are still necessary. In this paper, the forward-backward (FB) and directionally coupled forward-backward (DC) neutron transport models are derived, numerical methods for the FB model are reviewed, and a computationally efficient numerical solution is presented for the DC model. Both models are compared to the Monte Carlo codes HETC-HEDS, FLUKA, and MCNPX, and the DC model is shown to agree closely with the Monte Carlo results. Finally, it is found in the development of either model that the decoupling of low energy neutrons from the light particle transport procedure adversely affects low energy light ion fluence spectra and exposure quantities. A first order correction is presented to resolve the problem, and it is shown to be both accurate and efficient.

  18. The physiologic state of Escherichia coli O157:H7 does not affect its detection in two commercial real-time PCR-based tests

    USDA-ARS?s Scientific Manuscript database

    Multiplex real-time PCR detection of Escherichia coli O157:H7 is an efficient molecular tool with high sensitivity and specificity for meat safety and quality assurance in the beef industry. The Biocontrol GDS and the DuPont Qualicon BAX®-RT rapid detection systems are two commercial tests based on...

  19. Scaling Support Vector Machines On Modern HPC Platforms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    You, Yang; Fu, Haohuan; Song, Shuaiwen

    2015-02-01

    We designed and implemented MIC-SVM, a highly efficient parallel SVM for x86 based multicore and many-core architectures, such as the Intel Ivy Bridge CPUs and Intel Xeon Phi co-processor (MIC). We propose various novel analysis methods and optimization techniques to fully utilize the multilevel parallelism provided by these architectures and serve as general optimization methods for other machine learning tools.

  20. Demonstrating Biological Principles Efficiently and Effectively: The Overhead Is More than Just a Lighted Chalkboard

    ERIC Educational Resources Information Center

    Barden-Gabbei, Laura M.

    2006-01-01

    The overhead projector is an excellent tool for teachers at both the high school and college level. Teachers often use it to display class notes as they monitor students' actions and reactions to the concepts being presented and discussed, to display diagrams and figures too complex to draw on the chalkboard, and more recently to display computer…

  1. The Chromosome Microdissection and Microcloning Technique.

    PubMed

    Zhang, Ying-Xin; Deng, Chuan-Liang; Hu, Zan-Min

    2016-01-01

    Chromosome microdissection followed by microcloning is an efficient tool combining cytogenetics and molecular genetics that can be used for the construction of the high density molecular marker linkage map and fine physical map, the generation of probes for chromosome painting, and the localization and cloning of important genes. Here, we describe a modified technique to microdissect a single chromosome, paint individual chromosomes, and construct single-chromosome DNA libraries.

  2. Production, allocation, and stemwood growth efficiency of Pinus taeda L. stands in response to 6 years of intensive management

    Treesearch

    Lisa J. Samuelson; Kurt Johnsen; Tom Stokes

    2004-01-01

    Loblolly pine (Pinus taeda L.) is a highly plastic species with respect to growth responses to forest management. Loblolly pine is the most planted species across the southern United States, a region with the most expansive and intensively managed forest plantations in the world. Management intensity, using tools such as site preparation and...

  3. Inductive plasmas for plasma processing

    NASA Astrophysics Data System (ADS)

    Keller, John H.

    1996-05-01

    With the need for high plasma density and low pressure in single wafer etching tools, a number of inductive etching systems have been and are being developed for commercial sale. This paper reviews some of the history of low-pressure inductive plasmas, gives features of inductive plasmas, limitations, corrections and presents uses for plasma processing. The theory for the skin depth, rf coil impedance and efficiency is also discussed.

  4. The great importance of normalization of LC-MS data for highly-accurate non-targeted metabolomics.

    PubMed

    Mizuno, Hajime; Ueda, Kazuki; Kobayashi, Yuta; Tsuyama, Naohiro; Todoroki, Kenichiro; Min, Jun Zhe; Toyo'oka, Toshimasa

    2017-01-01

    The non-targeted metabolomics analysis of biological samples is very important to understand biological functions and diseases. LC combined with electrospray ionization-based MS has been a powerful tool and widely used for metabolomic analyses. However, the ionization efficiency of electrospray ionization fluctuates for various unexpected reasons such as matrix effects and intraday variations of the instrument performances. To remove these fluctuations, normalization methods have been developed. Such techniques include increasing the sensitivity, separating co-eluting components and normalizing the ionization efficiencies. Normalization techniques allow simultaneously correcting of the ionization efficiencies of the detected metabolite peaks and achieving quantitative non-targeted metabolomics. In this review paper, we focused on these normalization methods for non-targeted metabolomics by LC-MS. Copyright © 2016 John Wiley & Sons, Ltd.

  5. Multiple quay cranes scheduling for double cycling in container terminals

    PubMed Central

    Chu, Yanling; Zhang, Xiaoju; Yang, Zhongzhen

    2017-01-01

    Double cycling is an efficient tool to increase the efficiency of quay crane (QC) in container terminals. In this paper, an optimization model for double cycling is developed to optimize the operation sequence of multiple QCs. The objective is to minimize the makespan of the ship handling operation considering the ship balance constraint. To solve the model, an algorithm based on Lagrangian relaxation is designed. Finally, we compare the efficiency of the Lagrangian relaxation based heuristic with the branch-and-bound method and a genetic algorithm using instances of different sizes. The results of numerical experiments indicate that the proposed model can effectively reduce the unloading and loading times of QCs. The effects of the ship balance constraint are more notable when the number of QCs is high. PMID:28692699

  6. Multiple quay cranes scheduling for double cycling in container terminals.

    PubMed

    Chu, Yanling; Zhang, Xiaoju; Yang, Zhongzhen

    2017-01-01

    Double cycling is an efficient tool to increase the efficiency of quay crane (QC) in container terminals. In this paper, an optimization model for double cycling is developed to optimize the operation sequence of multiple QCs. The objective is to minimize the makespan of the ship handling operation considering the ship balance constraint. To solve the model, an algorithm based on Lagrangian relaxation is designed. Finally, we compare the efficiency of the Lagrangian relaxation based heuristic with the branch-and-bound method and a genetic algorithm using instances of different sizes. The results of numerical experiments indicate that the proposed model can effectively reduce the unloading and loading times of QCs. The effects of the ship balance constraint are more notable when the number of QCs is high.

  7. Enhanced and selective optical trapping in a slot-graphite photonic crystal.

    PubMed

    Krishnan, Aravind; Huang, Ningfeng; Wu, Shao-Hua; Martínez, Luis Javier; Povinelli, Michelle L

    2016-10-03

    Applicability of optical trapping tools for nanomanipulation is limited by the available laser power and trap efficiency. We utilized the strong confinement of light in a slot-graphite photonic crystal to develop high-efficiency parallel trapping over a large area. The stiffness is 35 times higher than our previously demonstrated on-chip, near field traps. We demonstrate the ability to trap both dielectric and metallic particles of sub-micron size. We find that the growth kinetics of nanoparticle arrays on the slot-graphite template depends on particle size. This difference is exploited to selectively trap one type of particle out of a binary colloidal mixture, creating an efficient optical sieve. This technique has rich potential for analysis, diagnostics, and enrichment and sorting of microscopic entities.

  8. Microencapsulation Technology: A Powerful Tool for Integrating Expansion and Cryopreservation of Human Embryonic Stem Cells

    PubMed Central

    Malpique, Rita; Brito, Catarina; Jensen, Janne; Bjorquist, Petter; Carrondo, Manuel J. T.; Alves, Paula M.

    2011-01-01

    The successful implementation of human embryonic stem cells (hESCs)-based technologies requires the production of relevant numbers of well-characterized cells and their efficient long-term storage. In this study, cells were microencapsulated in alginate to develop an integrated bioprocess for expansion and cryopreservation of pluripotent hESCs. Different three-dimensional (3D) culture strategies were evaluated and compared, specifically, microencapsulation of hESCs as: i) single cells, ii) aggregates and iii) immobilized on microcarriers. In order to establish a scalable bioprocess, hESC-microcapsules were cultured in stirred tank bioreactors. The combination of microencapsulation and microcarrier technology resulted in a highly efficient protocol for the production and storage of pluripotent hESCs. This strategy ensured high expansion ratios (an approximately twenty-fold increase in cell concentration) and high cell recovery yields (>70%) after cryopreservation. When compared with non-encapsulated cells, cell survival post-thawing demonstrated a three-fold improvement without compromising hESC characteristics. Microencapsulation also improved the culture of hESC aggregates by protecting cells from hydrodynamic shear stress, controlling aggregate size and maintaining cell pluripotency for two weeks. This work establishes that microencapsulation technology may prove a powerful tool for integrating the expansion and cryopreservation of pluripotent hESCs. The 3D culture strategy developed herein represents a significant breakthrough towards the implementation of hESCs in clinical and industrial applications. PMID:21850261

  9. Towards a generalized energy prediction model for machine tools

    PubMed Central

    Bhinge, Raunak; Park, Jinkyoo; Law, Kincho H.; Dornfeld, David A.; Helu, Moneer; Rachuri, Sudarsan

    2017-01-01

    Energy prediction of machine tools can deliver many advantages to a manufacturing enterprise, ranging from energy-efficient process planning to machine tool monitoring. Physics-based, energy prediction models have been proposed in the past to understand the energy usage pattern of a machine tool. However, uncertainties in both the machine and the operating environment make it difficult to predict the energy consumption of the target machine reliably. Taking advantage of the opportunity to collect extensive, contextual, energy-consumption data, we discuss a data-driven approach to develop an energy prediction model of a machine tool in this paper. First, we present a methodology that can efficiently and effectively collect and process data extracted from a machine tool and its sensors. We then present a data-driven model that can be used to predict the energy consumption of the machine tool for machining a generic part. Specifically, we use Gaussian Process (GP) Regression, a non-parametric machine-learning technique, to develop the prediction model. The energy prediction model is then generalized over multiple process parameters and operations. Finally, we apply this generalized model with a method to assess uncertainty intervals to predict the energy consumed to machine any part using a Mori Seiki NVD1500 machine tool. Furthermore, the same model can be used during process planning to optimize the energy-efficiency of a machining process. PMID:28652687

  10. Towards a generalized energy prediction model for machine tools.

    PubMed

    Bhinge, Raunak; Park, Jinkyoo; Law, Kincho H; Dornfeld, David A; Helu, Moneer; Rachuri, Sudarsan

    2017-04-01

    Energy prediction of machine tools can deliver many advantages to a manufacturing enterprise, ranging from energy-efficient process planning to machine tool monitoring. Physics-based, energy prediction models have been proposed in the past to understand the energy usage pattern of a machine tool. However, uncertainties in both the machine and the operating environment make it difficult to predict the energy consumption of the target machine reliably. Taking advantage of the opportunity to collect extensive, contextual, energy-consumption data, we discuss a data-driven approach to develop an energy prediction model of a machine tool in this paper. First, we present a methodology that can efficiently and effectively collect and process data extracted from a machine tool and its sensors. We then present a data-driven model that can be used to predict the energy consumption of the machine tool for machining a generic part. Specifically, we use Gaussian Process (GP) Regression, a non-parametric machine-learning technique, to develop the prediction model. The energy prediction model is then generalized over multiple process parameters and operations. Finally, we apply this generalized model with a method to assess uncertainty intervals to predict the energy consumed to machine any part using a Mori Seiki NVD1500 machine tool. Furthermore, the same model can be used during process planning to optimize the energy-efficiency of a machining process.

  11. Cyclostationarity approach for monitoring chatter and tool wear in high speed milling

    NASA Astrophysics Data System (ADS)

    Lamraoui, M.; Thomas, M.; El Badaoui, M.

    2014-02-01

    Detection of chatter and tool wear is crucial in the machining process and their monitoring is a key issue, for: (1) insuring better surface quality, (2) increasing productivity and (3) protecting both machines and safe workpiece. This paper presents an investigation of chatter and tool wear using the cyclostationary method to process the vibrations signals acquired from high speed milling. Experimental cutting tests were achieved on slot milling operation of aluminum alloy. The experimental set-up is designed for acquisition of accelerometer signals and encoding information picked up from an encoder. The encoder signal is used for re-sampling accelerometers signals in angular domain using a specific algorithm that was developed in LASPI laboratory. The use of cyclostationary on accelerometer signals has been applied for monitoring chatter and tool wear in high speed milling. The cyclostationarity appears on average properties (first order) of signals, on the energetic properties (second order) and it generates spectral lines at cyclic frequencies in spectral correlation. Angular power and kurtosis are used to analyze chatter phenomena. The formation of chatter is characterized by unstable, chaotic motion of the tool and strong anomalous fluctuations of cutting forces. Results show that stable machining generates only very few cyclostationary components of second order while chatter is strongly correlated to cyclostationary components of second order. By machining in the unstable region, chatter results in flat angular kurtosis and flat angular power, such as a pseudo (white) random signal with flat spectrum. Results reveal that spectral correlation and Wigner Ville spectrum or integrated Wigner Ville issued from second-order cyclostationary are an efficient parameter for the early diagnosis of faults in high speed machining, such as chatter, tool wear and bearings, compared to traditional stationary methods. Wigner Ville representation of the residual signal shows that the energy corresponding to the tooth passing decreases when chatter phenomenon occurs. The effect of the tool wear and the number of broken teeth on the excitation of structure resonances appears in Wigner Ville presentation.

  12. WE-H-BRC-04: Implement Lean Methodology to Make Our Current Process of CT Simulation to Treatment More Efficient

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boddu, S; Morrow, A; Krishnamurthy, N

    Purpose: Our goal is to implement lean methodology to make our current process of CT simulation to treatment more efficient. Methods: In this study, we implemented lean methodology and tools and employed flowchart in excel for process-mapping. We formed a group of physicians, physicists, dosimetrists, therapists and a clinical physics assistant and huddled bi-weekly to map current value streams. We performed GEMBA walks and observed current processes from scheduling patient CT Simulations to treatment plan approval. From this, the entire workflow was categorized into processes, sub-processes, and tasks. For each process we gathered data on touch time, first time quality,more » undesirable effects (UDEs), and wait-times from relevant members of each task. UDEs were binned per frequency of their occurrence. We huddled to map future state and to find solutions to high frequency UDEs. We implemented visual controls, hard stops, and documented issues found during chart checks prior to treatment plan approval. Results: We have identified approximately 64 UDEs in our current workflow that could cause delays, re-work, compromise the quality and safety of patient treatments, or cause wait times between 1 – 6 days. While some UDEs are unavoidable, such as re-planning due to patient weight loss, eliminating avoidable UDEs is our goal. In 2015, we found 399 issues with patient treatment plans, of which 261, 95 and 43 were low, medium and high severity, respectively. We also mapped patient-specific QA processes for IMRT/Rapid Arc and SRS/SBRT, involving 10 and 18 steps, respectively. From these, 13 UDEs were found and 5 were addressed that solved 20% of issues. Conclusion: We have successfully implemented lean methodology and tools. We are further mapping treatment site specific workflows to identify bottlenecks, potential breakdowns and personnel allocation and employ tools like failure mode effects analysis to mitigate risk factors to make this process efficient.« less

  13. Efficient Genome Editing in Induced Pluripotent Stem Cells with Engineered Nucleases In Vitro.

    PubMed

    Termglinchan, Vittavat; Seeger, Timon; Chen, Caressa; Wu, Joseph C; Karakikes, Ioannis

    2017-01-01

    Precision genome engineering is rapidly advancing the application of the induced pluripotent stem cells (iPSCs) technology for in vitro disease modeling of cardiovascular diseases. Targeted genome editing using engineered nucleases is a powerful tool that allows for reverse genetics, genome engineering, and targeted transgene integration experiments to be performed in a precise and predictable manner. However, nuclease-mediated homologous recombination is an inefficient process. Herein, we describe the development of an optimized method combining site-specific nucleases and the piggyBac transposon system for "seamless" genome editing in pluripotent stem cells with high efficiency and fidelity in vitro.

  14. Statistical Capability Study of a Helical Grinding Machine Producing Screw Rotors

    NASA Astrophysics Data System (ADS)

    Holmes, C. S.; Headley, M.; Hart, P. W.

    2017-08-01

    Screw compressors depend for their efficiency and reliability on the accuracy of the rotors, and therefore on the machinery used in their production. The machinery has evolved over more than half a century in response to customer demands for production accuracy, efficiency, and flexibility, and is now at a high level on all three criteria. Production equipment and processes must be capable of maintaining accuracy over a production run, and this must be assessed statistically under strictly controlled conditions. This paper gives numerical data from such a study of an innovative machine tool and shows that it is possible to meet the demanding statistical capability requirements.

  15. Modified GMDH-NN algorithm and its application for global sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Song, Shufang; Wang, Lu

    2017-11-01

    Global sensitivity analysis (GSA) is a very useful tool to evaluate the influence of input variables in the whole distribution range. Sobol' method is the most commonly used among variance-based methods, which are efficient and popular GSA techniques. High dimensional model representation (HDMR) is a popular way to compute Sobol' indices, however, its drawbacks cannot be ignored. We show that modified GMDH-NN algorithm can calculate coefficients of metamodel efficiently, so this paper aims at combining it with HDMR and proposes GMDH-HDMR method. The new method shows higher precision and faster convergent rate. Several numerical and engineering examples are used to confirm its advantages.

  16. CCD charge collection efficiency and the photon transfer technique

    NASA Technical Reports Server (NTRS)

    Janesick, J.; Klaasen, K.; Elliott, T.

    1985-01-01

    The charge-coupled device (CCD) has shown unprecendented performance as a photon detector in the areas of spectral response, charge transfer, and readout noise. Recent experience indicates, however, that the full potential for the CCD's charge collection efficiency (CCE) lies well beyond that which is realized in currently available devices. A definition of CCE performance is presented and a standard test tool (the photon transfer technique) for measuring and optimizing this important CCD parameter is introduced. CCE characteristics for different types of CCDs are compared; the primary limitations in achieving high CCE performance are discussed, and the prospects for future improvement are outlined.

  17. OpenTopography: Addressing Big Data Challenges Using Cloud Computing, HPC, and Data Analytics

    NASA Astrophysics Data System (ADS)

    Crosby, C. J.; Nandigam, V.; Phan, M.; Youn, C.; Baru, C.; Arrowsmith, R.

    2014-12-01

    OpenTopography (OT) is a geoinformatics-based data facility initiated in 2009 for democratizing access to high-resolution topographic data, derived products, and tools. Hosted at the San Diego Supercomputer Center (SDSC), OT utilizes cyberinfrastructure, including large-scale data management, high-performance computing, and service-oriented architectures to provide efficient Web based access to large, high-resolution topographic datasets. OT collocates data with processing tools to enable users to quickly access custom data and derived products for their application. OT's ongoing R&D efforts aim to solve emerging technical challenges associated with exponential growth in data, higher order data products, as well as user base. Optimization of data management strategies can be informed by a comprehensive set of OT user access metrics that allows us to better understand usage patterns with respect to the data. By analyzing the spatiotemporal access patterns within the datasets, we can map areas of the data archive that are highly active (hot) versus the ones that are rarely accessed (cold). This enables us to architect a tiered storage environment consisting of high performance disk storage (SSD) for the hot areas and less expensive slower disk for the cold ones, thereby optimizing price to performance. From a compute perspective, OT is looking at cloud based solutions such as the Microsoft Azure platform to handle sudden increases in load. An OT virtual machine image in Microsoft's VM Depot can be invoked and deployed quickly in response to increased system demand. OT has also integrated SDSC HPC systems like the Gordon supercomputer into our infrastructure tier to enable compute intensive workloads like parallel computation of hydrologic routing on high resolution topography. This capability also allows OT to scale to HPC resources during high loads to meet user demand and provide more efficient processing. With a growing user base and maturing scientific user community comes new requests for algorithms and processing capabilities. To address this demand, OT is developing an extensible service based architecture for integrating community-developed software. This "plugable" approach to Web service deployment will enable new processing and analysis tools to run collocated with OT hosted data.

  18. Technology Tools for the Tough Tasks: Plug in for Great Outcomes

    ERIC Educational Resources Information Center

    Simon, Fran

    2012-01-01

    There are a lot of easy-to-use online tools that can help teachers and administrators with the tough tasks involved in running efficient, responsive, and intentional programs. The efficiencies offered through these systems allow busy educators to spend less time managing information and more time doing the work that matters the most--working with…

  19. Data Mining Tools Make Flights Safer, More Efficient

    NASA Technical Reports Server (NTRS)

    2014-01-01

    A small data mining team at Ames Research Center developed a set of algorithms ideal for combing through flight data to find anomalies. Dallas-based Southwest Airlines Co. signed a Space Act Agreement with Ames in 2011 to access the tools, helping the company refine its safety practices, improve its safety reviews, and increase flight efficiencies.

  20. Adaptive Management Tools for Nitrogen: Nitrogen Index, Nitrogen Trading Tool and Nitrogen Losses Environmental Assessment Package (NLEAP-GIS)

    USDA-ARS?s Scientific Manuscript database

    Average nitrogen (N) use efficiencies are approximately fifty percent and can be even lower for shallower rooted systems grown on irrigated sandy soils. These low N use efficiencies need to be increased if reactive N losses to the environmental are to be reduced. Recently, USDA-NRCS identified Adapt...

  1. Monitoring Java Programs with Java PathExplorer

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Rosu, Grigore; Clancy, Daniel (Technical Monitor)

    2001-01-01

    We present recent work on the development Java PathExplorer (JPAX), a tool for monitoring the execution of Java programs. JPAX can be used during program testing to gain increased information about program executions, and can potentially furthermore be applied during operation to survey safety critical systems. The tool facilitates automated instrumentation of a program's late code which will then omit events to an observer during its execution. The observer checks the events against user provided high level requirement specifications, for example temporal logic formulae, and against lower level error detection procedures, for example concurrency related such as deadlock and data race algorithms. High level requirement specifications together with their underlying logics are defined in the Maude rewriting logic, and then can either be directly checked using the Maude rewriting engine, or be first translated to efficient data structures and then checked in Java.

  2. Hypersonic Vehicle Propulsion System Control Model Development Roadmap and Activities

    NASA Technical Reports Server (NTRS)

    Stueber, Thomas J.; Le, Dzu K.; Vrnak, Daniel R.

    2009-01-01

    The NASA Fundamental Aeronautics Program Hypersonic project is directed towards fundamental research for two classes of hypersonic vehicles: highly reliable reusable launch systems (HRRLS) and high-mass Mars entry systems (HMMES). The objective of the hypersonic guidance, navigation, and control (GN&C) discipline team is to develop advanced guidance and control algorithms to enable efficient and effective operation of these challenging vehicles. The ongoing work at the NASA Glenn Research Center supports the hypersonic GN&C effort in developing tools to aid the design of advanced control algorithms that specifically address the propulsion system of the HRRLSclass vehicles. These tools are being developed in conjunction with complementary research and development activities in hypersonic propulsion at Glenn and elsewhere. This report is focused on obtaining control-relevant dynamic models of an HRRLS-type hypersonic vehicle propulsion system.

  3. Optimizing laser beam profiles using micro-lens arrays for efficient material processing: applications to solar cells

    NASA Astrophysics Data System (ADS)

    Hauschild, Dirk; Homburg, Oliver; Mitra, Thomas; Ivanenko, Mikhail; Jarczynski, Manfred; Meinschien, Jens; Bayer, Andreas; Lissotschenko, Vitalij

    2009-02-01

    High power laser sources are used in various production tools for microelectronic products and solar cells, including the applications annealing, lithography, edge isolation as well as dicing and patterning. Besides the right choice of the laser source suitable high performance optics for generating the appropriate beam profile and intensity distribution are of high importance for the right processing speed, quality and yield. For industrial applications equally important is an adequate understanding of the physics of the light-matter interaction behind the process. In advance simulations of the tool performance can minimize technical and financial risk as well as lead times for prototyping and introduction into series production. LIMO has developed its own software founded on the Maxwell equations taking into account all important physical aspects of the laser based process: the light source, the beam shaping optical system and the light-matter interaction. Based on this knowledge together with a unique free-form micro-lens array production technology and patented micro-optics beam shaping designs a number of novel solar cell production tool sub-systems have been built. The basic functionalities, design principles and performance results are presented with a special emphasis on resilience, cost reduction and process reliability.

  4. High-throughput two-dimensional root system phenotyping platform facilitates genetic analysis of root growth and development.

    PubMed

    Clark, Randy T; Famoso, Adam N; Zhao, Keyan; Shaff, Jon E; Craft, Eric J; Bustamante, Carlos D; McCouch, Susan R; Aneshansley, Daniel J; Kochian, Leon V

    2013-02-01

    High-throughput phenotyping of root systems requires a combination of specialized techniques and adaptable plant growth, root imaging and software tools. A custom phenotyping platform was designed to capture images of whole root systems, and novel software tools were developed to process and analyse these images. The platform and its components are adaptable to a wide range root phenotyping studies using diverse growth systems (hydroponics, paper pouches, gel and soil) involving several plant species, including, but not limited to, rice, maize, sorghum, tomato and Arabidopsis. The RootReader2D software tool is free and publicly available and was designed with both user-guided and automated features that increase flexibility and enhance efficiency when measuring root growth traits from specific roots or entire root systems during large-scale phenotyping studies. To demonstrate the unique capabilities and high-throughput capacity of this phenotyping platform for studying root systems, genome-wide association studies on rice (Oryza sativa) and maize (Zea mays) root growth were performed and root traits related to aluminium (Al) tolerance were analysed on the parents of the maize nested association mapping (NAM) population. © 2012 Blackwell Publishing Ltd.

  5. WebPrInSeS: automated full-length clone sequence identification and verification using high-throughput sequencing data.

    PubMed

    Massouras, Andreas; Decouttere, Frederik; Hens, Korneel; Deplancke, Bart

    2010-07-01

    High-throughput sequencing (HTS) is revolutionizing our ability to obtain cheap, fast and reliable sequence information. Many experimental approaches are expected to benefit from the incorporation of such sequencing features in their pipeline. Consequently, software tools that facilitate such an incorporation should be of great interest. In this context, we developed WebPrInSeS, a web server tool allowing automated full-length clone sequence identification and verification using HTS data. WebPrInSeS encompasses two separate software applications. The first is WebPrInSeS-C which performs automated sequence verification of user-defined open-reading frame (ORF) clone libraries. The second is WebPrInSeS-E, which identifies positive hits in cDNA or ORF-based library screening experiments such as yeast one- or two-hybrid assays. Both tools perform de novo assembly using HTS data from any of the three major sequencing platforms. Thus, WebPrInSeS provides a highly integrated, cost-effective and efficient way to sequence-verify or identify clones of interest. WebPrInSeS is available at http://webprinses.epfl.ch/ and is open to all users.

  6. WebPrInSeS: automated full-length clone sequence identification and verification using high-throughput sequencing data

    PubMed Central

    Massouras, Andreas; Decouttere, Frederik; Hens, Korneel; Deplancke, Bart

    2010-01-01

    High-throughput sequencing (HTS) is revolutionizing our ability to obtain cheap, fast and reliable sequence information. Many experimental approaches are expected to benefit from the incorporation of such sequencing features in their pipeline. Consequently, software tools that facilitate such an incorporation should be of great interest. In this context, we developed WebPrInSeS, a web server tool allowing automated full-length clone sequence identification and verification using HTS data. WebPrInSeS encompasses two separate software applications. The first is WebPrInSeS-C which performs automated sequence verification of user-defined open-reading frame (ORF) clone libraries. The second is WebPrInSeS-E, which identifies positive hits in cDNA or ORF-based library screening experiments such as yeast one- or two-hybrid assays. Both tools perform de novo assembly using HTS data from any of the three major sequencing platforms. Thus, WebPrInSeS provides a highly integrated, cost-effective and efficient way to sequence-verify or identify clones of interest. WebPrInSeS is available at http://webprinses.epfl.ch/ and is open to all users. PMID:20501601

  7. Linking Automated Data Analysis and Visualization with Applications in Developmental Biology and High-Energy Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruebel, Oliver

    2009-11-20

    Knowledge discovery from large and complex collections of today's scientific datasets is a challenging task. With the ability to measure and simulate more processes at increasingly finer spatial and temporal scales, the increasing number of data dimensions and data objects is presenting tremendous challenges for data analysis and effective data exploration methods and tools. Researchers are overwhelmed with data and standard tools are often insufficient to enable effective data analysis and knowledge discovery. The main objective of this thesis is to provide important new capabilities to accelerate scientific knowledge discovery form large, complex, and multivariate scientific data. The research coveredmore » in this thesis addresses these scientific challenges using a combination of scientific visualization, information visualization, automated data analysis, and other enabling technologies, such as efficient data management. The effectiveness of the proposed analysis methods is demonstrated via applications in two distinct scientific research fields, namely developmental biology and high-energy physics.Advances in microscopy, image analysis, and embryo registration enable for the first time measurement of gene expression at cellular resolution for entire organisms. Analysis of high-dimensional spatial gene expression datasets is a challenging task. By integrating data clustering and visualization, analysis of complex, time-varying, spatial gene expression patterns and their formation becomes possible. The analysis framework MATLAB and the visualization have been integrated, making advanced analysis tools accessible to biologist and enabling bioinformatic researchers to directly integrate their analysis with the visualization. Laser wakefield particle accelerators (LWFAs) promise to be a new compact source of high-energy particles and radiation, with wide applications ranging from medicine to physics. To gain insight into the complex physical processes of particle acceleration, physicists model LWFAs computationally. The datasets produced by LWFA simulations are (i) extremely large, (ii) of varying spatial and temporal resolution, (iii) heterogeneous, and (iv) high-dimensional, making analysis and knowledge discovery from complex LWFA simulation data a challenging task. To address these challenges this thesis describes the integration of the visualization system VisIt and the state-of-the-art index/query system FastBit, enabling interactive visual exploration of extremely large three-dimensional particle datasets. Researchers are especially interested in beams of high-energy particles formed during the course of a simulation. This thesis describes novel methods for automatic detection and analysis of particle beams enabling a more accurate and efficient data analysis process. By integrating these automated analysis methods with visualization, this research enables more accurate, efficient, and effective analysis of LWFA simulation data than previously possible.« less

  8. Impact and Penetration Simulations for Composite Wing-like Structures

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.

    1998-01-01

    The goal of this research project was to develop methodologies for the analysis of wing-like structures subjected to impact loadings. Low-speed impact causing either no damage or only minimal damage and high-speed impact causing severe laminate damage and possible penetration of the structure were to be considered during this research effort. To address this goal, an assessment of current analytical tools for impact analysis was performed. Assessment of the analytical tools for impact and penetration simulations with regard to accuracy, modeling, and damage modeling was considered as well as robustness, efficient, and usage in a wing design environment. Following a qualitative assessment, selected quantitative evaluations will be performed using the leading simulation tools. Based on this assessment, future research thrusts for impact and penetration simulation of composite wing-like structures were identified.

  9. Research-IQ: Development and Evaluation of an Ontology-anchored Integrative Query Tool

    PubMed Central

    Borlawsky, Tara B.; Lele, Omkar; Payne, Philip R. O.

    2011-01-01

    Investigators in the translational research and systems medicine domains require highly usable, efficient and integrative tools and methods that allow for the navigation of and reasoning over emerging large-scale data sets. Such resources must cover a spectrum of granularity from bio-molecules to population phenotypes. Given such information needs, we report upon the initial design and evaluation of an ontology-anchored integrative query tool, Research-IQ, which employs a combination of conceptual knowledge engineering and information retrieval techniques to enable the intuitive and rapid construction of queries, in terms of semi-structured textual propositions, that can subsequently be applied to integrative data sets. Our initial results, based upon both quantitative and qualitative evaluations of the efficacy and usability of Research-IQ, demonstrate its potential to increase clinical and translational research throughput. PMID:21821150

  10. Successes and Challenges of Incompressible Flow Simulation

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan; Kiris, Cetin

    2003-01-01

    During the past thirty years, numerical methods and simulation tools for incompressible flows have been advanced as a subset of CFD discipline. Even though incompressible flows are encountered in many areas of engineering, simulation of compressible flow has been the major driver for developing computational algorithms and tools. This is probably due to rather stringent requirements for predicting aerodynamic performance characteristics of flight vehicles, while flow devices involving low speed or incompressible flow could be reasonably well designed without resorting to accurate numerical simulations. As flow devices are required to be more sophisticated and highly efficient, CFD tools become indispensable in fluid engineering for incompressible and low speed flow. This paper is intended to review some of the successes made possible by advances in computational technologies during the same period, and discuss some of the current challenges.

  11. Object-Oriented MDAO Tool with Aeroservoelastic Model Tuning Capability

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi; Li, Wesley; Lung, Shun-fat

    2008-01-01

    An object-oriented multi-disciplinary analysis and optimization (MDAO) tool has been developed at the NASA Dryden Flight Research Center to automate the design and analysis process and leverage existing commercial as well as in-house codes to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic and hypersonic aircraft. Once the structural analysis discipline is finalized and integrated completely into the MDAO process, other disciplines such as aerodynamics and flight controls will be integrated as well. Simple and efficient model tuning capabilities based on optimization problem are successfully integrated with the MDAO tool. More synchronized all phases of experimental testing (ground and flight), analytical model updating, high-fidelity simulations for model validation, and integrated design may result in reduction of uncertainties in the aeroservoelastic model and increase the flight safety.

  12. Scenario Tools For Efficient Eutrophication Management

    NASA Astrophysics Data System (ADS)

    Arheimer, B.; Vastra SP3 Team

    Several possible measures are available to reduce diffuse (non-point source) nutri- ent load to surface water and thereby reduce eutrophication. Such measures include changed arable practices and constructions of wetlands and buffer zones in the land- scape, as well as managing lake ecosystems. In some cases, such as for wetlands, there is an intense debate regarding the efficiency of their nutrient reducing capability. In ad- dition, the combined effect of several measures in a catchment is not necessarily equal to their sum. It is therefore important to apply a holistic and integrated catchment approach when applying and evaluating different management strategies. To facili- tate such catchment analyses, the Swedish water management research programme (VASTRA) develop modelling tools addressing both phosphorus (P) and nitrogen (N) dynamics in catchments. During the last three years decision support tools for N man- agement in rivers and lakes have been developed (e.g., HBV-N, BIOLA) and applied in scenarios to demonstrate the effect of various reducing measures. At present, similar tools for P are under development. This presentation will demonstrate the VASTRA tool-box and its applications for efficient eutrophication management.

  13. High sample throughput genotyping for estimating C-lineage introgression in the dark honeybee: an accurate and cost-effective SNP-based tool.

    PubMed

    Henriques, Dora; Browne, Keith A; Barnett, Mark W; Parejo, Melanie; Kryger, Per; Freeman, Tom C; Muñoz, Irene; Garnery, Lionel; Highet, Fiona; Jonhston, J Spencer; McCormack, Grace P; Pinto, M Alice

    2018-06-04

    The natural distribution of the honeybee (Apis mellifera L.) has been changed by humans in recent decades to such an extent that the formerly widest-spread European subspecies, Apis mellifera mellifera, is threatened by extinction through introgression from highly divergent commercial strains in large tracts of its range. Conservation efforts for A. m. mellifera are underway in multiple European countries requiring reliable and cost-efficient molecular tools to identify purebred colonies. Here, we developed four ancestry-informative SNP assays for high sample throughput genotyping using the iPLEX Mass Array system. Our customized assays were tested on DNA from individual and pooled, haploid and diploid honeybee samples extracted from different tissues using a diverse range of protocols. The assays had a high genotyping success rate and yielded accurate genotypes. Performance assessed against whole-genome data showed that individual assays behaved well, although the most accurate introgression estimates were obtained for the four assays combined (117 SNPs). The best compromise between accuracy and genotyping costs was achieved when combining two assays (62 SNPs). We provide a ready-to-use cost-effective tool for accurate molecular identification and estimation of introgression levels to more effectively monitor and manage A. m. mellifera conservatories.

  14. Phaedra, a protocol-driven system for analysis and validation of high-content imaging and flow cytometry.

    PubMed

    Cornelissen, Frans; Cik, Miroslav; Gustin, Emmanuel

    2012-04-01

    High-content screening has brought new dimensions to cellular assays by generating rich data sets that characterize cell populations in great detail and detect subtle phenotypes. To derive relevant, reliable conclusions from these complex data, it is crucial to have informatics tools supporting quality control, data reduction, and data mining. These tools must reconcile the complexity of advanced analysis methods with the user-friendliness demanded by the user community. After review of existing applications, we realized the possibility of adding innovative new analysis options. Phaedra was developed to support workflows for drug screening and target discovery, interact with several laboratory information management systems, and process data generated by a range of techniques including high-content imaging, multicolor flow cytometry, and traditional high-throughput screening assays. The application is modular and flexible, with an interface that can be tuned to specific user roles. It offers user-friendly data visualization and reduction tools for HCS but also integrates Matlab for custom image analysis and the Konstanz Information Miner (KNIME) framework for data mining. Phaedra features efficient JPEG2000 compression and full drill-down functionality from dose-response curves down to individual cells, with exclusion and annotation options, cell classification, statistical quality controls, and reporting.

  15. Online tools for individuals with depression and neurologic conditions: A scoping review.

    PubMed

    Lukmanji, Sara; Pham, Tram; Blaikie, Laura; Clark, Callie; Jetté, Nathalie; Wiebe, Samuel; Bulloch, Andrew; Holroyd-Leduc, Jayna; Macrodimitris, Sophia; Mackie, Aaron; Patten, Scott B

    2017-08-01

    Patients with neurologic conditions commonly have depression. Online tools have the potential to improve outcomes in these patients in an efficient and accessible manner. We aimed to identify evidence-informed online tools for patients with comorbid neurologic conditions and depression. A scoping review of online tools (free, publicly available, and not requiring a facilitator) for patients with depression and epilepsy, Parkinson disease (PD), multiple sclerosis (MS), traumatic brain injury (TBI), or migraine was conducted. MEDLINE, EMBASE, PsycINFO, Cochrane Database of Systematic Reviews, and Cochrane CENTRAL Register of Controlled Trials were searched from database inception to January 2017 for all 5 neurologic conditions. Gray literature using Google and Google Scholar as well as app stores for both Android and Apple devices were searched. Self-management or self-efficacy online tools were not included unless they were specifically targeted at depression and one of the neurologic conditions and met the other eligibility criteria. Only 4 online tools were identified. Of these 4 tools, 2 were web-based self-management programs for patients with migraine or MS and depression. The other 2 were mobile apps for patients with PD or TBI and depression. No online tools were found for epilepsy. There are limited depression tools for people with neurologic conditions that are evidence-informed, publicly available, and free. Future research should focus on the development of high-quality, evidence-based online tools targeted at neurologic patients.

  16. Recent development in software and automation tools for high-throughput discovery bioanalysis.

    PubMed

    Shou, Wilson Z; Zhang, Jun

    2012-05-01

    Bioanalysis with LC-MS/MS has been established as the method of choice for quantitative determination of drug candidates in biological matrices in drug discovery and development. The LC-MS/MS bioanalytical support for drug discovery, especially for early discovery, often requires high-throughput (HT) analysis of large numbers of samples (hundreds to thousands per day) generated from many structurally diverse compounds (tens to hundreds per day) with a very quick turnaround time, in order to provide important activity and liability data to move discovery projects forward. Another important consideration for discovery bioanalysis is its fit-for-purpose quality requirement depending on the particular experiments being conducted at this stage, and it is usually not as stringent as those required in bioanalysis supporting drug development. These aforementioned attributes of HT discovery bioanalysis made it an ideal candidate for using software and automation tools to eliminate manual steps, remove bottlenecks, improve efficiency and reduce turnaround time while maintaining adequate quality. In this article we will review various recent developments that facilitate automation of individual bioanalytical procedures, such as sample preparation, MS/MS method development, sample analysis and data review, as well as fully integrated software tools that manage the entire bioanalytical workflow in HT discovery bioanalysis. In addition, software tools supporting the emerging high-resolution accurate MS bioanalytical approach are also discussed.

  17. Study on boring hardened materials dryly by ultrasonic vibration cutter

    NASA Astrophysics Data System (ADS)

    Zhang, Jiangzhong; Zhang, Heng; Zhang, Yue

    2011-05-01

    It has been one of the difficulties that high-precision hole on hardened materials is machined. The supersonic vibration boring acoustic system in the lathe in which supersonic wave energy is applied on tool is introduced to create pulse power on the cutting process. The separation vibration cutting is achieved by the pulse force. The comparative tests on boring accuracy and surface quality are carried. The quality of surface machined by this method is compared to that by grinding. This cutting is the green cutting. The boring process system is stability. Under the condition that the cutting speed is less than or equal to 1/3 the tool vibration speed, the cutting force is pulse force and the Cutting energy is of high concentration in time, space and direction. The pulse energy effects on the cutting unit in less than one ten-thousandth second. Traditional cutting of irregular movement elastic compression are eliminated. The cutting force is greatly reduced. The cutting temperature is at room temperature. The tool life is greatly increased. Shape precision and surface quality is greatly improved. The regulations of the ultrasonic vibration boring dry cutting of hardened material are also summarized. The test results show that the ultrasonic vibration cutting tool boring is of very superior cutting mechanism and is a high-precision deep-hole machining of hardened materials, efficient cutting methods.

  18. Stop the Bleeding: the Development of a Tool to Streamline NASA Earth Science Metadata Curation Efforts

    NASA Astrophysics Data System (ADS)

    le Roux, J.; Baker, A.; Caltagirone, S.; Bugbee, K.

    2017-12-01

    The Common Metadata Repository (CMR) is a high-performance, high-quality repository for Earth science metadata records, and serves as the primary way to search NASA's growing 17.5 petabytes of Earth science data holdings. Released in 2015, CMR has the capability to support several different metadata standards already being utilized by NASA's combined network of Earth science data providers, or Distributed Active Archive Centers (DAACs). The Analysis and Review of CMR (ARC) Team located at Marshall Space Flight Center is working to improve the quality of records already in CMR with the goal of making records optimal for search and discovery. This effort entails a combination of automated and manual review, where each NASA record in CMR is checked for completeness, accuracy, and consistency. This effort is highly collaborative in nature, requiring communication and transparency of findings amongst NASA personnel, DAACs, the CMR team and other metadata curation teams. Through the evolution of this project it has become apparent that there is a need to document and report findings, as well as track metadata improvements in a more efficient manner. The ARC team has collaborated with Element 84 in order to develop a metadata curation tool to meet these needs. In this presentation, we will provide an overview of this metadata curation tool and its current capabilities. Challenges and future plans for the tool will also be discussed.

  19. IT infrastructure in the era of imaging 3.0.

    PubMed

    McGinty, Geraldine B; Allen, Bibb; Geis, J Raymond; Wald, Christoph

    2014-12-01

    Imaging 3.0 is a blueprint for the future of radiology modeled after the description of Web 3.0 as "more connected, more open, and more intelligent." Imaging 3.0 involves radiologists' using their expertise to manage all aspects of imaging care to improve patient safety and outcomes and to deliver high-value care. IT tools are critical elements and drivers of success as radiologists embrace the concepts of Imaging 3.0. Organized radiology, specifically the ACR, is the natural convener and resource for the development of this Imaging 3.0 toolkit. The ACR's new Imaging 3.0 Informatics Committee is actively working to develop the informatics tools radiologists need to improve efficiency, deliver more value, and provide quantitative ways to demonstrate their value in new health care delivery and payment systems. This article takes each step of the process of delivering high-value Imaging 3.0 care and outlines the tools available as well as additional resources available to support practicing radiologists. From the moment when imaging is considered through the delivery of a meaningful and actionable report that is communicated to the referring clinician and, when appropriate, to the patient, Imaging 3.0 IT tools will enable radiologists to position themselves as vital constituents in cost-effective, high-value health care. Copyright © 2014 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  20. Sequence features associated with the cleavage efficiency of CRISPR/Cas9 system.

    PubMed

    Liu, Xiaoxi; Homma, Ayaka; Sayadi, Jamasb; Yang, Shu; Ohashi, Jun; Takumi, Toru

    2016-01-27

    The CRISPR-Cas9 system has recently emerged as a versatile tool for biological and medical research. In this system, a single guide RNA (sgRNA) directs the endonuclease Cas9 to a targeted DNA sequence for site-specific manipulation. In addition to this targeting function, the sgRNA has also been shown to play a role in activating the endonuclease activity of Cas9. This dual function of the sgRNA likely underlies observations that different sgRNAs have varying on-target activities. Currently, our understanding of the relationship between sequence features of sgRNAs and their on-target cleavage efficiencies remains limited, largely due to difficulties in assessing the cleavage capacity of a large number of sgRNAs. In this study, we evaluated the cleavage activities of 218 sgRNAs using in vitro Surveyor assays. We found that nucleotides at both PAM-distal and PAM-proximal regions of the sgRNA are significantly correlated with on-target efficiency. Furthermore, we also demonstrated that the genomic context of the targeted DNA, the GC percentage, and the secondary structure of sgRNA are critical factors contributing to cleavage efficiency. In summary, our study reveals important parameters for the design of sgRNAs with high on-target efficiencies, especially in the context of high throughput applications.

  1. An improved yeast transformation method for the generation of very large human antibody libraries.

    PubMed

    Benatuil, Lorenzo; Perez, Jennifer M; Belk, Jonathan; Hsieh, Chung-Ming

    2010-04-01

    Antibody library selection by yeast display technology is an efficient and highly sensitive method to identify binders to target antigens. This powerful selection tool, however, is often hampered by the typically modest size of yeast libraries (approximately 10(7)) due to the limited yeast transformation efficiency, and the full potential of the yeast display technology for antibody discovery and engineering can only be realized if it can be coupled with a mean to generate very large yeast libraries. We describe here a yeast transformation method by electroporation that allows for the efficient generation of large antibody libraries up to 10(10) in size. Multiple components and conditions including CaCl(2), MgCl(2), sucrose, sorbitol, lithium acetate, dithiothreitol, electroporation voltage, DNA input and cell volume have been tested to identify the best combination. By applying this developed protocol, we have constructed a 1.4 x 10(10) human spleen antibody library essentially in 1 day with a transformation efficiency of 1-1.5 x 10(8) transformants/microg vector DNA. Taken together, we have developed a highly efficient yeast transformation method that enables the generation of very large and productive human antibody libraries for antibody discovery, and we are now routinely making 10(9) libraries in a day for antibody engineering purposes.

  2. Demonstration of Efficient Nonreciprocity in a Microwave Optomechanical Circuit*

    NASA Astrophysics Data System (ADS)

    Peterson, G. A.; Lecocq, F.; Cicak, K.; Simmonds, R. W.; Aumentado, J.; Teufel, J. D.

    2017-07-01

    The ability to engineer nonreciprocal interactions is an essential tool in modern communication technology as well as a powerful resource for building quantum networks. Aside from large reverse isolation, a nonreciprocal device suitable for applications must also have high efficiency (low insertion loss) and low output noise. Recent theoretical and experimental studies have shown that nonreciprocal behavior can be achieved in optomechanical systems, but performance in these last two attributes has been limited. Here, we demonstrate an efficient, frequency-converting microwave isolator based on the optomechanical interactions between electromagnetic fields and a mechanically compliant vacuum-gap capacitor. We achieve simultaneous reverse isolation of more than 20 dB and insertion loss less than 1.5 dB. We characterize the nonreciprocal noise performance of the device, observing that the residual thermal noise from the mechanical environments is routed solely to the input of the isolator. Our measurements show quantitative agreement with a general coupled-mode theory. Unlike conventional isolators and circulators, these compact nonreciprocal devices do not require a static magnetic field, and they allow for dynamic control of the direction of isolation. With these advantages, similar devices could enable programmable, high-efficiency connections between disparate nodes of quantum networks, even efficiently bridging the microwave and optical domains.

  3. High Sensitivity Combined with Extended Structural Coverage of Labile Compounds via Nanoelectrospray Ionization at Subambient Pressures

    DOE PAGES

    Cox, Jonathan T.; Kronewitter, Scott R.; Shukla, Anil K.; ...

    2014-09-15

    Subambient pressure ionization with nanoelectrospray (SPIN) has proven to be effective in producing ions with high efficiency and transmitting them to low pressures for high sensitivity mass spectrometry (MS) analysis. Here we present evidence that not only does the SPIN source improve MS sensitivity but also allows for gentler ionization conditions. The gentleness of a conventional heated capillary electrospray ionization (ESI) source and the SPIN source was compared by the liquid chromatography mass spectrometry (LC-MS) analysis of colominic acid. Colominic acid is a mixture of sialic acid polymers of different lengths containing labile glycosidic linkages between monomer units necessitating amore » gentle ion source. By coupling the SPIN source with high resolution mass spectrometry and using advanced data processing tools, we demonstrate much extended coverage of sialic acid polymer chains as compared to using the conventional ESI source. Additionally we show that SPIN-LC-MS is effective in elucidating polymer features with high efficiency and high sensitivity previously unattainable by the conventional ESI-LC-MS methods.« less

  4. Edge control in a computer controlled optical surfacing process using a heterocercal tool influence function.

    PubMed

    Hu, Haixiang; Zhang, Xin; Ford, Virginia; Luo, Xiao; Qi, Erhui; Zeng, Xuefeng; Zhang, Xuejun

    2016-11-14

    Edge effect is regarded as one of the most difficult technical issues in a computer controlled optical surfacing (CCOS) process. Traditional opticians have to even up the consequences of the two following cases. Operating CCOS in a large overhang condition affects the accuracy of material removal, while in a small overhang condition, it achieves a more accurate performance, but leaves a narrow rolled-up edge, which takes time and effort to remove. In order to control the edge residuals in the latter case, we present a new concept of the 'heterocercal' tool influence function (TIF). Generated from compound motion equipment, this type of TIF can 'transfer' the material removal from the inner place to the edge, meanwhile maintaining the high accuracy and efficiency of CCOS. We call it the 'heterocercal' TIF, because of the inspiration from the heterocercal tails of sharks, whose upper lobe provides most of the explosive power. The heterocercal TIF was theoretically analyzed, and physically realized in CCOS facilities. Experimental and simulation results showed good agreement. It enables significant control of the edge effect and convergence of entire surface errors in large tool-to-mirror size-ratio conditions. This improvement will largely help manufacturing efficiency in some extremely large optical system projects, like the tertiary mirror of the Thirty Meter Telescope.

  5. EggLib: processing, analysis and simulation tools for population genetics and genomics

    PubMed Central

    2012-01-01

    Background With the considerable growth of available nucleotide sequence data over the last decade, integrated and flexible analytical tools have become a necessity. In particular, in the field of population genetics, there is a strong need for automated and reliable procedures to conduct repeatable and rapid polymorphism analyses, coalescent simulations, data manipulation and estimation of demographic parameters under a variety of scenarios. Results In this context, we present EggLib (Evolutionary Genetics and Genomics Library), a flexible and powerful C++/Python software package providing efficient and easy to use computational tools for sequence data management and extensive population genetic analyses on nucleotide sequence data. EggLib is a multifaceted project involving several integrated modules: an underlying computationally efficient C++ library (which can be used independently in pure C++ applications); two C++ programs; a Python package providing, among other features, a high level Python interface to the C++ library; and the egglib script which provides direct access to pre-programmed Python applications. Conclusions EggLib has been designed aiming to be both efficient and easy to use. A wide array of methods are implemented, including file format conversion, sequence alignment edition, coalescent simulations, neutrality tests and estimation of demographic parameters by Approximate Bayesian Computation (ABC). Classes implementing different demographic scenarios for ABC analyses can easily be developed by the user and included to the package. EggLib source code is distributed freely under the GNU General Public License (GPL) from its website http://egglib.sourceforge.net/ where a full documentation and a manual can also be found and downloaded. PMID:22494792

  6. EggLib: processing, analysis and simulation tools for population genetics and genomics.

    PubMed

    De Mita, Stéphane; Siol, Mathieu

    2012-04-11

    With the considerable growth of available nucleotide sequence data over the last decade, integrated and flexible analytical tools have become a necessity. In particular, in the field of population genetics, there is a strong need for automated and reliable procedures to conduct repeatable and rapid polymorphism analyses, coalescent simulations, data manipulation and estimation of demographic parameters under a variety of scenarios. In this context, we present EggLib (Evolutionary Genetics and Genomics Library), a flexible and powerful C++/Python software package providing efficient and easy to use computational tools for sequence data management and extensive population genetic analyses on nucleotide sequence data. EggLib is a multifaceted project involving several integrated modules: an underlying computationally efficient C++ library (which can be used independently in pure C++ applications); two C++ programs; a Python package providing, among other features, a high level Python interface to the C++ library; and the egglib script which provides direct access to pre-programmed Python applications. EggLib has been designed aiming to be both efficient and easy to use. A wide array of methods are implemented, including file format conversion, sequence alignment edition, coalescent simulations, neutrality tests and estimation of demographic parameters by Approximate Bayesian Computation (ABC). Classes implementing different demographic scenarios for ABC analyses can easily be developed by the user and included to the package. EggLib source code is distributed freely under the GNU General Public License (GPL) from its website http://egglib.sourceforge.net/ where a full documentation and a manual can also be found and downloaded.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horiike, S.; Okazaki, Y.

    This paper describes a performance estimation tool developed for modeling and simulation of open distributed energy management systems to support their design. The approach of discrete event simulation with detailed models is considered for efficient performance estimation. The tool includes basic models constituting a platform, e.g., Ethernet, communication protocol, operating system, etc. Application softwares are modeled by specifying CPU time, disk access size, communication data size, etc. Different types of system configurations for various system activities can be easily studied. Simulation examples show how the tool is utilized for the efficient design of open distributed energy management systems.

  8. The effect of ergonomic laparoscopic tool handle design on performance and efficiency.

    PubMed

    Tung, Kryztopher D; Shorti, Rami M; Downey, Earl C; Bloswick, Donald S; Merryweather, Andrew S

    2015-09-01

    Many factors can affect a surgeon's performance in the operating room; these may include surgeon comfort, ergonomics of tool handle design, and fatigue. A laparoscopic tool handle designed with ergonomic considerations (pistol grip) was tested against a current market tool with a traditional pinch grip handle. The goal of this study is to quantify the impact ergonomic design considerations which have on surgeon performance. We hypothesized that there will be measurable differences between the efficiency while performing FLS surgical trainer tasks when using both tool handle designs in three categories: time to completion, technical skill, and subjective user ratings. The pistol grip incorporates an ergonomic interface intended to reduce contact stress points on the hand and fingers, promote a more neutral operating wrist posture, and reduce hand tremor and fatigue. The traditional pinch grip is a laparoscopic tool developed by Stryker Inc. widely used during minimal invasive surgery. Twenty-three (13 M, 10 F) participants with no existing upper extremity musculoskeletal disorders or experience performing laparoscopic procedures were selected to perform in this study. During a training session prior to testing, participants performed practice trials in a SAGES FLS trainer with both tools. During data collection, participants performed three evaluation tasks using both handle designs (order was randomized, and each trial completed three times). The tasks consisted of FLS peg transfer, cutting, and suturing tasks. Feedback from test participants indicated that they significantly preferred the ergonomic pistol grip in every category (p < 0.05); most notably, participants experienced greater degrees of discomfort in their hands after using the pinch grip tool. Furthermore, participants completed cutting and peg transfer tasks in a shorter time duration (p < 0.05) with the pistol grip than with the pinch grip design; there was no significant difference between completion times for the suturing task. Finally, there was no significant interaction between tool type and errors made during trials. There was a significant preference for as well as lower pain experienced during use of the pistol grip tool as seen from the survey feedback. Both evaluation tasks (cutting and peg transfer) were also completed significantly faster with the pistol grip tool. Finally, due to the high degree of variability in the error data, it was not possible to draw any meaningful conclusions about the effect of tool design on the number or degree of errors made.

  9. Technology Prioritization: Transforming the U.S. Building Stock to Embrace Energy Efficiency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abdelaziz, Omar; Farese, Philip; Abramson, Alexis

    2013-01-01

    The U.S. Buildings sector is responsible for about 40% of the national energy expenditures. This is due in part to wasteful use of resources and limited considerations made for energy efficiency during the design and retrofit phases. Recent studies have indicated the potential for up to 30-50% energy savings in the U.S. buildings sector using currently available technologies. This paper discusses efforts to accelerate the transformation in the U.S. building energy efficiency sector using a new technology prioritization framework. The underlying analysis examines building energy use micro segments using the Energy Information Administration Annual Energy Outlook and other publically availablemore » information. The tool includes a stock-and-flow model to track stock vintage and efficiency levels with time. The tool can be used to investigate energy efficiency measures under a variety of scenarios and has a built-in energy accounting framework to prevent double counting of energy savings within any given portfolio. This tool is developed to inform decision making and estimate long term potential energy savings for different market adoption scenarios.« less

  10. Diagnosing schistosomiasis: where are we?

    PubMed

    Gomes, Luciana Inácia; Enk, Martin Johannes; Rabello, Ana

    2014-01-01

    In light of the World Health Organization's initiative to extend schistosomiasis morbidity and mortality control programs by including a disease elimination strategy in low endemic settings, this paper reviews diagnostic tools described during the last decades and provide an overview of ongoing efforts in making an efficient diagnostic tool available worldwide. A literature search on PubMed using the search criteria schistosomiasis and diagnosis within the period from 1978 to 2013 was carried out. Articles with abstract in English and that used laboratory techniques specifically developed for the detection of schistosomiasis in humans were included. Publications were categorized according to the methodology applied (parasitological, immunological, or molecular) and stage of development (in house development, limited field, or large scale field testing). The initial research generated 4,535 publications, of which only 643 met the inclusion criteria. The vast majority (537) of the publications focused on immunological techniques; 81 focused on parasitological diagnosis, and 25 focused on molecular diagnostic methods. Regarding the stage of development, 307 papers referred to in-house development, 202 referred to limited field tests, and 134 referred to large scale field testing. The data obtained show that promising new diagnostic tools, especially for Schistosoma antigen and deoxyribonucleic acid (DNA) detection, which are characterized by high sensitivity and specificity, are being developed. In combination with international funding initiatives these tools may result in a significant step forward in successful disease elimination and surveillance, which is to make efficient tests accessible and its large use self-sustainable for control programs in endemic countries.

  11. Status of the AIAA Modeling and Simulation Format Standard

    NASA Technical Reports Server (NTRS)

    Jackson, E. Bruce; Hildreth, Bruce L.

    2008-01-01

    The current draft AIAA Standard for flight simulation models represents an on-going effort to improve the productivity of practitioners of the art of digital flight simulation (one of the original digital computer applications). This initial release provides the capability for the efficient representation and exchange of an aerodynamic model in full fidelity; the DAVE-ML format can be easily imported (with development of site-specific import tools) in an unambiguous way with automatic verification. An attractive feature of the standard is the ability to coexist with existing legacy software or tools. The draft Standard is currently limited in scope to static elements of dynamic flight simulations; however, these static elements represent the bulk of typical flight simulation mathematical models. It is already seeing application within U.S. and Australian government agencies in an effort to improve productivity and reduce model rehosting overhead. An existing tool allows import of DAVE-ML models into a popular simulation modeling and analysis tool, and other community-contributed tools and libraries can simplify the use of DAVE-ML compliant models at compile- or run-time of high-fidelity flight simulation.

  12. Autism screening and diagnosis in low resource settings: Challenges and opportunities to enhance research and services worldwide

    PubMed Central

    Elsabbagh, Mayada; Barbaro, Josephine; Gladstone, Melissa; Happe, Francesca; Hoekstra, Rosa A.; Lee, Li‐Ching; Rattazzi, Alexia; Stapel‐Wax, Jennifer; Stone, Wendy L.; Tager‐Flusberg, Helen; Thurm, Audrey; Tomlinson, Mark; Shih, Andy

    2015-01-01

    Most research into the epidemiology, etiology, clinical manifestations, diagnosis and treatment of autism is based on studies in high income countries. Moreover, within high income countries, individuals of high socioeconomic status are disproportionately represented among participants in autism research. Corresponding disparities in access to autism screening, diagnosis, and treatment exist globally. One of the barriers perpetuating this imbalance is the high cost of proprietary tools for diagnosing autism and for delivering evidence‐based therapies. Another barrier is the high cost of training of professionals and para‐professionals to use the tools. Open‐source and open access models provide a way to facilitate global collaboration and training. Using these models and technologies, the autism scientific community and clinicians worldwide should be able to work more effectively and efficiently than they have to date to address the global imbalance in autism knowledge and at the same time advance our understanding of autism and our ability to deliver cost‐effective services to everyone in need. Autism Res 2015, 8: 473–476. © 2015 International Society for Autism Research, Wiley Periodicals, Inc. PMID:26437907

  13. Design of new face-centered cubic high entropy alloys by thermodynamic calculation

    NASA Astrophysics Data System (ADS)

    Choi, Won-Mi; Jung, Seungmun; Jo, Yong Hee; Lee, Sunghak; Lee, Byeong-Joo

    2017-09-01

    A new face-centered cubic (fcc) high entropy alloy system with non-equiatomic compositions has been designed by utilizing a CALculation of PHAse Diagram (CALPHAD) - type thermodynamic calculation technique. The new alloy system is based on the representative fcc high entropy alloy, the Cantor alloy which is an equiatomic Co- Cr-Fe-Mn-Ni five-component alloy, but fully or partly replace the cobalt by vanadium and is of non-equiatomic compositions. Alloy compositions expected to have an fcc single-phase structure between 700 °C and melting temperatures are proposed. All the proposed alloys are experimentally confirmed to have the fcc single-phase during materials processes (> 800 °C), through an X-ray diffraction analysis. It is shown that there are more chances to find fcc single-phase high entropy alloys if paying attention to non-equiatomic composition regions and that the CALPHAD thermodynamic calculation can be an efficient tool for it. An alloy design technique based on thermodynamic calculation is demonstrated and the applicability and limitation of the approach as a design tool for high entropy alloys is discussed.

  14. Programmable DNA-Guided Artificial Restriction Enzymes.

    PubMed

    Enghiad, Behnam; Zhao, Huimin

    2017-05-19

    Restriction enzymes are essential tools for recombinant DNA technology that have revolutionized modern biological research. However, they have limited sequence specificity and availability. Here we report a Pyrococcus furiosus Argonaute (PfAgo) based platform for generating artificial restriction enzymes (AREs) capable of recognizing and cleaving DNA sequences at virtually any arbitrary site and generating defined sticky ends of varying length. Short DNA guides are used to direct PfAgo to target sites for cleavage at high temperatures (>87 °C) followed by reannealing of the cleaved single stranded DNAs. We used this platform to generate over 18 AREs for DNA fingerprinting and molecular cloning of PCR-amplified or genomic DNAs. These AREs work as efficiently as their naturally occurring counterparts, and some of them even do not have any naturally occurring counterparts, demonstrating easy programmability, generality, versatility, and high efficiency for this new technology.

  15. On modelling three-dimensional piezoelectric smart structures with boundary spectral element method

    NASA Astrophysics Data System (ADS)

    Zou, Fangxin; Aliabadi, M. H.

    2017-05-01

    The computational efficiency of the boundary element method in elastodynamic analysis can be significantly improved by employing high-order spectral elements for boundary discretisation. In this work, for the first time, the so-called boundary spectral element method is utilised to formulate the piezoelectric smart structures that are widely used in structural health monitoring (SHM) applications. The resultant boundary spectral element formulation has been validated by the finite element method (FEM) and physical experiments. The new formulation has demonstrated a lower demand on computational resources and a higher numerical stability than commercial FEM packages. Comparing to the conventional boundary element formulation, a significant reduction in computational expenses has been achieved. In summary, the boundary spectral element formulation presented in this paper provides a highly efficient and stable mathematical tool for the development of SHM applications.

  16. Carbon nanotube-mediated siRNA delivery for gene silencing in cancer cells

    NASA Astrophysics Data System (ADS)

    Hong, Tu; Guo, Honglian; Xu, Yaqiong

    2011-10-01

    Small interfering RNA (siRNA) is potentially a promising tool in influencing gene expression with a high degree of target specificity. However, its poor intracellular uptake, instability in vivo, and non-specific immune stimulations impeded its effect in clinical applications. In this study, carbon nanotubes (CNTs) functionalized with two types of phospholipid-polyethylene glycol (PEG) have shown capabilities to stabilize siRNA in cell culture medium during the transfection and efficiently deliver siRNA into neuroblastoma and breast cancer cells. Moreover, the intrinsic optical properties of CNTs have been investigated through absorption and fluorescence measurements. We have found that the directly-functionalized groups play an important role on the fluorescence imaging of functionalized CNTs. The unique fluorescence imaging and high delivery efficiency make CNTs a promising material to deliver drugs and evaluate the treatment effect simultaneously.

  17. Changing computing paradigms towards power efficiency.

    PubMed

    Klavík, Pavel; Malossi, A Cristiano I; Bekas, Costas; Curioni, Alessandro

    2014-06-28

    Power awareness is fast becoming immensely important in computing, ranging from the traditional high-performance computing applications to the new generation of data centric workloads. In this work, we describe our efforts towards a power-efficient computing paradigm that combines low- and high-precision arithmetic. We showcase our ideas for the widely used kernel of solving systems of linear equations that finds numerous applications in scientific and engineering disciplines as well as in large-scale data analytics, statistics and machine learning. Towards this goal, we developed tools for the seamless power profiling of applications at a fine-grain level. In addition, we verify here previous work on post-FLOPS/W metrics and show that these can shed much more light in the power/energy profile of important applications. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  18. Java Tool Framework for Automation of Hardware Commissioning and Maintenance Procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ho, J C; Fisher, J M; Gordon, J B

    2007-10-02

    The National Ignition Facility (NIF) is a 192-beam laser system designed to study high energy density physics. Each beam line contains a variety of line replaceable units (LRUs) that contain optics, stepping motors, sensors and other devices to control and diagnose the laser. During commissioning and subsequent maintenance of the laser, LRUs undergo a qualification process using the Integrated Computer Control System (ICCS) to verify and calibrate the equipment. The commissioning processes are both repetitive and tedious when we use remote manual computer controls, making them ideal candidates for software automation. Maintenance and Commissioning Tool (MCT) software was developed tomore » improve the efficiency of the qualification process. The tools are implemented in Java, leveraging ICCS services and CORBA to communicate with the control devices. The framework provides easy-to-use mechanisms for handling configuration data, task execution, task progress reporting, and generation of commissioning test reports. The tool framework design and application examples will be discussed.« less

  19. Biomechanical ToolKit: Open-source framework to visualize and process biomechanical data.

    PubMed

    Barre, Arnaud; Armand, Stéphane

    2014-04-01

    C3D file format is widely used in the biomechanical field by companies and laboratories to store motion capture systems data. However, few software packages can visualize and modify the integrality of the data in the C3D file. Our objective was to develop an open-source and multi-platform framework to read, write, modify and visualize data from any motion analysis systems using standard (C3D) and proprietary file formats (used by many companies producing motion capture systems). The Biomechanical ToolKit (BTK) was developed to provide cost-effective and efficient tools for the biomechanical community to easily deal with motion analysis data. A large panel of operations is available to read, modify and process data through C++ API, bindings for high-level languages (Matlab, Octave, and Python), and standalone application (Mokka). All these tools are open-source and cross-platform and run on all major operating systems (Windows, Linux, MacOS X). Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  20. ARX - A Comprehensive Tool for Anonymizing Biomedical Data

    PubMed Central

    Prasser, Fabian; Kohlmayer, Florian; Lautenschläger, Ronald; Kuhn, Klaus A.

    2014-01-01

    Collaboration and data sharing have become core elements of biomedical research. Especially when sensitive data from distributed sources are linked, privacy threats have to be considered. Statistical disclosure control allows the protection of sensitive data by introducing fuzziness. Reduction of data quality, however, needs to be balanced against gains in protection. Therefore, tools are needed which provide a good overview of the anonymization process to those responsible for data sharing. These tools require graphical interfaces and the use of intuitive and replicable methods. In addition, extensive testing, documentation and openness to reviews by the community are important. Existing publicly available software is limited in functionality, and often active support is lacking. We present ARX, an anonymization tool that i) implements a wide variety of privacy methods in a highly efficient manner, ii) provides an intuitive cross-platform graphical interface, iii) offers a programming interface for integration into other software systems, and iv) is well documented and actively supported. PMID:25954407

  1. Old and new techniques mixed up into optical photomask measurement method

    NASA Astrophysics Data System (ADS)

    Fukui, Jumpei; Tachibana, Yusaku; Osanai, Makoto

    2017-07-01

    It has been still highly required for cost efficient solution with easy operation for full-automated CD measurement for line width about 500nm up to 5μm on photomask, because it is frequently use such photomask in the process of manufacturing MEMS sensor for IoT and some devices made in BCD (Bipola CMOS DMOS). As reply to such demand in photomask manufacturing field, we try to take a low noise digital camera technology and LED light source for i-line, which are recently developed, into new measuring tool in order to achieve 1nm (3σ) repeatability for line width measurement between 300nm to 10μm. In addition, for the purpose of full-automated operation, it is very important to find where an initial target line in dense pattern. To achieve such auto line detection precisely, we have improved accuracy of high precision stage (20nm as 3σ) and an alignment algorithm of MEMS Stepper to combine with this tool. As for user-friendly interface, Windows based software helps a lot for not only the operation but also recipe creation or edition in Excel. Actually, in the MEMS manufacturing process, there are various photomasks which need to be check and measure frequently therefore various recipe files are also have to be created and edited frequently.. In order to meet such a requirement in photomask management, we try to make it true by mixing old and new techniques together into one system, which comes to fully automated and cost efficient tool with 1nm repeatability in CD measurement.

  2. Development of an Efficient Approach to Perform Neutronics Simulations for Plutonium-238 Production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chandler, David; Ellis, Ronald James

    Conversion of 238Pu decay heat into usable electricity is imperative to power National Aeronautics and Space Administration (NASA) deep space exploration missions; however, the current stockpile of 238Pu is diminishing and the quality is less than ideal. In response, the US Department of Energy and NASA have undertaken a program to reestablish a domestic 238Pu production program and a technology demonstration sub-project has been initiated. Neutronics simulations for 238Pu production play a vital role in this project because the results guide reactor safety-basis, target design and optimization, and post-irradiation examination activities. A new, efficient neutronics simulation tool written in Pythonmore » was developed to evaluate, with the highest fidelity possible with approved tools, the time-dependent nuclide evolution and heat deposition rates in 238Pu production targets irradiated in the High Flux Isotope Reactor (HFIR). The Python Activation and Heat Deposition Script (PAHDS) was developed specifically for experiment analysis in HFIR and couples the MCNP5 and SCALE 6.1.3 software quality assured tools to take advantage of an existing high-fidelity MCNP HFIR model, the most up-to-date ORIGEN code, and the most up-to-date nuclear data. Three cycle simulations were performed with PAHDS implementing ENDF/B-VII.0, ENDF/B-VII.1, and the Hybrid Library GPD-Rev0 cross-section libraries. The 238Pu production results were benchmarked against VESTA-obtained results and the impact of various cross-section libraries on the calculated metrics were assessed.« less

  3. H.264/AVC Video Compression on Smartphones

    NASA Astrophysics Data System (ADS)

    Sharabayko, M. P.; Markov, N. G.

    2017-01-01

    In this paper, we studied the usage of H.264/AVC video compression tools by the flagship smartphones. The results show that only a subset of tools is used, meaning that there is still a potential to achieve higher compression efficiency within the H.264/AVC standard, but the most advanced smartphones are already reaching the compression efficiency limit of H.264/AVC.

  4. Dynamic Load-Balancing for Distributed Heterogeneous Computing of Parallel CFD Problems

    NASA Technical Reports Server (NTRS)

    Ecer, A.; Chien, Y. P.; Boenisch, T.; Akay, H. U.

    2000-01-01

    The developed methodology is aimed at improving the efficiency of executing block-structured algorithms on parallel, distributed, heterogeneous computers. The basic approach of these algorithms is to divide the flow domain into many sub- domains called blocks, and solve the governing equations over these blocks. Dynamic load balancing problem is defined as the efficient distribution of the blocks among the available processors over a period of several hours of computations. In environments with computers of different architecture, operating systems, CPU speed, memory size, load, and network speed, balancing the loads and managing the communication between processors becomes crucial. Load balancing software tools for mutually dependent parallel processes have been created to efficiently utilize an advanced computation environment and algorithms. These tools are dynamic in nature because of the chances in the computer environment during execution time. More recently, these tools were extended to a second operating system: NT. In this paper, the problems associated with this application will be discussed. Also, the developed algorithms were combined with the load sharing capability of LSF to efficiently utilize workstation clusters for parallel computing. Finally, results will be presented on running a NASA based code ADPAC to demonstrate the developed tools for dynamic load balancing.

  5. PhytoCRISP-Ex: a web-based and stand-alone application to find specific target sequences for CRISPR/CAS editing.

    PubMed

    Rastogi, Achal; Murik, Omer; Bowler, Chris; Tirichine, Leila

    2016-07-01

    With the emerging interest in phytoplankton research, the need to establish genetic tools for the functional characterization of genes is indispensable. The CRISPR/Cas9 system is now well recognized as an efficient and accurate reverse genetic tool for genome editing. Several computational tools have been published allowing researchers to find candidate target sequences for the engineering of the CRISPR vectors, while searching possible off-targets for the predicted candidates. These tools provide built-in genome databases of common model organisms that are used for CRISPR target prediction. Although their predictions are highly sensitive, the applicability to non-model genomes, most notably protists, makes their design inadequate. This motivated us to design a new CRISPR target finding tool, PhytoCRISP-Ex. Our software offers CRIPSR target predictions using an extended list of phytoplankton genomes and also delivers a user-friendly standalone application that can be used for any genome. The software attempts to integrate, for the first time, most available phytoplankton genomes information and provide a web-based platform for Cas9 target prediction within them with high sensitivity. By offering a standalone version, PhytoCRISP-Ex maintains an independence to be used with any organism and widens its applicability in high throughput pipelines. PhytoCRISP-Ex out pars all the existing tools by computing the availability of restriction sites over the most probable Cas9 cleavage sites, which can be ideal for mutant screens. PhytoCRISP-Ex is a simple, fast and accurate web interface with 13 pre-indexed and presently updating phytoplankton genomes. The software was also designed as a UNIX-based standalone application that allows the user to search for target sequences in the genomes of a variety of other species.

  6. Targeted Delivery of CRISPR/Cas9-Mediated Cancer Gene Therapy via Liposome-Templated Hydrogel Nanoparticles.

    PubMed

    Chen, Zeming; Liu, Fuyao; Chen, Yanke; Liu, Jun; Wang, Xiaoying; Chen, Ann T; Deng, Gang; Zhang, Hongyi; Liu, Jie; Hong, Zhangyong; Zhou, Jiangbing

    2017-12-08

    Due to its simplicity, versatility, and high efficiency, the clustered regularly interspaced short palindromic repeat (CRISPR)/Cas9 technology has emerged as one of the most promising approaches for treatment of a variety of genetic diseases, including human cancers. However, further translation of CRISPR/Cas9 for cancer gene therapy requires development of safe approaches for efficient, highly specific delivery of both Cas9 and single guide RNA to tumors. Here, novel core-shell nanostructure, liposome-templated hydrogel nanoparticles (LHNPs) that are optimized for efficient codelivery of Cas9 protein and nucleic acids is reported. It is demonstrated that, when coupled with the minicircle DNA technology, LHNPs deliver CRISPR/Cas9 with efficiency greater than commercial agent Lipofectamine 2000 in cell culture and can be engineered for targeted inhibition of genes in tumors, including tumors the brain. When CRISPR/Cas9 targeting a model therapeutic gene, polo-like kinase 1 (PLK1), is delivered, LHNPs effectively inhibit tumor growth and improve tumor-bearing mouse survival. The results suggest LHNPs as versatile CRISPR/Cas9-delivery tool that can be adapted for experimentally studying the biology of cancer as well as for clinically translating cancer gene therapy.

  7. A Novel Approach To Improve the Efficiency of Block Freeze Concentration Using Ice Nucleation Proteins with Altered Ice Morphology.

    PubMed

    Jin, Jue; Yurkow, Edward J; Adler, Derek; Lee, Tung-Ching

    2017-03-22

    Freeze concentration is a separation process with high success in product quality. The remaining challenge is to achieve high efficiency with low cost. This study aims to evaluate the potential of using ice nucleation proteins (INPs) as an effective method to improve the efficiency of block freeze concentration while also exploring the related mechanism of ice morphology. Our results show that INPs are able to significantly improve the efficiency of block freeze concentration in a desalination model. Using this experimental system, we estimate that approximately 50% of the energy cost can be saved by the inclusion of INPs in desalination cycles while still meeting the EPA standard of drinking water (<500 ppm). Our investigative tools for ice morphology include optical microscopy and X-ray computed tomography imaging analysis. Their use indicates that INPs promote the development of a lamellar structured ice matrix with larger hydraulic diameters, which facilitates brine drainage and contains less brine entrapment as compared to control samples. These results suggest great potential for applying INPs to develop an energy-saving freeze concentration method via the alteration of ice morphology.

  8. Nanoparticles for cultural heritage conservation: calcium and barium hydroxide nanoparticles for wall painting consolidation.

    PubMed

    Giorgi, Rodorico; Ambrosi, Moira; Toccafondi, Nicola; Baglioni, Piero

    2010-08-16

    Nanotechnology provides new concepts and materials for the consolidation and protection of wall paintings. In particular, humble calcium and barium hydroxide nanoparticles offer a versatile and highly efficient tool to combat the main degradation processes altering wall paintings. Clear example of the efficacy and potentiality of nanotechnology is represented by the conservation in situ of Maya wall paintings in the archaeological area in Calakmul (Mexico).

  9. Performance index: An expeditious tool to screen for improved drought resistance in the Lathyrus genus.

    PubMed

    Silvestre, Susana; Araújo, Susana de Sousa; Vaz Patto, Maria Carlota; Marques da Silva, Jorge

    2014-07-01

    Some species of the Lathyrus genus are among the most promising crops for marginal lands, with high resilience to drought, flood, and fungal diseases, combined with high yields and seed nutritional value. However, lack of knowledge on the mechanisms underlying its outstanding performance and methodologies to identify elite genotypes has hampered its proper use in breeding. Chlorophyll a fast fluorescence transient (JIP test), was used to evaluate water deficit (WD) resistance in Lathyrus genus. Our results reveal unaltered photochemical values for all studied genotypes showing resistance to mild WD. Under severe WD, two Lathyrus sativus genotypes showed remarkable resilience maintaining the photochemical efficiency, contrary to other genotypes studied. Performance index (PIABS) is the best parameter to screen genotypes with improved performance and grain production under WD. Moreover, we found that JIP indices are good indicators of genotypic grain production under WD. Quantum yield of electron transport (ϕEo) and efficiency with which trapped excitons can move electrons further than QA (ψ0) revealed as important traits related to improved photosynthetic performance and should be exploited in future Lathyrus germplasm improvements. The JIP test herein described showed to be an expeditious tool to screen and to identify elite genotypes with improved drought resistance.

  10. A spatial DB model to simulate the road network efficiency in hydrogeological emergency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michele, Mangiameli, E-mail: michele.mangiameli@dica.unict.it; Giuseppe, Mussumeci

    We deal with the theme of the simulation of risk analysis using a technological approach based on the integration of exclusively free and open source tools: PostgreSQL as Database Management System (DBMS) and Quantum GIS-GRASS as Geographic Information System (GIS) platform. The case study is represented by a seismic land in Sicily characterized by steep slopes and frequent instability phenomena. This area includes a city of about 30.000 inhabitants (Enna) that lies on the top of a mountain at about 990 m a.s.l.. The access to the city is assured by few and very winding roads that are also highly vulnerablemore » to seismic and hydrogeological hazards. When exceptional rainfall events occur, the loss of efficiency of these roads should compromise timeliness and effectiveness of rescue operations. The data of the sample area have been structured into the adopted DBMS, and the connection to the GIS functionalities allows simulating the exceptional events. We analyzed the hazard, vulnerability and exposure related to these events and calculated the final risk defining three classes for each scenario: low (L), medium (M) and high (H). This study can be a valuable tool to prioritize risk levels and set priorities for intervention to the main road networks.« less

  11. Ultramicroscopy as a novel tool to unravel the tropism of AAV gene therapy vectors in the brain.

    PubMed

    Alves, Sandro; Bode, Julia; Bemelmans, Alexis-Pierre; von Kalle, Christof; Cartier, Nathalie; Tews, Björn

    2016-06-20

    Recombinant adeno-associated viral (AAV) vectors have advanced to the vanguard of gene therapy. Numerous naturally occurring serotypes have been used to target cells in various tissues. There is a strong need for fast and dynamic methods which efficiently unravel viral tropism in whole organs. Ultramicroscopy (UM) is a novel fluorescence microscopy technique that images optically cleared undissected specimens, achieving good resolutions at high penetration depths while being non-destructive. UM was applied to obtain high-resolution 3D analysis of AAV transduction in adult mouse brains, especially in the hippocampus, a region of interest for Alzheimer's disease therapy. We separately or simultaneously compared transduction efficacies for commonly used serotypes (AAV9 and AAVrh10) using fluorescent reporter expression. We provide a detailed comparative and quantitative analysis of the transduction profiles. UM allowed a rapid analysis of marker fluorescence expression in neurons with intact projections deep inside the brain, in defined anatomical structures. Major hippocampal neuronal transduction was observed with both vectors, with slightly better efficacy for AAV9 in UM. Glial response and synaptic marker expression did not change post transduction.We propose UM as a novel valuable complementary tool to efficiently and simultaneously unravel tropism of different viruses in a single non-dissected adult rodent brain.

  12. A spatial DB model to simulate the road network efficiency in hydrogeological emergency

    NASA Astrophysics Data System (ADS)

    Michele, Mangiameli; Giuseppe, Mussumeci

    2015-12-01

    We deal with the theme of the simulation of risk analysis using a technological approach based on the integration of exclusively free and open source tools: PostgreSQL as Database Management System (DBMS) and Quantum GIS-GRASS as Geographic Information System (GIS) platform. The case study is represented by a seismic land in Sicily characterized by steep slopes and frequent instability phenomena. This area includes a city of about 30.000 inhabitants (Enna) that lies on the top of a mountain at about 990 m a.s.l.. The access to the city is assured by few and very winding roads that are also highly vulnerable to seismic and hydrogeological hazards. When exceptional rainfall events occur, the loss of efficiency of these roads should compromise timeliness and effectiveness of rescue operations. The data of the sample area have been structured into the adopted DBMS, and the connection to the GIS functionalities allows simulating the exceptional events. We analyzed the hazard, vulnerability and exposure related to these events and calculated the final risk defining three classes for each scenario: low (L), medium (M) and high (H). This study can be a valuable tool to prioritize risk levels and set priorities for intervention to the main road networks..

  13. Targeted disruption of sp7 and myostatin with CRISPR-Cas9 results in severe bone defects and more muscular cells in common carp

    PubMed Central

    Zhong, Zhaomin; Niu, Pengfei; Wang, Mingyong; Huang, Guodong; Xu, Shuhao; Sun, Yi; Xu, Xiaona; Hou, Yi; Sun, Xiaowen; Yan, Yilin; Wang, Han

    2016-01-01

    The common carp (Cyprinus carpio) as one of the most important aquaculture fishes produces over 3 million metric tones annually, approximately 10% the annual production of the all farmed freshwater fish worldwide. However, the tetraploidy genome and long generation-time of the common carp have made its breeding and genetic studies extremely difficult. Here, TALEN and CRISPR-Cas9, two versatile genome-editing tools, are employed to target common carp bone-related genes sp7, runx2, bmp2a, spp1, opg, and muscle suppressor gene mstn. TALEN were shown to induce mutations in the target coding sites of sp7, runx2, spp1 and mstn. With CRISPR-Cas9, the two common carp sp7 genes, sp7a and sp7b, were mutated individually, all resulting in severe bone defects; while mstnba mutated fish have grown significantly more muscle cells. We also employed CRISPR-Cas9 to generate double mutant fish of sp7a;mstnba with high efficiencies in a single step. These results demonstrate that both TALEN and CRISPR-Cas9 are highly efficient tools for modifying the common carp genome, and open avenues for facilitating common carp genetic studies and breeding. PMID:26976234

  14. Predicting the future: opportunities and challenges for the chemical industry to apply 21st-century toxicity testing.

    PubMed

    Settivari, Raja S; Ball, Nicholas; Murphy, Lynea; Rasoulpour, Reza; Boverhof, Darrell R; Carney, Edward W

    2015-03-01

    Interest in applying 21st-century toxicity testing tools for safety assessment of industrial chemicals is growing. Whereas conventional toxicology uses mainly animal-based, descriptive methods, a paradigm shift is emerging in which computational approaches, systems biology, high-throughput in vitro toxicity assays, and high-throughput exposure assessments are beginning to be applied to mechanism-based risk assessments in a time- and resource-efficient fashion. Here we describe recent advances in predictive safety assessment, with a focus on their strategic application to meet the changing demands of the chemical industry and its stakeholders. The opportunities to apply these new approaches is extensive and include screening of new chemicals, informing the design of safer and more sustainable chemical alternatives, filling information gaps on data-poor chemicals already in commerce, strengthening read-across methodology for categories of chemicals sharing similar modes of action, and optimizing the design of reduced-risk product formulations. Finally, we discuss how these predictive approaches dovetail with in vivo integrated testing strategies within repeated-dose regulatory toxicity studies, which are in line with 3Rs principles to refine, reduce, and replace animal testing. Strategic application of these tools is the foundation for informed and efficient safety assessment testing strategies that can be applied at all stages of the product-development process.

  15. Efficient Data Generation and Publication as a Test Tool

    NASA Technical Reports Server (NTRS)

    Einstein, Craig Jakob

    2017-01-01

    A tool to facilitate the generation and publication of test data was created to test the individual components of a command and control system designed to launch spacecraft. Specifically, this tool was built to ensure messages are properly passed between system components. The tool can also be used to test whether the appropriate groups have access (read/write privileges) to the correct messages. The messages passed between system components take the form of unique identifiers with associated values. These identifiers are alphanumeric strings that identify the type of message and the additional parameters that are contained within the message. The values that are passed with the message depend on the identifier. The data generation tool allows for the efficient creation and publication of these messages. A configuration file can be used to set the parameters of the tool and also specify which messages to pass.

  16. Ecotoxicity tests using the green algae Chlorella vulgaris--a useful tool in hazardous effluents management.

    PubMed

    Silva, Aurora; Figueiredo, Sónia A; Sales, M Goreti; Delerue-Matos, Cristina

    2009-08-15

    The treatment efficiency of laboratory wastewaters was evaluated and ecotoxicity tests with Chlorella vulgaris were performed on them to assess the safety of their environmental discharge. For chemical oxygen demand wastewaters, chromium (VI), mercury (II) and silver were efficiently removed by chemical treatments. A reduction of ecotoxicity was achieved; nevertheless, an EC50 (effective concentration that causes a 50% inhibition in the algae growth) of 1.5% (v/v) indicated still high level of ecotoxicity. For chloride determination wastewaters, an efficient reduction of chromium and silver was achieved after treatment. Regarding the reduction of ecotoxicity observed, EC50 increased from 0.059% to 0.5%, only a 0.02% concentration in the aquatic environment would guarantee no effects. Wastewaters containing phenanthroline/iron (II) complex were treated by chemical oxidation. Treatment was satisfactory concerning chemical parameters, although an increase in ecotoxicity was observed (EC50 reduced from 0.31% to 0.21%). The wastes from the kinetic study of persulphate and iodide reaction were treated with sodium bisulphite until colour was removed. Although they did not reveal significant ecotoxicity, only over 1% of the untreated waste produced observable effects over algae. Therefore, ecotoxicity tests could be considered a useful tool not only in laboratory effluents treatment, as shown, but also in hazardous wastewaters management.

  17. GAPIT: genome association and prediction integrated tool.

    PubMed

    Lipka, Alexander E; Tian, Feng; Wang, Qishan; Peiffer, Jason; Li, Meng; Bradbury, Peter J; Gore, Michael A; Buckler, Edward S; Zhang, Zhiwu

    2012-09-15

    Software programs that conduct genome-wide association studies and genomic prediction and selection need to use methodologies that maximize statistical power, provide high prediction accuracy and run in a computationally efficient manner. We developed an R package called Genome Association and Prediction Integrated Tool (GAPIT) that implements advanced statistical methods including the compressed mixed linear model (CMLM) and CMLM-based genomic prediction and selection. The GAPIT package can handle large datasets in excess of 10 000 individuals and 1 million single-nucleotide polymorphisms with minimal computational time, while providing user-friendly access and concise tables and graphs to interpret results. http://www.maizegenetics.net/GAPIT. zhiwu.zhang@cornell.edu Supplementary data are available at Bioinformatics online.

  18. High-performance web services for querying gene and variant annotation.

    PubMed

    Xin, Jiwen; Mark, Adam; Afrasiabi, Cyrus; Tsueng, Ginger; Juchler, Moritz; Gopal, Nikhil; Stupp, Gregory S; Putman, Timothy E; Ainscough, Benjamin J; Griffith, Obi L; Torkamani, Ali; Whetzel, Patricia L; Mungall, Christopher J; Mooney, Sean D; Su, Andrew I; Wu, Chunlei

    2016-05-06

    Efficient tools for data management and integration are essential for many aspects of high-throughput biology. In particular, annotations of genes and human genetic variants are commonly used but highly fragmented across many resources. Here, we describe MyGene.info and MyVariant.info, high-performance web services for querying gene and variant annotation information. These web services are currently accessed more than three million times permonth. They also demonstrate a generalizable cloud-based model for organizing and querying biological annotation information. MyGene.info and MyVariant.info are provided as high-performance web services, accessible at http://mygene.info and http://myvariant.info . Both are offered free of charge to the research community.

  19. Intelligent control system based on ARM for lithography tool

    NASA Astrophysics Data System (ADS)

    Chen, Changlong; Tang, Xiaoping; Hu, Song; Wang, Nan

    2014-08-01

    The control system of traditional lithography tool is based on PC and MCU. The PC handles the complex algorithm, human-computer interaction, and communicates with MCU via serial port; The MCU controls motors and electromagnetic valves, etc. This mode has shortcomings like big volume, high power consumption, and wasting of PC resource. In this paper, an embedded intelligent control system of lithography tool, based on ARM, is provided. The control system used S5PV210 as processor, completing the functions of PC in traditional lithography tool, and provided a good human-computer interaction by using LCD and capacitive touch screen. Using Android4.0.3 as operating system, the equipment provided a cool and easy UI which made the control more user-friendly, and implemented remote control and debug, pushing video information of product by network programming. As a result, it's convenient for equipment vendor to provide technical support for users. Finally, compared with traditional lithography tool, this design reduced the PC part, making the hardware resources efficiently used and reducing the cost and volume. Introducing embedded OS and the concepts in "The Internet of things" into the design of lithography tool can be a development trend.

  20. SWATH2stats: An R/Bioconductor Package to Process and Convert Quantitative SWATH-MS Proteomics Data for Downstream Analysis Tools.

    PubMed

    Blattmann, Peter; Heusel, Moritz; Aebersold, Ruedi

    2016-01-01

    SWATH-MS is an acquisition and analysis technique of targeted proteomics that enables measuring several thousand proteins with high reproducibility and accuracy across many samples. OpenSWATH is popular open-source software for peptide identification and quantification from SWATH-MS data. For downstream statistical and quantitative analysis there exist different tools such as MSstats, mapDIA and aLFQ. However, the transfer of data from OpenSWATH to the downstream statistical tools is currently technically challenging. Here we introduce the R/Bioconductor package SWATH2stats, which allows convenient processing of the data into a format directly readable by the downstream analysis tools. In addition, SWATH2stats allows annotation, analyzing the variation and the reproducibility of the measurements, FDR estimation, and advanced filtering before submitting the processed data to downstream tools. These functionalities are important to quickly analyze the quality of the SWATH-MS data. Hence, SWATH2stats is a new open-source tool that summarizes several practical functionalities for analyzing, processing, and converting SWATH-MS data and thus facilitates the efficient analysis of large-scale SWATH/DIA datasets.

Top