Sample records for simple efficient tool

  1. Simple tool for prediction of parotid gland sparing in intensity-modulated radiation therapy.

    PubMed

    Gensheimer, Michael F; Hummel-Kramer, Sharon M; Cain, David; Quang, Tony S

    2015-01-01

    Sparing one or both parotid glands is a key goal when planning head and neck cancer radiation treatment. If the planning target volume (PTV) overlaps one or both parotid glands substantially, it may not be possible to achieve adequate gland sparing. This finding results in physicians revising their PTV contours after an intensity-modulated radiation therapy (IMRT) plan has been run and reduces workflow efficiency. We devised a simple formula for predicting mean parotid gland dose from the overlap of the parotid gland and isotropically expanded PTV contours. We tested the tool using 44 patients from 2 institutions and found agreement between predicted and actual parotid gland doses (mean absolute error = 5.3 Gy). This simple method could increase treatment planning efficiency by improving the chance that the first plan presented to the physician will have optimal parotid gland sparing. Published by Elsevier Inc.

  2. Simple tool for prediction of parotid gland sparing in intensity-modulated radiation therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gensheimer, Michael F.; Hummel-Kramer, Sharon M., E-mail: sharonhummel@comcast.net; Cain, David

    Sparing one or both parotid glands is a key goal when planning head and neck cancer radiation treatment. If the planning target volume (PTV) overlaps one or both parotid glands substantially, it may not be possible to achieve adequate gland sparing. This finding results in physicians revising their PTV contours after an intensity-modulated radiation therapy (IMRT) plan has been run and reduces workflow efficiency. We devised a simple formula for predicting mean parotid gland dose from the overlap of the parotid gland and isotropically expanded PTV contours. We tested the tool using 44 patients from 2 institutions and found agreementmore » between predicted and actual parotid gland doses (mean absolute error = 5.3 Gy). This simple method could increase treatment planning efficiency by improving the chance that the first plan presented to the physician will have optimal parotid gland sparing.« less

  3. Building Energy Asset Score for Utilities and Energy Efficiency Program Administrators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Building Technologies Office

    2015-01-01

    The Building Energy Asset Score is a national standardized tool for evaluating the physical and structural energy efficiency of commercial and multifamily residential buildings. The Asset Score generates a simple energy efficiency rating that enables comparison among buildings, and identifies opportunities for users to invest in energy efficiency upgrades. It is web-based and free to use. This fact sheet discusses the value of the score for utilities and energy efficiency program administrators.

  4. Building Energy Asset Score for Architects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Building Technologies Office

    2015-01-01

    The Building Energy Asset Score is a national standardized tool for evaluating the physical and structural energy efficiency of commercial and multifamily residential buildings. The Asset Score generates a simple energy efficiency rating that enables comparison among buildings, and identifies opportunities for users to invest in energy efficiency upgrades. It is web-based and free to use. This fact sheet discusses the value of the score for architects.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Building Technologies Office

    The Building Energy Asset Score is a national standardized tool for evaluating the physical and structural energy efficiency of commercial and multifamily residential buildings. The Asset Score generates a simple energy efficiency rating that enables comparison among buildings, and identifies opportunities for users to invest in energy efficiency upgrades. It is web-based and free to use. This fact sheet discusses the value of the score for energy services companies, engineers and green building consultants.

  6. Building Energy Asset Score for Building Owners

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Building Technologies Office

    2015-01-01

    The Building Energy Asset Score is a national standardized tool for evaluating the physical and structural energy efficiency of commercial and multifamily residential buildings. The Asset Score generates a simple energy efficiency rating that enables comparison among buildings, and identifies opportunities for users to invest in energy efficiency upgrades. It is web-based and free to use. This fact sheet discusses the value of the score for building owners.

  7. Building Energy Asset Score for Real Estate Managers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Building Technologies Office

    2015-01-01

    The Building Energy Asset Score is a national standardized tool for evaluating the physical and structural energy efficiency of commercial and multifamily residential buildings. The Asset Score generates a simple energy efficiency rating that enables comparison among buildings, and identifies opportunities for users to invest in energy efficiency upgrades. It is web-based and free to use. This fact sheet discusses the value of the score for real estate managers.

  8. Current trends for customized biomedical software tools.

    PubMed

    Khan, Haseeb Ahmad

    2017-01-01

    In the past, biomedical scientists were solely dependent on expensive commercial software packages for various applications. However, the advent of user-friendly programming languages and open source platforms has revolutionized the development of simple and efficient customized software tools for solving specific biomedical problems. Many of these tools are designed and developed by biomedical scientists independently or with the support of computer experts and often made freely available for the benefit of scientific community. The current trends for customized biomedical software tools are highlighted in this short review.

  9. EZ and GOSSIP, two new VO compliant tools for spectral analysis

    NASA Astrophysics Data System (ADS)

    Franzetti, P.; Garill, B.; Fumana, M.; Paioro, L.; Scodeggio, M.; Paltani, S.; Scaramella, R.

    2008-10-01

    We present EZ and GOSSIP, two new VO compliant tools dedicated to spectral analysis. EZ is a tool to perform automatic redshift measurement; GOSSIP is a tool created to perform the SED fitting procedure in a simple, user friendly and efficient way. These two tools have been developed by the PANDORA Group at INAF-IASF (Milano); EZ has been developed in collaboration with Osservatorio Monte Porzio (Roma) and Integral Science Data Center (Geneve). EZ is released to the astronomical community; GOSSIP is currently in beta-testing.

  10. Two-photon excited fluorescence from a pseudoisocyanine-attached gold-coated tip via a thin tapered fiber under a weak continuous wave excitation.

    PubMed

    Ren, Fang; Takashima, Hideaki; Tanaka, Yoshito; Fujiwara, Hideki; Sasaki, Keiji

    2013-11-18

    A simple tapered fiber based photonic-plasmonic hybrid nanostructure composed of a thin tapered fiber and a pseudoisocyanine (PIC)-attached Au-coated tip was demonstrated. Using this simple hybrid nanostructure, we succeeded in observing two-photon excited fluorescence from the PIC dye molecules under a weak continuous wave excitation condition. From the results of the tip-fiber distance dependence and excitation polarization dependence, we found that using a thin tapered fiber and an Au-coated tip realized efficient coupling of the incident light (~95%) and LSP excitation at the Au-coated tip, suggesting the possibility of efficiently inducing two-photon excited fluorescence from the PIC dye molecules attached on the Au-coated tip. This simple photonic-plasmonic hybrid system is one of the promising tools for single photon sources, highly efficient plasmonic sensors, and integrated nonlinear plasmonic devices.

  11. Building Energy Asset Score for State and Local Governments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Building Technologies Office

    2015-01-01

    The Building Energy Asset Score is a national standardized tool for evaluating the physical and structural energy efficiency of commercial and multifamily residential buildings. The Asset Score generates a simple energy efficiency rating that enables comparison among buildings, and identifies opportunities for users to invest in energy efficiency upgrades. It is web-based and free to use. This fact sheet discusses the value of the score for state and local governments.

  12. UPIC + GO: Zeroing in on informative markers

    USDA-ARS?s Scientific Manuscript database

    Microsatellites/SSRs (simple sequence repeats) have become a powerful tool in genomic biology because of their broad range of applications and availability. An efficient method recently developed to generate microsatellite-enriched libraries used in combination with high throughput DNA pyrosequencin...

  13. Design of efficient and simple interface testing equipment for opto-electric tracking system

    NASA Astrophysics Data System (ADS)

    Liu, Qiong; Deng, Chao; Tian, Jing; Mao, Yao

    2016-10-01

    Interface testing for opto-electric tracking system is one important work to assure system running performance, aiming to verify the design result of every electronic interface matching the communication protocols or not, by different levels. Opto-electric tracking system nowadays is more complicated, composed of many functional units. Usually, interface testing is executed between units manufactured completely, highly depending on unit design and manufacture progress as well as relative people. As a result, it always takes days or weeks, inefficiently. To solve the problem, this paper promotes an efficient and simple interface testing equipment for opto-electric tracking system, consisting of optional interface circuit card, processor and test program. The hardware cards provide matched hardware interface(s), easily offered from hardware engineer. Automatic code generation technique is imported, providing adaption to new communication protocols. Automatic acquiring items, automatic constructing code architecture and automatic encoding are used to form a new program quickly with adaption. After simple steps, a standard customized new interface testing equipment with matching test program and interface(s) is ready for a waiting-test system in minutes. The efficient and simple interface testing equipment for opto-electric tracking system has worked for many opto-electric tracking system to test entire or part interfaces, reducing test time from days to hours, greatly improving test efficiency, with high software quality and stability, without manual coding. Used as a common tool, the efficient and simple interface testing equipment for opto-electric tracking system promoted by this paper has changed traditional interface testing method and created much higher efficiency.

  14. GIS Toolsets for Planetary Geomorphology and Landing-Site Analysis

    NASA Astrophysics Data System (ADS)

    Nass, Andrea; van Gasselt, Stephan

    2015-04-01

    Modern Geographic Information Systems (GIS) allow expert and lay users alike to load and position geographic data and perform simple to highly complex surface analyses. For many applications dedicated and ready-to-use GIS tools are available in standard software systems while other applications require the modular combination of available basic tools to answer more specific questions. This also applies to analyses in modern planetary geomorphology where many of such (basic) tools can be used to build complex analysis tools, e.g. in image- and terrain model analysis. Apart from the simple application of sets of different tools, many complex tasks require a more sophisticated design for storing and accessing data using databases (e.g. ArcHydro for hydrological data analysis). In planetary sciences, complex database-driven models are often required to efficiently analyse potential landings sites or store rover data, but also geologic mapping data can be efficiently stored and accessed using database models rather than stand-alone shapefiles. For landings-site analyses, relief and surface roughness estimates are two common concepts that are of particular interest and for both, a number of different definitions co-exist. We here present an advanced toolset for the analysis of image and terrain-model data with an emphasis on extraction of landing site characteristics using established criteria. We provide working examples and particularly focus on the concepts of terrain roughness as it is interpreted in geomorphology and engineering studies.

  15. Growth and yield models for central hardwoods

    Treesearch

    Martin E. Dale; Donald E. Hilt

    1989-01-01

    Over the last 20 years computers have become an efficient tool to estimate growth and yield. Computerized yield estimates vary from simple approximation or interpolation of traditional normal yield tables to highly sophisticated programs that simulate the growth and yield of each individual tree.

  16. The efficiency of geophysical adjoint codes generated by automatic differentiation tools

    NASA Astrophysics Data System (ADS)

    Vlasenko, A. V.; Köhl, A.; Stammer, D.

    2016-02-01

    The accuracy of numerical models that describe complex physical or chemical processes depends on the choice of model parameters. Estimating an optimal set of parameters by optimization algorithms requires knowledge of the sensitivity of the process of interest to model parameters. Typically the sensitivity computation involves differentiation of the model, which can be performed by applying algorithmic differentiation (AD) tools to the underlying numerical code. However, existing AD tools differ substantially in design, legibility and computational efficiency. In this study we show that, for geophysical data assimilation problems of varying complexity, the performance of adjoint codes generated by the existing AD tools (i) Open_AD, (ii) Tapenade, (iii) NAGWare and (iv) Transformation of Algorithms in Fortran (TAF) can be vastly different. Based on simple test problems, we evaluate the efficiency of each AD tool with respect to computational speed, accuracy of the adjoint, the efficiency of memory usage, and the capability of each AD tool to handle modern FORTRAN 90-95 elements such as structures and pointers, which are new elements that either combine groups of variables or provide aliases to memory addresses, respectively. We show that, while operator overloading tools are the only ones suitable for modern codes written in object-oriented programming languages, their computational efficiency lags behind source transformation by orders of magnitude, rendering the application of these modern tools to practical assimilation problems prohibitive. In contrast, the application of source transformation tools appears to be the most efficient choice, allowing handling even large geophysical data assimilation problems. However, they can only be applied to numerical models written in earlier generations of programming languages. Our study indicates that applying existing AD tools to realistic geophysical problems faces limitations that urgently need to be solved to allow the continuous use of AD tools for solving geophysical problems on modern computer architectures.

  17. The Krylov accelerated SIMPLE(R) method for flow problems in industrial furnaces

    NASA Astrophysics Data System (ADS)

    Vuik, C.; Saghir, A.; Boerstoel, G. P.

    2000-08-01

    Numerical modeling of the melting and combustion process is an important tool in gaining understanding of the physical and chemical phenomena that occur in a gas- or oil-fired glass-melting furnace. The incompressible Navier-Stokes equations are used to model the gas flow in the furnace. The discrete Navier-Stokes equations are solved by the SIMPLE(R) pressure-correction method. In these applications, many SIMPLE(R) iterations are necessary to obtain an accurate solution. In this paper, Krylov accelerated versions are proposed: GCR-SIMPLE(R). The properties of these methods are investigated for a simple two-dimensional flow. Thereafter, the efficiencies of the methods are compared for three-dimensional flows in industrial glass-melting furnaces. Copyright

  18. The Simple Video Coder: A free tool for efficiently coding social video data.

    PubMed

    Barto, Daniel; Bird, Clark W; Hamilton, Derek A; Fink, Brandi C

    2017-08-01

    Videotaping of experimental sessions is a common practice across many disciplines of psychology, ranging from clinical therapy, to developmental science, to animal research. Audio-visual data are a rich source of information that can be easily recorded; however, analysis of the recordings presents a major obstacle to project completion. Coding behavior is time-consuming and often requires ad-hoc training of a student coder. In addition, existing software is either prohibitively expensive or cumbersome, which leaves researchers with inadequate tools to quickly process video data. We offer the Simple Video Coder-free, open-source software for behavior coding that is flexible in accommodating different experimental designs, is intuitive for students to use, and produces outcome measures of event timing, frequency, and duration. Finally, the software also offers extraction tools to splice video into coded segments suitable for training future human coders or for use as input for pattern classification algorithms.

  19. Implementation and Use of the Reference Analytics Module of LibAnswers

    ERIC Educational Resources Information Center

    Flatley, Robert; Jensen, Robert Bruce

    2012-01-01

    Academic libraries have traditionally collected reference statistics using hash marks on paper. Although efficient and simple, this method is not an effective way to capture the complexity of reference transactions. Several electronic tools are now available to assist libraries with collecting often elusive reference data--among them homegrown…

  20. Memory management in genome-wide association studies

    PubMed Central

    2009-01-01

    Genome-wide association is a powerful tool for the identification of genes that underlie common diseases. Genome-wide association studies generate billions of genotypes and pose significant computational challenges for most users including limited computer memory. We applied a recently developed memory management tool to two analyses of North American Rheumatoid Arthritis Consortium studies and measured the performance in terms of central processing unit and memory usage. We conclude that our memory management approach is simple, efficient, and effective for genome-wide association studies. PMID:20018047

  1. Your Personal Learning Network: Professional Development on Demand

    ERIC Educational Resources Information Center

    Bauer, William I.

    2010-01-01

    Web 2.0 tools and resources can enhance our efficiency and effectiveness as music educators, supporting personal learning networks for ongoing professional growth and development. This article includes (a) an explanation of Really Simple Syndication (RSS) and the use of an RSS reader/aggregator; (b) a discussion of blogs, podcasts, wikis,…

  2. A simple tool for tubing modification to improve spiral high-speed counter-current chromatography for protein purification

    PubMed Central

    Ito, Yoichiro; Ma, Xiaofeng; Clary, Robert

    2016-01-01

    A simple tool is introduced which can modify the shape of tubing to enhance the partition efficiency in high-speed countercurrent chromatography. It consists of a pair of interlocking identical gears, each coaxially holding a pressing wheel to intermittently compress plastic tubing in 0 – 10 mm length at every 1 cm interval. The performance of the processed tubing is examined in protein separation with 1.6 mm ID PTFE tubing intermittently pressed in 3 mm and 10 mm width both at 10 mm intervals at various flow rates and revolution speeds. A series of experiments was performed with a polymer phase system composed of polyethylene glycol and dibasic potassium phosphate each at 12.5% (w/w) in deionized water using three protein samples. Overall results clearly demonstrate that the compressed tubing can yield substantially higher peak resolution than the non-processed tubing. The simple tubing modifier is very useful for separation of proteins with high-speed countercurrent chromatography. PMID:27818942

  3. A simple tool for tubing modification to improve spiral high-speed counter-current chromatography for protein purification.

    PubMed

    Ito, Yoichiro; Ma, Xiaofeng; Clary, Robert

    2016-01-01

    A simple tool is introduced which can modify the shape of tubing to enhance the partition efficiency in high-speed countercurrent chromatography. It consists of a pair of interlocking identical gears, each coaxially holding a pressing wheel to intermittently compress plastic tubing in 0 - 10 mm length at every 1 cm interval. The performance of the processed tubing is examined in protein separation with 1.6 mm ID PTFE tubing intermittently pressed in 3 mm and 10 mm width both at 10 mm intervals at various flow rates and revolution speeds. A series of experiments was performed with a polymer phase system composed of polyethylene glycol and dibasic potassium phosphate each at 12.5% (w/w) in deionized water using three protein samples. Overall results clearly demonstrate that the compressed tubing can yield substantially higher peak resolution than the non-processed tubing. The simple tubing modifier is very useful for separation of proteins with high-speed countercurrent chromatography.

  4. NONMEMory: a run management tool for NONMEM.

    PubMed

    Wilkins, Justin J

    2005-06-01

    NONMEM is an extremely powerful tool for nonlinear mixed-effect modelling and simulation of pharmacokinetic and pharmacodynamic data. However, it is a console-based application whose output does not lend itself to rapid interpretation or efficient management. NONMEMory has been created to be a comprehensive project manager for NONMEM, providing detailed summary, comparison and overview of the runs comprising a given project, including the display of output data, simple post-run processing, fast diagnostic plots and run output management, complementary to other available modelling aids. Analysis time ought not to be spent on trivial tasks, and NONMEMory's role is to eliminate these as far as possible by increasing the efficiency of the modelling process. NONMEMory is freely available from http://www.uct.ac.za/depts/pha/nonmemory.php.

  5. Automating the application of smart materials for protein crystallization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khurshid, Sahir; Govada, Lata; EL-Sharif, Hazim F.

    2015-03-01

    The first semi-liquid, non-protein nucleating agent for automated protein crystallization trials is described. This ‘smart material’ is demonstrated to induce crystal growth and will provide a simple, cost-effective tool for scientists in academia and industry. The fabrication and validation of the first semi-liquid nonprotein nucleating agent to be administered automatically to crystallization trials is reported. This research builds upon prior demonstration of the suitability of molecularly imprinted polymers (MIPs; known as ‘smart materials’) for inducing protein crystal growth. Modified MIPs of altered texture suitable for high-throughput trials are demonstrated to improve crystal quality and to increase the probability of successmore » when screening for suitable crystallization conditions. The application of these materials is simple, time-efficient and will provide a potent tool for structural biologists embarking on crystallization trials.« less

  6. Combined distribution functions: A powerful tool to identify cation coordination geometries in liquid systems

    NASA Astrophysics Data System (ADS)

    Sessa, Francesco; D'Angelo, Paola; Migliorati, Valentina

    2018-01-01

    In this work we have developed an analytical procedure to identify metal ion coordination geometries in liquid media based on the calculation of Combined Distribution Functions (CDFs) starting from Molecular Dynamics (MD) simulations. CDFs provide a fingerprint which can be easily and unambiguously assigned to a reference polyhedron. The CDF analysis has been tested on five systems and has proven to reliably identify the correct geometries of several ion coordination complexes. This tool is simple and general and can be efficiently applied to different MD simulations of liquid systems.

  7. Internal Aspects of the Skill Transfer of Manual Assembly Work

    ERIC Educational Resources Information Center

    Doyo, Daisuke

    2009-01-01

    In manual assembly work, parts are often assembled by applying force with a simple tool or by hand. A worker thus needs control the force he or she applies in working, as an appropriate level of force is requisite for minimizing work failures and improving efficiency. The object of this study is to clarify the relationship between the level of…

  8. A novel rapid and reproducible flow cytometric method for optimization of transfection efficiency in cells

    PubMed Central

    Homann, Stefanie; Hofmann, Christian; Gorin, Aleksandr M.; Nguyen, Huy Cong Xuan; Huynh, Diana; Hamid, Phillip; Maithel, Neil; Yacoubian, Vahe; Mu, Wenli; Kossyvakis, Athanasios; Sen Roy, Shubhendu; Yang, Otto Orlean

    2017-01-01

    Transfection is one of the most frequently used techniques in molecular biology that is also applicable for gene therapy studies in humans. One of the biggest challenges to investigate the protein function and interaction in gene therapy studies is to have reliable monospecific detection reagents, particularly antibodies, for all human gene products. Thus, a reliable method that can optimize transfection efficiency based on not only expression of the target protein of interest but also the uptake of the nucleic acid plasmid, can be an important tool in molecular biology. Here, we present a simple, rapid and robust flow cytometric method that can be used as a tool to optimize transfection efficiency at the single cell level while overcoming limitations of prior established methods that quantify transfection efficiency. By using optimized ratios of transfection reagent and a nucleic acid (DNA or RNA) vector directly labeled with a fluorochrome, this method can be used as a tool to simultaneously quantify cellular toxicity of different transfection reagents, the amount of nucleic acid plasmid that cells have taken up during transfection as well as the amount of the encoded expressed protein. Finally, we demonstrate that this method is reproducible, can be standardized and can reliably and rapidly quantify transfection efficiency, reducing assay costs and increasing throughput while increasing data robustness. PMID:28863132

  9. Cloud-Based Tools to Support High-Resolution Modeling (Invited)

    NASA Astrophysics Data System (ADS)

    Jones, N.; Nelson, J.; Swain, N.; Christensen, S.

    2013-12-01

    The majority of watershed models developed to support decision-making by water management agencies are simple, lumped-parameter models. Maturity in research codes and advances in the computational power from multi-core processors on desktop machines, commercial cloud-computing resources, and supercomputers with thousands of cores have created new opportunities for employing more accurate, high-resolution distributed models for routine use in decision support. The barriers for using such models on a more routine basis include massive amounts of spatial data that must be processed for each new scenario and lack of efficient visualization tools. In this presentation we will review a current NSF-funded project called CI-WATER that is intended to overcome many of these roadblocks associated with high-resolution modeling. We are developing a suite of tools that will make it possible to deploy customized web-based apps for running custom scenarios for high-resolution models with minimal effort. These tools are based on a software stack that includes 52 North, MapServer, PostGIS, HT Condor, CKAN, and Python. This open source stack provides a simple scripting environment for quickly configuring new custom applications for running high-resolution models as geoprocessing workflows. The HT Condor component facilitates simple access to local distributed computers or commercial cloud resources when necessary for stochastic simulations. The CKAN framework provides a powerful suite of tools for hosting such workflows in a web-based environment that includes visualization tools and storage of model simulations in a database to archival, querying, and sharing of model results. Prototype applications including land use change, snow melt, and burned area analysis will be presented. This material is based upon work supported by the National Science Foundation under Grant No. 1135482

  10. FLaapLUC: A pipeline for the generation of prompt alerts on transient Fermi-LAT γ-ray sources

    NASA Astrophysics Data System (ADS)

    Lenain, J.-P.

    2018-01-01

    The large majority of high energy sources detected with Fermi-LAT are blazars, which are known to be very variable sources. High cadence long-term monitoring simultaneously at different wavelengths being prohibitive, the study of their transient activities can help shedding light on our understanding of these objects. The early detection of such potentially fast transient events is the key for triggering follow-up observations at other wavelengths. A Python tool, FLaapLUC, built on top of the Science Tools provided by the Fermi Science Support Center and the Fermi-LAT collaboration, has been developed using a simple aperture photometry approach. This tool can effectively detect relative flux variations in a set of predefined sources and alert potential users. Such alerts can then be used to trigger target of opportunity observations with other facilities. It is shown that FLaapLUC is an efficient tool to reveal transient events in Fermi-LAT data, providing quick results which can be used to promptly organise follow-up observations. Results from this simple aperture photometry method are also compared to full likelihood analyses. The FLaapLUC package is made available on GitHub and is open to contributions by the community.

  11. Simple Biological Systems for Assessing the Activity of Superoxide Dismutase Mimics

    PubMed Central

    Tovmasyan, Artak; Reboucas, Julio S.

    2014-01-01

    Abstract Significance: Half a century of research provided unambiguous proof that superoxide and species derived from it—reactive oxygen species (ROS)—play a central role in many diseases and degenerative processes. This stimulated the search for pharmaceutical agents that are capable of preventing oxidative damage, and methods of assessing their therapeutic potential. Recent Advances: The limitations of superoxide dismutase (SOD) as a therapeutic tool directed attention to small molecules, SOD mimics, that are capable of catalytically scavenging superoxide. Several groups of compounds, based on either metal complexes, including metalloporphyrins, metallocorroles, Mn(II) cyclic polyamines, and Mn(III) salen derivatives, or non-metal based compounds, such as fullerenes, nitrones, and nitroxides, have been developed and studied in vitro and in vivo. Very few entered clinical trials. Critical Issues and Future Directions: Development of SOD mimics requires in-depth understanding of their mechanisms of biological action. Elucidation of both molecular features, essential for efficient ROS-scavenging in vivo, and factors limiting the potential side effects requires biologically relevant and, at the same time, relatively simple testing systems. This review discuses the advantages and limitations of genetically engineered SOD-deficient unicellular organisms, Escherichia coli and Saccharomyces cerevisiae as tools for investigating the efficacy and mechanisms of biological actions of SOD mimics. These simple systems allow the scrutiny of the minimal requirements for a functional SOD mimic: the association of a high catalytic activity for superoxide dismutation, low toxicity, and an efficient cellular uptake/biodistribution. Antioxid. Redox Signal. 20, 2416–2436. PMID:23964890

  12. LADES: a software for constructing and analyzing longitudinal designs in biomedical research.

    PubMed

    Vázquez-Alcocer, Alan; Garzón-Cortes, Daniel Ladislao; Sánchez-Casas, Rosa María

    2014-01-01

    One of the most important steps in biomedical longitudinal studies is choosing a good experimental design that can provide high accuracy in the analysis of results with a minimum sample size. Several methods for constructing efficient longitudinal designs have been developed based on power analysis and the statistical model used for analyzing the final results. However, development of this technology is not available to practitioners through user-friendly software. In this paper we introduce LADES (Longitudinal Analysis and Design of Experiments Software) as an alternative and easy-to-use tool for conducting longitudinal analysis and constructing efficient longitudinal designs. LADES incorporates methods for creating cost-efficient longitudinal designs, unequal longitudinal designs, and simple longitudinal designs. In addition, LADES includes different methods for analyzing longitudinal data such as linear mixed models, generalized estimating equations, among others. A study of European eels is reanalyzed in order to show LADES capabilities. Three treatments contained in three aquariums with five eels each were analyzed. Data were collected from 0 up to the 12th week post treatment for all the eels (complete design). The response under evaluation is sperm volume. A linear mixed model was fitted to the results using LADES. The complete design had a power of 88.7% using 15 eels. With LADES we propose the use of an unequal design with only 14 eels and 89.5% efficiency. LADES was developed as a powerful and simple tool to promote the use of statistical methods for analyzing and creating longitudinal experiments in biomedical research.

  13. Virus-Clip: a fast and memory-efficient viral integration site detection tool at single-base resolution with annotation capability.

    PubMed

    Ho, Daniel W H; Sze, Karen M F; Ng, Irene O L

    2015-08-28

    Viral integration into the human genome upon infection is an important risk factor for various human malignancies. We developed viral integration site detection tool called Virus-Clip, which makes use of information extracted from soft-clipped sequencing reads to identify exact positions of human and virus breakpoints of integration events. With initial read alignment to virus reference genome and streamlined procedures, Virus-Clip delivers a simple, fast and memory-efficient solution to viral integration site detection. Moreover, it can also automatically annotate the integration events with the corresponding affected human genes. Virus-Clip has been verified using whole-transcriptome sequencing data and its detection was validated to have satisfactory sensitivity and specificity. Marked advancement in performance was detected, compared to existing tools. It is applicable to versatile types of data including whole-genome sequencing, whole-transcriptome sequencing, and targeted sequencing. Virus-Clip is available at http://web.hku.hk/~dwhho/Virus-Clip.zip.

  14. Chitosan-microreactor: a versatile approach for heterogeneous organic synthesis in microfluidics.

    PubMed

    Basavaraju, K C; Sharma, Siddharth; Singh, Ajay K; Im, Do Jin; Kim, Dong-Pyo

    2014-07-01

    Microreactors have been proven to be efficient tools for a variety of homogeneous organic transformations due to their mixing efficiency, which results in very fast reactions, better heat and mass transfer, and simple scale-up. However, in heterogeneous catalytic reactions each catalyst needs an individual substrate as support. Herein, a versatile approach to immobilize metal catalysts on chitosan as a common substrate is presented. Chitosan, accommodating many metal catalysts, is grafted onto the microchannel surface as nanobrush. The versatility, catalytic efficiency, and stability/durability of the microreactor are demonstrated for a number of organic transformations involving various metal compounds as catalysts. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Efficient production of a gene mutant cell line through integrating TALENs and high-throughput cell cloning.

    PubMed

    Sun, Changhong; Fan, Yu; Li, Juan; Wang, Gancheng; Zhang, Hanshuo; Xi, Jianzhong Jeff

    2015-02-01

    Transcription activator-like effectors (TALEs) are becoming powerful DNA-targeting tools in a variety of mammalian cells and model organisms. However, generating a stable cell line with specific gene mutations in a simple and rapid manner remains a challenging task. Here, we report a new method to efficiently produce monoclonal cells using integrated TALE nuclease technology and a series of high-throughput cell cloning approaches. Following this method, we obtained three mTOR mutant 293T cell lines within 2 months, which included one homozygous mutant line. © 2014 Society for Laboratory Automation and Screening.

  16. Image based method for aberration measurement of lithographic tools

    NASA Astrophysics Data System (ADS)

    Xu, Shuang; Tao, Bo; Guo, Yongxing; Li, Gongfa

    2018-01-01

    Information of lens aberration of lithographic tools is important as it directly affects the intensity distribution in the image plane. Zernike polynomials are commonly used for a mathematical description of lens aberrations. Due to the advantage of lower cost and easier implementation of tools, image based measurement techniques have been widely used. Lithographic tools are typically partially coherent systems that can be described by a bilinear model, which entails time consuming calculations and does not lend a simple and intuitive relationship between lens aberrations and the resulted images. Previous methods for retrieving lens aberrations in such partially coherent systems involve through-focus image measurements and time-consuming iterative algorithms. In this work, we propose a method for aberration measurement in lithographic tools, which only requires measuring two images of intensity distribution. Two linear formulations are derived in matrix forms that directly relate the measured images to the unknown Zernike coefficients. Consequently, an efficient non-iterative solution is obtained.

  17. Development of nonlinear acoustic propagation analysis tool toward realization of loud noise environment prediction in aeronautics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kanamori, Masashi, E-mail: kanamori.masashi@jaxa.jp; Takahashi, Takashi, E-mail: takahashi.takashi@jaxa.jp; Aoyama, Takashi, E-mail: aoyama.takashi@jaxa.jp

    2015-10-28

    Shown in this paper is an introduction of a prediction tool for the propagation of loud noise with the application to the aeronautics in mind. The tool, named SPnoise, is based on HOWARD approach, which can express almost exact multidimensionality of the diffraction effect at the cost of back scattering. This paper argues, in particular, the prediction of the effect of atmospheric turbulence on sonic boom as one of the important issues in aeronautics. Thanks to the simple and efficient modeling of the atmospheric turbulence, SPnoise successfully re-creates the feature of the effect, which often emerges in the region justmore » behind the front and rear shock waves in the sonic boom signature.« less

  18. Metrics for comparing neuronal tree shapes based on persistent homology.

    PubMed

    Li, Yanjie; Wang, Dingkang; Ascoli, Giorgio A; Mitra, Partha; Wang, Yusu

    2017-01-01

    As more and more neuroanatomical data are made available through efforts such as NeuroMorpho.Org and FlyCircuit.org, the need to develop computational tools to facilitate automatic knowledge discovery from such large datasets becomes more urgent. One fundamental question is how best to compare neuron structures, for instance to organize and classify large collection of neurons. We aim to develop a flexible yet powerful framework to support comparison and classification of large collection of neuron structures efficiently. Specifically we propose to use a topological persistence-based feature vectorization framework. Existing methods to vectorize a neuron (i.e, convert a neuron to a feature vector so as to support efficient comparison and/or searching) typically rely on statistics or summaries of morphometric information, such as the average or maximum local torque angle or partition asymmetry. These simple summaries have limited power in encoding global tree structures. Based on the concept of topological persistence recently developed in the field of computational topology, we vectorize each neuron structure into a simple yet informative summary. In particular, each type of information of interest can be represented as a descriptor function defined on the neuron tree, which is then mapped to a simple persistence-signature. Our framework can encode both local and global tree structure, as well as other information of interest (electrophysiological or dynamical measures), by considering multiple descriptor functions on the neuron. The resulting persistence-based signature is potentially more informative than simple statistical summaries (such as average/mean/max) of morphometric quantities-Indeed, we show that using a certain descriptor function will give a persistence-based signature containing strictly more information than the classical Sholl analysis. At the same time, our framework retains the efficiency associated with treating neurons as points in a simple Euclidean feature space, which would be important for constructing efficient searching or indexing structures over them. We present preliminary experimental results to demonstrate the effectiveness of our persistence-based neuronal feature vectorization framework.

  19. Metrics for comparing neuronal tree shapes based on persistent homology

    PubMed Central

    Li, Yanjie; Wang, Dingkang; Ascoli, Giorgio A.; Mitra, Partha

    2017-01-01

    As more and more neuroanatomical data are made available through efforts such as NeuroMorpho.Org and FlyCircuit.org, the need to develop computational tools to facilitate automatic knowledge discovery from such large datasets becomes more urgent. One fundamental question is how best to compare neuron structures, for instance to organize and classify large collection of neurons. We aim to develop a flexible yet powerful framework to support comparison and classification of large collection of neuron structures efficiently. Specifically we propose to use a topological persistence-based feature vectorization framework. Existing methods to vectorize a neuron (i.e, convert a neuron to a feature vector so as to support efficient comparison and/or searching) typically rely on statistics or summaries of morphometric information, such as the average or maximum local torque angle or partition asymmetry. These simple summaries have limited power in encoding global tree structures. Based on the concept of topological persistence recently developed in the field of computational topology, we vectorize each neuron structure into a simple yet informative summary. In particular, each type of information of interest can be represented as a descriptor function defined on the neuron tree, which is then mapped to a simple persistence-signature. Our framework can encode both local and global tree structure, as well as other information of interest (electrophysiological or dynamical measures), by considering multiple descriptor functions on the neuron. The resulting persistence-based signature is potentially more informative than simple statistical summaries (such as average/mean/max) of morphometric quantities—Indeed, we show that using a certain descriptor function will give a persistence-based signature containing strictly more information than the classical Sholl analysis. At the same time, our framework retains the efficiency associated with treating neurons as points in a simple Euclidean feature space, which would be important for constructing efficient searching or indexing structures over them. We present preliminary experimental results to demonstrate the effectiveness of our persistence-based neuronal feature vectorization framework. PMID:28809960

  20. Field Assessment of Energy Audit Tools for Retrofit Programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, J.; Bohac, D.; Nelson, C.

    2013-07-01

    This project focused on the use of home energy ratings as a tool to promote energy retrofits in existing homes. A home energy rating provides a quantitative appraisal of a home's asset performance, usually compared to a benchmark such as the average energy use of similar homes in the same region. Home rating systems can help motivate homeowners in several ways. Ratings can clearly communicate a home's achievable energy efficiency potential, provide a quantitative assessment of energy savings after retrofits are completed, and show homeowners how they rate compared to their neighbors, thus creating an incentive to conform to amore » social standard. An important consideration is how rating tools for the retrofit market will integrate with existing home energy service programs. For residential programs that target energy savings only, home visits should be focused on key efficiency measures for that home. In order to gain wide adoption, a rating tool must be easily integrated into the field process, demonstrate consistency and reasonable accuracy to earn the trust of home energy technicians, and have a low monetary cost and time hurdle for homeowners. Along with the Home Energy Score, this project also evaluated the energy modeling performance of SIMPLE and REM/Rate.« less

  1. A Computationally Efficient Method for Polyphonic Pitch Estimation

    NASA Astrophysics Data System (ADS)

    Zhou, Ruohua; Reiss, Joshua D.; Mattavelli, Marco; Zoia, Giorgio

    2009-12-01

    This paper presents a computationally efficient method for polyphonic pitch estimation. The method employs the Fast Resonator Time-Frequency Image (RTFI) as the basic time-frequency analysis tool. The approach is composed of two main stages. First, a preliminary pitch estimation is obtained by means of a simple peak-picking procedure in the pitch energy spectrum. Such spectrum is calculated from the original RTFI energy spectrum according to harmonic grouping principles. Then the incorrect estimations are removed according to spectral irregularity and knowledge of the harmonic structures of the music notes played on commonly used music instruments. The new approach is compared with a variety of other frame-based polyphonic pitch estimation methods, and results demonstrate the high performance and computational efficiency of the approach.

  2. Is your ribozyme design really correct?: A proposal of simple single turnover competition assay to evaluate ribozymes.

    PubMed

    Tanaka, T; Inui, O; Dohi, N; Okada, N; Okada, H; Kikuchi, Y

    2001-07-01

    Today, many nucleic acid enzymes are used in gene therapy and gene regulations. However, no simple assay methods to evaluate enzymatic activities, with which we judge the enzyme design, have been reported. Here, we propose a new simple competition assay for nucleic acid enzymes of different types to evaluate the cleaving efficiency of a target RNA molecule, of which the recognition sites are different but overlapped. Two nucleic acid enzymes were added to one tube to make a competition of these two enzymes for one substrate. The assay was used on two ribozymes, hammerhead ribozyme and hairpin ribozyme, and a DNA-enzyme. We found that this assay method is capable of application to those enzymes, as a powerful tool for the selection and designing of RNA-cleaving enzymes.

  3. Two Simple and Efficient Algorithms to Compute the SP-Score Objective Function of a Multiple Sequence Alignment.

    PubMed

    Ranwez, Vincent

    2016-01-01

    Multiple sequence alignment (MSA) is a crucial step in many molecular analyses and many MSA tools have been developed. Most of them use a greedy approach to construct a first alignment that is then refined by optimizing the sum of pair score (SP-score). The SP-score estimation is thus a bottleneck for most MSA tools since it is repeatedly required and is time consuming. Given an alignment of n sequences and L sites, I introduce here optimized solutions reaching O(nL) time complexity for affine gap cost, instead of O(n2L), which are easy to implement.

  4. High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis.

    PubMed

    Simonyan, Vahan; Mazumder, Raja

    2014-09-30

    The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  5. High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis

    PubMed Central

    Simonyan, Vahan; Mazumder, Raja

    2014-01-01

    The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis. PMID:25271953

  6. Oligomerization triggered by foldon: a simple method to enhance the catalytic efficiency of lichenase and xylanase.

    PubMed

    Wang, Xinzhe; Ge, Huihua; Zhang, Dandan; Wu, Shuyu; Zhang, Guangya

    2017-07-03

    Effective and simple methods that lead to higher enzymatic efficiencies are highly sough. Here we proposed a foldon-triggered trimerization of the target enzymes with significantly improved catalytic performances by fusing a foldon domain at the C-terminus of the enzymes via elastin-like polypeptides (ELPs). The foldon domain comprises 27 residues and can forms trimers with high stability. Lichenase and xylanase can hydrolyze lichenan and xylan to produce value added products and biofuels, and they have great potentials as biotechnological tools in various industrial applications. We took them as the examples and compared the kinetic parameters of the engineered trimeric enzymes to those of the monomeric and wild type ones. When compared with the monomeric ones, the catalytic efficiency (k cat /K m ) of the trimeric lichenase and xylanase increased 4.2- and 3.9- fold. The catalytic constant (k cat ) of the trimeric lichenase and xylanase increased 1.8- fold and 5.0- fold than their corresponding wild-type counterparts. Also, the specific activities of trimeric lichenase and xylanase increased by 149% and 94% than those of the monomeric ones. Besides, the recovery of the lichenase and xylanase activities increased by 12.4% and 6.1% during the purification process using ELPs as the non-chromatographic tag. The possible reason is the foldon domain can reduce the transition temperature of the ELPs. The trimeric lichenase and xylanase induced by foldon have advantages in the catalytic performances. Besides, they were easier to purify with increased purification fold and decreased the loss of activities compared to their corresponding monomeric ones. Trimerizing of the target enzymes triggered by the foldon domain could improve their activities and facilitate the purification, which represents a simple and effective enzyme-engineering tool. It should have exciting potentials both in industrial and laboratory scales.

  7. Future Automotive Systems Technology Simulator (FASTSim)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    An advanced vehicle powertrain systems analysis tool, the Future Automotive Systems Technology Simulator (FASTSim) provides a simple way to compare powertrains and estimate the impact of technology improvements on light-, medium- and heavy-duty vehicle efficiency, performance, cost, and battery life. Created by the National Renewable Energy Laboratory, FASTSim accommodates a range of vehicle types - including conventional vehicles, electric-drive vehicles, and fuel cell vehicles - and is available for free download in Microsoft Excel and Python formats.

  8. REopt Improves the Operations of Alcatraz's Solar PV-Battery-Diesel Hybrid System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olis, Daniel R; Walker, H. A; Van Geet, Otto D

    This poster identifies operations improvement strategies for a photovoltaic (PV)-battery-diesel hybrid system at the National Park Service's Alcatraz Island using NREL's REopt analysis tool. The current 'cycle charging' strategy results in significant curtailing of energy production from the PV array, requiring excessive diesel use, while also incurring high wear on batteries without benefit of improved efficiency. A simple 'load following' strategy results in near optimal operating cost reduction.

  9. Optimization of spine surgery planning with 3D image templating tools

    NASA Astrophysics Data System (ADS)

    Augustine, Kurt E.; Huddleston, Paul M.; Holmes, David R., III; Shridharani, Shyam M.; Robb, Richard A.

    2008-03-01

    The current standard of care for patients with spinal disorders involves a thorough clinical history, physical exam, and imaging studies. Simple radiographs provide a valuable assessment but prove inadequate for surgery planning because of the complex 3-dimensional anatomy of the spinal column and the close proximity of the neural elements, large blood vessels, and viscera. Currently, clinicians still use primitive techniques such as paper cutouts, pencils, and markers in an attempt to analyze and plan surgical procedures. 3D imaging studies are routinely ordered prior to spine surgeries but are currently limited to generating simple, linear and angular measurements from 2D views orthogonal to the central axis of the patient. Complex spinal corrections require more accurate and precise calculation of 3D parameters such as oblique lengths, angles, levers, and pivot points within individual vertebra. We have developed a clinician friendly spine surgery planning tool which incorporates rapid oblique reformatting of each individual vertebra, followed by interactive templating for 3D placement of implants. The template placement is guided by the simultaneous representation of multiple 2D section views from reformatted orthogonal views and a 3D rendering of individual or multiple vertebrae enabling superimposition of virtual implants. These tools run efficiently on desktop PCs typically found in clinician offices or workrooms. A preliminary study conducted with Mayo Clinic spine surgeons using several actual cases suggests significantly improved accuracy of pre-operative measurements and implant localization, which is expected to increase spinal procedure efficiency and safety, and reduce time and cost of the operation.

  10. Bio-jETI: a service integration, design, and provisioning platform for orchestrated bioinformatics processes.

    PubMed

    Margaria, Tiziana; Kubczak, Christian; Steffen, Bernhard

    2008-04-25

    With Bio-jETI, we introduce a service platform for interdisciplinary work on biological application domains and illustrate its use in a concrete application concerning statistical data processing in R and xcms for an LC/MS analysis of FAAH gene knockout. Bio-jETI uses the jABC environment for service-oriented modeling and design as a graphical process modeling tool and the jETI service integration technology for remote tool execution. As a service definition and provisioning platform, Bio-jETI has the potential to become a core technology in interdisciplinary service orchestration and technology transfer. Domain experts, like biologists not trained in computer science, directly define complex service orchestrations as process models and use efficient and complex bioinformatics tools in a simple and intuitive way.

  11. Optimized theory for simple and molecular fluids.

    PubMed

    Marucho, M; Montgomery Pettitt, B

    2007-03-28

    An optimized closure approximation for both simple and molecular fluids is presented. A smooth interpolation between Perkus-Yevick and hypernetted chain closures is optimized by minimizing the free energy self-consistently with respect to the interpolation parameter(s). The molecular version is derived from a refinement of the method for simple fluids. In doing so, a method is proposed which appropriately couples an optimized closure with the variant of the diagrammatically proper integral equation recently introduced by this laboratory [K. M. Dyer et al., J. Chem. Phys. 123, 204512 (2005)]. The simplicity of the expressions involved in this proposed theory has allowed the authors to obtain an analytic expression for the approximate excess chemical potential. This is shown to be an efficient tool to estimate, from first principles, the numerical value of the interpolation parameters defining the aforementioned closure. As a preliminary test, representative models for simple fluids and homonuclear diatomic Lennard-Jones fluids were analyzed, obtaining site-site correlation functions in excellent agreement with simulation data.

  12. A human beta cell line with drug inducible excision of immortalizing transgenes

    PubMed Central

    Benazra, Marion; Lecomte, Marie-José; Colace, Claire; Müller, Andreas; Machado, Cécile; Pechberty, Severine; Bricout-Neveu, Emilie; Grenier-Godard, Maud; Solimena, Michele; Scharfmann, Raphaël; Czernichow, Paul; Ravassard, Philippe

    2015-01-01

    Objectives Access to immortalized human pancreatic beta cell lines that are phenotypically close to genuine adult beta cells, represent a major tool to better understand human beta cell physiology and develop new therapeutics for Diabetes. Here we derived a new conditionally immortalized human beta cell line, EndoC-βH3 in which immortalizing transgene can be efficiently removed by simple addition of tamoxifen. Methods We used lentiviral mediated gene transfer to stably integrate a tamoxifen inducible form of CRE (CRE-ERT2) into the recently developed conditionally immortalized EndoC βH2 line. The resulting EndoC-βH3 line was characterized before and after tamoxifen treatment for cell proliferation, insulin content and insulin secretion. Results We showed that EndoC-βH3 expressing CRE-ERT2 can be massively amplified in culture. We established an optimized tamoxifen treatment to efficiently excise the immortalizing transgenes resulting in proliferation arrest. In addition, insulin expression raised by 12 fold and insulin content increased by 23 fold reaching 2 μg of insulin per million cells. Such massive increase was accompanied by enhanced insulin secretion upon glucose stimulation. We further observed that tamoxifen treated cells maintained a stable function for 5 weeks in culture. Conclusions EndoC βH3 cell line represents a powerful tool that allows, using a simple and efficient procedure, the massive production of functional non-proliferative human beta cells. Such cells are close to genuine human beta cells and maintain a stable phenotype for 5 weeks in culture. PMID:26909308

  13. EasyModeller: A graphical interface to MODELLER

    PubMed Central

    2010-01-01

    Background MODELLER is a program for automated protein Homology Modeling. It is one of the most widely used tool for homology or comparative modeling of protein three-dimensional structures, but most users find it a bit difficult to start with MODELLER as it is command line based and requires knowledge of basic Python scripting to use it efficiently. Findings The study was designed with an aim to develop of "EasyModeller" tool as a frontend graphical interface to MODELLER using Perl/Tk, which can be used as a standalone tool in windows platform with MODELLER and Python preinstalled. It helps inexperienced users to perform modeling, assessment, visualization, and optimization of protein models in a simple and straightforward way. Conclusion EasyModeller provides a graphical straight forward interface and functions as a stand-alone tool which can be used in a standard personal computer with Microsoft Windows as the operating system. PMID:20712861

  14. The development and evaluation of a new coding system for medical records.

    PubMed

    Papazissis, Elias

    2014-01-01

    The present study aims to develop a simple, reliable and easy tool enabling clinicians to codify the major part of individualized medical details (patient history and findings of physical examination) quickly and easily in routine medical practice, by entering data to a purpose-built software application, using structure data elements and detailed medical illustrations. We studied medical records of 9,320 patients and we extracted individualized medical details. We recorded the majority of symptoms and the majority of findings of physical examination into the system, which was named IMPACT® (Intelligent Medical Patient Record and Coding Tool). Subsequently the system was evaluated by clinicians, based on the examination of 1206 patients. The evaluation results showed that IMPACT® is an efficient tool, easy to use even under time-pressing conditions. IMPACT® seems to be a promising tool for illustration-guided, structured data entry of medical narrative, in electronic patient records.

  15. Microwave: An Important and Efficient Tool for the Synthesis of Biological Potent Organic Compounds.

    PubMed

    Kumari, Kamlesh; Vishvakarma, Vijay K; Singh, Prashant; Patel, Rajan; Chandra, Ramesh

    2017-01-01

    Green Chemistry is an interdisciplinary science or it can also be explained as a branch of chemistry. It is generally described as the chemistry to aim to synthesize chemical compounds to trim down the utilization of harmful chemicals proposed by the Environmental Protection Agency (EPA). Recently, the plan of academicians, researchers, industrialists is to generate greener and more efficient methodologies to carry out various organic syntheses. In the present scenario, green chemistry utilizes the raw materials economically, minimizes the waste and prevents the uses of harmful or hazardous chemicals to make the organic reactions simple and efficient. Microwave technique is a new, simple and efficient technology which opens new prospects to the chemists to carry out various organic and inorganic reactions, which are difficult via conventional methodology. It is used to decrease the duration of time to carry various organic transformation along with maximum yield, minimum by-products, minimum energy utilization, less manpower etc. e.g. various famous organic reactions have been carried out by various research groups like Aldol condensation, Knoevenagel condensation, Beckmann rearrangement, Vilsmeier reaction, Perkin reaction, Benzil-Benzilic acid rearrangement, Fischer cyclization, Mannich reaction, Claisen-Schmidt condensation, etc. Further, reduction, oxidation, coupling, condensation reaction were also performed using microwave technology. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  16. Test of the efficiency of three storm water quality models with a rich set of data.

    PubMed

    Ahyerre, M; Henry, F O; Gogien, F; Chabanel, M; Zug, M; Renaudet, D

    2005-01-01

    The objective of this article is to test the efficiency of three different Storm Water Quality Model (SWQM) on the same data set (34 rain events, SS measurements) sampled on a 42 ha watershed in the center of Paris. The models have been calibrated at the scale of the rain event. Considering the mass of pollution calculated per event, the results on the models are satisfactory but that they are in the same order of magnitude as the simple hydraulic approach associated to a constant concentration. In a second time, the mass of pollutant at the outlet of the catchment at the global scale of the 34 events has been calculated. This approach shows that the simple hydraulic calculations gives better results than SWQM. Finally, the pollutographs are analysed, showing that storm water quality models are interesting tools to represent the shape of the pollutographs, and the dynamics of the phenomenon which can be useful in some projects for managers.

  17. An Efficient Method for Generation of Knockout Human Embryonic Stem Cells Using CRISPR/Cas9 System.

    PubMed

    Bohaciakova, Dasa; Renzova, Tereza; Fedorova, Veronika; Barak, Martin; Kunova Bosakova, Michaela; Hampl, Ales; Cajanek, Lukas

    2017-11-01

    Human embryonic stem cells (hESCs) represent a promising tool to study functions of genes during development, to model diseases, and to even develop therapies when combined with gene editing techniques such as CRISPR/CRISPR-associated protein-9 nuclease (Cas9) system. However, the process of disruption of gene expression by generation of null alleles is often inefficient and tedious. To circumvent these limitations, we developed a simple and efficient protocol to permanently downregulate expression of a gene of interest in hESCs using CRISPR/Cas9. We selected p53 for our proof of concept experiments. The methodology is based on series of hESC transfection, which leads to efficient downregulation of p53 expression even in polyclonal population (p53 Low cells), here proven by a loss of regulation of the expression of p53 target gene, microRNA miR-34a. We demonstrate that our approach achieves over 80% efficiency in generating hESC clonal sublines that do not express p53 protein. Importantly, we document by a set of functional experiments that such genetically modified hESCs do retain typical stem cells characteristics. In summary, we provide a simple and robust protocol to efficiently target expression of gene of interest in hESCs that can be useful for laboratories aiming to employ gene editing in their hESC applications/protocols.

  18. Big Data Tools as Applied to ATLAS Event Data

    NASA Astrophysics Data System (ADS)

    Vukotic, I.; Gardner, R. W.; Bryant, L. A.

    2017-10-01

    Big Data technologies have proven to be very useful for storage, processing and visualization of derived metrics associated with ATLAS distributed computing (ADC) services. Logfiles, database records, and metadata from a diversity of systems have been aggregated and indexed to create an analytics platform for ATLAS ADC operations analysis. Dashboards, wide area data access cost metrics, user analysis patterns, and resource utilization efficiency charts are produced flexibly through queries against a powerful analytics cluster. Here we explore whether these techniques and associated analytics ecosystem can be applied to add new modes of open, quick, and pervasive access to ATLAS event data. Such modes would simplify access and broaden the reach of ATLAS public data to new communities of users. An ability to efficiently store, filter, search and deliver ATLAS data at the event and/or sub-event level in a widely supported format would enable or significantly simplify usage of machine learning environments and tools like Spark, Jupyter, R, SciPy, Caffe, TensorFlow, etc. Machine learning challenges such as the Higgs Boson Machine Learning Challenge, the Tracking challenge, Event viewers (VP1, ATLANTIS, ATLASrift), and still to be developed educational and outreach tools would be able to access the data through a simple REST API. In this preliminary investigation we focus on derived xAOD data sets. These are much smaller than the primary xAODs having containers, variables, and events of interest to a particular analysis. Being encouraged with the performance of Elasticsearch for the ADC analytics platform, we developed an algorithm for indexing derived xAOD event data. We have made an appropriate document mapping and have imported a full set of standard model W/Z datasets. We compare the disk space efficiency of this approach to that of standard ROOT files, the performance in simple cut flow type of data analysis, and will present preliminary results on its scaling characteristics with different numbers of clients, query complexity, and size of the data retrieved.

  19. Sex chromosomal abnormalities associated with equine infertility: validation of a simple molecular screening tool in the Purebred Spanish Horse.

    PubMed

    Anaya, G; Molina, A; Valera, M; Moreno-Millán, M; Azor, P; Peral-García, P; Demyda-Peyrás, S

    2017-08-01

    Chromosomal abnormalities in the sex chromosome pair (ECAX and ECAY) are widely associated with reproductive problems in horses. However, a large proportion of these abnormalities remains undiagnosed due to the lack of an affordable diagnostic tool that allows for avoiding karyotyping tests. Hereby, we developed an STR (single-tandem-repeat)-based molecular method to determine the presence of the main sex chromosomal abnormalities in horses in a fast, cheap and reliable way. The frequency of five ECAX-linked (LEX026, LEX003, TKY38, TKY270 and UCDEQ502) and two ECAY-linked (EcaYH12 and SRY) markers was characterized in 261 Purebred Spanish Horses to determine the efficiency of the methodology developed to be used as a chromosomal diagnostic tool. All the microsatellites analyzed were highly polymorphic, with a sizeable number of alleles (polymorphic information content > 0.5). Based on this variability, the methodology showed 100% sensitivity and 99.82% specificity to detect the most important sex chromosomal abnormalities reported in horses (chimerism, Turner's syndrome and sex reversal syndromes). The method was also validated with 100% efficiency in 10 individuals previously diagnosed as chromosomally aberrant. This STR screening panel is an efficient and reliable molecular-cytogenetic tool for the early detection of sex chromosomal abnormalities in equines that could be included in breeding programs to save money, effort and time of veterinary practitioners and breeders. © 2017 Stichting International Foundation for Animal Genetics.

  20. Highly selective rhodium catalyzed domino C-H activation/cyclizations.

    PubMed

    Trans, Duc N; Cramer, Nicolai

    2011-01-01

    The direct functionalization of carbon-hydrogen bonds is an emerging tool to establish more sustainable and efficient synthetic methods. We present its implementation in a cascade reaction that provides a rapid assembly of functionalized indanylamines from simple and readily available starting materials. Careful choice of the ancillary ligand---an electron-rich bidentate phosphine ligand--enables highly diastereoselective rhodium(i)-catalyzed intramolecular allylations of unsubstituted ketimines induced by a directed C-H bond activation and allene carbo-metalation sequence.

  1. Printing method for organic light emitting device lighting

    NASA Astrophysics Data System (ADS)

    Ki, Hyun Chul; Kim, Seon Hoon; Kim, Doo-Gun; Kim, Tae-Un; Kim, Snag-Gi; Hong, Kyung-Jin; So, Soon-Yeol

    2013-03-01

    Organic Light Emitting Device (OLED) has a characteristic to change the electric energy into the light when the electric field is applied to the organic material. OLED is currently employed as a light source for the lighting tools because research has extensively progressed in the improvement of luminance, efficiency, and life time. OLED is widely used in the plate display device because of a simple manufacture process and high emitting efficiency. But most of OLED lighting projects were used the vacuum evaporator (thermal evaporator) with low molecular. Although printing method has lower efficiency and life time of OLED than vacuum evaporator method, projects of printing OLED actively are progressed because was possible to combine with flexible substrate and printing technology. Printing technology is ink-jet, screen printing and slot coating. This printing method allows for low cost and mass production techniques and large substrates. In this research, we have proposed inkjet printing for organic light-emitting devices has the dominant method of thick film deposition because of its low cost and simple processing. In this research, the fabrication of the passive matrix OLED is achieved by inkjet printing, using a polymer phosphorescent ink. We are measured optical and electrical characteristics of OLED.

  2. Simple Tools to Facilitate Project Management of a Nursing Research Project.

    PubMed

    Aycock, Dawn M; Clark, Patricia C; Thomas-Seaton, LaTeshia; Lee, Shih-Yu; Moloney, Margaret

    2016-07-01

    Highly organized project management facilitates rigorous study implementation. Research involves gathering large amounts of information that can be overwhelming when organizational strategies are not used. We describe a variety of project management and organizational tools used in different studies that may be particularly useful for novice researchers. The studies were a multisite study of caregivers of stroke survivors, an Internet-based diary study of women with migraines, and a pilot study testing a sleep intervention in mothers of low-birth-weight infants. Project management tools were used to facilitate enrollment, data collection, and access to results. The tools included protocol and eligibility checklists, event calendars, screening and enrollment logs, instrument scoring tables, and data summary sheets. These tools created efficiency, promoted a positive image, minimized errors, and provided researchers with a sense of control. For the studies described, there were no protocol violations, there were minimal missing data, and the integrity of data collection was maintained. © The Author(s) 2016.

  3. SURE reliability analysis: Program and mathematics

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; White, Allan L.

    1988-01-01

    The SURE program is a new reliability analysis tool for ultrareliable computer system architectures. The computational methods on which the program is based provide an efficient means for computing accurate upper and lower bounds for the death state probabilities of a large class of semi-Markov models. Once a semi-Markov model is described using a simple input language, the SURE program automatically computes the upper and lower bounds on the probability of system failure. A parameter of the model can be specified as a variable over a range of values directing the SURE program to perform a sensitivity analysis automatically. This feature, along with the speed of the program, makes it especially useful as a design tool.

  4. Design and implementation of an efficient single layer five input majority voter gate in quantum-dot cellular automata.

    PubMed

    Bahar, Ali Newaz; Waheed, Sajjad

    2016-01-01

    The fundamental logical element of a quantum-dot cellular automata (QCA) circuit is majority voter gate (MV). The efficiency of a QCA circuit is depends on the efficiency of the MV. This paper presents an efficient single layer five-input majority voter gate (MV5). The structure of proposed MV5 is very simple and easy to implement in any logical circuit. This proposed MV5 reduce number of cells and use conventional QCA cells. However, using MV5 a multilayer 1-bit full-adder (FA) is designed. The functional accuracy of the proposed MV5 and FA are confirmed by QCADesigner a well-known QCA layout design and verification tools. Furthermore, the power dissipation of proposed circuits are estimated, which shows that those circuits dissipate extremely small amount of energy and suitable for reversible computing. The simulation outcomes demonstrate the superiority of the proposed circuit.

  5. Plant genome and transcriptome annotations: from misconceptions to simple solutions

    PubMed Central

    Bolger, Marie E; Arsova, Borjana; Usadel, Björn

    2018-01-01

    Abstract Next-generation sequencing has triggered an explosion of available genomic and transcriptomic resources in the plant sciences. Although genome and transcriptome sequencing has become orders of magnitudes cheaper and more efficient, often the functional annotation process is lagging behind. This might be hampered by the lack of a comprehensive enumeration of simple-to-use tools available to the plant researcher. In this comprehensive review, we present (i) typical ontologies to be used in the plant sciences, (ii) useful databases and resources used for functional annotation, (iii) what to expect from an annotated plant genome, (iv) an automated annotation pipeline and (v) a recipe and reference chart outlining typical steps used to annotate plant genomes/transcriptomes using publicly available resources. PMID:28062412

  6. Technical Note: Harmonizing met-ocean model data via standard web services within small research groups

    USGS Publications Warehouse

    Signell, Richard; Camossi, E.

    2016-01-01

    Work over the last decade has resulted in standardised web services and tools that can significantly improve the efficiency and effectiveness of working with meteorological and ocean model data. While many operational modelling centres have enabled query and access to data via common web services, most small research groups have not. The penetration of this approach into the research community, where IT resources are limited, can be dramatically improved by (1) making it simple for providers to enable web service access to existing output files; (2) using free technologies that are easy to deploy and configure; and (3) providing standardised, service-based tools that work in existing research environments. We present a simple, local brokering approach that lets modellers continue to use their existing files and tools, while serving virtual data sets that can be used with standardised tools. The goal of this paper is to convince modellers that a standardised framework is not only useful but can be implemented with modest effort using free software components. We use NetCDF Markup language for data aggregation and standardisation, the THREDDS Data Server for data delivery, pycsw for data search, NCTOOLBOX (MATLAB®) and Iris (Python) for data access, and Open Geospatial Consortium Web Map Service for data preview. We illustrate the effectiveness of this approach with two use cases involving small research modelling groups at NATO and USGS.

  7. Technical note: Harmonising metocean model data via standard web services within small research groups

    NASA Astrophysics Data System (ADS)

    Signell, Richard P.; Camossi, Elena

    2016-05-01

    Work over the last decade has resulted in standardised web services and tools that can significantly improve the efficiency and effectiveness of working with meteorological and ocean model data. While many operational modelling centres have enabled query and access to data via common web services, most small research groups have not. The penetration of this approach into the research community, where IT resources are limited, can be dramatically improved by (1) making it simple for providers to enable web service access to existing output files; (2) using free technologies that are easy to deploy and configure; and (3) providing standardised, service-based tools that work in existing research environments. We present a simple, local brokering approach that lets modellers continue to use their existing files and tools, while serving virtual data sets that can be used with standardised tools. The goal of this paper is to convince modellers that a standardised framework is not only useful but can be implemented with modest effort using free software components. We use NetCDF Markup language for data aggregation and standardisation, the THREDDS Data Server for data delivery, pycsw for data search, NCTOOLBOX (MATLAB®) and Iris (Python) for data access, and Open Geospatial Consortium Web Map Service for data preview. We illustrate the effectiveness of this approach with two use cases involving small research modelling groups at NATO and USGS.

  8. Bio-jETI: a service integration, design, and provisioning platform for orchestrated bioinformatics processes

    PubMed Central

    Margaria, Tiziana; Kubczak, Christian; Steffen, Bernhard

    2008-01-01

    Background With Bio-jETI, we introduce a service platform for interdisciplinary work on biological application domains and illustrate its use in a concrete application concerning statistical data processing in R and xcms for an LC/MS analysis of FAAH gene knockout. Methods Bio-jETI uses the jABC environment for service-oriented modeling and design as a graphical process modeling tool and the jETI service integration technology for remote tool execution. Conclusions As a service definition and provisioning platform, Bio-jETI has the potential to become a core technology in interdisciplinary service orchestration and technology transfer. Domain experts, like biologists not trained in computer science, directly define complex service orchestrations as process models and use efficient and complex bioinformatics tools in a simple and intuitive way. PMID:18460173

  9. Object-Oriented MDAO Tool with Aeroservoelastic Model Tuning Capability

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi; Li, Wesley; Lung, Shun-fat

    2008-01-01

    An object-oriented multi-disciplinary analysis and optimization (MDAO) tool has been developed at the NASA Dryden Flight Research Center to automate the design and analysis process and leverage existing commercial as well as in-house codes to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic and hypersonic aircraft. Once the structural analysis discipline is finalized and integrated completely into the MDAO process, other disciplines such as aerodynamics and flight controls will be integrated as well. Simple and efficient model tuning capabilities based on optimization problem are successfully integrated with the MDAO tool. More synchronized all phases of experimental testing (ground and flight), analytical model updating, high-fidelity simulations for model validation, and integrated design may result in reduction of uncertainties in the aeroservoelastic model and increase the flight safety.

  10. [Advances in CRISPR-Cas-mediated genome editing system in plants].

    PubMed

    Wang, Chun; Wang, Kejian

    2017-10-25

    Targeted genome editing technology is an important tool to study the function of genes and to modify organisms at the genetic level. Recently, CRISPR-Cas (clustered regularly interspaced short palindromic repeats and CRISPR-associated proteins) system has emerged as an efficient tool for specific genome editing in animals and plants. CRISPR-Cas system uses CRISPR-associated endonuclease and a guide RNA to generate double-strand breaks at the target DNA site, subsequently leading to genetic modifications. CRISPR-Cas system has received widespread attention for manipulating the genomes with simple, easy and high specificity. This review summarizes recent advances of diverse applications of the CRISPR-Cas toolkit in plant research and crop breeding, including expanding the range of genome editing, precise editing of a target base, and efficient DNA-free genome editing technology. This review also discusses the potential challenges and application prospect in the future, and provides a useful reference for researchers who are interested in this field.

  11. Parametric Study of Biconic Re-Entry Vehicles

    NASA Technical Reports Server (NTRS)

    Steele, Bryan; Banks, Daniel W.; Whitmore, Stephen A.

    2007-01-01

    An optimization based on hypersonic aerodynamic performance and volumetric efficiency was accomplished for a range of biconic configurations. Both axisymmetric and quasi-axisymmetric geometries (bent and flattened) were analyzed. The aerodynamic optimization wag based on hypersonic simple Incidence angle analysis tools. The range of configurations included those suitable for r lunar return trajectory with a lifting aerocapture at Earth and an overall volume that could support a nominal crew. The results yielded five configurations that had acceptable aerodynamic performance and met overall geometry and size limitations

  12. [Tools to assess the impact on health of public health programmes and community interventions from an equity perspective].

    PubMed

    Suárez Álvarez, Óscar; Fernández-Feito, Ana; Vallina Crespo, Henar; Aldasoro Unamuno, Elena; Cofiño, Rafael

    2018-05-11

    It is essential to develop a comprehensive approach to institutionally promoted interventions to assess their impact on health from the perspective of the social determinants of health and equity. Simple, adapted tools must be developed to carry out these assessments. The aim of this paper is to present two tools to assess the impact of programmes and community-based interventions on the social determinants of health. The first tool is intended to assess health programmes through interviews and analysis of information provided by the assessment team. The second tool, by means of online assessments of community-based interventions, also enables a report on inequality issues that includes recommendations for improvement. In addition to reducing health-related social inequities, the implementation of these tools can also help to improve the efficiency of public health interventions. Copyright © 2018 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  13. Adaptive reduction of constitutive model-form error using a posteriori error estimation techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bishop, Joseph E.; Brown, Judith Alice

    In engineering practice, models are typically kept as simple as possible for ease of setup and use, computational efficiency, maintenance, and overall reduced complexity to achieve robustness. In solid mechanics, a simple and efficient constitutive model may be favored over one that is more predictive, but is difficult to parameterize, is computationally expensive, or is simply not available within a simulation tool. In order to quantify the modeling error due to the choice of a relatively simple and less predictive constitutive model, we adopt the use of a posteriori model-form error-estimation techniques. Based on local error indicators in the energymore » norm, an algorithm is developed for reducing the modeling error by spatially adapting the material parameters in the simpler constitutive model. The resulting material parameters are not material properties per se, but depend on the given boundary-value problem. As a first step to the more general nonlinear case, we focus here on linear elasticity in which the “complex” constitutive model is general anisotropic elasticity and the chosen simpler model is isotropic elasticity. As a result, the algorithm for adaptive error reduction is demonstrated using two examples: (1) A transversely-isotropic plate with hole subjected to tension, and (2) a transversely-isotropic tube with two side holes subjected to torsion.« less

  14. Adaptive reduction of constitutive model-form error using a posteriori error estimation techniques

    DOE PAGES

    Bishop, Joseph E.; Brown, Judith Alice

    2018-06-15

    In engineering practice, models are typically kept as simple as possible for ease of setup and use, computational efficiency, maintenance, and overall reduced complexity to achieve robustness. In solid mechanics, a simple and efficient constitutive model may be favored over one that is more predictive, but is difficult to parameterize, is computationally expensive, or is simply not available within a simulation tool. In order to quantify the modeling error due to the choice of a relatively simple and less predictive constitutive model, we adopt the use of a posteriori model-form error-estimation techniques. Based on local error indicators in the energymore » norm, an algorithm is developed for reducing the modeling error by spatially adapting the material parameters in the simpler constitutive model. The resulting material parameters are not material properties per se, but depend on the given boundary-value problem. As a first step to the more general nonlinear case, we focus here on linear elasticity in which the “complex” constitutive model is general anisotropic elasticity and the chosen simpler model is isotropic elasticity. As a result, the algorithm for adaptive error reduction is demonstrated using two examples: (1) A transversely-isotropic plate with hole subjected to tension, and (2) a transversely-isotropic tube with two side holes subjected to torsion.« less

  15. Interactive graphic editing tools in bioluminescent imaging simulation

    NASA Astrophysics Data System (ADS)

    Li, Hui; Tian, Jie; Luo, Jie; Wang, Ge; Cong, Wenxiang

    2005-04-01

    It is a challenging task to accurately describe complicated biological tissues and bioluminescent sources in bioluminescent imaging simulation. Several graphic editing tools have been developed to efficiently model each part of the bioluminescent simulation environment and to interactively correct or improve the initial models of anatomical structures or bioluminescent sources. There are two major types of graphic editing tools: non-interactive tools and interactive tools. Geometric building blocks (i.e. regular geometric graphics and superquadrics) are applied as non-interactive tools. To a certain extent, complicated anatomical structures and bioluminescent sources can be approximately modeled by combining a sufficient large number of geometric building blocks with Boolean operators. However, those models are too simple to describe the local features and fine changes in 2D/3D irregular contours. Therefore, interactive graphic editing tools have been developed to facilitate the local modifications of any initial surface model. With initial models composed of geometric building blocks, interactive spline mode is applied to conveniently perform dragging and compressing operations on 2D/3D local surface of biological tissues and bioluminescent sources inside the region/volume of interest. Several applications of the interactive graphic editing tools will be presented in this article.

  16. The Exoplanet Simple Orbit Fitting Toolbox (ExoSOFT): An Open-source Tool for Efficient Fitting of Astrometric and Radial Velocity Data

    NASA Astrophysics Data System (ADS)

    Mede, Kyle; Brandt, Timothy D.

    2017-03-01

    We present the Exoplanet Simple Orbit Fitting Toolbox (ExoSOFT), a new, open-source suite to fit the orbital elements of planetary or stellar-mass companions to any combination of radial velocity and astrometric data. To explore the parameter space of Keplerian models, ExoSOFT may be operated with its own multistage sampling approach or interfaced with third-party tools such as emcee. In addition, ExoSOFT is packaged with a collection of post-processing tools to analyze and summarize the results. Although only a few systems have been observed with both radial velocity and direct imaging techniques, this number will increase, thanks to upcoming spacecraft and ground-based surveys. Providing both forms of data enables simultaneous fitting that can help break degeneracies in the orbital elements that arise when only one data type is available. The dynamical mass estimates this approach can produce are important when investigating the formation mechanisms and subsequent evolution of substellar companions. ExoSOFT was verified through fitting to artificial data and was implemented using the Python and Cython programming languages; it is available for public download at https://github.com/kylemede/ExoSOFT under GNU General Public License v3.

  17. De-quantisation

    NASA Astrophysics Data System (ADS)

    Gruska, Jozef

    2012-06-01

    One of the most basic tasks in quantum information processing, communication and security (QIPCC) research, theoretically deep and practically important, is to find bounds on how really important are inherently quantum resources for speeding up computations. This area of research is bringing a variety of results that imply, often in a very unexpected and counter-intuitive way, that: (a) surprisingly large classes of quantum circuits and algorithms can be efficiently simulated on classical computers; (b) the border line between quantum processes that can and cannot be efficiently simulated on classical computers is often surprisingly thin; (c) the addition of a seemingly very simple resource or a tool often enormously increases the power of available quantum tools. These discoveries have put also a new light on our understanding of quantum phenomena and quantum physics and on the potential of its inherently quantum and often mysteriously looking phenomena. The paper motivates and surveys research and its outcomes in the area of de-quantisation, especially presents various approaches and their outcomes concerning efficient classical simulations of various families of quantum circuits and algorithms. To motivate this area of research some outcomes in the area of de-randomization of classical randomized computations.

  18. Pointo - a Low Cost Solution to Point Cloud Processing

    NASA Astrophysics Data System (ADS)

    Houshiar, H.; Winkler, S.

    2017-11-01

    With advance in technology access to data especially 3D point cloud data becomes more and more an everyday task. 3D point clouds are usually captured with very expensive tools such as 3D laser scanners or very time consuming methods such as photogrammetry. Most of the available softwares for 3D point cloud processing are designed for experts and specialists in this field and are usually very large software packages containing variety of methods and tools. This results in softwares that are usually very expensive to acquire and also very difficult to use. Difficulty of use is caused by complicated user interfaces that is required to accommodate a large list of features. The aim of these complex softwares is to provide a powerful tool for a specific group of specialist. However they are not necessary required by the majority of the up coming average users of point clouds. In addition to complexity and high costs of these softwares they generally rely on expensive and modern hardware and only compatible with one specific operating system. Many point cloud customers are not point cloud processing experts or willing to spend the high acquisition costs of these expensive softwares and hardwares. In this paper we introduce a solution for low cost point cloud processing. Our approach is designed to accommodate the needs of the average point cloud user. To reduce the cost and complexity of software our approach focuses on one functionality at a time in contrast with most available softwares and tools that aim to solve as many problems as possible at the same time. Our simple and user oriented design improve the user experience and empower us to optimize our methods for creation of an efficient software. In this paper we introduce Pointo family as a series of connected softwares to provide easy to use tools with simple design for different point cloud processing requirements. PointoVIEWER and PointoCAD are introduced as the first components of the Pointo family to provide a fast and efficient visualization with the ability to add annotation and documentation to the point clouds.

  19. CRISPR/Cas9 Immune System as a Tool for Genome Engineering.

    PubMed

    Hryhorowicz, Magdalena; Lipiński, Daniel; Zeyland, Joanna; Słomski, Ryszard

    2017-06-01

    CRISPR/Cas (clustered regularly interspaced short palindromic repeats/CRISPR-associated) adaptive immune systems constitute a bacterial defence against invading nucleic acids derived from bacteriophages or plasmids. This prokaryotic system was adapted in molecular biology and became one of the most powerful and versatile platforms for genome engineering. CRISPR/Cas9 is a simple and rapid tool which enables the efficient modification of endogenous genes in various species and cell types. Moreover, a modified version of the CRISPR/Cas9 system with transcriptional repressors or activators allows robust transcription repression or activation of target genes. The simplicity of CRISPR/Cas9 has resulted in the widespread use of this technology in many fields, including basic research, biotechnology and biomedicine.

  20. A parallel algorithm for multi-level logic synthesis using the transduction method. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Lim, Chieng-Fai

    1991-01-01

    The Transduction Method has been shown to be a powerful tool in the optimization of multilevel networks. Many tools such as the SYLON synthesis system (X90), (CM89), (LM90) have been developed based on this method. A parallel implementation is presented of SYLON-XTRANS (XM89) on an eight processor Encore Multimax shared memory multiprocessor. It minimizes multilevel networks consisting of simple gates through parallel pruning, gate substitution, gate merging, generalized gate substitution, and gate input reduction. This implementation, called Parallel TRANSduction (PTRANS), also uses partitioning to break large circuits up and performs inter- and intra-partition dynamic load balancing. With this, good speedups and high processor efficiencies are achievable without sacrificing the resulting circuit quality.

  1. The SURE reliability analysis program

    NASA Technical Reports Server (NTRS)

    Butler, R. W.

    1986-01-01

    The SURE program is a new reliability tool for ultrareliable computer system architectures. The program is based on computational methods recently developed for the NASA Langley Research Center. These methods provide an efficient means for computing accurate upper and lower bounds for the death state probabilities of a large class of semi-Markov models. Once a semi-Markov model is described using a simple input language, the SURE program automatically computes the upper and lower bounds on the probability of system failure. A parameter of the model can be specified as a variable over a range of values directing the SURE program to perform a sensitivity analysis automatically. This feature, along with the speed of the program, makes it especially useful as a design tool.

  2. The SURE Reliability Analysis Program

    NASA Technical Reports Server (NTRS)

    Butler, R. W.

    1986-01-01

    The SURE program is a new reliability analysis tool for ultrareliable computer system architectures. The program is based on computational methods recently developed for the NASA Langley Research Center. These methods provide an efficient means for computing accurate upper and lower bounds for the death state probabilities of a large class of semi-Markov models. Once a semi-Markov model is described using a simple input language, the SURE program automatically computes the upper and lower bounds on the probability of system failure. A parameter of the model can be specified as a variable over a range of values directing the SURE program to perform a sensitivity analysis automatically. This feature, along with the speed of the program, makes it especially useful as a design tool.

  3. A Novel Approach to Monitoring the Curing of Epoxy in Closed Tools by Use of Ultrasonic Spectroscopy

    PubMed Central

    2017-01-01

    The increasing use of composite materials has led to a greater demand for efficient curing cycles to reduce costs and speed up production cycles in manufacturing. One method to achieve this goal is in-line cure monitoring to determine the exact curing time. This article proposes a novel method through which to monitor the curing process inside closed tools by employing ultrasonic spectroscopy. A simple experiment is used to demonstrate the change in the ultrasonic spectrum during the cure cycle of an epoxy. The results clearly reveal a direct correlation between the amplitude and state of cure. The glass transition point is indicated by a global minimum of the reflected amplitude. PMID:29301222

  4. SINEs of progress: Mobile element applications to molecular ecology.

    PubMed

    Ray, David A

    2007-01-01

    Mobile elements represent a unique and under-utilized set of tools for molecular ecologists. They are essentially homoplasy-free characters with the ability to be genotyped in a simple and efficient manner. Interpretation of the data generated using mobile elements can be simple compared to other genetic markers. They exist in a wide variety of taxa and are useful over a wide selection of temporal ranges within those taxa. Furthermore, their mode of evolution instills them with another advantage over other types of multilocus genotype data: the ability to determine loci applicable to a range of time spans in the history of a taxon. In this review, I discuss the application of mobile element markers, especially short interspersed elements (SINEs), to phylogenetic and population data, with an emphasis on potential applications to molecular ecology.

  5. A PDMS Device Coupled with Culture Dish for In Vitro Cell Migration Assay.

    PubMed

    Lv, Xiaoqing; Geng, Zhaoxin; Fan, Zhiyuan; Wang, Shicai; Pei, WeiHua; Chen, Hongda

    2018-04-30

    Cell migration and invasion are important factors during tumor progression and metastasis. Wound-healing assay and the Boyden chamber assay are efficient tools to investigate tumor development because both of them could be applied to measure cell migration rate. Therefore, a simple and integrated polydimethylsiloxane (PDMS) device was developed for cell migration assay, which could perform quantitative evaluation of cell migration behaviors, especially for the wound-healing assay. The integrated device was composed of three units, which included cell culture dish, PDMS chamber, and wound generation mold. The PDMS chamber was integrated with cell culture chamber and could perform six experiments under different conditions of stimuli simultaneously. To verify the function of this device, it was utilized to explore the tumor cell migration behaviors under different concentrations of fetal bovine serum (FBS) and transforming growth factor (TGF-β) at different time points. This device has the unique capability to create the "wound" area in parallel during cell migration assay and provides a simple and efficient platform for investigating cell migration assay in biomedical application.

  6. GMOseek: a user friendly tool for optimized GMO testing.

    PubMed

    Morisset, Dany; Novak, Petra Kralj; Zupanič, Darko; Gruden, Kristina; Lavrač, Nada; Žel, Jana

    2014-08-01

    With the increasing pace of new Genetically Modified Organisms (GMOs) authorized or in pipeline for commercialization worldwide, the task of the laboratories in charge to test the compliance of food, feed or seed samples with their relevant regulations became difficult and costly. Many of them have already adopted the so called "matrix approach" to rationalize the resources and efforts used to increase their efficiency within a limited budget. Most of the time, the "matrix approach" is implemented using limited information and some proprietary (if any) computational tool to efficiently use the available data. The developed GMOseek software is designed to support decision making in all the phases of routine GMO laboratory testing, including the interpretation of wet-lab results. The tool makes use of a tabulated matrix of GM events and their genetic elements, of the laboratory analysis history and the available information about the sample at hand. The tool uses an optimization approach to suggest the most suited screening assays for the given sample. The practical GMOseek user interface allows the user to customize the search for a cost-efficient combination of screening assays to be employed on a given sample. It further guides the user to select appropriate analyses to determine the presence of individual GM events in the analyzed sample, and it helps taking a final decision regarding the GMO composition in the sample. GMOseek can also be used to evaluate new, previously unused GMO screening targets and to estimate the profitability of developing new GMO screening methods. The presented freely available software tool offers the GMO testing laboratories the possibility to select combinations of assays (e.g. quantitative real-time PCR tests) needed for their task, by allowing the expert to express his/her preferences in terms of multiplexing and cost. The utility of GMOseek is exemplified by analyzing selected food, feed and seed samples from a national reference laboratory for GMO testing and by comparing its performance to existing tools which use the matrix approach. GMOseek proves superior when tested on real samples in terms of GMO coverage and cost efficiency of its screening strategies, including its capacity of simple interpretation of the testing results.

  7. Intuitive Tools for the Design and Analysis of Communication Payloads for Satellites

    NASA Technical Reports Server (NTRS)

    Culver, Michael R.; Soong, Christine; Warner, Joseph D.

    2014-01-01

    In an effort to make future communications satellite payload design more efficient and accessible, two tools were created with intuitive graphical user interfaces (GUIs). The first tool allows payload designers to graphically design their payload by using simple drag and drop of payload components onto a design area within the program. Information about each picked component is pulled from a database of common space-qualified communication components sold by commerical companies. Once a design is completed, various reports can be generated, such as the Master Equipment List. The second tool is a link budget calculator designed specifically for ease of use. Other features of this tool include being able to access a database of NASA ground based apertures for near Earth and Deep Space communication, the Tracking and Data Relay Satellite System (TDRSS) base apertures, and information about the solar system relevant to link budget calculations. The link budget tool allows for over 50 different combinations of user inputs, eliminating the need for multiple spreadsheets and the user errors associated with using them. Both of the aforementioned tools increase the productivity of space communication systems designers, and have the colloquial latitude to allow non-communication experts to design preliminary communication payloads.

  8. A subthreshold aVLSI implementation of the Izhikevich simple neuron model.

    PubMed

    Rangan, Venkat; Ghosh, Abhishek; Aparin, Vladimir; Cauwenberghs, Gert

    2010-01-01

    We present a circuit architecture for compact analog VLSI implementation of the Izhikevich neuron model, which efficiently describes a wide variety of neuron spiking and bursting dynamics using two state variables and four adjustable parameters. Log-domain circuit design utilizing MOS transistors in subthreshold results in high energy efficiency, with less than 1pJ of energy consumed per spike. We also discuss the effects of parameter variations on the dynamics of the equations, and present simulation results that replicate several types of neural dynamics. The low power operation and compact analog VLSI realization make the architecture suitable for human-machine interface applications in neural prostheses and implantable bioelectronics, as well as large-scale neural emulation tools for computational neuroscience.

  9. Phase structuring in metal alloys: Ultrasound-assisted top-down approach to engineering of nanostructured catalytic materials.

    PubMed

    Cherepanov, Pavel V; Andreeva, Daria V

    2017-03-01

    High intensity ultrasound (HIUS) is a novel and efficient tool for top-down nanostructuring of multi-phase metal systems. Ultrasound-assisted structuring of the phase in metal alloys relies on two main mechanisms including interfacial red/ox reactions and temperature driven solid state phase transformations which affect surface composition and morphology of metals. Physical and chemical properties of sonication medium strongly affects the structuring pathways as well as morphology and composition of catalysts. HIUS can serve as a simple, fast, and effective approach for the tuning of structure and surface properties of metal particles, opening the new perspectives in design of robust and efficient catalysts. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. An att site-based recombination reporter system for genome engineering and synthetic DNA assembly.

    PubMed

    Bland, Michael J; Ducos-Galand, Magaly; Val, Marie-Eve; Mazel, Didier

    2017-07-14

    Direct manipulation of the genome is a widespread technique for genetic studies and synthetic biology applications. The tyrosine and serine site-specific recombination systems of bacteriophages HK022 and ΦC31 are widely used for stable directional exchange and relocation of DNA sequences, making them valuable tools in these contexts. We have developed site-specific recombination tools that allow the direct selection of recombination events by embedding the attB site from each system within the β-lactamase resistance coding sequence (bla). The HK and ΦC31 tools were developed by placing the attB sites from each system into the signal peptide cleavage site coding sequence of bla. All possible open reading frames (ORFs) were inserted and tested for recombination efficiency and bla activity. Efficient recombination was observed for all tested ORFs (3 for HK, 6 for ΦC31) as shown through a cointegrate formation assay. The bla gene with the embedded attB site was functional for eight of the nine constructs tested. The HK/ΦC31 att-bla system offers a simple way to directly select recombination events, thus enhancing the use of site-specific recombination systems for carrying out precise, large-scale DNA manipulation, and adding useful tools to the genetics toolbox. We further show the power and flexibility of bla to be used as a reporter for recombination.

  11. A simple spatial working memory and attention test on paired symbols shows developmental deficits in schizophrenia patients.

    PubMed

    Song, Wei; Zhang, Kai; Sun, Jinhua; Ma, Lina; Jesse, Forrest Fabian; Teng, Xiaochun; Zhou, Ying; Bao, Hechen; Chen, Shiqing; Wang, Shuai; Yang, Beimeng; Chu, Xixia; Ding, Wenhua; Du, Yasong; Cheng, Zaohuo; Wu, Bin; Chen, Shanguang; He, Guang; He, Lin; Chen, Xiaoping; Li, Weidong

    2013-01-01

    People with neuropsychiatric disorders such as schizophrenia often display deficits in spatial working memory and attention. Evaluating working memory and attention in schizophrenia patients is usually based on traditional tasks and the interviewer's judgment. We developed a simple Spatial Working Memory and Attention Test on Paired Symbols (SWAPS). It takes only several minutes to complete, comprising 101 trials for each subject. In this study, we tested 72 schizophrenia patients and 188 healthy volunteers in China. In a healthy control group with ages ranging from 12 to 60, the efficiency score (accuracy divided by reaction time) reached a peak in the 20-27 age range and then declined with increasing age. Importantly, schizophrenia patients failed to display this developmental trend in the same age range and adults had significant deficits compared to the control group. Our data suggests that this simple Spatial Working Memory and Attention Test on Paired Symbols can be a useful tool for studies of spatial working memory and attention in neuropsychiatric disorders.

  12. A ligand predication tool based on modeling and reasoning with imprecise probabilistic knowledge.

    PubMed

    Liu, Weiru; Yue, Anbu; Timson, David J

    2010-04-01

    Ligand prediction has been driven by a fundamental desire to understand more about how biomolecules recognize their ligands and by the commercial imperative to develop new drugs. Most of the current available software systems are very complex and time-consuming to use. Therefore, developing simple and efficient tools to perform initial screening of interesting compounds is an appealing idea. In this paper, we introduce our tool for very rapid screening for likely ligands (either substrates or inhibitors) based on reasoning with imprecise probabilistic knowledge elicited from past experiments. Probabilistic knowledge is input to the system via a user-friendly interface showing a base compound structure. A prediction of whether a particular compound is a substrate is queried against the acquired probabilistic knowledge base and a probability is returned as an indication of the prediction. This tool will be particularly useful in situations where a number of similar compounds have been screened experimentally, but information is not available for all possible members of that group of compounds. We use two case studies to demonstrate how to use the tool. 2009 Elsevier Ireland Ltd. All rights reserved.

  13. En route Spacing Tool: Efficient Conflict-free Spacing to Flow-Restricted Airspace

    NASA Technical Reports Server (NTRS)

    Green, S.

    1999-01-01

    This paper describes the Air Traffic Management (ATM) problem within the U.S. of flow-restricted en route airspace, an assessment of its impact on airspace users, and a set of near-term tools and procedures to resolve the problem. The FAA is committed, over the next few years, to deploy the first generation of modem ATM decision support tool (DST) technology under the Free-Flight Phase-1 (FFp1) program. The associated en route tools include the User Request Evaluation Tool (URET) and the Traffic Management Advisor (TMA). URET is an initial conflict probe (ICP) capability that assists controllers with the detection and resolution of conflicts in en route airspace. TMA orchestrates arrivals transitioning into high-density terminal airspace by providing controllers with scheduled times of arrival (STA) and delay feedback advisories to assist with STA conformance. However, these FFPl capabilities do not mitigate the en route Miles-In-Trail (MIT) restrictions that are dynamically applied to mitigate airspace congestion. National statistics indicate that en route facilities (Centers) apply Miles-In-Trail (MIT) restrictions for approximately 5000 hours per month. Based on results from this study, an estimated 45,000 flights are impacted by these restrictions each month. Current-day practices for implementing these restrictions result in additional controller workload and an economic impact of which the fuel penalty alone may approach several hundred dollars per flight. To mitigate much of the impact of these restrictions on users and controller workload, a DST and procedures are presented. The DST is based on a simple derivative of FFP1 technology that is designed to introduce a set of simple tools for flow-rate (spacing) conformance and integrate them with conflict-probe capabilities. The tool and associated algorithms are described based on a concept prototype implemented within the CTAS baseline in 1995. A traffic scenario is used to illustrate the controller's use of the tool, and potential display options are presented for future controller evaluation.

  14. FASTSim: A Model to Estimate Vehicle Efficiency, Cost and Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brooker, A.; Gonder, J.; Wang, L.

    2015-05-04

    The Future Automotive Systems Technology Simulator (FASTSim) is a high-level advanced vehicle powertrain systems analysis tool supported by the U.S. Department of Energy’s Vehicle Technologies Office. FASTSim provides a quick and simple approach to compare powertrains and estimate the impact of technology improvements on light- and heavy-duty vehicle efficiency, performance, cost, and battery batches of real-world drive cycles. FASTSim’s calculation framework and balance among detail, accuracy, and speed enable it to simulate thousands of driven miles in minutes. The key components and vehicle outputs have been validated by comparing the model outputs to test data for many different vehicles tomore » provide confidence in the results. A graphical user interface makes FASTSim easy and efficient to use. FASTSim is freely available for download from the National Renewable Energy Laboratory’s website (see www.nrel.gov/fastsim).« less

  15. Control and prediction of the course of brewery fermentations by gravimetric analysis.

    PubMed

    Kosín, P; Savel, J; Broz, A; Sigler, K

    2008-01-01

    A simple, fast and cheap test suitable for predicting the course of brewery fermentations based on mass analysis is described and its efficiency is evaluated. Compared to commonly used yeast vitality tests, this analysis takes into account wort composition and other factors that influence fermentation performance. It can be used to predict the shape of the fermentation curve in brewery fermentations and in research and development projects concerning yeast vitality, fermentation conditions and wort composition. It can also be a useful tool for homebrewers to control their fermentations.

  16. Advancements in RNASeqGUI towards a Reproducible Analysis of RNA-Seq Experiments

    PubMed Central

    Russo, Francesco; Righelli, Dario

    2016-01-01

    We present the advancements and novelties recently introduced in RNASeqGUI, a graphical user interface that helps biologists to handle and analyse large data collected in RNA-Seq experiments. This work focuses on the concept of reproducible research and shows how it has been incorporated in RNASeqGUI to provide reproducible (computational) results. The novel version of RNASeqGUI combines graphical interfaces with tools for reproducible research, such as literate statistical programming, human readable report, parallel executions, caching, and interactive and web-explorable tables of results. These features allow the user to analyse big datasets in a fast, efficient, and reproducible way. Moreover, this paper represents a proof of concept, showing a simple way to develop computational tools for Life Science in the spirit of reproducible research. PMID:26977414

  17. A set of tetra-nucleotide core motif SSR markers for efficient identification of potato (Solanum tuberosum) cultivars.

    PubMed

    Kishine, Masahiro; Tsutsumi, Katsuji; Kitta, Kazumi

    2017-12-01

    Simple sequence repeat (SSR) is a popular tool for individual fingerprinting. The long-core motif (e.g. tetra-, penta-, and hexa-nucleotide) simple sequence repeats (SSRs) are preferred because they make it easier to separate and distinguish neighbor alleles. In the present study, a new set of 8 tetra-nucleotide SSRs in potato ( Solanum tuberosum ) is reported. By using these 8 markers, 72 out of 76 cultivars obtained from Japan and the United States were clearly discriminated, while two pairs, both of which arose from natural variation, showed identical profiles. The combined probability of identity between two random cultivars for the set of 8 SSR markers was estimated to be 1.10 × 10 -8 , confirming the usefulness of the proposed SSR markers for fingerprinting analyses of potato.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knittel, Christopher; Wolfran, Catherine; Gandhi, Raina

    A wide range of climate plans rely on energy efficiency to generate energy and carbon emissions reductions, but conventional wisdom holds that consumers have historically underinvested in energy efficiency upgrades. This underinvestment may occur for a variety of reasons, one of which is that consumers are not adequately informed about the benefits to energy efficiency. To address this, the U.S. Department of Energy created a tool called the Home Energy Score (HEScore) to act as a simple, low-cost means to provide clear information about a home’s energy efficiency and motivate homeowners and homebuyers to invest in energy efficiency. The Departmentmore » of Energy is in the process of conducting four evaluations assessing the impact of the Home Energy Score on residential energy efficiency investments and program participation. This paper describes one of these evaluations: a randomized controlled trial conducted in New Jersey in partnership with New Jersey Natural Gas. The evaluation randomly provides homeowners who have received an audit, either because they have recently replaced their furnace, boiler, and/or gas water heater with a high-efficiency model and participated in a free audit to access an incentive, or because they requested an independent audit3, between May 2014 and October 2015, with the Home Energy Score.« less

  19. The Role of Wakes in Modelling Tidal Current Turbines

    NASA Astrophysics Data System (ADS)

    Conley, Daniel; Roc, Thomas; Greaves, Deborah

    2010-05-01

    The eventual proper development of arrays of Tidal Current Turbines (TCT) will require a balance which maximizes power extraction while minimizing environmental impacts. Idealized analytical analogues and simple 2-D models are useful tools for investigating questions of a general nature but do not represent a practical tool for application to realistic cases. Some form of 3-D numerical simulations will be required for such applications and the current project is designed to develop a numerical decision-making tool for use in planning large scale TCT projects. The project is predicated on the use of an existing regional ocean modelling framework (the Regional Ocean Modelling System - ROMS) which is modified to enable the user to account for the effects of TCTs. In such a framework where mixing processes are highly parametrized, the fidelity of the quantitative results is critically dependent on the parameter values utilized. In light of the early stage of TCT development and the lack of field scale measurements, the calibration of such a model is problematic. In the absence of explicit calibration data sets, the device wake structure has been identified as an efficient feature for model calibration. This presentation will discuss efforts to design an appropriate calibration scheme which focuses on wake decay and the motivation for this approach, techniques applied, validation results from simple test cases and limitations shall be presented.

  20. Coupling of metal-organic frameworks-containing monolithic capillary-based selective enrichment with matrix-assisted laser desorption ionization-time-of-flight mass spectrometry for efficient analysis of protein phosphorylation.

    PubMed

    Li, Daojin; Yin, Danyang; Chen, Yang; Liu, Zhen

    2017-05-19

    Protein phosphorylation is a major post-translational modification, which plays a vital role in cellular signaling of numerous biological processes. Mass spectrometry (MS) has been an essential tool for the analysis of protein phosphorylation, for which it is a key step to selectively enrich phosphopeptides from complex biological samples. In this study, metal-organic frameworks (MOFs)-based monolithic capillary has been successfully prepared as an effective sorbent for the selective enrichment of phosphopeptides and has been off-line coupled with matrix-assisted laser desorption ionization-time-of-flight mass spectrometry (MALDI-TOF MS) for efficient analysis of phosphopeptides. Using š-casein as a representative phosphoprotein, efficient phosphorylation analysis by this off-line platform was verified. Phosphorylation analysis of a nonfat milk sample was also demonstrated. Through introducing large surface areas and highly ordered pores of MOFs into monolithic column, the MOFs-based monolithic capillary exhibited several significant advantages, such as excellent selectivity toward phosphopeptides, superb tolerance to interference and simple operation procedure. Because of these highly desirable properties, the MOFs-based monolithic capillary could be a useful tool for protein phosphorylation analysis. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Quantification of Sesquiterpene Lactones in Asteraceae Plant Extracts: Evaluation of their Allergenic Potential

    PubMed Central

    Salapovic, Helena; Geier, Johannes; Reznicek, Gottfried

    2013-01-01

    Sesquiterpene lactones (SLs), mainly those with an activated exocyclic methylene group, are important allergens in Asteraceae (Compositae) plants. As a screening tool, the Compositae mix, consisting of five Asteraceae plant extracts with allergenic potential (feverfew, tansy, arnica, yarrow, and German chamomile) is part of several national patch test baseline series. However, the SL content of the Compositae mix may vary due to the source material. Therefore, a simple spectrophotometric method for the quantitative measurement of SLs with the α-methylene-γ-butyrolactone moiety was developed, giving the percentage of allergenic compounds in plant extracts. The method has been validated and five Asteraceae extracts, namely feverfew (Tanacetum parthenium L.), tansy (Tanacetum vulgare L.), arnica (Arnica montana L.), yarrow (Achillea millefolium L.), and German chamomile (Chamomilla recutita L. Rauschert) that have been used in routine patch test screening were evaluated. A good correlation could be found between the results obtained using the proposed spectrophotometric method and the corresponding clinical results. Thus, the introduced method is a valuable tool for evaluating the allergenic potential and for the simple and efficient quality control of plant extracts with allergenic potential. PMID:24106675

  2. Assessment of a brain-tumour-specific Patient Concerns Inventory in the neuro-oncology clinic.

    PubMed

    Rooney, Alasdair G; Netten, Anouk; McNamara, Shanne; Erridge, Sara; Peoples, Sharon; Whittle, Ian; Hacking, Belinda; Grant, Robin

    2014-04-01

    Brain tumour patients may struggle to express their concerns in the outpatient clinic, creating a physician-focused rather than a shared agenda. We created a simple, practical brain-tumour-specific holistic needs assessment (HNA) tool for use in the neuro-oncology outpatient clinic. We posted the brain tumour Patient Concerns Inventory (PCI) to a consecutive sample of adult brain tumour attendees to a neuro-oncology outpatient clinic. Participants brought the completed PCI to their clinic consultation. Patients and staff provided feedback. Seventy seven patients were eligible and 53 participated (response rate = 68%). The PCI captured many problems absent from general cancer checklists. The five most frequent concerns were fatigue, fear of tumour coming back, memory, concentration, and low mood. Respondents used the PCI to formulate 105 specific questions, usually about the meaning of physical or psychological symptoms. Patients and staff found the PCI to be useful, and satisfaction with the instrument was high. This study demonstrates the clinical utility of the brain tumour PCI in a neuro-oncology clinic. The combination of a brain-tumour-specific concerns checklist and an intervention to focus patient agenda creates a simple and efficient HNA tool.

  3. Nut-cracking behaviour in wild-born, rehabilitated bonobos (Pan paniscus): a comprehensive study of hand-preference, hand grips and efficiency.

    PubMed

    Neufuss, Johanna; Humle, Tatyana; Cremaschi, Andrea; Kivell, Tracy L

    2017-02-01

    There has been an enduring interest in primate tool-use and manipulative abilities, most often with the goal of providing insight into the evolution of human manual dexterity, right-hand preference, and what behaviours make humans unique. Chimpanzees (Pan troglodytes) are arguably the most well-studied tool-users amongst non-human primates, and are particularly well-known for their complex nut-cracking behaviour, which has been documented in several West African populations. However, their sister-taxon, the bonobos (Pan paniscus), rarely engage in even simple tool-use and are not known to nut-crack in the wild. Only a few studies have reported tool-use in captive bonobos, including their ability to crack nuts, but details of this complex tool-use behaviour have not been documented before. Here, we fill this gap with the first comprehensive analysis of bonobo nut-cracking in a natural environment at the Lola ya Bonobo sanctuary, Democratic Republic of the Congo. Eighteen bonobos were studied as they cracked oil palm nuts using stone hammers. Individual bonobos showed exclusive laterality for using the hammerstone and there was a significant group-level right-hand bias. The study revealed 15 hand grips for holding differently sized and weighted hammerstones, 10 of which had not been previously described in the literature. Our findings also demonstrated that bonobos select the most effective hammerstones when nut-cracking. Bonobos are efficient nut-crackers and not that different from the renowned nut-cracking chimpanzees of Bossou, Guinea, which also crack oil palm nuts using stones. © 2016 Wiley Periodicals, Inc.

  4. New Tools for Managing Agricultural P

    NASA Astrophysics Data System (ADS)

    Nieber, J. L.; Baker, L. A.; Peterson, H. M.; Ulrich, J.

    2014-12-01

    Best management practices (BMPs) generally focus on retaining nutrients (especially P) after they enter the watershed. This approach is expensive, unsustainable, and has not led to reductions of P pollution at large scales (e.g., Mississippi River). Although source reduction, which results in reducing inputs of nutrients to a watershed, has long been cited as a preferred approach, we have not had tools to guide source reduction efforts at the watershed level. To augment conventional TMDL tools, we developed an "actionable" watershed P balance approach, based largely on watershed-specific information, yet simple enough to be utilized as a practical tool. Interviews with farmers were used to obtain detailed farm management data, data from livestock permits were adjusted based on site visits, stream P fluxes were calculated from 3 years of monitoring data, and expert knowledge was used to model P fluxes through animal operations. The overall P use efficiency. Puse was calculated as the sum of deliberate exports (P in animals, milk, eggs, and crops) divided by deliberate inputs (P inputs of fertilizer, feed, and nursery animals x 100. The crop P use efficiency was 1.7, meaning that more P was exported as products that was deliberately imported; we estimate that this mining would have resulted in a loss of 6 mg P/kg across the watershed. Despite the negative P balance, the equivalent of 5% of watershed input was lost via stream export. Tile drainage, the presence of buffer strips, and relatively flat topography result in dominance of P loads by ortho-P (66%) and low particulate P. This, together with geochemical analysis (ongoing) suggest that biological processes may be at least as important as sediment transport in controlling P loads. We have developed a P balance calculator tool to enable watershed management organizations to develop watershed P balances and identify opportunities for improving the efficiency of P utilization.

  5. Sparse RNA folding revisited: space-efficient minimum free energy structure prediction.

    PubMed

    Will, Sebastian; Jabbari, Hosna

    2016-01-01

    RNA secondary structure prediction by energy minimization is the central computational tool for the analysis of structural non-coding RNAs and their interactions. Sparsification has been successfully applied to improve the time efficiency of various structure prediction algorithms while guaranteeing the same result; however, for many such folding problems, space efficiency is of even greater concern, particularly for long RNA sequences. So far, space-efficient sparsified RNA folding with fold reconstruction was solved only for simple base-pair-based pseudo-energy models. Here, we revisit the problem of space-efficient free energy minimization. Whereas the space-efficient minimization of the free energy has been sketched before, the reconstruction of the optimum structure has not even been discussed. We show that this reconstruction is not possible in trivial extension of the method for simple energy models. Then, we present the time- and space-efficient sparsified free energy minimization algorithm SparseMFEFold that guarantees MFE structure prediction. In particular, this novel algorithm provides efficient fold reconstruction based on dynamically garbage-collected trace arrows. The complexity of our algorithm depends on two parameters, the number of candidates Z and the number of trace arrows T; both are bounded by [Formula: see text], but are typically much smaller. The time complexity of RNA folding is reduced from [Formula: see text] to [Formula: see text]; the space complexity, from [Formula: see text] to [Formula: see text]. Our empirical results show more than 80 % space savings over RNAfold [Vienna RNA package] on the long RNAs from the RNA STRAND database (≥2500 bases). The presented technique is intentionally generalizable to complex prediction algorithms; due to their high space demands, algorithms like pseudoknot prediction and RNA-RNA-interaction prediction are expected to profit even stronger than "standard" MFE folding. SparseMFEFold is free software, available at http://www.bioinf.uni-leipzig.de/~will/Software/SparseMFEFold.

  6. MAP: an iterative experimental design methodology for the optimization of catalytic search space structure modeling.

    PubMed

    Baumes, Laurent A

    2006-01-01

    One of the main problems in high-throughput research for materials is still the design of experiments. At early stages of discovery programs, purely exploratory methodologies coupled with fast screening tools should be employed. This should lead to opportunities to find unexpected catalytic results and identify the "groups" of catalyst outputs, providing well-defined boundaries for future optimizations. However, very few new papers deal with strategies that guide exploratory studies. Mostly, traditional designs, homogeneous covering, or simple random samplings are exploited. Typical catalytic output distributions exhibit unbalanced datasets for which an efficient learning is hardly carried out, and interesting but rare classes are usually unrecognized. Here is suggested a new iterative algorithm for the characterization of the search space structure, working independently of learning processes. It enhances recognition rates by transferring catalysts to be screened from "performance-stable" space zones to "unsteady" ones which necessitate more experiments to be well-modeled. The evaluation of new algorithm attempts through benchmarks is compulsory due to the lack of past proofs about their efficiency. The method is detailed and thoroughly tested with mathematical functions exhibiting different levels of complexity. The strategy is not only empirically evaluated, the effect or efficiency of sampling on future Machine Learning performances is also quantified. The minimum sample size required by the algorithm for being statistically discriminated from simple random sampling is investigated.

  7. Helping coaches apply the principles of representative learning design: validation of a tennis specific practice assessment tool.

    PubMed

    Krause, Lyndon; Farrow, Damian; Reid, Machar; Buszard, Tim; Pinder, Ross

    2018-06-01

    Representative Learning Design (RLD) is a framework for assessing the degree to which experimental or practice tasks simulate key aspects of specific performance environments (i.e. competition). The key premise being that when practice replicates the performance environment, skills are more likely to transfer. In applied situations, however, there is currently no simple or quick method for coaches to assess the key concepts of RLD (e.g. during on-court tasks). The aim of this study was to develop a tool for coaches to efficiently assess practice task design in tennis. A consensus-based tool was developed using a 4-round Delphi process with 10 academic and 13 tennis-coaching experts. Expert consensus was reached for the inclusion of seven items, each consisting of two sub-questions related to (i) the task goal and (ii) the relevance of the task to competition performance. The Representative Practice Assessment Tool (RPAT) is proposed for use in assessing and enhancing practice task designs in tennis to increase the functional coupling between information and movement, and to maximise the potential for skill transfer to competition contexts.

  8. Rocker: Open source, easy-to-use tool for AUC and enrichment calculations and ROC visualization.

    PubMed

    Lätti, Sakari; Niinivehmas, Sanna; Pentikäinen, Olli T

    2016-01-01

    Receiver operating characteristics (ROC) curve with the calculation of area under curve (AUC) is a useful tool to evaluate the performance of biomedical and chemoinformatics data. For example, in virtual drug screening ROC curves are very often used to visualize the efficiency of the used application to separate active ligands from inactive molecules. Unfortunately, most of the available tools for ROC analysis are implemented into commercially available software packages, or are plugins in statistical software, which are not always the easiest to use. Here, we present Rocker, a simple ROC curve visualization tool that can be used for the generation of publication quality images. Rocker also includes an automatic calculation of the AUC for the ROC curve and Boltzmann-enhanced discrimination of ROC (BEDROC). Furthermore, in virtual screening campaigns it is often important to understand the early enrichment of active ligand identification, for this Rocker offers automated calculation routine. To enable further development of Rocker, it is freely available (MIT-GPL license) for use and modifications from our web-site (http://www.jyu.fi/rocker).

  9. Simulation of green roof runoff under different substrate depths and vegetation covers by coupling a simple conceptual and a physically based hydrological model.

    PubMed

    Soulis, Konstantinos X; Valiantzas, John D; Ntoulas, Nikolaos; Kargas, George; Nektarios, Panayiotis A

    2017-09-15

    In spite of the well-known green roof benefits, their widespread adoption in the management practices of urban drainage systems requires the use of adequate analytical and modelling tools. In the current study, green roof runoff modeling was accomplished by developing, testing, and jointly using a simple conceptual model and a physically based numerical simulation model utilizing HYDRUS-1D software. The use of such an approach combines the advantages of the conceptual model, namely simplicity, low computational requirements, and ability to be easily integrated in decision support tools with the capacity of the physically based simulation model to be easily transferred in conditions and locations other than those used for calibrating and validating it. The proposed approach was evaluated with an experimental dataset that included various green roof covers (either succulent plants - Sedum sediforme, or xerophytic plants - Origanum onites, or bare substrate without any vegetation) and two substrate depths (either 8 cm or 16 cm). Both the physically based and the conceptual models matched very closely the observed hydrographs. In general, the conceptual model performed better than the physically based simulation model but the overall performance of both models was sufficient in most cases as it is revealed by the Nash-Sutcliffe Efficiency index which was generally greater than 0.70. Finally, it was showcased how a physically based and a simple conceptual model can be jointly used to allow the use of the simple conceptual model for a wider set of conditions than the available experimental data and in order to support green roof design. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Using computer-aided drug design and medicinal chemistry strategies in the fight against diabetes.

    PubMed

    Semighini, Evandro P; Resende, Jonathan A; de Andrade, Peterson; Morais, Pedro A B; Carvalho, Ivone; Taft, Carlton A; Silva, Carlos H T P

    2011-04-01

    The aim of this work is to present a simple, practical and efficient protocol for drug design, in particular Diabetes, which includes selection of the illness, good choice of a target as well as a bioactive ligand and then usage of various computer aided drug design and medicinal chemistry tools to design novel potential drug candidates in different diseases. We have selected the validated target dipeptidyl peptidase IV (DPP-IV), whose inhibition contributes to reduce glucose levels in type 2 diabetes patients. The most active inhibitor with complex X-ray structure reported was initially extracted from the BindingDB database. By using molecular modification strategies widely used in medicinal chemistry, besides current state-of-the-art tools in drug design (including flexible docking, virtual screening, molecular interaction fields, molecular dynamics, ADME and toxicity predictions), we have proposed 4 novel potential DPP-IV inhibitors with drug properties for Diabetes control, which have been supported and validated by all the computational tools used herewith.

  11. Development of an e-VLBI Data Transport Software Suite with VDIF

    NASA Technical Reports Server (NTRS)

    Sekido, Mamoru; Takefuji, Kazuhiro; Kimura, Moritaka; Hobiger, Thomas; Kokado, Kensuke; Nozawa, Kentarou; Kurihara, Shinobu; Shinno, Takuya; Takahashi, Fujinobu

    2010-01-01

    We have developed a software library (KVTP-lib) for VLBI data transmission over the network with the VDIF (VLBI Data Interchange Format), which is the newly proposed standard VLBI data format designed for electronic data transfer over the network. The software package keeps the application layer (VDIF frame) and the transmission layer separate, so that each layer can be developed efficiently. The real-time VLBI data transmission tool sudp-send is an application tool based on the KVTP-lib library. sudp-send captures the VLBI data stream from the VSI-H interface with the K5/VSI PC-board and writes the data to file in standard Linux file format or transmits it to the network using the simple- UDP (SUDP) protocol. Another tool, sudp-recv , receives the data stream from the network and writes the data to file in a specific VLBI format (K5/VSSP, VDIF, or Mark 5B). This software system has been implemented on the Wettzell Tsukuba baseline; evaluation before operational employment is under way.

  12. CRISPR system for genome engineering: the application for autophagy study.

    PubMed

    Cui, Jianzhou; Chew, Shirley Jia Li; Shi, Yin; Gong, Zhiyuan; Shen, Han-Ming

    2017-05-01

    CRISPR/Cas9 is the latest tool introduced in the field of genome engineering and is so far the best genome-editing tool as compared to its precedents such as, meganucleases, zinc finger nucleases (ZFNs) and transcription activator-like effectors (TALENs). The simple design and assembly of the CRISPR/Cas9 system makes genome editing easy to perform as it uses small guide RNAs that correspond to their DNA targets for high efficiency editing. This has helped open the doors for multiplexible genome targeting in many species that were intractable using old genetic perturbation techniques. Currently, The CRISPR system is revolutionizing the way biological researches are conducted and paves a bright future not only in research but also in medicine and biotechnology. In this review, we evaluated the history, types and structure, the mechanism of action of CRISPR/Cas System. In particular, we focused on the application of this powerful tool in autophagy research. [BMB Reports 2017; 50(5): 247-256].

  13. Molecular Identification of Date Palm Cultivars Using Random Amplified Polymorphic DNA (RAPD) Markers.

    PubMed

    Al-Khalifah, Nasser S; Shanavaskhan, A E

    2017-01-01

    Ambiguity in the total number of date palm cultivars across the world is pointing toward the necessity for an enumerative study using standard morphological and molecular markers. Among molecular markers, DNA markers are more suitable and ubiquitous to most applications. They are highly polymorphic in nature, frequently occurring in genomes, easy to access, and highly reproducible. Various molecular markers such as restriction fragment length polymorphism (RFLP), amplified fragment length polymorphism (AFLP), simple sequence repeats (SSR), inter-simple sequence repeats (ISSR), and random amplified polymorphic DNA (RAPD) markers have been successfully used as efficient tools for analysis of genetic variation in date palm. This chapter explains a stepwise protocol for extracting total genomic DNA from date palm leaves. A user-friendly protocol for RAPD analysis and a table showing the primers used in different molecular techniques that produce polymorphisms in date palm are also provided.

  14. Forensic collection of trace chemicals from diverse surfaces with strippable coatings.

    PubMed

    Jakubowski, Michael J; Beltis, Kevin J; Drennan, Paul M; Pindzola, Bradford A

    2013-11-07

    Surface sampling for chemical analysis plays a vital role in environmental monitoring, industrial hygiene, homeland security and forensics. The standard surface sampling tool, a simple cotton gauze pad, is failing to meet the needs of the community as analytical techniques become more sensitive and the variety of analytes increases. In previous work, we demonstrated the efficacy of non-destructive, conformal, spray-on strippable coatings for chemical collection from simple glass surfaces. Here we expand that work by presenting chemical collection at a low spiking level (0.1 g m(-2)) from a diverse array of common surfaces - painted metal, engineering plastics, painted wallboard and concrete - using strippable coatings. The collection efficiency of the strippable coatings is compared to and far exceeds gauze pads. Collection from concrete, a particular challenge for wipes like gauze, averaged 73% over eight chemically diverse compounds for the strippable coatings whereas gauze averaged 10%.

  15. Efficient micromagnetic modelling of spin-transfer torque and spin-orbit torque

    NASA Astrophysics Data System (ADS)

    Abert, Claas; Bruckner, Florian; Vogler, Christoph; Suess, Dieter

    2018-05-01

    While the spin-diffusion model is considered one of the most complete and accurate tools for the description of spin transport and spin torque, its solution in the context of dynamical micromagnetic simulations is numerically expensive. We propose a procedure to retrieve the free parameters of a simple macro-spin like spin-torque model through the spin-diffusion model. In case of spin-transfer torque the simplified model complies with the model of Slonczewski. A similar model can be established for the description of spin-orbit torque. In both cases the spin-diffusion model enables the retrieval of free model parameters from the geometry and the material parameters of the system. Since these parameters usually have to be determined phenomenologically through experiments, the proposed method combines the strength of the diffusion model to resolve material parameters and geometry with the high performance of simple torque models.

  16. A review on simple assembly line balancing type-e problem

    NASA Astrophysics Data System (ADS)

    Jusop, M.; Rashid, M. F. F. Ab

    2015-12-01

    Simple assembly line balancing (SALB) is an attempt to assign the tasks to the various workstations along the line so that the precedence relations are satisfied and some performance measure are optimised. Advanced approach of algorithm is necessary to solve large-scale problems as SALB is a class of NP-hard. Only a few studies are focusing on simple assembly line balancing of Type-E problem (SALB-E) since it is a general and complex problem. SALB-E problem is one of SALB problem which consider the number of workstation and the cycle time simultaneously for the purpose of maximising the line efficiency. This paper review previous works that has been done in order to optimise SALB -E problem. Besides that, this paper also reviewed the Genetic Algorithm approach that has been used to optimise SALB-E. From the reviewed that has been done, it was found that none of the existing works are concern on the resource constraint in the SALB-E problem especially on machine and tool constraints. The research on SALB-E will contribute to the improvement of productivity in real industrial application.

  17. Low Order Modeling Tools for Preliminary Pressure Gain Combustion Benefits Analyses

    NASA Technical Reports Server (NTRS)

    Paxson, Daniel E.

    2012-01-01

    Pressure gain combustion (PGC) offers the promise of higher thermodynamic cycle efficiency and greater specific power in propulsion and power systems. This presentation describes a model, developed under a cooperative agreement between NASA and AFRL, for preliminarily assessing the performance enhancement and preliminary size requirements of PGC components either as stand-alone thrust producers or coupled with surrounding turbomachinery. The model is implemented in the Numerical Propulsion Simulation System (NPSS) environment allowing various configurations to be examined at numerous operating points. The validated model is simple, yet physics-based. It executes quickly in NPSS, yet produces realistic results.

  18. Automating the application of smart materials for protein crystallization.

    PubMed

    Khurshid, Sahir; Govada, Lata; El-Sharif, Hazim F; Reddy, Subrayal M; Chayen, Naomi E

    2015-03-01

    The fabrication and validation of the first semi-liquid nonprotein nucleating agent to be administered automatically to crystallization trials is reported. This research builds upon prior demonstration of the suitability of molecularly imprinted polymers (MIPs; known as `smart materials') for inducing protein crystal growth. Modified MIPs of altered texture suitable for high-throughput trials are demonstrated to improve crystal quality and to increase the probability of success when screening for suitable crystallization conditions. The application of these materials is simple, time-efficient and will provide a potent tool for structural biologists embarking on crystallization trials.

  19. Morphogenesis of early stage melanoma

    NASA Astrophysics Data System (ADS)

    Chatelain, Clément; Amar, Martine Ben

    2015-08-01

    Melanoma early detection is possible by simple skin examination and can insure a high survival probability when successful. However it requires efficient methods for identifying malignant lesions from common moles. This paper provides an overview first of the biological and physical mechanisms controlling melanoma early evolution, and then of the clinical tools available today for detecting melanoma in vivo at an early stage. It highlights the lack of diagnosis methods rationally linking macroscopic observables to the microscopic properties of the tissue, which define the malignancy of the tumor. The possible inputs of multiscale models for improving these methods are shortly discussed.

  20. Recent advances and versatility of MAGE towards industrial applications.

    PubMed

    Singh, Vijai; Braddick, Darren

    2015-12-01

    The genome engineering toolkit has expanded significantly in recent years, allowing us to study the functions of genes in cellular networks and assist in over-production of proteins, drugs, chemicals and biofuels. Multiplex automated genome engineering (MAGE) has been recently developed and gained more scientific interest towards strain engineering. MAGE is a simple, rapid and efficient tool for manipulating genes simultaneously in multiple loci, assigning genetic codes and integrating non-natural amino acids. MAGE can be further expanded towards the engineering of fast, robust and over-producing strains for chemicals, drugs and biofuels at industrial scales.

  1. PopED lite: An optimal design software for preclinical pharmacokinetic and pharmacodynamic studies.

    PubMed

    Aoki, Yasunori; Sundqvist, Monika; Hooker, Andrew C; Gennemark, Peter

    2016-04-01

    Optimal experimental design approaches are seldom used in preclinical drug discovery. The objective is to develop an optimal design software tool specifically designed for preclinical applications in order to increase the efficiency of drug discovery in vivo studies. Several realistic experimental design case studies were collected and many preclinical experimental teams were consulted to determine the design goal of the software tool. The tool obtains an optimized experimental design by solving a constrained optimization problem, where each experimental design is evaluated using some function of the Fisher Information Matrix. The software was implemented in C++ using the Qt framework to assure a responsive user-software interaction through a rich graphical user interface, and at the same time, achieving the desired computational speed. In addition, a discrete global optimization algorithm was developed and implemented. The software design goals were simplicity, speed and intuition. Based on these design goals, we have developed the publicly available software PopED lite (http://www.bluetree.me/PopED_lite). Optimization computation was on average, over 14 test problems, 30 times faster in PopED lite compared to an already existing optimal design software tool. PopED lite is now used in real drug discovery projects and a few of these case studies are presented in this paper. PopED lite is designed to be simple, fast and intuitive. Simple, to give many users access to basic optimal design calculations. Fast, to fit a short design-execution cycle and allow interactive experimental design (test one design, discuss proposed design, test another design, etc). Intuitive, so that the input to and output from the software tool can easily be understood by users without knowledge of the theory of optimal design. In this way, PopED lite is highly useful in practice and complements existing tools. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  2. minimega

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David Fritz, John Floren

    2013-08-27

    Minimega is a simple emulytics platform for creating testbeds of networked devices. The platform consists of easily deployable tools to facilitate bringing up large networks of virtual machines including Windows, Linux, and Android. Minimega attempts to allow experiments to be brought up quickly with nearly no configuration. Minimega also includes tools for simple cluster management, as well as tools for creating Linux based virtual machine images.

  3. TLD efficiency calculations for heavy ions: an analytical approach

    DOE PAGES

    Boscolo, Daria; Scifoni, Emanuele; Carlino, Antonio; ...

    2015-12-18

    The use of thermoluminescent dosimeters (TLDs) in heavy charged particles’ dosimetry is limited by their non-linear dose response curve and by their response dependence on the radiation quality. Thus, in order to use TLDs with particle beams, a model that can reproduce the behavior of these detectors under different conditions is needed. Here a new, simple and completely analytical algorithm for the calculation of the relative TL-efficiency depending on the ion charge Z and energy E is presented. In addition, the detector response is evaluated starting from the single ion case, where the computed effectiveness values have been compared withmore » experimental data as well as with predictions from a different method. The main advantage of this approach is that, being fully analytical, it is computationally fast and can be efficiently integrated into treatment planning verification tools. In conclusion, the calculated efficiency values have been then implemented in the treatment planning code TRiP98 and dose calculations on a macroscopic target irradiated with an extended carbon ion field have been performed and verified against experimental data.« less

  4. A novel FPGA-programmable switch matrix interconnection element in quantum-dot cellular automata

    NASA Astrophysics Data System (ADS)

    Hashemi, Sara; Rahimi Azghadi, Mostafa; Zakerolhosseini, Ali; Navi, Keivan

    2015-04-01

    The Quantum-dot cellular automata (QCA) is a novel nanotechnology, promising extra low-power, extremely dense and very high-speed structure for the construction of logical circuits at a nanoscale. In this paper, initially previous works on QCA-based FPGA's routing elements are investigated, and then an efficient, symmetric and reliable QCA programmable switch matrix (PSM) interconnection element is introduced. This element has a simple structure and offers a complete routing capability. It is implemented using a bottom-up design approach that starts from a dense and high-speed 2:1 multiplexer and utilise it to build the target PSM interconnection element. In this study, simulations of the proposed circuits are carried out using QCAdesigner, a layout and simulation tool for QCA circuits. The results demonstrate high efficiency of the proposed designs in QCA-based FPGA routing.

  5. CRISPR/Cas9 mediates efficient conditional mutagenesis in Drosophila.

    PubMed

    Xue, Zhaoyu; Wu, Menghua; Wen, Kejia; Ren, Menda; Long, Li; Zhang, Xuedi; Gao, Guanjun

    2014-09-05

    Existing transgenic RNA interference (RNAi) methods greatly facilitate functional genome studies via controlled silencing of targeted mRNA in Drosophila. Although the RNAi approach is extremely powerful, concerns still linger about its low efficiency. Here, we developed a CRISPR/Cas9-mediated conditional mutagenesis system by combining tissue-specific expression of Cas9 driven by the Gal4/upstream activating site system with various ubiquitously expressed guide RNA transgenes to effectively inactivate gene expression in a temporally and spatially controlled manner. Furthermore, by including multiple guide RNAs in a transgenic vector to target a single gene, we achieved a high degree of gene mutagenesis in specific tissues. The CRISPR/Cas9-mediated conditional mutagenesis system provides a simple and effective tool for gene function analysis, and complements the existing RNAi approach. Copyright © 2014 Xue et al.

  6. Vehicle trajectory linearisation to enable efficient optimisation of the constant speed racing line

    NASA Astrophysics Data System (ADS)

    Timings, Julian P.; Cole, David J.

    2012-06-01

    A driver model is presented capable of optimising the trajectory of a simple dynamic nonlinear vehicle, at constant forward speed, so that progression along a predefined track is maximised as a function of time. In doing so, the model is able to continually operate a vehicle at its lateral-handling limit, maximising vehicle performance. The technique used forms a part of the solution to the motor racing objective of minimising lap time. A new approach of formulating the minimum lap time problem is motivated by the need for a more computationally efficient and robust tool-set for understanding on-the-limit driving behaviour. This has been achieved through set point-dependent linearisation of the vehicle model and coupling the vehicle-track system using an intrinsic coordinate description. Through this, the geometric vehicle trajectory had been linearised relative to the track reference, leading to new path optimisation algorithm which can be formed as a computationally efficient convex quadratic programming problem.

  7. DataPflex: a MATLAB-based tool for the manipulation and visualization of multidimensional datasets.

    PubMed

    Hendriks, Bart S; Espelin, Christopher W

    2010-02-01

    DataPflex is a MATLAB-based application that facilitates the manipulation and visualization of multidimensional datasets. The strength of DataPflex lies in the intuitive graphical user interface for the efficient incorporation, manipulation and visualization of high-dimensional data that can be generated by multiplexed protein measurement platforms including, but not limited to Luminex or Meso-Scale Discovery. Such data can generally be represented in the form of multidimensional datasets [for example (time x stimulation x inhibitor x inhibitor concentration x cell type x measurement)]. For cases where measurements are made in a combinational fashion across multiple dimensions, there is a need for a tool to efficiently manipulate and reorganize such data for visualization. DataPflex accepts data consisting of up to five arbitrary dimensions in addition to a measurement dimension. Data are imported from a simple .xls format and can be exported to MATLAB or .xls. Data dimensions can be reordered, subdivided, merged, normalized and visualized in the form of collections of line graphs, bar graphs, surface plots, heatmaps, IC50's and other custom plots. Open source implementation in MATLAB enables easy extension for custom plotting routines and integration with more sophisticated analysis tools. DataPflex is distributed under the GPL license (http://www.gnu.org/licenses/) together with documentation, source code and sample data files at: http://code.google.com/p/datapflex. Supplementary data available at Bioinformatics online.

  8. The Heterogeneous Investment Horizon and Dynamic Strategies for Asset Allocation

    NASA Astrophysics Data System (ADS)

    Xiong, Heping; Xu, Yiheng; Xiao, Yi

    This paper discusses the influence of the portfolio rebalancing strategy on the efficiency of long-term investment portfolios under the assumption of independent stationary distribution of returns. By comparing the efficient sets of the stochastic rebalancing strategy, the simple rebalancing strategy and the buy-and-hold strategy with specific data examples, we find that the stochastic rebalancing strategy is optimal, while the simple rebalancing strategy is of the lowest efficiency. In addition, the simple rebalancing strategy lowers the efficiency of the portfolio instead of improving it.

  9. Validation of green-solvent extraction combined with chromatographic chemical fingerprint to evaluate quality of Stevia rebaudiana Bertoni.

    PubMed

    Teo, Chin Chye; Tan, Swee Ngin; Yong, Jean Wan Hong; Hew, Choy Sin; Ong, Eng Shi

    2009-02-01

    An approach that combined green-solvent methods of extraction with chromatographic chemical fingerprint and pattern recognition tools such as principal component analysis (PCA) was used to evaluate the quality of medicinal plants. Pressurized hot water extraction (PHWE) and microwave-assisted extraction (MAE) were used and their extraction efficiencies to extract two bioactive compounds, namely stevioside (SV) and rebaudioside A (RA), from Stevia rebaudiana Bertoni (SB) under different cultivation conditions were compared. The proposed methods showed that SV and RA could be extracted from SB using pure water under optimized conditions. The extraction efficiency of the methods was observed to be higher or comparable to heating under reflux with water. The method precision (RSD, n = 6) was found to vary from 1.91 to 2.86% for the two different methods on different days. Compared to PHWE, MAE has higher extraction efficiency with shorter extraction time. MAE was also found to extract more chemical constituents and provide distinctive chemical fingerprints for quality control purposes. Thus, a combination of MAE with chromatographic chemical fingerprints and PCA provided a simple and rapid approach for the comparison and classification of medicinal plants from different growth conditions. Hence, the current work highlighted the importance of extraction method in chemical fingerprinting for the classification of medicinal plants from different cultivation conditions with the aid of pattern recognition tools used.

  10. FocusStack and StimServer: a new open source MATLAB toolchain for visual stimulation and analysis of two-photon calcium neuronal imaging data.

    PubMed

    Muir, Dylan R; Kampa, Björn M

    2014-01-01

    Two-photon calcium imaging of neuronal responses is an increasingly accessible technology for probing population responses in cortex at single cell resolution, and with reasonable and improving temporal resolution. However, analysis of two-photon data is usually performed using ad-hoc solutions. To date, no publicly available software exists for straightforward analysis of stimulus-triggered two-photon imaging experiments. In addition, the increasing data rates of two-photon acquisition systems imply increasing cost of computing hardware required for in-memory analysis. Here we present a Matlab toolbox, FocusStack, for simple and efficient analysis of two-photon calcium imaging stacks on consumer-level hardware, with minimal memory footprint. We also present a Matlab toolbox, StimServer, for generation and sequencing of visual stimuli, designed to be triggered over a network link from a two-photon acquisition system. FocusStack is compatible out of the box with several existing two-photon acquisition systems, and is simple to adapt to arbitrary binary file formats. Analysis tools such as stack alignment for movement correction, automated cell detection and peri-stimulus time histograms are already provided, and further tools can be easily incorporated. Both packages are available as publicly-accessible source-code repositories.

  11. FocusStack and StimServer: a new open source MATLAB toolchain for visual stimulation and analysis of two-photon calcium neuronal imaging data

    PubMed Central

    Muir, Dylan R.; Kampa, Björn M.

    2015-01-01

    Two-photon calcium imaging of neuronal responses is an increasingly accessible technology for probing population responses in cortex at single cell resolution, and with reasonable and improving temporal resolution. However, analysis of two-photon data is usually performed using ad-hoc solutions. To date, no publicly available software exists for straightforward analysis of stimulus-triggered two-photon imaging experiments. In addition, the increasing data rates of two-photon acquisition systems imply increasing cost of computing hardware required for in-memory analysis. Here we present a Matlab toolbox, FocusStack, for simple and efficient analysis of two-photon calcium imaging stacks on consumer-level hardware, with minimal memory footprint. We also present a Matlab toolbox, StimServer, for generation and sequencing of visual stimuli, designed to be triggered over a network link from a two-photon acquisition system. FocusStack is compatible out of the box with several existing two-photon acquisition systems, and is simple to adapt to arbitrary binary file formats. Analysis tools such as stack alignment for movement correction, automated cell detection and peri-stimulus time histograms are already provided, and further tools can be easily incorporated. Both packages are available as publicly-accessible source-code repositories1. PMID:25653614

  12. Technical note: Harmonizing met-ocean model data via standard web services within small research groups

    NASA Astrophysics Data System (ADS)

    Signell, R. P.; Camossi, E.

    2015-11-01

    Work over the last decade has resulted in standardized web-services and tools that can significantly improve the efficiency and effectiveness of working with meteorological and ocean model data. While many operational modelling centres have enabled query and access to data via common web services, most small research groups have not. The penetration of this approach into the research community, where IT resources are limited, can be dramatically improved by: (1) making it simple for providers to enable web service access to existing output files; (2) using technology that is free, and that is easy to deploy and configure; and (3) providing tools to communicate with web services that work in existing research environments. We present a simple, local brokering approach that lets modelers continue producing custom data, but virtually aggregates and standardizes the data using NetCDF Markup Language. The THREDDS Data Server is used for data delivery, pycsw for data search, NCTOOLBOX (Matlab®1) and Iris (Python) for data access, and Ocean Geospatial Consortium Web Map Service for data preview. We illustrate the effectiveness of this approach with two use cases involving small research modelling groups at NATO and USGS.1 Mention of trade names or commercial products does not constitute endorsement or recommendation for use by the US Government.

  13. Ultrathin nondoped emissive layers for efficient and simple monochrome and white organic light-emitting diodes.

    PubMed

    Zhao, Yongbiao; Chen, Jiangshan; Ma, Dongge

    2013-02-01

    In this paper, highly efficient and simple monochrome blue, green, orange, and red organic light emitting diodes (OLEDs) based on ultrathin nondoped emissive layers (EMLs) have been reported. The ultrathin nondoped EML was constructed by introducing a 0.1 nm thin layer of pure phosphorescent dyes between a hole transporting layer and an electron transporting layer. The maximum external quantum efficiencies (EQEs) reached 17.1%, 20.9%, 17.3%, and 19.2% for blue, green, orange, and red monochrome OLEDs, respectively, indicating the universality of the ultrathin nondoped EML for most phosphorescent dyes. On the basis of this, simple white OLED structures are also demonstrated. The demonstrated complementary blue/orange, three primary blue/green/red, and four color blue/green/orange/red white OLEDs show high efficiency and good white emission, indicating the advantage of ultrathin nondoped EMLs on constructing simple and efficient white OLEDs.

  14. Strehl ratio: a tool for optimizing optical nulls and singularities.

    PubMed

    Hénault, François

    2015-07-01

    In this paper a set of radial and azimuthal phase functions are reviewed that have a null Strehl ratio, which is equivalent to generating a central extinction in the image plane of an optical system. The study is conducted in the framework of Fraunhofer scalar diffraction, and is oriented toward practical cases where optical nulls or singularities are produced by deformable mirrors or phase plates. The identified solutions reveal unexpected links with the zeros of type-J Bessel functions of integer order. They include linear azimuthal phase ramps giving birth to an optical vortex, azimuthally modulated phase functions, and circular phase gratings (CPGs). It is found in particular that the CPG radiometric efficiency could be significantly improved by the null Strehl ratio condition. Simple design rules for rescaling and combining the different phase functions are also defined. Finally, the described analytical solutions could also serve as starting points for an automated searching software tool.

  15. Cloud-based solution to identify statistically significant MS peaks differentiating sample categories.

    PubMed

    Ji, Jun; Ling, Jeffrey; Jiang, Helen; Wen, Qiaojun; Whitin, John C; Tian, Lu; Cohen, Harvey J; Ling, Xuefeng B

    2013-03-23

    Mass spectrometry (MS) has evolved to become the primary high throughput tool for proteomics based biomarker discovery. Until now, multiple challenges in protein MS data analysis remain: large-scale and complex data set management; MS peak identification, indexing; and high dimensional peak differential analysis with the concurrent statistical tests based false discovery rate (FDR). "Turnkey" solutions are needed for biomarker investigations to rapidly process MS data sets to identify statistically significant peaks for subsequent validation. Here we present an efficient and effective solution, which provides experimental biologists easy access to "cloud" computing capabilities to analyze MS data. The web portal can be accessed at http://transmed.stanford.edu/ssa/. Presented web application supplies large scale MS data online uploading and analysis with a simple user interface. This bioinformatic tool will facilitate the discovery of the potential protein biomarkers using MS.

  16. Single Spore Isolation as a Simple and Efficient Technique to obtain fungal pure culture

    NASA Astrophysics Data System (ADS)

    Noman, E.; Al-Gheethi, AA; Rahman, N. K.; Talip, B.; Mohamed, R.; H, N.; Kadir, O. A.

    2018-04-01

    The successful identification of fungi by phenotypic methods or molecular technique depends mainly on the using an advanced technique for purifying the isolates. The most efficient is the single spore technique due to the simple requirements and the efficiency in preventing the contamination by yeast, mites or bacteria. The method described in the present work is depends on the using of a light microscope to transfer one spore into a new culture medium. The present work describes a simple and efficient procedure for single spore isolation to purify of fungi recovered from the clinical wastes.

  17. Comparison in partition efficiency of protein separation between four different tubing modifications in spiral high-speed countercurrent chromatography

    PubMed Central

    Ito, Yoichiro; Clary, Robert

    2016-01-01

    High-speed countercurrent chromatography with a spiral tube assembly can retain a satisfactory amount of stationary phase of polymer phase systems used for protein separation. In order to improve the partition efficiency a simple tool to modify the tubing shapes was fabricated, and the following four different tubing modifications were made: intermittently pressed at 10 mm width, flat, flat-wave, and flat-twist. Partition efficiencies of the separation column made from these modified tubing were examined in protein separation with an aqueous-aqueous polymer phase system at flow rates of 1–2 ml/min under 800 rpm. The results indicated that the column with all modified tubing improved the partition efficiency at a flow rate of 1 ml/min, but at a higher flow rate of 2 ml/min the columns made of flattened tubing showed lowered partition efficiency apparently due to the loss of the retained stationary phase. Among all the modified columns, the column with intermittently pressed tubing gave the best peak resolution. It may be concluded that the intermittently pressed and flat-twist improve the partition efficiency in a semi-preparative separation while other modified tubing of flat and flat-wave configurations may be used for analytical separations with a low flow rate. PMID:27790621

  18. Comparison in partition efficiency of protein separation between four different tubing modifications in spiral high-speed countercurrent chromatography.

    PubMed

    Ito, Yoichiro; Clary, Robert

    2016-12-01

    High-speed countercurrent chromatography with a spiral tube assembly can retain a satisfactory amount of stationary phase of polymer phase systems used for protein separation. In order to improve the partition efficiency a simple tool to modify the tubing shapes was fabricated, and the following four different tubing modifications were made: intermittently pressed at 10 mm width, flat, flat-wave, and flat-twist. Partition efficiencies of the separation column made from these modified tubing were examined in protein separation with an aqueous-aqueous polymer phase system at flow rates of 1-2 ml/min under 800 rpm. The results indicated that the column with all modified tubing improved the partition efficiency at a flow rate of 1 ml/min, but at a higher flow rate of 2 ml/min the columns made of flattened tubing showed lowered partition efficiency apparently due to the loss of the retained stationary phase. Among all the modified columns, the column with intermittently pressed tubing gave the best peak resolution. It may be concluded that the intermittently pressed and flat-twist improve the partition efficiency in a semi-preparative separation while other modified tubing of flat and flat-wave configurations may be used for analytical separations with a low flow rate.

  19. Wavelet-Based Peak Detection and a New Charge Inference Procedure for MS/MS Implemented in ProteoWizard’s msConvert

    PubMed Central

    2015-01-01

    We report the implementation of high-quality signal processing algorithms into ProteoWizard, an efficient, open-source software package designed for analyzing proteomics tandem mass spectrometry data. Specifically, a new wavelet-based peak-picker (CantWaiT) and a precursor charge determination algorithm (Turbocharger) have been implemented. These additions into ProteoWizard provide universal tools that are independent of vendor platform for tandem mass spectrometry analyses and have particular utility for intralaboratory studies requiring the advantages of different platforms convergent on a particular workflow or for interlaboratory investigations spanning multiple platforms. We compared results from these tools to those obtained using vendor and commercial software, finding that in all cases our algorithms resulted in a comparable number of identified peptides for simple and complex samples measured on Waters, Agilent, and AB SCIEX quadrupole time-of-flight and Thermo Q-Exactive mass spectrometers. The mass accuracy of matched precursor ions also compared favorably with vendor and commercial tools. Additionally, typical analysis runtimes (∼1–100 ms per MS/MS spectrum) were short enough to enable the practical use of these high-quality signal processing tools for large clinical and research data sets. PMID:25411686

  20. Wavelet-based peak detection and a new charge inference procedure for MS/MS implemented in ProteoWizard's msConvert.

    PubMed

    French, William R; Zimmerman, Lisa J; Schilling, Birgit; Gibson, Bradford W; Miller, Christine A; Townsend, R Reid; Sherrod, Stacy D; Goodwin, Cody R; McLean, John A; Tabb, David L

    2015-02-06

    We report the implementation of high-quality signal processing algorithms into ProteoWizard, an efficient, open-source software package designed for analyzing proteomics tandem mass spectrometry data. Specifically, a new wavelet-based peak-picker (CantWaiT) and a precursor charge determination algorithm (Turbocharger) have been implemented. These additions into ProteoWizard provide universal tools that are independent of vendor platform for tandem mass spectrometry analyses and have particular utility for intralaboratory studies requiring the advantages of different platforms convergent on a particular workflow or for interlaboratory investigations spanning multiple platforms. We compared results from these tools to those obtained using vendor and commercial software, finding that in all cases our algorithms resulted in a comparable number of identified peptides for simple and complex samples measured on Waters, Agilent, and AB SCIEX quadrupole time-of-flight and Thermo Q-Exactive mass spectrometers. The mass accuracy of matched precursor ions also compared favorably with vendor and commercial tools. Additionally, typical analysis runtimes (∼1-100 ms per MS/MS spectrum) were short enough to enable the practical use of these high-quality signal processing tools for large clinical and research data sets.

  1. Open-source Software for Demand Forecasting of Clinical Laboratory Test Volumes Using Time-series Analysis.

    PubMed

    Mohammed, Emad A; Naugler, Christopher

    2017-01-01

    Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. This tool will allow anyone with historic test volume data to model future demand.

  2. Open-source Software for Demand Forecasting of Clinical Laboratory Test Volumes Using Time-series Analysis

    PubMed Central

    Mohammed, Emad A.; Naugler, Christopher

    2017-01-01

    Background: Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. Method: In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. Results: This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. Conclusion: This tool will allow anyone with historic test volume data to model future demand. PMID:28400996

  3. A simple and robust vector-based shRNA expression system used for RNA interference.

    PubMed

    Wang, Xue-jun; Li, Ying; Huang, Hai; Zhang, Xiu-juan; Xie, Pei-wen; Hu, Wei; Li, Dan-dan; Wang, Sheng-qi

    2013-01-01

    RNA interference (RNAi) mediated by small interfering RNAs (siRNAs) or short hairpin RNAs (shRNAs) has become a powerful genetic tool for conducting functional studies. Previously, vector-based shRNA-expression strategies capable of inducing RNAi in viable cells have been developed, however, these vector systems have some disadvantages, either because they were error-prone or cost prohibitive. In this report we described the development of a simple, robust shRNA expression system utilizing 1 long oligonucleotide or 2 short oligonucleotides for half the cost of conventional shRNA construction methods and with a >95% cloning success rate. The shRNA loop sequence and stem structure were also compared and carefully selected for better RNAi efficiency. Furthermore, an easier strategy was developed based on isocaudomers which permit rapid combination of the most efficient promoter-shRNA cassettes. Finally, using this method, the conservative target sites for hepatitis B virus (HBV) knockdown were systemically screened and HBV antigen expression shown to be successfully suppressed in the presence of connected multiple shRNAs both in vitro and in vivo. This novel design describes an inexpensive and effective way to clone and express single or multiple shRNAs from the same vector with the capacity for potent and effective silencing of target genes.

  4. EMMA: An Extensible Mammalian Modular Assembly Toolkit for the Rapid Design and Production of Diverse Expression Vectors.

    PubMed

    Martella, Andrea; Matjusaitis, Mantas; Auxillos, Jamie; Pollard, Steven M; Cai, Yizhi

    2017-07-21

    Mammalian plasmid expression vectors are critical reagents underpinning many facets of research across biology, biomedical research, and the biotechnology industry. Traditional cloning methods often require laborious manual design and assembly of plasmids using tailored sequential cloning steps. This process can be protracted, complicated, expensive, and error-prone. New tools and strategies that facilitate the efficient design and production of bespoke vectors would help relieve a current bottleneck for researchers. To address this, we have developed an extensible mammalian modular assembly kit (EMMA). This enables rapid and efficient modular assembly of mammalian expression vectors in a one-tube, one-step golden-gate cloning reaction, using a standardized library of compatible genetic parts. The high modularity, flexibility, and extensibility of EMMA provide a simple method for the production of functionally diverse mammalian expression vectors. We demonstrate the value of this toolkit by constructing and validating a range of representative vectors, such as transient and stable expression vectors (transposon based vectors), targeting vectors, inducible systems, polycistronic expression cassettes, fusion proteins, and fluorescent reporters. The method also supports simple assembly combinatorial libraries and hierarchical assembly for production of larger multigenetic cargos. In summary, EMMA is compatible with automated production, and novel genetic parts can be easily incorporated, providing new opportunities for mammalian synthetic biology.

  5. The Hematopoietic Expression Viewer: expanding mobile apps as a scientific tool.

    PubMed

    James, Regis A; Rao, Mitchell M; Chen, Edward S; Goodell, Margaret A; Shaw, Chad A

    2012-07-15

    Many important data in current biological science comprise hundreds, thousands or more individual results. These massive data require computational tools to navigate results and effectively interact with the content. Mobile device apps are an increasingly important tool in the everyday lives of scientists and non-scientists alike. These software present individuals with compact and efficient tools to interact with complex data at meetings or other locations remote from their main computing environment. We believe that apps will be important tools for biologists, geneticists and physicians to review content while participating in biomedical research or practicing medicine. We have developed a prototype app for displaying gene expression data using the iOS platform. To present the software engineering requirements, we review the model-view-controller schema for Apple's iOS. We apply this schema to a simple app for querying locally developed microarray gene expression data. The challenge of this application is to balance between storing content locally within the app versus obtaining it dynamically via a network connection. The Hematopoietic Expression Viewer is available at http://www.shawlab.org/he_viewer. The source code for this project and any future information on how to obtain the app can be accessed at http://www.shawlab.org/he_viewer.

  6. Antisense oligonucleotide technologies in drug discovery.

    PubMed

    Aboul-Fadl, Tarek

    2006-09-01

    The principle of antisense oligonucleotide (AS-OD) technologies is based on the specific inhibition of unwanted gene expression by blocking mRNA activity. It has long appeared to be an ideal strategy to leverage new genomic knowledge for drug discovery and development. In recent years, AS-OD technologies have been widely used as potent and promising tools for this purpose. There is a rapid increase in the number of antisense molecules progressing in clinical trials. AS-OD technologies provide a simple and efficient approach for drug discovery and development and are expected to become a reality in the near future. This editorial describes the established and emerging AS-OD technologies in drug discovery.

  7. Automated Reporting of DXA Studies Using a Custom-Built Computer Program.

    PubMed

    England, Joseph R; Colletti, Patrick M

    2018-06-01

    Dual-energy x-ray absorptiometry (DXA) scans are a critical population health tool and relatively simple to interpret but can be time consuming to report, often requiring manual transfer of bone mineral density and associated statistics into commercially available dictation systems. We describe here a custom-built computer program for automated reporting of DXA scans using Pydicom, an open-source package built in the Python computer language, and regular expressions to mine DICOM tags for patient information and bone mineral density statistics. This program, easy to emulate by any novice computer programmer, has doubled our efficiency at reporting DXA scans and has eliminated dictation errors.

  8. Voluntary environmental agreements: Good or bad news for environmental protection?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Segerson, K.; Miceli, T.J.

    1998-09-01

    There has been growing interest in the use of voluntary agreements (VAs) as an environmental policy tool. This article uses a simple model to determine whether VAs are likely to lead to efficient environmental protection. The authors consider cases where polluters are induced to participate either by a background threat of mandatory controls (the stick approach) or by cost-sharing subsidies (the carrot approach). The results suggest that the overall impact on environmental quality could be positive or negative, depending on a number of factors, including the allocation of bargaining power, the magnitude of the background threat, and the social costmore » of funds.« less

  9. Overview of Fundamental High-Lift Research for Transport Aircraft at NASA

    NASA Technical Reports Server (NTRS)

    Leavitt, L. D.; Washburn, A. E.; Wahls, R. A.

    2007-01-01

    NASA has had a long history in fundamental and applied high lift research. Current programs provide a focus on the validation of technologies and tools that will enable extremely short take off and landing coupled with efficient cruise performance, simple flaps with flow control for improved effectiveness, circulation control wing concepts, some exploration into new aircraft concepts, and partnership with Air Force Research Lab in mobility. Transport high-lift development testing will shift more toward mid and high Rn facilities at least until the question: "How much Rn is required" is answered. This viewgraph presentation provides an overview of High-Lift research at NASA.

  10. Generation of genome-modified Drosophila cell lines using SwAP.

    PubMed

    Franz, Alexandra; Brunner, Erich; Basler, Konrad

    2017-10-02

    The ease of generating genetically modified animals and cell lines has been markedly increased by the recent development of the versatile CRISPR/Cas9 tool. However, while the isolation of isogenic cell populations is usually straightforward for mammalian cell lines, the generation of clonal Drosophila cell lines has remained a longstanding challenge, hampered by the difficulty of getting Drosophila cells to grow at low densities. Here, we describe a highly efficient workflow to generate clonal Cas9-engineered Drosophila cell lines using a combination of cell pools, limiting dilution in conditioned medium and PCR with allele-specific primers, enabling the efficient selection of a clonal cell line with a suitable mutation profile. We validate the protocol by documenting the isolation, selection and verification of eight independently Cas9-edited armadillo mutant Drosophila cell lines. Our method provides a powerful and simple workflow that improves the utility of Drosophila cells for genetic studies with CRISPR/Cas9.

  11. Self-running and self-floating two-dimensional actuator using near-field acoustic levitation

    NASA Astrophysics Data System (ADS)

    Chen, Keyu; Gao, Shiming; Pan, Yayue; Guo, Ping

    2016-09-01

    Non-contact actuators are promising technologies in metrology, machine-tools, and hovercars, but have been suffering from low energy efficiency, complex design, and low controllability. Here we report a new design of a self-running and self-floating actuator capable of two-dimensional motion with an unlimited travel range. The proposed design exploits near-field acoustic levitation for heavy object lifting, and coupled resonant vibration for generation of acoustic streaming for non-contact motion in designated directions. The device utilizes resonant vibration of the structure for high energy efficiency, and adopts a single piezo element to achieve both levitation and non-contact motion for a compact and simple design. Experiments demonstrate that the proposed actuator can reach a 1.65 cm/s or faster moving speed and is capable of transporting a total weight of 80 g under 1.2 W power consumption.

  12. Adeno-associated Virus as a Mammalian DNA Vector

    PubMed Central

    SALGANIK, MAX; HIRSCH, MATTHEW L.; SAMULSKI, RICHARD JUDE

    2015-01-01

    In the nearly five decades since its accidental discovery, adeno-associated virus (AAV) has emerged as a highly versatile vector system for both research and clinical applications. A broad range of natural serotypes, as well as an increasing number of capsid variants, has combined to produce a repertoire of vectors with different tissue tropisms, immunogenic profiles and transduction efficiencies. The story of AAV is one of continued progress and surprising discoveries in a viral system that, at first glance, is deceptively simple. This apparent simplicity has enabled the advancement of AAV into the clinic, where despite some challenges it has provided hope for patients and a promising new tool for physicians. Although a great deal of work remains to be done, both in studying the basic biology of AAV and in optimizing its clinical application, AAV vectors are currently the safest and most efficient platform for gene transfer in mammalian cells. PMID:26350320

  13. Cell-Free Optogenetic Gene Expression System.

    PubMed

    Jayaraman, Premkumar; Yeoh, Jing Wui; Jayaraman, Sudhaghar; Teh, Ai Ying; Zhang, Jingyun; Poh, Chueh Loo

    2018-04-20

    Optogenetic tools provide a new and efficient way to dynamically program gene expression with unmatched spatiotemporal precision. To date, their vast potential remains untapped in the field of cell-free synthetic biology, largely due to the lack of simple and efficient light-switchable systems. Here, to bridge the gap between cell-free systems and optogenetics, we studied our previously engineered one component-based blue light-inducible Escherichia coli promoter in a cell-free environment through experimental characterization and mathematical modeling. We achieved >10-fold dynamic expression and demonstrated rapid and reversible activation of the target gene to generate oscillatory response. The deterministic model developed was able to recapitulate the system behavior and helped to provide quantitative insights to optimize dynamic response. This in vitro optogenetic approach could be a powerful new high-throughput screening technology for rapid prototyping of complex biological networks in both space and time without the need for chemical induction.

  14. Tongue motor training support system.

    PubMed

    Sasaki, Makoto; Onishi, Kohei; Nakayama, Atsushi; Kamata, Katsuhiro; Stefanov, Dimitar; Yamaguchi, Masaki

    2014-01-01

    In this paper, we introduce a new tongue-training system that can be used for improvement of the tongue's range of motion and muscle strength after dysphagia. The training process is organized in game-like manner. Initially, we analyzed surface electromyography (EMG) signals of the suprahyoid muscles of five subjects during tongue-training motions. This test revealed that four types tongue training motions and a swallowing motion could be classified with 93.5% accuracy. Recognized EMG signals during tongue motions were designed to allow control of a mouse cursor via intentional tongue motions. Results demonstrated that simple PC games could be played by tongue motions, achieving in this way efficient, enjoyable and pleasant tongue training. Using the proposed method, dysphagia patients can choose games that suit their preferences and/or state of mind. It is expected that the proposed system will be an efficient tool for long-term tongue motor training and maintaining patients' motivation.

  15. Cryopreservation for preservation of potato genetic resources

    PubMed Central

    Niino, Takao; Arizaga, Miriam Valle

    2015-01-01

    Cryopreservation is becoming a very important tool for the long-term storage of plant genetic resources and efficient cryopreservation protocols have been developed for a large number of plant species. Practical procedures, developed using in vitro tissue culture, can be a simple and reliable preservation option of potato genetic resources rather than maintaining by vegetative propagation in genebanks due their allogamous nature. Cryopreserved materials insure a long-term backup of field collections against loss of plant germplasm. Occurrence of genetic variation, in tissue culture cells during prolonged subcultures, can be avoided with suitable cryopreservation protocols that provide high regrowth, leading and facilitating a systematic and strategic cryo-banking of plant genetic resources. Cryopreservation protocols for potato reviewed here, can efficiently complement field and in vitro conservation, providing for preservation of genotypes difficult to preserve by other methods, wild types and other species decided as priority collections. PMID:25931979

  16. Efficient transformation and artificial miRNA gene silencing in Lemna minor

    PubMed Central

    Cantó-Pastor, Alex; Mollá-Morales, Almudena; Ernst, Evan; Dahl, William; Zhai, Jixian; Yan, Yiheng; Meyers, Blake; Shanklin, John; Martienssen, Robert

    2015-01-01

    Lack of genetic tools in the Lemnaceae (duckweed) has impeded full implementation of this organism as model for biological research, despite its rapid doubling time, simple architecture and unusual metabolic characteristics. Here we present technologies to facilitate high-throughput genetic studies in duckweed. We developed a fast and efficient method for producing Lemna minor stable transgenic fronds via agrobacterium-mediated transformation and regeneration from tissue culture. Additionally, we engineered an artificial microRNA (amiRNA) gene silencing system. We identified a Lemna gibba endogenous miR166 precursor and used it as a backbone to produce amiRNAs. As a proof of concept we induced the silencing of CH42, a Magnesium Chelatase subunit, using our amiRNA platform. Expression of CH42 in transgenic Lemna minor fronds was significantly reduced, which resulted in reduction of chlorophyll pigmentation. The techniques presented here will enable tackling future challenges in the biology and biotechnology of Lemnaceae. PMID:24989135

  17. Efficient computation paths for the systematic analysis of sensitivities

    NASA Astrophysics Data System (ADS)

    Greppi, Paolo; Arato, Elisabetta

    2013-01-01

    A systematic sensitivity analysis requires computing the model on all points of a multi-dimensional grid covering the domain of interest, defined by the ranges of variability of the inputs. The issues to efficiently perform such analyses on algebraic models are handling solution failures within and close to the feasible region and minimizing the total iteration count. Scanning the domain in the obvious order is sub-optimal in terms of total iterations and is likely to cause many solution failures. The problem of choosing a better order can be translated geometrically into finding Hamiltonian paths on certain grid graphs. This work proposes two paths, one based on a mixed-radix Gray code and the other, a quasi-spiral path, produced by a novel heuristic algorithm. Some simple, easy-to-visualize examples are presented, followed by performance results for the quasi-spiral algorithm and the practical application of the different paths in a process simulation tool.

  18. Visual display aid for orbital maneuvering - Design considerations

    NASA Technical Reports Server (NTRS)

    Grunwald, Arthur J.; Ellis, Stephen R.

    1993-01-01

    This paper describes the development of an interactive proximity operations planning system that allows on-site planning of fuel-efficient multiburn maneuvers in a potential multispacecraft environment. Although this display system most directly assists planning by providing visual feedback to aid visualization of the trajectories and constraints, its most significant features include: (1) the use of an 'inverse dynamics' algorithm that removes control nonlinearities facing the operator, and (2) a trajectory planning technique that separates, through a 'geometric spreadsheet', the normally coupled complex problems of planning orbital maneuvers and allows solution by an iterative sequence of simple independent actions. The visual feedback of trajectory shapes and operational constraints, provided by user-transparent and continuously active background computations, allows the operator to make fast, iterative design changes that rapidly converge to fuel-efficient solutions. The planning tool provides an example of operator-assisted optimization of nonlinear cost functions.

  19. Empty tracks optimization based on Z-Map model

    NASA Astrophysics Data System (ADS)

    Liu, Le; Yan, Guangrong; Wang, Zaijun; Zang, Genao

    2017-12-01

    For parts with many features, there are more empty tracks during machining. If these tracks are not optimized, the machining efficiency will be seriously affected. In this paper, the characteristics of the empty tracks are studied in detail. Combining with the existing optimization algorithm, a new tracks optimization method based on Z-Map model is proposed. In this method, the tool tracks are divided into the unit processing section, and then the Z-Map model simulation technique is used to analyze the order constraint between the unit segments. The empty stroke optimization problem is transformed into the TSP with sequential constraints, and then through the genetic algorithm solves the established TSP problem. This kind of optimization method can not only optimize the simple structural parts, but also optimize the complex structural parts, so as to effectively plan the empty tracks and greatly improve the processing efficiency.

  20. GOSSIP, a New VO Compliant Tool for SED Fitting

    NASA Astrophysics Data System (ADS)

    Franzetti, P.; Scodeggio, M.; Garilli, B.; Fumana, M.; Paioro, L.

    2008-08-01

    We present GOSSIP (Galaxy Observed-Simulated SED Interactive Program), a new tool developed to perform SED fitting in a simple, user friendly and efficient way. GOSSIP automatically builds-up the observed SED of an object (or a large sample of objects) combining magnitudes in different bands and eventually a spectrum; then it performs a χ^2 minimization fitting procedure versus a set of synthetic models. The fitting results are used to estimate a number of physical parameters like the Star Formation History, absolute magnitudes, stellar mass and their Probability Distribution Functions. User defined models can be used, but GOSSIP is also able to load models produced by the most commonly used synthesis population codes. GOSSIP can be used interactively with other visualization tools using the PLASTIC protocol for communications. Moreover, since it has been developed with large data sets applications in mind, it will be extended to operate within the Virtual Observatory framework. GOSSIP is distributed to the astronomical community from the PANDORA group web site (http://cosmos.iasf-milano.inaf.it/pandora/gossip.html).

  1. ColorTree: a batch customization tool for phylogenic trees

    PubMed Central

    Chen, Wei-Hua; Lercher, Martin J

    2009-01-01

    Background Genome sequencing projects and comparative genomics studies typically aim to trace the evolutionary history of large gene sets, often requiring human inspection of hundreds of phylogenetic trees. If trees are checked for compatibility with an explicit null hypothesis (e.g., the monophyly of certain groups), this daunting task is greatly facilitated by an appropriate coloring scheme. Findings In this note, we introduce ColorTree, a simple yet powerful batch customization tool for phylogenic trees. Based on pattern matching rules, ColorTree applies a set of customizations to an input tree file, e.g., coloring labels or branches. The customized trees are saved to an output file, which can then be viewed and further edited by Dendroscope (a freely available tree viewer). ColorTree runs on any Perl installation as a stand-alone command line tool, and its application can thus be easily automated. This way, hundreds of phylogenic trees can be customized for easy visual inspection in a matter of minutes. Conclusion ColorTree allows efficient and flexible visual customization of large tree sets through the application of a user-supplied configuration file to multiple tree files. PMID:19646243

  2. ColorTree: a batch customization tool for phylogenic trees.

    PubMed

    Chen, Wei-Hua; Lercher, Martin J

    2009-07-31

    Genome sequencing projects and comparative genomics studies typically aim to trace the evolutionary history of large gene sets, often requiring human inspection of hundreds of phylogenetic trees. If trees are checked for compatibility with an explicit null hypothesis (e.g., the monophyly of certain groups), this daunting task is greatly facilitated by an appropriate coloring scheme. In this note, we introduce ColorTree, a simple yet powerful batch customization tool for phylogenic trees. Based on pattern matching rules, ColorTree applies a set of customizations to an input tree file, e.g., coloring labels or branches. The customized trees are saved to an output file, which can then be viewed and further edited by Dendroscope (a freely available tree viewer). ColorTree runs on any Perl installation as a stand-alone command line tool, and its application can thus be easily automated. This way, hundreds of phylogenic trees can be customized for easy visual inspection in a matter of minutes. ColorTree allows efficient and flexible visual customization of large tree sets through the application of a user-supplied configuration file to multiple tree files.

  3. Carbohydrates as efficient catalysts for the hydration of α-amino nitriles.

    PubMed

    Chitale, Sampada; Derasp, Joshua S; Hussain, Bashir; Tanveer, Kashif; Beauchemin, André M

    2016-11-01

    Directed hydration of α-amino nitriles was achieved under mild conditions using simple carbohydrates as catalysts exploiting temporary intramolecularity. A broadly applicable procedure using both formaldehyde and NaOH as catalysts efficiently hydrated a variety of primary and secondary susbtrates, and allowed the hydration of enantiopure substrates to proceed without racemization. This work also provides a rare comparison of the catalytic activity of carbohydrates, and shows that the simple aldehydes at the basis of chemical evolution are efficient organocatalysts mimicking the function of hydratase enzymes. Optimal catalytic efficiency was observed with destabilized aldehydes, and with difficult substrates only simple carbohydrates such as formaldehyde and glycolaldehyde proved reliable.

  4. Mathematical prediction of core body temperature from environment, activity, and clothing: The heat strain decision aid (HSDA).

    PubMed

    Potter, Adam W; Blanchard, Laurie A; Friedl, Karl E; Cadarette, Bruce S; Hoyt, Reed W

    2017-02-01

    Physiological models provide useful summaries of complex interrelated regulatory functions. These can often be reduced to simple input requirements and simple predictions for pragmatic applications. This paper demonstrates this modeling efficiency by tracing the development of one such simple model, the Heat Strain Decision Aid (HSDA), originally developed to address Army needs. The HSDA, which derives from the Givoni-Goldman equilibrium body core temperature prediction model, uses 16 inputs from four elements: individual characteristics, physical activity, clothing biophysics, and environmental conditions. These inputs are used to mathematically predict core temperature (T c ) rise over time and can estimate water turnover from sweat loss. Based on a history of military applications such as derivation of training and mission planning tools, we conclude that the HSDA model is a robust integration of physiological rules that can guide a variety of useful predictions. The HSDA model is limited to generalized predictions of thermal strain and does not provide individualized predictions that could be obtained from physiological sensor data-driven predictive models. This fully transparent physiological model should be improved and extended with new findings and new challenging scenarios. Published by Elsevier Ltd.

  5. A simple rhodamine hydrazide-based turn-on fluorescent probe for HOCl detection.

    PubMed

    Zhang, Zhen; Zou, Yuan; Deng, Chengquan; Meng, Liesu

    2016-06-01

    Hypochlorous acid (HOCl) plays a crucial role in daily life and mediates a variety of physiological processes, however, abnormal levels of HOCl have been associated with numerous human diseases. It is therefore of significant interest to establish a simple, selective, rapid and sensitive fluorogenic method for the detection of HOCl in environmental and biological samples. A hydrazide-containing fluorescent probe based on a rhodamine scaffold was facilely developed that could selectively detect HOCl over other biologically relevant reactive oxygen species, reactive nitrogen species and most common metal ions in vitro. Via an irreversible oxidation-hydrolysis mechanism, and upon HOCl-triggered opening of the intramolecular spirocyclic ring during detection, the rhodamine hydrazide-based probe exhibited large fluorescence enhancement in the emission spectra with a fast response, low detection limit and comparatively wide pH detection range in aqueous media. The probe was further successfully applied to monitoring trace HOCl in tap water and imaging both exogenous and endogenous HOCl within living cells. It is anticipated that this simple and useful probe might be an efficient tool with which to facilitate more HOCl-related chemical and biological research. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  6. Energy efficiency façade design in high-rise apartment buildings using the calculation of solar heat transfer through windows with shading devices

    NASA Astrophysics Data System (ADS)

    Ha, P. T. H.

    2018-04-01

    The architectural design orientation at the first design stage plays a key role and has a great impact on the energy consumption of a building throughout its life-cycle. To provide designers with a simple and useful tool in quantitatively determining and simply optimizing the energy efficiency of a building at the very first stage of conceptual design, a factor namely building envelope energy efficiency (Khqnl ) should be investigated and proposed. Heat transfer through windows and other glazed areas of mezzanine floors accounts for 86% of overall thermal transfer through building envelope, so the factor Khqnl of high-rise buildings largely depends on shading solutions. The author has established tables and charts to make reference to the values of Khqnl factor in certain high-rise apartment buildings in Hanoi calculated with a software program subject to various inputs including: types and sizes of shading devices, building orientations and at different points of time to be respectively analyzed. It is possible and easier for architects to refer to these tables and charts in façade design for a higher level of energy efficiency.

  7. Micropropagation of African violet (Saintpaulia ionantha Wendl.).

    PubMed

    Shukla, Mukund; Sullivan, J Alan; Jain, Shri Mohan; Murch, Susan J; Saxena, Praveen K

    2013-01-01

    Micropropagation is an important tool for rapid multiplication and the creation of genetic variability in African violets (Saintpaulia ionantha Wendl.). Successful in vitro propagation depends on the specific requirements and precise manipulation of various factors such as the type of explants used, physiological state of the mother plant, plant growth regulators in the culture medium, and growth conditions. Development of cost-effective protocols with a high rate of multiplication is a crucial requirement for commercial application of micropropagation. The current chapter describes an optimized protocol for micropropagation of African violets using leaf explants obtained from in vitro grown plants. In this process, plant regeneration occurs via both somatic embryogenesis and shoot organogenesis simultaneously in the explants induced with the growth regulator thidiazuron (TDZ; N-phenyl-N'-1,2,3-thidiazol-5-ylurea). The protocol is simple, rapid, and efficient for large-scale propagation of African violet and the dual routes of regeneration allow for multiple applications of the technology from simple clonal propagation to induction or selection of variants to the production of synthetic seeds.

  8. A Novel Shape Parameterization Approach

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    1999-01-01

    This paper presents a novel parameterization approach for complex shapes suitable for a multidisciplinary design optimization application. The approach consists of two basic concepts: (1) parameterizing the shape perturbations rather than the geometry itself and (2) performing the shape deformation by means of the soft objects animation algorithms used in computer graphics. Because the formulation presented in this paper is independent of grid topology, we can treat computational fluid dynamics and finite element grids in a similar manner. The proposed approach is simple, compact, and efficient. Also, the analytical sensitivity derivatives are easily computed for use in a gradient-based optimization. This algorithm is suitable for low-fidelity (e.g., linear aerodynamics and equivalent laminated plate structures) and high-fidelity analysis tools (e.g., nonlinear computational fluid dynamics and detailed finite element modeling). This paper contains the implementation details of parameterizing for planform, twist, dihedral, thickness, and camber. The results are presented for a multidisciplinary design optimization application consisting of nonlinear computational fluid dynamics, detailed computational structural mechanics, performance, and a simple propulsion module.

  9. Multidisciplinary Aerodynamic-Structural Shape Optimization Using Deformation (MASSOUD)

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    2000-01-01

    This paper presents a multidisciplinary shape parameterization approach. The approach consists of two basic concepts: (1) parameterizing the shape perturbations rather than the geometry itself and (2) performing the shape deformation by means of the soft object animation algorithms used in computer graphics. Because the formulation presented in this paper is independent of grid topology, we can treat computational fluid dynamics and finite element grids in the same manner. The proposed approach is simple, compact, and efficient. Also, the analytical sensitivity derivatives are easily computed for use in a gradient-based optimization. This algorithm is suitable for low-fidelity (e.g., linear aerodynamics and equivalent laminate plate structures) and high-fidelity (e.g., nonlinear computational fluid dynamics and detailed finite element modeling) analysis tools. This paper contains the implementation details of parameterizing for planform, twist, dihedral, thickness, camber, and free-form surface. Results are presented for a multidisciplinary application consisting of nonlinear computational fluid dynamics, detailed computational structural mechanics, and a simple performance module.

  10. Multidisciplinary Aerodynamic-Structural Shape Optimization Using Deformation (MASSOUD)

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    2000-01-01

    This paper presents a multidisciplinary shape parameterization approach. The approach consists of two basic concepts: (1) parameterizing the shape perturbations rather than the geometry itself and (2) performing the shape deformation by means of the soft object animation algorithms used in computer graphics. Because the formulation presented in this paper is independent of grid topology, we can treat computational fluid dynamics and finite element grids in a similar manner. The proposed approach is simple, compact, and efficient. Also, the analytical sensitivity derivatives are easily computed for use in a gradient-based optimization. This algorithm is suitable for low-fidelity (e.g., linear aerodynamics and equivalent laminated plate structures) and high-fidelity (e.g., nonlinear computational fluid dynamics and detailed finite element modeling analysis tools. This paper contains the implementation details of parameterizing for planform, twist, dihedral, thickness, camber, and free-form surface. Results are presented for a multidisciplinary design optimization application consisting of nonlinear computational fluid dynamics, detailed computational structural mechanics, and a simple performance module.

  11. UUI: Reusable Spatial Data Services in Unified User Interface at NASA GES DISC

    NASA Technical Reports Server (NTRS)

    Petrenko, Maksym; Hegde, Mahabaleshwa; Bryant, Keith; Pham, Long B.

    2016-01-01

    Unified User Interface (UUI) is a next-generation operational data access tool that has been developed at Goddard Earth Sciences Data and Information Services Center(GES DISC) to provide a simple, unified, and intuitive one-stop shop experience for the key data services available at GES DISC, including subsetting (Simple Subset Wizard -SSW), granule file search (Mirador), plotting (Giovanni), and other legacy spatial data services. UUI has been built based on a flexible infrastructure of reusable web services self-contained building blocks that can easily be plugged into spatial applications, including third-party clients or services, to easily enable new functionality as new datasets and services become available. In this presentation, we will discuss our experience in designing UUI services based on open industry standards. We will also explain how the resulting framework can be used for a rapid development, deployment, and integration of spatial data services, facilitating efficient access and dissemination of spatial data sets.

  12. Analyzing Discourse Processing Using a Simple Natural Language Processing Tool

    ERIC Educational Resources Information Center

    Crossley, Scott A.; Allen, Laura K.; Kyle, Kristopher; McNamara, Danielle S.

    2014-01-01

    Natural language processing (NLP) provides a powerful approach for discourse processing researchers. However, there remains a notable degree of hesitation by some researchers to consider using NLP, at least on their own. The purpose of this article is to introduce and make available a "simple" NLP (SiNLP) tool. The overarching goal of…

  13. Water Conservation Measures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ian Metzger, Jesse Dean

    2010-12-31

    This software requires inputs of simple water fixture inventory information and calculates the water/energy and cost benefits of various retrofit opportunities. This tool includes water conservation measures for: Low-flow Toilets, Low-flow Urinals, Low-flow Faucets, and Low-flow Showheads. This tool calculates water savings, energy savings, demand reduction, cost savings, and building life cycle costs including: simple payback, discounted payback, net-present value, and savings to investment ratio. In addition this tool also displays the environmental benefits of a project.

  14. Simple-MSSM: a simple and efficient method for simultaneous multi-site saturation mutagenesis.

    PubMed

    Cheng, Feng; Xu, Jian-Miao; Xiang, Chao; Liu, Zhi-Qiang; Zhao, Li-Qing; Zheng, Yu-Guo

    2017-04-01

    To develop a practically simple and robust multi-site saturation mutagenesis (MSSM) method that enables simultaneously recombination of amino acid positions for focused mutant library generation. A general restriction enzyme-free and ligase-free MSSM method (Simple-MSSM) based on prolonged overlap extension PCR (POE-PCR) and Simple Cloning techniques. As a proof of principle of Simple-MSSM, the gene of eGFP (enhanced green fluorescent protein) was used as a template gene for simultaneous mutagenesis of five codons. Forty-eight randomly selected clones were sequenced. Sequencing revealed that all the 48 clones showed at least one mutant codon (mutation efficiency = 100%), and 46 out of the 48 clones had mutations at all the five codons. The obtained diversities at these five codons are 27, 24, 26, 26 and 22, respectively, which correspond to 84, 75, 81, 81, 69% of the theoretical diversity offered by NNK-degeneration (32 codons; NNK, K = T or G). The enzyme-free Simple-MSSM method can simultaneously and efficiently saturate five codons within one day, and therefore avoid missing interactions between residues in interacting amino acid networks.

  15. Evaluation of Oral and Maxillofacial Surgery Residents' Operative Skills: Feasibility and Engagement Study Using SIMPL Software for a Mobile Phone.

    PubMed

    Kaban, Leonard B; Cappetta, Alyssa; George, Brian C; Lahey, Edward T; Bohnen, Jordan D; Troulis, Maria J

    2017-10-01

    There are no universally accepted tools to evaluate operative skills of surgical residents in a timely fashion. The purpose of this study was to determine the feasibility of using a smartphone application, SIMPL (System for Improving and Measuring Procedural Learning), developed by a multi-institutional research collaborative, to achieve a high rate of timely operative evaluations and resident communication and to collect performance data. The authors hypothesized that these goals would be achieved because the process is convenient and efficient. This was a prospective feasibility and engagement study using SIMPL to evaluate residents' operative skills. SIMPL requires the attending surgeon to answer 3 multiple-choice questions: 1) What level of help (Zwisch Scale) was required by the trainee? 2) What was the level of performance? 3) How complex was the case? The evaluator also can dictate a narrative. The sample was composed of 3 faculty members and 3 volunteer senior residents. Predictor variables were the surgeons, trainees, and procedures performed. Outcome variables included number and percentage of procedures performed by faculty-and-resident pairs assessed, time required to complete assessments, time lapsed to submission, percentage of assessments with narratives, and residents' response rates. From March through June 2016, 151 procedures were performed in the operating room by the faculty-and-resident teams. There were 107 assessments submitted (71%). Resident response (self-assessment) to faculty evaluations was 81%. Recorded time to complete assessments (n = 75 of 107) was shorter than 2 minutes. The time lapsed to submission was shorter than 72 hours (100%). Dictations were submitted for 35 evaluations (33%). Data for the type of help, performance, and complexity of cases were collected for each resident. SIMPL facilitates timely intraoperative evaluations of surgical skills, engagement by faculty and residents, and collection of detailed procedural data. Additional prospective trials to assess this tool further are planned. Copyright © 2017 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  16. Quantitative Assessment of Arrhythmia Using Non-linear Approach: A Non-invasive Prognostic Tool

    NASA Astrophysics Data System (ADS)

    Chakraborty, Monisha; Ghosh, Dipak

    2017-12-01

    Accurate prognostic tool to identify severity of Arrhythmia is yet to be investigated, owing to the complexity of the ECG signal. In this paper, we have shown that quantitative assessment of Arrhythmia is possible using non-linear technique based on "Hurst Rescaled Range Analysis". Although the concept of applying "non-linearity" for studying various cardiac dysfunctions is not entirely new, the novel objective of this paper is to identify the severity of the disease, monitoring of different medicine and their dose, and also to assess the efficiency of different medicine. The approach presented in this work is simple which in turn will help doctors in efficient disease management. In this work, Arrhythmia ECG time series are collected from MIT-BIH database. Normal ECG time series are acquired using POLYPARA system. Both time series are analyzed in thelight of non-linear approach following the method "Rescaled Range Analysis". The quantitative parameter, "Fractal Dimension" (D) is obtained from both types of time series. The major finding is that Arrhythmia ECG poses lower values of D as compared to normal. Further, this information can be used to access the severity of Arrhythmia quantitatively, which is a new direction of prognosis as well as adequate software may be developed for the use of medical practice.

  17. Quantitative Assessment of Arrhythmia Using Non-linear Approach: A Non-invasive Prognostic Tool

    NASA Astrophysics Data System (ADS)

    Chakraborty, Monisha; Ghosh, Dipak

    2018-04-01

    Accurate prognostic tool to identify severity of Arrhythmia is yet to be investigated, owing to the complexity of the ECG signal. In this paper, we have shown that quantitative assessment of Arrhythmia is possible using non-linear technique based on "Hurst Rescaled Range Analysis". Although the concept of applying "non-linearity" for studying various cardiac dysfunctions is not entirely new, the novel objective of this paper is to identify the severity of the disease, monitoring of different medicine and their dose, and also to assess the efficiency of different medicine. The approach presented in this work is simple which in turn will help doctors in efficient disease management. In this work, Arrhythmia ECG time series are collected from MIT-BIH database. Normal ECG time series are acquired using POLYPARA system. Both time series are analyzed in thelight of non-linear approach following the method "Rescaled Range Analysis". The quantitative parameter, "Fractal Dimension" (D) is obtained from both types of time series. The major finding is that Arrhythmia ECG poses lower values of D as compared to normal. Further, this information can be used to access the severity of Arrhythmia quantitatively, which is a new direction of prognosis as well as adequate software may be developed for the use of medical practice.

  18. CRISPR/Cas9-mediated efficient genome editing via blastospore-based transformation in entomopathogenic fungus Beauveria bassiana.

    PubMed

    Chen, Jingjing; Lai, Yiling; Wang, Lili; Zhai, Suzhen; Zou, Gen; Zhou, Zhihua; Cui, Chunlai; Wang, Sibao

    2017-04-03

    Beauveria bassiana is an environmentally friendly alternative to chemical insecticides against various agricultural insect pests and vectors of human diseases. However, its application has been limited due to slow kill and sensitivity to abiotic stresses. Understanding of the molecular pathogenesis and physiological characteristics would facilitate improvement of the fungal performance. Loss-of-function mutagenesis is the most powerful tool to characterize gene functions, but it is hampered by the low rate of homologous recombination and the limited availability of selectable markers. Here, by combining the use of uridine auxotrophy as recipient and donor DNAs harboring auxotrophic complementation gene ura5 as a selectable marker with the blastospore-based transformation system, we established a highly efficient, low false-positive background and cost-effective CRISPR/Cas9-mediated gene editing system in B. bassiana. This system has been demonstrated as a simple and powerful tool for targeted gene knock-out and/or knock-in in B. bassiana in a single gene disruption. We further demonstrated that our system allows simultaneous disruption of multiple genes via homology-directed repair in a single transformation. This technology will allow us to study functionally redundant genes and holds significant potential to greatly accelerate functional genomics studies of B. bassiana.

  19. CRISPR/Cas9-mediated efficient genome editing via blastospore-based transformation in entomopathogenic fungus Beauveria bassiana

    PubMed Central

    Chen, Jingjing; Lai, Yiling; Wang, Lili; Zhai, Suzhen; Zou, Gen; Zhou, Zhihua; Cui, Chunlai; Wang, Sibao

    2017-01-01

    Beauveria bassiana is an environmentally friendly alternative to chemical insecticides against various agricultural insect pests and vectors of human diseases. However, its application has been limited due to slow kill and sensitivity to abiotic stresses. Understanding of the molecular pathogenesis and physiological characteristics would facilitate improvement of the fungal performance. Loss-of-function mutagenesis is the most powerful tool to characterize gene functions, but it is hampered by the low rate of homologous recombination and the limited availability of selectable markers. Here, by combining the use of uridine auxotrophy as recipient and donor DNAs harboring auxotrophic complementation gene ura5 as a selectable marker with the blastospore-based transformation system, we established a highly efficient, low false-positive background and cost-effective CRISPR/Cas9-mediated gene editing system in B. bassiana. This system has been demonstrated as a simple and powerful tool for targeted gene knock-out and/or knock-in in B. bassiana in a single gene disruption. We further demonstrated that our system allows simultaneous disruption of multiple genes via homology-directed repair in a single transformation. This technology will allow us to study functionally redundant genes and holds significant potential to greatly accelerate functional genomics studies of B. bassiana. PMID:28368054

  20. Design of Friction Stir Spot Welding Tools by Using a Novel Thermal-Mechanical Approach

    PubMed Central

    Su, Zheng-Ming; Qiu, Qi-Hong; Lin, Pai-Chen

    2016-01-01

    A simple thermal-mechanical model for friction stir spot welding (FSSW) was developed to obtain similar weld performance for different weld tools. Use of the thermal-mechanical model and a combined approach enabled the design of weld tools for various sizes but similar qualities. Three weld tools for weld radii of 4, 5, and 6 mm were made to join 6061-T6 aluminum sheets. Performance evaluations of the three weld tools compared fracture behavior, microstructure, micro-hardness distribution, and welding temperature of welds in lap-shear specimens. For welds made by the three weld tools under identical processing conditions, failure loads were approximately proportional to tool size. Failure modes, microstructures, and micro-hardness distributions were similar. Welding temperatures correlated with frictional heat generation rate densities. Because the three weld tools sufficiently met all design objectives, the proposed approach is considered a simple and feasible guideline for preliminary tool design. PMID:28773800

  1. Design of Friction Stir Spot Welding Tools by Using a Novel Thermal-Mechanical Approach.

    PubMed

    Su, Zheng-Ming; Qiu, Qi-Hong; Lin, Pai-Chen

    2016-08-09

    A simple thermal-mechanical model for friction stir spot welding (FSSW) was developed to obtain similar weld performance for different weld tools. Use of the thermal-mechanical model and a combined approach enabled the design of weld tools for various sizes but similar qualities. Three weld tools for weld radii of 4, 5, and 6 mm were made to join 6061-T6 aluminum sheets. Performance evaluations of the three weld tools compared fracture behavior, microstructure, micro-hardness distribution, and welding temperature of welds in lap-shear specimens. For welds made by the three weld tools under identical processing conditions, failure loads were approximately proportional to tool size. Failure modes, microstructures, and micro-hardness distributions were similar. Welding temperatures correlated with frictional heat generation rate densities. Because the three weld tools sufficiently met all design objectives, the proposed approach is considered a simple and feasible guideline for preliminary tool design.

  2. In Vivo Functional Genomic Studies of Sterol Carrier Protein-2 Gene in the Yellow Fever Mosquito

    PubMed Central

    Peng, Rong; Maklokova, Vilena I.; Chandrashekhar, Jayadevi H.; Lan, Que

    2011-01-01

    A simple and efficient DNA delivery method to introduce extrachromosomal DNA into mosquito embryos would significantly aid functional genomic studies. The conventional method for delivery of DNA into insects is to inject the DNA directly into the embryos. Taking advantage of the unique aspects of mosquito reproductive physiology during vitellogenesis and an in vivo transfection reagent that mediates DNA uptake in cells via endocytosis, we have developed a new method to introduce DNA into mosquito embryos vertically via microinjection of DNA vectors in vitellogenic females without directly manipulating the embryos. Our method was able to introduce inducible gene expression vectors transiently into F0 mosquitoes to perform functional studies in vivo without transgenic lines. The high efficiency of expression knockdown was reproducible with more than 70% of the F0 individuals showed sufficient gene expression suppression (<30% of the controls' levels). At the cohort level, AeSCP-2 expression knockdown in early instar larvae resulted in detectable phenotypes of the expression deficiency such as high mortality, lowered fertility, and distorted sex ratio after induction of AeSCP-2 siRNA expression in vivo. The results further confirmed the important role of AeSCP-2 in the development and reproduction of A. aegypti. In this study, we proved that extrachromosaomal transient expression of an inducible gene from a DNA vector vertically delivered via vitellogenic females can be used to manipulate gene expression in F0 generation. This new method will be a simple and efficient tool for in vivo functional genomic studies in mosquitoes. PMID:21437205

  3. Using Alice 2.0 to Design Games for People with Stroke.

    PubMed

    Proffitt, Rachel; Kelleher, Caitlin; Baum, M Carolyn; Engsberg, Jack

    2012-08-01

    Computer and videogames are gaining in popularity as rehabilitation tools. Unfortunately, most systems still require extensive programming/engineering knowledge to create, something that therapists, as novice programmers, do not possess. There is software designed to allow novice programmers to create storyboard and games through simple drag-and-drop formats; however, the applications for therapeutic game development have not been studied. The purpose of this study was to have an occupational therapy (OT) student with no prior computer programming experience learn how to create computer games for persons with stroke using Alice 2.0, a drag-and-drop editor, designed by Carnegie Mellon University (Pittsburgh, PA). The OT student learned how to use Alice 2.0 through a textbook, tutorials, and assistance from computer science students. She kept a journal of her process, detailing her successes and challenges. The OT student created three games for people with stroke using Alice 2.0. She found that although there were many supports in Alice for creating stories, it lacked critical pieces necessary for game design. Her recommendations for a future programming environment for therapists were that it (1) be efficient, (2) include basic game design pieces so therapists do not have to create them, (3) provide technical support, and (4) be simple. With the incorporation of these recommendations, a future programming environment for therapists will be an effective tool for therapeutic game development.

  4. Simple and specific colorimetric detection of Staphylococcus using its volatile 2-[3-acetoxy-4,4,14-trimethylandrost-8-en-17-yl] propanoic acid in the liquid phase and head space of cultures.

    PubMed

    Saranya, Raju; Aarthi, Raju; Sankaran, Krishnan

    2015-05-01

    Spread of drug-resistant Staphylococcus spp. into communities pose danger demanding effective non-invasive and non-destructive tools for its early detection and surveillance. Characteristic volatile organic compounds (VOCs) produced by bacteria offer new diagnostic targets and novel approaches not exploited so far in infectious disease diagnostics. Our search for such characteristic VOC for Staphylococcus spp. led to the depiction of 2-[3-acetoxy-4,4,14-trimethylandrost-8-en-17-yl] propanoic acid (ATMAP), a moderately volatile compound detected both in the culture and headspace when the organism was grown in tryptone soya broth (TSB) medium. A simple and inexpensive colorimetric method (colour change from yellow to orange) using methyl red as the pH indicator provided an absolutely specific way for identifying Staphylococcus spp., The assay performed in liquid cultures (7-h growth in TSB) as well as in the headspace of plate cultures (grown for 10 h on TSA) was optimised in a 96-well plate and 12-well plate formats, respectively, employing a set of positive and negative strains. Only Staphylococcus spp. showed the distinct colour change from yellow to orange due to the production of the above VOC while in the case of other organisms, the reagent remained yellow. The method validated using known clinical and environmental strains (56 including Staphylococcus, Proteus, Pseudomonas, Klebsiella, Bacillus, Shigella and Escherichia coli) was found to be highly efficient showing 100% specificity and sensitivity. Such simple methods of bacterial pathogen identification are expected to form the next generation tools for the control of infectious diseases through early detection and surveillance of causative agents.

  5. Development of a CRISPR/Cas9 genome editing toolbox for Corynebacterium glutamicum.

    PubMed

    Liu, Jiao; Wang, Yu; Lu, Yujiao; Zheng, Ping; Sun, Jibin; Ma, Yanhe

    2017-11-16

    Corynebacterium glutamicum is an important industrial workhorse and advanced genetic engineering tools are urgently demanded. Recently, the clustered regularly interspaced short palindromic repeats (CRISPR) and their CRISPR-associated proteins (Cas) have revolutionized the field of genome engineering. The CRISPR/Cas9 system that utilizes NGG as protospacer adjacent motif (PAM) and has good targeting specificity can be developed into a powerful tool for efficient and precise genome editing of C. glutamicum. Herein, we developed a versatile CRISPR/Cas9 genome editing toolbox for C. glutamicum. Cas9 and gRNA expression cassettes were reconstituted to combat Cas9 toxicity and facilitate effective termination of gRNA transcription. Co-transformation of Cas9 and gRNA expression plasmids was exploited to overcome high-frequency mutation of cas9, allowing not only highly efficient gene deletion and insertion with plasmid-borne editing templates (efficiencies up to 60.0 and 62.5%, respectively) but also simple and time-saving operation. Furthermore, CRISPR/Cas9-mediated ssDNA recombineering was developed to precisely introduce small modifications and single-nucleotide changes into the genome of C. glutamicum with efficiencies over 80.0%. Notably, double-locus editing was also achieved in C. glutamicum. This toolbox works well in several C. glutamicum strains including the widely-used strains ATCC 13032 and ATCC 13869. In this study, we developed a CRISPR/Cas9 toolbox that could facilitate markerless gene deletion, gene insertion, precise base editing, and double-locus editing in C. glutamicum. The CRISPR/Cas9 toolbox holds promise for accelerating the engineering of C. glutamicum and advancing its application in the production of biochemicals and biofuels.

  6. Potential high-frequency off-target mutagenesis induced by CRISPR/Cas9 in Arabidopsis and its prevention.

    PubMed

    Zhang, Qiang; Xing, Hui-Li; Wang, Zhi-Ping; Zhang, Hai-Yan; Yang, Fang; Wang, Xue-Chen; Chen, Qi-Jun

    2018-03-01

    We present novel observations of high-specificity SpCas9 variants, sgRNA expression strategies based on mutant sgRNA scaffold and tRNA processing system, and CRISPR/Cas9-mediated T-DNA integrations. Specificity of CRISPR/Cas9 tools has been a major concern along with the reports of their successful applications. We report unexpected observations of high frequency off-target mutagenesis induced by CRISPR/Cas9 in T1 Arabidopsis mutants although the sgRNA was predicted to have a high specificity score. We also present evidence that the off-target effects were further exacerbated in the T2 progeny. To prevent the off-target effects, we tested and optimized two strategies in Arabidopsis, including introduction of a mCherry cassette for a simple and reliable isolation of Cas9-free mutants and the use of highly specific mutant SpCas9 variants. Optimization of the mCherry vectors and subsequent validation found that fusion of tRNA with the mutant rather than the original sgRNA scaffold significantly improves editing efficiency. We then examined the editing efficiency of eight high-specificity SpCas9 variants in combination with the improved tRNA-sgRNA fusion strategy. Our results suggest that highly specific SpCas9 variants require a higher level of expression than their wild-type counterpart to maintain high editing efficiency. Additionally, we demonstrate that T-DNA can be inserted into the cleavage sites of CRISPR/Cas9 targets with high frequency. Altogether, our results suggest that in plants, continuous attention should be paid to off-target effects induced by CRISPR/Cas9 in current and subsequent generations, and that the tools optimized in this report will be useful in improving genome editing efficiency and specificity in plants and other organisms.

  7. An efficient and scalable graph modeling approach for capturing information at different levels in next generation sequencing reads

    PubMed Central

    2013-01-01

    Background Next generation sequencing technologies have greatly advanced many research areas of the biomedical sciences through their capability to generate massive amounts of genetic information at unprecedented rates. The advent of next generation sequencing has led to the development of numerous computational tools to analyze and assemble the millions to billions of short sequencing reads produced by these technologies. While these tools filled an important gap, current approaches for storing, processing, and analyzing short read datasets generally have remained simple and lack the complexity needed to efficiently model the produced reads and assemble them correctly. Results Previously, we presented an overlap graph coarsening scheme for modeling read overlap relationships on multiple levels. Most current read assembly and analysis approaches use a single graph or set of clusters to represent the relationships among a read dataset. Instead, we use a series of graphs to represent the reads and their overlap relationships across a spectrum of information granularity. At each information level our algorithm is capable of generating clusters of reads from the reduced graph, forming an integrated graph modeling and clustering approach for read analysis and assembly. Previously we applied our algorithm to simulated and real 454 datasets to assess its ability to efficiently model and cluster next generation sequencing data. In this paper we extend our algorithm to large simulated and real Illumina datasets to demonstrate that our algorithm is practical for both sequencing technologies. Conclusions Our overlap graph theoretic algorithm is able to model next generation sequencing reads at various levels of granularity through the process of graph coarsening. Additionally, our model allows for efficient representation of the read overlap relationships, is scalable for large datasets, and is practical for both Illumina and 454 sequencing technologies. PMID:24564333

  8. Pulling My Gut out--Simple Tools for Engaging Students in Gross Anatomy Lectures

    ERIC Educational Resources Information Center

    Chan, Lap Ki

    2010-01-01

    A lecture is not necessarily a monologue, promoting only passive learning. If appropriate techniques are used, a lecture can stimulate active learning too. One such method is demonstration, which can engage learners' attention and increase the interaction between the lecturer and the learners. This article describes two simple and useful tools for…

  9. Manufacture and use of home made ophthalmoscopes: a 150th anniversary tribute to Helmholtz

    PubMed Central

    Armour, Roger H

    2000-01-01

    Objective To produce a simple, effective, and inexpensive training ophthalmoscope. Design Case study. Setting A coffee table in a sitting room and an eye clinic. Participants 10 friends and relatives, several patients, and a cooperative Persian cat. Interventions Direct ophthalmoscopy with instrument made with easily available material and tools from art and office equipment shops. Main outcome measures Efficiency, clarity of view, and price of ophthalmoscope. Results The instrument was readily made; of the 50 manufactured two thirds gave a good view; and each cost less than £1 to make. Conclusion The ophthalmoscope is fun to make, works well, and anyone can make one. PMID:11124172

  10. Computational fluid dynamics uses in fluid dynamics/aerodynamics education

    NASA Technical Reports Server (NTRS)

    Holst, Terry L.

    1994-01-01

    The field of computational fluid dynamics (CFD) has advanced to the point where it can now be used for the purpose of fluid dynamics physics education. Because of the tremendous wealth of information available from numerical simulation, certain fundamental concepts can be efficiently communicated using an interactive graphical interrogation of the appropriate numerical simulation data base. In other situations, a large amount of aerodynamic information can be communicated to the student by interactive use of simple CFD tools on a workstation or even in a personal computer environment. The emphasis in this presentation is to discuss ideas for how this process might be implemented. Specific examples, taken from previous publications, will be used to highlight the presentation.

  11. On making things the best - Aeronautical uses of optimization /Wright Bros. lecture/

    NASA Technical Reports Server (NTRS)

    Ashley, H.

    1981-01-01

    The paper's purpose is to summarize and evaluate the results of an investigation into the degree to which formal optimization methods have contributed practically to the design and operation of atmospheric flight vehicles. The nature of this technology is reviewed and illustrated with simple structural examples. A series of published successful applications is described, from the fields of aerodynamics, structures, guidance and control, optimal trajectories and vehicle configuration optimization. The corresponding improvements over conventional analysis are assessed. Speculations are offered as to why these tools have made such little headway toward acceptance by designers. The growing need for their use in the future is explained; they hold out an unparalleled opportunity for improved efficiencies.

  12. Chemical Posttranslational Modification with Designed Rhodium(II) Catalysts.

    PubMed

    Martin, S C; Minus, M B; Ball, Z T

    2016-01-01

    Natural enzymes use molecular recognition to perform exquisitely selective transformations on nucleic acids, proteins, and natural products. Rhodium(II) catalysts mimic this selectivity, using molecular recognition to allow selective modification of proteins with a variety of functionalized diazo reagents. The rhodium catalysts and the diazo reactivity have been successfully applied to a variety of protein folds, the chemistry succeeds in complex environments such as cell lysate, and a simple protein blot method accurately assesses modification efficiency. The studies with rhodium catalysts provide a new tool to study and probe protein-binding events, as well as a new synthetic approach to protein conjugates for medical, biochemical, or materials applications. © 2016 Elsevier Inc. All rights reserved.

  13. Petunia (Petunia hybrida).

    PubMed

    Lutke, W Kevin

    2006-01-01

    Petunia hybrida genetic transformation continues to be a valuable tool for genetic research into biochemical pathways and gene expression, as well as generating commercial products with varying floral colors. In this chapter, we describe a simple and reproducible genetic transformation protocol for generating transgenic petunia plants harboring a gene of interest and selectable marker. The system utilizes Agrobacterium tumefaciens for transgene integration with plant recovery via shoot organogenesis from leaf explant material. Selection for transgenic plants is achieved using the bar gene conferring resistance to glufosinate or nptII gene for resistance to kanamycin. Transformation efficiencies of around 10% are achievable with shoots being recovered about 8 wk after transgene insertion and rooted plants transferred to the greenhouse about twelve weeks after inoculation.

  14. A Design Support Framework through Dynamic Deployment of Hypothesis and Verification in the Design Process

    NASA Astrophysics Data System (ADS)

    Nomaguch, Yutaka; Fujita, Kikuo

    This paper proposes a design support framework, named DRIFT (Design Rationale Integration Framework of Three layers), which dynamically captures and manages hypothesis and verification in the design process. A core of DRIFT is a three-layered design process model of action, model operation and argumentation. This model integrates various design support tools and captures design operations performed on them. Action level captures the sequence of design operations. Model operation level captures the transition of design states, which records a design snapshot over design tools. Argumentation level captures the process of setting problems and alternatives. The linkage of three levels enables to automatically and efficiently capture and manage iterative hypothesis and verification processes through design operations over design tools. In DRIFT, such a linkage is extracted through the templates of design operations, which are extracted from the patterns embeded in design tools such as Design-For-X (DFX) approaches, and design tools are integrated through ontology-based representation of design concepts. An argumentation model, gIBIS (graphical Issue-Based Information System), is used for representing dependencies among problems and alternatives. A mechanism of TMS (Truth Maintenance System) is used for managing multiple hypothetical design stages. This paper also demonstrates a prototype implementation of DRIFT and its application to a simple design problem. Further, it is concluded with discussion of some future issues.

  15. Novel calibration tools and validation concepts for microarray-based platforms used in molecular diagnostics and food safety control.

    PubMed

    Brunner, C; Hoffmann, K; Thiele, T; Schedler, U; Jehle, H; Resch-Genger, U

    2015-04-01

    Commercial platforms consisting of ready-to-use microarrays printed with target-specific DNA probes, a microarray scanner, and software for data analysis are available for different applications in medical diagnostics and food analysis, detecting, e.g., viral and bacteriological DNA sequences. The transfer of these tools from basic research to routine analysis, their broad acceptance in regulated areas, and their use in medical practice requires suitable calibration tools for regular control of instrument performance in addition to internal assay controls. Here, we present the development of a novel assay-adapted calibration slide for a commercialized DNA-based assay platform, consisting of precisely arranged fluorescent areas of various intensities obtained by incorporating different concentrations of a "green" dye and a "red" dye in a polymer matrix. These dyes present "Cy3" and "Cy5" analogues with improved photostability, chosen based upon their spectroscopic properties closely matching those of common labels for the green and red channel of microarray scanners. This simple tool allows to efficiently and regularly assess and control the performance of the microarray scanner provided with the biochip platform and to compare different scanners. It will be eventually used as fluorescence intensity scale for referencing of assays results and to enhance the overall comparability of diagnostic tests.

  16. AIRNOISE: A Tool for Preliminary Noise-Abatement Terminal Approach Route Design

    NASA Technical Reports Server (NTRS)

    Li, Jinhua; Sridhar, Banavar; Xue, Min; Ng, Hok

    2016-01-01

    Noise from aircraft in the airport vicinity is one of the leading aviation-induced environmental issues. The FAA developed the Integrated Noise Model (INM) and its replacement Aviation Environmental Design Tool (AEDT) software to assess noise impact resulting from all aviation activities. However, a software tool is needed that is simple to use for terminal route modification, quick and reasonably accurate for preliminary noise impact evaluation and flexible to be used for iterative design of optimal noise-abatement terminal routes. In this paper, we extend our previous work on developing a noise-abatement terminal approach route design tool, named AIRNOISE, to satisfy this criterion. First, software efficiency has been significantly increased by over tenfold using the C programming language instead of MATLAB. Moreover, a state-of-the-art high performance GPU-accelerated computing module is implemented that was tested to be hundreds time faster than the C implementation. Secondly, a Graphical User Interface (GUI) was developed allowing users to import current terminal approach routes and modify the routes interactively to design new terminal approach routes. The corresponding noise impacts are then calculated and displayed in the GUI in seconds. Finally, AIRNOISE was applied to Baltimore-Washington International Airport terminal approach route to demonstrate its usage.

  17. Graph-based optimization of epitope coverage for vaccine antigen design

    DOE PAGES

    Theiler, James Patrick; Korber, Bette Tina Marie

    2017-01-29

    Epigraph is a recently developed algorithm that enables the computationally efficient design of single or multi-antigen vaccines to maximize the potential epitope coverage for a diverse pathogen population. Potential epitopes are defined as short contiguous stretches of proteins, comparable in length to T-cell epitopes. This optimal coverage problem can be formulated in terms of a directed graph, with candidate antigens represented as paths that traverse this graph. Epigraph protein sequences can also be used as the basis for designing peptides for experimental evaluation of immune responses in natural infections to highly variable proteins. The epigraph tool suite also enables rapidmore » characterization of populations of diverse sequences from an immunological perspective. Fundamental distance measures are based on immunologically relevant shared potential epitope frequencies, rather than simple Hamming or phylogenetic distances. Here, we provide a mathematical description of the epigraph algorithm, include a comparison of different heuristics that can be used when graphs are not acyclic, and we describe an additional tool we have added to the web-based epigraph tool suite that provides frequency summaries of all distinct potential epitopes in a population. Lastly, we also show examples of the graphical output and summary tables that can be generated using the epigraph tool suite and explain their content and applications.« less

  18. Graph-based optimization of epitope coverage for vaccine antigen design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Theiler, James Patrick; Korber, Bette Tina Marie

    Epigraph is a recently developed algorithm that enables the computationally efficient design of single or multi-antigen vaccines to maximize the potential epitope coverage for a diverse pathogen population. Potential epitopes are defined as short contiguous stretches of proteins, comparable in length to T-cell epitopes. This optimal coverage problem can be formulated in terms of a directed graph, with candidate antigens represented as paths that traverse this graph. Epigraph protein sequences can also be used as the basis for designing peptides for experimental evaluation of immune responses in natural infections to highly variable proteins. The epigraph tool suite also enables rapidmore » characterization of populations of diverse sequences from an immunological perspective. Fundamental distance measures are based on immunologically relevant shared potential epitope frequencies, rather than simple Hamming or phylogenetic distances. Here, we provide a mathematical description of the epigraph algorithm, include a comparison of different heuristics that can be used when graphs are not acyclic, and we describe an additional tool we have added to the web-based epigraph tool suite that provides frequency summaries of all distinct potential epitopes in a population. Lastly, we also show examples of the graphical output and summary tables that can be generated using the epigraph tool suite and explain their content and applications.« less

  19. Creating Simple Admin Tools Using Info*Engine and Java

    NASA Technical Reports Server (NTRS)

    Jones, Corey; Kapatos, Dennis; Skradski, Cory; Felkins, J. D.

    2012-01-01

    PTC has provided a simple way to dynamically interact with Windchill using Info*Engine. This presentation will describe how to create a simple Info*Engine Tasks capable of saving Windchill 10.0 administration of tedious work.

  20. AppEEARS: A Simple Tool that Eases Complex Data Integration and Visualization Challenges for Users

    NASA Astrophysics Data System (ADS)

    Maiersperger, T.

    2017-12-01

    The Application for Extracting and Exploring Analysis-Ready Samples (AppEEARS) offers a simple and efficient way to perform discovery, processing, visualization, and acquisition across large quantities and varieties of Earth science data. AppEEARS brings significant value to a very broad array of user communities by 1) significantly reducing data volumes, at-archive, based on user-defined space-time-variable subsets, 2) promoting interoperability across a wide variety of datasets via format and coordinate reference system harmonization, 3) increasing the velocity of both data analysis and insight by providing analysis-ready data packages and by allowing interactive visual exploration of those packages, and 4) ensuring veracity by making data quality measures more apparent and usable and by providing standards-based metadata and processing provenance. Development and operation of AppEEARS is led by the National Aeronautics and Space Administration (NASA) Land Processes Distributed Active Archive Center (LP DAAC). The LP DAAC also partners with several other archives to extend the capability across a larger federation of geospatial data providers. Over one hundred datasets are currently available, covering a diversity of variables including land cover, population, elevation, vegetation indices, and land surface temperature. Many hundreds of users have already used this new web-based capability to make the complex tasks of data integration and visualization much simpler and more efficient.

  1. Efficient hybrid white organic light-emitting diodes for application of triplet harvesting with simple structure

    NASA Astrophysics Data System (ADS)

    Hwang, Kyo Min; Lee, Song Eun; Lee, Sungkyu; Yoo, Han Kyu; Baek, Hyun Jung; Kim, Young Kwan; Kim, Jwajin; Yoon, Seung Soo

    2016-08-01

    In this study, we fabricated hybrid white organic light-emitting diodes (WOLEDs) based on triplet harvesting with a simple structure. All the hole transporting material and host in the emitting layer (EML) of devices utilized the same material N,N'-di-1-naphthalenyl-N,N'-diphenyl [1,1':4',1″:4″,1‴-quaterphenyl]-4,4‴-diamine (4P-NPD), which is known to be blue fluorescent material. Simple hybrid WOLEDs were fabricated with blue fluorescent, green and red phosphorescent materials. We investigated the effect of triplet harvesting (TH) by an exciton generation zone on simple hybrid WOLEDs. The simple hybrid WOLEDs characteristically had a dominant hole mobility, so an exciton generation zone was expected in the EML. Additionally, the optimal the thickness of the hole transporting layer and electron transporting layer was fabricated a simple hybrid WOLEDs. The simple hybrid WOLED exhibits a maximum luminous efficiency of 29.3 cd/A and a maximum external quantum efficiency of 11.2%. The Commission Internationale de l'Éclairage (International Commission on Illumination) coordinates were (0.45, 0.43) at about 10,000 cd/m2.

  2. Mining Peripheral Arterial Disease Cases from Narrative Clinical Notes Using Natural Language Processing

    PubMed Central

    Afzal, Naveed; Sohn, Sunghwan; Abram, Sara; Scott, Christopher G.; Chaudhry, Rajeev; Liu, Hongfang; Kullo, Iftikhar J.; Arruda-Olson, Adelaide M.

    2016-01-01

    Objective Lower extremity peripheral arterial disease (PAD) is highly prevalent and affects millions of individuals worldwide. We developed a natural language processing (NLP) system for automated ascertainment of PAD cases from clinical narrative notes and compared the performance of the NLP algorithm to billing code algorithms, using ankle-brachial index (ABI) test results as the gold standard. Methods We compared the performance of the NLP algorithm to 1) results of gold standard ABI; 2) previously validated algorithms based on relevant ICD-9 diagnostic codes (simple model) and 3) a combination of ICD-9 codes with procedural codes (full model). A dataset of 1,569 PAD patients and controls was randomly divided into training (n= 935) and testing (n= 634) subsets. Results We iteratively refined the NLP algorithm in the training set including narrative note sections, note types and service types, to maximize its accuracy. In the testing dataset, when compared with both simple and full models, the NLP algorithm had better accuracy (NLP: 91.8%, full model: 81.8%, simple model: 83%, P<.001), PPV (NLP: 92.9%, full model: 74.3%, simple model: 79.9%, P<.001), and specificity (NLP: 92.5%, full model: 64.2%, simple model: 75.9%, P<.001). Conclusions A knowledge-driven NLP algorithm for automatic ascertainment of PAD cases from clinical notes had greater accuracy than billing code algorithms. Our findings highlight the potential of NLP tools for rapid and efficient ascertainment of PAD cases from electronic health records to facilitate clinical investigation and eventually improve care by clinical decision support. PMID:28189359

  3. Integration of Linear Dynamic Emission and Climate Models with Air Traffic Simulations

    NASA Technical Reports Server (NTRS)

    Sridhar, Banavar; Ng, Hok K.; Chen, Neil Y.

    2012-01-01

    Future air traffic management systems are required to balance the conflicting objectives of maximizing safety and efficiency of traffic flows while minimizing the climate impact of aviation emissions and contrails. Integrating emission and climate models together with air traffic simulations improve the understanding of the complex interaction between the physical climate system, carbon and other greenhouse gas emissions and aviation activity. This paper integrates a national-level air traffic simulation and optimization capability with simple climate models and carbon cycle models, and climate metrics to assess the impact of aviation on climate. The capability can be used to make trade-offs between extra fuel cost and reduction in global surface temperature change. The parameters in the simulation can be used to evaluate the effect of various uncertainties in emission models and contrails and the impact of different decision horizons. Alternatively, the optimization results from the simulation can be used as inputs to other tools that monetize global climate impacts like the FAA s Aviation Environmental Portfolio Management Tool for Impacts.

  4. Framework for SEM contour analysis

    NASA Astrophysics Data System (ADS)

    Schneider, L.; Farys, V.; Serret, E.; Fenouillet-Beranger, C.

    2017-03-01

    SEM images provide valuable information about patterning capability. Geometrical properties such as Critical Dimension (CD) can be extracted from them and are used to calibrate OPC models, thus making OPC more robust and reliable. However, there is currently a shortage of appropriate metrology tools to inspect complex two-dimensional patterns in the same way as one would work with simple one-dimensional patterns. In this article we present a full framework for the analysis of SEM images. It has been proven to be fast, reliable and robust for every type of structure, and particularly for two-dimensional structures. To achieve this result, several innovative solutions have been developed and will be presented in the following pages. Firstly, we will present a new noise filter which is used to reduce noise on SEM images, followed by an efficient topography identifier, and finally we will describe the use of a topological skeleton as a measurement tool that can extend CD measurements on all kinds of patterns.

  5. Field Assessment of Energy Audit Tools for Retrofit Programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, J.; Bohac, D.; Nelson, C.

    2013-07-01

    This project focused on the use of home energy ratings as a tool to promote energy retrofits in existing homes. A home energy rating provides a quantitative appraisal of a home’s energy performance, usually compared to a benchmark such as the average energy use of similar homes in the same region. Rating systems based on energy performance models, the focus of this report, can establish a home’s achievable energy efficiency potential and provide a quantitative assessment of energy savings after retrofits are completed, although their accuracy needs to be verified by actual measurement or billing data. Ratings can also showmore » homeowners where they stand compared to their neighbors, thus creating social pressure to conform to or surpass others. This project field-tested three different building performance models of varying complexity, in order to assess their value as rating systems in the context of a residential retrofit program: Home Energy Score, SIMPLE, and REM/Rate.« less

  6. Ancestry Informative Marker Sets for Determining Continental Origin and Admixture Proportions in Common Populations in America

    PubMed Central

    Kosoy, Roman; Nassir, Rami; Tian, Chao; White, Phoebe A; Butler, Lesley M.; Silva, Gabriel; Kittles, Rick; Alarcon-Riquelme, Marta E.; Gregersen, Peter K.; Belmont, John W.; De La Vega, Francisco M.; Seldin, Michael F.

    2011-01-01

    To provide a resource for assessing continental ancestry in a wide variety of genetic studies we identified, validated and characterized a set of 128 ancestry informative markers (AIMs). The markers were chosen for informativeness, genome-wide distribution, and genotype reproducibility on two platforms (TaqMan® assays and Illumina arrays). We analyzed genotyping data from 825 subjects with diverse ancestry, including European, East Asian, Amerindian, African, South Asian, Mexican, and Puerto Rican. A comprehensive set of 128 AIMs and subsets as small as 24 AIMs are shown to be useful tools for ascertaining the origin of subjects from particular continents, and to correct for population stratification in admixed population sample sets. Our findings provide general guidelines for the application of specific AIM subsets as a resource for wide application. We conclude that investigators can use TaqMan assays for the selected AIMs as a simple and cost efficient tool to control for differences in continental ancestry when conducting association studies in ethnically diverse populations. PMID:18683858

  7. How to Recognize Success and Failure: Practical Assessment of an Evolving, First-Semester Laboratory Program Using Simple, Outcome-Based Tools

    ERIC Educational Resources Information Center

    Gron, Liz U.; Bradley, Shelly B.; McKenzie, Jennifer R.; Shinn, Sara E.; Teague, M. Warfield

    2013-01-01

    This paper presents the use of simple, outcome-based assessment tools to design and evaluate the first semester of a new introductory laboratory program created to teach green analytical chemistry using environmental samples. This general chemistry laboratory program, like many introductory courses, has a wide array of stakeholders within and…

  8. Visualization of small lesions in rat cartilage by means of laboratory-based x-ray phase contrast imaging

    NASA Astrophysics Data System (ADS)

    Marenzana, Massimo; Hagen, Charlotte K.; Das Neves Borges, Patricia; Endrizzi, Marco; Szafraniec, Magdalena B.; Ignatyev, Konstantin; Olivo, Alessandro

    2012-12-01

    Being able to quantitatively assess articular cartilage in three-dimensions (3D) in small rodent animal models, with a simple laboratory set-up, would prove extremely important for the development of pre-clinical research focusing on cartilage pathologies such as osteoarthritis (OA). These models are becoming essential tools for the development of new drugs for OA, a disease affecting up to 1/3 of the population older than 50 years for which there is no cure except prosthetic surgery. However, due to limitations in imaging technology, high-throughput 3D structural imaging has not been achievable in small rodent models, thereby limiting their translational potential and their efficiency as research tools. We show that a simple laboratory system based on coded-aperture x-ray phase contrast imaging (CAXPCi) can correctly visualize the cartilage layer in slices of an excised rat tibia imaged both in air and in saline solution. Moreover, we show that small, surgically induced lesions are also correctly detected by the CAXPCi system, and we support this finding with histopathology examination. Following these successful proof-of-concept results in rat cartilage, we expect that an upgrade of the system to higher resolutions (currently underway) will enable extending the method to the imaging of mouse cartilage as well. From a technological standpoint, by showing the capability of the system to detect cartilage also in water, we demonstrate phase sensitivity comparable to other lab-based phase methods (e.g. grating interferometry). In conclusion, CAXPCi holds a strong potential for being adopted as a routine laboratory tool for non-destructive, high throughput assessment of 3D structural changes in murine articular cartilage, with a possible impact in the field similar to the revolution that conventional microCT brought into bone research.

  9. A simple quality assurance test tool for the visual verification of light and radiation field congruent using electronic portal images device and computed radiography

    PubMed Central

    2012-01-01

    Background The radiation field on most megavoltage radiation therapy units are shown by a light field projected through the collimator by a light source mounted inside the collimator. The light field is traditionally used for patient alignment. Hence it is imperative that the light field is congruent with the radiation field. Method A simple quality assurance tool has been designed for rapid and simple test of the light field and radiation field using electronic portal images device (EPID) or computed radiography (CR). We tested this QA tool using Varian PortalVision and Elekta iViewGT EPID systems and Kodak CR system. Results Both the single and double exposure techniques were evaluated, with double exposure technique providing a better visualization of the light-radiation field markers. The light and radiation congruency could be detected within 1 mm. This will satisfy the American Association of Physicists in Medicine task group report number 142 recommendation of 2 mm tolerance. Conclusion The QA tool can be used with either an EPID or CR to provide a simple and rapid method to verify light and radiation field congruence. PMID:22452821

  10. The Persistence of Mode 1 Technology in the Korean Late Paleolithic

    PubMed Central

    Lee, Hyeong Woo

    2013-01-01

    Ssangjungri (SJ), an open-air site with several Paleolithic horizons, was recently discovered in South Korea. Most of the identified artifacts are simple core and flake tools that indicate an expedient knapping strategy. Bifacially worked core tools, which might be considered non-classic bifaces, also have been found. The prolific horizons at the site were dated by accelerator mass spectrometry (AMS) to about 30 kya. Another newly discovered Paleolithic open-air site, Jeungsan (JS), shows a homogeneous lithic pattern during this period. The dominated artifact types and usage of raw materials are similar in character to those from SJ, although JS yielded a larger number of simple core and flake tools with non-classic bifaces. Chronometric analysis by AMS and optically stimulated luminescence (OSL) indicate that the prime stratigraphic levels at JS also date to approximately 30 kya, and the numerous conjoining pieces indicate that the layers were not seriously affected by post-depositional processes. Thus, it can be confirmed that simple core and flake tools were produced at temporally and culturally independent sites until after 30 kya, supporting the hypothesis of a wide and persistent use of simple technology into the Late Pleistocene. PMID:23724113

  11. Analysis of pre-service physics teacher skills designing simple physics experiments based technology

    NASA Astrophysics Data System (ADS)

    Susilawati; Huda, C.; Kurniawan, W.; Masturi; Khoiri, N.

    2018-03-01

    Pre-service physics teacher skill in designing simple experiment set is very important in adding understanding of student concept and practicing scientific skill in laboratory. This study describes the skills of physics students in designing simple experiments based technologicall. The experimental design stages include simple tool design and sensor modification. The research method used is descriptive method with the number of research samples 25 students and 5 variations of simple physics experimental design. Based on the results of interviews and observations obtained the results of pre-service physics teacher skill analysis in designing simple experimental physics charged technology is good. Based on observation result, pre-service physics teacher skill in designing simple experiment is good while modification and sensor application are still not good. This suggests that pre-service physics teacher still need a lot of practice and do experiments in designing physics experiments using sensor modifications. Based on the interview result, it is found that students have high enough motivation to perform laboratory activities actively and students have high curiosity to be skilled at making simple practicum tool for physics experiment.

  12. Comprehensive, powerful, efficient, intuitive: a new software framework for clinical imaging applications

    NASA Astrophysics Data System (ADS)

    Augustine, Kurt E.; Holmes, David R., III; Hanson, Dennis P.; Robb, Richard A.

    2006-03-01

    One of the greatest challenges for a software engineer is to create a complex application that is comprehensive enough to be useful to a diverse set of users, yet focused enough for individual tasks to be carried out efficiently with minimal training. This "powerful yet simple" paradox is particularly prevalent in advanced medical imaging applications. Recent research in the Biomedical Imaging Resource (BIR) at Mayo Clinic has been directed toward development of an imaging application framework that provides powerful image visualization/analysis tools in an intuitive, easy-to-use interface. It is based on two concepts very familiar to physicians - Cases and Workflows. Each case is associated with a unique patient and a specific set of routine clinical tasks, or a workflow. Each workflow is comprised of an ordered set of general-purpose modules which can be re-used for each unique workflow. Clinicians help describe and design the workflows, and then are provided with an intuitive interface to both patient data and analysis tools. Since most of the individual steps are common to many different workflows, the use of general-purpose modules reduces development time and results in applications that are consistent, stable, and robust. While the development of individual modules may reflect years of research by imaging scientists, new customized workflows based on the new modules can be developed extremely fast. If a powerful, comprehensive application is difficult to learn and complicated to use, it will be unacceptable to most clinicians. Clinical image analysis tools must be intuitive and effective or they simply will not be used.

  13. Identification of apple cultivars on the basis of simple sequence repeat markers.

    PubMed

    Liu, G S; Zhang, Y G; Tao, R; Fang, J G; Dai, H Y

    2014-09-12

    DNA markers are useful tools that play an important role in plant cultivar identification. They are usually based on polymerase chain reaction (PCR) and include simple sequence repeats (SSRs), inter-simple sequence repeats, and random amplified polymorphic DNA. However, DNA markers were not used effectively in the complete identification of plant cultivars because of the lack of known DNA fingerprints. Recently, a novel approach called the cultivar identification diagram (CID) strategy was developed to facilitate the use of DNA markers for separate plant individuals. The CID was designed whereby a polymorphic maker was generated from each PCR that directly allowed for cultivar sample separation at each step. Therefore, it could be used to identify cultivars and varieties easily with fewer primers. In this study, 60 apple cultivars, including a few main cultivars in fields and varieties from descendants (Fuji x Telamon) were examined. Of the 20 pairs of SSR primers screened, 8 pairs gave reproducible, polymorphic DNA amplification patterns. The banding patterns obtained from these 8 primers were used to construct a CID map. Each cultivar or variety in this study was distinguished from the others completely, indicating that this method can be used for efficient cultivar identification. The result contributed to studies on germplasm resources and the seedling industry in fruit trees.

  14. Simplified aeroelastic modeling of horizontal axis wind turbines

    NASA Technical Reports Server (NTRS)

    Wendell, J. H.

    1982-01-01

    Certain aspects of the aeroelastic modeling and behavior of the horizontal axis wind turbine (HAWT) are examined. Two simple three degree of freedom models are described in this report, and tools are developed which allow other simple models to be derived. The first simple model developed is an equivalent hinge model to study the flap-lag-torsion aeroelastic stability of an isolated rotor blade. The model includes nonlinear effects, preconing, and noncoincident elastic axis, center of gravity, and aerodynamic center. A stability study is presented which examines the influence of key parameters on aeroelastic stability. Next, two general tools are developed to study the aeroelastic stability and response of a teetering rotor coupled to a flexible tower. The first of these tools is an aeroelastic model of a two-bladed rotor on a general flexible support. The second general tool is a harmonic balance solution method for the resulting second order system with periodic coefficients. The second simple model developed is a rotor-tower model which serves to demonstrate the general tools. This model includes nacelle yawing, nacelle pitching, and rotor teetering. Transient response time histories are calculated and compared to a similar model in the literature. Agreement between the two is very good, especially considering how few harmonics are used. Finally, a stability study is presented which examines the effects of support stiffness and damping, inflow angle, and preconing.

  15. CRISPR recognition tool (CRT): a tool for automatic detection of clustered regularly interspaced palindromic repeats.

    PubMed

    Bland, Charles; Ramsey, Teresa L; Sabree, Fareedah; Lowe, Micheal; Brown, Kyndall; Kyrpides, Nikos C; Hugenholtz, Philip

    2007-06-18

    Clustered Regularly Interspaced Palindromic Repeats (CRISPRs) are a novel type of direct repeat found in a wide range of bacteria and archaea. CRISPRs are beginning to attract attention because of their proposed mechanism; that is, defending their hosts against invading extrachromosomal elements such as viruses. Existing repeat detection tools do a poor job of identifying CRISPRs due to the presence of unique spacer sequences separating the repeats. In this study, a new tool, CRT, is introduced that rapidly and accurately identifies CRISPRs in large DNA strings, such as genomes and metagenomes. CRT was compared to CRISPR detection tools, Patscan and Pilercr. In terms of correctness, CRT was shown to be very reliable, demonstrating significant improvements over Patscan for measures precision, recall and quality. When compared to Pilercr, CRT showed improved performance for recall and quality. In terms of speed, CRT proved to be a huge improvement over Patscan. Both CRT and Pilercr were comparable in speed, however CRT was faster for genomes containing large numbers of repeats. In this paper a new tool was introduced for the automatic detection of CRISPR elements. This tool, CRT, showed some important improvements over current techniques for CRISPR identification. CRT's approach to detecting repetitive sequences is straightforward. It uses a simple sequential scan of a DNA sequence and detects repeats directly without any major conversion or preprocessing of the input. This leads to a program that is easy to describe and understand; yet it is very accurate, fast and memory efficient, being O(n) in space and O(nm/l) in time.

  16. CRISPR Recognition Tool (CRT): a tool for automatic detection ofclustered regularly interspaced palindromic repeats

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bland, Charles; Ramsey, Teresa L.; Sabree, Fareedah

    Clustered Regularly Interspaced Palindromic Repeats (CRISPRs) are a novel type of direct repeat found in a wide range of bacteria and archaea. CRISPRs are beginning to attract attention because of their proposed mechanism; that is, defending their hosts against invading extrachromosomal elements such as viruses. Existing repeat detection tools do a poor job of identifying CRISPRs due to the presence of unique spacer sequences separating the repeats. In this study, a new tool, CRT, is introduced that rapidly and accurately identifies CRISPRs in large DNA strings, such as genomes and metagenomes. CRT was compared to CRISPR detection tools, Patscan andmore » Pilercr. In terms of correctness, CRT was shown to be very reliable, demonstrating significant improvements over Patscan for measures precision, recall and quality. When compared to Pilercr, CRT showed improved performance for recall and quality. In terms of speed, CRT also demonstrated superior performance, especially for genomes containing large numbers of repeats. In this paper a new tool was introduced for the automatic detection of CRISPR elements. This tool, CRT, was shown to be a significant improvement over the current techniques for CRISPR identification. CRT's approach to detecting repetitive sequences is straightforward. It uses a simple sequential scan of a DNA sequence and detects repeats directly without any major conversion or preprocessing of the input. This leads to a program that is easy to describe and understand; yet it is very accurate, fast and memory efficient, being O(n) in space and O(nm/l) in time.« less

  17. Visualization and Interaction in Research, Teaching, and Scientific Communication

    NASA Astrophysics Data System (ADS)

    Ammon, C. J.

    2017-12-01

    Modern computing provides many tools for exploring observations, numerical calculations, and theoretical relationships. The number of options is, in fact, almost overwhelming. But the choices provide those with modest programming skills opportunities to create unique views of scientific information and to develop deeper insights into their data, their computations, and the underlying theoretical data-model relationships. I present simple examples of using animation and human-computer interaction to explore scientific data and scientific-analysis approaches. I illustrate how valuable a little programming ability can free scientists from the constraints of existing tools and can facilitate the development of deeper appreciation data and models. I present examples from a suite of programming languages ranging from C to JavaScript including the Wolfram Language. JavaScript is valuable for sharing tools and insight (hopefully) with others because it is integrated into one of the most powerful communication tools in human history, the web browser. Although too much of that power is often spent on distracting advertisements, the underlying computation and graphics engines are efficient, flexible, and almost universally available in desktop and mobile computing platforms. Many are working to fulfill the browser's potential to become the most effective tool for interactive study. Open-source frameworks for visualizing everything from algorithms to data are available, but advance rapidly. One strategy for dealing with swiftly changing tools is to adopt common, open data formats that are easily adapted (often by framework or tool developers). I illustrate the use of animation and interaction in research and teaching with examples from earthquake seismology.

  18. A novel cell culture model as a tool for forensic biology experiments and validations.

    PubMed

    Feine, Ilan; Shpitzen, Moshe; Roth, Jonathan; Gafny, Ron

    2016-09-01

    To improve and advance DNA forensic casework investigation outcomes, extensive field and laboratory experiments are carried out in a broad range of relevant branches, such as touch and trace DNA, secondary DNA transfer and contamination confinement. Moreover, the development of new forensic tools, for example new sampling appliances, by commercial companies requires ongoing validation and assessment by forensic scientists. A frequent challenge in these kinds of experiments and validations is the lack of a stable, reproducible and flexible biological reference material. As a possible solution, we present here a cell culture model based on skin-derived human dermal fibroblasts. Cultured cells were harvested, quantified and dried on glass slides. These slides were used in adhesive tape-lifting experiments and tests of DNA crossover confinement by UV irradiation. The use of this model enabled a simple and concise comparison between four adhesive tapes, as well as a straightforward demonstration of the effect of UV irradiation intensities on DNA quantity and degradation. In conclusion, we believe this model has great potential to serve as an efficient research tool in forensic biology. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  19. Sampling and sensitivity analyses tools (SaSAT) for computational modelling

    PubMed Central

    Hoare, Alexander; Regan, David G; Wilson, David P

    2008-01-01

    SaSAT (Sampling and Sensitivity Analysis Tools) is a user-friendly software package for applying uncertainty and sensitivity analyses to mathematical and computational models of arbitrary complexity and context. The toolbox is built in Matlab®, a numerical mathematical software package, and utilises algorithms contained in the Matlab® Statistics Toolbox. However, Matlab® is not required to use SaSAT as the software package is provided as an executable file with all the necessary supplementary files. The SaSAT package is also designed to work seamlessly with Microsoft Excel but no functionality is forfeited if that software is not available. A comprehensive suite of tools is provided to enable the following tasks to be easily performed: efficient and equitable sampling of parameter space by various methodologies; calculation of correlation coefficients; regression analysis; factor prioritisation; and graphical output of results, including response surfaces, tornado plots, and scatterplots. Use of SaSAT is exemplified by application to a simple epidemic model. To our knowledge, a number of the methods available in SaSAT for performing sensitivity analyses have not previously been used in epidemiological modelling and their usefulness in this context is demonstrated. PMID:18304361

  20. Modern data-driven decision support systems: the role of computing with words and computational linguistics

    NASA Astrophysics Data System (ADS)

    Kacprzyk, Janusz; Zadrożny, Sławomir

    2010-05-01

    We present how the conceptually and numerically simple concept of a fuzzy linguistic database summary can be a very powerful tool for gaining much insight into the very essence of data. The use of linguistic summaries provides tools for the verbalisation of data analysis (mining) results which, in addition to the more commonly used visualisation, e.g. via a graphical user interface, can contribute to an increased human consistency and ease of use, notably for supporting decision makers via the data-driven decision support system paradigm. Two new relevant aspects of the analysis are also outlined which were first initiated by the authors. First, following Kacprzyk and Zadrożny, it is further considered how linguistic data summarisation is closely related to some types of solutions used in natural language generation (NLG). This can make it possible to use more and more effective and efficient tools and techniques developed in NLG. Second, similar remarks are given on relations to systemic functional linguistics. Moreover, following Kacprzyk and Zadrożny, comments are given on an extremely relevant aspect of scalability of linguistic summarisation of data, using a new concept of a conceptual scalability.

  1. NEFI: Network Extraction From Images

    PubMed Central

    Dirnberger, M.; Kehl, T.; Neumann, A.

    2015-01-01

    Networks are amongst the central building blocks of many systems. Given a graph of a network, methods from graph theory enable a precise investigation of its properties. Software for the analysis of graphs is widely available and has been applied to study various types of networks. In some applications, graph acquisition is relatively simple. However, for many networks data collection relies on images where graph extraction requires domain-specific solutions. Here we introduce NEFI, a tool that extracts graphs from images of networks originating in various domains. Regarding previous work on graph extraction, theoretical results are fully accessible only to an expert audience and ready-to-use implementations for non-experts are rarely available or insufficiently documented. NEFI provides a novel platform allowing practitioners to easily extract graphs from images by combining basic tools from image processing, computer vision and graph theory. Thus, NEFI constitutes an alternative to tedious manual graph extraction and special purpose tools. We anticipate NEFI to enable time-efficient collection of large datasets. The analysis of these novel datasets may open up the possibility to gain new insights into the structure and function of various networks. NEFI is open source and available at http://nefi.mpi-inf.mpg.de. PMID:26521675

  2. Orchids (Cymbidium spp., Oncidium, and Phalaenopsis).

    PubMed

    Chan, Ming-Tsair; Chan, Yuan-Li; Sanjaya

    2006-01-01

    Recent advances in genetic engineering have made the transformation and regeneration of plants into a powerful tool for orchid improvement. This chapter presents a simple and reproducible Agrobacterium tumefaciens-mediated transformation protocol and molecular screening technique of transgenics for two orchid species, Oncidium and Phalaenopsis. The target tissues for gene transfer were protocorm-like bodies (PLBs) derived from protocorms, into which constructed foreign genes were successfully introduced. To establish stable transformants, two stages of selection were applied on the PLBs co-cultivated with A. tumefaciens. About 10% transformation efficiency was achieved in Oncidium orchid, as 108 antibiotic resistant independent PLBs were proliferated from 1000 infected PLBs. In Phalaenopsis orchid about 11 to 12% of transformation efficiency was achieved by using the present protocol. Different molecular methods and GUS-staining used to screen putative transgenic plants to confirm the integration of foreign DNA into the orchid genome were also described in detail. The methods described would also be useful for transformation of desired genes into other orchid species.

  3. Selecting soybean resistant to the cyst nematode Heterodera glycines using simple sequence repeat (microssatellite) markers.

    PubMed

    Espindola, S M C G; Hamawaki, O T; Oliveira, A P; Hamawaki, C D L; Hamawaki, R L; Takahashi, L M

    2016-03-11

    The soybean cyst nematode (SCN) is a major cause of soybean yield reduction. The objective of this study was to evaluate the efficiency of marker-assisted selection to identify genotypes resistant to SCN race 3 infection, using Sat_168 and Sat-141 resistance quantitative trait loci. The experiment was carried out under greenhouse conditions, using soybean populations originated from crosses between susceptible and resistant parent stock: CD-201 (susceptible) and Foster IAC (resistant), Conquista (susceptible) and S83-30 (resistant), La-Suprema (susceptible) and S57-11 (resistant), and Parecis (susceptible) and S65-50 (resistant). Plants were inoculated with SCN and evaluated according to the female index (FI), those with FI < 10% were classified as resistant to nematode infection. Plants were genotyped for SCN resistance using microsatellite markers Sat-141 and Sat_168. Marker selection efficiency was analyzed by a contingency table, taking into account genotypic versus phenotypic evaluations for each line. These markers were shown to be useful tool for selection of SCN race 3.

  4. Single Cell Proteolytic Assays to Investigate Cancer Clonal Heterogeneity and Cell Dynamics Using an Efficient Cell Loading Scheme

    NASA Astrophysics Data System (ADS)

    Chen, Yu-Chih; Cheng, Yu-Heng; Ingram, Patrick; Yoon, Euisik

    2016-06-01

    Proteolytic degradation of the extracellular matrix (ECM) is critical in cancer invasion, and recent work suggests that heterogeneous cancer populations cooperate in this process. Despite the importance of cell heterogeneity, conventional proteolytic assays measure average activity, requiring thousands of cells and providing limited information about heterogeneity and dynamics. Here, we developed a microfluidic platform that provides high-efficiency cell loading and simple valveless isolation, so the proteolytic activity of a small sample (10-100 cells) can be easily characterized. Combined with a single cell derived (clonal) sphere formation platform, we have successfully demonstrated the importance of microenvironmental cues for proteolytic activity and also investigated the difference between clones. Furthermore, the platform allows monitoring single cells at multiple time points, unveiling different cancer cell line dynamics in proteolytic activity. The presented tool facilitates single cell proteolytic analysis using small samples, and our findings illuminate the heterogeneous and dynamic nature of proteolytic activity.

  5. Efficient transformation and artificial miRNA gene silencing in Lemna minor.

    PubMed

    Cantó-Pastor, A; Mollá-Morales, A; Ernst, E; Dahl, W; Zhai, J; Yan, Y; Meyers, B C; Shanklin, J; Martienssen, R

    2015-01-01

    Despite rapid doubling time, simple architecture and ease of metabolic labelling, a lack of genetic tools in the Lemnaceae (duckweed) has impeded the full implementation of this organism as a model for biological research. Here, we present technologies to facilitate high-throughput genetic studies in duckweed. We developed a fast and efficient method for producing Lemna minor stable transgenic fronds via Agrobacterium-mediated transformation and regeneration from tissue culture. Additionally, we engineered an artificial microRNA (amiRNA) gene silencing system. We identified a Lemna gibba endogenous miR166 precursor and used it as a backbone to produce amiRNAs. As a proof of concept we induced the silencing of CH42, a magnesium chelatase subunit, using our amiRNA platform. Expression of CH42 in transgenic L. minor fronds was significantly reduced, which resulted in reduction of chlorophyll pigmentation. The techniques presented here will enable tackling future challenges in the biology and biotechnology of Lemnaceae. © 2014 German Botanical Society and The Royal Botanical Society of the Netherlands.

  6. Recent experience in simultaneous control-structure optimization

    NASA Technical Reports Server (NTRS)

    Salama, M.; Ramaker, R.; Milman, M.

    1989-01-01

    To show the feasibility of simultaneous optimization as design procedure, low order problems were used in conjunction with simple control formulations. The numerical results indicate that simultaneous optimization is not only feasible, but also advantageous. Such advantages come at the expense of introducing complexities beyond those encountered in structure optimization alone, or control optimization alone. Examples include: larger design parameter space, optimization may combine continuous and combinatoric variables, and the combined objective function may be nonconvex. Future extensions to include large order problems, more complex objective functions and constraints, and more sophisticated control formulations will require further research to ensure that the additional complexities do not outweigh the advantages of simultaneous optimization. Some areas requiring more efficient tools than currently available include: multiobjective criteria and nonconvex optimization. Efficient techniques to deal with optimization over combinatoric and continuous variables, and with truncation issues for structure and control parameters of both the model space as well as the design space need to be developed.

  7. A versatile model for soft patchy particles with various patch arrangements.

    PubMed

    Li, Zhan-Wei; Zhu, You-Liang; Lu, Zhong-Yuan; Sun, Zhao-Yan

    2016-01-21

    We propose a simple and general mesoscale soft patchy particle model, which can felicitously describe the deformable and surface-anisotropic characteristics of soft patchy particles. This model can be used in dynamics simulations to investigate the aggregation behavior and mechanism of various types of soft patchy particles with tunable number, size, direction, and geometrical arrangement of the patches. To improve the computational efficiency of this mesoscale model in dynamics simulations, we give the simulation algorithm that fits the compute unified device architecture (CUDA) framework of NVIDIA graphics processing units (GPUs). The validation of the model and the performance of the simulations using GPUs are demonstrated by simulating several benchmark systems of soft patchy particles with 1 to 4 patches in a regular geometrical arrangement. Because of its simplicity and computational efficiency, the soft patchy particle model will provide a powerful tool to investigate the aggregation behavior of soft patchy particles, such as patchy micelles, patchy microgels, and patchy dendrimers, over larger spatial and temporal scales.

  8. A unified REC market and composite RPO scheme for promotion of renewable energy in India

    NASA Astrophysics Data System (ADS)

    Shereef, R. M.; Khaparde, S. A.

    2017-07-01

    In India, uniform price was assigned to renewable energy certificate (REC) irrespective of renewable energy (RE) type, technology, and location. Moreover REC price bands are higher than existing preferential tariff. There are distinct renewable purchase obligations (RPOs) specified for various RE types, whereas there is lack of efficient tools to check RPO compliance. Because of these reasons, REC market stabilisation is getting delayed. This paper proposes a method using plant performance multiplier to convert non-solar and solar REC to single equivalent REC with competitive REC pricing, which can be traded on unified REC market. The method combines solar and non-solar RPOs into a single composite RPO, to make RPO compliance and its checking simple and efficient. A sample illustration of the proposed method is given. The benefits offered by the proposed method in REC pricing, REC trading and RPO compliance are discussed. A comparative economic analysis of present and proposed method is reported.

  9. pySeismicFMM: Python based Travel Time Calculation in Regular 2D and 3D Grids in Cartesian and Geographic Coordinates using Fast Marching Method

    NASA Astrophysics Data System (ADS)

    Wilde-Piorko, M.; Polkowski, M.

    2016-12-01

    Seismic wave travel time calculation is the most common numerical operation in seismology. The most efficient is travel time calculation in 1D velocity model - for given source, receiver depths and angular distance time is calculated within fraction of a second. Unfortunately, in most cases 1D is not enough to encounter differentiating local and regional structures. Whenever possible travel time through 3D velocity model has to be calculated. It can be achieved using ray calculation or time propagation in space. While single ray path calculation is quick it is complicated to find the ray path that connects source with the receiver. Time propagation in space using Fast Marching Method seems more efficient in most cases, especially when there are multiple receivers. In this presentation final release of a Python module pySeismicFMM is presented - simple and very efficient tool for calculating travel time from sources to receivers. Calculation requires regular 2D or 3D velocity grid either in Cartesian or geographic coordinates. On desktop class computer calculation speed is 200k grid cells per second. Calculation has to be performed once for every source location and provides travel time to all receivers. pySeismicFMM is free and open source. Development of this tool is a part of authors PhD thesis. Source code of pySeismicFMM will be published before Fall Meeting. National Science Centre Poland provided financial support for this work via NCN grant DEC-2011/02/A/ST10/00284.

  10. Dose calculation of dynamic trajectory radiotherapy using Monte Carlo.

    PubMed

    Manser, P; Frauchiger, D; Frei, D; Volken, W; Terribilini, D; Fix, M K

    2018-04-06

    Using volumetric modulated arc therapy (VMAT) delivery technique gantry position, multi-leaf collimator (MLC) as well as dose rate change dynamically during the application. However, additional components can be dynamically altered throughout the dose delivery such as the collimator or the couch. Thus, the degrees of freedom increase allowing almost arbitrary dynamic trajectories for the beam. While the dose delivery of such dynamic trajectories for linear accelerators is technically possible, there is currently no dose calculation and validation tool available. Thus, the aim of this work is to develop a dose calculation and verification tool for dynamic trajectories using Monte Carlo (MC) methods. The dose calculation for dynamic trajectories is implemented in the previously developed Swiss Monte Carlo Plan (SMCP). SMCP interfaces the treatment planning system Eclipse with a MC dose calculation algorithm and is already able to handle dynamic MLC and gantry rotations. Hence, the additional dynamic components, namely the collimator and the couch, are described similarly to the dynamic MLC by defining data pairs of positions of the dynamic component and the corresponding MU-fractions. For validation purposes, measurements are performed with the Delta4 phantom and film measurements using the developer mode on a TrueBeam linear accelerator. These measured dose distributions are then compared with the corresponding calculations using SMCP. First, simple academic cases applying one-dimensional movements are investigated and second, more complex dynamic trajectories with several simultaneously moving components are compared considering academic cases as well as a clinically motivated prostate case. The dose calculation for dynamic trajectories is successfully implemented into SMCP. The comparisons between the measured and calculated dose distributions for the simple as well as for the more complex situations show an agreement which is generally within 3% of the maximum dose or 3mm. The required computation time for the dose calculation remains the same when the additional dynamic moving components are included. The results obtained for the dose comparisons for simple and complex situations suggest that the extended SMCP is an accurate dose calculation and efficient verification tool for dynamic trajectory radiotherapy. This work was supported by Varian Medical Systems. Copyright © 2018. Published by Elsevier GmbH.

  11. SU-G-201-15: Nomogram as an Efficient Dosimetric Verification Tool in HDR Prostate Brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liang, J; Todor, D

    Purpose: Nomogram as a simple QA tool for HDR prostate brachytherapy treatment planning has been developed and validated clinically. Reproducibility including patient-to-patient and physician-to-physician variability was assessed. Methods: The study was performed on HDR prostate implants from physician A (n=34) and B (n=15) using different implant techniques and planning methodologies. A nomogram was implemented as an independent QA of computer-based treatment planning before plan execution. Normalized implant strength (total air kerma strength Sk*t in cGy cm{sup 2} divided by prescribed dose in cGy) was plotted as a function of PTV volume and total V100. A quadratic equation was used tomore » fit the data with R{sup 2} denoting the model predictive power. Results: All plans showed good target coverage while OARs met the dose constraint guidelines. Vastly different implant and planning styles were reflected on conformity index (entire dose matrix V100/PTV volume, physician A implants: 1.27±0.14, physician B: 1.47±0.17) and PTV V150/PTV volume ratio (physician A: 0.34±0.09, physician B: 0.24±0.07). The quadratic model provided a better fit for the curved relationship between normalized implant strength and total V100 (or PTV volume) than a simple linear function. Unlike the normalized implant strength versus PTV volume nomogram which differed between physicians, a unique quadratic model based nomogram (Sk*t)/D=−0.0008V2+0.0542V+1.1185 (R{sup 2}=0.9977) described the dependence of normalized implant strength on total V100 over all the patients from both physicians despite two different implant and planning philosophies. Normalized implant strength - total V100 model also generated less deviant points distorting the smoothed ones with a significantly higher correlation. Conclusion: A simple and universal, excel-based nomogram was created as an independent calculation tool for HDR prostate brachytherapy. Unlike similar attempts, our nomogram is insensitive to implant style and does not rely on reproducing dose calculations using TG-43 formalism, thus making it a truly independent check.« less

  12. Exposure assessment in health assessments for hand-arm vibration syndrome.

    PubMed

    Mason, H J; Poole, K; Young, C

    2011-08-01

    Assessing past cumulative vibration exposure is part of assessing the risk of hand-arm vibration syndrome (HAVS) in workers exposed to hand-arm vibration and invariably forms part of a medical assessment of such workers. To investigate the strength of relationships between the presence and severity of HAVS and different cumulative exposure metrics obtained from a self-reporting questionnaire. Cumulative exposure metrics were constructed from a tool-based questionnaire applied in a group of HAVS referrals and workplace field studies. These metrics included simple years of vibration exposure, cumulative total hours of all tool use and differing combinations of acceleration magnitudes for specific tools and their daily use, including the current frequency-weighting method contained in ISO 5349-1:2001. Use of simple years of exposure is a weak predictor of HAVS or its increasing severity. The calculation of cumulative hours across all vibrating tools used is a more powerful predictor. More complex calculations based on involving likely acceleration data for specific classes of tools, either frequency weighted or not, did not offer a clear further advantage in this dataset. This may be due to the uncertainty associated with workers' recall of their past tool usage or the variability between tools in the magnitude of their vibration emission. Assessing years of exposure or 'latency' in a worker should be replaced by cumulative hours of tool use. This can be readily obtained using a tool-pictogram-based self-reporting questionnaire and a simple spreadsheet calculation.

  13. Efficient simple sealed-off CO laser at room temperature

    NASA Astrophysics Data System (ADS)

    Peters, P. J. M.; Witteman, W. J.; Zuidema, R. J.

    1980-07-01

    The paper reports a simple sealed-off CW CO laser with gold electrodes. A constant long-life output power of more than 29 W/m and a maximum efficiency of 15% at room temperature are reported. No auxiliary features, such as a palladium hydrogen extraction tube, are necessary.

  14. An efficient enzyme-powered micromotor device fabricated by cyclic alternate hybridization assembly for DNA detection.

    PubMed

    Fu, Shizhe; Zhang, Xueqing; Xie, Yuzhe; Wu, Jie; Ju, Huangxian

    2017-07-06

    An efficient enzyme-powered micromotor device was fabricated by assembling multiple layers of catalase on the inner surface of a poly(3,4-ethylenedioxythiophene and sodium 4-styrenesulfonate)/Au microtube (PEDOT-PSS/Au). The catalase assembly was achieved by programmed DNA hybridization, which was performed by immobilizing a designed sandwich DNA structure as the sensing unit on the PEDOT-PSS/Au, and then alternately hybridizing with two assisting DNA to bind the enzyme for efficient motor motion. The micromotor device showed unique features of good reproducibility, stability and motion performance. Under optimal conditions, it showed a speed of 420 μm s -1 in 2% H 2 O 2 and even 51 μm s -1 in 0.25% H 2 O 2 . In the presence of target DNA, the sensing unit hybridized with target DNA to release the multi-layer DNA as well as the multi-catalase, resulting in a decrease of the motion speed. By using the speed as a signal, the micromotor device could detect DNA from 10 nM to 1 μM. The proposed micromotor device along with the cyclic alternate DNA hybridization assembly technique provided a new path to fabricate efficient and versatile micromotors, which would be an exceptional tool for rapid and simple detection of biomolecules.

  15. EPMOSt: An Energy-Efficient Passive Monitoring System for Wireless Sensor Networks

    PubMed Central

    Garcia, Fernando P.; Andrade, Rossana M. C.; Oliveira, Carina T.; de Souza, José Neuman

    2014-01-01

    Monitoring systems are important for debugging and analyzing Wireless Sensor Networks (WSN). In passive monitoring, a monitoring network needs to be deployed in addition to the network to be monitored, named the target network. The monitoring network captures and analyzes packets transmitted by the target network. An energy-efficient passive monitoring system is necessary when we need to monitor a WSN in a real scenario because the lifetime of the monitoring network is extended and, consequently, the target network benefits from the monitoring for a longer time. In this work, we have identified, analyzed and compared the main passive monitoring systems proposed for WSN. During our research, we did not identify any passive monitoring system for WSN that aims to reduce the energy consumption of the monitoring network. Therefore, we propose an Energy-efficient Passive MOnitoring SysTem for WSN named EPMOSt that provides monitoring information using a Simple Network Management Protocol (SNMP) agent. Thus, any management tool that supports the SNMP protocol can be integrated with this monitoring system. Experiments with real sensors were performed in several scenarios. The results obtained show the energy efficiency of the proposed monitoring system and the viability of using it to monitor WSN in real scenarios. PMID:24949639

  16. New Tools to Prepare ACE Cross-section Files for MCNP Analytic Test Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.

    Monte Carlo calculations using one-group cross sections, multigroup cross sections, or simple continuous energy cross sections are often used to: (1) verify production codes against known analytical solutions, (2) verify new methods and algorithms that do not involve detailed collision physics, (3) compare Monte Carlo calculation methods with deterministic methods, and (4) teach fundamentals to students. In this work we describe 2 new tools for preparing the ACE cross-section files to be used by MCNP ® for these analytic test problems, simple_ace.pl and simple_ace_mg.pl.

  17. Engineered nanomaterials: toward effective safety management in research laboratories.

    PubMed

    Groso, Amela; Petri-Fink, Alke; Rothen-Rutishauser, Barbara; Hofmann, Heinrich; Meyer, Thierry

    2016-03-15

    It is still unknown which types of nanomaterials and associated doses represent an actual danger to humans and environment. Meanwhile, there is consensus on applying the precautionary principle to these novel materials until more information is available. To deal with the rapid evolution of research, including the fast turnover of collaborators, a user-friendly and easy-to-apply risk assessment tool offering adequate preventive and protective measures has to be provided. Based on new information concerning the hazards of engineered nanomaterials, we improved a previously developed risk assessment tool by following a simple scheme to gain in efficiency. In the first step, using a logical decision tree, one of the three hazard levels, from H1 to H3, is assigned to the nanomaterial. Using a combination of decision trees and matrices, the second step links the hazard with the emission and exposure potential to assign one of the three nanorisk levels (Nano 3 highest risk; Nano 1 lowest risk) to the activity. These operations are repeated at each process step, leading to the laboratory classification. The third step provides detailed preventive and protective measures for the determined level of nanorisk. We developed an adapted simple and intuitive method for nanomaterial risk management in research laboratories. It allows classifying the nanoactivities into three levels, additionally proposing concrete preventive and protective measures and associated actions. This method is a valuable tool for all the participants in nanomaterial safety. The users experience an essential learning opportunity and increase their safety awareness. Laboratory managers have a reliable tool to obtain an overview of the operations involving nanomaterials in their laboratories; this is essential, as they are responsible for the employee safety, but are sometimes unaware of the works performed. Bringing this risk to a three-band scale (like other types of risks such as biological, radiation, chemical, etc.) facilitates the management for occupational health and safety specialists. Institutes and school managers can obtain the necessary information to implement an adequate safety management system. Having an easy-to-use tool enables a dialog between all these partners, whose semantic and priorities in terms of safety are often different.

  18. Evaluating Lexical Coverage in Simple English Wikipedia Articles: A Corpus-Driven Study

    ERIC Educational Resources Information Center

    Hendry, Clinton; Sheepy, Emily

    2017-01-01

    Simple English Wikipedia is a user-contributed online encyclopedia intended for young readers and readers whose first language is not English. We compiled a corpus of the entirety of Simple English Wikipedia as of June 20th, 2017. We used lexical frequency profiling tools to investigate the vocabulary size needed to comprehend Simple English…

  19. Designing optimal cell factories: integer programming couples elementary mode analysis with regulation

    PubMed Central

    2012-01-01

    Background Elementary mode (EM) analysis is ideally suited for metabolic engineering as it allows for an unbiased decomposition of metabolic networks in biologically meaningful pathways. Recently, constrained minimal cut sets (cMCS) have been introduced to derive optimal design strategies for strain improvement by using the full potential of EM analysis. However, this approach does not allow for the inclusion of regulatory information. Results Here we present an alternative, novel and simple method for the prediction of cMCS, which allows to account for boolean transcriptional regulation. We use binary linear programming and show that the design of a regulated, optimal metabolic network of minimal functionality can be formulated as a standard optimization problem, where EM and regulation show up as constraints. We validated our tool by optimizing ethanol production in E. coli. Our study showed that up to 70% of the predicted cMCS contained non-enzymatic, non-annotated reactions, which are difficult to engineer. These cMCS are automatically excluded by our approach utilizing simple weight functions. Finally, due to efficient preprocessing, the binary program remains computationally feasible. Conclusions We used integer programming to predict efficient deletion strategies to metabolically engineer a production organism. Our formulation utilizes the full potential of cMCS but adds additional flexibility to the design process. In particular our method allows to integrate regulatory information into the metabolic design process and explicitly favors experimentally feasible deletions. Our method remains manageable even if millions or potentially billions of EM enter the analysis. We demonstrated that our approach is able to correctly predict the most efficient designs for ethanol production in E. coli. PMID:22898474

  20. Low-Dose Irradiation Enhances Gene Targeting in Human Pluripotent Stem Cells.

    PubMed

    Hatada, Seigo; Subramanian, Aparna; Mandefro, Berhan; Ren, Songyang; Kim, Ho Won; Tang, Jie; Funari, Vincent; Baloh, Robert H; Sareen, Dhruv; Arumugaswami, Vaithilingaraja; Svendsen, Clive N

    2015-09-01

    Human pluripotent stem cells (hPSCs) are now being used for both disease modeling and cell therapy; however, efficient homologous recombination (HR) is often crucial to develop isogenic control or reporter lines. We showed that limited low-dose irradiation (LDI) using either γ-ray or x-ray exposure (0.4 Gy) significantly enhanced HR frequency, possibly through induction of DNA repair/recombination machinery including ataxia-telangiectasia mutated, histone H2A.X and RAD51 proteins. LDI could also increase HR efficiency by more than 30-fold when combined with the targeting tools zinc finger nucleases, transcription activator-like effector nucleases, and clustered regularly interspaced short palindromic repeats. Whole-exome sequencing confirmed that the LDI administered to hPSCs did not induce gross genomic alterations or affect cellular viability. Irradiated and targeted lines were karyotypically normal and made all differentiated lineages that continued to express green fluorescent protein targeted at the AAVS1 locus. This simple method allows higher throughput of new, targeted hPSC lines that are crucial to expand the use of disease modeling and to develop novel avenues of cell therapy. The simple and relevant technique described in this report uses a low level of radiation to increase desired gene modifications in human pluripotent stem cells by an order of magnitude. This higher efficiency permits greater throughput with reduced time and cost. The low level of radiation also greatly increased the recombination frequency when combined with developed engineered nucleases. Critically, the radiation did not lead to increases in DNA mutations or to reductions in overall cellular viability. This novel technique enables not only the rapid production of disease models using human stem cells but also the possibility of treating genetically based diseases by correcting patient-derived cells. ©AlphaMed Press.

  1. Underworld: What we set out to do, How far did we get, What did we Learn ? (Invited)

    NASA Astrophysics Data System (ADS)

    Moresi, L. N.

    2013-12-01

    Underworld was conceived as a tool for modelling 3D lithospheric deformation coupled with the underlying / surrounding mantle flow. The challenges involved were to find a method capable of representing the complicated, non-linear, history dependent rheology of the near surface as well as being able to model mantle convection, and, simultaneously, to be able to solve the numerical system efficiently. Underworld is a hybrid particle / mesh code reminiscent of the particle-in-cell techniques from the early 1960s. The Underworld team (*) was not the first to use this approach, nor the last, but the team does have considerable experience and much has been learned along the way. The use of a finite element method as the underlying "cell" in which the Lagrangian particles are embedded considerably reduces errors associated with mapping material properties to the cells. The particles are treated as moving quadrature points in computing the stiffness matrix integrals. The decoupling of deformation markers from computation points allows the use of structured meshes, efficient parallel decompositions, and simple-to-code geometric multigrid solution methods. For a 3D code such efficiencies are very important. The elegance of the method is that it can be completely described in a couple of sentences. However, there are some limitations: it is not obvious how to retain this elegance for unstructured or adaptive meshes, arbitrary element types are not sufficiently well integrated by the simple quadrature approach, and swarms of particles representing volumes are usually an inefficient representation of surfaces. This will be discussed ! (*) Although not formally constituted, my co-conspirators in this exercise are listed as the Underworld team and I will reveal their true identities on the day.

  2. Improving smoothing efficiency of rigid conformal polishing tool using time-dependent smoothing evaluation model

    NASA Astrophysics Data System (ADS)

    Song, Chi; Zhang, Xuejun; Zhang, Xin; Hu, Haifei; Zeng, Xuefeng

    2017-06-01

    A rigid conformal (RC) lap can smooth mid-spatial-frequency (MSF) errors, which are naturally smaller than the tool size, while still removing large-scale errors in a short time. However, the RC-lap smoothing efficiency performance is poorer than expected, and existing smoothing models cannot explicitly specify the methods to improve this efficiency. We presented an explicit time-dependent smoothing evaluation model that contained specific smoothing parameters directly derived from the parametric smoothing model and the Preston equation. Based on the time-dependent model, we proposed a strategy to improve the RC-lap smoothing efficiency, which incorporated the theoretical model, tool optimization, and efficiency limit determination. Two sets of smoothing experiments were performed to demonstrate the smoothing efficiency achieved using the time-dependent smoothing model. A high, theory-like tool influence function and a limiting tool speed of 300 RPM were o

  3. 5-phosphonato-3,4-dihydropyrimidin-2(1H)-ones: Zinc triflate-catalyzed one-pot multi-component synthesis, X-ray crystal structure and anti-inflammatory activity

    NASA Astrophysics Data System (ADS)

    Essid, Idris; Lahbib, Karima; Kaminsky, Werner; Ben Nasr, Cherif; Touil, Soufiane

    2017-08-01

    Herein we report a simple and efficient one-pot three-component synthesis of 5-phosphonato-3,4-dihydropyrimidin-2(1H)-ones, through the zinc triflate-catalyzed Biginelli-type reaction of β-ketophosphonates, aldehydes and urea. The compounds obtained were characterized by various spectroscopic tools including IR, NMR (1H, 31P, 13C) spectroscopy, mass spectrometry and single crystal X-ray diffraction. All the synthesized compounds were screened, for the first time, for anti-inflammatory activity by carrageenan-induced hind paw edema method, using female Wister rats and they showed significant anti-inflammatory activity in some cases higher than the standard indomethacin.

  4. Comparison of simulated and measured spectra from an X-ray tube for the energies between 20 and 35 keV

    NASA Astrophysics Data System (ADS)

    Yücel, M.; Emirhan, E.; Bayrak, A.; Ozben, C. S.; Yücel, E. Barlas

    2015-11-01

    Design and production of a simple and low cost X-ray imaging system that can be used for light industrial applications was targeted in the Nuclear Physics Laboratory of Istanbul Technical University. In this study, production, transmission and detection of X-rays were simulated for the proposed imaging device. OX/70-P dental tube was used and X-ray spectra simulated by Geant4 were validated by comparison with X-ray spectra measured between 20 and 35 keV. Relative detection efficiency of the detector was also determined to confirm the physics processes used in the simulations. Various time optimization tools were performed to reduce the simulation time.

  5. A quantitative witness for Greenberger-Horne-Zeilinger entanglement.

    PubMed

    Eltschka, Christopher; Siewert, Jens

    2012-01-01

    Along with the vast progress in experimental quantum technologies there is an increasing demand for the quantification of entanglement between three or more quantum systems. Theory still does not provide adequate tools for this purpose. The objective is, besides the quest for exact results, to develop operational methods that allow for efficient entanglement quantification. Here we put forward an analytical approach that serves both these goals. We provide a simple procedure to quantify Greenberger-Horne-Zeilinger-type multipartite entanglement in arbitrary three-qubit states. For two qubits this method is equivalent to Wootters' seminal result for the concurrence. It establishes a close link between entanglement quantification and entanglement detection by witnesses, and can be generalised both to higher dimensions and to more than three parties.

  6. Direct determination of trace phthalate esters in alcoholic spirits by spray-inlet microwave plasma torch ionization tandem mass spectrometry.

    PubMed

    Miao, Meng; Zhao, Gaosheng; Xu, Li; Dong, Junguo; Cheng, Ping

    2018-03-01

    A direct analytical method based on spray-inlet microwave plasma torch tandem mass spectrometry was applied to simultaneously determine 4 phthalate esters (PAEs), namely, benzyl butyl phthalate, diethyl phthalate, dipentyl phthalate, and dodecyl phthalate with extremely high sensitivity in spirits without sample treatment. Among the 4 brands of spirit products, 3 kinds of PAE compounds were directly determined at very low concentrations from 1.30 to 114 ng·g -1 . Compared with other online and off-line methods, the spray-inlet microwave plasma torch tandem mass spectrometry technique is extremely simple, rapid, sensitive, and high efficient, providing an ideal screening tool for PAEs in spirits. Copyright © 2017 John Wiley & Sons, Ltd.

  7. Hydrolyzable tannins with the hexahydroxydiphenoyl unit and the m-depsidic link: HPLC-DAD-MS identification and model synthesis.

    PubMed

    Arapitsas, Panagiotis; Menichetti, Stefano; Vincieri, Franco F; Romani, Annalisa

    2007-01-10

    This study was designed to develop efficient analytical tools for the difficult HPLC-DAD-MS identification of hydrolyzable tannins in natural tissue extracts. Throughout the study of the spectroscopic characteristics of properly synthesized stereodefined standards, it was observed that the UV-vis spectra of compounds with the m-depsidic link showed a characteristic shoulder at 300 nm, consistent with the simple glucogalloyl esters, whereas compounds with the hexahydroxydiphenoyl (HHDP) unit gave a diagnostic fragmentation pattern, caused by a spontaneous lactonization in the mass spectrometer. These observations were confirmed by HPLC-DAD-MS analyses of tannic acid and raspberry extracts, which are rich in hydrolyzable tannins with the m-depsidic link and the HHDP unit, respectively.

  8. Building a computer-aided design capability using a standard time share operating system

    NASA Technical Reports Server (NTRS)

    Sobieszczanski, J.

    1975-01-01

    The paper describes how an integrated system of engineering computer programs can be built using a standard commercially available operating system. The discussion opens with an outline of the auxiliary functions that an operating system can perform for a team of engineers involved in a large and complex task. An example of a specific integrated system is provided to explain how the standard operating system features can be used to organize the programs into a simple and inexpensive but effective system. Applications to an aircraft structural design study are discussed to illustrate the use of an integrated system as a flexible and efficient engineering tool. The discussion concludes with an engineer's assessment of an operating system's capabilities and desirable improvements.

  9. Design and Testing of Flight Control Laws on the RASCAL Research Helicopter

    NASA Technical Reports Server (NTRS)

    Frost, Chad R.; Hindson, William S.; Moralez. Ernesto, III; Tucker, George E.; Dryfoos, James B.

    2001-01-01

    Two unique sets of flight control laws were designed, tested and flown on the Army/NASA Rotorcraft Aircrew Systems Concepts Airborne Laboratory (RASCAL) JUH-60A Black Hawk helicopter. The first set of control laws used a simple rate feedback scheme, intended to facilitate the first flight and subsequent flight qualification of the RASCAL research flight control system. The second set of control laws comprised a more sophisticated model-following architecture. Both sets of flight control laws were developed and tested extensively using desktop-to-flight modeling, analysis, and simulation tools. Flight test data matched the model predicted responses well, providing both evidence and confidence that future flight control development for RASCAL will be efficient and accurate.

  10. Targeted Gene Knock Out Using Nuclease-Assisted Vector Integration: Hemi- and Homozygous Deletion of JAG1.

    PubMed

    Gapinske, Michael; Tague, Nathan; Winter, Jackson; Underhill, Gregory H; Perez-Pinera, Pablo

    2018-01-01

    Gene editing technologies are revolutionizing fields such as biomedicine and biotechnology by providing a simple means to manipulate the genetic makeup of essentially any organism. Gene editing tools function by introducing double-stranded breaks at targeted sites within the genome, which the host cells repair preferentially by Non-Homologous End Joining. While the technologies to introduce double-stranded breaks have been extensively optimized, this progress has not been matched by the development of methods to integrate heterologous DNA at the target sites or techniques to detect and isolate cells that harbor the desired modification. We present here a technique for rapid introduction of vectors at target sites in the genome that enables efficient isolation of successfully edited cells.

  11. Picometer-resolution dual-comb spectroscopy with a free-running fiber laser.

    PubMed

    Zhao, Xin; Hu, Guoqing; Zhao, Bofeng; Li, Cui; Pan, Yingling; Liu, Ya; Yasui, Takeshi; Zheng, Zheng

    2016-09-19

    Dual-comb spectroscopy holds the promise as real-time, high-resolution spectroscopy tools. However, in its conventional schemes, the stringent requirement on the coherence between two lasers requires sophisticated control systems. By replacing control electronics with an all-optical dual-comb lasing scheme, a simplified dual-comb spectroscopy scheme is demonstrated using one dual-wavelength, passively mode-locked fiber laser. Pulses with a intracavity-dispersion-determined repetition-frequency difference are shown to have good mutual coherence and stability. Capability to resolve the comb teeth and a picometer-wide optical spectral resolution are demonstrated using a simple data acquisition system. Energy-efficient, free-running fiber lasers with a small comb-tooth-spacing could enable low-cost dual-comb systems.

  12. Applications of capillary electrophoresis in characterizing recombinant protein therapeutics.

    PubMed

    Zhao, Shuai Sherry; Chen, David D Y

    2014-01-01

    The use of recombinant protein for therapeutic applications has increased significantly in the last three decades. The heterogeneity of these proteins, often caused by the complex biosynthesis pathways and the subsequent PTMs, poses a challenge for drug characterization to ensure its safety, quality, integrity, and efficacy. CE, with its simple instrumentation, superior separation efficiency, small sample consumption, and short analysis time, is a well-suited analytical tool for therapeutic protein characterization. Different separation modes, including CIEF, SDS-CGE, CZE, and CE-MS, provide complementary information of the proteins. The CE applications for recombinant therapeutic proteins from the year 2000 to June 2013 are reviewed and technical concerns are discussed in this article. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. An evaluation of the ERTS data collection system as a potential operational tool. [automatic hydrologic data collection and processing system for geological surveys

    NASA Technical Reports Server (NTRS)

    Paulson, R. W.

    1974-01-01

    The Earth Resources Technology Satellite Data Collection System has been shown to be, from the users vantage point, a reliable and simple system for collecting data from U.S. Geological Survey operational field instrumentation. It is technically feasible to expand the ERTS system into an operational polar-orbiting data collection system to gather data from the Geological Survey's Hydrologic Data Network. This could permit more efficient internal management of the Network, and could enable the Geological Survey to make data available to cooperating agencies in near-real time. The Geological Survey is conducting an analysis of the costs and benefits of satellite data-relay systems.

  14. Information integration for a sky survey by data warehousing

    NASA Astrophysics Data System (ADS)

    Luo, A.; Zhang, Y.; Zhao, Y.

    The virtualization service of data system for a sky survey LAMOST is very important for astronomers The service needs to integrate information from data collections catalogs and references and support simple federation of a set of distributed files and associated metadata Data warehousing has been in existence for several years and demonstrated superiority over traditional relational database management systems by providing novel indexing schemes that supported efficient on-line analytical processing OLAP of large databases Now relational database systems such as Oracle etc support the warehouse capability which including extensions to the SQL language to support OLAP operations and a number of metadata management tools have been created The information integration of LAMOST by applying data warehousing is to effectively provide data and knowledge on-line

  15. A quantitative witness for Greenberger-Horne-Zeilinger entanglement

    PubMed Central

    Eltschka, Christopher; Siewert, Jens

    2012-01-01

    Along with the vast progress in experimental quantum technologies there is an increasing demand for the quantification of entanglement between three or more quantum systems. Theory still does not provide adequate tools for this purpose. The objective is, besides the quest for exact results, to develop operational methods that allow for efficient entanglement quantification. Here we put forward an analytical approach that serves both these goals. We provide a simple procedure to quantify Greenberger-Horne-Zeilinger–type multipartite entanglement in arbitrary three-qubit states. For two qubits this method is equivalent to Wootters' seminal result for the concurrence. It establishes a close link between entanglement quantification and entanglement detection by witnesses, and can be generalised both to higher dimensions and to more than three parties. PMID:23267431

  16. Efficiency of endoscopy units can be improved with use of discrete event simulation modeling.

    PubMed

    Sauer, Bryan G; Singh, Kanwar P; Wagner, Barry L; Vanden Hoek, Matthew S; Twilley, Katherine; Cohn, Steven M; Shami, Vanessa M; Wang, Andrew Y

    2016-11-01

    Background and study aims: The projected increased demand for health services obligates healthcare organizations to operate efficiently. Discrete event simulation (DES) is a modeling method that allows for optimization of systems through virtual testing of different configurations before implementation. The objective of this study was to identify strategies to improve the daily efficiencies of an endoscopy center with the use of DES. Methods: We built a DES model of a five procedure room endoscopy unit at a tertiary-care university medical center. After validating the baseline model, we tested alternate configurations to run the endoscopy suite and evaluated outcomes associated with each change. The main outcome measures included adequate number of preparation and recovery rooms, blocked inflow, delay times, blocked outflows, and patient cycle time. Results: Based on a sensitivity analysis, the adequate number of preparation rooms is eight and recovery rooms is nine for a five procedure room unit (total 3.4 preparation and recovery rooms per procedure room). Simple changes to procedure scheduling and patient arrival times led to a modest improvement in efficiency. Increasing the preparation/recovery rooms based on the sensitivity analysis led to significant improvements in efficiency. Conclusions: By applying tools such as DES, we can model changes in an environment with complex interactions and find ways to improve the medical care we provide. DES is applicable to any endoscopy unit and would be particularly valuable to those who are trying to improve on the efficiency of care and patient experience.

  17. Efficiency of endoscopy units can be improved with use of discrete event simulation modeling

    PubMed Central

    Sauer, Bryan G.; Singh, Kanwar P.; Wagner, Barry L.; Vanden Hoek, Matthew S.; Twilley, Katherine; Cohn, Steven M.; Shami, Vanessa M.; Wang, Andrew Y.

    2016-01-01

    Background and study aims: The projected increased demand for health services obligates healthcare organizations to operate efficiently. Discrete event simulation (DES) is a modeling method that allows for optimization of systems through virtual testing of different configurations before implementation. The objective of this study was to identify strategies to improve the daily efficiencies of an endoscopy center with the use of DES. Methods: We built a DES model of a five procedure room endoscopy unit at a tertiary-care university medical center. After validating the baseline model, we tested alternate configurations to run the endoscopy suite and evaluated outcomes associated with each change. The main outcome measures included adequate number of preparation and recovery rooms, blocked inflow, delay times, blocked outflows, and patient cycle time. Results: Based on a sensitivity analysis, the adequate number of preparation rooms is eight and recovery rooms is nine for a five procedure room unit (total 3.4 preparation and recovery rooms per procedure room). Simple changes to procedure scheduling and patient arrival times led to a modest improvement in efficiency. Increasing the preparation/recovery rooms based on the sensitivity analysis led to significant improvements in efficiency. Conclusions: By applying tools such as DES, we can model changes in an environment with complex interactions and find ways to improve the medical care we provide. DES is applicable to any endoscopy unit and would be particularly valuable to those who are trying to improve on the efficiency of care and patient experience. PMID:27853739

  18. Dynamic Characteristics of a Simple Brayton Cryocycle

    NASA Astrophysics Data System (ADS)

    Kutzschbach, A.; Kauschke, M.; Haberstroh, Ch.; Quack, H.

    2006-04-01

    The goal of the overall program is to develop a dynamic numerical model of helium refrigerators and the associated cooling systems based on commercial simulation software. The aim is to give system designers a tool to search for optimum control strategies during the construction phase of the refrigerator with the help of a plant "simulator". In a first step, a simple Brayton refrigerator has been investigated, which consists of a compressor, an after-cooler, a counter-current heat exchanger, a turboexpander and a heat source. Operating modes are "refrigeration" and "liquefaction". Whereas for the steady state design only component efficiencies are needed and mass and energy balances have to be calculated, for the dynamic calculation one needs also the thermal masses and the helium inventory. Transient mass and energy balances have to be formulated for many small elements and then solved simultaneously for all elements. Starting point of the simulation of the Brayton cycle is the steady state operation at design conditions. The response of the system to step and cyclic changes of the refrigeration or liquefaction rate are calculated and characterized.

  19. Study of central light concentration in nearby galaxies

    NASA Astrophysics Data System (ADS)

    Aswathy, S.; Ravikumar, C. D.

    2018-06-01

    We propose a novel technique to estimate the masses of supermassive black holes (SMBHs) residing at the centres of massive galaxies in the nearby Universe using simple photometry. Aperture photometry using SEXTRACTOR is employed to determine the central intensity ratio (CIR) at the optical centre of the galaxy image for a sample of 49 nearby galaxies with SMBH mass estimations. We find that the CIR of ellipticals and classical bulges is strongly correlated with SMBH masses whereas pseudo-bulges and ongoing mergers show significant scatter. Also, the CIR of low-luminosity AGNs in the sample shows significant connection with the 5 GHz nuclear radio emission suggesting a stronger link between the former and the SMBH evolution in these galaxies. In addition, it is seen that various structural and dynamical properties of the SMBH host galaxies are correlated with the CIR making the latter an important parameter in galaxy evolution studies. Finally, we propose the CIR to be an efficient and simple tool not only to distinguish classical bulges from pseudo-bulges but also to estimate the mass of the central SMBH.

  20. Gradient optimization of finite projected entangled pair states

    NASA Astrophysics Data System (ADS)

    Liu, Wen-Yuan; Dong, Shao-Jun; Han, Yong-Jian; Guo, Guang-Can; He, Lixin

    2017-05-01

    Projected entangled pair states (PEPS) methods have been proven to be powerful tools to solve strongly correlated quantum many-body problems in two dimensions. However, due to the high computational scaling with the virtual bond dimension D , in a practical application, PEPS are often limited to rather small bond dimensions, which may not be large enough for some highly entangled systems, for instance, frustrated systems. Optimization of the ground state using the imaginary time evolution method with a simple update scheme may go to a larger bond dimension. However, the accuracy of the rough approximation to the environment of the local tensors is questionable. Here, we demonstrate that by combining the imaginary time evolution method with a simple update, Monte Carlo sampling techniques and gradient optimization will offer an efficient method to calculate the PEPS ground state. By taking advantage of massive parallel computing, we can study quantum systems with larger bond dimensions up to D =10 without resorting to any symmetry. Benchmark tests of the method on the J1-J2 model give impressive accuracy compared with exact results.

  1. ParCAT: A Parallel Climate Analysis Toolkit

    NASA Astrophysics Data System (ADS)

    Haugen, B.; Smith, B.; Steed, C.; Ricciuto, D. M.; Thornton, P. E.; Shipman, G.

    2012-12-01

    Climate science has employed increasingly complex models and simulations to analyze the past and predict the future of our climate. The size and dimensionality of climate simulation data has been growing with the complexity of the models. This growth in data is creating a widening gap between the data being produced and the tools necessary to analyze large, high dimensional data sets. With single run data sets increasing into 10's, 100's and even 1000's of gigabytes, parallel computing tools are becoming a necessity in order to analyze and compare climate simulation data. The Parallel Climate Analysis Toolkit (ParCAT) provides basic tools that efficiently use parallel computing techniques to narrow the gap between data set size and analysis tools. ParCAT was created as a collaborative effort between climate scientists and computer scientists in order to provide efficient parallel implementations of the computing tools that are of use to climate scientists. Some of the basic functionalities included in the toolkit are the ability to compute spatio-temporal means and variances, differences between two runs and histograms of the values in a data set. ParCAT is designed to facilitate the "heavy lifting" that is required for large, multidimensional data sets. The toolkit does not focus on performing the final visualizations and presentation of results but rather, reducing large data sets to smaller, more manageable summaries. The output from ParCAT is provided in commonly used file formats (NetCDF, CSV, ASCII) to allow for simple integration with other tools. The toolkit is currently implemented as a command line utility, but will likely also provide a C library for developers interested in tighter software integration. Elements of the toolkit are already being incorporated into projects such as UV-CDAT and CMDX. There is also an effort underway to implement portions of the CCSM Land Model Diagnostics package using ParCAT in conjunction with Python and gnuplot. ParCAT is implemented in C to provide efficient file IO. The file IO operations in the toolkit use the parallel-netcdf library; this enables the code to use the parallel IO capabilities of modern HPC systems. Analysis that currently requires an estimated 12+ hours with the traditional CCSM Land Model Diagnostics Package can now be performed in as little as 30 minutes on a single desktop workstation and a few minutes for relatively small jobs completed on modern HPC systems such as ORNL's Jaguar.

  2. Convexity of Energy-Like Functions: Theoretical Results and Applications to Power System Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dvijotham, Krishnamurthy; Low, Steven; Chertkov, Michael

    2015-01-12

    Power systems are undergoing unprecedented transformations with increased adoption of renewables and distributed generation, as well as the adoption of demand response programs. All of these changes, while making the grid more responsive and potentially more efficient, pose significant challenges for power systems operators. Conventional operational paradigms are no longer sufficient as the power system may no longer have big dispatchable generators with sufficient positive and negative reserves. This increases the need for tools and algorithms that can efficiently predict safe regions of operation of the power system. In this paper, we study energy functions as a tool to designmore » algorithms for various operational problems in power systems. These have a long history in power systems and have been primarily applied to transient stability problems. In this paper, we take a new look at power systems, focusing on an aspect that has previously received little attention: Convexity. We characterize the domain of voltage magnitudes and phases within which the energy function is convex in these variables. We show that this corresponds naturally with standard operational constraints imposed in power systems. We show that power of equations can be solved using this approach, as long as the solution lies within the convexity domain. We outline various desirable properties of solutions in the convexity domain and present simple numerical illustrations supporting our results.« less

  3. Investigating Geosparql Requirements for Participatory Urban Planning

    NASA Astrophysics Data System (ADS)

    Mohammadi, E.; Hunter, A. J. S.

    2015-06-01

    We propose that participatory GIS (PGIS) activities including participatory urban planning can be made more efficient and effective if spatial reasoning rules are integrated with PGIS tools to simplify engagement for public contributors. Spatial reasoning is used to describe relationships between spatial entities. These relationships can be evaluated quantitatively or qualitatively using geometrical algorithms, ontological relations, and topological methods. Semantic web services utilize tools and methods that can facilitate spatial reasoning. GeoSPARQL, introduced by OGC, is a spatial reasoning standard used to make declarations about entities (graphical contributions) that take the form of a subject-predicate-object triple or statement. GeoSPARQL uses three basic methods to infer topological relationships between spatial entities, including: OGC's simple feature topology, RCC8, and the DE-9IM model. While these methods are comprehensive in their ability to define topological relationships between spatial entities, they are often inadequate for defining complex relationships that exist in the spatial realm. Particularly relationships between urban entities, such as those between a bus route, the collection of associated bus stops and their overall surroundings as an urban planning pattern. In this paper we investigate common qualitative spatial reasoning methods as a preliminary step to enhancing the capabilities of GeoSPARQL in an online participatory GIS framework in which reasoning is used to validate plans based on standard patterns that can be found in an efficient/effective urban environment.

  4. Generation of recombinant rotaviruses expressing fluorescent proteins using an optimized reverse genetics system.

    PubMed

    Komoto, Satoshi; Fukuda, Saori; Ide, Tomihiko; Ito, Naoto; Sugiyama, Makoto; Yoshikawa, Tetsushi; Murata, Takayuki; Taniguchi, Koki

    2018-04-18

    An entirely plasmid-based reverse genetics system for rotaviruses was established very recently. We improved the reverse genetics system to generate recombinant rotavirus by transfecting only 11 cDNA plasmids for its 11 gene segments under the condition of increasing the ratio of the cDNA plasmids for NSP2 and NSP5 genes. Utilizing this highly efficient system, we then engineered infectious recombinant rotaviruses expressing bioluminescent (NanoLuc luciferase) and fluorescent (EGFP and mCherry) reporters. These recombinant rotaviruses expressing reporters remained genetically stable during serial passages. Our reverse genetics approach and recombinant rotaviruses carrying reporter genes will be great additions to the tool kit for studying the molecular virology of rotavirus, and for developing future next-generation vaccines and expression vectors. IMPORTANCE Rotavirus is one of the most important pathogens causing severe gastroenteritis in young children worldwide. In this paper, we describe a robust and simple reverse genetics system based on only rotavirus cDNAs, and its application for engineering infectious recombinant rotaviruses harboring bioluminescent (NanoLuc) and fluorescent (EGFP and mCherry) protein genes. This highly efficient reverse genetics system and recombinant RVAs expressing reporters could be powerful tools for the study of different aspects of rotavirus replication. Furthermore, they may be useful for next-generation vaccine production for this medically important virus. Copyright © 2018 American Society for Microbiology.

  5. The curation paradigm and application tool used for manual curation of the scientific literature at the Comparative Toxicogenomics Database

    PubMed Central

    Davis, Allan Peter; Wiegers, Thomas C.; Murphy, Cynthia G.; Mattingly, Carolyn J.

    2011-01-01

    The Comparative Toxicogenomics Database (CTD) is a public resource that promotes understanding about the effects of environmental chemicals on human health. CTD biocurators read the scientific literature and convert free-text information into a structured format using official nomenclature, integrating third party controlled vocabularies for chemicals, genes, diseases and organisms, and a novel controlled vocabulary for molecular interactions. Manual curation produces a robust, richly annotated dataset of highly accurate and detailed information. Currently, CTD describes over 349 000 molecular interactions between 6800 chemicals, 20 900 genes (for 330 organisms) and 4300 diseases that have been manually curated from over 25 400 peer-reviewed articles. This manually curated data are further integrated with other third party data (e.g. Gene Ontology, KEGG and Reactome annotations) to generate a wealth of toxicogenomic relationships. Here, we describe our approach to manual curation that uses a powerful and efficient paradigm involving mnemonic codes. This strategy allows biocurators to quickly capture detailed information from articles by generating simple statements using codes to represent the relationships between data types. The paradigm is versatile, expandable, and able to accommodate new data challenges that arise. We have incorporated this strategy into a web-based curation tool to further increase efficiency and productivity, implement quality control in real-time and accommodate biocurators working remotely. Database URL: http://ctd.mdibl.org PMID:21933848

  6. Quality by design: optimization of a liquid filled pH-responsive macroparticles using Draper-Lin composite design.

    PubMed

    Rafati, Hasan; Talebpour, Zahra; Adlnasab, Laleh; Ebrahimi, Samad Nejad

    2009-07-01

    In this study, pH responsive macroparticles incorporating peppermint oil (PO) were prepared using a simple emulsification/polymer precipitation technique. The formulations were examined for their properties and the desired quality was then achieved using a quality by design (QBD) approach. For this purpose, a Draper-Lin small composite design study was employed in order to investigate the effect of four independent variables, including the PO to water ratio, the concentration of pH sensitive polymer (hydroxypropyl methylcellulose phthalate), acid and plasticizer concentrations, on the encapsulation efficiency and PO loading. The analysis of variance showed that the polymer concentration was the most important variable on encapsulation efficiency (p < 0.05). The multiple regression analysis of the results led to equations that adequately described the influence of the independent variables on the selected responses. Furthermore, the desirability function was employed as an effective tool for transforming each response separately and encompassing all of these responses in an overall desirability function for global optimization of the encapsulation process. The optimized macroparticles were predicted to yield 93.4% encapsulation efficiency and 72.8% PO loading, which were remarkably close to the experimental values of 89.2% and 69.5%, consequently.

  7. On the study of control effectiveness and computational efficiency of reduced Saint-Venant model in model predictive control of open channel flow

    NASA Astrophysics Data System (ADS)

    Xu, M.; van Overloop, P. J.; van de Giesen, N. C.

    2011-02-01

    Model predictive control (MPC) of open channel flow is becoming an important tool in water management. The complexity of the prediction model has a large influence on the MPC application in terms of control effectiveness and computational efficiency. The Saint-Venant equations, called SV model in this paper, and the Integrator Delay (ID) model are either accurate but computationally costly, or simple but restricted to allowed flow changes. In this paper, a reduced Saint-Venant (RSV) model is developed through a model reduction technique, Proper Orthogonal Decomposition (POD), on the SV equations. The RSV model keeps the main flow dynamics and functions over a large flow range but is easier to implement in MPC. In the test case of a modeled canal reach, the number of states and disturbances in the RSV model is about 45 and 16 times less than the SV model, respectively. The computational time of MPC with the RSV model is significantly reduced, while the controller remains effective. Thus, the RSV model is a promising means to balance the control effectiveness and computational efficiency.

  8. Excitation-Energy Transfer Paths from Tryptophans to Coordinated Copper Ions in Engineered Azurins: a Source of Observables for Monitoring Protein Structural Changes

    NASA Astrophysics Data System (ADS)

    Di Rocco, Giulia; Bernini, Fabrizio; Borsari, Marco; Martinelli, Ilaria; Bortolotti, Carlo Augusto; Battistuzzi, Gianantonio; Ranieri, Antonio; Caselli, Monica; Sola, Marco; Ponterini, Glauco

    2016-09-01

    The intrinsic fluorescence of recombinant proteins offers a powerful tool to detect and characterize structural changes induced by chemical or biological stimuli. We show that metal-ion binding to a hexahistidine tail can significantly broaden the range of such structurally sensitive fluorescence observables. Bipositive metal-ions as Cu2+, Ni2+ and Zn2+ bind 6xHis-tag azurin and its 6xHis-tagged R129W and W48A-R129W mutants with good efficiency and, thereby, quench their intrinsic fluorescence. Due to a much more favourable spectral overlap, the 6xHis-tag/Cu2+ complex(es) are the most efficient quenchers of both W48 and W129 emissions. Based on simple Förster-type dependence of energy-transfer efficiency on donor/acceptor distance, we can trace several excitation-energy transfer paths across the protein structure. Unexpected lifetime components in the azurin 6xHis-tag/Cu2+ complex emission decays reveal underneath complexity in the conformational landscape of these systems. The new tryptophan emission quenching paths provide additional signals for detecting and identifying protein structural changes.

  9. A simple and efficient quasi 3-dimensional viscoelastic model and software for simulation of tapping-mode atomic force microscopy

    DOE PAGES

    Solares, Santiago D.

    2015-11-26

    This study introduces a quasi-3-dimensional (Q3D) viscoelastic model and software tool for use in atomic force microscopy (AFM) simulations. The model is based on a 2-dimensional array of standard linear solid (SLS) model elements. The well-known 1-dimensional SLS model is a textbook example in viscoelastic theory but is relatively new in AFM simulation. It is the simplest model that offers a qualitatively correct description of the most fundamental viscoelastic behaviors, namely stress relaxation and creep. However, this simple model does not reflect the correct curvature in the repulsive portion of the force curve, so its application in the quantitative interpretationmore » of AFM experiments is relatively limited. In the proposed Q3D model the use of an array of SLS elements leads to force curves that have the typical upward curvature in the repulsive region, while still offering a very low computational cost. Furthermore, the use of a multidimensional model allows for the study of AFM tips having non-ideal geometries, which can be extremely useful in practice. Examples of typical force curves are provided for single- and multifrequency tappingmode imaging, for both of which the force curves exhibit the expected features. Lastly, a software tool to simulate amplitude and phase spectroscopy curves is provided, which can be easily modified to implement other controls schemes in order to aid in the interpretation of AFM experiments.« less

  10. A simple and efficient quasi 3-dimensional viscoelastic model and software for simulation of tapping-mode atomic force microscopy.

    PubMed

    Solares, Santiago D

    2015-01-01

    This paper introduces a quasi-3-dimensional (Q3D) viscoelastic model and software tool for use in atomic force microscopy (AFM) simulations. The model is based on a 2-dimensional array of standard linear solid (SLS) model elements. The well-known 1-dimensional SLS model is a textbook example in viscoelastic theory but is relatively new in AFM simulation. It is the simplest model that offers a qualitatively correct description of the most fundamental viscoelastic behaviors, namely stress relaxation and creep. However, this simple model does not reflect the correct curvature in the repulsive portion of the force curve, so its application in the quantitative interpretation of AFM experiments is relatively limited. In the proposed Q3D model the use of an array of SLS elements leads to force curves that have the typical upward curvature in the repulsive region, while still offering a very low computational cost. Furthermore, the use of a multidimensional model allows for the study of AFM tips having non-ideal geometries, which can be extremely useful in practice. Examples of typical force curves are provided for single- and multifrequency tapping-mode imaging, for both of which the force curves exhibit the expected features. Finally, a software tool to simulate amplitude and phase spectroscopy curves is provided, which can be easily modified to implement other controls schemes in order to aid in the interpretation of AFM experiments.

  11. Utility of the heteroduplex assay (HDA) as a simple and cost-effective tool for the identification of HIV type 1 dual infections in resource-limited settings.

    PubMed

    Powell, Rebecca L R; Urbanski, Mateusz M; Burda, Sherri; Nanfack, Aubin; Kinge, Thompson; Nyambi, Phillipe N

    2008-01-01

    The predominance of unique recombinant forms (URFs) of HIV-1 in Cameroon suggests that dual infection, the concomitant or sequential infection with genetically distinct HIV-1 strains, occurs frequently in this region; yet, identifying dual infection among large HIV cohorts in local, resource-limited settings is uncommon, since this generally relies on labor-intensive and costly sequencing methods. Consequently, there is a need to develop an effective, cost-efficient method appropriate to the developing world to identify these infections. In the present study, the heteroduplex assay (HDA) was used to verify dual or single infection status, as shown by traditional sequence analysis, for 15 longitudinally sampled study subjects from Cameroon. Heteroduplex formation, indicative of a dual infection, was identified for all five study subjects shown by sequence analysis to be dually infected. Conversely, heteroduplex formation was not detectable for all 10 HDA reactions of the singly infected study subjects. These results suggest that the HDA is a simple yet powerful and inexpensive tool for the detection of both intersubtype and intrasubtype dual infections, and that the HDA harbors significant potential for reliable, high-throughput screening for dual infection. As these infections and the recombinants they generate facilitate leaps in HIV-1 evolution, and may present major challenges for treatment and vaccine design, this assay will be critical for monitoring the continuing pandemic in regions of the world where HIV-1 viral diversity is broad.

  12. Simple and efficient method for region of interest value extraction from picture archiving and communication system viewer with optical character recognition software and macro program.

    PubMed

    Lee, Young Han; Park, Eun Hae; Suh, Jin-Suck

    2015-01-01

    The objectives are: 1) to introduce a simple and efficient method for extracting region of interest (ROI) values from a Picture Archiving and Communication System (PACS) viewer using optical character recognition (OCR) software and a macro program, and 2) to evaluate the accuracy of this method with a PACS workstation. This module was designed to extract the ROI values on the images of the PACS, and created as a development tool by using open-source OCR software and an open-source macro program. The principal processes are as follows: (1) capture a region of the ROI values as a graphic file for OCR, (2) recognize the text from the captured image by OCR software, (3) perform error-correction, (4) extract the values including area, average, standard deviation, max, and min values from the text, (5) reformat the values into temporary strings with tabs, and (6) paste the temporary strings into the spreadsheet. This principal process was repeated for the number of ROIs. The accuracy of this module was evaluated on 1040 recognitions from 280 randomly selected ROIs of the magnetic resonance images. The input times of ROIs were compared between conventional manual method and this extraction module-assisted input method. The module for extracting ROI values operated successfully using the OCR and macro programs. The values of the area, average, standard deviation, maximum, and minimum could be recognized and error-corrected with AutoHotkey-coded module. The average input times using the conventional method and the proposed module-assisted method were 34.97 seconds and 7.87 seconds, respectively. A simple and efficient method for ROI value extraction was developed with open-source OCR and a macro program. Accurate inputs of various numbers from ROIs can be extracted with this module. The proposed module could be applied to the next generation of PACS or existing PACS that have not yet been upgraded. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  13. Simple Example of Backtest Overfitting (SEBO)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    In the field of mathematical finance, a "backtest" is the usage of historical market data to assess the performance of a proposed trading strategy. It is a relatively simple matter for a present-day computer system to explore thousands, millions or even billions of variations of a proposed strategy, and pick the best performing variant as the "optimal" strategy "in sample" (i.e., on the input dataset). Unfortunately, such an "optimal" strategy often performs very poorly "out of sample" (i.e. on another dataset), because the parameters of the invest strategy have been oversit to the in-sample data, a situation known as "backtestmore » overfitting". While the mathematics of backtest overfitting has been examined in several recent theoretical studies, here we pursue a more tangible analysis of this problem, in the form of an online simulator tool. Given a input random walk time series, the tool develops an "optimal" variant of a simple strategy by exhaustively exploring all integer parameter values among a handful of parameters. That "optimal" strategy is overfit, since by definition a random walk is unpredictable. Then the tool tests the resulting "optimal" strategy on a second random walk time series. In most runs using our online tool, the "optimal" strategy derived from the first time series performs poorly on the second time series, demonstrating how hard it is not to overfit a backtest. We offer this online tool, "Simple Example of Backtest Overfitting (SEBO)", to facilitate further research in this area.« less

  14. Moving toward the automation of the systematic review process: a summary of discussions at the second meeting of International Collaboration for the Automation of Systematic Reviews (ICASR).

    PubMed

    O'Connor, Annette M; Tsafnat, Guy; Gilbert, Stephen B; Thayer, Kristina A; Wolfe, Mary S

    2018-01-09

    The second meeting of the International Collaboration for Automation of Systematic Reviews (ICASR) was held 3-4 October 2016 in Philadelphia, Pennsylvania, USA. ICASR is an interdisciplinary group whose aim is to maximize the use of technology for conducting rapid, accurate, and efficient systematic reviews of scientific evidence. Having automated tools for systematic review should enable more transparent and timely review, maximizing the potential for identifying and translating research findings to practical application. The meeting brought together multiple stakeholder groups including users of summarized research, methodologists who explore production processes and systematic review quality, and technologists such as software developers, statisticians, and vendors. This diversity of participants was intended to ensure effective communication with numerous stakeholders about progress toward automation of systematic reviews and stimulate discussion about potential solutions to identified challenges. The meeting highlighted challenges, both simple and complex, and raised awareness among participants about ongoing efforts by various stakeholders. An outcome of this forum was to identify several short-term projects that participants felt would advance the automation of tasks in the systematic review workflow including (1) fostering better understanding about available tools, (2) developing validated datasets for testing new tools, (3) determining a standard method to facilitate interoperability of tools such as through an application programming interface or API, and (4) establishing criteria to evaluate the quality of tools' output. ICASR 2016 provided a beneficial forum to foster focused discussion about tool development and resources and reconfirm ICASR members' commitment toward systematic reviews' automation.

  15. In Silico PCR Tools for a Fast Primer, Probe, and Advanced Searching.

    PubMed

    Kalendar, Ruslan; Muterko, Alexandr; Shamekova, Malika; Zhambakin, Kabyl

    2017-01-01

    The polymerase chain reaction (PCR) is fundamental to molecular biology and is the most important practical molecular technique for the research laboratory. The principle of this technique has been further used and applied in plenty of other simple or complex nucleic acid amplification technologies (NAAT). In parallel to laboratory "wet bench" experiments for nucleic acid amplification technologies, in silico or virtual (bioinformatics) approaches have been developed, among which in silico PCR analysis. In silico NAAT analysis is a useful and efficient complementary method to ensure the specificity of primers or probes for an extensive range of PCR applications from homology gene discovery, molecular diagnosis, DNA fingerprinting, and repeat searching. Predicting sensitivity and specificity of primers and probes requires a search to determine whether they match a database with an optimal number of mismatches, similarity, and stability. In the development of in silico bioinformatics tools for nucleic acid amplification technologies, the prospects for the development of new NAAT or similar approaches should be taken into account, including forward-looking and comprehensive analysis that is not limited to only one PCR technique variant. The software FastPCR and the online Java web tool are integrated tools for in silico PCR of linear and circular DNA, multiple primer or probe searches in large or small databases and for advanced search. These tools are suitable for processing of batch files that are essential for automation when working with large amounts of data. The FastPCR software is available for download at http://primerdigital.com/fastpcr.html and the online Java version at http://primerdigital.com/tools/pcr.html .

  16. Mining peripheral arterial disease cases from narrative clinical notes using natural language processing.

    PubMed

    Afzal, Naveed; Sohn, Sunghwan; Abram, Sara; Scott, Christopher G; Chaudhry, Rajeev; Liu, Hongfang; Kullo, Iftikhar J; Arruda-Olson, Adelaide M

    2017-06-01

    Lower extremity peripheral arterial disease (PAD) is highly prevalent and affects millions of individuals worldwide. We developed a natural language processing (NLP) system for automated ascertainment of PAD cases from clinical narrative notes and compared the performance of the NLP algorithm with billing code algorithms, using ankle-brachial index test results as the gold standard. We compared the performance of the NLP algorithm to (1) results of gold standard ankle-brachial index; (2) previously validated algorithms based on relevant International Classification of Diseases, Ninth Revision diagnostic codes (simple model); and (3) a combination of International Classification of Diseases, Ninth Revision codes with procedural codes (full model). A dataset of 1569 patients with PAD and controls was randomly divided into training (n = 935) and testing (n = 634) subsets. We iteratively refined the NLP algorithm in the training set including narrative note sections, note types, and service types, to maximize its accuracy. In the testing dataset, when compared with both simple and full models, the NLP algorithm had better accuracy (NLP, 91.8%; full model, 81.8%; simple model, 83%; P < .001), positive predictive value (NLP, 92.9%; full model, 74.3%; simple model, 79.9%; P < .001), and specificity (NLP, 92.5%; full model, 64.2%; simple model, 75.9%; P < .001). A knowledge-driven NLP algorithm for automatic ascertainment of PAD cases from clinical notes had greater accuracy than billing code algorithms. Our findings highlight the potential of NLP tools for rapid and efficient ascertainment of PAD cases from electronic health records to facilitate clinical investigation and eventually improve care by clinical decision support. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  17. A novel tool to standardize rheology testing of molten polymers for pharmaceutical applications.

    PubMed

    Treffer, Daniel; Troiss, Alexander; Khinast, Johannes

    2015-11-10

    Melt rheology provides information about material properties that are of great importance for equipment design and simulations, especially for novel pharmaceutical manufacturing operations, including extrusion, injection molding or 3d printing. To that end, homogeneous samples must be prepared, most commonly via compression or injection molding, both of which require costly equipment and might not be applicable for shear- and heat-sensitive pharmaceutical materials. Our study introduces a novel vacuum compression molding (VCM) tool for simple preparation of thermoplastic specimens using standard laboratory equipment: a hot plate and a vacuum source. Sticking is eliminated by applying polytetrafluoroethylene (PTFE) coated separation foils. The evacuation of the tool leads to compression of the sample chamber, which is cost-efficient compared to conventional methods, such as compression molding or injection molding that require special equipment. In addition, this compact design reduces the preparation time and the heat load. The VCM tool was used to prepare samples for a rheological study of three pharmaceutical polymers (Soluplus(®), Eudragit(®)E, EVA Rowalit(®) 300-1/28). The prepared samples were without any air inclusions or voids, and the measurements had a high reproducibility. All relative standard deviations were below 3%. The obtained data were fitted to the Carreau-Yasuda model and time-temperature superposition was applied. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. The Multisensory Attentional Consequences of Tool Use: A Functional Magnetic Resonance Imaging Study

    PubMed Central

    Holmes, Nicholas P.; Spence, Charles; Hansen, Peter C.; Mackay, Clare E.; Calvert, Gemma A.

    2008-01-01

    Background Tool use in humans requires that multisensory information is integrated across different locations, from objects seen to be distant from the hand, but felt indirectly at the hand via the tool. We tested the hypothesis that using a simple tool to perceive vibrotactile stimuli results in the enhanced processing of visual stimuli presented at the distal, functional part of the tool. Such a finding would be consistent with a shift of spatial attention to the location where the tool is used. Methodology/Principal Findings We tested this hypothesis by scanning healthy human participants' brains using functional magnetic resonance imaging, while they used a simple tool to discriminate between target vibrations, accompanied by congruent or incongruent visual distractors, on the same or opposite side to the tool. The attentional hypothesis was supported: BOLD response in occipital cortex, particularly in the right hemisphere lingual gyrus, varied significantly as a function of tool position, increasing contralaterally, and decreasing ipsilaterally to the tool. Furthermore, these modulations occurred despite the fact that participants were repeatedly instructed to ignore the visual stimuli, to respond only to the vibrotactile stimuli, and to maintain visual fixation centrally. In addition, the magnitude of multisensory (visual-vibrotactile) interactions in participants' behavioural responses significantly predicted the BOLD response in occipital cortical areas that were also modulated as a function of both visual stimulus position and tool position. Conclusions/Significance These results show that using a simple tool to locate and to perceive vibrotactile stimuli is accompanied by a shift of spatial attention to the location where the functional part of the tool is used, resulting in enhanced processing of visual stimuli at that location, and decreased processing at other locations. This was most clearly observed in the right hemisphere lingual gyrus. Such modulations of visual processing may reflect the functional importance of visuospatial information during human tool use. PMID:18958150

  19. The productivity and cost-efficiency of models for involving nurse practitioners in primary care: a perspective from queueing analysis.

    PubMed

    Liu, Nan; D'Aunno, Thomas

    2012-04-01

    To develop simple stylized models for evaluating the productivity and cost-efficiencies of different practice models to involve nurse practitioners (NPs) in primary care, and in particular to generate insights on what affects the performance of these models and how. The productivity of a practice model is defined as the maximum number of patients that can be accounted for by the model under a given timeliness-to-care requirement; cost-efficiency is measured by the corresponding annual cost per patient in that model. Appropriate queueing analysis is conducted to generate formulas and values for these two performance measures. Model parameters for the analysis are extracted from the previous literature and survey reports. Sensitivity analysis is conducted to investigate the model performance under different scenarios and to verify the robustness of findings. Employing an NP, whose salary is usually lower than a primary care physician, may not be cost-efficient, in particular when the NP's capacity is underutilized. Besides provider service rates, workload allocation among providers is one of the most important determinants for the cost-efficiency of a practice model involving NPs. Capacity pooling among providers could be a helpful strategy to improve efficiency in care delivery. The productivity and cost-efficiency of a practice model depend heavily on how providers organize their work and a variety of other factors related to the practice environment. Queueing theory provides useful tools to take into account these factors in making strategic decisions on staffing and panel size selection for a practice model. © Health Research and Educational Trust.

  20. Influence of the Numerical Scheme on the Solution Quality of the SWE for Tsunami Numerical Codes: The Tohoku-Oki, 2011Example.

    NASA Astrophysics Data System (ADS)

    Reis, C.; Clain, S.; Figueiredo, J.; Baptista, M. A.; Miranda, J. M. A.

    2015-12-01

    Numerical tools turn to be very important for scenario evaluations of hazardous phenomena such as tsunami. Nevertheless, the predictions highly depends on the numerical tool quality and the design of efficient numerical schemes still receives important attention to provide robust and accurate solutions. In this study we propose a comparative study between the efficiency of two volume finite numerical codes with second-order discretization implemented with different method to solve the non-conservative shallow water equations, the MUSCL (Monotonic Upstream-Centered Scheme for Conservation Laws) and the MOOD methods (Multi-dimensional Optimal Order Detection) which optimize the accuracy of the approximation in function of the solution local smoothness. The MUSCL is based on a priori criteria where the limiting procedure is performed before updated the solution to the next time-step leading to non-necessary accuracy reduction. On the contrary, the new MOOD technique uses a posteriori detectors to prevent the solution from oscillating in the vicinity of the discontinuities. Indeed, a candidate solution is computed and corrections are performed only for the cells where non-physical oscillations are detected. Using a simple one-dimensional analytical benchmark, 'Single wave on a sloping beach', we show that the classical 1D shallow-water system can be accurately solved with the finite volume method equipped with the MOOD technique and provide better approximation with sharper shock and less numerical diffusion. For the code validation, we also use the Tohoku-Oki 2011 tsunami and reproduce two DART records, demonstrating that the quality of the solution may deeply interfere with the scenario one can assess. This work is funded by the Portugal-France research agreement, through the research project GEONUM FCT-ANR/MAT-NAN/0122/2012.Numerical tools turn to be very important for scenario evaluations of hazardous phenomena such as tsunami. Nevertheless, the predictions highly depends on the numerical tool quality and the design of efficient numerical schemes still receives important attention to provide robust and accurate solutions. In this study we propose a comparative study between the efficiency of two volume finite numerical codes with second-order discretization implemented with different method to solve the non-conservative shallow water equations, the MUSCL (Monotonic Upstream-Centered Scheme for Conservation Laws) and the MOOD methods (Multi-dimensional Optimal Order Detection) which optimize the accuracy of the approximation in function of the solution local smoothness. The MUSCL is based on a priori criteria where the limiting procedure is performed before updated the solution to the next time-step leading to non-necessary accuracy reduction. On the contrary, the new MOOD technique uses a posteriori detectors to prevent the solution from oscillating in the vicinity of the discontinuities. Indeed, a candidate solution is computed and corrections are performed only for the cells where non-physical oscillations are detected. Using a simple one-dimensional analytical benchmark, 'Single wave on a sloping beach', we show that the classical 1D shallow-water system can be accurately solved with the finite volume method equipped with the MOOD technique and provide better approximation with sharper shock and less numerical diffusion. For the code validation, we also use the Tohoku-Oki 2011 tsunami and reproduce two DART records, demonstrating that the quality of the solution may deeply interfere with the scenario one can assess. This work is funded by the Portugal-France research agreement, through the research project GEONUM FCT-ANR/MAT-NAN/0122/2012.

  1. Hybridization chain reaction: a versatile molecular tool for biosensing, bioimaging, and biomedicine.

    PubMed

    Bi, Sai; Yue, Shuzhen; Zhang, Shusheng

    2017-07-17

    Developing powerful, simple and low-cost DNA amplification techniques is of great significance to bioanalysis and biomedical research. Thus far, many signal amplification strategies have been developed, such as polymerase chain reaction (PCR), rolling circle amplification (RCA), and DNA strand displacement amplification (SDA). In particular, hybridization chain reaction (HCR), a type of toehold-mediated strand displacement (TMSD) reaction, has attracted great interest because of its enzyme-free nature, isothermal conditions, simple protocols, and excellent amplification efficiency. In a typical HCR, an analyte initiates the cross-opening of two DNA hairpins, yielding nicked double helices that are analogous to alternating copolymers. As an efficient amplification platform, HCR has been utilized for the sensitive detection of a wide variety of analytes, including nucleic acids, proteins, small molecules, and cells. In recent years, more complicated sets of monomers have been designed to develop nonlinear HCR, such as branched HCR and even dendritic systems, achieving quadratic and exponential growth mechanisms. In addition, HCR has attracted enormous attention in the fields of bioimaging and biomedicine, including applications in fluorescence in situ hybridization (FISH) imaging, live cell imaging, and targeted drug delivery. In this review, we introduce the fundamentals of HCR and examine the visualization and analysis techniques for HCR products in detail. The most recent HCR developments in biosensing, bioimaging, and biomedicine are subsequently discussed with selected examples. Finally, the review provides insight into the challenges and future perspectives of HCR.

  2. Collaboratively Conceived, Designed and Implemented: Matching Visualization Tools with Geoscience Data Collections and Geoscience Data Collections with Visualization Tools via the ToolMatch Service.

    NASA Astrophysics Data System (ADS)

    Hoebelheinrich, N. J.; Lynnes, C.; West, P.; Ferritto, M.

    2014-12-01

    Two problems common to many geoscience domains are the difficulties in finding tools to work with a given dataset collection, and conversely, the difficulties in finding data for a known tool. A collaborative team from the Earth Science Information Partnership (ESIP) has gotten together to design and create a web service, called ToolMatch, to address these problems. The team began their efforts by defining an initial, relatively simple conceptual model that addressed the two uses cases briefly described above. The conceptual model is expressed as an ontology using OWL (Web Ontology Language) and DCterms (Dublin Core Terms), and utilizing standard ontologies such as DOAP (Description of a Project), FOAF (Friend of a Friend), SKOS (Simple Knowledge Organization System) and DCAT (Data Catalog Vocabulary). The ToolMatch service will be taking advantage of various Semantic Web and Web standards, such as OpenSearch, RESTful web services, SWRL (Semantic Web Rule Language) and SPARQL (Simple Protocol and RDF Query Language). The first version of the ToolMatch service was deployed in early fall 2014. While more complete testing is required, a number of communities besides ESIP member organizations have expressed interest in collaborating to create, test and use the service and incorporate it into their own web pages, tools and / or services including the USGS Data Catalog service, DataONE, the Deep Carbon Observatory, Virtual Solar Terrestrial Observatory (VSTO), and the U.S. Global Change Research Program. In this session, presenters will discuss the inception and development of the ToolMatch service, the collaborative process used to design, refine, and test the service, and future plans for the service.

  3. Burden Calculator: a simple and open analytical tool for estimating the population burden of injuries.

    PubMed

    Bhalla, Kavi; Harrison, James E

    2016-04-01

    Burden of disease and injury methods can be used to summarise and compare the effects of conditions in terms of disability-adjusted life years (DALYs). Burden estimation methods are not inherently complex. However, as commonly implemented, the methods include complex modelling and estimation. To provide a simple and open-source software tool that allows estimation of incidence-DALYs due to injury, given data on incidence of deaths and non-fatal injuries. The tool includes a default set of estimation parameters, which can be replaced by users. The tool was written in Microsoft Excel. All calculations and values can be seen and altered by users. The parameter sets currently used in the tool are based on published sources. The tool is available without charge online at http://calculator.globalburdenofinjuries.org. To use the tool with the supplied parameter sets, users need to only paste a table of population and injury case data organised by age, sex and external cause of injury into a specified location in the tool. Estimated DALYs can be read or copied from tables and figures in another part of the tool. In some contexts, a simple and user-modifiable burden calculator may be preferable to undertaking a more complex study to estimate the burden of disease. The tool and the parameter sets required for its use can be improved by user innovation, by studies comparing DALYs estimates calculated in this way and in other ways, and by shared experience of its use. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  4. Electronic Identities: The Strategic Use of Email for Impression Management.

    ERIC Educational Resources Information Center

    Kersten, Larry; Phillips, Stephen R.

    Traditionally, e-mail (electronic mail) has been seen as an efficient communications medium for the transmission of simple, routine, unambiguous messages. More recent research has argued that the simple, efficient view of e-mail is incomplete. Future research should be extended into the strategic and symbolic functions of email, such as the use of…

  5. Effect of steam addition on cycle performance of simple and recuperated gas turbines

    NASA Technical Reports Server (NTRS)

    Boyle, R. J.

    1979-01-01

    Results are presented for the cycle efficiency and specific power of simple and recuperated gas turbine cycles in which steam is generated and used to increase turbine flow. Calculations showed significant improvements in cycle efficiency and specific power by adding steam. The calculations were made using component efficiencies and loss assumptions typical of stationary powerplants. These results are presented for a range of operating temperatures and pressures. Relative heat exchanger size and the water use rate are also examined.

  6. Rocks in Our Pockets

    ERIC Educational Resources Information Center

    Plummer, Donna; Kuhlman, Wilma

    2005-01-01

    To introduce students to rocks and their characteristics, teacher can begin rock units with the activities described in this article. Students need the ability to make simple observations using their senses and simple tools.

  7. On the use of satellite data to implement a parsimonious ecohydrological model in the upper Ewaso Ngiro river basin

    NASA Astrophysics Data System (ADS)

    Ruiz-Pérez, G.

    2015-12-01

    Drylands are extensive, covering 30% of the Earth's land surface and 50% of Africa. Projections of the IPCC (Intergovernmental Panel on Climate Change, 2007) indicate that the extent of these regions have high probability to increase with a considerable additional impact on water resources, which should be taken into account by water management plans. In these water-controlled areas, vegetation plays a key role in the water cycle. Ecohydrological models provide a tool to investigate the relationships between vegetation and water resources. However, studies in Africa often face the problem that many ecohydrological models have quite extensive parametrical requirements, while available data are scarce. Therefore, there is a need for assessments using models whose requirements match the data availability. In that context, parsimonious models, together with available remote sensing information, can be valuable tools for ecohydrological studies. For this reason, we have focused on the use of a parsimonious model based on the amount of photosynthetically active radiation absorbed by green vegetation (APAR) and the Light Use Efficiency index (the efficiency by which that radiation is converted to plant biomass increment) in order to compute the gross primary production (GPP).This model has been calibrated using only remote sensing data (particularly, NDVI data from Modis products) in order to explore the potential of satellite information in implementing a simple distributed model. The model has been subsequently validated against stream flow data with the aim to define a tool able to account for landuse characteristics in describing water budget. Results are promising for studies aimed at the description of the consequences of ongoing land use changes on water resources.

  8. CRISPR therapeutic tools for complex genetic disorders and cancer (Review)

    PubMed Central

    Baliou, Stella; Adamaki, Maria; Kyriakopoulos, Anthony M.; Spandidos, Demetrios A.; Panayiotidis, Mihalis; Christodoulou, Ioannis; Zoumpourlis, Vassilis

    2018-01-01

    One of the fundamental discoveries in the field of biology is the ability to modulate the genome and to monitor the functional outputs derived from genomic alterations. In order to unravel new therapeutic options, scientists had initially focused on inducing genetic alterations in primary cells, in established cancer cell lines and mouse models using either RNA interference or cDNA overexpression or various programmable nucleases [zinc finger nucleases (ZNF), transcription activator-like effector nucleases (TALEN)]. Even though a huge volume of data was produced, its use was neither cheap nor accurate. Therefore, the clustered regularly interspaced short palindromic repeats (CRISPR) system was evidenced to be the next step in genome engineering tools. CRISPR-associated protein 9 (Cas9)-mediated genetic perturbation is simple, precise and highly efficient, empowering researchers to apply this method to immortalized cancerous cell lines, primary cells derived from mouse and human origins, xenografts, induced pluripotent stem cells, organoid cultures, as well as the generation of genetically engineered animal models. In this review, we assess the development of the CRISPR system and its therapeutic applications to a wide range of complex diseases (particularly distinct tumors), aiming at personalized therapy. Special emphasis is given to organoids and CRISPR screens in the design of innovative therapeutic approaches. Overall, the CRISPR system is regarded as an eminent genome engineering tool in therapeutics. We envision a new era in cancer biology during which the CRISPR-based genome engineering toolbox will serve as the fundamental conduit between the bench and the bedside; nonetheless, certain obstacles need to be addressed, such as the eradication of side-effects, maximization of efficiency, the assurance of delivery and the elimination of immunogenicity. PMID:29901119

  9. Highly efficient single-layer dendrimer light-emitting diodes with balanced charge transport

    NASA Astrophysics Data System (ADS)

    Anthopoulos, Thomas D.; Markham, Jonathan P. J.; Namdas, Ebinazar B.; Samuel, Ifor D. W.; Lo, Shih-Chun; Burn, Paul L.

    2003-06-01

    High-efficiency single-layer-solution-processed green light-emitting diodes based on a phosphorescent dendrimer are demonstrated. A peak external quantum efficiency of 10.4% (35 cd/A) was measured for a first generation fac-tris(2-phenylpyridine) iridium cored dendrimer when blended with 4,4'-bis(N-carbazolyl)biphenyl and electron transporting 1,3,5-tris(2-N-phenylbenzimidazolyl)benzene at 8.1 V. A maximum power efficiency of 12.8 lm/W was measured also at 8.1 V and 550 cd/m2. These results indicate that, by simple blending of bipolar and electron-transporting molecules, highly efficient light-emitting diodes can be made employing a very simple device structure.

  10. Nutrition screening tools: an analysis of the evidence.

    PubMed

    Skipper, Annalynn; Ferguson, Maree; Thompson, Kyle; Castellanos, Victoria H; Porcari, Judy

    2012-05-01

    In response to questions about tools for nutrition screening, an evidence analysis project was developed to identify the most valid and reliable nutrition screening tools for use in acute care and hospital-based ambulatory care settings. An oversight group defined nutrition screening and literature search criteria. A trained analyst conducted structured searches of the literature for studies of nutrition screening tools according to predetermined criteria. Eleven nutrition screening tools designed to detect undernutrition in patients in acute care and hospital-based ambulatory care were identified. Trained analysts evaluated articles for quality using criteria specified by the American Dietetic Association's Evidence Analysis Library. Members of the oversight group assigned quality grades to the tools based on the quality of the supporting evidence, including reliability and validity data. One tool, the NRS-2002, received a grade I, and 4 tools-the Simple Two-Part Tool, the Mini-Nutritional Assessment-Short Form (MNA-SF), the Malnutrition Screening Tool (MST), and Malnutrition Universal Screening Tool (MUST)-received a grade II. The MST was the only tool shown to be both valid and reliable for identifying undernutrition in the settings studied. Thus, validated nutrition screening tools that are simple and easy to use are available for application in acute care and hospital-based ambulatory care settings.

  11. BEST Winery Guidebook: Benchmarking and Energy and Water SavingsTool for the Wine Industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galitsky, Christina; Worrell, Ernst; Radspieler, Anthony

    2005-10-15

    Not all industrial facilities have the staff or the opportunity to perform a detailed audit of their operations. The lack of knowledge of energy efficiency opportunities provides an important barrier to improving efficiency. Benchmarking has demonstrated to help energy users understand energy use and the potential for energy efficiency improvement, reducing the information barrier. In California, the wine making industry is not only one of the economic pillars of the economy; it is also a large energy consumer, with a considerable potential for energy-efficiency improvement. Lawrence Berkeley National Laboratory and Fetzer Vineyards developed an integrated benchmarking and self-assessment tool formore » the California wine industry called ''BEST''(Benchmarking and Energy and water Savings Tool) Winery. BEST Winery enables a winery to compare its energy efficiency to a best practice winery, accounting for differences in product mix and other characteristics of the winery. The tool enables the user to evaluate the impact of implementing energy and water efficiency measures. The tool facilitates strategic planning of efficiency measures, based on the estimated impact of the measures, their costs and savings. BEST Winery is available as a software tool in an Excel environment. This report serves as background material, documenting assumptions and information on the included energy and water efficiency measures. It also serves as a user guide for the software package.« less

  12. Simple SPION Incubation as an Efficient Intracellular Labeling Method for Tracking Neural Progenitor Cells Using MRI

    PubMed Central

    D. M., Jayaseema; Lai, Jiann-Shiun; Hueng, Dueng-Yuan; Chang, Chen

    2013-01-01

    Cellular magnetic resonance imaging (MRI) has been well-established for tracking neural progenitor cells (NPC). Superparamagnetic iron oxide nanoparticles (SPIONs) approved for clinical application are the most common agents used for labeling. Conventionally, transfection agents (TAs) were added with SPIONs to facilitate cell labeling because SPIONs in the native unmodified form were deemed inefficient for intracellular labeling. However, compelling evidence also shows that simple SPION incubation is not invariably ineffective. The labeling efficiency can be improved by prolonged incubation and elevated iron doses. The goal of the present study was to establish simple SPION incubation as an efficient intracellular labeling method. To this end, NPCs derived from the neonatal subventricular zone were incubated with SPIONs (Feridex®) and then evaluated in vitro with regard to the labeling efficiency and biological functions. The results showed that, following 48 hours of incubation at 75 µg/ml, nearly all NPCs exhibited visible SPION intake. Evidence from light microscopy, electron microscopy, chemical analysis, and magnetic resonance imaging confirmed the effectiveness of the labeling. Additionally, biological assays showed that the labeled NPCs exhibited unaffected viability, oxidative stress, apoptosis and differentiation. In the demonstrated in vivo cellular MRI experiment, the hypointensities representing the SPION labeled NPCs remained observable throughout the entire tracking period. The findings indicate that simple SPION incubation without the addition of TAs is an efficient intracellular magnetic labeling method. This simple approach may be considered as an alternative approach to the mainstream labeling method that involves the use of TAs. PMID:23468856

  13. A Simple and Robust Method for Culturing Human-Induced Pluripotent Stem Cells in an Undifferentiated State Using Botulinum Hemagglutinin.

    PubMed

    Kim, Mee-Hae; Matsubara, Yoshifumi; Fujinaga, Yukako; Kino-Oka, Masahiro

    2018-02-01

    Clinical and industrial applications of human-induced pluripotent stem cells (hiPSCs) is hindered by the lack of robust culture strategies capable of sustaining a culture in an undifferentiated state. Here, a simple and robust hiPSC-culture-propagation strategy incorporating botulinum hemagglutinin (HA)-mediated selective removal of cells deviating from an undifferentiated state is developed. After HA treatment, cell-cell adhesion is disrupted, and deviated cells detached from the central region of the colony to subsequently form tight monolayer colonies following prolonged incubation. The authors find that the temporal and dose-dependent activity of HA regulated deviated-cell removal and recoverability after disruption of cell-cell adhesion in hiPSC colonies. The effects of HA are confirmed under all culture conditions examined, regardless of hiPSC line and feeder-dependent or -free culture conditions. After routine application of our HA-treatment paradigm for serial passages, hiPSCs maintains expression of pluripotent markers and readily forms embryoid bodies expressing markers for all three germ-cell layers. This method enables highly efficient culturing of hiPSCs and use of entire undifferentiated portions without having to pick deviated cells manually. This simple and readily reproducible culture strategy is a potentially useful tool for improving the robust and scalable maintenance of undifferentiated hiPSC cultures. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. A simple but highly efficient multi-formyl phenol-amine system for fluorescence detection of peroxide explosive vapour.

    PubMed

    Xu, Wei; Fu, Yanyan; Gao, Yixun; Yao, Junjun; Fan, Tianchi; Zhu, Defeng; He, Qingguo; Cao, Huimin; Cheng, Jiangong

    2015-07-11

    A simple, highly stable, sensitive and selective fluorescent system for peroxide explosives was developed via an aromatic aldehyde oxidation reaction. The high efficiency arises from its higher HOMO level and multiple H-bonding. The sensitivity is obtained to be 0.1 ppt for H2O2 and 0.2 ppb for TATP.

  15. Introduction, comparison, and validation of Meta‐Essentials: A free and simple tool for meta‐analysis

    PubMed Central

    van Rhee, Henk; Hak, Tony

    2017-01-01

    We present a new tool for meta‐analysis, Meta‐Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta‐analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta‐Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta‐analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp‐Hartung adjustment of the DerSimonian‐Laird estimator. However, more advanced meta‐analysis methods such as meta‐analytical structural equation modelling and meta‐regression with multiple covariates are not available. In summary, Meta‐Essentials may prove a valuable resource for meta‐analysts, including researchers, teachers, and students. PMID:28801932

  16. The use of video capture virtual reality in burn rehabilitation: the possibilities.

    PubMed

    Haik, Josef; Tessone, Ariel; Nota, Ayala; Mendes, David; Raz, Liat; Goldan, Oren; Regev, Elli; Winkler, Eyal; Mor, Elisheva; Orenstein, Arie; Hollombe, Ilana

    2006-01-01

    We independently explored the use of the Sony PlayStation II EyeToy (Sony Corporation, Foster City, CA) as a tool for use in the rehabilitation of patients with severe burns. Intensive occupational and physical therapy is crucial in minimizing and preventing long-term disability for the burn patient; however, the therapist faces a difficult challenge combating the agonizing pain experienced by the patient during therapy. The Sony PlayStation II EyeToy is a projected, video-capture system that, although initially developed as a gaming environment for children, may be a useful application in a rehabilitative context. As compared with other virtual reality systems the EyeToy is an efficient rehabilitation tool that is sold commercially at a relatively low cost. This report presents the potential advantages for use of the EyeToy as an innovative rehabilitative tool with mitigating effects on pain in burn rehabilitation. This new technology represents a challenging and motivating way for the patient to immerse himself or herself in an alternate reality while undergoing treatment, thereby reducing the pain and discomfort he or she experiences. This simple, affordable technique may prove to heighten the level of patient cooperation and therefore speed the process of rehabilitation and return of functional ability.

  17. Evaluation of Arthrobacter aurescens Strain TC1 as Bioaugmentation Bacterium in Soils Contaminated with the Herbicidal Substance Terbuthylazine

    PubMed Central

    Silva, Vera P.; Moreira-Santos, Matilde; Mateus, Carla; Teixeira, Tânia; Ribeiro, Rui; Viegas, Cristina A.

    2015-01-01

    In the last years the chloro-s-triazine active substance terbuthylazine has been increasingly used as an herbicide and may leave residues in the environment which can be of concern. The present study aimed at developing a bioaugmentation tool based on the soil bacterium Arthrobacter aurescens strain TC1 for the remediation of terbuthylazine contaminated soils and at examining its efficacy for both soil and aquatic compartments. First, the feasibility of growing the bioaugmentation bacterium inocula on simple sole nitrogen sources (ammonium and nitrate) instead of atrazine, while still maintaining its efficiency to biodegrade terbuthylazine was shown. In sequence, the successful and quick (3 days) bioremediation efficacy of ammonium-grown A. aurescens TC1 cells was proven in a natural soil freshly spiked or four-months aged with commercial terbuthylazine at a dose 10× higher than the recommended in corn cultivation, to mimic spill situations. Ecotoxicity assessment of the soil eluates towards a freshwater microalga supported the effectiveness of the bioaugmentation tool. Obtained results highlight the potential to decontaminate soil while minimizing terbuthylazine from reaching aquatic compartments via the soil-water pathway. The usefulness of this bioaugmentation tool to provide rapid environment decontamination is particularly relevant in the event of accidental high herbicide contamination. Its limitations and advantages are discussed. PMID:26662024

  18. Mutational Signatures in Cancer (MuSiCa): a web application to implement mutational signatures analysis in cancer samples.

    PubMed

    Díaz-Gay, Marcos; Vila-Casadesús, Maria; Franch-Expósito, Sebastià; Hernández-Illán, Eva; Lozano, Juan José; Castellví-Bel, Sergi

    2018-06-14

    Mutational signatures have been proved as a valuable pattern in somatic genomics, mainly regarding cancer, with a potential application as a biomarker in clinical practice. Up to now, several bioinformatic packages to address this topic have been developed in different languages/platforms. MutationalPatterns has arisen as the most efficient tool for the comparison with the signatures currently reported in the Catalogue of Somatic Mutations in Cancer (COSMIC) database. However, the analysis of mutational signatures is nowadays restricted to a small community of bioinformatic experts. In this work we present Mutational Signatures in Cancer (MuSiCa), a new web tool based on MutationalPatterns and built using the Shiny framework in R language. By means of a simple interface suited to non-specialized researchers, it provides a comprehensive analysis of the somatic mutational status of the supplied cancer samples. It permits characterizing the profile and burden of mutations, as well as quantifying COSMIC-reported mutational signatures. It also allows classifying samples according to the above signature contributions. MuSiCa is a helpful web application to characterize mutational signatures in cancer samples. It is accessible online at http://bioinfo.ciberehd.org/GPtoCRC/en/tools.html and source code is freely available at https://github.com/marcos-diazg/musica .

  19. Simulation of laser generated ultrasound with application to defect detection

    NASA Astrophysics Data System (ADS)

    Pantano, A.; Cerniglia, D.

    2008-06-01

    Laser generated ultrasound holds substantial promise for use as a tool for defect detection in remote inspection thanks to its ability to produce frequencies in the MHz range, enabling fine spatial resolution of defects. Despite the potential impact of laser generated ultrasound in many areas of science and industry, robust tools for studying the phenomenon are lacking and thus limit the design and optimization of non-destructive testing and evaluation techniques. The laser generated ultrasound propagation in complex structures is an intricate phenomenon and is extremely hard to analyze. Only simple geometries can be studied analytically. Numerical techniques found in the literature have proved to be limited in their applicability, by the frequencies in the MHz range and very short wavelengths. The objective of this research is to prove that by using an explicit integration rule together with diagonal element mass matrices, instead of the almost universally adopted implicit integration rule to integrate the equations of motion in a dynamic analysis, it is possible to efficiently and accurately solve ultrasound wave propagation problems with frequencies in the MHz range travelling in relatively large bodies. Presented results on NDE testing of rails demonstrate that the proposed FE technique can provide a valuable tool for studying the laser generated ultrasound propagation.

  20. Simple debugging techniques for embedded subsystems

    NASA Astrophysics Data System (ADS)

    MacPherson, Matthew S.; Martin, Kevin S.

    1990-08-01

    This paper describes some of the tools and methods used for developing and debugging embedded subsystems at Fermilab. Specifically, these tools have been used for the Flying Wire project and are currently being employed for the New TECAR upgrade. The Flying Wire is a subsystem that swings a wire through the beam in order to measure luminosity and beam density distribution, and TECAR (Tevatron excitation controller and regulator) controls the power-supply ramp generation for the superconducting Tevatron accelerator at Fermilab. In both instances the subsystem hardware consists of a VME crate with one or more processors, shared memory and a network connection to the accelerator control system. Two real-time-operating systems are currently being used: VRTX for the Flying Wire system, and MTOS for New TECAR. The code which runs in these subsystems is a combination of C and assembler and is developed using the Microtec cross-development tools on a VAX 8650 running VMS. This paper explains how multiple debuggers are used to give the greatest possible flexibility from assembly to high-level debugging. Also discussed is how network debugging and network downloading can make a very effective and efficient means of finding bugs in the subsystem environment. The debuggers used are PROBE1, TRACER and the MTOS debugger.

  1. A new method for studying population genetics of cyst nematodes based on Pool-Seq and genomewide allele frequency analysis.

    PubMed

    Mimee, Benjamin; Duceppe, Marc-Olivier; Véronneau, Pierre-Yves; Lafond-Lapalme, Joël; Jean, Martine; Belzile, François; Bélair, Guy

    2015-11-01

    Cyst nematodes are important agricultural pests responsible for billions of dollars of losses each year. Plant resistance is the most effective management tool, but it requires a close monitoring of population genetics. Current technologies for pathotyping and genotyping cyst nematodes are time-consuming, expensive and imprecise. In this study, we capitalized on the reproduction mode of cyst nematodes to develop a simple population genetic analysis pipeline based on genotyping-by-sequencing and Pool-Seq. This method yielded thousands of SNPs and allowed us to study the relationships between populations of different origins or pathotypes. Validation of the method on well-characterized populations also demonstrated that it was a powerful and accurate tool for population genetics. The genomewide allele frequencies of 23 populations of golden nematode, from nine countries and representing the five known pathotypes, were compared. A clear separation of the pathotypes and fine genetic relationships between and among global populations were obtained using this method. In addition to being powerful, this tool has proven to be very time- and cost-efficient and could be applied to other cyst nematode species. © 2015 Her Majesty the Queen in Right of Canada Molecular Ecology Resources © 2015 John Wiley & Sons Ltd Reproduced with the permission of the Minister of Agriculture and Agri-food.

  2. Color management with a hammer: the B-spline fitter

    NASA Astrophysics Data System (ADS)

    Bell, Ian E.; Liu, Bonny H. P.

    2003-01-01

    To paraphrase Abraham Maslow: If the only tool you have is a hammer, every problem looks like a nail. We have a B-spline fitter customized for 3D color data, and many problems in color management can be solved with this tool. Whereas color devices were once modeled with extensive measurement, look-up tables and trilinear interpolation, recent improvements in hardware have made B-spline models an affordable alternative. Such device characterizations require fewer color measurements than piecewise linear models, and have uses beyond simple interpolation. A B-spline fitter, for example, can act as a filter to remove noise from measurements, leaving a model with guaranteed smoothness. Inversion of the device model can then be carried out consistently and efficiently, as the spline model is well behaved and its derivatives easily computed. Spline-based algorithms also exist for gamut mapping, the composition of maps, and the extrapolation of a gamut. Trilinear interpolation---a degree-one spline---can still be used after nonlinear spline smoothing for high-speed evaluation with robust convergence. Using data from several color devices, this paper examines the use of B-splines as a generic tool for modeling devices and mapping one gamut to another, and concludes with applications to high-dimensional and spectral data.

  3. Benchmarking of density functionals for a soft but accurate prediction and assignment of (1) H and (13)C NMR chemical shifts in organic and biological molecules.

    PubMed

    Benassi, Enrico

    2017-01-15

    A number of programs and tools that simulate 1 H and 13 C nuclear magnetic resonance (NMR) chemical shifts using empirical approaches are available. These tools are user-friendly, but they provide a very rough (and sometimes misleading) estimation of the NMR properties, especially for complex systems. Rigorous and reliable ways to predict and interpret NMR properties of simple and complex systems are available in many popular computational program packages. Nevertheless, experimentalists keep relying on these "unreliable" tools in their daily work because, to have a sufficiently high accuracy, these rigorous quantum mechanical methods need high levels of theory. An alternative, efficient, semi-empirical approach has been proposed by Bally, Rablen, Tantillo, and coworkers. This idea consists of creating linear calibrations models, on the basis of the application of different combinations of functionals and basis sets. Following this approach, the predictive capability of a wider range of popular functionals was systematically investigated and tested. The NMR chemical shifts were computed in solvated phase at density functional theory level, using 30 different functionals coupled with three different triple-ζ basis sets. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  4. Green remediation. Tool for safe and sustainable environment: a review

    NASA Astrophysics Data System (ADS)

    Singh, Mamta; Pant, Gaurav; Hossain, Kaizar; Bhatia, A. K.

    2017-10-01

    Nowadays, the bioremediation of toxic pollutants is a subject of interest in terms of health issues and environmental cleaning. In the present review, an eco-friendly, cost-effective approach is discussed for the detoxification of environmental pollutants by the means of natural purifier, i.e., blue-green algae over the conventional methods. Industrial wastes having toxic pollutants are not able to eliminate completely by existing the conventional techniques; in fact, these methods can only change their form rather than the entire degradation. These pollutants have an adverse effect on aquatic life, such as fauna and flora, and finally harm human life directly or indirectly. Cyanobacterial approach for the removal of this contaminant is an efficient tool for sustainable development and pollution control. Cyanobacteria are the primary consumers of food chain which absorbed complex toxic compounds from environments and convert them to simple nontoxic compounds which finally protect higher food chain consumer and eliminate risk of pollution. In addition, these organisms have capability to solve secondary pollution, as they can remediate radioactive compound, petroleum waste and degrade toxins from pesticides.

  5. A programming language for composable DNA circuits

    PubMed Central

    Phillips, Andrew; Cardelli, Luca

    2009-01-01

    Recently, a range of information-processing circuits have been implemented in DNA by using strand displacement as their main computational mechanism. Examples include digital logic circuits and catalytic signal amplification circuits that function as efficient molecular detectors. As new paradigms for DNA computation emerge, the development of corresponding languages and tools for these paradigms will help to facilitate the design of DNA circuits and their automatic compilation to nucleotide sequences. We present a programming language for designing and simulating DNA circuits in which strand displacement is the main computational mechanism. The language includes basic elements of sequence domains, toeholds and branch migration, and assumes that strands do not possess any secondary structure. The language is used to model and simulate a variety of circuits, including an entropy-driven catalytic gate, a simple gate motif for synthesizing large-scale circuits and a scheme for implementing an arbitrary system of chemical reactions. The language is a first step towards the design of modelling and simulation tools for DNA strand displacement, which complements the emergence of novel implementation strategies for DNA computing. PMID:19535415

  6. Ancestry informative marker sets for determining continental origin and admixture proportions in common populations in America.

    PubMed

    Kosoy, Roman; Nassir, Rami; Tian, Chao; White, Phoebe A; Butler, Lesley M; Silva, Gabriel; Kittles, Rick; Alarcon-Riquelme, Marta E; Gregersen, Peter K; Belmont, John W; De La Vega, Francisco M; Seldin, Michael F

    2009-01-01

    To provide a resource for assessing continental ancestry in a wide variety of genetic studies, we identified, validated, and characterized a set of 128 ancestry informative markers (AIMs). The markers were chosen for informativeness, genome-wide distribution, and genotype reproducibility on two platforms (TaqMan assays and Illumina arrays). We analyzed genotyping data from 825 subjects with diverse ancestry, including European, East Asian, Amerindian, African, South Asian, Mexican, and Puerto Rican. A comprehensive set of 128 AIMs and subsets as small as 24 AIMs are shown to be useful tools for ascertaining the origin of subjects from particular continents, and to correct for population stratification in admixed population sample sets. Our findings provide general guidelines for the application of specific AIM subsets as a resource for wide application. We conclude that investigators can use TaqMan assays for the selected AIMs as a simple and cost efficient tool to control for differences in continental ancestry when conducting association studies in ethnically diverse populations. Copyright 2008 Wiley-Liss, Inc.

  7. ChromBiSim: Interactive chromatin biclustering using a simple approach.

    PubMed

    Noureen, Nighat; Zohaib, Hafiz Muhammad; Qadir, Muhammad Abdul; Fazal, Sahar

    2017-10-01

    Combinatorial patterns of histone modifications sketch the epigenomic locale. Specific positions of these modifications in the genome are marked by the presence of such signals. Various methods highlight such patterns on global scale hence missing the local patterns which are the actual hidden combinatorics. We present ChromBiSim, an interactive tool for mining subsets of modifications from epigenomic profiles. ChromBiSim efficiently extracts biclusters with their genomic locations. It is the very first user interface based and multiple cell type handling tool for decoding the interplay of subsets of histone modifications combinations along their genomic locations. It displays the results in the forms of charts and heat maps in accordance with saving them in files which could be used for post analysis. ChromBiSim tested on multiple cell types produced in total 803 combinatorial patterns. It could be used to highlight variations among diseased versus normal cell types of any species. ChromBiSim is available at (http://sourceforge.net/projects/chrombisim) in C-sharp and python languages. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Zeta-potential Analyses using Micro Electrical Field Flow Fractionation with Fluorescent Nanoparticles

    PubMed Central

    Chang, Moon-Hwan; Dosev, Dosi; Kennedy, Ian M.

    2007-01-01

    Increasingly growing application of nanoparticles in biotechnology requires fast and accessible tools for their manipulation and for characterization of their colloidal properties. In this work we determine the zeta-potentials for polystyrene nanoparticles using micro electrical field flow fractionation (μ–EFFF) which is an efficient method for sorting of particles by size. The data obtained by μ–EFFF were compared to zeta potentials determined by standard capillary electrophoresis. For proof of concept, we used polystyrene nanoparticles of two different sizes, impregnated with two different fluorescent dyes. Fluorescent emission spectra were used to evaluate the particle separation in both systems. Using the theory of electrophoresis, we estimated the zeta-potentials as a function of size, dielectric permittivity, viscosity and electrophoretic mobility. The results obtained by the μ–EFFF technique were confirmed by the conventional capillary electrophoresis measurements. These results demonstrate the applicability of the μ–EFFF method not only for particle size separation but also as a simple and inexpensive tool for measurements of nanoparticles zeta potentials. PMID:18542710

  9. A programming language for composable DNA circuits.

    PubMed

    Phillips, Andrew; Cardelli, Luca

    2009-08-06

    Recently, a range of information-processing circuits have been implemented in DNA by using strand displacement as their main computational mechanism. Examples include digital logic circuits and catalytic signal amplification circuits that function as efficient molecular detectors. As new paradigms for DNA computation emerge, the development of corresponding languages and tools for these paradigms will help to facilitate the design of DNA circuits and their automatic compilation to nucleotide sequences. We present a programming language for designing and simulating DNA circuits in which strand displacement is the main computational mechanism. The language includes basic elements of sequence domains, toeholds and branch migration, and assumes that strands do not possess any secondary structure. The language is used to model and simulate a variety of circuits, including an entropy-driven catalytic gate, a simple gate motif for synthesizing large-scale circuits and a scheme for implementing an arbitrary system of chemical reactions. The language is a first step towards the design of modelling and simulation tools for DNA strand displacement, which complements the emergence of novel implementation strategies for DNA computing.

  10. Opportunities and limitations in using google glass to assist drug dispensing.

    PubMed

    Ehrler, Frederic; Diener, Raphael; Lovis, Christian

    2015-01-01

    The administration of intravenous drugs is a significant source of medical errors. Protocol based care are have been demonstrated to be an efficient way to favor best practices and to avoid simple errors such as those related to expiration date, hygiene regulation among other. The recent availability of the Google Glass, a hands free wearable device offers new opportunities to access care protocols at patients' bedside. In this article, we present a prototype application for displaying care protocols developed through a user centered design. The software enables the navigation through the different steps of care protocols and their validation using barcodes. Three interactions paradigms, tactile, vocal and by eye blink are proposed in order to provide hands free manipulation when necessary. The realization of a concrete project revealed some limitations that should be taken in account in order to ensure the proper behavior of the tool. If no formal evaluation has been performed, the first feedbacks are very positive and encourage us to go forward and test the tool in real care situations.

  11. Reducing Transaction Costs for Energy Efficiency Investments and Analysis of Economic Risk Associated With Building Performance Uncertainties: Small Buildings and Small Portfolios Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Langner, R.; Hendron, B.; Bonnema, E.

    2014-08-01

    The small buildings and small portfolios (SBSP) sector face a number of barriers that inhibit SBSP owners from adopting energy efficiency solutions. This pilot project focused on overcoming two of the largest barriers to financing energy efficiency in small buildings: disproportionately high transaction costs and unknown or unacceptable risk. Solutions to these barriers can often be at odds, because inexpensive turnkey solutions are often not sufficiently tailored to the unique circumstances of each building, reducing confidence that the expected energy savings will be achieved. To address these barriers, NREL worked with two innovative, forward-thinking lead partners, Michigan Saves and Energi,more » to develop technical solutions that provide a quick and easy process to encourage energy efficiency investments while managing risk. The pilot project was broken into two stages: the first stage focused on reducing transaction costs, and the second stage focused on reducing performance risk. In the first stage, NREL worked with the non-profit organization, Michigan Saves, to analyze the effects of 8 energy efficiency measures (EEMs) on 81 different baseline small office building models in Holland, Michigan (climate zone 5A). The results of this analysis (totaling over 30,000 cases) are summarized in a simple spreadsheet tool that enables users to easily sort through the results and find appropriate small office EEM packages that meet a particular energy savings threshold and are likely to be cost-effective.« less

  12. Utilizing Visual Effects Software for Efficient and Flexible Isostatic Adjustment Modelling

    NASA Astrophysics Data System (ADS)

    Meldgaard, A.; Nielsen, L.; Iaffaldano, G.

    2017-12-01

    The isostatic adjustment signal generated by transient ice sheet loading is an important indicator of past ice sheet extent and the rheological constitution of the interior of the Earth. Finite element modelling has proved to be a very useful tool in these studies. We present a simple numerical model for 3D visco elastic Earth deformation and a new approach to the design of such models utilizing visual effects software designed for the film and game industry. The software package Houdini offers an assortment of optimized tools and libraries which greatly facilitate the creation of efficient numerical algorithms. In particular, we make use of Houdini's procedural work flow, the SIMD programming language VEX, Houdini's sparse matrix creation and inversion libraries, an inbuilt tetrahedralizer for grid creation, and the user interface, which facilitates effortless manipulation of 3D geometry. We mitigate many of the time consuming steps associated with the authoring of efficient algorithms from scratch while still keeping the flexibility that may be lost with the use of commercial dedicated finite element programs. We test the efficiency of the algorithm by comparing simulation times with off-the-shelf solutions from the Abaqus software package. The algorithm is tailored for the study of local isostatic adjustment patterns, in close vicinity to present ice sheet margins. In particular, we wish to examine possible causes for the considerable spatial differences in the uplift magnitude which are apparent from field observations in these areas. Such features, with spatial scales of tens of kilometres, are not resolvable with current global isostatic adjustment models, and may require the inclusion of local topographic features. We use the presented algorithm to study a near field area where field observations are abundant, namely, Disko Bay in West Greenland with the intention of constraining Earth parameters and ice thickness. In addition, we assess how local topographic features may influence the differential isostatic uplift in the area.

  13. Optimal Force Control of Vibro-Impact Systems for Autonomous Drilling Applications

    NASA Technical Reports Server (NTRS)

    Aldrich, Jack B.; Okon, Avi B.

    2012-01-01

    The need to maintain optimal energy efficiency is critical during the drilling operations performed on future and current planetary rover missions (see figure). Specifically, this innovation seeks to solve the following problem. Given a spring-loaded percussive drill driven by a voice-coil motor, one needs to determine the optimal input voltage waveform (periodic function) and the optimal hammering period that minimizes the dissipated energy, while ensuring that the hammer-to-rock impacts are made with sufficient (user-defined) impact velocity (or impact energy). To solve this problem, it was first observed that when voice-coil-actuated percussive drills are driven at high power, it is of paramount importance to ensure that the electrical current of the device remains in phase with the velocity of the hammer. Otherwise, negative work is performed and the drill experiences a loss of performance (i.e., reduced impact energy) and an increase in Joule heating (i.e., reduction in energy efficiency). This observation has motivated many drilling products to incorporate the standard bang-bang control approach for driving their percussive drills. However, the bang-bang control approach is significantly less efficient than the optimal energy-efficient control approach solved herein. To obtain this solution, the standard tools of classical optimal control theory were applied. It is worth noting that these tools inherently require the solution of a two-point boundary value problem (TPBVP), i.e., a system of differential equations where half the equations have unknown boundary conditions. Typically, the TPBVP is impossible to solve analytically for high-dimensional dynamic systems. However, for the case of the spring-loaded vibro-impactor, this approach yields the exact optimal control solution as the sum of four analytic functions whose coefficients are determined using a simple, easy-to-implement algorithm. Once the optimal control waveform is determined, it can be used optimally in the context of both open-loop and closed-loop control modes (using standard realtime control hardware).

  14. Biallelic insertion of a transcriptional terminator via the CRISPR/Cas9 system efficiently silences expression of protein-coding and non-coding RNA genes.

    PubMed

    Liu, Yangyang; Han, Xiao; Yuan, Junting; Geng, Tuoyu; Chen, Shihao; Hu, Xuming; Cui, Isabelle H; Cui, Hengmi

    2017-04-07

    The type II bacterial CRISPR/Cas9 system is a simple, convenient, and powerful tool for targeted gene editing. Here, we describe a CRISPR/Cas9-based approach for inserting a poly(A) transcriptional terminator into both alleles of a targeted gene to silence protein-coding and non-protein-coding genes, which often play key roles in gene regulation but are difficult to silence via insertion or deletion of short DNA fragments. The integration of 225 bp of bovine growth hormone poly(A) signals into either the first intron or the first exon or behind the promoter of target genes caused efficient termination of expression of PPP1R12C , NSUN2 (protein-coding genes), and MALAT1 (non-protein-coding gene). Both NeoR and PuroR were used as markers in the selection of clonal cell lines with biallelic integration of a poly(A) signal. Genotyping analysis indicated that the cell lines displayed the desired biallelic silencing after a brief selection period. These combined results indicate that this CRISPR/Cas9-based approach offers an easy, convenient, and efficient novel technique for gene silencing in cell lines, especially for those in which gene integration is difficult because of a low efficiency of homology-directed repair. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  15. Surface Modification of Naturally Available Biomass for Enhancement of Heavy Metal Removal Efficiency, Upscaling Prospects, and Management Aspects of Spent Biosorbents: A Review.

    PubMed

    Ramrakhiani, Lata; Ghosh, Sourja; Majumdar, Swachchha

    2016-09-01

    Heavy metal pollution in water emerges as a severe socio-environmental problem originating primarily from the discharge of industrial wastewater. In view of the toxic, non-biodegradable, and persistent nature of most of the heavy metal ions, remediation of such components becomes an absolute necessity. Biosorption is an emerging tool for bioremediation that has gained momentum for employing low-cost biological materials with effective metal binding capacities. Even though biological materials possess excellent metal adsorption abilities, they show poor mechanical strength and low rigidity. Other disadvantages include solid-liquid separation problems, possible biomass swelling, lower efficiency for regeneration or reuse, and frequent development of high pressure drop in the column mode that limits its applications under real conditions. To improve the biosorption efficiency, biomasses need to be modified with a simple technique for selective/multi-metal adsorption. This review is intended to cover discussion on biomass modification for enhanced biosorption efficiency, mechanism studies using various instrumental/analytical techniques, and future direction for research and development including the fate of spent biosorbent. In most of the previously published researches, difficulty of the process in scaling up has not been addressed. The current article outlines the application potential of biosorbents in the development of hybrid technology integrated with membrane processes for water and wastewater treatment in industrial scale.

  16. A simple and inexpensive external fixator.

    PubMed

    Noor, M A

    1988-11-01

    A simple and inexpensive external fixator has been designed. It is constructed of galvanized iron pipe and mild steel bolts and nuts. It can easily be manufactured in a hospital workshop with a minimum of tools.

  17. Efficient Multidisciplinary Analysis Approach for Conceptual Design of Aircraft with Large Shape Change

    NASA Technical Reports Server (NTRS)

    Chwalowski, Pawel; Samareh, Jamshid A.; Horta, Lucas G.; Piatak, David J.; McGowan, Anna-Maria R.

    2009-01-01

    The conceptual and preliminary design processes for aircraft with large shape changes are generally difficult and time-consuming, and the processes are often customized for a specific shape change concept to streamline the vehicle design effort. Accordingly, several existing reports show excellent results of assessing a particular shape change concept or perturbations of a concept. The goal of the current effort was to develop a multidisciplinary analysis tool and process that would enable an aircraft designer to assess several very different morphing concepts early in the design phase and yet obtain second-order performance results so that design decisions can be made with better confidence. The approach uses an efficient parametric model formulation that allows automatic model generation for systems undergoing radical shape changes as a function of aerodynamic parameters, geometry parameters, and shape change parameters. In contrast to other more self-contained approaches, the approach utilizes off-the-shelf analysis modules to reduce development time and to make it accessible to many users. Because the analysis is loosely coupled, discipline modules like a multibody code can be easily swapped for other modules with similar capabilities. One of the advantages of this loosely coupled system is the ability to use the medium- to high-fidelity tools early in the design stages when the information can significantly influence and improve overall vehicle design. Data transfer among the analysis modules are based on an accurate and automated general purpose data transfer tool. In general, setup time for the integrated system presented in this paper is 2-4 days for simple shape change concepts and 1-2 weeks for more mechanically complicated concepts. Some of the key elements briefly described in the paper include parametric model development, aerodynamic database generation, multibody analysis, and the required software modules as well as examples for a telescoping wing, a folding wing, and a bat-like wing. The paper also includes the verification of a medium-fidelity aerodynamic tool used for the aerodynamic database generation with a steady and unsteady high-fidelity CFD analysis tool for a folding wing example.

  18. What's Inside?

    ERIC Educational Resources Information Center

    Sigford, Ann; Nelson, Nancy

    1998-01-01

    Presents a program for elementary teachers to learn how to use hand tools and household appliances to teach the principles of physics. The lesson helps teachers become familiar with simple hand tools, combat the apprehension of mechanical devices, and develop an interest in tools and technology. Session involves disassembling appliances to…

  19. Development and Validation of the Texas Best Management Practice Evaluation Tool (TBET)

    USDA-ARS?s Scientific Manuscript database

    Conservation planners need simple yet accurate tools to predict sediment and nutrient losses from agricultural fields to guide conservation practice implementation and increase cost-effectiveness. The Texas Best management practice Evaluation Tool (TBET), which serves as an input/output interpreter...

  20. Processing MALDI mass spectra to improve mass spectral direct tissue analysis

    NASA Astrophysics Data System (ADS)

    Norris, Jeremy L.; Cornett, Dale S.; Mobley, James A.; Andersson, Malin; Seeley, Erin H.; Chaurand, Pierre; Caprioli, Richard M.

    2007-02-01

    Profiling and imaging biological specimens using MALDI mass spectrometry has significant potential to contribute to our understanding and diagnosis of disease. The technique is efficient and high-throughput providing a wealth of data about the biological state of the sample from a very simple and direct experiment. However, in order for these techniques to be put to use for clinical purposes, the approaches used to process and analyze the data must improve. This study examines some of the existing tools to baseline subtract, normalize, align, and remove spectral noise for MALDI data, comparing the advantages of each. A preferred workflow is presented that can be easily implemented for data in ASCII format. The advantages of using such an approach are discussed for both molecular profiling and imaging mass spectrometry.

  1. SPR based immunosensor for detection of Legionella pneumophila in water samples

    NASA Astrophysics Data System (ADS)

    Enrico, De Lorenzis; Manera, Maria G.; Montagna, Giovanni; Cimaglia, Fabio; Chiesa, Maurizio; Poltronieri, Palmiro; Santino, Angelo; Rella, Roberto

    2013-05-01

    Detection of legionellae by water sampling is an important factor in epidemiological investigations of Legionnaires' disease and its prevention. To avoid labor-intensive problems with conventional methods, an alternative, highly sensitive and simple method is proposed for detecting L. pneumophila in aqueous samples. A compact Surface Plasmon Resonance (SPR) instrumentation prototype, provided with proper microfluidics tools, is built. The developed immunosensor is capable of dynamically following the binding between antigens and the corresponding antibody molecules immobilized on the SPR sensor surface. A proper immobilization strategy is used in this work that makes use of an important efficient step aimed at the orientation of antibodies onto the sensor surface. The feasibility of the integration of SPR-based biosensing setups with microfluidic technologies, resulting in a low-cost and portable biosensor is demonstrated.

  2. Optical integration of Pancharatnam-Berry phase lens and dynamical phase lens

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ke, Yougang; Liu, Yachao; Zhou, Junxiao

    In the optical system, most elements such as lens, prism, and optical fiber are made of silica glass. Therefore, integrating Pancharatnam-Berry phase elements into silica glass has potential applications in the optical system. In this paper, we take a lens, for example, which integrates a Pancharatnam-Berry phase lens into a conventional plano-convex lens. The spin states and positions of focal points can be modulated by controlling the polarization states of the incident beam. The proposed lens has a high transmission efficiency, and thereby acts as a simple and powerful tool to manipulate spin photons. Furthermore, the method can be convenientlymore » extended to the optical fiber and laser cavity, and may provide a route to the design of the spin-photonic devices.« less

  3. A Computational Framework for Automation of Point Defect Calculations

    NASA Astrophysics Data System (ADS)

    Goyal, Anuj; Gorai, Prashun; Peng, Haowei; Lany, Stephan; Stevanovic, Vladan; National Renewable Energy Laboratory, Golden, Colorado 80401 Collaboration

    A complete and rigorously validated open-source Python framework to automate point defect calculations using density functional theory has been developed. The framework provides an effective and efficient method for defect structure generation, and creation of simple yet customizable workflows to analyze defect calculations. The package provides the capability to compute widely accepted correction schemes to overcome finite-size effects, including (1) potential alignment, (2) image-charge correction, and (3) band filling correction to shallow defects. Using Si, ZnO and In2O3as test examples, we demonstrate the package capabilities and validate the methodology. We believe that a robust automated tool like this will enable the materials by design community to assess the impact of point defects on materials performance. National Renewable Energy Laboratory, Golden, Colorado 80401.

  4. Rockets: Physical science teacher's guide with activities

    NASA Astrophysics Data System (ADS)

    Vogt, Gregory L.; Rosenberg, Carla R.

    1993-07-01

    This guide begins with background information sections on the history of rocketry, scientific principles, and practical rocketry. The sections on scientific principles and practical rocketry are based on Isaac Newton's three laws of motion. These laws explain why rockets work and how to make them more efficient. The background sections are followed with a series of physical science activities that demonstrate the basic science of rocketry. Each activity is designed to be simple and take advantage of inexpensive materials. Construction diagrams, materials and tools lists, and instructions are included. A brief discussion elaborates on the concepts covered in the activities and is followed with teaching notes and discussion questions. The guide concludes with a glossary of terms, suggested reading list, NASA educational resources, and an evaluation questionnaire with a mailer.

  5. Rockets: Physical science teacher's guide with activities

    NASA Technical Reports Server (NTRS)

    Vogt, Gregory L.; Rosenberg, Carla R. (Editor)

    1993-01-01

    This guide begins with background information sections on the history of rocketry, scientific principles, and practical rocketry. The sections on scientific principles and practical rocketry are based on Isaac Newton's three laws of motion. These laws explain why rockets work and how to make them more efficient. The background sections are followed with a series of physical science activities that demonstrate the basic science of rocketry. Each activity is designed to be simple and take advantage of inexpensive materials. Construction diagrams, materials and tools lists, and instructions are included. A brief discussion elaborates on the concepts covered in the activities and is followed with teaching notes and discussion questions. The guide concludes with a glossary of terms, suggested reading list, NASA educational resources, and an evaluation questionnaire with a mailer.

  6. D-GENIES: dot plot large genomes in an interactive, efficient and simple way.

    PubMed

    Cabanettes, Floréal; Klopp, Christophe

    2018-01-01

    Dot plots are widely used to quickly compare sequence sets. They provide a synthetic similarity overview, highlighting repetitions, breaks and inversions. Different tools have been developed to easily generated genomic alignment dot plots, but they are often limited in the input sequence size. D-GENIES is a standalone and web application performing large genome alignments using minimap2 software package and generating interactive dot plots. It enables users to sort query sequences along the reference, zoom in the plot and download several image, alignment or sequence files. D-GENIES is an easy-to-install, open-source software package (GPL) developed in Python and JavaScript. The source code is available at https://github.com/genotoul-bioinfo/dgenies and it can be tested at http://dgenies.toulouse.inra.fr/.

  7. High-frequency CAD-based scattering model: SERMAT

    NASA Astrophysics Data System (ADS)

    Goupil, D.; Boutillier, M.

    1991-09-01

    Specifications for an industrial radar cross section (RCS) calculation code are given: it must be able to exchange data with many computer aided design (CAD) systems, it must be fast, and it must have powerful graphic tools. Classical physical optics (PO) and equivalent currents (EC) techniques have proven their efficiency on simple objects for a long time. Difficult geometric problems occur when objects with very complex shapes have to be computed. Only a specific geometric code can solve these problems. We have established that, once these problems have been solved: (1) PO and EC give good results on complex objects of large size compared to wavelength; and (2) the implementation of these objects in a software package (SERMAT) allows fast and sufficiently precise domain RCS calculations to meet industry requirements in the domain of stealth.

  8. Photochemical Approaches to Complex Chemotypes: Applications in Natural Product Synthesis.

    PubMed

    Kärkäs, Markus D; Porco, John A; Stephenson, Corey R J

    2016-09-14

    The use of photochemical transformations is a powerful strategy that allows for the formation of a high degree of molecular complexity from relatively simple building blocks in a single step. A central feature of all light-promoted transformations is the involvement of electronically excited states, generated upon absorption of photons. This produces transient reactive intermediates and significantly alters the reactivity of a chemical compound. The input of energy provided by light thus offers a means to produce strained and unique target compounds that cannot be assembled using thermal protocols. This review aims at highlighting photochemical transformations as a tool for rapidly accessing structurally and stereochemically diverse scaffolds. Synthetic designs based on photochemical transformations have the potential to afford complex polycyclic carbon skeletons with impressive efficiency, which are of high value in total synthesis.

  9. Accurate radiation temperature and chemical potential from quantitative photoluminescence analysis of hot carrier populations.

    PubMed

    Gibelli, François; Lombez, Laurent; Guillemoles, Jean-François

    2017-02-15

    In order to characterize hot carrier populations in semiconductors, photoluminescence measurement is a convenient tool, enabling us to probe the carrier thermodynamical properties in a contactless way. However, the analysis of the photoluminescence spectra is based on some assumptions which will be discussed in this work. We especially emphasize the importance of the variation of the material absorptivity that should be considered to access accurate thermodynamical properties of the carriers, especially by varying the excitation power. The proposed method enables us to obtain more accurate results of thermodynamical properties by taking into account a rigorous physical description and finds direct application in investigating hot carrier solar cells, which are an adequate concept for achieving high conversion efficiencies with a relatively simple device architecture.

  10. Analytical thermal model for end-pumped solid-state lasers

    NASA Astrophysics Data System (ADS)

    Cini, L.; Mackenzie, J. I.

    2017-12-01

    Fundamentally power-limited by thermal effects, the design challenge for end-pumped "bulk" solid-state lasers depends upon knowledge of the temperature gradients within the gain medium. We have developed analytical expressions that can be used to model the temperature distribution and thermal-lens power in end-pumped solid-state lasers. Enabled by the inclusion of a temperature-dependent thermal conductivity, applicable from cryogenic to elevated temperatures, typical pumping distributions are explored and the results compared with accepted models. Key insights are gained through these analytical expressions, such as the dependence of the peak temperature rise in function of the boundary thermal conductance to the heat sink. Our generalized expressions provide simple and time-efficient tools for parametric optimization of the heat distribution in the gain medium based upon the material and pumping constraints.

  11. Benchmarking of Decision-Support Tools Used for Tiered Sustainable Remediation Appraisal.

    PubMed

    Smith, Jonathan W N; Kerrison, Gavin

    2013-01-01

    Sustainable remediation comprises soil and groundwater risk-management actions that are selected, designed, and operated to maximize net environmental, social, and economic benefit (while assuring protection of human health and safety). This paper describes a benchmarking exercise to comparatively assess potential differences in environmental management decision making resulting from application of different sustainability appraisal tools ranging from simple (qualitative) to more quantitative (multi-criteria and fully monetized cost-benefit analysis), as outlined in the SuRF-UK framework. The appraisal tools were used to rank remedial options for risk management of a subsurface petroleum release that occurred at a petrol filling station in central England. The remediation options were benchmarked using a consistent set of soil and groundwater data for each tier of sustainability appraisal. The ranking of remedial options was very similar in all three tiers, and an environmental management decision to select the most sustainable options at tier 1 would have been the same decision at tiers 2 and 3. The exercise showed that, for relatively simple remediation projects, a simple sustainability appraisal led to the same remediation option selection as more complex appraisal, and can be used to reliably inform environmental management decisions on other relatively simple land contamination projects.

  12. Twelve essential tools for living the life of whole person health care.

    PubMed

    Schlitz, Marilyn; Valentina, Elizabeth

    2013-01-01

    The integration of body, mind, and spirit has become a key dimension of health education and disease prevention and treatment; however, our health care system remains primarily disease centered. Finding simple steps to help each of us find our own balance can improve our lives, our work, and our relationships. On the basis of interviews with health care experts at the leading edge of the new model of medicine, this article identifies simple tools to improve the health of patients and caregivers.

  13. Granular materials interacting with thin flexible rods

    NASA Astrophysics Data System (ADS)

    Neto, Alfredo Gay; Campello, Eduardo M. B.

    2017-04-01

    In this work, we develop a computational model for the simulation of problems wherein granular materials interact with thin flexible rods. We treat granular materials as a collection of spherical particles following a discrete element method (DEM) approach, while flexible rods are described by a large deformation finite element (FEM) rod formulation. Grain-to-grain, grain-to-rod, and rod-to-rod contacts are fully permitted and resolved. A simple and efficient strategy is proposed for coupling the motion of the two types (discrete and continuum) of materials within an iterative time-stepping solution scheme. Implementation details are shown and discussed. Validity and applicability of the model are assessed by means of a few numerical examples. We believe that robust, efficiently coupled DEM-FEM schemes can be a useful tool to the simulation of problems wherein granular materials interact with thin flexible rods, such as (but not limited to) bombardment of grains on beam structures, flow of granular materials over surfaces covered by threads of hair in many biological processes, flow of grains through filters and strainers in various industrial segregation processes, and many others.

  14. Efficient assembly of full-length infectious clone of Brazilian IBDV isolate by homologous recombination in yeast

    PubMed Central

    Silva, J.V.J.; Arenhart, S.; Santos, H.F.; Almeida-Queiroz, S.R.; Silva, A.N.M.R.; Trevisol, I.M.; Bertani, G.R.; Gil, L.H.V.G.

    2014-01-01

    The Infectious Bursal Disease Virus (IBDV) causes immunosuppression in young chickens. Advances in molecular virology and vaccines for IBDV have been achieved by viral reverse genetics (VRG). VRG for IBDV has undergone changes over time, however all strategies used to generate particles of IBDV involves multiple rounds of amplification and need of in vitro ligation and restriction sites. The aim of this research was to build the world’s first VRG for IBDV by yeast-based homologous recombination; a more efficient, robust and simple process than cloning by in vitro ligation. The wild type IBDV (Wt-IBDV-Br) was isolated in Brazil and had its genome cloned in pJG-CMV-HDR vector by yeast-based homologous recombination. The clones were transfected into chicken embryo fibroblasts and the recovered virus (IC-IBDV-Br) showed genetic stability and similar phenotype to Wt-IBDV-Br, which were observed by nucleotide sequence, focus size/morphology and replication kinetics, respectively. Thus, IBDV reverse genetics by yeast-based homologous recombination provides tools to IBDV understanding and vaccines/viral vectors development. PMID:25763067

  15. Engineering M13 for phage display.

    PubMed

    Sidhu, S S

    2001-09-01

    Phage display is achieved by fusing polypeptide libraries to phage coat proteins. The resulting phage particles display the polypeptides on their surfaces and they also contain the encoding DNA. Library members with particular functions can be isolated with simple selections and polypeptide sequences can be decoded from the encapsulated DNA. The technology's success depends on the efficiency with which polypeptides can be displayed on the phage surface, and significant progress has been made in engineering M13 bacteriophage coat proteins as improved phage display platforms. Functional display has been achieved with all five M13 coat proteins, with both N- and C-terminal fusions. Also, coat protein mutants have been designed and selected to improve the efficiency of heterologous protein display, and in the extreme case, completely artificial coat proteins have been evolved specifically as display platforms. These studies demonstrate that the M13 phage coat is extremely malleable, and this property can be used to engineer the phage particle specifically for phage display. These improvements expand the utility of phage display as a powerful tool in modern biotechnology.

  16. Colloid Transport in Saturated Porous Media: Elimination of Attachment Efficiency in a New Colloid Transport Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Landkamer, Lee L.; Harvey, Ronald W.; Scheibe, Timothy D.

    A new colloid transport model is introduced that is conceptually simple but captures the essential features of complicated attachment and detachment behavior of colloids when conditions of secondary minimum attachment exist. This model eliminates the empirical concept of collision efficiency; the attachment rate is computed directly from colloid filtration theory. Also, a new paradigm for colloid detachment based on colloid population heterogeneity is introduced. Assuming the dispersion coefficient can be estimated from tracer behavior, this model has only two fitting parameters: (1) the fraction of colloids that attach irreversibly and (2) the rate at which reversibly attached colloids leave themore » surface. These two parameters were correlated to physical parameters that control colloid transport such as the depth of the secondary minimum and pore water velocity. Given this correlation, the model serves as a heuristic tool for exploring the influence of physical parameters such as surface potential and fluid velocity on colloid transport. This model can be extended to heterogeneous systems characterized by both primary and secondary minimum deposition by simply increasing the fraction of colloids that attach irreversibly.« less

  17. Efficient quantification of water content in edible oils by headspace gas chromatography with vapour phase calibration.

    PubMed

    Xie, Wei-Qi; Gong, Yi-Xian; Yu, Kong-Xian

    2018-06-01

    An automated and accurate headspace gas chromatographic (HS-GC) technique was investigated for rapidly quantifying water content in edible oils. In this method, multiple headspace extraction (MHE) procedures were used to analyse the integrated water content from the edible oil sample. A simple vapour phase calibration technique with an external vapour standard was used to calibrate both the water content in the gas phase and the total weight of water in edible oil sample. After that the water in edible oils can be quantified. The data showed that the relative standard deviation of the present HS-GC method in the precision test was less than 1.13%, the relative differences between the new method and a reference method (i.e. the oven-drying method) were no more than 1.62%. The present HS-GC method is automated, accurate, efficient, and can be a reliable tool for quantifying water content in edible oil related products and research. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  18. Steric engineering of metal-halide perovskites with tunable optical band gaps

    NASA Astrophysics Data System (ADS)

    Filip, Marina R.; Eperon, Giles E.; Snaith, Henry J.; Giustino, Feliciano

    2014-12-01

    Owing to their high energy-conversion efficiency and inexpensive fabrication routes, solar cells based on metal-organic halide perovskites have rapidly gained prominence as a disruptive technology. An attractive feature of perovskite absorbers is the possibility of tailoring their properties by changing the elemental composition through the chemical precursors. In this context, rational in silico design represents a powerful tool for mapping the vast materials landscape and accelerating discovery. Here we show that the optical band gap of metal-halide perovskites, a key design parameter for solar cells, strongly correlates with a simple structural feature, the largest metal-halide-metal bond angle. Using this descriptor we suggest continuous tunability of the optical gap from the mid-infrared to the visible. Precise band gap engineering is achieved by controlling the bond angles through the steric size of the molecular cation. On the basis of these design principles we predict novel low-gap perovskites for optimum photovoltaic efficiency, and we demonstrate the concept of band gap modulation by synthesising and characterising novel mixed-cation perovskites.

  19. A novel sgRNA selection system for CRISPR-Cas9 in mammalian cells.

    PubMed

    Zhang, Haiwei; Zhang, Xixi; Fan, Cunxian; Xie, Qun; Xu, Chengxian; Zhao, Qun; Liu, Yongbo; Wu, Xiaoxia; Zhang, Haibing

    2016-03-18

    CRISPR-Cas9 mediated genome editing system has been developed as a powerful tool for elucidating the function of genes through genetic engineering in multiple cells and organisms. This system takes advantage of a single guide RNA (sgRNA) to direct the Cas9 endonuclease to a specific DNA site to generate mutant alleles. Since the targeting efficiency of sgRNAs to distinct DNA loci can vary widely, there remains a need for a rapid, simple and efficient sgRNA selection method to overcome this limitation of the CRISPR-Cas9 system. Here we report a novel system to select sgRNA with high efficacy for DNA sequence modification by a luciferase assay. Using this sgRNAs selection system, we further demonstrated successful examples of one sgRNA for generating one gene knockout cell lines where the targeted genes are shown to be functionally defective. This system provides a potential application to optimize the sgRNAs in different species and to generate a powerful CRISPR-Cas9 genome-wide screening system with minimum amounts of sgRNAs. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Repair of full-thickness tendon injury using connective tissue progenitors efficiently derived from human embryonic stem cells and fetal tissues.

    PubMed

    Cohen, Shahar; Leshansky, Lucy; Zussman, Eyal; Burman, Michael; Srouji, Samer; Livne, Erella; Abramov, Natalie; Itskovitz-Eldor, Joseph

    2010-10-01

    The use of stem cells for tissue engineering (TE) encourages scientists to design new platforms in the field of regenerative and reconstructive medicine. Human embryonic stem cells (hESC) have been proposed to be an important cell source for cell-based TE applications as well as an exciting tool for investigating the fundamentals of human development. Here, we describe the efficient derivation of connective tissue progenitors (CTPs) from hESC lines and fetal tissues. The CTPs were significantly expanded and induced to generate tendon tissues in vitro, with ultrastructural characteristics and biomechanical properties typical of mature tendons. We describe a simple method for engineering tendon grafts that can successfully repair injured Achilles tendons and restore the ankle joint extension movement in mice. We also show the CTP's ability to differentiate into bone, cartilage, and fat both in vitro and in vivo. This study offers evidence for the possibility of using stem cell-derived engineered grafts to replace missing tissues, and sets a basic platform for future cell-based TE applications in the fields of orthopedics and reconstructive surgery.

  1. SIGNUM: A Matlab, TIN-based landscape evolution model

    NASA Astrophysics Data System (ADS)

    Refice, A.; Giachetta, E.; Capolongo, D.

    2012-08-01

    Several numerical landscape evolution models (LEMs) have been developed to date, and many are available as open source codes. Most are written in efficient programming languages such as Fortran or C, but often require additional code efforts to plug in to more user-friendly data analysis and/or visualization tools to ease interpretation and scientific insight. In this paper, we present an effort to port a common core of accepted physical principles governing landscape evolution directly into a high-level language and data analysis environment such as Matlab. SIGNUM (acronym for Simple Integrated Geomorphological Numerical Model) is an independent and self-contained Matlab, TIN-based landscape evolution model, built to simulate topography development at various space and time scales. SIGNUM is presently capable of simulating hillslope processes such as linear and nonlinear diffusion, fluvial incision into bedrock, spatially varying surface uplift which can be used to simulate changes in base level, thrust and faulting, as well as effects of climate changes. Although based on accepted and well-known processes and algorithms in its present version, it is built with a modular structure, which allows to easily modify and upgrade the simulated physical processes to suite virtually any user needs. The code is conceived as an open-source project, and is thus an ideal tool for both research and didactic purposes, thanks to the high-level nature of the Matlab environment and its popularity among the scientific community. In this paper the simulation code is presented together with some simple examples of surface evolution, and guidelines for development of new modules and algorithms are proposed.

  2. An Open-Source Label Atlas Correction Tool and Preliminary Results on Huntingtons Disease Whole-Brain MRI Atlases

    PubMed Central

    Forbes, Jessica L.; Kim, Regina E. Y.; Paulsen, Jane S.; Johnson, Hans J.

    2016-01-01

    The creation of high-quality medical imaging reference atlas datasets with consistent dense anatomical region labels is a challenging task. Reference atlases have many uses in medical image applications and are essential components of atlas-based segmentation tools commonly used for producing personalized anatomical measurements for individual subjects. The process of manual identification of anatomical regions by experts is regarded as a so-called gold standard; however, it is usually impractical because of the labor-intensive costs. Further, as the number of regions of interest increases, these manually created atlases often contain many small inconsistently labeled or disconnected regions that need to be identified and corrected. This project proposes an efficient process to drastically reduce the time necessary for manual revision in order to improve atlas label quality. We introduce the LabelAtlasEditor tool, a SimpleITK-based open-source label atlas correction tool distributed within the image visualization software 3D Slicer. LabelAtlasEditor incorporates several 3D Slicer widgets into one consistent interface and provides label-specific correction tools, allowing for rapid identification, navigation, and modification of the small, disconnected erroneous labels within an atlas. The technical details for the implementation and performance of LabelAtlasEditor are demonstrated using an application of improving a set of 20 Huntingtons Disease-specific multi-modal brain atlases. Additionally, we present the advantages and limitations of automatic atlas correction. After the correction of atlas inconsistencies and small, disconnected regions, the number of unidentified voxels for each dataset was reduced on average by 68.48%. PMID:27536233

  3. An Open-Source Label Atlas Correction Tool and Preliminary Results on Huntingtons Disease Whole-Brain MRI Atlases.

    PubMed

    Forbes, Jessica L; Kim, Regina E Y; Paulsen, Jane S; Johnson, Hans J

    2016-01-01

    The creation of high-quality medical imaging reference atlas datasets with consistent dense anatomical region labels is a challenging task. Reference atlases have many uses in medical image applications and are essential components of atlas-based segmentation tools commonly used for producing personalized anatomical measurements for individual subjects. The process of manual identification of anatomical regions by experts is regarded as a so-called gold standard; however, it is usually impractical because of the labor-intensive costs. Further, as the number of regions of interest increases, these manually created atlases often contain many small inconsistently labeled or disconnected regions that need to be identified and corrected. This project proposes an efficient process to drastically reduce the time necessary for manual revision in order to improve atlas label quality. We introduce the LabelAtlasEditor tool, a SimpleITK-based open-source label atlas correction tool distributed within the image visualization software 3D Slicer. LabelAtlasEditor incorporates several 3D Slicer widgets into one consistent interface and provides label-specific correction tools, allowing for rapid identification, navigation, and modification of the small, disconnected erroneous labels within an atlas. The technical details for the implementation and performance of LabelAtlasEditor are demonstrated using an application of improving a set of 20 Huntingtons Disease-specific multi-modal brain atlases. Additionally, we present the advantages and limitations of automatic atlas correction. After the correction of atlas inconsistencies and small, disconnected regions, the number of unidentified voxels for each dataset was reduced on average by 68.48%.

  4. PhytoCRISP-Ex: a web-based and stand-alone application to find specific target sequences for CRISPR/CAS editing.

    PubMed

    Rastogi, Achal; Murik, Omer; Bowler, Chris; Tirichine, Leila

    2016-07-01

    With the emerging interest in phytoplankton research, the need to establish genetic tools for the functional characterization of genes is indispensable. The CRISPR/Cas9 system is now well recognized as an efficient and accurate reverse genetic tool for genome editing. Several computational tools have been published allowing researchers to find candidate target sequences for the engineering of the CRISPR vectors, while searching possible off-targets for the predicted candidates. These tools provide built-in genome databases of common model organisms that are used for CRISPR target prediction. Although their predictions are highly sensitive, the applicability to non-model genomes, most notably protists, makes their design inadequate. This motivated us to design a new CRISPR target finding tool, PhytoCRISP-Ex. Our software offers CRIPSR target predictions using an extended list of phytoplankton genomes and also delivers a user-friendly standalone application that can be used for any genome. The software attempts to integrate, for the first time, most available phytoplankton genomes information and provide a web-based platform for Cas9 target prediction within them with high sensitivity. By offering a standalone version, PhytoCRISP-Ex maintains an independence to be used with any organism and widens its applicability in high throughput pipelines. PhytoCRISP-Ex out pars all the existing tools by computing the availability of restriction sites over the most probable Cas9 cleavage sites, which can be ideal for mutant screens. PhytoCRISP-Ex is a simple, fast and accurate web interface with 13 pre-indexed and presently updating phytoplankton genomes. The software was also designed as a UNIX-based standalone application that allows the user to search for target sequences in the genomes of a variety of other species.

  5. Comparison Of Human Modelling Tools For Efficiency Of Prediction Of EVA Tasks

    NASA Technical Reports Server (NTRS)

    Dischinger, H. Charles, Jr.; Loughead, Tomas E.

    1998-01-01

    Construction of the International Space Station (ISS) will require extensive extravehicular activity (EVA, spacewalks), and estimates of the actual time needed continue to rise. As recently as September, 1996, the amount of time to be spent in EVA was believed to be about 400 hours, excluding spacewalks on the Russian segment. This estimate has recently risen to over 1100 hours, and it could go higher before assembly begins in the summer of 1998. These activities are extremely expensive and hazardous, so any design tools which help assure mission success and improve the efficiency of the astronaut in task completion can pay off in reduced design and EVA costs and increased astronaut safety. The tasks which astronauts can accomplish in EVA are limited by spacesuit mobility. They are therefore relatively simple, from an ergonomic standpoint, requiring gross movements rather than time motor skills. The actual tasks include driving bolts, mating and demating electric and fluid connectors, and actuating levers; the important characteristics to be considered in design improvement include the ability of the astronaut to see and reach the item to be manipulated and the clearance required to accomplish the manipulation. This makes the tasks amenable to simulation in a Computer-Assisted Design (CAD) environment. For EVA, the spacesuited astronaut must have his or her feet attached on a work platform called a foot restraint to obtain a purchase against which work forces may be actuated. An important component of the design is therefore the proper placement of foot restraints.

  6. In Search of a Time Efficient Approach to Crack and Delamination Growth Predictions in Composites

    NASA Technical Reports Server (NTRS)

    Krueger, Ronald; Carvalho, Nelson

    2016-01-01

    Analysis benchmarking was used to assess the accuracy and time efficiency of algorithms suitable for automated delamination growth analysis. First, the Floating Node Method (FNM) was introduced and its combination with a simple exponential growth law (Paris Law) and Virtual Crack Closure technique (VCCT) was discussed. Implementation of the method into a user element (UEL) in Abaqus/Standard(Registered TradeMark) was also presented. For the assessment of growth prediction capabilities, an existing benchmark case based on the Double Cantilever Beam (DCB) specimen was briefly summarized. Additionally, the development of new benchmark cases based on the Mixed-Mode Bending (MMB) specimen to assess the growth prediction capabilities under mixed-mode I/II conditions was discussed in detail. A comparison was presented, in which the benchmark cases were used to assess the existing low-cycle fatigue analysis tool in Abaqus/Standard(Registered TradeMark) in comparison to the FNM-VCCT fatigue growth analysis implementation. The low-cycle fatigue analysis tool in Abaqus/Standard(Registered TradeMark) was able to yield results that were in good agreement with the DCB benchmark example. Results for the MMB benchmark cases, however, only captured the trend correctly. The user element (FNM-VCCT) always yielded results that were in excellent agreement with all benchmark cases, at a fraction of the analysis time. The ability to assess the implementation of two methods in one finite element code illustrated the value of establishing benchmark solutions.

  7. Simplified, inverse, ejector design tool

    NASA Technical Reports Server (NTRS)

    Dechant, Lawrence J.

    1993-01-01

    A simple lumped parameter based inverse design tool has been developed which provides flow path geometry and entrainment estimates subject to operational, acoustic, and design constraints. These constraints are manifested through specification of primary mass flow rate or ejector thrust, fully-mixed exit velocity, and static pressure matching. Fundamentally, integral forms of the conservation equations coupled with the specified design constraints are combined to yield an easily invertible linear system in terms of the flow path cross-sectional areas. Entrainment is computed by back substitution. Initial comparison with experimental and analogous one-dimensional methods show good agreement. Thus, this simple inverse design code provides an analytically based, preliminary design tool with direct application to High Speed Civil Transport (HSCT) design studies.

  8. The Productivity and Cost-Efficiency of Models for Involving Nurse Practitioners in Primary Care: A Perspective from Queueing Analysis

    PubMed Central

    Liu, Nan; D'Aunno, Thomas

    2012-01-01

    Objective To develop simple stylized models for evaluating the productivity and cost-efficiencies of different practice models to involve nurse practitioners (NPs) in primary care, and in particular to generate insights on what affects the performance of these models and how. Data Sources and Study Design The productivity of a practice model is defined as the maximum number of patients that can be accounted for by the model under a given timeliness-to-care requirement; cost-efficiency is measured by the corresponding annual cost per patient in that model. Appropriate queueing analysis is conducted to generate formulas and values for these two performance measures. Model parameters for the analysis are extracted from the previous literature and survey reports. Sensitivity analysis is conducted to investigate the model performance under different scenarios and to verify the robustness of findings. Principal Findings Employing an NP, whose salary is usually lower than a primary care physician, may not be cost-efficient, in particular when the NP's capacity is underutilized. Besides provider service rates, workload allocation among providers is one of the most important determinants for the cost-efficiency of a practice model involving NPs. Capacity pooling among providers could be a helpful strategy to improve efficiency in care delivery. Conclusions The productivity and cost-efficiency of a practice model depend heavily on how providers organize their work and a variety of other factors related to the practice environment. Queueing theory provides useful tools to take into account these factors in making strategic decisions on staffing and panel size selection for a practice model. PMID:22092009

  9. Remote Control and Data Acquisition: A Case Study

    NASA Technical Reports Server (NTRS)

    DeGennaro, Alfred J.; Wilkinson, R. Allen

    2000-01-01

    This paper details software tools developed to remotely command experimental apparatus, and to acquire and visualize the associated data in soft real time. The work was undertaken because commercial products failed to meet the needs. This work has identified six key factors intrinsic to development of quality research laboratory software. Capabilities include access to all new instrument functions without any programming or dependence on others to write drivers or virtual instruments, simple full screen text-based experiment configuration and control user interface, months of continuous experiment run-times, order of 1% CPU load for condensed matter physics experiment described here, very little imposition of software tool choices on remote users, and total remote control from anywhere in the world over the Internet or from home on a 56 Kb modem as if the user is sitting in the laboratory. This work yielded a set of simple robust tools that are highly reliable, resource conserving, extensible, and versatile, with a uniform simple interface.

  10. Protoplast isolation, transient transformation of leaf mesophyll protoplasts and improved Agrobacterium-mediated leaf disc infiltration of Phaseolus vulgaris: tools for rapid gene expression analysis.

    PubMed

    Nanjareddy, Kalpana; Arthikala, Manoj-Kumar; Blanco, Lourdes; Arellano, Elizabeth S; Lara, Miguel

    2016-06-24

    Phaseolus vulgaris is one of the most extensively studied model legumes in the world. The P. vulgaris genome sequence is available; therefore, the need for an efficient and rapid transformation system is more imperative than ever. The functional characterization of P. vulgaris genes is impeded chiefly due to the non-amenable nature of Phaseolus sp. to stable genetic transformation. Transient transformation systems are convenient and versatile alternatives for rapid gene functional characterization studies. Hence, the present work focuses on standardizing methodologies for protoplast isolation from multiple tissues and transient transformation protocols for rapid gene expression analysis in the recalcitrant grain legume P. vulgaris. Herein, we provide methodologies for the high-throughput isolation of leaf mesophyll-, flower petal-, hypocotyl-, root- and nodule-derived protoplasts from P. vulgaris. The highly efficient polyethylene glycol-mannitol magnesium (PEG-MMG)-mediated transformation of leaf mesophyll protoplasts was optimized using a GUS reporter gene. We used the P. vulgaris SNF1-related protein kinase 1 (PvSnRK1) gene as proof of concept to demonstrate rapid gene functional analysis. An RT-qPCR analysis of protoplasts that had been transformed with PvSnRK1-RNAi and PvSnRK1-OE vectors showed the significant downregulation and ectopic constitutive expression (overexpression), respectively, of the PvSnRK1 transcript. We also demonstrated an improved transient transformation approach, sonication-assisted Agrobacterium-mediated transformation (SAAT), for the leaf disc infiltration of P. vulgaris. Interestingly, this method resulted in a 90 % transformation efficiency and transformed 60-85 % of the cells in a given area of the leaf surface. The constitutive expression of YFP further confirmed the amenability of the system to gene functional characterization studies. We present simple and efficient methodologies for protoplast isolation from multiple P. vulgaris tissues. We also provide a high-efficiency and amenable method for leaf mesophyll transformation for rapid gene functional characterization studies. Furthermore, a modified SAAT leaf disc infiltration approach aids in validating genes and their functions. Together, these methods help to rapidly unravel novel gene functions and are promising tools for P. vulgaris research.

  11. Modeling the efficiency of a magnetic needle for collecting magnetic cells

    NASA Astrophysics Data System (ADS)

    Butler, Kimberly S.; Adolphi, Natalie L.; Bryant, H. C.; Lovato, Debbie M.; Larson, Richard S.; Flynn, Edward R.

    2014-07-01

    As new magnetic nanoparticle-based technologies are developed and new target cells are identified, there is a critical need to understand the features important for magnetic isolation of specific cells in fluids, an increasingly important tool in disease research and diagnosis. To investigate magnetic cell collection, cell-sized spherical microparticles, coated with superparamagnetic nanoparticles, were suspended in (1) glycerine-water solutions, chosen to approximate the range of viscosities of bone marrow, and (2) water in which 3, 5, 10 and 100% of the total suspended microspheres are coated with magnetic nanoparticles, to model collection of rare magnetic nanoparticle-coated cells from a mixture of cells in a fluid. The magnetic microspheres were collected on a magnetic needle, and we demonstrate that the collection efficiency versus time can be modeled using a simple, heuristically-derived function, with three physically-significant parameters. The function enables experimentally-obtained collection efficiencies to be scaled to extract the effective drag of the suspending medium. The results of this analysis demonstrate that the effective drag scales linearly with fluid viscosity, as expected. Surprisingly, increasing the number of non-magnetic microspheres in the suspending fluid results increases the collection of magnetic microspheres, corresponding to a decrease in the effective drag of the medium.

  12. Thermal Convection in High-Pressure Ice Layers Beneath a Buried Ocean within Titan and Ganymede

    NASA Astrophysics Data System (ADS)

    Tobie, G.; Choblet, G.; Dumont, M.

    2014-12-01

    Deep interiors of large icy satellites such as Titan and Ganymede probably harbor a buried ocean sandwiched between low pressure ice and high-pressure ice layers. The nature and location of the lower interface of the ocean involves equilibration of heat and melt transfer in the HP ices and is ultimately controlled by the amount heat transferred through the surface ice Ih layer. Here, we perform 3D simulations of thermal convection, using the OEDIPUS numerical tool (Choblet et al. GJI 2007), to determine the efficiency of heat and mass transfer through these HP ice mantles. In a first series of simulations with no melting, we show that a significant fraction of the HP layer reaches the melting point. Using a simple description of water production and transport, our simulations demonstrate that the melt generation in the outermost part of the HP ice layer and its extraction to the overlying ocean increase the efficiency of heat transfer and reduce strongly the internal temperature. structure and the efficiency of the heat transfer. Scaling relationships are proposed to describe the cooling effect of melt production/extraction and used to investigate the consequences of internal melting on the thermal history of Titan and Ganymede's interior.

  13. Modeling the Efficiency of a Magnetic Needle for Collecting Magnetic Cells

    PubMed Central

    Butler, Kimberly S; Adolphi, Natalie L.; Bryant, H C; Lovato, Debbie M; Larson, Richard S; Flynn, Edward R

    2014-01-01

    As new magnetic nanoparticle-based technologies are developed and new target cells are identified, there is a critical need to understand the features important for magnetic isolation of specific cells in fluids, an increasingly important tool in disease research and diagnosis. To investigate magnetic cell collection, cell-sized spherical microparticles, coated with superparamagnetic nanoparticles, were suspended in 1) glycerine-water solutions, chosen to approximate the range of viscosities of bone marrow, and 2) water in which 3, 5, 10 and 100 % of the total suspended microspheres are coated with magnetic nanoparticles, to model collection of rare magnetic nanoparticle-coated cells from a mixture of cells in a fluid. The magnetic microspheres were collected on a magnetic needle, and we demonstrate that the collection efficiency vs. time can be modeled using a simple, heuristically-derived function, with three physically-significant parameters. The function enables experimentally-obtained collection efficiencies to be scaled to extract the effective drag of the suspending medium. The results of this analysis demonstrate that the effective drag scales linearly with fluid viscosity, as expected. Surprisingly, increasing the number of non-magnetic microspheres in the suspending fluid results increases the collection of magnetic microspheres, corresponding to a decrease in the effective drag of the medium. PMID:24874577

  14. Analytical Tools Interface for Landscape Assessments

    EPA Science Inventory

    Environmental management practices are trending away from simple, local-scale assessments toward complex, multiple-stressor regional assessments. Landscape ecology provides the theory behind these assessments while geographic information systems (GIS) supply the tools to implemen...

  15. Development of a simplified urban water balance model (WABILA).

    PubMed

    Henrichs, M; Langner, J; Uhl, M

    2016-01-01

    During the last decade, water sensitive urban design (WSUD) has become more and more accepted. However, there is not any simple tool or option available to evaluate the influence of these measures on the local water balance. To counteract the impact of new settlements, planners focus on mitigating increases in runoff through installation of infiltration systems. This leads to an increasing non-natural groundwater recharge and decreased evapotranspiration. Simple software tools which evaluate or simulate the effect of WSUD on the local water balance are still needed. The authors developed a tool named WABILA (Wasserbilanz) that could support planners for optimal WSUD. WABILA is an easy-to-use planning tool that is based on simplified regression functions for established measures and land covers. Results show that WSUD has to be site-specific, based on climate conditions and the natural water balance.

  16. Recent Advances in Genome Editing Using CRISPR/Cas9.

    PubMed

    Ding, Yuduan; Li, Hong; Chen, Ling-Ling; Xie, Kabin

    2016-01-01

    The CRISPR (clustered regularly interspaced short palindromic repeat)-Cas9 (CRISPR-associated nuclease 9) system is a versatile tool for genome engineering that uses a guide RNA (gRNA) to target Cas9 to a specific sequence. This simple RNA-guided genome-editing technology has become a revolutionary tool in biology and has many innovative applications in different fields. In this review, we briefly introduce the Cas9-mediated genome-editing method, summarize the recent advances in CRISPR/Cas9 technology, and discuss their implications for plant research. To date, targeted gene knockout using the Cas9/gRNA system has been established in many plant species, and the targeting efficiency and capacity of Cas9 has been improved by optimizing its expression and that of its gRNA. The CRISPR/Cas9 system can also be used for sequence-specific mutagenesis/integration and transcriptional control of target genes. We also discuss off-target effects and the constraint that the protospacer-adjacent motif (PAM) puts on CRISPR/Cas9 genome engineering. To address these problems, a number of bioinformatic tools are available to help design specific gRNAs, and new Cas9 variants and orthologs with high fidelity and alternative PAM specificities have been engineered. Owing to these recent efforts, the CRISPR/Cas9 system is becoming a revolutionary and flexible tool for genome engineering. Adoption of the CRISPR/Cas9 technology in plant research would enable the investigation of plant biology at an unprecedented depth and create innovative applications in precise crop breeding.

  17. Modeling RNA interference in mammalian cells

    PubMed Central

    2011-01-01

    Background RNA interference (RNAi) is a regulatory cellular process that controls post-transcriptional gene silencing. During RNAi double-stranded RNA (dsRNA) induces sequence-specific degradation of homologous mRNA via the generation of smaller dsRNA oligomers of length between 21-23nt (siRNAs). siRNAs are then loaded onto the RNA-Induced Silencing multiprotein Complex (RISC), which uses the siRNA antisense strand to specifically recognize mRNA species which exhibit a complementary sequence. Once the siRNA loaded-RISC binds the target mRNA, the mRNA is cleaved and degraded, and the siRNA loaded-RISC can degrade additional mRNA molecules. Despite the widespread use of siRNAs for gene silencing, and the importance of dosage for its efficiency and to avoid off target effects, none of the numerous mathematical models proposed in literature was validated to quantitatively capture the effects of RNAi on the target mRNA degradation for different concentrations of siRNAs. Here, we address this pressing open problem performing in vitro experiments of RNAi in mammalian cells and testing and comparing different mathematical models fitting experimental data to in-silico generated data. We performed in vitro experiments in human and hamster cell lines constitutively expressing respectively EGFP protein or tTA protein, measuring both mRNA levels, by quantitative Real-Time PCR, and protein levels, by FACS analysis, for a large range of concentrations of siRNA oligomers. Results We tested and validated four different mathematical models of RNA interference by quantitatively fitting models' parameters to best capture the in vitro experimental data. We show that a simple Hill kinetic model is the most efficient way to model RNA interference. Our experimental and modeling findings clearly show that the RNAi-mediated degradation of mRNA is subject to saturation effects. Conclusions Our model has a simple mathematical form, amenable to analytical investigations and a small set of parameters with an intuitive physical meaning, that makes it a unique and reliable mathematical tool. The findings here presented will be a useful instrument for better understanding RNAi biology and as modelling tool in Systems and Synthetic Biology. PMID:21272352

  18. A simplified lumped model for the optimization of post-buckled beam architecture wideband generator

    NASA Astrophysics Data System (ADS)

    Liu, Weiqun; Formosa, Fabien; Badel, Adrien; Hu, Guangdi

    2017-11-01

    Buckled beams structures are a classical kind of bistable energy harvesters which attract more and more interests because of their capability to scavenge energy over a large frequency band in comparison with linear generator. The usual modeling approach uses the Galerkin mode discretization method with relatively high complexity, while the simplification with a single-mode solution lacks accuracy. It stems on the optimization of the energy potential features to finally define the physical and geometrical parameters. Therefore, in this paper, a simple lumped model is proposed with explicit relationship between the potential shape and parameters to allow efficient design of bistable beams based generator. The accuracy of the approximation model is studied with the effectiveness of application analyzed. Moreover, an important fact, that the bending stiffness has little influence on the potential shape with low buckling level and the sectional area determined, is found. This feature extends the applicable range of the model by utilizing the design of high moment of inertia. Numerical investigations demonstrate that the proposed model is a simple and reliable tool for design. An optimization example of using the proposed model is demonstrated with satisfactory performance.

  19. T-cell libraries allow simple parallel generation of multiple peptide-specific human T-cell clones.

    PubMed

    Theaker, Sarah M; Rius, Cristina; Greenshields-Watson, Alexander; Lloyd, Angharad; Trimby, Andrew; Fuller, Anna; Miles, John J; Cole, David K; Peakman, Mark; Sewell, Andrew K; Dolton, Garry

    2016-03-01

    Isolation of peptide-specific T-cell clones is highly desirable for determining the role of T-cells in human disease, as well as for the development of therapies and diagnostics. However, generation of monoclonal T-cells with the required specificity is challenging and time-consuming. Here we describe a library-based strategy for the simple parallel detection and isolation of multiple peptide-specific human T-cell clones from CD8(+) or CD4(+) polyclonal T-cell populations. T-cells were first amplified by CD3/CD28 microbeads in a 96U-well library format, prior to screening for desired peptide recognition. T-cells from peptide-reactive wells were then subjected to cytokine-mediated enrichment followed by single-cell cloning, with the entire process from sample to validated clone taking as little as 6 weeks. Overall, T-cell libraries represent an efficient and relatively rapid tool for the generation of peptide-specific T-cell clones, with applications shown here in infectious disease (Epstein-Barr virus, influenza A, and Ebola virus), autoimmunity (type 1 diabetes) and cancer. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  20. A univariate model of river water nitrate time series

    NASA Astrophysics Data System (ADS)

    Worrall, F.; Burt, T. P.

    1999-01-01

    Four time series were taken from three catchments in the North and South of England. The sites chosen included two in predominantly agricultural catchments, one at the tidal limit and one downstream of a sewage treatment works. A time series model was constructed for each of these series as a means of decomposing the elements controlling river water nitrate concentrations and to assess whether this approach could provide a simple management tool for protecting water abstractions. Autoregressive (AR) modelling of the detrended and deseasoned time series showed a "memory effect". This memory effect expressed itself as an increase in the winter-summer difference in nitrate levels that was dependent upon the nitrate concentration 12 or 6 months previously. Autoregressive moving average (ARMA) modelling showed that one of the series contained seasonal, non-stationary elements that appeared as an increasing trend in the winter-summer difference. The ARMA model was used to predict nitrate levels and predictions were tested against data held back from the model construction process - predictions gave average percentage errors of less than 10%. Empirical modelling can therefore provide a simple, efficient method for constructing management models for downstream water abstraction.

  1. Engineering modular ‘ON’ RNA switches using biological components

    PubMed Central

    Ceres, Pablo; Trausch, Jeremiah J.; Batey, Robert T.

    2013-01-01

    Riboswitches are cis-acting regulatory elements broadly distributed in bacterial mRNAs that control a wide range of critical metabolic activities. Expression is governed by two distinct domains within the mRNA leader: a sensory ‘aptamer domain’ and a regulatory ‘expression platform’. Riboswitches have also received considerable attention as important tools in synthetic biology because of their conceptually simple structure and the ability to obtain aptamers that bind almost any conceivable small molecule using in vitro selection (referred to as SELEX). In the design of artificial riboswitches, a significant hurdle has been to couple the two domains enabling their efficient communication. We previously demonstrated that biological transcriptional ‘OFF’ expression platforms are easily coupled to diverse aptamers, both biological and SELEX-derived, using simple design rules. Here, we present two modular transcriptional ‘ON’ riboswitch expression platforms that are also capable of hosting foreign aptamers. We demonstrate that these biological parts can be used to facilely generate artificial chimeric riboswitches capable of robustly regulating transcription both in vitro and in vivo. We expect that these modular expression platforms will be of great utility for various synthetic biological applications that use RNA-based biosensors. PMID:23999097

  2. Generation of a natural glycan microarray using 9-fluorenylmethyl chloroformate (FmocCl) as a cleavable fluorescent tag.

    PubMed

    Song, Xuezheng; Lasanajak, Yi; Rivera-Marrero, Carlos; Luyai, Anthony; Willard, Margaret; Smith, David F; Cummings, Richard D

    2009-12-15

    Glycan microarray technology has become a successful tool for studying protein-carbohydrate interactions, but a limitation has been the laborious synthesis of glycan structures by enzymatic and chemical methods. Here we describe a new method to generate quantifiable glycan libraries from natural sources by combining widely used protease digestion of glycoproteins and Fmoc chemistry. Glycoproteins including chicken ovalbumin, bovine fetuin, and horseradish peroxidase (HRP) were digested by Pronase, protected by FmocCl, and efficiently separated by 2D-HPLC. We show that glycans from HRP glycopeptides separated by HPLC and fluorescence monitoring retained their natural reducing end structures, mostly core alpha1,3-fucose and core alpha1,2-xylose. After simple Fmoc deprotection, the glycans were printed on NHS-activated glass slides. The glycans were interrogated using plant lectins and antibodies in sera from mice infected with Schistosoma mansoni, which revealed the presence of both IgM and IgG antibody responses to HRP glycopeptides. This simple approach to glycopeptide purification and conjugation allows for the development of natural glycopeptide microarrays without the need to remove and derivatize glycans and potentially compromise their reducing end determinants.

  3. Network structure of multivariate time series.

    PubMed

    Lacasa, Lucas; Nicosia, Vincenzo; Latora, Vito

    2015-10-21

    Our understanding of a variety of phenomena in physics, biology and economics crucially depends on the analysis of multivariate time series. While a wide range tools and techniques for time series analysis already exist, the increasing availability of massive data structures calls for new approaches for multidimensional signal processing. We present here a non-parametric method to analyse multivariate time series, based on the mapping of a multidimensional time series into a multilayer network, which allows to extract information on a high dimensional dynamical system through the analysis of the structure of the associated multiplex network. The method is simple to implement, general, scalable, does not require ad hoc phase space partitioning, and is thus suitable for the analysis of large, heterogeneous and non-stationary time series. We show that simple structural descriptors of the associated multiplex networks allow to extract and quantify nontrivial properties of coupled chaotic maps, including the transition between different dynamical phases and the onset of various types of synchronization. As a concrete example we then study financial time series, showing that a multiplex network analysis can efficiently discriminate crises from periods of financial stability, where standard methods based on time-series symbolization often fail.

  4. The JASMIN Analysis Platform - bridging the gap between traditional climate data practicies and data-centric analysis paradigms

    NASA Astrophysics Data System (ADS)

    Pascoe, Stephen; Iwi, Alan; kershaw, philip; Stephens, Ag; Lawrence, Bryan

    2014-05-01

    The advent of large-scale data and the consequential analysis problems have led to two new challenges for the research community: how to share such data to get the maximum value and how to carry out efficient analysis. Solving both challenges require a form of parallelisation: the first is social parallelisation (involving trust and information sharing), the second data parallelisation (involving new algorithms and tools). The JASMIN infrastructure supports both kinds of parallelism by providing a multi-tennent environment with petabyte-scale storage, VM provisioning and batch cluster facilities. The JASMIN Analysis Platform (JAP) is an analysis software layer for JASMIN which emphasises ease of transition from a researcher's local environment to JASMIN. JAP brings together tools traditionally used by multiple communities and configures them to work together, enabling users to move analysis from their local environment to JASMIN without rewriting code. JAP also provides facilities to exploit JASMIN's parallel capabilities whilst maintaining their familiar analysis environment where ever possible. Modern opensource analysis tools typically have multiple dependent packages, increasing the installation burden on system administrators. When you consider a suite of tools, often with both common and conflicting dependencies, analysis pipelines can become locked to a particular installation simply because of the effort required to reconstruct the dependency tree. JAP addresses this problem by providing a consistent suite of RPMs compatible with RedHat Enterprise Linux and CentOS 6.4. Researchers can install JAP locally, either as RPMs or through a pre-built VM image, giving them the confidence to know moving analysis to JASMIN will not disrupt their environment. Analysis parallelisation is in it's infancy in climate sciences, with few tools capable of exploiting any parallel environment beyond manual scripting of the use of multiple processors. JAP begins to bridge this gap through a veriety of higher-level tools for parallelisation and job scheduling such as IPython-parallel and MPI support for interactive analysis languages. We find that enabling even simple parallelisation of workflows, together with the state of the art I/O performance of JASMIN storage, provides many users with the large increases in efficiency they need to scale their analyses to conteporary data volumes and tackly new, previously inaccessible, problems.

  5. A student-centered approach for developing active learning: the construction of physical models as a teaching tool in medical physiology.

    PubMed

    Rezende-Filho, Flávio Moura; da Fonseca, Lucas José Sá; Nunes-Souza, Valéria; Guedes, Glaucevane da Silva; Rabelo, Luiza Antas

    2014-09-15

    Teaching physiology, a complex and constantly evolving subject, is not a simple task. A considerable body of knowledge about cognitive processes and teaching and learning methods has accumulated over the years, helping teachers to determine the most efficient way to teach, and highlighting student's active participation as a means to improve learning outcomes. In this context, this paper describes and qualitatively analyzes an experience of a student-centered teaching-learning methodology based on the construction of physiological-physical models, focusing on their possible application in the practice of teaching physiology. After having Physiology classes and revising the literature, students, divided in small groups, built physiological-physical models predominantly using low-cost materials, for studying different topics in Physiology. Groups were followed by monitors and guided by teachers during the whole process, finally presenting the results in a Symposium on Integrative Physiology. Along the proposed activities, students were capable of efficiently creating physiological-physical models (118 in total) highly representative of different physiological processes. The implementation of the proposal indicated that students successfully achieved active learning and meaningful learning in Physiology while addressing multiple learning styles. The proposed method has proved to be an attractive, accessible and relatively simple approach to facilitate the physiology teaching-learning process, while facing difficulties imposed by recent requirements, especially those relating to the use of experimental animals and professional training guidelines. Finally, students' active participation in the production of knowledge may result in a holistic education, and possibly, better professional practices.

  6. Vapor-Liquid Sol-Gel Approach to Fabricating Highly Durable and Robust Superhydrophobic Polydimethylsiloxane@Silica Surface on Polyester Textile for Oil-Water Separation.

    PubMed

    Su, Xiaojing; Li, Hongqiang; Lai, Xuejun; Zhang, Lin; Wang, Jing; Liao, Xiaofeng; Zeng, Xingrong

    2017-08-23

    Large-scale fabrication of superhydrophobic surfaces with excellent durability by simple techniques has been of considerable interest for its urgent practical application in oil-water separation in recent years. Herein, we proposed a facile vapor-liquid sol-gel approach to fabricating highly durable and robust superhydrophobic polydimethylsiloxane@silica surfaces on the cross-structure polyester textiles. Scanning electron microscopy and Fourier transform infrared spectroscopy demonstrated that the silica generated from the hydrolysis-condensation of tetraethyl orthosilicate (TEOS) gradually aggregated at microscale driven by the extreme nonpolar dihydroxyl-terminated polydimethylsiloxane (PDMS(OH)). This led to construction of hierarchical roughness and micronano structures of the superhydrophobic textile surface. The as-fabricated superhydrophobic textile possessed outstanding durability in deionized water, various solvents, strong acid/base solutions, and boiling/ice water. Remarkably, the polyester textile still retained great water repellency and even after ultrasonic treatment for 18 h, 96 laundering cycles, and 600 abrasion cycles, exhibiting excellent mechanical robustness. Importantly, the superhydrophobic polyester textile was further applied for oil-water separation as absorption materials and/or filter pipes, presenting high separation efficiency and great reusability. Our method to construct superhydrophobic textiles is simple but highly efficient; no special equipment, chemicals, or atmosphere is required. Additionally, no fluorinated slianes and organic solvents are involved, which is very beneficial for environment safety and protection. Our findings conceivably stand out as a new tool to fabricate organic-inorganic superhydrophobic surfaces with strong durability and robustness for practical applications in oil spill accidents and industrial sewage emission.

  7. Volatile-Compound Fingerprinting by Headspace-Gas-Chromatography Ion-Mobility Spectrometry (HS-GC-IMS) as a Benchtop Alternative to 1H NMR Profiling for Assessment of the Authenticity of Honey.

    PubMed

    Gerhardt, Natalie; Birkenmeier, Markus; Schwolow, Sebastian; Rohn, Sascha; Weller, Philipp

    2018-02-06

    This work describes a simple approach for the untargeted profiling of volatile compounds for the authentication of the botanical origins of honey based on resolution-optimized HS-GC-IMS combined with optimized chemometric techniques, namely PCA, LDA, and kNN. A direct comparison of the PCA-LDA models between the HS-GC-IMS and 1 H NMR data demonstrated that HS-GC-IMS profiling could be used as a complementary tool to NMR-based profiling of honey samples. Whereas NMR profiling still requires comparatively precise sample preparation, pH adjustment in particular, HS-GC-IMS fingerprinting may be considered an alternative approach for a truly fully automatable, cost-efficient, and in particular highly sensitive method. It was demonstrated that all tested honey samples could be distinguished on the basis of their botanical origins. Loading plots revealed the volatile compounds responsible for the differences among the monofloral honeys. The HS-GC-IMS-based PCA-LDA model was composed of two linear functions of discrimination and 10 selected PCs that discriminated canola, acacia, and honeydew honeys with a predictive accuracy of 98.6%. Application of the LDA model to an external test set of 10 authentic honeys clearly proved the high predictive ability of the model by correctly classifying them into three variety groups with 100% correct classifications. The constructed model presents a simple and efficient method of analysis and may serve as a basis for the authentication of other food types.

  8. A Fixed-Wing Aircraft Simulation Tool for Improving the efficiency of DoD Acquisition

    DTIC Science & Technology

    2015-10-05

    simulation tool , CREATETM-AV Helios [12-14], a high fidelity rotary wing vehicle simulation tool , and CREATETM-AV DaVinci [15-16], a conceptual through...05/2015 Oct 2008-Sep 2015 A Fixed-Wing Aircraft Simulation Tool for Improving the Efficiency of DoD Acquisition Scott A. Morton and David R...multi-disciplinary fixed-wing virtual aircraft simulation tool incorporating aerodynamics, structural dynamics, kinematics, and kinetics. Kestrel allows

  9. Dynamic Analyses of Result Quality in Energy-Aware Approximate Programs

    NASA Astrophysics Data System (ADS)

    RIngenburg, Michael F.

    Energy efficiency is a key concern in the design of modern computer systems. One promising approach to energy-efficient computation, approximate computing, trades off output precision for energy efficiency. However, this tradeoff can have unexpected effects on computation quality. This thesis presents dynamic analysis tools to study, debug, and monitor the quality and energy efficiency of approximate computations. We propose three styles of tools: prototyping tools that allow developers to experiment with approximation in their applications, online tools that instrument code to determine the key sources of error, and online tools that monitor the quality of deployed applications in real time. Our prototyping tool is based on an extension to the functional language OCaml. We add approximation constructs to the language, an approximation simulator to the runtime, and profiling and auto-tuning tools for studying and experimenting with energy-quality tradeoffs. We also present two online debugging tools and three online monitoring tools. The first online tool identifies correlations between output quality and the total number of executions of, and errors in, individual approximate operations. The second tracks the number of approximate operations that flow into a particular value. Our online tools comprise three low-cost approaches to dynamic quality monitoring. They are designed to monitor quality in deployed applications without spending more energy than is saved by approximation. Online monitors can be used to perform real time adjustments to energy usage in order to meet specific quality goals. We present prototype implementations of all of these tools and describe their usage with several applications. Our prototyping, profiling, and autotuning tools allow us to experiment with approximation strategies and identify new strategies, our online tools succeed in providing new insights into the effects of approximation on output quality, and our monitors succeed in controlling output quality while still maintaining significant energy efficiency gains.

  10. Tracking of Indels by DEcomposition is a Simple and Effective Method to Assess Efficiency of Guide RNAs in Zebrafish.

    PubMed

    Etard, Christelle; Joshi, Swarnima; Stegmaier, Johannes; Mikut, Ralf; Strähle, Uwe

    2017-12-01

    A bottleneck in CRISPR/Cas9 genome editing is variable efficiencies of in silico-designed gRNAs. We evaluated the sensitivity of the TIDE method (Tracking of Indels by DEcomposition) introduced by Brinkman et al. in 2014 for assessing the cutting efficiencies of gRNAs in zebrafish. We show that this simple method, which involves bulk polymerase chain reaction amplification and Sanger sequencing, is highly effective in tracking well-performing gRNAs in pools of genomic DNA derived from injected embryos. The method is equally effective for tracing INDELs in heterozygotes.

  11. Development of High Efficiency (14%) Solar Cell Array Module

    NASA Technical Reports Server (NTRS)

    Iles, P. A.; Khemthong, S.; Olah, S.; Sampson, W. J.; Ling, K. S.

    1979-01-01

    High efficiency solar cells required for the low cost modules was developed. The production tooling for the manufacture of the cells and modules was designed. The tooling consisted of: (1) back contact soldering machine; (2) vacuum pickup; (3) antireflective coating tooling; and (4) test fixture.

  12. How to Compute a Slot Marker - Calculation of Controller Managed Spacing Tools for Efficient Descents with Precision Scheduling

    NASA Technical Reports Server (NTRS)

    Prevot, Thomas

    2012-01-01

    This paper describes the underlying principles and algorithms for computing the primary controller managed spacing (CMS) tools developed at NASA for precisely spacing aircraft along efficient descent paths. The trajectory-based CMS tools include slot markers, delay indications and speed advisories. These tools are one of three core NASA technologies integrated in NASAs ATM technology demonstration-1 (ATD-1) that will operationally demonstrate the feasibility of fuel-efficient, high throughput arrival operations using Automatic Dependent Surveillance Broadcast (ADS-B) and ground-based and airborne NASA technologies for precision scheduling and spacing.

  13. Power and Efficiency.

    ERIC Educational Resources Information Center

    Boyd, James N.

    1991-01-01

    Presents a mathematical problem that, when examined and generalized, develops the relationships between power and efficiency in energy transfer. Offers four examples of simple electrical and mechanical systems to illustrate the principle that maximum power occurs at 50 percent efficiency. (MDH)

  14. External validation of a simple clinical tool used to predict falls in people with Parkinson disease

    PubMed Central

    Duncan, Ryan P.; Cavanaugh, James T.; Earhart, Gammon M.; Ellis, Terry D.; Ford, Matthew P.; Foreman, K. Bo; Leddy, Abigail L.; Paul, Serene S.; Canning, Colleen G.; Thackeray, Anne; Dibble, Leland E.

    2015-01-01

    Background Assessment of fall risk in an individual with Parkinson disease (PD) is a critical yet often time consuming component of patient care. Recently a simple clinical prediction tool based only on fall history in the previous year, freezing of gait in the past month, and gait velocity <1.1 m/s was developed and accurately predicted future falls in a sample of individuals with PD. METHODS We sought to externally validate the utility of the tool by administering it to a different cohort of 171 individuals with PD. Falls were monitored prospectively for 6 months following predictor assessment. RESULTS The tool accurately discriminated future fallers from non-fallers (area under the curve [AUC] = 0.83; 95% CI 0.76 –0.89), comparable to the developmental study. CONCLUSION The results validated the utility of the tool for allowing clinicians to quickly and accurately identify an individual’s risk of an impending fall. PMID:26003412

  15. External validation of a simple clinical tool used to predict falls in people with Parkinson disease.

    PubMed

    Duncan, Ryan P; Cavanaugh, James T; Earhart, Gammon M; Ellis, Terry D; Ford, Matthew P; Foreman, K Bo; Leddy, Abigail L; Paul, Serene S; Canning, Colleen G; Thackeray, Anne; Dibble, Leland E

    2015-08-01

    Assessment of fall risk in an individual with Parkinson disease (PD) is a critical yet often time consuming component of patient care. Recently a simple clinical prediction tool based only on fall history in the previous year, freezing of gait in the past month, and gait velocity <1.1 m/s was developed and accurately predicted future falls in a sample of individuals with PD. We sought to externally validate the utility of the tool by administering it to a different cohort of 171 individuals with PD. Falls were monitored prospectively for 6 months following predictor assessment. The tool accurately discriminated future fallers from non-fallers (area under the curve [AUC] = 0.83; 95% CI 0.76-0.89), comparable to the developmental study. The results validated the utility of the tool for allowing clinicians to quickly and accurately identify an individual's risk of an impending fall. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Introduction, comparison, and validation of Meta-Essentials: A free and simple tool for meta-analysis.

    PubMed

    Suurmond, Robert; van Rhee, Henk; Hak, Tony

    2017-12-01

    We present a new tool for meta-analysis, Meta-Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta-analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta-Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta-analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp-Hartung adjustment of the DerSimonian-Laird estimator. However, more advanced meta-analysis methods such as meta-analytical structural equation modelling and meta-regression with multiple covariates are not available. In summary, Meta-Essentials may prove a valuable resource for meta-analysts, including researchers, teachers, and students. © 2017 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.

  17. Benefits and Pitfalls: Simple Guidelines for the Use of Social Networking Tools in K-12 Education

    ERIC Educational Resources Information Center

    Huffman, Stephanie

    2013-01-01

    The article will outline a framework for the use of social networking tools in K-12 education framed around four thought provoking questions: 1) what are the benefits and pitfalls of using social networking tools in P-12 education, 2) how do we plan effectively for the use of social networking tool, 3) what role does professional development play…

  18. The Simple Concurrent Online Processing System (SCOPS) - An open-source interface for remotely sensed data processing

    NASA Astrophysics Data System (ADS)

    Warren, M. A.; Goult, S.; Clewley, D.

    2018-06-01

    Advances in technology allow remotely sensed data to be acquired with increasingly higher spatial and spectral resolutions. These data may then be used to influence government decision making and solve a number of research and application driven questions. However, such large volumes of data can be difficult to handle on a single personal computer or on older machines with slower components. Often the software required to process data is varied and can be highly technical and too advanced for the novice user to fully understand. This paper describes an open-source tool, the Simple Concurrent Online Processing System (SCOPS), which forms part of an airborne hyperspectral data processing chain that allows users accessing the tool over a web interface to submit jobs and process data remotely. It is demonstrated using Natural Environment Research Council Airborne Research Facility (NERC-ARF) instruments together with other free- and open-source tools to take radiometrically corrected data from sensor geometry into geocorrected form and to generate simple or complex band ratio products. The final processed data products are acquired via an HTTP download. SCOPS can cut data processing times and introduce complex processing software to novice users by distributing jobs across a network using a simple to use web interface.

  19. A numerical tool for reproducing driver behaviour: experiments and predictive simulations.

    PubMed

    Casucci, M; Marchitto, M; Cacciabue, P C

    2010-03-01

    This paper presents the simulation tool called SDDRIVE (Simple Simulation of Driver performance), which is the numerical computerised implementation of the theoretical architecture describing Driver-Vehicle-Environment (DVE) interactions, contained in Cacciabue and Carsten [Cacciabue, P.C., Carsten, O. A simple model of driver behaviour to sustain design and safety assessment of automated systems in automotive environments, 2010]. Following a brief description of the basic algorithms that simulate the performance of drivers, the paper presents and discusses a set of experiments carried out in a Virtual Reality full scale simulator for validating the simulation. Then the predictive potentiality of the tool is shown by discussing two case studies of DVE interactions, performed in the presence of different driver attitudes in similar traffic conditions.

  20. Simple and Efficient Trap for Bark and Ambrosia Beetles (Coleoptera: Curculionidae) to Facilitate Invasive Species Monitoring and Citizen Involvement.

    PubMed

    Steininger, M S; Hulcr, J; Šigut, M; Lucky, A

    2015-06-01

    Bark and ambrosia beetles (Coleoptera: Curculionidae: Scolytinae & Platypodinae) are among the most damaging forest pests worldwide, and monitoring is essential to damage prevention. Unfortunately, traps and attractants that are currently used are costly, and agencies rely on limited field personnel for deployment. The situation can be greatly aided by 1) the development of cost-effective trapping techniques, and 2) distribution of the effort through the Citizen Science approach. The goal of this study was to test a simple, effective trap that can be made and deployed by anyone interested in collecting bark and ambrosia beetles. Three trap types made from 2-liter soda bottles and, separately, four attractants were compared. Simple, one-window traps performed comparably at capturing species in traps painted or with multiple windows. A comparison of attractants in two-window traps found that 95% ethanol attracted the highest number of species but that Purell hand sanitizer (70% ethanol) and then Germ-X hand sanitizer (63% ethanol) were also effective. A perforated zip-top plastic bag containing Purell hanging over a trap filled with automobile antifreeze attracted the fewest species and individual specimens. Overall, >4,500 bark and ambrosia beetles, including 30 species were captured, representing a third of the regional species diversity. More than three quarters of the specimens were nonnative, representing nearly half of the known regional exotic species. These results suggest that simple one-window soda bottle traps baited with ethanol-based hand sanitizer will be effective and inexpensive tools for large-scale monitoring of bark and ambrosia beetles. © The Authors 2015. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. Diagnosis of Late-Stage, Early-Onset, Small-Fiber Polyneuropathy

    DTIC Science & Technology

    2016-10-01

    develop biotechnology tools for simple diagnosis (sweat testing and pupilometry), 3) identify gene polymorphisms to detect risk for SFPN. None...Goal 4) Specific Aim 2: To develop and evaluate simple biotechnology devices for diagnosing and monitoring longstanding eoSFPN based on

  2. Simple Retrofit High-Efficiency Natural Gas Water Heater Field Test

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schoenbauer, Ben

    High-performance water heaters are typically more time consuming and costly to install in retrofit applications, making high performance water heaters difficult to justify economically. However, recent advancements in high performance water heaters have targeted the retrofit market, simplifying installations and reducing costs. Four high efficiency natural gas water heaters designed specifically for retrofit applications were installed in single-family homes along with detailed monitoring systems to characterize their savings potential, their installed efficiencies, and their ability to meet household demands. The water heaters tested for this project were designed to improve the cost-effectiveness and increase market penetration of high efficiency watermore » heaters in the residential retrofit market. The retrofit high efficiency water heaters achieved their goal of reducing costs, maintaining savings potential and installed efficiency of other high efficiency water heaters, and meeting the necessary capacity in order to improve cost-effectiveness. However, the improvements were not sufficient to achieve simple paybacks of less than ten years for the incremental cost compared to a minimum efficiency heater. Significant changes would be necessary to reduce the simple payback to six years or less. Annual energy savings in the range of $200 would also reduce paybacks to less than six years. These energy savings would require either significantly higher fuel costs (greater than $1.50 per therm) or very high usage (around 120 gallons per day). For current incremental costs, the water heater efficiency would need to be similar to that of a heat pump water heater to deliver a six year payback.« less

  3. Unifying Model-Based and Reactive Programming within a Model-Based Executive

    NASA Technical Reports Server (NTRS)

    Williams, Brian C.; Gupta, Vineet; Norvig, Peter (Technical Monitor)

    1999-01-01

    Real-time, model-based, deduction has recently emerged as a vital component in AI's tool box for developing highly autonomous reactive systems. Yet one of the current hurdles towards developing model-based reactive systems is the number of methods simultaneously employed, and their corresponding melange of programming and modeling languages. This paper offers an important step towards unification. We introduce RMPL, a rich modeling language that combines probabilistic, constraint-based modeling with reactive programming constructs, while offering a simple semantics in terms of hidden state Markov processes. We introduce probabilistic, hierarchical constraint automata (PHCA), which allow Markov processes to be expressed in a compact representation that preserves the modularity of RMPL programs. Finally, a model-based executive, called Reactive Burton is described that exploits this compact encoding to perform efficIent simulation, belief state update and control sequence generation.

  4. Images as embedding maps and minimal surfaces: Movies, color, and volumetric medical images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kimmel, R.; Malladi, R.; Sochen, N.

    A general geometrical framework for image processing is presented. The authors consider intensity images as surfaces in the (x,I) space. The image is thereby a two dimensional surface in three dimensional space for gray level images. The new formulation unifies many classical schemes, algorithms, and measures via choices of parameters in a {open_quote}master{close_quotes} geometrical measure. More important, it is a simple and efficient tool for the design of natural schemes for image enhancement, segmentation, and scale space. Here the authors give the basic motivation and apply the scheme to enhance images. They present the concept of an image as amore » surface in dimensions higher than the three dimensional intuitive space. This will help them handle movies, color, and volumetric medical images.« less

  5. FISH-in-CHIPS: A Microfluidic Platform for Molecular Typing of Cancer Cells.

    PubMed

    Perez-Toralla, Karla; Mottet, Guillaume; Tulukcuoglu-Guneri, Ezgi; Champ, Jérôme; Bidard, François-Clément; Pierga, Jean-Yves; Klijanienko, Jerzy; Draskovic, Irena; Malaquin, Laurent; Viovy, Jean-Louis; Descroix, Stéphanie

    2017-01-01

    Microfluidics offer powerful tools for the control, manipulation, and analysis of cells, in particular for the assessment of cell malignancy or the study of cell subpopulations. However, implementing complex biological protocols on chip remains a challenge. Sample preparation is often performed off chip using multiple manually performed steps, and protocols usually include different dehydration and drying steps that are not always compatible with a microfluidic format.Here, we report the implementation of a Fluorescence in situ Hybridization (FISH) protocol for the molecular typing of cancer cells in a simple and low-cost device. The geometry of the chip allows integrating the sample preparation steps to efficiently assess the genomic content of individual cells using a minute amount of sample. The FISH protocol can be fully automated, thus enabling its use in routine clinical practice.

  6. Mining and Development of Novel SSR Markers Using Next Generation Sequencing (NGS) Data in Plants.

    PubMed

    Taheri, Sima; Lee Abdullah, Thohirah; Yusop, Mohd Rafii; Hanafi, Mohamed Musa; Sahebi, Mahbod; Azizi, Parisa; Shamshiri, Redmond Ramin

    2018-02-13

    Microsatellites, or simple sequence repeats (SSRs), are one of the most informative and multi-purpose genetic markers exploited in plant functional genomics. However, the discovery of SSRs and development using traditional methods are laborious, time-consuming, and costly. Recently, the availability of high-throughput sequencing technologies has enabled researchers to identify a substantial number of microsatellites at less cost and effort than traditional approaches. Illumina is a noteworthy transcriptome sequencing technology that is currently used in SSR marker development. Although 454 pyrosequencing datasets can be used for SSR development, this type of sequencing is no longer supported. This review aims to present an overview of the next generation sequencing, with a focus on the efficient use of de novo transcriptome sequencing (RNA-Seq) and related tools for mining and development of microsatellites in plants.

  7. Conversion of the agent-oriented domain-specific language ALAS into JavaScript

    NASA Astrophysics Data System (ADS)

    Sredojević, Dejan; Vidaković, Milan; Okanović, Dušan; Mitrović, Dejan; Ivanović, Mirjana

    2016-06-01

    This paper shows generation of JavaScript code from code written in agent-oriented domain-specific language ALAS. ALAS is an agent-oriented domain-specific language for writing software agents that are executed within XJAF middleware. Since the agents can be executed on various platforms, they must be converted into a language of the target platform. We also try to utilize existing tools and technologies to make the whole conversion process as simple as possible, as well as faster and more efficient. We use the Xtext framework that is compatible with Java to implement ALAS infrastructure - editor and code generator. Since Xtext supports Java, generation of Java code from ALAS code is straightforward. To generate a JavaScript code that will be executed within the target JavaScript XJAF implementation, Google Web Toolkit (GWT) is used.

  8. Retinal artery-vein classification via topology estimation

    PubMed Central

    Estrada, Rolando; Allingham, Michael J.; Mettu, Priyatham S.; Cousins, Scott W.; Tomasi, Carlo; Farsiu, Sina

    2015-01-01

    We propose a novel, graph-theoretic framework for distinguishing arteries from veins in a fundus image. We make use of the underlying vessel topology to better classify small and midsized vessels. We extend our previously proposed tree topology estimation framework by incorporating expert, domain-specific features to construct a simple, yet powerful global likelihood model. We efficiently maximize this model by iteratively exploring the space of possible solutions consistent with the projected vessels. We tested our method on four retinal datasets and achieved classification accuracies of 91.0%, 93.5%, 91.7%, and 90.9%, outperforming existing methods. Our results show the effectiveness of our approach, which is capable of analyzing the entire vasculature, including peripheral vessels, in wide field-of-view fundus photographs. This topology-based method is a potentially important tool for diagnosing diseases with retinal vascular manifestation. PMID:26068204

  9. Real-Time XRD Studies of Li-O2 Electrochemical Reaction in Nonaqueous Lithium-Oxygen Battery.

    PubMed

    Lim, Hyunseob; Yilmaz, Eda; Byon, Hye Ryung

    2012-11-01

    Understanding of electrochemical process in rechargeable Li-O2 battery has suffered from lack of proper analytical tool, especially related to the identification of chemical species and number of electrons involved in the discharge/recharge process. Here we present a simple and straightforward analytical method for simultaneously attaining chemical and quantified information of Li2O2 (discharge product) and byproducts using in situ XRD measurement. By real-time monitoring of solid-state Li2O2 peak area, the accurate efficiency of Li2O2 formation and the number of electrons can be evaluated during full discharge. Furthermore, by observation of sequential area change of Li2O2 peak during recharge, we found nonlinearity of Li2O2 decomposition rate for the first time in ether-based electrolyte.

  10. Assessing and benchmarking multiphoton microscopes for biologists

    PubMed Central

    Corbin, Kaitlin; Pinkard, Henry; Peck, Sebastian; Beemiller, Peter; Krummel, Matthew F.

    2017-01-01

    Multiphoton microscopy has become staple tool for tracking cells within tissues and organs due to superior depth of penetration, low excitation volumes, and reduced phototoxicity. Many factors, ranging from laser pulse width to relay optics to detectors and electronics, contribute to the overall ability of these microscopes to excite and detect fluorescence deep within tissues. However, we have found that there are few standard ways already described in the literature to distinguish between microscopes or to benchmark existing microscopes to measure the overall quality and efficiency of these instruments. Here, we discuss some simple parameters and methods that can either be used within a multiphoton facility or by a prospective purchaser to benchmark performance. This can both assist in identifying decay in microscope performance and in choosing features of a scope that are suited to experimental needs. PMID:24974026

  11. Photochemical Approaches to Complex Chemotypes: Applications in Natural Product Synthesis

    PubMed Central

    2016-01-01

    The use of photochemical transformations is a powerful strategy that allows for the formation of a high degree of molecular complexity from relatively simple building blocks in a single step. A central feature of all light-promoted transformations is the involvement of electronically excited states, generated upon absorption of photons. This produces transient reactive intermediates and significantly alters the reactivity of a chemical compound. The input of energy provided by light thus offers a means to produce strained and unique target compounds that cannot be assembled using thermal protocols. This review aims at highlighting photochemical transformations as a tool for rapidly accessing structurally and stereochemically diverse scaffolds. Synthetic designs based on photochemical transformations have the potential to afford complex polycyclic carbon skeletons with impressive efficiency, which are of high value in total synthesis. PMID:27120289

  12. Path generation algorithm for UML graphic modeling of aerospace test software

    NASA Astrophysics Data System (ADS)

    Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Chen, Chao

    2018-03-01

    Aerospace traditional software testing engineers are based on their own work experience and communication with software development personnel to complete the description of the test software, manual writing test cases, time-consuming, inefficient, loopholes and more. Using the high reliability MBT tools developed by our company, the one-time modeling can automatically generate test case documents, which is efficient and accurate. UML model to describe the process accurately express the need to rely on the path is reached, the existing path generation algorithm are too simple, cannot be combined into a path and branch path with loop, or too cumbersome, too complicated arrangement generates a path is meaningless, for aerospace software testing is superfluous, I rely on our experience of ten load space, tailor developed a description of aerospace software UML graphics path generation algorithm.

  13. Simple Nutrition Screening Tool for Pediatric Inpatients.

    PubMed

    White, Melinda; Lawson, Karen; Ramsey, Rebecca; Dennis, Nicole; Hutchinson, Zoe; Soh, Xin Ying; Matsuyama, Misa; Doolan, Annabel; Todd, Alwyn; Elliott, Aoife; Bell, Kristie; Littlewood, Robyn

    2016-03-01

    Pediatric nutrition risk screening tools are not routinely implemented throughout many hospitals, despite prevalence studies demonstrating malnutrition is common in hospitalized children. Existing tools lack the simplicity of those used to assess nutrition risk in the adult population. This study reports the accuracy of a new, quick, and simple pediatric nutrition screening tool (PNST) designed to be used for pediatric inpatients. The pediatric Subjective Global Nutrition Assessment (SGNA) and anthropometric measures were used to develop and assess the validity of 4 simple nutrition screening questions comprising the PNST. Participants were pediatric inpatients in 2 tertiary pediatric hospitals and 1 regional hospital. Two affirmative answers to the PNST questions were found to maximize the specificity and sensitivity to the pediatric SGNA and body mass index (BMI) z scores for malnutrition in 295 patients. The PNST identified 37.6% of patients as being at nutrition risk, whereas the pediatric SGNA identified 34.2%. The sensitivity and specificity of the PNST compared with the pediatric SGNA were 77.8% and 82.1%, respectively. The sensitivity of the PNST at detecting patients with a BMI z score of less than -2 was 89.3%, and the specificity was 66.2%. Both the PNST and pediatric SGNA were relatively poor at detecting patients who were stunted or overweight, with the sensitivity and specificity being less than 69%. The PNST provides a sensitive, valid, and simpler alternative to existing pediatric nutrition screening tools such as Screening Tool for the Assessment of Malnutrition in Pediatrics (STAMP), Screening Tool Risk on Nutritional status and Growth (STRONGkids), and Paediatric Yorkhill Malnutrition Score (PYMS) to ensure the early detection of hospitalized children at nutrition risk. © 2014 American Society for Parenteral and Enteral Nutrition.

  14. Rule Systems for Runtime Verification: A Short Tutorial

    NASA Astrophysics Data System (ADS)

    Barringer, Howard; Havelund, Klaus; Rydeheard, David; Groce, Alex

    In this tutorial, we introduce two rule-based systems for on and off-line trace analysis, RuleR and LogScope. RuleR is a conditional rule-based system, which has a simple and easily implemented algorithm for effective runtime verification, and into which one can compile a wide range of temporal logics and other specification formalisms used for runtime verification. Specifications can be parameterized with data, or even with specifications, allowing for temporal logic combinators to be defined. We outline a number of simple syntactic extensions of core RuleR that can lead to further conciseness of specification but still enabling easy and efficient implementation. RuleR is implemented in Java and we will demonstrate its ease of use in monitoring Java programs. LogScope is a derivation of RuleR adding a simple very user-friendly temporal logic. It was developed in Python, specifically for supporting testing of spacecraft flight software for NASA’s next 2011 Mars mission MSL (Mars Science Laboratory). The system has been applied by test engineers to analysis of log files generated by running the flight software. Detailed logging is already part of the system design approach, and hence there is no added instrumentation overhead caused by this approach. While post-mortem log analysis prevents the autonomous reaction to problems possible with traditional runtime verification, it provides a powerful tool for test automation. A new system is being developed that integrates features from both RuleR and LogScope.

  15. Equivalent circuit models for interpreting impedance perturbation spectroscopy data

    NASA Astrophysics Data System (ADS)

    Smith, R. Lowell

    2004-07-01

    As in-situ structural integrity monitoring disciplines mature, there is a growing need to process sensor/actuator data efficiently in real time. Although smaller, faster embedded processors will contribute to this, it is also important to develop straightforward, robust methods to reduce the overall computational burden for practical applications of interest. This paper addresses the use of equivalent circuit modeling techniques for inferring structure attributes monitored using impedance perturbation spectroscopy. In pioneering work about ten years ago significant progress was associated with the development of simple impedance models derived from the piezoelectric equations. Using mathematical modeling tools currently available from research in ultrasonics and impedance spectroscopy is expected to provide additional synergistic benefits. For purposes of structural health monitoring the objective is to use impedance spectroscopy data to infer the physical condition of structures to which small piezoelectric actuators are bonded. Features of interest include stiffness changes, mass loading, and damping or mechanical losses. Equivalent circuit models are typically simple enough to facilitate the development of practical analytical models of the actuator-structure interaction. This type of parametric structure model allows raw impedance/admittance data to be interpreted optimally using standard multiple, nonlinear regression analysis. One potential long-term outcome is the possibility of cataloging measured viscoelastic properties of the mechanical subsystems of interest as simple lists of attributes and their statistical uncertainties, whose evolution can be followed in time. Equivalent circuit models are well suited for addressing calibration and self-consistency issues such as temperature corrections, Poisson mode coupling, and distributed relaxation processes.

  16. Maximum efficiency of state-space models of nanoscale energy conversion devices

    NASA Astrophysics Data System (ADS)

    Einax, Mario; Nitzan, Abraham

    2016-07-01

    The performance of nano-scale energy conversion devices is studied in the framework of state-space models where a device is described by a graph comprising states and transitions between them represented by nodes and links, respectively. Particular segments of this network represent input (driving) and output processes whose properly chosen flux ratio provides the energy conversion efficiency. Simple cyclical graphs yield Carnot efficiency for the maximum conversion yield. We give general proof that opening a link that separate between the two driving segments always leads to reduced efficiency. We illustrate these general result with simple models of a thermoelectric nanodevice and an organic photovoltaic cell. In the latter an intersecting link of the above type corresponds to non-radiative carriers recombination and the reduced maximum efficiency is manifested as a smaller open-circuit voltage.

  17. Maximum efficiency of state-space models of nanoscale energy conversion devices.

    PubMed

    Einax, Mario; Nitzan, Abraham

    2016-07-07

    The performance of nano-scale energy conversion devices is studied in the framework of state-space models where a device is described by a graph comprising states and transitions between them represented by nodes and links, respectively. Particular segments of this network represent input (driving) and output processes whose properly chosen flux ratio provides the energy conversion efficiency. Simple cyclical graphs yield Carnot efficiency for the maximum conversion yield. We give general proof that opening a link that separate between the two driving segments always leads to reduced efficiency. We illustrate these general result with simple models of a thermoelectric nanodevice and an organic photovoltaic cell. In the latter an intersecting link of the above type corresponds to non-radiative carriers recombination and the reduced maximum efficiency is manifested as a smaller open-circuit voltage.

  18. Red phosphorescent organic light-emitting diodes based on the simple structure.

    PubMed

    Seo, Ji Hyun; Lee, Seok Jae; Kim, Bo Young; Choi, Eun Young; Han, Wone Keun; Lee, Kum Hee; Yoon, Seung Soo; Kim, Young Kwan

    2012-05-01

    We demonstrated that the simple layered red phosphorescent organic light-emitting diodes (OLEDs) are possible to have high efficiency, low driving voltage, stable roll-off efficiency, and pure emission color without hole injection and transport layers. We fabricated the OLEDs with a structure of ITO/CBP doped with Ir(pq)2(acac)/BPhen/Liq/Al, where the doping concentration of red dopant, Ir(pq)2(acac), was varied from 4% to 20%. As a result, the quantum efficiencies of 13.4, 11.2, 16.7, 10.8 and 9.8% were observed in devices with doping concentrations of 4, 8, 12, 16 and 20%, respectively. Despite of absence of the hole injection and transport layers, these efficiencies are superior to efficiencies of device with hole transporting layer due to direct hole injection from anode to dopant in emission layer.

  19. An Accessible User Interface for Geoscience and Programming

    NASA Astrophysics Data System (ADS)

    Sevre, E. O.; Lee, S.

    2012-12-01

    The goal of this research is to develop an interface that will simplify user interaction with software for scientists. The motivating factor of the research is to develop tools that assist scientists with limited motor skills with the efficient generation and use of software tools. Reliance on computers and programming is increasing in the world of geology, and it is increasingly important for geologists and geophysicists to have the computational resources to use advanced software and edit programs for their research. I have developed a prototype of a program to help geophysicists write programs using a simple interface that requires only simple single-mouse-clicks to input code. It is my goal to minimize the amount of typing necessary to create simple programs and scripts to increase accessibility for people with disabilities limiting fine motor skills. This interface can be adapted for various programming and scripting languages. Using this interface will simplify development of code for C/C++, Java, and GMT, and can be expanded to support any other text based programming language. The interface is designed around the concept of maximizing the amount of code that can be written using a minimum number of clicks and typing. The screen is split into two sections: a list of click-commands is on the left hand side, and a text area is on the right hand side. When the user clicks on a command on the left hand side the applicable code is automatically inserted at the insertion point in the text area. Currently in the C/C++ interface, there are commands for common code segments that are often used, such as for loops, comments, print statements, and structured code creation. The primary goal is to provide an interface that will work across many devices for developing code. A simple prototype has been developed for the iPad. Due to the limited number of devices that an iOS application can be used with, the code has been re-written in Java to run on a wider range of devices. Currently, the software works in a prototype mode, and it is our goal to further development to create software that can benefit a wide range of people working in geosciences, which will make code development practical and accessible for a wider audience of scientists. By using an interface like this, it reduces potential for errors by reusing known working code.

  20. Use of simple models to determine wake vortex categories for new aircraft.

    DOT National Transportation Integrated Search

    2015-06-22

    The paper describes how to use simple models and, if needed, sensitivity analyses to determine the wake vortex categories for new aircraft. The methodology provides a tool for the regulators to assess the relative risk of introducing new aircraft int...

  1. Equine behavioral enrichment toys as tools for non-invasive recovery of viral and host DNA.

    PubMed

    Seeber, Peter A; Soilemetzidou, Sanatana E; East, Marion L; Walzer, Chris; Greenwood, Alex D

    2017-09-01

    Direct collection of samples from wildlife can be difficult and sometimes impossible. Non-invasive remote sampling for the purpose of DNA extraction is a potential tool for monitoring the presence of wildlife at the individual level, and for identifying the pathogens shed by wildlife. Equine herpesviruses (EHV) are common pathogens of equids that can be fatal if transmitted to other mammals. Transmission usually occurs by nasal aerosol discharge from virus-shedding individuals. The aim of this study was to validate a simple, non-invasive method to track EHV shedding in zebras and to establish an efficient protocol for genotyping individual zebras from environmental DNA (eDNA). A commercially available horse enrichment toy was deployed in captive Grévy's, mountain, and plains zebra enclosures and swabbed after 4-24 hr. Using eDNA extracted from these swabs four EHV strains (EHV-1, EHV-7, wild ass herpesvirus and zebra herpesvirus) were detected by PCR and confirmed by sequencing, and 12 of 16 zebras present in the enclosures were identified as having interacted with the enrichment toy by mitochondrial DNA amplification and sequencing. We conclude that, when direct sampling is difficult or prohibited, non-invasive sampling of eDNA can be a useful tool to determine the genetics of individuals or populations and for detecting pathogen shedding in captive wildlife. © 2017 Wiley Periodicals, Inc.

  2. EUV focus sensor: design and modeling

    NASA Astrophysics Data System (ADS)

    Goldberg, Kenneth A.; Teyssier, Maureen E.; Liddle, J. Alexander

    2005-05-01

    We describe performance modeling and design optimization of a prototype EUV focus sensor (FS) designed for use with existing 0.3-NA EUV projection-lithography tools. At 0.3-NA and 13.5-nm wavelength, the depth of focus shrinks to 150 nm increasing the importance of high-sensitivity focal-plane detection tools. The FS is a free-standing Ni grating structure that works in concert with a simple mask pattern of regular lines and spaces at constant pitch. The FS pitch matches that of the image-plane aerial-image intensity: it transmits the light with high efficiency when the grating is aligned with the aerial image laterally and longitudinally. Using a single-element photodetector, to detect the transmitted flux, the FS is scanned laterally and longitudinally so the plane of peak aerial-image contrast can be found. The design under consideration has a fixed image-plane pitch of 80-nm, with aperture widths of 12-40-nm (1-3 wave-lengths), and aspect ratios of 2-8. TEMPEST-3D is used to model the light transmission. Careful attention is paid to the annular, partially coherent, unpolarized illumination and to the annular pupil of the Micro-Exposure Tool (MET) optics for which the FS is designed. The system design balances the opposing needs of high sensitivity and high throughput opti-mizing the signal-to-noise ratio in the measured intensity contrast.

  3. EUV Focus Sensor: Design and Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldberg, Kenneth A.; Teyssier, Maureen E.; Liddle, J. Alexander

    We describe performance modeling and design optimization of a prototype EUV focus sensor (FS) designed for use with existing 0.3-NA EUV projection-lithography tools. At 0.3-NA and 13.5-nm wavelength, the depth of focus shrinks to 150 nm increasing the importance of high-sensitivity focal-plane detection tools. The FS is a free-standing Ni grating structure that works in concert with a simple mask pattern of regular lines and spaces at constant pitch. The FS pitch matches that of the image-plane aerial-image intensity: it transmits the light with high efficiency when the grating is aligned with the aerial image laterally and longitudinally. Using amore » single-element photodetector, to detect the transmitted flux, the FS is scanned laterally and longitudinally so the plane of peak aerial-image contrast can be found. The design under consideration has a fixed image-plane pitch of 80-nm, with aperture widths of 12-40-nm (1-3 wavelengths), and aspect ratios of 2-8. TEMPEST-3D is used to model the light transmission. Careful attention is paid to the annular, partially coherent, unpolarized illumination and to the annular pupil of the Micro-Exposure Tool (MET) optics for which the FS is designed. The system design balances the opposing needs of high sensitivity and high throughput optimizing the signal-to-noise ratio in the measured intensity contrast.« less

  4. Technology: Presentations in the Cloud with a Twist

    ERIC Educational Resources Information Center

    Siegle, Del

    2011-01-01

    Technology tools have come a long way from early word processing applications and opportunities for students to engage in simple programming. Many tools now exist for students to develop and share products in a variety of formats and for a wide range of audiences. PowerPoint is probably the most ubiquitously used tool for student projects. In…

  5. Scratch as a Computational Modelling Tool for Teaching Physics

    ERIC Educational Resources Information Center

    Lopez, Victor; Hernandez, Maria Isabel

    2015-01-01

    The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling…

  6. Jet engine performance enhancement through use of a wave-rotor topping cycle

    NASA Technical Reports Server (NTRS)

    Wilson, Jack; Paxson, Daniel E.

    1993-01-01

    A simple model is used to calculate the thermal efficiency and specific power of simple jet engines and jet engines with a wave-rotor topping cycle. The performance of the wave rotor is based on measurements from a previous experiment. Applied to the case of an aircraft flying at Mach 0.8, the calculations show that an engine with a wave rotor topping cycle may have gains in thermal efficiency of approximately 1 to 2 percent and gains in specific power of approximately 10 to 16 percent over a simple jet engine with the same overall compression ratio. Even greater gains are possible if the wave rotor's performance can be improved.

  7. Web-based visual analysis for high-throughput genomics

    PubMed Central

    2013-01-01

    Background Visualization plays an essential role in genomics research by making it possible to observe correlations and trends in large datasets as well as communicate findings to others. Visual analysis, which combines visualization with analysis tools to enable seamless use of both approaches for scientific investigation, offers a powerful method for performing complex genomic analyses. However, there are numerous challenges that arise when creating rich, interactive Web-based visualizations/visual analysis applications for high-throughput genomics. These challenges include managing data flow from Web server to Web browser, integrating analysis tools and visualizations, and sharing visualizations with colleagues. Results We have created a platform simplifies the creation of Web-based visualization/visual analysis applications for high-throughput genomics. This platform provides components that make it simple to efficiently query very large datasets, draw common representations of genomic data, integrate with analysis tools, and share or publish fully interactive visualizations. Using this platform, we have created a Circos-style genome-wide viewer, a generic scatter plot for correlation analysis, an interactive phylogenetic tree, a scalable genome browser for next-generation sequencing data, and an application for systematically exploring tool parameter spaces to find good parameter values. All visualizations are interactive and fully customizable. The platform is integrated with the Galaxy (http://galaxyproject.org) genomics workbench, making it easy to integrate new visual applications into Galaxy. Conclusions Visualization and visual analysis play an important role in high-throughput genomics experiments, and approaches are needed to make it easier to create applications for these activities. Our framework provides a foundation for creating Web-based visualizations and integrating them into Galaxy. Finally, the visualizations we have created using the framework are useful tools for high-throughput genomics experiments. PMID:23758618

  8. Toward an International Classification of Functioning, Disability and Health clinical data collection tool: the Italian experience of developing simple, intuitive descriptions of the Rehabilitation Set categories.

    PubMed

    Selb, Melissa; Gimigliano, Francesca; Prodinger, Birgit; Stucki, Gerold; Pestelli, Germano; Iocco, Maurizio; Boldrini, Paolo

    2017-04-01

    As part of international efforts to develop and implement national models including the specification of ICF-based clinical data collection tools, the Italian rehabilitation community initiated a project to develop simple, intuitive descriptions of the ICF Rehabilitation Set, highlighting the core concept of each category in user-friendly language. This paper outlines the Italian experience in developing simple, intuitive descriptions of the ICF Rehabilitation Set as an ICF-based clinical data collection tool for Italy. Consensus process. Expert conference. Multidisciplinary group of rehabilitation professionals. The first of a two-stage consensus process involved developing an initial proposal for simple, intuitive descriptions of each ICF Rehabilitation Set category based on descriptions generated in a similar process in China. Stage two involved a consensus conference. Divided into three working groups, participants discussed and voted (vote A) whether the initially proposed descriptions of each ICF Rehabilitation Set category was simple and intuitive enough for use in daily practice. Afterwards the categories with descriptions considered ambiguous i.e. not simple and intuitive enough, were divided among the working groups, who were asked to propose a new description for the allocated categories. These proposals were then voted (vote B) on in a plenary session. The last step of the consensus conference required each working group to develop a new proposal for each and the same categories with descriptions still considered ambiguous. Participants then voted (final vote) for which of the three proposed descriptions they preferred. Nineteen clinicians from diverse rehabilitation disciplines from various regions of Italy participated in the consensus process. Three ICF categories already achieved consensus in vote A, while 20 ICF categories were accepted in vote B. The remaining 7 categories were decided in the final vote. The findings were discussed in light of current efforts toward developing strategies for ICF implementation, specifically for the application of an ICF-based clinical data collection tool, not only for Italy but also for the rest of Europe. Promising as minimal standards for monitoring the impact of interventions and for standardized reporting of functioning as a relevant outcome in rehabilitation.

  9. High accurate interpolation of NURBS tool path for CNC machine tools

    NASA Astrophysics Data System (ADS)

    Liu, Qiang; Liu, Huan; Yuan, Songmei

    2016-09-01

    Feedrate fluctuation caused by approximation errors of interpolation methods has great effects on machining quality in NURBS interpolation, but few methods can efficiently eliminate or reduce it to a satisfying level without sacrificing the computing efficiency at present. In order to solve this problem, a high accurate interpolation method for NURBS tool path is proposed. The proposed method can efficiently reduce the feedrate fluctuation by forming a quartic equation with respect to the curve parameter increment, which can be efficiently solved by analytic methods in real-time. Theoretically, the proposed method can totally eliminate the feedrate fluctuation for any 2nd degree NURBS curves and can interpolate 3rd degree NURBS curves with minimal feedrate fluctuation. Moreover, a smooth feedrate planning algorithm is also proposed to generate smooth tool motion with considering multiple constraints and scheduling errors by an efficient planning strategy. Experiments are conducted to verify the feasibility and applicability of the proposed method. This research presents a novel NURBS interpolation method with not only high accuracy but also satisfying computing efficiency.

  10. Risk assessment tools to identify women with increased risk of osteoporotic fracture: complexity or simplicity? A systematic review.

    PubMed

    Rubin, Katrine Hass; Friis-Holmberg, Teresa; Hermann, Anne Pernille; Abrahamsen, Bo; Brixen, Kim

    2013-08-01

    A huge number of risk assessment tools have been developed. Far from all have been validated in external studies, more of them have absence of methodological and transparent evidence, and few are integrated in national guidelines. Therefore, we performed a systematic review to provide an overview of existing valid and reliable risk assessment tools for prediction of osteoporotic fractures. Additionally, we aimed to determine if the performance of each tool was sufficient for practical use, and last, to examine whether the complexity of the tools influenced their discriminative power. We searched PubMed, Embase, and Cochrane databases for papers and evaluated these with respect to methodological quality using the Quality Assessment Tool for Diagnostic Accuracy Studies (QUADAS) checklist. A total of 48 tools were identified; 20 had been externally validated, however, only six tools had been tested more than once in a population-based setting with acceptable methodological quality. None of the tools performed consistently better than the others and simple tools (i.e., the Osteoporosis Self-assessment Tool [OST], Osteoporosis Risk Assessment Instrument [ORAI], and Garvan Fracture Risk Calculator [Garvan]) often did as well or better than more complex tools (i.e., Simple Calculated Risk Estimation Score [SCORE], WHO Fracture Risk Assessment Tool [FRAX], and Qfracture). No studies determined the effectiveness of tools in selecting patients for therapy and thus improving fracture outcomes. High-quality studies in randomized design with population-based cohorts with different case mixes are needed. Copyright © 2013 American Society for Bone and Mineral Research.

  11. ToNER: A tool for identifying nucleotide enrichment signals in feature-enriched RNA-seq data.

    PubMed

    Promworn, Yuttachon; Kaewprommal, Pavita; Shaw, Philip J; Intarapanich, Apichart; Tongsima, Sissades; Piriyapongsa, Jittima

    2017-01-01

    Biochemical methods are available for enriching 5' ends of RNAs in prokaryotes, which are employed in the differential RNA-seq (dRNA-seq) and the more recent Cappable-seq protocols. Computational methods are needed to locate RNA 5' ends from these data by statistical analysis of the enrichment. Although statistical-based analysis methods have been developed for dRNA-seq, they may not be suitable for Cappable-seq data. The more efficient enrichment method employed in Cappable-seq compared with dRNA-seq could affect data distribution and thus algorithm performance. We present Transformation of Nucleotide Enrichment Ratios (ToNER), a tool for statistical modeling of enrichment from RNA-seq data obtained from enriched and unenriched libraries. The tool calculates nucleotide enrichment scores and determines the global transformation for fitting to the normal distribution using the Box-Cox procedure. From the transformed distribution, sites of significant enrichment are identified. To increase power of detection, meta-analysis across experimental replicates is offered. We tested the tool on Cappable-seq and dRNA-seq data for identifying Escherichia coli transcript 5' ends and compared the results with those from the TSSAR tool, which is designed for analyzing dRNA-seq data. When combining results across Cappable-seq replicates, ToNER detects more known transcript 5' ends than TSSAR. In general, the transcript 5' ends detected by ToNER but not TSSAR occur in regions which cannot be locally modeled by TSSAR. ToNER uses a simple yet robust statistical modeling approach, which can be used for detecting RNA 5'ends from Cappable-seq data, in particular when combining information from experimental replicates. The ToNER tool could potentially be applied for analyzing other RNA-seq datasets in which enrichment for other structural features of RNA is employed. The program is freely available for download at ToNER webpage (http://www4a.biotec.or.th/GI/tools/toner) and GitHub repository (https://github.com/PavitaKae/ToNER).

  12. Validation of a modified FRAX® tool for improving outpatient efficiency--part of the "Catch Before a Fall" initiative.

    PubMed

    Parker, Simon; Ciaccio, Maria; Cook, Erica; Davenport, Graham; Cooper, Alun; Grange, Simon; Smitham, Peter

    2015-01-01

    We have validated our touch-screen-modified FRAX® tool against the traditional healthcare professional-led questionnaire, demonstrating strong concordance between doctor- and patient-derived results. We will use this in outpatient clinics and general practice to increase our capture rate of at-risk patients, making valuable use of otherwise wasted patient waiting times. Outpatient clinics offer an opportunity to collect valuable health information from a captive population. We have previously developed a modified fracture risk assessment (FRAX®) tool, enabling patients to self-assess their osteoporotic fracture risk in a touch-screen computer format and demonstrated its acceptability with patients. We aim to validate the accuracy of our tool against the traditional questionnaire. Fifty patients over 50 years of age within the fracture clinic independently completed a paper equivalent of our touch-screen-modified FRAX® questionnaire. Responses were analysed against the traditional healthcare professional (HCP)-led questionnaire which was carried out afterwards. Correlation was assessed by sensitivity, specificity, Cohen's kappa statistic and Fisher's exact test for each potential FRAX® outcome of "treat", "measure BMD" and "lifestyle advice". Age range was 51-98 years. The FRAX® tool was completed by 88 % of patients; six patients lacked confidence in estimating either their height or weight. Following question adjustment according to patient response and feedback, our tool achieved >95 % sensitivity and specificity for the "treat" and "lifestyle advice" groups, and 79 % sensitivity and 100 % specificity in the "measure BMD" group. Cohen's kappa value ranged from 0.823 to 0.995 across all groups, demonstrating "very good" agreement for all. Fisher's exact test demonstrated significant concordance between doctor and patient decisions. Our modified tool provides a simple, accurate and reliable method for patients to self-report their own FRAX® score outside the clinical contact period, thus releasing the HCP from the time required to complete the questionnaire and potentially increasing our capture rate of at-risk patients.

  13. Benchmarking and Self-Assessment in the Wine Industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galitsky, Christina; Radspieler, Anthony; Worrell, Ernst

    2005-12-01

    Not all industrial facilities have the staff or theopportunity to perform a detailed audit of their operations. The lack ofknowledge of energy efficiency opportunities provides an importantbarrier to improving efficiency. Benchmarking programs in the U.S. andabroad have shown to improve knowledge of the energy performance ofindustrial facilities and buildings and to fuel energy managementpractices. Benchmarking provides a fair way to compare the energyintensity of plants, while accounting for structural differences (e.g.,the mix of products produced, climate conditions) between differentfacilities. In California, the winemaking industry is not only one of theeconomic pillars of the economy; it is also a large energymore » consumer, witha considerable potential for energy-efficiency improvement. LawrenceBerkeley National Laboratory and Fetzer Vineyards developed the firstbenchmarking tool for the California wine industry called "BEST(Benchmarking and Energy and water Savings Tool) Winery". BEST Wineryenables a winery to compare its energy efficiency to a best practicereference winery. Besides overall performance, the tool enables the userto evaluate the impact of implementing efficiency measures. The toolfacilitates strategic planning of efficiency measures, based on theestimated impact of the measures, their costs and savings. The tool willraise awareness of current energy intensities and offer an efficient wayto evaluate the impact of future efficiency measures.« less

  14. Building America Case Study: Simple Retrofit High-Efficiency Natural Gas Water Heater Field Test, Minneapolis, Minnesota

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    High performance water heaters are typically more time consuming and costly to install in retrofit applications, making high performance water heaters difficult to justify economically. However, recent advancements in high performance water heaters have targeted the retrofit market, simplifying installations and reducing costs. Four high efficiency natural gas water heaters designed specifically for retrofit applications were installed in single-family homes along with detailed monitoring systems to characterize their savings potential, their installed efficiencies, and their ability to meet household demands. The water heaters tested for this project were designed to improve the cost-effectiveness and increase market penetration of high efficiencymore » water heaters in the residential retrofit market. The retrofit high efficiency water heaters achieved their goal of reducing costs, maintaining savings potential and installed efficiency of other high efficiency water heaters, and meeting the necessary capacity in order to improve cost-effectiveness. However, the improvements were not sufficient to achieve simple paybacks of less than ten years for the incremental cost compared to a minimum efficiency heater. Significant changes would be necessary to reduce the simple payback to six years or less. Annual energy savings in the range of $200 would also reduce paybacks to less than six years. These energy savings would require either significantly higher fuel costs (greater than $1.50 per therm) or very high usage (around 120 gallons per day). For current incremental costs, the water heater efficiency would need to be similar to that of a heat pump water heater to deliver a six year payback.« less

  15. Simple Retrofit High-Efficiency Natural Gas Water Heater Field Test

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schoenbauer, Ben

    High performance water heaters are typically more time consuming and costly to install in retrofit applications, making high performance water heaters difficult to justify economically. However, recent advancements in high performance water heaters have targeted the retrofit market, simplifying installations and reducing costs. Four high efficiency natural gas water heaters designed specifically for retrofit applications were installed in single-family homes along with detailed monitoring systems to characterize their savings potential, their installed efficiencies, and their ability to meet household demands. The water heaters tested for this project were designed to improve the cost-effectiveness and increase market penetration of high efficiencymore » water heaters in the residential retrofit market. The retrofit high efficiency water heaters achieved their goal of reducing costs, maintaining savings potential and installed efficiency of other high efficiency water heaters, and meeting the necessary capacity in order to improve cost-effectiveness. However, the improvements were not sufficient to achieve simple paybacks of less than ten years for the incremental cost compared to a minimum efficiency heater. Significant changes would be necessary to reduce the simple payback to six years or less. Annual energy savings in the range of $200 would also reduce paybacks to less than six years. These energy savings would require either significantly higher fuel costs (greater than $1.50 per therm) or very high usage (around 120 gallons per day). For current incremental costs, the water heater efficiency would need to be similar to that of a heat pump water heater to deliver a six year payback.« less

  16. Value-added Data Services at the Goddard Earth Sciences Data and Information Services Center

    NASA Technical Reports Server (NTRS)

    Leptoukh, Gregory G.; Alcott, Gary T.; Kempler, Steven J.; Lynnes, Christopher S.; Vollmer, Bruce E.

    2004-01-01

    The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC), in addition to serving the Earth Science community as one of the major Distributed Active Archives Centers (DAACs), provides much more than just data. Among the value-added services available to general users are subsetting data spatially and/or by parameter, online analysis (to avoid downloading unnecessarily all the data), and assistance in obtaining data from other centers. Services available to data producers and high-volume users include consulting on building new products with standard formats and metadata and construction of data management systems. A particularly useful service is data processing at the DISC (i.e., close to the input data) with the users algorithm. This can take a number of different forms: as a configuration-managed algorithm within the main processing stream; as a stand-alone program next to the on-line data storage; as build-it-yourself code within the Near-Archive Data Mining (NADM) system; or as an on-the-fly analysis with simple algorithms embedded into the web-based tools. Partnerships between the GES DISC and scientists, both producers and users, allow the scientists to concentrate on science, while the GES DISC handles the data management, e.g., formats, integration, and data processing. The existing data management infrastructure at the GES DISC supports a wide spectrum of options: from simple data support to sophisticated on-line analysis tools, producing economies of scale and rapid time-to-deploy. At the same time, such partnerships allow the GES DISC to serve the user community more efficiently and to better prioritize on-line holdings. Several examples of successful partnerships are described in the presentation.

  17. Biocompatible water soluble quantum dots as new biophotonic tools for hematologic cells: applications for flow cell cytometry

    NASA Astrophysics Data System (ADS)

    Lira, Rafael B.; de Sales Neto, Antonio T.; Carvalho, Kilmara K. H. G.; Leite, Elisa S.; Brasil, Aluizio G., Jr.; Azevedo, Denise P. L.; Cabral Filho, Paulo E.; Cavalcanti, Mariana B.; Amaral, Ademir J.; Farias, Patricía M. A.; Santos, Beate S.; Fontes, Adriana

    2010-02-01

    Quantum dots (QDs) are a promising class of fluorescent probes that can be conjugated to a variety of specific cell antibodies. For this reason, simple, cheap and reproducible routes of QDśs syntheses are the main goal of many researches in this field. The main objective of this work was to demonstrate the ability of QDs as biolabels for flow cell cytometry analysis. We have synthesized biocompatible water soluble CdS/Cd(OH)2 and CdTe/CdS QDs and applied them as fluorescent labels of hematologic cells. CdTe/CdS QDs was prepared using using a simple aqueous route with mercaptoacetic acid and mercaptopropionic acid as stabilizing agents. The resulting CdTe/CdS QDs can target biological membrane proteins and can also be internalized by cells. We applied the CdTe/CdS QDs as biolabels of human lymphocytes and compared the results obtained for lymphocytes treated and non-treated with permeabilizing agents for cell membranes. Permeabilized cells present higher fluorescence pattern than non permeabilized ones. We associated antibody A to the CdS/Cd(OH)2 QDs to label type A red blood cell (RBC). In this case, the O erythrocytes were used as the negative control. The results demonstrate that QDs were successfully functionalized with antibody A. There was a specific binding of QDs-antibody A to RBC membrane antigen only for A RBCs. We have also monitored QDs-hematologic cell interaction by using fluorescence microscopy. Our method shows that QDs can be conjugated to a variety of specific cell antibodies and can become a potential, highly efficient and low cost diagnostic tool for flow cell cytometry, very compatible with the lasers and filters used in this kind of equipments.

  18. Instillation and Fixation Methods Useful in Mouse Lung Cancer Research.

    PubMed

    Limjunyawong, Nathachit; Mock, Jason; Mitzner, Wayne

    2015-08-31

    The ability to instill live agents, cells, or chemicals directly into the lung without injuring or killing the mice is an important tool in lung cancer research. Although there are a number of methods that have been published showing how to intubate mice for pulmonary function measurements, none are without potential problems for rapid tracheal instillation in large cohorts of mice. In the present paper, a simple and quick method is described that enables an investigator to carry out such instillations in an efficient manner. The method does not require any special tools or lighting and can be learned with very little practice. It involves anesthetizing a mouse, making a small incision in the neck to visualize the trachea, and then inserting an intravenous catheter directly. The small incision is quickly closed with tissue adhesive, and the mice are allowed to recover. A skilled student or technician can do instillations at an average rate of 2 min/mouse. Once the cancer is established, there is frequently a need for quantitative histologic analysis of the lungs. Traditionally pathologists usually do not bother to standardize lung inflation during fixation, and analyses are often based on a scoring system that can be quite subjective. While this may sometime be sufficiently adequate for gross estimates of the size of a lung tumor, any proper stereological quantification of lung structure or cells requires a reproducible fixation procedure and subsequent lung volume measurement. Here we describe simple reliable procedures for both fixing the lungs under pressure and then accurately measuring the fixed lung volume. The only requirement is a laboratory balance that is accurate over a range of 1 mg-300 g. The procedures presented here thus could greatly improve the ability to create, treat, and analyze lung cancers in mice.

  19. Several steps/day indicators predict changes in anthropometric outcomes: HUB city steps

    USDA-ARS?s Scientific Manuscript database

    Walking for exercise remains the most frequently reported leisure-time activity, likely because it is simple, inexpensive, and easily incorporated into most people’s lifestyle. Pedometers are simple, convenient, and economical tools that can be used to quantify step-determined physical activity. F...

  20. Predicting Fish Densities in Lotic Systems: a Simple Modeling Approach

    EPA Science Inventory

    Fish density models are essential tools for fish ecologists and fisheries managers. However, applying these models can be difficult because of high levels of model complexity and the large number of parameters that must be estimated. We designed a simple fish density model and te...

  1. A Progression of Static Equilibrium Laboratory Exercises

    ERIC Educational Resources Information Center

    Kutzner, Mickey; Kutzner, Andrew

    2013-01-01

    Although simple architectural structures like bridges, catwalks, cantilevers, and Stonehenge have been integral in human societies for millennia, as have levers and other simple tools, modern students of introductory physics continue to grapple with Newton's conditions for static equilibrium. As formulated in typical introductory physics…

  2. AMPLISAS: a web server for multilocus genotyping using next-generation amplicon sequencing data.

    PubMed

    Sebastian, Alvaro; Herdegen, Magdalena; Migalska, Magdalena; Radwan, Jacek

    2016-03-01

    Next-generation sequencing (NGS) technologies are revolutionizing the fields of biology and medicine as powerful tools for amplicon sequencing (AS). Using combinations of primers and barcodes, it is possible to sequence targeted genomic regions with deep coverage for hundreds, even thousands, of individuals in a single experiment. This is extremely valuable for the genotyping of gene families in which locus-specific primers are often difficult to design, such as the major histocompatibility complex (MHC). The utility of AS is, however, limited by the high intrinsic sequencing error rates of NGS technologies and other sources of error such as polymerase amplification or chimera formation. Correcting these errors requires extensive bioinformatic post-processing of NGS data. Amplicon Sequence Assignment (AMPLISAS) is a tool that performs analysis of AS results in a simple and efficient way, while offering customization options for advanced users. AMPLISAS is designed as a three-step pipeline consisting of (i) read demultiplexing, (ii) unique sequence clustering and (iii) erroneous sequence filtering. Allele sequences and frequencies are retrieved in excel spreadsheet format, making them easy to interpret. AMPLISAS performance has been successfully benchmarked against previously published genotyped MHC data sets obtained with various NGS technologies. © 2015 John Wiley & Sons Ltd.

  3. Using Value Stream Mapping to improve quality of care in low-resource facility settings.

    PubMed

    Ramaswamy, Rohit; Rothschild, Claire; Alabi, Funmi; Wachira, Eric; Muigai, Faith; Pearson, Nick

    2017-11-01

    Jacaranda Health (JH) is a Kenya-based organization that attempts to provide affordable, high-quality maternal and newborn healthcare through a chain of private health facilities in Nairobi. JH needed to adopted quality improvement as an organization-wide strategy to optimize effectiveness and efficiency. Value Stream Mapping, a Lean Management tool, was used to engage staff in prioritizing opportunities to improve clinical outcomes and patient-centered quality of care. Implementation was accomplished through a five-step process: (i) leadership engagement and commitment; (ii) staff training; (iii) team formation; (iv) process walkthrough; and (v) construction and validation. The Value Stream Map allowed the organization to come together and develop an end-to-end view of the process of care at JH and to select improvement opportunities for the entire system. The Value Stream Map is a simple visual tool that allows organizations to engage staff at all levels to gain commitment around quality improvement efforts. © The Author 2017. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  4. Combinatorial invariants and covariants as tools for conical intersections.

    PubMed

    Ryb, Itai; Baer, Roi

    2004-12-01

    The combinatorial invariant and covariant are introduced as practical tools for analysis of conical intersections in molecules. The combinatorial invariant is a quantity depending on adiabatic electronic states taken at discrete nuclear configuration points. It is invariant to the phase choice (gauge) of these states. In the limit that the points trace a loop in nuclear configuration space, the value of the invariant approaches the corresponding Berry phase factor. The Berry phase indicates the presence of an odd or even number of conical intersections on surfaces bounded by these loops. Based on the combinatorial invariant, we develop a computationally simple and efficient method for locating conical intersections. The method is robust due to its use of gauge invariant nature. It does not rely on the landscape of intersecting potential energy surfaces nor does it require the computation of nonadiabatic couplings. We generalize the concept to open paths and combinatorial covariants for higher dimensions obtaining a technique for the construction of the gauge-covariant adiabatic-diabatic transformation matrix. This too does not make use of nonadiabatic couplings. The importance of using gauge-covariant expressions is underlined throughout. These techniques can be readily implemented by standard quantum chemistry codes. (c) 2004 American Institute of Physics.

  5. Single Nucleotide Polymorphism Markers for Genetic Mapping in Drosophila melanogaster

    PubMed Central

    Hoskins, Roger A.; Phan, Alexander C.; Naeemuddin, Mohammed; Mapa, Felipa A.; Ruddy, David A.; Ryan, Jessica J.; Young, Lynn M.; Wells, Trent; Kopczynski, Casey; Ellis, Michael C.

    2001-01-01

    For nearly a century, genetic analysis in Drosophila melanogaster has been a powerful tool for analyzing gene function, yet Drosophila lacks the molecular genetic mapping tools that recently have revolutionized human, mouse, and plant genetics. Here, we describe the systematic characterization of a dense set of molecular markers in Drosophila by using a sequence tagged site-based physical map of the genome. We identify 474 biallelic markers in standard laboratory strains of Drosophila that span the genome. Most of these markers are single nucleotide polymorphisms and sequences for these variants are provided in an accessible format. The average density of the new markers is one per 225 kb on the autosomes and one per megabase on the X chromosome. We include in this survey a set of P-element strains that provide additional use for high-resolution mapping. We show one application of the new markers in a simple set of crosses to map a mutation in the hedgehog gene to an interval of <1 Mb. This new map resource significantly increases the efficiency and resolution of recombination mapping and will be of immediate value to the Drosophila research community. PMID:11381036

  6. Single nucleotide polymorphism markers for genetic mapping in Drosophila melanogaster

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoskins, Roger A.; Phan, Alexander C.; Naeemuddin, Mohammed

    2001-04-16

    For nearly a century, genetic analysis in Drosophila melanogaster has been a powerful tool for analyzing gene function, yet Drosophila lacks the molecular genetic mapping tools that have recently revolutionized human, mouse and plant genetics. Here, we describe the systematic characterization of a dense set of molecular markers in Drosophila using an STS-based physical map of the genome. We identify 474 biallelic markers in standard laboratory strains of Drosophila that the genome. The majority of these markers are single nucleotide polymorphisms (SNPs) and sequences for these variants are provided in an accessible format. The average density of the new markersmore » is 1 marker per 225 kb on the autosomes and 1 marker per 1 Mb on the X chromosome. We include in this survey a set of P-element strains that provide additional utility for high-resolution mapping. We demonstrate one application of the new markers in a simple set of crosses to map a mutation in the hedgehog gene to an interval of <1 Mb. This new map resource significantly increases the efficiency and resolution of recombination mapping and will be of immediate value to the Drosophila research community.« less

  7. Active flow control insight gained from a modified integral boundary layer equation

    NASA Astrophysics Data System (ADS)

    Seifert, Avraham

    2016-11-01

    Active Flow Control (AFC) can alter the development of boundary layers with applications (e.g., reducing drag by separation delay or separating the boundary layers and enhancing vortex shedding to increase drag). Historically, significant effects of steady AFC methods were observed. Unsteady actuation is significantly more efficient than steady. Full-scale AFC tests were conducted with varying levels of success. While clearly relevant to industry, AFC implementation relies on expert knowledge with proven intuition and or costly and lengthy computational efforts. This situation hinders the use of AFC while simple, quick and reliable design method is absent. An updated form of the unsteady integral boundary layer (UIBL) equations, that include AFC terms (unsteady wall transpiration and body forces) can be used to assist in AFC analysis and design. With these equations and given a family of suitable velocity profiles, the momentum thickness can be calculated and matched with an outer, potential flow solution in 2D and 3D manner to create an AFC design tool, parallel to proven tools for airfoil design. Limiting cases of the UIBL equation can be used to analyze candidate AFC concepts in terms of their capability to modify the boundary layers development and system performance.

  8. Measuring fish and their physical habitats: Versatile 2D and 3D video techniques with user-friendly software

    USGS Publications Warehouse

    Neuswanger, Jason R.; Wipfli, Mark S.; Rosenberger, Amanda E.; Hughes, Nicholas F.

    2017-01-01

    Applications of video in fisheries research range from simple biodiversity surveys to three-dimensional (3D) measurement of complex swimming, schooling, feeding, and territorial behaviors. However, researchers lack a transparently developed, easy-to-use, general purpose tool for 3D video measurement and event logging. Thus, we developed a new measurement system, with freely available, user-friendly software, easily obtained hardware, and flexible underlying mathematical methods capable of high precision and accuracy. The software, VidSync, allows users to efficiently record, organize, and navigate complex 2D or 3D measurements of fish and their physical habitats. Laboratory tests showed submillimetre accuracy in length measurements of 50.8 mm targets at close range, with increasing errors (mostly <1%) at longer range and for longer targets. A field test on juvenile Chinook salmon (Oncorhynchus tshawytscha) feeding behavior in Alaska streams found that individuals within aggregations avoided the immediate proximity of their competitors, out to a distance of 1.0 to 2.9 body lengths. This system makes 3D video measurement a practical tool for laboratory and field studies of aquatic or terrestrial animal behavior and ecology.

  9. Project SOS: The Science of Sustainability

    NASA Astrophysics Data System (ADS)

    Berven, Christine; Dawes, Kathy; Kern, Anne; Ryan, Kathleen; McNamara, Patricia

    2014-03-01

    Project SOS: Making Connections Using The Science Of Sustainability is an Informal Science Education Pathways Project designed to teach the science of sustainability to middle-school aged youth in rural communities of northern ID and eastern WA. The educational focus is the physics of convection, conduction and radiation and how these exist in nature and specifically in the home of the youth. Our goal is to explore the implementation of a cooperative-learning model in which youth become experts in their area of heat transfer using portable exhibits, teach their fellow team-members about those mechanisms, and apply this knowledge as a team to improve the energy efficiency of a model house. We provide simple tools and instructions so that they may apply their new knowledge to their own homes. We analyze audio and video of the interactions of our facilitators with the youth and among the youth, and use pre- and post-surveys to document the increase in understanding of energy transfer mechanisms in their homes and the environment. The tools and techniques developed to accomplish our goals and our current findings regarding the effectiveness of this approach will be discussed. Work supported by National Science Foundation Award DRL-1223290.

  10. Simple tool for planting acorns

    Treesearch

    William R. Beaufait

    1957-01-01

    A handy, inexpensive tool for planting acorns has been developed at the Delta Research Center of the Southern Forest Experiment Station and used successfully in experimental plantings. One of its merits is that it ensures a planting hole of eactly the desired depth.

  11. Extremely Low Roll-Off and High Efficiency Achieved by Strategic Exciton Management in Organic Light-Emitting Diodes with Simple Ultrathin Emitting Layer Structure.

    PubMed

    Zhang, Tianmu; Shi, Changsheng; Zhao, Chenyang; Wu, Zhongbin; Chen, Jiangshan; Xie, Zhiyuan; Ma, Dongge

    2018-03-07

    Phosphorescent organic light-emitting diodes (OLEDs) possess the property of high efficiency but have serious efficiency roll-off at high luminance. Herein, we manufactured high-efficiency phosphorescent OLEDs with extremely low roll-off by effectively locating the ultrathin emitting layer (UEML) away from the high-concentration exciton formation region. The strategic exciton management in this simple UEML architecture greatly suppressed the exciton annihilation due to the expansion of the exciton diffusion region; thus, this efficiency roll-off at high luminance was significantly improved. The resulting green phosphorescent OLEDs exhibited the maximum external quantum efficiency of 25.5%, current efficiency of 98.0 cd A -1 , and power efficiency of 85.4 lm W -1 and still had 25.1%, 94.9 cd A -1 , and 55.5 lm W -1 at 5000 cd m -2 luminance, and retained 24.3%, 92.7 cd A -1 , and 49.3 lm W -1 at 10 000 cd m -2 luminance, respectively. Compared with the usual structures, the improvement demonstrated in this work displays potential value in applications.

  12. Making Temporal Logic Calculational: A Tool for Unification and Discovery

    NASA Astrophysics Data System (ADS)

    Boute, Raymond

    In temporal logic, calculational proofs beyond simple cases are often seen as challenging. The situation is reversed by making temporal logic calculational, yielding shorter and clearer proofs than traditional ones, and serving as a (mental) tool for unification and discovery. A side-effect of unifying theories is easier access by practicians. The starting point is a simple generic (software tool independent) Functional Temporal Calculus (FTC). Specific temporal logics are then captured via endosemantic functions. This concept reflects tacit conventions throughout mathematics and, once identified, is general and useful. FTC also yields a reasoning style that helps discovering theorems by calculation rather than just proving given facts. This is illustrated by deriving various theorems, most related to liveness issues in TLA+, and finding strengthenings of known results. Educational issues are addressed in passing.

  13. A Simple Framework for Evaluating Authorial Contributions for Scientific Publications.

    PubMed

    Warrender, Jeffrey M

    2016-10-01

    A simple tool is provided to assist researchers in assessing contributions to a scientific publication, for ease in evaluating which contributors qualify for authorship, and in what order the authors should be listed. The tool identifies four phases of activity leading to a publication-Conception and Design, Data Acquisition, Analysis and Interpretation, and Manuscript Preparation. By comparing a project participant's contribution in a given phase to several specified thresholds, a score of up to five points can be assigned; the contributor's scores in all four phases are summed to yield a total "contribution score", which is compared to a threshold to determine which contributors merit authorship. This tool may be useful in a variety of contexts in which a systematic approach to authorial credit is desired.

  14. Monte Carlo isotopic inventory analysis for complex nuclear systems

    NASA Astrophysics Data System (ADS)

    Phruksarojanakun, Phiphat

    Monte Carlo Inventory Simulation Engine (MCise) is a newly developed method for calculating isotopic inventory of materials. It offers the promise of modeling materials with complex processes and irradiation histories, which pose challenges for current, deterministic tools, and has strong analogies to Monte Carlo (MC) neutral particle transport. The analog method, including considerations for simple, complex and loop flows, is fully developed. In addition, six variance reduction tools provide unique capabilities of MCise to improve statistical precision of MC simulations. Forced Reaction forces an atom to undergo a desired number of reactions in a given irradiation environment. Biased Reaction Branching primarily focuses on improving statistical results of the isotopes that are produced from rare reaction pathways. Biased Source Sampling aims at increasing frequencies of sampling rare initial isotopes as the starting particles. Reaction Path Splitting increases the population by splitting the atom at each reaction point, creating one new atom for each decay or transmutation product. Delta Tracking is recommended for high-frequency pulsing to reduce the computing time. Lastly, Weight Window is introduced as a strategy to decrease large deviations of weight due to the uses of variance reduction techniques. A figure of merit is necessary to compare the efficiency of different variance reduction techniques. A number of possibilities for figure of merit are explored, two of which are robust and subsequently used. One is based on the relative error of a known target isotope (1/R 2T) and the other on the overall detection limit corrected by the relative error (1/DkR 2T). An automated Adaptive Variance-reduction Adjustment (AVA) tool is developed to iteratively define parameters for some variance reduction techniques in a problem with a target isotope. Sample problems demonstrate that AVA improves both precision and accuracy of a target result in an efficient manner. Potential applications of MCise include molten salt fueled reactors and liquid breeders in fusion blankets. As an example, the inventory analysis of a liquid actinide fuel in the In-Zinerator, a sub-critical power reactor driven by a fusion source, is examined. The result reassures MCise as a reliable tool for inventory analysis of complex nuclear systems.

  15. Eureka-DMA: an easy-to-operate graphical user interface for fast comprehensive investigation and analysis of DNA microarray data.

    PubMed

    Abelson, Sagi

    2014-02-24

    In the past decade, the field of molecular biology has become increasingly quantitative; rapid development of new technologies enables researchers to investigate and address fundamental issues quickly and in an efficient manner which were once impossible. Among these technologies, DNA microarray provides methodology for many applications such as gene discovery, diseases diagnosis, drug development and toxicological research and it has been used increasingly since it first emerged. Multiple tools have been developed to interpret the high-throughput data produced by microarrays. However, many times, less consideration has been given to the fact that an extensive and effective interpretation requires close interplay between the bioinformaticians who analyze the data and the biologists who generate it. To bridge this gap and to simplify the usability of such tools we developed Eureka-DMA - an easy-to-operate graphical user interface that allows bioinformaticians and bench-biologists alike to initiate analyses as well as to investigate the data produced by DNA microarrays. In this paper, we describe Eureka-DMA, a user-friendly software that comprises a set of methods for the interpretation of gene expression arrays. Eureka-DMA includes methods for the identification of genes with differential expression between conditions; it searches for enriched pathways and gene ontology terms and combines them with other relevant features. It thus enables the full understanding of the data for following testing as well as generating new hypotheses. Here we show two analyses, demonstrating examples of how Eureka-DMA can be used and its capability to produce relevant and reliable results. We have integrated several elementary expression analysis tools to provide a unified interface for their implementation. Eureka-DMA's simple graphical user interface provides effective and efficient framework in which the investigator has the full set of tools for the visualization and interpretation of the data with the option of exporting the analysis results for later use in other platforms. Eureka-DMA is freely available for academic users and can be downloaded at http://blue-meduza.org/Eureka-DMA.

  16. Clinical utility of the AlphaFIM® instrument in stroke rehabilitation.

    PubMed

    Lo, Alexander; Tahair, Nicola; Sharp, Shelley; Bayley, Mark T

    2012-02-01

    The AlphaFIM instrument is an assessment tool designed to facilitate discharge planning of stroke patients from acute care, by extrapolating overall functional status from performance in six key Functional Independence Measure (FIM) instrument items. To determine whether acute care AlphaFIM rating is correlated to stroke rehabilitation outcomes. In this prospective observational study, data were analyzed from 891 patients referred for inpatient stroke rehabilitation through an Internet-based referral system. Simple linear and stepwise regression models determined correlations between rehabilitation-ready AlphaFIM rating and rehabilitation outcomes (admission and discharge FIM ratings, FIM gain, FIM efficiency, and length of stay). Covariates including demographic data, stroke characteristics, medical history, cognitive deficits, and activity tolerance were included in the stepwise regressions. The AlphaFIM instrument was significant in predicting admission and discharge FIM ratings at rehabilitation (adjusted R² 0.40 and 0.28, respectively; P < 0.0001) and was weakly correlated with FIM gain and length of stay (adjusted R² 0.04 and 0.09, respectively; P < 0.0001), but not FIM efficiency. AlphaFIM rating was inversely related to FIM gain. Age, bowel incontinence, left hemiparesis, and previous infarcts were negative predictors of discharge FIM rating on stepwise regression. Intact executive function and physical activity tolerance of 30 to 60 mins were predictors of FIM gain. The AlphaFIM instrument is a valuable tool for triaging stroke patients from acute care to rehabilitation and predicts functional status at discharge from rehabilitation. Patients with low AlphaFIM ratings have the potential to make significant functional gains and should not be denied admission to inpatient rehabilitation programs.

  17. Simple structured hybrid WOLEDs based on incomplete energy transfer mechanism: from blue exciplex to orange dopant.

    PubMed

    Zhang, Tianyou; Zhao, Bo; Chu, Bei; Li, Wenlian; Su, Zisheng; Yan, Xingwu; Liu, Chengyuan; Wu, Hairuo; Gao, Yuan; Jin, Fangming; Hou, Fuhua

    2015-05-15

    Exciplex is well known as a charge transfer state formed between electron-donating and electron-accepting molecules. However, exciplex based organic light emitting diodes (OLED) often performed low efficiencies relative to pure phosphorescent OLED and could hardly be used to construct white OLED (WOLED). In this work, a new mechanism is developed to realize efficient WOLED with extremely simple structure by redistributing the energy of triplet exciplex to both singlet exciplex and the orange dopant. The micro process of energy transfer could be directly examined by detailed photoluminescence decay measurement and time resolved photoluminescence analysis. This strategy overcomes the low reverse intersystem crossing efficiency of blue exciplex and complicated device structure of traditional WOLED, enables us to achieve efficient hybrid WOLEDs. Based on this mechanism, we have successfully constructed both exciplex-fluorescence and exciplex-phosphorescence hybrid WOLEDs with remarkable efficiencies.

  18. Simple structured hybrid WOLEDs based on incomplete energy transfer mechanism: from blue exciplex to orange dopant

    NASA Astrophysics Data System (ADS)

    Zhang, Tianyou; Zhao, Bo; Chu, Bei; Li, Wenlian; Su, Zisheng; Yan, Xingwu; Liu, Chengyuan; Wu, Hairuo; Gao, Yuan; Jin, Fangming; Hou, Fuhua

    2015-05-01

    Exciplex is well known as a charge transfer state formed between electron-donating and electron-accepting molecules. However, exciplex based organic light emitting diodes (OLED) often performed low efficiencies relative to pure phosphorescent OLED and could hardly be used to construct white OLED (WOLED). In this work, a new mechanism is developed to realize efficient WOLED with extremely simple structure by redistributing the energy of triplet exciplex to both singlet exciplex and the orange dopant. The micro process of energy transfer could be directly examined by detailed photoluminescence decay measurement and time resolved photoluminescence analysis. This strategy overcomes the low reverse intersystem crossing efficiency of blue exciplex and complicated device structure of traditional WOLED, enables us to achieve efficient hybrid WOLEDs. Based on this mechanism, we have successfully constructed both exciplex-fluorescence and exciplex-phosphorescence hybrid WOLEDs with remarkable efficiencies.

  19. Simple structured hybrid WOLEDs based on incomplete energy transfer mechanism: from blue exciplex to orange dopant

    PubMed Central

    Zhang, Tianyou; Zhao, Bo; Chu, Bei; Li, Wenlian; Su, Zisheng; Yan, Xingwu; Liu, Chengyuan; Wu, Hairuo; Gao, Yuan; Jin, Fangming; Hou, Fuhua

    2015-01-01

    Exciplex is well known as a charge transfer state formed between electron-donating and electron-accepting molecules. However, exciplex based organic light emitting diodes (OLED) often performed low efficiencies relative to pure phosphorescent OLED and could hardly be used to construct white OLED (WOLED). In this work, a new mechanism is developed to realize efficient WOLED with extremely simple structure by redistributing the energy of triplet exciplex to both singlet exciplex and the orange dopant. The micro process of energy transfer could be directly examined by detailed photoluminescence decay measurement and time resolved photoluminescence analysis. This strategy overcomes the low reverse intersystem crossing efficiency of blue exciplex and complicated device structure of traditional WOLED, enables us to achieve efficient hybrid WOLEDs. Based on this mechanism, we have successfully constructed both exciplex-fluorescence and exciplex-phosphorescence hybrid WOLEDs with remarkable efficiencies. PMID:25975371

  20. Simple single-emitting layer hybrid white organic light emitting with high color stability

    NASA Astrophysics Data System (ADS)

    Nguyen, C.; Lu, Z. H.

    2017-10-01

    Simultaneously achieving a high efficiency and color quality at luminance levels required for solid-state lighting has been difficult for white organic light emitting diodes (OLEDs). Single-emitting layer (SEL) white OLEDs, in particular, exhibit a significant tradeoff between efficiency and color stability. Furthermore, despite the simplicity of SEL white OLEDs being its main advantage, the reported device structures are often complicated by the use of multiple blocking layers. In this paper, we report a highly simplified three-layered white OLED that achieves a low turn-on voltage of 2.7 V, an external quantum efficiency of 18.9% and power efficiency of 30 lm/W at 1000 cd/cm2. This simple white OLED also shows good color quality with a color rendering index of 75, CIE coordinates (0.42, 0.46), and little color shifting at high luminance. The device consists of a SEL sandwiched between a hole transport layer and an electron transport layer. The SEL comprises a thermally activated delayer fluorescent molecule having dual functions as a blue emitter and as a host for other lower energy emitters. The improved color stability and efficiency in such a simple device structure is explained as due to the elimination of significant energy barriers at various organic-organic interfaces in the traditional devices having multiple blocking layers.

  1. A simple and efficient method to visualize and quantify the efficiency of chromosomal mutations from genome editing

    PubMed Central

    Fu, Liezhen; Wen, Luan; Luu, Nga; Shi, Yun-Bo

    2016-01-01

    Genome editing with designer nucleases such as TALEN and CRISPR/Cas enzymes has broad applications. Delivery of these designer nucleases into organisms induces various genetic mutations including deletions, insertions and nucleotide substitutions. Characterizing those mutations is critical for evaluating the efficacy and specificity of targeted genome editing. While a number of methods have been developed to identify the mutations, none other than sequencing allows the identification of the most desired mutations, i.e., out-of-frame insertions/deletions that disrupt genes. Here we report a simple and efficient method to visualize and quantify the efficiency of genomic mutations induced by genome-editing. Our approach is based on the expression of a two-color fusion protein in a vector that allows the insertion of the edited region in the genome in between the two color moieties. We show that our approach not only easily identifies developing animals with desired mutations but also efficiently quantifies the mutation rate in vivo. Furthermore, by using LacZα and GFP as the color moieties, our approach can even eliminate the need for a fluorescent microscope, allowing the analysis with simple bright field visualization. Such an approach will greatly simplify the screen for effective genome-editing enzymes and identify the desired mutant cells/animals. PMID:27748423

  2. Seed: a user-friendly tool for exploring and visualizing microbial community data.

    PubMed

    Beck, Daniel; Dennis, Christopher; Foster, James A

    2015-02-15

    In this article we present Simple Exploration of Ecological Data (Seed), a data exploration tool for microbial communities. Seed is written in R using the Shiny library. This provides access to powerful R-based functions and libraries through a simple user interface. Seed allows users to explore ecological datasets using principal coordinate analyses, scatter plots, bar plots, hierarchal clustering and heatmaps. Seed is open source and available at https://github.com/danlbek/Seed. danlbek@gmail.com Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  3. Comparison of four modeling tools for the prediction of potential distribution for non-indigenous weeds in the United States

    USGS Publications Warehouse

    Magarey, Roger; Newton, Leslie; Hong, Seung C.; Takeuchi, Yu; Christie, Dave; Jarnevich, Catherine S.; Kohl, Lisa; Damus, Martin; Higgins, Steven I.; Miller, Leah; Castro, Karen; West, Amanda; Hastings, John; Cook, Gericke; Kartesz, John; Koop, Anthony

    2018-01-01

    This study compares four models for predicting the potential distribution of non-indigenous weed species in the conterminous U.S. The comparison focused on evaluating modeling tools and protocols as currently used for weed risk assessment or for predicting the potential distribution of invasive weeds. We used six weed species (three highly invasive and three less invasive non-indigenous species) that have been established in the U.S. for more than 75 years. The experiment involved providing non-U. S. location data to users familiar with one of the four evaluated techniques, who then developed predictive models that were applied to the United States without knowing the identity of the species or its U.S. distribution. We compared a simple GIS climate matching technique known as Proto3, a simple climate matching tool CLIMEX Match Climates, the correlative model MaxEnt, and a process model known as the Thornley Transport Resistance (TTR) model. Two experienced users ran each modeling tool except TTR, which had one user. Models were trained with global species distribution data excluding any U.S. data, and then were evaluated using the current known U.S. distribution. The influence of weed species identity and modeling tool on prevalence and sensitivity effects was compared using a generalized linear mixed model. Each modeling tool itself had a low statistical significance, while weed species alone accounted for 69.1 and 48.5% of the variance for prevalence and sensitivity, respectively. These results suggest that simple modeling tools might perform as well as complex ones in the case of predicting potential distribution for a weed not yet present in the United States. Considerations of model accuracy should also be balanced with those of reproducibility and ease of use. More important than the choice of modeling tool is the construction of robust protocols and testing both new and experienced users under blind test conditions that approximate operational conditions.

  4. Eutectic salt catalyzed environmentally benign and highly efficient Biginelli reaction.

    PubMed

    Azizi, Najmadin; Dezfuli, Sahar; Hahsemi, Mohmmad Mahmoodi

    2012-01-01

    A simple deep eutectic solvent based on tin (II) chloride was used as a dual catalyst and environmentally benign reaction medium for an efficient synthesis of 3,4-dihydropyrimidin-2(1H)-one derivatives, from aromatic and aliphatic aldehydes, 1,3-dicarbonyl compounds, and urea in good-to-excellent yields and short reaction time. This simple ammonium deep eutectic solvent, easily synthesized from choline chloride and tin chloride, is relatively inexpensive and recyclable, making it applicable for industrial applications.

  5. Eutectic Salt Catalyzed Environmentally Benign and Highly Efficient Biginelli Reaction

    PubMed Central

    Azizi, Najmadin; Dezfuli, Sahar; Hahsemi, Mohmmad Mahmoodi

    2012-01-01

    A simple deep eutectic solvent based on tin (II) chloride was used as a dual catalyst and environmentally benign reaction medium for an efficient synthesis of 3,4-dihydropyrimidin-2(1H)-one derivatives, from aromatic and aliphatic aldehydes, 1,3-dicarbonyl compounds, and urea in good-to-excellent yields and short reaction time. This simple ammonium deep eutectic solvent, easily synthesized from choline chloride and tin chloride, is relatively inexpensive and recyclable, making it applicable for industrial applications. PMID:22649326

  6. Creation of a simple natural language processing tool to support an imaging utilization quality dashboard.

    PubMed

    Swartz, Jordan; Koziatek, Christian; Theobald, Jason; Smith, Silas; Iturrate, Eduardo

    2017-05-01

    Testing for venous thromboembolism (VTE) is associated with cost and risk to patients (e.g. radiation). To assess the appropriateness of imaging utilization at the provider level, it is important to know that provider's diagnostic yield (percentage of tests positive for the diagnostic entity of interest). However, determining diagnostic yield typically requires either time-consuming, manual review of radiology reports or the use of complex and/or proprietary natural language processing software. The objectives of this study were twofold: 1) to develop and implement a simple, user-configurable, and open-source natural language processing tool to classify radiology reports with high accuracy and 2) to use the results of the tool to design a provider-specific VTE imaging dashboard, consisting of both utilization rate and diagnostic yield. Two physicians reviewed a training set of 400 lower extremity ultrasound (UTZ) and computed tomography pulmonary angiogram (CTPA) reports to understand the language used in VTE-positive and VTE-negative reports. The insights from this review informed the arguments to the five modifiable parameters of the NLP tool. A validation set of 2,000 studies was then independently classified by the reviewers and by the tool; the classifications were compared and the performance of the tool was calculated. The tool was highly accurate in classifying the presence and absence of VTE for both the UTZ (sensitivity 95.7%; 95% CI 91.5-99.8, specificity 100%; 95% CI 100-100) and CTPA reports (sensitivity 97.1%; 95% CI 94.3-99.9, specificity 98.6%; 95% CI 97.8-99.4). The diagnostic yield was then calculated at the individual provider level and the imaging dashboard was created. We have created a novel NLP tool designed for users without a background in computer programming, which has been used to classify venous thromboembolism reports with a high degree of accuracy. The tool is open-source and available for download at http://iturrate.com/simpleNLP. Results obtained using this tool can be applied to enhance quality by presenting information about utilization and yield to providers via an imaging dashboard. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Multicentre study for validation of the French addictovigilance network reports assessment tool

    PubMed Central

    Hardouin, Jean Benoit; Rousselet, Morgane; Gerardin, Marie; Guerlais, Marylène; Guillou, Morgane; Bronnec, Marie; Sébille, Véronique; Jolliet, Pascale

    2016-01-01

    Aims The French health authority (ANSM) is responsible for monitoring medicinal and other drug dependencies. To support these activities, the ANSM manages a network of 13 drug dependence evaluation and information centres (Centres d'Evaluation et d'Information sur la Pharmacodépendance ‐ Addictovigilance ‐ CEIP‐A) throughout France. In 2006, the Nantes CEIP‐A created a new tool called the EGAP (Echelle de GrAvité de la Pharmacodépendance‐ drug dependence severity scale) based on DSM IV criteria. This tool allows the creation of a substance use profile that enables the drug dependence severity to be homogeneously quantified by assigning a score to each substance indicated in the reports from health professionals. This article describes the validation and psychometric properties of the drug dependence severity score obtained from the scale ( Clinicaltrials.gov NCT01052675). Method The validity of the EGAP construct, the concurrent validity and the discriminative ability of the EGAP score, the consistency of answers to EGAP items, the internal consistency and inter rater reliability of the EGAP score were assessed using statistical methods that are generally used for psychometric tests. Results The total EGAP score was a reliable and precise measure for evaluating drug dependence (Cronbach alpha = 0.84; ASI correlation = 0.70; global ICC = 0.92). In addition to its good psychometric properties, the EGAP is a simple and efficient tool that can be easily specified on the official ANSM notification form. Conclusion The good psychometric properties of the total EGAP score justify its use for evaluating the severity of drug dependence. PMID:27302554

  8. Recent Advances in Genome Editing Using CRISPR/Cas9

    PubMed Central

    Ding, Yuduan; Li, Hong; Chen, Ling-Ling; Xie, Kabin

    2016-01-01

    The CRISPR (clustered regularly interspaced short palindromic repeat)-Cas9 (CRISPR-associated nuclease 9) system is a versatile tool for genome engineering that uses a guide RNA (gRNA) to target Cas9 to a specific sequence. This simple RNA-guided genome-editing technology has become a revolutionary tool in biology and has many innovative applications in different fields. In this review, we briefly introduce the Cas9-mediated genome-editing method, summarize the recent advances in CRISPR/Cas9 technology, and discuss their implications for plant research. To date, targeted gene knockout using the Cas9/gRNA system has been established in many plant species, and the targeting efficiency and capacity of Cas9 has been improved by optimizing its expression and that of its gRNA. The CRISPR/Cas9 system can also be used for sequence-specific mutagenesis/integration and transcriptional control of target genes. We also discuss off-target effects and the constraint that the protospacer-adjacent motif (PAM) puts on CRISPR/Cas9 genome engineering. To address these problems, a number of bioinformatic tools are available to help design specific gRNAs, and new Cas9 variants and orthologs with high fidelity and alternative PAM specificities have been engineered. Owing to these recent efforts, the CRISPR/Cas9 system is becoming a revolutionary and flexible tool for genome engineering. Adoption of the CRISPR/Cas9 technology in plant research would enable the investigation of plant biology at an unprecedented depth and create innovative applications in precise crop breeding. PMID:27252719

  9. Wireless remote control of clinical image workflow: using a PDA for off-site distribution and disaster recovery.

    PubMed

    Documet, Jorge; Liu, Brent J; Documet, Luis; Huang, H K

    2006-07-01

    This paper describes a picture archiving and communication system (PACS) tool based on Web technology that remotely manages medical images between a PACS archive and remote destinations. Successfully implemented in a clinical environment and also demonstrated for the past 3 years at the conferences of various organizations, including the Radiological Society of North America, this tool provides a very practical and simple way to manage a PACS, including off-site image distribution and disaster recovery. The application is robust and flexible and can be used on a standard PC workstation or a Tablet PC, but more important, it can be used with a personal digital assistant (PDA). With a PDA, the Web application becomes a powerful wireless and mobile image management tool. The application's quick and easy-to-use features allow users to perform Digital Imaging and Communications in Medicine (DICOM) queries and retrievals with a single interface, without having to worry about the underlying configuration of DICOM nodes. In addition, this frees up dedicated PACS workstations to perform their specialized roles within the PACS workflow. This tool has been used at Saint John's Health Center in Santa Monica, California, for 2 years. The average number of queries per month is 2,021, with 816 C-MOVE retrieve requests. Clinical staff members can use PDAs to manage image workflow and PACS examination distribution conveniently for off-site consultations by referring physicians and radiologists and for disaster recovery. This solution also improves radiologists' effectiveness and efficiency in health care delivery both within radiology departments and for off-site clinical coverage.

  10. A Simple Mechanical Model for the Isotropic Harmonic Oscillator

    ERIC Educational Resources Information Center

    Nita, Gelu M.

    2010-01-01

    A constrained elastic pendulum is proposed as a simple mechanical model for the isotropic harmonic oscillator. The conceptual and mathematical simplicity of this model recommends it as an effective pedagogical tool in teaching basic physics concepts at advanced high school and introductory undergraduate course levels. (Contains 2 figures.)

  11. The Simple Theory of Public Library Services.

    ERIC Educational Resources Information Center

    Newhouse, Joseph P.

    A simple normative theory applicable to public library services was developed as a tool to aid libraries in answering the question: which books should be bought by the library? Although developed for normative purposes, the theory generates testable predictions. It is relevant to measuring benefits from services which are provided publicly because…

  12. Safety in the Chemical Laboratory: Laboratory Air Quality: Part I. A Concentration Model.

    ERIC Educational Resources Information Center

    Butcher, Samuel S.; And Others

    1985-01-01

    Offers a simple model for estimating vapor concentrations in instructional laboratories. Three methods are described for measuring ventilation rates, and the results of measurements in six laboratories are presented. The model should provide a simple screening tool for evaluating worst-case personal exposures. (JN)

  13. Always look on both sides: Phylogenetic information conveyed by simple sequence repeat allele sequences

    USDA-ARS?s Scientific Manuscript database

    Simple sequence repeat (SSR) markers are widely used tools for inferences about genetic diversity, phylogeography and spatial genetic structure. Their applications assume that variation among alleles is essentially caused by an expansion or contraction of the number of repeats and that, accessorily,...

  14. Evaluating the Auto-MODS Assay, a Novel Tool for Tuberculosis Diagnosis for Use in Resource-Limited Settings

    PubMed Central

    Wang, Linwei; Mohammad, Sohaib H.; Li, Qiaozhi; Rienthong, Somsak; Rienthong, Dhanida; Nedsuwan, Supalert; Mahasirimongkol, Surakameth; Yasui, Yutaka

    2014-01-01

    There is an urgent need for simple, rapid, and affordable diagnostic tests for tuberculosis (TB) to combat the great burden of the disease in developing countries. The microscopic observation drug susceptibility assay (MODS) is a promising tool to fill this need, but it is not widely used due to concerns regarding its biosafety and efficiency. This study evaluated the automated MODS (Auto-MODS), which operates on principles similar to those of MODS but with several key modifications, making it an appealing alternative to MODS in resource-limited settings. In the operational setting of Chiang Rai, Thailand, we compared the performance of Auto-MODS with the gold standard liquid culture method in Thailand, mycobacterial growth indicator tube (MGIT) 960 plus the SD Bioline TB Ag MPT64 test, in terms of accuracy and efficiency in differentiating TB and non-TB samples as well as distinguishing TB and multidrug-resistant (MDR) TB samples. Sputum samples from clinically diagnosed TB and non-TB subjects across 17 hospitals in Chiang Rai were consecutively collected from May 2011 to September 2012. A total of 360 samples were available for evaluation, of which 221 (61.4%) were positive and 139 (38.6%) were negative for mycobacterial cultures according to MGIT 960. Of the 221 true-positive samples, Auto-MODS identified 212 as positive and 9 as negative (sensitivity, 95.9%; 95% confidence interval [CI], 92.4% to 98.1%). Of the 139 true-negative samples, Auto-MODS identified 135 as negative and 4 as positive (specificity, 97.1%; 95% CI, 92.8% to 99.2%). The median time to culture positivity was 10 days, with an interquartile range of 8 to 13 days for Auto-MODS. Auto-MODS is an effective and cost-sensitive alternative diagnostic tool for TB diagnosis in resource-limited settings. PMID:25378569

  15. Automatic detection and analysis of cell motility in phase-contrast time-lapse images using a combination of maximally stable extremal regions and Kalman filter approaches.

    PubMed

    Kaakinen, M; Huttunen, S; Paavolainen, L; Marjomäki, V; Heikkilä, J; Eklund, L

    2014-01-01

    Phase-contrast illumination is simple and most commonly used microscopic method to observe nonstained living cells. Automatic cell segmentation and motion analysis provide tools to analyze single cell motility in large cell populations. However, the challenge is to find a sophisticated method that is sufficiently accurate to generate reliable results, robust to function under the wide range of illumination conditions encountered in phase-contrast microscopy, and also computationally light for efficient analysis of large number of cells and image frames. To develop better automatic tools for analysis of low magnification phase-contrast images in time-lapse cell migration movies, we investigated the performance of cell segmentation method that is based on the intrinsic properties of maximally stable extremal regions (MSER). MSER was found to be reliable and effective in a wide range of experimental conditions. When compared to the commonly used segmentation approaches, MSER required negligible preoptimization steps thus dramatically reducing the computation time. To analyze cell migration characteristics in time-lapse movies, the MSER-based automatic cell detection was accompanied by a Kalman filter multiobject tracker that efficiently tracked individual cells even in confluent cell populations. This allowed quantitative cell motion analysis resulting in accurate measurements of the migration magnitude and direction of individual cells, as well as characteristics of collective migration of cell groups. Our results demonstrate that MSER accompanied by temporal data association is a powerful tool for accurate and reliable analysis of the dynamic behaviour of cells in phase-contrast image sequences. These techniques tolerate varying and nonoptimal imaging conditions and due to their relatively light computational requirements they should help to resolve problems in computationally demanding and often time-consuming large-scale dynamical analysis of cultured cells. © 2013 The Authors Journal of Microscopy © 2013 Royal Microscopical Society.

  16. Occupational self-coding and automatic recording (OSCAR): a novel web-based tool to collect and code lifetime job histories in large population-based studies.

    PubMed

    De Matteis, Sara; Jarvis, Deborah; Young, Heather; Young, Alan; Allen, Naomi; Potts, James; Darnton, Andrew; Rushton, Lesley; Cullinan, Paul

    2017-03-01

    Objectives The standard approach to the assessment of occupational exposures is through the manual collection and coding of job histories. This method is time-consuming and costly and makes it potentially unfeasible to perform high quality analyses on occupational exposures in large population-based studies. Our aim was to develop a novel, efficient web-based tool to collect and code lifetime job histories in the UK Biobank, a population-based cohort of over 500 000 participants. Methods We developed OSCAR (occupations self-coding automatic recording) based on the hierarchical structure of the UK Standard Occupational Classification (SOC) 2000, which allows individuals to collect and automatically code their lifetime job histories via a simple decision-tree model. Participants were asked to find each of their jobs by selecting appropriate job categories until they identified their job title, which was linked to a hidden 4-digit SOC code. For each occupation a job title in free text was also collected to estimate Cohen's kappa (κ) inter-rater agreement between SOC codes assigned by OSCAR and an expert manual coder. Results OSCAR was administered to 324 653 UK Biobank participants with an existing email address between June and September 2015. Complete 4-digit SOC-coded lifetime job histories were collected for 108 784 participants (response rate: 34%). Agreement between the 4-digit SOC codes assigned by OSCAR and the manual coder for a random sample of 400 job titles was moderately good [κ=0.45, 95% confidence interval (95% CI) 0.42-0.49], and improved when broader job categories were considered (κ=0.64, 95% CI 0.61-0.69 at a 1-digit SOC-code level). Conclusions OSCAR is a novel, efficient, and reasonably reliable web-based tool for collecting and automatically coding lifetime job histories in large population-based studies. Further application in other research projects for external validation purposes is warranted.

  17. Kangaroo – A pattern-matching program for biological sequences

    PubMed Central

    2002-01-01

    Background Biologists are often interested in performing a simple database search to identify proteins or genes that contain a well-defined sequence pattern. Many databases do not provide straightforward or readily available query tools to perform simple searches, such as identifying transcription binding sites, protein motifs, or repetitive DNA sequences. However, in many cases simple pattern-matching searches can reveal a wealth of information. We present in this paper a regular expression pattern-matching tool that was used to identify short repetitive DNA sequences in human coding regions for the purpose of identifying potential mutation sites in mismatch repair deficient cells. Results Kangaroo is a web-based regular expression pattern-matching program that can search for patterns in DNA, protein, or coding region sequences in ten different organisms. The program is implemented to facilitate a wide range of queries with no restriction on the length or complexity of the query expression. The program is accessible on the web at http://bioinfo.mshri.on.ca/kangaroo/ and the source code is freely distributed at http://sourceforge.net/projects/slritools/. Conclusion A low-level simple pattern-matching application can prove to be a useful tool in many research settings. For example, Kangaroo was used to identify potential genetic targets in a human colorectal cancer variant that is characterized by a high frequency of mutations in coding regions containing mononucleotide repeats. PMID:12150718

  18. Rating a Teacher Observation Tool: Five Ways to Ensure Classroom Observations are Focused and Rigorous

    ERIC Educational Resources Information Center

    New Teacher Project, 2011

    2011-01-01

    This "Rating a Teacher Observation Tool" identifies five simple questions and provides an easy-to-use scorecard to help policymakers decide whether an observation framework is likely to produce fair and accurate results. The five questions are: (1) Do the criteria and tools cover the classroom performance areas most connected to student outcomes?…

  19. Wire harness twisting aid

    NASA Technical Reports Server (NTRS)

    Casey, E. J.; Commadore, C. C.; Ingles, M. E.

    1980-01-01

    Long wire bundles twist into uniform spiral harnesses with help of simple apparatus. Wires pass through spacers and through hand-held tool with hole for each wire. Ends are attached to low speed bench motor. As motor turns, operator moves hand tool away forming smooth twists in wires between motor and tool. Technique produces harnesses that generate less radio-frequency interference than do irregularly twisted cables.

  20. Simple Ontology Format (SOFT)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sorokine, Alexandre

    2011-10-01

    Simple Ontology Format (SOFT) library and file format specification provides a set of simple tools for developing and maintaining ontologies. The library, implemented as a perl module, supports parsing and verification of the files in SOFt format, operations with ontologies (adding, removing, or filtering of entities), and converting of ontologies into other formats. SOFT allows users to quickly create ontologies using only a basic text editor, verify it, and portray it in a graph layout system using customized styles.

  1. Evaluation of a simple method for the automatic assignment of MeSH descriptors to health resources in a French online catalogue.

    PubMed

    Névéol, Aurélie; Pereira, Suzanne; Kerdelhué, Gaetan; Dahamna, Badisse; Joubert, Michel; Darmoni, Stéfan J

    2007-01-01

    The growing number of resources to be indexed in the catalogue of online health resources in French (CISMeF) calls for curating strategies involving automatic indexing tools while maintaining the catalogue's high indexing quality standards. To develop a simple automatic tool that retrieves MeSH descriptors from documents titles. In parallel to research on advanced indexing methods, a bag-of-words tool was developed for timely inclusion in CISMeF's maintenance system. An evaluation was carried out on a corpus of 99 documents. The indexing sets retrieved by the automatic tool were compared to manual indexing based on the title and on the full text of resources. 58% of the major main headings were retrieved by the bag-of-words algorithm and the precision on main heading retrieval was 69%. Bag-of-words indexing has effectively been used on selected resources to be included in CISMeF since August 2006. Meanwhile, on going work aims at improving the current version of the tool.

  2. An adaptive proper orthogonal decomposition method for model order reduction of multi-disc rotor system

    NASA Astrophysics Data System (ADS)

    Jin, Yulin; Lu, Kuan; Hou, Lei; Chen, Yushu

    2017-12-01

    The proper orthogonal decomposition (POD) method is a main and efficient tool for order reduction of high-dimensional complex systems in many research fields. However, the robustness problem of this method is always unsolved, although there are some modified POD methods which were proposed to solve this problem. In this paper, a new adaptive POD method called the interpolation Grassmann manifold (IGM) method is proposed to address the weakness of local property of the interpolation tangent-space of Grassmann manifold (ITGM) method in a wider parametric region. This method is demonstrated here by a nonlinear rotor system of 33-degrees of freedom (DOFs) with a pair of liquid-film bearings and a pedestal looseness fault. The motion region of the rotor system is divided into two parts: simple motion region and complex motion region. The adaptive POD method is compared with the ITGM method for the large and small spans of parameter in the two parametric regions to present the advantage of this method and disadvantage of the ITGM method. The comparisons of the responses are applied to verify the accuracy and robustness of the adaptive POD method, as well as the computational efficiency is also analyzed. As a result, the new adaptive POD method has a strong robustness and high computational efficiency and accuracy in a wide scope of parameter.

  3. Heart-Rate Variability During Deep Sleep in World-Class Alpine Skiers: A Time-Efficient Alternative to Morning Supine Measurements.

    PubMed

    Herzig, David; Testorelli, Moreno; Olstad, Daniela Schäfer; Erlacher, Daniel; Achermann, Peter; Eser, Prisca; Wilhelm, Matthias

    2017-05-01

    It is increasingly popular to use heart-rate variability (HRV) to tailor training for athletes. A time-efficient method is HRV assessment during deep sleep. To validate the selection of deep-sleep segments identified by RR intervals with simultaneous electroencephalography (EEG) recordings and to compare HRV parameters of these segments with those of standard morning supine measurements. In 11 world-class alpine skiers, RR intervals were monitored during 10 nights, and simultaneous EEGs were recorded during 2-4 nights. Deep sleep was determined from the HRV signal and verified by delta power from the EEG recordings. Four further segments were chosen for HRV determination, namely, a 4-h segment from midnight to 4 AM and three 5-min segments: 1 just before awakening, 1 after waking in supine position, and 1 in standing after orthostatic challenge. Training load was recorded every day. A total of 80 night and 68 morning measurements of 9 athletes were analyzed. Good correspondence between the phases selected by RR intervals vs those selected by EEG was found. Concerning root-mean-squared difference of successive RR intervals (RMSSD), a marker for parasympathetic activity, the best relationship with the morning supine measurement was found in deep sleep. HRV is a simple tool for approximating deep-sleep phases, and HRV measurement during deep sleep could provide a time-efficient alternative to HRV in supine position.

  4. Building America

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brad Oberg

    2010-12-31

    Builders generally use a 'spec and purchase' business management system (BMS) when implementing energy efficiency. A BMS is the overall operational and organizational systems and strategies that a builder uses to set up and run its company. This type of BMS treats building performance as a simple technology swap (e.g. a tank water heater to a tankless water heater) and typically compartmentalizes energy efficiency within one or two groups in the organization (e.g. purchasing and construction). While certain tools, such as details, checklists, and scopes of work, can assist builders in managing the quality of the construction of higher performancemore » homes, they do nothing to address the underlying operational strategies and issues related to change management that builders face when they make high performance homes a core part of their mission. To achieve the systems integration necessary for attaining 40% + levels of energy efficiency, while capturing the cost tradeoffs, builders must use a 'systems approach' BMS, rather than a 'spec and purchase' BMS. The following attributes are inherent in a systems approach BMS; they are also generally seen in quality management systems (QMS), such as the National Housing Quality Certification program: Cultural and corporate alignment, Clear intent for quality and performance, Increased collaboration across internal and external teams, Better communication practices and systems, Disciplined approach to quality control, Measurement and verification of performance, Continuous feedback and improvement, and Whole house integrated design and specification.« less

  5. Parameter estimation by Differential Search Algorithm from horizontal loop electromagnetic (HLEM) data

    NASA Astrophysics Data System (ADS)

    Alkan, Hilal; Balkaya, Çağlayan

    2018-02-01

    We present an efficient inversion tool for parameter estimation from horizontal loop electromagnetic (HLEM) data using Differential Search Algorithm (DSA) which is a swarm-intelligence-based metaheuristic proposed recently. The depth, dip, and origin of a thin subsurface conductor causing the anomaly are the parameters estimated by the HLEM method commonly known as Slingram. The applicability of the developed scheme was firstly tested on two synthetically generated anomalies with and without noise content. Two control parameters affecting the convergence characteristic to the solution of the algorithm were tuned for the so-called anomalies including one and two conductive bodies, respectively. Tuned control parameters yielded more successful statistical results compared to widely used parameter couples in DSA applications. Two field anomalies measured over a dipping graphitic shale from Northern Australia were then considered, and the algorithm provided the depth estimations being in good agreement with those of previous studies and drilling information. Furthermore, the efficiency and reliability of the results obtained were investigated via probability density function. Considering the results obtained, we can conclude that DSA characterized by the simple algorithmic structure is an efficient and promising metaheuristic for the other relatively low-dimensional geophysical inverse problems. Finally, the researchers after being familiar with the content of developed scheme displaying an easy to use and flexible characteristic can easily modify and expand it for their scientific optimization problems.

  6. Measuring DNA hybridization using fluorescent DNA-stabilized silver clusters to investigate mismatch effects on therapeutic oligonucleotides.

    PubMed

    de Bruin, Donny; Bossert, Nelli; Aartsma-Rus, Annemieke; Bouwmeester, Dirk

    2018-04-06

    Short nucleic acid oligomers have found a wide range of applications in experimental physics, biology and medicine, and show potential for the treatment of acquired and genetic diseases. These applications rely heavily on the predictability of hybridization through Watson-Crick base pairing to allow positioning on a nanometer scale, as well as binding to the target transcripts, but also off-target binding to transcripts with partial homology. These effects are of particular importance in the development of therapeutic oligonucleotides, where off-target effects caused by the binding of mismatched sequences need to be avoided. We employ a novel method of probing DNA hybridization using optically active DNA-stabilized silver clusters (Ag-DNA) to measure binding efficiencies through a change in fluorescence intensity. In this way we can determine their location-specific sensitivity to individual mismatches in the sequence. The results reveal a strong dependence of the hybridization on the location of the mismatch, whereby mismatches close to the edges and center show a relatively minor impact. In parallel, we propose a simple model for calculating the annealing ratios of mismatched DNA sequences, which supports our experimental results. The primary result shown in this work is a demonstration of a novel technique to measure DNA hybridization using fluorescent Ag-DNA. With this technique, we investigated the effect of mismatches on the hybridization efficiency, and found a significant dependence on the location of individual mismatches. These effects are strongly influenced by the length of the used oligonucleotides. The novel probe method based on fluorescent Ag-DNA functions as a reliable tool in measuring this behavior. As a secondary result, we formulated a simple model that is consistent with the experimental data.

  7. MnO2 nanosheet mediated "DD-A" FRET binary probes for sensitive detection of intracellular mRNA.

    PubMed

    Ou, Min; Huang, Jin; Yang, Xiaohai; Quan, Ke; Yang, Yanjing; Xie, Nuli; Wang, Kemin

    2017-01-01

    The donor donor-acceptor (DD-A) FRET model has proven to have a higher FRET efficiency than donor-acceptor acceptor (D-AA), donor-acceptor (D-A), and donor donor-acceptor acceptor (DD-AA) FRET models. The in-tube and in-cell experiments clearly demonstrate that the "DD-A" FRET binary probes can indeed increase the FRET efficiency and provide higher imaging contrast, which is about one order of magnitude higher than the ordinary "D-A" model. Furthermore, MnO 2 nanosheets were employed to deliver these probes into living cells for intracellular TK1 mRNA detection because they can adsorb ssDNA probes, penetrate across the cell membrane and be reduced to Mn 2+ ions by intracellular GSH. The results indicated that the MnO 2 nanosheet mediated "DD-A" FRET binary probes are capable of sensitive and selective sensing gene expression and chemical-stimuli changes in gene expression levels in cancer cells. We believe that the MnO 2 nanosheet mediated "DD-A" FRET binary probes have the potential as a simple but powerful tool for basic research and clinical diagnosis.

  8. Exact solution of a modified El Farol's bar problem: Efficiency and the role of market impact

    NASA Astrophysics Data System (ADS)

    Marsili, Matteo; Challet, Damien; Zecchina, Riccardo

    2000-06-01

    We discuss a model of heterogeneous, inductive rational agents inspired by the El Farol Bar problem and the Minority Game. As in markets, agents interact through a collective aggregate variable - which plays a role similar to price - whose value is fixed by all of them. Agents follow a simple reinforcement-learning dynamics where the reinforcement, for each of their available strategies, is related to the payoff delivered by that strategy. We derive the exact solution of the model in the “thermodynamic” limit of infinitely many agents using tools of statistical physics of disordered systems. Our results show that the impact of agents on the market price plays a key role: even though price has a weak dependence on the behavior of each individual agent, the collective behavior crucially depends on whether agents account for such dependence or not. Remarkably, if the adaptive behavior of agents accounts even “infinitesimally” for this dependence they can, in a whole range of parameters, reduce global fluctuations by a finite amount. Both global efficiency and individual utility improve with respect to a “price taker” behavior if agents account for their market impact.

  9. Numerical and experimental characterization of a novel modular passive micromixer.

    PubMed

    Pennella, Francesco; Rossi, Massimiliano; Ripandelli, Simone; Rasponi, Marco; Mastrangelo, Francesco; Deriu, Marco A; Ridolfi, Luca; Kähler, Christian J; Morbiducci, Umberto

    2012-10-01

    This paper reports a new low-cost passive microfluidic mixer design, based on a replication of identical mixing units composed of microchannels with variable curvature (clothoid) geometry. The micromixer presents a compact and modular architecture that can be easily fabricated using a simple and reliable fabrication process. The particular clothoid-based geometry enhances the mixing by inducing transversal secondary flows and recirculation effects. The role of the relevant fluid mechanics mechanisms promoting the mixing in this geometry were analysed using computational fluid dynamics (CFD) for Reynolds numbers ranging from 1 to 110. A measure of mixing potency was quantitatively evaluated by calculating mixing efficiency, while a measure of particle dispersion was assessed through the lacunarity index. The results show that the secondary flow arrangement and recirculation effects are able to provide a mixing efficiency equal to 80 % at Reynolds number above 70. In addition, the analysis of particles distribution promotes the lacunarity as powerful tool to quantify the dispersion of fluid particles and, in turn, the overall mixing. On fabricated micromixer prototypes the microscopic-Laser-Induced-Fluorescence (μLIF) technique was applied to characterize mixing. The experimental results confirmed the mixing potency of the microdevice.

  10. Computational Relativistic Astrophysics Using the Flowfield-Dependent Variation Theory

    NASA Technical Reports Server (NTRS)

    Richardson, G. A.; Chung, T. J.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    Theoretical models, observations and measurements have preoccupied astrophysicists for many centuries. Only in recent years, has the theory of relativity as applied to astrophysical flows met the challenges of how the governing equations can be solved numerically with accuracy and efficiency. Even without the effects of relativity, the physics of magnetohydrodynamic flow instability, turbulence, radiation, and enhanced transport in accretion disks has not been completely resolved. Relativistic effects become pronounced in such cases as jet formation from black hole magnetized accretion disks and also in the study of Gamma-Ray bursts (GRB). Thus, our concern in this paper is to reexamine existing numerical simulation tools as to the accuracy and efficiency of computations and introduce a new approach known as the flowfield-dependent variation (FDV) method. The main feature of the FDV method consists of accommodating discontinuities of shock waves and high gradients of flow variables such as occur in turbulence and unstable motions. In this paper, the physics involved in the solution of relativistic hydrodynamics and solution strategies of the FDV theory are elaborated. The general relativistic astrophysical flow and shock solver (GRAFSS) is introduced, and some simple example problems for Computational Relativistic Astrophysics (CRA) are demonstrated.

  11. ENCoRE: an efficient software for CRISPR screens identifies new players in extrinsic apoptosis.

    PubMed

    Trümbach, Dietrich; Pfeiffer, Susanne; Poppe, Manuel; Scherb, Hagen; Doll, Sebastian; Wurst, Wolfgang; Schick, Joel A

    2017-11-25

    As CRISPR/Cas9 mediated screens with pooled guide libraries in somatic cells become increasingly established, an unmet need for rapid and accurate companion informatics tools has emerged. We have developed a lightweight and efficient software to easily manipulate large raw next generation sequencing datasets derived from such screens into informative relational context with graphical support. The advantages of the software entitled ENCoRE (Easy NGS-to-Gene CRISPR REsults) include a simple graphical workflow, platform independence, local and fast multithreaded processing, data pre-processing and gene mapping with custom library import. We demonstrate the capabilities of ENCoRE to interrogate results from a pooled CRISPR cellular viability screen following Tumor Necrosis Factor-alpha challenge. The results not only identified stereotypical players in extrinsic apoptotic signaling but two as yet uncharacterized members of the extrinsic apoptotic cascade, Smg7 and Ces2a. We further validated and characterized cell lines containing mutations in these genes against a panel of cell death stimuli and involvement in p53 signaling. In summary, this software enables bench scientists with sensitive data or without access to informatic cores to rapidly interpret results from large scale experiments resulting from pooled CRISPR/Cas9 library screens.

  12. Multicategory Composite Least Squares Classifiers

    PubMed Central

    Park, Seo Young; Liu, Yufeng; Liu, Dacheng; Scholl, Paul

    2010-01-01

    Classification is a very useful statistical tool for information extraction. In particular, multicategory classification is commonly seen in various applications. Although binary classification problems are heavily studied, extensions to the multicategory case are much less so. In view of the increased complexity and volume of modern statistical problems, it is desirable to have multicategory classifiers that are able to handle problems with high dimensions and with a large number of classes. Moreover, it is necessary to have sound theoretical properties for the multicategory classifiers. In the literature, there exist several different versions of simultaneous multicategory Support Vector Machines (SVMs). However, the computation of the SVM can be difficult for large scale problems, especially for problems with large number of classes. Furthermore, the SVM cannot produce class probability estimation directly. In this article, we propose a novel efficient multicategory composite least squares classifier (CLS classifier), which utilizes a new composite squared loss function. The proposed CLS classifier has several important merits: efficient computation for problems with large number of classes, asymptotic consistency, ability to handle high dimensional data, and simple conditional class probability estimation. Our simulated and real examples demonstrate competitive performance of the proposed approach. PMID:21218128

  13. Consumer Behavior Under Conflicting Information Provided by Interested Parties: Implications for Equilibrium in the Market for Credence Goods.

    PubMed

    Russo, Carlo; Tufi, Eleonora

    2016-01-01

    Incomplete information in food consumption is a relevant topic in agricultural economics. This paper proposes a theoretical model describing consumer behavior, market equilibrium and public intervention in an industry where consumers must rely on the information of interested parties such as producers or associations. We provide simple game theory model showing the link between price competition and the strategic use of information. If information are unverifiable (as in the case of credence attributes) firms may have no incentive to advertise true claims and consumer decisions may be biased. Our model incorporates the opportunistic behavior of self-interested information providers. The result is a model of competition in prices and information finding a potential for market failure and public intervention. In the paper we discuss the efficiency of three possible regulations: banning false claims, subsidizing advertising campaigns, and public statement if favor of true claims. In that context, some recent patents related to both the regulatory compliance in communication and to the reduction of asymmetric information between producers and consumers have been considered. Finally, we found that the efficiency of these policy tools is affected by the reputation of trustworthiness of the firms.

  14. Development and single-laboratory validation of a UHPLC-MS/MS method for quantitation of microcystins and nodularin in natural water, cyanobacteria, shellfish and algal supplement tablet powders.

    PubMed

    Turner, Andrew D; Waack, Julia; Lewis, Adam; Edwards, Christine; Lawton, Linda

    2018-02-01

    A simple, rapid UHPLC-MS/MS method has been developed and optimised for the quantitation of microcystins and nodularin in wide variety of sample matrices. Microcystin analogues targeted were MC-LR, MC-RR, MC-LA, MC-LY, MC-LF, LC-LW, MC-YR, MC-WR, [Asp3] MC-LR, [Dha7] MC-LR, MC-HilR and MC-HtyR. Optimisation studies were conducted to develop a simple, quick and efficient extraction protocol without the need for complex pre-analysis concentration procedures, together with a rapid sub 5min chromatographic separation of toxins in shellfish and algal supplement tablet powders, as well as water and cyanobacterial bloom samples. Validation studies were undertaken on each matrix-analyte combination to the full method performance characteristics following international guidelines. The method was found to be specific and linear over the full calibration range. Method sensitivity in terms of limits of detection, quantitation and reporting were found to be significantly improved in comparison to LC-UV methods and applicable to the analysis of each of the four matrices. Overall, acceptable recoveries were determined for each of the matrices studied, with associated precision and within-laboratory reproducibility well within expected guidance limits. Results from the formalised ruggedness analysis of all available cyanotoxins, showed that the method was robust for all parameters investigated. The results presented here show that the optimised LC-MS/MS method for cyanotoxins is fit for the purpose of detection and quantitation of a range of microcystins and nodularin in shellfish, algal supplement tablet powder, water and cyanobacteria. The method provides a valuable early warning tool for the rapid, routine extraction and analysis of natural waters, cyanobacterial blooms, algal powders, food supplements and shellfish tissues, enabling monitoring labs to supplement traditional microscopy techniques and report toxicity results within a short timeframe of sample receipt. The new method, now accredited to ISO17025 standard, is simple, quick, applicable to multiple matrices and is highly suitable for use as a routine, high-throughout, fast turnaround regulatory monitoring tool. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. A comparison of methods for teaching receptive labeling to children with autism spectrum disorders: a systematic replication.

    PubMed

    Grow, Laura L; Kodak, Tiffany; Carr, James E

    2014-01-01

    Previous research has demonstrated that the conditional-only method (starting with a multiple-stimulus array) is more efficient than the simple-conditional method (progressive incorporation of more stimuli into the array) for teaching receptive labeling to children with autism spectrum disorders (Grow, Carr, Kodak, Jostad, & Kisamore,). The current study systematically replicated the earlier study by comparing the 2 approaches using progressive prompting with 2 boys with autism. The results showed that the conditional-only method was a more efficient and reliable teaching procedure than the simple-conditional method. The results further call into question the practice of teaching simple discriminations to facilitate acquisition of conditional discriminations. © Society for the Experimental Analysis of Behavior.

  16. Histological and Thermometric Examination of Soft Tissue De-Epithelialization Using Digitally Controlled Er:YAG Laser Handpiece: An Ex Vivo Study.

    PubMed

    Grzech-Leśniak, Kinga; Matys, Jacek; Jurczyszyn, Kamil; Ziółkowski, Piotr; Dominiak, Marzena; Brugnera Junior, Aldo; Romeo, Umberto

    2018-06-01

    The purpose of this study was histological and thermometric examination of soft tissue de-epithelialization using digitally controlled laser handpiece (DCLH) - X-Runner. Commonly used techniques for de-epithelialization include scalpel, abrasion with diamond bur, or a combination of the two. Despite being simple, inexpensive and effective, these techniques are invasive and may produce unwanted side effects. It is important to look for alternative techniques using novel tools, which are minimally invasive and effective. 114 porcine samples sized 6 × 6 mm were collected from the attached gingiva (AG) of the alveolar process of the mandible using 15C scalpel blade. The samples were irradiated by means of Er:YAG laser (LightWalker, Fotona, Slovenia), using X-Runner and HO 2 handpieces at different parameters; 80, 100, and 140 mJ/20 Hz in time of 6 or 16 sec, respectively. The temperature was measured with a K-type thermocouple. For the histopathological analysis of efficiency of epithelium removal and thermal injury, 3 random samples were de-epithelialized with an HO 2 handpiece, and 9 random samples with an X-Runner handpiece with different parameters. For the samples irradiated with DCLH, we have used three different settings, which resulted in removing 1 to 3 layers of the soft tissue. The efficiency of epithelium removal and the rise of temperature were analyzed. DCLH has induced significantly lower temperature increase compared with HO 2 at each energy to frequency ratio. The histological examination revealed total epithelium removal when HO 2 handpiece was used at 100 and 140 mJ/20 Hz and when DCLH was used for two- and threefold lasing at 80, 100, and 140 mJ/20 Hz. Er:YAG laser with DCLH handpiece may be an efficient tool in epithelium removal without excessive thermal damage.

  17. Inversion Of Jacobian Matrix For Robot Manipulators

    NASA Technical Reports Server (NTRS)

    Fijany, Amir; Bejczy, Antal K.

    1989-01-01

    Report discusses inversion of Jacobian matrix for class of six-degree-of-freedom arms with spherical wrist, i.e., with last three joints intersecting. Shows by taking advantage of simple geometry of such arms, closed-form solution of Q=J-1X, which represents linear transformation from task space to joint space, obtained efficiently. Presents solutions for PUMA arm, JPL/Stanford arm, and six-revolute-joint coplanar arm along with all singular points. Main contribution of paper shows simple geometry of this type of arms exploited in performing inverse transformation without any need to compute Jacobian or its inverse explicitly. Implication of this computational efficiency advanced task-space control schemes for spherical-wrist arms implemented more efficiently.

  18. Self-Assembly of Measles Virus Nucleocapsid-like Particles: Kinetics and RNA Sequence Dependence.

    PubMed

    Milles, Sigrid; Jensen, Malene Ringkjøbing; Communie, Guillaume; Maurin, Damien; Schoehn, Guy; Ruigrok, Rob W H; Blackledge, Martin

    2016-08-01

    Measles virus RNA genomes are packaged into helical nucleocapsids (NCs), comprising thousands of nucleo-proteins (N) that bind the entire genome. N-RNA provides the template for replication and transcription by the viral polymerase and is a promising target for viral inhibition. Elucidation of mechanisms regulating this process has been severely hampered by the inability to controllably assemble NCs. Here, we demonstrate self-organization of N into NC-like particles in vitro upon addition of RNA, providing a simple and versatile tool for investigating assembly. Real-time NMR and fluorescence spectroscopy reveals biphasic assembly kinetics. Remarkably, assembly depends strongly on the RNA-sequence, with the genomic 5' end and poly-Adenine sequences assembling efficiently, while sequences such as poly-Uracil are incompetent for NC formation. This observation has important consequences for understanding the assembly process. © 2016 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  19. Comparison of double-locus sequence typing (DLST) and multilocus sequence typing (MLST) for the investigation of Pseudomonas aeruginosa populations.

    PubMed

    Cholley, Pascal; Stojanov, Milos; Hocquet, Didier; Thouverez, Michelle; Bertrand, Xavier; Blanc, Dominique S

    2015-08-01

    Reliable molecular typing methods are necessary to investigate the epidemiology of bacterial pathogens. Reference methods such as multilocus sequence typing (MLST) and pulsed-field gel electrophoresis (PFGE) are costly and time consuming. Here, we compared our newly developed double-locus sequence typing (DLST) method for Pseudomonas aeruginosa to MLST and PFGE on a collection of 281 isolates. DLST was as discriminatory as MLST and was able to recognize "high-risk" epidemic clones. Both methods were highly congruent. Not surprisingly, a higher discriminatory power was observed with PFGE. In conclusion, being a simple method (single-strand sequencing of only 2 loci), DLST is valuable as a first-line typing tool for epidemiological investigations of P. aeruginosa. Coupled to a more discriminant method like PFGE or whole genome sequencing, it might represent an efficient typing strategy to investigate or prevent outbreaks. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Application of cloud database in the management of clinical data of patients with skin diseases.

    PubMed

    Mao, Xiao-fei; Liu, Rui; DU, Wei; Fan, Xue; Chen, Dian; Zuo, Ya-gang; Sun, Qiu-ning

    2015-04-01

    To evaluate the needs and applications of using cloud database in the daily practice of dermatology department. The cloud database was established for systemic scleroderma and localized scleroderma. Paper forms were used to record the original data including personal information, pictures, specimens, blood biochemical indicators, skin lesions,and scores of self-rating scales. The results were input into the cloud database. The applications of the cloud database in the dermatology department were summarized and analyzed. The personal and clinical information of 215 systemic scleroderma patients and 522 localized scleroderma patients were included and analyzed using the cloud database. The disease status,quality of life, and prognosis were obtained by statistical calculations. The cloud database can efficiently and rapidly store and manage the data of patients with skin diseases. As a simple, prompt, safe, and convenient tool, it can be used in patients information management, clinical decision-making, and scientific research.

  1. Charge Transfer Directed Radical Substitution Enables para-Selective C–H Functionalization

    PubMed Central

    Boursalian, Gregory B.; Ham, Won Seok; Mazzotti, Anthony R.; Ritter, Tobias

    2016-01-01

    Efficient C–H functionalization requires selectivity for specific C–H bonds. Progress has been made for directed aromatic substitution reactions to achieve ortho- and meta- selectivity, but a general strategy for para-selective C–H functionalization has remained elusive. Herein, we introduce a previously unappreciated concept which enables nearly complete para selectivity. We propose that radicals with high electron affinity elicit areneto-radical charge transfer in the transition state of radical addition, which is the factor primarily responsible for high positional selectivity. We demonstrate that the selectivity is predictable by a simple theoretical tool and show the utility of the concept through a direct synthesis of aryl piperazines. Our results contradict the notion, widely held by organic chemists, that radical aromatic substitution reactions are inherently unselective. The concept of charge transfer directed radical substitution could serve as the basis for the development of new, highly selective C–H functionalization reactions. PMID:27442288

  2. An efficient current-based logic cell model for crosstalk delay analysis

    NASA Astrophysics Data System (ADS)

    Nazarian, Shahin; Das, Debasish

    2013-04-01

    Logic cell modelling is an important component in the analysis and design of CMOS integrated circuits, mostly due to nonlinear behaviour of CMOS cells with respect to the voltage signal at their input and output pins. A current-based model for CMOS logic cells is presented, which can be used for effective crosstalk noise and delta delay analysis in CMOS VLSI circuits. Existing current source models are expensive and need a new set of Spice-based characterisation, which is not compatible with typical EDA tools. In this article we present Imodel, a simple nonlinear logic cell model that can be derived from the typical cell libraries such as NLDM, with accuracy much higher than NLDM-based cell delay models. In fact, our experiments show an average error of 3% compared to Spice. This level of accuracy comes with a maximum runtime penalty of 19% compared to NLDM-based cell delay models on medium-sized industrial designs.

  3. Crowdsourced data for flood hydrology: Feedback from recent citizen science projects in Argentina, France and New Zealand

    NASA Astrophysics Data System (ADS)

    Le Coz, Jérôme; Patalano, Antoine; Collins, Daniel; Guillén, Nicolás Federico; García, Carlos Marcelo; Smart, Graeme M.; Bind, Jochen; Chiaverini, Antoine; Le Boursicaud, Raphaël; Dramais, Guillaume; Braud, Isabelle

    2016-10-01

    New communication and digital image technologies have enabled the public to produce large quantities of flood observations and share them through social media. In addition to flood incident reports, valuable hydraulic data such as the extent and depths of inundated areas and flow rate estimates can be computed using messages, photos and videos produced by citizens. Such crowdsourced data help improve the understanding and modelling of flood hazard. Since little feedback on similar initiatives is available, we introduce three recent citizen science projects which have been launched independently by research organisations to quantitatively document flood flows in catchments and urban areas of Argentina, France, and New Zealand. Key drivers for success appear to be: a clear and simple procedure, suitable tools for data collecting and processing, an efficient communication plan, the support of local stakeholders, and the public awareness of natural hazards.

  4. DNA assembler, an in vivo genetic method for rapid construction of biochemical pathways

    PubMed Central

    Shao, Zengyi; Zhao, Hua; Zhao, Huimin

    2009-01-01

    The assembly of large recombinant DNA encoding a whole biochemical pathway or genome represents a significant challenge. Here, we report a new method, DNA assembler, which allows the assembly of an entire biochemical pathway in a single step via in vivo homologous recombination in Saccharomyces cerevisiae. We show that DNA assembler can rapidly assemble a functional d-xylose utilization pathway (∼9 kb DNA consisting of three genes), a functional zeaxanthin biosynthesis pathway (∼11 kb DNA consisting of five genes) and a functional combined d-xylose utilization and zeaxanthin biosynthesis pathway (∼19 kb consisting of eight genes) with high efficiencies (70–100%) either on a plasmid or on a yeast chromosome. As this new method only requires simple DNA preparation and one-step yeast transformation, it represents a powerful tool in the construction of biochemical pathways for synthetic biology, metabolic engineering and functional genomics studies. PMID:19074487

  5. Computational Relativistic Astrophysics Using the Flow Field-Dependent Variation Theory

    NASA Technical Reports Server (NTRS)

    Richardson, G. A.; Chung, T. J.

    2002-01-01

    We present our method for solving general relativistic nonideal hydrodynamics. Relativistic effects become pronounced in such cases as jet formation from black hole magnetized accretion disks which may lead to the study of gamma-ray bursts. Nonideal flows are present where radiation, magnetic forces, viscosities, and turbulence play an important role. Our concern in this paper is to reexamine existing numerical simulation tools as to the accuracy and efficiency of computations and introduce a new approach known as the flow field-dependent variation (FDV) method. The main feature of the FDV method consists of accommodating discontinuities of shock waves and high gradients of flow variables such as occur in turbulence and unstable motions. In this paper, the physics involved in the solution of relativistic hydrodynamics and solution strategies of the FDV theory are elaborated. The general relativistic astrophysical flow and shock solver (GRAFSS) is introduced, and some simple example problems for computational relativistic astrophysics (CRA) are demonstrated.

  6. Dynamically variable negative stiffness structures.

    PubMed

    Churchill, Christopher B; Shahan, David W; Smith, Sloan P; Keefe, Andrew C; McKnight, Geoffrey P

    2016-02-01

    Variable stiffness structures that enable a wide range of efficient load-bearing and dexterous activity are ubiquitous in mammalian musculoskeletal systems but are rare in engineered systems because of their complexity, power, and cost. We present a new negative stiffness-based load-bearing structure with dynamically tunable stiffness. Negative stiffness, traditionally used to achieve novel response from passive structures, is a powerful tool to achieve dynamic stiffness changes when configured with an active component. Using relatively simple hardware and low-power, low-frequency actuation, we show an assembly capable of fast (<10 ms) and useful (>100×) dynamic stiffness control. This approach mitigates limitations of conventional tunable stiffness structures that exhibit either small (<30%) stiffness change, high friction, poor load/torque transmission at low stiffness, or high power active control at the frequencies of interest. We experimentally demonstrate actively tunable vibration isolation and stiffness tuning independent of supported loads, enhancing applications such as humanoid robotic limbs and lightweight adaptive vibration isolators.

  7. Precise and Scalable Static Program Analysis of NASA Flight Software

    NASA Technical Reports Server (NTRS)

    Brat, G.; Venet, A.

    2005-01-01

    Recent NASA mission failures (e.g., Mars Polar Lander and Mars Orbiter) illustrate the importance of having an efficient verification and validation process for such systems. One software error, as simple as it may be, can cause the loss of an expensive mission, or lead to budget overruns and crunched schedules. Unfortunately, traditional verification methods cannot guarantee the absence of errors in software systems. Therefore, we have developed the CGS static program analysis tool, which can exhaustively analyze large C programs. CGS analyzes the source code and identifies statements in which arrays are accessed out of bounds, or, pointers are used outside the memory region they should address. This paper gives a high-level description of CGS and its theoretical foundations. It also reports on the use of CGS on real NASA software systems used in Mars missions (from Mars PathFinder to Mars Exploration Rover) and on the International Space Station.

  8. Digital detection of endonuclease mediated gene disruption in the HIV provirus

    PubMed Central

    Sedlak, Ruth Hall; Liang, Shu; Niyonzima, Nixon; De Silva Feelixge, Harshana S.; Roychoudhury, Pavitra; Greninger, Alexander L.; Weber, Nicholas D.; Boissel, Sandrine; Scharenberg, Andrew M.; Cheng, Anqi; Magaret, Amalia; Bumgarner, Roger; Stone, Daniel; Jerome, Keith R.

    2016-01-01

    Genome editing by designer nucleases is a rapidly evolving technology utilized in a highly diverse set of research fields. Among all fields, the T7 endonuclease mismatch cleavage assay, or Surveyor assay, is the most commonly used tool to assess genomic editing by designer nucleases. This assay, while relatively easy to perform, provides only a semi-quantitative measure of mutation efficiency that lacks sensitivity and accuracy. We demonstrate a simple droplet digital PCR assay that quickly quantitates a range of indel mutations with detection as low as 0.02% mutant in a wild type background and precision (≤6%CV) and accuracy superior to either mismatch cleavage assay or clonal sequencing when compared to next-generation sequencing. The precision and simplicity of this assay will facilitate comparison of gene editing approaches and their optimization, accelerating progress in this rapidly-moving field. PMID:26829887

  9. Safety assessment of ultra-wideband antennas for microwave breast imaging.

    PubMed

    De Santis, Valerio; Sill, Jeff M; Bourqui, Jeremie; Fear, Elise C

    2012-04-01

    This article deals with the safety assessment of several ultra-wideband (UWB) antenna designs for use in prototype microwave breast imaging systems. First, the performances of the antennas are validated by comparison of measured and simulated data collected for a simple test case. An efficient approach to estimating the specific energy absorption (SA) is introduced and validated. Next, SA produced by the UWB antennas inside more realistic breast models is computed. In particular, the power levels and pulse repetition periods adopted for the SA evaluation follow the measurement protocol employed by a tissue sensing adaptive radar (TSAR) prototype system. Results indicate that the SA for the antennas examined is below limits prescribed in standards for exposure of the general population; however, the difficulties inherent in applying such standards to UWB exposures are discussed. The results also suggest that effective tools for the rapid evaluation of new sensors have been developed. © 2011 Wiley Periodicals, Inc.

  10. Highly integrated system solutions for air conditioning.

    PubMed

    Bartz, Horst

    2002-08-01

    Starting with the air handling unit, new features concerning energy efficient air treatment in combination with optimisation of required space were presented. Strategic concepts for the supply of one or more operating suites with a modular based air handling system were discussed. The operating theatre ceiling itself, as a major part of the whole integrated system, is no longer a simple air outlet: additional functions have been added in so-called media-bridges, so that it has changed towards a medical apparatus serving as a daily tool for the physicians and the operating staff. Last and not least, the servicing of the whole system has become an integral part of the facility management with remote access to the main functions and controls. The results are understood to be the basis for a discussion with specialists from medical and hygienic disciplines as well as with technically orientated people representing the hospital and building-engineering.

  11. ZnO supported CoFe2O4 nanophotocatalysts for the mineralization of Direct Blue 71 in aqueous environments.

    PubMed

    Sathishkumar, Panneerselvam; Pugazhenthiran, Nalenthiran; Mangalaraja, Ramalinga Viswanathan; Asiri, Abdullah M; Anandan, Sambandam

    2013-05-15

    In this study, an attempt was made to render both the magnetic and photocatalytic properties in a semiconductor material to enhance the efficiency of degradation and recycling possibility of magnetic nanophotocatalysts. CoFe2O4 and CoFe2O4 loaded ZnO nanoparticles were prepared by a simple co-precipitation method and characterized using various analytical tools and in addition to check its visible light assisted photocatalytic activity. CoFe2O4/ZnO nanocatalyst coupled with acceptor, peroxomonosulphate (PMS) showed 1.69-fold enhancement in Direct Blue 71 (triazo dye; DB71) mineralization within 5h. The accomplished enrichment in decolorization was due to the production of more number of non-selective and active free radicals at the catalyst surface. Copyright © 2013 Elsevier B.V. All rights reserved.

  12. Time Domain Astrochemistry in Protoplanetary Disks

    NASA Astrophysics Data System (ADS)

    Cleeves, Lauren Ilsedore

    2018-01-01

    The chemistry of protoplanetary disks sets the initial composition of newly formed planets and may regulate the efficiency by which planets form. Disk chemical abundances typically evolve over timescales spanning thousands if not millions of years. Consequently, it was a surprise when ALMA observations taken over the course of a single year showed significantly variable emission in H13CO+ relative to the otherwise constant thermal dust emission in the IM Lup protoplanetary disk. HCO+ is a known X-ray sensitive molecule, and by using simple time-evolving chemical models including stellar activity, we demonstrate that stellar X-ray flares are a viable explanation for the observed H13CO+ variability. If this link between chemistry and stellar activity is confirmed, simultaneous observations can provide a new tool to measure (and potentially map) fundamental disk parameters, such as electron density, as the light from X-ray flares propagates across the disk.

  13. shinyCircos: an R/Shiny application for interactive creation of Circos plot.

    PubMed

    Yu, Yiming; Ouyang, Yidan; Yao, Wen

    2018-04-01

    Creation of Circos plot is one of the most efficient approaches to visualize genomic data. However, the installation and use of existing tools to make Circos plot are challenging for users lacking of coding experiences. To address this issue, we developed an R/Shiny application shinyCircos, a graphical user interface for interactive creation of Circos plot. shinyCircos can be easily installed either on computers for personal use or on local or public servers to provide online use to the community. Furthermore, various types of Circos plots could be easily generated and decorated with simple mouse-click. shinyCircos and its manual are freely available at https://github.com/venyao/shinyCircos. shinyCircos is deployed at https://yimingyu.shinyapps.io/shinycircos/ and http://shinycircos.ncpgr.cn/ for online use. diana1983941@mail.hzau.edu.cn or yaowen@henau.edu.cn.

  14. A Novel and Simple Spike Sorting Implementation.

    PubMed

    Petrantonakis, Panagiotis C; Poirazi, Panayiota

    2017-04-01

    Monitoring the activity of multiple, individual neurons that fire spikes in the vicinity of an electrode, namely perform a Spike Sorting (SS) procedure, comprises one of the most important tools for contemporary neuroscience in order to reverse-engineer the brain. As recording electrodes' technology rabidly evolves by integrating thousands of electrodes in a confined spatial setting, the algorithms that are used to monitor individual neurons from recorded signals have to become even more reliable and computationally efficient. In this work, we propose a novel framework of the SS approach in which a single-step processing of the raw (unfiltered) extracellular signal is sufficient for both the detection and sorting of the activity of individual neurons. Despite its simplicity, the proposed approach exhibits comparable performance with state-of-the-art approaches, especially for spike detection in noisy signals, and paves the way for a new family of SS algorithms with the potential for multi-recording, fast, on-chip implementations.

  15. Fault diagnosis of motor bearing with speed fluctuation via angular resampling of transient sound signals

    NASA Astrophysics Data System (ADS)

    Lu, Siliang; Wang, Xiaoxian; He, Qingbo; Liu, Fang; Liu, Yongbin

    2016-12-01

    Transient signal analysis (TSA) has been proven an effective tool for motor bearing fault diagnosis, but has yet to be applied in processing bearing fault signals with variable rotating speed. In this study, a new TSA-based angular resampling (TSAAR) method is proposed for fault diagnosis under speed fluctuation condition via sound signal analysis. By applying the TSAAR method, the frequency smearing phenomenon is eliminated and the fault characteristic frequency is exposed in the envelope spectrum for bearing fault recognition. The TSAAR method can accurately estimate the phase information of the fault-induced impulses using neither complicated time-frequency analysis techniques nor external speed sensors, and hence it provides a simple, flexible, and data-driven approach that realizes variable-speed motor bearing fault diagnosis. The effectiveness and efficiency of the proposed TSAAR method are verified through a series of simulated and experimental case studies.

  16. Anharmonic Vibrational Spectroscopy on Metal Transition Complexes

    NASA Astrophysics Data System (ADS)

    Latouche, Camille; Bloino, Julien; Barone, Vincenzo

    2014-06-01

    Advances in hardware performance and the availability of efficient and reliable computational models have made possible the application of computational spectroscopy to ever larger molecular systems. The systematic interpretation of experimental data and the full characterization of complex molecules can then be facilitated. Focusing on vibrational spectroscopy, several approaches have been proposed to simulate spectra beyond the double harmonic approximation, so that more details become available. However, a routine use of such tools requires the preliminary definition of a valid protocol with the most appropriate combination of electronic structure and nuclear calculation models. Several benchmark of anharmonic calculations frequency have been realized on organic molecules. Nevertheless, benchmarks of organometallics or inorganic metal complexes at this level are strongly lacking despite the interest of these systems due to their strong emission and vibrational properties. Herein we report the benchmark study realized with anharmonic calculations on simple metal complexes, along with some pilot applications on systems of direct technological or biological interest.

  17. Exporting coal through technology and countertrade

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Borissoff, E.

    1985-08-01

    Straightforward coal exporting on a simple price-and-delivery basis is becoming increasingly difficult for US suppliers. Technology and countertrade are two tools which could help coal suppliers' exports and, at the same time, satisfy the needs of their overseas customers. Neither would complicate the established process of coal exporting, but both would offer the prospect of increased sales and higher profits. Technical selling involves demonstrating to a customer that US steam coal is more competitive when burned in boiler designed specifically to burn that coal efficiently. To do this, the exporter must know the chemical characteristic of his coal and establishmore » a working relationship with his customers' purchasing agents and boiler chiefs. Technical selling to new users offers even more opportunities. Countertrade occurs when the customer pays for coal or a coal/boiler package with something other than US dollars.« less

  18. Rapid Inspection of Aerospace Structures - Is It Autonomous Yet?

    NASA Technical Reports Server (NTRS)

    Bar-Cohen, Yoseph; Backes, Paul; Joffe, Benjamin

    1996-01-01

    The trend to increase the usage of aging aircraft added a great deal of urgency to the ongoing need for low-cost, rapid, simple-to-operate, reliable and efficient NDE methods for detection and characterization of flaws in aircraft structures. In many cases, the problem of inspection is complex due to the limitation of current technology and the need to disassemble aircraft structures and testing them in lab conditions. To overcome these limitations, reliable field inspection tools are being developed for rapid NDE of large and complex-shape structures, that can operate at harsh, hostal and remote conditions with minimum human interface. In recent years, to address the need for rapid inspection in field conditions, numerous portable scanners were developed using NDE methods, including ultrasonics, shearography, thermography. This paper is written with emphasis on ultrasonic NDE scanners, their evolution and the expected direction of growth.

  19. Computer simulation of turbulent jet structure radiography

    NASA Astrophysics Data System (ADS)

    Kodimer, Kory A.; Parnell, Lynn A.; Nelson, Robert S.; Papin, Patrick J.

    1992-12-01

    Liquid metal combustion chambers are under consideration as power sources for propulsion devices used in undersea vehicles. Characteristics of the reactive jet are studied to gain information about the internal combustion phenomena, including temporal and spatial variation of the jet flame, and the effects of phase changes on both the combustion and imaging processes. A ray tracing program which employs simplified Monte Carlo methods has been developed for use as a predictive tool for radiographic imaging of closed liquid metal combustors. A complex focal spot is characterized by either a monochromatic or polychromatic emission spectrum. For the simplest case, the x-ray detection system is modeled by an integrating planar detector having 100% efficiency. Several simple geometrical shapes are used to simulate jet structures contained within the combustor, such as cylinders, paraboloids, and ellipsoids. The results of the simulation and real time radiographic images are presented and discussed.

  20. Biofilm inhibitors that target amyloid proteins.

    PubMed

    Romero, Diego; Sanabria-Valentín, Edgardo; Vlamakis, Hera; Kolter, Roberto

    2013-01-24

    Bacteria establish stable communities, known as biofilms, that are resistant to antimicrobials. Biofilm robustness is due to the presence of an extracellular matrix, which for several species-among them Bacillus subtilis-includes amyloid-like protein fibers. In this work, we show that B. subtilis biofilms can be a simple and reliable tool for screening of molecules with antiamyloid activity. We identified two molecules, AA-861 and parthenolide, which efficiently inhibited biofilms by preventing the formation of amyloid-like fibers. Parthenolide also disrupted pre-established biofilms. These molecules also impeded the formation of biofilms of other bacterial species that secrete amyloid proteins, such as Bacillus cereus and Escherichia coli. Furthermore, the identified molecules decreased the conversion of the yeast protein New1 to the prion state in a heterologous host, indicating the broad range of activity of the molecules. Copyright © 2013 Elsevier Ltd. All rights reserved.

Top