Sample records for user friendly automated

  1. CANEapp: a user-friendly application for automated next generation transcriptomic data analysis.

    PubMed

    Velmeshev, Dmitry; Lally, Patrick; Magistri, Marco; Faghihi, Mohammad Ali

    2016-01-13

    Next generation sequencing (NGS) technologies are indispensable for molecular biology research, but data analysis represents the bottleneck in their application. Users need to be familiar with computer terminal commands, the Linux environment, and various software tools and scripts. Analysis workflows have to be optimized and experimentally validated to extract biologically meaningful data. Moreover, as larger datasets are being generated, their analysis requires use of high-performance servers. To address these needs, we developed CANEapp (application for Comprehensive automated Analysis of Next-generation sequencing Experiments), a unique suite that combines a Graphical User Interface (GUI) and an automated server-side analysis pipeline that is platform-independent, making it suitable for any server architecture. The GUI runs on a PC or Mac and seamlessly connects to the server to provide full GUI control of RNA-sequencing (RNA-seq) project analysis. The server-side analysis pipeline contains a framework that is implemented on a Linux server through completely automated installation of software components and reference files. Analysis with CANEapp is also fully automated and performs differential gene expression analysis and novel noncoding RNA discovery through alternative workflows (Cuffdiff and R packages edgeR and DESeq2). We compared CANEapp to other similar tools, and it significantly improves on previous developments. We experimentally validated CANEapp's performance by applying it to data derived from different experimental paradigms and confirming the results with quantitative real-time PCR (qRT-PCR). CANEapp adapts to any server architecture by effectively using available resources and thus handles large amounts of data efficiently. CANEapp performance has been experimentally validated on various biological datasets. CANEapp is available free of charge at http://psychiatry.med.miami.edu/research/laboratory-of-translational-rna-genomics/CANE-app . We

  2. Altering user' acceptance of automation through prior automation exposure.

    PubMed

    Bekier, Marek; Molesworth, Brett R C

    2017-06-01

    Air navigation service providers worldwide see increased use of automation as one solution to overcome the capacity constraints imbedded in the present air traffic management (ATM) system. However, increased use of automation within any system is dependent on user acceptance. The present research sought to determine if the point at which an individual is no longer willing to accept or cooperate with automation can be manipulated. Forty participants underwent training on a computer-based air traffic control programme, followed by two ATM exercises (order counterbalanced), one with and one without the aid of automation. Results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation ('tipping point') decreased; suggesting it is indeed possible to alter automation acceptance. Practitioner Summary: This paper investigates whether the point at which a user of automation rejects automation (i.e. 'tipping point') is constant or can be manipulated. The results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation decreased; suggesting it is possible to alter automation acceptance.

  3. User-Friendly Tools for Random Matrices: An Introduction

    DTIC Science & Technology

    2012-12-03

    T 2011 , Oliveira 2010, Mackey et al . 2012, ... Joel A. Tropp, User-Friendly Tools for Random Matrices, NIPS, 3 December 2012 47 To learn more... E...the matrix product Y = AΩ 3. Construct an orthonormal basis Q for the range of Y [Ref] Halko –Martinsson–T, SIAM Rev. 2011 . Joel A. Tropp, User-Friendly...concentration inequalities...” with L. Mackey et al .. Submitted 2012. § “User-Friendly Tools for Random Matrices: An Introduction.” 2012. See also

  4. A new user-friendly visual environment for breast MRI data analysis.

    PubMed

    Antonios, Danelakis; Dimitrios, Verganelakis A; Theoharis, Theoharis

    2013-06-01

    In this paper a novel, user friendly visual environment for Breast MRI Data Analysis is presented (BreDAn). Given planar MRI images before and after IV contrast medium injection, BreDAn generates kinematic graphs, color maps of signal increase and decrease and finally detects high risk breast areas. The advantage of BreDAn, which has been validated and tested successfully, is the automation of the radiodiagnostic process in an accurate and reliable manner. It can potentially facilitate radiologists' workload. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  5. CARE 3 user-friendly interface user's guide

    NASA Technical Reports Server (NTRS)

    Martensen, A. L.

    1987-01-01

    CARE 3 predicts the unreliability of highly reliable reconfigurable fault-tolerant systems that include redundant computers or computer systems. CARE3MENU is a user-friendly interface used to create an input for the CARE 3 program. The CARE3MENU interface has been designed to minimize user input errors. Although a CARE3MENU session may be successfully completed and all parameters may be within specified limits or ranges, the CARE 3 program is not guaranteed to produce meaningful results if the user incorrectly interprets the CARE 3 stochastic model. The CARE3MENU User Guide provides complete information on how to create a CARE 3 model with the interface. The CARE3MENU interface runs under the VAX/VMS operating system.

  6. Modular modeling system for building distributed hydrologic models with a user-friendly software package

    NASA Astrophysics Data System (ADS)

    Wi, S.; Ray, P. A.; Brown, C.

    2015-12-01

    A software package developed to facilitate building distributed hydrologic models in a modular modeling system is presented. The software package provides a user-friendly graphical user interface that eases its practical use in water resources-related research and practice. The modular modeling system organizes the options available to users when assembling models according to the stages of hydrological cycle, such as potential evapotranspiration, soil moisture accounting, and snow/glacier melting processes. The software is intended to be a comprehensive tool that simplifies the task of developing, calibrating, validating, and using hydrologic models through the inclusion of intelligent automation to minimize user effort, and reduce opportunities for error. Processes so far automated include the definition of system boundaries (i.e., watershed delineation), climate and geographical input generation, and parameter calibration. Built-in post-processing toolkits greatly improve the functionality of the software as a decision support tool for water resources system management and planning. Example post-processing toolkits enable streamflow simulation at ungauged sites with predefined model parameters, and perform climate change risk assessment by means of the decision scaling approach. The software is validated through application to watersheds representing a variety of hydrologic regimes.

  7. A user-friendly software package to ease the use of VIC hydrologic model for practitioners

    NASA Astrophysics Data System (ADS)

    Wi, S.; Ray, P.; Brown, C.

    2016-12-01

    The VIC (Variable Infiltration Capacity) hydrologic and river routing model simulates the water and energy fluxes that occur near the land surface and provides users with useful information regarding the quantity and timing of available water at points of interest within the basin. However, despite its popularity (proved by numerous applications in the literature), its wider adoption is hampered by the considerable effort required to prepare model inputs; e.g., input files storing spatial information related to watershed topography, soil properties, and land cover. This study presents a user-friendly software package (named VIC Setup Toolkit) developed within the MATLAB (matrix laboratory) framework and accessible through an intuitive graphical user interface. The VIC Setup Toolkit enables users to navigate the model building process confidently through prompts and automation, with an intention to promote the use of the model for both practical and academic purposes. The automated processes include watershed delineation, climate and geographical input set-up, model parameter calibration, graph generation and output evaluation. We demonstrate the package's usefulness in various case studies with the American River, Oklahoma River, Feather River and Zambezi River basins.

  8. Friend suggestion in social network based on user log

    NASA Astrophysics Data System (ADS)

    Kaviya, R.; Vanitha, M.; Sumaiya Thaseen, I.; Mangaiyarkarasi, R.

    2017-11-01

    Simple friend recommendation algorithms such as similarity, popularity and social aspects is the basic requirement to be explored to methodically form high-performance social friend recommendation. Suggestion of friends is followed. No tags of character were followed. In the proposed system, we use an algorithm for network correlation-based social friend recommendation (NC-based SFR).It includes user activities like where one lives and works. A new friend recommendation method, based on network correlation, by considering the effect of different social roles. To model the correlation between different networks, we develop a method that aligns these networks through important feature selection. We consider by preserving the network structure for a more better recommendations so that it significantly improves the accuracy for better friend-recommendation.

  9. Extracting DNA from FFPE Tissue Biospecimens Using User-Friendly Automated Technology: Is There an Impact on Yield or Quality?

    PubMed

    Mathieson, William; Guljar, Nafia; Sanchez, Ignacio; Sroya, Manveer; Thomas, Gerry A

    2018-05-03

    DNA extracted from formalin-fixed, paraffin-embedded (FFPE) tissue blocks is amenable to analytical techniques, including sequencing. DNA extraction protocols are typically long and complex, often involving an overnight proteinase K digest. Automated platforms that shorten and simplify the process are therefore an attractive proposition for users wanting a faster turn-around or to process large numbers of biospecimens. It is, however, unclear whether automated extraction systems return poorer DNA yields or quality than manual extractions performed by experienced technicians. We extracted DNA from 42 FFPE clinical tissue biospecimens using the QiaCube (Qiagen) and ExScale (ExScale Biospecimen Solutions) automated platforms, comparing DNA yields and integrities with those from manual extractions. The QIAamp DNA FFPE Spin Column Kit was used for manual and QiaCube DNA extractions and the ExScale extractions were performed using two of the manufacturer's magnetic bead kits: one extracting DNA only and the other simultaneously extracting DNA and RNA. In all automated extraction methods, DNA yields and integrities (assayed using DNA Integrity Numbers from a 4200 TapeStation and the qPCR-based Illumina FFPE QC Assay) were poorer than in the manual method, with the QiaCube system performing better than the ExScale system. However, ExScale was fastest, offered the highest reproducibility when extracting DNA only, and required the least intervention or technician experience. Thus, the extraction methods have different strengths and weaknesses, would appeal to different users with different requirements, and therefore, we cannot recommend one method over another.

  10. Hazardous Waste Generator Regulations: A User-Friendly Reference Document

    EPA Pesticide Factsheets

    User-friendly reference to assist EPA and state staff, industrial facilities generating and managing hazardous wastes as well as the general public, in locating and understanding RCRA hazardous waste generator regulations.

  11. Alkahest NuclearBLAST : a user-friendly BLAST management and analysis system

    PubMed Central

    Diener, Stephen E; Houfek, Thomas D; Kalat, Sam E; Windham, DE; Burke, Mark; Opperman, Charles; Dean, Ralph A

    2005-01-01

    Background - Sequencing of EST and BAC end datasets is no longer limited to large research groups. Drops in per-base pricing have made high throughput sequencing accessible to individual investigators. However, there are few options available which provide a free and user-friendly solution to the BLAST result storage and data mining needs of biologists. Results - Here we describe NuclearBLAST, a batch BLAST analysis, storage and management system designed for the biologist. It is a wrapper for NCBI BLAST which provides a user-friendly web interface which includes a request wizard and the ability to view and mine the results. All BLAST results are stored in a MySQL database which allows for more advanced data-mining through supplied command-line utilities or direct database access. NuclearBLAST can be installed on a single machine or clustered amongst a number of machines to improve analysis throughput. NuclearBLAST provides a platform which eases data-mining of multiple BLAST results. With the supplied scripts, the program can export data into a spreadsheet-friendly format, automatically assign Gene Ontology terms to sequences and provide bi-directional best hits between two datasets. Users with SQL experience can use the database to ask even more complex questions and extract any subset of data they require. Conclusion - This tool provides a user-friendly interface for requesting, viewing and mining of BLAST results which makes the management and data-mining of large sets of BLAST analyses tractable to biologists. PMID:15958161

  12. Investigation and Implementation of a Tree Transformation System for User Friendly Programming.

    DTIC Science & Technology

    1984-12-01

    systems have become an important area of research because of theiL direct impact on all areas of computer science such as software engineering ...RD-i52 716 INVESTIGTIN AND IMPLEMENTATION OF A TREE I/2TRANSFORMATION SYSTEM FOR USER FRIENDLY PROGRAMMING (U) NAVAL POSTGRADUATE SCHOOL MONTEREY CA...Implementation of a Master’s Thesis Tree Transformation System for User December 1984 Friendly Programming 6. PERFORMING ORG. REPORT NUMBER 7. AU~THOR(s) S

  13. Automated Tumor Registry for Oncology. A VA-DHCP MUMPS application.

    PubMed

    Richie, S

    1992-01-01

    The VA Automated Tumor Registry for Oncology, Version 2, is a multifaceted, completely automated user-friendly cancer database. Easy to use modules include: Automatic Casefinding; Suspense Files; Abstracting and Printing; Follow-up; Annual Reports; Statistical Reports; Utility Functions.

  14. The use of Graphic User Interface for development of a user-friendly CRS-Stack software

    NASA Astrophysics Data System (ADS)

    Sule, Rachmat; Prayudhatama, Dythia; Perkasa, Muhammad D.; Hendriyana, Andri; Fatkhan; Sardjito; Adriansyah

    2017-04-01

    The development of a user-friendly Common Reflection Surface (CRS) Stack software that has been built by implementing Graphical User Interface (GUI) is described in this paper. The original CRS-Stack software developed by WIT Consortium is compiled in the unix/linux environment, which is not a user-friendly software, so that a user must write the commands and parameters manually in a script file. Due to this limitation, the CRS-Stack become a non popular method, although applying this method is actually a promising way in order to obtain better seismic sections, which have better reflector continuity and S/N ratio. After obtaining successful results that have been tested by using several seismic data belong to oil companies in Indonesia, it comes to an idea to develop a user-friendly software in our own laboratory. Graphical User Interface (GUI) is a type of user interface that allows people to interact with computer programs in a better way. Rather than typing commands and module parameters, GUI allows the users to use computer programs in much simple and easy. Thus, GUI can transform the text-based interface into graphical icons and visual indicators. The use of complicated seismic unix shell script can be avoided. The Java Swing GUI library is used to develop this CRS-Stack GUI. Every shell script that represents each seismic process is invoked from Java environment. Besides developing interactive GUI to perform CRS-Stack processing, this CRS-Stack GUI is design to help geophysicists to manage a project with complex seismic processing procedures. The CRS-Stack GUI software is composed by input directory, operators, and output directory, which are defined as a seismic data processing workflow. The CRS-Stack processing workflow involves four steps; i.e. automatic CMP stack, initial CRS-Stack, optimized CRS-Stack, and CRS-Stack Supergather. Those operations are visualized in an informative flowchart with self explanatory system to guide the user inputting the

  15. Creating the User-Friendly Library by Evaluating Patron Perception of Signage.

    ERIC Educational Resources Information Center

    Bosman, Ellen; Rusinek, Carol

    1997-01-01

    Librarians at Indiana University Northwest Library surveyed patrons on how to make the library's collection and services more accessible by improving signage. Examines the effectiveness of signage to instruct users, reduce difficulties and fears, ameliorate negative experiences, and contribute to a user-friendly environment. (AEF)

  16. Automated Tumor Registry for Oncology. A VA-DHCP MUMPS application.

    PubMed Central

    Richie, S.

    1992-01-01

    The VA Automated Tumor Registry for Oncology, Version 2, is a multifaceted, completely automated user-friendly cancer database. Easy to use modules include: Automatic Casefinding; Suspense Files; Abstracting and Printing; Follow-up; Annual Reports; Statistical Reports; Utility Functions. PMID:1482866

  17. BAGEL4: a user-friendly web server to thoroughly mine RiPPs and bacteriocins.

    PubMed

    van Heel, Auke J; de Jong, Anne; Song, Chunxu; Viel, Jakob H; Kok, Jan; Kuipers, Oscar P

    2018-05-21

    Interest in secondary metabolites such as RiPPs (ribosomally synthesized and posttranslationally modified peptides) is increasing worldwide. To facilitate the research in this field we have updated our mining web server. BAGEL4 is faster than its predecessor and is now fully independent from ORF-calling. Gene clusters of interest are discovered using the core-peptide database and/or through HMM motifs that are present in associated context genes. The databases used for mining have been updated and extended with literature references and links to UniProt and NCBI. Additionally, we have included automated promoter and terminator prediction and the option to upload RNA expression data, which can be displayed along with the identified clusters. Further improvements include the annotation of the context genes, which is now based on a fast blast against the prokaryote part of the UniRef90 database, and the improved web-BLAST feature that dynamically loads structural data such as internal cross-linking from UniProt. Overall BAGEL4 provides the user with more information through a user-friendly web-interface which simplifies data evaluation. BAGEL4 is freely accessible at http://bagel4.molgenrug.nl.

  18. A LabVIEW based template for user created experiment automation.

    PubMed

    Kim, D J; Fisk, Z

    2012-12-01

    We have developed an expandable software template to automate user created experiments. The LabVIEW based template is easily modifiable to add together user created measurements, controls, and data logging with virtually any type of laboratory equipment. We use reentrant sequential selection to implement sequence script making it possible to wrap a long series of the user created experiments and execute them in sequence. Details of software structure and application examples for scanning probe microscope and automated transport experiments using custom built laboratory electronics and a cryostat are described.

  19. FORESEE™ User-Centric Energy Automation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    FORESEE™ is a home energy management system (HEMS) that provides a user centric energy automation solution for residential building occupants. Built upon advanced control and machine learning algorithms, FORESEE intelligently manages the home appliances and distributed energy resources (DERs) such as photovoltaics and battery storage in a home. Unlike existing HEMS in the market, FORESEE provides a tailored home automation solution for individual occupants by learning and adapting to their preferences on cost, comfort, convenience and carbon. FORESEE improves not only the energy efficiency of the home but also its capability to provide grid services such as demand response. Highlymore » reliable demand response services are likely to be incentivized by utility companies, making FORESEE economically viable for most homes.« less

  20. MareyMap Online: A User-Friendly Web Application and Database Service for Estimating Recombination Rates Using Physical and Genetic Maps.

    PubMed

    Siberchicot, Aurélie; Bessy, Adrien; Guéguen, Laurent; Marais, Gabriel A B

    2017-10-01

    Given the importance of meiotic recombination in biology, there is a need to develop robust methods to estimate meiotic recombination rates. A popular approach, called the Marey map approach, relies on comparing genetic and physical maps of a chromosome to estimate local recombination rates. In the past, we have implemented this approach in an R package called MareyMap, which includes many functionalities useful to get reliable recombination rate estimates in a semi-automated way. MareyMap has been used repeatedly in studies looking at the effect of recombination on genome evolution. Here, we propose a simpler user-friendly web service version of MareyMap, called MareyMap Online, which allows a user to get recombination rates from her/his own data or from a publicly available database that we offer in a few clicks. When the analysis is done, the user is asked whether her/his curated data can be placed in the database and shared with other users, which we hope will make meta-analysis on recombination rates including many species easy in the future. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  1. User-friendly program for multitask analysis

    NASA Astrophysics Data System (ADS)

    Caporali, Sergio A.; Akladios, Magdy; Becker, Paul E.

    2000-10-01

    Research on lifting activities has led to the design of several useful tools for evaluating tasks that involve lifting and material handling. The National Institute for Occupational Safety and Health (NIOSH) has developed a single task lifting equation. This formula has been frequently used as a guide in the field of ergonomics and material handling. While being much more complicated, the multi-task formula will provide a more realistic analysis for the evaluation of lifting and material handling jobs. A user friendly tool has been developed to assist professionals in the field of ergonomics in analyzing multitask types of material handling jobs. The program allows for up to 10 different tasks to be evaluated. The program requires a basic understanding of the NIOSH lifting guidelines and the six multipliers that are involved in the analysis of each single task. These multipliers are: Horizontal Distance Multiplier (HM), Vertical Distance Multiplier (VM), Vertical Displacement Multiplier (DM), Frequency of lifting Multiplier (FM), Coupling Multiplier (CM), and the Asymmetry Multiplier (AM). Once a given job is analyzed, a researched list of recommendations is provided to the user in an attempt to reduce the potential risk factors that are associated with each task.

  2. Automation of the CAS Document Delivery Service.

    ERIC Educational Resources Information Center

    Steensland, M. C.; Soukup, K. M.

    1986-01-01

    The automation of online order retrieval for Chemical Abstracts Service Document Delivery Service was accomplished by shifting to an order retrieval/dispatch process linked to a Unix network. The Unix-based environment, its terminal emulation, page-break, and user-friendly interface software, and later enhancements are reviewed. Resultant increase…

  3. Living Lab as an Agile Approach in Developing User-Friendly Welfare Technology.

    PubMed

    Holappa, Niina; Sirkka, Andrew

    2017-01-01

    This paper discusses living lab as a method of developing user-friendly welfare technology, and presents a qualitative evaluation research of how living lab tested technologies impacted on the life of healthcare customers and professionals over test periods.

  4. Automating Relational Database Design for Microcomputer Users.

    ERIC Educational Resources Information Center

    Pu, Hao-Che

    1991-01-01

    Discusses issues involved in automating the relational database design process for microcomputer users and presents a prototype of a microcomputer-based system (RA, Relation Assistant) that is based on expert systems technology and helps avoid database maintenance problems. Relational database design is explained and the importance of easy input…

  5. Retention payoff-based cost per day open regression equations: Application in a user-friendly decision support tool for investment analysis of automated estrus detection technologies.

    PubMed

    Dolecheck, K A; Heersche, G; Bewley, J M

    2016-12-01

    Assessing the economic implications of investing in automated estrus detection (AED) technologies can be overwhelming for dairy producers. The objectives of this study were to develop new regression equations for estimating the cost per day open (DO) and to apply the results to create a user-friendly, partial budget, decision support tool for investment analysis of AED technologies. In the resulting decision support tool, the end user can adjust herd-specific inputs regarding general management, current reproductive management strategies, and the proposed AED system. Outputs include expected DO, reproductive cull rate, net present value, and payback period for the proposed AED system. Utility of the decision support tool was demonstrated with an example dairy herd created using data from DairyMetrics (Dairy Records Management Systems, Raleigh, NC), Food and Agricultural Policy Research Institute (Columbia, MO), and published literature. Resulting herd size, rolling herd average milk production, milk price, and feed cost were 323 cows, 10,758kg, $0.41/kg, and $0.20/kg of dry matter, respectively. Automated estrus detection technologies with 2 levels of initial system cost (low: $5,000 vs. high: $10,000), tag price (low: $50 vs. high: $100), and estrus detection rate (low: 60% vs. high: 80%) were compared over a 7-yr investment period. Four scenarios were considered in a demonstration of the investment analysis tool: (1) a herd using 100% visual observation for estrus detection before adopting 100% AED, (2) a herd using 100% visual observation before adopting 75% AED and 25% visual observation, (3) a herd using 100% timed artificial insemination (TAI) before adopting 100% AED, and (4) a herd using 100% TAI before adopting 75% AED and 25% TAI. Net present value in scenarios 1 and 2 was always positive, indicating a positive investment situation. Net present value in scenarios 3 and 4 was always positive in combinations using a $50 tag price, and in scenario 4, the $5

  6. A Full-Featured User Friendly CO 2-EOR and Sequestration Planning Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Savage, Bill

    A Full-Featured, User Friendly CO 2-EOR and Sequestration Planning Software This project addressed the development of an integrated software solution that includes a graphical user interface, numerical simulation, visualization tools and optimization processes for reservoir simulation modeling of CO 2-EOR. The objective was to assist the industry in the development of domestic energy resources by expanding the application of CO 2-EOR technologies, and ultimately to maximize the CO 2} sequestration capacity of the U.S. The software resulted in a field-ready application for the industry to address the current CO 2-EOR technologies. The software has been made available to the publicmore » without restrictions and with user friendly operating documentation and tutorials. The software (executable only) can be downloaded from NITEC’s website at www.nitecllc.com. This integrated solution enables the design, optimization and operation of CO 2-EOR processes for small and mid-sized operators, who currently cannot afford the expensive, time intensive solutions that the major oil companies enjoy. Based on one estimate, small oil fields comprise 30% of the of total economic resource potential for the application of CO 2-EOR processes in the U.S. This corresponds to 21.7 billion barrels of incremental, technically recoverable oil using the current “best practices”, and 31.9 billion barrels using “next-generation” CO 2-EOR techniques. The project included a Case Study of a prospective CO 2-EOR candidate field in Wyoming by a small independent, Linc Energy Petroleum Wyoming, Inc. NITEC LLC has an established track record of developing innovative and user friendly software. The Principle Investigator is an experienced manager and engineer with expertise in software development, numerical techniques, and GUI applications. Unique, presently-proprietary NITEC technologies have been integrated into this application to further its ease of use and technical functionality.« less

  7. Flexible software architecture for user-interface and machine control in laboratory automation.

    PubMed

    Arutunian, E B; Meldrum, D R; Friedman, N A; Moody, S E

    1998-10-01

    We describe a modular, layered software architecture for automated laboratory instruments. The design consists of a sophisticated user interface, a machine controller and multiple individual hardware subsystems, each interacting through a client-server architecture built entirely on top of open Internet standards. In our implementation, the user-interface components are built as Java applets that are downloaded from a server integrated into the machine controller. The user-interface client can thereby provide laboratory personnel with a familiar environment for experiment design through a standard World Wide Web browser. Data management and security are seamlessly integrated at the machine-controller layer using QNX, a real-time operating system. This layer also controls hardware subsystems through a second client-server interface. This architecture has proven flexible and relatively easy to implement and allows users to operate laboratory automation instruments remotely through an Internet connection. The software architecture was implemented and demonstrated on the Acapella, an automated fluid-sample-processing system that is under development at the University of Washington.

  8. User-friendly design approach for analog layout design

    NASA Astrophysics Data System (ADS)

    Li, Yongfu; Lee, Zhao Chuan; Tripathi, Vikas; Perez, Valerio; Ong, Yoong Seang; Hui, Chiu Wing

    2017-03-01

    Analog circuits are sensitives to the changes in the layout environment conditions, manufacturing processes, and variations. This paper presents analog verification flow with five types of analogfocused layout constraint checks to assist engineers in identifying any potential device mismatch and layout drawing mistakes. Compared to several solutions, our approach only requires layout design, which is sufficient to recognize all the matched devices. Our approach simplifies the data preparation and allows seamless integration into the layout environment with minimum disruption to the custom layout flow. Our user-friendly analog verification flow provides the engineer with more confident with their layouts quality.

  9. MSIX - A general and user-friendly platform for RAM analysis

    NASA Astrophysics Data System (ADS)

    Pan, Z. J.; Blemel, Peter

    The authors present a CAD (computer-aided design) platform supporting RAM (reliability, availability, and maintainability) analysis with efficient system description and alternative evaluation. The design concepts, implementation techniques, and application results are described. This platform is user-friendly because of its graphic environment, drawing facilities, object orientation, self-tutoring, and access to the operating system. The programs' independency and portability make them generally applicable to various analysis tasks.

  10. Sensor Network-Based and User-Friendly User Location Discovery for Future Smart Homes.

    PubMed

    Ahvar, Ehsan; Lee, Gyu Myoung; Han, Son N; Crespi, Noel; Khan, Imran

    2016-06-27

    User location is crucial context information for future smart homes where many location based services will be proposed. This location necessarily means that User Location Discovery (ULD) will play an important role in future smart homes. Concerns about privacy and the need to carry a mobile or a tag device within a smart home currently make conventional ULD systems uncomfortable for users. Future smart homes will need a ULD system to consider these challenges. This paper addresses the design of such a ULD system for context-aware services in future smart homes stressing the following challenges: (i) users' privacy; (ii) device-/tag-free; and (iii) fault tolerance and accuracy. On the other hand, emerging new technologies, such as the Internet of Things, embedded systems, intelligent devices and machine-to-machine communication, are penetrating into our daily life with more and more sensors available for use in our homes. Considering this opportunity, we propose a ULD system that is capitalizing on the prevalence of sensors for the home while satisfying the aforementioned challenges. The proposed sensor network-based and user-friendly ULD system relies on different types of inexpensive sensors, as well as a context broker with a fuzzy-based decision-maker. The context broker receives context information from different types of sensors and evaluates that data using the fuzzy set theory. We demonstrate the performance of the proposed system by illustrating a use case, utilizing both an analytical model and simulation.

  11. Advanced Cell Classifier: User-Friendly Machine-Learning-Based Software for Discovering Phenotypes in High-Content Imaging Data.

    PubMed

    Piccinini, Filippo; Balassa, Tamas; Szkalisity, Abel; Molnar, Csaba; Paavolainen, Lassi; Kujala, Kaisa; Buzas, Krisztina; Sarazova, Marie; Pietiainen, Vilja; Kutay, Ulrike; Smith, Kevin; Horvath, Peter

    2017-06-28

    High-content, imaging-based screens now routinely generate data on a scale that precludes manual verification and interrogation. Software applying machine learning has become an essential tool to automate analysis, but these methods require annotated examples to learn from. Efficiently exploring large datasets to find relevant examples remains a challenging bottleneck. Here, we present Advanced Cell Classifier (ACC), a graphical software package for phenotypic analysis that addresses these difficulties. ACC applies machine-learning and image-analysis methods to high-content data generated by large-scale, cell-based experiments. It features methods to mine microscopic image data, discover new phenotypes, and improve recognition performance. We demonstrate that these features substantially expedite the training process, successfully uncover rare phenotypes, and improve the accuracy of the analysis. ACC is extensively documented, designed to be user-friendly for researchers without machine-learning expertise, and distributed as a free open-source tool at www.cellclassifier.org. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. TIGER: A user-friendly interactive grid generation system for complicated turbomachinery and axis-symmetric configurations

    NASA Technical Reports Server (NTRS)

    Shih, Ming H.; Soni, Bharat K.

    1993-01-01

    The issue of time efficiency in grid generation is addressed by developing a user friendly graphical interface for interactive/automatic construction of structured grids around complex turbomachinery/axis-symmetric configurations. The accuracy of geometry modeling and its fidelity is accomplished by adapting the nonuniform rational b-spline (NURBS) representation. A customized interactive grid generation code, TIGER, has been developed to facilitate the grid generation process for complicated internal, external, and internal-external turbomachinery fields simulations. The FORMS Library is utilized to build user-friendly graphical interface. The algorithm allows a user to redistribute grid points interactively on curves/surfaces using NURBS formulation with accurate geometric definition. TIGER's features include multiblock, multiduct/shroud, multiblade row, uneven blade count, and patched/overlapping block interfaces. It has been applied to generate grids for various complicated turbomachinery geometries, as well as rocket and missile configurations.

  13. DockingApp: a user friendly interface for facilitated docking simulations with AutoDock Vina.

    PubMed

    Di Muzio, Elena; Toti, Daniele; Polticelli, Fabio

    2017-02-01

    Molecular docking is a powerful technique that helps uncover the structural and energetic bases of the interaction between macromolecules and substrates, endogenous and exogenous ligands, and inhibitors. Moreover, this technique plays a pivotal role in accelerating the screening of large libraries of compounds for drug development purposes. The need to promote community-driven drug development efforts, especially as far as neglected diseases are concerned, calls for user-friendly tools to allow non-expert users to exploit the full potential of molecular docking. Along this path, here is described the implementation of DockingApp, a freely available, extremely user-friendly, platform-independent application for performing docking simulations and virtual screening tasks using AutoDock Vina. DockingApp sports an intuitive graphical user interface which greatly facilitates both the input phase and the analysis of the results, which can be visualized in graphical form using the embedded JMol applet. The application comes with the DrugBank set of more than 1400 ready-to-dock, FDA-approved drugs, to facilitate virtual screening and drug repurposing initiatives. Furthermore, other databases of compounds such as ZINC, available also in AutoDock format, can be readily and easily plugged in.

  14. DockingApp: a user friendly interface for facilitated docking simulations with AutoDock Vina

    NASA Astrophysics Data System (ADS)

    Di Muzio, Elena; Toti, Daniele; Polticelli, Fabio

    2017-02-01

    Molecular docking is a powerful technique that helps uncover the structural and energetic bases of the interaction between macromolecules and substrates, endogenous and exogenous ligands, and inhibitors. Moreover, this technique plays a pivotal role in accelerating the screening of large libraries of compounds for drug development purposes. The need to promote community-driven drug development efforts, especially as far as neglected diseases are concerned, calls for user-friendly tools to allow non-expert users to exploit the full potential of molecular docking. Along this path, here is described the implementation of DockingApp, a freely available, extremely user-friendly, platform-independent application for performing docking simulations and virtual screening tasks using AutoDock Vina. DockingApp sports an intuitive graphical user interface which greatly facilitates both the input phase and the analysis of the results, which can be visualized in graphical form using the embedded JMol applet. The application comes with the DrugBank set of more than 1400 ready-to-dock, FDA-approved drugs, to facilitate virtual screening and drug repurposing initiatives. Furthermore, other databases of compounds such as ZINC, available also in AutoDock format, can be readily and easily plugged in.

  15. UTOPIA-User-Friendly Tools for Operating Informatics Applications.

    PubMed

    Pettifer, S R; Sinnott, J R; Attwood, T K

    2004-01-01

    Bioinformaticians routinely analyse vast amounts of information held both in large remote databases and in flat data files hosted on local machines. The contemporary toolkit available for this purpose consists of an ad hoc collection of data manipulation tools, scripting languages and visualization systems; these must often be combined in complex and bespoke ways, the result frequently being an unwieldy artefact capable of one specific task, which cannot easily be exploited or extended by other practitioners. Owing to the sizes of current databases and the scale of the analyses necessary, routine bioinformatics tasks are often automated, but many still require the unique experience and intuition of human researchers: this requires tools that support real-time interaction with complex datasets. Many existing tools have poor user interfaces and limited real-time performance when applied to realistically large datasets; much of the user's cognitive capacity is therefore focused on controlling the tool rather than on performing the research. The UTOPIA project is addressing some of these issues by building reusable software components that can be combined to make useful applications in the field of bioinformatics. Expertise in the fields of human computer interaction, high-performance rendering, and distributed systems is being guided by bioinformaticians and end-user biologists to create a toolkit that is both architecturally sound from a computing point of view, and directly addresses end-user and application-developer requirements.

  16. User-friendly cognitive training for the elderly: a technical report.

    PubMed

    Boquete, Luciano; Rodríguez-Ascariz, José Manuel; Amo-Usanos, Carlos; Martínez-Arribas, Alejandro; Amo-Usanos, Javier; Otón, Salvador

    2011-01-01

    This article presents a system that implements a cognitive training program in users' homes. The system comprises various applications designed to create a daily brain-fitness regime. The proposed mental training system uses television and a remote control specially designed for the elderly. This system integrates Java applications to promote brain-fitness training in three areas: arithmetic, memory, and idea association. The system comprises the following: Standard television set, simplified wireless remote control, black box (system's core hardware and software), brain-fitness games (language Java), and Wi-Fi-enabled Internet-connected router. All data from the user training sessions are monitored through a control center. This control center analyzes the evolution of the user and the proper performance of the system during the test. The implemented system has been tested by six healthy volunteers. The results for this user group demonstrated the accessibility and usability of the system in a controlled real environment. The impressions of the users were very favorable, and they reported high adaptability to the system. The mean score for usability and accessibility assigned by the users was 3.56 out of 5 points. The operation stress test (over 200 h) was successful. The proposed system was used to implement a cognitive training program in users' homes, which was developed to be a low-cost tool with a high degree of user interactivity. The results of this preliminary study indicate that this user-friendly system could be adopted as a form of cognitive training for the elderly.

  17. Sensor Network-Based and User-Friendly User Location Discovery for Future Smart Homes

    PubMed Central

    Ahvar, Ehsan; Lee, Gyu Myoung; Han, Son N.; Crespi, Noel; Khan, Imran

    2016-01-01

    User location is crucial context information for future smart homes where many location based services will be proposed. This location necessarily means that User Location Discovery (ULD) will play an important role in future smart homes. Concerns about privacy and the need to carry a mobile or a tag device within a smart home currently make conventional ULD systems uncomfortable for users. Future smart homes will need a ULD system to consider these challenges. This paper addresses the design of such a ULD system for context-aware services in future smart homes stressing the following challenges: (i) users’ privacy; (ii) device-/tag-free; and (iii) fault tolerance and accuracy. On the other hand, emerging new technologies, such as the Internet of Things, embedded systems, intelligent devices and machine-to-machine communication, are penetrating into our daily life with more and more sensors available for use in our homes. Considering this opportunity, we propose a ULD system that is capitalizing on the prevalence of sensors for the home while satisfying the aforementioned challenges. The proposed sensor network-based and user-friendly ULD system relies on different types of inexpensive sensors, as well as a context broker with a fuzzy-based decision-maker. The context broker receives context information from different types of sensors and evaluates that data using the fuzzy set theory. We demonstrate the performance of the proposed system by illustrating a use case, utilizing both an analytical model and simulation. PMID:27355951

  18. BATS: a Bayesian user-friendly software for analyzing time series microarray experiments.

    PubMed

    Angelini, Claudia; Cutillo, Luisa; De Canditiis, Daniela; Mutarelli, Margherita; Pensky, Marianna

    2008-10-06

    Gene expression levels in a given cell can be influenced by different factors, namely pharmacological or medical treatments. The response to a given stimulus is usually different for different genes and may depend on time. One of the goals of modern molecular biology is the high-throughput identification of genes associated with a particular treatment or a biological process of interest. From methodological and computational point of view, analyzing high-dimensional time course microarray data requires very specific set of tools which are usually not included in standard software packages. Recently, the authors of this paper developed a fully Bayesian approach which allows one to identify differentially expressed genes in a 'one-sample' time-course microarray experiment, to rank them and to estimate their expression profiles. The method is based on explicit expressions for calculations and, hence, very computationally efficient. The software package BATS (Bayesian Analysis of Time Series) presented here implements the methodology described above. It allows an user to automatically identify and rank differentially expressed genes and to estimate their expression profiles when at least 5-6 time points are available. The package has a user-friendly interface. BATS successfully manages various technical difficulties which arise in time-course microarray experiments, such as a small number of observations, non-uniform sampling intervals and replicated or missing data. BATS is a free user-friendly software for the analysis of both simulated and real microarray time course experiments. The software, the user manual and a brief illustrative example are freely available online at the BATS website: http://www.na.iac.cnr.it/bats.

  19. 3D-Printed Microfluidic Automation

    PubMed Central

    Au, Anthony K.; Bhattacharjee, Nirveek; Horowitz, Lisa F.; Chang, Tim C.; Folch, Albert

    2015-01-01

    Microfluidic automation – the automated routing, dispensing, mixing, and/or separation of fluids through microchannels – generally remains a slowly-spreading technology because device fabrication requires sophisticated facilities and the technology’s use demands expert operators. Integrating microfluidic automation in devices has involved specialized multi-layering and bonding approaches. Stereolithography is an assembly-free, 3D-printing technique that is emerging as an efficient alternative for rapid prototyping of biomedical devices. Here we describe fluidic valves and pumps that can be stereolithographically printed in optically-clear, biocompatible plastic and integrated within microfluidic devices at low cost. User-friendly fluid automation devices can be printed and used by non-engineers as replacement for costly robotic pipettors or tedious manual pipetting. Engineers can manipulate the designs as digital modules into new devices of expanded functionality. Printing these devices only requires the digital file and electronic access to a printer. PMID:25738695

  20. 3D-printed microfluidic automation.

    PubMed

    Au, Anthony K; Bhattacharjee, Nirveek; Horowitz, Lisa F; Chang, Tim C; Folch, Albert

    2015-04-21

    Microfluidic automation - the automated routing, dispensing, mixing, and/or separation of fluids through microchannels - generally remains a slowly-spreading technology because device fabrication requires sophisticated facilities and the technology's use demands expert operators. Integrating microfluidic automation in devices has involved specialized multi-layering and bonding approaches. Stereolithography is an assembly-free, 3D-printing technique that is emerging as an efficient alternative for rapid prototyping of biomedical devices. Here we describe fluidic valves and pumps that can be stereolithographically printed in optically-clear, biocompatible plastic and integrated within microfluidic devices at low cost. User-friendly fluid automation devices can be printed and used by non-engineers as replacement for costly robotic pipettors or tedious manual pipetting. Engineers can manipulate the designs as digital modules into new devices of expanded functionality. Printing these devices only requires the digital file and electronic access to a printer.

  1. Automated Instructional Management Systems (AIMS) Version III, Users Manual.

    ERIC Educational Resources Information Center

    New York Inst. of Tech., Old Westbury.

    This document sets forth the procedures necessary to utilize and understand the operating characteristics of the Automated Instructional Management System - Version III, a computer-based system for management of educational processes. Directions for initialization, including internal and user files; system and operational input requirements;…

  2. Technology demonstration of space intravehicular automation and robotics

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Barker, L. Keith

    1994-01-01

    Automation and robotic technologies are being developed and capabilities demonstrated which would increase the productivity of microgravity science and materials processing in the space station laboratory module, especially when the crew is not present. The Automation Technology Branch at NASA Langley has been working in the area of intravehicular automation and robotics (IVAR) to provide a user-friendly development facility, to determine customer requirements for automated laboratory systems, and to improve the quality and efficiency of commercial production and scientific experimentation in space. This paper will describe the IVAR facility and present the results of a demonstration using a simulated protein crystal growth experiment inside a full-scale mockup of the space station laboratory module using a unique seven-degree-of-freedom robot.

  3. A user-friendly tool for medical-related patent retrieval.

    PubMed

    Pasche, Emilie; Gobeill, Julien; Teodoro, Douglas; Gaudinat, Arnaud; Vishnyakova, Dina; Lovis, Christian; Ruch, Patrick

    2012-01-01

    Health-related information retrieval is complicated by the variety of nomenclatures available to name entities, since different communities of users will use different ways to name a same entity. We present in this report the development and evaluation of a user-friendly interactive Web application aiming at facilitating health-related patent search. Our tool, called TWINC, relies on a search engine tuned during several patent retrieval competitions, enhanced with intelligent interaction modules, such as chemical query, normalization and expansion. While the functionality of related article search showed promising performances, the ad hoc search results in fairly contrasted results. Nonetheless, TWINC performed well during the PatOlympics competition and was appreciated by intellectual property experts. This result should be balanced by the limited evaluation sample. We can also assume that it can be customized to be applied in corporate search environments to process domain and company-specific vocabularies, including non-English literature and patents reports.

  4. Pseudomonas Genome Database: facilitating user-friendly, comprehensive comparisons of microbial genomes.

    PubMed

    Winsor, Geoffrey L; Van Rossum, Thea; Lo, Raymond; Khaira, Bhavjinder; Whiteside, Matthew D; Hancock, Robert E W; Brinkman, Fiona S L

    2009-01-01

    Pseudomonas aeruginosa is a well-studied opportunistic pathogen that is particularly known for its intrinsic antimicrobial resistance, diverse metabolic capacity, and its ability to cause life threatening infections in cystic fibrosis patients. The Pseudomonas Genome Database (http://www.pseudomonas.com) was originally developed as a resource for peer-reviewed, continually updated annotation for the Pseudomonas aeruginosa PAO1 reference strain genome. In order to facilitate cross-strain and cross-species genome comparisons with other Pseudomonas species of importance, we have now expanded the database capabilities to include all Pseudomonas species, and have developed or incorporated methods to facilitate high quality comparative genomics. The database contains robust assessment of orthologs, a novel ortholog clustering method, and incorporates five views of the data at the sequence and annotation levels (Gbrowse, Mauve and custom views) to facilitate genome comparisons. A choice of simple and more flexible user-friendly Boolean search features allows researchers to search and compare annotations or sequences within or between genomes. Other features include more accurate protein subcellular localization predictions and a user-friendly, Boolean searchable log file of updates for the reference strain PAO1. This database aims to continue to provide a high quality, annotated genome resource for the research community and is available under an open source license.

  5. ROCOPT: A user friendly interactive code to optimize rocket structural components

    NASA Technical Reports Server (NTRS)

    Rule, William K.

    1989-01-01

    ROCOPT is a user-friendly, graphically-interfaced, microcomputer-based computer program (IBM compatible) that optimizes rocket components by minimizing the structural weight. The rocket components considered are ring stiffened truncated cones and cylinders. The applied loading is static, and can consist of any combination of internal or external pressure, axial force, bending moment, and torque. Stress margins are calculated by means of simple closed form strength of material type equations. Stability margins are determined by approximate, orthotropic-shell, closed-form equations. A modified form of Powell's method, in conjunction with a modified form of the external penalty method, is used to determine the minimum weight of the structure subject to stress and stability margin constraints, as well as user input constraints on the structural dimensions. The graphical interface guides the user through the required data prompts, explains program options and graphically displays results for easy interpretation.

  6. SimArray: a user-friendly and user-configurable microarray design tool

    PubMed Central

    Auburn, Richard P; Russell, Roslin R; Fischer, Bettina; Meadows, Lisa A; Sevillano Matilla, Santiago; Russell, Steven

    2006-01-01

    Background Microarrays were first developed to assess gene expression but are now also used to map protein-binding sites and to assess allelic variation between individuals. Regardless of the intended application, efficient production and appropriate array design are key determinants of experimental success. Inefficient production can make larger-scale studies prohibitively expensive, whereas poor array design makes normalisation and data analysis problematic. Results We have developed a user-friendly tool, SimArray, which generates a randomised spot layout, computes a maximum meta-grid area, and estimates the print time, in response to user-specified design decisions. Selected parameters include: the number of probes to be printed; the microtitre plate format; the printing pin configuration, and the achievable spot density. SimArray is compatible with all current robotic spotters that employ 96-, 384- or 1536-well microtitre plates, and can be configured to reflect most production environments. Print time and maximum meta-grid area estimates facilitate evaluation of each array design for its suitability. Randomisation of the spot layout facilitates correction of systematic biases by normalisation. Conclusion SimArray is intended to help both established researchers and those new to the microarray field to develop microarray designs with randomised spot layouts that are compatible with their specific production environment. SimArray is an open-source program and is available from . PMID:16509966

  7. A user friendly system for ultrasound carotid intima-media thickness image interpretation

    NASA Astrophysics Data System (ADS)

    Zhu, Xiangjun; Kendall, Christopher B.; Hurst, R. Todd; Liang, Jianming

    2011-03-01

    Assessment of Carotid Intima-Media Thickness (CIMT) by B-mode ultrasound is a technically mature and reproducible technology. Given the high morbidity, mortality and the large societal burden associated with CV diseases, as a safe yet inexpensive tool, CIMT is increasingly utilized for cardiovascular (CV) risk stratification. However, CIMT requires a precise measure of the thickness of the intima and media layers of the carotid artery that can be tedious, time consuming, and demand specialized expertise and experience. To this end, we have developed a highly user-friendly system for semiautomatic CIMT image interpretation. Our contribution is the application of active contour models (snake models) with hard constraints, leading to an accurate, adaptive and user-friendly border detection algorithm. A comparison study with the CIMT measurement software in Siemens Syngo® Arterial Health Package shows that our system gives a small bias in mean (0.049 +/-0.051mm) and maximum (0.010 +/- 0.083 mm) CIMT measures and offers a higher reproducibility (average correlation coefficients were 0.948 and 0.844 in mean and maximum CIMT respectively (P <0.001)). This superior performance is attributed to our novel interface design for hard constraints in the snake models.

  8. Library Automation Alternatives in 1996 and User Satisfaction Ratings of Library Users by Operating System.

    ERIC Educational Resources Information Center

    Cibbarelli, Pamela

    1996-01-01

    Examines library automation product introductions and conversions to new operating systems. Compares user satisfaction ratings of the following library software packages: DOS/Windows, UNIX, Macintosh, and DEC VAX/VMS. Software is rated according to documentation, service/support, training, product reliability, product capabilities, ease of use,…

  9. Iterative User Interface Design for Automated Sequential Organ Failure Assessment Score Calculator in Sepsis Detection

    PubMed Central

    Herasevich, Vitaly

    2017-01-01

    Background The new sepsis definition has increased the need for frequent sequential organ failure assessment (SOFA) score recalculation and the clerical burden of information retrieval makes this score ideal for automated calculation. Objective The aim of this study was to (1) estimate the clerical workload of manual SOFA score calculation through a time-motion analysis and (2) describe a user-centered design process for an electronic medical record (EMR) integrated, automated SOFA score calculator with subsequent usability evaluation study. Methods First, we performed a time-motion analysis by recording time-to-task-completion for the manual calculation of 35 baseline and 35 current SOFA scores by 14 internal medicine residents over a 2-month period. Next, we used an agile development process to create a user interface for a previously developed automated SOFA score calculator. The final user interface usability was evaluated by clinician end users with the Computer Systems Usability Questionnaire. Results The overall mean (standard deviation, SD) time-to-complete manual SOFA score calculation time was 61.6 s (33). Among the 24% (12/50) usability survey respondents, our user-centered user interface design process resulted in >75% favorability of survey items in the domains of system usability, information quality, and interface quality. Conclusions Early stakeholder engagement in our agile design process resulted in a user interface for an automated SOFA score calculator that reduced clinician workload and met clinicians’ needs at the point of care. Emerging interoperable platforms may facilitate dissemination of similarly useful clinical score calculators and decision support algorithms as “apps.” A user-centered design process and usability evaluation should be considered during creation of these tools. PMID:28526675

  10. SLIMS--a user-friendly sample operations and inventory management system for genotyping labs.

    PubMed

    Van Rossum, Thea; Tripp, Ben; Daley, Denise

    2010-07-15

    We present the Sample-based Laboratory Information Management System (SLIMS), a powerful and user-friendly open source web application that provides all members of a laboratory with an interface to view, edit and create sample information. SLIMS aims to simplify common laboratory tasks with tools such as a user-friendly shopping cart for subjects, samples and containers that easily generates reports, shareable lists and plate designs for genotyping. Further key features include customizable data views, database change-logging and dynamically filled pre-formatted reports. Along with being feature-rich, SLIMS' power comes from being able to handle longitudinal data from multiple time-points and biological sources. This type of data is increasingly common from studies searching for susceptibility genes for common complex diseases that collect thousands of samples generating millions of genotypes and overwhelming amounts of data. LIMSs provide an efficient way to deal with this data while increasing accessibility and reducing laboratory errors; however, professional LIMS are often too costly to be practical. SLIMS gives labs a feasible alternative that is easily accessible, user-centrically designed and feature-rich. To facilitate system customization, and utilization for other groups, manuals have been written for users and developers. Documentation, source code and manuals are available at http://genapha.icapture.ubc.ca/SLIMS/index.jsp. SLIMS was developed using Java 1.6.0, JSPs, Hibernate 3.3.1.GA, DB2 and mySQL, Apache Tomcat 6.0.18, NetBeans IDE 6.5, Jasper Reports 3.5.1 and JasperSoft's iReport 3.5.1.

  11. Preliminary evaluation of user terminals for an automated pilot briefing system

    DOT National Transportation Integrated Search

    1976-08-01

    This report describes a preliminary evaluation of various user terminal concepts for an automated aviation pilot weather briefing and flight plan filing system. Terminals embodying differing operational concepts were used by volunteer general aviatio...

  12. Iterative User Interface Design for Automated Sequential Organ Failure Assessment Score Calculator in Sepsis Detection.

    PubMed

    Aakre, Christopher Ansel; Kitson, Jaben E; Li, Man; Herasevich, Vitaly

    2017-05-18

    The new sepsis definition has increased the need for frequent sequential organ failure assessment (SOFA) score recalculation and the clerical burden of information retrieval makes this score ideal for automated calculation. The aim of this study was to (1) estimate the clerical workload of manual SOFA score calculation through a time-motion analysis and (2) describe a user-centered design process for an electronic medical record (EMR) integrated, automated SOFA score calculator with subsequent usability evaluation study. First, we performed a time-motion analysis by recording time-to-task-completion for the manual calculation of 35 baseline and 35 current SOFA scores by 14 internal medicine residents over a 2-month period. Next, we used an agile development process to create a user interface for a previously developed automated SOFA score calculator. The final user interface usability was evaluated by clinician end users with the Computer Systems Usability Questionnaire. The overall mean (standard deviation, SD) time-to-complete manual SOFA score calculation time was 61.6 s (33). Among the 24% (12/50) usability survey respondents, our user-centered user interface design process resulted in >75% favorability of survey items in the domains of system usability, information quality, and interface quality. Early stakeholder engagement in our agile design process resulted in a user interface for an automated SOFA score calculator that reduced clinician workload and met clinicians' needs at the point of care. Emerging interoperable platforms may facilitate dissemination of similarly useful clinical score calculators and decision support algorithms as "apps." A user-centered design process and usability evaluation should be considered during creation of these tools. ©Christopher Ansel Aakre, Jaben E Kitson, Man Li, Vitaly Herasevich. Originally published in JMIR Human Factors (http://humanfactors.jmir.org), 18.05.2017.

  13. Development of an automated scanning monochromator for sensitivity calibration of the MUSTANG instrument

    NASA Astrophysics Data System (ADS)

    Rivers, Thane D.

    1992-06-01

    An Automated Scanning Monochromator was developed using: an Acton Research Corporation (ARC) Monochromator, Ealing Photomultiplier Tube and a Macintosh PC in conjunction with LabVIEW software. The LabVIEW Virtual Instrument written to operate the ARC Monochromator is a mouse driven user friendly program developed for automated spectral data measurements. Resolution and sensitivity of the Automated Scanning Monochromator System were determined experimentally. The Automated monochromator was then used for spectral measurements of a Platinum Lamp. Additionally, the reflectivity curve for a BaSO4 coated screen has been measured. Reflectivity measurements indicate a large discrepancy with expected results. Further analysis of the reflectivity experiment is required for conclusive results.

  14. Electric-powered indoor/outdoor wheelchairs (EPIOCs): users' views of influence on family, friends and carers.

    PubMed

    Frank, Andrew; Neophytou, Claudius; Frank, Julia; de Souza, Lorraine

    2010-01-01

    This study explored the effects of electric-powered indoor/outdoor wheelchair (EPIOC) provision to users on their family and carers. EPIOC users receiving their chairs between February and November 2002 (N = 74) were invited to participate in a telephone questionnaire/interview, and 64 agreed. This study examined the responses to the question 'Has the use of your EPIOC affected your family or friends in any way?' and related comments. Interviews were analysed using a qualitative framework approach to identify emergent themes. In addition, the proportion of individuals raising issues related to each theme was determined. Participants were interviewed 10-19 (mean = 14.3) months after chair delivery. The following themes emerged: reduced physical burden on family/friends and increased independence and freedom. However, an EPIOC does not eliminate other practical problems particularly during transportation and negotiating kerbs and slopes. Users also reported anxiety/worry in relationship to EPIOC use, e.g., weather conditions, personal safety (muggings), use of ramps and kerbs. There are considerable benefits to families and carers associated with powered wheelchair use. A reduction in the physical demand for pushing and increased freedom were identified. These benefits appear to outweigh the residual practical difficulties and worries.

  15. Developing a Ruggedized User-Friendly UAS for Monitoring Volcanic Emissions

    NASA Astrophysics Data System (ADS)

    Wardell, L. J.; Elston, J. S.; Stachura, M.

    2017-12-01

    Using lessons learned from a history of airborne volcano measurements and a range of UAS R&D, a reliable and ruggedized UAS is being developed specifically for volcano monitoring and response. A key feature is the user interface (UI) that allows for a menu of automated flight plans that will account for terrain and sensor requirements. Due to variation in response times of miniaturized airborne the sensors, flight plan options are extended to account for sensor lag when needed. By automating such complicating variables into the UI, the amount of background and training needed for operation is further minimized. Payload options include simultaneous in situ gas and particle sensors combined with downward-looking imagers to provide a wide range of data products. Currently under development by Black Swift Technologies, the latest updates and test results will be presented. Specifications of the Superswift airframe include a 6,000 m flight ceiling, 2.4 kg payload capacity, and 2 hr endurance.

  16. Software-supported USER cloning strategies for site-directed mutagenesis and DNA assembly.

    PubMed

    Genee, Hans Jasper; Bonde, Mads Tvillinggaard; Bagger, Frederik Otzen; Jespersen, Jakob Berg; Sommer, Morten O A; Wernersson, Rasmus; Olsen, Lars Rønn

    2015-03-20

    USER cloning is a fast and versatile method for engineering of plasmid DNA. We have developed a user friendly Web server tool that automates the design of optimal PCR primers for several distinct USER cloning-based applications. Our Web server, named AMUSER (Automated DNA Modifications with USER cloning), facilitates DNA assembly and introduction of virtually any type of site-directed mutagenesis by designing optimal PCR primers for the desired genetic changes. To demonstrate the utility, we designed primers for a simultaneous two-position site-directed mutagenesis of green fluorescent protein (GFP) to yellow fluorescent protein (YFP), which in a single step reaction resulted in a 94% cloning efficiency. AMUSER also supports degenerate nucleotide primers, single insert combinatorial assembly, and flexible parameters for PCR amplification. AMUSER is freely available online at http://www.cbs.dtu.dk/services/AMUSER/.

  17. A user-friendly phytoremediation database: creating the searchable database, the users, and the broader implications.

    PubMed

    Famulari, Stevie; Witz, Kyla

    2015-01-01

    Designers, students, teachers, gardeners, farmers, landscape architects, architects, engineers, homeowners, and others have uses for the practice of phytoremediation. This research looks at the creation of a phytoremediation database which is designed for ease of use for a non-scientific user, as well as for students in an educational setting ( http://www.steviefamulari.net/phytoremediation ). During 2012, Environmental Artist & Professor of Landscape Architecture Stevie Famulari, with assistance from Kyla Witz, a landscape architecture student, created an online searchable database designed for high public accessibility. The database is a record of research of plant species that aid in the uptake of contaminants, including metals, organic materials, biodiesels & oils, and radionuclides. The database consists of multiple interconnected indexes categorized into common and scientific plant name, contaminant name, and contaminant type. It includes photographs, hardiness zones, specific plant qualities, full citations to the original research, and other relevant information intended to aid those designing with phytoremediation search for potential plants which may be used to address their site's need. The objective of the terminology section is to remove uncertainty for more inexperienced users, and to clarify terms for a more user-friendly experience. Implications of the work, including education and ease of browsing, as well as use of the database in teaching, are discussed.

  18. Chipster: user-friendly analysis software for microarray and other high-throughput data.

    PubMed

    Kallio, M Aleksi; Tuimala, Jarno T; Hupponen, Taavi; Klemelä, Petri; Gentile, Massimiliano; Scheinin, Ilari; Koski, Mikko; Käki, Janne; Korpelainen, Eija I

    2011-10-14

    The growth of high-throughput technologies such as microarrays and next generation sequencing has been accompanied by active research in data analysis methodology, producing new analysis methods at a rapid pace. While most of the newly developed methods are freely available, their use requires substantial computational skills. In order to enable non-programming biologists to benefit from the method development in a timely manner, we have created the Chipster software. Chipster (http://chipster.csc.fi/) brings a powerful collection of data analysis methods within the reach of bioscientists via its intuitive graphical user interface. Users can analyze and integrate different data types such as gene expression, miRNA and aCGH. The analysis functionality is complemented with rich interactive visualizations, allowing users to select datapoints and create new gene lists based on these selections. Importantly, users can save the performed analysis steps as reusable, automatic workflows, which can also be shared with other users. Being a versatile and easily extendable platform, Chipster can be used for microarray, proteomics and sequencing data. In this article we describe its comprehensive collection of analysis and visualization tools for microarray data using three case studies. Chipster is a user-friendly analysis software for high-throughput data. Its intuitive graphical user interface enables biologists to access a powerful collection of data analysis and integration tools, and to visualize data interactively. Users can collaborate by sharing analysis sessions and workflows. Chipster is open source, and the server installation package is freely available.

  19. A user-friendly tool for incremental haemodialysis prescription.

    PubMed

    Casino, Francesco Gaetano; Basile, Carlo

    2018-01-05

    There is a recently heightened interest in incremental haemodialysis (IHD), the main advantage of which could likely be a better preservation of the residual kidney function of the patients. The implementation of IHD, however, is hindered by many factors, among them, the mathematical complexity of its prescription. The aim of our study was to design a user-friendly tool for IHD prescription, consisting of only a few rows of a common spreadsheet. The keystone of our spreadsheet was the following fundamental concept: the dialysis dose to be prescribed in IHD depends only on the normalized urea clearance provided by the native kidneys (KRUn) of the patient for each frequency of treatment, according to the variable target model recently proposed by Casino and Basile (The variable target model: a paradigm shift in the incremental haemodialysis prescription. Nephrol Dial Transplant 2017; 32: 182-190). The first step was to put in sequence a series of equations in order to calculate, firstly, KRUn and, then, the key parameters to be prescribed for an adequate IHD; the second step was to compare KRUn values obtained with our spreadsheet with KRUn values obtainable with the gold standard Solute-solver (Daugirdas JT et al., Solute-solver: a web-based tool for modeling urea kinetics for a broad range of hemodialysis schedules in multiple patients. Am J Kidney Dis 2009; 54: 798-809) in a sample of 40 incident haemodialysis patients. Our spreadsheet provided excellent results. The differences with Solute-solver were clinically negligible. This was confirmed by the Bland-Altman plot built to analyse the agreement between KRUn values obtained with the two methods: the difference was 0.07 ± 0.05 mL/min/35 L. Our spreadsheet is a user-friendly tool able to provide clinically acceptable results in IHD prescription. Two immediate consequences could derive: (i) a larger dissemination of IHD might occur; and (ii) our spreadsheet could represent a useful tool for an ineludibly

  20. ARES: automated response function code. Users manual. [HPGAM and LSQVM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maung, T.; Reynolds, G.M.

    This ARES user's manual provides detailed instructions for a general understanding of the Automated Response Function Code and gives step by step instructions for using the complete code package on a HP-1000 system. This code is designed to calculate response functions of NaI gamma-ray detectors, with cylindrical or rectangular geometries.

  1. MASH Suite: a user-friendly and versatile software interface for high-resolution mass spectrometry data interpretation and visualization.

    PubMed

    Guner, Huseyin; Close, Patrick L; Cai, Wenxuan; Zhang, Han; Peng, Ying; Gregorich, Zachery R; Ge, Ying

    2014-03-01

    The rapid advancements in mass spectrometry (MS) instrumentation, particularly in Fourier transform (FT) MS, have made the acquisition of high-resolution and high-accuracy mass measurements routine. However, the software tools for the interpretation of high-resolution MS data are underdeveloped. Although several algorithms for the automatic processing of high-resolution MS data are available, there is still an urgent need for a user-friendly interface with functions that allow users to visualize and validate the computational output. Therefore, we have developed MASH Suite, a user-friendly and versatile software interface for processing high-resolution MS data. MASH Suite contains a wide range of features that allow users to easily navigate through data analysis, visualize complex high-resolution MS data, and manually validate automatically processed results. Furthermore, it provides easy, fast, and reliable interpretation of top-down, middle-down, and bottom-up MS data. MASH Suite is convenient, easily operated, and freely available. It can greatly facilitate the comprehensive interpretation and validation of high-resolution MS data with high accuracy and reliability.

  2. EpHLA: an innovative and user-friendly software automating the HLAMatchmaker algorithm for antibody analysis.

    PubMed

    Sousa, Luiz Cláudio Demes da Mata; Filho, Herton Luiz Alves Sales; Von Glehn, Cristina de Queiroz Carrascosa; da Silva, Adalberto Socorro; Neto, Pedro de Alcântara dos Santos; de Castro, José Adail Fonseca; do Monte, Semíramis Jamil Hadad

    2011-12-01

    The global challenge for solid organ transplantation programs is to distribute organs to the highly sensitized recipients. The purpose of this work is to describe and test the functionality of the EpHLA software, a program that automates the analysis of acceptable and unacceptable HLA epitopes on the basis of the HLAMatchmaker algorithm. HLAMatchmaker considers small configurations of polymorphic residues referred to as eplets as essential components of HLA-epitopes. Currently, the analyses require the creation of temporary files and the manual cut and paste of laboratory tests results between electronic spreadsheets, which is time-consuming and prone to administrative errors. The EpHLA software was developed in Object Pascal programming language and uses the HLAMatchmaker algorithm to generate histocompatibility reports. The automated generation of reports requires the integration of files containing the results of laboratory tests (HLA typing, anti-HLA antibody signature) and public data banks (NMDP, IMGT). The integration and the access to this data were accomplished by means of the framework called eDAFramework. The eDAFramework was developed in Object Pascal and PHP and it provides data access functionalities for software developed in these languages. The tool functionality was successfully tested in comparison to actual, manually derived reports of patients from a renal transplantation program with related donors. We successfully developed software, which enables the automated definition of the epitope specificities of HLA antibodies. This new tool will benefit the management of recipient/donor pairs selection for highly sensitized patients. Copyright © 2011 Elsevier B.V. All rights reserved.

  3. User's operating procedures. Volume 1: Scout project information programs

    NASA Technical Reports Server (NTRS)

    Harris, C. G.; Harris, D. K.

    1985-01-01

    A review of the user's operating procedures for the Scout Project Automatic Data System, called SPADS is given. SPADS is the result of the past seven years of software development on a Prime minicomputer located at the Scout Project Office. SPADS was developed as a single entry, multiple cross reference data management and information retrieval system for the automation of Project office tasks, including engineering, financial, managerial, and clerical support. The instructions to operate the Scout Project Information programs in data retrieval and file maintenance via the user friendly menu drivers is presented.

  4. User's operating procedures. Volume 3: Projects directorate information programs

    NASA Technical Reports Server (NTRS)

    Haris, C. G.; Harris, D. K.

    1985-01-01

    A review of the user's operating procedures for the scout project automatic data system, called SPADS is presented. SPADS is the results of the past seven years of software development on a prime mini-computer. SPADS was developed as a single entry, multiple cross-reference data management and information retrieval system for the automation of Project office tasks, including engineering, financial, managerial, and clerical support. This volume, three of three, provides the instructions to operate the projects directorate information programs in data retrieval and file maintenance via the user friendly menu drivers.

  5. PuffinPlot: A versatile, user-friendly program for paleomagnetic analysis

    NASA Astrophysics Data System (ADS)

    Lurcock, P. C.; Wilson, G. S.

    2012-06-01

    PuffinPlot is a user-friendly desktop application for analysis of paleomagnetic data, offering a unique combination of features. It runs on several operating systems, including Windows, Mac OS X, and Linux; supports both discrete and long core data; and facilitates analysis of very weakly magnetic samples. As well as interactive graphical operation, PuffinPlot offers batch analysis for large volumes of data, and a Python scripting interface for programmatic control of its features. Available data displays include demagnetization/intensity, Zijderveld, equal-area (for sample, site, and suite level demagnetization data, and for magnetic susceptibility anisotropy data), a demagnetization data table, and a natural remanent magnetization intensity histogram. Analysis types include principal component analysis, Fisherian statistics, and great-circle path intersections. The results of calculations can be exported as CSV (comma-separated value) files; graphs can be printed, and can also be saved as publication-quality vector files in SVG or PDF format. PuffinPlot is free, and the program, user manual, and fully documented source code may be downloaded from http://code.google.com/p/puffinplot/.

  6. Chipster: user-friendly analysis software for microarray and other high-throughput data

    PubMed Central

    2011-01-01

    Background The growth of high-throughput technologies such as microarrays and next generation sequencing has been accompanied by active research in data analysis methodology, producing new analysis methods at a rapid pace. While most of the newly developed methods are freely available, their use requires substantial computational skills. In order to enable non-programming biologists to benefit from the method development in a timely manner, we have created the Chipster software. Results Chipster (http://chipster.csc.fi/) brings a powerful collection of data analysis methods within the reach of bioscientists via its intuitive graphical user interface. Users can analyze and integrate different data types such as gene expression, miRNA and aCGH. The analysis functionality is complemented with rich interactive visualizations, allowing users to select datapoints and create new gene lists based on these selections. Importantly, users can save the performed analysis steps as reusable, automatic workflows, which can also be shared with other users. Being a versatile and easily extendable platform, Chipster can be used for microarray, proteomics and sequencing data. In this article we describe its comprehensive collection of analysis and visualization tools for microarray data using three case studies. Conclusions Chipster is a user-friendly analysis software for high-throughput data. Its intuitive graphical user interface enables biologists to access a powerful collection of data analysis and integration tools, and to visualize data interactively. Users can collaborate by sharing analysis sessions and workflows. Chipster is open source, and the server installation package is freely available. PMID:21999641

  7. VCFtoTree: a user-friendly tool to construct locus-specific alignments and phylogenies from thousands of anthropologically relevant genome sequences.

    PubMed

    Xu, Duo; Jaber, Yousef; Pavlidis, Pavlos; Gokcumen, Omer

    2017-09-26

    Constructing alignments and phylogenies for a given locus from large genome sequencing studies with relevant outgroups allow novel evolutionary and anthropological insights. However, no user-friendly tool has been developed to integrate thousands of recently available and anthropologically relevant genome sequences to construct complete sequence alignments and phylogenies. Here, we provide VCFtoTree, a user friendly tool with a graphical user interface that directly accesses online databases to download, parse and analyze genome variation data for regions of interest. Our pipeline combines popular sequence datasets and tree building algorithms with custom data parsing to generate accurate alignments and phylogenies using all the individuals from the 1000 Genomes Project, Neanderthal and Denisovan genomes, as well as reference genomes of Chimpanzee and Rhesus Macaque. It can also be applied to other phased human genomes, as well as genomes from other species. The output of our pipeline includes an alignment in FASTA format and a tree file in newick format. VCFtoTree fulfills the increasing demand for constructing alignments and phylogenies for a given loci from thousands of available genomes. Our software provides a user friendly interface for a wider audience without prerequisite knowledge in programming. VCFtoTree can be accessed from https://github.com/duoduoo/VCFtoTree_3.0.0 .

  8. PDB Editor: a user-friendly Java-based Protein Data Bank file editor with a GUI.

    PubMed

    Lee, Jonas; Kim, Sung Hou

    2009-04-01

    The Protein Data Bank file format is the format most widely used by protein crystallographers and biologists to disseminate and manipulate protein structures. Despite this, there are few user-friendly software packages available to efficiently edit and extract raw information from PDB files. This limitation often leads to many protein crystallographers wasting significant time manually editing PDB files. PDB Editor, written in Java Swing GUI, allows the user to selectively search, select, extract and edit information in parallel. Furthermore, the program is a stand-alone application written in Java which frees users from the hassles associated with platform/operating system-dependent installation and usage. PDB Editor can be downloaded from http://sourceforge.net/projects/pdbeditorjl/.

  9. User interface design principles for the SSM/PMAD automated power system

    NASA Technical Reports Server (NTRS)

    Jakstas, Laura M.; Myers, Chris J.

    1991-01-01

    Martin Marietta has developed a user interface for the space station module power management and distribution (SSM/PMAD) automated power system testbed which provides human access to the functionality of the power system, as well as exemplifying current techniques in user interface design. The testbed user interface was designed to enable an engineer to operate the system easily without having significant knowledge of computer systems, as well as provide an environment in which the engineer can monitor and interact with the SSM/PMAD system hardware. The design of the interface supports a global view of the most important data from the various hardware and software components, as well as enabling the user to obtain additional or more detailed data when needed. The components and representations of the SSM/PMAD testbed user interface are examined. An engineer's interactions with the system are also described.

  10. WIFIP: a web-based user interface for automated synchrotron beamlines.

    PubMed

    Sallaz-Damaz, Yoann; Ferrer, Jean Luc

    2017-09-01

    The beamline control software, through the associated graphical user interface (GUI), is the user access point to the experiment, interacting with synchrotron beamline components and providing automated routines. FIP, the French beamline for the Investigation of Proteins, is a highly automatized macromolecular crystallography (MX) beamline at the European Synchrotron Radiation Facility. On such a beamline, a significant number of users choose to control their experiment remotely. This is often performed with a limited bandwidth and from a large choice of computers and operating systems. Furthermore, this has to be possible in a rapidly evolving experimental environment, where new developments have to be easily integrated. To face these challenges, a light, platform-independent, control software and associated GUI are required. Here, WIFIP, a web-based user interface developed at FIP, is described. Further than being the present FIP control interface, WIFIP is also a proof of concept for future MX control software.

  11. Automated workflows for modelling chemical fate, kinetics and toxicity.

    PubMed

    Sala Benito, J V; Paini, Alicia; Richarz, Andrea-Nicole; Meinl, Thorsten; Berthold, Michael R; Cronin, Mark T D; Worth, Andrew P

    2017-12-01

    Automation is universal in today's society, from operating equipment such as machinery, in factory processes, to self-parking automobile systems. While these examples show the efficiency and effectiveness of automated mechanical processes, automated procedures that support the chemical risk assessment process are still in their infancy. Future human safety assessments will rely increasingly on the use of automated models, such as physiologically based kinetic (PBK) and dynamic models and the virtual cell based assay (VCBA). These biologically-based models will be coupled with chemistry-based prediction models that also automate the generation of key input parameters such as physicochemical properties. The development of automated software tools is an important step in harmonising and expediting the chemical safety assessment process. In this study, we illustrate how the KNIME Analytics Platform can be used to provide a user-friendly graphical interface for these biokinetic models, such as PBK models and VCBA, which simulates the fate of chemicals in vivo within the body and in vitro test systems respectively. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. Back to BaySICS: a user-friendly program for Bayesian Statistical Inference from Coalescent Simulations.

    PubMed

    Sandoval-Castellanos, Edson; Palkopoulou, Eleftheria; Dalén, Love

    2014-01-01

    Inference of population demographic history has vastly improved in recent years due to a number of technological and theoretical advances including the use of ancient DNA. Approximate Bayesian computation (ABC) stands among the most promising methods due to its simple theoretical fundament and exceptional flexibility. However, limited availability of user-friendly programs that perform ABC analysis renders it difficult to implement, and hence programming skills are frequently required. In addition, there is limited availability of programs able to deal with heterochronous data. Here we present the software BaySICS: Bayesian Statistical Inference of Coalescent Simulations. BaySICS provides an integrated and user-friendly platform that performs ABC analyses by means of coalescent simulations from DNA sequence data. It estimates historical demographic population parameters and performs hypothesis testing by means of Bayes factors obtained from model comparisons. Although providing specific features that improve inference from datasets with heterochronous data, BaySICS also has several capabilities making it a suitable tool for analysing contemporary genetic datasets. Those capabilities include joint analysis of independent tables, a graphical interface and the implementation of Markov-chain Monte Carlo without likelihoods.

  13. ASPECTS: an automation-assisted SPE method development system.

    PubMed

    Li, Ming; Chou, Judy; King, Kristopher W; Yang, Liyu

    2013-07-01

    A typical conventional SPE method development (MD) process usually involves deciding the chemistry of the sorbent and eluent based on information about the analyte; experimentally preparing and trying out various combinations of adsorption chemistry and elution conditions; quantitatively evaluating the various conditions; and comparing quantitative results from all combination of conditions to select the best condition for method qualification. The second and fourth steps have mostly been performed manually until now. We developed an automation-assisted system that expedites the conventional SPE MD process by automating 99% of the second step, and expedites the fourth step by automatically processing the results data and presenting it to the analyst in a user-friendly format. The automation-assisted SPE MD system greatly saves the manual labor in SPE MD work, prevents analyst errors from causing misinterpretation of quantitative results, and shortens data analysis and interpretation time.

  14. SeqFIRE: a web application for automated extraction of indel regions and conserved blocks from protein multiple sequence alignments.

    PubMed

    Ajawatanawong, Pravech; Atkinson, Gemma C; Watson-Haigh, Nathan S; Mackenzie, Bryony; Baldauf, Sandra L

    2012-07-01

    Analyses of multiple sequence alignments generally focus on well-defined conserved sequence blocks, while the rest of the alignment is largely ignored or discarded. This is especially true in phylogenomics, where large multigene datasets are produced through automated pipelines. However, some of the most powerful phylogenetic markers have been found in the variable length regions of multiple alignments, particularly insertions/deletions (indels) in protein sequences. We have developed Sequence Feature and Indel Region Extractor (SeqFIRE) to enable the automated identification and extraction of indels from protein sequence alignments. The program can also extract conserved blocks and identify fast evolving sites using a combination of conservation and entropy. All major variables can be adjusted by the user, allowing them to identify the sets of variables most suited to a particular analysis or dataset. Thus, all major tasks in preparing an alignment for further analysis are combined in a single flexible and user-friendly program. The output includes a numbered list of indels, alignments in NEXUS format with indels annotated or removed and indel-only matrices. SeqFIRE is a user-friendly web application, freely available online at www.seqfire.org/.

  15. The global diabetes model: user friendly version 3.0.

    PubMed

    Brown, J B; Russell, A; Chan, W; Pedula, K; Aickin, M

    2000-11-01

    The attributes of Release 3.0 of the user friendly version (UFV) of the global diabetes model (GDM) are described and documented in detail. The GDM is a continuous, stochastic microsimulation model of type 2 diabetes. Suitable for predicting the medical futures of both individuals with diabetes and representative diabetic populations, the GDM predicts medical events (complications of diabetes), survival, utilities, and medical care costs. Incidence rate functions for microvascular and macrovascular complications are based on a combination of published studies and analyses of data describing diabetic members of Kaiser Permanente Northwest Region, a non-profit group-model health maintenance organization. Active risk factors include average blood glucose (HbAlc), systolic blood pressure (SBP), low density lipoprotein cholesterol (LDL), high density lipoprotein cholesterol (HDL), triglycerides, smoking status, and use of prophylactic aspirin. Events predicted include diabetic eye disease, diabetic nephropathy, peripheral neuropathy amputation, myocardial infarction, stroke, peripheral artery disease, congestive heart failure, coronary artery surgery, coronary angioplasty, and death.

  16. UCTM2: An updated User friendly Configurable Trigger, scaler and delay Module for nuclear and particle physics

    NASA Astrophysics Data System (ADS)

    Bourrion, O.; Boyer, B.; Derome, L.; Pignol, G.

    2016-06-01

    We developed a highly integrated and versatile electronic module to equip small nuclear physics experiments and lab teaching classes: the User friendly Configurable Trigger, scaler and delay Module for nuclear and particle physics (UCTM). It is configurable through a Graphical User Interface (GUI) and provides a large number of possible trigger conditions without any Hardware Description Language (HDL) required knowledge. This new version significantly enhances the previous capabilities by providing two additional features: signal digitization and time measurements. The design, performances and a typical application are presented.

  17. Assessing user acceptance towards automated and conventional sink use for hand decontamination using the technology acceptance model.

    PubMed

    Dawson, Carolyn H; Mackrill, Jamie B; Cain, Rebecca

    2017-12-01

    Hand hygiene (HH) prevents harmful contaminants spreading in settings including domestic, health care and food handling. Strategies to improve HH range from behavioural techniques through to automated sinks that ensure hand surface cleaning. This study aimed to assess user experience and acceptance towards a new automated sink, compared to a normal sink. An adapted version of the technology acceptance model (TAM) assessed each mode of handwashing. A within-subjects design enabled N = 46 participants to evaluate both sinks. Perceived Ease of Use and Satisfaction of Use were significantly lower for the automated sink, compared to the conventional sink (p < 0.005). Across the remaining TAM factors, there was no significant difference. Participants suggested design features including jet strength, water temperature and device affordance may improve HH technology. We provide recommendations for future HH technology development to contribute a positive user experience, relevant to technology developers, ergonomists and those involved in HH across all sectors. Practitioner Summary: The need to facilitate timely, effective hand hygiene to prevent illness has led to a rise in automated handwashing systems across different contexts. User acceptance is a key factor in system uptake. This paper applies the technology acceptance model as a means to explore and optimise the design of such systems.

  18. IFDOTMETER: A New Software Application for Automated Immunofluorescence Analysis.

    PubMed

    Rodríguez-Arribas, Mario; Pizarro-Estrella, Elisa; Gómez-Sánchez, Rubén; Yakhine-Diop, S M S; Gragera-Hidalgo, Antonio; Cristo, Alejandro; Bravo-San Pedro, Jose M; González-Polo, Rosa A; Fuentes, José M

    2016-04-01

    Most laboratories interested in autophagy use different imaging software for managing and analyzing heterogeneous parameters in immunofluorescence experiments (e.g., LC3-puncta quantification and determination of the number and size of lysosomes). One solution would be software that works on a user's laptop or workstation that can access all image settings and provide quick and easy-to-use analysis of data. Thus, we have designed and implemented an application called IFDOTMETER, which can run on all major operating systems because it has been programmed using JAVA (Sun Microsystems). Briefly, IFDOTMETER software has been created to quantify a variety of biological hallmarks, including mitochondrial morphology and nuclear condensation. The program interface is intuitive and user-friendly, making it useful for users not familiar with computer handling. By setting previously defined parameters, the software can automatically analyze a large number of images without the supervision of the researcher. Once analysis is complete, the results are stored in a spreadsheet. Using software for high-throughput cell image analysis offers researchers the possibility of performing comprehensive and precise analysis of a high number of images in an automated manner, making this routine task easier. © 2015 Society for Laboratory Automation and Screening.

  19. Automated platform for designing multiple robot work cells

    NASA Astrophysics Data System (ADS)

    Osman, N. S.; Rahman, M. A. A.; Rahman, A. A. Abdul; Kamsani, S. H.; Bali Mohamad, B. M.; Mohamad, E.; Zaini, Z. A.; Rahman, M. F. Ab; Mohamad Hatta, M. N. H.

    2017-06-01

    Designing the multiple robot work cells is very knowledge-intensive, intricate, and time-consuming process. This paper elaborates the development process of a computer-aided design program for generating the multiple robot work cells which offer a user-friendly interface. The primary purpose of this work is to provide a fast and easy platform for less cost and human involvement with minimum trial and errors adjustments. The automated platform is constructed based on the variant-shaped configuration concept with its mathematical model. A robot work cell layout, system components, and construction procedure of the automated platform are discussed in this paper where integration of these items will be able to automatically provide the optimum robot work cell design according to the information set by the user. This system is implemented on top of CATIA V5 software and utilises its Part Design, Assembly Design, and Macro tool. The current outcomes of this work provide a basis for future investigation in developing a flexible configuration system for the multiple robot work cells.

  20. GProX, a user-friendly platform for bioinformatics analysis and visualization of quantitative proteomics data.

    PubMed

    Rigbolt, Kristoffer T G; Vanselow, Jens T; Blagoev, Blagoy

    2011-08-01

    Recent technological advances have made it possible to identify and quantify thousands of proteins in a single proteomics experiment. As a result of these developments, the analysis of data has become the bottleneck of proteomics experiment. To provide the proteomics community with a user-friendly platform for comprehensive analysis, inspection and visualization of quantitative proteomics data we developed the Graphical Proteomics Data Explorer (GProX)(1). The program requires no special bioinformatics training, as all functions of GProX are accessible within its graphical user-friendly interface which will be intuitive to most users. Basic features facilitate the uncomplicated management and organization of large data sets and complex experimental setups as well as the inspection and graphical plotting of quantitative data. These are complemented by readily available high-level analysis options such as database querying, clustering based on abundance ratios, feature enrichment tests for e.g. GO terms and pathway analysis tools. A number of plotting options for visualization of quantitative proteomics data is available and most analysis functions in GProX create customizable high quality graphical displays in both vector and bitmap formats. The generic import requirements allow data originating from essentially all mass spectrometry platforms, quantitation strategies and software to be analyzed in the program. GProX represents a powerful approach to proteomics data analysis providing proteomics experimenters with a toolbox for bioinformatics analysis of quantitative proteomics data. The program is released as open-source and can be freely downloaded from the project webpage at http://gprox.sourceforge.net.

  1. GProX, a User-Friendly Platform for Bioinformatics Analysis and Visualization of Quantitative Proteomics Data*

    PubMed Central

    Rigbolt, Kristoffer T. G.; Vanselow, Jens T.; Blagoev, Blagoy

    2011-01-01

    Recent technological advances have made it possible to identify and quantify thousands of proteins in a single proteomics experiment. As a result of these developments, the analysis of data has become the bottleneck of proteomics experiment. To provide the proteomics community with a user-friendly platform for comprehensive analysis, inspection and visualization of quantitative proteomics data we developed the Graphical Proteomics Data Explorer (GProX)1. The program requires no special bioinformatics training, as all functions of GProX are accessible within its graphical user-friendly interface which will be intuitive to most users. Basic features facilitate the uncomplicated management and organization of large data sets and complex experimental setups as well as the inspection and graphical plotting of quantitative data. These are complemented by readily available high-level analysis options such as database querying, clustering based on abundance ratios, feature enrichment tests for e.g. GO terms and pathway analysis tools. A number of plotting options for visualization of quantitative proteomics data is available and most analysis functions in GProX create customizable high quality graphical displays in both vector and bitmap formats. The generic import requirements allow data originating from essentially all mass spectrometry platforms, quantitation strategies and software to be analyzed in the program. GProX represents a powerful approach to proteomics data analysis providing proteomics experimenters with a toolbox for bioinformatics analysis of quantitative proteomics data. The program is released as open-source and can be freely downloaded from the project webpage at http://gprox.sourceforge.net. PMID:21602510

  2. Real-time spectral analysis of HRV signals: an interactive and user-friendly PC system.

    PubMed

    Basano, L; Canepa, F; Ottonello, P

    1998-01-01

    We present a real-time system, built around a PC and a low-cost data acquisition board, for the spectral analysis of the heart rate variability signal. The Windows-like operating environment on which it is based makes the computer program very user-friendly even for non-specialized personnel. The Power Spectral Density is computed through the use of a hybrid method, in which a classical FFT analysis follows an autoregressive finite-extension of data; the stationarity of the sequence is continuously checked. The use of this algorithm gives a high degree of robustness of the spectral estimation. Moreover, always in real time, the FFT of every data block is computed and displayed in order to corroborate the results as well as to allow the user to interactively choose a proper AR model order.

  3. Automated Derivation of Complex System Constraints from User Requirements

    NASA Technical Reports Server (NTRS)

    Muery, Kim; Foshee, Mark; Marsh, Angela

    2006-01-01

    International Space Station (ISS) payload developers submit their payload science requirements for the development of on-board execution timelines. The ISS systems required to execute the payload science operations must be represented as constraints for the execution timeline. Payload developers use a software application, User Requirements Collection (URC), to submit their requirements by selecting a simplified representation of ISS system constraints. To fully represent the complex ISS systems, the constraints require a level of detail that is beyond the insight of the payload developer. To provide the complex representation of the ISS system constraints, HOSC operations personnel, specifically the Payload Activity Requirements Coordinators (PARC), manually translate the payload developers simplified constraints into detailed ISS system constraints used for scheduling the payload activities in the Consolidated Planning System (CPS). This paper describes the implementation for a software application, User Requirements Integration (URI), developed to automate the manual ISS constraint translation process.

  4. An automation-assisted generic approach for biological sample preparation and LC-MS/MS method validation.

    PubMed

    Zhang, Jie; Wei, Shimin; Ayres, David W; Smith, Harold T; Tse, Francis L S

    2011-09-01

    Although it is well known that automation can provide significant improvement in the efficiency of biological sample preparation in quantitative LC-MS/MS analysis, it has not been widely implemented in bioanalytical laboratories throughout the industry. This can be attributed to the lack of a sound strategy and practical procedures in working with robotic liquid-handling systems. Several comprehensive automation assisted procedures for biological sample preparation and method validation were developed and qualified using two types of Hamilton Microlab liquid-handling robots. The procedures developed were generic, user-friendly and covered the majority of steps involved in routine sample preparation and method validation. Generic automation procedures were established as a practical approach to widely implement automation into the routine bioanalysis of samples in support of drug-development programs.

  5. The HADDOCK2.2 Web Server: User-Friendly Integrative Modeling of Biomolecular Complexes.

    PubMed

    van Zundert, G C P; Rodrigues, J P G L M; Trellet, M; Schmitz, C; Kastritis, P L; Karaca, E; Melquiond, A S J; van Dijk, M; de Vries, S J; Bonvin, A M J J

    2016-02-22

    The prediction of the quaternary structure of biomolecular macromolecules is of paramount importance for fundamental understanding of cellular processes and drug design. In the era of integrative structural biology, one way of increasing the accuracy of modeling methods used to predict the structure of biomolecular complexes is to include as much experimental or predictive information as possible in the process. This has been at the core of our information-driven docking approach HADDOCK. We present here the updated version 2.2 of the HADDOCK portal, which offers new features such as support for mixed molecule types, additional experimental restraints and improved protocols, all of this in a user-friendly interface. With well over 6000 registered users and 108,000 jobs served, an increasing fraction of which on grid resources, we hope that this timely upgrade will help the community to solve important biological questions and further advance the field. The HADDOCK2.2 Web server is freely accessible to non-profit users at http://haddock.science.uu.nl/services/HADDOCK2.2. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  6. Colorado Late Cenozoic Fault and Fold Database and Internet Map Server: User-friendly technology for complex information

    USGS Publications Warehouse

    Morgan, K.S.; Pattyn, G.J.; Morgan, M.L.

    2005-01-01

    Internet mapping applications for geologic data allow simultaneous data delivery and collection, enabling quick data modification while efficiently supplying the end user with information. Utilizing Web-based technologies, the Colorado Geological Survey's Colorado Late Cenozoic Fault and Fold Database was transformed from a monothematic, nonspatial Microsoft Access database into a complex information set incorporating multiple data sources. The resulting user-friendly format supports easy analysis and browsing. The core of the application is the Microsoft Access database, which contains information compiled from available literature about faults and folds that are known or suspected to have moved during the late Cenozoic. The database contains nonspatial fields such as structure type, age, and rate of movement. Geographic locations of the fault and fold traces were compiled from previous studies at 1:250,000 scale to form a spatial database containing information such as length and strike. Integration of the two databases allowed both spatial and nonspatial information to be presented on the Internet as a single dataset (http://geosurvey.state.co.us/pubs/ceno/). The user-friendly interface enables users to view and query the data in an integrated manner, thus providing multiple ways to locate desired information. Retaining the digital data format also allows continuous data updating and quick delivery of newly acquired information. This dataset is a valuable resource to anyone interested in earthquake hazards and the activity of faults and folds in Colorado. Additional geologic hazard layers and imagery may aid in decision support and hazard evaluation. The up-to-date and customizable maps are invaluable tools for researchers or the public.

  7. User's operating procedures. Volume 2: Scout project financial analysis program

    NASA Technical Reports Server (NTRS)

    Harris, C. G.; Haris, D. K.

    1985-01-01

    A review is presented of the user's operating procedures for the Scout Project Automatic Data system, called SPADS. SPADS is the result of the past seven years of software development on a Prime mini-computer located at the Scout Project Office, NASA Langley Research Center, Hampton, Virginia. SPADS was developed as a single entry, multiple cross-reference data management and information retrieval system for the automation of Project office tasks, including engineering, financial, managerial, and clerical support. This volume, two (2) of three (3), provides the instructions to operate the Scout Project Financial Analysis program in data retrieval and file maintenance via the user friendly menu drivers.

  8. Development of a versatile user-friendly IBA experimental chamber

    NASA Astrophysics Data System (ADS)

    Kakuee, Omidreza; Fathollahi, Vahid; Lamehi-Rachti, Mohammad

    2016-03-01

    Reliable performance of the Ion Beam Analysis (IBA) techniques is based on the accurate geometry of the experimental setup, employment of the reliable nuclear data and implementation of dedicated analysis software for each of the IBA techniques. It has already been shown that geometrical imperfections lead to significant uncertainties in quantifications of IBA measurements. To minimize these uncertainties, a user-friendly experimental chamber with a heuristic sample positioning system for IBA analysis was recently developed in the Van de Graaff laboratory in Tehran. This system enhances IBA capabilities and in particular Nuclear Reaction Analysis (NRA) and Elastic Recoil Detection Analysis (ERDA) techniques. The newly developed sample manipulator provides the possibility of both controlling the tilt angle of the sample and analyzing samples with different thicknesses. Moreover, a reasonable number of samples can be loaded in the sample wheel. A comparison of the measured cross section data of the 16O(d,p1)17O reaction with the data reported in the literature confirms the performance and capability of the newly developed experimental chamber.

  9. ALPHACAL: A new user-friendly tool for the calibration of alpha-particle sources.

    PubMed

    Timón, A Fernández; Vargas, M Jurado; Gallardo, P Álvarez; Sánchez-Oro, J; Peralta, L

    2018-05-01

    In this work, we present and describe the program ALPHACAL, specifically developed for the calibration of alpha-particle sources. It is therefore more user-friendly and less time-consuming than multipurpose codes developed for a wide range of applications. The program is based on the recently developed code AlfaMC, which simulates specifically the transport of alpha particles. Both cylindrical and point sources mounted on the surface of polished backings can be simulated, as is the convention in experimental measurements of alpha-particle sources. In addition to the efficiency calculation and determination of the backscattering coefficient, some additional tools are available to the user, like the visualization of energy spectrum, use of energy cut-off or low-energy tail corrections. ALPHACAL has been implemented in C++ language using QT library, so it is available for Windows, MacOs and Linux platforms. It is free and can be provided under request to the authors. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. A user-friendly modified pore-solid fractal model

    PubMed Central

    Ding, Dian-yuan; Zhao, Ying; Feng, Hao; Si, Bing-cheng; Hill, Robert Lee

    2016-01-01

    The primary objective of this study was to evaluate a range of calculation points on water retention curves (WRC) instead of the singularity point at air-entry suction in the pore-solid fractal (PSF) model, which additionally considered the hysteresis effect based on the PSF theory. The modified pore-solid fractal (M-PSF) model was tested using 26 soil samples from Yangling on the Loess Plateau in China and 54 soil samples from the Unsaturated Soil Hydraulic Database. The derivation results showed that the M-PSF model is user-friendly and flexible for a wide range of calculation point options. This model theoretically describes the primary differences between the soil moisture desorption and the adsorption processes by the fractal dimensions. The M-PSF model demonstrated good performance particularly at the calculation points corresponding to the suctions from 100 cm to 1000 cm. Furthermore, the M-PSF model, used the fractal dimension of the particle size distribution, exhibited an accepted performance of WRC predictions for different textured soils when the suction values were ≥100 cm. To fully understand the function of hysteresis in the PSF theory, the role of allowable and accessible pores must be examined. PMID:27996013

  11. For Professors, "Friending" Can Be Fraught

    ERIC Educational Resources Information Center

    Lipka, Sara

    2007-01-01

    People connect on Facebook by asking to "friend" one another. A typical user lists at least 100 such connections, while newbies are informed, "You don't have any friends yet." A humbling statement. It might make one want to find some. But friending students can be even dicier than befriending them. In the real world, casual professors may ask…

  12. ARES (Automated Residential Energy Standard) 1.2: User`s guide, in support of proposed interim energy conservation voluntary performance standards for new non-federal residential buildings: Volume 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    The ARES (Automated Residential Energy Standard) User`s Guide is designed to the user successfully operate the ARES computer program. This guide assumes that the user is familiar with basic PC skills such as using a keyboard and loading a disk drive. The ARES computer program was designed to assist building code officials in creating a residential energy standard based on local climate and costs.

  13. SpectraPLOT, Visualization Package with a User-Friendly Graphical Interface

    NASA Astrophysics Data System (ADS)

    Sebald, James; Macfarlane, Joseph; Golovkin, Igor

    2017-10-01

    SPECT3D is a collisional-radiative spectral analysis package designed to compute detailed emission, absorption, or x-ray scattering spectra, filtered images, XRD signals, and other synthetic diagnostics. The spectra and images are computed for virtual detectors by post-processing the results of hydrodynamics simulations in 1D, 2D, and 3D geometries. SPECT3D can account for a variety of instrumental response effects so that direct comparisons between simulations and experimental measurements can be made. SpectraPLOT is a user-friendly graphical interface for viewing a wide variety of results from SPECT3D simulations, and applying various instrumental effects to the simulated images and spectra. We will present SpectraPLOT's ability to display a variety of data, including spectra, images, light curves, streaked spectra, space-resolved spectra, and drilldown plasma property plots, for an argon-doped capsule implosion experiment example. Future SpectraPLOT features and enhancements will also be discussed.

  14. BRepertoire: a user-friendly web server for analysing antibody repertoire data.

    PubMed

    Margreitter, Christian; Lu, Hui-Chun; Townsend, Catherine; Stewart, Alexander; Dunn-Walters, Deborah K; Fraternali, Franca

    2018-04-14

    Antibody repertoire analysis by high throughput sequencing is now widely used, but a persisting challenge is enabling immunologists to explore their data to discover discriminating repertoire features for their own particular investigations. Computational methods are necessary for large-scale evaluation of antibody properties. We have developed BRepertoire, a suite of user-friendly web-based software tools for large-scale statistical analyses of repertoire data. The software is able to use data preprocessed by IMGT, and performs statistical and comparative analyses with versatile plotting options. BRepertoire has been designed to operate in various modes, for example analysing sequence-specific V(D)J gene usage, discerning physico-chemical properties of the CDR regions and clustering of clonotypes. Those analyses are performed on the fly by a number of R packages and are deployed by a shiny web platform. The user can download the analysed data in different table formats and save the generated plots as image files ready for publication. We believe BRepertoire to be a versatile analytical tool that complements experimental studies of immune repertoires. To illustrate the server's functionality, we show use cases including differential gene usage in a vaccination dataset and analysis of CDR3H properties in old and young individuals. The server is accessible under http://mabra.biomed.kcl.ac.uk/BRepertoire.

  15. A User-Friendly, Keyword-Searchable Database of Geoscientific References Through 2007 for Afghanistan

    USGS Publications Warehouse

    Eppinger, Robert G.; Sipeki, Julianna; Scofield, M.L. Sco

    2008-01-01

    This report includes a document and accompanying Microsoft Access 2003 database of geoscientific references for the country of Afghanistan. The reference compilation is part of a larger joint study of Afghanistan?s energy, mineral, and water resources, and geologic hazards currently underway by the U.S. Geological Survey, the British Geological Survey, and the Afghanistan Geological Survey. The database includes both published (n = 2,489) and unpublished (n = 176) references compiled through calendar year 2007. The references comprise two separate tables in the Access database. The reference database includes a user-friendly, keyword-searchable interface and only minimum knowledge of the use of Microsoft Access is required.

  16. SigmoID: a user-friendly tool for improving bacterial genome annotation through analysis of transcription control signals

    PubMed Central

    Damienikan, Aliaksandr U.

    2016-01-01

    The majority of bacterial genome annotations are currently automated and based on a ‘gene by gene’ approach. Regulatory signals and operon structures are rarely taken into account which often results in incomplete and even incorrect gene function assignments. Here we present SigmoID, a cross-platform (OS X, Linux and Windows) open-source application aiming at simplifying the identification of transcription regulatory sites (promoters, transcription factor binding sites and terminators) in bacterial genomes and providing assistance in correcting annotations in accordance with regulatory information. SigmoID combines a user-friendly graphical interface to well known command line tools with a genome browser for visualising regulatory elements in genomic context. Integrated access to online databases with regulatory information (RegPrecise and RegulonDB) and web-based search engines speeds up genome analysis and simplifies correction of genome annotation. We demonstrate some features of SigmoID by constructing a series of regulatory protein binding site profiles for two groups of bacteria: Soft Rot Enterobacteriaceae (Pectobacterium and Dickeya spp.) and Pseudomonas spp. Furthermore, we inferred over 900 transcription factor binding sites and alternative sigma factor promoters in the annotated genome of Pectobacterium atrosepticum. These regulatory signals control putative transcription units covering about 40% of the P. atrosepticum chromosome. Reviewing the annotation in cases where it didn’t fit with regulatory information allowed us to correct product and gene names for over 300 loci. PMID:27257541

  17. Specdata: Automated Analysis Software for Broadband Spectra

    NASA Astrophysics Data System (ADS)

    Oliveira, Jasmine N.; Martin-Drumel, Marie-Aline; McCarthy, Michael C.

    2017-06-01

    With the advancement of chirped-pulse techniques, broadband rotational spectra with a few tens to several hundred GHz of spectral coverage are now routinely recorded. When studying multi-component mixtures that might result, for example, with the use of an electrical discharge, lines of new chemical species are often obscured by those of known compounds, and analysis can be laborious. To address this issue, we have developed SPECdata, an open source, interactive tool which is designed to simplify and greatly accelerate the spectral analysis and discovery. Our software tool combines both automated and manual components that free the user from computation, while giving him/her considerable flexibility to assign, manipulate, interpret and export their analysis. The automated - and key - component of the new software is a database query system that rapidly assigns transitions of known species in an experimental spectrum. For each experiment, the software identifies spectral features, and subsequently assigns them to known molecules within an in-house database (Pickett .cat files, list of frequencies...), or those catalogued in Splatalogue (using automatic on-line queries). With suggested assignments, the control is then handed over to the user who can choose to accept, decline or add additional species. Data visualization, statistical information, and interactive widgets assist the user in making decisions about their data. SPECdata has several other useful features intended to improve the user experience. Exporting a full report of the analysis, or a peak file in which assigned lines are removed are among several options. A user may also save their progress to continue at another time. Additional features of SPECdata help the user to maintain and expand their database for future use. A user-friendly interface allows one to search, upload, edit or update catalog or experiment entries.

  18. Developing Automated Feedback Materials for a Training Simulator: An Interaction between Users and Researchers.

    ERIC Educational Resources Information Center

    Shlechter, Theodore M.; And Others

    This paper focuses upon the research and development (R&D) process associated with developing automated feedback materials for the SIMulation NETworking (SIMNET) training system. This R&D process involved a partnership among instructional developers, practitioners, and researchers. Users' input has been utilized to help: (1) design the…

  19. A user-friendly SSVEP-based brain-computer interface using a time-domain classifier.

    PubMed

    Luo, An; Sullivan, Thomas J

    2010-04-01

    We introduce a user-friendly steady-state visual evoked potential (SSVEP)-based brain-computer interface (BCI) system. Single-channel EEG is recorded using a low-noise dry electrode. Compared to traditional gel-based multi-sensor EEG systems, a dry sensor proves to be more convenient, comfortable and cost effective. A hardware system was built that displays four LED light panels flashing at different frequencies and synchronizes with EEG acquisition. The visual stimuli have been carefully designed such that potential risk to photosensitive people is minimized. We describe a novel stimulus-locked inter-trace correlation (SLIC) method for SSVEP classification using EEG time-locked to stimulus onsets. We studied how the performance of the algorithm is affected by different selection of parameters. Using the SLIC method, the average light detection rate is 75.8% with very low error rates (an 8.4% false positive rate and a 1.3% misclassification rate). Compared to a traditional frequency-domain-based method, the SLIC method is more robust (resulting in less annoyance to the users) and is also suitable for irregular stimulus patterns.

  20. Users manual for the Automated Performance Test System (APTS)

    NASA Technical Reports Server (NTRS)

    Lane, N. E.; Kennedy, R. S.

    1990-01-01

    The characteristics of and the user information for the Essex Automated Performance Test System (APTS) computer-based portable performance assessment battery are given. The battery was developed to provide a menu of performance test tapping the widest possible variety of human cognitive and motor functions, implemented on a portable computer system suitable for use in both laboratory and field settings for studying the effects of toxic agents and other stressors. The manual gives guidance in selecting, administering and scoring tests from the battery, and reviews the data and studies underlying the development of the battery. Its main emphasis is on the users of the battery - the scientists, researchers and technicians who wish to examine changes in human performance across time or as a function of changes in the conditions under which test data are obtained. First the how to information needed to make decisions about where and how to use the battery is given, followed by the research background supporting the battery development. Further, the development history of the battery focuses largely on the logical framework within which tests were evaluated.

  1. Simple and Inexpensive 3D Printed Filter Fluorometer Designs: User-Friendly Instrument Models for Laboratory Learning and Outreach Activities

    ERIC Educational Resources Information Center

    Porter, Lon A., Jr.; Chapman, Cole A.; Alaniz, Jacob A.

    2017-01-01

    In this work, a versatile and user-friendly selection of stereolithography (STL) files and computer-aided design (CAD) models are shared to assist educators and students in the production of simple and inexpensive 3D printed filter fluorometer instruments. These devices are effective resources for supporting active learners in the exploration of…

  2. Thermal depth profiling of vascular lesions: automated regularization of reconstruction algorithms

    NASA Astrophysics Data System (ADS)

    Verkruysse, Wim; Choi, Bernard; Zhang, Jenny R.; Kim, Jeehyun; Nelson, J. Stuart

    2008-03-01

    Pulsed photo-thermal radiometry (PPTR) is a non-invasive, non-contact diagnostic technique used to locate cutaneous chromophores such as melanin (epidermis) and hemoglobin (vascular structures). Clinical utility of PPTR is limited because it typically requires trained user intervention to regularize the inversion solution. Herein, the feasibility of automated regularization was studied. A second objective of this study was to depart from modeling port wine stain PWS, a vascular skin lesion frequently studied with PPTR, as strictly layered structures since this may influence conclusions regarding PPTR reconstruction quality. Average blood vessel depths, diameters and densities derived from histology of 30 PWS patients were used to generate 15 randomized lesion geometries for which we simulated PPTR signals. Reconstruction accuracy for subjective regularization was compared with that for automated regularization methods. The objective regularization approach performed better. However, the average difference was much smaller than the variation between the 15 simulated profiles. Reconstruction quality depended more on the actual profile to be reconstructed than on the reconstruction algorithm or regularization method. Similar, or better, accuracy reconstructions can be achieved with an automated regularization procedure which enhances prospects for user friendly implementation of PPTR to optimize laser therapy on an individual patient basis.

  3. A user-friendly one-dimensional model for wet volcanic plumes

    USGS Publications Warehouse

    Mastin, Larry G.

    2007-01-01

    This paper presents a user-friendly graphically based numerical model of one-dimensional steady state homogeneous volcanic plumes that calculates and plots profiles of upward velocity, plume density, radius, temperature, and other parameters as a function of height. The model considers effects of water condensation and ice formation on plume dynamics as well as the effect of water added to the plume at the vent. Atmospheric conditions may be specified through input parameters of constant lapse rates and relative humidity, or by loading profiles of actual atmospheric soundings. To illustrate the utility of the model, we compare calculations with field-based estimates of plume height (∼9 km) and eruption rate (>∼4 × 105 kg/s) during a brief tephra eruption at Mount St. Helens on 8 March 2005. Results show that the atmospheric conditions on that day boosted plume height by 1–3 km over that in a standard dry atmosphere. Although the eruption temperature was unknown, model calculations most closely match the observations for a temperature that is below magmatic but above 100°C.

  4. Trust and reliance on an automated combat identification system.

    PubMed

    Wang, Lu; Jamieson, Greg A; Hollands, Justin G

    2009-06-01

    We examined the effects of aid reliability and reliability disclosure on human trust in and reliance on a combat identification (CID) aid. We tested whether trust acts as a mediating factor between belief in and reliance on a CID aid. Individual CID systems have been developed to reduce friendly fire incidents. However, these systems cannot positively identify a target that does not have a working transponder. Therefore, when the feedback is "unknown", the target could be hostile, neutral, or friendly. Soldiers have difficulty relying on this type of imperfect automation appropriately. In manual and aided conditions, 24 participants completed a simulated CID task. The reliability of the aid varied within participants, half of whom were told the aid reliability level. We used the difference in response bias values across conditions to measure automation reliance. Response bias varied more appropriately with the aid reliability level when it was disclosed than when not. Trust in aid feedback correlated with belief in aid reliability and reliance on aid feedback; however, belief was not correlated with reliance. To engender appropriate reliance on CID systems, users should be made aware of system reliability. The findings can be applied to the design of information displays for individual CID systems and soldier training.

  5. Development and Design of a User Interface for a Computer Automated Heating, Ventilation, and Air Conditioning System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, B.; /Fermilab

    1999-10-08

    A user interface is created to monitor and operate the heating, ventilation, and air conditioning system. The interface is networked to the system's programmable logic controller. The controller maintains automated control of the system. The user through the interface is able to see the status of the system and override or adjust the automatic control features. The interface is programmed to show digital readouts of system equipment as well as visual queues of system operational statuses. It also provides information for system design and component interaction. The interface is made easier to read by simple designs, color coordination, and graphics.more » Fermi National Accelerator Laboratory (Fermi lab) conducts high energy particle physics research. Part of this research involves collision experiments with protons, and anti-protons. These interactions are contained within one of two massive detectors along Fermilab's largest particle accelerator the Tevatron. The D-Zero Assembly Building houses one of these detectors. At this time detector systems are being upgraded for a second experiment run, titled Run II. Unlike the previous run, systems at D-Zero must be computer automated so operators do not have to continually monitor and adjust these systems during the run. Human intervention should only be necessary for system start up and shut down, and equipment failure. Part of this upgrade includes the heating, ventilation, and air conditioning system (HVAC system). The HVAC system is responsible for controlling two subsystems, the air temperatures of the D-Zero Assembly Building and associated collision hall, as well as six separate water systems used in the heating and cooling of the air and detector components. The BYAC system is automated by a programmable logic controller. In order to provide system monitoring and operator control a user interface is required. This paper will address methods and strategies used to design and implement an effective user

  6. Workflow efficiency of two 1.5 T MR scanners with and without an automated user interface for head examinations.

    PubMed

    Moenninghoff, Christoph; Umutlu, Lale; Kloeters, Christian; Ringelstein, Adrian; Ladd, Mark E; Sombetzki, Antje; Lauenstein, Thomas C; Forsting, Michael; Schlamann, Marc

    2013-06-01

    Workflow efficiency and workload of radiological technologists (RTs) were compared in head examinations performed with two 1.5 T magnetic resonance (MR) scanners equipped with or without an automated user interface called "day optimizing throughput" (Dot) workflow engine. Thirty-four patients with known intracranial pathology were examined with a 1.5 T MR scanner with Dot workflow engine (Siemens MAGNETOM Aera) and with a 1.5 T MR scanner with conventional user interface (Siemens MAGNETOM Avanto) using four standardized examination protocols. The elapsed time for all necessary work steps, which were performed by 11 RTs within the total examination time, was compared for each examination at both MR scanners. The RTs evaluated the user-friendliness of both scanners by a questionnaire. Normality of distribution was checked for all continuous variables by use of the Shapiro-Wilk test. Normally distributed variables were analyzed by Student's paired t-test, otherwise Wilcoxon signed-rank test was used to compare means. Total examination time of MR examinations performed with Dot engine was reduced from 24:53 to 20:01 minutes (P < .001) and the necessary RT intervention decreased by 61% (P < .001). The Dot engine's automated choice of MR protocols was significantly better assessed by the RTs than the conventional user interface (P = .001). According to this preliminary study, the Dot workflow engine is a time-saving user assistance software, which decreases the RTs' effort significantly and may help to automate neuroradiological examinations for a higher workflow efficiency. Copyright © 2013 AUR. Published by Elsevier Inc. All rights reserved.

  7. A User-Friendly DNA Modeling Software for the Interpretation of Cryo-Electron Microscopy Data.

    PubMed

    Larivière, Damien; Galindo-Murillo, Rodrigo; Fourmentin, Eric; Hornus, Samuel; Lévy, Bruno; Papillon, Julie; Ménétret, Jean-François; Lamour, Valérie

    2017-01-01

    The structural modeling of a macromolecular machine is like a "Lego" approach that is challenged when blocks, like proteins imported from the Protein Data Bank, are to be assembled with an element adopting a serpentine shape, such as DNA templates. DNA must then be built ex nihilo, but modeling approaches are either not user-friendly or very long and fastidious. In this method chapter we show how to use GraphiteLifeExplorer, a software with a simple graphical user interface that enables the sketching of free forms of DNA, of any length, at the atomic scale, as fast as drawing a line on a sheet of paper. We took as an example the nucleoprotein complex of DNA gyrase, a bacterial topoisomerase whose structure has been determined using cryo-electron microscopy (Cryo-EM). Using GraphiteLifeExplorer, we could model in one go a 155 bp long and twisted DNA duplex that wraps around DNA gyrase in the cryo-EM map, improving the quality and interpretation of the final model compared to the initially published data.

  8. Evaluation and comparison of classical interatomic potentials through a user-friendly interactive web-interface

    NASA Astrophysics Data System (ADS)

    Choudhary, Kamal; Congo, Faical Yannick P.; Liang, Tao; Becker, Chandler; Hennig, Richard G.; Tavazza, Francesca

    2017-01-01

    Classical empirical potentials/force-fields (FF) provide atomistic insights into material phenomena through molecular dynamics and Monte Carlo simulations. Despite their wide applicability, a systematic evaluation of materials properties using such potentials and, especially, an easy-to-use user-interface for their comparison is still lacking. To address this deficiency, we computed energetics and elastic properties of variety of materials such as metals and ceramics using a wide range of empirical potentials and compared them to density functional theory (DFT) as well as to experimental data, where available. The database currently consists of 3248 entries including energetics and elastic property calculations, and it is still increasing. We also include computational tools for convex-hull plots for DFT and FF calculations. The data covers 1471 materials and 116 force-fields. In addition, both the complete database and the software coding used in the process have been released for public use online (presently at http://www.ctcms.nist.gov/˜knc6/periodic.html) in a user-friendly way designed to enable further material design and discovery.

  9. Evaluation and comparison of classical interatomic potentials through a user-friendly interactive web-interface

    PubMed Central

    Choudhary, Kamal; Congo, Faical Yannick P.; Liang, Tao; Becker, Chandler; Hennig, Richard G.; Tavazza, Francesca

    2017-01-01

    Classical empirical potentials/force-fields (FF) provide atomistic insights into material phenomena through molecular dynamics and Monte Carlo simulations. Despite their wide applicability, a systematic evaluation of materials properties using such potentials and, especially, an easy-to-use user-interface for their comparison is still lacking. To address this deficiency, we computed energetics and elastic properties of variety of materials such as metals and ceramics using a wide range of empirical potentials and compared them to density functional theory (DFT) as well as to experimental data, where available. The database currently consists of 3248 entries including energetics and elastic property calculations, and it is still increasing. We also include computational tools for convex-hull plots for DFT and FF calculations. The data covers 1471 materials and 116 force-fields. In addition, both the complete database and the software coding used in the process have been released for public use online (presently at http://www.ctcms.nist.gov/∼knc6/periodic.html) in a user-friendly way designed to enable further material design and discovery. PMID:28140407

  10. Towards a user-friendly brain-computer interface: initial tests in ALS and PLS patients.

    PubMed

    Bai, Ou; Lin, Peter; Huang, Dandan; Fei, Ding-Yu; Floeter, Mary Kay

    2010-08-01

    Patients usually require long-term training for effective EEG-based brain-computer interface (BCI) control due to fatigue caused by the demands for focused attention during prolonged BCI operation. We intended to develop a user-friendly BCI requiring minimal training and less mental load. Testing of BCI performance was investigated in three patients with amyotrophic lateral sclerosis (ALS) and three patients with primary lateral sclerosis (PLS), who had no previous BCI experience. All patients performed binary control of cursor movement. One ALS patient and one PLS patient performed four-directional cursor control in a two-dimensional domain under a BCI paradigm associated with human natural motor behavior using motor execution and motor imagery. Subjects practiced for 5-10min and then participated in a multi-session study of either binary control or four-directional control including online BCI game over 1.5-2h in a single visit. Event-related desynchronization and event-related synchronization in the beta band were observed in all patients during the production of voluntary movement either by motor execution or motor imagery. The online binary control of cursor movement was achieved with an average accuracy about 82.1+/-8.2% with motor execution and about 80% with motor imagery, whereas offline accuracy was achieved with 91.4+/-3.4% with motor execution and 83.3+/-8.9% with motor imagery after optimization. In addition, four-directional cursor control was achieved with an accuracy of 50-60% with motor execution and motor imagery. Patients with ALS or PLS may achieve BCI control without extended training, and fatigue might be reduced during operation of a BCI associated with human natural motor behavior. The development of a user-friendly BCI will promote practical BCI applications in paralyzed patients. Copyright 2010 International Federation of Clinical Neurophysiology. All rights reserved.

  11. A user-friendly LabVIEW software platform for grating based X-ray phase-contrast imaging.

    PubMed

    Wang, Shenghao; Han, Huajie; Gao, Kun; Wang, Zhili; Zhang, Can; Yang, Meng; Wu, Zhao; Wu, Ziyu

    2015-01-01

    X-ray phase-contrast imaging can provide greatly improved contrast over conventional absorption-based imaging for weakly absorbing samples, such as biological soft tissues and fibre composites. In this study, we introduced an easy and fast way to develop a user-friendly software platform dedicated to the new grating-based X-ray phase-contrast imaging setup at the National Synchrotron Radiation Laboratory of the University of Science and Technology of China. The control of 21 motorized stages, of a piezoelectric stage and of an X-ray tube are achieved with this software, it also covers image acquisition with a flat panel detector for automatic phase stepping scan. Moreover, a data post-processing module for signals retrieval and other custom features are in principle available. With a seamless integration of all the necessary functions in one software package, this platform greatly facilitate users' activities during experimental runs with this grating based X-ray phase contrast imaging setup.

  12. A user-friendly technical set-up for infrared photography of forensic findings.

    PubMed

    Rost, Thomas; Kalberer, Nicole; Scheurer, Eva

    2017-09-01

    Infrared photography is interesting for a use in forensic science and forensic medicine since it reveals findings that normally are almost invisible to the human eye. Originally, infrared photography has been made possible by the placement of an infrared light transmission filter screwed in front of the camera objective lens. However, this set-up is associated with many drawbacks such as the loss of the autofocus function, the need of an external infrared source, and long exposure times which make the use of a tripod necessary. These limitations prevented up to now the routine application of infrared photography in forensics. In this study the use of a professional modification inside the digital camera body was evaluated regarding camera handling and image quality. This permanent modification consisted of the replacement of the in-built infrared blocking filter by an infrared transmission filter of 700nm and 830nm, respectively. The application of this camera set-up for the photo-documentation of forensically relevant post-mortem findings was investigated in examples of trace evidence such as gunshot residues on the skin, in external findings, e.g. hematomas, as well as in an exemplary internal finding, i.e., Wischnewski spots in a putrefied stomach. The application of scattered light created by indirect flashlight yielded a more uniform illumination of the object, and the use of the 700nm filter resulted in better pictures than the 830nm filter. Compared to pictures taken under visible light, infrared photographs generally yielded better contrast. This allowed for discerning more details and revealed findings which were not visible otherwise, such as imprints on a fabric and tattoos in mummified skin. The permanent modification of a digital camera by building in a 700nm infrared transmission filter resulted in a user-friendly and efficient set-up which qualified for the use in daily forensic routine. Main advantages were a clear picture in the viewfinder, an auto

  13. BUMPER: the Bayesian User-friendly Model for Palaeo-Environmental Reconstruction

    NASA Astrophysics Data System (ADS)

    Holden, Phil; Birks, John; Brooks, Steve; Bush, Mark; Hwang, Grace; Matthews-Bird, Frazer; Valencia, Bryan; van Woesik, Robert

    2017-04-01

    We describe the Bayesian User-friendly Model for Palaeo-Environmental Reconstruction (BUMPER), a Bayesian transfer function for inferring past climate and other environmental variables from microfossil assemblages. The principal motivation for a Bayesian approach is that the palaeoenvironment is treated probabilistically, and can be updated as additional data become available. Bayesian approaches therefore provide a reconstruction-specific quantification of the uncertainty in the data and in the model parameters. BUMPER is fully self-calibrating, straightforward to apply, and computationally fast, requiring 2 seconds to build a 100-taxon model from a 100-site training-set on a standard personal computer. We apply the model's probabilistic framework to generate thousands of artificial training-sets under ideal assumptions. We then use these to demonstrate both the general applicability of the model and the sensitivity of reconstructions to the characteristics of the training-set, considering assemblage richness, taxon tolerances, and the number of training sites. We demonstrate general applicability to real data, considering three different organism types (chironomids, diatoms, pollen) and different reconstructed variables. In all of these applications an identically configured model is used, the only change being the input files that provide the training-set environment and taxon-count data.

  14. MADANALYSIS 5, a user-friendly framework for collider phenomenology

    NASA Astrophysics Data System (ADS)

    Conte, Eric; Fuks, Benjamin; Serret, Guillaume

    2013-01-01

    We present MADANALYSIS 5, a new framework for phenomenological investigations at particle colliders. Based on a C++ kernel, this program allows us to efficiently perform, in a straightforward and user-friendly fashion, sophisticated physics analyses of event files such as those generated by a large class of Monte Carlo event generators. MADANALYSIS 5 comes with two modes of running. The first one, easier to handle, uses the strengths of a powerful PYTHON interface in order to implement physics analyses by means of a set of intuitive commands. The second one requires one to implement the analyses in the C++ programming language, directly within the core of the analysis framework. This opens unlimited possibilities concerning the level of complexity which can be reached, being only limited by the programming skills and the originality of the user. Program summaryProgram title: MadAnalysis 5 Catalogue identifier: AENO_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AENO_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Permission to use, copy, modify and distribute this program is granted under the terms of the GNU General Public License. No. of lines in distributed program, including test data, etc.: 31087 No. of bytes in distributed program, including test data, etc.: 399105 Distribution format: tar.gz Programming language: PYTHON, C++. Computer: All platforms on which Python version 2.7, Root version 5.27 and the g++ compiler are available. Compatibility with newer versions of these programs is also ensured. However, the Python version must be below version 3.0. Operating system: Unix, Linux and Mac OS operating systems on which the above-mentioned versions of Python and Root, as well as g++, are available. Classification: 11.1. External routines: ROOT (http://root.cern.ch/drupal/) Nature of problem: Implementing sophisticated phenomenological analyses in high-energy physics through a

  15. Stability of user-friendly blood typing kits stored under typical military field conditions.

    PubMed

    Bienek, Diane R; Chang, Cheow K; Charlton, David G

    2009-10-01

    To help preserve in-theater strength within deployed military units, commercially available, rapid, user-friendly ABO-Rh blood typing kits were evaluated to determine their stability in storage conditions commonly encountered by the warfighter. Methods for environmental exposure testing were based on MIL-STD-810F. When Eldon Home Kits 2511 were exposed to various temperature/relative humidity conditions, the results were comparable to those obtained with the control group and those obtained with industry-standard methods. For the ABO-Rh Combination Blood Typing Experiment Kits, 2 of the exposure treatments rendered them unusable. In addition, a third set of exposure treatments adversely affected the kits, resulting in approximately 30% blood type misclassifications. Collectively, this evaluation of commercial blood typing kits revealed that diagnostic performance can vary between products, lots, and environmental storage conditions.

  16. An integrated user-friendly ArcMAP tool for bivariate statistical modeling in geoscience applications

    NASA Astrophysics Data System (ADS)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.

    2014-10-01

    Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  17. An integrated user-friendly ArcMAP tool for bivariate statistical modelling in geoscience applications

    NASA Astrophysics Data System (ADS)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.

    2015-03-01

    Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  18. CloudDOE: a user-friendly tool for deploying Hadoop clouds and analyzing high-throughput sequencing data with MapReduce.

    PubMed

    Chung, Wei-Chun; Chen, Chien-Chih; Ho, Jan-Ming; Lin, Chung-Yen; Hsu, Wen-Lian; Wang, Yu-Chun; Lee, D T; Lai, Feipei; Huang, Chih-Wei; Chang, Yu-Jung

    2014-01-01

    Explosive growth of next-generation sequencing data has resulted in ultra-large-scale data sets and ensuing computational problems. Cloud computing provides an on-demand and scalable environment for large-scale data analysis. Using a MapReduce framework, data and workload can be distributed via a network to computers in the cloud to substantially reduce computational latency. Hadoop/MapReduce has been successfully adopted in bioinformatics for genome assembly, mapping reads to genomes, and finding single nucleotide polymorphisms. Major cloud providers offer Hadoop cloud services to their users. However, it remains technically challenging to deploy a Hadoop cloud for those who prefer to run MapReduce programs in a cluster without built-in Hadoop/MapReduce. We present CloudDOE, a platform-independent software package implemented in Java. CloudDOE encapsulates technical details behind a user-friendly graphical interface, thus liberating scientists from having to perform complicated operational procedures. Users are guided through the user interface to deploy a Hadoop cloud within in-house computing environments and to run applications specifically targeted for bioinformatics, including CloudBurst, CloudBrush, and CloudRS. One may also use CloudDOE on top of a public cloud. CloudDOE consists of three wizards, i.e., Deploy, Operate, and Extend wizards. Deploy wizard is designed to aid the system administrator to deploy a Hadoop cloud. It installs Java runtime environment version 1.6 and Hadoop version 0.20.203, and initiates the service automatically. Operate wizard allows the user to run a MapReduce application on the dashboard list. To extend the dashboard list, the administrator may install a new MapReduce application using Extend wizard. CloudDOE is a user-friendly tool for deploying a Hadoop cloud. Its smart wizards substantially reduce the complexity and costs of deployment, execution, enhancement, and management. Interested users may collaborate to improve the

  19. CloudDOE: A User-Friendly Tool for Deploying Hadoop Clouds and Analyzing High-Throughput Sequencing Data with MapReduce

    PubMed Central

    Chung, Wei-Chun; Chen, Chien-Chih; Ho, Jan-Ming; Lin, Chung-Yen; Hsu, Wen-Lian; Wang, Yu-Chun; Lee, D. T.; Lai, Feipei; Huang, Chih-Wei; Chang, Yu-Jung

    2014-01-01

    Background Explosive growth of next-generation sequencing data has resulted in ultra-large-scale data sets and ensuing computational problems. Cloud computing provides an on-demand and scalable environment for large-scale data analysis. Using a MapReduce framework, data and workload can be distributed via a network to computers in the cloud to substantially reduce computational latency. Hadoop/MapReduce has been successfully adopted in bioinformatics for genome assembly, mapping reads to genomes, and finding single nucleotide polymorphisms. Major cloud providers offer Hadoop cloud services to their users. However, it remains technically challenging to deploy a Hadoop cloud for those who prefer to run MapReduce programs in a cluster without built-in Hadoop/MapReduce. Results We present CloudDOE, a platform-independent software package implemented in Java. CloudDOE encapsulates technical details behind a user-friendly graphical interface, thus liberating scientists from having to perform complicated operational procedures. Users are guided through the user interface to deploy a Hadoop cloud within in-house computing environments and to run applications specifically targeted for bioinformatics, including CloudBurst, CloudBrush, and CloudRS. One may also use CloudDOE on top of a public cloud. CloudDOE consists of three wizards, i.e., Deploy, Operate, and Extend wizards. Deploy wizard is designed to aid the system administrator to deploy a Hadoop cloud. It installs Java runtime environment version 1.6 and Hadoop version 0.20.203, and initiates the service automatically. Operate wizard allows the user to run a MapReduce application on the dashboard list. To extend the dashboard list, the administrator may install a new MapReduce application using Extend wizard. Conclusions CloudDOE is a user-friendly tool for deploying a Hadoop cloud. Its smart wizards substantially reduce the complexity and costs of deployment, execution, enhancement, and management. Interested users

  20. Design Guidelines for User Transactions with Battlefield Automated Systems: Prototype for a Handbook

    DTIC Science & Technology

    1984-05-01

    83-86. Codd , E. F ., Seven Steps to Rendezvous with the Casual User (Technical Report RJ-1333). San Jose, California: IBM Research Laboratory, January...SCIENCES A Field Operating Agency under the Jurisdiction of the - Deputy Chief of Staff for Personnel L. NEALE COSBY EDGAR M. JOHNSON Colonel, IN...automated systems. The work here demonstrates that the concept can be productively extended to the behavioral domain. .I: UNC!•ASSI! F I TD SeCURITY CL

  1. Towards User-Friendly Spelling with an Auditory Brain-Computer Interface: The CharStreamer Paradigm

    PubMed Central

    Höhne, Johannes; Tangermann, Michael

    2014-01-01

    Realizing the decoding of brain signals into control commands, brain-computer interfaces (BCI) aim to establish an alternative communication pathway for locked-in patients. In contrast to most visual BCI approaches which use event-related potentials (ERP) of the electroencephalogram, auditory BCI systems are challenged with ERP responses, which are less class-discriminant between attended and unattended stimuli. Furthermore, these auditory approaches have more complex interfaces which imposes a substantial workload on their users. Aiming for a maximally user-friendly spelling interface, this study introduces a novel auditory paradigm: “CharStreamer”. The speller can be used with an instruction as simple as “please attend to what you want to spell”. The stimuli of CharStreamer comprise 30 spoken sounds of letters and actions. As each of them is represented by the sound of itself and not by an artificial substitute, it can be selected in a one-step procedure. The mental mapping effort (sound stimuli to actions) is thus minimized. Usability is further accounted for by an alphabetical stimulus presentation: contrary to random presentation orders, the user can foresee the presentation time of the target letter sound. Healthy, normal hearing users (n = 10) of the CharStreamer paradigm displayed ERP responses that systematically differed between target and non-target sounds. Class-discriminant features, however, varied individually from the typical N1-P2 complex and P3 ERP components found in control conditions with random sequences. To fully exploit the sequential presentation structure of CharStreamer, novel data analysis approaches and classification methods were introduced. The results of online spelling tests showed that a competitive spelling speed can be achieved with CharStreamer. With respect to user rating, it clearly outperforms a control setup with random presentation sequences. PMID:24886978

  2. Development of a Software Tool to Automate ADCO Flight Controller Console Planning Tasks

    NASA Technical Reports Server (NTRS)

    Anderson, Mark G.

    2011-01-01

    This independent study project covers the development of the International Space Station (ISS) Attitude Determination and Control Officer (ADCO) Planning Exchange APEX Tool. The primary goal of the tool is to streamline existing manual and time-intensive planning tools into a more automated, user-friendly application that interfaces with existing products and allows the ADCO to produce accurate products and timelines more effectively. This paper will survey the current ISS attitude planning process and its associated requirements, goals, documentation and software tools and how a software tool could simplify and automate many of the planning actions which occur at the ADCO console. The project will be covered from inception through the initial prototype delivery in November 2011 and will include development of design requirements and software as well as design verification and testing.

  3. A user friendly database for use in ALARA job dose assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zodiates, A.M.; Willcock, A.

    1995-03-01

    The pressurized water reactor (PWR) design chosen for adoption by Nuclear Electric plc was based on the Westinghouse Standard Nuclear Unit Power Plant (SNUPPS). This design was developed to meet the United Kingdom requirements and these improvements are embodied in the Sizewell B plant which will start commercial operation in 1994. A user-friendly database was developed to assist the station in the dose and ALARP assessments of the work expected to be carried out during station operation and outage. The database stores the information in an easily accessible form and enables updating, editing, retrieval, and searches of the information. Themore » database contains job-related information such as job locations, number of workers required, job times, and the expected plant doserates. It also contains the means to flag job requirements such as requirements for temporary shielding, flushing, scaffolding, etc. Typical uses of the database are envisaged to be in the prediction of occupational doses, the identification of high collective and individual dose jobs, use in ALARP assessments, setting of dose targets, monitoring of dose control performance, and others.« less

  4. Biblio-MetReS for user-friendly mining of genes and biological processes in scientific documents.

    PubMed

    Usie, Anabel; Karathia, Hiren; Teixidó, Ivan; Alves, Rui; Solsona, Francesc

    2014-01-01

    One way to initiate the reconstruction of molecular circuits is by using automated text-mining techniques. Developing more efficient methods for such reconstruction is a topic of active research, and those methods are typically included by bioinformaticians in pipelines used to mine and curate large literature datasets. Nevertheless, experimental biologists have a limited number of available user-friendly tools that use text-mining for network reconstruction and require no programming skills to use. One of these tools is Biblio-MetReS. Originally, this tool permitted an on-the-fly analysis of documents contained in a number of web-based literature databases to identify co-occurrence of proteins/genes. This approach ensured results that were always up-to-date with the latest live version of the databases. However, this 'up-to-dateness' came at the cost of large execution times. Here we report an evolution of the application Biblio-MetReS that permits constructing co-occurrence networks for genes, GO processes, Pathways, or any combination of the three types of entities and graphically represent those entities. We show that the performance of Biblio-MetReS in identifying gene co-occurrence is as least as good as that of other comparable applications (STRING and iHOP). In addition, we also show that the identification of GO processes is on par to that reported in the latest BioCreAtIvE challenge. Finally, we also report the implementation of a new strategy that combines on-the-fly analysis of new documents with preprocessed information from documents that were encountered in previous analyses. This combination simultaneously decreases program run time and maintains 'up-to-dateness' of the results. http://metres.udl.cat/index.php/downloads, metres.cmb@gmail.com.

  5. Feasibility study of a take-home array-based functional electrical stimulation system with automated setup for current functional electrical stimulation users with foot-drop.

    PubMed

    Prenton, Sarah; Kenney, Laurence P; Stapleton, Claire; Cooper, Glen; Reeves, Mark L; Heller, Ben W; Sobuh, Mohammad; Barker, Anthony T; Healey, Jamie; Good, Timothy R; Thies, Sibylle B; Howard, David; Williamson, Tracey

    2014-10-01

    To investigate the feasibility of unsupervised community use of an array-based automated setup functional electrical stimulator for current foot-drop functional electrical stimulation (FES) users. Feasibility study. Gait laboratory and community use. Participants (N=7) with diagnosis of unilateral foot-drop of central neurologic origin (>6mo) who were regular users of a foot-drop FES system (>3mo). Array-based automated setup FES system for foot-drop (ShefStim). Logged usage, logged automated setup times for the array-based automated setup FES system and diary recording of problems experienced, all collected in the community environment. Walking speed, ankle angles at initial contact, foot clearance during swing, and the Quebec User Evaluation of Satisfaction with Assistive Technology version 2.0 (QUEST version 2.0) questionnaire, all collected in the gait laboratory. All participants were able to use the array-based automated setup FES system. Total setup time took longer than participants' own FES systems, and automated setup time was longer than in a previous study of a similar system. Some problems were experienced, but overall, participants were as satisfied with this system as their own FES system. The increase in walking speed (N=7) relative to no stimulation was comparable between both systems, and appropriate ankle angles at initial contact (N=7) and foot clearance during swing (n=5) were greater with the array-based automated setup FES system. This study demonstrates that an array-based automated setup FES system for foot-drop can be successfully used unsupervised. Despite setup's taking longer and some problems, users are satisfied with the system and it would appear as effective, if not better, at addressing the foot-drop impairment. Further product development of this unique system, followed by a larger-scale and longer-term study, is required before firm conclusions about its efficacy can be reached. Copyright © 2014 American Congress of

  6. Accuracy of user-friendly blood typing kits tested under simulated military field conditions.

    PubMed

    Bienek, Diane R; Charlton, David G

    2011-04-01

    Rapid user-friendly ABO-Rh blood typing kits (Eldon Home Kit 2511, ABO-Rh Combination Blood Typing Experiment Kit) were evaluated to determine their accuracy when used under simulated military field conditions and after long-term storage at various temperatures and humidities. Rates of positive tests between control groups, experimental groups, and industry standards were measured and analyzed using the Fisher's exact chi-square method to identify significant differences (p < or = 0.05). When Eldon Home Kits 2511 were used in various operational conditions, the results were comparable to those obtained with the control group and with the industry standard. The performance of the ABO-Rh Combination Blood Typing Experiment Kit was adversely affected by prolonged storage in temperatures above 37 degrees C. The diagnostic performance of commercial blood typing kits varies according to product and environmental storage conditions.

  7. A user-friendly, dynamic web environment for remote data browsing and analysis of multiparametric geophysical data within the MULTIMO project

    NASA Astrophysics Data System (ADS)

    Carniel, Roberto; Di Cecca, Mauro; Jaquet, Olivier

    2006-05-01

    In the framework of the EU-funded project "Multi-disciplinary monitoring, modelling and forecasting of volcanic hazard" (MULTIMO), multiparametric data have been recorded at the MULTIMO station in Montserrat. Moreover, several other long time series, recorded at Montserrat and at other volcanoes, have been acquired in order to test stochastic and deterministic methodologies under development. Creating a general framework to handle data efficiently is a considerable task even for homogeneous data. In the case of heterogeneous data, this becomes a major issue. A need for a consistent way of browsing such a heterogeneous dataset in a user-friendly way therefore arose. Additionally, a framework for applying the calculation of the developed dynamical parameters on the data series was also needed in order to easily keep these parameters under control, e.g. for monitoring, research or forecasting purposes. The solution which we present is completely based on Open Source software, including Linux operating system, MySql database management system, Apache web server, Zope application server, Scilab math engine, Plone content management framework, Unified Modelling Language. From the user point of view the main advantage is the possibility of browsing through datasets recorded on different volcanoes, with different instruments, with different sampling frequencies, stored in different formats, all via a consistent, user- friendly interface that transparently runs queries to the database, gets the data from the main storage units, generates the graphs and produces dynamically generated web pages to interact with the user. The involvement of third parties for continuing the development in the Open Source philosophy and/or extending the application fields is now sought.

  8. WHAM!: a web-based visualization suite for user-defined analysis of metagenomic shotgun sequencing data.

    PubMed

    Devlin, Joseph C; Battaglia, Thomas; Blaser, Martin J; Ruggles, Kelly V

    2018-06-25

    Exploration of large data sets, such as shotgun metagenomic sequence or expression data, by biomedical experts and medical professionals remains as a major bottleneck in the scientific discovery process. Although tools for this purpose exist for 16S ribosomal RNA sequencing analysis, there is a growing but still insufficient number of user-friendly interactive visualization workflows for easy data exploration and figure generation. The development of such platforms for this purpose is necessary to accelerate and streamline microbiome laboratory research. We developed the Workflow Hub for Automated Metagenomic Exploration (WHAM!) as a web-based interactive tool capable of user-directed data visualization and statistical analysis of annotated shotgun metagenomic and metatranscriptomic data sets. WHAM! includes exploratory and hypothesis-based gene and taxa search modules for visualizing differences in microbial taxa and gene family expression across experimental groups, and for creating publication quality figures without the need for command line interface or in-house bioinformatics. WHAM! is an interactive and customizable tool for downstream metagenomic and metatranscriptomic analysis providing a user-friendly interface allowing for easy data exploration by microbiome and ecological experts to facilitate discovery in multi-dimensional and large-scale data sets.

  9. User-Friendly Predictive Modeling of Greenhouse Gas (GHG) Fluxes and Carbon Storage in Tidal Wetlands

    NASA Astrophysics Data System (ADS)

    Ishtiaq, K. S.; Abdul-Aziz, O. I.

    2015-12-01

    We developed user-friendly empirical models to predict instantaneous fluxes of CO2 and CH4 from coastal wetlands based on a small set of dominant hydro-climatic and environmental drivers (e.g., photosynthetically active radiation, soil temperature, water depth, and soil salinity). The dominant predictor variables were systematically identified by applying a robust data-analytics framework on a wide range of possible environmental variables driving wetland greenhouse gas (GHG) fluxes. The method comprised of a multi-layered data-analytics framework, including Pearson correlation analysis, explanatory principal component and factor analyses, and partial least squares regression modeling. The identified dominant predictors were finally utilized to develop power-law based non-linear regression models to predict CO2 and CH4 fluxes under different climatic, land use (nitrogen gradient), tidal hydrology and salinity conditions. Four different tidal wetlands of Waquoit Bay, MA were considered as the case study sites to identify the dominant drivers and evaluate model performance. The study sites were dominated by native Spartina Alterniflora and characterized by frequent flooding and high saline conditions. The model estimated the potential net ecosystem carbon balance (NECB) both in gC/m2 and metric tonC/hectare by up-scaling the instantaneous predicted fluxes to the growing season and accounting for the lateral C flux exchanges between the wetlands and estuary. The entire model was presented in a single Excel spreadsheet as a user-friendly ecological engineering tool. The model can aid the development of appropriate GHG offset protocols for setting monitoring plans for tidal wetland restoration and maintenance projects. The model can also be used to estimate wetland GHG fluxes and potential carbon storage under various IPCC climate change and sea level rise scenarios; facilitating an appropriate management of carbon stocks in tidal wetlands and their incorporation into a

  10. User-Friendly Data Servers for Climate Studies at the Asia-Pacific Data-Research Center (APDRC)

    NASA Astrophysics Data System (ADS)

    Yuan, G.; Shen, Y.; Zhang, Y.; Merrill, R.; Waseda, T.; Mitsudera, H.; Hacker, P.

    2002-12-01

    The APDRC was recently established within the International Pacific Research Center (IPRC) at the University of Hawaii. The APDRC mission is to increase understanding of climate variability in the Asia-Pacific region by developing the computational, data-management, and networking infrastructure necessary to make data resources readily accessible and usable by researchers, and by undertaking data-intensive research activities that will both advance knowledge and lead to improvements in data preparation and data products. A focus of recent activity is the implementation of user-friendly data servers. The APDRC is currently running a Live Access Server (LAS) developed at NOAA/PMEL to provide access to and visualization of gridded climate products via the web. The LAS also allows users to download the selected data subsets in various formats (such as binary, netCDF and ASCII). Most of the datasets served by the LAS are also served through our OPeNDAP server (formerly DODS), which allows users to directly access the data using their desktop client tools (e.g. GrADS, Matlab and Ferret). In addition, the APDRC is running an OPeNDAP Catalog/Aggregation Server (CAS) developed by Unidata at UCAR to serve climate data and products such as model output and satellite-derived products. These products are often large (> 2 GB) and are therefore stored as multiple files (stored separately in time or in parameters). The CAS remedies the inconvenience of multiple files and allows access to the whole dataset (or any subset that cuts across the multiple files) via a single request command from any DODS enabled client software. Once the aggregation of files is configured at the server (CAS), the process of aggregation is transparent to the user. The user only needs to know a single URL for the entire dataset, which is, in fact, stored as multiple files. CAS even allows aggregation of files on different systems and at different locations. Currently, the APDRC is serving NCEP, ECMWF

  11. Franklin: User Experiences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    National Energy Research Supercomputing Center; He, Yun; Kramer, William T.C.

    2008-05-07

    The newest workhorse of the National Energy Research Scientific Computing Center is a Cray XT4 with 9,736 dual core nodes. This paper summarizes Franklin user experiences from friendly early user period to production period. Selected successful user stories along with top issues affecting user experiences are presented.

  12. SUGAR: graphical user interface-based data refiner for high-throughput DNA sequencing.

    PubMed

    Sato, Yukuto; Kojima, Kaname; Nariai, Naoki; Yamaguchi-Kabata, Yumi; Kawai, Yosuke; Takahashi, Mamoru; Mimori, Takahiro; Nagasaki, Masao

    2014-08-08

    Next-generation sequencers (NGSs) have become one of the main tools for current biology. To obtain useful insights from the NGS data, it is essential to control low-quality portions of the data affected by technical errors such as air bubbles in sequencing fluidics. We develop a software SUGAR (subtile-based GUI-assisted refiner) which can handle ultra-high-throughput data with user-friendly graphical user interface (GUI) and interactive analysis capability. The SUGAR generates high-resolution quality heatmaps of the flowcell, enabling users to find possible signals of technical errors during the sequencing. The sequencing data generated from the error-affected regions of a flowcell can be selectively removed by automated analysis or GUI-assisted operations implemented in the SUGAR. The automated data-cleaning function based on sequence read quality (Phred) scores was applied to a public whole human genome sequencing data and we proved the overall mapping quality was improved. The detailed data evaluation and cleaning enabled by SUGAR would reduce technical problems in sequence read mapping, improving subsequent variant analysis that require high-quality sequence data and mapping results. Therefore, the software will be especially useful to control the quality of variant calls to the low population cells, e.g., cancers, in a sample with technical errors of sequencing procedures.

  13. Altools: a user friendly NGS data analyser.

    PubMed

    Camiolo, Salvatore; Sablok, Gaurav; Porceddu, Andrea

    2016-02-17

    Genotyping by re-sequencing has become a standard approach to estimate single nucleotide polymorphism (SNP) diversity, haplotype structure and the biodiversity and has been defined as an efficient approach to address geographical population genomics of several model species. To access core SNPs and insertion/deletion polymorphisms (indels), and to infer the phyletic patterns of speciation, most such approaches map short reads to the reference genome. Variant calling is important to establish patterns of genome-wide association studies (GWAS) for quantitative trait loci (QTLs), and to determine the population and haplotype structure based on SNPs, thus allowing content-dependent trait and evolutionary analysis. Several tools have been developed to investigate such polymorphisms as well as more complex genomic rearrangements such as copy number variations, presence/absence variations and large deletions. The programs available for this purpose have different strengths (e.g. accuracy, sensitivity and specificity) and weaknesses (e.g. low computation speed, complex installation procedure and absence of a user-friendly interface). Here we introduce Altools, a software package that is easy to install and use, which allows the precise detection of polymorphisms and structural variations. Altools uses the BWA/SAMtools/VarScan pipeline to call SNPs and indels, and the dnaCopy algorithm to achieve genome segmentation according to local coverage differences in order to identify copy number variations. It also uses insert size information from the alignment of paired-end reads and detects potential large deletions. A double mapping approach (BWA/BLASTn) identifies precise breakpoints while ensuring rapid elaboration. Finally, Altools implements several processes that yield deeper insight into the genes affected by the detected polymorphisms. Altools was used to analyse both simulated and real next-generation sequencing (NGS) data and performed satisfactorily in terms of

  14. Design and analysis on sorting blade for automated size-based sorting device

    NASA Astrophysics Data System (ADS)

    Razali, Zol Bahri; Kader, Mohamed Mydin M. Abdul; Samsudin, Yasser Suhaimi; Daud, Mohd Hisam

    2017-09-01

    Nowadays rubbish separating or recycling is a main problem of nation, where peoples dumped their rubbish into dumpsite without caring the value of the rubbish if it can be recycled and reused. Thus the author proposed an automated segregating device, purposely to teach people to separate their rubbish and value the rubbish that can be reused. The automated size-based mechanical segregating device provides significant improvements in terms of efficiency and consistency in this segregating process. This device is designed to make recycling easier, user friendly, in the hope that more people will take responsibility if it is less of an expense of time and effort. This paper discussed about redesign a blade for the sorting device which is to develop an efficient automated mechanical sorting device for the similar material but in different size. The machine is able to identify the size of waste and it depends to the coil inside the container to separate it out. The detail design and methodology is described in detail in this paper.

  15. National Space Science Data Center data archive and distribution service (NDADS) automated retrieval mail system user's guide

    NASA Technical Reports Server (NTRS)

    Perry, Charleen M.; Vansteenberg, Michael E.

    1992-01-01

    The National Space Science Data Center (NSSDC) has developed an automated data retrieval request service utilizing our Data Archive and Distribution Service (NDADS) computer system. NDADS currently has selected project data written to optical disk platters with the disks residing in a robotic 'jukebox' near-line environment. This allows for rapid and automated access to the data with no staff intervention required. There are also automated help information and user services available that can be accessed. The request system permits an average-size data request to be completed within minutes of the request being sent to NSSDC. A mail message, in the format described in this document, retrieves the data and can send it to a remote site. Also listed in this document are the data currently available.

  16. An automated Genomes-to-Natural Products platform (GNP) for the discovery of modular natural products.

    PubMed

    Johnston, Chad W; Skinnider, Michael A; Wyatt, Morgan A; Li, Xiang; Ranieri, Michael R M; Yang, Lian; Zechel, David L; Ma, Bin; Magarvey, Nathan A

    2015-09-28

    Bacterial natural products are a diverse and valuable group of small molecules, and genome sequencing indicates that the vast majority remain undiscovered. The prediction of natural product structures from biosynthetic assembly lines can facilitate their discovery, but highly automated, accurate, and integrated systems are required to mine the broad spectrum of sequenced bacterial genomes. Here we present a genome-guided natural products discovery tool to automatically predict, combinatorialize and identify polyketides and nonribosomal peptides from biosynthetic assembly lines using LC-MS/MS data of crude extracts in a high-throughput manner. We detail the directed identification and isolation of six genetically predicted polyketides and nonribosomal peptides using our Genome-to-Natural Products platform. This highly automated, user-friendly programme provides a means of realizing the potential of genetically encoded natural products.

  17. iMet-Q: A User-Friendly Tool for Label-Free Metabolomics Quantitation Using Dynamic Peak-Width Determination

    PubMed Central

    Chang, Hui-Yin; Chen, Ching-Tai; Lih, T. Mamie; Lynn, Ke-Shiuan; Juo, Chiun-Gung; Hsu, Wen-Lian; Sung, Ting-Yi

    2016-01-01

    Efficient and accurate quantitation of metabolites from LC-MS data has become an important topic. Here we present an automated tool, called iMet-Q (intelligent Metabolomic Quantitation), for label-free metabolomics quantitation from high-throughput MS1 data. By performing peak detection and peak alignment, iMet-Q provides a summary of quantitation results and reports ion abundance at both replicate level and sample level. Furthermore, it gives the charge states and isotope ratios of detected metabolite peaks to facilitate metabolite identification. An in-house standard mixture and a public Arabidopsis metabolome data set were analyzed by iMet-Q. Three public quantitation tools, including XCMS, MetAlign, and MZmine 2, were used for performance comparison. From the mixture data set, seven standard metabolites were detected by the four quantitation tools, for which iMet-Q had a smaller quantitation error of 12% in both profile and centroid data sets. Our tool also correctly determined the charge states of seven standard metabolites. By searching the mass values for those standard metabolites against Human Metabolome Database, we obtained a total of 183 metabolite candidates. With the isotope ratios calculated by iMet-Q, 49% (89 out of 183) metabolite candidates were filtered out. From the public Arabidopsis data set reported with two internal standards and 167 elucidated metabolites, iMet-Q detected all of the peaks corresponding to the internal standards and 167 metabolites. Meanwhile, our tool had small abundance variation (≤0.19) when quantifying the two internal standards and had higher abundance correlation (≥0.92) when quantifying the 167 metabolites. iMet-Q provides user-friendly interfaces and is publicly available for download at http://ms.iis.sinica.edu.tw/comics/Software_iMet-Q.html. PMID:26784691

  18. ClimatePipes: User-Friendly Data Access, Manipulation, Analysis & Visualization of Community Climate Models

    NASA Astrophysics Data System (ADS)

    Chaudhary, A.; DeMarle, D.; Burnett, B.; Harris, C.; Silva, W.; Osmari, D.; Geveci, B.; Silva, C.; Doutriaux, C.; Williams, D. N.

    2013-12-01

    The impact of climate change will resonate through a broad range of fields including public health, infrastructure, water resources, and many others. Long-term coordinated planning, funding, and action are required for climate change adaptation and mitigation. Unfortunately, widespread use of climate data (simulated and observed) in non-climate science communities is impeded by factors such as large data size, lack of adequate metadata, poor documentation, and lack of sufficient computational and visualization resources. We present ClimatePipes to address many of these challenges by creating an open source platform that provides state-of-the-art, user-friendly data access, analysis, and visualization for climate and other relevant geospatial datasets, making the climate data available to non-researchers, decision-makers, and other stakeholders. The overarching goals of ClimatePipes are: - Enable users to explore real-world questions related to climate change. - Provide tools for data access, analysis, and visualization. - Facilitate collaboration by enabling users to share datasets, workflows, and visualization. ClimatePipes uses a web-based application platform for its widespread support on mainstream operating systems, ease-of-use, and inherent collaboration support. The front-end of ClimatePipes uses HTML5 (WebGL, Canvas2D, CSS3) to deliver state-of-the-art visualization and to provide a best-in-class user experience. The back-end of the ClimatePipes is built around Python using the Visualization Toolkit (VTK, http://vtk.org), Climate Data Analysis Tools (CDAT, http://uv-cdat.llnl.gov), and other climate and geospatial data processing tools such as GDAL and PROJ4. ClimatePipes web-interface to query and access data from remote sources (such as ESGF). Shown in the figure is climate data layer from ESGF on top of map data layer from OpenStreetMap. The ClimatePipes workflow editor provides flexibility and fine grained control, and uses the VisTrails (http

  19. Nutrient Tracking Tool - A user-friendly tool for evaluating the water and air quality and quantity as affected by various agricultural management practices

    NASA Astrophysics Data System (ADS)

    Saleh, A.; Niraula, R.; Gallego, O.; Osei, E.; Kannan, N.

    2017-12-01

    The Nutrient Tracking Tool (NTT) is a user-friendly web-based computer program that estimate nutrient (nitrogen and phosphorus) and sediment losses from fields managed under a variety of cropping patterns and management practices. The NTT includes a user-friendly web-based interface and is linked to the Agricultural Policy Environmental eXtender (APEX) model. It also accesses USDA-NRCS's Web Soil Survey to obtain field, weather, and soil information. NTT provides producers, government officials, and other users with a fast and efficient method of estimating the nutrient, sediment, and atmosphoric gases (N2o, Co2, and NH4) losses, and crop production under different conservation practices regims at the farm-level. The information obtained from NTT can help producers to determine the most cost-effective conservation practice(s) to reduce the nutrient and sediment losses while optimizing the crop production. Also, the recent version of NTT (NTTg3) has been developed for those coutries without access to national databasis, such as soils and wether. The NTTg3 also has been designed as easy to use APEX interface. NTT is currently being evaluated for trading and other programs at Cheaseapea Bay regions and numerous states in US. During this presentation the new capabilities of NTTg3 will be described and demonstrated.

  20. The Clinical Ethnographic Interview: A user-friendly guide to the cultural formulation of distress and help seeking

    PubMed Central

    Arnault, Denise Saint; Shimabukuro, Shizuka

    2013-01-01

    Transcultural nursing, psychiatry, and medical anthropology have theorized that practitioners and researchers need more flexible instruments to gather culturally relevant illness experience, meaning, and help seeking. The state of the science is sufficiently developed to allow standardized yet ethnographically sound protocols for assessment. However, vigorous calls for culturally adapted assessment models have yielded little real change in routine practice. This paper describes the conversion of the Diagnostic and Statistical Manual IV, Appendix I Outline for Cultural Formulation into a user-friendly Clinical Ethnographic Interview (CEI), and provides clinical examples of its use in a sample of highly distressed Japanese women. PMID:22194348

  1. MIEC-SVM: automated pipeline for protein peptide/ligand interaction prediction.

    PubMed

    Li, Nan; Ainsworth, Richard I; Wu, Meixin; Ding, Bo; Wang, Wei

    2016-03-15

    MIEC-SVM is a structure-based method for predicting protein recognition specificity. Here, we present an automated MIEC-SVM pipeline providing an integrated and user-friendly workflow for construction and application of the MIEC-SVM models. This pipeline can handle standard amino acids and those with post-translational modifications (PTMs) or small molecules. Moreover, multi-threading and support to Sun Grid Engine (SGE) are implemented to significantly boost the computational efficiency. The program is available at http://wanglab.ucsd.edu/MIEC-SVM CONTACT: : wei-wang@ucsd.edu Supplementary data available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  2. New User-Friendly Approach to Obtain an Eisenberg Plot and Its Use as a Practical Tool in Protein Sequence Analysis

    PubMed Central

    Keller, Rob C.A.

    2011-01-01

    The Eisenberg plot or hydrophobic moment plot methodology is one of the most frequently used methods of bioinformatics. Bioinformatics is more and more recognized as a helpful tool in Life Sciences in general, and recent developments in approaches recognizing lipid binding regions in proteins are promising in this respect. In this study a bioinformatics approach specialized in identifying lipid binding helical regions in proteins was used to obtain an Eisenberg plot. The validity of the Heliquest generated hydrophobic moment plot was checked and exemplified. This study indicates that the Eisenberg plot methodology can be transferred to another hydrophobicity scale and renders a user-friendly approach which can be utilized in routine checks in protein–lipid interaction and in protein and peptide lipid binding characterization studies. A combined approach seems to be advantageous and results in a powerful tool in the search of helical lipid-binding regions in proteins and peptides. The strength and limitations of the Eisenberg plot approach itself are discussed as well. The presented approach not only leads to a better understanding of the nature of the protein–lipid interactions but also provides a user-friendly tool for the search of lipid-binding regions in proteins and peptides. PMID:22016610

  3. New user-friendly approach to obtain an Eisenberg plot and its use as a practical tool in protein sequence analysis.

    PubMed

    Keller, Rob C A

    2011-01-01

    The Eisenberg plot or hydrophobic moment plot methodology is one of the most frequently used methods of bioinformatics. Bioinformatics is more and more recognized as a helpful tool in Life Sciences in general, and recent developments in approaches recognizing lipid binding regions in proteins are promising in this respect. In this study a bioinformatics approach specialized in identifying lipid binding helical regions in proteins was used to obtain an Eisenberg plot. The validity of the Heliquest generated hydrophobic moment plot was checked and exemplified. This study indicates that the Eisenberg plot methodology can be transferred to another hydrophobicity scale and renders a user-friendly approach which can be utilized in routine checks in protein-lipid interaction and in protein and peptide lipid binding characterization studies. A combined approach seems to be advantageous and results in a powerful tool in the search of helical lipid-binding regions in proteins and peptides. The strength and limitations of the Eisenberg plot approach itself are discussed as well. The presented approach not only leads to a better understanding of the nature of the protein-lipid interactions but also provides a user-friendly tool for the search of lipid-binding regions in proteins and peptides.

  4. Ferret: a user-friendly Java tool to extract data from the 1000 Genomes Project.

    PubMed

    Limou, Sophie; Taverner, Andrew M; Winkler, Cheryl A

    2016-07-15

    The 1000 Genomes (1KG) Project provides a near-comprehensive resource on human genetic variation in worldwide reference populations. 1KG variants can be accessed through a browser and through the raw and annotated data that are regularly released on an ftp server. We developed Ferret, a user-friendly Java tool, to easily extract genetic variation information from these large and complex data files. From a locus, gene(s) or SNP(s) of interest, Ferret retrieves genotype data for 1KG SNPs and indels, and computes allelic frequencies for 1KG populations and optionally, for the Exome Sequencing Project populations. By converting the 1KG data into files that can be imported into popular pre-existing tools (e.g. PLINK and HaploView), Ferret offers a straightforward way, even for non-bioinformatics specialists, to manipulate, explore and merge 1KG data with the user's dataset, as well as visualize linkage disequilibrium pattern, infer haplotypes and design tagSNPs. Ferret tool and source code are publicly available at http://limousophie35.github.io/Ferret/ ferret@nih.gov Supplementary data are available at Bioinformatics online. Published by Oxford University Press 2016. This work is written by US Government employees and is in the public domain in the US.

  5. Developing user-friendly habitat suitability tools from regional stream fish survey data

    USGS Publications Warehouse

    Zorn, T.G.; Seelbach, P.; Wiley, M.J.

    2011-01-01

    We developed user-friendly fish habitat suitability tools (plots) for fishery managers in Michigan; these tools are based on driving habitat variables and fish population estimates for several hundred stream sites throughout the state. We generated contour plots to show patterns in fish biomass for over 60 common species (and for 120 species grouped at the family level) in relation to axes of catchment area and low-flow yield (90% exceedance flow divided by catchment area) and also in relation to axes of mean and weekly range of July temperatures. The plots showed distinct patterns in fish habitat suitability at each level of biological organization studied and were useful for quantitatively comparing river sites. We demonstrate how these plots can be used to support stream management, and we provide examples pertaining to resource assessment, trout stocking, angling regulations, chemical reclamation of marginal trout streams, indicator species, instream flow protection, and habitat restoration. These straightforward and effective tools are electronically available so that managers can easily access and incorporate them into decision protocols and presentations.

  6. Applicability of the "Emotiv EEG Neuroheadset" as a user-friendly input interface.

    PubMed

    Boutani, Hidenori; Ohsuga, Mieko

    2013-01-01

    We aimed to develop an input interface by using the P3 component of visual event-related potentials (ERPs). When using electroencephalography (EEG) in daily applications, coping with ocular-motor artifacts and ensuring that the equipment is user-friendly are both important. To address the first issue, we applied a previously proposed method that applies an unmixing matrix to acquire independent components (ICs) obtained from another dataset. For the second issue, we introduced a 14-channel EEG commercial headset called the "Emotiv EEG Neuroheadset". An advantage of the Emotiv headset is that users can put it on by themselves within 1 min without any specific skills. However, only a few studies have investigated whether EEG and ERP signals are accurately measured by Emotiv. Additionally, no electrodes of the Emotiv headset are located over the centroparietal area of the head where P3 components are reported to show large amplitudes. Therefore, we first demonstrated that the P3 components obtained by the headset and by commercial plate electrodes and a multipurpose bioelectric amplifier during an oddball task were comparable. Next, we confirmed that eye-blink and ocular movement components could be decomposed by independent component analysis (ICA) using the 14-channel signals measured by the headset. We also demonstrated that artifacts could be removed with an unmixing matrix, as long as the matrix was obtained from the same person, even if they were measured on different days. Finally, we confirmed that the fluctuation of the sampling frequency of the Emotiv headset was not a major problem.

  7. I trust it, but I don't know why: effects of implicit attitudes toward automation on trust in an automated system.

    PubMed

    Merritt, Stephanie M; Heimbaugh, Heather; LaChapell, Jennifer; Lee, Deborah

    2013-06-01

    This study is the first to examine the influence of implicit attitudes toward automation on users' trust in automation. Past empirical work has examined explicit (conscious) influences on user level of trust in automation but has not yet measured implicit influences. We examine concurrent effects of explicit propensity to trust machines and implicit attitudes toward automation on trust in an automated system. We examine differential impacts of each under varying automation performance conditions (clearly good, ambiguous, clearly poor). Participants completed both a self-report measure of propensity to trust and an Implicit Association Test measuring implicit attitude toward automation, then performed an X-ray screening task. Automation performance was manipulated within-subjects by varying the number and obviousness of errors. Explicit propensity to trust and implicit attitude toward automation did not significantly correlate. When the automation's performance was ambiguous, implicit attitude significantly affected automation trust, and its relationship with propensity to trust was additive: Increments in either were related to increases in trust. When errors were obvious, a significant interaction between the implicit and explicit measures was found, with those high in both having higher trust. Implicit attitudes have important implications for automation trust. Users may not be able to accurately report why they experience a given level of trust. To understand why users trust or fail to trust automation, measurements of implicit and explicit predictors may be necessary. Furthermore, implicit attitude toward automation might be used as a lever to effectively calibrate trust.

  8. The SubCons webserver: A user friendly web interface for state-of-the-art subcellular localization prediction.

    PubMed

    Salvatore, M; Shu, N; Elofsson, A

    2018-01-01

    SubCons is a recently developed method that predicts the subcellular localization of a protein. It combines predictions from four predictors using a Random Forest classifier. Here, we present the user-friendly web-interface implementation of SubCons. Starting from a protein sequence, the server rapidly predicts the subcellular localizations of an individual protein. In addition, the server accepts the submission of sets of proteins either by uploading the files or programmatically by using command line WSDL API scripts. This makes SubCons ideal for proteome wide analyses allowing the user to scan a whole proteome in few days. From the web page, it is also possible to download precalculated predictions for several eukaryotic organisms. To evaluate the performance of SubCons we present a benchmark of LocTree3 and SubCons using two recent mass-spectrometry based datasets of mouse and drosophila proteins. The server is available at http://subcons.bioinfo.se/. © 2017 The Protein Society.

  9. User-Friendly Interface Developed for a Web-Based Service for SpaceCAL Emulations

    NASA Technical Reports Server (NTRS)

    Liszka, Kathy J.; Holtz, Allen P.

    2004-01-01

    A team at the NASA Glenn Research Center is developing a Space Communications Architecture Laboratory (SpaceCAL) for protocol development activities for coordinated satellite missions. SpaceCAL will provide a multiuser, distributed system to emulate space-based Internet architectures, backbone networks, formation clusters, and constellations. As part of a new effort in 2003, building blocks are being defined for an open distributed system to make the satellite emulation test bed accessible through an Internet connection. The first step in creating a Web-based service to control the emulation remotely is providing a user-friendly interface for encoding the data into a well-formed and complete Extensible Markup Language (XML) document. XML provides coding that allows data to be transferred between dissimilar systems. Scenario specifications include control parameters, network routes, interface bandwidths, delay, and bit error rate. Specifications for all satellite, instruments, and ground stations in a given scenario are also included in the XML document. For the SpaceCAL emulation, the XML document can be created using XForms, a Webbased forms language for data collection. Contrary to older forms technology, the interactive user interface makes the science prevalent, not the data representation. Required versus optional input fields, default values, automatic calculations, data validation, and reuse will help researchers quickly and accurately define missions. XForms can apply any XML schema defined for the test mission to validate data before forwarding it to the emulation facility. New instrument definitions, facilities, and mission types can be added to the existing schema. The first prototype user interface incorporates components for interactive input and form processing. Internet address, data rate, and the location of the facility are implemented with basic form controls with default values provided for convenience and efficiency using basic XForms operations

  10. A user-friendly, open-source tool to project impact and cost of diagnostic tests for tuberculosis

    PubMed Central

    Dowdy, David W; Andrews, Jason R; Dodd, Peter J; Gilman, Robert H

    2014-01-01

    Most models of infectious diseases, including tuberculosis (TB), do not provide results customized to local conditions. We created a dynamic transmission model to project TB incidence, TB mortality, multidrug-resistant (MDR) TB prevalence, and incremental costs over 5 years after scale-up of nine alternative diagnostic strategies. A corresponding web-based interface allows users to specify local costs and epidemiology. In settings with little capacity for up-front investment, same-day microscopy had the greatest impact on TB incidence and became cost-saving within 5 years if delivered at $10/test. With greater initial investment, population-level scale-up of Xpert MTB/RIF or microcolony-based culture often averted 10 times more TB cases than narrowly-targeted strategies, at minimal incremental long-term cost. Xpert for smear-positive TB had reasonable impact on MDR-TB incidence, but at substantial price and little impact on overall TB incidence and mortality. This user-friendly modeling framework improves decision-makers' ability to evaluate the local impact of TB diagnostic strategies. DOI: http://dx.doi.org/10.7554/eLife.02565.001 PMID:24898755

  11. The closer the relationship, the more the interaction on facebook? Investigating the case of Taiwan users.

    PubMed

    Hsu, Chiung-Wen Julia; Wang, Ching-Chan; Tai, Yi-Ting

    2011-01-01

    This study argues for the necessity of applying offline contexts to social networking site research and the importance of distinguishing the relationship types of users' counterparts when studying Facebook users' behaviors. In an attempt to examine the relationship among users' behaviors, their counterparts' relationship types, and the users' perceived acquaintanceships after using Facebook, this study first investigated users' frequently used tools when interacting with different types of friends. Users tended to use less time- and effort-consuming and less privacy-concerned tools with newly acquired friends. This study further examined users' behaviors in terms of their closeness and intimacy and their perceived acquaintanceships toward four different types of friends. The study found that users gained more perceived acquaintanceships from less close friends with whom users have more frequent interaction but less intimate behaviors. As for closer friends, users tended to use more intimate activities to interact with them. However, these activities did not necessarily occur more frequently than the activities they employed with their less close friends. It was found that perceived acquaintanceships with closer friends were significantly lower than those with less close friends. This implies that Facebook is a mechanism for new friends, rather than close friends, to become more acquainted.

  12. AVR Microcontroller-based automated technique for analysis of DC motors

    NASA Astrophysics Data System (ADS)

    Kaur, P.; Chatterji, S.

    2014-01-01

    This paper provides essential information on the development of a 'dc motor test and analysis control card' using AVR series ATMega32 microcontroller. This card can be interfaced to PC and calculates parameters like motor losses, efficiency and plot characteristics for dc motors. Presently, there are different tests and methods available to evaluate motor parameters, but a single and universal user-friendly automated set-up has been discussed in this paper. It has been accomplished by designing a data acquisition and SCR bridge firing hardware based on AVR ATMega32 microcontroller. This hardware has the capability to drive the phase-controlled rectifiers and acquire real-time values of current, voltage, temperature and speed of motor. Various analyses feasible with the designed hardware are of immense importance for dc motor manufacturers and quality-sensitive users. Authors, through this paper aim to provide details of this AVR-based hardware which can be used for dc motor parameter analysis and also for motor control applications.

  13. Time-Motion Analysis of Four Automated Systems for the Detection of Chlamydia trachomatis and Neisseria gonorrhoeae by Nucleic Acid Amplification Testing.

    PubMed

    Williams, James A; Eddleman, Laura; Pantone, Amy; Martinez, Regina; Young, Stephen; Van Der Pol, Barbara

    2014-08-01

    Next-generation diagnostics for Chlamydia trachomatis and Neisseria gonorrhoeae are available on semi- or fully-automated platforms. These systems require less hands-on time than older platforms and are user friendly. Four automated systems, the ABBOTT m2000 system, Becton Dickinson Viper System with XTR Technology, Gen-Probe Tigris DTS system, and Roche cobas 4800 system, were evaluated for total run time, hands-on time, and walk-away time. All of the systems evaluated in this time-motion study were able to complete a diagnostic test run within an 8-h work shift, instrument setup and operation were straightforward and uncomplicated, and walk-away time ranged from approximately 90 to 270 min in a head-to-head comparison of each system. All of the automated systems provide technical staff with increased time to perform other tasks during the run, offer easy expansion of the diagnostic test menu, and have the ability to increase specimen throughput. © 2013 Society for Laboratory Automation and Screening.

  14. Automated Help System For A Supercomputer

    NASA Technical Reports Server (NTRS)

    Callas, George P.; Schulbach, Catherine H.; Younkin, Michael

    1994-01-01

    Expert-system software developed to provide automated system of user-helping displays in supercomputer system at Ames Research Center Advanced Computer Facility. Users located at remote computer terminals connected to supercomputer and each other via gateway computers, local-area networks, telephone lines, and satellite links. Automated help system answers routine user inquiries about how to use services of computer system. Available 24 hours per day and reduces burden on human experts, freeing them to concentrate on helping users with complicated problems.

  15. CellShape: A user-friendly image analysis tool for quantitative visualization of bacterial cell factories inside.

    PubMed

    Goñi-Moreno, Ángel; Kim, Juhyun; de Lorenzo, Víctor

    2017-02-01

    Visualization of the intracellular constituents of individual bacteria while performing as live biocatalysts is in principle doable through more or less sophisticated fluorescence microscopy. Unfortunately, rigorous quantitation of the wealth of data embodied in the resulting images requires bioinformatic tools that are not widely extended within the community-let alone that they are often subject to licensing that impedes software reuse. In this context we have developed CellShape, a user-friendly platform for image analysis with subpixel precision and double-threshold segmentation system for quantification of fluorescent signals stemming from single-cells. CellShape is entirely coded in Python, a free, open-source programming language with widespread community support. For a developer, CellShape enhances extensibility (ease of software improvements) by acting as an interface to access and use existing Python modules; for an end-user, CellShape presents standalone executable files ready to open without installation. We have adopted this platform to analyse with an unprecedented detail the tridimensional distribution of the constituents of the gene expression flow (DNA, RNA polymerase, mRNA and ribosomal proteins) in individual cells of the industrial platform strain Pseudomonas putida KT2440. While the CellShape first release version (v0.8) is readily operational, users and/or developers are enabled to expand the platform further. Copyright © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. STORMSeq: an open-source, user-friendly pipeline for processing personal genomics data in the cloud.

    PubMed

    Karczewski, Konrad J; Fernald, Guy Haskin; Martin, Alicia R; Snyder, Michael; Tatonetti, Nicholas P; Dudley, Joel T

    2014-01-01

    The increasing public availability of personal complete genome sequencing data has ushered in an era of democratized genomics. However, read mapping and variant calling software is constantly improving and individuals with personal genomic data may prefer to customize and update their variant calls. Here, we describe STORMSeq (Scalable Tools for Open-Source Read Mapping), a graphical interface cloud computing solution that does not require a parallel computing environment or extensive technical experience. This customizable and modular system performs read mapping, read cleaning, and variant calling and annotation. At present, STORMSeq costs approximately $2 and 5-10 hours to process a full exome sequence and $30 and 3-8 days to process a whole genome sequence. We provide this open-access and open-source resource as a user-friendly interface in Amazon EC2.

  17. GLINT: a user-friendly toolset for the analysis of high-throughput DNA-methylation array data.

    PubMed

    Rahmani, Elior; Yedidim, Reut; Shenhav, Liat; Schweiger, Regev; Weissbrod, Omer; Zaitlen, Noah; Halperin, Eran

    2017-06-15

    GLINT is a user-friendly command-line toolset for fast analysis of genome-wide DNA methylation data generated using the Illumina human methylation arrays. GLINT, which does not require any programming proficiency, allows an easy execution of Epigenome-Wide Association Study analysis pipeline under different models while accounting for known confounders in methylation data. GLINT is a command-line software, freely available at https://github.com/cozygene/glint/releases . It requires Python 2.7 and several freely available Python packages. Further information and documentation as well as a quick start tutorial are available at http://glint-epigenetics.readthedocs.io . elior.rahmani@gmail.com or ehalperin@cs.ucla.edu. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  18. TIME Impact - a new user-friendly tuberculosis (TB) model to inform TB policy decisions.

    PubMed

    Houben, R M G J; Lalli, M; Sumner, T; Hamilton, M; Pedrazzoli, D; Bonsu, F; Hippner, P; Pillay, Y; Kimerling, M; Ahmedov, S; Pretorius, C; White, R G

    2016-03-24

    Tuberculosis (TB) is the leading cause of death from infectious disease worldwide, predominantly affecting low- and middle-income countries (LMICs), where resources are limited. As such, countries need to be able to choose the most efficient interventions for their respective setting. Mathematical models can be valuable tools to inform rational policy decisions and improve resource allocation, but are often unavailable or inaccessible for LMICs, particularly in TB. We developed TIME Impact, a user-friendly TB model that enables local capacity building and strengthens country-specific policy discussions to inform support funding applications at the (sub-)national level (e.g. Ministry of Finance) or to international donors (e.g. the Global Fund to Fight AIDS, Tuberculosis and Malaria).TIME Impact is an epidemiological transmission model nested in TIME, a set of TB modelling tools available for free download within the widely-used Spectrum software. The TIME Impact model reflects key aspects of the natural history of TB, with additional structure for HIV/ART, drug resistance, treatment history and age. TIME Impact enables national TB programmes (NTPs) and other TB policymakers to better understand their own TB epidemic, plan their response, apply for funding and evaluate the implementation of the response.The explicit aim of TIME Impact's user-friendly interface is to enable training of local and international TB experts towards independent use. During application of TIME Impact, close involvement of the NTPs and other local partners also builds critical understanding of the modelling methods, assumptions and limitations inherent to modelling. This is essential to generate broad country-level ownership of the modelling data inputs and results. In turn, it stimulates discussions and a review of the current evidence and assumptions, strengthening the decision-making process in general.TIME Impact has been effectively applied in a variety of settings. In South Africa, it

  19. Bias Correction of Satellite Precipitation Products (SPPs) using a User-friendly Tool: A Step in Enhancing Technical Capacity

    NASA Astrophysics Data System (ADS)

    Rushi, B. R.; Ellenburg, W. L.; Adams, E. C.; Flores, A.; Limaye, A. S.; Valdés-Pineda, R.; Roy, T.; Valdés, J. B.; Mithieu, F.; Omondi, S.

    2017-12-01

    SERVIR, a joint NASA-USAID initiative, works to build capacity in Earth observation technologies in developing countries for improved environmental decision making in the arena of: weather and climate, water and disasters, food security and land use/land cover. SERVIR partners with leading regional organizations in Eastern and Southern Africa, Hindu Kush-Himalaya, Mekong region, and West Africa to achieve its objectives. SERVIR develops hydrological applications to address specific needs articulated by key stakeholders and daily rainfall estimates are a vital input for these applications. Satellite-derived rainfall is subjected to systemic biases which need to be corrected before it can be used for any hydrologic application such as real-time or seasonal forecasting. SERVIR and the SWAAT team at the University of Arizona, have co-developed an open-source and user friendly tool of rainfall bias correction approaches for SPPs. Bias correction tools were developed based on Linear Scaling and Quantile Mapping techniques. A set of SPPs, such as PERSIANN-CCS, TMPA-RT, and CMORPH, are bias corrected using Climate Hazards Group InfraRed Precipitation with Station (CHIRPS) data which incorporates ground based precipitation observations. This bias correction tools also contains a component, which is included to improve monthly mean of CHIRPS using precipitation products of the Global Surface Summary of the Day (GSOD) database developed by the National Climatic Data Center (NCDC). This tool takes input from command-line which makes it user-friendly and applicable in any operating platform without prior programming skills. This presentation will focus on this bias-correction tool for SPPs, including application scenarios.

  20. Human Resources Planning for Office Automation.

    ERIC Educational Resources Information Center

    Munoz, Judith T.

    1985-01-01

    Examination of the transition to automated offices revealed that the prospect can be anxiety-producing for potential users, most users did not receive adequate training, lack of training may cause equipment underutilization, administrators are often not prepared to manage the change, and most experienced job satisfaction after automation. (MSE)

  1. Humans: still vital after all these years of automation.

    PubMed

    Parasuraman, Raja; Wickens, Christopher D

    2008-06-01

    The authors discuss empirical studies of human-automation interaction and their implications for automation design. Automation is prevalent in safety-critical systems and increasingly in everyday life. Many studies of human performance in automated systems have been conducted over the past 30 years. Developments in three areas are examined: levels and stages of automation, reliance on and compliance with automation, and adaptive automation. Automation applied to information analysis or decision-making functions leads to differential system performance benefits and costs that must be considered in choosing appropriate levels and stages of automation. Human user dependence on automated alerts and advisories reflects two components of operator trust, reliance and compliance, which are in turn determined by the threshold designers use to balance automation misses and false alarms. Finally, adaptive automation can provide additional benefits in balancing workload and maintaining the user's situation awareness, although more research is required to identify when adaptation should be user controlled or system driven. The past three decades of empirical research on humans and automation has provided a strong science base that can be used to guide the design of automated systems. This research can be applied to most current and future automated systems.

  2. A Method for Automated Detection of Usability Problems from Client User Interface Events

    PubMed Central

    Saadawi, Gilan M.; Legowski, Elizabeth; Medvedeva, Olga; Chavan, Girish; Crowley, Rebecca S.

    2005-01-01

    Think-aloud usability analysis provides extremely useful data but is very time-consuming and expensive to perform because of the extensive manual video analysis that is required. We describe a simple method for automated detection of usability problems from client user interface events for a developing medical intelligent tutoring system. The method incorporates (1) an agent-based method for communication that funnels all interface events and system responses to a centralized database, (2) a simple schema for representing interface events and higher order subgoals, and (3) an algorithm that reproduces the criteria used for manual coding of usability problems. A correction factor was empirically determining to account for the slower task performance of users when thinking aloud. We tested the validity of the method by simultaneously identifying usability problems using TAU and manually computing them from stored interface event data using the proposed algorithm. All usability problems that did not rely on verbal utterances were detectable with the proposed method. PMID:16779121

  3. FRIEND: a brain-monitoring agent for adaptive and assistive systems.

    PubMed

    Morris, Alexis; Ulieru, Mihaela

    2012-01-01

    This paper presents an architectural design for adaptive-systems agents (FRIEND) that use brain state information to make more effective decisions on behalf of a user; measuring brain context versus situational demands. These systems could be useful for alerting users to cognitive workload levels or fatigue, and could attempt to compensate for higher cognitive activity by filtering noise information. In some cases such systems could also share control of devices, such as pulling over in an automated vehicle. These aim to assist people in everyday systems to perform tasks better and be more aware of internal states. Achieving a functioning system of this sort is a challenge, involving a unification of brain- computer-interfaces, human-computer-interaction, soft-computin deliberative multi-agent systems disciplines. Until recently, these were not able to be combined into a usable platform due largely to technological limitations (e.g., size, cost, and processing speed), insufficient research on extracting behavioral states from EEG signals, and lack of low-cost wireless sensing headsets. We aim to surpass these limitations and develop control architectures for making sense of brain state in applications by realizing an agent architecture for adaptive (human-aware) technology. In this paper we present an early, high-level design towards implementing a multi-purpose brain-monitoring agent system to improve user quality of life through the assistive applications of psycho-physiological monitoring, noise-filtering, and shared system control.

  4. User Friendly Real Time Display

    NASA Astrophysics Data System (ADS)

    McCarthy, Denise M.; McCracken, Bill

    1989-02-01

    Real-time viewing of high resolution infrared line scan reconnaissance imagery is greatly facilitated using Honeywell's Real Time Display in conjunction with a D-500 Infrared Reconnaissance System. The Real-Time Display (RTD) provides the capability of on-board review of high resolution infrared imagery using the wide infrared dynamic range of the D-500 infrared receiver to maximum advantage. The scan converter accepts, processes, and displays imagery from four channels of the IR Receiver after formatting by a multiplexer. The scan converter interfaces with a standard RS-170 video monitor. Detailed review and on-board analysis of infrared reconnaissance imagery stored on a videotape is easily accomplished using the many user-friendly features of the RTD. Using a convenient joystick controller, on-screen mode menus, and a moveable cursor, the operator can examine scenes of interest at four different display magnifications using a four step bidirectional zoom. Imagery areas of interest are first noted using the scrolling wide field display mode at 8x reduced display resolution. On noting an area of interest, the imagery can be marked on the tape record for future recovery and a freeze frame mode can be initiated. The operator can then move the cursor to the area of interest and zoom to higher display magnification for 4x, 2x, and lx display resolutions so that the full 4096 x 4096 pixel infrared frame can be matched to the 512 x 512 pixel display frame. At 8x wide field display magnification the full line scanner field of view is displayed at 8x reduced resolution. There are two selectable modes of obtaining this reduced resolution. The operator can use the default method, which averages the signal from an 8 x 8 pixel group, or it is also possible to select the peak signal of the 8 x 8 pixel block to represent the entire block on the display. In this alternate peak-signal display the wide field can be effectively scanned for hot objects which are more likely to be

  5. PHYSICO2: an UNIX based standalone procedure for computation of physicochemical, window-dependent and substitution based evolutionary properties of protein sequences along with automated block preparation tool, version 2.

    PubMed

    Banerjee, Shyamashree; Gupta, Parth Sarthi Sen; Nayek, Arnab; Das, Sunit; Sur, Vishma Pratap; Seth, Pratyay; Islam, Rifat Nawaz Ul; Bandyopadhyay, Amal K

    2015-01-01

    Automated genome sequencing procedure is enriching the sequence database very fast. To achieve a balance between the entry of sequences in the database and their analyses, efficient software is required. In this end PHYSICO2, compare to earlier PHYSICO and other public domain tools, is most efficient in that it i] extracts physicochemical, window-dependent and homologousposition-based-substitution (PWS) properties including positional and BLOCK-specific diversity and conservation, ii] provides users with optional-flexibility in setting relevant input-parameters, iii] helps users to prepare BLOCK-FASTA-file by the use of Automated Block Preparation Tool of the program, iv] performs fast, accurate and user-friendly analyses and v] redirects itemized outputs in excel format along with detailed methodology. The program package contains documentation describing application of methods. Overall the program acts as efficient PWS-analyzer and finds application in sequence-bioinformatics. PHYSICO2: is freely available at http://sourceforge.net/projects/physico2/ along with its documentation at https://sourceforge.net/projects/physico2/files/Documentation.pdf/download for all users.

  6. PHYSICO2: an UNIX based standalone procedure for computation of physicochemical, window-dependent and substitution based evolutionary properties of protein sequences along with automated block preparation tool, version 2

    PubMed Central

    Banerjee, Shyamashree; Gupta, Parth Sarthi Sen; Nayek, Arnab; Das, Sunit; Sur, Vishma Pratap; Seth, Pratyay; Islam, Rifat Nawaz Ul; Bandyopadhyay, Amal K

    2015-01-01

    Automated genome sequencing procedure is enriching the sequence database very fast. To achieve a balance between the entry of sequences in the database and their analyses, efficient software is required. In this end PHYSICO2, compare to earlier PHYSICO and other public domain tools, is most efficient in that it i] extracts physicochemical, window-dependent and homologousposition-based-substitution (PWS) properties including positional and BLOCK-specific diversity and conservation, ii] provides users with optional-flexibility in setting relevant input-parameters, iii] helps users to prepare BLOCK-FASTA-file by the use of Automated Block Preparation Tool of the program, iv] performs fast, accurate and user-friendly analyses and v] redirects itemized outputs in excel format along with detailed methodology. The program package contains documentation describing application of methods. Overall the program acts as efficient PWS-analyzer and finds application in sequence-bioinformatics. Availability PHYSICO2: is freely available at http://sourceforge.net/projects/physico2/ along with its documentation at https://sourceforge.net/projects/physico2/files/Documentation.pdf/download for all users. PMID:26339154

  7. Development of a user-friendly delivery method for the fungus Metarhizium anisopliac to control the ectoparasitic mite Varroa destructor in honey bee, Apis mellifera, colonies

    USDA-ARS?s Scientific Manuscript database

    A user-friendly method to deliver Metarhizium spores to honey bee colonies for control of Varroa mites was developed and tested. Patty blend formulations protected the fungal spores at brood nest temperatures and served as an improved delivery system of the fungus to bee hives. Field trials conducte...

  8. Proteomics to go: Proteomatic enables the user-friendly creation of versatile MS/MS data evaluation workflows.

    PubMed

    Specht, Michael; Kuhlgert, Sebastian; Fufezan, Christian; Hippler, Michael

    2011-04-15

    We present Proteomatic, an operating system independent and user-friendly platform that enables the construction and execution of MS/MS data evaluation pipelines using free and commercial software. Required external programs such as for peptide identification are downloaded automatically in the case of free software. Due to a strict separation of functionality and presentation, and support for multiple scripting languages, new processing steps can be added easily. Proteomatic is implemented in C++/Qt, scripts are implemented in Ruby, Python and PHP. All source code is released under the LGPL. Source code and installers for Windows, Mac OS X, and Linux are freely available at http://www.proteomatic.org. michael.specht@uni-muenster.de Supplementary data are available at Bioinformatics online.

  9. EDENetworks: a user-friendly software to build and analyse networks in biogeography, ecology and population genetics.

    PubMed

    Kivelä, Mikko; Arnaud-Haond, Sophie; Saramäki, Jari

    2015-01-01

    The recent application of graph-based network theory analysis to biogeography, community ecology and population genetics has created a need for user-friendly software, which would allow a wider accessibility to and adaptation of these methods. EDENetworks aims to fill this void by providing an easy-to-use interface for the whole analysis pipeline of ecological and evolutionary networks starting from matrices of species distributions, genotypes, bacterial OTUs or populations characterized genetically. The user can choose between several different ecological distance metrics, such as Bray-Curtis or Sorensen distance, or population genetic metrics such as FST or Goldstein distances, to turn the raw data into a distance/dissimilarity matrix. This matrix is then transformed into a network by manual or automatic thresholding based on percolation theory or by building the minimum spanning tree. The networks can be visualized along with auxiliary data and analysed with various metrics such as degree, clustering coefficient, assortativity and betweenness centrality. The statistical significance of the results can be estimated either by resampling the original biological data or by null models based on permutations of the data. © 2014 John Wiley & Sons Ltd.

  10. A user-friendly, low-cost turbidostat with versatile growth rate estimation based on an extended Kalman filter.

    PubMed

    Hoffmann, Stefan A; Wohltat, Christian; Müller, Kristian M; Arndt, Katja M

    2017-01-01

    For various experimental applications, microbial cultures at defined, constant densities are highly advantageous over simple batch cultures. Due to high costs, however, devices for continuous culture at freely defined densities still experience limited use. We have developed a small-scale turbidostat for research purposes, which is manufactured from inexpensive components and 3D printed parts. A high degree of spatial system integration and a graphical user interface provide user-friendly operability. The used optical density feedback control allows for constant continuous culture at a wide range of densities and offers to vary culture volume and dilution rates without additional parametrization. Further, a recursive algorithm for on-line growth rate estimation has been implemented. The employed Kalman filtering approach based on a very general state model retains the flexibility of the used control type and can be easily adapted to other bioreactor designs. Within several minutes it can converge to robust, accurate growth rate estimates. This is particularly useful for directed evolution experiments or studies on metabolic challenges, as it allows direct monitoring of the population fitness.

  11. STORMSeq: An Open-Source, User-Friendly Pipeline for Processing Personal Genomics Data in the Cloud

    PubMed Central

    Karczewski, Konrad J.; Fernald, Guy Haskin; Martin, Alicia R.; Snyder, Michael; Tatonetti, Nicholas P.; Dudley, Joel T.

    2014-01-01

    The increasing public availability of personal complete genome sequencing data has ushered in an era of democratized genomics. However, read mapping and variant calling software is constantly improving and individuals with personal genomic data may prefer to customize and update their variant calls. Here, we describe STORMSeq (Scalable Tools for Open-Source Read Mapping), a graphical interface cloud computing solution that does not require a parallel computing environment or extensive technical experience. This customizable and modular system performs read mapping, read cleaning, and variant calling and annotation. At present, STORMSeq costs approximately $2 and 5–10 hours to process a full exome sequence and $30 and 3–8 days to process a whole genome sequence. We provide this open-access and open-source resource as a user-friendly interface in Amazon EC2. PMID:24454756

  12. DXRaySMCS: a user-friendly interface developed for prediction of diagnostic radiology X-ray spectra produced by Monte Carlo (MCNP-4C) simulation.

    PubMed

    Bahreyni Toossi, M T; Moradi, H; Zare, H

    2008-01-01

    In this work, the general purpose Monte Carlo N-particle radiation transport computer code (MCNP-4C) was used for the simulation of X-ray spectra in diagnostic radiology. The electron's path in the target was followed until its energy was reduced to 10 keV. A user-friendly interface named 'diagnostic X-ray spectra by Monte Carlo simulation (DXRaySMCS)' was developed to facilitate the application of MCNP-4C code for diagnostic radiology spectrum prediction. The program provides a user-friendly interface for: (i) modifying the MCNP input file, (ii) launching the MCNP program to simulate electron and photon transport and (iii) processing the MCNP output file to yield a summary of the results (relative photon number per energy bin). In this article, the development and characteristics of DXRaySMCS are outlined. As part of the validation process, output spectra for 46 diagnostic radiology system settings produced by DXRaySMCS were compared with the corresponding IPEM78. Generally, there is a good agreement between the two sets of spectra. No statistically significant differences have been observed between IPEM78 reported spectra and the simulated spectra generated in this study.

  13. Field performance of a low-cost and fully-automated blood counting system operated by trained and untrained users (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Xie, Dengling; Xie, Yanjun; Liu, Peng; Tong, Lieshu; Chu, Kaiqin; Smith, Zachary J.

    2017-02-01

    Current flow-based blood counting devices require expensive and centralized medical infrastructure and are not appropriate for field use. In this paper we report a method to count red blood cells, white blood cells as well as platelets through a low-cost and fully-automated blood counting system. The approach consists of using a compact, custom-built microscope with large field-of-view to record bright-field and fluorescence images of samples that are diluted with a single, stable reagent mixture and counted using automatic algorithms. Sample collection is performed manually using a spring loaded lancet, and volume-metering capillary tubes. The capillaries are then dropped into a tube of pre-measured reagents and gently shaken for 10-30 seconds. The sample is loaded into a measurement chamber and placed on a custom 3D printed platform. Sample translation and focusing is fully automated, and a user has only to press a button for the measurement and analysis to commence. Cost of the system is minimized through the use of custom-designed motorized components. We performed a series of comparative experiments by trained and untrained users on blood from adults and children. We compare the performance of our system, as operated by trained and untrained users, to the clinical gold standard using a Bland-Altman analysis, demonstrating good agreement of our system to the clinical standard. The system's low cost, complete automation, and good field performance indicate that it can be successfully translated for use in low-resource settings where central hematology laboratories are not accessible.

  14. Development of interactive graphic user interfaces for modeling reaction-based biogeochemical processes in batch systems with BIOGEOCHEM

    NASA Astrophysics Data System (ADS)

    Chang, C.; Li, M.; Yeh, G.

    2010-12-01

    The BIOGEOCHEM numerical model (Yeh and Fang, 2002; Fang et al., 2003) was developed with FORTRAN for simulating reaction-based geochemical and biochemical processes with mixed equilibrium and kinetic reactions in batch systems. A complete suite of reactions including aqueous complexation, adsorption/desorption, ion-exchange, redox, precipitation/dissolution, acid-base reactions, and microbial mediated reactions were embodied in this unique modeling tool. Any reaction can be treated as fast/equilibrium or slow/kinetic reaction. An equilibrium reaction is modeled with an implicit finite rate governed by a mass action equilibrium equation or by a user-specified algebraic equation. A kinetic reaction is modeled with an explicit finite rate with an elementary rate, microbial mediated enzymatic kinetics, or a user-specified rate equation. None of the existing models has encompassed this wide array of scopes. To ease the input/output learning curve using the unique feature of BIOGEOCHEM, an interactive graphic user interface was developed with the Microsoft Visual Studio and .Net tools. Several user-friendly features, such as pop-up help windows, typo warning messages, and on-screen input hints, were implemented, which are robust. All input data can be real-time viewed and automated to conform with the input file format of BIOGEOCHEM. A post-processor for graphic visualizations of simulated results was also embedded for immediate demonstrations. By following data input windows step by step, errorless BIOGEOCHEM input files can be created even if users have little prior experiences in FORTRAN. With this user-friendly interface, the time effort to conduct simulations with BIOGEOCHEM can be greatly reduced.

  15. User-friendly freehand ultrasound calibration using Lego bricks and automatic registration.

    PubMed

    Xiao, Yiming; Yan, Charles Xiao Bo; Drouin, Simon; De Nigris, Dante; Kochanowska, Anna; Collins, D Louis

    2016-09-01

    As an inexpensive, noninvasive, and portable clinical imaging modality, ultrasound (US) has been widely employed in many interventional procedures for monitoring potential tissue deformation, surgical tool placement, and locating surgical targets. The application requires the spatial mapping between 2D US images and 3D coordinates of the patient. Although positions of the devices (i.e., ultrasound transducer) and the patient can be easily recorded by a motion tracking system, the spatial relationship between the US image and the tracker attached to the US transducer needs to be estimated through an US calibration procedure. Previously, various calibration techniques have been proposed, where a spatial transformation is computed to match the coordinates of corresponding features in a physical phantom and those seen in the US scans. However, most of these methods are difficult to use for novel users. We proposed an ultrasound calibration method by constructing a phantom from simple Lego bricks and applying an automated multi-slice 2D-3D registration scheme without volumetric reconstruction. The method was validated for its calibration accuracy and reproducibility. Our method yields a calibration accuracy of [Formula: see text] mm and a calibration reproducibility of 1.29 mm. We have proposed a robust, inexpensive, and easy-to-use ultrasound calibration method.

  16. SP-Designer: a user-friendly program for designing species-specific primer pairs from DNA sequence alignments.

    PubMed

    Villard, Pierre; Malausa, Thibaut

    2013-07-01

    SP-Designer is an open-source program providing a user-friendly tool for the design of specific PCR primer pairs from a DNA sequence alignment containing sequences from various taxa. SP-Designer selects PCR primer pairs for the amplification of DNA from a target species on the basis of several criteria: (i) primer specificity, as assessed by interspecific sequence polymorphism in the annealing regions, (ii) the biochemical characteristics of the primers and (iii) the intended PCR conditions. SP-Designer generates tables, detailing the primer pair and PCR characteristics, and a FASTA file locating the primer sequences in the original sequence alignment. SP-Designer is Windows-compatible and freely available from http://www2.sophia.inra.fr/urih/sophia_mart/sp_designer/info_sp_designer.php. © 2013 John Wiley & Sons Ltd.

  17. Automating spectral measurements

    NASA Astrophysics Data System (ADS)

    Goldstein, Fred T.

    2008-09-01

    This paper discusses the architecture of software utilized in spectroscopic measurements. As optical coatings become more sophisticated, there is mounting need to automate data acquisition (DAQ) from spectrophotometers. Such need is exacerbated when 100% inspection is required, ancillary devices are utilized, cost reduction is crucial, or security is vital. While instrument manufacturers normally provide point-and-click DAQ software, an application programming interface (API) may be missing. In such cases automation is impossible or expensive. An API is typically provided in libraries (*.dll, *.ocx) which may be embedded in user-developed applications. Users can thereby implement DAQ automation in several Windows languages. Another possibility, developed by FTG as an alternative to instrument manufacturers' software, is the ActiveX application (*.exe). ActiveX, a component of many Windows applications, provides means for programming and interoperability. This architecture permits a point-and-click program to act as automation client and server. Excel, for example, can control and be controlled by DAQ applications. Most importantly, ActiveX permits ancillary devices such as barcode readers and XY-stages to be easily and economically integrated into scanning procedures. Since an ActiveX application has its own user-interface, it can be independently tested. The ActiveX application then runs (visibly or invisibly) under DAQ software control. Automation capabilities are accessed via a built-in spectro-BASIC language with industry-standard (VBA-compatible) syntax. Supplementing ActiveX, spectro-BASIC also includes auxiliary serial port commands for interfacing programmable logic controllers (PLC). A typical application is automatic filter handling.

  18. A user-friendly application for the extraction of kubios hrv output to an optimal format for statistical analysis - biomed 2011.

    PubMed

    Johnsen Lind, Andreas; Helge Johnsen, Bjorn; Hill, Labarron K; Sollers Iii, John J; Thayer, Julian F

    2011-01-01

    The aim of the present manuscript is to present a user-friendly and flexible platform for transforming Kubios HRV output files to an .xls-file format, used by MS Excel. The program utilizes either native or bundled Java and is platform-independent and mobile. This means that it can run without being installed on a computer. It also has an option of continuous transferring of data indicating that it can run in the background while Kubios produces output files. The program checks for changes in the file structure and automatically updates the .xls- output file.

  19. Automated Engineering Design (AED); An approach to automated documentation

    NASA Technical Reports Server (NTRS)

    Mcclure, C. W.

    1970-01-01

    The automated engineering design (AED) is reviewed, consisting of a high level systems programming language, a series of modular precoded subroutines, and a set of powerful software machine tools that effectively automate the production and design of new languages. AED is used primarily for development of problem and user-oriented languages. Software production phases are diagramed, and factors which inhibit effective documentation are evaluated.

  20. ACME, a GIS tool for Automated Cirque Metric Extraction

    NASA Astrophysics Data System (ADS)

    Spagnolo, Matteo; Pellitero, Ramon; Barr, Iestyn D.; Ely, Jeremy C.; Pellicer, Xavier M.; Rea, Brice R.

    2017-02-01

    Regional scale studies of glacial cirque metrics provide key insights on the (palaeo) environment related to the formation of these erosional landforms. The growing availability of high resolution terrain models means that more glacial cirques can be identified and mapped in the future. However, the extraction of their metrics still largely relies on time consuming manual techniques or the combination of, more or less obsolete, GIS tools. In this paper, a newly coded toolbox is provided for the automated, and comparatively quick, extraction of 16 key glacial cirque metrics; including length, width, circularity, planar and 3D area, elevation, slope, aspect, plan closure and hypsometry. The set of tools, named ACME (Automated Cirque Metric Extraction), is coded in Python, runs in one of the most commonly used GIS packages (ArcGIS) and has a user friendly interface. A polygon layer of mapped cirques is required for all metrics, while a Digital Terrain Model and a point layer of cirque threshold midpoints are needed to run some of the tools. Results from ACME are comparable to those from other techniques and can be obtained rapidly, allowing large cirque datasets to be analysed and potentially important regional trends highlighted.

  1. Towards automation of user interface design

    NASA Technical Reports Server (NTRS)

    Gastner, Rainer; Kraetzschmar, Gerhard K.; Lutz, Ernst

    1992-01-01

    This paper suggests an approach to automatic software design in the domain of graphical user interfaces. There are still some drawbacks in existing user interface management systems (UIMS's) which basically offer only quantitative layout specifications via direct manipulation. Our approach suggests a convenient way to get a default graphical user interface which may be customized and redesigned easily in further prototyping cycles.

  2. Automated verification of flight software. User's manual

    NASA Technical Reports Server (NTRS)

    Saib, S. H.

    1982-01-01

    (Automated Verification of Flight Software), a collection of tools for analyzing source programs written in FORTRAN and AED is documented. The quality and the reliability of flight software are improved by: (1) indented listings of source programs, (2) static analysis to detect inconsistencies in the use of variables and parameters, (3) automated documentation, (4) instrumentation of source code, (5) retesting guidance, (6) analysis of assertions, (7) symbolic execution, (8) generation of verification conditions, and (9) simplification of verification conditions. Use of AVFS in the verification of flight software is described.

  3. A user-friendly workflow for analysis of Illumina gene expression bead array data available at the arrayanalysis.org portal.

    PubMed

    Eijssen, Lars M T; Goelela, Varshna S; Kelder, Thomas; Adriaens, Michiel E; Evelo, Chris T; Radonjic, Marijana

    2015-06-30

    Illumina whole-genome expression bead arrays are a widely used platform for transcriptomics. Most of the tools available for the analysis of the resulting data are not easily applicable by less experienced users. ArrayAnalysis.org provides researchers with an easy-to-use and comprehensive interface to the functionality of R and Bioconductor packages for microarray data analysis. As a modular open source project, it allows developers to contribute modules that provide support for additional types of data or extend workflows. To enable data analysis of Illumina bead arrays for a broad user community, we have developed a module for ArrayAnalysis.org that provides a free and user-friendly web interface for quality control and pre-processing for these arrays. This module can be used together with existing modules for statistical and pathway analysis to provide a full workflow for Illumina gene expression data analysis. The module accepts data exported from Illumina's GenomeStudio, and provides the user with quality control plots and normalized data. The outputs are directly linked to the existing statistics module of ArrayAnalysis.org, but can also be downloaded for further downstream analysis in third-party tools. The Illumina bead arrays analysis module is available at http://www.arrayanalysis.org . A user guide, a tutorial demonstrating the analysis of an example dataset, and R scripts are available. The module can be used as a starting point for statistical evaluation and pathway analysis provided on the website or to generate processed input data for a broad range of applications in life sciences research.

  4. Best Friends: Alliances, Friend Ranking, and the MySpace Social Network.

    PubMed

    DeScioli, Peter; Kurzban, Robert; Koch, Elizabeth N; Liben-Nowell, David

    2011-01-01

    Like many topics of psychological research, the explanation for friendship is at once intuitive and difficult to address empirically. These difficulties worsen when one seeks, as we do, to go beyond "obvious" explanations ("humans are social creatures") to ask deeper questions, such as "What is the evolved function of human friendship?" In recent years, however, a new window into human behavior has opened as a growing fraction of people's social activity has moved online, leaving a wealth of digital traces behind. One example is a feature of the MySpace social network that allows millions of users to rank their "Top Friends." In this study, we collected over 10 million people's friendship decisions from MySpace to test predictions made by hypotheses about human friendship. We found particular support for the alliance hypothesis, which holds that human friendship is caused by cognitive systems that function to create alliances for potential disputes. Because an ally's support can be undermined by a stronger outside relationship, the alliance model predicts that people will prefer partners who rank them above other friends. Consistent with the alliance model, we found that an individual's choice of best friend in MySpace is strongly predicted by how partners rank that individual. © The Author(s) 2011.

  5. Logistics Automation Master Plan (LAMP). Better Logistics Support through Automation.

    DTIC Science & Technology

    1983-06-01

    office micro-computers, positioned throughout the command chain , by providing real time links between LCA and all users: 2. Goals: Assist HQDA staff in...field i.e., Airland Battle 2000. IV-27 Section V: CONCEPT OF EXECUTION Suply (Retail) A. SRstem Description. I. The Division Logistics Property Book...7. Divisional Direct Support Unit Automated Supply System (DDASS)/Direct pport Level Suply Automation (DLSA). DDASS and DLSA are system development

  6. A User-Friendly Model for Spray Drying to Aid Pharmaceutical Product Development

    PubMed Central

    Grasmeijer, Niels; de Waard, Hans; Hinrichs, Wouter L. J.; Frijlink, Henderik W.

    2013-01-01

    The aim of this study was to develop a user-friendly model for spray drying that can aid in the development of a pharmaceutical product, by shifting from a trial-and-error towards a quality-by-design approach. To achieve this, a spray dryer model was developed in commercial and open source spreadsheet software. The output of the model was first fitted to the experimental output of a Büchi B-290 spray dryer and subsequently validated. The predicted outlet temperatures of the spray dryer model matched the experimental values very well over the entire range of spray dryer settings that were tested. Finally, the model was applied to produce glassy sugars by spray drying, an often used excipient in formulations of biopharmaceuticals. For the production of glassy sugars, the model was extended to predict the relative humidity at the outlet, which is not measured in the spray dryer by default. This extended model was then successfully used to predict whether specific settings were suitable for producing glassy trehalose and inulin by spray drying. In conclusion, a spray dryer model was developed that is able to predict the output parameters of the spray drying process. The model can aid the development of spray dried pharmaceutical products by shifting from a trial-and-error towards a quality-by-design approach. PMID:24040240

  7. A user-friendly model for spray drying to aid pharmaceutical product development.

    PubMed

    Grasmeijer, Niels; de Waard, Hans; Hinrichs, Wouter L J; Frijlink, Henderik W

    2013-01-01

    The aim of this study was to develop a user-friendly model for spray drying that can aid in the development of a pharmaceutical product, by shifting from a trial-and-error towards a quality-by-design approach. To achieve this, a spray dryer model was developed in commercial and open source spreadsheet software. The output of the model was first fitted to the experimental output of a Büchi B-290 spray dryer and subsequently validated. The predicted outlet temperatures of the spray dryer model matched the experimental values very well over the entire range of spray dryer settings that were tested. Finally, the model was applied to produce glassy sugars by spray drying, an often used excipient in formulations of biopharmaceuticals. For the production of glassy sugars, the model was extended to predict the relative humidity at the outlet, which is not measured in the spray dryer by default. This extended model was then successfully used to predict whether specific settings were suitable for producing glassy trehalose and inulin by spray drying. In conclusion, a spray dryer model was developed that is able to predict the output parameters of the spray drying process. The model can aid the development of spray dried pharmaceutical products by shifting from a trial-and-error towards a quality-by-design approach.

  8. Automated diagnostic kiosk for diagnosing diseases

    DOEpatents

    Regan, John Frederick; Birch, James Michael

    2014-02-11

    An automated and autonomous diagnostic apparatus that is capable of dispensing collection vials and collections kits to users interesting in collecting a biological sample and submitting their collected sample contained within a collection vial into the apparatus for automated diagnostic services. The user communicates with the apparatus through a touch-screen monitor. A user is able to enter personnel information into the apparatus including medical history, insurance information, co-payment, and answer a series of questions regarding their illness, which is used to determine the assay most likely to yield a positive result. Remotely-located physicians can communicate with users of the apparatus using video tele-medicine and request specific assays to be performed. The apparatus archives submitted samples for additional testing. Users may receive their assay results electronically. Users may allow the uploading of their diagnoses into a central databank for disease surveillance purposes.

  9. Heuristic automation for decluttering tactical displays.

    PubMed

    St John, Mark; Smallman, Harvey S; Manes, Daniel I; Feher, Bela A; Morrison, Jeffrey G

    2005-01-01

    Tactical displays can quickly become cluttered with large numbers of symbols that can compromise effective monitoring. Here, we studied how heuristic automation can aid users by intelligently "decluttering" the display. In a realistic simulated naval air defense task, 27 experienced U.S. Navy users monitored a cluttered airspace and executed defensive responses against significant threats. An algorithm continuously evaluated aircraft for their levels of threat and decluttered the less threatening ones by dimming their symbols. Users appropriately distrusted and spot-checked the automation's assessments, and decluttering had very little effect on which aircraft were judged as significantly threatening. Nonetheless, decluttering improved the timeliness of responses to threatening aircraft by 25% as compared with a baseline display with no decluttering; it was especially beneficial for threats in more peripheral locations, and 25 of 27 participants preferred decluttering. Heuristic automation, when properly designed to guide users' attention by decluttering less important objects, may prove valuable in many cluttered monitoring situations, including air traffic management, crisis team management, and tactical situation awareness in general.

  10. StructRNAfinder: an automated pipeline and web server for RNA families prediction.

    PubMed

    Arias-Carrasco, Raúl; Vásquez-Morán, Yessenia; Nakaya, Helder I; Maracaja-Coutinho, Vinicius

    2018-02-17

    The function of many noncoding RNAs (ncRNAs) depend upon their secondary structures. Over the last decades, several methodologies have been developed to predict such structures or to use them to functionally annotate RNAs into RNA families. However, to fully perform this analysis, researchers should utilize multiple tools, which require the constant parsing and processing of several intermediate files. This makes the large-scale prediction and annotation of RNAs a daunting task even to researchers with good computational or bioinformatics skills. We present an automated pipeline named StructRNAfinder that predicts and annotates RNA families in transcript or genome sequences. This single tool not only displays the sequence/structural consensus alignments for each RNA family, according to Rfam database but also provides a taxonomic overview for each assigned functional RNA. Moreover, we implemented a user-friendly web service that allows researchers to upload their own nucleotide sequences in order to perform the whole analysis. Finally, we provided a stand-alone version of StructRNAfinder to be used in large-scale projects. The tool was developed under GNU General Public License (GPLv3) and is freely available at http://structrnafinder.integrativebioinformatics.me . The main advantage of StructRNAfinder relies on the large-scale processing and integrating the data obtained by each tool and database employed along the workflow, of which several files are generated and displayed in user-friendly reports, useful for downstream analyses and data exploration.

  11. A user-friendly mathematical modelling web interface to assist local decision making in the fight against drug-resistant tuberculosis.

    PubMed

    Ragonnet, Romain; Trauer, James M; Denholm, Justin T; Marais, Ben J; McBryde, Emma S

    2017-05-30

    Multidrug-resistant and rifampicin-resistant tuberculosis (MDR/RR-TB) represent an important challenge for global tuberculosis (TB) control. The high rates of MDR/RR-TB observed among re-treatment cases can arise from diverse pathways: de novo amplification during initial treatment, inappropriate treatment of undiagnosed MDR/RR-TB, relapse despite appropriate treatment, or reinfection with MDR/RR-TB. Mathematical modelling allows quantification of the contribution made by these pathways in different settings. This information provides valuable insights for TB policy-makers, allowing better contextualised solutions. However, mathematical modelling outputs need to consider local data and be easily accessible to decision makers in order to improve their usefulness. We present a user-friendly web-based modelling interface, which can be used by people without technical knowledge. Users can input their own parameter values and produce estimates for their specific setting. This innovative tool provides easy access to mathematical modelling outputs that are highly relevant to national TB control programs. In future, the same approach could be applied to a variety of modelling applications, enhancing local decision making.

  12. Real-Time fMRI Pattern Decoding and Neurofeedback Using FRIEND: An FSL-Integrated BCI Toolbox

    PubMed Central

    Sato, João R.; Basilio, Rodrigo; Paiva, Fernando F.; Garrido, Griselda J.; Bramati, Ivanei E.; Bado, Patricia; Tovar-Moll, Fernanda; Zahn, Roland; Moll, Jorge

    2013-01-01

    The demonstration that humans can learn to modulate their own brain activity based on feedback of neurophysiological signals opened up exciting opportunities for fundamental and applied neuroscience. Although EEG-based neurofeedback has been long employed both in experimental and clinical investigation, functional MRI (fMRI)-based neurofeedback emerged as a promising method, given its superior spatial resolution and ability to gauge deep cortical and subcortical brain regions. In combination with improved computational approaches, such as pattern recognition analysis (e.g., Support Vector Machines, SVM), fMRI neurofeedback and brain decoding represent key innovations in the field of neuromodulation and functional plasticity. Expansion in this field and its applications critically depend on the existence of freely available, integrated and user-friendly tools for the neuroimaging research community. Here, we introduce FRIEND, a graphic-oriented user-friendly interface package for fMRI neurofeedback and real-time multivoxel pattern decoding. The package integrates routines for image preprocessing in real-time, ROI-based feedback (single-ROI BOLD level and functional connectivity) and brain decoding-based feedback using SVM. FRIEND delivers an intuitive graphic interface with flexible processing pipelines involving optimized procedures embedding widely validated packages, such as FSL and libSVM. In addition, a user-defined visual neurofeedback module allows users to easily design and run fMRI neurofeedback experiments using ROI-based or multivariate classification approaches. FRIEND is open-source and free for non-commercial use. Processing tutorials and extensive documentation are available. PMID:24312569

  13. A user friendly interface for microwave tomography enhanced GPR surveys

    NASA Astrophysics Data System (ADS)

    Catapano, Ilaria; Affinito, Antonio; Soldovieri, Francesco

    2013-04-01

    Ground Penetrating Radar (GPR) systems are nowadays widely used in civil applications among which structural monitoring is one of the most critical issues due to its importance in terms of risks prevents and cost effective management of the structure itself. Despite GPR systems are assessed devices, there is a continuous interest towards their optimization, which involves both hardware and software aspects, with the common final goal to achieve accurate and highly informative images while keeping as low as possible difficulties and times involved in on field surveys. As far as data processing is concerned, one of the key aims is the development of imaging approaches capable of providing images easily interpretable by not expert users while keeping feasible the requirements in terms of computational resources. To satisfy this request or at least improve the reconstruction capabilities of data processing tools actually available in commercial GPR systems, microwave tomographic approaches based on the Born approximation have been developed and tested in several practical conditions, such as civil and archeological investigations, sub-service monitoring, security surveys and so on [1-3]. However, the adoption of these approaches is subjected to the involvement of expert workers, which have to be capable of properly managing the gathered data and their processing, which involves the solution of a linear inverse scattering problem. In order to overcome this drawback, aim of this contribution is to present an end-user friendly software interface that makes possible a simple management of the microwave tomographic approaches. In particular, the proposed interface allows us to upload both synthetic and experimental data sets saved in .txt, .dt and .dt1 formats, to perform all the steps needed to obtain tomographic images and to display raw-radargrams, intermediate and final results. By means of the interface, the users can apply time gating, back-ground removal or both to

  14. Cloud-Based Automated Design and Additive Manufacturing: A Usage Data-Enabled Paradigm Shift

    PubMed Central

    Lehmhus, Dirk; Wuest, Thorsten; Wellsandt, Stefan; Bosse, Stefan; Kaihara, Toshiya; Thoben, Klaus-Dieter; Busse, Matthias

    2015-01-01

    Integration of sensors into various kinds of products and machines provides access to in-depth usage information as basis for product optimization. Presently, this large potential for more user-friendly and efficient products is not being realized because (a) sensor integration and thus usage information is not available on a large scale and (b) product optimization requires considerable efforts in terms of manpower and adaptation of production equipment. However, with the advent of cloud-based services and highly flexible additive manufacturing techniques, these obstacles are currently crumbling away at rapid pace. The present study explores the state of the art in gathering and evaluating product usage and life cycle data, additive manufacturing and sensor integration, automated design and cloud-based services in manufacturing. By joining and extrapolating development trends in these areas, it delimits the foundations of a manufacturing concept that will allow continuous and economically viable product optimization on a general, user group or individual user level. This projection is checked against three different application scenarios, each of which stresses different aspects of the underlying holistic concept. The following discussion identifies critical issues and research needs by adopting the relevant stakeholder perspectives. PMID:26703606

  15. Cloud-Based Automated Design and Additive Manufacturing: A Usage Data-Enabled Paradigm Shift.

    PubMed

    Lehmhus, Dirk; Wuest, Thorsten; Wellsandt, Stefan; Bosse, Stefan; Kaihara, Toshiya; Thoben, Klaus-Dieter; Busse, Matthias

    2015-12-19

    Integration of sensors into various kinds of products and machines provides access to in-depth usage information as basis for product optimization. Presently, this large potential for more user-friendly and efficient products is not being realized because (a) sensor integration and thus usage information is not available on a large scale and (b) product optimization requires considerable efforts in terms of manpower and adaptation of production equipment. However, with the advent of cloud-based services and highly flexible additive manufacturing techniques, these obstacles are currently crumbling away at rapid pace. The present study explores the state of the art in gathering and evaluating product usage and life cycle data, additive manufacturing and sensor integration, automated design and cloud-based services in manufacturing. By joining and extrapolating development trends in these areas, it delimits the foundations of a manufacturing concept that will allow continuous and economically viable product optimization on a general, user group or individual user level. This projection is checked against three different application scenarios, each of which stresses different aspects of the underlying holistic concept. The following discussion identifies critical issues and research needs by adopting the relevant stakeholder perspectives.

  16. Automation and decision support in interactive consumer products.

    PubMed

    Sauer, J; Rüttinger, B

    2007-06-01

    This article presents two empirical studies (n = 30, n = 48) that are concerned with different forms of automation in interactive consumer products. The goal of the studies was to evaluate the effectiveness of two types of automation: perceptual augmentation (i.e. supporting users' information acquisition and analysis); and control integration (i.e. supporting users' action selection and implementation). Furthermore, the effectiveness of on-product information (i.e. labels attached to product) in supporting automation design was evaluated. The findings suggested greater benefits for automation in control integration than in perceptual augmentation alone, which may be partly due to the specific requirements of consumer product usage. If employed appropriately, on-product information can be a helpful means of information conveyance. The article discusses the implications of automation design in interactive consumer products while drawing on automation models from the work environment.

  17. PoPoolation DB: a user-friendly web-based database for the retrieval of natural polymorphisms in Drosophila.

    PubMed

    Pandey, Ram Vinay; Kofler, Robert; Orozco-terWengel, Pablo; Nolte, Viola; Schlötterer, Christian

    2011-03-02

    The enormous potential of natural variation for the functional characterization of genes has been neglected for a long time. Only since recently, functional geneticists are starting to account for natural variation in their analyses. With the new sequencing technologies it has become feasible to collect sequence information for multiple individuals on a genomic scale. In particular sequencing pooled DNA samples has been shown to provide a cost-effective approach for characterizing variation in natural populations. While a range of software tools have been developed for mapping these reads onto a reference genome and extracting SNPs, linking this information to population genetic estimators and functional information still poses a major challenge to many researchers. We developed PoPoolation DB a user-friendly integrated database. Popoolation DB links variation in natural populations with functional information, allowing a wide range of researchers to take advantage of population genetic data. PoPoolation DB provides the user with population genetic parameters (Watterson's θ or Tajima's π), Tajima's D, SNPs, allele frequencies and indels in regions of interest. The database can be queried by gene name, chromosomal position, or a user-provided query sequence or GTF file. We anticipate that PoPoolation DB will be a highly versatile tool for functional geneticists as well as evolutionary biologists. PoPoolation DB, available at http://www.popoolation.at/pgt, provides an integrated platform for researchers to investigate natural polymorphism and associated functional annotations from UCSC and Flybase genome browsers, population genetic estimators and RNA-seq information.

  18. BMPOS: a Flexible and User-Friendly Tool Sets for Microbiome Studies.

    PubMed

    Pylro, Victor S; Morais, Daniel K; de Oliveira, Francislon S; Dos Santos, Fausto G; Lemos, Leandro N; Oliveira, Guilherme; Roesch, Luiz F W

    2016-08-01

    Recent advances in science and technology are leading to a revision and re-orientation of methodologies, addressing old and current issues under a new perspective. Advances in next generation sequencing (NGS) are allowing comparative analysis of the abundance and diversity of whole microbial communities, generating a large amount of data and findings at a systems level. The current limitation for biologists has been the increasing demand for computational power and training required for processing of NGS data. Here, we describe the deployment of the Brazilian Microbiome Project Operating System (BMPOS), a flexible and user-friendly Linux distribution dedicated to microbiome studies. The Brazilian Microbiome Project (BMP) has developed data analyses pipelines for metagenomic studies (phylogenetic marker genes), conducted using the two main high-throughput sequencing platforms (Ion Torrent and Illumina MiSeq). The BMPOS is freely available and possesses the entire requirement of bioinformatics packages and databases to perform all the pipelines suggested by the BMP team. The BMPOS may be used as a bootable live USB stick or installed in any computer with at least 1 GHz CPU and 512 MB RAM, independent of the operating system previously installed. The BMPOS has proved to be effective for sequences processing, sequences clustering, alignment, taxonomic annotation, statistical analysis, and plotting of metagenomic data. The BMPOS has been used during several metagenomic analyses courses, being valuable as a tool for training, and an excellent starting point to anyone interested in performing metagenomic studies. The BMPOS and its documentation are available at http://www.brmicrobiome.org .

  19. Measuring fish and their physical habitats: Versatile 2D and 3D video techniques with user-friendly software

    USGS Publications Warehouse

    Neuswanger, Jason R.; Wipfli, Mark S.; Rosenberger, Amanda E.; Hughes, Nicholas F.

    2017-01-01

    Applications of video in fisheries research range from simple biodiversity surveys to three-dimensional (3D) measurement of complex swimming, schooling, feeding, and territorial behaviors. However, researchers lack a transparently developed, easy-to-use, general purpose tool for 3D video measurement and event logging. Thus, we developed a new measurement system, with freely available, user-friendly software, easily obtained hardware, and flexible underlying mathematical methods capable of high precision and accuracy. The software, VidSync, allows users to efficiently record, organize, and navigate complex 2D or 3D measurements of fish and their physical habitats. Laboratory tests showed submillimetre accuracy in length measurements of 50.8 mm targets at close range, with increasing errors (mostly <1%) at longer range and for longer targets. A field test on juvenile Chinook salmon (Oncorhynchus tshawytscha) feeding behavior in Alaska streams found that individuals within aggregations avoided the immediate proximity of their competitors, out to a distance of 1.0 to 2.9 body lengths. This system makes 3D video measurement a practical tool for laboratory and field studies of aquatic or terrestrial animal behavior and ecology.

  20. TWIST User Presentation

    EPA Pesticide Factsheets

    The Wastewater Information System Tool (TWIST) is downloadable, user-friendly management tool that will allow state and local health departments to effectively inventory and manage small wastewater treatment systems in their jurisdictions.

  1. MySpace and Facebook: applying the uses and gratifications theory to exploring friend-networking sites.

    PubMed

    Raacke, John; Bonds-Raacke, Jennifer

    2008-04-01

    The increased use of the Internet as a new tool in communication has changed the way people interact. This fact is even more evident in the recent development and use of friend-networking sites. However, no research has evaluated these sites and their impact on college students. Therefore, the present study was conducted to evaluate: (a) why people use these friend-networking sites, (b) what the characteristics are of the typical college user, and (c) what uses and gratifications are met by using these sites. Results indicated that the vast majority of college students are using these friend-networking sites for a significant portion of their day for reasons such as making new friends and locating old friends. Additionally, both men and women of traditional college age are equally engaging in this form of online communication with this result holding true for nearly all ethnic groups. Finally, results showed that many uses and gratifications are met by users (e.g., "keeping in touch with friends"). Results are discussed in light of the impact that friend-networking sites have on communication and social needs of college students.

  2. SimHap GUI: An intuitive graphical user interface for genetic association analysis

    PubMed Central

    Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J

    2008-01-01

    Background Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. Results We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. Conclusion SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis. PMID:19109877

  3. SimHap GUI: an intuitive graphical user interface for genetic association analysis.

    PubMed

    Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J

    2008-12-25

    Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis.

  4. Students' Facebook "Friends": Public and Private Spheres

    ERIC Educational Resources Information Center

    West, Anne; Lewis, Jane; Currie, Peter

    2009-01-01

    Friendship is highly significant during the university years. Facebook, widely used by students, is designed to facilitate communication with different groups of "friends". This exploratory study involved interviewing a sample of student users of Facebook: it focuses on the extent to which older adults, especially parents, are accepted as Facebook…

  5. PAPST, a User Friendly and Powerful Java Platform for ChIP-Seq Peak Co-Localization Analysis and Beyond.

    PubMed

    Bible, Paul W; Kanno, Yuka; Wei, Lai; Brooks, Stephen R; O'Shea, John J; Morasso, Maria I; Loganantharaj, Rasiah; Sun, Hong-Wei

    2015-01-01

    Comparative co-localization analysis of transcription factors (TFs) and epigenetic marks (EMs) in specific biological contexts is one of the most critical areas of ChIP-Seq data analysis beyond peak calling. Yet there is a significant lack of user-friendly and powerful tools geared towards co-localization analysis based exploratory research. Most tools currently used for co-localization analysis are command line only and require extensive installation procedures and Linux expertise. Online tools partially address the usability issues of command line tools, but slow response times and few customization features make them unsuitable for rapid data-driven interactive exploratory research. We have developed PAPST: Peak Assignment and Profile Search Tool, a user-friendly yet powerful platform with a unique design, which integrates both gene-centric and peak-centric co-localization analysis into a single package. Most of PAPST's functions can be completed in less than five seconds, allowing quick cycles of data-driven hypothesis generation and testing. With PAPST, a researcher with or without computational expertise can perform sophisticated co-localization pattern analysis of multiple TFs and EMs, either against all known genes or a set of genomic regions obtained from public repositories or prior analysis. PAPST is a versatile, efficient, and customizable tool for genome-wide data-driven exploratory research. Creatively used, PAPST can be quickly applied to any genomic data analysis that involves a comparison of two or more sets of genomic coordinate intervals, making it a powerful tool for a wide range of exploratory genomic research. We first present PAPST's general purpose features then apply it to several public ChIP-Seq data sets to demonstrate its rapid execution and potential for cutting-edge research with a case study in enhancer analysis. To our knowledge, PAPST is the first software of its kind to provide efficient and sophisticated post peak-calling Ch

  6. PAPST, a User Friendly and Powerful Java Platform for ChIP-Seq Peak Co-Localization Analysis and Beyond

    PubMed Central

    Bible, Paul W.; Kanno, Yuka; Wei, Lai; Brooks, Stephen R.; O’Shea, John J.; Morasso, Maria I.; Loganantharaj, Rasiah; Sun, Hong-Wei

    2015-01-01

    Comparative co-localization analysis of transcription factors (TFs) and epigenetic marks (EMs) in specific biological contexts is one of the most critical areas of ChIP-Seq data analysis beyond peak calling. Yet there is a significant lack of user-friendly and powerful tools geared towards co-localization analysis based exploratory research. Most tools currently used for co-localization analysis are command line only and require extensive installation procedures and Linux expertise. Online tools partially address the usability issues of command line tools, but slow response times and few customization features make them unsuitable for rapid data-driven interactive exploratory research. We have developed PAPST: Peak Assignment and Profile Search Tool, a user-friendly yet powerful platform with a unique design, which integrates both gene-centric and peak-centric co-localization analysis into a single package. Most of PAPST’s functions can be completed in less than five seconds, allowing quick cycles of data-driven hypothesis generation and testing. With PAPST, a researcher with or without computational expertise can perform sophisticated co-localization pattern analysis of multiple TFs and EMs, either against all known genes or a set of genomic regions obtained from public repositories or prior analysis. PAPST is a versatile, efficient, and customizable tool for genome-wide data-driven exploratory research. Creatively used, PAPST can be quickly applied to any genomic data analysis that involves a comparison of two or more sets of genomic coordinate intervals, making it a powerful tool for a wide range of exploratory genomic research. We first present PAPST’s general purpose features then apply it to several public ChIP-Seq data sets to demonstrate its rapid execution and potential for cutting-edge research with a case study in enhancer analysis. To our knowledge, PAPST is the first software of its kind to provide efficient and sophisticated post peak-calling Ch

  7. A user-friendly means to scale from the biochemistry of photosynthesis to whole crop canopies and production in time and space - development of Java WIMOVAC.

    PubMed

    Song, Qingfeng; Chen, Dairui; Long, Stephen P; Zhu, Xin-Guang

    2017-01-01

    Windows Intuitive Model of Vegetation response to Atmosphere and Climate Change (WIMOVAC) has been used widely as a generic modular mechanistically rich model of plant production. It can predict the responses of leaf and canopy carbon balance, as well as production in different environmental conditions, in particular those relevant to global change. Here, we introduce an open source Java user-friendly version of WIMOVAC. This software is platform independent and can be easily downloaded to a laptop and used without any prior programming skills. In this article, we describe the structure, equations and user guide and illustrate some potential applications of WIMOVAC. © 2016 The Authors Plant, Cell & Environment Published by John Wiley & Sons Ltd.

  8. Automated 3D structure composition for large RNAs

    PubMed Central

    Popenda, Mariusz; Szachniuk, Marta; Antczak, Maciej; Purzycka, Katarzyna J.; Lukasiak, Piotr; Bartol, Natalia; Blazewicz, Jacek; Adamiak, Ryszard W.

    2012-01-01

    Understanding the numerous functions that RNAs play in living cells depends critically on knowledge of their three-dimensional structure. Due to the difficulties in experimentally assessing structures of large RNAs, there is currently great demand for new high-resolution structure prediction methods. We present the novel method for the fully automated prediction of RNA 3D structures from a user-defined secondary structure. The concept is founded on the machine translation system. The translation engine operates on the RNA FRABASE database tailored to the dictionary relating the RNA secondary structure and tertiary structure elements. The translation algorithm is very fast. Initial 3D structure is composed in a range of seconds on a single processor. The method assures the prediction of large RNA 3D structures of high quality. Our approach needs neither structural templates nor RNA sequence alignment, required for comparative methods. This enables the building of unresolved yet native and artificial RNA structures. The method is implemented in a publicly available, user-friendly server RNAComposer. It works in an interactive mode and a batch mode. The batch mode is designed for large-scale modelling and accepts atomic distance restraints. Presently, the server is set to build RNA structures of up to 500 residues. PMID:22539264

  9. New-generation bar adsorptive microextraction (BAμE) devices for a better eco-user-friendly analytical approach-Application for the determination of antidepressant pharmaceuticals in biological fluids.

    PubMed

    Ide, A H; Nogueira, J M F

    2018-05-10

    The present contribution aims to design new-generation bar adsorptive microextraction (BAμE) devices that promote an innovative and much better user-friendly analytical approach. The novel BAμE devices were lab-made prepared having smaller dimensions by using flexible nylon-based supports (7.5 × 1.0 mm) coated with convenient sorbents (≈ 0.5 mg). This novel advance allows effective microextraction and back-extraction ('only single liquid desorption step') stages as well as interfacing enhancement with the instrumental systems dedicated for routine analysis. To evaluate the achievements of these improvements, four antidepressant agents (bupropion, citalopram, amitriptyline and trazodone) were used as model compounds in aqueous media combined with liquid chromatography (LC) systems. By using an N-vinylpyrrolidone based-polymer phase good selectivity and efficiency were obtained. Assays performed on 25 mL spiked aqueous samples, yielded average recoveries in between 67.8 ± 12.4% (bupropion) and 88.3 ± 12.1% (citalopram), under optimized experimental conditions. The analytical performance also showed convenient precision (RSD < 12%) and detection limits (50 ng L -1 ), as well as linear dynamic ranges (160-2000 ng L -1 ) with suitable determination coefficients (r 2  > 0.9820). The application of the proposed analytical approach on biological fluids showed negligible matrix effects by using the standard addition methodology. From the data obtained, the new-generation BAμE devices presented herein provide an innovative and robust analytical cycle, are simple to prepare, cost-effective, user-friendly and compatible with the current LC autosampler systems. Furthermore, the novel devices were designed to be disposable and used together with negligible amounts of organic solvents (100 μL) during back-extraction, in compliance with the green analytical chemistry principles. In short, the new-generation BAμE devices showed to be

  10. Conceptual design for a user-friendly adaptive optics system at Lick Observatory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bissinger, H.D.; Olivier, S.; Max, C.

    1996-03-08

    In this paper, we present a conceptual design for a general-purpose adaptive optics system, usable with all Cassegrain facility instruments on the 3 meter Shane telescope at the University of California`s Lick Observatory located on Mt. Hamilton near San Jose, California. The overall design goal for this system is to take the sodium-layer laser guide star adaptive optics technology out of the demonstration stage and to build a user-friendly astronomical tool. The emphasis will be on ease of calibration, improved stability and operational simplicity in order to allow the system to be run routinely by observatory staff. A prototype adaptivemore » optics system and a 20 watt sodium-layer laser guide star system have already been built at Lawrence Livermore National Laboratory for use at Lick Observatory. The design presented in this paper is for a next- generation adaptive optics system that extends the capabilities of the prototype system into the visible with more degrees of freedom. When coupled with a laser guide star system that is upgraded to a power matching the new adaptive optics system, the combined system will produce diffraction-limited images for near-IR cameras. Atmospheric correction at wavelengths of 0.6-1 mm will significantly increase the throughput of the most heavily used facility instrument at Lick, the Kast Spectrograph, and will allow it to operate with smaller slit widths and deeper limiting magnitudes. 8 refs., 2 figs.« less

  11. Cognitive consequences of clumsy automation on high workload, high consequence human performance

    NASA Technical Reports Server (NTRS)

    Cook, Richard I.; Woods, David D.; Mccolligan, Elizabeth; Howie, Michael B.

    1991-01-01

    The growth of computational power has fueled attempts to automate more of the human role in complex problem solving domains, especially those where system faults have high consequences and where periods of high workload may saturate the performance capacity of human operators. Examples of these domains include flightdecks, space stations, air traffic control, nuclear power operation, ground satellite control rooms, and surgical operating rooms. Automation efforts may have unanticipated effects on human performance, particularly if they increase the workload at peak workload times or change the practitioners' strategies for coping with workload. Smooth and effective changes in automation requires detailed understanding of the congnitive tasks confronting the user: it has been called user centered automation. The introduction of a new computerized technology in a group of hospital operating rooms used for heart surgery was observed. The study revealed how automation, especially 'clumsy automation', effects practitioner work patterns and suggest that clumsy automation constrains users in specific and significant ways. Users tailor both the new system and their tasks in order to accommodate the needs of process and production. The study of this tailoring may prove a powerful tool for exposing previously hidden patterns of user data processing, integration, and decision making which may, in turn, be useful in the design of more effective human-machine systems.

  12. Automated Test Case Generation for an Autopilot Requirement Prototype

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Rungta, Neha; Feary, Michael

    2011-01-01

    Designing safety-critical automation with robust human interaction is a difficult task that is susceptible to a number of known Human-Automation Interaction (HAI) vulnerabilities. It is therefore essential to develop automated tools that provide support both in the design and rapid evaluation of such automation. The Automation Design and Evaluation Prototyping Toolset (ADEPT) enables the rapid development of an executable specification for automation behavior and user interaction. ADEPT supports a number of analysis capabilities, thus enabling the detection of HAI vulnerabilities early in the design process, when modifications are less costly. In this paper, we advocate the introduction of a new capability to model-based prototyping tools such as ADEPT. The new capability is based on symbolic execution that allows us to automatically generate quality test suites based on the system design. Symbolic execution is used to generate both user input and test oracles user input drives the testing of the system implementation, and test oracles ensure that the system behaves as designed. We present early results in the context of a component in the Autopilot system modeled in ADEPT, and discuss the challenges of test case generation in the HAI domain.

  13. Analysis of Environmental Friendly Library Based on the Satisfaction and Service Quality: study at Library “X”

    NASA Astrophysics Data System (ADS)

    Herdiansyah, Herdis; Satriya Utama, Andre; Safruddin; Hidayat, Heri; Gema Zuliana Irawan, Angga; Immanuel Tjandra Muliawan, R.; Mutia Pratiwi, Diana

    2017-10-01

    One of the factor that influenced the development of science is the existence of the library, which in this case is the college libraries. Library, which is located in the college environment, aims to supply collections of literatures to support research activities as well as educational for students of the college. Conceptually, every library now starts to practice environmental principles. For example, “X” library as a central library claims to be an environmental friendly library for practicing environmental friendly management, but the X library has not inserted the satisfaction and service aspect to the users, including whether it is true that environmental friendly process is perceived by library users. Satisfaction can be seen from the comparison between expectations and reality of library users. This paper analyzes the level of library user satisfaction with library services in the campus area and the gap between expectations and reality felt by the library users. The result of the research shows that there is a disparity between the hope of library management, which is sustainable and environmentally friendly with the reality in the management of the library, so that it has not given satisfaction to the users yet. The gap value of satisfaction that has the biggest difference is in the library collection with the value of 1.57; while for the smallest gap value is in the same service to all students with a value of 0.67.

  14. Automated Testing Experience of the Linear Aerospike SR-71 Experiment (LASRE) Controller

    NASA Technical Reports Server (NTRS)

    Larson, Richard R.

    1999-01-01

    System controllers must be fail-safe, low cost, flexible to software changes, able to output health and status words, and permit rapid retest qualification. The system controller designed and tested for the aerospike engine program was an attempt to meet these requirements. This paper describes (1) the aerospike controller design, (2) the automated simulation testing techniques, and (3) the real time monitoring data visualization structure. Controller cost was minimized by design of a single-string system that used an off-the-shelf 486 central processing unit (CPU). A linked-list architecture, with states (nodes) defined in a user-friendly state table, accomplished software changes to the controller. Proven to be fail-safe, this system reported the abort cause and automatically reverted to a safe condition for any first failure. A real time simulation and test system automated the software checkout and retest requirements. A program requirement to decode all abort causes in real time during all ground and flight tests assured the safety of flight decisions and the proper execution of mission rules. The design also included health and status words, and provided a real time analysis interpretation for all health and status data.

  15. Automated imaging system for single molecules

    DOEpatents

    Schwartz, David Charles; Runnheim, Rodney; Forrest, Daniel

    2012-09-18

    There is provided a high throughput automated single molecule image collection and processing system that requires minimal initial user input. The unique features embodied in the present disclosure allow automated collection and initial processing of optical images of single molecules and their assemblies. Correct focus may be automatically maintained while images are collected. Uneven illumination in fluorescence microscopy is accounted for, and an overall robust imaging operation is provided yielding individual images prepared for further processing in external systems. Embodiments described herein are useful in studies of any macromolecules such as DNA, RNA, peptides and proteins. The automated image collection and processing system and method of same may be implemented and deployed over a computer network, and may be ergonomically optimized to facilitate user interaction.

  16. The Human Response to Library Automation.

    ERIC Educational Resources Information Center

    Kirkland, Janice, Ed.

    1989-01-01

    Eleven articles discuss the response of library users and personnel to automation. Topics covered include computerized reference services, online public access catalogs, paraprofessional staff perceptions of technology, organizational and managerial impacts of automation, workshops on new technology, gender differences in motivation to manage and…

  17. Automated campaign system

    NASA Astrophysics Data System (ADS)

    Vondran, Gary; Chao, Hui; Lin, Xiaofan; Beyer, Dirk; Joshi, Parag; Atkins, Brian; Obrador, Pere

    2006-02-01

    To run a targeted campaign involves coordination and management across numerous organizations and complex process flows. Everything from market analytics on customer databases, acquiring content and images, composing the materials, meeting the sponsoring enterprise brand standards, driving through production and fulfillment, and evaluating results; all processes are currently performed by experienced highly trained staff. Presented is a developed solution that not only brings together technologies that automate each process, but also automates the entire flow so that a novice user could easily run a successful campaign from their desktop. This paper presents the technologies, structure, and process flows used to bring this system together. Highlighted will be how the complexity of running a targeted campaign is hidden from the user through technologies, all while providing the benefits of a professionally managed campaign.

  18. Genetic algorithm and graph theory based matrix factorization method for online friend recommendation.

    PubMed

    Li, Qu; Yao, Min; Yang, Jianhua; Xu, Ning

    2014-01-01

    Online friend recommendation is a fast developing topic in web mining. In this paper, we used SVD matrix factorization to model user and item feature vector and used stochastic gradient descent to amend parameter and improve accuracy. To tackle cold start problem and data sparsity, we used KNN model to influence user feature vector. At the same time, we used graph theory to partition communities with fairly low time and space complexity. What is more, matrix factorization can combine online and offline recommendation. Experiments showed that the hybrid recommendation algorithm is able to recommend online friends with good accuracy.

  19. BUMPER v1.0: a Bayesian user-friendly model for palaeo-environmental reconstruction

    NASA Astrophysics Data System (ADS)

    Holden, Philip B.; Birks, H. John B.; Brooks, Stephen J.; Bush, Mark B.; Hwang, Grace M.; Matthews-Bird, Frazer; Valencia, Bryan G.; van Woesik, Robert

    2017-02-01

    We describe the Bayesian user-friendly model for palaeo-environmental reconstruction (BUMPER), a Bayesian transfer function for inferring past climate and other environmental variables from microfossil assemblages. BUMPER is fully self-calibrating, straightforward to apply, and computationally fast, requiring ˜ 2 s to build a 100-taxon model from a 100-site training set on a standard personal computer. We apply the model's probabilistic framework to generate thousands of artificial training sets under ideal assumptions. We then use these to demonstrate the sensitivity of reconstructions to the characteristics of the training set, considering assemblage richness, taxon tolerances, and the number of training sites. We find that a useful guideline for the size of a training set is to provide, on average, at least 10 samples of each taxon. We demonstrate general applicability to real data, considering three different organism types (chironomids, diatoms, pollen) and different reconstructed variables. An identically configured model is used in each application, the only change being the input files that provide the training-set environment and taxon-count data. The performance of BUMPER is shown to be comparable with weighted average partial least squares (WAPLS) in each case. Additional artificial datasets are constructed with similar characteristics to the real data, and these are used to explore the reasons for the differing performances of the different training sets.

  20. Effect of descriptive information and experience on automation reliance.

    PubMed

    Yuviler-Gavish, Nirit; Gopher, Daniel

    2011-06-01

    The present research addresses the issue of reliance on decision support systems for the long-term (DSSLT), which help users develop decision-making strategies and long-term planning. It is argued that providing information about a system's future performance in an experiential manner, as compared with a descriptive manner, encourages users to increase their reliance level. Establishing appropriate reliance on DSSLT is contingent on the system developer's ability to provide users with information about the system's future performance. A sequence of three studies contrasts the effect on automation reliance of providing descriptive information versus experience for DSSLT with two different positive expected values of recommendations. Study I demonstrated that when automation reliance was determined solely on the basis of description, it was relatively low, but it increased significantly when a decision was made after experience with 50 training simulations. Participants were able to learn to increase their automation reliance levels when they encountered the same type of recommendation again. Study 2 showed that the absence of preliminary descriptive information did not affect the automation reliance levels obtained after experience. Study 3 demonstrated that participants were able to generalize their learning about increasing reliance levels to new recommendations. Using experience rather than description to give users information about future performance in DSSLT can help increase automation reliance levels. Implications for designing DSSLT and decision support systems in general are discussed.

  1. SVAw - a web-based application tool for automated surrogate variable analysis of gene expression studies

    PubMed Central

    2013-01-01

    Background Surrogate variable analysis (SVA) is a powerful method to identify, estimate, and utilize the components of gene expression heterogeneity due to unknown and/or unmeasured technical, genetic, environmental, or demographic factors. These sources of heterogeneity are common in gene expression studies, and failing to incorporate them into the analysis can obscure results. Using SVA increases the biological accuracy and reproducibility of gene expression studies by identifying these sources of heterogeneity and correctly accounting for them in the analysis. Results Here we have developed a web application called SVAw (Surrogate variable analysis Web app) that provides a user friendly interface for SVA analyses of genome-wide expression studies. The software has been developed based on open source bioconductor SVA package. In our software, we have extended the SVA program functionality in three aspects: (i) the SVAw performs a fully automated and user friendly analysis workflow; (ii) It calculates probe/gene Statistics for both pre and post SVA analysis and provides a table of results for the regression of gene expression on the primary variable of interest before and after correcting for surrogate variables; and (iii) it generates a comprehensive report file, including graphical comparison of the outcome for the user. Conclusions SVAw is a web server freely accessible solution for the surrogate variant analysis of high-throughput datasets and facilitates removing all unwanted and unknown sources of variation. It is freely available for use at http://psychiatry.igm.jhmi.edu/sva. The executable packages for both web and standalone application and the instruction for installation can be downloaded from our web site. PMID:23497726

  2. Automating Document Delivery: A Conference Report.

    ERIC Educational Resources Information Center

    Ensor, Pat

    1992-01-01

    Describes presentations made at a forum on automation, interlibrary loan (ILL), and document delivery sponsored by the Houston Area Library Consortium. Highlights include access versus ownership; software for ILL; fee-based services; automated management systems for ILL; and electronic mail and online systems for end-user-generated ILL requests.…

  3. Continuous Calibration of Trust in Automated Systems

    DTIC Science & Technology

    2014-01-01

    Airlines Flight 214 in San Francisco. Therefore, understanding how users form, lose, and recover trust in imperfect automation is of critical...1997). Misuse and disuse can have fatal consequences; for example, inappropriate automation reliance has been implicated in the recent crash of Asiana

  4. Comparing Text-based and Graphic User Interfaces for Novice and Expert Users

    PubMed Central

    Chen, Jung-Wei; Zhang, Jiajie

    2007-01-01

    Graphic User Interface (GUI) is commonly considered to be superior to Text-based User Interface (TUI). This study compares GUI and TUI in an electronic dental record system. Several usability analysis techniques compared the relative effectiveness of a GUI and a TUI. Expert users and novice users were evaluated in time required and steps needed to complete the task. A within-subject design was used to evaluate if the experience with either interface will affect task performance. The results show that the GUI interface was not better than the TUI for expert users. GUI interface was better for novice users. For novice users there was a learning transfer effect from TUI to GUI. This means a user interface is user-friendly or not depending on the mapping between the user interface and tasks. GUI by itself may or may not be better than TUI. PMID:18693811

  5. Comparing Text-based and Graphic User Interfaces for novice and expert users.

    PubMed

    Chen, Jung-Wei; Zhang, Jiajie

    2007-10-11

    Graphic User Interface (GUI) is commonly considered to be superior to Text-based User Interface (TUI). This study compares GUI and TUI in an electronic dental record system. Several usability analysis techniques compared the relative effectiveness of a GUI and a TUI. Expert users and novice users were evaluated in time required and steps needed to complete the task. A within-subject design was used to evaluate if the experience with either interface will affect task performance. The results show that the GUI interface was not better than the TUI for expert users. GUI interface was better for novice users. For novice users there was a learning transfer effect from TUI to GUI. This means a user interface is user-friendly or not depending on the mapping between the user interface and tasks. GUI by itself may or may not be better than TUI.

  6. Automating CPM-GOMS

    NASA Technical Reports Server (NTRS)

    John, Bonnie; Vera, Alonso; Matessa, Michael; Freed, Michael; Remington, Roger

    2002-01-01

    CPM-GOMS is a modeling method that combines the task decomposition of a GOMS analysis with a model of human resource usage at the level of cognitive, perceptual, and motor operations. CPM-GOMS models have made accurate predictions about skilled user behavior in routine tasks, but developing such models is tedious and error-prone. We describe a process for automatically generating CPM-GOMS models from a hierarchical task decomposition expressed in a cognitive modeling tool called Apex. Resource scheduling in Apex automates the difficult task of interleaving the cognitive, perceptual, and motor resources underlying common task operators (e.g. mouse move-and-click). Apex's UI automatically generates PERT charts, which allow modelers to visualize a model's complex parallel behavior. Because interleaving and visualization is now automated, it is feasible to construct arbitrarily long sequences of behavior. To demonstrate the process, we present a model of automated teller interactions in Apex and discuss implications for user modeling. available to model human users, the Goals, Operators, Methods, and Selection (GOMS) method [6, 21] has been the most widely used, providing accurate, often zero-parameter, predictions of the routine performance of skilled users in a wide range of procedural tasks [6, 13, 15, 27, 28]. GOMS is meant to model routine behavior. The user is assumed to have methods that apply sequences of operators and to achieve a goal. Selection rules are applied when there is more than one method to achieve a goal. Many routine tasks lend themselves well to such decomposition. Decomposition produces a representation of the task as a set of nested goal states that include an initial state and a final state. The iterative decomposition into goals and nested subgoals can terminate in primitives of any desired granularity, the choice of level of detail dependent on the predictions required. Although GOMS has proven useful in HCI, tools to support the

  7. An automated metrics system to measure and improve the success of laboratory automation implementation.

    PubMed

    Benn, Neil; Turlais, Fabrice; Clark, Victoria; Jones, Mike; Clulow, Stephen

    2007-03-01

    The authors describe a system for collecting usage metrics from widely distributed automation systems. An application that records and stores usage data centrally, calculates run times, and charts the data was developed. Data were collected over 20 months from at least 28 workstations. The application was used to plot bar charts of date versus run time for individual workstations, the automation in a specific laboratory, or automation of a specified type. The authors show that revised user training, redeployment of equipment, and running complimentary processes on one workstation can increase the average number of runs by up to 20-fold and run times by up to 450%. Active monitoring of usage leads to more effective use of automation. Usage data could be used to determine whether purchasing particular automation was a good investment.

  8. Ion channel pharmacology under flow: automation via well-plate microfluidics.

    PubMed

    Spencer, C Ian; Li, Nianzhen; Chen, Qin; Johnson, Juliette; Nevill, Tanner; Kammonen, Juha; Ionescu-Zanetti, Cristian

    2012-08-01

    Automated patch clamping addresses the need for high-throughput screening of chemical entities that alter ion channel function. As a result, there is considerable utility in the pharmaceutical screening arena for novel platforms that can produce relevant data both rapidly and consistently. Here we present results that were obtained with an innovative microfluidic automated patch clamp system utilizing a well-plate that eliminates the necessity of internal robotic liquid handling. Continuous recording from cell ensembles, rapid solution switching, and a bench-top footprint enable a number of assay formats previously inaccessible to automated systems. An electro-pneumatic interface was employed to drive the laminar flow of solutions in a microfluidic network that delivered cells in suspension to ensemble recording sites. Whole-cell voltage clamp was applied to linear arrays of 20 cells in parallel utilizing a 64-channel voltage clamp amplifier. A number of unique assays requiring sequential compound applications separated by a second or less, such as rapid determination of the agonist EC(50) for a ligand-gated ion channel or the kinetics of desensitization recovery, are enabled by the system. In addition, the system was validated via electrophysiological characterizations of both voltage-gated and ligand-gated ion channel targets: hK(V)2.1 and human Ether-à-go-go-related gene potassium channels, hNa(V)1.7 and 1.8 sodium channels, and (α1) hGABA(A) and (α1) human nicotinic acetylcholine receptor receptors. Our results show that the voltage dependence, kinetics, and interactions of these channels with pharmacological agents were matched to reference data. The results from these IonFlux™ experiments demonstrate that the system provides high-throughput automated electrophysiology with enhanced reliability and consistency, in a user-friendly format.

  9. PIA: An Intuitive Protein Inference Engine with a Web-Based User Interface.

    PubMed

    Uszkoreit, Julian; Maerkens, Alexandra; Perez-Riverol, Yasset; Meyer, Helmut E; Marcus, Katrin; Stephan, Christian; Kohlbacher, Oliver; Eisenacher, Martin

    2015-07-02

    Protein inference connects the peptide spectrum matches (PSMs) obtained from database search engines back to proteins, which are typically at the heart of most proteomics studies. Different search engines yield different PSMs and thus different protein lists. Analysis of results from one or multiple search engines is often hampered by different data exchange formats and lack of convenient and intuitive user interfaces. We present PIA, a flexible software suite for combining PSMs from different search engine runs and turning these into consistent results. PIA can be integrated into proteomics data analysis workflows in several ways. A user-friendly graphical user interface can be run either locally or (e.g., for larger core facilities) from a central server. For automated data processing, stand-alone tools are available. PIA implements several established protein inference algorithms and can combine results from different search engines seamlessly. On several benchmark data sets, we show that PIA can identify a larger number of proteins at the same protein FDR when compared to that using inference based on a single search engine. PIA supports the majority of established search engines and data in the mzIdentML standard format. It is implemented in Java and freely available at https://github.com/mpc-bioinformatics/pia.

  10. System for Automated Geoscientific Analyses (SAGA) v. 2.1.4

    NASA Astrophysics Data System (ADS)

    Conrad, O.; Bechtel, B.; Bock, M.; Dietrich, H.; Fischer, E.; Gerlitz, L.; Wehberg, J.; Wichmann, V.; Böhner, J.

    2015-07-01

    The System for Automated Geoscientific Analyses (SAGA) is an open source geographic information system (GIS), mainly licensed under the GNU General Public License. Since its first release in 2004, SAGA has rapidly developed from a specialized tool for digital terrain analysis to a comprehensive and globally established GIS platform for scientific analysis and modeling. SAGA is coded in C++ in an object oriented design and runs under several operating systems including Windows and Linux. Key functional features of the modular software architecture comprise an application programming interface for the development and implementation of new geoscientific methods, a user friendly graphical user interface with many visualization options, a command line interpreter, and interfaces to interpreted languages like R and Python. The current version 2.1.4 offers more than 600 tools, which are implemented in dynamically loadable libraries or shared objects and represent the broad scopes of SAGA in numerous fields of geoscientific endeavor and beyond. In this paper, we inform about the system's architecture, functionality, and its current state of development and implementation. Furthermore, we highlight the wide spectrum of scientific applications of SAGA in a review of published studies, with special emphasis on the core application areas digital terrain analysis, geomorphology, soil science, climatology and meteorology, as well as remote sensing.

  11. Improving medical stores management through automation and effective communication.

    PubMed

    Kumar, Ashok; Cariappa, M P; Marwaha, Vishal; Sharma, Mukti; Arora, Manu

    2016-01-01

    Medical stores management in hospitals is a tedious and time consuming chore with limited resources tasked for the purpose and poor penetration of Information Technology. The process of automation is slow paced due to various inherent factors and is being challenged by the increasing inventory loads and escalating budgets for procurement of drugs. We carried out an indepth case study at the Medical Stores of a tertiary care health care facility. An iterative six step Quality Improvement (QI) process was implemented based on the Plan-Do-Study-Act (PDSA) cycle. The QI process was modified as per requirement to fit the medical stores management model. The results were evaluated after six months. After the implementation of QI process, 55 drugs of the medical store inventory which had expired since 2009 onwards were replaced with fresh stock by the suppliers as a result of effective communication through upgraded database management. Various pending audit objections were dropped due to the streamlined documentation and processes. Inventory management improved drastically due to automation, with disposal orders being initiated four months prior to the expiry of drugs and correct demands being generated two months prior to depletion of stocks. The monthly expense summary of drugs was now being done within ten days of the closing month. Improving communication systems within the hospital with vendor database management and reaching out to clinicians is important. Automation of inventory management requires to be simple and user-friendly, utilizing existing hardware. Physical stores monitoring is indispensable, especially due to the scattered nature of stores. Staff training and standardized documentation protocols are the other keystones for optimal medical store management.

  12. Improving medical stores management through automation and effective communication

    PubMed Central

    Kumar, Ashok; Cariappa, M.P.; Marwaha, Vishal; Sharma, Mukti; Arora, Manu

    2016-01-01

    Background Medical stores management in hospitals is a tedious and time consuming chore with limited resources tasked for the purpose and poor penetration of Information Technology. The process of automation is slow paced due to various inherent factors and is being challenged by the increasing inventory loads and escalating budgets for procurement of drugs. Methods We carried out an indepth case study at the Medical Stores of a tertiary care health care facility. An iterative six step Quality Improvement (QI) process was implemented based on the Plan–Do–Study–Act (PDSA) cycle. The QI process was modified as per requirement to fit the medical stores management model. The results were evaluated after six months. Results After the implementation of QI process, 55 drugs of the medical store inventory which had expired since 2009 onwards were replaced with fresh stock by the suppliers as a result of effective communication through upgraded database management. Various pending audit objections were dropped due to the streamlined documentation and processes. Inventory management improved drastically due to automation, with disposal orders being initiated four months prior to the expiry of drugs and correct demands being generated two months prior to depletion of stocks. The monthly expense summary of drugs was now being done within ten days of the closing month. Conclusion Improving communication systems within the hospital with vendor database management and reaching out to clinicians is important. Automation of inventory management requires to be simple and user-friendly, utilizing existing hardware. Physical stores monitoring is indispensable, especially due to the scattered nature of stores. Staff training and standardized documentation protocols are the other keystones for optimal medical store management. PMID:26900225

  13. MR efficiency using automated MRI-desktop eProtocol

    NASA Astrophysics Data System (ADS)

    Gao, Fei; Xu, Yanzhe; Panda, Anshuman; Zhang, Min; Hanson, James; Su, Congzhe; Wu, Teresa; Pavlicek, William; James, Judy R.

    2017-03-01

    MRI protocols are instruction sheets that radiology technologists use in routine clinical practice for guidance (e.g., slice position, acquisition parameters etc.). In Mayo Clinic Arizona (MCA), there are over 900 MR protocols (ranging across neuro, body, cardiac, breast etc.) which makes maintaining and updating the protocol instructions a labor intensive effort. The task is even more challenging given different vendors (Siemens, GE etc.). This is a universal problem faced by all the hospitals and/or medical research institutions. To increase the efficiency of the MR practice, we designed and implemented a web-based platform (eProtocol) to automate the management of MRI protocols. It is built upon a database that automatically extracts protocol information from DICOM compliant images and provides a user-friendly interface to the technologists to create, edit and update the protocols. Advanced operations such as protocol migrations from scanner to scanner and capability to upload Multimedia content were also implemented. To the best of our knowledge, eProtocol is the first MR protocol automated management tool used clinically. It is expected that this platform will significantly improve the radiology operations efficiency including better image quality and exam consistency, fewer repeat examinations and less acquisition errors. These protocols instructions will be readily available to the technologists during scans. In addition, this web-based platform can be extended to other imaging modalities such as CT, Mammography, and Interventional Radiology and different vendors for imaging protocol management.

  14. FabricS: A user-friendly, complete and robust software for particle shape-fabric analysis

    NASA Astrophysics Data System (ADS)

    Moreno Chávez, G.; Castillo Rivera, F.; Sarocchi, D.; Borselli, L.; Rodríguez-Sedano, L. A.

    2018-06-01

    Shape-fabric is a textural parameter related to the spatial arrangement of elongated particles in geological samples. Its usefulness spans a range from sedimentary petrology to igneous and metamorphic petrology. Independently of the process being studied, when a material flows, the elongated particles are oriented with the major axis in the direction of flow. In sedimentary petrology this information has been used for studies of paleo-flow direction of turbidites, the origin of quartz sediments, and locating ignimbrite vents, among others. In addition to flow direction and its polarity, the method enables flow rheology to be inferred. The use of shape-fabric has been limited due to the difficulties of automatically measuring particles and analyzing them with reliable circular statistics programs. This has dampened interest in the method for a long time. Shape-fabric measurement has increased in popularity since the 1980s thanks to the development of new image analysis techniques and circular statistics software. However, the programs currently available are unreliable, old and are incompatible with newer operating systems, or require programming skills. The goal of our work is to develop a user-friendly program, in the MATLAB environment, with a graphical user interface, that can process images and includes editing functions, and thresholds (elongation and size) for selecting a particle population and analyzing it with reliable circular statistics algorithms. Moreover, the method also has to produce rose diagrams, orientation vectors, and a complete series of statistical parameters. All these requirements are met by our new software. In this paper, we briefly explain the methodology from collection of oriented samples in the field to the minimum number of particles needed to obtain reliable fabric data. We obtained the data using specific statistical tests and taking into account the degree of iso-orientation of the samples and the required degree of reliability

  15. Not all trust is created equal: dispositional and history-based trust in human-automation interactions.

    PubMed

    Merritt, Stephanie M; Ilgen, Daniel R

    2008-04-01

    We provide an empirical demonstration of the importance of attending to human user individual differences in examinations of trust and automation use. Past research has generally supported the notions that machine reliability predicts trust in automation, and trust in turn predicts automation use. However, links between user personality and perceptions of the machine with trust in automation have not been empirically established. On our X-ray screening task, 255 students rated trust and made automation use decisions while visually searching for weapons in X-ray images of luggage. We demonstrate that individual differences affect perceptions of machine characteristics when actual machine characteristics are constant, that perceptions account for 52% of trust variance above the effects of actual characteristics, and that perceptions mediate the effects of actual characteristics on trust. Importantly, we also demonstrate that when administered at different times, the same six trust items reflect two types of trust (dispositional trust and history-based trust) and that these two trust constructs are differentially related to other variables. Interactions were found among user characteristics, machine characteristics, and automation use. Our results suggest that increased specificity in the conceptualization and measurement of trust is required, future researchers should assess user perceptions of machine characteristics in addition to actual machine characteristics, and incorporation of user extraversion and propensity to trust machines can increase prediction of automation use decisions. Potential applications include the design of flexible automation training programs tailored to individuals who differ in systematic ways.

  16. Individual-Based Modeling of Tuberculosis in a User-Friendly Interface: Understanding the Epidemiological Role of Population Heterogeneity in a City

    PubMed Central

    Prats, Clara; Montañola-Sales, Cristina; Gilabert-Navarro, Joan F.; Valls, Joaquim; Casanovas-Garcia, Josep; Vilaplana, Cristina; Cardona, Pere-Joan; López, Daniel

    2016-01-01

    For millennia tuberculosis (TB) has shown a successful strategy to survive, making it one of the world’s deadliest infectious diseases. This resilient behavior is based not only on remaining hidden in most of the infected population, but also by showing slow evolution in most sick people. The course of the disease within a population is highly related to its heterogeneity. Thus, classic epidemiological approaches with a top-down perspective have not succeeded in understanding its dynamics. In the past decade a few individual-based models were built, but most of them preserved a top-down view that makes it difficult to study a heterogeneous population. We propose an individual-based model developed with a bottom-up approach to studying the dynamics of pulmonary TB in a certain population, considered constant. Individuals may belong to the following classes: healthy, infected, sick, under treatment, and treated with a probability of relapse. Several variables and parameters account for their age, origin (native or immigrant), immunodeficiency, diabetes, and other risk factors (smoking and alcoholism). The time within each infection state is controlled, and sick individuals may show a cavitated disease or not that conditions infectiousness. It was implemented in NetLogo because it allows non-modelers to perform virtual experiments with a user-friendly interface. The simulation was conducted with data from Ciutat Vella, a district of Barcelona with an incidence of 67 TB cases per 100,000 inhabitants in 2013. Several virtual experiments were performed to relate the disease dynamics with the structure of the infected subpopulation (e.g., the distribution of infected times). Moreover, the short-term effect of health control policies on modifying that structure was studied. Results show that the characteristics of the population are crucial for the local epidemiology of TB. The developed user-friendly tool is ready to test control strategies of disease in any city in the

  17. Individual-Based Modeling of Tuberculosis in a User-Friendly Interface: Understanding the Epidemiological Role of Population Heterogeneity in a City.

    PubMed

    Prats, Clara; Montañola-Sales, Cristina; Gilabert-Navarro, Joan F; Valls, Joaquim; Casanovas-Garcia, Josep; Vilaplana, Cristina; Cardona, Pere-Joan; López, Daniel

    2015-01-01

    For millennia tuberculosis (TB) has shown a successful strategy to survive, making it one of the world's deadliest infectious diseases. This resilient behavior is based not only on remaining hidden in most of the infected population, but also by showing slow evolution in most sick people. The course of the disease within a population is highly related to its heterogeneity. Thus, classic epidemiological approaches with a top-down perspective have not succeeded in understanding its dynamics. In the past decade a few individual-based models were built, but most of them preserved a top-down view that makes it difficult to study a heterogeneous population. We propose an individual-based model developed with a bottom-up approach to studying the dynamics of pulmonary TB in a certain population, considered constant. Individuals may belong to the following classes: healthy, infected, sick, under treatment, and treated with a probability of relapse. Several variables and parameters account for their age, origin (native or immigrant), immunodeficiency, diabetes, and other risk factors (smoking and alcoholism). The time within each infection state is controlled, and sick individuals may show a cavitated disease or not that conditions infectiousness. It was implemented in NetLogo because it allows non-modelers to perform virtual experiments with a user-friendly interface. The simulation was conducted with data from Ciutat Vella, a district of Barcelona with an incidence of 67 TB cases per 100,000 inhabitants in 2013. Several virtual experiments were performed to relate the disease dynamics with the structure of the infected subpopulation (e.g., the distribution of infected times). Moreover, the short-term effect of health control policies on modifying that structure was studied. Results show that the characteristics of the population are crucial for the local epidemiology of TB. The developed user-friendly tool is ready to test control strategies of disease in any city in the

  18. Stage Evolution of Office Automation Technological Change and Organizational Learning.

    ERIC Educational Resources Information Center

    Sumner, Mary

    1985-01-01

    A study was conducted to identify stage characteristics in terms of technology, applications, the role and responsibilities of the office automation organization, and planning and control strategies; and to describe the respective roles of data processing professionals, office automation analysts, and users in office automation systems development…

  19. Enhancing reproducibility of ultrasonic measurements by new users

    NASA Astrophysics Data System (ADS)

    Pramanik, Manojit; Gupta, Madhumita; Krishnan, Kajoli Banerjee

    2013-03-01

    Perception of operator influences ultrasound image acquisition and processing. Lower costs are attracting new users to medical ultrasound. Anticipating an increase in this trend, we conducted a study to quantify the variability in ultrasonic measurements made by novice users and identify methods to reduce it. We designed a protocol with four presets and trained four new users to scan and manually measure the head circumference of a fetal phantom with an ultrasound scanner. In the first phase, the users followed this protocol in seven distinct sessions. They then received feedback on the quality of the scans from an expert. In the second phase, two of the users repeated the entire protocol aided by visual cues provided to them during scanning. We performed off-line measurements on all the images using a fully automated algorithm capable of measuring the head circumference from fetal phantom images. The ground truth (198.1±1.6 mm) was based on sixteen scans and measurements made by an expert. Our analysis shows that: (1) the inter-observer variability of manual measurements was 5.5 mm, whereas the inter-observer variability of automated measurements was only 0.6 mm in the first phase (2) consistency of image appearance improved and mean manual measurements was 4-5 mm closer to the ground truth in the second phase (3) automated measurements were more precise, accurate and less sensitive to different presets compared to manual measurements in both phases. Our results show that visual aids and automation can bring more reproducibility to ultrasonic measurements made by new users.

  20. HWNOISE program users' guide

    DOT National Transportation Integrated Search

    1992-02-01

    HWNOISE is a VNTSC-developed user friendly program written in : Microsoft Fortran version 4.01 for the IBM PC/AT and : compatibles to analyze acoustic data. This program is an : integral part of the Federal Highway Administration's Mobile : Noise Dat...

  1. HWINPUT program users' guide

    DOT National Transportation Integrated Search

    1992-02-01

    HWINPUT is a VNTSC-developed user friendly program written in : Microsoft Fortran version 4.01 for the IBM PC/AT. This program : is an integral part of the Federal Highway Administration's : Mobile Noise Data Gathering and Analysis Laboratory and is ...

  2. Manned spacecraft automation and robotics

    NASA Technical Reports Server (NTRS)

    Erickson, Jon D.

    1987-01-01

    The Space Station holds promise of being a showcase user and driver of advanced automation and robotics technology. The author addresses the advances in automation and robotics from the Space Shuttle - with its high-reliability redundancy management and fault tolerance design and its remote manipulator system - to the projected knowledge-based systems for monitoring, control, fault diagnosis, planning, and scheduling, and the telerobotic systems of the future Space Station.

  3. MyGeneFriends: A Social Network Linking Genes, Genetic Diseases, and Researchers.

    PubMed

    Allot, Alexis; Chennen, Kirsley; Nevers, Yannis; Poidevin, Laetitia; Kress, Arnaud; Ripp, Raymond; Thompson, Julie Dawn; Poch, Olivier; Lecompte, Odile

    2017-06-16

    The constant and massive increase of biological data offers unprecedented opportunities to decipher the function and evolution of genes and their roles in human diseases. However, the multiplicity of sources and flow of data mean that efficient access to useful information and knowledge production has become a major challenge. This challenge can be addressed by taking inspiration from Web 2.0 and particularly social networks, which are at the forefront of big data exploration and human-data interaction. MyGeneFriends is a Web platform inspired by social networks, devoted to genetic disease analysis, and organized around three types of proactive agents: genes, humans, and genetic diseases. The aim of this study was to improve exploration and exploitation of biological, postgenomic era big data. MyGeneFriends leverages conventions popularized by top social networks (Facebook, LinkedIn, etc), such as networks of friends, profile pages, friendship recommendations, affinity scores, news feeds, content recommendation, and data visualization. MyGeneFriends provides simple and intuitive interactions with data through evaluation and visualization of connections (friendships) between genes, humans, and diseases. The platform suggests new friends and publications and allows agents to follow the activity of their friends. It dynamically personalizes information depending on the user's specific interests and provides an efficient way to share information with collaborators. Furthermore, the user's behavior itself generates new information that constitutes an added value integrated in the network, which can be used to discover new connections between biological agents. We have developed MyGeneFriends, a Web platform leveraging conventions from popular social networks to redefine the relationship between humans and biological big data and improve human processing of biomedical data. MyGeneFriends is available at lbgi.fr/mygenefriends. ©Alexis Allot, Kirsley Chennen, Yannis

  4. Automated Tracking of Cell Migration with Rapid Data Analysis.

    PubMed

    DuChez, Brian J

    2017-09-01

    Cell migration is essential for many biological processes including development, wound healing, and metastasis. However, studying cell migration often requires the time-consuming and labor-intensive task of manually tracking cells. To accelerate the task of obtaining coordinate positions of migrating cells, we have developed a graphical user interface (GUI) capable of automating the tracking of fluorescently labeled nuclei. This GUI provides an intuitive user interface that makes automated tracking accessible to researchers with no image-processing experience or familiarity with particle-tracking approaches. Using this GUI, users can interactively determine a minimum of four parameters to identify fluorescently labeled cells and automate acquisition of cell trajectories. Additional features allow for batch processing of numerous time-lapse images, curation of unwanted tracks, and subsequent statistical analysis of tracked cells. Statistical outputs allow users to evaluate migratory phenotypes, including cell speed, distance, displacement, and persistence, as well as measures of directional movement, such as forward migration index (FMI) and angular displacement. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.

  5. Automating Phase Change Lines and Their Labels Using Microsoft Excel(R).

    PubMed

    Deochand, Neil

    2017-09-01

    Many researchers have rallied against drawn in graphical elements and offered ways to avoid them, especially regarding the insertion of phase change lines (Deochand, Costello, & Fuqua, 2015; Dubuque, 2015; Vanselow & Bourret, 2012). However, few have offered a solution to automating the phase labels, which are often utilized in behavior analytic graphical displays (Deochand et al., 2015). Despite the fact that Microsoft Excel® is extensively utilized by behavior analysts, solutions to resolve issues in our graphing practices are not always apparent or user-friendly. Considering the insertion of phase change lines and their labels constitute a repetitious and laborious endeavor, any minimization in the steps to accomplish these graphical elements could offer substantial time-savings to the field. The purpose of this report is to provide an updated way (and templates in the supplemental materials) to add phase change lines with their respective labels, which stay embedded to the graph when they are moved or updated.

  6. The Facebook paths to happiness: effects of the number of Facebook friends and self-presentation on subjective well-being.

    PubMed

    Kim, Junghyun; Lee, Jong-Eun Roselyn

    2011-06-01

    The current study investigates whether and how Facebook increases college-age users' subjective well-being by focusing on the number of Facebook friends and self-presentation strategies (positive vs. honest). A structural equation modeling analysis of cross-sectional survey data of college student Facebook users (N=391) revealed that the number of Facebook friends had a positive association with subjective well-being, but this association was not mediated by perceived social support. Additionally, we found that there was a negative curvilinear (inverted U-shape curve) relationship between Facebook friends and perceived social support. As for self-presentation strategies, whereas positive self-presentation had a direct effect on subjective well-being, honest self-presentation had a significant indirect effect on subjective well-being through perceived social support. Our study suggests that the number of Facebook friends and positive self-presentation may enhance users' subjective well-being, but this portion of happiness may not be grounded in perceived social support. On the other hand, honest self-presentation may enhance happiness rooted in social support provided by Facebook friends. Implications of our findings are discussed in light of affirmation of self-worth, time and effort required for building and maintaining friendships, and the important role played by self-disclosure in signaling one's need for social support.

  7. Affective processes in human-automation interactions.

    PubMed

    Merritt, Stephanie M

    2011-08-01

    This study contributes to the literature on automation reliance by illuminating the influences of user moods and emotions on reliance on automated systems. Past work has focused predominantly on cognitive and attitudinal variables, such as perceived machine reliability and trust. However, recent work on human decision making suggests that affective variables (i.e., moods and emotions) are also important. Drawing from the affect infusion model, significant effects of affect are hypothesized. Furthermore, a new affectively laden attitude termed liking is introduced. Participants watched video clips selected to induce positive or negative moods, then interacted with a fictitious automated system on an X-ray screening task At five time points, important variables were assessed including trust, liking, perceived machine accuracy, user self-perceived accuracy, and reliance.These variables, along with propensity to trust machines and state affect, were integrated in a structural equation model. Happiness significantly increased trust and liking for the system throughout the task. Liking was the only variable that significantly predicted reliance early in the task. Trust predicted reliance later in the task, whereas perceived machine accuracy and user self-perceived accuracy had no significant direct effects on reliance at any time. Affective influences on automation reliance are demonstrated, suggesting that this decision-making process may be less rational and more emotional than previously acknowledged. Liking for a new system may be key to appropriate reliance, particularly early in the task. Positive affect can be easily induced and may be a lever for increasing liking.

  8. Investing in the Future: Automation Marketplace 2009

    ERIC Educational Resources Information Center

    Breeding, Marshall

    2009-01-01

    In a year where the general economy presented enormous challenges, libraries continued to make investments in automation, especially in products that help improve what and how they deliver to their end users. Access to electronic content remains a key driver. In response to anticipated needs for new approaches to library automation, many companies…

  9. Automation in biological crystallization.

    PubMed

    Stewart, Patrick Shaw; Mueller-Dieckmann, Jochen

    2014-06-01

    Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given.

  10. Automation in biological crystallization

    PubMed Central

    Shaw Stewart, Patrick; Mueller-Dieckmann, Jochen

    2014-01-01

    Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given. PMID:24915074

  11. Web-Based Family Life Education: Spotlight on User Experience

    ERIC Educational Resources Information Center

    Doty, Jennifer; Doty, Matthew; Dwrokin, Jodi

    2011-01-01

    Family Life Education (FLE) websites can benefit from the field of user experience, which makes technology easy to use. A heuristic evaluation of five FLE sites was performed using Neilson's heuristics, guidelines for making sites user friendly. Greater site complexity resulted in more potential user problems. Sites most frequently had problems…

  12. A novel tool for user-friendly estimation of natural, diagnostic and professional radiation risk: Radio-Risk software.

    PubMed

    Carpeggiani, Clara; Paterni, Marco; Caramella, Davide; Vano, Eliseo; Semelka, Richard C; Picano, Eugenio

    2012-11-01

    Awareness of radiological risk is low among doctors and patients. An educational/decision tool that considers each patient' s cumulative lifetime radiation exposure would facilitate provider-patient communication. The purpose of this work was to develop user-friendly software for simple estimation and communication of radiological risk to patients and doctors as a part of the SUIT-Heart (Stop Useless Imaging Testing in Heart disease) Project of the Tuscany Region. We developed a novel software program (PC-platform, Windows OS fully downloadable at http://suit-heart.ifc.cnr.it) considering reference dose estimates from American Heart Association Radiological Imaging 2009 guidelines and UK Royal College of Radiology 2007 guidelines. Cancer age and gender-weighted risk were derived from Biological Effects of Ionising Radiation VII Committee, 2006. With simple input functions (demographics, age, gender) the user selects from a predetermined menu variables relating to natural (e.g., airplane flights and geo-tracked background exposure), professional (e.g., cath lab workers) and medical (e.g., CT, cardiac scintigraphy, coronary stenting) sources. The program provides a simple numeric (cumulative effective dose in milliSievert, mSv, and equivalent number of chest X-rays) and graphic (cumulative temporal trends of exposure, cancer cases out of 100 exposed persons) display. A simple software program allows straightforward estimation of cumulative dose (in multiples of chest X-rays) and risk (in extra % lifetime cancer risk), with simple numbers quantifying lifetime extra cancer risk. Pictorial display of radiation risk may be valuable for increasing radiological awareness in cardiologists. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  13. Quora.com: Another Place for Users to Ask Questions

    ERIC Educational Resources Information Center

    Ovadia, Steven

    2011-01-01

    Quora (www.quora.com) is a contemporary, web-based take on reference. Users post questions within Quora and other users answer the questions. Users can vote for and against answers (or not vote at all). It is users asking questions of friends and strangers and then sorting through the results. If the model sounds familiar, it's because it is.…

  14. The oral health of people with learning disabilities - a user-friendly questionnaire survey.

    PubMed

    Owens, J; Jones, K; Marshman, Z

    2017-03-01

    To conduct a user-friendly questionnaire survey of the oral health and service needs of adults with learning disabilities. Researchers collaborated with local self-advocacy services to develop a questionnaire adapted from one used in a regional postal survey. The questionnaire, which covered dental status, oral health and dental services use, was sent to a random sample of people from the learning disability case register. Of 2,000 questionnaires mailed, 117 were returned undelivered and 625 were completed (response rate 31.3%). The self-reported dental status of people with learning disabilities appeared similar to that of the 2008 postal survey of the general population in Sheffield. The major difference in dental status was 11.5% of people with learning disabilities wore upper dentures and 7.2% wore lower dentures, compared to 21.2% and 12.1% of the general population in Sheffield. Using the case register as a recruitment instrument may have excluded people with learning disabilities not registered. Time and finances only permitted one mailing. Analysis on the basis of deprivation could not be conducted. Contrary to current practice, it is possible to include people with learning disabilities in oral health surveys. A multidisciplinary team was essential for enabling the progression and implementation of inclusive research and for people with learning disabilities and their supporters to engage meaningfully. This level of collaboration appears necessary if we are committed to ensuring that people with learning disabilities and their supporters are made visible to policy and decision-makers. Copyright© 2017 Dennis Barber Ltd

  15. Automated management for pavement inspection system (AMPIS)

    NASA Astrophysics Data System (ADS)

    Chung, Hung Chi; Girardello, Roberto; Soeller, Tony; Shinozuka, Masanobu

    2003-08-01

    An automated in-situ road surface distress surveying and management system, AMPIS, has been developed on the basis of video images within the framework of GIS software. Video image processing techniques are introduced to acquire, process and analyze the road surface images obtained from a moving vehicle. ArcGIS platform is used to integrate the routines of image processing and spatial analysis in handling the full-scale metropolitan highway surface distress detection and data fusion/management. This makes it possible to present user-friendly interfaces in GIS and to provide efficient visualizations of surveyed results not only for the use of transportation engineers to manage road surveying documentations, data acquisition, analysis and management, but also for financial officials to plan maintenance and repair programs and further evaluate the socio-economic impacts of highway degradation and deterioration. A review performed in this study on fundamental principle of Pavement Management System (PMS) and its implementation indicates that the proposed approach of using GIS concept and its tools for PMS application will reshape PMS into a new information technology-based system providing a convenient and efficient pavement inspection and management.

  16. Formal verification of human-automation interaction

    NASA Technical Reports Server (NTRS)

    Degani, Asaf; Heymann, Michael

    2002-01-01

    This paper discusses a formal and rigorous approach to the analysis of operator interaction with machines. It addresses the acute problem of detecting design errors in human-machine interaction and focuses on verifying the correctness of the interaction in complex and automated control systems. The paper describes a systematic methodology for evaluating whether the interface provides the necessary information about the machine to enable the operator to perform a specified task successfully and unambiguously. It also addresses the adequacy of information provided to the user via training material (e.g., user manual) about the machine's behavior. The essentials of the methodology, which can be automated and applied to the verification of large systems, are illustrated by several examples and through a case study of pilot interaction with an autopilot aboard a modern commercial aircraft. The expected application of this methodology is an augmentation and enhancement, by formal verification, of human-automation interfaces.

  17. A Meta-Analysis of Factors Influencing the Development of Trust in Automation: Implications for Human-Robot Interaction

    DTIC Science & Technology

    2014-07-01

    Submoderating factors were examined and reported for human-related (i.e., age, cognitive factors, emotive factors) and automation- related (i.e., features and...capabilities) effects. Analyses were also conducted for type of automated aid: cognitive, control, and perceptual automation aids. Automated cognitive...operator, user) action. Perceptual aids are used to assist the operator or user by providing warnings or to assist with pattern recognition. All

  18. Maneuver Automation Software

    NASA Technical Reports Server (NTRS)

    Uffelman, Hal; Goodson, Troy; Pellegrin, Michael; Stavert, Lynn; Burk, Thomas; Beach, David; Signorelli, Joel; Jones, Jeremy; Hahn, Yungsun; Attiyah, Ahlam; hide

    2009-01-01

    The Maneuver Automation Software (MAS) automates the process of generating commands for maneuvers to keep the spacecraft of the Cassini-Huygens mission on a predetermined prime mission trajectory. Before MAS became available, a team of approximately 10 members had to work about two weeks to design, test, and implement each maneuver in a process that involved running many maneuver-related application programs and then serially handing off data products to other parts of the team. MAS enables a three-member team to design, test, and implement a maneuver in about one-half hour after Navigation has process-tracking data. MAS accepts more than 60 parameters and 22 files as input directly from users. MAS consists of Practical Extraction and Reporting Language (PERL) scripts that link, sequence, and execute the maneuver- related application programs: "Pushing a single button" on a graphical user interface causes MAS to run navigation programs that design a maneuver; programs that create sequences of commands to execute the maneuver on the spacecraft; and a program that generates predictions about maneuver performance and generates reports and other files that enable users to quickly review and verify the maneuver design. MAS can also generate presentation materials, initiate electronic command request forms, and archive all data products for future reference.

  19. User's guide to SSARRMENU

    USGS Publications Warehouse

    Mastin, M.C.; Le, Thanh

    2001-01-01

    The U.S. Geological Survey, in cooperation with Pierce County Department of Public Works, Washington, has developed an operational tool called the Puyallup Flood-Alert System to alert users of impending floods in the Puyallup River Basin. The system acquires and incorporates meteorological and hydrological data into the Streamflow Synthesis and Reservoir Regulation (SSARR) hydrologic flow-routing model to simulate floods in the Puyallup River Basin. SSARRMENU is the user-interactive graphical interface between the user, the input and output data, and the SSARR model. In a companion cooperative project with Pierce County, the SSARR model for the Puyallup River Basin was calibrated and validated. The calibrated model is accessed through SSARRMENU, which has been specifically programed for the Puyallup River and the needs of Pierce County. SSARRMENU automates the retrieval of data from ADAPS (Automated DAta Processing System, the U.S. Geological Survey?s real-time hydrologic database), formats the data for use with SSARR, initiates SSARR model runs, displays alerts for impending floods, and provides utilities to display the simulated and observed data. An on-screen map of the basin and a series of menu items provide the user wi

  20. Correction of spin diffusion during iterative automated NOE assignment

    NASA Astrophysics Data System (ADS)

    Linge, Jens P.; Habeck, Michael; Rieping, Wolfgang; Nilges, Michael

    2004-04-01

    Indirect magnetization transfer increases the observed nuclear Overhauser enhancement (NOE) between two protons in many cases, leading to an underestimation of target distances. Wider distance bounds are necessary to account for this error. However, this leads to a loss of information and may reduce the quality of the structures generated from the inter-proton distances. Although several methods for spin diffusion correction have been published, they are often not employed to derive distance restraints. This prompted us to write a user-friendly and CPU-efficient method to correct for spin diffusion that is fully integrated in our program ambiguous restraints for iterative assignment (ARIA). ARIA thus allows automated iterative NOE assignment and structure calculation with spin diffusion corrected distances. The method relies on numerical integration of the coupled differential equations which govern relaxation by matrix squaring and sparse matrix techniques. We derive a correction factor for the distance restraints from calculated NOE volumes and inter-proton distances. To evaluate the impact of our spin diffusion correction, we tested the new calibration process extensively with data from the Pleckstrin homology (PH) domain of Mus musculus β-spectrin. By comparing structures refined with and without spin diffusion correction, we show that spin diffusion corrected distance restraints give rise to structures of higher quality (notably fewer NOE violations and a more regular Ramachandran map). Furthermore, spin diffusion correction permits the use of tighter error bounds which improves the distinction between signal and noise in an automated NOE assignment scheme.

  1. Optical tracking of embryonic vertebrates behavioural responses using automated time-resolved video-microscopy system

    NASA Astrophysics Data System (ADS)

    Walpitagama, Milanga; Kaslin, Jan; Nugegoda, Dayanthi; Wlodkowic, Donald

    2016-12-01

    The fish embryo toxicity (FET) biotest performed on embryos of zebrafish (Danio rerio) has gained significant popularity as a rapid and inexpensive alternative approach in chemical hazard and risk assessment. The FET was designed to evaluate acute toxicity on embryonic stages of fish exposed to the test chemical. The current standard, similar to most traditional methods for evaluating aquatic toxicity provides, however, little understanding of effects of environmentally relevant concentrations of chemical stressors. We postulate that significant environmental effects such as altered motor functions, physiological alterations reflected in heart rate, effects on development and reproduction can occur at sub-lethal concentrations well below than LC10. Behavioral studies can, therefore, provide a valuable integrative link between physiological and ecological effects. Despite the advantages of behavioral analysis development of behavioral toxicity, biotests is greatly hampered by the lack of dedicated laboratory automation, in particular, user-friendly and automated video microscopy systems. In this work we present a proof-of-concept development of an optical system capable of tracking embryonic vertebrates behavioral responses using automated and vastly miniaturized time-resolved video-microscopy. We have employed miniaturized CMOS cameras to perform high definition video recording and analysis of earliest vertebrate behavioral responses. The main objective was to develop a biocompatible embryo positioning structures that were suitable for high-throughput imaging as well as video capture and video analysis algorithms. This system should support the development of sub-lethal and behavioral markers for accelerated environmental monitoring.

  2. Automation of Periodic Reports

    DOT National Transportation Integrated Search

    1975-06-01

    The manual is a user's guide to the automation of the 'Summary of National Transportation Statistics.' The System is stored on the in-house PDP-10 computer to provide ready access and retrieval of the data. The information stored in the system includ...

  3. webPOISONCONTROL: can poison control be automated?

    PubMed

    Litovitz, Toby; Benson, Blaine E; Smolinske, Susan

    2016-08-01

    A free webPOISONCONTROL app allows the public to determine the appropriate triage of poison ingestions without calling poison control. If accepted and safe, this alternative expands access to reliable poison control services to those who prefer the Internet over the telephone. This study assesses feasibility, safety, and user-acceptance of automated online triage of asymptomatic, nonsuicidal poison ingestion cases. The user provides substance name, amount, age, and weight in an automated online tool or downloadable app, and is given a specific triage recommendation to stay home, go to the emergency department, or call poison control for further guidance. Safety was determined by assessing outcomes of consecutive home-triaged cases with follow-up and by confirming the correct application of algorithms. Case completion times and user perceptions of speed and ease of use were measures of user-acceptance. Of 9256 cases, 73.3% were triaged to home, 2.1% to an emergency department, and 24.5% directed to call poison control. Children younger than 6 years were involved in 75.2% of cases. Automated follow-up was done in 31.2% of home-triaged cases; 82.3% of these had no effect. No major or fatal outcomes were reported. More than 91% of survey respondents found the tool quick and easy to use. Median case completion time was 4.1 minutes. webPOISONCONTROL augments traditional poison control services by providing automated, accurate online access to case-specific triage and first aid guidance for poison ingestions. It is safe, quick, and easy to use. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  4. Development of design principles for automated systems in transport control.

    PubMed

    Balfe, Nora; Wilson, John R; Sharples, Sarah; Clarke, Theresa

    2012-01-01

    This article reports the results of a qualitative study investigating attitudes towards and opinions of an advanced automation system currently used in UK rail signalling. In-depth interviews were held with 10 users, key issues associated with automation were identified and the automation's impact on the signalling task investigated. The interview data highlighted the importance of the signallers' understanding of the automation and their (in)ability to predict its outputs. The interviews also covered the methods used by signallers to interact with and control the automation, and the perceived effects on their workload. The results indicate that despite a generally low level of understanding and ability to predict the actions of the automation system, signallers have developed largely successful coping mechanisms that enable them to use the technology effectively. These findings, along with parallel work identifying desirable attributes of automation from the literature in the area, were used to develop 12 principles of automation which can be used to help design new systems which better facilitate cooperative working. The work reported in this article was completed with the active involvement of operational rail staff who regularly use automated systems in rail signalling. The outcomes are currently being used to inform decisions on the extent and type of automation and user interfaces in future generations of rail control systems.

  5. Automated information retrieval using CLIPS

    NASA Technical Reports Server (NTRS)

    Raines, Rodney Doyle, III; Beug, James Lewis

    1991-01-01

    Expert systems have considerable potential to assist computer users in managing the large volume of information available to them. One possible use of an expert system is to model the information retrieval interests of a human user and then make recommendations to the user as to articles of interest. At Cal Poly, a prototype expert system written in the C Language Integrated Production System (CLIPS) serves as an Automated Information Retrieval System (AIRS). AIRS monitors a user's reading preferences, develops a profile of the user, and then evaluates items returned from the information base. When prompted by the user, AIRS returns a list of items of interest to the user. In order to minimize the impact on system resources, AIRS is designed to run in the background during periods of light system use.

  6. Recommendations from Friends Anytime and Anywhere: Toward a Model of Contextual Offer and Consumption Values

    PubMed Central

    Shen, Xiao-Liang; Wang, Nan

    2013-01-01

    Abstract The ubiquity and portability of mobile devices provide additional opportunities for information retrieval. People can easily access mobile applications anytime and anywhere when they need to acquire specific context-aware recommendations (contextual offer) from their friends. This study, thus, represents an initial attempt to understand users' acceptance of a mobile-based social reviews platform, where recommendations from friends can be obtained with mobile devices. Based on the consumption value theory, a theoretical model is proposed and empirically examined using survey data from 218 mobile users. The findings demonstrate that contextual offers based on users' profiles, access time, and geographic positions significantly predict their value perceptions (utilitarian, hedonic, and social), which, in turn, affect their intention to use a mobile social reviews platform. This study is also believed to provide some useful insights to both research and practice. PMID:23530548

  7. Intelligent Automation Approach for Improving Pilot Situational Awareness

    NASA Technical Reports Server (NTRS)

    Spirkovska, Lilly

    2004-01-01

    Automation in the aviation domain has been increasing for the past two decades. Pilot reaction to automation varies from highly favorable to highly critical depending on both the pilot's background and how effectively the automation is implemented. We describe a user-centered approach for automation that considers the pilot's tasks and his needs related to accomplishing those tasks. Further, we augment rather than replace how the pilot currently fulfills his goals, relying on redundant displays that offer the pilot an opportunity to build trust in the automation. Our prototype system automates the interpretation of hydraulic system faults of the UH-60 helicopter. We describe the problem with the current system and our methodology for resolving it.

  8. Automation of Ocean Product Metrics

    DTIC Science & Technology

    2008-09-30

    Presented in: Ocean Sciences 2008 Conf., 5 Mar 2008. Shriver, J., J. D. Dykes, and J. Fabre: Automation of Operational Ocean Product Metrics. Presented in 2008 EGU General Assembly , 14 April 2008. 9 ...processing (multiple data cuts per day) and multiple-nested models. Routines for generating automated evaluations of model forecast statistics will be...developed and pre-existing tools will be collected to create a generalized tool set, which will include user-interface tools to the metrics data

  9. FRIEND Engine Framework: a real time neurofeedback client-server system for neuroimaging studies

    PubMed Central

    Basilio, Rodrigo; Garrido, Griselda J.; Sato, João R.; Hoefle, Sebastian; Melo, Bruno R. P.; Pamplona, Fabricio A.; Zahn, Roland; Moll, Jorge

    2015-01-01

    In this methods article, we present a new implementation of a recently reported FSL-integrated neurofeedback tool, the standalone version of “Functional Real-time Interactive Endogenous Neuromodulation and Decoding” (FRIEND). We will refer to this new implementation as the FRIEND Engine Framework. The framework comprises a client-server cross-platform solution for real time fMRI and fMRI/EEG neurofeedback studies, enabling flexible customization or integration of graphical interfaces, devices, and data processing. This implementation allows a fast setup of novel plug-ins and frontends, which can be shared with the user community at large. The FRIEND Engine Framework is freely distributed for non-commercial, research purposes. PMID:25688193

  10. National Helpline for Problem Gambling: A Profile of Its Users' Characteristics

    PubMed Central

    Bastiani, Luca; Fea, Maurizio; Potente, Roberta; Luppi, Claudia; Lucchini, Fabio; Molinaro, Sabrina

    2015-01-01

    Gambling has seen a significant increase in Italy in the last 10 years and has rapidly become a public health issue, and for these reasons the first National Helpline for Problem Gambling (GR-Helpline) has been established. The aims of this study are to describe the GR-Helpline users' characteristics and to compare the prevalence rates of the users with those of moderate-risk/problematic gamblers obtained from the national survey (IPSAD 2010-2011). Statistical analysis was performed on data obtained from the counselling sessions (phone/e-mail/chat) carried out on 5,805 users (57.5% gamblers; 42.5% families/friends). This confirms that the problems related to gambling concern not only the gamblers but also their families and friends. Significant differences were found between gamblers and families/friends involving gender (74% of gamblers were male; 76.9% of families/friends were female), as well as age-classes and geographical area. Female gamblers had a higher mean age (47.3 versus 40.2 years) and preferred nonstrategy-based games. Prevalence rates of GR-Helpline users and of moderate risk/problematic gamblers were correlated (Rho = 0.58; p = 0.0113). The results highlight the fact that remote access to counselling can be an effective means of promoting treatment for problem gamblers who do not otherwise appeal directly for services. PMID:26064772

  11. Utilizing Facebook and Automated Telephone Calls to Increase Adoption of a Local Smoke Alarm Installation Program.

    PubMed

    Frattaroli, Shannon; Schulman, Eric; McDonald, Eileen M; Omaki, Elise C; Shields, Wendy C; Jones, Vanya; Brewer, William

    2018-05-17

    Innovative strategies are needed to improve the prevalence of working smoke alarms in homes. To our knowledge, this is the first study to report on the effectiveness of Facebook advertising and automated telephone calls as population-level strategies to encourage an injury prevention behavior. We examine the effectiveness of Facebook advertising and automated telephone calls as strategies to enroll individuals in Baltimore City's Fire Department's free smoke alarm installation program. We directed our advertising efforts toward Facebook users eligible for the Baltimore City Fire Department's free smoke alarm installation program and all homes with a residential phone line included in Baltimore City's automated call system. The Facebook campaign targeted Baltimore City residents 18 years of age and older. In total, an estimated 300 000 Facebook users met the eligibility criteria. Facebook advertisements were delivered to users' desktop and mobile device newsfeeds. A prerecorded message was sent to all residential landlines listed in the city's automated call system. By the end of the campaign, the 3 advertisements generated 456 666 impressions reaching 130 264 Facebook users. Of the users reached, 4367 individuals (1.3%) clicked the advertisement. The automated call system included approximately 90 000 residential phone numbers. Participants attributed 25 smoke alarm installation requests to Facebook and 458 to the automated call. Facebook advertisements are a novel approach to promoting smoke alarms and appear to be effective in exposing individuals to injury prevention messages. However, converting Facebook message recipients to users of a smoke alarm installation program occurred infrequently in this study. Residents who participated in the smoke alarm installation program were more likely to cite the automated call as the impetus for their participation. Additional research is needed to understand the circumstances and strategies to effectively use the social

  12. SlugIn 1.0: A Free Tool for Automated Slug Test Analysis.

    PubMed

    Martos-Rosillo, Sergio; Guardiola-Albert, Carolina; Padilla Benítez, Alberto; Delgado Pastor, Joaquín; Azcón González, Antonio; Durán Valsero, Juan José

    2018-05-01

    The correct characterization of aquifer parameters is essential for water-supply and water-quality investigations. Slug tests are widely used for these purposes. While free software is available to interpret slug tests, some codes are not user-friendly, or do not include a wide range of methods to interpret the results, or do not include automatic, inverse solutions to the test data. The private sector has also generated several good programs to interpret slug test data, but they are not free of charge. The computer program SlugIn 1.0 is available online for free download, and is demonstrated to aid in the analysis of slug tests to estimate hydraulic parameters. The program provides an easy-to-use Graphical User Interface. SlugIn 1.0 incorporates automated parameter estimation and facilitates the visualization of several interpretations of the same test. It incorporates solutions for confined and unconfined aquifers, partially penetrating wells, skin effects, shape factor, anisotropy, high hydraulic conductivity formations and the Mace test for large-diameter wells. It is available in English and Spanish and can be downloaded from the web site of the Geological Survey of Spain. Two field examples are presented to illustrate how the software operates. © 2018, National Ground Water Association.

  13. Department of Defense Travel Reengineering Pilot Report to Congress

    DTIC Science & Technology

    1997-06-01

    Electronic Commerce /Electronic Data Interchange (EC/EDI) capabilities to integrate functions. automate edit checks for internal controls, and create user-friendly management tools at all levels of the process.

  14. CONSUME: users guide.

    Treesearch

    R.D. Ottmar; M.F. Burns; J.N. Hall; A.D. Hanson

    1993-01-01

    CONSUME is a user-friendly computer program designed for resource managers with some working knowledge of IBM-PC applications. The software predicts the amount of fuel consumption on logged units based on weather data, the amount and fuel moisture of fuels, and a number of other factors. Using these predictions, the resource manager can accurately determine when and...

  15. A user-friendly tool to transform large scale administrative data into wide table format using a MapReduce program with a Pig Latin based script.

    PubMed

    Horiguchi, Hiromasa; Yasunaga, Hideo; Hashimoto, Hideki; Ohe, Kazuhiko

    2012-12-22

    Secondary use of large scale administrative data is increasingly popular in health services and clinical research, where a user-friendly tool for data management is in great demand. MapReduce technology such as Hadoop is a promising tool for this purpose, though its use has been limited by the lack of user-friendly functions for transforming large scale data into wide table format, where each subject is represented by one row, for use in health services and clinical research. Since the original specification of Pig provides very few functions for column field management, we have developed a novel system called GroupFilterFormat to handle the definition of field and data content based on a Pig Latin script. We have also developed, as an open-source project, several user-defined functions to transform the table format using GroupFilterFormat and to deal with processing that considers date conditions. Having prepared dummy discharge summary data for 2.3 million inpatients and medical activity log data for 950 million events, we used the Elastic Compute Cloud environment provided by Amazon Inc. to execute processing speed and scaling benchmarks. In the speed benchmark test, the response time was significantly reduced and a linear relationship was observed between the quantity of data and processing time in both a small and a very large dataset. The scaling benchmark test showed clear scalability. In our system, doubling the number of nodes resulted in a 47% decrease in processing time. Our newly developed system is widely accessible as an open resource. This system is very simple and easy to use for researchers who are accustomed to using declarative command syntax for commercial statistical software and Structured Query Language. Although our system needs further sophistication to allow more flexibility in scripts and to improve efficiency in data processing, it shows promise in facilitating the application of MapReduce technology to efficient data processing with

  16. FRAME (Force Review Automation Environment): MATLAB-based AFM data processor.

    PubMed

    Partola, Kostyantyn R; Lykotrafitis, George

    2016-05-03

    Data processing of force-displacement curves generated by atomic force microscopes (AFMs) for elastic moduli and unbinding event measurements is very time consuming and susceptible to user error or bias. There is an evident need for consistent, dependable, and easy-to-use AFM data processing software. We have developed an open-source software application, the force review automation environment (or FRAME), that provides users with an intuitive graphical user interface, automating data processing, and tools for expediting manual processing. We did not observe a significant difference between manually processed and automatically processed results from the same data sets. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. User Preferences and Design Recommendations for an mHealth App to Promote Cystic Fibrosis Self-Management

    PubMed Central

    Hilliard, Marisa E; Hahn, Amy; Ridge, Alana K; Eakin, Michelle N

    2014-01-01

    Background mHealth apps hold potential to provide automated, tailored support for treatment adherence among individuals with chronic medical conditions. Yet relatively little empirical research has guided app development and end users are infrequently involved in designing the app features or functions that would best suit their needs. Self-management apps may be particularly useful for people with chronic conditions like cystic fibrosis (CF) that have complex, demanding regimens. Objective The aim of this mixed-methods study was to involve individuals with CF in guiding the development of engaging, effective, user-friendly adherence promotion apps that meet their preferences and self-management needs. Methods Adults with CF (n=16, aged 21-48 years, 50% male) provided quantitative data via a secure Web survey and qualitative data via semi-structured telephone interviews regarding previous experiences using apps in general and for health, and preferred and unwanted features of potential future apps to support CF self-management. Results Participants were smartphone users who reported sending or receiving text messages (93%, 14/15) or emails (80%, 12/15) on their smartphone or device every day, and 87% (13/15) said it would be somewhat or very hard to give up their smartphone. Approximately one-half (53%, 8/15) reported having health apps, all diet/weight-related, yet many reported that existing nutrition apps were not well-suited for CF management. Participants wanted apps to support CF self-management with characteristics such as having multiple rather than single functions (eg, simple alarms), being specific to CF, and minimizing user burden. Common themes for desired CF app features were having information at one’s fingertips, automation of disease management activities such as pharmacy refills, integration with smartphones’ technological capabilities, enhancing communication with health care team, and facilitating socialization within the CF community

  18. Microbioreactors with microfluidic control and a user-friendly connection to the actuator hardware

    NASA Astrophysics Data System (ADS)

    Buchenauer, A.; Funke, M.; Büchs, J.; Mokwa, W.; Schnakenberg, U.

    2009-07-01

    In this study, an array of microbioreactors based on the format of 48-well microtiter plates (MTPs) is presented. The process parameters pH and biomass are monitored online using commercially available optical sensor technology. A microfluidic device dispenses acid or base individually into each well for controlling the pH of fermentations. Fluid volumes from 72 nL to 940 nL can be supplied with valve opening times between 10 ms and 200 ms. One microfluidic device is capable of supplying four wells from two reservoirs. Up to four microfluidic devices can be integrated on the area of a prototype MTP. The devices are fabricated in polydimethylsiloxane (PDMS) using soft lithographic techniques and utilize pneumatically actuated microvalves. During fermentations, the microbioreactor is clamped to an orbital shaker and a temporary pneumatic connection guides the externally controlled pressurized air to the microfluidic device. Finally, fermentations of Escherichia coli in the presence and absence of pH control are carried out in the microbioreactor system over 18 h. During the fermentation the pH of the cultures is continuously monitored by means of optodes. An ammonia solution or phosphoric acid is dispensed to adjust the pH if it differs from the set point of 7.2. In a controlled culture, the pH can be sustained within 7.0 to 7.3 while the pH in an uncontrolled culture ranges between 6.5 and 9.0. This microbioreactor demonstrates the possibility of pH-controlled fermentations in micro-scale. The process control and the user friendly connection to the actuation hardware provide an easy handling comparable to standard MTPs.

  19. Measuring Individual Differences in the Perfect Automation Schema.

    PubMed

    Merritt, Stephanie M; Unnerstall, Jennifer L; Lee, Deborah; Huber, Kelli

    2015-08-01

    A self-report measure of the perfect automation schema (PAS) is developed and tested. Researchers have hypothesized that the extent to which users possess a PAS is associated with greater decreases in trust after users encounter automation errors. However, no measure of the PAS currently exists. We developed a self-report measure assessing two proposed PAS factors: high expectations and all-or-none thinking about automation performance. In two studies, participants responded to our PAS measure, interacted with imperfect automated aids, and reported trust. Each of the two PAS measure factors demonstrated fit to the hypothesized factor structure and convergent and discriminant validity when compared with propensity to trust machines and trust in a specific aid. However, the high expectations and all-or-none thinking scales showed low intercorrelations and differential relationships with outcomes, suggesting that they might best be considered two separate constructs rather than two subfactors of the PAS. All-or-none thinking had significant associations with decreases in trust following aid errors, whereas high expectations did not. Results therefore suggest that the all-or-none thinking scale may best represent the PAS construct. Our PAS measure (specifically, the all-or-none thinking scale) significantly predicted the severe trust decreases thought to be associated with high PAS. Further, it demonstrated acceptable psychometric properties across two samples. This measure may be used in future work to assess levels of PAS in users of automated systems in either research or applied settings. © 2015, Human Factors and Ergonomics Society.

  20. Digital-flight-control-system software written in automated-engineering-design language: A user's guide of verification and validation tools

    NASA Technical Reports Server (NTRS)

    Saito, Jim

    1987-01-01

    The user guide of verification and validation (V&V) tools for the Automated Engineering Design (AED) language is specifically written to update the information found in several documents pertaining to the automated verification of flight software tools. The intent is to provide, in one document, all the information necessary to adequately prepare a run to use the AED V&V tools. No attempt is made to discuss the FORTRAN V&V tools since they were not updated and are not currently active. Additionally, the current descriptions of the AED V&V tools are contained and provides information to augment the NASA TM 84276. The AED V&V tools are accessed from the digital flight control systems verification laboratory (DFCSVL) via a PDP-11/60 digital computer. The AED V&V tool interface handlers on the PDP-11/60 generate a Univac run stream which is transmitted to the Univac via a Remote Job Entry (RJE) link. Job execution takes place on the Univac 1100 and the job output is transmitted back to the DFCSVL and stored as a PDP-11/60 printfile.

  1. The effect of simulated field storage conditions on the accuracy of rapid user-friendly blood pathogen detection kits.

    PubMed

    Bienek, Diane R; Charlton, David G

    2012-05-01

    Being able to test for the presence of blood pathogens at forward locations could reduce morbidity and mortality in the field. Rapid, user-friendly blood typing kits for detecting Human Immunodeficiency Virus (HIV), Hepatitis C Virus (HCV), and Hepatitis B Virus (HBV) were evaluated to determine their accuracy after storage at various temperatures/humidities. Rates of positive tests of control groups, experimental groups, and industry standards were compared (Fisher's exact chi2, p < or = 0.05). Compared to the control group, 2 of 10 HIV detection devices were adversely affected by exposure to high temperature/high humidity or high temperature/low humidity. With one exception, none of the environmentally exposed HCV or HBV detection devices exhibited significant differences compared to those stored under control conditions. For HIV, HCV, and HBV devices, there were differences compared to the industry standard. Collectively, this evaluation of pathogen detection kits revealed that diagnostic performance varies among products and storage conditions, and that the tested products cannot be considered to be approved for use to screen blood, plasma, cell, or tissue donors.

  2. Human Papillomavirus Genotyping Using an Automated Film-Based Chip Array

    PubMed Central

    Erali, Maria; Pattison, David C.; Wittwer, Carl T.; Petti, Cathy A.

    2009-01-01

    The INFINITI HPV-QUAD assay is a commercially available genotyping platform for human papillomavirus (HPV) that uses multiplex PCR, followed by automated processing for primer extension, hybridization, and detection. The analytical performance of the HPV-QUAD assay was evaluated using liquid cervical cytology specimens, and the results were compared with those results obtained using the digene High-Risk HPV hc2 Test (HC2). The specimen types included Surepath and PreservCyt transport media, as well as residual SurePath and HC2 transport media from the HC2 assay. The overall concordance of positive and negative results following the resolution of indeterminate and intermediate results was 83% among the 197 specimens tested. HC2 positive (+) and HPV-QUAD negative (−) results were noted in 24 specimens that were shown by real-time PCR and sequence analysis to contain no HPV, HPV types that were cross-reactive in the HC2 assay, or low virus levels. Conversely, HC2 (−) and HPV-QUAD (+) results were noted in four specimens and were subsequently attributed to cross-contamination. The most common HPV types to be identified in this study were HPV16, HPV18, HPV52/58, and HPV39/56. We show that the HPV-QUAD assay is a user friendly, automated system for the identification of distinct HPV genotypes. Based on its analytical performance, future studies with this platform are warranted to assess its clinical utility for HPV detection and genotyping. PMID:19644025

  3. Human papillomavirus genotyping using an automated film-based chip array.

    PubMed

    Erali, Maria; Pattison, David C; Wittwer, Carl T; Petti, Cathy A

    2009-09-01

    The INFINITI HPV-QUAD assay is a commercially available genotyping platform for human papillomavirus (HPV) that uses multiplex PCR, followed by automated processing for primer extension, hybridization, and detection. The analytical performance of the HPV-QUAD assay was evaluated using liquid cervical cytology specimens, and the results were compared with those results obtained using the digene High-Risk HPV hc2 Test (HC2). The specimen types included Surepath and PreservCyt transport media, as well as residual SurePath and HC2 transport media from the HC2 assay. The overall concordance of positive and negative results following the resolution of indeterminate and intermediate results was 83% among the 197 specimens tested. HC2 positive (+) and HPV-QUAD negative (-) results were noted in 24 specimens that were shown by real-time PCR and sequence analysis to contain no HPV, HPV types that were cross-reactive in the HC2 assay, or low virus levels. Conversely, HC2 (-) and HPV-QUAD (+) results were noted in four specimens and were subsequently attributed to cross-contamination. The most common HPV types to be identified in this study were HPV16, HPV18, HPV52/58, and HPV39/56. We show that the HPV-QUAD assay is a user friendly, automated system for the identification of distinct HPV genotypes. Based on its analytical performance, future studies with this platform are warranted to assess its clinical utility for HPV detection and genotyping.

  4. ARIA: Delivering state-of-the-art InSAR products to end users

    NASA Astrophysics Data System (ADS)

    Agram, P. S.; Owen, S. E.; Hua, H.; Manipon, G.; Sacco, G. F.; Bue, B. D.; Fielding, E. J.; Yun, S. H.; Simons, M.; Webb, F.; Rosen, P. A.; Lundgren, P.; Liu, Z.

    2016-12-01

    Advanced Rapid Imaging and Analysis (ARIA) Center for Natural Hazards aims to bring state-of-the-art geodetic imaging capabilities to an operational level in support of local, national, and international hazard response communities. ARIA project's first foray into operational generation of InSAR products was with Calimap Project, in collaboration with ASI-CIDOT, using X-band data from the Cosmo-SkyMed constellation. Over the last year, ARIA's processing infrastructure has been significantly upgraded to exploit the free stream of high quality C-band SAR data from ESA's Sentinel-1 mission and related algorithmic improvements to the ISCE software. ARIA's data system can now operationally generate geocoded unwrapped phase and coherence products in GIS-friendly formats from Sentinel-1 TOPS mode data in an automated fashion, and this capability is currently being exercised various study sites across the United States including Hawaii, Central California, Iceland and South America. The ARIA team, building on the experience gained from handling X-band data and C-band data, has also built an automated machine learning-based classifier to label the auto-generated interferograms based on phase unwrapping quality. These high quality "time-series ready" InSAR products generated using state-of-the-art processing algorithms can be accessed by end users using two different mechanisms - 1) a Faceted-search interface that includes browse imagery for quick visualization and 2) an ElasticSearch-based API to enable bulk automated download, post-processing and time-series analysis. In this talk, we will present InSAR results from various global events that ARIA system has responded to. We will also discuss the set of geospatial big data tools including GIS libraries and API tools, that end users will need to familiarize themselves with in order to maximize the utilization of continuous stream of InSAR products from the Sentinel-1 and NISAR missions that the ARIA project will generate.

  5. [Automation in surgery: a systematical approach].

    PubMed

    Strauss, G; Meixensberger, J; Dietz, A; Manzey, D

    2007-04-01

    Surgical assistance systems permit a misalignment from the purely manual to an assisted activity of the surgeon (automation). Automation defines a system, that partly or totally fulfils function, those was carried out before totally or partly by the user. The organization of surgical assistance systems following application (planning, simulation, intraoperative navigation and visualization) or technical configuration of the system (manipulator, robot) is not suitable for a description of the interaction between user (surgeon) and the system. The available work has the goal of providing a classification for the degree of the automation of surgical interventions and describing by examples. The presented classification orients itself at pre-working from the Human-Factors-Sciences. As a condition for an automation of a surgical intervention applies that an assumption of a task, which was alone assigned so far to the surgeon takes place via the system. For both reference objects (humans and machine) the condition passively or actively comes into consideration. Besides can be classified according to which functions are taken over during a selected function division by humans and/or the surgical assistance system. Three functional areas were differentiated: "information acquisition and -analysis", "decision making and action planning" as well as "execution of the surgical action". From this results a classification of pre- and intraoperative surgical assist systems in six categories, which represent different automation degrees. The classification pattern is described and illustrated on the basis of surgical of examples.

  6. Wet Lab Accelerator: A Web-Based Application Democratizing Laboratory Automation for Synthetic Biology.

    PubMed

    Bates, Maxwell; Berliner, Aaron J; Lachoff, Joe; Jaschke, Paul R; Groban, Eli S

    2017-01-20

    Wet Lab Accelerator (WLA) is a cloud-based tool that allows a scientist to conduct biology via robotic control without the need for any programming knowledge. A drag and drop interface provides a convenient and user-friendly method of generating biological protocols. Graphically developed protocols are turned into programmatic instruction lists required to conduct experiments at the cloud laboratory Transcriptic. Prior to the development of WLA, biologists were required to write in a programming language called "Autoprotocol" in order to work with Transcriptic. WLA relies on a new abstraction layer we call "Omniprotocol" to convert the graphical experimental description into lower level Autoprotocol language, which then directs robots at Transcriptic. While WLA has only been tested at Transcriptic, the conversion of graphically laid out experimental steps into Autoprotocol is generic, allowing extension of WLA into other cloud laboratories in the future. WLA hopes to democratize biology by bringing automation to general biologists.

  7. Automated CPX support system preliminary design phase

    NASA Technical Reports Server (NTRS)

    Bordeaux, T. A.; Carson, E. T.; Hepburn, C. D.; Shinnick, F. M.

    1984-01-01

    The development of the Distributed Command and Control System (DCCS) is discussed. The development of an automated C2 system stimulated the development of an automated command post exercise (CPX) support system to provide a more realistic stimulus to DCCS than could be achieved with the existing manual system. An automated CPX system to support corps-level exercise was designed. The effort comprised four tasks: (1) collecting and documenting user requirements; (2) developing a preliminary system design; (3) defining a program plan; and (4) evaluating the suitability of the TRASANA FOURCE computer model.

  8. Developing a user-friendly photometric software for exoplanets to increase participation in Citizen Science

    NASA Astrophysics Data System (ADS)

    Kokori, A.; Tsiaras, A.

    2017-09-01

    Previous research on Citizen Science projects agree that Citizen Science (CS) would serve as a way of both increasing levels of public understanding of science and public participation in scientific research. Historically, the concept of CS is not new, it dates back to the 20th century when citizens where making skilled observations, particularly in archaeology, ecology, and astronomy. Recently, the idea of CS has been improved due to technological progress and the arrival of Internet. The phrase "astronomy from the chair" that is being used in the literature highlights the extent of the convenience for analysing observational data. Citizen science benefits a variety of communities, such as scientific researchers, volunteers and STEM educators. Participating in CS projects is not only engaging the volunteers with the research goals of a science team, but is also helping them learning more about specialised scientific topics. In the case of astronomy, typical examples of CS projects are gathering observational data or/and analysing them. The Holomon Photometric Software (HOPS) is a user-friendly photometric software for exoplanets, with graphical representations, statistics, models, options are brought together into a single package. It was originally developed to analyse observations of transiting exoplanets obtained from the Holomon Astronomical Station of the Aristotle University of Thessaloniki. Here, we make the case that this software can be used as part of a CS project in analysing transiting exoplanets and producing light-curves. HOPS could contribute to the scientific data analysis but it could be used also as an educational tool for learning and visualizing photometry analyses of transiting exoplanets. Such a tool could be proven very efficient in the context of public participation in the research. In recent successful representative examples such as Galaxy Zoo professional astronomers cooperating with CS discovered a group of rare galaxies by using

  9. User Friendly Processing of Sediment CT Data: Software and Application in High Resolution Non-Destructive Sediment Core Data Sets

    NASA Astrophysics Data System (ADS)

    Reilly, B. T.; Stoner, J. S.; Wiest, J.; Abbott, M. B.; Francus, P.; Lapointe, F.

    2015-12-01

    Computed Tomography (CT) of sediment cores allow for high resolution images, three dimensional volumes, and down core profiles, generated through the attenuation of X-rays as a function of density and atomic number. When using a medical CT-Scanner, these quantitative data are stored in pixels using the Hounsfield scale, which are relative to the attenuation of X-rays in water and air at standard temperature and pressure. Here we present MATLAB based software specifically designed for sedimentary applications with a user friendly graphical interface to process DICOM files and stitch overlapping CT scans. For visualization, the software allows easy generation of core slice images with grayscale and false color relative to a user defined Hounsfield number range. For comparison to other high resolution non-destructive methods, down core Hounsfield number profiles are extracted using a method robust to coring imperfections, like deformation, bowing, gaps, and gas expansion. We demonstrate the usefulness of this technique with lacustrine sediment cores from the Western United States and Canadian High Arctic, including Fish Lake, Oregon, and Sawtooth Lake, Ellesmere Island. These sites represent two different depositional environments and provide examples for a variety of common coring defects and lithologies. The Hounsfield profiles and images can be used in combination with other high resolution data sets, including sediment magnetic parameters, XRF core scans and many other types of data, to provide unique insights into how lithology influences paleoenvironmental and paleomagnetic records and their interpretations.

  10. Semi-Automated Identification of Rocks in Images

    NASA Technical Reports Server (NTRS)

    Bornstein, Benjamin; Castano, Andres; Anderson, Robert

    2006-01-01

    Rock Identification Toolkit Suite is a computer program that assists users in identifying and characterizing rocks shown in images returned by the Mars Explorer Rover mission. Included in the program are components for automated finding of rocks, interactive adjustments of outlines of rocks, active contouring of rocks, and automated analysis of shapes in two dimensions. The program assists users in evaluating the surface properties of rocks and soil and reports basic properties of rocks. The program requires either the Mac OS X operating system running on a G4 (or more capable) processor or a Linux operating system running on a Pentium (or more capable) processor, plus at least 128MB of random-access memory.

  11. Poster - 09: A MATLAB-based Program for Automated Quality Assurance of a Prostate Brachytherapy Ultrasound System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poon, Justin; Sabondjian, Eric; Sankreacha, Raxa

    Purpose: A robust Quality Assurance (QA) program is essential for prostate brachytherapy ultrasound systems due to the importance of imaging accuracy during treatment and planning. Task Group 128 of the American Association of Physicists in Medicine has recommended a set of QA tests covering grayscale visibility, depth of penetration, axial and lateral resolution, distance measurement, area measurement, volume measurement, and template/electronic grid alignment. Making manual measurements on the ultrasound system can be slow and inaccurate, so a MATLAB program was developed for automation of the described tests. Methods: Test images were acquired using a BK Medical Flex Focus 400 ultrasoundmore » scanner and 8848 transducer with the CIRS Brachytherapy QA Phantom – Model 045A. For each test, the program automatically segments the inputted image(s), makes the appropriate measurements, and indicates if the test passed or failed. The program was tested by analyzing two sets of images, where the measurements from the first set were used as baseline values. Results: The program successfully analyzed the images for each test and determined if any action limits were exceeded. All tests passed – the measurements made by the program were consistent and met the requirements outlined by Task Group 128. Conclusions: The MATLAB program we have developed can be used for automated QA of an ultrasound system for prostate brachytherapy. The GUI provides a user-friendly way to analyze images without the need for any manual measurement, potentially removing intra- and inter-user variability for more consistent results.« less

  12. A new, fast and semi-automated size determination method (SASDM) for studying multicellular tumor spheroids

    PubMed Central

    Monazzam, Azita; Razifar, Pasha; Lindhe, Örjan; Josephsson, Raymond; Långström, Bengt; Bergström, Mats

    2005-01-01

    Background Considering the width and importance of using Multicellular Tumor Spheroids (MTS) in oncology research, size determination of MTSs by an accurate and fast method is essential. In the present study an effective, fast and semi-automated method, SASDM, was developed to determinate the size of MTSs. The method was applied and tested in MTSs of three different cell-lines. Frozen section autoradiography and Hemotoxylin Eosin (H&E) staining was used for further confirmation. Results SASDM was shown to be effective, user-friendly, and time efficient, and to be more precise than the traditional methods and it was applicable for MTSs of different cell-lines. Furthermore, the results of image analysis showed high correspondence to the results of autoradiography and staining. Conclusion The combination of assessment of metabolic condition and image analysis in MTSs provides a good model to evaluate the effect of various anti-cancer treatments. PMID:16283948

  13. Interactive Office user's manual

    NASA Technical Reports Server (NTRS)

    Montgomery, Edward E.; Lowers, Benjamin; Nabors, Terri L.

    1990-01-01

    Given here is a user's manual for Interactive Office (IO), an executive office tool for organization and planning, written specifically for Macintosh. IO is a paperless management tool to automate a related group of individuals into one productive system.

  14. QuickRNASeq lifts large-scale RNA-seq data analyses to the next level of automation and interactive visualization.

    PubMed

    Zhao, Shanrong; Xi, Li; Quan, Jie; Xi, Hualin; Zhang, Ying; von Schack, David; Vincent, Michael; Zhang, Baohong

    2016-01-08

    RNA sequencing (RNA-seq), a next-generation sequencing technique for transcriptome profiling, is being increasingly used, in part driven by the decreasing cost of sequencing. Nevertheless, the analysis of the massive amounts of data generated by large-scale RNA-seq remains a challenge. Multiple algorithms pertinent to basic analyses have been developed, and there is an increasing need to automate the use of these tools so as to obtain results in an efficient and user friendly manner. Increased automation and improved visualization of the results will help make the results and findings of the analyses readily available to experimental scientists. By combing the best open source tools developed for RNA-seq data analyses and the most advanced web 2.0 technologies, we have implemented QuickRNASeq, a pipeline for large-scale RNA-seq data analyses and visualization. The QuickRNASeq workflow consists of three main steps. In Step #1, each individual sample is processed, including mapping RNA-seq reads to a reference genome, counting the numbers of mapped reads, quality control of the aligned reads, and SNP (single nucleotide polymorphism) calling. Step #1 is computationally intensive, and can be processed in parallel. In Step #2, the results from individual samples are merged, and an integrated and interactive project report is generated. All analyses results in the report are accessible via a single HTML entry webpage. Step #3 is the data interpretation and presentation step. The rich visualization features implemented here allow end users to interactively explore the results of RNA-seq data analyses, and to gain more insights into RNA-seq datasets. In addition, we used a real world dataset to demonstrate the simplicity and efficiency of QuickRNASeq in RNA-seq data analyses and interactive visualizations. The seamless integration of automated capabilites with interactive visualizations in QuickRNASeq is not available in other published RNA-seq pipelines. The high degree

  15. User-friendly chemistry takes center stage at ACS meeting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pool, R.

    1992-09-11

    These days it seems that what chemistry needs more than anything else is a good p.r. agent. If you ask John or Joan Q. Public about the accomplishments of the chemical industry, chances are they'll mention Love Canal, CFCs destroying the ozone layer, or carcinogens in food. However, if the national meeting of the American Chemical Society in Washington, D.C., 2 weeks ago is any indication, chemists are working hard to fix the image problem. Nearly all of the two dozen press conferences held during the meeting focused on food, health topics, environment-friendly technology, or some other subject close tomore » consumers' hearts. And the scientific talks themselves reflected the same interests, with sessions such as Environmental Successes in the Chemical Industry', Food Phytochemicals for Cancer Prevention', Chemistry of Electrophilic Metal Complexes', New Advances in Polyolefin Polymers', Zapping acid rain with microwaves.'« less

  16. Normative Data for a User-friendly Paradigm for Pattern Electroretinogram Recording

    PubMed Central

    Porciatti, Vittorio; Ventura, Lori M.

    2009-01-01

    Purpose To provide normative data for a user-friendly paradigm for the pattern electroretinogram (PERG) optimized for glaucoma screening (PERGLA). Design Prospective nonrandomized case series. Participants Ninety-three normal subjects ranging in age between 22 and 85 years. Methods A circular black–white grating of 25° visual angle, reversing 16.28 times per second, was presented on a television monitor placed inside a Ganzfeld bowl. The PERG was recorded simultaneously from both eyes with undilated pupils by means of skin cup electrodes taped over the lower eyelids. Reference electrodes were taped on the ipsilateral temples. Electrophysiologic signals were conventionally amplified, filtered, and digitized. Six hundred artifact-free repetitions were averaged. The response component at the reversal frequency was isolated automatically by digital Fourier transforms and was expressed as a deviation from the age-corrected average. The procedure took approximately 4 minutes. Main Outcome Measures Pattern electroretinogram amplitude (μV) and phase (π rad); response variability (coefficient of variation [CV] = standard deviation [SD] / mean × 100) of amplitude and phase of 2 partial averages that build up the PERG waveform; amplitude (μV) of background noise waveform, obtained by multiplying alternate sweeps by +1 and −1; and interocular asymmetry (CV of amplitude and phase of the PERG of the 2 eyes). Results On average, the PERG has a signal-to-noise ratio of more than 13:1. The CVs of intrasession and intersession variabilities in amplitude and phase are lower than 10% and 2%, respectively, and do not depend on the operator. The CV of interocular asymmetries in amplitude and phase are 9.8±8.8% and 1.5±1.4%, respectively. The PERG amplitude and phase decrease with age. Residuals of linear regression lines have normal distribution, with an SD of 0.1 log units for amplitude and 0.019 log units for phase. Age-corrected confidence limits (P<0.05) are defined as

  17. A User-Friendly Software Package for HIFU Simulation

    NASA Astrophysics Data System (ADS)

    Soneson, Joshua E.

    2009-04-01

    A freely-distributed, MATLAB (The Mathworks, Inc., Natick, MA)-based software package for simulating axisymmetric high-intensity focused ultrasound (HIFU) beams and their heating effects is discussed. The package (HIFU_Simulator) consists of a propagation module which solves the Khokhlov-Zabolotskaya-Kuznetsov (KZK) equation and a heating module which solves Pennes' bioheat transfer (BHT) equation. The pressure, intensity, heating rate, temperature, and thermal dose fields are computed, plotted, the output is released to the MATLAB workspace for further user analysis or postprocessing.

  18. On Abstractions and Simplifications in the Design of Human-Automation Interfaces

    NASA Technical Reports Server (NTRS)

    Heymann, Michael; Degani, Asaf; Clancy, Daniel (Technical Monitor)

    2002-01-01

    This report addresses the design of human-automation interaction from a formal perspective that focuses on the information content of the interface, rather than the design of the graphical user interface. It also addresses the issue of the information provided to the user (e.g., user-manuals, training material, and all other resources). In this report, we propose a formal procedure for generating interfaces and user-manuals. The procedure is guided by two criteria: First, the interface must be correct, that is, with the given interface the user will be able to perform the specified tasks correctly. Second, the interface should be succinct. The report discusses the underlying concepts and the formal methods for this approach. Two examples are used to illustrate the procedure. The algorithm for constructing interfaces can be automated, and a preliminary software system for its implementation has been developed.

  19. Re-Imagining Archival Display: Creating User-Friendly Finding Aids

    ERIC Educational Resources Information Center

    Daines, J. Gordon, III; Nimer, Cory L.

    2011-01-01

    This article examines how finding aids are structured and delivered, considering alternative approaches. It suggests that single-level displays, those that present a single component of a multilevel description to users at a time, have the potential to transform the delivery and display of collection information while improving the user…

  20. Power subsystem automation study

    NASA Technical Reports Server (NTRS)

    Imamura, M. S.; Moser, R. L.; Veatch, M.

    1983-01-01

    Generic power-system elements and their potential faults are identified. Automation functions and their resulting benefits are defined and automation functions between power subsystem, central spacecraft computer, and ground flight-support personnel are partitioned. All automation activities were categorized as data handling, monitoring, routine control, fault handling, planning and operations, or anomaly handling. Incorporation of all these classes of tasks, except for anomaly handling, in power subsystem hardware and software was concluded to be mandatory to meet the design and operational requirements of the space station. The key drivers are long mission lifetime, modular growth, high-performance flexibility, a need to accommodate different electrical user-load equipment, onorbit assembly/maintenance/servicing, and potentially large number of power subsystem components. A significant effort in algorithm development and validation is essential in meeting the 1987 technology readiness date for the space station.

  1. 18 CFR 37.8 - Obligations of OASIS users.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ..., DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT OPEN ACCESS SAME-TIME INFORMATION SYSTEMS § 37.8... initiating a significant amount of automated queries. The OASIS user must also notify the Responsible Party one month in advance of expected significant increases in the volume of automated queries. [Order 605...

  2. User-friendly technology to help family carers cope.

    PubMed

    Chambers, Mary; Connor, Samantha L

    2002-12-01

    Increases in the older adult population are occurring simultaneously with a growth in new technology. Modern technology presents an opportunity to enhance the quality of life and independence of older people and their family carers through communication and access to health care information. To evaluate the usability of a multimedia software application designed to provide family carers of the elderly or disabled with information, advice and psychological support to increase their coping capacity. The interactive application consisted of an information-based package that provided carers with advice on the promotion of psychological health, including relaxation and other coping strategies. The software application also included a carer self-assessment instrument, designed to provide both family and professional carers with information to assess how family carers were coping with their care-giving role. Usability evaluation was carried out in two stages. In the first stage (verification), user trials and an evaluation questionnaire were used to refine and develop the content and usability of the multimedia software application. In the second (demonstration), stage evaluation questionnaires were used to appraise the usability of the modified software application. The findings evidenced that the majority of users found the software to be usable and informative. Some areas were highlighted for improvement in the navigation of the software. The authors conclude that with further refinement, the software application has the potential to offer information and support to those who are caring for the elderly and disabled at home.

  3. You Don't De-Friend the Dead: An Analysis of Grief Communication by College Students through Facebook Profiles

    ERIC Educational Resources Information Center

    Pennington, Natalie

    2013-01-01

    This research examined how various members of a social network interact with the Facebook (FB) profile page of a friend who has died. From 43 in-depth qualitative interviews, FB friends of deceased FB users maintained their FB connection with the deceased. Most participants who visited the profile found it helpful to look at pictures; a few wrote…

  4. CARE3MENU- A CARE III USER FRIENDLY INTERFACE

    NASA Technical Reports Server (NTRS)

    Pierce, J. L.

    1994-01-01

    CARE3MENU generates an input file for the CARE III program. CARE III is used for reliability prediction of complex, redundant, fault-tolerant systems including digital computers, aircraft, nuclear and chemical control systems. The CARE III input file often becomes complicated and is not easily formatted with a text editor. CARE3MENU provides an easy, interactive method of creating an input file by automatically formatting a set of user-supplied inputs for the CARE III system. CARE3MENU provides detailed on-line help for most of its screen formats. The reliability model input process is divided into sections using menu-driven screen displays. Each stage, or set of identical modules comprising the model, must be identified and described in terms of number of modules, minimum number of modules for stage operation, and critical fault threshold. The fault handling and fault occurence models are detailed in several screens by parameters such as transition rates, propagation and detection densities, Weibull or exponential characteristics, and model accuracy. The system fault tree and critical pairs fault tree screens are used to define the governing logic and to identify modules affected by component failures. Additional CARE3MENU screens prompt the user for output options and run time control values such as mission time and truncation values. There are fourteen major screens, many with default values and HELP options. The documentation includes: 1) a users guide with several examples of CARE III models, the dialog required to input them to CARE3MENU, and the output files created; and 2) a maintenance manual for assistance in changing the HELP files and modifying any of the menu formats or contents. CARE3MENU is written in FORTRAN 77 for interactive execution and has been implemented on a DEC VAX series computer operating under VMS. This program was developed in 1985.

  5. Point Analysis in Java applied to histological images of the perforant pathway: a user's account.

    PubMed

    Scorcioni, Ruggero; Wright, Susan N; Patrick Card, J; Ascoli, Giorgio A; Barrionuevo, Germán

    2008-01-01

    The freeware Java tool Point Analysis in Java (PAJ), created to perform 3D point analysis, was tested in an independent laboratory setting. The input data consisted of images of the hippocampal perforant pathway from serial immunocytochemical localizations of the rat brain in multiple views at different resolutions. The low magnification set (x2 objective) comprised the entire perforant pathway, while the high magnification set (x100 objective) allowed the identification of individual fibers. A preliminary stereological study revealed a striking linear relationship between the fiber count at high magnification and the optical density at low magnification. PAJ enabled fast analysis for down-sampled data sets and a friendly interface with automated plot drawings. Noted strengths included the multi-platform support as well as the free availability of the source code, conducive to a broad user base and maximum flexibility for ad hoc requirements. PAJ has great potential to extend its usability by (a) improving its graphical user interface, (b) increasing its input size limit, (c) improving response time for large data sets, and (d) potentially being integrated with other Java graphical tools such as ImageJ.

  6. AlaScan: A Graphical User Interface for Alanine Scanning Free-Energy Calculations.

    PubMed

    Ramadoss, Vijayaraj; Dehez, François; Chipot, Christophe

    2016-06-27

    Computation of the free-energy changes that underlie molecular recognition and association has gained significant importance due to its considerable potential in drug discovery. The massive increase of computational power in recent years substantiates the application of more accurate theoretical methods for the calculation of binding free energies. The impact of such advances is the application of parent approaches, like computational alanine scanning, to investigate in silico the effect of amino-acid replacement in protein-ligand and protein-protein complexes, or probe the thermostability of individual proteins. Because human effort represents a significant cost that precludes the routine use of this form of free-energy calculations, minimizing manual intervention constitutes a stringent prerequisite for any such systematic computation. With this objective in mind, we propose a new plug-in, referred to as AlaScan, developed within the popular visualization program VMD to automate the major steps in alanine-scanning calculations, employing free-energy perturbation as implemented in the widely used molecular dynamics code NAMD. The AlaScan plug-in can be utilized upstream, to prepare input files for selected alanine mutations. It can also be utilized downstream to perform the analysis of different alanine-scanning calculations and to report the free-energy estimates in a user-friendly graphical user interface, allowing favorable mutations to be identified at a glance. The plug-in also assists the end-user in assessing the reliability of the calculation through rapid visual inspection.

  7. Flowing Hot or Cold: User-Friendly Computational Models of Terrestrial and Planetary Lava Channels and Lakes

    NASA Astrophysics Data System (ADS)

    Sakimoto, S. E. H.

    2016-12-01

    Planetary volcanism has redefined what is considered volcanism. "Magma" now may be considered to be anything from the molten rock familiar at terrestrial volcanoes to cryovolcanic ammonia-water mixes erupted on an outer solar system moon. However, even with unfamiliar compositions and source mechanisms, we find familiar landforms such as volcanic channels, lakes, flows, and domes and thus a multitude of possibilities for modeling. As on Earth, these landforms lend themselves to analysis for estimating storage, eruption and/or flow rates. This has potential pitfalls, as extension of the simplified analytic models we often use for terrestrial features into unfamiliar parameter space might yield misleading results. Our most commonly used tools for estimating flow and cooling have tended to lag significantly behind state-of-the-art; the easiest methods to use are neither realistic or accurate, but the more realistic and accurate computational methods are not simple to use. Since the latter computational tools tend to be both expensive and require a significant learning curve, there is a need for a user-friendly approach that still takes advantage of their accuracy. One method is use of the computational package for generation of a server-based tool that allows less computationally inclined users to get accurate results over their range of input parameters for a given problem geometry. A second method is to use the computational package for the generation of a polynomial empirical solution for each class of flow geometry that can be fairly easily solved by anyone with a spreadsheet. In this study, we demonstrate both approaches for several channel flow and lava lake geometries with terrestrial and extraterrestrial examples and compare their results. Specifically, we model cooling rectangular channel flow with a yield strength material, with applications to Mauna Loa, Kilauea, Venus, and Mars. This approach also shows promise with model applications to lava lakes, magma

  8. Designing for flexible interaction between humans and automation: delegation interfaces for supervisory control.

    PubMed

    Miller, Christopher A; Parasuraman, Raja

    2007-02-01

    To develop a method enabling human-like, flexible supervisory control via delegation to automation. Real-time supervisory relationships with automation are rarely as flexible as human task delegation to other humans. Flexibility in human-adaptable automation can provide important benefits, including improved situation awareness, more accurate automation usage, more balanced mental workload, increased user acceptance, and improved overall performance. We review problems with static and adaptive (as opposed to "adaptable") automation; contrast these approaches with human-human task delegation, which can mitigate many of the problems; and revise the concept of a "level of automation" as a pattern of task-based roles and authorizations. We argue that delegation requires a shared hierarchical task model between supervisor and subordinates, used to delegate tasks at various levels, and offer instruction on performing them. A prototype implementation called Playbook is described. On the basis of these analyses, we propose methods for supporting human-machine delegation interactions that parallel human-human delegation in important respects. We develop an architecture for machine-based delegation systems based on the metaphor of a sports team's "playbook." Finally, we describe a prototype implementation of this architecture, with an accompanying user interface and usage scenario, for mission planning for uninhabited air vehicles. Delegation offers a viable method for flexible, multilevel human-automation interaction to enhance system performance while maintaining user workload at a manageable level. Most applications of adaptive automation (aviation, air traffic control, robotics, process control, etc.) are potential avenues for the adaptable, delegation approach we advocate. We present an extended example for uninhabited air vehicle mission planning.

  9. FRED user's manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shilling, J.

    1984-02-01

    FRED, the friendly editor, is a screen-based structured editor. This manual is intended to serve the needs of a wide range of users of the FRED text editor. Most users will find it sufficient to read the introductory material in section 2, supplemented with the full command set description in section 3. Advanced users may wish to change the keystroke sequences which invoke editor commands. Section 4 describes how to change key bindings and how to define command macros. Some users may need to modify a language description or create an entirely new language description for use with FRED. Sectionmore » 5 describes the format of the language descriptions used by the editor, and describes how to construct a language grammar. Section 6 describes known portability problems of the FRED editor and should concern only system installation personnel. The editor points out syntax errors in the file being edited and does automatic pretty printing.« less

  10. Considerations on automation of coating machines

    NASA Astrophysics Data System (ADS)

    Tilsch, Markus K.; O'Donnell, Michael S.

    2015-04-01

    Most deposition chambers sold into the optical coating market today are outfitted with an automated control system. We surveyed several of the larger equipment providers, and nine of them responded with information about their hardware architecture, data logging, level of automation, error handling, user interface, and interfacing options. In this paper, we present a summary of the results of the survey and describe commonalities and differences together with some considerations of tradeoffs, such as between capability for high customization and simplicity of operation.

  11. On Abstractions and Simplifications in the Design of Human-Automation Interfaces

    NASA Technical Reports Server (NTRS)

    Heymann, Michael; Degani, Asaf; Shafto, Michael; Meyer, George; Clancy, Daniel (Technical Monitor)

    2001-01-01

    This report addresses the design of human-automation interaction from a formal perspective that focuses on the information content of the interface, rather than the design of the graphical user interface. It also addresses the, issue of the information provided to the user (e.g., user-manuals, training material, and all other resources). In this report, we propose a formal procedure for generating interfaces and user-manuals. The procedure is guided by two criteria: First, the interface must be correct, i.e., that with the given interface the user will be able to perform the specified tasks correctly. Second, the interface should be as succinct as possible. The report discusses the underlying concepts and the formal methods for this approach. Several examples are used to illustrate the procedure. The algorithm for constructing interfaces can be automated, and a preliminary software system for its implementation has been developed.

  12. Intermediate peer contexts and educational outcomes: Do the friends of students' friends matter?

    PubMed

    Carbonaro, William; Workman, Joseph

    2016-07-01

    Sociologists of education have long been interested in the effects of peer relations on educational outcomes. Recent theory and research on adolescence suggest that peers on the boundaries of students' friendship networks may play an important role in shaping behaviors and educational outcomes. In this study, we examine the importance of a key "intermediate peer context" for students' outcomes: the friends of a student's friends. Our findings indicate both friends' and friends' friends' characteristics independently predict students' college expectations and their risk of dropping out of high school (although only friends' characteristics predict GPA). Our models suggest the magnitude of students' friends-of-friends' characteristics are at least as large their friends' characteristics. Together, the association between the peer context and students outcomes is considerably larger when accounting for both the characteristics of students' friends and the friends of their friends. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. The heat-transfer method: a versatile low-cost, label-free, fast, and user-friendly readout platform for biosensor applications.

    PubMed

    van Grinsven, Bart; Eersels, Kasper; Peeters, Marloes; Losada-Pérez, Patricia; Vandenryt, Thijs; Cleij, Thomas J; Wagner, Patrick

    2014-08-27

    In recent years, biosensors have become increasingly important in various scientific domains including medicine, biology, and pharmacology, resulting in an increased demand for fast and effective readout techniques. In this Spotlight on Applications, we report on the recently developed heat-transfer method (HTM) and illustrate the use of the technique by zooming in on four established bio(mimetic) sensor applications: (i) mutation analysis in DNA sequences, (ii) cancer cell identification through surface-imprinted polymers, (iii) detection of neurotransmitters with molecularly imprinted polymers, and (iv) phase-transition analysis in lipid vesicle layers. The methodology is based on changes in heat-transfer resistance at a functionalized solid-liquid interface. To this extent, the device applies a temperature gradient over this interface and monitors the temperature underneath and above the functionalized chip in time. The heat-transfer resistance can be obtained by dividing this temperature gradient by the power needed to achieve a programmed temperature. The low-cost, fast, label-free and user-friendly nature of the technology in combination with a high degree of specificity, selectivity, and sensitivity makes HTM a promising sensor technology.

  14. CO2calc: A User-Friendly Seawater Carbon Calculator for Windows, Mac OS X, and iOS (iPhone)

    USGS Publications Warehouse

    Robbins, L.L.; Hansen, M.E.; Kleypas, J.A.; Meylan, S.C.

    2010-01-01

    A user-friendly, stand-alone application for the calculation of carbonate system parameters was developed by the U.S. Geological Survey Florida Shelf Ecosystems Response to Climate Change Project in response to its Ocean Acidification Task. The application, by Mark Hansen and Lisa Robbins, USGS St. Petersburg, FL, Joanie Kleypas, NCAR, Boulder, CO, and Stephan Meylan, Jacobs Technology, St. Petersburg, FL, is intended as a follow-on to CO2SYS, originally developed by Lewis and Wallace (1998) and later modified for Microsoft Excel? by Denis Pierrot (Pierrot and others, 2006). Besides eliminating the need for using Microsoft Excel on the host system, CO2calc offers several improvements on CO2SYS, including: An improved graphical user interface for data entry and results Additional calculations of air-sea CO2 fluxes (for surface water calculations) The ability to tag data with sample name, comments, date, time, and latitude/longitude The ability to use the system time and date and latitude/ longitude (automatic retrieval of latitude and longitude available on iPhone? 3, 3GS, 4, and Windows? hosts with an attached National Marine Electronics Association (NMEA)-enabled GPS) The ability to process multiple files in a batch processing mode An option to save sample information, data input, and calculated results as a comma-separated value (CSV) file for use with Microsoft Excel, ArcGIS,? or other applications An option to export points with geographic coordinates as a KMZ file for viewing and editing in Google EarthTM

  15. ORIGAMI Automator Primer. Automated ORIGEN Source Terms and Spent Fuel Storage Pool Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wieselquist, William A.; Thompson, Adam B.; Bowman, Stephen M.

    2016-04-01

    Source terms and spent nuclear fuel (SNF) storage pool decay heat load analyses for operating nuclear power plants require a large number of Oak Ridge Isotope Generation and Depletion (ORIGEN) calculations. SNF source term calculations also require a significant amount of bookkeeping to track quantities such as core and assembly operating histories, spent fuel pool (SFP) residence times, heavy metal masses, and enrichments. The ORIGEN Assembly Isotopics (ORIGAMI) module in the SCALE code system provides a simple scheme for entering these data. However, given the large scope of the analysis, extensive scripting is necessary to convert formats and process datamore » to create thousands of ORIGAMI input files (one per assembly) and to process the results into formats readily usable by follow-on analysis tools. This primer describes a project within the SCALE Fulcrum graphical user interface (GUI) called ORIGAMI Automator that was developed to automate the scripting and bookkeeping in large-scale source term analyses. The ORIGAMI Automator enables the analyst to (1) easily create, view, and edit the reactor site and assembly information, (2) automatically create and run ORIGAMI inputs, and (3) analyze the results from ORIGAMI. ORIGAMI Automator uses the standard ORIGEN binary concentrations files produced by ORIGAMI, with concentrations available at all time points in each assembly’s life. The GUI plots results such as mass, concentration, activity, and decay heat using a powerful new ORIGEN Post-Processing Utility for SCALE (OPUS) GUI component. This document includes a description and user guide for the GUI, a step-by-step tutorial for a simplified scenario, and appendices that document the file structures used.« less

  16. CAPTCHA: Impact on User Experience of Users with Learning Disabilities

    ERIC Educational Resources Information Center

    Gafni, Ruti; Nagar, Idan

    2016-01-01

    CAPTCHA is one of the most common solutions to check if the user trying to enter a Website is a real person or an automated piece of software. This challenge-response test, implemented in many Internet Websites, emphasizes the gaps between accessibility and security on the Internet, as it poses an obstacle for the learning-impaired in the reading…

  17. CORE_TF: a user-friendly interface to identify evolutionary conserved transcription factor binding sites in sets of co-regulated genes

    PubMed Central

    Hestand, Matthew S; van Galen, Michiel; Villerius, Michel P; van Ommen, Gert-Jan B; den Dunnen, Johan T; 't Hoen, Peter AC

    2008-01-01

    Background The identification of transcription factor binding sites is difficult since they are only a small number of nucleotides in size, resulting in large numbers of false positives and false negatives in current approaches. Computational methods to reduce false positives are to look for over-representation of transcription factor binding sites in a set of similarly regulated promoters or to look for conservation in orthologous promoter alignments. Results We have developed a novel tool, "CORE_TF" (Conserved and Over-REpresented Transcription Factor binding sites) that identifies common transcription factor binding sites in promoters of co-regulated genes. To improve upon existing binding site predictions, the tool searches for position weight matrices from the TRANSFACR database that are over-represented in an experimental set compared to a random set of promoters and identifies cross-species conservation of the predicted transcription factor binding sites. The algorithm has been evaluated with expression and chromatin-immunoprecipitation on microarray data. We also implement and demonstrate the importance of matching the random set of promoters to the experimental promoters by GC content, which is a unique feature of our tool. Conclusion The program CORE_TF is accessible in a user friendly web interface at . It provides a table of over-represented transcription factor binding sites in the users input genes' promoters and a graphical view of evolutionary conserved transcription factor binding sites. In our test data sets it successfully predicts target transcription factors and their binding sites. PMID:19036135

  18. Biblio-MetReS: A bibliometric network reconstruction application and server

    PubMed Central

    2011-01-01

    Background Reconstruction of genes and/or protein networks from automated analysis of the literature is one of the current targets of text mining in biomedical research. Some user-friendly tools already perform this analysis on precompiled databases of abstracts of scientific papers. Other tools allow expert users to elaborate and analyze the full content of a corpus of scientific documents. However, to our knowledge, no user friendly tool that simultaneously analyzes the latest set of scientific documents available on line and reconstructs the set of genes referenced in those documents is available. Results This article presents such a tool, Biblio-MetReS, and compares its functioning and results to those of other user-friendly applications (iHOP, STRING) that are widely used. Under similar conditions, Biblio-MetReS creates networks that are comparable to those of other user friendly tools. Furthermore, analysis of full text documents provides more complete reconstructions than those that result from using only the abstract of the document. Conclusions Literature-based automated network reconstruction is still far from providing complete reconstructions of molecular networks. However, its value as an auxiliary tool is high and it will increase as standards for reporting biological entities and relationships become more widely accepted and enforced. Biblio-MetReS is an application that can be downloaded from http://metres.udl.cat/. It provides an easy to use environment for researchers to reconstruct their networks of interest from an always up to date set of scientific documents. PMID:21975133

  19. Automation and Intensity Modulated Radiation Therapy for Individualized High-Quality Tangent Breast Treatment Plans

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Purdie, Thomas G., E-mail: Tom.Purdie@rmp.uhn.on.ca; Department of Radiation Oncology, University of Toronto, Toronto, Ontario; Techna Institute, University Health Network, Toronto, Ontario

    Purpose: To demonstrate the large-scale clinical implementation and performance of an automated treatment planning methodology for tangential breast intensity modulated radiation therapy (IMRT). Methods and Materials: Automated planning was used to prospectively plan tangential breast IMRT treatment for 1661 patients between June 2009 and November 2012. The automated planning method emulates the manual steps performed by the user during treatment planning, including anatomical segmentation, beam placement, optimization, dose calculation, and plan documentation. The user specifies clinical requirements of the plan to be generated through a user interface embedded in the planning system. The automated method uses heuristic algorithms to definemore » and simplify the technical aspects of the treatment planning process. Results: Automated planning was used in 1661 of 1708 patients receiving tangential breast IMRT during the time interval studied. Therefore, automated planning was applicable in greater than 97% of cases. The time for treatment planning using the automated process is routinely 5 to 6 minutes on standard commercially available planning hardware. We have shown a consistent reduction in plan rejections from plan reviews through the standard quality control process or weekly quality review multidisciplinary breast rounds as we have automated the planning process for tangential breast IMRT. Clinical plan acceptance increased from 97.3% using our previous semiautomated inverse method to 98.9% using the fully automated method. Conclusions: Automation has become the routine standard method for treatment planning of tangential breast IMRT at our institution and is clinically feasible on a large scale. The method has wide clinical applicability and can add tremendous efficiency, standardization, and quality to the current treatment planning process. The use of automated methods can allow centers to more rapidly adopt IMRT and enhance access to the documented

  20. Automation and intensity modulated radiation therapy for individualized high-quality tangent breast treatment plans.

    PubMed

    Purdie, Thomas G; Dinniwell, Robert E; Fyles, Anthony; Sharpe, Michael B

    2014-11-01

    To demonstrate the large-scale clinical implementation and performance of an automated treatment planning methodology for tangential breast intensity modulated radiation therapy (IMRT). Automated planning was used to prospectively plan tangential breast IMRT treatment for 1661 patients between June 2009 and November 2012. The automated planning method emulates the manual steps performed by the user during treatment planning, including anatomical segmentation, beam placement, optimization, dose calculation, and plan documentation. The user specifies clinical requirements of the plan to be generated through a user interface embedded in the planning system. The automated method uses heuristic algorithms to define and simplify the technical aspects of the treatment planning process. Automated planning was used in 1661 of 1708 patients receiving tangential breast IMRT during the time interval studied. Therefore, automated planning was applicable in greater than 97% of cases. The time for treatment planning using the automated process is routinely 5 to 6 minutes on standard commercially available planning hardware. We have shown a consistent reduction in plan rejections from plan reviews through the standard quality control process or weekly quality review multidisciplinary breast rounds as we have automated the planning process for tangential breast IMRT. Clinical plan acceptance increased from 97.3% using our previous semiautomated inverse method to 98.9% using the fully automated method. Automation has become the routine standard method for treatment planning of tangential breast IMRT at our institution and is clinically feasible on a large scale. The method has wide clinical applicability and can add tremendous efficiency, standardization, and quality to the current treatment planning process. The use of automated methods can allow centers to more rapidly adopt IMRT and enhance access to the documented improvements in care for breast cancer patients, using

  1. TIA Software User's Manual

    NASA Technical Reports Server (NTRS)

    Cramer, K. Elliott; Syed, Hazari I.

    1995-01-01

    This user's manual describes the installation and operation of TIA, the Thermal-Imaging acquisition and processing Application, developed by the Nondestructive Evaluation Sciences Branch at NASA Langley Research Center, Hampton, Virginia. TIA is a user friendly graphical interface application for the Macintosh 2 and higher series computers. The software has been developed to interface with the Perceptics/Westinghouse Pixelpipe(TM) and PixelStore(TM) NuBus cards and the GW Instruments MacADIOS(TM) input-output (I/O) card for the Macintosh for imaging thermal data. The software is also capable of performing generic image-processing functions.

  2. Romantic Partners, Friends, Friends with Benefits, and Casual Acquaintances As Sexual Partners

    PubMed Central

    Furman, Wyndol; Shaffer, Laura

    2011-01-01

    The purpose of the present study was to provide a detailed examination of sexual behavior with different types of partners. A sample of 163 young adults reported on their light nongenital, heavy nongenital, and genital sexual activity with romantic partners, friends, and casual acquaintances. They described their sexual activity with “friends with benefits” as well as with friends in general. Young adults were most likely to engage in sexual behavior with romantic partners, but sexual behavior also often occurred with some type of nonromantic partner. More young adults engaged in some form of sexual behavior with casual acquaintances than with friends with benefits. The frequencies of sexual behavior, however, were greater with friends with benefits than with friends or casual acquaintances. Interview and questionnaire data revealed that friends with benefits were typically friends, but not necessarily. Nonsexual activities were also less common with friends with benefits than other friends. Taken together, the findings illustrate the value of differentiating among different types of nonromantic partners and different levels of sexual behavior. PMID:21128155

  3. A study into the automation of cognitive assessment tasks for delivery via the telephone: lessons for developing remote monitoring applications for the elderly.

    PubMed

    D'Arcy, Shona; Rapcan, Viliam; Gali, Alessandra; Burke, Nicola; O'Connell, Gloria Crispino; Robertson, Ian H; Reilly, Richard B

    2013-01-01

    Cognitive assessments are valuable tools in assessing neurological conditions. They are critical in measuring deficits in cognitive function in an array of neurological disorders and during the ageing process. Automation of cognitive assessments is one way to address the increasing burden on medical resources for an ever increasing ageing population. This study investigated the suitability of using automated Interactive Voice Response (IVR) technology to deliver a suite of cognitive assessments to older adults using speech as the input modality. Several clinically valid and gold-standard cognitive assessments were selected for implementation in the IVR application. The IVR application was designed using human centred design principles to ensure the experience was as user friendly as possible. Sixty one participants completed two IVR assessments and one face to face (FF) assessment with a neuropsychologist. Completion rates for individual tests were inspected to identify those tests that are most suitable for administration via IVR technology. Interclass correlations were calculated to assess the reliability of the automated administration of the cognitive assessments across delivery modes. While all participants successfully completed all automated assessments, variability in the completion rates for different cognitive tests was observed. Statistical analysis found significant interclass correlations for certain cognitive tests between the different modes of administration. Analysis also suggests that an initial FF assessment reduces the variability in cognitive test scores when introducing automation into such an assessment. [corrected] This study has demonstrated the functional and cognitive reliability of administering specific cognitive tests using an automated, speech driven application. This study has defined the characteristics of existing cognitive tests that are suitable for such an automated delivery system and also informs on the limitations of other

  4. Automating Visualization Service Generation with the WATT Compiler

    NASA Astrophysics Data System (ADS)

    Bollig, E. F.; Lyness, M. D.; Erlebacher, G.; Yuen, D. A.

    2007-12-01

    As tasks and workflows become increasingly complex, software developers are devoting increasing attention to automation tools. Among many examples, the Automator tool from Apple collects components of a workflow into a single script, with very little effort on the part of the user. Tasks are most often described as a series of instructions. The granularity of the tasks dictates the tools to use. Compilers translate fine-grained instructions to assembler code, while scripting languages (ruby, perl) are used to describe a series of tasks at a higher level. Compilers can also be viewed as transformational tools: a cross-compiler can translate executable code written on one computer to assembler code understood on another, while transformational tools can translate from one high-level language to another. We are interested in creating visualization web services automatically, starting from stand-alone VTK (Visualization Toolkit) code written in Tcl. To this end, using the OCaml programming language, we have developed a compiler that translates Tcl into C++, including all the stubs, classes and methods to interface with gSOAP, a C++ implementation of the Soap 1.1/1.2 protocols. This compiler, referred to as the Web Automation and Translation Toolkit (WATT), is the first step towards automated creation of specialized visualization web services without input from the user. The WATT compiler seeks to automate all aspects of web service generation, including the transport layer, the division of labor and the details related to interface generation. The WATT compiler is part of ongoing efforts within the NSF funded VLab consortium [1] to facilitate and automate time-consuming tasks for the science related to understanding planetary materials. Through examples of services produced by WATT for the VLab portal, we will illustrate features, limitations and the improvements necessary to achieve the ultimate goal of complete and transparent automation in the generation of web

  5. Friends and family: A software program for identification of unrelated individuals from molecular marker data.

    PubMed

    de Jager, Deon; Swarts, Petrus; Harper, Cindy; Bloomer, Paulette

    2017-11-01

    The identification of related and unrelated individuals from molecular marker data is often difficult, particularly when no pedigree information is available and the data set is large. High levels of relatedness or inbreeding can influence genotype frequencies and thus genetic marker evaluation, as well as the accurate inference of hidden genetic structure. Identification of related and unrelated individuals is also important in breeding programmes, to inform decisions about breeding pairs and translocations. We present Friends and Family, a Windows executable program with a graphical user interface that identifies unrelated individuals from a pairwise relatedness matrix or table generated in programs such as coancestry and genalex. Friends and Family outputs a list of samples that are all unrelated to each other, based on a user-defined relatedness cut-off value. This unrelated data set can be used in downstream analyses, such as marker evaluation or inference of genetic structure. The results can be compared to that of the full data set to determine the effect related individuals have on the analyses. We demonstrate one of the applications of the program: how the removal of related individuals altered the Hardy-Weinberg equilibrium test outcome for microsatellite markers in an empirical data set. Friends and Family can be obtained from https://github.com/DeondeJager/Friends-and-Family. © 2017 John Wiley & Sons Ltd.

  6. Tweeting the Friendly Skies: Investigating Information Exchange among Twitter Users about Airlines

    ERIC Educational Resources Information Center

    Sreenivasan, Nirupama Dharmavaram; Lee, Chei Sian; Goh, Dion Hoe-Lian

    2012-01-01

    Purpose: The purpose of this study is to investigate airline users' microblog postings pertaining to their travel-related information exchange so as to assess their wants, preferences and feedback about airline products and services. Examining such real-time information exchange is important as users rely on this for various purposes such as…

  7. Classification of Automated Search Traffic

    NASA Astrophysics Data System (ADS)

    Buehrer, Greg; Stokes, Jack W.; Chellapilla, Kumar; Platt, John C.

    As web search providers seek to improve both relevance and response times, they are challenged by the ever-increasing tax of automated search query traffic. Third party systems interact with search engines for a variety of reasons, such as monitoring a web site’s rank, augmenting online games, or possibly to maliciously alter click-through rates. In this paper, we investigate automated traffic (sometimes referred to as bot traffic) in the query stream of a large search engine provider. We define automated traffic as any search query not generated by a human in real time. We first provide examples of different categories of query logs generated by automated means. We then develop many different features that distinguish between queries generated by people searching for information, and those generated by automated processes. We categorize these features into two classes, either an interpretation of the physical model of human interactions, or as behavioral patterns of automated interactions. Using the these detection features, we next classify the query stream using multiple binary classifiers. In addition, a multiclass classifier is then developed to identify subclasses of both normal and automated traffic. An active learning algorithm is used to suggest which user sessions to label to improve the accuracy of the multiclass classifier, while also seeking to discover new classes of automated traffic. Performance analysis are then provided. Finally, the multiclass classifier is used to predict the subclass distribution for the search query stream.

  8. Step-by-step guide to building an inexpensive 3D printed motorized positioning stage for automated high-content screening microscopy.

    PubMed

    Schneidereit, Dominik; Kraus, Larissa; Meier, Jochen C; Friedrich, Oliver; Gilbert, Daniel F

    2017-06-15

    High-content screening microscopy relies on automation infrastructure that is typically proprietary, non-customizable, costly and requires a high level of skill to use and maintain. The increasing availability of rapid prototyping technology makes it possible to quickly engineer alternatives to conventional automation infrastructure that are low-cost and user-friendly. Here, we describe a 3D printed inexpensive open source and scalable motorized positioning stage for automated high-content screening microscopy and provide detailed step-by-step instructions to re-building the device, including a comprehensive parts list, 3D design files in STEP (Standard for the Exchange of Product model data) and STL (Standard Tessellation Language) format, electronic circuits and wiring diagrams as well as software code. System assembly including 3D printing requires approx. 30h. The fully assembled device is light-weight (1.1kg), small (33×20×8cm) and extremely low-cost (approx. EUR 250). We describe positioning characteristics of the stage, including spatial resolution, accuracy and repeatability, compare imaging data generated with our device to data obtained using a commercially available microplate reader, demonstrate its suitability to high-content microscopy in 96-well high-throughput screening format and validate its applicability to automated functional Cl - - and Ca 2+ -imaging with recombinant HEK293 cells as a model system. A time-lapse video of the stage during operation and as part of a custom assembled screening robot can be found at https://vimeo.com/158813199. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  9. Remote access and automation of SPring-8 MX beamlines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ueno, Go, E-mail: ueno@spring8.or.jp; Hikima, Takaaki; Yamashita, Keitaro

    At SPring-8 MX beamlines, a remote access system has been developed and started user operation in 2010. The system has been developed based on an automated data collection and data management architecture utilized for the confirmed scheme of SPring-8 mail-in data collection. Currently, further improvement to the remote access and automation which covers data processing and analysis are being developed.

  10. Low Cost Desktop Image Analysis Workstation With Enhanced Interactive User Interface

    NASA Astrophysics Data System (ADS)

    Ratib, Osman M.; Huang, H. K.

    1989-05-01

    A multimodality picture archiving and communication system (PACS) is in routine clinical use in the UCLA Radiology Department. Several types workstations are currently implemented for this PACS. Among them, the Apple Macintosh II personal computer was recently chosen to serve as a desktop workstation for display and analysis of radiological images. This personal computer was selected mainly because of its extremely friendly user-interface, its popularity among the academic and medical community and its low cost. In comparison to other microcomputer-based systems the Macintosh II offers the following advantages: the extreme standardization of its user interface, file system and networking, and the availability of a very large variety of commercial software packages. In the current configuration the Macintosh II operates as a stand-alone workstation where images are imported from a centralized PACS server through an Ethernet network using a standard TCP-IP protocol, and stored locally on magnetic disk. The use of high resolution screens (1024x768 pixels x 8bits) offer sufficient performance for image display and analysis. We focused our project on the design and implementation of a variety of image analysis algorithms ranging from automated structure and edge detection to sophisticated dynamic analysis of sequential images. Specific analysis programs were developed for ultrasound images, digitized angiograms, MRI and CT tomographic images and scintigraphic images.

  11. MOO in Your Face: Researching, Designing, and Programming a User-Friendly Interface.

    ERIC Educational Resources Information Center

    Haas, Mark; Gardner, Clinton

    1999-01-01

    Suggests the learning curve of a multi-user, object-oriented domain (MOO) blockades effective use. Discusses use of an IBM/PC-compatible interface that allows developers to modify the interface to provide a sense of presence for the user. Concludes that work in programming a variety of interfaces has led to a more intuitive environment for…

  12. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics

    PubMed Central

    2016-01-01

    intervals, or a confusion matrix. Conclusions IBMWA is a new alternative for data analytics software that automates descriptive, predictive, and visual analytics. This program is very user-friendly but requires data preprocessing, statistical conceptual understanding, and domain expertise. PMID:27729304

  13. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics.

    PubMed

    Hoyt, Robert Eugene; Snider, Dallas; Thompson, Carla; Mantravadi, Sarita

    2016-10-11

    . IBMWA is a new alternative for data analytics software that automates descriptive, predictive, and visual analytics. This program is very user-friendly but requires data preprocessing, statistical conceptual understanding, and domain expertise.

  14. GIS-based automated management of highway surface crack inspection system

    NASA Astrophysics Data System (ADS)

    Chung, Hung-Chi; Shinozuka, Masanobu; Soeller, Tony; Girardello, Roberto

    2004-07-01

    An automated in-situ road surface distress surveying and management system, AMPIS, has been developed on the basis of video images within the framework of GIS software. Video image processing techniques are introduced to acquire, process and analyze the road surface images obtained from a moving vehicle. ArcGIS platform is used to integrate the routines of image processing and spatial analysis in handling the full-scale metropolitan highway surface distress detection and data fusion/management. This makes it possible to present user-friendly interfaces in GIS and to provide efficient visualizations of surveyed results not only for the use of transportation engineers to manage road surveying documentations, data acquisition, analysis and management, but also for financial officials to plan maintenance and repair programs and further evaluate the socio-economic impacts of highway degradation and deterioration. A review performed in this study on fundamental principle of Pavement Management System (PMS) and its implementation indicates that the proposed approach of using GIS concept and its tools for PMS application will reshape PMS into a new information technology-based system that can provide convenient and efficient pavement inspection and management.

  15. Brainstorm: A User-Friendly Application for MEG/EEG Analysis

    PubMed Central

    Tadel, François; Baillet, Sylvain; Mosher, John C.; Pantazis, Dimitrios; Leahy, Richard M.

    2011-01-01

    Brainstorm is a collaborative open-source application dedicated to magnetoencephalography (MEG) and electroencephalography (EEG) data visualization and processing, with an emphasis on cortical source estimation techniques and their integration with anatomical magnetic resonance imaging (MRI) data. The primary objective of the software is to connect MEG/EEG neuroscience investigators with both the best-established and cutting-edge methods through a simple and intuitive graphical user interface (GUI). PMID:21584256

  16. USER'S GUIDE FOR GLOED VERSION 1.0 - THE GLOBAL EMISSIONS DATABASE

    EPA Science Inventory

    The document is a user's guide for the EPA-developed, powerful software package, Global Emissions Database (GloED). GloED is a user-friendly, menu-driven tool for storing and retrieving emissions factors and activity data on a country-specific basis. Data can be selected from dat...

  17. Intelligence Preparation of the Battlefield (Automated) IPB(A).

    DTIC Science & Technology

    1980-02-14

    courses of action as the basis for friendly operations planninq. The commander’s intelligence system must handle timely combat information...through A8 represent the series of snapshots developed for Option A. Snapshots would be developed for each ontion, and stored in an automated file for ...process throughout a battle situation demands local ability to modify the IPB products stored in the system, and to develop new ones as enemy courses

  18. The Development of an Automated Device for Asthma Monitoring for Adolescents: Methodologic Approach and User Acceptability

    PubMed Central

    Miner, Sarah; Sterling, Mark; Halterman, Jill S; Fairbanks, Eileen

    2014-01-01

    Background Many adolescents suffer serious asthma related morbidity that can be prevented by adequate self-management of the disease. The accurate symptom monitoring by patients is the most fundamental antecedent to effective asthma management. Nonetheless, the adequacy and effectiveness of current methods of symptom self-monitoring have been challenged due to the individuals’ fallible symptom perception, poor adherence, and inadequate technique. Recognition of these limitations led to the development of an innovative device that can facilitate continuous and accurate monitoring of asthma symptoms with minimal disruption of daily routines, thus increasing acceptability to adolescents. Objective The objectives of this study were to: (1) describe the development of a novel symptom monitoring device for teenagers (teens), and (2) assess their perspectives on the usability and acceptability of the device. Methods Adolescents (13-17 years old) with and without asthma participated in the evolution of an automated device for asthma monitoring (ADAM), which comprised three phases, including development (Phase 1, n=37), validation/user acceptability (Phase 2, n=84), and post hoc validation (Phase 3, n=10). In Phase 1, symptom algorithms were identified based on the acoustic analysis of raw symptom sounds and programmed into a popular mobile system, the iPod. Phase 2 involved a 7 day trial of ADAM in vivo, and the evaluation of user acceptance using an acceptance survey and individual interviews. ADAM was further modified and enhanced in Phase 3. Results Through ADAM, incoming audio data were digitized and processed in two steps involving the extraction of a sequence of descriptive feature vectors, and the processing of these sequences by a hidden Markov model-based Viterbi decoder to differentiate symptom sounds from background noise. The number and times of detected symptoms were stored and displayed in the device. The sensitivity (true positive) of the updated cough

  19. HiRel: Hybrid Automated Reliability Predictor (HARP) integrated reliability tool system, (version 7.0). Volume 3: HARP Graphics Oriented (GO) input user's guide

    NASA Technical Reports Server (NTRS)

    Bavuso, Salvatore J.; Rothmann, Elizabeth; Mittal, Nitin; Koppen, Sandra Howell

    1994-01-01

    The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of highly reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed at the outset to be compatible with most computing platforms and operating systems, and some programs have been beta tested within the aerospace community for over 8 years. This document is a user's guide for the HiRel graphical preprocessor Graphics Oriented (GO) program. GO is a graphical user interface for the HARP engine that enables the drawing of reliability/availability models on a monitor. A mouse is used to select fault tree gates or Markov graphical symbols from a menu for drawing.

  20. User Vulnerability and its Reduction on a Social Networking Site

    DTIC Science & Technology

    2014-01-01

    social networking sites bring about new...and explore other users’ profiles and friend networks. Social networking sites have reshaped business models [Vayner- chuk 2009], provided platform... social networking sites is to enable users to be more social, user privacy and security issues cannot be ignored. On one hand, most social networking sites

  1. The more friends, the less political talk? Predictors of Facebook discussions among college students.

    PubMed

    Jang, S Mo; Lee, Hoon; Park, Yong Jin

    2014-05-01

    Although previous research has indicated that Facebook users, especially young adults, can cultivate their civic values by talking about public matters with their Facebook friends, little research has examined the predictors of political discussion on Facebook. Using survey data from 442 college students in the United States, this study finds that individual characteristics and network size influence college students' expressive behavior on Facebook related to two controversial topics: gay rights issues and politics. In line with previous studies about offline political discussion, the results show that conflict avoidance and ambivalence about target issues are negatively associated with Facebook discussions. Perhaps the most interesting finding is that users who have a large number of Facebook friends are less likely to talk about politics and gay rights issues on Facebook despite having access to increasing human and information resources. Theoretical implications of these findings and future directions are addressed.

  2. Multi-Mission Automated Task Invocation Subsystem

    NASA Technical Reports Server (NTRS)

    Cheng, Cecilia S.; Patel, Rajesh R.; Sayfi, Elias M.; Lee, Hyun H.

    2009-01-01

    Multi-Mission Automated Task Invocation Subsystem (MATIS) is software that establishes a distributed data-processing framework for automated generation of instrument data products from a spacecraft mission. Each mission may set up a set of MATIS servers for processing its data products. MATIS embodies lessons learned in experience with prior instrument- data-product-generation software. MATIS is an event-driven workflow manager that interprets project-specific, user-defined rules for managing processes. It executes programs in response to specific events under specific conditions according to the rules. Because requirements of different missions are too diverse to be satisfied by one program, MATIS accommodates plug-in programs. MATIS is flexible in that users can control such processing parameters as how many pipelines to run and on which computing machines to run them. MATIS has a fail-safe capability. At each step, MATIS captures and retains pertinent information needed to complete the step and start the next step. In the event of a restart, this information is retrieved so that processing can be resumed appropriately. At this writing, it is planned to develop a graphical user interface (GUI) for monitoring and controlling a product generation engine in MATIS. The GUI would enable users to schedule multiple processes and manage the data products produced in the processes. Although MATIS was initially designed for instrument data product generation,

  3. Adolescent Boys' Intentions of Seeking Help from Male Friends and Female Friends

    ERIC Educational Resources Information Center

    Sears, Heather A.; Graham, Joanna; Campbell, Anna

    2009-01-01

    This study examined adolescent boys' intentions of seeking help from male friends and female friends. We evaluated mean differences in boys' help-seeking intentions; assessed whether boys' individual characteristics predicted their intentions; and examined perceived support from male friends and female friends as mediators of these relationships.…

  4. Computing and Office Automation: Changing Variables.

    ERIC Educational Resources Information Center

    Staman, E. Michael

    1981-01-01

    Trends in computing and office automation and their applications, including planning, institutional research, and general administrative support in higher education, are discussed. Changing aspects of information processing and an increasingly larger user community are considered. The computing literacy cycle may involve programming, analysis, use…

  5. "Friending Facebook?" A Minicourse on the Use of Social Media by Health Professionals

    ERIC Educational Resources Information Center

    George, Daniel R.

    2011-01-01

    Introduction: Health professionals are working in an era of social technologies that empower users to generate content in real time. This article describes a 3-part continuing education minicourse called "Friending Facebook?" undertaken at Penn State Hershey Medical Center that aimed to model the functionality of current technologies in…

  6. MARC and the Library Service Center: Automation at Bargain Rates.

    ERIC Educational Resources Information Center

    Pearson, Karl M.

    Despite recent research and development in the field of library automation, libraries have been unable to reap the benefits promised by technology due to the high cost of building and maintaining their own computer-based systems. Time-sharing and disc mass storage devices will bring automation costs, if spread over a number of users, within the…

  7. User's manual for the National Water Information System of the U.S. Geological Survey: Automated Data Processing System (ADAPS)

    USGS Publications Warehouse

    ,

    2003-01-01

    The Automated Data Processing System (ADAPS) was developed for the processing, storage, and retrieval of water data, and is part of the National Water Information System (NWIS) developed by the U.S. Geological Survey. NWIS is a distributed water database in which data can be processed over a network of computers at U.S. Geological Survey offices throughout the United States. NWIS comprises four subsystems: ADAPS, the Ground-Water Site Inventory System (GWSI), the Water-Quality System (QWDATA), and the Site-Specific Water-Use Data System (SWUDS). This section of the NWIS User's Manual describes the automated data processing of continuously recorded water data, which primarily are surface-water data; however, the system also allows for the processing of water-quality and ground-water data. This manual describes various components and features of the ADAPS, and provides an overview of the data processing system and a description of the system framework. The components and features included are: (1) data collection and processing, (2) ADAPS menus and programs, (3) command line functions, (4) steps for processing station records, (5) postprocessor programs control files, (6) the standard format for transferring and entering unit and daily values, and (7) relational database (RDB) formats.

  8. In-camera automation of photographic composition rules.

    PubMed

    Banerjee, Serene; Evans, Brian L

    2007-07-01

    At the time of image acquisition, professional photographers apply many rules of thumb to improve the composition of their photographs. This paper develops a joint optical-digital processing framework for automating composition rules during image acquisition for photographs with one main subject. Within the framework, we automate three photographic composition rules: repositioning the main subject, making the main subject more prominent, and making objects that merge with the main subject less prominent. The idea is to provide to the user alternate pictures obtained by applying photographic composition rules in addition to the original picture taken by the user. The proposed algorithms do not depend on prior knowledge of the indoor/outdoor setting or scene content. The proposed algorithms are also designed to be amenable to software implementation on fixed-point programmable digital signal processors available in digital still cameras.

  9. GMOseek: a user friendly tool for optimized GMO testing.

    PubMed

    Morisset, Dany; Novak, Petra Kralj; Zupanič, Darko; Gruden, Kristina; Lavrač, Nada; Žel, Jana

    2014-08-01

    With the increasing pace of new Genetically Modified Organisms (GMOs) authorized or in pipeline for commercialization worldwide, the task of the laboratories in charge to test the compliance of food, feed or seed samples with their relevant regulations became difficult and costly. Many of them have already adopted the so called "matrix approach" to rationalize the resources and efforts used to increase their efficiency within a limited budget. Most of the time, the "matrix approach" is implemented using limited information and some proprietary (if any) computational tool to efficiently use the available data. The developed GMOseek software is designed to support decision making in all the phases of routine GMO laboratory testing, including the interpretation of wet-lab results. The tool makes use of a tabulated matrix of GM events and their genetic elements, of the laboratory analysis history and the available information about the sample at hand. The tool uses an optimization approach to suggest the most suited screening assays for the given sample. The practical GMOseek user interface allows the user to customize the search for a cost-efficient combination of screening assays to be employed on a given sample. It further guides the user to select appropriate analyses to determine the presence of individual GM events in the analyzed sample, and it helps taking a final decision regarding the GMO composition in the sample. GMOseek can also be used to evaluate new, previously unused GMO screening targets and to estimate the profitability of developing new GMO screening methods. The presented freely available software tool offers the GMO testing laboratories the possibility to select combinations of assays (e.g. quantitative real-time PCR tests) needed for their task, by allowing the expert to express his/her preferences in terms of multiplexing and cost. The utility of GMOseek is exemplified by analyzing selected food, feed and seed samples from a national reference

  10. Open-Source Assisted Laboratory Automation through Graphical User Interfaces and 3D Printers: Application to Equipment Hyphenation for Higher-Order Data Generation.

    PubMed

    Siano, Gabriel G; Montemurro, Milagros; Alcaráz, Mirta R; Goicoechea, Héctor C

    2017-10-17

    Higher-order data generation implies some automation challenges, which are mainly related to the hidden programming languages and electronic details of the equipment. When techniques and/or equipment hyphenation are the key to obtaining higher-order data, the required simultaneous control of them demands funds for new hardware, software, and licenses, in addition to very skilled operators. In this work, we present Design of Inputs-Outputs with Sikuli (DIOS), a free and open-source code program that provides a general framework for the design of automated experimental procedures without prior knowledge of programming or electronics. Basically, instruments and devices are considered as nodes in a network, and every node is associated both with physical and virtual inputs and outputs. Virtual components, such as graphical user interfaces (GUIs) of equipment, are handled by means of image recognition tools provided by Sikuli scripting language, while handling of their physical counterparts is achieved using an adapted open-source three-dimensional (3D) printer. Two previously reported experiments of our research group, related to fluorescence matrices derived from kinetics and high-performance liquid chromatography, were adapted to be carried out in a more automated fashion. Satisfactory results, in terms of analytical performance, were obtained. Similarly, advantages derived from open-source tools assistance could be appreciated, mainly in terms of lesser intervention of operators and cost savings.

  11. Water-based alkyl ketene dimer ink for user-friendly patterning in paper microfluidics.

    PubMed

    Hamidon, Nurul Nadiah; Hong, Yumiao; Salentijn, Gert Ij; Verpoorte, Elisabeth

    2018-02-13

    We propose the use of water-based alkyl ketene dimer (AKD) ink for fast and user-friendly patterning of paper microfluidic devices either manually or using an inexpensive XY-plotter. The ink was produced by dissolving hydrophobic AKD in chloroform and emulsifying the solution in water. The emulsification was performed in a warm water bath, which led to an increased rate of the evaporation of chloroform. Subsequent cooling led to the final product, an aqueous suspension of fine AKD particles. The effects of surfactant and AKD concentrations, emulsification procedure, and cooling approach on final ink properties are presented, along with an optimized protocol for its formulation. This hydrophobic agent was applied onto paper using a plotter pen, after which the paper was heated to allow spreading of AKD molecules and chemical bonding with cellulose. A paper surface patterned with the ink (10 g L -1 AKD) yielded a contact angle of 135.6° for water. Unlike organic solvent-based solutions of AKD, this AKD ink does not require a fume hood for its use. Moreover, it is compatible with plastic patterning tools, due to the effective removal of chloroform in the production process to less than 2% of the total volume. Furthermore, this water-based ink is easy to prepare and use. Finally, the AKD ink can also be used for the fabrication of so-called selectively permeable barriers for use in paper microfluidic networks. These are barriers that stop the flow of water through paper, but are permeable to solvents with lower surface energies. We applied the AKD ink to confine and preconcentrate sample on paper, and demonstrated the use of this approach to achieve higher detection sensitivities in paper spray ionization-mass spectrometry (PSI-MS). Our patterning approach can be employed outside of the analytical lab or machine workshop for fast prototyping and small-scale production of paper-based analytical tools, for use in limited-resource labs or in the field. Copyright

  12. Pilot-Vehicle Interface

    DTIC Science & Technology

    1993-11-01

    way is to develop a crude but working model of an entire system. The other is by developing a realistic model of the user interface , leaving out most...devices or by incorporating software for a more user -friendly interface . Automation introduces the possibility of making data entry errors. Multimode...across various human- computer interfaces . 127 a Memory: Minimize the amount of information that the user must maintain in short-term memory

  13. Characterization of available automated external defibrillators in the market based on the product manuals in 2014

    PubMed Central

    Ho, Chik Leung; Cheng, Ka Wai; Ma, Tze Hang; Wong, Yau Hang; Cheng, Ka Lok; Kam, Chak Wah

    2016-01-01

    BACKGROUND: To popularize the wide-spread use of automated external defibrillator (AED) to save life in sudden cardiac arrest, we compared the strength and weakness of different types of AEDs to enable a sound selection based on regional requirement. METHODS: This was a retrospective descriptive study. Different types of AEDs were compared according to the information of AEDs from manuals and brochures provided by the manufacturers. Fifteen types of AEDs were divided into 3 groups, basic, intermediate and advanced. RESULTS: Lifeline™ AUTO AED had the best performance in price, portability and user-friendly among AEDs of basic level. It required less time for shock charging. Samaritan PAD defibrillator was superior in price, portability, durability and characteristic among AEDs of intermediate level. It had the longest warranty and highest protection against water and dust. Lifeline™ PRO AED had the best performance in most of the criteria among AEDs of advanced level and offered CPR video and manual mode for laypersons and clinicians respectively. CONCLUSION: Lifeline™ AUTO AED, Samaritan PAD defibrillator, Lifeline™ PRO AED are superior in AEDs of basic, intermediate and advanced levels, respectively. A feasible AED may be chosen by users according to the regional requirement and the current information about the best available products. PMID:27313810

  14. Two Types of Well Followed Users in the Followership Networks of Twitter

    PubMed Central

    Saito, Kodai; Masuda, Naoki

    2014-01-01

    In the Twitter blogosphere, the number of followers is probably the most basic and succinct quantity for measuring popularity of users. However, the number of followers can be manipulated in various ways; we can even buy follows. Therefore, alternative popularity measures for Twitter users on the basis of, for example, users' tweets and retweets, have been developed. In the present work, we take a purely network approach to this fundamental question. First, we find that two relatively distinct types of users possessing a large number of followers exist, in particular for Japanese, Russian, and Korean users among the seven language groups that we examined. A first type of user follows a small number of other users. A second type of user follows approximately the same number of other users as the number of follows that the user receives. Then, we compare local (i.e., egocentric) followership networks around the two types of users with many followers. We show that the second type, which is presumably uninfluential users despite its large number of followers, is characterized by high link reciprocity, a large number of friends (i.e., those whom a user follows) for the followers, followers' high link reciprocity, large clustering coefficient, large fraction of the second type of users among the followers, and a small PageRank. Our network-based results support that the number of followers used alone is a misleading measure of user's popularity. We propose that the number of friends, which is simple to measure, also helps us to assess the popularity of Twitter users. PMID:24416209

  15. Three Applications of Automated Test Assembly within a User-Friendly Modeling Environment

    ERIC Educational Resources Information Center

    Cor, Ken; Alves, Cecilia; Gierl, Mark

    2009-01-01

    While linear programming is a common tool in business and industry, there have not been many applications in educational assessment and only a handful of individuals have been actively involved in conducting psychometric research in this area. Perhaps this is due, at least in part, to the complexity of existing software packages. This article…

  16. Lab-on-a-Chip Proteomic Assays for Psychiatric Disorders.

    PubMed

    Peter, Harald; Wienke, Julia; Guest, Paul C; Bistolas, Nikitas; Bier, Frank F

    2017-01-01

    Lab-on-a-chip assays allow rapid identification of multiple parameters on an automated user-friendly platform. Here we describe a fully automated multiplex immunoassay and readout in less than 15 min using the Fraunhofer in vitro diagnostics (ivD) platform to enable inexpensive point-of-care profiling of sera or a single drop of blood from patients with various diseases such as psychiatric disorders.

  17. AEDs at your fingertips: automated external defibrillators on college campuses and a novel approach for increasing accessibility.

    PubMed

    Berger, Ryan J; O'Shea, Jesse G

    2014-01-01

    The use of automated external defibrillators (AEDs) increases survival in cardiac arrest events. Due to the success of previous efforts and free, readily available mobile mapping software, the discussion is to emphasize the importance of the use of AEDs to prevent sudden cardiac arrest-related deaths on college campuses and abroad, while suggesting a novel approach to aiding in access and awareness issues. A user-friendly mobile application (a low-cost iOS map) was developed at Florida State University to decrease AED retrieval distance and time. The development of mobile AED maps is feasible for a variety of universities and other entities, with the potential to save lives. Just having AEDs installed is not enough--they need to be easily locatable. Society increasingly relies on phones to provide information, and there are opportunities to use mobile technology to locate and share information about relevant emergency devices; these should be incorporated into the chain of survival.

  18. The Design of an Interactive Data Retrieval System for Casual Users.

    ERIC Educational Resources Information Center

    Radhakrishnan, T.; And Others

    1982-01-01

    Describes an interactive data retrieval system which was designed and implemented for casual users and which incorporates a user-friendly interface, aids to train beginners in use of the system, versatility in output, and error recovery protocols. A 14-item reference list and two figures illustrating system operation and output are included. (JL)

  19. Convergent spray process for environmentally friendly coatings

    NASA Technical Reports Server (NTRS)

    Scarpa, Jack

    1995-01-01

    Conventional spray application processes have poor transfer efficiencies, resulting in an exorbitant loss in materials, solvents, and time. Also, with ever tightening Environmental Protection Agency (EPA) regulations and Occupational Safety and Health Administration requirements, the low transfer efficiencies have a significant impact on the quantities of materials and solvents that are released into the environment. High solids spray processes are also limited by material viscosities, thus requiring many passes over the surface to achieve a thickness in the 0.125 -inch range. This results in high application costs and a negative impact on the environment. Until recently, requirements for a 100 percent solid sprayable, environmentally friendly, lightweight thermal protection system that can be applied in a thick (greater than 0.125 inch) single-pass operation exceeded the capability of existing systems. Such coatings must be applied by hand lay-up techniques, especially for thermal and/or fire protection systems. The current formulation of these coatings has presented many problems such as worker safety, environmental hazards, waste, high cost, and application constraints. A system which can apply coatings without using hazardous materials would alleviate many of these problems. Potential applications include the aerospace thermal protective specialty coatings, chemical and petroleum industries that require fire-protection coatings that resist impact, chemicals, and weather. These markets can be penetrated by offering customized coatings applied by automated processes that are environmentally friendly.

  20. Automated data collection based on RoboDiff at the ESRF beamline MASSIF-1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nurizzo, Didier, E-mail: Didier.nurizzo@esrf.fr; Guichard, Nicolas; McSweeney, Sean

    2016-07-27

    The European Synchrotron Radiation Facility has a long standing history in the automation of experiments in Macromolecular Crystallography. MASSIF-1 (Massively Automated Sample Screening and evaluation Integrated Facility), a beamline constructed as part of the ESRF Upgrade Phase I program, has been open to the external user community since July 2014 and offers a unique completely automated data collection service to both academic and industrial structural biologists.

  1. The automated multi-stage substructuring system for NASTRAN

    NASA Technical Reports Server (NTRS)

    Field, E. I.; Herting, D. N.; Herendeen, D. L.; Hoesly, R. L.

    1975-01-01

    The substructuring capability developed for eventual installation in Level 16 is now operational in a test version of NASTRAN. Its features are summarized. These include the user-oriented, Case Control type control language, the automated multi-stage matrix processing, the independent direct access data storage facilities, and the static and normal modes solution capabilities. A complete problem analysis sequence is presented with card-by-card description of the user input.

  2. OPTIGRAMI V2 user's guide

    Treesearch

    Penny S. Lawson; R. Edward Thomas; Elizabeth S Walker

    1996-01-01

    OPTIGRAMI V2 is a computer program available for IBM persaonl computer with 80286 and higher processors. OPTIGRAMI V2 determines the least-cost lumber grade mix required to produce a given cutting order for clear parts from rough lumber of known grades in a crosscut-first rough mill operation. It is a user-friendly integrated application that includes optimization...

  3. Influence of Learning Styles on Graphical User Interface Preferences for e-Learners

    ERIC Educational Resources Information Center

    Dedic, Velimir; Markovic, Suzana

    2012-01-01

    Implementing Web-based educational environment requires not only developing appropriate architectures, but also incorporating human factors considerations. User interface becomes the major channel to convey information in e-learning context: a well-designed and friendly enough interface is thus the key element in helping users to get the best…

  4. End User Evaluations

    NASA Astrophysics Data System (ADS)

    Jay, Caroline; Lunn, Darren; Michailidou, Eleni

    As new technologies emerge, and Web sites become increasingly sophisticated, ensuring they remain accessible to disabled and small-screen users is a major challenge. While guidelines and automated evaluation tools are useful for informing some aspects of Web site design, numerous studies have demonstrated that they provide no guarantee that the site is genuinely accessible. The only reliable way to evaluate the accessibility of a site is to study the intended users interacting with it. This chapter outlines the processes that can be used throughout the design life cycle to ensure Web accessibility, describing their strengths and weaknesses, and discussing the practical and ethical considerations that they entail. The chapter also considers an important emerging trend in user evaluations: combining data from studies of “standard” Web use with data describing existing accessibility issues, to drive accessibility solutions forward.

  5. An interactive user-friendly approach to surface-fitting three-dimensional geometries

    NASA Technical Reports Server (NTRS)

    Cheatwood, F. Mcneil; Dejarnette, Fred R.

    1988-01-01

    A surface-fitting technique has been developed which addresses two problems with existing geometry packages: computer storage requirements and the time required of the user for the initial setup of the geometry model. Coordinates of cross sections are fit using segments of general conic sections. The next step is to blend the cross-sectional curve-fits in the longitudinal direction using general conics to fit specific meridional half-planes. Provisions are made to allow the fitting of fuselages and wings so that entire wing-body combinations may be modeled. This report includes the development of the technique along with a User's Guide for the various menus within the program. Results for the modeling of the Space Shuttle and a proposed Aeroassist Flight Experiment geometry are presented.

  6. Library Automation at the University for Development Studies: Challenges and Prospects

    ERIC Educational Resources Information Center

    Thompson, Edwin S.; Pwadura, Joana

    2014-01-01

    The automation of a library that basically aims at improving the management of the library's resources and increasing access to these same resources by users has caught on so well in the western world that virtually all academic libraries in that part of the world have automated most of their services. In Africa, however, several challenges are…

  7. Automated Coding Software: Development and Use to Enhance Anti-Fraud Activities*

    PubMed Central

    Garvin, Jennifer H.; Watzlaf, Valerie; Moeini, Sohrab

    2006-01-01

    This descriptive research project identified characteristics of automated coding systems that have the potential to detect improper coding and to minimize improper or fraudulent coding practices in the setting of automated coding used with the electronic health record (EHR). Recommendations were also developed for software developers and users of coding products to maximize anti-fraud practices. PMID:17238546

  8. Architecture Views Illustrating the Service Automation Aspect of SOA

    NASA Astrophysics Data System (ADS)

    Gu, Qing; Cuadrado, Félix; Lago, Patricia; Duenãs, Juan C.

    Earlier in this book, Chapter 8 provided a detailed analysis of service engineering, including a review of service engineering techniques and methodologies. This chapter is closely related to Chapter 8 as shows how such approaches can be used to develop a service, with particular emphasis on the identification of three views (the automation decision view, degree of service automation view and service automation related data view) that structure and ease elicitation and documentation of stakeholders' concerns. This is carried out through two large case studies to learn the industrial needs in illustrating services deployment and configuration automation. This set of views adds to the more traditional notations like UML, the visual power of attracting the attention of their users to the addressed concerns, and assist them in their work. This is especially crucial in service oriented architecting where service automation is highly demanded.

  9. Development of a user-friendly system for image processing of electron microscopy by integrating a web browser and PIONE with Eos.

    PubMed

    Tsukamoto, Takafumi; Yasunaga, Takuo

    2014-11-01

    Eos (Extensible object-oriented system) is one of the powerful applications for image processing of electron micrographs. In usual cases, Eos works with only character user interfaces (CUI) under the operating systems (OS) such as OS-X or Linux, not user-friendly. Thus, users of Eos need to be expert at image processing of electron micrographs, and have a little knowledge of computer science, as well. However, all the persons who require Eos does not an expert for CUI. Thus we extended Eos to a web system independent of OS with graphical user interfaces (GUI) by integrating web browser.Advantage to use web browser is not only to extend Eos with GUI, but also extend Eos to work under distributed computational environment. Using Ajax (Asynchronous JavaScript and XML) technology, we implemented more comfortable user-interface on web browser. Eos has more than 400 commands related to image processing for electron microscopy, and the usage of each command is different from each other. Since the beginning of development, Eos has managed their user-interface by using the interface definition file of "OptionControlFile" written in CSV (Comma-Separated Value) format, i.e., Each command has "OptionControlFile", which notes information for interface and its usage generation. Developed GUI system called "Zephyr" (Zone for Easy Processing of HYpermedia Resources) also accessed "OptionControlFIle" and produced a web user-interface automatically, because its mechanism is mature and convenient,The basic actions of client side system was implemented properly and can supply auto-generation of web-form, which has functions of execution, image preview, file-uploading to a web server. Thus the system can execute Eos commands with unique options for each commands, and process image analysis. There remain problems of image file format for visualization and workspace for analysis: The image file format information is useful to check whether the input/output file is correct and we also

  10. Automation Framework for Flight Dynamics Products Generation

    NASA Technical Reports Server (NTRS)

    Wiegand, Robert E.; Esposito, Timothy C.; Watson, John S.; Jun, Linda; Shoan, Wendy; Matusow, Carla

    2010-01-01

    XFDS provides an easily adaptable automation platform. To date it has been used to support flight dynamics operations. It coordinates the execution of other applications such as Satellite TookKit, FreeFlyer, MATLAB, and Perl code. It provides a mechanism for passing messages among a collection of XFDS processes, and allows sending and receiving of GMSEC messages. A unified and consistent graphical user interface (GUI) is used for the various tools. Its automation configuration is stored in text files, and can be edited either directly or using the GUI.

  11. Constructing Pairing-Friendly Elliptic Curves under Embedding Degree 1 for Securing Critical Infrastructures.

    PubMed

    Wang, Maocai; Dai, Guangming; Choo, Kim-Kwang Raymond; Jayaraman, Prem Prakash; Ranjan, Rajiv

    2016-01-01

    Information confidentiality is an essential requirement for cyber security in critical infrastructure. Identity-based cryptography, an increasingly popular branch of cryptography, is widely used to protect the information confidentiality in the critical infrastructure sector due to the ability to directly compute the user's public key based on the user's identity. However, computational requirements complicate the practical application of Identity-based cryptography. In order to improve the efficiency of identity-based cryptography, this paper presents an effective method to construct pairing-friendly elliptic curves with low hamming weight 4 under embedding degree 1. Based on the analysis of the Complex Multiplication(CM) method, the soundness of our method to calculate the characteristic of the finite field is proved. And then, three relative algorithms to construct pairing-friendly elliptic curve are put forward. 10 elliptic curves with low hamming weight 4 under 160 bits are presented to demonstrate the utility of our approach. Finally, the evaluation also indicates that it is more efficient to compute Tate pairing with our curves, than that of Bertoni et al.

  12. PR-PR: cross-platform laboratory automation system.

    PubMed

    Linshiz, Gregory; Stawski, Nina; Goyal, Garima; Bi, Changhao; Poust, Sean; Sharma, Monica; Mutalik, Vivek; Keasling, Jay D; Hillson, Nathan J

    2014-08-15

    To enable protocol standardization, sharing, and efficient implementation across laboratory automation platforms, we have further developed the PR-PR open-source high-level biology-friendly robot programming language as a cross-platform laboratory automation system. Beyond liquid-handling robotics, PR-PR now supports microfluidic and microscopy platforms, as well as protocol translation into human languages, such as English. While the same set of basic PR-PR commands and features are available for each supported platform, the underlying optimization and translation modules vary from platform to platform. Here, we describe these further developments to PR-PR, and demonstrate the experimental implementation and validation of PR-PR protocols for combinatorial modified Golden Gate DNA assembly across liquid-handling robotic, microfluidic, and manual platforms. To further test PR-PR cross-platform performance, we then implement and assess PR-PR protocols for Kunkel DNA mutagenesis and hierarchical Gibson DNA assembly for microfluidic and manual platforms.

  13. What's in Your Techno-Future? Vendors Share Their Views.

    ERIC Educational Resources Information Center

    Gerber, Carole

    1995-01-01

    Examines vendors' views on the future of CD-ROM technology. Topics include the library role, single point access, costs, tape backup, user-friendly library automation systems and databases, improved quality, the growth of Internet access, and perspectives on technology in schools. (AEF)

  14. Adolescent Alcohol Use: Social Comparison Orientation Moderates the Impact of Friend and Sibling Behavior

    PubMed Central

    Litt, Dana M.; Stock, Michelle L.; Gibbons, Frederick X.

    2014-01-01

    Objectives Research has indicated that both peers and siblings influence adolescents' alcohol use (e.g. Windle, 2000). The present two studies examined if social comparison orientation (SCO) moderates the effects of perceived friend and sibling alcohol use on adolescents' alcohol use cognitions and behaviors. Design & Methods Two studies examined the role of SCO as a moderator of social influence (perceived friend alcohol use in Study 1 and both perceived friend use and sibling-reported alcohol use in Study 2) on prototype perceptions and willingness to drink alcohol (Studies 1 & 2) as well as actual alcohol consumption (Study 2) among early adolescents. Results In Study 1, cross-sectional results indicated that SCO moderated the effect of perceived friend alcohol use on favorable images of drinkers and willingness to drink. Study 2 found that SCO moderated the effects of perceived friend use and sibling use on favorable images of alcohol users, willingness to use alcohol, and change in alcohol use over three years such that adolescents who reported engaging in social comparison more often reported greater willingness, more favorable images, and increases in alcohol use when perceived friend use or sibling use was high. Conclusions These studies highlight the importance of SCO as a moderator of susceptibility to the social influences of friends and siblings and may hold important implications for adolescent alcohol use prevention programs and models of health-risk behavior. PMID:25243814

  15. 'We are always in some form of contact': friendships among homeless drug and alcohol users living in hostels.

    PubMed

    Neale, Joanne; Brown, Caral

    2016-09-01

    Homeless drug and alcohol users are one of the most marginalised groups in society. They frequently have complex needs and limited social support. In this paper, we explore the role of friendship in the lives of homeless drug and alcohol users living in hostels, using the concepts of 'social capital' and 'recovery capital' to frame the analyses. The study was undertaken in three hostels, each in a different English city, during 2013-2014. Audio recorded semi-structured interviews were conducted with 30 residents (9 females; 21 males) who self-reported drink and/or drug problems; follow-up interviews were completed 4-6 weeks later with 22 participants (6 females; 16 males). Data were transcribed verbatim, coded using the software package MAXQDA, and analysed using Framework. Only 21 participants reported current friends at interview 1, and friendship networks were small and changeable. Despite this, participants desired friendships that were culturally normative. Eight categories of friend emerged from the data: family-like friends; using friends; homeless friends; childhood friends; online-only friends; drug treatment friends; work friends; and mutual interest friends. Routine and regular contact was highly valued, with family-like friends appearing to offer the most constant practical and emotional support. The use of information and communication technologies (ICTs) was central to many participants' friendships, keeping them connected to social support and recovery capital outside homelessness and substance-using worlds. We conclude that those working with homeless drug and alcohol users - and potentially other marginalised populations - could beneficially encourage their clients to identify and build upon their most positive and reliable relationships. Additionally, they might explore ways of promoting the use of ICTs to combat loneliness and isolation. Texting, emailing, online mutual aid meetings, chatrooms, Internet penpals, skyping and other social media

  16. Seed: a user-friendly tool for exploring and visualizing microbial community data.

    PubMed

    Beck, Daniel; Dennis, Christopher; Foster, James A

    2015-02-15

    In this article we present Simple Exploration of Ecological Data (Seed), a data exploration tool for microbial communities. Seed is written in R using the Shiny library. This provides access to powerful R-based functions and libraries through a simple user interface. Seed allows users to explore ecological datasets using principal coordinate analyses, scatter plots, bar plots, hierarchal clustering and heatmaps. Seed is open source and available at https://github.com/danlbek/Seed. danlbek@gmail.com Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  17. Developing a Graphical User Interface to Automate the Estimation and Prediction of Risk Values for Flood Protective Structures using Artificial Neural Network

    NASA Astrophysics Data System (ADS)

    Hasan, M.; Helal, A.; Gabr, M.

    2014-12-01

    In this project, we focus on providing a computer-automated platform for a better assessment of the potential failures and retrofit measures of flood-protecting earth structures, e.g., dams and levees. Such structures play an important role during extreme flooding events as well as during normal operating conditions. Furthermore, they are part of other civil infrastructures such as water storage and hydropower generation. Hence, there is a clear need for accurate evaluation of stability and functionality levels during their service lifetime so that the rehabilitation and maintenance costs are effectively guided. Among condition assessment approaches based on the factor of safety, the limit states (LS) approach utilizes numerical modeling to quantify the probability of potential failures. The parameters for LS numerical modeling include i) geometry and side slopes of the embankment, ii) loading conditions in terms of rate of rising and duration of high water levels in the reservoir, and iii) cycles of rising and falling water levels simulating the effect of consecutive storms throughout the service life of the structure. Sample data regarding the correlations of these parameters are available through previous research studies. We have unified these criteria and extended the risk assessment in term of loss of life through the implementation of a graphical user interface to automate input parameters that divides data into training and testing sets, and then feeds them into Artificial Neural Network (ANN) tool through MATLAB programming. The ANN modeling allows us to predict risk values of flood protective structures based on user feedback quickly and easily. In future, we expect to fine-tune the software by adding extensive data on variations of parameters.

  18. ANGIOCARE: an automated system for fast three-dimensional coronary reconstruction by integrating angiographic and intracoronary ultrasound data.

    PubMed

    Bourantas, Christos V; Kalatzis, Fanis G; Papafaklis, Michail I; Fotiadis, Dimitrios I; Tweddel, Ann C; Kourtis, Iraklis C; Katsouras, Christos S; Michalis, Lampros K

    2008-08-01

    The development of an automated, user-friendly system (ANGIOCARE), for rapid three-dimensional (3D) coronary reconstruction, integrating angiographic and, intracoronary ultrasound (ICUS) data. Biplane angiographic and ICUS sequence images are imported into the system where a prevalidated method is used for coronary reconstruction. This incorporates extraction of the catheter path from two end-diastolic X-ray images and detection of regions of interest (lumen, outer vessel wall) in the ICUS sequence by an automated border detection algorithm. The detected borders are placed perpendicular to the catheter path and established algorithms used to estimate their absolute orientation. The resulting 3D object is imported into an advanced visualization module with which the operator can interact, examine plaque distribution (depicted as a color coded map) and assess plaque burden by virtual endoscopy. Data from 19 patients (27 vessels) undergoing biplane angiography and ICUS were examined. The reconstructed vessels were 21.3-80.2 mm long. The mean difference was 0.9 +/- 2.9% between the plaque volumes measured using linear 3D ICUS analysis and the volumes, estimated by taking into account the curvature of the vessel. The time required to reconstruct a luminal narrowing of 25 mm was approximately 10 min. The ANGIOCARE system provides rapid coronary reconstruction allowing the operator accurately to estimate the length of the lesion and determine plaque distribution and volume. (c) 2008 Wiley-Liss, Inc.

  19. Competence, Problem Behavior, and the Effects of Having No Friends, Aggressive Friends, or Nonaggressive Friends: A Four-Year Longitudinal Study

    ERIC Educational Resources Information Center

    Palmen, Hanneke; Vermande, Marjolijn M.; Dekovic, Maja; van Aken, Marcel A. G.

    2011-01-01

    This study examined the longitudinal relations between competence (academic achievement and social preference) and problem behavior (loneliness and aggression) in 741 elementary school boys and girls in the Netherlands (Grades 1-5). Also, we examined the moderation effects of having no friends, aggressive friends, or nonaggressive friends on the…

  20. Visualization for Hyper-Heuristics. Front-End Graphical User Interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kroenung, Lauren

    Modern society is faced with ever more complex problems, many of which can be formulated as generate-and-test optimization problems. General-purpose optimization algorithms are not well suited for real-world scenarios where many instances of the same problem class need to be repeatedly and efficiently solved because they are not targeted to a particular scenario. Hyper-heuristics automate the design of algorithms to create a custom algorithm for a particular scenario. While such automated design has great advantages, it can often be difficult to understand exactly how a design was derived and why it should be trusted. This project aims to address thesemore » issues of usability by creating an easy-to-use graphical user interface (GUI) for hyper-heuristics to support practitioners, as well as scientific visualization of the produced automated designs. My contributions to this project are exhibited in the user-facing portion of the developed system and the detailed scientific visualizations created from back-end data.« less

  1. An Intelligent Automation Platform for Rapid Bioprocess Design.

    PubMed

    Wu, Tianyi; Zhou, Yuhong

    2014-08-01

    Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user's inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. © 2013 Society for Laboratory Automation and Screening.

  2. An automated microfluidic DNA microarray platform for genetic variant detection in inherited arrhythmic diseases.

    PubMed

    Huang, Shu-Hong; Chang, Yu-Shin; Juang, Jyh-Ming Jimmy; Chang, Kai-Wei; Tsai, Mong-Hsun; Lu, Tzu-Pin; Lai, Liang-Chuan; Chuang, Eric Y; Huang, Nien-Tsu

    2018-03-12

    In this study, we developed an automated microfluidic DNA microarray (AMDM) platform for point mutation detection of genetic variants in inherited arrhythmic diseases. The platform allows for automated and programmable reagent sequencing under precise conditions of hybridization flow and temperature control. It is composed of a commercial microfluidic control system, a microfluidic microarray device, and a temperature control unit. The automated and rapid hybridization process can be performed in the AMDM platform using Cy3 labeled oligonucleotide exons of SCN5A genetic DNA, which produces proteins associated with sodium channels abundant in the heart (cardiac) muscle cells. We then introduce a graphene oxide (GO)-assisted DNA microarray hybridization protocol to enable point mutation detection. In this protocol, a GO solution is added after the staining step to quench dyes bound to single-stranded DNA or non-perfectly matched DNA, which can improve point mutation specificity. As proof-of-concept we extracted the wild-type and mutant of exon 12 and exon 17 of SCN5A genetic DNA from patients with long QT syndrome or Brugada syndrome by touchdown PCR and performed a successful point mutation discrimination in the AMDM platform. Overall, the AMDM platform can greatly reduce laborious and time-consuming hybridization steps and prevent potential contamination. Furthermore, by introducing the reciprocating flow into the microchannel during the hybridization process, the total assay time can be reduced to 3 hours, which is 6 times faster than the conventional DNA microarray. Given the automatic assay operation, shorter assay time, and high point mutation discrimination, we believe that the AMDM platform has potential for low-cost, rapid and sensitive genetic testing in a simple and user-friendly manner, which may benefit gene screening in medical practice.

  3. HiRel: Hybrid Automated Reliability Predictor (HARP) integrated reliability tool system, (version 7.0). Volume 4: HARP Output (HARPO) graphics display user's guide

    NASA Technical Reports Server (NTRS)

    Sproles, Darrell W.; Bavuso, Salvatore J.

    1994-01-01

    The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of highly reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed at the outset to be compatible with most computing platforms and operating systems and some programs have been beta tested within the aerospace community for over 8 years. This document is a user's guide for the HiRel graphical postprocessor program HARPO (HARP Output). HARPO reads ASCII files generated by HARP. It provides an interactive plotting capability that can be used to display alternate model data for trade-off analyses. File data can also be imported to other commercial software programs.

  4. Integrating the Allen Brain Institute Cell Types Database into Automated Neuroscience Workflow.

    PubMed

    Stockton, David B; Santamaria, Fidel

    2017-10-01

    We developed software tools to download, extract features, and organize the Cell Types Database from the Allen Brain Institute (ABI) in order to integrate its whole cell patch clamp characterization data into the automated modeling/data analysis cycle. To expand the potential user base we employed both Python and MATLAB. The basic set of tools downloads selected raw data and extracts cell, sweep, and spike features, using ABI's feature extraction code. To facilitate data manipulation we added a tool to build a local specialized database of raw data plus extracted features. Finally, to maximize automation, we extended our NeuroManager workflow automation suite to include these tools plus a separate investigation database. The extended suite allows the user to integrate ABI experimental and modeling data into an automated workflow deployed on heterogeneous computer infrastructures, from local servers, to high performance computing environments, to the cloud. Since our approach is focused on workflow procedures our tools can be modified to interact with the increasing number of neuroscience databases being developed to cover all scales and properties of the nervous system.

  5. Using artificial intelligence to automate remittance processing.

    PubMed

    Adams, W T; Snow, G M; Helmick, P M

    1998-06-01

    The consolidated business office of the Allegheny Health Education Research Foundation (AHERF), a large integrated healthcare system based in Pittsburgh, Pennsylvania, sought to improve its cash-related business office activities by implementing an automated remittance processing system that uses artificial intelligence. The goal was to create a completely automated system whereby all monies it processed would be tracked, automatically posted, analyzed, monitored, controlled, and reconciled through a central database. Using a phased approach, the automated payment system has become the central repository for all of the remittances for seven of the hospitals in the AHERF system and has allowed for the complete integration of these hospitals' existing billing systems, document imaging system, and intranet, as well as the new automated payment posting, and electronic cash tracking and reconciling systems. For such new technology, which is designed to bring about major change, factors contributing to the project's success were adequate planning, clearly articulated objectives, marketing, end-user acceptance, and post-implementation plan revision.

  6. Semi-Automated Neuron Boundary Detection and Nonbranching Process Segmentation in Electron Microscopy Images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jurrus, Elizabeth R.; Watanabe, Shigeki; Giuly, Richard J.

    2013-01-01

    Neuroscientists are developing new imaging techniques and generating large volumes of data in an effort to understand the complex structure of the nervous system. The complexity and size of this data makes human interpretation a labor-intensive task. To aid in the analysis, new segmentation techniques for identifying neurons in these feature rich datasets are required. This paper presents a method for neuron boundary detection and nonbranching process segmentation in electron microscopy images and visualizing them in three dimensions. It combines both automated segmentation techniques with a graphical user interface for correction of mistakes in the automated process. The automated processmore » first uses machine learning and image processing techniques to identify neuron membranes that deliniate the cells in each two-dimensional section. To segment nonbranching processes, the cell regions in each two-dimensional section are connected in 3D using correlation of regions between sections. The combination of this method with a graphical user interface specially designed for this purpose, enables users to quickly segment cellular processes in large volumes.« less

  7. Semi-automated Neuron Boundary Detection and Nonbranching Process Segmentation in Electron Microscopy Images

    PubMed Central

    Jurrus, Elizabeth; Watanabe, Shigeki; Giuly, Richard J.; Paiva, Antonio R. C.; Ellisman, Mark H.; Jorgensen, Erik M.; Tasdizen, Tolga

    2013-01-01

    Neuroscientists are developing new imaging techniques and generating large volumes of data in an effort to understand the complex structure of the nervous system. The complexity and size of this data makes human interpretation a labor-intensive task. To aid in the analysis, new segmentation techniques for identifying neurons in these feature rich datasets are required. This paper presents a method for neuron boundary detection and nonbranching process segmentation in electron microscopy images and visualizing them in three dimensions. It combines both automated segmentation techniques with a graphical user interface for correction of mistakes in the automated process. The automated process first uses machine learning and image processing techniques to identify neuron membranes that deliniate the cells in each two-dimensional section. To segment nonbranching processes, the cell regions in each two-dimensional section are connected in 3D using correlation of regions between sections. The combination of this method with a graphical user interface specially designed for this purpose, enables users to quickly segment cellular processes in large volumes. PMID:22644867

  8. Young friendship in HFASD and typical development: friend versus non-friend comparisons.

    PubMed

    Bauminger-Zviely, Nirit; Agam-Ben-Artzi, Galit

    2014-07-01

    This study conducted comparative assessment of friendship in preschoolers with high-functioning autism spectrum disorder (HFASD, n = 29) versus preschoolers with typical development (n = 30), focusing on interactions with friends versus acquaintances. Groups were matched on SES, verbal/nonverbal MA, IQ, and CA. Multidimensional assessments included: mothers' and teachers' reports about friends' and friendship characteristics and observed individual and dyadic behaviors throughout interactions with friends versus non-friends during construction, drawing, and free-play situations. Findings revealed group differences in peer interaction favoring the typical development group, thus supporting the neuropsychological profile of HFASD. However, both groups' interactions with friends surpassed interactions with acquaintances on several key socio-communicative and intersubjective capabilities, thus suggesting that friendship may contribute to enhancement and practice of social interaction in HFASD.

  9. An automated dose tracking system for adaptive radiation therapy.

    PubMed

    Liu, Chang; Kim, Jinkoo; Kumarasiri, Akila; Mayyas, Essa; Brown, Stephen L; Wen, Ning; Siddiqui, Farzan; Chetty, Indrin J

    2018-02-01

    The implementation of adaptive radiation therapy (ART) into routine clinical practice is technically challenging and requires significant resources to perform and validate each process step. The objective of this report is to identify the key components of ART, to illustrate how a specific automated procedure improves efficiency, and to facilitate the routine clinical application of ART. Data was used from patient images, exported from a clinical database and converted to an intermediate format for point-wise dose tracking and accumulation. The process was automated using in-house developed software containing three modularized components: an ART engine, user interactive tools, and integration tools. The ART engine conducts computing tasks using the following modules: data importing, image pre-processing, dose mapping, dose accumulation, and reporting. In addition, custom graphical user interfaces (GUIs) were developed to allow user interaction with select processes such as deformable image registration (DIR). A commercial scripting application programming interface was used to incorporate automated dose calculation for application in routine treatment planning. Each module was considered an independent program, written in C++or C#, running in a distributed Windows environment, scheduled and monitored by integration tools. The automated tracking system was retrospectively evaluated for 20 patients with prostate cancer and 96 patients with head and neck cancer, under institutional review board (IRB) approval. In addition, the system was evaluated prospectively using 4 patients with head and neck cancer. Altogether 780 prostate dose fractions and 2586 head and neck cancer dose fractions went processed, including DIR and dose mapping. On average, daily cumulative dose was computed in 3 h and the manual work was limited to 13 min per case with approximately 10% of cases requiring an additional 10 min for image registration refinement. An efficient and convenient

  10. Interchange Safety Analysis Tool (ISAT) : user manual

    DOT National Transportation Integrated Search

    2007-06-01

    This User Manual describes the usage and operation of the spreadsheet-based Interchange Safety Analysis Tool (ISAT). ISAT provides design and safety engineers with an automated tool for assessing the safety effects of geometric design and traffic con...

  11. Industrial Wireless Sensors: A User's Perspective on the Impact of Standards on Wide-spread Deployment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taft, Cyrus W.; Manges, Wayne W; Sorge, John N

    2012-01-01

    The role of wireless sensing technologies in industrial instrumentation will undoubtedly become more important in the years ahead. . Deployment of such instrumentation in an industrial setting with its heightened security and robustness criteria hinges on user acceptance of verified performance as well as meeting cost requirements. Today, industrial users face many choices when specifying a wireless sensor network, including radio performance, battery life, interoperability, security, and standards compliance. The potential market for industrial wireless sensors is literally millions of wireless instruments and it is imperative that accurate information for applying the technology to real-world applications be available to themore » end-user so that they can make informed deployment decisions. The majority of industrial wireless automation designs now being deployed or being considered for deployment are based on three different standards . The HART Communications Foundation s WirelessHART (IEC 62591), the International Society of Automation s ISA100.11a, and the offering from the Industrial Wireless Alliance of China known as WIA-PA (IEC 62601). Aside from these industrial automation standards, users must also be cognizant of the underlying wireless network standards IEEE 802.11, IEEE 802.15.4, and IEEE 802.15.3a and their interactions with the three principal industrial automation protocols mentioned previously. The crucial questions being asked by end users revolve around sensor network performance, interoperability, reliability, and security. This paper will discuss potential wireless sensor applications in power plants, barriers to the acceptance of wireless technology, concerns related to standards, and provide an end user prospective on the issues affecting wide-spread deployment of wireless sensors. Finally, the authors conclude with a discussion of a recommended path forward including how standards organizations can better facilitate end user decision

  12. TARDIS: An Automation Framework for JPL Mission Design and Navigation

    NASA Technical Reports Server (NTRS)

    Roundhill, Ian M.; Kelly, Richard M.

    2014-01-01

    Mission Design and Navigation at the Jet Propulsion Laboratory has implemented an automation framework tool to assist in orbit determination and maneuver design analysis. This paper describes the lessons learned from previous automation tools and how they have been implemented in this tool. In addition this tool has revealed challenges in software implementation, testing, and user education. This paper describes some of these challenges and invites others to share their experiences.

  13. Operational Based Vision Assessment Automated Vision Test Collection User Guide

    DTIC Science & Technology

    2017-05-15

    repeatability to support correlation analysis. The AVT research grade tests also support interservice, international, industry, and academic partnerships...software, provides information concerning various menu options and operation of the test, and provides a brief description of each of the automated vision...2802, 6 Jun 2017. TABLE OF CONTENTS (concluded) Section Page 7.0 OBVA VISION TEST DESCRIPTIONS

  14. Reliability of an Automated High-Resolution Manometry Analysis Program across Expert Users, Novice Users, and Speech-Language Pathologists

    ERIC Educational Resources Information Center

    Jones, Corinne A.; Hoffman, Matthew R.; Geng, Zhixian; Abdelhalim, Suzan M.; Jiang, Jack J.; McCulloch, Timothy M.

    2014-01-01

    Purpose: The purpose of this study was to investigate inter- and intrarater reliability among expert users, novice users, and speech-language pathologists with a semiautomated high-resolution manometry analysis program. We hypothesized that all users would have high intrarater reliability and high interrater reliability. Method: Three expert…

  15. Automation bias: a systematic review of frequency, effect mediators, and mitigators.

    PubMed

    Goddard, Kate; Roudsari, Abdul; Wyatt, Jeremy C

    2012-01-01

    Automation bias (AB)--the tendency to over-rely on automation--has been studied in various academic fields. Clinical decision support systems (CDSS) aim to benefit the clinical decision-making process. Although most research shows overall improved performance with use, there is often a failure to recognize the new errors that CDSS can introduce. With a focus on healthcare, a systematic review of the literature from a variety of research fields has been carried out, assessing the frequency and severity of AB, the effect mediators, and interventions potentially mitigating this effect. This is discussed alongside automation-induced complacency, or insufficient monitoring of automation output. A mix of subject specific and freetext terms around the themes of automation, human-automation interaction, and task performance and error were used to search article databases. Of 13 821 retrieved papers, 74 met the inclusion criteria. User factors such as cognitive style, decision support systems (DSS), and task specific experience mediated AB, as did attitudinal driving factors such as trust and confidence. Environmental mediators included workload, task complexity, and time constraint, which pressurized cognitive resources. Mitigators of AB included implementation factors such as training and emphasizing user accountability, and DSS design factors such as the position of advice on the screen, updated confidence levels attached to DSS output, and the provision of information versus recommendation. By uncovering the mechanisms by which AB operates, this review aims to help optimize the clinical decision-making process for CDSS developers and healthcare practitioners.

  16. Identification of User Facility Related Publications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patton, Robert M; Stahl, Christopher G; Wells, Jack C

    2012-01-01

    Scientific user facilities provide physical resources and technical support that enable scientists to conduct experiments or simulations pertinent to their respective research. One metric for evaluating the scientific value or impact of a facility is the number of publications by users as a direct result of using that facility. Unfortunately, for a variety of reasons, capturing accurate values for this metric proves time consuming and error-prone. This work describes a new approach that leverages automated browser technology combined with text analytics to reduce the time and error involved in identifying publications related to user facilities. With this approach, scientific usermore » facilities gain more accurate measures of their impact as well as insight into policy revisions for user access.« less

  17. Automated inference procedure for the determination of cell growth parameters

    NASA Astrophysics Data System (ADS)

    Harris, Edouard A.; Koh, Eun Jee; Moffat, Jason; McMillen, David R.

    2016-01-01

    The growth rate and carrying capacity of a cell population are key to the characterization of the population's viability and to the quantification of its responses to perturbations such as drug treatments. Accurate estimation of these parameters necessitates careful analysis. Here, we present a rigorous mathematical approach for the robust analysis of cell count data, in which all the experimental stages of the cell counting process are investigated in detail with the machinery of Bayesian probability theory. We advance a flexible theoretical framework that permits accurate estimates of the growth parameters of cell populations and of the logical correlations between them. Moreover, our approach naturally produces an objective metric of avoidable experimental error, which may be tracked over time in a laboratory to detect instrumentation failures or lapses in protocol. We apply our method to the analysis of cell count data in the context of a logistic growth model by means of a user-friendly computer program that automates this analysis, and present some samples of its output. Finally, we note that a traditional least squares fit can provide misleading estimates of parameter values, because it ignores available information with regard to the way in which the data have actually been collected.

  18. A User Study: Keyboard and Applications to Simplify Smartphone Adoption for Seniors.

    PubMed

    Austad, Hanne O; Liverud, Anders E; Chan, Richard; Røhne, Mette

    2017-01-01

    The aim of the user study was to evaluate how the developed assistive physical keyboard, the Ezi-PAD, and integrated senior friendly applications, can encourage non-smartphone seniors to start using the smartphone and enable senior smartphone users to continue using a smartphone in spite of increasing motoric or visual impairment. A number of seniors with different experience and impairment, aged 64 to 86, were equipped with a smartphone and an Ezi-PAD assembly. After basic training, their use of the smartphone was monitored for up to 2 months. Five out of nine participants used the system for 2 months, and found the Ezi-PAD easy to use. The senior friendly applications gave extra utilitarian value to the phone.

  19. Automated Software Development Workstation (ASDW)

    NASA Technical Reports Server (NTRS)

    Fridge, Ernie

    1990-01-01

    Software development is a serious bottleneck in the construction of complex automated systems. An increase of the reuse of software designs and components has been viewed as a way to relieve this bottleneck. One approach to achieving software reusability is through the development and use of software parts composition systems. A software parts composition system is a software development environment comprised of a parts description language for modeling parts and their interfaces, a catalog of existing parts, a composition editor that aids a user in the specification of a new application from existing parts, and a code generator that takes a specification and generates an implementation of a new application in a target language. The Automated Software Development Workstation (ASDW) is an expert system shell that provides the capabilities required to develop and manipulate these software parts composition systems. The ASDW is now in Beta testing at the Johnson Space Center. Future work centers on responding to user feedback for capability and usability enhancement, expanding the scope of the software lifecycle that is covered, and in providing solutions to handling very large libraries of reusable components.

  20. Numerical analysis of stiffened shells of revolution. Volume 3: Users' manual for STARS-2B, 2V, shell theory automated for rotational structures, 2 (buckling, vibrations), digital computer programs

    NASA Technical Reports Server (NTRS)

    Svalbonas, V.

    1973-01-01

    The User's manual for the shell theory automated for rotational structures (STARS) 2B and 2V (buckling, vibrations) is presented. Several features of the program are: (1) arbitrary branching of the shell meridians, (2) arbitrary boundary conditions, (3) minimum input requirements to describe a complex, practical shell of revolution structure, and (4) accurate analysis capability using a minimum number of degrees of freedom.

  1. Data publication, documentation and user friendly landing pages - improving data discovery and reuse

    NASA Astrophysics Data System (ADS)

    Elger, Kirsten; Ulbricht, Damian; Bertelmann, Roland

    2016-04-01

    Research data are the basis for scientific research and often irreplaceable (e.g. observational data). Storage of such data in appropriate, theme specific or institutional repositories is an essential part of ensuring their long term preservation and access. The free and open access to research data for reuse and scrutiny has been identified as a key issue by the scientific community as well as by research agencies and the public. To ensure the datasets to intelligible and usable for others they must be accompanied by comprehensive data description and standardized metadata for data discovery, and ideally should be published using digital object identifier (DOI). These make datasets citable and ensure their long-term accessibility and are accepted in reference lists of journal articles (http://www.copdess.org/statement-of-commitment/). The GFZ German Research Centre for Geosciences is the national laboratory for Geosciences in Germany and part of the Helmholtz Association, Germany's largest scientific organization. The development and maintenance of data systems is a key component of 'GFZ Data Services' to support state-of-the-art research. The datasets, archived in and published by the GFZ Data Repository cover all geoscientific disciplines and range from large dynamic datasets deriving from global monitoring seismic or geodetic networks with real-time data acquisition, to remotely sensed satellite products, to automatically generated data publications from a database for data from micro meteorological stations, to various model results, to geochemical and rock mechanical analyses from various labs, and field observations. The user-friendly presentation of published datasets via a DOI landing page is as important for reuse as the storage itself, and the required information is highly specific for each scientific discipline. If dataset descriptions are too general, or require the download of a dataset before knowing its suitability, many researchers often decide

  2. LLCEDATA and LLCECALC for Windows version 1.0, Volume 1: User`s manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McFadden, J.G.

    LLCEDATA and LLCECALC for Windows are user-friendly computer software programs that work together to determine the proper waste designation, handling, and disposition requirements for Long Length Contaminated Equipment (LLCE). LLCEDATA reads from a variety of data bases to produce an equipment data file (EDF) that represents a snapshot of both the LLCE and the tank it originates from. LLCECALC reads the EDF and a gamma assay (AV2) file that is produced by the Flexible Receiver Gamma Energy Analysis System. LLCECALC performs corrections to the AV2 file as it is being read and characterizes the LLCE. Both programs produce a varietymore » of reports, including a characterization report and a status report. The status report documents each action taken by the user, LLCEDATA, and LLCECALC. Documentation for LLCEDATA and LLCECALC for Windows is available in three volumes. Volume 1 is a user`s manual, which is intended as a quick reference for both LLCEDATA and LLCECALC. Volume 2 is a technical manual, and Volume 3 is a software verification and validation document.« less

  3. Automating Software Design Metrics.

    DTIC Science & Technology

    1984-02-01

    INTRODUCTION 1 ", ... 0..1 1.2 HISTORICAL PERSPECTIVE High quality software is of interest to both the software engineering com- munity and its users. As...contributions of many other software engineering efforts, most notably [MCC 77] and [Boe 83b], which have defined and refined a framework for quantifying...AUTOMATION OF DESIGN METRICS Software metrics can be useful within the context of an integrated soft- ware engineering environment. The purpose of this

  4. Active learning of neuron morphology for accurate automated tracing of neurites

    PubMed Central

    Gala, Rohan; Chapeton, Julio; Jitesh, Jayant; Bhavsar, Chintan; Stepanyants, Armen

    2014-01-01

    Automating the process of neurite tracing from light microscopy stacks of images is essential for large-scale or high-throughput quantitative studies of neural circuits. While the general layout of labeled neurites can be captured by many automated tracing algorithms, it is often not possible to differentiate reliably between the processes belonging to different cells. The reason is that some neurites in the stack may appear broken due to imperfect labeling, while others may appear fused due to the limited resolution of optical microscopy. Trained neuroanatomists routinely resolve such topological ambiguities during manual tracing tasks by combining information about distances between branches, branch orientations, intensities, calibers, tortuosities, colors, as well as the presence of spines or boutons. Likewise, to evaluate different topological scenarios automatically, we developed a machine learning approach that combines many of the above mentioned features. A specifically designed confidence measure was used to actively train the algorithm during user-assisted tracing procedure. Active learning significantly reduces the training time and makes it possible to obtain less than 1% generalization error rates by providing few training examples. To evaluate the overall performance of the algorithm a number of image stacks were reconstructed automatically, as well as manually by several trained users, making it possible to compare the automated traces to the baseline inter-user variability. Several geometrical and topological features of the traces were selected for the comparisons. These features include the total trace length, the total numbers of branch and terminal points, the affinity of corresponding traces, and the distances between corresponding branch and terminal points. Our results show that when the density of labeled neurites is sufficiently low, automated traces are not significantly different from manual reconstructions obtained by trained users. PMID

  5. On the engineering design for systematic integration of agent-orientation in industrial automation.

    PubMed

    Yu, Liyong; Schüller, Andreas; Epple, Ulrich

    2014-09-01

    In today's automation industry, agent-oriented development of system functionalities appears to have a great potential for increasing autonomy and flexibility of complex operations, while lowering the workload of users. In this paper, we present a reference model for the harmonious and systematical integration of agent-orientation in industrial automation. Considering compatibility with existing automation systems and best practice, this model combines advantages of function block technology, service orientation and native description methods from the automation standard IEC 61131-3. This approach can be applied as a guideline for the engineering design of future agent-oriented automation systems. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  6. If this then that: an introduction to automated task services.

    PubMed

    Hoy, Matthew B

    2015-01-01

    This article explores automated task services, a type of website that allows users to create rules that are triggered by activity on one website and perform a task on another site. The most well-known automated task service is If This Then That (IFTTT), but recently a large number of these services have sprung up. These services can be used to connect websites, apps, business services, and even devices such as phones and home automation equipment. This allows for millions of possible combinations of rules, triggers, and actions. Librarians can put these services to use in many ways, from automating social media postings to remembering to bring their umbrella when rain is in the forecast. A list of popular automated task services is included, as well as a number of ideas for using these services in libraries.

  7. Airport Information Retrieval System (AIRS) User's Guide

    DOT National Transportation Integrated Search

    1973-08-01

    The handbook is a user's guide for a prototype air traffic flow control automation system developed for the FAA's System Command Center. The system is implemented on a time-sharing computer and is designed to provide airport traffic load predictions ...

  8. Automation, Miniature Robotics and Sensors for Nondestructive Testing and Evaluation, Volume 4

    NASA Technical Reports Server (NTRS)

    Bar-Cohen, Y.; Baumgartner, E.; Backes, P.; Sherrit, S.; Bao, X.; Leary, S.; Kennedy, B.; Mavroidis, C.; Pfeiffer, C.; Culbert, C.; hide

    1999-01-01

    The development of NDE techniques has always been driven by the ongoing need for low-cost, rapid, user-friendly, reliable and efficient methods of detecting and characterizing flaws as well as determining material properties.

  9. High School and Beyond: Twins and Siblings' File Users' Manual, User's Manual for Teacher Comment File, Friends File Users' Manual.

    ERIC Educational Resources Information Center

    National Center for Education Statistics (ED), Washington, DC.

    These three users' manuals are for specific files of the High School and Beyond Study, a national longitudinal study of high school sophomores and seniors in 1980. The three files are computerized databases that are available on magnetic tape. As one component of base year data collection, information identifying twins, triplets, and some non-twin…

  10. Space station automation of common module power management and distribution, volume 2

    NASA Technical Reports Server (NTRS)

    Ashworth, B.; Riedesel, J.; Myers, C.; Jakstas, L.; Smith, D.

    1990-01-01

    The new Space Station Module Power Management and Distribution System (SSM/PMAD) testbed automation system is described. The subjects discussed include testbed 120 volt dc star bus configuration and operation, SSM/PMAD automation system architecture, fault recovery and management expert system (FRAMES) rules english representation, the SSM/PMAD user interface, and the SSM/PMAD future direction. Several appendices are presented and include the following: SSM/PMAD interface user manual version 1.0, SSM/PMAD lowest level processor (LLP) reference, SSM/PMAD technical reference version 1.0, SSM/PMAD LLP visual control logic representation's (VCLR's), SSM/PMAD LLP/FRAMES interface control document (ICD) , and SSM/PMAD LLP switchgear interface controller (SIC) ICD.

  11. PR-PR: Cross-Platform Laboratory Automation System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Linshiz, G; Stawski, N; Goyal, G

    To enable protocol standardization, sharing, and efficient implementation across laboratory automation platforms, we have further developed the PR-PR open-source high-level biology-friendly robot programming language as a cross-platform laboratory automation system. Beyond liquid-handling robotics, PR-PR now supports microfluidic and microscopy platforms, as well as protocol translation into human languages, such as English. While the same set of basic PR-PR commands and features are available for each supported platform, the underlying optimization and translation modules vary from platform to platform. Here, we describe these further developments to PR-PR, and demonstrate the experimental implementation and validation of PR-PR protocols for combinatorial modified Goldenmore » Gate DNA assembly across liquid-handling robotic, microfluidic, and manual platforms. To further test PR-PR cross-platform performance, we then implement and assess PR-PR protocols for Kunkel DNA mutagenesis and hierarchical Gibson DNA assembly for microfluidic and manual platforms.« less

  12. Implementing an Automated Antenna Measurement System

    NASA Technical Reports Server (NTRS)

    Valerio, Matthew D.; Romanofsky, Robert R.; VanKeuls, Fred W.

    2003-01-01

    We developed an automated measurement system using a PC running a LabView application, a Velmex BiSlide X-Y positioner, and a HP85l0C network analyzer. The system provides high positioning accuracy and requires no user supervision. After the user inputs the necessary parameters into the LabView application, LabView controls the motor positioning and performs the data acquisition. Current parameters and measured data are shown on the PC display in two 3-D graphs and updated after every data point is collected. The final output is a formatted data file for later processing.

  13. Automated search method for AFM and profilers

    NASA Astrophysics Data System (ADS)

    Ray, Michael; Martin, Yves C.

    2001-08-01

    A new automation software creates a search model as an initial setup and searches for a user-defined target in atomic force microscopes or stylus profilometers used in semiconductor manufacturing. The need for such automation has become critical in manufacturing lines. The new method starts with a survey map of a small area of a chip obtained from a chip-design database or an image of the area. The user interface requires a user to point to and define a precise location to be measured, and to select a macro function for an application such as line width or contact hole. The search algorithm automatically constructs a range of possible scan sequences within the survey, and provides increased speed and functionality compared to the methods used in instruments to date. Each sequence consists in a starting point relative to the target, a scan direction, and a scan length. The search algorithm stops when the location of a target is found and criteria for certainty in positioning is met. With today's capability in high speed processing and signal control, the tool can simultaneously scan and search for a target in a robotic and continuous manner. Examples are given that illustrate the key concepts.

  14. An Efficient User Interface Design for Nursing Information System Based on Integrated Patient Order Information.

    PubMed

    Chu, Chia-Hui; Kuo, Ming-Chuan; Weng, Shu-Hui; Lee, Ting-Ting

    2016-01-01

    A user friendly interface can enhance the efficiency of data entry, which is crucial for building a complete database. In this study, two user interfaces (traditional pull-down menu vs. check boxes) are proposed and evaluated based on medical records with fever medication orders by measuring the time for data entry, steps for each data entry record, and the complete rate of each medical record. The result revealed that the time for data entry is reduced from 22.8 sec/record to 3.2 sec/record. The data entry procedures also have reduced from 9 steps in the traditional one to 3 steps in the new one. In addition, the completeness of medical records is increased from 20.2% to 98%. All these results indicate that the new user interface provides a more user friendly and efficient approach for data entry than the traditional interface.

  15. TmoleX--a graphical user interface for TURBOMOLE.

    PubMed

    Steffen, Claudia; Thomas, Klaus; Huniar, Uwe; Hellweg, Arnim; Rubner, Oliver; Schroer, Alexander

    2010-12-01

    We herein present the graphical user interface (GUI) TmoleX for the quantum chemical program package TURBOMOLE. TmoleX allows users to execute the complete workflow of a quantum chemical investigation from the initial building of a structure to the visualization of the results in a user friendly graphical front end. The purpose of TmoleX is to make TURBOMOLE easy to use and to provide a high degree of flexibility. Hence, it should be a valuable tool for most users from beginners to experts. The program is developed in Java and runs on Linux, Windows, and Mac platforms. It can be used to run calculations on local desktops as well as on remote computers. © 2010 Wiley Periodicals, Inc.

  16. The development of an intelligent user interface for NASA's scientific databases

    NASA Technical Reports Server (NTRS)

    Campbell, William J.; Roelofs, Larry H.

    1986-01-01

    The National Space Science Data Center (NSSDC) has initiated an Intelligent Data Management (IDM) research effort which has as one of its components, the development of an Intelligent User Interface (IUI). The intent of the IUI effort is to develop a friendly and intelligent user interface service that is based on expert systems and natural language processing technologies. This paper presents the design concepts, development approach and evaluation of performance of a prototype Intelligent User Interface Subsystem (IUIS) supporting an operational database.

  17. Automated, Parametric Geometry Modeling and Grid Generation for Turbomachinery Applications

    NASA Technical Reports Server (NTRS)

    Harrand, Vincent J.; Uchitel, Vadim G.; Whitmire, John B.

    2000-01-01

    The objective of this Phase I project is to develop a highly automated software system for rapid geometry modeling and grid generation for turbomachinery applications. The proposed system features a graphical user interface for interactive control, a direct interface to commercial CAD/PDM systems, support for IGES geometry output, and a scripting capability for obtaining a high level of automation and end-user customization of the tool. The developed system is fully parametric and highly automated, and, therefore, significantly reduces the turnaround time for 3D geometry modeling, grid generation and model setup. This facilitates design environments in which a large number of cases need to be generated, such as for parametric analysis and design optimization of turbomachinery equipment. In Phase I we have successfully demonstrated the feasibility of the approach. The system has been tested on a wide variety of turbomachinery geometries, including several impellers and a multi stage rotor-stator combination. In Phase II, we plan to integrate the developed system with turbomachinery design software and with commercial CAD/PDM software.

  18. An Automated, High-Throughput System for GISAXS and GIWAXS Measurements of Thin Films

    NASA Astrophysics Data System (ADS)

    Schaible, Eric; Jimenez, Jessica; Church, Matthew; Lim, Eunhee; Stewart, Polite; Hexemer, Alexander

    Grazing incidence small-angle X-ray scattering (GISAXS) and grazing incidence wide-angle X-ray scattering (GIWAXS) are important techniques for characterizing thin films. In order to meet rapidly increasing demand, the SAXSWAXS beamline at the Advanced Light Source (beamline 7.3.3) has implemented a fully automated, high-throughput system to conduct SAXS, GISAXS and GIWAXS measurements. An automated robot arm transfers samples from a holding tray to a measurement stage. Intelligent software aligns each sample in turn, and measures each according to user-defined specifications. Users mail in trays of samples on individually barcoded pucks, and can download and view their data remotely. Data will be pipelined to the NERSC supercomputing facility, and will be available to users via a web portal that facilitates highly parallelized analysis.

  19. Mendel,MD: A user-friendly open-source web tool for analyzing WES and WGS in the diagnosis of patients with Mendelian disorders

    PubMed Central

    D. Linhares, Natália; Pena, Sérgio D. J.

    2017-01-01

    Whole exome and whole genome sequencing have both become widely adopted methods for investigating and diagnosing human Mendelian disorders. As pangenomic agnostic tests, they are capable of more accurate and agile diagnosis compared to traditional sequencing methods. This article describes new software called Mendel,MD, which combines multiple types of filter options and makes use of regularly updated databases to facilitate exome and genome annotation, the filtering process and the selection of candidate genes and variants for experimental validation and possible diagnosis. This tool offers a user-friendly interface, and leads clinicians through simple steps by limiting the number of candidates to achieve a final diagnosis of a medical genetics case. A useful innovation is the “1-click” method, which enables listing all the relevant variants in genes present at OMIM for perusal by clinicians. Mendel,MD was experimentally validated using clinical cases from the literature and was tested by students at the Universidade Federal de Minas Gerais, at GENE–Núcleo de Genética Médica in Brazil and at the Children’s University Hospital in Dublin, Ireland. We show in this article how it can simplify and increase the speed of identifying the culprit mutation in each of the clinical cases that were received for further investigation. Mendel,MD proved to be a reliable web-based tool, being open-source and time efficient for identifying the culprit mutation in different clinical cases of patients with Mendelian Disorders. It is also freely accessible for academic users on the following URL: https://mendelmd.org. PMID:28594829

  20. CEASAW: A User-Friendly Computer Environment Analysis for the Sawmill Owner

    Treesearch

    Guillermo Mendoza; William Sprouse; Philip A. Araman; William G. Luppold

    1991-01-01

    Improved spreadsheet software capabilities have brought optimization to users with little or no background in mathematical programming. Better interface capabilities of spreadsheet models now make it possible to combine optimization models with a spreadsheet system. Sawmill production and inventory systems possess many features that make them suitable application...

  1. Improvement of Computer Software Quality through Software Automated Tools.

    DTIC Science & Technology

    1986-08-30

    information that are returned from the tools to the human user, and the forms in which these outputs are presented. Page 2 of 4 STAGE OF DEVELOPMENT: What... AUTOMIATED SOFTWARE TOOL MONITORING SYSTEM APPENDIX 2 2-1 INTRODUCTION This document and Automated Software Tool Monitoring Program (Appendix 1) are...t Output Output features provide links from the tool to both the human user and the target machine (where applicable). They describe the types

  2. Friendly tanning: young adults' engagement with friends around indoor tanning.

    PubMed

    Rodríguez, Vivian M; Daniel, Casey L; Welles, Brooke Foucault; Geller, Alan C; Hay, Jennifer L

    2017-08-01

    Indoor tanning (IT), particularly during early adulthood, increases risk for melanoma and is exceedingly common among youth. Social influence, including social norms, promotes IT but little is known about young adults' engagement with friends around tanning. We examined IT behaviors and tanning-related communication with friends at three universities. Of 837 participants, 261 (31%) reported ever tanning (90% female, 85% White). Of those, 113 (43%) were former tanners and 148 (57%) current tanners. Current tanners reported more social tanning and discussions with friends about tanning, more frequent outdoor tanning, high propensity to tan, and greater lifetime IT exposure than former tanners. Risks-to-benefits discussion ratios were greater for former tanners. In adjusted analyses, current tanners were more likely to make plans to tan and to talk about tanning benefits with friends. Findings confirm IT is a social experience. Future work should examine social tanning's role in the promotion and reduction of IT among youth.

  3. KCNSC Automated RAIL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Branson, Donald

    The KCNSC Automated RAIL (Rolling Action Item List) system provides an electronic platform to manage and escalate rolling action items within an business and manufacturing environment at Honeywell. The software enables a tiered approach to issue management where issues are escalated up a management chain based on team input and compared to business metrics. The software manages action items at different levels of the organization and allows all users to discuss action items concurrently. In addition, the software drives accountability through timely emails and proper visibility during team meetings.

  4. Allergy-Friendly Gardening

    MedlinePlus

    ... Menu Search Main navigation Skip to content Conditions & Treatments Allergies Asthma Primary Immunodeficiency Disease Related Conditions Drug Guide ... Expert Search Search AAAAI Breadcrumb navigation Home ▸ Conditions & Treatments ▸ ... Library ▸ Allergy-friendly gardening Share | Allergy-Friendly Gardening ...

  5. MAC/GMC 4.0 User's Manual: Example Problem Manual. Volume 3

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2002-01-01

    This document is the third volume in the three volume set of User's Manuals for the Micromechanics Analysis Code with Generalized Method of Cells Version 4.0 (MAC/GMC 4.0). Volume 1 is the Theory Manual, Volume 2 is the Keywords Manual, and this document is the Example Problems Manual. MAC/GMC 4.0 is a composite material and laminate analysis software program developed at the NASA Glenn Research Center. It is based on the generalized method of cells (GMC) micromechanics theory, which provides access to the local stress and strain fields in the composite material. This access grants GMC the ability to accommodate arbitrary local models for inelastic material behavior and various types of damage and failure analysis. MAC/GMC 4.0 has been built around GMC to provide the theory with a user-friendly framework, along with a library of local inelastic, damage, and failure models. Further, application of simulated thermo-mechanical loading, generation of output results, and selection of architectures to represent the composite material, have been automated in MAC/GMC 4.0. Finally, classical lamination theory has been implemented within MAC/GMC 4.0 wherein GMC is used to model the composite material response of each ply. Consequently, the full range of GMC composite material capabilities is available for analysis of arbitrary laminate configurations as well. This volume provides in-depth descriptions of 43 example problems, which were specially designed to highlight many of the most important capabilities of the code. The actual input files associated with each example problem are distributed with the MAC/GMC 4.0 software; thus providing the user with a convenient starting point for their own specialized problems of interest.

  6. Vestibular rehabilitation using the Nintendo® Wii Balance Board -- a user-friendly alternative for central nervous compensation.

    PubMed

    Sparrer, Ingo; Duong Dinh, Thien An; Ilgner, Justus; Westhofen, Martin

    2013-03-01

    The Nintendo® Wii Balance Board is a cost-effective and user-friendly alternative to other popular frequently used systems that aid vestibular compensation, particularly in elderly patients. In addition, further treatment in the home environment is possible. This cohort study was designed to investigate the impact of the Nintendo® Wii Balance Board as a visual compensation device after acute vestibular neuritis. Subjects were randomly assigned to one of two treatment groups. Group A (n = 37) performed customized exercises with the Nintendo® Wii Balance Board. Group B (n = 34) performed only two elected exercises as a control group for comparison of the results. Both groups underwent additive therapy with steroids (intravenous) in decreasing doses (250 mg decreasing to 25 mg over 10 days). The Sensory Organization Test (SOT), Dizziness Handicap Inventory (DHI), Vertigo Symptom Scale (VSS), and Tinneti questionnaire were evaluated immediately before treatment (baseline), at the end of treatment, i.e. at day 5, and after 10 weeks. The early use of a visual feedback system in the context of the balance training supports the central nervous vestibular compensation after peripheral labyrinthine disorders. Patients in group B (without training) required a longer in-patient stay (average 2.4 days, SD 0.4) compared with patients following early Wii rehabilitation. The absence of nystagmus under Frenzel's goggles in group A was observed 2.1 days (SD 0.5) earlier than in group B. Group A showed significantly better results in the SOT, DHI, VSS, and Tinneti questionnaire at all time points measured (p < 0.05).

  7. The automated ground network system

    NASA Technical Reports Server (NTRS)

    Smith, Miles T.; Militch, Peter N.

    1993-01-01

    The primary goal of the Automated Ground Network System (AGNS) project is to reduce Ground Network (GN) station life-cycle costs. To accomplish this goal, the AGNS project will employ an object-oriented approach to develop a new infrastructure that will permit continuous application of new technologies and methodologies to the Ground Network's class of problems. The AGNS project is a Total Quality (TQ) project. Through use of an open collaborative development environment, developers and users will have equal input into the end-to-end design and development process. This will permit direct user input and feedback and will enable rapid prototyping for requirements clarification. This paper describes the AGNS objectives, operations concept, and proposed design.

  8. pmx Webserver: A User Friendly Interface for Alchemistry.

    PubMed

    Gapsys, Vytautas; de Groot, Bert L

    2017-02-27

    With the increase of available computational power and improvements in simulation algorithms, alchemical molecular dynamics based free energy calculations have developed into routine usage. To further facilitate the usability of alchemical methods for amino acid mutations, we have developed a web based infrastructure for obtaining hybrid protein structures and topologies. The presented webserver allows amino acid mutation selection in five contemporary molecular mechanics force fields. In addition, a complete mutation scan with a user defined amino acid is supported. The output generated by the webserver is directly compatible with the Gromacs molecular dynamics engine and can be used with any of the alchemical free energy calculation setup. Furthermore, we present a database of input files and precalculated free energy differences for tripeptides approximating a disordered state of a protein, of particular use for protein stability studies. Finally, the usage of the webserver and its output is exemplified by performing an alanine scan and investigating thermodynamic stability of the Trp cage mini protein. The webserver is accessible at http://pmx.mpibpc.mpg.de.

  9. Automating Geospatial Visualizations with Smart Default Renderers for Data Exploration Web Applications

    NASA Astrophysics Data System (ADS)

    Ekenes, K.

    2017-12-01

    This presentation will outline the process of creating a web application for exploring large amounts of scientific geospatial data using modern automated cartographic techniques. Traditional cartographic methods, including data classification, may inadvertently hide geospatial and statistical patterns in the underlying data. This presentation demonstrates how to use smart web APIs that quickly analyze the data when it loads, and provides suggestions for the most appropriate visualizations based on the statistics of the data. Since there are just a few ways to visualize any given dataset well, it is imperative to provide smart default color schemes tailored to the dataset as opposed to static defaults. Since many users don't go beyond default values, it is imperative that they are provided with smart default visualizations. Multiple functions for automating visualizations are available in the Smart APIs, along with UI elements allowing users to create more than one visualization for a dataset since there isn't a single best way to visualize a given dataset. Since bivariate and multivariate visualizations are particularly difficult to create effectively, this automated approach removes the guesswork out of the process and provides a number of ways to generate multivariate visualizations for the same variables. This allows the user to choose which visualization is most appropriate for their presentation. The methods used in these APIs and the renderers generated by them are not available elsewhere. The presentation will show how statistics can be used as the basis for automating default visualizations of data along continuous ramps, creating more refined visualizations while revealing the spread and outliers of the data. Adding interactive components to instantaneously alter visualizations allows users to unearth spatial patterns previously unknown among one or more variables. These applications may focus on a single dataset that is frequently updated, or configurable

  10. Expert System for Automated Design Synthesis

    NASA Technical Reports Server (NTRS)

    Rogers, James L., Jr.; Barthelemy, Jean-Francois M.

    1987-01-01

    Expert-system computer program EXADS developed to aid users of Automated Design Synthesis (ADS) general-purpose optimization program. EXADS aids engineer in determining best combination based on knowledge of specific problem and expert knowledge stored in knowledge base. Available in two interactive machine versions. IBM PC version (LAR-13687) written in IQ-LISP. DEC VAX version (LAR-13688) written in Franz-LISP.

  11. j5 DNA assembly design automation.

    PubMed

    Hillson, Nathan J

    2014-01-01

    Modern standardized methodologies, described in detail in the previous chapters of this book, have enabled the software-automated design of optimized DNA construction protocols. This chapter describes how to design (combinatorial) scar-less DNA assembly protocols using the web-based software j5. j5 assists biomedical and biotechnological researchers construct DNA by automating the design of optimized protocols for flanking homology sequence as well as type IIS endonuclease-mediated DNA assembly methodologies. Unlike any other software tool available today, j5 designs scar-less combinatorial DNA assembly protocols, performs a cost-benefit analysis to identify which portions of an assembly process would be less expensive to outsource to a DNA synthesis service provider, and designs hierarchical DNA assembly strategies to mitigate anticipated poor assembly junction sequence performance. Software integrated with j5 add significant value to the j5 design process through graphical user-interface enhancement and downstream liquid-handling robotic laboratory automation.

  12. Clarity: An Open Source Manager for Laboratory Automation

    PubMed Central

    Delaney, Nigel F.; Echenique, José Rojas; Marx, Christopher J.

    2013-01-01

    Software to manage automated laboratories interfaces with hardware instruments, gives users a way to specify experimental protocols, and schedules activities to avoid hardware conflicts. In addition to these basics, modern laboratories need software that can run multiple different protocols in parallel and that can be easily extended to interface with a constantly growing diversity of techniques and instruments. We present Clarity: a laboratory automation manager that is hardware agnostic, portable, extensible and open source. Clarity provides critical features including remote monitoring, robust error reporting by phone or email, and full state recovery in the event of a system crash. We discuss the basic organization of Clarity; demonstrate an example of its implementation for the automated analysis of bacterial growth; and describe how the program can be extended to manage new hardware. Clarity is mature; well documented; actively developed; written in C# for the Common Language Infrastructure; and is free and open source software. These advantages set Clarity apart from currently available laboratory automation programs. PMID:23032169

  13. A Mechanistically Informed User-Friendly Model to Predict Greenhouse Gas (GHG) Fluxes and Carbon Storage from Coastal Wetlands

    NASA Astrophysics Data System (ADS)

    Abdul-Aziz, O. I.; Ishtiaq, K. S.

    2015-12-01

    We present a user-friendly modeling tool on MS Excel to predict the greenhouse gas (GHG) fluxes and estimate potential carbon sequestration from the coastal wetlands. The dominant controls of wetland GHG fluxes and their relative mechanistic linkages with various hydro-climatic, sea level, biogeochemical and ecological drivers were first determined by employing a systematic data-analytics method, including Pearson correlation matrix, principal component and factor analyses, and exploratory partial least squares regressions. The mechanistic knowledge and understanding was then utilized to develop parsimonious non-linear (power-law) models to predict wetland carbon dioxide (CO2) and methane (CH4) fluxes based on a sub-set of climatic, hydrologic and environmental drivers such as the photosynthetically active radiation, soil temperature, water depth, and soil salinity. The models were tested with field data for multiple sites and seasons (2012-13) collected from the Waquoit Bay, MA. The model estimated the annual wetland carbon storage by up-scaling the instantaneous predicted fluxes to an extended growing season (e.g., May-October) and by accounting for the net annual lateral carbon fluxes between the wetlands and estuary. The Excel Spreadsheet model is a simple ecological engineering tool for coastal carbon management and their incorporation into a potential carbon market under a changing climate, sea level and environment. Specifically, the model can help to determine appropriate GHG offset protocols and monitoring plans for projects that focus on tidal wetland restoration and maintenance.

  14. Automated Rocket Propulsion Test Management

    NASA Technical Reports Server (NTRS)

    Walters, Ian; Nelson, Cheryl; Jones, Helene

    2007-01-01

    The Rocket Propulsion Test-Automated Management System provides a central location for managing activities associated with Rocket Propulsion Test Management Board, National Rocket Propulsion Test Alliance, and the Senior Steering Group business management activities. A set of authorized users, both on-site and off-site with regard to Stennis Space Center (SSC), can access the system through a Web interface. Web-based forms are used for user input with generation and electronic distribution of reports easily accessible. Major functions managed by this software include meeting agenda management, meeting minutes, action requests, action items, directives, and recommendations. Additional functions include electronic review, approval, and signatures. A repository/library of documents is available for users, and all items are tracked in the system by unique identification numbers and status (open, closed, percent complete, etc.). The system also provides queries and version control for input of all items.

  15. Automated spectral and timing analysis of AGNs

    NASA Astrophysics Data System (ADS)

    Munz, F.; Karas, V.; Guainazzi, M.

    2006-12-01

    % We have developed an autonomous script that helps the user to automate the XMM-Newton data analysis for the purposes of extensive statistical investigations. We test this approach by examining X-ray spectra of bright AGNs pre-selected from the public database. The event lists extracted in this process were studied further by constructing their energy-resolved Fourier power-spectrum density. This analysis combines energy distributions, light-curves, and their power-spectra and it proves useful to assess the variability patterns present is the data. As another example, an automated search was based on the XSPEC package to reveal the emission features in 2-8 keV range.

  16. A Computational Architecture for Programmable Automation Research

    NASA Astrophysics Data System (ADS)

    Taylor, Russell H.; Korein, James U.; Maier, Georg E.; Durfee, Lawrence F.

    1987-03-01

    This short paper describes recent work at the IBM T. J. Watson Research Center directed at developing a highly flexible computational architecture for research on sensor-based programmable automation. The system described here has been designed with a focus on dynamic configurability, layered user inter-faces and incorporation of sensor-based real time operations into new commands. It is these features which distinguish it from earlier work. The system is cur-rently being implemented at IBM for research purposes and internal use and is an outgrowth of programmable automation research which has been ongoing since 1972 [e.g., 1, 2, 3, 4, 5, 6] .

  17. ARTIP: Automated Radio Telescope Image Processing Pipeline

    NASA Astrophysics Data System (ADS)

    Sharma, Ravi; Gyanchandani, Dolly; Kulkarni, Sarang; Gupta, Neeraj; Pathak, Vineet; Pande, Arti; Joshi, Unmesh

    2018-02-01

    The Automated Radio Telescope Image Processing Pipeline (ARTIP) automates the entire process of flagging, calibrating, and imaging for radio-interferometric data. ARTIP starts with raw data, i.e. a measurement set and goes through multiple stages, such as flux calibration, bandpass calibration, phase calibration, and imaging to generate continuum and spectral line images. Each stage can also be run independently. The pipeline provides continuous feedback to the user through various messages, charts and logs. It is written using standard python libraries and the CASA package. The pipeline can deal with datasets with multiple spectral windows and also multiple target sources which may have arbitrary combinations of flux/bandpass/phase calibrators.

  18. Cognitive anchoring on self-generated decisions reduces operator reliance on automated diagnostic aids.

    PubMed

    Madhavan, Poornima; Wiegmann, Douglas A

    2005-01-01

    Automation users often disagree with diagnostic aids that are imperfectly reliable. The extent to which users' agreements with an aid are anchored to their personal, self-generated diagnoses was explored. Participants (N = 75) performed 200 trials in which they diagnosed pump failures using an imperfectly reliable automated aid. One group (nonforced anchor, n = 50) provided diagnoses only after consulting the aid. Another group (forced anchor, n = 25) provided diagnoses both before and after receiving feedback from the aid. Within the nonforced anchor group, participants' self-reported tendency to prediagnose system failures significantly predicted their tendency to disagree with the aid, revealing a cognitive anchoring effect. Agreement rates of participants in the forced anchor group indicated that public commitment to a diagnosis did not strengthen this effect. Potential applications include the development of methods for reducing cognitive anchoring effects and improving automation utilization in high-risk domains.

  19. Students' Experiences with an Automated Essay Scorer

    ERIC Educational Resources Information Center

    Scharber, Cassandra; Dexter, Sara; Riedel, Eric

    2008-01-01

    The purpose of this research is to analyze preservice teachers' use of and reactions to an automated essay scorer used within an online, case-based learning environment called ETIPS. Data analyzed include post-assignment surveys, a user log of students' actions within the cases, instructor-assigned scores on final essays, and interviews with four…

  20. Usability Testing, User-Centered Design, and LibGuides Subject Guides: A Case Study

    ERIC Educational Resources Information Center

    Sonsteby, Alec; DeJonghe, Jennifer

    2013-01-01

    Usability testing has become a routine way for many libraries to ensure that their Web presence is user-friendly and accessible. At the same time, popular subject guide creation systems, such as LibGuides, decentralize Web content creation and put authorship into the hands of librarians who may not be trained in user-centered design principles. At…

  1. Automation Applications in an Advanced Air Traffic Management System : Volume 5B. DELTA Simulation Model - Programmer's Guide.

    DOT National Transportation Integrated Search

    1974-08-01

    Volume 5 describes the DELTA Simulation Model. It includes all documentation of the DELTA (Determine Effective Levels of Task Automation) computer simulation developed by TRW for use in the Automation Applications Study. Volume 5A includes a user's m...

  2. Formularity: Software for Automated Formula Assignment of Natural and Other Organic Matter from Ultrahigh-Resolution Mass Spectra.

    PubMed

    Tolić, Nikola; Liu, Yina; Liyu, Andrey; Shen, Yufeng; Tfaily, Malak M; Kujawinski, Elizabeth B; Longnecker, Krista; Kuo, Li-Jung; Robinson, Errol W; Paša-Tolić, Ljiljana; Hess, Nancy J

    2017-12-05

    Ultrahigh resolution mass spectrometry, such as Fourier transform ion cyclotron resonance mass spectrometry (FT ICR MS), can resolve thousands of molecular ions in complex organic matrices. A Compound Identification Algorithm (CIA) was previously developed for automated elemental formula assignment for natural organic matter (NOM). In this work, we describe software Formularity with a user-friendly interface for CIA function and newly developed search function Isotopic Pattern Algorithm (IPA). While CIA assigns elemental formulas for compounds containing C, H, O, N, S, and P, IPA is capable of assigning formulas for compounds containing other elements. We used halogenated organic compounds (HOC), a chemical class that is ubiquitous in nature as well as anthropogenic systems, as an example to demonstrate the capability of Formularity with IPA. A HOC standard mix was used to evaluate the identification confidence of IPA. Tap water and HOC spike in Suwannee River NOM were used to assess HOC identification in complex environmental samples. Strategies for reconciliation of CIA and IPA assignments were discussed. Software and sample databases with documentation are freely available.

  3. Friends Like Me: Associations in Overweight/Obese Status among Adolescent Friends by Race/Ethnicity, Sex, and Friendship Type.

    PubMed

    Bruening, Meg; MacLehose, Richard; Eisenberg, Marla E; Kim, Sunkyung; Story, Mary; Neumark-Sztainer, Dianne

    2015-12-01

    Little is known about how interpersonal friend relationships are associated with obesity in young people, particularly with regard to how race/ethnicity, type of friendship, and sex affect the association between friends' and adolescents' weight status. This study examined associations in weight status among adolescents and their friends, exploring magnitudes of associations across friendship type, sex, and race/ethnicity. As part of EAT-2010 (Eating and Activity in Teens), friend nominations and anthropometrics were obtained from adolescents (n = 2099: 54% female; 80% nonwhite; mean age: 14.2 ± 1.9 years). Generalized estimating equation logistic regression models were used to test associations between adolescents' overweight/obese status and friends' (i.e., friend group, female friends, male friends, female best friends, and male best friends) overweight/obese status. Interactions by adolescent race/ethnicity were examined. The majority of significant associations were observed among white female adolescents' who had a 22-40% higher prevalence of overweight/obesity if their friends were overweight compared to white females whose friends were not overweight. In contrast, there were few significant differences for other adolescent female and male racial/ethnic groups for girls and boys. Results for friend groups and best friends were generally similar to one another. The association between friend and adolescent overweight/obese status depended on adolescents' sex, race/ethnicity, and friendship type. Given the similarities among friends, obesity interventions targeting youth, especially white females, should consider involving friends.

  4. Friends Like Me: Associations in Overweight/Obese Status among Adolescent Friends by Race/Ethnicity, Sex, and Friendship Type

    PubMed Central

    MacLehose, Richard; Eisenberg, Marla E.; Kim, Sunkyung; Story, Mary; Neumark-Sztainer, Dianne

    2015-01-01

    Abstract Background: Little is known about how interpersonal friend relationships are associated with obesity in young people, particularly with regard to how race/ethnicity, type of friendship, and sex affect the association between friends' and adolescents' weight status. This study examined associations in weight status among adolescents and their friends, exploring magnitudes of associations across friendship type, sex, and race/ethnicity. Methods: As part of EAT-2010 (Eating and Activity in Teens), friend nominations and anthropometrics were obtained from adolescents (n = 2099: 54% female; 80% nonwhite; mean age: 14.2 ± 1.9 years). Generalized estimating equation logistic regression models were used to test associations between adolescents' overweight/obese status and friends' (i.e., friend group, female friends, male friends, female best friends, and male best friends) overweight/obese status. Interactions by adolescent race/ethnicity were examined. Results: The majority of significant associations were observed among white female adolescents' who had a 22–40% higher prevalence of overweight/obesity if their friends were overweight compared to white females whose friends were not overweight. In contrast, there were few significant differences for other adolescent female and male racial/ethnic groups for girls and boys. Results for friend groups and best friends were generally similar to one another. Conclusions: The association between friend and adolescent overweight/obese status depended on adolescents' sex, race/ethnicity, and friendship type. Given the similarities among friends, obesity interventions targeting youth, especially white females, should consider involving friends. PMID:26655453

  5. Task automation in a successful industrial telerobot

    NASA Technical Reports Server (NTRS)

    Spelt, Philip F.; Jones, Sammy L.

    1994-01-01

    In this paper, we discuss cooperative work by Oak Ridge National Laboratory and Remotec, Inc., to automate components of the operator's workload using Remotec's Andros telerobot, thereby providing an enhanced user interface which can be retrofit to existing fielded units as well as being incorporated into new production units. Remotec's Andros robots are presently used by numerous electric utilities to perform tasks in reactors where substantial exposure to radiation exists, as well as by the armed forces and numerous law enforcement agencies. The automation of task components, as well as the video graphics display of the robot's position in the environment, will enhance all tasks performed by these users, as well as enabling performance in terrain where the robots cannot presently perform due to lack of knowledge about, for instance, the degree of tilt of the robot. Enhanced performance of a successful industrial mobile robot leads to increased safety and efficiency of performance in hazardous environments. The addition of these capabilities will greatly enhance the utility of the robot, as well as its marketability.

  6. Just-in-time automated counseling for physical activity promotion.

    PubMed

    Bickmore, Timothy; Gruber, Amanda; Intille, Stephen

    2008-11-06

    Preliminary results from a field study into the efficacy of automated health behavior counseling delivered at the moment of user decision-making compared to the same counseling delivered at the end of the day are reported. The study uses an animated PDA-based advisor with an integrated accelerometer that can engage users in dialogues about their physical activity throughout the day. Preliminary results indicate health counseling is more effective when delivered just-in-time than when delivered retrospectively.

  7. Improving treatment plan evaluation with automation.

    PubMed

    Covington, Elizabeth L; Chen, Xiaoping; Younge, Kelly C; Lee, Choonik; Matuszak, Martha M; Kessler, Marc L; Keranen, Wayne; Acosta, Eduardo; Dougherty, Ashley M; Filpansick, Stephanie E; Moran, Jean M

    2016-11-08

    The goal of this work is to evaluate the effectiveness of Plan-Checker Tool (PCT) which was created to improve first-time plan quality, reduce patient delays, increase the efficiency of our electronic workflow, and standardize and automate the phys-ics plan review in the treatment planning system (TPS). PCT uses an application programming interface to check and compare data from the TPS and treatment management system (TMS). PCT includes a comprehensive checklist of automated and manual checks that are documented when performed by the user as part of a plan readiness check for treatment. Prior to and during PCT development, errors identified during the physics review and causes of patient treatment start delays were tracked to prioritize which checks should be automated. Nineteen of 33checklist items were automated, with data extracted with PCT. There was a 60% reduction in the number of patient delays in the six months after PCT release. PCT was suc-cessfully implemented for use on all external beam treatment plans in our clinic. While the number of errors found during the physics check did not decrease, automation of checks increased visibility of errors during the physics check, which led to decreased patient delays. The methods used here can be applied to any TMS and TPS that allows queries of the database. © 2016 The Authors.

  8. PaR-PaR Laboratory Automation Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Linshiz, G; Stawski, N; Poust, S

    2013-05-01

    Labor-intensive multistep biological tasks, such as the construction and cloning of DNA molecules, are prime candidates for laboratory automation. Flexible and biology-friendly operation of robotic equipment is key to its successful integration in biological laboratories, and the efforts required to operate a robot must be much smaller than the alternative manual lab work. To achieve these goals, a simple high-level biology-friendly robot programming language is needed. We have developed and experimentally validated such a language: Programming a Robot (PaR-PaR). The syntax and compiler for the language are based on computer science principles and a deep understanding of biological workflows. PaR-PaRmore » allows researchers to use liquid-handling robots effectively, enabling experiments that would not have been considered previously. After minimal training, a biologist can independently write complicated protocols for a robot within an hour. Adoption of PaR-PaR as a standard cross-platform language would enable hand-written or software-generated robotic protocols to be shared across laboratories.« less

  9. PaR-PaR laboratory automation platform.

    PubMed

    Linshiz, Gregory; Stawski, Nina; Poust, Sean; Bi, Changhao; Keasling, Jay D; Hillson, Nathan J

    2013-05-17

    Labor-intensive multistep biological tasks, such as the construction and cloning of DNA molecules, are prime candidates for laboratory automation. Flexible and biology-friendly operation of robotic equipment is key to its successful integration in biological laboratories, and the efforts required to operate a robot must be much smaller than the alternative manual lab work. To achieve these goals, a simple high-level biology-friendly robot programming language is needed. We have developed and experimentally validated such a language: Programming a Robot (PaR-PaR). The syntax and compiler for the language are based on computer science principles and a deep understanding of biological workflows. PaR-PaR allows researchers to use liquid-handling robots effectively, enabling experiments that would not have been considered previously. After minimal training, a biologist can independently write complicated protocols for a robot within an hour. Adoption of PaR-PaR as a standard cross-platform language would enable hand-written or software-generated robotic protocols to be shared across laboratories.

  10. Road user behaviour changes following a self-explaining roads intervention.

    PubMed

    Mackie, Hamish W; Charlton, Samuel G; Baas, Peter H; Villasenor, Pablo C

    2013-01-01

    The self-explaining roads (SER) approach uses road designs that evoke correct expectations and driving behaviours from road users to create a safe and user-friendly road network. Following the implementation of an SER process and retrofitting of local and collector roads in a suburb within Auckland City, lower speeds on local roads and less variation in speed on both local and collector roads were achieved, along with a closer match between actual and perceived safe speeds. Preliminary analyses of crash data shows that the project has resulted in a 30% reduction crash numbers and an 86% reduction in crash costs per annum, since the road changes were completed. In order to further understand the outcomes from this project, a study was carried out to measure the effects of the SER intervention on the activity and behaviour of all road users. Video was collected over nine separate days, at nine different locations, both before and after SER construction. Road user behaviour categories were developed for all potential road users at different location types and then used to code the video data. Following SER construction, on local roads there was a relatively higher proportion of pedestrians, less uniformity in vehicle lane keeping and less indicating by motorists along with less through traffic, reflecting a more informal/low speed local road environment. Pedestrians were less constrained on local roads following SER construction, possibly reflecting a perceptually safer and more user-friendly environment. These behaviours were not generally evident on collector roads, a trend also shown by the previous study of speed changes. Given that one of the objectives of SER is to match road user behaviour with functionally different road categories, the road user behaviour differences demonstrated on different road types within the SER trial area provides further reinforcement of a successful SER trial. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. Formal analysis and automatic generation of user interfaces: approach, methodology, and an algorithm.

    PubMed

    Heymann, Michael; Degani, Asaf

    2007-04-01

    We present a formal approach and methodology for the analysis and generation of user interfaces, with special emphasis on human-automation interaction. A conceptual approach for modeling, analyzing, and verifying the information content of user interfaces is discussed. The proposed methodology is based on two criteria: First, the interface must be correct--that is, given the interface indications and all related information (user manuals, training material, etc.), the user must be able to successfully perform the specified tasks. Second, the interface and related information must be succinct--that is, the amount of information (mode indications, mode buttons, parameter settings, etc.) presented to the user must be reduced (abstracted) to the minimum necessary. A step-by-step procedure for generating the information content of the interface that is both correct and succinct is presented and then explained and illustrated via two examples. Every user interface is an abstract description of the underlying system. The correspondence between the abstracted information presented to the user and the underlying behavior of a given machine can be analyzed and addressed formally. The procedure for generating the information content of user interfaces can be automated, and a software tool for its implementation has been developed. Potential application areas include adaptive interface systems and customized/personalized interfaces.

  12. Apparatus for automated testing of biological specimens

    DOEpatents

    Layne, Scott P.; Beugelsdijk, Tony J.

    1999-01-01

    An apparatus for performing automated testing of infections biological specimens is disclosed. The apparatus comprise a process controller for translating user commands into test instrument suite commands, and a test instrument suite comprising a means to treat the specimen to manifest an observable result, and a detector for measuring the observable result to generate specimen test results.

  13. TargetVue: Visual Analysis of Anomalous User Behaviors in Online Communication Systems.

    PubMed

    Cao, Nan; Shi, Conglei; Lin, Sabrina; Lu, Jie; Lin, Yu-Ru; Lin, Ching-Yung

    2016-01-01

    Users with anomalous behaviors in online communication systems (e.g. email and social medial platforms) are potential threats to society. Automated anomaly detection based on advanced machine learning techniques has been developed to combat this issue; challenges remain, though, due to the difficulty of obtaining proper ground truth for model training and evaluation. Therefore, substantial human judgment on the automated analysis results is often required to better adjust the performance of anomaly detection. Unfortunately, techniques that allow users to understand the analysis results more efficiently, to make a confident judgment about anomalies, and to explore data in their context, are still lacking. In this paper, we propose a novel visual analysis system, TargetVue, which detects anomalous users via an unsupervised learning model and visualizes the behaviors of suspicious users in behavior-rich context through novel visualization designs and multiple coordinated contextual views. Particularly, TargetVue incorporates three new ego-centric glyphs to visually summarize a user's behaviors which effectively present the user's communication activities, features, and social interactions. An efficient layout method is proposed to place these glyphs on a triangle grid, which captures similarities among users and facilitates comparisons of behaviors of different users. We demonstrate the power of TargetVue through its application in a social bot detection challenge using Twitter data, a case study based on email records, and an interview with expert users. Our evaluation shows that TargetVue is beneficial to the detection of users with anomalous communication behaviors.

  14. Users matter : multi-agent systems model of high performance computing cluster users.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    North, M. J.; Hood, C. S.; Decision and Information Sciences

    2005-01-01

    High performance computing clusters have been a critical resource for computational science for over a decade and have more recently become integral to large-scale industrial analysis. Despite their well-specified components, the aggregate behavior of clusters is poorly understood. The difficulties arise from complicated interactions between cluster components during operation. These interactions have been studied by many researchers, some of whom have identified the need for holistic multi-scale modeling that simultaneously includes network level, operating system level, process level, and user level behaviors. Each of these levels presents its own modeling challenges, but the user level is the most complex duemore » to the adaptability of human beings. In this vein, there are several major user modeling goals, namely descriptive modeling, predictive modeling and automated weakness discovery. This study shows how multi-agent techniques were used to simulate a large-scale computing cluster at each of these levels.« less

  15. Toward an automated parallel computing environment for geosciences

    NASA Astrophysics Data System (ADS)

    Zhang, Huai; Liu, Mian; Shi, Yaolin; Yuen, David A.; Yan, Zhenzhen; Liang, Guoping

    2007-08-01

    Software for geodynamic modeling has not kept up with the fast growing computing hardware and network resources. In the past decade supercomputing power has become available to most researchers in the form of affordable Beowulf clusters and other parallel computer platforms. However, to take full advantage of such computing power requires developing parallel algorithms and associated software, a task that is often too daunting for geoscience modelers whose main expertise is in geosciences. We introduce here an automated parallel computing environment built on open-source algorithms and libraries. Users interact with this computing environment by specifying the partial differential equations, solvers, and model-specific properties using an English-like modeling language in the input files. The system then automatically generates the finite element codes that can be run on distributed or shared memory parallel machines. This system is dynamic and flexible, allowing users to address different problems in geosciences. It is capable of providing web-based services, enabling users to generate source codes online. This unique feature will facilitate high-performance computing to be integrated with distributed data grids in the emerging cyber-infrastructures for geosciences. In this paper we discuss the principles of this automated modeling environment and provide examples to demonstrate its versatility.

  16. Development of an Automated Precipitation Processing Model and Applications in Hydrologic Investigations

    NASA Astrophysics Data System (ADS)

    Milewski, A. M.; Markondiah Jayaprakash, S.; Sultan, M.; Becker, R.

    2006-12-01

    Given the advances in new technologies, more and more scientists are beginning to utilize remote sensing or satellite imagery in their research applications. Remote sensing data offer a synoptic view and observational quantitative parameters over large domains and thus provide cost-effective solutions by reducing the labor involved in collecting extensive field observations. One of the valuable data sets that can be extracted from remote sensing observations is precipitation. Prior to the deployment of the relevant satellite-based sensors, users had to resort to rainfall stations to obtain precipitation data. Currently, users can freely download digital Tropical Rainfall Measuring Mission (TRMM) and Special Spectral Measuring Imager (SSM/I) precipitation data, however, the process of data extraction is not user friendly as it requires computer programming to fully utilize these datasets. We have developed the Automated Precipitation Processing Module (APPM) to simplify the tedious manual process needed to retrieve rainfall estimates via satellite measurements. The function of the APPM is to process the TRMM and SSM/I data according to the user's spatial and temporal inputs. Using APPM, we processed all available TRMM and SSM/I data for six continents (processed data is available on six compact discs: one/continent: refer to www.esrs.wmich.edu). The input data includes global SSM/I (1987-1998) and TRMM (1998-2005) covering an area extending from 50 degrees North to 50 degrees South. Advantages of using our software include: (1) user friendly technology, (2) reduction in processing time (e.g., processing of the entire TRMM & SSM/I dataset (1987-2005) for Africa was reduced from one year to one week), and (3) reduction in required computer resources (original TRMM & SSM/I data: 1.5 terabytes; processed: 300 megabytes). The APPM reads raw binary data and allows for: (1) sub-setting global dataset given user-defined boundaries (latitude and longitude), (2) selection of

  17. Waste treatability guidance program. User`s guide. Revision 0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toth, C.

    1995-12-21

    DOE sites across the country generate and manage radioactive, hazardous, mixed, and sanitary wastes. It is necessary for each site to find the technologies and associated capacities required to manage its waste. One role of DOE HQ Office of Environmental Restoration and Waste Management is to facilitate the integration of the site- specific plans into coherent national plans. DOE has developed a standard methodology for defining and categorizing waste streams into treatability groups based on characteristic parameters that influence waste management technology needs. This Waste Treatability Guidance Program automates the Guidance Document for the categorization of waste information into treatabilitymore » groups; this application provides a consistent implementation of the methodology across the National TRU Program. This User`s Guide provides instructions on how to use the program, including installations instructions and program operation. This document satisfies the requirements of the Software Quality Assurance Plan.« less

  18. HiRel: Hybrid Automated Reliability Predictor (HARP) integrated reliability tool system, (version 7.0). Volume 1: HARP introduction and user's guide

    NASA Technical Reports Server (NTRS)

    Bavuso, Salvatore J.; Rothmann, Elizabeth; Dugan, Joanne Bechta; Trivedi, Kishor S.; Mittal, Nitin; Boyd, Mark A.; Geist, Robert M.; Smotherman, Mark D.

    1994-01-01

    The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed to be compatible with most computing platforms and operating systems, and some programs have been beta tested, within the aerospace community for over 8 years. Volume 1 provides an introduction to the HARP program. Comprehensive information on HARP mathematical models can be found in the references.

  19. Automation Tools for Finite Element Analysis of Adhesively Bonded Joints

    NASA Technical Reports Server (NTRS)

    Tahmasebi, Farhad; Brodeur, Stephen J. (Technical Monitor)

    2002-01-01

    This article presents two new automation creation tools that obtain stresses and strains (Shear and peel) in adhesively bonded joints. For a given adhesively bonded joint Finite Element model, in which the adhesive is characterised using springs, these automation tools read the corresponding input and output files, use the spring forces and deformations to obtain the adhesive stresses and strains, sort the stresses and strains in descending order, and generate plot files for 3D visualisation of the stress and strain fields. Grids (nodes) and elements can be numbered in any order that is convenient for the user. Using the automation tools, trade-off studies, which are needed for design of adhesively bonded joints, can be performed very quickly.

  20. CovalentDock Cloud: a web server for automated covalent docking.

    PubMed

    Ouyang, Xuchang; Zhou, Shuo; Ge, Zemei; Li, Runtao; Kwoh, Chee Keong

    2013-07-01

    Covalent binding is an important mechanism for many drugs to gain its function. We developed a computational algorithm to model this chemical event and extended it to a web server, the CovalentDock Cloud, to make it accessible directly online without any local installation and configuration. It provides a simple yet user-friendly web interface to perform covalent docking experiments and analysis online. The web server accepts the structures of both the ligand and the receptor uploaded by the user or retrieved from online databases with valid access id. It identifies the potential covalent binding patterns, carries out the covalent docking experiments and provides visualization of the result for user analysis. This web server is free and open to all users at http://docking.sce.ntu.edu.sg/.

  1. CovalentDock Cloud: a web server for automated covalent docking

    PubMed Central

    Ouyang, Xuchang; Zhou, Shuo; Ge, Zemei; Li, Runtao; Kwoh, Chee Keong

    2013-01-01

    Covalent binding is an important mechanism for many drugs to gain its function. We developed a computational algorithm to model this chemical event and extended it to a web server, the CovalentDock Cloud, to make it accessible directly online without any local installation and configuration. It provides a simple yet user-friendly web interface to perform covalent docking experiments and analysis online. The web server accepts the structures of both the ligand and the receptor uploaded by the user or retrieved from online databases with valid access id. It identifies the potential covalent binding patterns, carries out the covalent docking experiments and provides visualization of the result for user analysis. This web server is free and open to all users at http://docking.sce.ntu.edu.sg/. PMID:23677616

  2. Computer-automated ABCD versus dermatologists with different degrees of experience in dermoscopy.

    PubMed

    Piccolo, Domenico; Crisman, Giuliana; Schoinas, Spyridon; Altamura, Davide; Peris, Ketty

    2014-01-01

    Dermoscopy is a very useful and non-invasive technique for in vivo observation and preoperative diagnosis of pigmented skin lesions (PSLs) inasmuch as it enables analysis of surface and subsurface structures that are not discernible to the naked eye. The authors used the ABCD rule of dermoscopy to test the accuracy of melanoma diagnosis with respect to a panel of 165 PSLs and the intra- and inter-observer diagnostic agreement obtained between three dermatologists with different degrees of experience, one General Practitioner and a DDA for computer-assisted diagnosis (Nevuscreen(®), Arkè s.a.s., Avezzano, Italy). 165 Pigmented Skin Lesions from 165 patients were selected. Histopathological examination revealed 132 benign melanocytic skin lesions and 33 melanomas. The kappa statistic, sensitivity, specificity and predictive positive and negative values were calculated to measure agreement between all the human observers and in comparison with the automated DDA. Our results revealed poor reproducibility of the semi-quantitative algorithm devised by Stolz et al. independently of observers' experience in dermoscopy. Nevuscreen(®) (Arkè s.a.s., Avezzano, Italy) proved to be 'user friendly' to all observers, thus enabling a more critical evaluation of each lesion and representing a helpful tool for clinicians without significant experience in dermoscopy in improving and achieving more accurate diagnosis of PSLs.

  3. Automated control of hierarchical systems using value-driven methods

    NASA Technical Reports Server (NTRS)

    Pugh, George E.; Burke, Thomas E.

    1990-01-01

    An introduction is given to the Value-driven methodology, which has been successfully applied to solve a variety of difficult decision, control, and optimization problems. Many real-world decision processes (e.g., those encountered in scheduling, allocation, and command and control) involve a hierarchy of complex planning considerations. For such problems it is virtually impossible to define a fixed set of rules that will operate satisfactorily over the full range of probable contingencies. Decision Science Applications' value-driven methodology offers a systematic way of automating the intuitive, common-sense approach used by human planners. The inherent responsiveness of value-driven systems to user-controlled priorities makes them particularly suitable for semi-automated applications in which the user must remain in command of the systems operation. Three examples of the practical application of the approach in the automation of hierarchical decision processes are discussed: the TAC Brawler air-to-air combat simulation is a four-level computerized hierarchy; the autonomous underwater vehicle mission planning system is a three-level control system; and the Space Station Freedom electrical power control and scheduling system is designed as a two-level hierarchy. The methodology is compared with rule-based systems and with other more widely-known optimization techniques.

  4. Facebook friends with (health) benefits? Exploring social network site use and perceptions of social support, stress, and well-being.

    PubMed

    Nabi, Robin L; Prestin, Abby; So, Jiyeon

    2013-10-01

    There is clear evidence that interpersonal social support impacts stress levels and, in turn, degree of physical illness and psychological well-being. This study examines whether mediated social networks serve the same palliative function. A survey of 401 undergraduate Facebook users revealed that, as predicted, number of Facebook friends associated with stronger perceptions of social support, which in turn associated with reduced stress, and in turn less physical illness and greater well-being. This effect was minimized when interpersonal network size was taken into consideration. However, for those who have experienced many objective life stressors, the number of Facebook friends emerged as the stronger predictor of perceived social support. The "more-friends-the-better" heuristic is proposed as the most likely explanation for these findings.

  5. Discrete Abstractions of Hybrid Systems: Verification of Safety and Application to User-Interface Design

    NASA Technical Reports Server (NTRS)

    Oishi, Meeko; Tomlin, Claire; Degani, Asaf

    2003-01-01

    Human interaction with complex hybrid systems involves the user, the automation's discrete mode logic, and the underlying continuous dynamics of the physical system. Often the user-interface of such systems displays a reduced set of information about the entire system. In safety-critical systems, how can we identify user-interface designs which do not have adequate information, or which may confuse the user? Here we describe a methodology, based on hybrid system analysis, to verify that a user-interface contains information necessary to safely complete a desired procedure or task. Verification within a hybrid framework allows us to account for the continuous dynamics underlying the simple, discrete representations displayed to the user. We provide two examples: a car traveling through a yellow light at an intersection and an aircraft autopilot in a landing/go-around maneuver. The examples demonstrate the general nature of this methodology, which is applicable to hybrid systems (not fully automated) which have operational constraints we can pose in terms of safety. This methodology differs from existing work in hybrid system verification in that we directly account for the user's interactions with the system.

  6. Sharing intelligence: Decision-making interactions between users and software in MAESTRO

    NASA Technical Reports Server (NTRS)

    Geoffroy, Amy L.; Gohring, John R.; Britt, Daniel L.

    1991-01-01

    By combining the best of automated and human decision-making in scheduling many advantages can accrue. The joint performance of the user and system is potentially much better than either alone. Features of the MAESTRO scheduling system serve to illustrate concepts of user/software cooperation. MAESTRO may be operated at a user-determinable and dynamic level of autonomy. Because the system allows so much flexibility in the allocation of decision-making responsibilities, and provides users with a wealth of information and other support for their own decision-making, better overall schedules may result.

  7. A generic template for automated bioanalytical ligand-binding assays using modular robotic scripts in support of discovery biotherapeutic programs.

    PubMed

    Duo, Jia; Dong, Huijin; DeSilva, Binodh; Zhang, Yan J

    2013-07-01

    Sample dilution and reagent pipetting are time-consuming steps in ligand-binding assays (LBAs). Traditional automation-assisted LBAs use assay-specific scripts that require labor-intensive script writing and user training. Five major script modules were developed on Tecan Freedom EVO liquid handling software to facilitate the automated sample preparation and LBA procedure: sample dilution, sample minimum required dilution, standard/QC minimum required dilution, standard/QC/sample addition, and reagent addition. The modular design of automation scripts allowed the users to assemble an automated assay with minimal script modification. The application of the template was demonstrated in three LBAs to support discovery biotherapeutic programs. The results demonstrated that the modular scripts provided the flexibility in adapting to various LBA formats and the significant time saving in script writing and scientist training. Data generated by the automated process were comparable to those by manual process while the bioanalytical productivity was significantly improved using the modular robotic scripts.

  8. Satellite services system analysis study. Volume 2: Satellite and services user model

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Satellite services needs are analyzed. Topics include methodology: a satellite user model; representative servicing scenarios; potential service needs; manned, remote, and automated involvement; and inactive satellites/debris. Satellite and services user model development is considered. Groundrules and assumptions, servicing, events, and sensitivity analysis are included. Selection of references satellites is also discussed.

  9. Effects of imperfect automation on decision making in a simulated command and control task.

    PubMed

    Rovira, Ericka; McGarry, Kathleen; Parasuraman, Raja

    2007-02-01

    Effects of four types of automation support and two levels of automation reliability were examined. The objective was to examine the differential impact of information and decision automation and to investigate the costs of automation unreliability. Research has shown that imperfect automation can lead to differential effects of stages and levels of automation on human performance. Eighteen participants performed a "sensor to shooter" targeting simulation of command and control. Dependent variables included accuracy and response time of target engagement decisions, secondary task performance, and subjective ratings of mental work-load, trust, and self-confidence. Compared with manual performance, reliable automation significantly reduced decision times. Unreliable automation led to greater cost in decision-making accuracy under the higher automation reliability condition for three different forms of decision automation relative to information automation. At low automation reliability, however, there was a cost in performance for both information and decision automation. The results are consistent with a model of human-automation interaction that requires evaluation of the different stages of information processing to which automation support can be applied. If fully reliable decision automation cannot be guaranteed, designers should provide users with information automation support or other tools that allow for inspection and analysis of raw data.

  10. The SSM/PMAD automated test bed project

    NASA Technical Reports Server (NTRS)

    Lollar, Louis F.

    1991-01-01

    The Space Station Module/Power Management and Distribution (SSM/PMAD) autonomous subsystem project was initiated in 1984. The project's goal has been to design and develop an autonomous, user-supportive PMAD test bed simulating the SSF Hab/Lab module(s). An eighteen kilowatt SSM/PMAD test bed model with a high degree of automated operation has been developed. This advanced automation test bed contains three expert/knowledge based systems that interact with one another and with other more conventional software residing in up to eight distributed 386-based microcomputers to perform the necessary tasks of real-time and near real-time load scheduling, dynamic load prioritizing, and fault detection, isolation, and recovery (FDIR).

  11. MAC/GMC 4.0 User's Manual: Keywords Manual. Volume 2

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2002-01-01

    This document is the second volume in the three volume set of User's Manuals for the Micromechanics Analysis Code with Generalized Method of Cells Version 4.0 (MAC/GMC 4.0). Volume 1 is the Theory Manual, this document is the Keywords Manual, and Volume 3 is the Example Problem Manual. MAC/GMC 4.0 is a composite material and laminate analysis software program developed at the NASA Glenn Research Center. It is based on the generalized method of cells (GMC) micromechanics theory, which provides access to the local stress and strain fields in the composite material. This access grants GMC the ability to accommodate arbitrary local models for inelastic material behavior and various types of damage and failure analysis. MAC/GMC 4.0 has been built around GMC to provide the theory with a user-friendly framework, along with a library of local inelastic, damage, and failure models. Further, applications of simulated thermo-mechanical loading, generation of output results, and selection of architectures to represent the composite material have been automated in MAC/GMC 4.0. Finally, classical lamination theory has been implemented within MAC/GMC 4.0 wherein GMC is used to model the composite material response of each ply. Consequently, the full range of GMC composite material capabilities is available for analysis of arbitrary laminate configurations as well. This volume describes the basic information required to use the MAC/GMC 4.0 software, including a 'Getting Started' section, and an in-depth description of each of the 22 keywords used in the input file to control the execution of the code.

  12. The historical development and basis of human factors guidelines for automated systems in aeronautical operations

    NASA Technical Reports Server (NTRS)

    Ciciora, J. A.; Leonard, S. D.; Johnson, N.; Amell, J.

    1984-01-01

    In order to derive general design guidelines for automated systems a study was conducted on the utilization and acceptance of existing automated systems as currently employed in several commercial fields. Four principal study area were investigated by means of structured interviews, and in some cases questionnaires. The study areas were aviation, a both scheduled airline and general commercial aviation; process control and factory applications; office automation; and automation in the power industry. The results of over eighty structured interviews were analyzed and responses categoried as various human factors issues for use by both designers and users of automated equipment. These guidelines address such items as general physical features of automated equipment; personnel orientation, acceptance, and training; and both personnel and system reliability.

  13. Comparison of a brain-based adaptive system and a manual adaptable system for invoking automation.

    PubMed

    Bailey, Nathan R; Scerbo, Mark W; Freeman, Frederick G; Mikulka, Peter J; Scott, Lorissa A

    2006-01-01

    Two experiments are presented examining adaptive and adaptable methods for invoking automation. Empirical investigations of adaptive automation have focused on methods used to invoke automation or on automation-related performance implications. However, no research has addressed whether performance benefits associated with brain-based systems exceed those in which users have control over task allocations. Participants performed monitoring and resource management tasks as well as a tracking task that shifted between automatic and manual modes. In the first experiment, participants worked with an adaptive system that used their electroencephalographic signals to switch the tracking task between automatic and manual modes. Participants were also divided between high- and low-reliability conditions for the system-monitoring task as well as high- and low-complacency potential. For the second experiment, participants operated an adaptable system that gave them manual control over task allocations. Results indicated increased situation awareness (SA) of gauge instrument settings for individuals high in complacency potential using the adaptive system. In addition, participants who had control over automation performed more poorly on the resource management task and reported higher levels of workload. A comparison between systems also revealed enhanced SA of gauge instrument settings and decreased workload in the adaptive condition. The present results suggest that brain-based adaptive automation systems may enhance perceptual level SA while reducing mental workload relative to systems requiring user-initiated control. Potential applications include automated systems for which operator monitoring performance and high-workload conditions are of concern.

  14. Hypertext-based design of a user interface for scheduling

    NASA Technical Reports Server (NTRS)

    Woerner, Irene W.; Biefeld, Eric

    1993-01-01

    Operations Mission Planner (OMP) is an ongoing research project at JPL that utilizes AI techniques to create an intelligent, automated planning and scheduling system. The information space reflects the complexity and diversity of tasks necessary in most real-world scheduling problems. Thus the problem of the user interface is to present as much information as possible at a given moment and allow the user to quickly navigate through the various types of displays. This paper describes a design which applies the hypertext model to solve these user interface problems. The general paradigm is to provide maps and search queries to allow the user to quickly find an interesting conflict or problem, and then allow the user to navigate through the displays in a hypertext fashion.

  15. Automated videography for residential communications

    NASA Astrophysics Data System (ADS)

    Kurtz, Andrew F.; Neustaedter, Carman; Blose, Andrew C.

    2010-02-01

    The current widespread use of webcams for personal video communication over the Internet suggests that opportunities exist to develop video communications systems optimized for domestic use. We discuss both prior and existing technologies, and the results of user studies that indicate potential needs and expectations for people relative to personal video communications. In particular, users anticipate an easily used, high image quality video system, which enables multitasking communications during the course of real-world activities and provides appropriate privacy controls. To address these needs, we propose a potential approach premised on automated capture of user activity. We then describe a method that adapts cinematography principles, with a dual-camera videography system, to automatically control image capture relative to user activity, using semantic or activity-based cues to determine user position and motion. In particular, we discuss an approach to automatically manage shot framing, shot selection, and shot transitions, with respect to one or more local users engaged in real-time, unscripted events, while transmitting the resulting video to a remote viewer. The goal is to tightly frame subjects (to provide more detail), while minimizing subject loss and repeated abrupt shot framing changes in the images as perceived by a remote viewer. We also discuss some aspects of the system and related technologies that we have experimented with thus far. In summary, the method enables users to participate in interactive video-mediated communications while engaged in other activities.

  16. Automated Cryocooler Monitor and Control System Software

    NASA Technical Reports Server (NTRS)

    Britchcliffe, Michael J.; Conroy, Bruce L.; Anderson, Paul E.; Wilson, Ahmad

    2011-01-01

    This software is used in an automated cryogenic control system developed to monitor and control the operation of small-scale cryocoolers. The system was designed to automate the cryogenically cooled low-noise amplifier system described in "Automated Cryocooler Monitor and Control System" (NPO-47246), NASA Tech Briefs, Vol. 35, No. 5 (May 2011), page 7a. The software contains algorithms necessary to convert non-linear output voltages from the cryogenic diode-type thermometers and vacuum pressure and helium pressure sensors, to temperature and pressure units. The control function algorithms use the monitor data to control the cooler power, vacuum solenoid, vacuum pump, and electrical warm-up heaters. The control algorithms are based on a rule-based system that activates the required device based on the operating mode. The external interface is Web-based. It acts as a Web server, providing pages for monitor, control, and configuration. No client software from the external user is required.

  17. Automation bias: a systematic review of frequency, effect mediators, and mitigators

    PubMed Central

    Roudsari, Abdul; Wyatt, Jeremy C

    2011-01-01

    Automation bias (AB)—the tendency to over-rely on automation—has been studied in various academic fields. Clinical decision support systems (CDSS) aim to benefit the clinical decision-making process. Although most research shows overall improved performance with use, there is often a failure to recognize the new errors that CDSS can introduce. With a focus on healthcare, a systematic review of the literature from a variety of research fields has been carried out, assessing the frequency and severity of AB, the effect mediators, and interventions potentially mitigating this effect. This is discussed alongside automation-induced complacency, or insufficient monitoring of automation output. A mix of subject specific and freetext terms around the themes of automation, human–automation interaction, and task performance and error were used to search article databases. Of 13 821 retrieved papers, 74 met the inclusion criteria. User factors such as cognitive style, decision support systems (DSS), and task specific experience mediated AB, as did attitudinal driving factors such as trust and confidence. Environmental mediators included workload, task complexity, and time constraint, which pressurized cognitive resources. Mitigators of AB included implementation factors such as training and emphasizing user accountability, and DSS design factors such as the position of advice on the screen, updated confidence levels attached to DSS output, and the provision of information versus recommendation. By uncovering the mechanisms by which AB operates, this review aims to help optimize the clinical decision-making process for CDSS developers and healthcare practitioners. PMID:21685142

  18. Grasp specific and user friendly interface design for myoelectric hand prostheses.

    PubMed

    Mohammadi, Alireza; Lavranos, Jim; Howe, Rob; Choong, Peter; Oetomo, Denny

    2017-07-01

    This paper presents the design and characterisation of a hand prosthesis and its user interface, focusing on performing the most commonly used grasps in activities of daily living (ADLs). Since the operation of a multi-articulated powered hand prosthesis is difficult to learn and master, there is a significant rate of abandonment by amputees in preference for simpler devices. In choosing so, amputees chose to live with fewer features in their prosthesis that would more reliably perform the basic operations. In this paper, we look simultaneously at a hand prosthesis design method that aims for a small number of grasps, a low complexity user interface and an alternative method to the current use of EMG as a preshape selection method through the use of a simple button; to enable amputees to get to and execute the intended hand movements intuitively, quickly and reliably. An experiment is reported at the end of the paper comparing the speed and accuracy with which able-bodied naive subjects are able to select the intended preshapes through the use of a simplified EMG method and a simple button. It is shown that the button was significantly superior in the speed of successful task completion and marginally superior in accuracy (success of first attempt).

  19. Social Circles: A 3D User Interface for Facebook

    NASA Astrophysics Data System (ADS)

    Rodrigues, Diego; Oakley, Ian

    Online social network services are increasingly popular web applications which display large amounts of rich multimedia content: contacts, status updates, photos and event information. Arguing that this quantity of information overwhelms conventional user interfaces, this paper presents Social Circles, a rich interactive visualization designed to support real world users of social network services in everyday tasks such as keeping up with friends and organizing their network. It achieves this by using 3D UIs, fluid animations and a spatial metaphor to enable direct manipulation of a social network.

  20. Improving treatment plan evaluation with automation

    PubMed Central

    Covington, Elizabeth L.; Chen, Xiaoping; Younge, Kelly C.; Lee, Choonik; Matuszak, Martha M.; Kessler, Marc L.; Keranen, Wayne; Acosta, Eduardo; Dougherty, Ashley M.; Filpansick, Stephanie E.

    2016-01-01

    The goal of this work is to evaluate the effectiveness of Plan‐Checker Tool (PCT) which was created to improve first‐time plan quality, reduce patient delays, increase the efficiency of our electronic workflow, and standardize and automate the physics plan review in the treatment planning system (TPS). PCT uses an application programming interface to check and compare data from the TPS and treatment management system (TMS). PCT includes a comprehensive checklist of automated and manual checks that are documented when performed by the user as part of a plan readiness check for treatment. Prior to and during PCT development, errors identified during the physics review and causes of patient treatment start delays were tracked to prioritize which checks should be automated. Nineteen of 33 checklist items were automated, with data extracted with PCT. There was a 60% reduction in the number of patient delays in the six months after PCT release. PCT was successfully implemented for use on all external beam treatment plans in our clinic. While the number of errors found during the physics check did not decrease, automation of checks increased visibility of errors during the physics check, which led to decreased patient delays. The methods used here can be applied to any TMS and TPS that allows queries of the database. PACS number(s): 87.55.‐x, 87.55.N‐, 87.55.Qr, 87.55.tm, 89.20.Bb PMID:27929478

  1. "Friends" of Anglican Cathedrals: Norms and Values. Befriending, Friending or Misnomer?

    ERIC Educational Resources Information Center

    Muskett, Judith A.

    2013-01-01

    Loyal supporters of Anglican cathedrals first subscribed to "Friends" associations in the late 1920s. Yet, in 1937, a journalist in "The Times" portrayed cathedrals as a "queer thing to be a friend of." Drawing on theories of friendship from a range of disciplines, and surveys of what has been proclaimed in the public…

  2. Combined process automation for large-scale EEG analysis.

    PubMed

    Sfondouris, John L; Quebedeaux, Tabitha M; Holdgraf, Chris; Musto, Alberto E

    2012-01-01

    Epileptogenesis is a dynamic process producing increased seizure susceptibility. Electroencephalography (EEG) data provides information critical in understanding the evolution of epileptiform changes throughout epileptic foci. We designed an algorithm to facilitate efficient large-scale EEG analysis via linked automation of multiple data processing steps. Using EEG recordings obtained from electrical stimulation studies, the following steps of EEG analysis were automated: (1) alignment and isolation of pre- and post-stimulation intervals, (2) generation of user-defined band frequency waveforms, (3) spike-sorting, (4) quantification of spike and burst data and (5) power spectral density analysis. This algorithm allows for quicker, more efficient EEG analysis. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. Developing an automated water emitting-sensing system, based on integral tensiometers placed in homogenous environment.

    NASA Astrophysics Data System (ADS)

    Dabach, Sharon; Shani, Uri

    2010-05-01

    -Tensiometers. In treatment A, the system discharge changed according to the plant water demand. The discharge changes followed the water uptake changes during the day and during the entire growth period without any user interference. The integration of Geo-Tensiometer into the emitter system, together with the irrigation regime, maintained high and constant water content in the root zone in comparison to other irrigation methods, such as daily drip irrigation. Reading the matric potential in this media yielded better indication of water availability to the plants than sensors placed 3 cm from the emitters. In addition, the amount of water drainage below the root zone decreased significantly and therefore the threat of polluting groundwater. Furthermore, the automated flushing system eliminated the need for manual maintenance of the tensiometers creating a user friendly system.

  4. Automated Tutoring in Interactive Environments: A Task-Centered Approach.

    ERIC Educational Resources Information Center

    Wolz, Ursula; And Others

    1989-01-01

    Discusses tutoring and consulting functions in interactive computer environments. Tutoring strategies are considered, the expert model and the user model are described, and GENIE (Generated Informative Explanations)--an answer generating system for the Berkeley Unix Mail system--is explained as an example of an automated consulting system. (33…

  5. Norplant: users' perspective in Pakistan.

    PubMed

    Rehan, N; Inayatullah, A; Chaudhary, I

    1999-01-01

    Five hundred and eighteen Norplant acceptors (260 ever-users and 258 current users) were interviewed to assess their perceptions about Norplant. The mean age of the acceptors was 32.6+/-5.7 years (mean +/- SD). The mean parity was 4.3 and many of the users (40.2%) were illiterate. The most common reason to choose Norplant was its long duration of action (70.1%) followed by doctor's advice (10.4%) and use by other women (10.1%). Norplant was recommended by family planning workers in 35.3% cases, doctors in 29.2% cases and friends in 17.4% cases. Advertisement did not play any role in the women's choice of Norplant. In 77.3% cases, the decision to use Norplant was a joint decision. Only 15% of the users had fears/anxieties before insertion. Most of these women (44%) were concerned about possible ill-effects of Norplant on their health rather than efficacy. The social acceptance of Norplant was very high (76%) and more than half of the users (52.5%) were satisfied with the method. Among current users, 83.9% wanted to continue Norplant for 5 years. Only 39 users (15.1%) intended to discontinue. The main reason for discontinuation was menstrual disturbance (69.2%), followed by weight gain (12.7%). The study suggests that long duration of effective action and high social acceptance are likely to make Norplant a popular method among Pakistani women.

  6. Automation in the graphic arts

    NASA Astrophysics Data System (ADS)

    Truszkowski, Walt

    1995-04-01

    The CHIMES (Computer-Human Interaction Models) tool was designed to help solve a simply-stated but important problem, i.e., the problem of generating a user interface to a system that complies with established human factors standards and guidelines. Though designed for use in a fairly restricted user domain, i.e., spacecraft mission operations, the CHIMES system is essentially domain independent and applicable wherever graphical user interfaces of displays are to be encountered. The CHIMES philosophy and operating strategy are quite simple. Instead of requiring a human designer to actively maintain in his or her head the now encyclopedic knowledge that human factors and user interface specialists have evolved, CHIMES incorporates this information in its knowledge bases. When directed to evaluated a design, CHIMES determines and accesses the appropriate knowledge, performs an evaluation of the design against that information, determines whether the design is compliant with the selected guidelines and suggests corrective actions if deviations from guidelines are discovered. This paper will provide an overview of the capabilities of the current CHIMES tool and discuss the potential integration of CHIMES-like technology in automated graphic arts systems.

  7. An ultraviolet-visible spectrophotometer automation system. Part 3: Program documentation

    NASA Astrophysics Data System (ADS)

    Roth, G. S.; Teuschler, J. M.; Budde, W. L.

    1982-07-01

    The Ultraviolet-Visible Spectrophotometer (UVVIS) automation system accomplishes 'on-line' spectrophotometric quality assurance determinations, report generations, plot generations and data reduction for chlorophyll or color analysis. This system also has the capability to process manually entered data for the analysis of chlorophyll or color. For each program of the UVVIS system, this document contains a program description, flowchart, variable dictionary, code listing, and symbol cross-reference table. Also included are descriptions of file structures and of routines common to all automated analyses. The programs are written in Data General extended BASIC, Revision 4.3, under the RDOS operating systems, Revision 6.2. The BASIC code has been enhanced for real-time data acquisition, which is accomplished by CALLS to assembly language subroutines. Two other related publications are 'An Ultraviolet-Visible Spectrophotometer Automation System - Part I Functional Specifications,' and 'An Ultraviolet-Visible Spectrophotometer Automation System - Part II User's Guide.'

  8. Gromita: a fully integrated graphical user interface to gromacs 4.

    PubMed

    Sellis, Diamantis; Vlachakis, Dimitrios; Vlassi, Metaxia

    2009-09-07

    Gromita is a fully integrated and efficient graphical user interface (GUI) to the recently updated molecular dynamics suite Gromacs, version 4. Gromita is a cross-platform, perl/tcl-tk based, interactive front end designed to break the command line barrier and introduce a new user-friendly environment to run molecular dynamics simulations through Gromacs. Our GUI features a novel workflow interface that guides the user through each logical step of the molecular dynamics setup process, making it accessible to both advanced and novice users. This tool provides a seamless interface to the Gromacs package, while providing enhanced functionality by speeding up and simplifying the task of setting up molecular dynamics simulations of biological systems. Gromita can be freely downloaded from http://bio.demokritos.gr/gromita/.

  9. Earthquake Early Warning Beta Users: Java, Modeling, and Mobile Apps

    NASA Astrophysics Data System (ADS)

    Strauss, J. A.; Vinci, M.; Steele, W. P.; Allen, R. M.; Hellweg, M.

    2014-12-01

    Earthquake Early Warning (EEW) is a system that can provide a few to tens of seconds warning prior to ground shaking at a user's location. The goal and purpose of such a system is to reduce, or minimize, the damage, costs, and casualties resulting from an earthquake. A demonstration earthquake early warning system (ShakeAlert) is undergoing testing in the United States by the UC Berkeley Seismological Laboratory, Caltech, ETH Zurich, University of Washington, the USGS, and beta users in California and the Pacific Northwest. The beta users receive earthquake information very rapidly in real-time and are providing feedback on their experiences of performance and potential uses within their organization. Beta user interactions allow the ShakeAlert team to discern: which alert delivery options are most effective, what changes would make the UserDisplay more useful in a pre-disaster situation, and most importantly, what actions users plan to take for various scenarios. Actions could include: personal safety approaches, such as drop cover, and hold on; automated processes and procedures, such as opening elevator or fire stations doors; or situational awareness. Users are beginning to determine which policy and technological changes may need to be enacted, and funding requirements to implement their automated controls. The use of models and mobile apps are beginning to augment the basic Java desktop applet. Modeling allows beta users to test their early warning responses against various scenarios without having to wait for a real event. Mobile apps are also changing the possible response landscape, providing other avenues for people to receive information. All of these combine to improve business continuity and resiliency.

  10. Automated Installation Verification of COMSOL via LiveLink for MATLAB

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crowell, Michael W

    Verifying that a local software installation performs as the developer intends is a potentially time-consuming but necessary step for nuclear safety-related codes. Automating this process not only saves time, but can increase reliability and scope of verification compared to ‘hand’ comparisons. While COMSOL does not include automatic installation verification as many commercial codes do, it does provide tools such as LiveLink™ for MATLAB® and the COMSOL API for use with Java® through which the user can automate the process. Here we present a successful automated verification example of a local COMSOL 5.0 installation for nuclear safety-related calculations at the Oakmore » Ridge National Laboratory’s High Flux Isotope Reactor (HFIR).« less

  11. Software for Partly Automated Recognition of Targets

    NASA Technical Reports Server (NTRS)

    Opitz, David; Blundell, Stuart; Bain, William; Morris, Matthew; Carlson, Ian; Mangrich, Mark; Selinsky, T.

    2002-01-01

    The Feature Analyst is a computer program for assisted (partially automated) recognition of targets in images. This program was developed to accelerate the processing of high-resolution satellite image data for incorporation into geographic information systems (GIS). This program creates an advanced user interface that embeds proprietary machine-learning algorithms in commercial image-processing and GIS software. A human analyst provides samples of target features from multiple sets of data, then the software develops a data-fusion model that automatically extracts the remaining features from selected sets of data. The program thus leverages the natural ability of humans to recognize objects in complex scenes, without requiring the user to explain the human visual recognition process by means of lengthy software. Two major subprograms are the reactive agent and the thinking agent. The reactive agent strives to quickly learn the user's tendencies while the user is selecting targets and to increase the user's productivity by immediately suggesting the next set of pixels that the user may wish to select. The thinking agent utilizes all available resources, taking as much time as needed, to produce the most accurate autonomous feature-extraction model possible.

  12. Software for Partly Automated Recognition of Targets

    NASA Technical Reports Server (NTRS)

    Opitz, David; Blundell, Stuart; Bain, William; Morris, Matthew; Carlson, Ian; Mangrich, Mark

    2003-01-01

    The Feature Analyst is a computer program for assisted (partially automated) recognition of targets in images. This program was developed to accelerate the processing of high-resolution satellite image data for incorporation into geographic information systems (GIS). This program creates an advanced user interface that embeds proprietary machine-learning algorithms in commercial image-processing and GIS software. A human analyst provides samples of target features from multiple sets of data, then the software develops a data-fusion model that automatically extracts the remaining features from selected sets of data. The program thus leverages the natural ability of humans to recognize objects in complex scenes, without requiring the user to explain the human visual recognition process by means of lengthy software. Two major subprograms are the reactive agent and the thinking agent. The reactive agent strives to quickly learn the user s tendencies while the user is selecting targets and to increase the user s productivity by immediately suggesting the next set of pixels that the user may wish to select. The thinking agent utilizes all available resources, taking as much time as needed, to produce the most accurate autonomous feature-extraction model possible.

  13. Pathview Web: user friendly pathway visualization and data integration

    PubMed Central

    Pant, Gaurav; Bhavnasi, Yeshvant K.; Blanchard, Steven G.; Brouwer, Cory

    2017-01-01

    Abstract Pathway analysis is widely used in omics studies. Pathway-based data integration and visualization is a critical component of the analysis. To address this need, we recently developed a novel R package called Pathview. Pathview maps, integrates and renders a large variety of biological data onto molecular pathway graphs. Here we developed the Pathview Web server, as to make pathway visualization and data integration accessible to all scientists, including those without the special computing skills or resources. Pathview Web features an intuitive graphical web interface and a user centered design. The server not only expands the core functions of Pathview, but also provides many useful features not available in the offline R package. Importantly, the server presents a comprehensive workflow for both regular and integrated pathway analysis of multiple omics data. In addition, the server also provides a RESTful API for programmatic access and conveniently integration in third-party software or workflows. Pathview Web is openly and freely accessible at https://pathview.uncc.edu/. PMID:28482075

  14. Analysis of Delays in Transmitting Time Code Using an Automated Computer Time Distribution System

    DTIC Science & Technology

    1999-12-01

    jlevine@clock. bldrdoc.gov Abstract An automated computer time distribution system broadcasts standard tune to users using computers and modems via...contributed to &lays - sofhareplatform (50% of the delay), transmission speed of time- codes (25OA), telephone network (lS%), modem and others (10’4). The... modems , and telephone lines. Users dial the ACTS server to receive time traceable to the national time scale of Singapore, UTC(PSB). The users can in

  15. Automation Applications in an Advanced Air Traffic Management System : Volume 5A. DELTA Simulation Model - User's Guide

    DOT National Transportation Integrated Search

    1974-08-01

    Volume 4 describes the automation requirements. A presentation of automation requirements is made for an advanced air traffic management system in terms of controller work for-e, computer resources, controller productivity, system manning, failure ef...

  16. Automated visual imaging interface for the plant floor

    NASA Astrophysics Data System (ADS)

    Wutke, John R.

    1991-03-01

    The paper will provide an overview of the challenges facing a user of automated visual imaging (" AVI" ) machines and the philosophies that should be employed in designing them. As manufacturing tools and equipment become more sophisticated it is increasingly difficult to maintain an efficient interaction between the operator and machine. The typical user of an AVI machine in a production environment is technically unsophisticated. Also operator and machine ergonomics are often a neglected or poorly addressed part of an efficient manufacturing process. This paper presents a number of man-machine interface design techniques and philosophies that effectively solve these problems.

  17. ESPVI 4.0 ELECTROSTATIS PRECIPITATOR V-1 AND PERFORMANCE MODEL: USER'S MANUAL

    EPA Science Inventory

    The manual is the companion document for the microcomputer program ESPVI 4.0, Electrostatic Precipitation VI and Performance Model. The program was developed to provide a user- friendly interface to an advanced model of electrostatic precipitation (ESP) performance. The program i...

  18. Securing the User's Work Environment

    NASA Technical Reports Server (NTRS)

    Cardo, Nicholas P.

    2004-01-01

    High performance computing at the Numerical Aerospace Simulation Facility at NASA Ames Research Center includes C90's, J90's and Origin 2000's. Not only is it necessary to protect these systems from outside attacks, but also to provide a safe working environment on the systems. With the right tools, security anomalies in the user s work environment can be deleted and corrected. Validating proper ownership of files against user s permissions, will reduce the risk of inadvertent data compromise. The detection of extraneous directories and files hidden amongst user home directories is important for identifying potential compromises. The first runs of these utilities detected over 350,000 files with problems. With periodic scans, automated correction of problems takes only minutes. Tools for detecting these types of problems as well as their development techniques will be discussed with emphasis on consistency, portability and efficiency for both UNICOS and IRIX.

  19. "Am I a Good Reader?" a Friendly Way to Evaluate Students' Perceptions of Themselves as a Reader

    ERIC Educational Resources Information Center

    Wangsgard, Nichole

    2014-01-01

    Current reading assessments put emphasis in the five components of reading competency, neglecting the self-efficacy needs of students. When students struggle with one of or all five components of reading competency, they could have negative feelings about themselves as a reader. This article offers teachers an effective user friendly assessment…

  20. Program for User-Friendly Management of Input and Output Data Sets

    NASA Technical Reports Server (NTRS)

    Klimeck, Gerhard

    2003-01-01

    A computer program manages large, hierarchical sets of input and output (I/O) parameters (typically, sequences of alphanumeric data) involved in computational simulations in a variety of technological disciplines. This program represents sets of parameters as structures coded in object-oriented but otherwise standard American National Standards Institute C language. Each structure contains a group of I/O parameters that make sense as a unit in the simulation program with which this program is used. The addition of options and/or elements to sets of parameters amounts to the addition of new elements to data structures. By association of child data generated in response to a particular user input, a hierarchical ordering of input parameters can be achieved. Associated with child data structures are the creation and description mechanisms within the parent data structures. Child data structures can spawn further child data structures. In this program, the creation and representation of a sequence of data structures is effected by one line of code that looks for children of a sequence of structures until there are no more children to be found. A linked list of structures is created dynamically and is completely represented in the data structures themselves. Such hierarchical data presentation can guide users through otherwise complex setup procedures and it can be integrated within a variety of graphical representations.