Sample records for user requirements analysis

  1. Application of Cognitive Task Analysis in User Requirements and Prototype Design Presentation/Briefing

    DTIC Science & Technology

    2005-10-01

    AFRL-HE-WP-TP-2005-0030 AIR FORCE RESEARCH LABORATORY Application of Cognitive Task Analysis in User Requirements and Prototype Design Presentation...TITLE AND SUBTITLE 5a. CONTRACT NUMBER FA8650-04-C-6406 Application of Cognitive Task Analysis in User Requirements 5b.GRANTNUMBER and Prototype...maintainer experience 21 21 Questions & Answers Application of Cognitive Task Analysis in User Requirements Definition and Prototype Design Christopher Curtis

  2. Course Ontology-Based User's Knowledge Requirement Acquisition from Behaviors within E-Learning Systems

    ERIC Educational Resources Information Center

    Zeng, Qingtian; Zhao, Zhongying; Liang, Yongquan

    2009-01-01

    User's knowledge requirement acquisition and analysis are very important for a personalized or user-adaptive learning system. Two approaches to capture user's knowledge requirement about course content within an e-learning system are proposed and implemented in this paper. The first approach is based on the historical data accumulated by an…

  3. Evolution of user analysis on the grid in ATLAS

    NASA Astrophysics Data System (ADS)

    Dewhurst, A.; Legger, F.; ATLAS Collaboration

    2017-10-01

    More than one thousand physicists analyse data collected by the ATLAS experiment at the Large Hadron Collider (LHC) at CERN through 150 computing facilities around the world. Efficient distributed analysis requires optimal resource usage and the interplay of several factors: robust grid and software infrastructures, and system capability to adapt to different workloads. The continuous automatic validation of grid sites and the user support provided by a dedicated team of expert shifters have been proven to provide a solid distributed analysis system for ATLAS users. Typical user workflows on the grid, and their associated metrics, are discussed. Measurements of user job performance and typical requirements are also shown.

  4. CMS Configuration Editor: GUI based application for user analysis job

    NASA Astrophysics Data System (ADS)

    de Cosa, A.

    2011-12-01

    We present the user interface and the software architecture of the Configuration Editor for the CMS experiment. The analysis workflow is organized in a modular way integrated within the CMS framework that organizes in a flexible way user analysis code. The Python scripting language is adopted to define the job configuration that drives the analysis workflow. It could be a challenging task for users, especially for newcomers, to develop analysis jobs managing the configuration of many required modules. For this reason a graphical tool has been conceived in order to edit and inspect configuration files. A set of common analysis tools defined in the CMS Physics Analysis Toolkit (PAT) can be steered and configured using the Config Editor. A user-defined analysis workflow can be produced starting from a standard configuration file, applying and configuring PAT tools according to the specific user requirements. CMS users can adopt this tool, the Config Editor, to create their analysis visualizing in real time which are the effects of their actions. They can visualize the structure of their configuration, look at the modules included in the workflow, inspect the dependences existing among the modules and check the data flow. They can visualize at which values parameters are set and change them according to what is required by their analysis task. The integration of common tools in the GUI needed to adopt an object-oriented structure in the Python definition of the PAT tools and the definition of a layer of abstraction from which all PAT tools inherit.

  5. Post LANDSAT D Advanced Concept Evaluation (PLACE). [with emphasis on mission planning, technological forecasting, and user requirements

    NASA Technical Reports Server (NTRS)

    1977-01-01

    An outline is given of the mission objectives and requirements, system elements, system concepts, technology requirements and forecasting, and priority analysis for LANDSAT D. User requirements and mission analysis and technological forecasting are emphasized. Mission areas considered include agriculture, range management, forestry, geology, land use, water resources, environmental quality, and disaster assessment.

  6. Information transfer satellite concept study. Volume 4: computer manual

    NASA Technical Reports Server (NTRS)

    Bergin, P.; Kincade, C.; Kurpiewski, D.; Leinhaupel, F.; Millican, F.; Onstad, R.

    1971-01-01

    The Satellite Telecommunications Analysis and Modeling Program (STAMP) provides the user with a flexible and comprehensive tool for the analysis of ITS system requirements. While obtaining minimum cost design points, the program enables the user to perform studies over a wide range of user requirements and parametric demands. The program utilizes a total system approach wherein the ground uplink and downlink, the spacecraft, and the launch vehicle are simultaneously synthesized. A steepest descent algorithm is employed to determine the minimum total system cost design subject to the fixed user requirements and imposed constraints. In the process of converging to the solution, the pertinent subsystem tradeoffs are resolved. This report documents STAMP through a technical analysis and a description of the principal techniques employed in the program.

  7. Decision Analysis for Remediation Technologies (DART) user`s manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sebo, D.

    1997-09-01

    This user`s manual is an introduction to the use of the Decision Analysis for Remediation Technology (DART) Report Generator. DART provides a user interface to a database containing site data (e.g., contaminants, waste depth, area) for sites within the Subsurface Contaminant Focus Area (SCFA). The database also contains SCFA requirements, needs, and technology information. The manual is arranged in two major sections. The first section describes loading DART onto a user system. The second section describes DART operation. DART operation is organized into sections by the user interface forms. For each form, user input, both optional and required, DART capabilities,more » and the result of user selections will be covered in sufficient detail to enable the user to understand DART, capabilities and determine how to use DART to meet specific needs.« less

  8. Digital Avionics Information System (DAIS): Training Requirements Analysis Model Users Guide. Final Report.

    ERIC Educational Resources Information Center

    Czuchry, Andrew J.; And Others

    This user's guide describes the functions, logical operations and subroutines, input data requirements, and available outputs of the Training Requirements Analysis Model (TRAMOD), a computerized analytical life cycle cost modeling system for use in the early stages of system design. Operable in a stand-alone mode, TRAMOD can be used for the…

  9. Development of a task analysis tool to facilitate user interface design

    NASA Technical Reports Server (NTRS)

    Scholtz, Jean C.

    1992-01-01

    A good user interface is one that facilitates the user in carrying out his task. Such interfaces are difficult and costly to produce. The most important aspect in producing a good interface is the ability to communicate to the software designers what the user's task is. The Task Analysis Tool is a system for cooperative task analysis and specification of the user interface requirements. This tool is intended to serve as a guide to development of initial prototypes for user feedback.

  10. National Land Imaging Requirements (NLIR) Pilot Project summary report: summary of moderate resolution imaging user requirements

    USGS Publications Warehouse

    Vadnais, Carolyn; Stensaas, Gregory

    2014-01-01

    Under the National Land Imaging Requirements (NLIR) Project, the U.S. Geological Survey (USGS) is developing a functional capability to obtain, characterize, manage, maintain and prioritize all Earth observing (EO) land remote sensing user requirements. The goal is a better understanding of community needs that can be supported with land remote sensing resources, and a means to match needs with appropriate solutions in an effective and efficient way. The NLIR Project is composed of two components. The first component is focused on the development of the Earth Observation Requirements Evaluation System (EORES) to capture, store and analyze user requirements, whereas, the second component is the mechanism and processes to elicit and document the user requirements that will populate the EORES. To develop the second component, the requirements elicitation methodology was exercised and refined through a pilot project conducted from June to September 2013. The pilot project focused specifically on applications and user requirements for moderate resolution imagery (5–120 meter resolution) as the test case for requirements development. The purpose of this summary report is to provide a high-level overview of the requirements elicitation process that was exercised through the pilot project and an early analysis of the moderate resolution imaging user requirements acquired to date to support ongoing USGS sustainable land imaging study needs. The pilot project engaged a limited set of Federal Government users from the operational and research communities and therefore the information captured represents only a subset of all land imaging user requirements. However, based on a comparison of results, trends, and analysis, the pilot captured a strong baseline of typical applications areas and user needs for moderate resolution imagery. Because these results are preliminary and represent only a sample of users and application areas, the information from this report should only be used to indicate general user needs for the applications covered. Users of the information are cautioned that use of specific numeric results may be inappropriate without additional research. Any information used or cited from this report should specifically be cited as preliminary findings.

  11. Development of user customized smart keyboard using Smart Product Design-Finite Element Analysis Process in the Internet of Things.

    PubMed

    Kim, Jung Woo; Sul, Sang Hun; Choi, Jae Boong

    2018-06-07

    In a hyper-connected society, IoT environment, markets are rapidly changing as smartphones penetrate global market. As smartphones are applied to various digital media, development of a novel smart product is required. In this paper, a Smart Product Design-Finite Element Analysis Process (SPD-FEAP) is developed to adopt fast-changing tends and user requirements that can be visually verified. The user requirements are derived and quantitatively evaluated from Smart Quality Function Deployment (SQFD) using WebData. Then the usage scenarios are created according to the priority of the functions derived from SQFD. 3D shape analysis by Finite Element Analysis (FEA) was conducted and printed out through Rapid Prototyping (RP) technology to identify any possible errors. Thus, a User Customized Smart Keyboard has been developed using SPD-FEAP. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  12. Gamified Cognitive Control Training for Remitted Depressed Individuals: User Requirements Analysis

    PubMed Central

    Van Looy, Jan; Hoorelbeke, Kristof; Baeken, Chris; Koster, Ernst HW

    2018-01-01

    Background The high incidence and relapse rates of major depressive disorder demand novel treatment options. Standard treatments (psychotherapy, medication) usually do not target cognitive control impairments, although these seem to play a crucial role in achieving stable remission. The urgent need for treatment combined with poor availability of adequate psychological interventions has instigated a shift toward internet interventions. Numerous computerized programs have been developed that can be presented online and offline. However, their uptake and adherence are oftentimes low. Objective The aim of this study was to perform a user requirements analysis for an internet-based training targeting cognitive control. This training focuses on ameliorating cognitive control impairments, as these are still present during remission and can be a risk factor for relapse. To facilitate uptake of and adherence to this intervention, a qualitative user requirements analysis was conducted to map mandatory and desirable requirements. Methods We conducted a user requirements analysis through a focus group with 5 remitted depressed individuals and individual interviews with 6 mental health care professionals. All qualitative data were transcribed and examined using a thematic analytic approach. Results Results showed mandatory requirements for the remitted sample in terms of training configuration, technological and personal factors, and desirable requirements regarding knowledge and enjoyment. Furthermore, knowledge and therapeutic benefits were key requirements for therapists. Conclusions The identified requirements provide useful information to be integrated in interventions targeting cognitive control in depression. PMID:29622525

  13. Satellite on-board processing for earth resources data

    NASA Technical Reports Server (NTRS)

    Bodenheimer, R. E.; Gonzalez, R. C.; Gupta, J. N.; Hwang, K.; Rochelle, R. W.; Wilson, J. B.; Wintz, P. A.

    1975-01-01

    Results of a survey of earth resources user applications and their data requirements, earth resources multispectral scanner sensor technology, and preprocessing algorithms for correcting the sensor outputs and for data bulk reduction are presented along with a candidate data format. Computational requirements required to implement the data analysis algorithms are included along with a review of computer architectures and organizations. Computer architectures capable of handling the algorithm computational requirements are suggested and the environmental effects of an on-board processor discussed. By relating performance parameters to the system requirements of each of the user requirements the feasibility of on-board processing is determined for each user. A tradeoff analysis is performed to determine the sensitivity of results to each of the system parameters. Significant results and conclusions are discussed, and recommendations are presented.

  14. Sustainable Land Imaging User Requirements

    NASA Astrophysics Data System (ADS)

    Wu, Z.; Snyder, G.; Vadnais, C. M.

    2017-12-01

    The US Geological Survey (USGS) Land Remote Sensing Program (LRSP) has collected user requirements from a range of applications to help formulate the Landsat 9 follow-on mission (Landsat 10) through the Requirements, Capabilities and Analysis (RCA) activity. The USGS is working with NASA to develop Landsat 10, which is scheduled to launch in the 2027 timeframe as part of the Sustainable Land Imaging program. User requirements collected through RCA will help inform future Landsat 10 sensor designs and mission characteristics. Current Federal civil community users have provided hundreds of requirements through systematic, in-depth interviews. Academic, State, local, industry, and international Landsat user community input was also incorporated in the process. Emphasis was placed on spatial resolution, temporal revisit, and spectral characteristics, as well as other aspects such as accuracy, continuity, sampling condition, data access and format. We will provide an overview of the Landsat 10 user requirements collection process and summary results of user needs from the broad land imagining community.

  15. Get Your Requirements Straight: Storyboarding Revisited

    NASA Astrophysics Data System (ADS)

    Haesen, Mieke; Luyten, Kris; Coninx, Karin

    Current user-centred software engineering (UCSE) approaches provide many techniques to combine know-how available in multidisciplinary teams. Although the involvement of various disciplines is beneficial for the user experience of the future application, the transition from a user needs analysis to a structured interaction analysis and UI design is not always straightforward. We propose storyboards, enriched by metadata, to specify functional and non-functional requirements. Accompanying tool support should facilitate the creation and use of storyboards. We used a meta-storyboard for the verification of storyboarding approaches.

  16. Mask Analysis Program (MAP) reference manual

    NASA Technical Reports Server (NTRS)

    Mitchell, C. L.

    1976-01-01

    A document intended to serve as a User's Manual and a Programmer's Manual for the Mask Analysis Program is presented. The first portion of the document is devoted to the user. It contains all of the information required to execute MAP. The remainder of the document describes the details of MAP software logic. Although the information in this portion is not required to run the program, it is recommended that every user review it to gain an appreciation for the program functions.

  17. A framework supporting the development of a Grid portal for analysis based on ROI.

    PubMed

    Ichikawa, K; Date, S; Kaishima, T; Shimojo, S

    2005-01-01

    In our research on brain function analysis, users require two different simultaneous types of processing: interactive processing to a specific part of data and high-performance batch processing to an entire dataset. The difference between these two types of processing is in whether or not the analysis is for data in the region of interest (ROI). In this study, we propose a Grid portal that has a mechanism to freely assign computing resources to the users on a Grid environment according to the users' two different types of processing requirements. We constructed a Grid portal which integrates interactive processing and batch processing by the following two mechanisms. First, a job steering mechanism controls job execution based on user-tagged priority among organizations with heterogeneous computing resources. Interactive jobs are processed in preference to batch jobs by this mechanism. Second, a priority-based result delivery mechanism that administrates a rank of data significance. The portal ensures a turn-around time of interactive processing by the priority-based job controlling mechanism, and provides the users with quality of services (QoS) for interactive processing. The users can access the analysis results of interactive jobs in preference to the analysis results of batch jobs. The Grid portal has also achieved high-performance computation of MEG analysis with batch processing on the Grid environment. The priority-based job controlling mechanism has been realized to freely assign computing resources to the users' requirements. Furthermore the achievement of high-performance computation contributes greatly to the overall progress of brain science. The portal has thus made it possible for the users to flexibly include the large computational power in what they want to analyze.

  18. Building a Web-based drug ordering system for hospitals: from requirements engineering to prototyping.

    PubMed

    Hübner, U; Klein, F; Hofstetter, J; Kammeyer, G; Seete, H

    2000-01-01

    Web-based drug ordering allows a growing number of hospitals without pharmacy to communicate seamlessly with their external pharmacy. Business process analysis and object oriented modelling performed together with the users at a pilot hospital resulted in a comprehensive picture of the user and business requirements for electronic drug ordering. The user requirements were further validated with the help of a software prototype. In order to capture the needs of a large number of users CAP10, a new method making use of pre-built models, is proposed. Solutions for coping with the technical requirements (interfacing the business software at the pharmacy) and with the legal requirements (signing the orders) are presented.

  19. Unified Framework for Development, Deployment and Robust Testing of Neuroimaging Algorithms

    PubMed Central

    Joshi, Alark; Scheinost, Dustin; Okuda, Hirohito; Belhachemi, Dominique; Murphy, Isabella; Staib, Lawrence H.; Papademetris, Xenophon

    2011-01-01

    Developing both graphical and command-line user interfaces for neuroimaging algorithms requires considerable effort. Neuroimaging algorithms can meet their potential only if they can be easily and frequently used by their intended users. Deployment of a large suite of such algorithms on multiple platforms requires consistency of user interface controls, consistent results across various platforms and thorough testing. We present the design and implementation of a novel object-oriented framework that allows for rapid development of complex image analysis algorithms with many reusable components and the ability to easily add graphical user interface controls. Our framework also allows for simplified yet robust nightly testing of the algorithms to ensure stability and cross platform interoperability. All of the functionality is encapsulated into a software object requiring no separate source code for user interfaces, testing or deployment. This formulation makes our framework ideal for developing novel, stable and easy-to-use algorithms for medical image analysis and computer assisted interventions. The framework has been both deployed at Yale and released for public use in the open source multi-platform image analysis software—BioImage Suite (bioimagesuite.org). PMID:21249532

  20. Experiencing the Elicitation of User Requirements and Recording Them in Use Case Diagrams through Role-Play

    ERIC Educational Resources Information Center

    Costain, Gay; McKenna, Brad

    2011-01-01

    This paper describes a role-play exercise used in a second-year tertiary Systems Analysis and Design course, and the quantitative and qualitative analysis of the students' responses to a survey that solicited their perceptions of that role-play experience. The role-play involved students in eliciting user requirements from customers during a Joint…

  1. Gamified Cognitive Control Training for Remitted Depressed Individuals: User Requirements Analysis.

    PubMed

    Vervaeke, Jasmien; Van Looy, Jan; Hoorelbeke, Kristof; Baeken, Chris; Koster, Ernst Hw

    2018-04-05

    The high incidence and relapse rates of major depressive disorder demand novel treatment options. Standard treatments (psychotherapy, medication) usually do not target cognitive control impairments, although these seem to play a crucial role in achieving stable remission. The urgent need for treatment combined with poor availability of adequate psychological interventions has instigated a shift toward internet interventions. Numerous computerized programs have been developed that can be presented online and offline. However, their uptake and adherence are oftentimes low. The aim of this study was to perform a user requirements analysis for an internet-based training targeting cognitive control. This training focuses on ameliorating cognitive control impairments, as these are still present during remission and can be a risk factor for relapse. To facilitate uptake of and adherence to this intervention, a qualitative user requirements analysis was conducted to map mandatory and desirable requirements. We conducted a user requirements analysis through a focus group with 5 remitted depressed individuals and individual interviews with 6 mental health care professionals. All qualitative data were transcribed and examined using a thematic analytic approach. Results showed mandatory requirements for the remitted sample in terms of training configuration, technological and personal factors, and desirable requirements regarding knowledge and enjoyment. Furthermore, knowledge and therapeutic benefits were key requirements for therapists. The identified requirements provide useful information to be integrated in interventions targeting cognitive control in depression. ©Jasmien Vervaeke, Jan Van Looy, Kristof Hoorelbeke, Chris Baeken, Ernst HW Koster. Originally published in JMIR Serious Games (http://games.jmir.org), 05.04.2018.

  2. User requirements and user acceptance of current and next-generation satellite mission and sensor complement, oriented toward the monitoring of water resources

    NASA Technical Reports Server (NTRS)

    Castruccio, P. A.; Loats, H. L., Jr.; Fowler, T. R.; Robinson, P.

    1975-01-01

    Principal water resources users were surveyed to determine the applicability of remotely sensed data to their present and future requirements. Analysis of responses was used to assess the levels of adequacy of LANDSAT 1 and 2 in fulfilling hydrological functions, and to derive systems specifications for future water resources-oriented remote sensing satellite systems. The analysis indicates that water resources applications for all but the very large users require: (1) resolutions on the order of 15 meters, (2) a number of radiometric levels of the same order as currently used in LANDSAT 1 (64), (3) a number of spectral bands not in excess of those used in LANDSAT 1, and (4) a repetition frequency on the order of 2 weeks. The users had little feel for the value of new sensors (thermal IR, passive and active microwaves). What is needed in this area is to achieve specific demonstrations of the utility of these sensors and submit the results to the users to evince their judgement.

  3. Accessibility and Analysis to NASA's New Large Volume Missions

    NASA Astrophysics Data System (ADS)

    Hausman, J.; Gangl, M.; McAuley, J.; Toaz, R., Jr.

    2016-12-01

    Each new satellite mission continues to measure larger volumes of data than the last. This is especially true with the new NASA satellite missions NISAR and SWOT, launching in 2020 and 2021, which will produce petabytes of data a year. A major concern is how will users be able to analyze such volumes? This presentation will show how cloud storage and analysis can help overcome and accommodate multiple users' needs. While users may only need gigabytes of data for their research, the data center will need to leverage the processing power of the cloud to perform search and subsetting capabilities over the large volume of data. There is also a vast array of user types that require different tools and services to access and analyze the data. Some users need global data to run climate models, while others require small, dynamic regions with lots of analysis and transformations. There will also be a need to generate data that have different inputs or correction algorithms that the project may not be able to provide as those will be very specialized for specific regions or evolve quicker than what the project can reprocess. By having the data and tools side by side, users will be able to access the data they require and analyze it all in one place. By placing data in the cloud, users can analyze the data there, shifting the current "download and analyze" paradigm to "log-in and analyze". The cloud will provide adequate processing power needed to analyze large volumes of data, subset small regions over large volumes of data, and regenerate/reformat data to the specificity each user requires.

  4. Data System Implications Derived from User Application Requirements for Satellite Data

    NASA Technical Reports Server (NTRS)

    Neiers, J.

    1979-01-01

    An investigation of the data system needs as driven by users of space acquired Earth observation data is documented. Two major categories of users, operational and research, are identified. Limiting data acquisition alleviates some of the delays in processing thus improving timeliness of the delivered product. Trade offs occur between timeliness and data distribution costs, and between data storage and reprocessing. The complexity of the data system requirements to apply space data to users' needs is such that no single analysis suffices to design and implement the optimum system. A series of iterations is required with analyses of the salient problems in a general way, followed by a limited implementation of benefit to some users with a continual upgrade in system capacity, functions, and applications served. The resulting most important requirement for the data system is flexibility to accommodate changing requirements as the system is implemented.

  5. Detailed requirements document for the integrated structural analysis system, phase B

    NASA Technical Reports Server (NTRS)

    Rainey, J. A.

    1976-01-01

    The requirements are defined for a software system entitled integrated Structural Analysis System (ISAS) Phase B which is being developed to provide the user with a tool by which a complete and detailed analysis of a complex structural system can be performed. This software system will allow for automated interface with numerous structural analysis batch programs and for user interaction in the creation, selection, and validation of data. This system will include modifications to the 4 functions developed for ISAS, and the development of 25 new functions. The new functions are described.

  6. User-defined Material Model for Thermo-mechanical Progressive Failure Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.

    2008-01-01

    Previously a user-defined material model for orthotropic bimodulus materials was developed for linear and nonlinear stress analysis of composite structures using either shell or solid finite elements within a nonlinear finite element analysis tool. Extensions of this user-defined material model to thermo-mechanical progressive failure analysis are described, and the required input data are documented. The extensions include providing for temperature-dependent material properties, archival of the elastic strains, and a thermal strain calculation for materials exhibiting a stress-free temperature.

  7. Integrated verification and testing system (IVTS) for HAL/S programs

    NASA Technical Reports Server (NTRS)

    Senn, E. H.; Ames, K. R.; Smith, K. A.

    1983-01-01

    The IVTS is a large software system designed to support user-controlled verification analysis and testing activities for programs written in the HAL/S language. The system is composed of a user interface and user command language, analysis tools and an organized data base of host system files. The analysis tools are of four major types: (1) static analysis, (2) symbolic execution, (3) dynamic analysis (testing), and (4) documentation enhancement. The IVTS requires a split HAL/S compiler, divided at the natural separation point between the parser/lexical analyzer phase and the target machine code generator phase. The IVTS uses the internal program form (HALMAT) between these two phases as primary input for the analysis tools. The dynamic analysis component requires some way to 'execute' the object HAL/S program. The execution medium may be an interpretive simulation or an actual host or target machine.

  8. Development of Integrated Programs for Aerospace-Vehicle Design (IPAD) - IPAD user requirements

    NASA Technical Reports Server (NTRS)

    Anderton, G. L.

    1979-01-01

    Results of a requirements analysis task for Integrated Programs for Aerospace Vehicle Design (IPAD) are presented. User requirements which, in part, will shape the IPAD system design are given. Requirements considered were: generation, modification, storage, retrieval, communication, reporting, and protection of information. Data manipulation and controls on the system and the information were also considered. Specific needs relative to the product design process are also discussed.

  9. An analysis for high speed propeller-nacelle aerodynamic performance prediction. Volume 2: User's manual

    NASA Technical Reports Server (NTRS)

    Egolf, T. Alan; Anderson, Olof L.; Edwards, David E.; Landgrebe, Anton J.

    1988-01-01

    A user's manual for the computer program developed for the prediction of propeller-nacelle aerodynamic performance reported in, An Analysis for High Speed Propeller-Nacelle Aerodynamic Performance Prediction: Volume 1 -- Theory and Application, is presented. The manual describes the computer program mode of operation requirements, input structure, input data requirements and the program output. In addition, it provides the user with documentation of the internal program structure and the software used in the computer program as it relates to the theory presented in Volume 1. Sample input data setups are provided along with selected printout of the program output for one of the sample setups.

  10. User participation in the development of the human/computer interface for control centers

    NASA Technical Reports Server (NTRS)

    Broome, Richard; Quick-Campbell, Marlene; Creegan, James; Dutilly, Robert

    1996-01-01

    Technological advances coupled with the requirements to reduce operations staffing costs led to the demand for efficient, technologically-sophisticated mission operations control centers. The control center under development for the earth observing system (EOS) is considered. The users are involved in the development of a control center in order to ensure that it is cost-efficient and flexible. A number of measures were implemented in the EOS program in order to encourage user involvement in the area of human-computer interface development. The following user participation exercises carried out in relation to the system analysis and design are described: the shadow participation of the programmers during a day of operations; the flight operations personnel interviews; and the analysis of the flight operations team tasks. The user participation in the interface prototype development, the prototype evaluation, and the system implementation are reported on. The involvement of the users early in the development process enables the requirements to be better understood and the cost to be reduced.

  11. Analysis of User Need with CD-ROM Databases: A Case Study Based on Work Sampling at One University Library.

    ERIC Educational Resources Information Center

    Wells, Amy Tracy

    Analysis of the needs of users of Compact Disk-Read Only Memory (CD-ROM) was performed at the Tampa campus of the University of South Florida. A review of the literature indicated that problems associated with selecting the appropriate database, searching, and requiring technical assistance were the probable areas of user need. The library has 17…

  12. Technical Requirements Analysis and Control Systems (TRACS) Initial Operating Capability (IOC) documentation

    NASA Technical Reports Server (NTRS)

    Hammond, Dana P.

    1991-01-01

    The Technical Requirements Analysis and Control Systems (TRACS) software package is described. TRACS offers supplemental tools for the analysis, control, and interchange of project requirements. This package provides the fundamental capability to analyze and control requirements, serves a focal point for project requirements, and integrates a system that supports efficient and consistent operations. TRACS uses relational data base technology (ORACLE) in a stand alone or in a distributed environment that can be used to coordinate the activities required to support a project through its entire life cycle. TRACS uses a set of keyword and mouse driven screens (HyperCard) which imposes adherence through a controlled user interface. The user interface provides an interactive capability to interrogate the data base and to display or print project requirement information. TRACS has a limited report capability, but can be extended with PostScript conventions.

  13. Space station needs, attributes and architectural options. Volume 1, attachment 1: Executive summary NASA

    NASA Technical Reports Server (NTRS)

    1983-01-01

    User alignment plan, physical and life sciences and applications, commercial requirements national security, space operations, user needs, foreign contacts, mission scenario analysis and architectural concepts, alternative systems concepts, mission operations architectural development, architectural analysis trades, evolution, configuration, and technology development are discussed.

  14. User Requirements for a Chronic Kidney Disease Clinical Decision Support Tool to Promote Timely Referral.

    PubMed

    Gulla, Joy; Neri, Pamela M; Bates, David W; Samal, Lipika

    2017-05-01

    Timely referral of patients with CKD has been associated with cost and mortality benefits, but referrals are often done too late in the course of the disease. Clinical decision support (CDS) offers a potential solution, but interventions have failed because they were not designed to support the physician workflow. We sought to identify user requirements for a chronic kidney disease (CKD) CDS system to promote timely referral. We interviewed primary care physicians (PCPs) to identify data needs for a CKD CDS system that would encourage timely referral and also gathered information about workflow to assess risk factors for progression of CKD. Interviewees were general internists recruited from a network of 14 primary care clinics affiliated with Brigham and Women's Hospital (BWH). We then performed a qualitative analysis to identify user requirements and system attributes for a CKD CDS system. Of the 12 participants, 25% were women, the mean age was 53 (range 37-82), mean years in clinical practice was 27 (range 11-58). We identified 21 user requirements. Seven of these user requirements were related to support for the referral process workflow, including access to pertinent information and support for longitudinal co-management. Six user requirements were relevant to PCP management of CKD, including management of risk factors for progression, interpretation of biomarkers of CKD severity, and diagnosis of the cause of CKD. Finally, eight user requirements addressed user-centered design of CDS, including the need for actionable information, links to guidelines and reference materials, and visualization of trends. These 21 user requirements can be used to design an intuitive and usable CDS system with the attributes necessary to promote timely referral. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Evolutionary Capability Delivery of Coast Guard Manpower System

    DTIC Science & Technology

    2014-06-01

    Office IID iterative incremental development model IT information technology MA major accomplishment MRA manpower requirements analysis MRD manpower...CG will need to ensure that development is low risk. The CG uses Manpower Requirements Analysis ( MRAs ) to collect the necessary manpower data to...of users. The CG uses two business processes to manage human capital: Manpower Requirements Analysis ( MRA ) and Manpower Requirements

  16. From users involvement to users' needs understanding: a case study.

    PubMed

    Niès, Julie; Pelayo, Sylvia

    2010-04-01

    Companies developing and commercializing Healthcare IT applications may decide to involve the users in the software development lifecycle in order to better understand the users' needs and to optimize their products. Unfortunately direct developers-users dialogues are not sufficient to ensure a proper understanding of the users' needs. It is also necessary to involve human factors specialists to analyze the users' expression of their needs and to properly formalize the requirements for design purposes. The objective of this paper is to present a case study reporting the collaborative work between HF experts and a company developing and commercializing a CPOE. This study shows how this collaboration helps resolve the limits of direct users involvement and usual problems pertaining to users' needs description and understanding. The company participating in the study has implemented a procedure to convene regular meetings allowing direct exchanges between the development team and users' representatives. Those meetings aim at getting users' feedbacks on the existing products and at validating further developments. In parallel with usual HF methods supporting the analysis of the work system (onsite observations followed by debriefing interviews) and the usability evaluation of the application (usability inspection and usability tests), HF experts took the opportunity of the meetings organized by the company to collect, re-interpret and re-formulate the needs expressed by the users. The developers perceive the physicians' requirements concerning the display of the patient's list of medication as contradictory. In a previous meeting round the users had required a detailed view of the medication list against the synthesized existing one. Once this requirement satisfied, the users participating in the current meeting round require a synthesized view against the existing detailed one. The development team is unable to understand what they perceive as a reverse claim. Relying on a cognitive analysis of the physicians' decision making concerning the patient's treatment, the HF experts help re-formulate the physicians' cognitive needs in terms of synthesized/detailed display of the medication list depending on the stage of the decision making process. This led to an astute re-engineering of the application allowing the physicians to easily navigate back and forth between the synthesized and detailed views depending on the progress of their decision making. This study demonstrates that the integration of users' representatives in the software lifecycle is a good point for the end users. But it remains insufficient to resolve the complex usability problems of the system. Such solutions require the integration of HF expertise. Moreover, such an involvement of HF experts may generate benefits in terms of reduction of (i) the number of iterative developments and (ii) the users' training costs. (c) 2009 Elsevier Ireland Ltd. All rights reserved.

  17. Paper simulation techniques in user requirements analysis for interactive computer systems

    NASA Technical Reports Server (NTRS)

    Ramsey, H. R.; Atwood, M. E.; Willoughby, J. K.

    1979-01-01

    This paper describes the use of a technique called 'paper simulation' in the analysis of user requirements for interactive computer systems. In a paper simulation, the user solves problems with the aid of a 'computer', as in normal man-in-the-loop simulation. In this procedure, though, the computer does not exist, but is simulated by the experimenters. This allows simulated problem solving early in the design effort, and allows the properties and degree of structure of the system and its dialogue to be varied. The technique, and a method of analyzing the results, are illustrated with examples from a recent paper simulation exercise involving a Space Shuttle flight design task

  18. Scientific user requirements for a herbarium data portal.

    PubMed

    Vissers, Jorick; den Bosch, Frederik Van; Bogaerts, Ann; Cocquyt, Christine; Degreef, Jérôme; Diagre, Denis; de Haan, Myriam; Smedt, Sofie De; Henry Engledow; Ertz, Damien; Fabri, Régine; Godefroid, Sandrine; Nicole Hanquart; Mergen, Patricia; Ronse, Anne; Sosef, Marc; Stévart, Tariq; Stoffelen, Piet; Vanderhoeven, Sonia; Groom, Quentin

    2017-01-01

    The digitization of herbaria and their online access will greatly facilitate access to plant collections around the world. This will improve the efficiency of taxonomy and help reduce inequalities between scientists. The Botanic Garden Meise, Belgium, is currently digitizing 1.2 million specimens including label data. In this paper we describe the user requirements analysis conducted for a new herbarium web portal. The aim was to identify the required functionality, but also to assist in the prioritization of software development and data acquisition. The Garden conducted the analysis in cooperation with Clockwork, the digital engagement agency of Ordina. Using a series of interactive interviews, potential users were consulted from universities, research institutions, science-policy initiatives and the Botanic Garden Meise. Although digital herbarium data have many potential stakeholders, we focused on the needs of taxonomists, ecologists and historians, who are currently the primary users of the Meise herbarium data portal. The three categories of user have similar needs, all wanted as much specimen data as possible, and for those data, to be interlinked with other digital resources within and outside the Garden. Many users wanted an interactive system that they could comment on, or correct online, particularly if such corrections and annotations could be used to rank the reliability of data. Many requirements depend on the quality of the digitized data associated with each specimen. The essential data fields are the taxonomic name; geographic location; country; collection date; collector name and collection number. Also all researchers valued linkage between biodiversity literature and specimens. Nevertheless, to verify digitized data the researchers still want access to high quality images, even if fully transcribed label information is provided. The only major point of disagreement is the level of access users should have and what they should be allowed to do with the data and images. Not all of the user requirements are feasible given the current technical and regulatory landscape, however, the potential of these suggestions is discussed. Currently, there is no off-the-shelf solution to satisfy all these user requirements, but the intention of this paper is to guide other herbaria who are prioritising their investment in digitization and online web functionality.

  19. Scientific user requirements for a herbarium data portal

    PubMed Central

    Vissers, Jorick; den Bosch, Frederik Van; Bogaerts, Ann; Cocquyt, Christine; Degreef, Jérôme; Diagre, Denis; de Haan, Myriam; Smedt, Sofie De; Henry Engledow; Ertz, Damien; Fabri, Régine; Godefroid, Sandrine; Nicole Hanquart; Mergen, Patricia; Ronse, Anne; Sosef, Marc; Stévart, Tariq; Stoffelen, Piet; Vanderhoeven, Sonia; Groom, Quentin

    2017-01-01

    Abstract The digitization of herbaria and their online access will greatly facilitate access to plant collections around the world. This will improve the efficiency of taxonomy and help reduce inequalities between scientists. The Botanic Garden Meise, Belgium, is currently digitizing 1.2 million specimens including label data. In this paper we describe the user requirements analysis conducted for a new herbarium web portal. The aim was to identify the required functionality, but also to assist in the prioritization of software development and data acquisition. The Garden conducted the analysis in cooperation with Clockwork, the digital engagement agency of Ordina. Using a series of interactive interviews, potential users were consulted from universities, research institutions, science-policy initiatives and the Botanic Garden Meise. Although digital herbarium data have many potential stakeholders, we focused on the needs of taxonomists, ecologists and historians, who are currently the primary users of the Meise herbarium data portal. The three categories of user have similar needs, all wanted as much specimen data as possible, and for those data, to be interlinked with other digital resources within and outside the Garden. Many users wanted an interactive system that they could comment on, or correct online, particularly if such corrections and annotations could be used to rank the reliability of data. Many requirements depend on the quality of the digitized data associated with each specimen. The essential data fields are the taxonomic name; geographic location; country; collection date; collector name and collection number. Also all researchers valued linkage between biodiversity literature and specimens. Nevertheless, to verify digitized data the researchers still want access to high quality images, even if fully transcribed label information is provided. The only major point of disagreement is the level of access users should have and what they should be allowed to do with the data and images. Not all of the user requirements are feasible given the current technical and regulatory landscape, however, the potential of these suggestions is discussed. Currently, there is no off-the-shelf solution to satisfy all these user requirements, but the intention of this paper is to guide other herbaria who are prioritising their investment in digitization and online web functionality. PMID:28781551

  20. A Simplified Shuttle Payload Thermal Analyzer /SSPTA/ program

    NASA Technical Reports Server (NTRS)

    Bartoszek, J. T.; Huckins, B.; Coyle, M.

    1979-01-01

    A simple thermal analysis program for Space Shuttle payloads has been developed to accommodate the user who requires an easily understood but dependable analytical tool. The thermal analysis program includes several thermal subprograms traditionally employed in spacecraft thermal studies, a data management system for data generated by the subprograms, and a master program to coordinate the data files and thermal subprograms. The language and logic used to run the thermal analysis program are designed for the small user. In addition, analytical and storage techniques which conserve computer time and minimize core requirements are incorporated into the program.

  1. The Design of a Graphical User Interface for an Electronic Classroom.

    ERIC Educational Resources Information Center

    Cahalan, Kathleen J.; Levin, Jacques

    2000-01-01

    Describes the design of a prototype for the graphical user interface component of an electronic classroom (ECR) application that supports real-time lectures and question-and-answer sessions between an instructor and students. Based on requirements analysis and an analysis of competing products, a Web-based ECR prototype was produced. Findings show…

  2. USER'S GUIDE: Strategic Waste Minimization Initiative (SWAMI) Version 2.0 - A Software Tool to Aid in Process Analysis for Pollution Prevention

    EPA Science Inventory

    The Strategic WAste Minimization Initiative (SWAMI) Software, Version 2.0 is a tool for using process analysis for identifying waste minimization opportunities within an industrial setting. The software requires user-supplied information for process definition, as well as materia...

  3. Commerce Laboratory: Mission analysis payload integration study

    NASA Technical Reports Server (NTRS)

    Bannister, T. C.

    1984-01-01

    A mission model which will accommodate commercial users and provide a basic data base for further mission planning is reported. The data bases to be developed are: (1) user requirements; (2) apparatus capabilities and availabilities; and (3) carrier capabilities. These data bases are synthesized in a trades and analysis phase along with the STS flight apparatus, and optimum missions will be identified. The completed work is reported. The user requirements data base was expanded to identify within the six scientific disciplines the areas of investigation, investigation categories and status, potential commercial application, interested parties, process, and experiment requirements. The scope of the apparatus data base was expanded to indicate apparatus status as to whether it is ground or flight equipment and, within both categories, whether the apparatus is: (1) existing, (2) under development, (3) planned, or (4) needed. Applications for the apparatus are listed. The methodology is revised in the areas of trades and analysis and mission planning. The carrier capabilities data base was updated and completed.

  4. How to measure the QoS of a web-based EHRs system: development of an instrument.

    PubMed

    de la Torre-Díez, Isabel; López-Coronado, Miguel; Rodrigues, Joel J P C

    2012-12-01

    The quality of service (QoS) can be treated as a set of concepts whose satisfaction/dissatisfaction generates a global positive/negative vision about the service provided by any application. The different nature of the services and its features require an analysis of the factors that have the greatest influence on the users' opinion and, therefore, measuring the quality of service in each application requires a specific instrument. This paper will introduce an instrument to measure the QoS offered to users by a general Web application for Electronic Health Records (EHRs). The collection of opinions from a pilot sample and the performance of an explanatory factor analysis will bring together the factors that best sum up the quality of an EHRs application. Subsequently, a confirmatory factor analysis will be performed to make the study reliable and, as its name suggests, to confirm that indeed the structure of the instrument developed measures the QoS in accordance with the requirements of the users.

  5. Accessible cell phone design: development and application of a needs analysis framework.

    PubMed

    Smith-Jackson, Tonya; Nussbaum, Maury; Mooney, Aaron

    2003-05-20

    This research describes the development and use of the Needs Analysis and Requirements Acquisition (NARA) framework to elicit and construct user requirements for the design of cell phones (which are a type of assistive technology) that are both usable and accessible to persons with disabilities. Semi-structured interviews and a focus group were used to elicit information and a systematic approach was used to translation information into requirements (construct). Elicitation and construction are the first two stages of NARA. Requirements for general and feature-specific phone attributes were identified, and several requirements were found to match six of the seven universal design principles. The study demonstrated that NARA is both a straight-forward and cost-effective method to develop user requirements and can be used throughout the development cycle.

  6. Chipster: user-friendly analysis software for microarray and other high-throughput data.

    PubMed

    Kallio, M Aleksi; Tuimala, Jarno T; Hupponen, Taavi; Klemelä, Petri; Gentile, Massimiliano; Scheinin, Ilari; Koski, Mikko; Käki, Janne; Korpelainen, Eija I

    2011-10-14

    The growth of high-throughput technologies such as microarrays and next generation sequencing has been accompanied by active research in data analysis methodology, producing new analysis methods at a rapid pace. While most of the newly developed methods are freely available, their use requires substantial computational skills. In order to enable non-programming biologists to benefit from the method development in a timely manner, we have created the Chipster software. Chipster (http://chipster.csc.fi/) brings a powerful collection of data analysis methods within the reach of bioscientists via its intuitive graphical user interface. Users can analyze and integrate different data types such as gene expression, miRNA and aCGH. The analysis functionality is complemented with rich interactive visualizations, allowing users to select datapoints and create new gene lists based on these selections. Importantly, users can save the performed analysis steps as reusable, automatic workflows, which can also be shared with other users. Being a versatile and easily extendable platform, Chipster can be used for microarray, proteomics and sequencing data. In this article we describe its comprehensive collection of analysis and visualization tools for microarray data using three case studies. Chipster is a user-friendly analysis software for high-throughput data. Its intuitive graphical user interface enables biologists to access a powerful collection of data analysis and integration tools, and to visualize data interactively. Users can collaborate by sharing analysis sessions and workflows. Chipster is open source, and the server installation package is freely available.

  7. Chipster: user-friendly analysis software for microarray and other high-throughput data

    PubMed Central

    2011-01-01

    Background The growth of high-throughput technologies such as microarrays and next generation sequencing has been accompanied by active research in data analysis methodology, producing new analysis methods at a rapid pace. While most of the newly developed methods are freely available, their use requires substantial computational skills. In order to enable non-programming biologists to benefit from the method development in a timely manner, we have created the Chipster software. Results Chipster (http://chipster.csc.fi/) brings a powerful collection of data analysis methods within the reach of bioscientists via its intuitive graphical user interface. Users can analyze and integrate different data types such as gene expression, miRNA and aCGH. The analysis functionality is complemented with rich interactive visualizations, allowing users to select datapoints and create new gene lists based on these selections. Importantly, users can save the performed analysis steps as reusable, automatic workflows, which can also be shared with other users. Being a versatile and easily extendable platform, Chipster can be used for microarray, proteomics and sequencing data. In this article we describe its comprehensive collection of analysis and visualization tools for microarray data using three case studies. Conclusions Chipster is a user-friendly analysis software for high-throughput data. Its intuitive graphical user interface enables biologists to access a powerful collection of data analysis and integration tools, and to visualize data interactively. Users can collaborate by sharing analysis sessions and workflows. Chipster is open source, and the server installation package is freely available. PMID:21999641

  8. User's manual for the coupled rotor/airframe vibration analysis graphic package

    NASA Technical Reports Server (NTRS)

    Studwell, R. E.

    1982-01-01

    User instructions for a graphics package for coupled rotor/airframe vibration analysis are presented. Responses to plot package messages which the user must make to activate plot package operations and options are described. Installation instructions required to set up the program on the CDC system are included. The plot package overlay structure and subroutines which have to be modified for the CDC system are also described. Operating instructions for CDC applications are included.

  9. The experience of living with stroke and using technology: opportunities to engage and co-design with end users.

    PubMed

    Nasr, Nasrin; Leon, Beatriz; Mountain, Gail; Nijenhuis, Sharon M; Prange, Gerdienke; Sale, Patrizio; Amirabdollahian, Farshid

    2016-11-01

    We drew on an interdisciplinary research design to examine stroke survivors' experiences of living with stroke and with technology in order to provide technology developers with insight into values, thoughts and feelings of the potential users of a to-be-designed robotic technology for home-based rehabilitation of the hand and wrist. Ten stroke survivors and their family carers were purposefully selected. On the first home visit, they were introduced to cultural probe. On the second visit, the content of the probe packs were used as prompt to conduct one-to-one interviews with them. The data generated was analysed using thematic analysis. A third home visit was conducted to evaluate the early prototype. User requirements were categorised into their network of relationships, their attitude towards technology, their skills, their goals and motivations. The user requirements were used to envision the requirements of the system including providing feedback on performance, motivational aspects and usability of the system. Participants' views on the system requirements were obtained during a participatory evaluation. This study showed that prior to the development of technology, it is important to engage with potential users to identify user requirements and subsequently envision system requirements based on users' views. Implications for Rehabilitation An understanding of how stroke survivors make sense of their experiences of living with stroke is needed to design home-based rehabilitation technologies. Linking stroke survivors' goals, motivations, behaviour, feelings and attitude to user requirements prior to technology development has a significant impact on improving the design.

  10. From user needs to system specifications: multi-disciplinary thematic seminars as a collaborative design method for development of health information systems.

    PubMed

    Scandurra, I; Hägglund, M; Koch, S

    2008-08-01

    This paper presents a new multi-disciplinary method for user needs analysis and requirements specification in the context of health information systems based on established theories from the fields of participatory design and computer supported cooperative work (CSCW). Whereas conventional methods imply a separate, sequential needs analysis for each profession, the "multi-disciplinary thematic seminar" (MdTS) method uses a collaborative design process. Application of the method in elderly homecare resulted in prototypes that were well adapted to the intended user groups. Vital information in the points of intersection between different care professions was elicited and a holistic view of the entire care process was obtained. Health informatics-usability specialists and clinical domain experts are necessary to apply the method. Although user needs acquisition can be time-consuming, MdTS was perceived to efficiently identify in-context user needs, and transformed these directly into requirements specifications. Consequently the method was perceived to expedite the entire ICT implementation process.

  11. User's manual for MacPASCO

    NASA Technical Reports Server (NTRS)

    Lucas, S. H.; Davis, R. C.

    1992-01-01

    A user's manual is presented for MacPASCO, which is an interactive, graphic, preprocessor for panel design. MacPASCO creates input for PASCO, an existing computer code for structural analysis and sizing of longitudinally stiffened composite panels. MacPASCO provides a graphical user interface which simplifies the specification of panel geometry and reduces user input errors. The user draws the initial structural geometry and reduces user input errors. The user draws the initial structural geometry on the computer screen, then uses a combination of graphic and text inputs to: refine the structural geometry; specify information required for analysis such as panel load and boundary conditions; and define design variables and constraints for minimum mass optimization. Only the use of MacPASCO is described, since the use of PASCO has been documented elsewhere.

  12. Semantic technologies in a decision support system

    NASA Astrophysics Data System (ADS)

    Wasielewska, K.; Ganzha, M.; Paprzycki, M.; Bǎdicǎ, C.; Ivanovic, M.; Lirkov, I.

    2015-10-01

    The aim of our work is to design a decision support system based on ontological representation of domain(s) and semantic technologies. Specifically, we consider the case when Grid / Cloud user describes his/her requirements regarding a "resource" as a class expression from an ontology, while the instances of (the same) ontology represent available resources. The goal is to help the user to find the best option with respect to his/her requirements, while remembering that user's knowledge may be "limited." In this context, we discuss multiple approaches based on semantic data processing, which involve different "forms" of user interaction with the system. Specifically, we consider: (a) ontological matchmaking based on SPARQL queries and class expression, (b) graph-based semantic closeness of instances representing user requirements (constructed from the class expression) and available resources, and (c) multicriterial analysis based on the AHP method, which utilizes expert domain knowledge (also ontologically represented).

  13. An analysis of specialist and non-specialist user requirements for geographic climate change information.

    PubMed

    Maguire, Martin C

    2013-11-01

    The EU EuroClim project developed a system to monitor and record climate change indicator data based on satellite observations of snow cover, sea ice and glaciers in Northern Europe and the Arctic. It also contained projection data for temperature, rainfall and average wind speed for Europe. These were all stored as data sets in a GIS database for users to download. The process of gathering requirements for a user population including scientists, researchers, policy makers, educationalists and the general public is described. Using an iterative design methodology, a user survey was administered to obtain initial feedback on the system concept followed by panel sessions where users were presented with the system concept and a demonstrator to interact with it. The requirements of both specialist and non-specialist users is summarised together with strategies for the effective communication of geographic climate change information. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  14. EPPRD: An Efficient Privacy-Preserving Power Requirement and Distribution Aggregation Scheme for a Smart Grid.

    PubMed

    Zhang, Lei; Zhang, Jing

    2017-08-07

    A Smart Grid (SG) facilitates bidirectional demand-response communication between individual users and power providers with high computation and communication performance but also brings about the risk of leaking users' private information. Therefore, improving the individual power requirement and distribution efficiency to ensure communication reliability while preserving user privacy is a new challenge for SG. Based on this issue, we propose an efficient and privacy-preserving power requirement and distribution aggregation scheme (EPPRD) based on a hierarchical communication architecture. In the proposed scheme, an efficient encryption and authentication mechanism is proposed for better fit to each individual demand-response situation. Through extensive analysis and experiment, we demonstrate how the EPPRD resists various security threats and preserves user privacy while satisfying the individual requirement in a semi-honest model; it involves less communication overhead and computation time than the existing competing schemes.

  15. Defining Requirements and Related Methods for Designing Sensorized Garments.

    PubMed

    Andreoni, Giuseppe; Standoli, Carlo Emilio; Perego, Paolo

    2016-05-26

    Designing smart garments has strong interdisciplinary implications, specifically related to user and technical requirements, but also because of the very different applications they have: medicine, sport and fitness, lifestyle monitoring, workplace and job conditions analysis, etc. This paper aims to discuss some user, textile, and technical issues to be faced in sensorized clothes development. In relation to the user, the main requirements are anthropometric, gender-related, and aesthetical. In terms of these requirements, the user's age, the target application, and fashion trends cannot be ignored, because they determine the compliance with the wearable system. Regarding textile requirements, functional factors-also influencing user comfort-are elasticity and washability, while more technical properties are the stability of the chemical agents' effects for preserving the sensors' efficacy and reliability, and assuring the proper duration of the product for the complete life cycle. From the technical side, the physiological issues are the most important: skin conductance, tolerance, irritation, and the effect of sweat and perspiration are key factors for reliable sensing. Other technical features such as battery size and duration, and the form factor of the sensor collector, should be considered, as they affect aesthetical requirements, which have proven to be crucial, as well as comfort and wearability.

  16. Information services platforms at geosynchronous earth orbit: A requirements analysis

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The potential user requirements for Information Services Platforms at geosynchronous orbits were investigated. A rationale for identifying the corollary system requirements and supporting research and technology needs was provided.

  17. User's manual for the Simulated Life Analysis of Vehicle Elements (SLAVE) model

    NASA Technical Reports Server (NTRS)

    Paul, D. D., Jr.

    1972-01-01

    The simulated life analysis of vehicle elements model was designed to perform statistical simulation studies for any constant loss rate. The outputs of the model consist of the total number of stages required, stages successfully completing their lifetime, and average stage flight life. This report contains a complete description of the model. Users' instructions and interpretation of input and output data are presented such that a user with little or no prior programming knowledge can successfully implement the program.

  18. AdaNET research project

    NASA Technical Reports Server (NTRS)

    Digman, R. Michael

    1988-01-01

    The components necessary for the success of the commercialization of an Ada Technology Transition Network are reported in detail. The organizational plan presents the planned structure for services development and technical transition of AdaNET services to potential user communities. The Business Plan is the operational plan for the AdaNET service as a commercial venture. The Technical Plan is the plan from which the AdaNET can be designed including detailed requirements analysis. Also contained is an analysis of user fees and charges, and a proposed user fee schedule.

  19. CIAO: A Modern Data Analysis System for X-Ray Astronomy

    NASA Astrophysics Data System (ADS)

    Fruscione, Antonella

    2017-08-01

    It is now eighteen years after launch and Chandra continues to produce spectacular results!A portion of the success is to be attributed to the data analysis software CIAO (Chandra Interactive Analysis of Observations) that the Chandra X-Ray Center (CXC) continues to improve and release year after year.CIAO is downloaded more than 1200 times a year and it is used by a wide variety of users around the world: from novice to experienced X-ray astronomers, high school, undergraduate and graduate students, archival users (many new to X-ray or Chandra data), users with extensive resources and others from smaller countries and institutions.The scientific goals and kinds of datasets and analysis cover a wide range: observations spanning from days to years, different instrument configurations and different kinds of targets, from pointlike stars and quasars, to fuzzy galaxies and clusters, to moving solar objects. These different needs and goals require a variety of specialized software and careful and detailed documentation which is what the CIAO software provides. In general, we strive to build a software system which is easy for beginners, yet powerful for advanced users.The complexity of the Chandra data require a flexible data analysis system which provides an environment where the users can apply our tools, but can also explore and construct their own applications. The main purpose of this talk is to present CIAO as a modern data analysis system for X-ray data analysis.CIAO has grown tremendously over the years and we will highlight (a) the most recent advancements with a particular emphasis on the newly developed high-level scripts which simplify the analysis steps for the most common cases making CIAO more accessible to all users - including beginners and users who are not X-ray astronomy specialists, (b) the python-based Sherpa modelling and fitting application and the new stand-alone version openly developed and distributed on Github and (c) progress on methods to characterize the Chandra PSF.

  20. Determining Relevant Financial Statement Ratios in Department of Defense Service Component General Fund Financial Statements

    DTIC Science & Technology

    2014-06-01

    a . Gross Profit Ratio Modified to Budget Compliance Ratio ....70 b...statements. 1. Objective and Users For a business to make a profit , it is often required to obtain funds from lenders or investors to purchase the...required to determine conclusions regarding the financial position of a business. It is important for users to acknowledge that a ratio analysis is

  1. Background Information and User’s Guide for MIL-F-9490

    DTIC Science & Technology

    1975-01-01

    requirements, although different analysis results will apply to each requirement. Basic differences between the two realibility requirements are: MIL-F-8785B...provides the rationale for establishing such limits. The specific risk analysis comprises the same data which formed the average risk analysis , except...statistical analysis will be based on statistical data taken using limited exposure Limes of components and equipment. The exposure times and resulting

  2. Farm and personal characteristics of the clientele of a community-based animal-health service programme in northern Malawi.

    PubMed

    Hüttner, K; Leidl, K; Pfeiffer, D U; Jere, F B; Kasambara, D

    2001-05-01

    The social background, farm characteristics, indicators of income and self-evaluation returns of 96 randomly selected users of a Basic Animal Health Service (BAHS) programme in northern Malawi were compared with those of 96 matched past-users and 96 non-users, respectively. All 288 farms were visited between July and October 1997. Data analysis was performed using univariate and multivariate techniques. The results showed that, on average, BAHS users had larger cattle herds (16.3) than part-users (14.7) or non-users (12.4). Similarly, the annual yields of crops were higher for users compared to either of the other groups. Users occupied better houses and owned a larger number of farm and household items than did part-users or non-users. A third of all farmers were engaged in additional income generation to lessen the risk of poverty. However, analysis of the livestock management and the educational background of the farmers suggested that usage of the BAHS programme was not only determined by already existing 'wealth'. Improved livestock husbandry and management measures, which do not require capital investment, were more frequently applied by users compared to either of the other groups. Non-users and part-users had attained a lower level of education, were less open towards improved farming methods and felt less knowledgeable than BAHS users. The average straight-line distances from farms using BAHS to their respective village animal health worker (2.2 km) or veterinary assistant (2.9 km) were similar but varied according to ecological zone. Intensified extension and awareness meetings in villages will be required to get more non-users involved in BAHS.

  3. Spacelab user implementation assessment study. (Software requirements analysis). Volume 2: Technical report

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The engineering analyses and evaluation studies conducted for the Software Requirements Analysis are discussed. Included are the development of the study data base, synthesis of implementation approaches for software required by both mandatory onboard computer services and command/control functions, and identification and implementation of software for ground processing activities.

  4. User engineering: A new look at system engineering

    NASA Technical Reports Server (NTRS)

    Mclaughlin, Larry L.

    1987-01-01

    User Engineering is a new System Engineering perspective responsible for defining and maintaining the user view of the system. Its elements are a process to guide the project and customer, a multidisciplinary team including hard and soft sciences, rapid prototyping tools to build user interfaces quickly and modify them frequently at low cost, and a prototyping center for involving users and designers in an iterative way. The main consideration is reducing the risk that the end user will not or cannot effectively use the system. The process begins with user analysis to produce cognitive and work style models, and task analysis to produce user work functions and scenarios. These become major drivers of the human computer interface design which is presented and reviewed as an interactive prototype by users. Feedback is rapid and productive, and user effectiveness can be measured and observed before the system is built and fielded. Requirements are derived via the prototype and baselined early to serve as an input to the architecture and software design.

  5. Kameleon Live: An Interactive Cloud Based Analysis and Visualization Platform for Space Weather Researchers

    NASA Astrophysics Data System (ADS)

    Pembroke, A. D.; Colbert, J. A.

    2015-12-01

    The Community Coordinated Modeling Center (CCMC) provides hosting for many of the simulations used by the space weather community of scientists, educators, and forecasters. CCMC users may submit model runs through the Runs on Request system, which produces static visualizations of model output in the browser, while further analysis may be performed off-line via Kameleon, CCMC's cross-language access and interpolation library. Off-line analysis may be suitable for power-users, but storage and coding requirements present a barrier to entry for non-experts. Moreover, a lack of a consistent framework for analysis hinders reproducibility of scientific findings. To that end, we have developed Kameleon Live, a cloud based interactive analysis and visualization platform. Kameleon Live allows users to create scientific studies built around selected runs from the Runs on Request database, perform analysis on those runs, collaborate with other users, and disseminate their findings among the space weather community. In addition to showcasing these novel collaborative analysis features, we invite feedback from CCMC users as we seek to advance and improve on the new platform.

  6. DCS: A global satellite environmental data collection system study

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Cost analysis and technical feasibility data are presented on five medium orbiting and six geosynchronous satellite data collection systems with varying degrees of spacecraft and local user terminal complexity. Data are also provided on system approaches, user requirements, and user classes. Systems considered include orbiting ERTS and EOS type satellites and geosynchronous SmS and SEOS type data collectors.

  7. User-friendly InSAR Data Products: Fast and Simple Timeseries (FAST) Processing

    NASA Astrophysics Data System (ADS)

    Zebker, H. A.

    2017-12-01

    Interferometric Synthetic Aperture Radar (InSAR) methods provide high resolution maps of surface deformation applicable to many scientific, engineering and management studies. Despite its utility, the specialized skills and computer resources required for InSAR analysis remain as barriers for truly widespread use of the technique. Reduction of radar scenes to maps of temporal deformation evolution requires not only detailed metadata describing the exact radar and surface acquisition geometries, but also a software package that can combine these for the specific scenes of interest. Furthermore, the radar range-Doppler radar coordinate system itself is confusing, so that many users find it hard to incorporate even useful products in their customary analyses. And finally, the sheer data volume needed to represent interferogram time series makes InSAR analysis challenging for many analysis systems. We show here that it is possible to deliver radar data products to users that address all of these difficulties, so that the data acquired by large, modern satellite systems are ready to use in more natural coordinates, without requiring further processing, and in as small volume as possible.

  8. Commerce Lab: Mission analysis. Payload integration study

    NASA Technical Reports Server (NTRS)

    Marvin, G. D.

    1984-01-01

    The objectives of the commerce lab mission analysis and payload integration study are discussed. A mission model which accommodates commercial users and provides a basic data base for future mission planning is described. The data bases developed under this study include: (1) user requirements; (2) apparatus capabilities and availabilities; and (3) carrier capabilities. These data bases are synthesized in a trades and analysis phase along with the STS flight opportunities. Optimum missions are identified.

  9. Understanding the relationship between Kano model's customer satisfaction scores and self-stated requirements importance.

    PubMed

    Mkpojiogu, Emmanuel O C; Hashim, Nor Laily

    2016-01-01

    Customer satisfaction is the result of product quality and viability. The place of the perceived satisfaction of users/customers for a software product cannot be neglected especially in today competitive market environment as it drives the loyalty of customers and promotes high profitability and return on investment. Therefore understanding the importance of requirements as it is associated with the satisfaction of users/customers when their requirements are met is worth the pain considering. It is necessary to know the relationship between customer satisfactions when their requirements are met (or their dissatisfaction when their requirements are unmet) and the importance of such requirement. So many works have been carried out on customer satisfaction in connection with the importance of requirements but the relationship between customer satisfaction scores (coefficients) of the Kano model and users/customers self-stated requirements importance have not been sufficiently explored. In this study, an attempt is made to unravel the underlying relationship existing between Kano model's customer satisfaction indexes and users/customers self reported requirements importance. The results of the study indicate some interesting associations between these considered variables. These bivariate associations reveal that customer satisfaction index (SI), and average satisfaction coefficient (ASC) and customer dissatisfaction index (DI) and average satisfaction coefficient (ASC) are highly correlated (r = 96 %) and thus ASC can be used in place of either SI or DI in representing customer satisfaction scores. Also, these Kano model's customer satisfaction variables (SI, DI, and ASC) are each associated with self-stated requirements importance (IMP). Further analysis indicates that the value customers or users place on requirements that are met or on features that are incorporated into a product influences the level of satisfaction such customers derive from the product. The worth of a product feature is indicated by the perceived satisfaction customers get from the inclusion of such feature in the product design and development. The satisfaction users/customers derive when a requirement is fulfilled or when a feature is placed in the product (SI or ASC) is strongly influenced by the value the users/customers place on such requirements/features when met (IMP). However, the dissatisfaction users/customers received when a requirement is not met or when a feature is not incorporated into the product (DI), even though related to self-stated requirements importance (IMP), does not have a strong effect on the importance/worth (IMP) of that given requirement/feature as perceived by the users or customers. Therefore, since customer satisfaction is proportionally related to the perceived requirements importance (worth), it is then necessary to give adequate attention to user/customer satisfying requirements (features) from elicitation to design and to the final implementation of the design. Incorporating user or customer satisfying requirements in product design is of great worth or value to the future users or customers of the product.

  10. TADS: A CFD-based turbomachinery and analysis design system with GUI. Volume 2: User's manual

    NASA Technical Reports Server (NTRS)

    Myers, R. A.; Topp, D. A.; Delaney, R. A.

    1995-01-01

    The primary objective of this study was the development of a computational fluid dynamics (CFD) based turbomachinery airfoil analysis and design system, controlled by a graphical user interface (GUI). The computer codes resulting from this effort are referred to as the Turbomachinery Analysis and Design System (TADS). This document is intended to serve as a user's manual for the computer programs which comprise the TADS system. TADS couples a throughflow solver (ADPAC) with a quasi-3D blade-to-blade solver (RVCQ3D) in an interactive package. Throughflow analysis capability was developed in ADPAC through the addition of blade force and blockage terms to the governing equations. A GUI was developed to simplify user input and automate the many tasks required to perform turbomachinery analysis and design. The coupling of various programs was done in a way that alternative solvers or grid generators could be easily incorporated into the TADS framework.

  11. Usability assessment of an electronic health record in a comprehensive dental clinic.

    PubMed

    Suebnukarn, Siriwan; Rittipakorn, Pawornwan; Thongyoi, Budsara; Boonpitak, Kwanwong; Wongsapai, Mansuang; Pakdeesan, Panu

    2013-12-01

    In this paper we present the development and usability of an electronic health record (EHR) system in a comprehensive dental clinic.The graphic user interface of the system was designed to consider the concept of cognitive ergonomics.The cognitive task analysis was used to evaluate the user interface of the EHR by identifying all sub-tasks and classifying them into mental or physical operators, and to predict task execution time required to perform the given task. We randomly selected 30 cases that had oral examinations for routine clinical care in a comprehensive dental clinic. The results were based on the analysis of 4 prototypical tasks performed by ten EHR users. The results showed that on average a user needed to go through 27 steps to complete all tasks for one case. To perform all 4 tasks of 30 cases, they spent about 91 min (independent of system response time) for data entry, of which 51.8 min were spent on more effortful mental operators. In conclusion, the user interface can be improved by reducing the percentage of mental effort required for the tasks.

  12. Unlocking data: federated identity with LSDMA and dCache

    NASA Astrophysics Data System (ADS)

    Millar, AP; Behrmann, G.; Bernardt, C.; Fuhrmann, P.; Hardt, M.; Hayrapetyan, A.; Litvintsev, D.; Mkrtchyan, T.; Rossi, A.; Schwank, K.

    2015-12-01

    X.509, the dominant identity system from grid computing, has proved unpopular for many user communities. More popular alternatives generally assume the user is interacting via their web-browser. Such alternatives allow a user to authenticate with many services with the same credentials (user-name and password). They also allow users from different organisations form collaborations quickly and simply. Scientists generally require that their custom analysis software has direct access to the data. Such direct access is not currently supported by alternatives to X.509, as they require the use of a web-browser. Various approaches to solve this issue are being investigated as part of the Large Scale Data Management and Analysis (LSDMA) project, a German funded national R&D project. These involve dynamic credential translation (creating an X.509 credential) to allow backwards compatibility in addition to direct SAML- and OpenID Connect-based authentication. We present a summary of the current state of art and the current status of the federated identity work funded by the LSDMA project along with the future road map.

  13. Lamprey: tracking users on the World Wide Web.

    PubMed

    Felciano, R M; Altman, R B

    1996-01-01

    Tracking individual web sessions provides valuable information about user behavior. This information can be used for general purpose evaluation of web-based user interfaces to biomedical information systems. To this end, we have developed Lamprey, a tool for doing quantitative and qualitative analysis of Web-based user interfaces. Lamprey can be used from any conforming browser, and does not require modification of server or client software. By rerouting WWW navigation through a centralized filter, Lamprey collects the sequence and timing of hyperlinks used by individual users to move through the web. Instead of providing marginal statistics, it retains the full information required to recreate a user session. We have built Lamprey as a standard Common Gateway Interface (CGI) that works with all standard WWW browsers and servers. In this paper, we describe Lamprey and provide a short demonstration of this approach for evaluating web usage patterns.

  14. Assessing performance of an Electronic Health Record (EHR) using Cognitive Task Analysis.

    PubMed

    Saitwal, Himali; Feng, Xuan; Walji, Muhammad; Patel, Vimla; Zhang, Jiajie

    2010-07-01

    Many Electronic Health Record (EHR) systems fail to provide user-friendly interfaces due to the lack of systematic consideration of human-centered computing issues. Such interfaces can be improved to provide easy to use, easy to learn, and error-resistant EHR systems to the users. To evaluate the usability of an EHR system and suggest areas of improvement in the user interface. The user interface of the AHLTA (Armed Forces Health Longitudinal Technology Application) was analyzed using the Cognitive Task Analysis (CTA) method called GOMS (Goals, Operators, Methods, and Selection rules) and an associated technique called KLM (Keystroke Level Model). The GOMS method was used to evaluate the AHLTA user interface by classifying each step of a given task into Mental (Internal) or Physical (External) operators. This analysis was performed by two analysts independently and the inter-rater reliability was computed to verify the reliability of the GOMS method. Further evaluation was performed using KLM to estimate the execution time required to perform the given task through application of its standard set of operators. The results are based on the analysis of 14 prototypical tasks performed by AHLTA users. The results show that on average a user needs to go through 106 steps to complete a task. To perform all 14 tasks, they would spend about 22 min (independent of system response time) for data entry, of which 11 min are spent on more effortful mental operators. The inter-rater reliability analysis performed for all 14 tasks was 0.8 (kappa), indicating good reliability of the method. This paper empirically reveals and identifies the following finding related to the performance of AHLTA: (1) large number of average total steps to complete common tasks, (2) high average execution time and (3) large percentage of mental operators. The user interface can be improved by reducing (a) the total number of steps and (b) the percentage of mental effort, required for the tasks. 2010 Elsevier Ireland Ltd. All rights reserved.

  15. Usage, Utility, and Usability of the Knowledge Wall During the Global 2000 War Game

    DTIC Science & Technology

    2001-08-01

    evaluation during the Global 2000 War Game. The prototype KW was designed to meet 14 user requirements identified in a previous cognitive task analysis of potential KW users. This report presents the results of the evaluation.

  16. User's manual for rocket combustor interactive design (ROCCID) and analysis computer program. Volume 1: User's manual

    NASA Technical Reports Server (NTRS)

    Muss, J. A.; Nguyen, T. V.; Johnson, C. W.

    1991-01-01

    The user's manual for the rocket combustor interactive design (ROCCID) computer program is presented. The program, written in Fortran 77, provides a standardized methodology using state of the art codes and procedures for the analysis of a liquid rocket engine combustor's steady state combustion performance and combustion stability. The ROCCID is currently capable of analyzing mixed element injector patterns containing impinging like doublet or unlike triplet, showerhead, shear coaxial, and swirl coaxial elements as long as only one element type exists in each injector core, baffle, or barrier zone. Real propellant properties of oxygen, hydrogen, methane, propane, and RP-1 are included in ROCCID. The properties of other propellants can easily be added. The analysis model in ROCCID can account for the influence of acoustic cavities, helmholtz resonators, and radial thrust chamber baffles on combustion stability. ROCCID also contains the logic to interactively create a combustor design which meets input performance and stability goals. A preliminary design results from the application of historical correlations to the input design requirements. The steady state performance and combustion stability of this design is evaluated using the analysis models, and ROCCID guides the user as to the design changes required to satisfy the user's performance and stability goals, including the design of stability aids. Output from ROCCID includes a formatted input file for the standardized JANNAF engine performance prediction procedure.

  17. NASA scientific and technical program: User survey

    NASA Technical Reports Server (NTRS)

    Hunter, Judy F.; Shockley, Cynthia W.

    1993-01-01

    Results are presented of an intensive user requirements survey conducted by NASA's Scientific and Technical Information (STI) Program with the goal of improving the foundation for the user outreach program. The survey was carried out by interviewing 550 NASA scientists, engineers, and contractors and by analyzing 650 individual responses to a mailed out questionnaire. To analyze the user demographic data, a data base was built and used, and will be applied to ongoing analysis by the NASA STI Program.

  18. Reinventing Image Detective: An Evidence-Based Approach to Citizen Science Online

    NASA Astrophysics Data System (ADS)

    Romano, C.; Graff, P. V.; Runco, S.

    2017-12-01

    Usability studies demonstrate that web users are notoriously impatient, spending as little as 15 seconds on a home page. How do you get users to stay long enough to understand a citizen science project? How do you get users to complete complex citizen science tasks online?Image Detective, a citizen science project originally developed by scientists and science engagement specialists at the NASA Johnson Space center to engage the public in the analysis of images taken from space by astronauts to help enhance NASA's online database of astronaut imagery, partnered with the CosmoQuest citizen science platform to modernize, offering new and improved options for participation in Image Detective. The challenge: to create a web interface that builds users' skills and knowledge, creating engagement while learning complex concepts essential to the accurate completion of tasks. The project team turned to usability testing for an objective understanding of how users perceived Image Detective and the steps required to complete required tasks. A group of six users was recruited online for unmoderated and initial testing. The users followed a think-aloud protocol while attempting tasks, and were recorded on video and audio. The usability test examined users' perception of four broad areas: the purpose of and context for Image Detective; the steps required to successfully complete the analysis (differentiating images of Earth's surface from those showing outer space and identifying common surface features); locating the image center point on a map of Earth; and finally, naming geographic locations or natural events seen in the image.Usability test findings demonstrated that the following best practices can increase participation in Image Detective and can be applied to the successful implementation of any citizen science project:• Concise explanation of the project, its context, and its purpose;• Including a mention of the funding agency (in this case, NASA);• A preview of the specific tasks required of participants;• A dedicated user interface for the actual citizen science interaction.In addition, testing revealed that users may require additional context when a task is complex, difficult, or unusual (locating a specific image and its center point on a map of Earth). Video evidence will be made available with this presentation.

  19. Reinventing Image Detective: An Evidence-Based Approach to Citizen Science Online

    NASA Technical Reports Server (NTRS)

    Romano, Cia; Graff, Paige V.; Runco, Susan

    2017-01-01

    Usability studies demonstrate that web users are notoriously impatient, spending as little as 15 seconds on a home page. How do you get users to stay long enough to understand a citizen science project? How do you get users to complete complex citizen science tasks online? Image Detective, a citizen science project originally developed by scientists and science engagement specialists at the NASA Johnson Space center to engage the public in the analysis of images taken from space by astronauts to help enhance NASA's online database of astronaut imagery, partnered with the CosmoQuest citizen science platform to modernize, offering new and improved options for participation in Image Detective. The challenge: to create a web interface that builds users' skills and knowledge, creating engagement while learning complex concepts essential to the accurate completion of tasks. The project team turned to usability testing for an objective understanding of how users perceived Image Detective and the steps required to complete required tasks. A group of six users was recruited online for unmoderated and initial testing. The users followed a think-aloud protocol while attempting tasks, and were recorded on video and audio. The usability test examined users' perception of four broad areas: the purpose of and context for Image Detective; the steps required to successfully complete the analysis (differentiating images of Earth's surface from those showing outer space and identifying common surface features); locating the image center point on a map of Earth; and finally, naming geographic locations or natural events seen in the image. Usability test findings demonstrated that the following best practices can increase participation in Image Detective and can be applied to the successful implementation of any citizen science project: (1) Concise explanation of the project, its context, and its purpose; (2) Including a mention of the funding agency (in this case, NASA); (3) A preview of the specific tasks required of participants; (4) A dedicated user interface for the actual citizen science interaction. In addition, testing revealed that users may require additional context when a task is complex, difficult, or unusual (locating a specific image and its center point on a map of Earth). Video evidence will be made available with this presentation.

  20. Understanding the Requirements for Open Source Software

    DTIC Science & Technology

    2009-06-17

    GNOME and K Development Environment ( KDE ) for end-user interfaces, the Eclipse and NetBeans interactive development environments for Java-based Web...17 4.1. Informal Post-hoc Assertion of OSS Requirements vs . Requirements Elicitation...18 4.2. Requirements Reading, Sense-making, and Accountability vs . Requirements Analysis

  1. User Instructions for the Policy Analysis Modeling System (PAMS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McNeil, Michael A.; Letschert, Virginie E.; Van Buskirk, Robert D.

    PAMS uses country-specific and product-specific data to calculate estimates of impacts of a Minimum Efficiency Performance Standard (MEPS) program. The analysis tool is self-contained in a Microsoft Excel spreadsheet, and requires no links to external data, or special code additions to run. The analysis can be customized to a particular program without additional user input, through the use of the pull-down menus located on the Summary page. In addition, the spreadsheet contains many areas into which user-generated input data can be entered for increased accuracy of projection. The following is a step-by-step guide for using and customizing the tool.

  2. Information Switching Processor (ISP) contention analysis and control

    NASA Technical Reports Server (NTRS)

    Shyy, D.; Inukai, T.

    1993-01-01

    Future satellite communications, as a viable means of communications and an alternative to terrestrial networks, demand flexibility and low end-user cost. On-board switching/processing satellites potentially provide these features, allowing flexible interconnection among multiple spot beams, direct to the user communications services using very small aperture terminals (VSAT's), independent uplink and downlink access/transmission system designs optimized to user's traffic requirements, efficient TDM downlink transmission, and better link performance. A flexible switching system on the satellite in conjunction with low-cost user terminals will likely benefit future satellite network users.

  3. Comparative analysis of data base management systems

    NASA Technical Reports Server (NTRS)

    Smith, R.

    1983-01-01

    A study to determine if the Remote File Inquiry (RFI) system would handle the future requirements of the user community is discussed. RFI is a locally written and locally maintained on-line query/update package. The current and future on-line requirements of the user community were studied. Additional consideration was given to the types of data structuring the users required. The survey indicated the features of greatest benefit were: sort, subtotals, totals, record selection, storage of queries, global updating and the ability to page break. The major deficiencies were: one level of hierarchy, excessive response time, software unreliability, difficult to add, delete and modify records, complicated error messages and the lack of ability to perform interfield comparisons. Missing features users required were: formatted screens, interfield comparions, interfield arithmetic, multiple file access, security and data integrity. The survey team recommended Kennedy Space Center move forward to state-of-the-art software, a Data Base Management System which is thoroughly tested and easy to implement and use.

  4. Generalized Advanced Propeller Analysis System (GAPAS). Volume 2: Computer program user manual

    NASA Technical Reports Server (NTRS)

    Glatt, L.; Crawford, D. R.; Kosmatka, J. B.; Swigart, R. J.; Wong, E. W.

    1986-01-01

    The Generalized Advanced Propeller Analysis System (GAPAS) computer code is described. GAPAS was developed to analyze advanced technology multi-bladed propellers which operate on aircraft with speeds up to Mach 0.8 and altitudes up to 40,000 feet. GAPAS includes technology for analyzing aerodynamic, structural, and acoustic performance of propellers. The computer code was developed for the CDC 7600 computer and is currently available for industrial use on the NASA Langley computer. A description of all the analytical models incorporated in GAPAS is included. Sample calculations are also described as well as users requirements for modifying the analysis system. Computer system core requirements and running times are also discussed.

  5. Interactive K-Means Clustering Method Based on User Behavior for Different Analysis Target in Medicine.

    PubMed

    Lei, Yang; Yu, Dai; Bin, Zhang; Yang, Yang

    2017-01-01

    Clustering algorithm as a basis of data analysis is widely used in analysis systems. However, as for the high dimensions of the data, the clustering algorithm may overlook the business relation between these dimensions especially in the medical fields. As a result, usually the clustering result may not meet the business goals of the users. Then, in the clustering process, if it can combine the knowledge of the users, that is, the doctor's knowledge or the analysis intent, the clustering result can be more satisfied. In this paper, we propose an interactive K -means clustering method to improve the user's satisfactions towards the result. The core of this method is to get the user's feedback of the clustering result, to optimize the clustering result. Then, a particle swarm optimization algorithm is used in the method to optimize the parameters, especially the weight settings in the clustering algorithm to make it reflect the user's business preference as possible. After that, based on the parameter optimization and adjustment, the clustering result can be closer to the user's requirement. Finally, we take an example in the breast cancer, to testify our method. The experiments show the better performance of our algorithm.

  6. System Engineering Concept Demonstration, Effort Summary. Volume 1

    DTIC Science & Technology

    1992-12-01

    involve only the system software, user frameworks and user tools. U •User Tool....s , Catalyst oExternal 00 Computer Framwork P OSystems • •~ Sysytem...analysis, synthesis, optimization, conceptual design of Catalyst. The paper discusses the definition, design, test, and evaluation; operational concept...This approach will allow system engineering The conceptual requirements for the Process Model practitioners to recognize and tailor the model. This

  7. SimHap GUI: an intuitive graphical user interface for genetic association analysis.

    PubMed

    Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J

    2008-12-25

    Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis.

  8. Proceedings of the Third International Mobile Satellite Conference (IMSC 1993)

    NASA Technical Reports Server (NTRS)

    Kwan, Robert (Compiler); Rigley, Jack (Compiler); Cassingham, Randy (Editor)

    1993-01-01

    Satellite-based mobile communications systems provide voice and data communications to users over a vast geographic area. The users may communicate via mobile or hand-held terminals, which may also provide access to terrestrial cellular communications services. While the first and second International Mobile Satellite Conferences (IMSC) mostly concentrated on technical advances, this Third IMSC also focuses on the increasing worldwide commercial activities in Mobile Satellite Services. Because of the large service areas provided by such systems, it is important to consider political and regulatory issues in addition to technical and user requirements issues. Topics covered include: the direct broadcast of audio programming from satellites; spacecraft technology; regulatory and policy considerations; advanced system concepts and analysis; propagation; and user requirements and applications.

  9. NeoAnalysis: a Python-based toolbox for quick electrophysiological data processing and analysis.

    PubMed

    Zhang, Bo; Dai, Ji; Zhang, Tao

    2017-11-13

    In a typical electrophysiological experiment, especially one that includes studying animal behavior, the data collected normally contain spikes, local field potentials, behavioral responses and other associated data. In order to obtain informative results, the data must be analyzed simultaneously with the experimental settings. However, most open-source toolboxes currently available for data analysis were developed to handle only a portion of the data and did not take into account the sorting of experimental conditions. Additionally, these toolboxes require that the input data be in a specific format, which can be inconvenient to users. Therefore, the development of a highly integrated toolbox that can process multiple types of data regardless of input data format and perform basic analysis for general electrophysiological experiments is incredibly useful. Here, we report the development of a Python based open-source toolbox, referred to as NeoAnalysis, to be used for quick electrophysiological data processing and analysis. The toolbox can import data from different data acquisition systems regardless of their formats and automatically combine different types of data into a single file with a standardized format. In cases where additional spike sorting is needed, NeoAnalysis provides a module to perform efficient offline sorting with a user-friendly interface. Then, NeoAnalysis can perform regular analog signal processing, spike train, and local field potentials analysis, behavioral response (e.g. saccade) detection and extraction, with several options available for data plotting and statistics. Particularly, it can automatically generate sorted results without requiring users to manually sort data beforehand. In addition, NeoAnalysis can organize all of the relevant data into an informative table on a trial-by-trial basis for data visualization. Finally, NeoAnalysis supports analysis at the population level. With the multitude of general-purpose functions provided by NeoAnalysis, users can easily obtain publication-quality figures without writing complex codes. NeoAnalysis is a powerful and valuable toolbox for users doing electrophysiological experiments.

  10. The GEOSS User Requirement Registry (URR): A Cross-Cutting Service-Oriented Infrastructure Linking Science, Society and GEOSS

    NASA Astrophysics Data System (ADS)

    Plag, H.-P.; Foley, G.; Jules-Plag, S.; Ondich, G.; Kaufman, J.

    2012-04-01

    The Group on Earth Observations (GEO) is implementing the Global Earth Observation System of Systems (GEOSS) as a user-driven service infrastructure responding to the needs of users in nine interdependent Societal Benefit Areas (SBAs) of Earth observations (EOs). GEOSS applies an interdisciplinary scientific approach integrating observations, research, and knowledge in these SBAs in order to enable scientific interpretation of the collected observations and the extraction of actionable information. Using EOs to actually produce these societal benefits means getting the data and information to users, i.e., decision-makers. Thus, GEO needs to know what the users need and how they would use the information. The GEOSS User Requirements Registry (URR) is developed as a service-oriented infrastructure enabling a wide range of users, including science and technology (S&T) users, to express their needs in terms of EOs and to understand the benefits of GEOSS for their fields. S&T communities need to be involved in both the development and the use of GEOSS, and the development of the URR accounts for the special needs of these communities. The GEOSS Common Infrastructure (GCI) at the core of GEOSS includes system-oriented registries enabling users to discover, access, and use EOs and derived products and services available through GEOSS. In addition, the user-oriented URR is a place for the collection, sharing, and analysis of user needs and EO requirements, and it provides means for an efficient dialog between users and providers. The URR is a community-based infrastructure for the publishing, viewing, and analyzing of user-need related information. The data model of the URR has a core of seven relations for User Types, Applications, Requirements, Research Needs, Infrastructure Needs, Technology Needs, and Capacity Building Needs. The URR also includes a Lexicon, a number of controlled vocabularies, and

  11. TargetVue: Visual Analysis of Anomalous User Behaviors in Online Communication Systems.

    PubMed

    Cao, Nan; Shi, Conglei; Lin, Sabrina; Lu, Jie; Lin, Yu-Ru; Lin, Ching-Yung

    2016-01-01

    Users with anomalous behaviors in online communication systems (e.g. email and social medial platforms) are potential threats to society. Automated anomaly detection based on advanced machine learning techniques has been developed to combat this issue; challenges remain, though, due to the difficulty of obtaining proper ground truth for model training and evaluation. Therefore, substantial human judgment on the automated analysis results is often required to better adjust the performance of anomaly detection. Unfortunately, techniques that allow users to understand the analysis results more efficiently, to make a confident judgment about anomalies, and to explore data in their context, are still lacking. In this paper, we propose a novel visual analysis system, TargetVue, which detects anomalous users via an unsupervised learning model and visualizes the behaviors of suspicious users in behavior-rich context through novel visualization designs and multiple coordinated contextual views. Particularly, TargetVue incorporates three new ego-centric glyphs to visually summarize a user's behaviors which effectively present the user's communication activities, features, and social interactions. An efficient layout method is proposed to place these glyphs on a triangle grid, which captures similarities among users and facilitates comparisons of behaviors of different users. We demonstrate the power of TargetVue through its application in a social bot detection challenge using Twitter data, a case study based on email records, and an interview with expert users. Our evaluation shows that TargetVue is beneficial to the detection of users with anomalous communication behaviors.

  12. Software design for analysis of multichannel intracardial and body surface electrocardiograms.

    PubMed

    Potse, Mark; Linnenbank, André C; Grimbergen, Cornelis A

    2002-11-01

    Analysis of multichannel ECG recordings (body surface maps (BSMs) and intracardial maps) requires special software. We created a software package and a user interface on top of a commercial data analysis package (MATLAB) by a combination of high-level and low-level programming. Our software was created to satisfy the needs of a diverse group of researchers. It can handle a large variety of recording configurations. It allows for interactive usage through a fast and robust user interface, and batch processing for the analysis of large amounts of data. The package is user-extensible, includes routines for both common and experimental data processing tasks, and works on several computer platforms. The source code is made intelligible using software for structured documentation and is available to the users. The package is currently used by more than ten research groups analysing ECG data worldwide.

  13. A New Method to Retrieve the Data Requirements of the Remote Sensing Community – Exemplarily Demonstrated for Hyperspectral User Needs

    PubMed Central

    Nieke, Jens; Reusen, Ils

    2007-01-01

    User-driven requirements for remote sensing data are difficult to define, especially details on geometric, spectral and radiometric parameters. Even more difficult is a decent assessment of the required degrees of processing and corresponding data quality. It is therefore a real challenge to appropriately assess data costs and services to be provided. In 2006, the HYRESSA project was initiated within the framework 6 programme of the European Commission to analyze the user needs of the hyperspectral remote sensing community. Special focus was given to finding an answer to the key question, “What are the individual user requirements for hyperspectral imagery and its related data products?”. A Value-Benefit Analysis (VBA) was performed to retrieve user needs and address open items accordingly. The VBA is an established tool for systematic problem solving by supporting the possibility of comparing competing projects or solutions. It enables evaluation on the basis of a multidimensional objective model and can be augmented with expert's preferences. After undergoing a VBA, the scaling method (e.g., Law of Comparative Judgment) was applied for achieving the desired ranking judgments. The result, which is the relative value of projects with respect to a well-defined main objective, can therefore be produced analytically using a VBA. A multidimensional objective model adhering to VBA methodology was established. Thereafter, end users and experts were requested to fill out a Questionnaire of User Needs (QUN) at the highest level of detail - the value indicator level. The end user was additionally requested to report personal preferences for his particular research field. In the end, results from the experts' evaluation and results from a sensor data survey can be compared in order to understand user needs and the drawbacks of existing data products. The investigation – focusing on the needs of the hyperspectral user community in Europe – showed that a VBA is a suitable method for analyzing the needs of hyperspectral data users and supporting the sensor/data specification-building process. The VBA has the advantage of being easy to handle, resulting in a comprehensive evaluation. The primary disadvantage is the large effort in realizing such an analysis because the level of detail is extremely high.

  14. Sierra Structural Dynamics User's Notes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reese, Garth M.

    2015-10-19

    Sierra/SD provides a massively parallel implementation of structural dynamics finite element analysis, required for high fidelity, validated models used in modal, vibration, static and shock analysis of weapons systems. This document provides a users guide to the input for Sierra/SD. Details of input specifications for the different solution types, output options, element types and parameters are included. The appendices contain detailed examples, and instructions for running the software on parallel platforms.

  15. Sierra/SD User's Notes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Munday, Lynn Brendon; Day, David M.; Bunting, Gregory

    Sierra/SD provides a massively parallel implementation of structural dynamics finite element analysis, required for high fidelity, validated models used in modal, vibration, static and shock analysis of weapons systems. This document provides a users guide to the input for Sierra/SD. Details of input specifications for the different solution types, output options, element types and parameters are included. The appendices contain detailed examples, and instructions for running the software on parallel platforms.

  16. Requirements Engineering in Building Climate Science Software

    NASA Astrophysics Data System (ADS)

    Batcheller, Archer L.

    Software has an important role in supporting scientific work. This dissertation studies teams that build scientific software, focusing on the way that they determine what the software should do. These requirements engineering processes are investigated through three case studies of climate science software projects. The Earth System Modeling Framework assists modeling applications, the Earth System Grid distributes data via a web portal, and the NCAR (National Center for Atmospheric Research) Command Language is used to convert, analyze and visualize data. Document analysis, observation, and interviews were used to investigate the requirements-related work. The first research question is about how and why stakeholders engage in a project, and what they do for the project. Two key findings arise. First, user counts are a vital measure of project success, which makes adoption important and makes counting tricky and political. Second, despite the importance of quantities of users, a few particular "power users" develop a relationship with the software developers and play a special role in providing feedback to the software team and integrating the system into user practice. The second research question focuses on how project objectives are articulated and how they are put into practice. The team seeks to both build a software system according to product requirements but also to conduct their work according to process requirements such as user support. Support provides essential communication between users and developers that assists with refining and identifying requirements for the software. It also helps users to learn and apply the software to their real needs. User support is a vital activity for scientific software teams aspiring to create infrastructure. The third research question is about how change in scientific practice and knowledge leads to changes in the software, and vice versa. The "thickness" of a layer of software infrastructure impacts whether the software team or users have control and responsibility for making changes in response to new scientific ideas. Thick infrastructure provides more functionality for users, but gives them less control of it. The stability of infrastructure trades off against the responsiveness that the infrastructure can have to user needs.

  17. A Mobile Food Record For Integrated Dietary Assessment*

    PubMed Central

    Ahmad, Ziad; Kerr, Deborah A.; Bosch, Marc; Boushey, Carol J.; Delp, Edward J.; Khanna, Nitin; Zhu, Fengqing

    2017-01-01

    This paper presents an integrated dietary assessment system based on food image analysis that uses mobile devices or smartphones. We describe two components of our integrated system: a mobile application and an image-based food nutrient database that is connected to the mobile application. An easy-to-use mobile application user interface is described that was designed based on user preferences as well as the requirements of the image analysis methods. The user interface is validated by user feedback collected from several studies. Food nutrient and image databases are also described which facilitates image-based dietary assessment and enable dietitians and other healthcare professionals to monitor patients dietary intake in real-time. The system has been tested and validated in several user studies involving more than 500 users who took more than 60,000 food images under controlled and community-dwelling conditions. PMID:28691119

  18. InterFace: A software package for face image warping, averaging, and principal components analysis.

    PubMed

    Kramer, Robin S S; Jenkins, Rob; Burton, A Mike

    2017-12-01

    We describe InterFace, a software package for research in face recognition. The package supports image warping, reshaping, averaging of multiple face images, and morphing between faces. It also supports principal components analysis (PCA) of face images, along with tools for exploring the "face space" produced by PCA. The package uses a simple graphical user interface, allowing users to perform these sophisticated image manipulations without any need for programming knowledge. The program is available for download in the form of an app, which requires that users also have access to the (freely available) MATLAB Runtime environment.

  19. Development of the cancer patient financial aid system and analysis of user satisfaction.

    PubMed

    Park, Joon Ho; Park, Eun-Cheol; Lee, Myung Ha; Kim, Yun-Mi; Choi, Soo Mi

    2006-01-01

    A financial aid program for low income cancer patients in Korea was initiated in 2005, which required a web-based system. Therefore, the Cancer Patient Financial Aid System (CPFAS) was developed. To improve the CPFAS, we evaluated the nationwide satisfaction of public health center users.

  20. Manned Orbital Transfer Vehicle (MOTV). Volume 4: Supporting analysis

    NASA Technical Reports Server (NTRS)

    Boyland, R. E.; Sherman, S. W.; Morfin, H. W.

    1979-01-01

    Generic missions were defined to enable potential users to determine the parameters for suggested user projects. Mission modes were identified for providing operation, interfaces, performance, and cost data for studying payloads. Safety requirements for emergencies during various phases of the mission are considered with emphasis on radiation hazards.

  1. Using task analysis to improve the requirements elicitation in health information system.

    PubMed

    Teixeira, Leonor; Ferreira, Carlos; Santos, Beatriz Sousa

    2007-01-01

    This paper describes the application of task analysis within the design process of a Web-based information system for managing clinical information in hemophilia care, in order to improve the requirements elicitation and, consequently, to validate the domain model obtained in a previous phase of the design process (system analysis). The use of task analysis in this case proved to be a practical and efficient way to improve the requirements engineering process by involving users in the design process.

  2. Materials and Nondestructive Evaluation Laboratoriers: User Test Planning Guide

    NASA Technical Reports Server (NTRS)

    Schaschl, Leslie

    2011-01-01

    The Materials and Nondestructive Evaluation Laboratory process, milestones and inputs are unknowns to first-time users. The Materials and Nondestructive Evaluation Laboratory Planning Guide aids in establishing expectations for both NASA and non- NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware developers. It is intended to assist their project engineering personnel in materials analysis planning and execution. Material covered includes a roadmap of the analysis process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, products, and inputs necessary to define scope of analysis, cost, and schedule are included as an appendix to the guide.

  3. Acquisition and production of skilled behavior in dynamic decision-making tasks. Semiannual Status Report M.S. Thesis - Georgia Inst. of Tech., Nov. 1992

    NASA Technical Reports Server (NTRS)

    Kirlik, Alex; Kossack, Merrick Frank

    1993-01-01

    This status report consists of a thesis entitled 'Ecological Task Analysis: A Method for Display Enhancements.' Previous use of various analysis processes for the purpose of display interface design or enhancement has run the risk of failing to improve user performance due to the analysis resulting in only a sequencial listing of user tasks. Adopting an ecological approach to performing the task analysis, however, may result in the necessary modeling of an unpredictable and variable task domain required to improve user performance. Kirlik has proposed an Ecological Task Analysis framework which is designed for this purpose. It is the purpose of this research to measure this framework's effectiveness at enhancing display interfaces in order to improve user performance. Following the proposed framework, an ecological task analysis of experienced users of a complex and dynamic laboratory task, Star Cruiser, was performed. Based on this analysis, display enhancements were proposed and implemented. An experiment was then conducted to compare this new version of Star Cruiser to the original. By measuring user performance at different tasks, it was determined that during early sessions, use of the enhanced display contributed to better user performance compared to that achieved using the original display. Furthermore, the results indicate that the enhancements proposed as a result of the ecological task analysis affected user performance differently depending on whether they are enhancements which aid in the selection of a possible action or in the performance of an action. Generalizations of these findings to larger, more complex systems were avoided since the analysis was only performed on this one particular system.

  4. Investigating Users' Requirements

    PubMed Central

    Walker, Deborah S.; Lee, Wen-Yu; Skov, Neil M.; Berger, Carl F.; Athley, Brian D.

    2002-01-01

    Objective: User data and information about anatomy education were used to guide development of a learning environment that is efficient and effective. The research question focused on how to design instructional software suitable for the educational goals of different groups of users of the Visible Human data set. The ultimate goal of the study was to provide options for students and teachers to use different anatomy learning modules corresponding to key topics, for course work and professional training. Design: The research used both qualitative and quantitative methods. It was driven by the belief that good instructional design must address learning context information and pedagogic content information. The data collection emphasized measurement of users' perspectives, experience, and demands in anatomy learning. Measurement: Users' requirements elicited from 12 focus groups were combined and rated by 11 researchers. Collective data were sorted and analyzed by use of multidimensional scaling and cluster analysis. Results: A set of functions and features in high demand across all groups of users was suggested by the results. However, several subgroups of users shared distinct demands. The design of the learning modules will encompass both unified core components and user-specific applications. The design templates will allow sufficient flexibility for dynamic insertion of different learning applications for different users. Conclusion: This study describes how users' requirements, associated with users' learning experiences, were systematically collected and analyzed and then transformed into guidelines informing the iterative design of multiple learning modules. Information about learning challenges and processes was gathered to define essential anatomy teaching strategies. A prototype instrument to design and polish the Visible Human user interface system is currently being developed using ideas and feedback from users. PMID:12087112

  5. Privacy and data security in E-health: requirements from the user's perspective.

    PubMed

    Wilkowska, Wiktoria; Ziefle, Martina

    2012-09-01

    In this study two currently relevant aspects of using medical assistive technologies were addressed-security and privacy. In a two-step empirical approach that used focus groups (n = 19) and a survey (n = 104), users' requirements for the use of medical technologies were collected and evaluated. Specifically, we focused on the perceived importance of data security and privacy issues. Outcomes showed that both security and privacy aspects play an important role in the successful adoption of medical assistive technologies in the home environment. In particular, analysis of data with respect to gender, health-status and age (young, middle-aged and old users) revealed that females and healthy adults require, and insist on, the highest security and privacy standards compared with males and the ailing elderly.

  6. E-Textbooks and Connectivity: Proposing an Analytical Framework

    ERIC Educational Resources Information Center

    Gueudet, Ghislaine; Pepin, Birgit; Restrepo, Angela; Sabra, Hussein; Trouche, Luc

    2018-01-01

    This paper is concerned with the development of e-textbooks. We claim that analysis (and design) of e-textbooks requires the development of a specific frame. Digital affordances provide particular opportunities (e.g. in terms of interactions between users) that require specific considerations for their analysis, as teachers and students use them…

  7. EPPRD: An Efficient Privacy-Preserving Power Requirement and Distribution Aggregation Scheme for a Smart Grid

    PubMed Central

    Zhang, Lei; Zhang, Jing

    2017-01-01

    A Smart Grid (SG) facilitates bidirectional demand-response communication between individual users and power providers with high computation and communication performance but also brings about the risk of leaking users’ private information. Therefore, improving the individual power requirement and distribution efficiency to ensure communication reliability while preserving user privacy is a new challenge for SG. Based on this issue, we propose an efficient and privacy-preserving power requirement and distribution aggregation scheme (EPPRD) based on a hierarchical communication architecture. In the proposed scheme, an efficient encryption and authentication mechanism is proposed for better fit to each individual demand-response situation. Through extensive analysis and experiment, we demonstrate how the EPPRD resists various security threats and preserves user privacy while satisfying the individual requirement in a semi-honest model; it involves less communication overhead and computation time than the existing competing schemes. PMID:28783122

  8. IDEOM: an Excel interface for analysis of LC-MS-based metabolomics data.

    PubMed

    Creek, Darren J; Jankevics, Andris; Burgess, Karl E V; Breitling, Rainer; Barrett, Michael P

    2012-04-01

    The application of emerging metabolomics technologies to the comprehensive investigation of cellular biochemistry has been limited by bottlenecks in data processing, particularly noise filtering and metabolite identification. IDEOM provides a user-friendly data processing application that automates filtering and identification of metabolite peaks, paying particular attention to common sources of noise and false identifications generated by liquid chromatography-mass spectrometry (LC-MS) platforms. Building on advanced processing tools such as mzMatch and XCMS, it allows users to run a comprehensive pipeline for data analysis and visualization from a graphical user interface within Microsoft Excel, a familiar program for most biological scientists. IDEOM is provided free of charge at http://mzmatch.sourceforge.net/ideom.html, as a macro-enabled spreadsheet (.xlsb). Implementation requires Microsoft Excel (2007 or later). R is also required for full functionality. michael.barrett@glasgow.ac.uk Supplementary data are available at Bioinformatics online.

  9. Analysis and Development of Management Information Systems for Private Messes Afloat

    DTIC Science & Technology

    1988-03-01

    the development phase emphasis was placed on a three step approach starting with an analysis of the requirements as established by... oper - ating the mess divided by number of mess members Total Mess Bill Due Total of old bills, current bill, mess share owed, and special assessment 46...TRANSPARENCY THE SYSTEM BEHAVIOR IS TRANSPARENT TO THE USER. THAT MEANS THAT THE USER CAN DEVELOP A CONSISTENT MODEL OF THE SYSTEM WHEN WORKING

  10. SimHap GUI: An intuitive graphical user interface for genetic association analysis

    PubMed Central

    Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J

    2008-01-01

    Background Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. Results We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. Conclusion SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis. PMID:19109877

  11. Comparing Text-based and Graphic User Interfaces for Novice and Expert Users

    PubMed Central

    Chen, Jung-Wei; Zhang, Jiajie

    2007-01-01

    Graphic User Interface (GUI) is commonly considered to be superior to Text-based User Interface (TUI). This study compares GUI and TUI in an electronic dental record system. Several usability analysis techniques compared the relative effectiveness of a GUI and a TUI. Expert users and novice users were evaluated in time required and steps needed to complete the task. A within-subject design was used to evaluate if the experience with either interface will affect task performance. The results show that the GUI interface was not better than the TUI for expert users. GUI interface was better for novice users. For novice users there was a learning transfer effect from TUI to GUI. This means a user interface is user-friendly or not depending on the mapping between the user interface and tasks. GUI by itself may or may not be better than TUI. PMID:18693811

  12. Comparing Text-based and Graphic User Interfaces for novice and expert users.

    PubMed

    Chen, Jung-Wei; Zhang, Jiajie

    2007-10-11

    Graphic User Interface (GUI) is commonly considered to be superior to Text-based User Interface (TUI). This study compares GUI and TUI in an electronic dental record system. Several usability analysis techniques compared the relative effectiveness of a GUI and a TUI. Expert users and novice users were evaluated in time required and steps needed to complete the task. A within-subject design was used to evaluate if the experience with either interface will affect task performance. The results show that the GUI interface was not better than the TUI for expert users. GUI interface was better for novice users. For novice users there was a learning transfer effect from TUI to GUI. This means a user interface is user-friendly or not depending on the mapping between the user interface and tasks. GUI by itself may or may not be better than TUI.

  13. Requirements for the implementation of schedule repair technology in the Experiment Scheduling Program

    NASA Technical Reports Server (NTRS)

    Bullington, Stanley F.

    1992-01-01

    The following list of requirements specifies the proposed revisions to the Experiment Scheduling Program (ESP2) which deal with schedule repair. These requirements are divided into those which are general in nature, those which relate to measurement and analysis functions of the software, those which relate specifically to conflict resolution, and those relating directly to the user interface. (This list is not a complete list of requirements for the user interface, but only a list of those schedule repair requirements which relate to the interface.) Some of the requirements relate only to uses of the software in real-time operations. Others are clearly for future versions of the software, beyond the upcoming revision. In either case, the fact will be clearly stated.

  14. Introduction to TAFI - A Matlab® toolbox for analysis of flexural isostasy

    NASA Astrophysics Data System (ADS)

    Jha, S.; Harry, D. L.; Schutt, D.

    2016-12-01

    The isostatic response of vertical tectonic loads emplaced on thin elastic plates overlying inviscid substrate and the corresponding gravity anomalies are commonly modeled using well established theories and methodologies of flexural analysis. However, such analysis requires some mathematical and coding expertise on part of users. With that in mind, we designed a new interactive Matlab® toolbox called Toolbox for Analysis of Flexural Isostasy (TAFI). TAFI allows users to create forward models (2-D and 3-D) of flexural deformation of the lithosphere and resulting gravity anomaly. TAFI computes Green's Functions for flexure of the elastic plate subjected to point or line loads, and analytical solution for harmonic loads. Flexure due to non-impulsive, distributed 2-D or 3-D loads are computed by convolving the appropriate Green's function with a user-supplied spatially discretized load function. The gravity anomaly associated with each density interface is calculated by using the Fourier Transform of flexural deflection of these interfaces and estimating the gravity in the wavenumber domain. All models created in TAFI are based on Matlab's intrinsic functions and do not require any specialized toolbox, function or library except those distributed with TAFI. Modeling functions within TAFI can be called from Matlab workspace, from within user written programs or from the TAFI's graphical user interface (GUI). The GUI enables the user to model the flexural deflection of lithosphere interactively, enabling real time comparison of model fit with observed data constraining the flexural deformation and gravity, facilitating rapid search for best fitting flexural model. TAFI is a very useful teaching and research tool and have been tested rigorously in graduate level teaching and basic research environment.

  15. Speechlinks: Robust Cross-Lingual Tactical Communication Aids

    DTIC Science & Technology

    2008-06-01

    domain, the ontology based translation has proven to be challenging to build in this domain, however recent developments show promising results...assignments, and the effect of domain knowledge on those requirements. • Improving the front end of the speech recognizer remains one of the most challenging ...users by being very selective. 4.2.3.2 Analysis of the Normal user type inference result Figure 4.11 shows one of the most challenging users to

  16. GProX, a user-friendly platform for bioinformatics analysis and visualization of quantitative proteomics data.

    PubMed

    Rigbolt, Kristoffer T G; Vanselow, Jens T; Blagoev, Blagoy

    2011-08-01

    Recent technological advances have made it possible to identify and quantify thousands of proteins in a single proteomics experiment. As a result of these developments, the analysis of data has become the bottleneck of proteomics experiment. To provide the proteomics community with a user-friendly platform for comprehensive analysis, inspection and visualization of quantitative proteomics data we developed the Graphical Proteomics Data Explorer (GProX)(1). The program requires no special bioinformatics training, as all functions of GProX are accessible within its graphical user-friendly interface which will be intuitive to most users. Basic features facilitate the uncomplicated management and organization of large data sets and complex experimental setups as well as the inspection and graphical plotting of quantitative data. These are complemented by readily available high-level analysis options such as database querying, clustering based on abundance ratios, feature enrichment tests for e.g. GO terms and pathway analysis tools. A number of plotting options for visualization of quantitative proteomics data is available and most analysis functions in GProX create customizable high quality graphical displays in both vector and bitmap formats. The generic import requirements allow data originating from essentially all mass spectrometry platforms, quantitation strategies and software to be analyzed in the program. GProX represents a powerful approach to proteomics data analysis providing proteomics experimenters with a toolbox for bioinformatics analysis of quantitative proteomics data. The program is released as open-source and can be freely downloaded from the project webpage at http://gprox.sourceforge.net.

  17. GProX, a User-Friendly Platform for Bioinformatics Analysis and Visualization of Quantitative Proteomics Data*

    PubMed Central

    Rigbolt, Kristoffer T. G.; Vanselow, Jens T.; Blagoev, Blagoy

    2011-01-01

    Recent technological advances have made it possible to identify and quantify thousands of proteins in a single proteomics experiment. As a result of these developments, the analysis of data has become the bottleneck of proteomics experiment. To provide the proteomics community with a user-friendly platform for comprehensive analysis, inspection and visualization of quantitative proteomics data we developed the Graphical Proteomics Data Explorer (GProX)1. The program requires no special bioinformatics training, as all functions of GProX are accessible within its graphical user-friendly interface which will be intuitive to most users. Basic features facilitate the uncomplicated management and organization of large data sets and complex experimental setups as well as the inspection and graphical plotting of quantitative data. These are complemented by readily available high-level analysis options such as database querying, clustering based on abundance ratios, feature enrichment tests for e.g. GO terms and pathway analysis tools. A number of plotting options for visualization of quantitative proteomics data is available and most analysis functions in GProX create customizable high quality graphical displays in both vector and bitmap formats. The generic import requirements allow data originating from essentially all mass spectrometry platforms, quantitation strategies and software to be analyzed in the program. GProX represents a powerful approach to proteomics data analysis providing proteomics experimenters with a toolbox for bioinformatics analysis of quantitative proteomics data. The program is released as open-source and can be freely downloaded from the project webpage at http://gprox.sourceforge.net. PMID:21602510

  18. Overview of ATLAS PanDA Workload Management

    NASA Astrophysics Data System (ADS)

    Maeno, T.; De, K.; Wenaus, T.; Nilsson, P.; Stewart, G. A.; Walker, R.; Stradling, A.; Caballero, J.; Potekhin, M.; Smith, D.; ATLAS Collaboration

    2011-12-01

    The Production and Distributed Analysis System (PanDA) plays a key role in the ATLAS distributed computing infrastructure. All ATLAS Monte-Carlo simulation and data reprocessing jobs pass through the PanDA system. We will describe how PanDA manages job execution on the grid using dynamic resource estimation and data replication together with intelligent brokerage in order to meet the scaling and automation requirements of ATLAS distributed computing. PanDA is also the primary ATLAS system for processing user and group analysis jobs, bringing further requirements for quick, flexible adaptation to the rapidly evolving analysis use cases of the early datataking phase, in addition to the high reliability, robustness and usability needed to provide efficient and transparent utilization of the grid for analysis users. We will describe how PanDA meets ATLAS requirements, the evolution of the system in light of operational experience, how the system has performed during the first LHC data-taking phase and plans for the future.

  19. Overview of ATLAS PanDA Workload Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maeno T.; De K.; Wenaus T.

    2011-01-01

    The Production and Distributed Analysis System (PanDA) plays a key role in the ATLAS distributed computing infrastructure. All ATLAS Monte-Carlo simulation and data reprocessing jobs pass through the PanDA system. We will describe how PanDA manages job execution on the grid using dynamic resource estimation and data replication together with intelligent brokerage in order to meet the scaling and automation requirements of ATLAS distributed computing. PanDA is also the primary ATLAS system for processing user and group analysis jobs, bringing further requirements for quick, flexible adaptation to the rapidly evolving analysis use cases of the early datataking phase, in additionmore » to the high reliability, robustness and usability needed to provide efficient and transparent utilization of the grid for analysis users. We will describe how PanDA meets ATLAS requirements, the evolution of the system in light of operational experience, how the system has performed during the first LHC data-taking phase and plans for the future.« less

  20. General Mission Analysis Tool (GMAT) Architectural Specification. Draft

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.; Conway, Darrel, J.

    2007-01-01

    Early in 2002, Goddard Space Flight Center (GSFC) began to identify requirements for the flight dynamics software needed to fly upcoming missions that use formations of spacecraft to collect data. These requirements ranged from low level modeling features to large scale interoperability requirements. In 2003 we began work on a system designed to meet these requirement; this system is GMAT. The General Mission Analysis Tool (GMAT) is a general purpose flight dynamics modeling tool built on open source principles. The GMAT code is written in C++, and uses modern C++ constructs extensively. GMAT can be run through either a fully functional Graphical User Interface (GUI) or as a command line program with minimal user feedback. The system is built and runs on Microsoft Windows, Linux, and Macintosh OS X platforms. The GMAT GUI is written using wxWidgets, a cross platform library of components that streamlines the development and extension of the user interface Flight dynamics modeling is performed in GMAT by building components that represent the players in the analysis problem that is being modeled. These components interact through the sequential execution of instructions, embodied in the GMAT Mission Sequence. A typical Mission Sequence will model the trajectories of a set of spacecraft evolving over time, calculating relevant parameters during this propagation, and maneuvering individual spacecraft to maintain a set of mission constraints as established by the mission analyst. All of the elements used in GMAT for mission analysis can be viewed in the GMAT GUI or through a custom scripting language. Analysis problems modeled in GMAT are saved as script files, and these files can be read into GMAT. When a script is read into the GMAT GUI, the corresponding user interface elements are constructed in the GMAT GUI. The GMAT system was developed from the ground up to run in a platform agnostic environment. The source code compiles on numerous different platforms, and is regularly exercised running on Windows, Linux and Macintosh computers by the development and analysis teams working on the project. The system can be run using either a graphical user interface, written using the open source wxWidgets framework, or from a text console. The GMAT source code was written using open source tools. GSFC has released the code using the NASA open source license.

  1. Managing Information On Technical Requirements

    NASA Technical Reports Server (NTRS)

    Mauldin, Lemuel E., III; Hammond, Dana P.

    1993-01-01

    Technical Requirements Analysis and Control Systems/Initial Operating Capability (TRACS/IOC) computer program provides supplemental software tools for analysis, control, and interchange of project requirements so qualified project members have access to pertinent project information, even if in different locations. Enables users to analyze and control requirements, serves as focal point for project requirements, and integrates system supporting efficient and consistent operations. TRACS/IOC is HyperCard stack for use on Macintosh computers running HyperCard 1.2 or later and Oracle 1.2 or later.

  2. Space Station commercial user development

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The commercial utilization of the space station is investigated. The interest of nonaerospace firms in the use of the space station is determined. The user requirements are compared to the space station's capabilities and a feasibility analysis of a commercial firm acting as an intermediary between NASA and the private sector to reduce costs is presented.

  3. Developing Formal Object-oriented Requirements Specifications: A Model, Tool and Technique.

    ERIC Educational Resources Information Center

    Jackson, Robert B.; And Others

    1995-01-01

    Presents a formal object-oriented specification model (OSS) for computer software system development that is supported by a tool that automatically generates a prototype from an object-oriented analysis model (OSA) instance, lets the user examine the prototype, and permits the user to refine the OSA model instance to generate a requirements…

  4. Language workbench user interfaces for data analysis

    PubMed Central

    Benson, Victoria M.

    2015-01-01

    Biological data analysis is frequently performed with command line software. While this practice provides considerable flexibility for computationally savy individuals, such as investigators trained in bioinformatics, this also creates a barrier to the widespread use of data analysis software by investigators trained as biologists and/or clinicians. Workflow systems such as Galaxy and Taverna have been developed to try and provide generic user interfaces that can wrap command line analysis software. These solutions are useful for problems that can be solved with workflows, and that do not require specialized user interfaces. However, some types of analyses can benefit from custom user interfaces. For instance, developing biomarker models from high-throughput data is a type of analysis that can be expressed more succinctly with specialized user interfaces. Here, we show how Language Workbench (LW) technology can be used to model the biomarker development and validation process. We developed a language that models the concepts of Dataset, Endpoint, Feature Selection Method and Classifier. These high-level language concepts map directly to abstractions that analysts who develop biomarker models are familiar with. We found that user interfaces developed in the Meta-Programming System (MPS) LW provide convenient means to configure a biomarker development project, to train models and view the validation statistics. We discuss several advantages of developing user interfaces for data analysis with a LW, including increased interface consistency, portability and extension by language composition. The language developed during this experiment is distributed as an MPS plugin (available at http://campagnelab.org/software/bdval-for-mps/). PMID:25755929

  5. Sentiment of Search: KM and IT for User Expectations

    NASA Technical Reports Server (NTRS)

    Berndt, Sarah Ann; Meza, David

    2014-01-01

    User perceived value is the number one indicator of a successful implementation of KM and IT collaborations. The system known as "Search" requires more strategy and workflow that a mere data dump or ungoverned infrastructure can provide. Monitoring of user sentiment can be a driver for providing objective measures of success and justifying changes to the user interface. The dynamic nature of information technology makes traditional usability metrics difficult to identify, yet easy to argue against. There is little disagreement, however, on the criticality of adapting to user needs and expectations. The Systems Usability Scale (SUS), developed by John Brook in 1986 has become an industry standard for usability engineering. The first phase of a modified SUS, polls the sentiment of representative users of the JSC Search system. This information can be used to correlate user determined value with types of information sought and how the system is (or is not) meeting expectations. Sentiment analysis by way of the SUS assists an organization in identification and prioritization of the KM and IT variables impacting user perceived value. A secondary, user group focused analysis is the topic of additional work that demonstrates the impact of specific changes dictated by user sentiment.

  6. CASKS (Computer Analysis of Storage casKS): A microcomputer based analysis system for storage cask design review. User`s manual to Version 1b (including program reference)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, T.F.; Gerhard, M.A.; Trummer, D.J.

    CASKS (Computer Analysis of Storage casKS) is a microcomputer-based system of computer programs and databases developed at the Lawrence Livermore National Laboratory (LLNL) for evaluating safety analysis reports on spent-fuel storage casks. The bulk of the complete program and this user`s manual are based upon the SCANS (Shipping Cask ANalysis System) program previously developed at LLNL. A number of enhancements and improvements were added to the original SCANS program to meet requirements unique to storage casks. CASKS is an easy-to-use system that calculates global response of storage casks to impact loads, pressure loads and thermal conditions. This provides reviewers withmore » a tool for an independent check on analyses submitted by licensees. CASKS is based on microcomputers compatible with the IBM-PC family of computers. The system is composed of a series of menus, input programs, cask analysis programs, and output display programs. All data is entered through fill-in-the-blank input screens that contain descriptive data requests.« less

  7. Satellite services system analysis study. Volume 5: Programmatics

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The overall program and resources needed for development and operation of a Satellite Services System is reviewed. Program requirements covered system operations through 1993 and were completed in preliminary form. Program requirements were refined based on equipment preliminary design and analysis. Schedules, costs, equipment utilization, and facility/advanced technology requirements were included in the update. Equipment user charges were developed for each piece of equipment and for representative satellite servicing missions.

  8. Defining Requirements and Related Methods for Designing Sensorized Garments

    PubMed Central

    Andreoni, Giuseppe; Standoli, Carlo Emilio; Perego, Paolo

    2016-01-01

    Designing smart garments has strong interdisciplinary implications, specifically related to user and technical requirements, but also because of the very different applications they have: medicine, sport and fitness, lifestyle monitoring, workplace and job conditions analysis, etc. This paper aims to discuss some user, textile, and technical issues to be faced in sensorized clothes development. In relation to the user, the main requirements are anthropometric, gender-related, and aesthetical. In terms of these requirements, the user’s age, the target application, and fashion trends cannot be ignored, because they determine the compliance with the wearable system. Regarding textile requirements, functional factors—also influencing user comfort—are elasticity and washability, while more technical properties are the stability of the chemical agents’ effects for preserving the sensors’ efficacy and reliability, and assuring the proper duration of the product for the complete life cycle. From the technical side, the physiological issues are the most important: skin conductance, tolerance, irritation, and the effect of sweat and perspiration are key factors for reliable sensing. Other technical features such as battery size and duration, and the form factor of the sensor collector, should be considered, as they affect aesthetical requirements, which have proven to be crucial, as well as comfort and wearability. PMID:27240361

  9. Cytoscape: the network visualization tool for GenomeSpace workflows.

    PubMed

    Demchak, Barry; Hull, Tim; Reich, Michael; Liefeld, Ted; Smoot, Michael; Ideker, Trey; Mesirov, Jill P

    2014-01-01

    Modern genomic analysis often requires workflows incorporating multiple best-of-breed tools. GenomeSpace is a web-based visual workbench that combines a selection of these tools with mechanisms that create data flows between them. One such tool is Cytoscape 3, a popular application that enables analysis and visualization of graph-oriented genomic networks. As Cytoscape runs on the desktop, and not in a web browser, integrating it into GenomeSpace required special care in creating a seamless user experience and enabling appropriate data flows. In this paper, we present the design and operation of the Cytoscape GenomeSpace app, which accomplishes this integration, thereby providing critical analysis and visualization functionality for GenomeSpace users. It has been downloaded over 850 times since the release of its first version in September, 2013.

  10. Cytoscape: the network visualization tool for GenomeSpace workflows

    PubMed Central

    Demchak, Barry; Hull, Tim; Reich, Michael; Liefeld, Ted; Smoot, Michael; Ideker, Trey; Mesirov, Jill P.

    2014-01-01

    Modern genomic analysis often requires workflows incorporating multiple best-of-breed tools. GenomeSpace is a web-based visual workbench that combines a selection of these tools with mechanisms that create data flows between them. One such tool is Cytoscape 3, a popular application that enables analysis and visualization of graph-oriented genomic networks. As Cytoscape runs on the desktop, and not in a web browser, integrating it into GenomeSpace required special care in creating a seamless user experience and enabling appropriate data flows. In this paper, we present the design and operation of the Cytoscape GenomeSpace app, which accomplishes this integration, thereby providing critical analysis and visualization functionality for GenomeSpace users. It has been downloaded over 850 times since the release of its first version in September, 2013. PMID:25165537

  11. An assistive technology for hearing-impaired persons: analysis, requirements and architecture.

    PubMed

    Mielke, Matthias; Grunewald, Armin; Bruck, Rainer

    2013-01-01

    In this contribution, a concept of an assistive technology for hearing-impaired and deaf persons is presented. The concept applies pattern recognition algorithms and makes use of modern communication technology to analyze the acoustic environment around a user, identify critical acoustic signatures and give an alert to the user when an event of interest happened. A detailed analysis of the needs of deaf and hearing-impaired people has been performed. Requirements for an adequate assisting device have been derived from the results of the analysis, and have been turned into an architecture for its implementation that will be presented in this article. The presented concept is the basis for an assistive system which is now under development at the Institute of Microsystem Engineering at the University of Siegen.

  12. Vertical Guidance Performance Analysis of the L1–L5 Dual-Frequency GPS/WAAS User Avionics Sensor

    PubMed Central

    Jan, Shau-Shiun

    2010-01-01

    This paper investigates the potential vertical guidance performance of global positioning system (GPS)/wide area augmentation system (WAAS) user avionics sensor when the modernized GPS and Galileo are available. This paper will first investigate the airborne receiver code noise and multipath (CNMP) confidence (σair). The σair will be the dominant factor in the availability analysis of an L1–L5 dual-frequency GPS/WAAS user avionics sensor. This paper uses the MATLAB Algorithm Availability Simulation Tool (MAAST) to determine the required values for the σair, so that an L1–L5 dual-frequency GPS/WAAS user avionics sensor can meet the vertical guidance requirements of APproach with Vertical guidance (APV) II and CATegory (CAT) I over conterminous United States (CONUS). A modified MAAST that includes the Galileo satellite constellation is used to determine under what user configurations WAAS could be an APV II system or a CAT I system over CONUS. Furthermore, this paper examines the combinations of possible improvements in signal models and the addition of Galileo to determine if GPS/WAAS user avionics sensor could achieve 10 m Vertical Alert Limit (VAL) within the service volume. Finally, this paper presents the future vertical guidance performance of GPS user avionics sensor for the United States’ WAAS, Japanese MTSAT-based satellite augmentation system (MSAS) and European geostationary navigation overlay service (EGNOS). PMID:22319263

  13. Can existing mobile apps support healthier food purchasing behaviour? Content analysis of nutrition content, behaviour change theory and user quality integration.

    PubMed

    Flaherty, Sarah-Jane; McCarthy, Mary; Collins, Alan; McAuliffe, Fionnuala

    2018-02-01

    To assess the quality of nutrition content and the integration of user quality components and behaviour change theory relevant to food purchasing behaviour in a sample of existing mobile apps. Descriptive comparative analysis of eleven mobile apps comprising an assessment of their alignment with existing evidence on nutrition, behaviour change and user quality, and their potential ability to support healthier food purchasing behaviour. Mobile apps freely available for public use in GoogePlay were assessed and scored according to agreed criteria to assess nutrition content quality and integration of behaviour change theory and user quality components. A sample of eleven mobile apps that met predefined inclusion criteria to ensure relevance and good quality. The quality of the nutrition content varied. Improvements to the accuracy and appropriateness of nutrition content are needed to ensure mobile apps support a healthy behaviour change process and are accessible to a wider population. There appears to be a narrow focus towards behaviour change with an overemphasis on behavioural outcomes and a small number of behaviour change techniques, which may limit effectiveness. A significant effort from the user was required to use the mobile apps appropriately which may negatively influence user acceptability and subsequent utilisation. Existing mobile apps may offer a potentially effective approach to supporting healthier food purchasing behaviour but improvements in mobile app design are required to maximise their potential effectiveness. Engagement of mobile app users and nutrition professionals is recommended to support effective design.

  14. PASCO: Structural panel analysis and sizing code: Users manual - Revised

    NASA Technical Reports Server (NTRS)

    Anderson, M. S.; Stroud, W. J.; Durling, B. J.; Hennessy, K. W.

    1981-01-01

    A computer code denoted PASCO is described for analyzing and sizing uniaxially stiffened composite panels. Buckling and vibration analyses are carried out with a linked plate analysis computer code denoted VIPASA, which is included in PASCO. Sizing is based on nonlinear mathematical programming techniques and employs a computer code denoted CONMIN, also included in PASCO. Design requirements considered are initial buckling, material strength, stiffness and vibration frequency. A user's manual for PASCO is presented.

  15. IAC user manual

    NASA Technical Reports Server (NTRS)

    Vos, R. G.; Beste, D. L.; Gregg, J.

    1984-01-01

    The User Manual for the Integrated Analysis Capability (IAC) Level 1 system is presented. The IAC system currently supports the thermal, structures, controls and system dynamics technologies, and its development is influenced by the requirements for design/analysis of large space systems. The system has many features which make it applicable to general problems in engineering, and to management of data and software. Information includes basic IAC operation, executive commands, modules, solution paths, data organization and storage, IAC utilities, and module implementation.

  16. Rapid, autonomous analysis of He spectra I: Overview of the RadID program, user experience, and structure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gosnell, Thomas B.; Chavez, Joseph R.; Rowland, Mark S.

    2014-02-26

    RadID is a new gamma-ray spectrum analysis program for rapid screening of HPGe gamma-ray data to reveal the presence of radionuclide signatures. It is an autonomous, rule-based heuristic system that can identify well over 200 radioactive sources with particular interest in uranium and plutonium characteristics. It executes in about one second. RadID does not require knowledge of the detector efficiency, the source-to-detector distance, or the geometry of the inspected radiation source—including any shielding. In this first of a three-document series we sketch the RadID program’s origin, its minimal requirements, the user experience, and the program operation.

  17. Tradeoff analysis of technology needs for public service helicopters

    NASA Technical Reports Server (NTRS)

    Bauchspies, J. S.; Bryant, W. R., Jr.; Simpson, W. E.

    1985-01-01

    The design requirements for a family or type of Public Service Helicopter (PSH) is examined which will satisfy the needs of municipal and state governments in the following mission areas: Emergency Medical Service--Airborne Rescue Squad; Law Enforcement; Search and Rescue; and Environmental Control (Fire Fighting, Pollution, Resource Management). The report compares both design and performance requirements as specified by the PSH user's group against current technological capabilities, RTOPS and US Army LHX design requirements. The study explores various design trade-offs and options available to the aircraft designer/manufacturer in order to meet the several criteria specified by the PSH user's group. In addition, the report includes a brief assessment of the feasibility of employing certain advanced rotorcraft designs to meet the stringent combination of operational capabilities desired by the Public Service Helicopter Users.

  18. Towards elicitation of users requirements for hospital information system: from a care process modelling technique to a web based collaborative tool.

    PubMed Central

    Staccini, Pascal M.; Joubert, Michel; Quaranta, Jean-Francois; Fieschi, Marius

    2002-01-01

    Growing attention is being given to the use of process modeling methodology for user requirements elicitation. In the analysis phase of hospital information systems, the usefulness of care-process models has been investigated to evaluate the conceptual applicability and practical understandability by clinical staff and members of users teams. Nevertheless, there still remains a gap between users and analysts in their mutual ability to share conceptual views and vocabulary, keeping the meaning of clinical context while providing elements for analysis. One of the solutions for filling this gap is to consider the process model itself in the role of a hub as a centralized means of facilitating communication between team members. Starting with a robust and descriptive technique for process modeling called IDEF0/SADT, we refined the basic data model by extracting concepts from ISO 9000 process analysis and from enterprise ontology. We defined a web-based architecture to serve as a collaborative tool and implemented it using an object-oriented database. The prospects of such a tool are discussed notably regarding to its ability to generate data dictionaries and to be used as a navigation tool through the medium of hospital-wide documentation. PMID:12463921

  19. Towards elicitation of users requirements for hospital information system: from a care process modelling technique to a web based collaborative tool.

    PubMed

    Staccini, Pascal M; Joubert, Michel; Quaranta, Jean-Francois; Fieschi, Marius

    2002-01-01

    Growing attention is being given to the use of process modeling methodology for user requirements elicitation. In the analysis phase of hospital information systems, the usefulness of care-process models has been investigated to evaluate the conceptual applicability and practical understandability by clinical staff and members of users teams. Nevertheless, there still remains a gap between users and analysts in their mutual ability to share conceptual views and vocabulary, keeping the meaning of clinical context while providing elements for analysis. One of the solutions for filling this gap is to consider the process model itself in the role of a hub as a centralized means of facilitating communication between team members. Starting with a robust and descriptive technique for process modeling called IDEF0/SADT, we refined the basic data model by extracting concepts from ISO 9000 process analysis and from enterprise ontology. We defined a web-based architecture to serve as a collaborative tool and implemented it using an object-oriented database. The prospects of such a tool are discussed notably regarding to its ability to generate data dictionaries and to be used as a navigation tool through the medium of hospital-wide documentation.

  20. The effects of 'ecstasy' (MDMA) on visuospatial memory performance: findings from a systematic review with meta-analyses.

    PubMed

    Murphy, Philip N; Bruno, Raimondo; Ryland, Ida; Wareing, Michele; Fisk, John E; Montgomery, Catharine; Hilton, Joanne

    2012-03-01

    To review, with meta-analyses where appropriate, performance differences between ecstasy (3,4-methylenedioxymethamphetamine) users and non-users on a wider range of visuospatial tasks than previously reviewed. Such tasks have been shown to draw upon working memory executive resources. Abstract databases were searched using the United Kingdom National Health Service Evidence Health Information Resource. Inclusion criteria were publication in English language peer-reviewed journals and the reporting of new findings regarding human ecstasy-users' performance on visuospatial tasks. Data extracted included specific task requirements to provide a basis for meta-analyses for categories of tasks with similar requirements. Fifty-two studies were identified for review, although not all were suitable for meta-analysis. Significant weighted mean effect sizes indicating poorer performance by ecstasy users compared with matched controls were found for tasks requiring recall of spatial stimulus elements, recognition of figures and production/reproduction of figures. There was no evidence of a linear relationship between estimated ecstasy consumption and effect sizes. Given the networked nature of processing for spatial and non-spatial visual information, future scanning and imaging studies should focus on brain activation differences between ecstasy users and non-users in the context of specific tasks to facilitate identification of loci of potentially compromised activity in users. Copyright © 2012 John Wiley & Sons, Ltd.

  1. ReMatch: a web-based tool to construct, store and share stoichiometric metabolic models with carbon maps for metabolic flux analysis.

    PubMed

    Pitkänen, Esa; Akerlund, Arto; Rantanen, Ari; Jouhten, Paula; Ukkonen, Esko

    2008-08-25

    ReMatch is a web-based, user-friendly tool that constructs stoichiometric network models for metabolic flux analysis, integrating user-developed models into a database collected from several comprehensive metabolic data resources, including KEGG, MetaCyc and CheBI. Particularly, ReMatch augments the metabolic reactions of the model with carbon mappings to facilitate (13)C metabolic flux analysis. The construction of a network model consisting of biochemical reactions is the first step in most metabolic modelling tasks. This model construction can be a tedious task as the required information is usually scattered to many separate databases whose interoperability is suboptimal, due to the heterogeneous naming conventions of metabolites in different databases. Another, particularly severe data integration problem is faced in (13)C metabolic flux analysis, where the mappings of carbon atoms from substrates into products in the model are required. ReMatch has been developed to solve the above data integration problems. First, ReMatch matches the imported user-developed model against the internal ReMatch database while considering a comprehensive metabolite name thesaurus. This, together with wild card support, allows the user to specify the model quickly without having to look the names up manually. Second, ReMatch is able to augment reactions of the model with carbon mappings, obtained either from the internal database or given by the user with an easy-touse tool. The constructed models can be exported into 13C-FLUX and SBML file formats. Further, a stoichiometric matrix and visualizations of the network model can be generated. The constructed models of metabolic networks can be optionally made available to the other users of ReMatch. Thus, ReMatch provides a common repository for metabolic network models with carbon mappings for the needs of metabolic flux analysis community. ReMatch is freely available for academic use at http://www.cs.helsinki.fi/group/sysfys/software/rematch/.

  2. Improving the quality of e-commerce web service: what is important for the request scheduling algorithm?

    NASA Astrophysics Data System (ADS)

    Suchacka, Grazyna

    2005-02-01

    The paper concerns a new research area that is Quality of Web Service (QoWS). The need for QoWS is motivated by a still growing number of Internet users, by a steady development and diversification of Web services, and especially by popularization of e-commerce applications. The goal of the paper is a critical analysis of the literature concerning scheduling algorithms for e-commerce Web servers. The paper characterizes factors affecting the load of the Web servers and discusses ways of improving their efficiency. Crucial QoWS requirements of the business Web server are identified: serving requests before their individual deadlines, supporting user session integrity, supporting different classes of users and minimizing a number of rejected requests. It is justified that meeting these requirements and implementing them in an admission control (AC) and scheduling algorithm for the business Web server is crucial to the functioning of e-commerce Web sites and revenue generated by them. The paper presents results of the literature analysis and discusses algorithms that implement these important QoWS requirements. The analysis showed that very few algorithms take into consideration the above mentioned factors and that there is a need for designing an algorithm implementing them.

  3. Are Password Management Applications Viable? An Analysis of User Training and Reactions

    ERIC Educational Resources Information Center

    Ciampa, Mark

    2011-01-01

    Passwords have the distinction of being the most widely-used form of authentication--and the most vulnerable. With the dramatic increase today in the number of accounts that require passwords, overwhelmed users usually resort to creating weak passwords or reusing the same password for multiple accounts, thus making passwords the weakest link in…

  4. iSeq: Web-Based RNA-seq Data Analysis and Visualization.

    PubMed

    Zhang, Chao; Fan, Caoqi; Gan, Jingbo; Zhu, Ping; Kong, Lei; Li, Cheng

    2018-01-01

    Transcriptome sequencing (RNA-seq) is becoming a standard experimental methodology for genome-wide characterization and quantification of transcripts at single base-pair resolution. However, downstream analysis of massive amount of sequencing data can be prohibitively technical for wet-lab researchers. A functionally integrated and user-friendly platform is required to meet this demand. Here, we present iSeq, an R-based Web server, for RNA-seq data analysis and visualization. iSeq is a streamlined Web-based R application under the Shiny framework, featuring a simple user interface and multiple data analysis modules. Users without programming and statistical skills can analyze their RNA-seq data and construct publication-level graphs through a standardized yet customizable analytical pipeline. iSeq is accessible via Web browsers on any operating system at http://iseq.cbi.pku.edu.cn .

  5. eRNA: a graphic user interface-based tool optimized for large data analysis from high-throughput RNA sequencing

    PubMed Central

    2014-01-01

    Background RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. Results We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification” includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module “mRNA identification” includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module “Target screening” provides expression profiling analyses and graphic visualization. The module “Self-testing” offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program’s functionality. Conclusions eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory. PMID:24593312

  6. eRNA: a graphic user interface-based tool optimized for large data analysis from high-throughput RNA sequencing.

    PubMed

    Yuan, Tiezheng; Huang, Xiaoyi; Dittmar, Rachel L; Du, Meijun; Kohli, Manish; Boardman, Lisa; Thibodeau, Stephen N; Wang, Liang

    2014-03-05

    RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification" includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module "mRNA identification" includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module "Target screening" provides expression profiling analyses and graphic visualization. The module "Self-testing" offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program's functionality. eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory.

  7. VDJServer: A Cloud-Based Analysis Portal and Data Commons for Immune Repertoire Sequences and Rearrangements.

    PubMed

    Christley, Scott; Scarborough, Walter; Salinas, Eddie; Rounds, William H; Toby, Inimary T; Fonner, John M; Levin, Mikhail K; Kim, Min; Mock, Stephen A; Jordan, Christopher; Ostmeyer, Jared; Buntzman, Adam; Rubelt, Florian; Davila, Marco L; Monson, Nancy L; Scheuermann, Richard H; Cowell, Lindsay G

    2018-01-01

    Recent technological advances in immune repertoire sequencing have created tremendous potential for advancing our understanding of adaptive immune response dynamics in various states of health and disease. Immune repertoire sequencing produces large, highly complex data sets, however, which require specialized methods and software tools for their effective analysis and interpretation. VDJServer is a cloud-based analysis portal for immune repertoire sequence data that provide access to a suite of tools for a complete analysis workflow, including modules for preprocessing and quality control of sequence reads, V(D)J gene segment assignment, repertoire characterization, and repertoire comparison. VDJServer also provides sophisticated visualizations for exploratory analysis. It is accessible through a standard web browser via a graphical user interface designed for use by immunologists, clinicians, and bioinformatics researchers. VDJServer provides a data commons for public sharing of repertoire sequencing data, as well as private sharing of data between users. We describe the main functionality and architecture of VDJServer and demonstrate its capabilities with use cases from cancer immunology and autoimmunity. VDJServer provides a complete analysis suite for human and mouse T-cell and B-cell receptor repertoire sequencing data. The combination of its user-friendly interface and high-performance computing allows large immune repertoire sequencing projects to be analyzed with no programming or software installation required. VDJServer is a web-accessible cloud platform that provides access through a graphical user interface to a data management infrastructure, a collection of analysis tools covering all steps in an analysis, and an infrastructure for sharing data along with workflows, results, and computational provenance. VDJServer is a free, publicly available, and open-source licensed resource.

  8. Beyond the management and dissemination of projects' results: stakeholders involvement and project co-design

    NASA Astrophysics Data System (ADS)

    L'Astorina, Alba; Tomasoni, Irene; Basoni, Anna; Carrara, Paola

    2015-04-01

    Nowadays scientists are asked to undertake innovative and participative approaches in communicating the results of researches carried out within the various funding programs. In particular, Workpackages of Dissemination and Exploitation of results are considered mandatory at the European level helping the Project Management find innovative knowledge transfer strategies and enhance the project outcomes and impact. In this context, the involvement of stakeholders or users in research projects won a well-defined role and recent trends, in some cases, push to co-design research projects involving a pool of stakeholders or users in them from its first steps. Horizon 2020, the new EU Framework Programme for Innovation and Community financing system (2014-2020), moves clearly in this direction. CNR has an extensive experience in this kind of activity, both at the national and the international level and, in some cases, involve users and analyse their expectations using qualitative and quantitative surveys thus recognizing the role of users as research co-actors. Often products and services derived from research lack of attractiveness among enterprises, the Public Administrations and citizens, are due to the lack of an appropriate knowledge and consideration of the needs and requirements of such users. This paper intends to illustrate a case study where the analysis of the needs and requirements of the users were included in a specific Workpackage collaborating so both with the Project management and the Workpackage of Dissemination. The analysis were conducted within an ongoing project at CNR, i.e. Space4Agri (S4A): Development of Innovative Methodologies Aerospace Earth Observation in Support of the Sector agriculture in Lombardy. The main purpose of S4A is to contribute to the development of tools to improve the ability of the regional system in the planning and management of the agricultural sector Lombard, combining three domains that is scientific and technical areas, namely the remote observations from satellite, aircraft technologies for UAVs and Internet technologies 2.0 for smart exchange of data. The methodology for collecting user requirements was recursive. Once identified target users, their "external needs" were investigated through qualitative tools such as semi - structured interviews. Thanks to the information provided by respondents subsequent deeper interviews were conducted from which additional requirements, such as further case studies and other beneficiaries were derived. During the process, also a second category of requirements, called "internal" emerged, derived from the mutual interactions between the domains of the project. The collection of requirements took more iterations, the results of which were summarized showing the expected contributions, products and by-products from stakeholders; starting from these elements each domain of the project reconstructed its state of the art in order to set methods and plan a work flow in a manner as close as possible to the needs of regional partners. The methodological issues involved both external and internal factors and stripped the complexity involved in the analysis of user needs in multi-domain, highlighting critical issues and operational difficulties but also providing interesting ideas for future developments.

  9. User modeling for distributed virtual environment intelligent agents

    NASA Astrophysics Data System (ADS)

    Banks, Sheila B.; Stytz, Martin R.

    1999-07-01

    This paper emphasizes the requirement for user modeling by presenting the necessary information to motivate the need for and use of user modeling for intelligent agent development. The paper will present information on our current intelligent agent development program, the Symbiotic Information Reasoning and Decision Support (SIRDS) project. We then discuss the areas of intelligent agents and user modeling, which form the foundation of the SIRDS project. Included in the discussion of user modeling are its major components, which are cognitive modeling and behavioral modeling. We next motivate the need for and user of a methodology to develop user models to encompass work within cognitive task analysis. We close the paper by drawing conclusions from our current intelligent agent research project and discuss avenues of future research in the utilization of user modeling for the development of intelligent agents for virtual environments.

  10. Numerical analysis of stiffened shells of revolution. Volume 3: Users' manual for STARS-2B, 2V, shell theory automated for rotational structures, 2 (buckling, vibrations), digital computer programs

    NASA Technical Reports Server (NTRS)

    Svalbonas, V.

    1973-01-01

    The User's manual for the shell theory automated for rotational structures (STARS) 2B and 2V (buckling, vibrations) is presented. Several features of the program are: (1) arbitrary branching of the shell meridians, (2) arbitrary boundary conditions, (3) minimum input requirements to describe a complex, practical shell of revolution structure, and (4) accurate analysis capability using a minimum number of degrees of freedom.

  11. Thermal APU/hydraulics analysis program. User's guide and programmer's manual

    NASA Technical Reports Server (NTRS)

    Deluna, T. A.

    1976-01-01

    The User's Guide information plus program description necessary to run and have a general understanding of the Thermal APU/Hydraulics Analysis Program (TAHAP) is described. This information consists of general descriptions of the APU/hydraulic system and the TAHAP model, input and output data descriptions, and specific subroutine requirements. Deck setups and input data formats are included and other necessary and/or helpful information for using TAHAP is given. The math model descriptions for the driver program and each of its supporting subroutines are outlined.

  12. An operational satellite scatterometer for wind vector measurements over the ocean

    NASA Technical Reports Server (NTRS)

    Grantham, W. L.; Bracalente, E. M.; Jones, W. L.; Schrader, J. H.; Schroeder, L. C.; Mitchell, J. L.

    1975-01-01

    Performance requirements and design characteristics of a microwave scatterometer wind sensor for measuring surface winds over the oceans on a global basis are described. Scatterometer specifications are developed from user requirements of wind vector measurement range and accuracy, swath width, resolution cell size and measurement grid spacing. A detailed analysis is performed for a baseline fan-beam scatterometer design, and its performance capabilities for meeting the SeaSat-A user requirements. Various modes of operation are discussed which will allow the resolution of questions concerning the effects of sea state on the scatterometer wind sensing ability and to verify design boundaries of the instrument.

  13. System requirements for a computerised patient record information system at a busy primary health care clinic.

    PubMed

    Blignaut, P J; McDonald, T; Tolmie, C J

    2001-05-01

    A prototyping approach was used to determine the essential system requirements of a computerised patient record information system for a typical township primary health care clinic. A pilot clinic was identified and the existing manual system and business processes in this clinic was studied intensively before the first prototype was implemented. Interviews with users, incidental observations and analysis of actual data entered were used as primary techniques to refine the prototype system iteratively until a system with an acceptable data set and adequate functionalities were in place. Several non-functional and user-related requirements were also discovered during the prototyping period.

  14. Personas in online health communities.

    PubMed

    Huh, Jina; Kwon, Bum Chul; Kim, Sung-Hee; Lee, Sukwon; Choo, Jaegul; Kim, Jihoon; Choi, Min-Je; Yi, Ji Soo

    2016-10-01

    Many researchers and practitioners use online health communities (OHCs) to influence health behavior and provide patients with social support. One of the biggest challenges in this approach, however, is the rate of attrition. OHCs face similar problems as other social media platforms where user migration happens unless tailored content and appropriate socialization is supported. To provide tailored support for each OHC user, we developed personas in OHCs illustrating users' needs and requirements in OHC use. To develop OHC personas, we first interviewed 16 OHC users and administrators to qualitatively understand varying user needs in OHC. Based on their responses, we developed an online survey to systematically investigate OHC personas. We received 184 survey responses from OHC users, which informed their values and their OHC use patterns. We performed open coding analysis with the interview data and cluster analysis with the survey data and consolidated the analyses of the two datasets. Four personas emerged-Caretakers, Opportunists, Scientists, and Adventurers. The results inform users' interaction behavior and attitude patterns with OHCs. We discuss implications for how these personas inform OHCs in delivering personalized informational and emotional support. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Generating a Corpus of Mobile Forensic Images for Masquerading user Experimentation.

    PubMed

    Guido, Mark; Brooks, Marc; Grover, Justin; Katz, Eric; Ondricek, Jared; Rogers, Marcus; Sharpe, Lauren

    2016-11-01

    The Periodic Mobile Forensics (PMF) system investigates user behavior on mobile devices. It applies forensic techniques to an enterprise mobile infrastructure, utilizing an on-device agent named TractorBeam. The agent collects changed storage locations for later acquisition, reconstruction, and analysis. TractorBeam provides its data to an enterprise infrastructure that consists of a cloud-based queuing service, relational database, and analytical framework for running forensic processes. During a 3-month experiment with Purdue University, TractorBeam was utilized in a simulated operational setting across 34 users to evaluate techniques to identify masquerading users (i.e., users other than the intended device user). The research team surmises that all masqueraders are undesirable to an enterprise, even when a masquerader lacks malicious intent. The PMF system reconstructed 821 forensic images, extracted one million audit events, and accurately detected masqueraders. Evaluation revealed that developed methods reduced storage requirements 50-fold. This paper describes the PMF architecture, performance of TractorBeam throughout the protocol, and results of the masquerading user analysis. © 2016 American Academy of Forensic Sciences.

  16. From scenarios to domain models: processes and representations

    NASA Astrophysics Data System (ADS)

    Haddock, Gail; Harbison, Karan

    1994-03-01

    The domain specific software architectures (DSSA) community has defined a philosophy for the development of complex systems. This philosophy improves productivity and efficiency by increasing the user's role in the definition of requirements, increasing the systems engineer's role in the reuse of components, and decreasing the software engineer's role to the development of new components and component modifications only. The scenario-based engineering process (SEP), the first instantiation of the DSSA philosophy, has been adopted by the next generation controller project. It is also the chosen methodology of the trauma care information management system project, and the surrogate semi-autonomous vehicle project. SEP uses scenarios from the user to create domain models and define the system's requirements. Domain knowledge is obtained from a variety of sources including experts, documents, and videos. This knowledge is analyzed using three techniques: scenario analysis, task analysis, and object-oriented analysis. Scenario analysis results in formal representations of selected scenarios. Task analysis of the scenario representations results in descriptions of tasks necessary for object-oriented analysis and also subtasks necessary for functional system analysis. Object-oriented analysis of task descriptions produces domain models and system requirements. This paper examines the representations that support the DSSA philosophy, including reference requirements, reference architectures, and domain models. The processes used to create and use the representations are explained through use of the scenario-based engineering process. Selected examples are taken from the next generation controller project.

  17. Capabilities needed for the next generation of thermo-hydraulic codes for use in real time applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arndt, S.A.

    1997-07-01

    The real-time reactor simulation field is currently at a crossroads in terms of the capability to perform real-time analysis using the most sophisticated computer codes. Current generation safety analysis codes are being modified to replace simplified codes that were specifically designed to meet the competing requirement for real-time applications. The next generation of thermo-hydraulic codes will need to have included in their specifications the specific requirement for use in a real-time environment. Use of the codes in real-time applications imposes much stricter requirements on robustness, reliability and repeatability than do design and analysis applications. In addition, the need for codemore » use by a variety of users is a critical issue for real-time users, trainers and emergency planners who currently use real-time simulation, and PRA practitioners who will increasingly use real-time simulation for evaluating PRA success criteria in near real-time to validate PRA results for specific configurations and plant system unavailabilities.« less

  18. Agricultural aviation user requirement priorities

    NASA Technical Reports Server (NTRS)

    Kaplan, R. L.; Meeland, T.; Peterson, J. E.

    1977-01-01

    The results are given of a research project pertaining to the development of agricultural aviation user requirement priorities. The raw data utilized in the project was obtained from the National Agricultural Aviation Association. A specially configured poll, developed by the Actuarial Research Corporation was used to solicit responses from NAAA members and others. The primary product of the poll is the specification of seriousness as determined by the respondents for some selected agricultural aviation problem areas identified and defined during the course of an intensive analysis by the Actuarial Research Corporation.

  19. User needs, benefits and integration of robotic systems in a space station laboratory

    NASA Technical Reports Server (NTRS)

    Farnell, K. E.; Richard, J. A.; Ploge, E.; Badgley, M. B.; Konkel, C. R.; Dodd, W. R.

    1989-01-01

    The methodology, results and conclusions of the User Needs, Benefits, and Integration Study (UNBIS) of Robotic Systems in the Space Station Microgravity and Materials Processing Facility are summarized. Study goals include the determination of user requirements for robotics within the Space Station, United States Laboratory. Three experiments were selected to determine user needs and to allow detailed investigation of microgravity requirements. A NASTRAN analysis of Space Station response to robotic disturbances, and acceleration measurement of a standard industrial robot (Intelledex Model 660) resulted in selection of two ranges of low gravity manipulation: Level 1 (10-3 to 10-5 G at greater than 1 Hz.) and Level 2 (less than = 10-6 G at 0.1 Hz). This included an evaluation of microstepping methods for controlling stepper motors and concluded that an industrial robot actuator can perform milli-G motion without modification. Relative merits of end-effectors and manipulators were studied in order to determine their ability to perform a range of tasks related to the three low gravity experiments. An Effectivity Rating was established for evaluating these robotic system capabilities. Preliminary interface requirements were determined such that definition of requirements for an orbital flight demonstration experiment may be established.

  20. Image-based mobile service: automatic text extraction and translation

    NASA Astrophysics Data System (ADS)

    Berclaz, Jérôme; Bhatti, Nina; Simske, Steven J.; Schettino, John C.

    2010-01-01

    We present a new mobile service for the translation of text from images taken by consumer-grade cell-phone cameras. Such capability represents a new paradigm for users where a simple image provides the basis for a service. The ubiquity and ease of use of cell-phone cameras enables acquisition and transmission of images anywhere and at any time a user wishes, delivering rapid and accurate translation over the phone's MMS and SMS facilities. Target text is extracted completely automatically, requiring no bounding box delineation or related user intervention. The service uses localization, binarization, text deskewing, and optical character recognition (OCR) in its analysis. Once the text is translated, an SMS message is sent to the user with the result. Further novelties include that no software installation is required on the handset, any service provider or camera phone can be used, and the entire service is implemented on the server side.

  1. GLINT: a user-friendly toolset for the analysis of high-throughput DNA-methylation array data.

    PubMed

    Rahmani, Elior; Yedidim, Reut; Shenhav, Liat; Schweiger, Regev; Weissbrod, Omer; Zaitlen, Noah; Halperin, Eran

    2017-06-15

    GLINT is a user-friendly command-line toolset for fast analysis of genome-wide DNA methylation data generated using the Illumina human methylation arrays. GLINT, which does not require any programming proficiency, allows an easy execution of Epigenome-Wide Association Study analysis pipeline under different models while accounting for known confounders in methylation data. GLINT is a command-line software, freely available at https://github.com/cozygene/glint/releases . It requires Python 2.7 and several freely available Python packages. Further information and documentation as well as a quick start tutorial are available at http://glint-epigenetics.readthedocs.io . elior.rahmani@gmail.com or ehalperin@cs.ucla.edu. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  2. Web accessibility support for visually impaired users using link content analysis.

    PubMed

    Iwata, Hajime; Kobayashi, Naofumi; Tachibana, Kenji; Shirogane, Junko; Fukazawa, Yoshiaki

    2013-12-01

    Web pages are used for a variety of purposes. End users must understand dynamically changing content and sequentially follow page links to find desired material, requiring significant time and effort. However, for visually impaired users using screen readers, it can be difficult to find links to web pages when link text and alternative text descriptions are inappropriate. Our method supports the discovery of content by analyzing 8 categories of link types, and allows visually impaired users to be aware of the content represented by links in advance. This facilitates end users access to necessary information on web pages. Our method of classifying web page links is therefore effective as a means of evaluating accessibility.

  3. Candidate Mission from Planet Earth control and data delivery system architecture

    NASA Technical Reports Server (NTRS)

    Shapiro, Phillip; Weinstein, Frank C.; Hei, Donald J., Jr.; Todd, Jacqueline

    1992-01-01

    Using a structured, experienced-based approach, Goddard Space Flight Center (GSFC) has assessed the generic functional requirements for a lunar mission control and data delivery (CDD) system. This analysis was based on lunar mission requirements outlined in GSFC-developed user traffic models. The CDD system will facilitate data transportation among user elements, element operations, and user teams by providing functions such as data management, fault isolation, fault correction, and link acquisition. The CDD system for the lunar missions must not only satisfy lunar requirements but also facilitate and provide early development of data system technologies for Mars. Reuse and evolution of existing data systems can help to maximize system reliability and minimize cost. This paper presents a set of existing and currently planned NASA data systems that provide the basic functionality. Reuse of such systems can have an impact on mission design and significantly reduce CDD and other system development costs.

  4. An Analysis of an Automatic Coolant Bypass in the International Space Station Node 2 Internal Active Thermal Control System

    NASA Technical Reports Server (NTRS)

    Clanton, Stephen E.; Holt, James M.; Turner, Larry D. (Technical Monitor)

    2001-01-01

    A challenging part of International Space Station (ISS) thermal control design is the ability to incorporate design changes into an integrated system without negatively impacting performance. The challenge presents itself in that the typical ISS Internal Active Thermal Control System (IATCS) consists of an integrated hardware/software system that provides active coolant resources to a variety of users. Software algorithms control the IATCS to specific temperatures, flow rates, and pressure differentials in order to meet the user-defined requirements. What may seem to be small design changes imposed on the system may in fact result in system instability or the temporary inability to meet user requirements. The purpose of this paper is to provide a brief description of the solution process and analyses used to implement one such design change that required the incorporation of an automatic coolant bypass in the ISS Node 2 element.

  5. Using a data base management system for modelling SSME test history data

    NASA Technical Reports Server (NTRS)

    Abernethy, K.

    1985-01-01

    The usefulness of a data base management system (DBMS) for modelling historical test data for the complete series of static test firings for the Space Shuttle Main Engine (SSME) was assessed. From an analysis of user data base query requirements, it became clear that a relational DMBS which included a relationally complete query language would permit a model satisfying the query requirements. Representative models and sample queries are discussed. A list of environment-particular evaluation criteria for the desired DBMS was constructed; these criteria include requirements in the areas of user-interface complexity, program independence, flexibility, modifiability, and output capability. The evaluation process included the construction of several prototype data bases for user assessement. The systems studied, representing the three major DBMS conceptual models, were: MIRADS, a hierarchical system; DMS-1100, a CODASYL-based network system; ORACLE, a relational system; and DATATRIEVE, a relational-type system.

  6. An Expert Assistant for Computer Aided Parallelization

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Chun, Robert; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    The prototype implementation of an expert system was developed to assist the user in the computer aided parallelization process. The system interfaces to tools for automatic parallelization and performance analysis. By fusing static program structure information and dynamic performance analysis data the expert system can help the user to filter, correlate, and interpret the data gathered by the existing tools. Sections of the code that show poor performance and require further attention are rapidly identified and suggestions for improvements are presented to the user. In this paper we describe the components of the expert system and discuss its interface to the existing tools. We present a case study to demonstrate the successful use in full scale scientific applications.

  7. A study and experiment plan for digital mobile communication via satellite

    NASA Technical Reports Server (NTRS)

    Jones, J. J.; Craighill, E. J.; Evans, R. G.; Vincze, A. D.; Tom, N. N.

    1978-01-01

    The viability of mobile communications is examined within the context of a frequency division multiple access, single channel per carrier satellite system emphasizing digital techniques to serve a large population of users. The intent is to provide the mobile users with a grade of service consistant with the requirements for remote, rural (perhaps emergency) voice communications, but which approaches toll quality speech. A traffic model is derived on which to base the determination of the required maximum number of satellite channels to provide the anticipated level of service. Various voice digitalization and digital modulation schemes are reviewed along with a general link analysis of the mobile system. Demand assignment multiple access considerations and analysis tradeoffs are presented. Finally, a completed configuration is described.

  8. The Scenario-Based Engineering Process (SEP): a user-centered approach for the development of health care systems.

    PubMed

    Harbison, K; Kelly, J; Burnell, L; Silva, J

    1995-01-01

    The Scenario-based Engineering Process (SEP) is a user-focused methodology for large and complex system design. This process supports new application development from requirements analysis with domain models to component selection, design and modification, implementation, integration, and archival placement. It is built upon object-oriented methodologies, domain modeling strategies, and scenario-based techniques to provide an analysis process for mapping application requirements to available components. We are using SEP in the health care applications that we are developing. The process has already achieved success in the manufacturing and military domains and is being adopted by many organizations. SEP should prove viable in any domain containing scenarios that can be decomposed into tasks.

  9. Design for interaction between humans and intelligent systems during real-time fault management

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Schreckenghost, Debra L.; Thronesbery, Carroll G.

    1992-01-01

    Initial results are reported to provide guidance and assistance for designers of intelligent systems and their human interfaces. The objective is to achieve more effective human-computer interaction (HCI) for real time fault management support systems. Studies of the development of intelligent fault management systems within NASA have resulted in a new perspective of the user. If the user is viewed as one of the subsystems in a heterogeneous, distributed system, system design becomes the design of a flexible architecture for accomplishing system tasks with both human and computer agents. HCI requirements and design should be distinguished from user interface (displays and controls) requirements and design. Effective HCI design for multi-agent systems requires explicit identification of activities and information that support coordination and communication between agents. The effects are characterized of HCI design on overall system design and approaches are identified to addressing HCI requirements in system design. The results include definition of (1) guidance based on information level requirements analysis of HCI, (2) high level requirements for a design methodology that integrates the HCI perspective into system design, and (3) requirements for embedding HCI design tools into intelligent system development environments.

  10. Development and analysis of SCR requirements tables for system scenarios

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Morrison, Jeffery L.

    1995-01-01

    We describe the use of scenarios to develop and refine requirement tables for parts of the Earth Observing System Data and Information System (EOSDIS). The National Aeronautics and Space Administration (NASA) is developing EOSDIS as part of its Mission-To-Planet-Earth (MTPE) project to accept instrument/platform observation requests from end-user scientists, schedule and perform requested observations of the Earth from space, collect and process the observed data, and distribute data to scientists and archives. Current requirements for the system are managed with tools that allow developers to trace the relationships between requirements and other development artifacts, including other requirements. In addition, the user community (e.g., earth and atmospheric scientists), in conjunction with NASA, has generated scenarios describing the actions of EOSDIS subsystems in response to user requests and other system activities. As part of a research effort in verification and validation techniques, this paper describes our efforts to develop requirements tables from these scenarios for the EOSDIS Core System (ECS). The tables specify event-driven mode transitions based on techniques developed by the Naval Research Lab's (NRL) Software Cost Reduction (SCR) project. The SCR approach has proven effective in specifying requirements for large systems in an unambiguous, terse format that enhance identification of incomplete and inconsistent requirements. We describe development of SCR tables from user scenarios and identify the strengths and weaknesses of our approach in contrast to the requirements tracing approach. We also evaluate the capabilities of both approach to respond to the volatility of requirements in large, complex systems.

  11. User's manual for the Shuttle Electric Power System analysis computer program (SEPS), volume 2 of program documentation

    NASA Technical Reports Server (NTRS)

    Bains, R. W.; Herwig, H. A.; Luedeman, J. K.; Torina, E. M.

    1974-01-01

    The Shuttle Electric Power System Analysis SEPS computer program which performs detailed load analysis including predicting energy demands and consumables requirements of the shuttle electric power system along with parameteric and special case studies on the shuttle electric power system is described. The functional flow diagram of the SEPS program is presented along with data base requirements and formats, procedure and activity definitions, and mission timeline input formats. Distribution circuit input and fixed data requirements are included. Run procedures and deck setups are described.

  12. Operation of controls on consumer products by physically impaired users.

    PubMed

    Kanis, H

    1993-06-01

    The self-reliance of the physically impaired can be seriously jeopardized by their inability to operate everyday products, especially if both upper extremities are impaired. To determine the difficulties impaired users encounter in operating consumer product controls, on-site video recordings were made of subjects suffering from arthritis or a muscular disease. Subjects' force exertion was compared with that of a group of nonimpaired users. The resulting inventory allowed the analysis of the manipulation problems faced by impaired subjects and the development of design recommendations. In this study the force exerted by the subjects and that required to operate the controls were measured. A comparison of the results of these force measurements led to a number of conclusions. This study led to the following design recommendations: the amount of force required to operate controls should be kept as low as possible; the user should not be required to make two manipulations at the same time, such as simultaneously pushing and rotating a control device; pushing is preferable to rotating; and there should be a great degree of freedom to manipulate controls.

  13. User's manual for rocket combustor interactive design (ROCCID) and analysis computer program. Volume 2: Appendixes A-K

    NASA Technical Reports Server (NTRS)

    Muss, J. A.; Nguyen, T. V.; Johnson, C. W.

    1991-01-01

    The appendices A-K to the user's manual for the rocket combustor interactive design (ROCCID) computer program are presented. This includes installation instructions, flow charts, subroutine model documentation, and sample output files. The ROCCID program, written in Fortran 77, provides a standardized methodology using state of the art codes and procedures for the analysis of a liquid rocket engine combustor's steady state combustion performance and combustion stability. The ROCCID is currently capable of analyzing mixed element injector patterns containing impinging like doublet or unlike triplet, showerhead, shear coaxial and swirl coaxial elements as long as only one element type exists in each injector core, baffle, or barrier zone. Real propellant properties of oxygen, hydrogen, methane, propane, and RP-1 are included in ROCCID. The properties of other propellants can be easily added. The analysis models in ROCCID can account for the influences of acoustic cavities, helmholtz resonators, and radial thrust chamber baffles on combustion stability. ROCCID also contains the logic to interactively create a combustor design which meets input performance and stability goals. A preliminary design results from the application of historical correlations to the input design requirements. The steady state performance and combustion stability of this design is evaluated using the analysis models, and ROCCID guides the user as to the design changes required to satisfy the user's performance and stability goals, including the design of stability aids. Output from ROCCID includes a formatted input file for the standardized JANNAF engine performance prediction procedure.

  14. AQUATOX Fact Sheet

    EPA Pesticide Factsheets

    AQUATOX Release 3.1 includes numerous enhancements designed to improve model performance, more closely match data requirements with generally available data, improve data manipulation and analysis, and increase user friendliness.

  15. Development of Web Interfaces for Analysis Codes

    NASA Astrophysics Data System (ADS)

    Emoto, M.; Watanabe, T.; Funaba, H.; Murakami, S.; Nagayama, Y.; Kawahata, K.

    Several codes have been developed to analyze plasma physics. However, most of them are developed to run on supercomputers. Therefore, users who typically use personal computers (PCs) find it difficult to use these codes. In order to facilitate the widespread use of these codes, a user-friendly interface is required. The authors propose Web interfaces for these codes. To demonstrate the usefulness of this approach, the authors developed Web interfaces for two analysis codes. One of them is for FIT developed by Murakami. This code is used to analyze the NBI heat deposition, etc. Because it requires electron density profiles, electron temperatures, and ion temperatures as polynomial expressions, those unfamiliar with the experiments find it difficult to use this code, especially visitors from other institutes. The second one is for visualizing the lines of force in the LHD (large helical device) developed by Watanabe. This code is used to analyze the interference caused by the lines of force resulting from the various structures installed in the vacuum vessel of the LHD. This code runs on PCs; however, it requires that the necessary parameters be edited manually. Using these Web interfaces, users can execute these codes interactively.

  16. LHCb trigger streams optimization

    NASA Astrophysics Data System (ADS)

    Derkach, D.; Kazeev, N.; Neychev, R.; Panin, A.; Trofimov, I.; Ustyuzhanin, A.; Vesterinen, M.

    2017-10-01

    The LHCb experiment stores around 1011 collision events per year. A typical physics analysis deals with a final sample of up to 107 events. Event preselection algorithms (lines) are used for data reduction. Since the data are stored in a format that requires sequential access, the lines are grouped into several output file streams, in order to increase the efficiency of user analysis jobs that read these data. The scheme efficiency heavily depends on the stream composition. By putting similar lines together and balancing the stream sizes it is possible to reduce the overhead. We present a method for finding an optimal stream composition. The method is applied to a part of the LHCb data (Turbo stream) on the stage where it is prepared for user physics analysis. This results in an expected improvement of 15% in the speed of user analysis jobs, and will be applied on data to be recorded in 2017.

  17. User's manual for GAMNAS: Geometric and Material Nonlinear Analysis of Structures

    NASA Technical Reports Server (NTRS)

    Whitcomb, J. D.; Dattaguru, B.

    1984-01-01

    GAMNAS (Geometric and Material Nonlinear Analysis of Structures) is a two dimensional finite-element stress analysis program. Options include linear, geometric nonlinear, material nonlinear, and combined geometric and material nonlinear analysis. The theory, organization, and use of GAMNAS are described. Required input data and results for several sample problems are included.

  18. Web-TCGA: an online platform for integrated analysis of molecular cancer data sets.

    PubMed

    Deng, Mario; Brägelmann, Johannes; Schultze, Joachim L; Perner, Sven

    2016-02-06

    The Cancer Genome Atlas (TCGA) is a pool of molecular data sets publicly accessible and freely available to cancer researchers anywhere around the world. However, wide spread use is limited since an advanced knowledge of statistics and statistical software is required. In order to improve accessibility we created Web-TCGA, a web based, freely accessible online tool, which can also be run in a private instance, for integrated analysis of molecular cancer data sets provided by TCGA. In contrast to already available tools, Web-TCGA utilizes different methods for analysis and visualization of TCGA data, allowing users to generate global molecular profiles across different cancer entities simultaneously. In addition to global molecular profiles, Web-TCGA offers highly detailed gene and tumor entity centric analysis by providing interactive tables and views. As a supplement to other already available tools, such as cBioPortal (Sci Signal 6:pl1, 2013, Cancer Discov 2:401-4, 2012), Web-TCGA is offering an analysis service, which does not require any installation or configuration, for molecular data sets available at the TCGA. Individual processing requests (queries) are generated by the user for mutation, methylation, expression and copy number variation (CNV) analyses. The user can focus analyses on results from single genes and cancer entities or perform a global analysis (multiple cancer entities and genes simultaneously).

  19. Using Airborne Remote Sensing to Increase Situational Awareness in Civil Protection and Humanitarian Relief - the Importance of User Involvement

    NASA Astrophysics Data System (ADS)

    Römer, H.; Kiefl, R.; Henkel, F.; Wenxi, C.; Nippold, R.; Kurz, F.; Kippnich, U.

    2016-06-01

    Enhancing situational awareness in real-time (RT) civil protection and emergency response scenarios requires the development of comprehensive monitoring concepts combining classical remote sensing disciplines with geospatial information science. In the VABENE++ project of the German Aerospace Center (DLR) monitoring tools are being developed by which innovative data acquisition approaches are combined with information extraction as well as the generation and dissemination of information products to a specific user. DLR's 3K and 4k camera system which allow for a RT acquisition and pre-processing of high resolution aerial imagery are applied in two application examples conducted with end users: a civil protection exercise with humanitarian relief organisations and a large open-air music festival in cooperation with a festival organising company. This study discusses how airborne remote sensing can significantly contribute to both, situational assessment and awareness, focussing on the downstream processes required for extracting information from imagery and for visualising and disseminating imagery in combination with other geospatial information. Valuable user feedback and impetus for further developments has been obtained from both applications, referring to innovations in thematic image analysis (supporting festival site management) and product dissemination (editable web services). Thus, this study emphasises the important role of user involvement in application-related research, i.e. by aligning it closer to user's requirements.

  20. On the next generation of reliability analysis tools

    NASA Technical Reports Server (NTRS)

    Babcock, Philip S., IV; Leong, Frank; Gai, Eli

    1987-01-01

    The current generation of reliability analysis tools concentrates on improving the efficiency of the description and solution of the fault-handling processes and providing a solution algorithm for the full system model. The tools have improved user efficiency in these areas to the extent that the problem of constructing the fault-occurrence model is now the major analysis bottleneck. For the next generation of reliability tools, it is proposed that techniques be developed to improve the efficiency of the fault-occurrence model generation and input. Further, the goal is to provide an environment permitting a user to provide a top-down design description of the system from which a Markov reliability model is automatically constructed. Thus, the user is relieved of the tedious and error-prone process of model construction, permitting an efficient exploration of the design space, and an independent validation of the system's operation is obtained. An additional benefit of automating the model construction process is the opportunity to reduce the specialized knowledge required. Hence, the user need only be an expert in the system he is analyzing; the expertise in reliability analysis techniques is supplied.

  1. Towards an EO-based Landslide Web Mapping and Monitoring Service

    NASA Astrophysics Data System (ADS)

    Hölbling, Daniel; Weinke, Elisabeth; Albrecht, Florian; Eisank, Clemens; Vecchiotti, Filippo; Friedl, Barbara; Kociu, Arben

    2017-04-01

    National and regional authorities and infrastructure maintainers in mountainous regions require accurate knowledge of the location and spatial extent of landslides for hazard and risk management. Information on landslides is often collected by a combination of ground surveying and manual image interpretation following landslide triggering events. However, the high workload and limited time for data acquisition result in a trade-off between completeness, accuracy and detail. Remote sensing data offers great potential for mapping and monitoring landslides in a fast and efficient manner. While facing an increased availability of high-quality Earth Observation (EO) data and new computational methods, there is still a lack in science-policy interaction and in providing innovative tools and methods that can easily be used by stakeholders and users to support their daily work. Taking up this issue, we introduce an innovative and user-oriented EO-based web service for landslide mapping and monitoring. Three central design components of the service are presented: (1) the user requirements definition, (2) the semi-automated image analysis methods implemented in the service, and (3) the web mapping application with its responsive user interface. User requirements were gathered during semi-structured interviews with regional authorities. The potential users were asked if and how they employ remote sensing data for landslide investigation and what their expectations to a landslide web mapping service regarding reliability and usability are. The interviews revealed the capability of our service for landslide documentation and mapping as well as monitoring of selected landslide sites, for example to complete and update landslide inventory maps. In addition, the users see a considerable potential for landslide rapid mapping. The user requirements analysis served as basis for the service concept definition. Optical satellite imagery from different high resolution (HR) and very high resolution (VHR) sensors, e.g. Landsat, Sentinel-2, SPOT-5, WorldView-2/3, was acquired for different study areas in the Alps. Object-based image analysis (OBIA) methods were used for semi-automated mapping of landslides. Selected mapping routines and results, including a step-by-step guidance, are integrated in the service by means of a web processing chain. This allows the user to gain insights into the service idea, the potential of semi-automated mapping methods, and the applicability of various satellite data for specific landslide mapping tasks. Moreover, an easy-to use and guided classification workflow, which includes image segmentation, statistical classification and manual editing options, enables the user to perform his/her own analyses. For validation, the classification results can be downloaded or compared against uploaded reference data using the implemented tools. Furthermore, users can compare the classification results to freely available data such as OpenStreetMap to identify landslide-affected infrastructure (e.g. roads, buildings). They also can upload infrastructure data available at their organization for specific assessments or monitor the evolution of selected landslides over time. Further actions will include the validation of the service in collaboration with stakeholders, decision makers and experts, which is essential to produce landslide information products that can assist the targeted management of natural hazards, and the evaluation of the potential towards the development of an operational Copernicus downstream service.

  2. Models of railroad passenger-car requirements in the northeast corridor : volume II user's guide

    DOT National Transportation Integrated Search

    1976-09-30

    Models and techniques for determining passenger-car requirements in railroad service were developed and applied by a research project of which this is the final report. The report is published in two volumes. The solution and analysis of the Northeas...

  3. Thermal/structural Tailoring of Engine Blades (T/STAEBL) User's Manual

    NASA Technical Reports Server (NTRS)

    Brown, K. W.; Clevenger, W. B.; Arel, J. D.

    1994-01-01

    The Thermal/Structural Tailoring of Engine Blades (T/STAEBL) system is a family of computer programs executed by a control program. The T/STAEBL system performs design optimizations of cooled, hollow turbine blades and vanes. This manual contains an overview of the system, fundamentals of the data block structure, and detailed descriptions of the inputs required by the optimizer. Additionally, the thermal analysis input requirements are described as well as the inputs required to perform a finite element blade vibrations analysis.

  4. Mission requirements for a manned earth observatory. Task 2: Reference mission definition and analyiss, volume 2

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The mission requirements and conceptual design of manned earth observatory payloads for the 1980 time period are discussed. Projections of 1980 sensor technology and user data requirements were used to formulate typical basic criteria pertaining to experiments, sensor complements, and reference missions. The subjects discussed are: (1) mission selection and prioritization, (2) baseline mission analysis, (3) earth observation data handling and contingency plans, and (4) analysis of low cost mission definition and rationale.

  5. Space Operations Center system analysis study extension. Volume 4, book 2: SOC system analysis report

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The Space Operations Center (SOC) orbital space station research missions integration, crew requirements, SOC operations, and configurations are analyzed. Potential research and applications missions and their requirements are described. The capabilities of SOC are compared with user requirements. The SOC/space shuttle and shuttle-derived vehicle flight support operations and SOC orbital operations are described. Module configurations and systems options, SOC/external tank configurations, and configurations for geostationary orbits are described. Crew and systems safety configurations are summarized.

  6. An arbitrary grid CFD algorithm for configuration aerodynamics analysis. Volume 2: FEMNAS user guide

    NASA Technical Reports Server (NTRS)

    Manhardt, Paul D.; Orzechowski, J. A.; Baker, A. J.

    1992-01-01

    This report documents the user input and output data requirements for the FEMNAS finite element Navier-Stokes code for real-gas simulations of external aerodynamics flowfields. This code was developed for the configuration aerodynamics branch of NASA ARC, under SBIR Phase 2 contract NAS2-124568 by Computational Mechanics Corporation (COMCO). This report is in two volumes. Volume 1 contains the theory for the derived finite element algorithm and describes the test cases used to validate the computer program described in the Volume 2 user guide.

  7. Opportunistic data locality for end user data analysis

    NASA Astrophysics Data System (ADS)

    Fischer, M.; Heidecker, C.; Kuehn, E.; Quast, G.; Giffels, M.; Schnepf, M.; Heiss, A.; Petzold, A.

    2017-10-01

    With the increasing data volume of LHC Run2, user analyses are evolving towards increasing data throughput. This evolution translates to higher requirements for efficiency and scalability of the underlying analysis infrastructure. We approach this issue with a new middleware to optimise data access: a layer of coordinated caches transparently provides data locality for high-throughput analyses. We demonstrated the feasibility of this approach with a prototype used for analyses of the CMS working groups at KIT. In this paper, we present our experience both with the approach in general, and our prototype in specific.

  8. A proposed concept for a crustal dynamics information management network

    NASA Technical Reports Server (NTRS)

    Lohman, G. M.; Renfrow, J. T.

    1980-01-01

    The findings of a requirements and feasibility analysis of the present and potential producers, users, and repositories of space-derived geodetic information are summarized. A proposed concept is presented for a crustal dynamics information management network that would apply state of the art concepts of information management technology to meet the expanding needs of the producers, users, and archivists of this geodetic information.

  9. Extending the Virtual Solar Observatory (VSO) to Incorporate Data Analysis Capabilities (III)

    NASA Astrophysics Data System (ADS)

    Csillaghy, A.; Etesi, L.; Dennis, B.; Zarro, D.; Schwartz, R.; Tolbert, K.

    2008-12-01

    We will present a progress report on our activities to extend the data analysis capabilities of the VSO. Our efforts to date have focused on three areas: 1. Extending the data retrieval capabilities by developing a centralized data processing server. The server is built with Java, IDL (Interactive Data Language), and the SSW (Solar SoftWare) package with all SSW-related instrument libraries and required calibration data. When a user requests VSO data that requires preprocessing, the data are transparently sent to the server, processed, and returned to the user's IDL session for viewing and analysis. It is possible to have any Java or IDL client connect to the server. An IDL prototype for preparing and calibrating SOHO/EIT data wll be demonstrated. 2. Improving the solar data search in SHOW SYNOP, a graphical user tool connected to VSO in IDL. We introduce the Java-IDL interface that allows a flexible dynamic, and extendable way of searching the VSO, where all the communication with VSO are managed dynamically by standard Java tools. 3. Improving image overlay capability to support coregistration of solar disk observations obtained from different orbital view angles, position angles, and distances - such as from the twin STEREO spacecraft.

  10. Design of a Workstation by a Cognitive Approach

    PubMed Central

    Jaspers, MWM; Steen, T.; Geelen, M.; van den Bos, C.

    2001-01-01

    To ensure ultimate acceptance of computer systems that are easy to use, provide the desired functionality and fits into users work practices requires the use of improved methods for system design and evaluation. Both designing and evaluating workstations that link up smoothly with daily routine of physicians' work requires a thorough understanding of their working practices. The application of methods from cognitive science may contribute to a thorough understanding of the activities involved in medical information processing. We used cognitive task analysis in designing a physicians' workstation, which seems a promising method to ensure that the system meets the user needs.

  11. Recent results of PADReS, the Photon Analysis Delivery and REduction System, from the FERMI FEL commissioning and user operations.

    PubMed

    Zangrando, Marco; Cocco, Daniele; Fava, Claudio; Gerusina, Simone; Gobessi, Riccardo; Mahne, Nicola; Mazzucco, Eric; Raimondi, Lorenzo; Rumiz, Luca; Svetina, Cristian

    2015-05-01

    The Photon Analysis Delivery and REduction System of FERMI (PADReS) has been routinely used during the machine commissioning and operations of FERMI since 2011. It has also served the needs of several user runs at the facility from late 2012. The system is endowed with online and shot-to-shot diagnostics giving information about intensity, spatial-angular distribution, spectral content, as well as other diagnostics to determine coherence, pulse length etc. Moreover, PADReS is capable of manipulating the beam in terms of intensity and optical parameters. Regarding the optics, besides a standard refocusing system based on an ellipsoidal mirror, the Kirkpatrick-Baez active optics systems are key elements and have been used intensively to meet users' requirements. A general description of the system is given, together with some selected results from the commissioning/operations/user beam time.

  12. Lightweight Adaptation of Classifiers to Users and Contexts: Trends of the Emerging Domain

    PubMed Central

    Vildjiounaite, Elena; Gimel'farb, Georgy; Kyllönen, Vesa; Peltola, Johannes

    2015-01-01

    Intelligent computer applications need to adapt their behaviour to contexts and users, but conventional classifier adaptation methods require long data collection and/or training times. Therefore classifier adaptation is often performed as follows: at design time application developers define typical usage contexts and provide reasoning models for each of these contexts, and then at runtime an appropriate model is selected from available ones. Typically, definition of usage contexts and reasoning models heavily relies on domain knowledge. However, in practice many applications are used in so diverse situations that no developer can predict them all and collect for each situation adequate training and test databases. Such applications have to adapt to a new user or unknown context at runtime just from interaction with the user, preferably in fairly lightweight ways, that is, requiring limited user effort to collect training data and limited time of performing the adaptation. This paper analyses adaptation trends in several emerging domains and outlines promising ideas, proposed for making multimodal classifiers user-specific and context-specific without significant user efforts, detailed domain knowledge, and/or complete retraining of the classifiers. Based on this analysis, this paper identifies important application characteristics and presents guidelines to consider these characteristics in adaptation design. PMID:26473165

  13. An online database for plant image analysis software tools.

    PubMed

    Lobet, Guillaume; Draye, Xavier; Périlleux, Claire

    2013-10-09

    Recent years have seen an increase in methods for plant phenotyping using image analyses. These methods require new software solutions for data extraction and treatment. These solutions are instrumental in supporting various research pipelines, ranging from the localisation of cellular compounds to the quantification of tree canopies. However, due to the variety of existing tools and the lack of central repository, it is challenging for researchers to identify the software that is best suited for their research. We present an online, manually curated, database referencing more than 90 plant image analysis software solutions. The website, plant-image-analysis.org, presents each software in a uniform and concise manner enabling users to identify the available solutions for their experimental needs. The website also enables user feedback, evaluations and new software submissions. The plant-image-analysis.org database provides an overview of existing plant image analysis software. The aim of such a toolbox is to help users to find solutions, and to provide developers a way to exchange and communicate about their work.

  14. Feasibility and demonstration of a cloud-based RIID analysis system

    NASA Astrophysics Data System (ADS)

    Wright, Michael C.; Hertz, Kristin L.; Johnson, William C.; Sword, Eric D.; Younkin, James R.; Sadler, Lorraine E.

    2015-06-01

    A significant limitation in the operational utility of handheld and backpack radioisotope identifiers (RIIDs) is the inability of their onboard algorithms to accurately and reliably identify the isotopic sources of the measured gamma-ray energy spectrum. A possible solution is to move the spectral analysis computations to an external device, the cloud, where significantly greater capabilities are available. The implementation and demonstration of a prototype cloud-based RIID analysis system have shown this type of system to be feasible with currently available communication and computational technology. A system study has shown that the potential user community could derive significant benefits from an appropriately implemented cloud-based analysis system and has identified the design and operational characteristics required by the users and stakeholders for such a system. A general description of the hardware and software necessary to implement reliable cloud-based analysis, the value of the cloud expressed by the user community, and the aspects of the cloud implemented in the demonstrations are discussed.

  15. An Automated Algorithm for Producing Land Cover Information from Landsat Surface Reflectance Data Acquired Between 1984 and Present

    NASA Astrophysics Data System (ADS)

    Rover, J.; Goldhaber, M. B.; Holen, C.; Dittmeier, R.; Wika, S.; Steinwand, D.; Dahal, D.; Tolk, B.; Quenzer, R.; Nelson, K.; Wylie, B. K.; Coan, M.

    2015-12-01

    Multi-year land cover mapping from remotely sensed data poses challenges. Producing land cover products at spatial and temporal scales required for assessing longer-term trends in land cover change are typically a resource-limited process. A recently developed approach utilizes open source software libraries to automatically generate datasets, decision tree classifications, and data products while requiring minimal user interaction. Users are only required to supply coordinates for an area of interest, land cover from an existing source such as National Land Cover Database and percent slope from a digital terrain model for the same area of interest, two target acquisition year-day windows, and the years of interest between 1984 and present. The algorithm queries the Landsat archive for Landsat data intersecting the area and dates of interest. Cloud-free pixels meeting the user's criteria are mosaicked to create composite images for training the classifiers and applying the classifiers. Stratification of training data is determined by the user and redefined during an iterative process of reviewing classifiers and resulting predictions. The algorithm outputs include yearly land cover raster format data, graphics, and supporting databases for further analysis. Additional analytical tools are also incorporated into the automated land cover system and enable statistical analysis after data are generated. Applications tested include the impact of land cover change and water permanence. For example, land cover conversions in areas where shrubland and grassland were replaced by shale oil pads during hydrofracking of the Bakken Formation were quantified. Analytical analysis of spatial and temporal changes in surface water included identifying wetlands in the Prairie Pothole Region of North Dakota with potential connectivity to ground water, indicating subsurface permeability and geochemistry.

  16. Radar altimetry systems cost analysis

    NASA Technical Reports Server (NTRS)

    Escoe, D.; Heuring, F. T.; Denman, W. F.

    1976-01-01

    This report discusses the application and cost of two types of altimeter systems (spaceborne (satellite and shuttle) and airborne) to twelve user requirements. The overall design of the systems defined to meet these requirements is predicated on an unconstrained altimetry technology; that is, any level of altimeter or supporting equipment performance is possible.

  17. Saint Lawrence Seaway Navigation-Aid System Study : Volume II - Appendix B - User's Manual and Documentation of Seaway Capacity and Capacity Analysis Programs

    DOT National Transportation Integrated Search

    1978-09-01

    The requirements for a navigation guidance system which will effect an increase in the ship processing capacity of the Saint Lawrence Seaway (Lake Ontario to Montreal, Quebec) are developed. The requirements include a specification of system position...

  18. A User's Guide for the Differential Reduced Ejector/Mixer Analysis "DREA" Program. 1.0

    NASA Technical Reports Server (NTRS)

    DeChant, Lawrence J.; Nadell, Shari-Beth

    1999-01-01

    A system of analytical and numerical two-dimensional mixer/ejector nozzle models that require minimal empirical input has been developed and programmed for use in conceptual and preliminary design. This report contains a user's guide describing the operation of the computer code, DREA (Differential Reduced Ejector/mixer Analysis), that contains these mathematical models. This program is currently being adopted by the Propulsion Systems Analysis Office at the NASA Glenn Research Center. A brief summary of the DREA method is provided, followed by detailed descriptions of the program input and output files. Sample cases demonstrating the application of the program are presented.

  19. ZBIT Bioinformatics Toolbox: A Web-Platform for Systems Biology and Expression Data Analysis

    PubMed Central

    Römer, Michael; Eichner, Johannes; Dräger, Andreas; Wrzodek, Clemens; Wrzodek, Finja; Zell, Andreas

    2016-01-01

    Bioinformatics analysis has become an integral part of research in biology. However, installation and use of scientific software can be difficult and often requires technical expert knowledge. Reasons are dependencies on certain operating systems or required third-party libraries, missing graphical user interfaces and documentation, or nonstandard input and output formats. In order to make bioinformatics software easily accessible to researchers, we here present a web-based platform. The Center for Bioinformatics Tuebingen (ZBIT) Bioinformatics Toolbox provides web-based access to a collection of bioinformatics tools developed for systems biology, protein sequence annotation, and expression data analysis. Currently, the collection encompasses software for conversion and processing of community standards SBML and BioPAX, transcription factor analysis, and analysis of microarray data from transcriptomics and proteomics studies. All tools are hosted on a customized Galaxy instance and run on a dedicated computation cluster. Users only need a web browser and an active internet connection in order to benefit from this service. The web platform is designed to facilitate the usage of the bioinformatics tools for researchers without advanced technical background. Users can combine tools for complex analyses or use predefined, customizable workflows. All results are stored persistently and reproducible. For each tool, we provide documentation, tutorials, and example data to maximize usability. The ZBIT Bioinformatics Toolbox is freely available at https://webservices.cs.uni-tuebingen.de/. PMID:26882475

  20. Ergatis: a web interface and scalable software system for bioinformatics workflows

    PubMed Central

    Orvis, Joshua; Crabtree, Jonathan; Galens, Kevin; Gussman, Aaron; Inman, Jason M.; Lee, Eduardo; Nampally, Sreenath; Riley, David; Sundaram, Jaideep P.; Felix, Victor; Whitty, Brett; Mahurkar, Anup; Wortman, Jennifer; White, Owen; Angiuoli, Samuel V.

    2010-01-01

    Motivation: The growth of sequence data has been accompanied by an increasing need to analyze data on distributed computer clusters. The use of these systems for routine analysis requires scalable and robust software for data management of large datasets. Software is also needed to simplify data management and make large-scale bioinformatics analysis accessible and reproducible to a wide class of target users. Results: We have developed a workflow management system named Ergatis that enables users to build, execute and monitor pipelines for computational analysis of genomics data. Ergatis contains preconfigured components and template pipelines for a number of common bioinformatics tasks such as prokaryotic genome annotation and genome comparisons. Outputs from many of these components can be loaded into a Chado relational database. Ergatis was designed to be accessible to a broad class of users and provides a user friendly, web-based interface. Ergatis supports high-throughput batch processing on distributed compute clusters and has been used for data management in a number of genome annotation and comparative genomics projects. Availability: Ergatis is an open-source project and is freely available at http://ergatis.sourceforge.net Contact: jorvis@users.sourceforge.net PMID:20413634

  1. Intelligent Transportation Infrastructure Deployment Analysis System

    DOT National Transportation Integrated Search

    1997-01-01

    Much of the work on Intelligent Transportation Systems (ITS) to date has emphasized technologies, Standards/protocols, architecture, user services, core infrastructure requirements, and various other technical and institutional issues. ITS implementa...

  2. Chapter 27. Seed testing requirements and regulatory laws

    Treesearch

    Richard Stevens; Kent R. Jorgensen

    2004-01-01

    Federal and State seed laws require that seed used on range and wildland sites be officially tested and appropriately labeled or tagged. It is the responsibility of the seed distributor (who may be the producer, collector, or broker) toward the end user to properly tag each container of seed to comply with these laws. An analysis tag is always required. If seed has...

  3. Definition study of land/sea civil user navigational location monitoring systems for NAVSTAR GPS: User requirements and systems concepts

    NASA Technical Reports Server (NTRS)

    Devito, D. M.

    1981-01-01

    A low-cost GPS civil-user mobile terminal whose purchase cost is substantially an order of magnitude less than estimates for the military counterpart is considered with focus on ground station requirements for position monitoring of civil users requiring this capability and the civil user navigation and location-monitoring requirements. Existing survey literature was examined to ascertain the potential users of a low-cost NAVSTAR receiver and to estimate their number, function, and accuracy requirements. System concepts are defined for low cost user equipments for in-situ navigation and the retransmission of low data rate positioning data via a geostationary satellite to a central computing facility.

  4. Dexterity: A MATLAB-based analysis software suite for processing and visualizing data from tasks that measure arm or forelimb function.

    PubMed

    Butensky, Samuel D; Sloan, Andrew P; Meyers, Eric; Carmel, Jason B

    2017-07-15

    Hand function is critical for independence, and neurological injury often impairs dexterity. To measure hand function in people or forelimb function in animals, sensors are employed to quantify manipulation. These sensors make assessment easier and more quantitative and allow automation of these tasks. While automated tasks improve objectivity and throughput, they also produce large amounts of data that can be burdensome to analyze. We created software called Dexterity that simplifies data analysis of automated reaching tasks. Dexterity is MATLAB software that enables quick analysis of data from forelimb tasks. Through a graphical user interface, files are loaded and data are identified and analyzed. These data can be annotated or graphed directly. Analysis is saved, and the graph and corresponding data can be exported. For additional analysis, Dexterity provides access to custom scripts created by other users. To determine the utility of Dexterity, we performed a study to evaluate the effects of task difficulty on the degree of impairment after injury. Dexterity analyzed two months of data and allowed new users to annotate the experiment, visualize results, and save and export data easily. Previous analysis of tasks was performed with custom data analysis, requiring expertise with analysis software. Dexterity made the tools required to analyze, visualize and annotate data easy to use by investigators without data science experience. Dexterity increases accessibility to automated tasks that measure dexterity by making analysis of large data intuitive, robust, and efficient. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Transforming user needs into functional requirements for an antibiotic clinical decision support system: explicating content analysis for system design.

    PubMed

    Bright, T J

    2013-01-01

    Many informatics studies use content analysis to generate functional requirements for system development. Explication of this translational process from qualitative data to functional requirements can strengthen the understanding and scientific rigor when applying content analysis in informatics studies. To describe a user-centered approach transforming emergent themes derived from focus group data into functional requirements for informatics solutions and to illustrate these methods to the development of an antibiotic clinical decision support system (CDS). THE APPROACH CONSISTED OF FIVE STEPS: 1) identify unmet therapeutic planning information needs via Focus Group Study-I, 2) develop a coding framework of therapeutic planning themes to refine the domain scope to antibiotic therapeutic planning, 3) identify functional requirements of an antibiotic CDS system via Focus Group Study-II, 4) discover informatics solutions and functional requirements from coded data, and 5) determine the types of information needed to support the antibiotic CDS system and link with the identified informatics solutions and functional requirements. The coding framework for Focus Group Study-I revealed unmet therapeutic planning needs. Twelve subthemes emerged and were clustered into four themes; analysis indicated a need for an antibiotic CDS intervention. Focus Group Study-II included five types of information needs. Comments from the Barrier/Challenge to information access and Function/Feature themes produced three informatics solutions and 13 functional requirements of an antibiotic CDS system. Comments from the Patient, Institution, and Domain themes generated required data elements for each informatics solution. This study presents one example explicating content analysis of focus group data and the analysis process to functional requirements from narrative data. Illustration of this 5-step method was used to develop an antibiotic CDS system, resolving unmet antibiotic prescribing needs. As a reusable approach, these techniques can be refined and applied to resolve unmet information needs with informatics interventions in additional domains.

  6. ClonoCalc and ClonoPlot: immune repertoire analysis from raw files to publication figures with graphical user interface.

    PubMed

    Fähnrich, Anke; Krebbel, Moritz; Decker, Normann; Leucker, Martin; Lange, Felix D; Kalies, Kathrin; Möller, Steffen

    2017-03-11

    Next generation sequencing (NGS) technologies enable studies and analyses of the diversity of both T and B cell receptors (TCR and BCR) in human and animal systems to elucidate immune functions in health and disease. Over the last few years, several algorithms and tools have been developed to support respective analyses of raw sequencing data of the immune repertoire. These tools focus on distinct aspects of the data processing and require a strong bioinformatics background. To facilitate the analysis of T and B cell repertoires by less experienced users, software is needed that combines the most common tools for repertoire analysis. We introduce a graphical user interface (GUI) providing a complete analysis pipeline for processing raw NGS data for human and animal TCR and BCR clonotype determination and advanced differential repertoire studies. It provides two applications. ClonoCalc prepares the raw data for downstream analyses. It combines a demultiplexer for barcode splitting and employs MiXCR for paired-end read merging and the extraction of human and animal TCR/BCR sequences. ClonoPlot wraps the R package tcR and further contributes self-developed plots for the descriptive comparative investigation of immune repertoires. This workflow reduces the amount of programming required to perform the respective analyses and supports both communication and training between scientists and technicians, and across scientific disciplines. The Open Source development in Java and R is modular and invites advanced users to extend its functionality. Software and documentation are freely available at https://bitbucket.org/ClonoSuite/clonocalc-plot .

  7. Distributed data analysis in ATLAS

    NASA Astrophysics Data System (ADS)

    Nilsson, Paul; Atlas Collaboration

    2012-12-01

    Data analysis using grid resources is one of the fundamental challenges to be addressed before the start of LHC data taking. The ATLAS detector will produce petabytes of data per year, and roughly one thousand users will need to run physics analyses on this data. Appropriate user interfaces and helper applications have been made available to ensure that the grid resources can be used without requiring expertise in grid technology. These tools enlarge the number of grid users from a few production administrators to potentially all participating physicists. ATLAS makes use of three grid infrastructures for the distributed analysis: the EGEE sites, the Open Science Grid, and Nordu Grid. These grids are managed by the gLite workload management system, the PanDA workload management system, and ARC middleware; many sites can be accessed via both the gLite WMS and PanDA. Users can choose between two front-end tools to access the distributed resources. Ganga is a tool co-developed with LHCb to provide a common interface to the multitude of execution backends (local, batch, and grid). The PanDA workload management system provides a set of utilities called PanDA Client; with these tools users can easily submit Athena analysis jobs to the PanDA-managed resources. Distributed data is managed by Don Quixote 2, a system developed by ATLAS; DQ2 is used to replicate datasets according to the data distribution policies and maintains a central catalog of file locations. The operation of the grid resources is continually monitored by the Ganga Robot functional testing system, and infrequent site stress tests are performed using the Hammer Cloud system. In addition, the DAST shift team is a group of power users who take shifts to provide distributed analysis user support; this team has effectively relieved the burden of support from the developers.

  8. User modeling techniques for enhanced usability of OPSMODEL operations simulation software

    NASA Technical Reports Server (NTRS)

    Davis, William T.

    1991-01-01

    The PC based OPSMODEL operations software for modeling and simulation of space station crew activities supports engineering and cost analyses and operations planning. Using top-down modeling, the level of detail required in the data base can be limited to being commensurate with the results required of any particular analysis. To perform a simulation, a resource environment consisting of locations, crew definition, equipment, and consumables is first defined. Activities to be simulated are then defined as operations and scheduled as desired. These operations are defined within a 1000 level priority structure. The simulation on OPSMODEL, then, consists of the following: user defined, user scheduled operations executing within an environment of user defined resource and priority constraints. Techniques for prioritizing operations to realistically model a representative daily scenario of on-orbit space station crew activities are discussed. The large number of priority levels allows priorities to be assigned commensurate with the detail necessary for a given simulation. Several techniques for realistic modeling of day-to-day work carryover are also addressed.

  9. Study of data collection platform concepts: Data collection system user requirements

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The overall purpose of the survey was to provide real world data on user requirements. The intent was to assess data collection system user requirements by questioning actual potential users rather than speculating on requirements. The end results of the survey are baseline requirements models for both a data collection platform and a data collection system. These models were derived from the survey results. The real value of these models lies in the fact that they are based on actual user requirements as delineated in the survey questionnaires. Some users desire data collection platforms of small size and light weight. These sizes and weights are beyond the present state of the art. Also, the survey provided a wealth of information on the nature and constituency of the data collection user community as well as information on user applications for data collection systems. Finally, the data sheds light on the generalized platform concept. That is, the diversity of user requirements shown in the data indicates the difficulty that can be anticipated in attempting to implement such a concept.

  10. Older driver highway design handbook

    DOT National Transportation Integrated Search

    1998-01-01

    This project included literature reviews and research syntheses, using meta-analytic techniques where : appropriate, in the areas of age-related (diminished) functional capabilities, and human factors and : highway safety. A User-Requirements Analysi...

  11. A mobile phone user interface for image-based dietary assessment

    NASA Astrophysics Data System (ADS)

    Ahmad, Ziad; Khanna, Nitin; Kerr, Deborah A.; Boushey, Carol J.; Delp, Edward J.

    2014-02-01

    Many chronic diseases, including obesity and cancer, are related to diet. Such diseases may be prevented and/or successfully treated by accurately monitoring and assessing food and beverage intakes. Existing dietary assessment methods such as the 24-hour dietary recall and the food frequency questionnaire, are burdensome and not generally accurate. In this paper, we present a user interface for a mobile telephone food record that relies on taking images, using the built-in camera, as the primary method of recording. We describe the design and implementation of this user interface while stressing the solutions we devised to meet the requirements imposed by the image analysis process, yet keeping the user interface easy to use.

  12. A Mobile Phone User Interface for Image-Based Dietary Assessment

    PubMed Central

    Ahmad, Ziad; Khanna, Nitin; Kerr, Deborah A.; Boushey, Carol J.; Delp, Edward J.

    2016-01-01

    Many chronic diseases, including obesity and cancer, are related to diet. Such diseases may be prevented and/or successfully treated by accurately monitoring and assessing food and beverage intakes. Existing dietary assessment methods such as the 24-hour dietary recall and the food frequency questionnaire, are burdensome and not generally accurate. In this paper, we present a user interface for a mobile telephone food record that relies on taking images, using the built-in camera, as the primary method of recording. We describe the design and implementation of this user interface while stressing the solutions we devised to meet the requirements imposed by the image analysis process, yet keeping the user interface easy to use. PMID:28572696

  13. A Mobile Phone User Interface for Image-Based Dietary Assessment.

    PubMed

    Ahmad, Ziad; Khanna, Nitin; Kerr, Deborah A; Boushey, Carol J; Delp, Edward J

    2014-02-02

    Many chronic diseases, including obesity and cancer, are related to diet. Such diseases may be prevented and/or successfully treated by accurately monitoring and assessing food and beverage intakes. Existing dietary assessment methods such as the 24-hour dietary recall and the food frequency questionnaire, are burdensome and not generally accurate. In this paper, we present a user interface for a mobile telephone food record that relies on taking images, using the built-in camera, as the primary method of recording. We describe the design and implementation of this user interface while stressing the solutions we devised to meet the requirements imposed by the image analysis process, yet keeping the user interface easy to use.

  14. MPBEC, a Matlab Program for Biomolecular Electrostatic Calculations

    NASA Astrophysics Data System (ADS)

    Vergara-Perez, Sandra; Marucho, Marcelo

    2016-01-01

    One of the most used and efficient approaches to compute electrostatic properties of biological systems is to numerically solve the Poisson-Boltzmann (PB) equation. There are several software packages available that solve the PB equation for molecules in aqueous electrolyte solutions. Most of these software packages are useful for scientists with specialized training and expertise in computational biophysics. However, the user is usually required to manually take several important choices, depending on the complexity of the biological system, to successfully obtain the numerical solution of the PB equation. This may become an obstacle for researchers, experimentalists, even students with no special training in computational methodologies. Aiming to overcome this limitation, in this article we present MPBEC, a free, cross-platform, open-source software that provides non-experts in the field an easy and efficient way to perform biomolecular electrostatic calculations on single processor computers. MPBEC is a Matlab script based on the Adaptative Poisson-Boltzmann Solver, one of the most popular approaches used to solve the PB equation. MPBEC does not require any user programming, text editing or extensive statistical skills, and comes with detailed user-guide documentation. As a unique feature, MPBEC includes a useful graphical user interface (GUI) application which helps and guides users to configure and setup the optimal parameters and approximations to successfully perform the required biomolecular electrostatic calculations. The GUI also incorporates visualization tools to facilitate users pre- and post-analysis of structural and electrical properties of biomolecules.

  15. MPBEC, a Matlab Program for Biomolecular Electrostatic Calculations

    PubMed Central

    Vergara-Perez, Sandra; Marucho, Marcelo

    2015-01-01

    One of the most used and efficient approaches to compute electrostatic properties of biological systems is to numerically solve the Poisson-Boltzmann (PB) equation. There are several software packages available that solve the PB equation for molecules in aqueous electrolyte solutions. Most of these software packages are useful for scientists with specialized training and expertise in computational biophysics. However, the user is usually required to manually take several important choices, depending on the complexity of the biological system, to successfully obtain the numerical solution of the PB equation. This may become an obstacle for researchers, experimentalists, even students with no special training in computational methodologies. Aiming to overcome this limitation, in this article we present MPBEC, a free, cross-platform, open-source software that provides non-experts in the field an easy and efficient way to perform biomolecular electrostatic calculations on single processor computers. MPBEC is a Matlab script based on the Adaptative Poisson Boltzmann Solver, one of the most popular approaches used to solve the PB equation. MPBEC does not require any user programming, text editing or extensive statistical skills, and comes with detailed user-guide documentation. As a unique feature, MPBEC includes a useful graphical user interface (GUI) application which helps and guides users to configure and setup the optimal parameters and approximations to successfully perform the required biomolecular electrostatic calculations. The GUI also incorporates visualization tools to facilitate users pre- and post- analysis of structural and electrical properties of biomolecules. PMID:26924848

  16. MPBEC, a Matlab Program for Biomolecular Electrostatic Calculations.

    PubMed

    Vergara-Perez, Sandra; Marucho, Marcelo

    2016-01-01

    One of the most used and efficient approaches to compute electrostatic properties of biological systems is to numerically solve the Poisson-Boltzmann (PB) equation. There are several software packages available that solve the PB equation for molecules in aqueous electrolyte solutions. Most of these software packages are useful for scientists with specialized training and expertise in computational biophysics. However, the user is usually required to manually take several important choices, depending on the complexity of the biological system, to successfully obtain the numerical solution of the PB equation. This may become an obstacle for researchers, experimentalists, even students with no special training in computational methodologies. Aiming to overcome this limitation, in this article we present MPBEC, a free, cross-platform, open-source software that provides non-experts in the field an easy and efficient way to perform biomolecular electrostatic calculations on single processor computers. MPBEC is a Matlab script based on the Adaptative Poisson Boltzmann Solver, one of the most popular approaches used to solve the PB equation. MPBEC does not require any user programming, text editing or extensive statistical skills, and comes with detailed user-guide documentation. As a unique feature, MPBEC includes a useful graphical user interface (GUI) application which helps and guides users to configure and setup the optimal parameters and approximations to successfully perform the required biomolecular electrostatic calculations. The GUI also incorporates visualization tools to facilitate users pre- and post- analysis of structural and electrical properties of biomolecules.

  17. Computer program for nonlinear static stress analysis of shuttle thermal protection system: User's manual

    NASA Technical Reports Server (NTRS)

    Giles, G. L.; Wallas, M.

    1981-01-01

    User documentation is presented for a computer program which considers the nonlinear properties of the strain isolator pad (SIP) in the static stress analysis of the shuttle thermal protection system. This program is generalized to handle an arbitrary SIP footprint including cutouts for instrumentation and filler bar. Multiple SIP surfaces are defined to model tiles in unique locations such as leading edges, intersections, and penetrations. The nonlinearity of the SIP is characterized by experimental stress displacement data for both normal and shear behavior. Stresses in the SIP are calculated using a Newton iteration procedure to determine the six rigid body displacements of the tile which develop reaction forces in the SIP to equilibrate the externally applied loads. This user documentation gives an overview of the analysis capabilities, a detailed description of required input data and an example to illustrate use of the program.

  18. Micrometeoroid and Orbital Debris (MMOD) Shield Ballistic Limit Analysis Program

    NASA Technical Reports Server (NTRS)

    Ryan, Shannon

    2013-01-01

    This software implements penetration limit equations for common micrometeoroid and orbital debris (MMOD) shield configurations, windows, and thermal protection systems. Allowable MMOD risk is formulated in terms of the probability of penetration (PNP) of the spacecraft pressure hull. For calculating the risk, spacecraft geometry models, mission profiles, debris environment models, and penetration limit equations for installed shielding configurations are required. Risk assessment software such as NASA's BUMPERII is used to calculate mission PNP; however, they are unsuitable for use in shield design and preliminary analysis studies. The software defines a single equation for the design and performance evaluation of common MMOD shielding configurations, windows, and thermal protection systems, along with a description of their validity range and guidelines for their application. Recommendations are based on preliminary reviews of fundamental assumptions, and accuracy in predicting experimental impact test results. The software is programmed in Visual Basic for Applications for installation as a simple add-in for Microsoft Excel. The user is directed to a graphical user interface (GUI) that requires user inputs and provides solutions directly in Microsoft Excel workbooks.

  19. Improving Requirements Generation Thoroughness in User-Centered Workshops: The Role of Prompting and Shared User Stories

    ERIC Educational Resources Information Center

    Read, Aaron

    2013-01-01

    The rise of stakeholder centered software development has led to organizations engaging users early in the development process to help define system requirements. To facilitate user involvement in the requirements elicitation process, companies can use Group Support Systems (GSS) to conduct requirements elicitation workshops. The effectiveness of…

  20. Automatic generation of user material subroutines for biomechanical growth analysis.

    PubMed

    Young, Jonathan M; Yao, Jiang; Ramasubramanian, Ashok; Taber, Larry A; Perucchio, Renato

    2010-10-01

    The analysis of the biomechanics of growth and remodeling in soft tissues requires the formulation of specialized pseudoelastic constitutive relations. The nonlinear finite element analysis package ABAQUS allows the user to implement such specialized material responses through the coding of a user material subroutine called UMAT. However, hand coding UMAT subroutines is a challenge even for simple pseudoelastic materials and requires substantial time to debug and test the code. To resolve this issue, we develop an automatic UMAT code generation procedure for pseudoelastic materials using the symbolic mathematics package MATHEMATICA and extend the UMAT generator to include continuum growth. The performance of the automatically coded UMAT is tested by simulating the stress-stretch response of a material defined by a Fung-orthotropic strain energy function, subject to uniaxial stretching, equibiaxial stretching, and simple shear in ABAQUS. The MATHEMATICA UMAT generator is then extended to include continuum growth by adding a growth subroutine to the automatically generated UMAT. The MATHEMATICA UMAT generator correctly derives the variables required in the UMAT code, quickly providing a ready-to-use UMAT. In turn, the UMAT accurately simulates the pseudoelastic response. In order to test the growth UMAT, we simulate the growth-based bending of a bilayered bar with differing fiber directions in a nongrowing passive layer. The anisotropic passive layer, being topologically tied to the growing isotropic layer, causes the bending bar to twist laterally. The results of simulations demonstrate the validity of the automatically coded UMAT, used in both standardized tests of hyperelastic materials and for a biomechanical growth analysis.

  1. TADS--A CFD-Based Turbomachinery Analysis and Design System with GUI: User's Manual. 2.0

    NASA Technical Reports Server (NTRS)

    Koiro, M. J.; Myers, R. A.; Delaney, R. A.

    1999-01-01

    The primary objective of this study was the development of a Computational Fluid Dynamics (CFD) based turbomachinery airfoil analysis and design system, controlled by a Graphical User Interface (GUI). The computer codes resulting from this effort are referred to as TADS (Turbomachinery Analysis and Design System). This document is intended to serve as a User's Manual for the computer programs which comprise the TADS system, developed under Task 18 of NASA Contract NAS3-27350, ADPAC System Coupling to Blade Analysis & Design System GUI and Task 10 of NASA Contract NAS3-27394, ADPAC System Coupling to Blade Analysis & Design System GUI, Phase II-Loss, Design and, Multi-stage Analysis. TADS couples a throughflow solver (ADPAC) with a quasi-3D blade-to-blade solver (RVCQ3D) in an interactive package. Throughflow analysis and design capability was developed in ADPAC through the addition of blade force and blockage terms to the governing equations. A GUI was developed to simplify user input and automate the many tasks required to perform turbomachinery analysis and design. The coupling of the various programs was done in such a way that alternative solvers or grid generators could be easily incorporated into the TADS framework. Results of aerodynamic calculations using the TADS system are presented for a highly loaded fan, a compressor stator, a low speed turbine blade and a transonic turbine vane.

  2. User Interface Requirements for Web-Based Integrated Care Pathways: Evidence from the Evaluation of an Online Care Pathway Investigation Tool.

    PubMed

    Balatsoukas, Panos; Williams, Richard; Davies, Colin; Ainsworth, John; Buchan, Iain

    2015-11-01

    Integrated care pathways (ICPs) define a chronological sequence of steps, most commonly diagnostic or treatment, to be followed in providing care for patients. Care pathways help to ensure quality standards are met and to reduce variation in practice. Although research on the computerisation of ICP progresses, there is still little knowledge on what are the requirements for designing user-friendly and usable electronic care pathways, or how users (normally health care professionals) interact with interfaces that support design, analysis and visualisation of ICPs. The purpose of the study reported in this paper was to address this gap by evaluating the usability of a novel web-based tool called COCPIT (Collaborative Online Care Pathway Investigation Tool). COCPIT supports the design, analysis and visualisation of ICPs at the population level. In order to address the aim of this study, an evaluation methodology was designed based on heuristic evaluations and a mixed method usability test. The results showed that modular visualisation and direct manipulation of information related to the design and analysis of ICPs is useful for engaging and stimulating users. However, designers should pay attention to issues related to the visibility of the system status and the match between the system and the real world, especially in relation to the display of statistical information about care pathways and the editing of clinical information within a care pathway. The paper concludes with recommendations for interface design.

  3. A Pre-launch Analysis of NASA's SMAP Mission Data

    NASA Astrophysics Data System (ADS)

    Escobar, V. M.; Brown, M. E.

    2012-12-01

    Product applications have become an integral part of converting the data collected into actionable knowledge that can be used to inform policy. Successfully bridging scientific research with operational decision making in different application areas requires looking into thematic user requirements and data requirements. NASA's Soil Moisture Active/Passive mission (SMAP) has an applications program that actively seeks to integrate the data prior to launch into a broad range of environmental monitoring and decision making systems from drought and flood guidance to disease risk assessment and national security SMAP is a a combined active/passive microwave instrument, which will be launched into a near-polar orbit in late 2014. It aims to produce a series of soil moisture products and soil freeze/thaw products with an accuracy of +/- 10%, a nominal resolution of between 3 and 40km, and latency between 12 hours and 7 days. These measurements will be used to enhance the understanding of processes that link the water, energy and carbon cycles, and to extend the capabilities of weather and climate prediction models. The driving success of the SMAP applications program is joining mission scientists to thematic end users and leveraging the knowledge base of soil moisture data applications, increase the speed SMAP data product ingestion into critical processes and research, improving societal benefits to science. Because SMAP has not yet launched, the mission is using test algorithms to determine how the data will interact with existing processes. The objective of this profession review is to solicit data requirements, accuracy needs and current understanding of the SMAP mission from the user community and then feed that back into mission product development. Thus, understanding how users will apply SMAP data, prior to the satellite's launch, is an important component of SMAP Applied Sciences and one of NASA's measures for mission success. This paper presents an analysis of an email-based review of expert end-users and earth science researchers to eliciting how pre-launch activities and research is being conducted in thematic group's organizations. Our focus through the SMAP Applications Program will be to (1) improve the missions understanding of the SMAP user community requirements, (2) document and communicate the perceived challenges and advantages to the mission scientists, and (3) facilitate the movement of science into policy and decision making arenas. We will analyze the data of this review to understand the perceived benefits to pre-launch efforts, user engagement and define areas were the connection between science development and user engagement can continue to improve and further benefit future mission pre launch efforts. The research will facilitate collaborative opportunities between agencies, broadening the fields of science where soil moisture observation data can be applied.

  4. South African mental health care service user views on priorities for supporting recovery: implications for policy and service development.

    PubMed

    Kleintjes, Sharon; Lund, Crick; Swartz, Leslie

    2012-01-01

    The paper documents the views of South African mental health care service users on policy directions and service developments that are required to support their recovery. Semi-structured interviews were conducted with forty service users and service user advocates. A framework analysis approach was used to analyse the qualitative data. Service user priorities included addressing stigma, discrimination and disempowerment, and the links between mental health and poverty. They suggested that these challenges be addressed through public awareness campaigns, legislative and policy reform for rights protection, development of a national lobby to advocate for changes, and user empowerment. Users suggested that empowerment can be facilitated through opportunities for improved social relatedness and equitable access to social and economic resources. This study suggests three strategies to bridge the gap between mental health care service users rights and needs on one hand, and unsupportive attitudes, policies and practices on the other. These are: giving priority to service user involvement in policy and service reform, creating empathic alliances to promote user priorities, and building enabling partnerships to effect these priorities.

  5. Coloc-stats: a unified web interface to perform colocalization analysis of genomic features.

    PubMed

    Simovski, Boris; Kanduri, Chakravarthi; Gundersen, Sveinung; Titov, Dmytro; Domanska, Diana; Bock, Christoph; Bossini-Castillo, Lara; Chikina, Maria; Favorov, Alexander; Layer, Ryan M; Mironov, Andrey A; Quinlan, Aaron R; Sheffield, Nathan C; Trynka, Gosia; Sandve, Geir K

    2018-06-05

    Functional genomics assays produce sets of genomic regions as one of their main outputs. To biologically interpret such region-sets, researchers often use colocalization analysis, where the statistical significance of colocalization (overlap, spatial proximity) between two or more region-sets is tested. Existing colocalization analysis tools vary in the statistical methodology and analysis approaches, thus potentially providing different conclusions for the same research question. As the findings of colocalization analysis are often the basis for follow-up experiments, it is helpful to use several tools in parallel and to compare the results. We developed the Coloc-stats web service to facilitate such analyses. Coloc-stats provides a unified interface to perform colocalization analysis across various analytical methods and method-specific options (e.g. colocalization measures, resolution, null models). Coloc-stats helps the user to find a method that supports their experimental requirements and allows for a straightforward comparison across methods. Coloc-stats is implemented as a web server with a graphical user interface that assists users with configuring their colocalization analyses. Coloc-stats is freely available at https://hyperbrowser.uio.no/coloc-stats/.

  6. Service user and caregiver involvement in mental health system strengthening in low- and middle-income countries: a cross-country qualitative study.

    PubMed

    Lempp, H; Abayneh, S; Gurung, D; Kola, L; Abdulmalik, J; Evans-Lacko, S; Semrau, M; Alem, A; Thornicroft, G; Hanlon, C

    2018-02-01

    The aims of this paper are to: (i) explore the experiences of involvement of mental health service users, their caregivers, mental health centre heads and policy makers in mental health system strengthening in three low- and middle-income countries (LMICs) (Ethiopia, Nepal and Nigeria); (ii) analyse the potential benefits and barriers of such involvement; and (iii) identify strategies required to achieve greater service user and caregiver participation. A cross-country qualitative study was conducted, interviewing 83 stakeholders of mental health services. Our analysis showed that service user and caregiver involvement in the health system strengthening process was an alien concept for most participants. They reported very limited access to direct participation. Stigma and poverty were described as the main barriers for involvement. Several strategies were identified by participants to overcome existing hurdles to facilitate service user and caregiver involvement in the mental health system strengthening process, such as support to access treatment, mental health promotion and empowerment of service users. This study suggests that capacity building for service users, and strengthening of user groups would equip them to contribute meaningfully to policy development from informed perspectives. Involvement of service users and their caregivers in mental health decision-making is still in its infancy in LMICs. Effective strategies are required to overcome existing barriers, for example making funding more widely available for Ph.D. studies in participatory research with service users and caregivers to develop, implement and evaluate approaches to involvement that are locally and culturally acceptable in LMICs.

  7. An importance-performance analysis of hospital information system attributes: A nurses' perspective.

    PubMed

    Cohen, Jason F; Coleman, Emma; Kangethe, Matheri J

    2016-02-01

    Health workers have numerous concerns about hospital IS (HIS) usage. Addressing these concerns requires understanding the system attributes most important to their satisfaction and productivity. Following a recent HIS implementation, our objective was to identify priorities for managerial intervention based on user evaluations of the performance of the HIS attributes as well as the relative importance of these attributes to user satisfaction and productivity outcomes. We collected data along a set of attributes representing system quality, data quality, information quality, and service quality from 154 nurse users. Their quantitative responses were analysed using the partial least squares approach followed by an importance-performance analysis. Qualitative responses were analysed using thematic analysis to triangulate and supplement the quantitative findings. Two system quality attributes (responsiveness and ease of learning), one information quality attribute (detail), one service quality attribute (sufficient support), and three data quality attributes (records complete, accurate and never missing) were identified as high priorities for intervention. Our application of importance-performance analysis is unique in HIS evaluation and we have illustrated its utility for identifying those system attributes for which underperformance is not acceptable to users and therefore should be high priorities for intervention. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  8. United States data collection activities and requirements, volume 1

    NASA Technical Reports Server (NTRS)

    Hrin, S.; Mcgregor, D.

    1977-01-01

    The potential market for a data collection system was investigated to determine whether the user needs would be sufficient to support a satellite relay data collection system design. The activities of 107,407 data collections stations were studied to determine user needs in agriculture, climatology, environmental monitoring, forestry, geology, hydrology, meteorology, and oceanography. Descriptions of 50 distinct data collections networks are described and used to form the user data base. The computer program used to analyze the station data base is discussed, and results of the analysis are presented in maps and graphs. Information format and coding is described in the appendix.

  9. Quantifying Therapeutic and Diagnostic Efficacy in 2D Microvascular Images

    NASA Technical Reports Server (NTRS)

    Parsons-Wingerter, Patricia; Vickerman, Mary B.; Keith, Patricia A.

    2009-01-01

    VESGEN is a newly automated, user-interactive program that maps and quantifies the effects of vascular therapeutics and regulators on microvascular form and function. VESGEN analyzes two-dimensional, black and white vascular images by measuring important vessel morphology parameters. This software guides the user through each required step of the analysis process via a concise graphical user interface (GUI). Primary applications of the VESGEN code are 2D vascular images acquired as clinical diagnostic images of the human retina and as experimental studies of the effects of vascular regulators and therapeutics on vessel remodeling.

  10. The role of the user within the medical device design and development process: medical device manufacturers' perspectives

    PubMed Central

    2011-01-01

    Background Academic literature and international standards bodies suggest that user involvement, via the incorporation of human factors engineering methods within the medical device design and development (MDDD) process, offer many benefits that enable the development of safer and more usable medical devices that are better suited to users' needs. However, little research has been carried out to explore medical device manufacturers' beliefs and attitudes towards user involvement within this process, or indeed what value they believe can be added by doing so. Methods In-depth interviews with representatives from 11 medical device manufacturers are carried out. We ask them to specify who they believe the intended users of the device to be, who they consult to inform the MDDD process, what role they believe the user plays within this process, and what value (if any) they believe users add. Thematic analysis is used to analyse the fully transcribed interview data, to gain insight into medical device manufacturers' beliefs and attitudes towards user involvement within the MDDD process. Results A number of high-level themes emerged, relating who the user is perceived to be, the methods used, the perceived value and barriers to user involvement, and the nature of user contributions. The findings reveal that despite standards agencies and academic literature offering strong support for the employment formal methods, manufacturers are still hesitant due to a range of factors including: perceived barriers to obtaining ethical approval; the speed at which such activity may be carried out; the belief that there is no need given the 'all-knowing' nature of senior health care staff and clinical champions; a belief that effective results are achievable by consulting a minimal number of champions. Furthermore, less senior health care practitioners and patients were rarely seen as being able to provide valuable input into the process. Conclusions Medical device manufacturers often do not see the benefit of employing formal human factors engineering methods within the MDDD process. Research is required to better understand the day-to-day requirements of manufacturers within this sector. The development of new or adapted methods may be required if user involvement is to be fully realised. PMID:21356097

  11. Concept Maps as Instructional Tools for Improving Learning of Phase Transitions in Object-Oriented Analysis and Design

    ERIC Educational Resources Information Center

    Shin, Shin-Shing

    2016-01-01

    Students attending object-oriented analysis and design (OOAD) courses typically encounter difficulties transitioning from requirements analysis to logical design and then to physical design. Concept maps have been widely used in studies of user learning. The study reported here, based on the relationship of concept maps to learning theory and…

  12. Health Insurance Portability and Accountability Act-Compliant Ocular Telehealth Network for the Remote Diagnosis and Management of Diabetic Retinopathy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Yaquin; Karnowski, Thomas Paul; Tobin Jr, Kenneth William

    2011-01-01

    In this article, we present the design and implementation of a regional ocular telehealth network for remote assessment and management of diabetic retinopathy (DR), including the design requirements, network topology, protocol design, system work flow, graphics user interfaces, and performance evaluation. The Telemedical Retinal Image Analysis and Diagnosis Network is a computer-aided, image analysis telehealth paradigm for the diagnosis of DR and other retinal diseases using fundus images acquired from primary care end users delivering care to underserved patient populations in the mid-South and southeastern United States.

  13. A health insurance portability and accountability act-compliant ocular telehealth network for the remote diagnosis and management of diabetic retinopathy.

    PubMed

    Li, Yaqin; Karnowski, Thomas P; Tobin, Kenneth W; Giancardo, Luca; Morris, Scott; Sparrow, Sylvia E; Garg, Seema; Fox, Karen; Chaum, Edward

    2011-10-01

    In this article, we present the design and implementation of a regional ocular telehealth network for remote assessment and management of diabetic retinopathy (DR), including the design requirements, network topology, protocol design, system work flow, graphics user interfaces, and performance evaluation. The Telemedical Retinal Image Analysis and Diagnosis Network is a computer-aided, image analysis telehealth paradigm for the diagnosis of DR and other retinal diseases using fundus images acquired from primary care end users delivering care to underserved patient populations in the mid-South and southeastern United States.

  14. COSAL: A black-box compressible stability analysis code for transition prediction in three-dimensional boundary layers

    NASA Technical Reports Server (NTRS)

    Malik, M. R.

    1982-01-01

    A fast computer code COSAL for transition prediction in three dimensional boundary layers using compressible stability analysis is described. The compressible stability eigenvalue problem is solved using a finite difference method, and the code is a black box in the sense that no guess of the eigenvalue is required from the user. Several optimization procedures were incorporated into COSAL to calculate integrated growth rates (N factor) for transition correlation for swept and tapered laminar flow control wings using the well known e to the Nth power method. A user's guide to the program is provided.

  15. Data Management System (DMS) Evolution Analysis

    NASA Technical Reports Server (NTRS)

    Douglas, Katherine

    1990-01-01

    The all encompassing goal for the Data Management System (DMS) Evolution Analysis task is to develop an advocacy for ensuring that growth and technology insertion issues are properly and adequately addressed during DMS requirements specification, design, and development. The most efficient methods of addressing those issues are via planned and graceful evolution, technology transparency, and system growth margins. It is necessary that provisions, such as those previously mentioned, are made to accommodate advanced missions requirements (e.g., Human Space Exploration Programs) in addition to evolving Space Station Freedom operations and user requirements .

  16. An Analysis of Data Breach Notifications as Negative News

    ERIC Educational Resources Information Center

    Veltsos, Jennifer R.

    2012-01-01

    Forty-six states require organizations to notify users when personally identifiable information has been exposed or when the organization's data security measures have been breached. This article describes a qualitative document analysis of 13 data breach notification templates from state and federal agencies. The results confirm much of the…

  17. Development of a shuttle recovery Commercial Materials Processing in Space (CMPS) program

    NASA Technical Reports Server (NTRS)

    1989-01-01

    The work performed has covered the following tasks: update commercial users requirements; assess availability of carriers and facilities; shuttle availability assessment; development of optimum accommodations plan; and payload documentation requirements assessment. The results from the first four tasks are presented. To update commercial user requirements, contacts were made with the JEA and CCDS partners to obtain copies of their most recent official flight requests. From these requests the commercial partners' short and long range plans for flight dates, flight frequency, experiment hardware and carriers was determined. A 34 by 44 inch chart was completed to give a snapshot view of the progress of commercialization in space. Further, an assessment was made of the availability of carriers and facilities. Both existing carriers and those under development were identified for use by the commercial partners. A data base was compiled to show the capabilities of the carriers. A shuttle availability assessment was performed using the primary and secondary shuttle manifests released by NASA. Analysis of the manifest produced a flight-by-flight list of flight opportunities available to commercial users. Using inputs from the first three tasks, an Optimum Accommodations Plan was developed. The Accommodation Plan shows the commercial users manifested by flight, the experiment flown, the carrier used and complete list of commercial users that could not be manifested in each calendar year.

  18. Aeroelastic analysis for propellers - mathematical formulations and program user's manual

    NASA Technical Reports Server (NTRS)

    Bielawa, R. L.; Johnson, S. A.; Chi, R. M.; Gangwani, S. T.

    1983-01-01

    Mathematical development is presented for a specialized propeller dedicated version of the G400 rotor aeroelastic analysis. The G400PROP analysis simulates aeroelastic characteristics particular to propellers such as structural sweep, aerodynamic sweep and high subsonic unsteady airloads (both stalled and unstalled). Formulations are presented for these expanded propeller related methodologies. Results of limited application of the analysis to realistic blade configurations and operating conditions which include stable and unstable stall flutter test conditions are given. Sections included for enhanced program user efficiency and expanded utilization include descriptions of: (1) the structuring of the G400PROP FORTRAN coding; (2) the required input data; and (3) the output results. General information to facilitate operation and improve efficiency is also provided.

  19. A Web Architecture to Geographically Interrogate CHIRPS Rainfall and eMODIS NDVI for Land Use Change

    NASA Technical Reports Server (NTRS)

    Burks, Jason E.; Limaye, Ashutosh

    2014-01-01

    Monitoring of rainfall and vegetation over the continent of Africa is important for assessing the status of crop health and agriculture, along with long-term changes in land use change. These issues can be addressed through examination of long-term precipitation (rainfall) data sets and remote sensing of land surface vegetation and land use types. Two products have been used previously to address these goals: the Climate Hazard Group Infrared Precipitation with Stations (CHIRPS) rainfall data, and multi-day composites of Normalized Difference Vegetation Index (NDVI) from the USGS eMODIS product. Combined, these are very large data sets that require unique tools and architecture to facilitate a variety of data analysis methods or data exploration by the end user community. To address these needs, a web-enabled system has been developed to allow end-users to interrogate CHIRPS rainfall and eMODIS NDVI data over the continent of Africa. The architecture allows end-users to use custom defined geometries, or the use of predefined political boundaries in their interrogation of the data. The massive amount of data interrogated by the system allows the end-users with only a web browser to extract vital information in order to investigate land use change and its causes. The system can be used to generate daily, monthly and yearly averages over a geographical area and range of dates of interest to the user. It also provides analysis of trends in precipitation or vegetation change for times of interest. The data provided back to the end-user is displayed in graphical form and can be exported for use in other, external tools. The development of this tool has significantly decreased the investment and requirements for end-users to use these two important datasets, while also allowing the flexibility to the end-user to limit the search to the area of interest.

  20. "New Space Explosion" and Earth Observing System Capabilities

    NASA Astrophysics Data System (ADS)

    Stensaas, G. L.; Casey, K.; Snyder, G. I.; Christopherson, J.

    2017-12-01

    This presentation will describe recent developments in spaceborne remote sensing, including introduction to some of the increasing number of new firms entering the market, along with new systems and successes from established players, as well as industry consolidation reactions to these developments from communities of users. The information in this presentation will include inputs from the results of the Joint Agency Commercial Imagery Evaluation (JACIE) 2017 Civil Commercial Imagery Evaluation Workshop and the use of the US Geological Survey's Requirements Capabilities and Analysis for Earth Observation (RCA-EO) centralized Earth observing systems database and how system performance parameters are used with user science applications requirements.

  1. "Recovery" in bipolar disorder: how can service users be supported through a self-management intervention? A qualitative focus group study.

    PubMed

    Todd, Nicholas J; Jones, Steven H; Lobban, Fiona A

    2012-04-01

    Bipolar disorder (BD) is a chronic and recurrent affective disorder. Recovery is defined as the process by which people can live fulfilling lives despite experiencing symptoms. To explore how an opportunistically recruited group of service users with BD experience recovery and self-management to understand more about how a service users' recovery may be supported. Twelve service users with BD took part in a series of focus groups. Service users' responses to questions about their personal experiences of self-management and recovery were analysed. Focus groups were transcribed verbatim and thematic analysis ([ Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77-101]) was employed to identify common themes in the data. Four key themes were identified: (1) Recovery is not about being symptom free; (2) Recovery requires taking responsibility for your own wellness; (3) Self-management: building on existing techniques; (4) Overcoming barriers to recovery: negativity, stigma and taboo. Service users with BD have provided further support for the concept of recovery and have suggested a number of ways recovery can be supported. A self-management approach informed by the recovery literature has been proposed as a way to support service users' recovery.

  2. Literature Review on Needs of Upper Limb Prosthesis Users.

    PubMed

    Cordella, Francesca; Ciancio, Anna Lisa; Sacchetti, Rinaldo; Davalli, Angelo; Cutti, Andrea Giovanni; Guglielmelli, Eugenio; Zollo, Loredana

    2016-01-01

    The loss of one hand can significantly affect the level of autonomy and the capability of performing daily living, working and social activities. The current prosthetic solutions contribute in a poor way to overcome these problems due to limitations in the interfaces adopted for controlling the prosthesis and to the lack of force or tactile feedback, thus limiting hand grasp capabilities. This paper presents a literature review on needs analysis of upper limb prosthesis users, and points out the main critical aspects of the current prosthetic solutions, in terms of users satisfaction and activities of daily living they would like to perform with the prosthetic device. The ultimate goal is to provide design inputs in the prosthetic field and, contemporary, increase user satisfaction rates and reduce device abandonment. A list of requirements for upper limb prostheses is proposed, grounded on the performed analysis on user needs. It wants to (i) provide guidelines for improving the level of acceptability and usefulness of the prosthesis, by accounting for hand functional and technical aspects; (ii) propose a control architecture of PNS-based prosthetic systems able to satisfy the analyzed user wishes; (iii) provide hints for improving the quality of the methods (e.g., questionnaires) adopted for understanding the user satisfaction with their prostheses.

  3. Literature Review on Needs of Upper Limb Prosthesis Users

    PubMed Central

    Cordella, Francesca; Ciancio, Anna Lisa; Sacchetti, Rinaldo; Davalli, Angelo; Cutti, Andrea Giovanni; Guglielmelli, Eugenio; Zollo, Loredana

    2016-01-01

    The loss of one hand can significantly affect the level of autonomy and the capability of performing daily living, working and social activities. The current prosthetic solutions contribute in a poor way to overcome these problems due to limitations in the interfaces adopted for controlling the prosthesis and to the lack of force or tactile feedback, thus limiting hand grasp capabilities. This paper presents a literature review on needs analysis of upper limb prosthesis users, and points out the main critical aspects of the current prosthetic solutions, in terms of users satisfaction and activities of daily living they would like to perform with the prosthetic device. The ultimate goal is to provide design inputs in the prosthetic field and, contemporary, increase user satisfaction rates and reduce device abandonment. A list of requirements for upper limb prostheses is proposed, grounded on the performed analysis on user needs. It wants to (i) provide guidelines for improving the level of acceptability and usefulness of the prosthesis, by accounting for hand functional and technical aspects; (ii) propose a control architecture of PNS-based prosthetic systems able to satisfy the analyzed user wishes; (iii) provide hints for improving the quality of the methods (e.g., questionnaires) adopted for understanding the user satisfaction with their prostheses. PMID:27242413

  4. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  5. Development of a site analysis tool for distributed wind projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shaw, Shawn

    The Cadmus Group, Inc., in collaboration with the National Renewable Energy Laboratory (NREL) and Encraft, was awarded a grant from the Department of Energy (DOE) to develop a site analysis tool for distributed wind technologies. As the principal investigator for this project, Mr. Shawn Shaw was responsible for overall project management, direction, and technical approach. The product resulting from this project is the Distributed Wind Site Analysis Tool (DSAT), a software tool for analyzing proposed sites for distributed wind technology (DWT) systems. This user-friendly tool supports the long-term growth and stability of the DWT market by providing reliable, realistic estimatesmore » of site and system energy output and feasibility. DSAT-which is accessible online and requires no purchase or download of software-is available in two account types; Standard: This free account allows the user to analyze a limited number of sites and to produce a system performance report for each; and Professional: For a small annual fee users can analyze an unlimited number of sites, produce system performance reports, and generate other customizable reports containing key information such as visual influence and wind resources. The tool’s interactive maps allow users to create site models that incorporate the obstructions and terrain types present. Users can generate site reports immediately after entering the requisite site information. Ideally, this tool also educates users regarding good site selection and effective evaluation practices.« less

  6. ISPyB for BioSAXS, the gateway to user autonomy in solution scattering experiments.

    PubMed

    De Maria Antolinos, Alejandro; Pernot, Petra; Brennich, Martha E; Kieffer, Jérôme; Bowler, Matthew W; Delageniere, Solange; Ohlsson, Staffan; Malbet Monaco, Stephanie; Ashton, Alun; Franke, Daniel; Svergun, Dmitri; McSweeney, Sean; Gordon, Elspeth; Round, Adam

    2015-01-01

    Logging experiments with the laboratory-information management system ISPyB (Information System for Protein crystallography Beamlines) enhances the automation of small-angle X-ray scattering of biological macromolecules in solution (BioSAXS) experiments. The ISPyB interface provides immediate user-oriented online feedback and enables data cross-checking and downstream analysis. To optimize data quality and completeness, ISPyBB (ISPyB for BioSAXS) makes it simple for users to compare the results from new measurements with previous acquisitions from the same day or earlier experiments in order to maximize the ability to collect all data required in a single synchrotron visit. The graphical user interface (GUI) of ISPyBB has been designed to guide users in the preparation of an experiment. The input of sample information and the ability to outline the experimental aims in advance provides feedback on the number of measurements required, calculation of expected sample volumes and time needed to collect the data: all of this information aids the users to better prepare for their trip to the synchrotron. A prototype version of the ISPyBB database is now available at the European Synchrotron Radiation Facility (ESRF) beamline BM29 and is already greatly appreciated by academic users and industrial clients. It will soon be available at the PETRA III beamline P12 and the Diamond Light Source beamlines I22 and B21.

  7. ISPyB for BioSAXS, the gateway to user autonomy in solution scattering experiments

    PubMed Central

    De Maria Antolinos, Alejandro; Pernot, Petra; Brennich, Martha E.; Kieffer, Jérôme; Bowler, Matthew W.; Delageniere, Solange; Ohlsson, Staffan; Malbet Monaco, Stephanie; Ashton, Alun; Franke, Daniel; Svergun, Dmitri; McSweeney, Sean; Gordon, Elspeth; Round, Adam

    2015-01-01

    Logging experiments with the laboratory-information management system ISPyB (Information System for Protein crystallography Beamlines) enhances the automation of small-angle X-ray scattering of biological macromolecules in solution (BioSAXS) experiments. The ISPyB interface provides immediate user-oriented online feedback and enables data cross-checking and downstream analysis. To optimize data quality and completeness, ISPyBB (ISPyB for BioSAXS) makes it simple for users to compare the results from new measurements with previous acquisitions from the same day or earlier experiments in order to maximize the ability to collect all data required in a single synchrotron visit. The graphical user interface (GUI) of ISPyBB has been designed to guide users in the preparation of an experiment. The input of sample information and the ability to outline the experimental aims in advance provides feedback on the number of measurements required, calculation of expected sample volumes and time needed to collect the data: all of this information aids the users to better prepare for their trip to the synchrotron. A prototype version of the ISPyBB database is now available at the European Synchrotron Radiation Facility (ESRF) beamline BM29 and is already greatly appreciated by academic users and industrial clients. It will soon be available at the PETRA III beamline P12 and the Diamond Light Source beamlines I22 and B21. PMID:25615862

  8. SYSTID - A flexible tool for the analysis of communication systems.

    NASA Technical Reports Server (NTRS)

    Dawson, C. T.; Tranter, W. H.

    1972-01-01

    Description of the System Time Domain Simulation (SYSTID) computer-aided analysis program which is specifically structured for communication systems analysis. The SYSTID program is user oriented so that very little knowledge of computer techniques and very little programming ability are required for proper application. The program is designed so that the user can go from a system block diagram to an accurate simulation by simply programming a single English language statement for each block in the system. The mathematical and functional models available in the SYSTID library are presented. An example problem is given which illustrates the ease of modeling communication systems. Examples of the outputs available are presented, and proposed improvements are summarized.

  9. Transforming User Needs into Functional Requirements for an Antibiotic Clinical Decision Support System

    PubMed Central

    Bright, T.J.

    2013-01-01

    Summary Background Many informatics studies use content analysis to generate functional requirements for system development. Explication of this translational process from qualitative data to functional requirements can strengthen the understanding and scientific rigor when applying content analysis in informatics studies. Objective To describe a user-centered approach transforming emergent themes derived from focus group data into functional requirements for informatics solutions and to illustrate these methods to the development of an antibiotic clinical decision support system (CDS). Methods The approach consisted of five steps: 1) identify unmet therapeutic planning information needs via Focus Group Study-I, 2) develop a coding framework of therapeutic planning themes to refine the domain scope to antibiotic therapeutic planning, 3) identify functional requirements of an antibiotic CDS system via Focus Group Study-II, 4) discover informatics solutions and functional requirements from coded data, and 5) determine the types of information needed to support the antibiotic CDS system and link with the identified informatics solutions and functional requirements. Results The coding framework for Focus Group Study-I revealed unmet therapeutic planning needs. Twelve subthemes emerged and were clustered into four themes; analysis indicated a need for an antibiotic CDS intervention. Focus Group Study-II included five types of information needs. Comments from the Barrier/Challenge to information access and Function/Feature themes produced three informatics solutions and 13 functional requirements of an antibiotic CDS system. Comments from the Patient, Institution, and Domain themes generated required data elements for each informatics solution. Conclusion This study presents one example explicating content analysis of focus group data and the analysis process to functional requirements from narrative data. Illustration of this 5-step method was used to develop an antibiotic CDS system, resolving unmet antibiotic prescribing needs. As a reusable approach, these techniques can be refined and applied to resolve unmet information needs with informatics interventions in additional domains. PMID:24454586

  10. Geoscience data visualization and analysis using GeoMapApp

    NASA Astrophysics Data System (ADS)

    Ferrini, Vicki; Carbotte, Suzanne; Ryan, William; Chan, Samantha

    2013-04-01

    Increased availability of geoscience data resources has resulted in new opportunities for developing visualization and analysis tools that not only promote data integration and synthesis, but also facilitate quantitative cross-disciplinary access to data. Interdisciplinary investigations, in particular, frequently require visualizations and quantitative access to specialized data resources across disciplines, which has historically required specialist knowledge of data formats and software tools. GeoMapApp (www.geomapapp.org) is a free online data visualization and analysis tool that provides direct quantitative access to a wide variety of geoscience data for a broad international interdisciplinary user community. While GeoMapApp provides access to online data resources, it can also be packaged to work offline through the deployment of a small portable hard drive. This mode of operation can be particularly useful during field programs to provide functionality and direct access to data when a network connection is not possible. Hundreds of data sets from a variety of repositories are directly accessible in GeoMapApp, without the need for the user to understand the specifics of file formats or data reduction procedures. Available data include global and regional gridded data, images, as well as tabular and vector datasets. In addition to basic visualization and data discovery functionality, users are provided with simple tools for creating customized maps and visualizations and to quantitatively interrogate data. Specialized data portals with advanced functionality are also provided for power users to further analyze data resources and access underlying component datasets. Users may import and analyze their own geospatial datasets by loading local versions of geospatial data and can access content made available through Web Feature Services (WFS) and Web Map Services (WMS). Once data are loaded in GeoMapApp, a variety options are provided to export data and/or 2D/3D visualizations into common formats including grids, images, text files, spreadsheets, etc. Examples of interdisciplinary investigations that make use of GeoMapApp visualization and analysis functionality will be provided.

  11. Use of DAGMan in CRAB3 to Improve the Splitting of CMS User Jobs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolf, M.; Mascheroni, M.; Woodard, A.

    CRAB3 is a workload management tool used by CMS physicists to analyze data acquired by the Compact Muon Solenoid (CMS) detector at the CERN Large Hadron Collider (LHC). Research in high energy physics often requires the analysis of large collections of files, referred to as datasets. The task is divided into jobs that are distributed among a large collection of worker nodes throughout the Worldwide LHC Computing Grid (WLCG). Splitting a large analysis task into optimally sized jobs is critical to efficient use of distributed computing resources. Jobs that are too big will have excessive runtimes and will not distributemore » the work across all of the available nodes. However, splitting the project into a large number of very small jobs is also inefficient, as each job creates additional overhead which increases load on infrastructure resources. Currently this splitting is done manually, using parameters provided by the user. However the resources needed for each job are difficult to predict because of frequent variations in the performance of the user code and the content of the input dataset. As a result, dividing a task into jobs by hand is difficult and often suboptimal. In this work we present a new feature called “automatic splitting” which removes the need for users to manually specify job splitting parameters. We discuss how HTCondor DAGMan can be used to build dynamic Directed Acyclic Graphs (DAGs) to optimize the performance of large CMS analysis jobs on the Grid. We use DAGMan to dynamically generate interconnected DAGs that estimate the processing time the user code will require to analyze each event. This is used to calculate an estimate of the total processing time per job, and a set of analysis jobs are run using this estimate as a specified time limit. Some jobs may not finish within the alloted time; they are terminated at the time limit, and the unfinished data is regrouped into smaller jobs and resubmitted.« less

  12. Use of DAGMan in CRAB3 to improve the splitting of CMS user jobs

    NASA Astrophysics Data System (ADS)

    Wolf, M.; Mascheroni, M.; Woodard, A.; Belforte, S.; Bockelman, B.; Hernandez, J. M.; Vaandering, E.

    2017-10-01

    CRAB3 is a workload management tool used by CMS physicists to analyze data acquired by the Compact Muon Solenoid (CMS) detector at the CERN Large Hadron Collider (LHC). Research in high energy physics often requires the analysis of large collections of files, referred to as datasets. The task is divided into jobs that are distributed among a large collection of worker nodes throughout the Worldwide LHC Computing Grid (WLCG). Splitting a large analysis task into optimally sized jobs is critical to efficient use of distributed computing resources. Jobs that are too big will have excessive runtimes and will not distribute the work across all of the available nodes. However, splitting the project into a large number of very small jobs is also inefficient, as each job creates additional overhead which increases load on infrastructure resources. Currently this splitting is done manually, using parameters provided by the user. However the resources needed for each job are difficult to predict because of frequent variations in the performance of the user code and the content of the input dataset. As a result, dividing a task into jobs by hand is difficult and often suboptimal. In this work we present a new feature called “automatic splitting” which removes the need for users to manually specify job splitting parameters. We discuss how HTCondor DAGMan can be used to build dynamic Directed Acyclic Graphs (DAGs) to optimize the performance of large CMS analysis jobs on the Grid. We use DAGMan to dynamically generate interconnected DAGs that estimate the processing time the user code will require to analyze each event. This is used to calculate an estimate of the total processing time per job, and a set of analysis jobs are run using this estimate as a specified time limit. Some jobs may not finish within the alloted time; they are terminated at the time limit, and the unfinished data is regrouped into smaller jobs and resubmitted.

  13. ProteoSign: an end-user online differential proteomics statistical analysis platform.

    PubMed

    Efstathiou, Georgios; Antonakis, Andreas N; Pavlopoulos, Georgios A; Theodosiou, Theodosios; Divanach, Peter; Trudgian, David C; Thomas, Benjamin; Papanikolaou, Nikolas; Aivaliotis, Michalis; Acuto, Oreste; Iliopoulos, Ioannis

    2017-07-03

    Profiling of proteome dynamics is crucial for understanding cellular behavior in response to intrinsic and extrinsic stimuli and maintenance of homeostasis. Over the last 20 years, mass spectrometry (MS) has emerged as the most powerful tool for large-scale identification and characterization of proteins. Bottom-up proteomics, the most common MS-based proteomics approach, has always been challenging in terms of data management, processing, analysis and visualization, with modern instruments capable of producing several gigabytes of data out of a single experiment. Here, we present ProteoSign, a freely available web application, dedicated in allowing users to perform proteomics differential expression/abundance analysis in a user-friendly and self-explanatory way. Although several non-commercial standalone tools have been developed for post-quantification statistical analysis of proteomics data, most of them are not end-user appealing as they often require very stringent installation of programming environments, third-party software packages and sometimes further scripting or computer programming. To avoid this bottleneck, we have developed a user-friendly software platform accessible via a web interface in order to enable proteomics laboratories and core facilities to statistically analyse quantitative proteomics data sets in a resource-efficient manner. ProteoSign is available at http://bioinformatics.med.uoc.gr/ProteoSign and the source code at https://github.com/yorgodillo/ProteoSign. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  14. Long-Term Preservation and Advanced Access Services to Archived Data: The Approach of a System Integrator

    NASA Astrophysics Data System (ADS)

    Petitjean, Gilles; de Hauteclocque, Bertrand

    2004-06-01

    EADS Defence and Security Systems (EADS DS SA) have developed an expertise as integrator of archive management systems for both their commercial and defence customers (ESA, CNES, EC, EUMETSAT, French MOD, US DOD, etc.), especially in Earth Observation and in Meteorology fields.The concern of valuable data owners is both their long-term preservation but also the integration of the archive in their information system with in particular an efficient access to archived data for their user community. The system integrator answers to this requirement by a methodology combining understanding of user needs, exhaustive knowledge of the existing solutions both for hardware and software elements and development and integration ability. The system integrator completes the facility development by support activities.The long-term preservation of archived data obviously involves a pertinent selection of storage media and archive library. This selection relies on storage technology survey but the selection criteria depend on the analysis of the user needs. The system integrator will recommend the best compromise for implementing an archive management facility, thanks to its knowledge and its independence of storage market and through the analysis of the user requirements. He will provide a solution, which is able to evolve to take advantage of the storage technology progress.But preserving the data for long-term is not only a question of storage technology. Some functions are required to secure the archive management system against contingency situation: multiple data set copies using operational procedures, active quality control of the archived data, migration policy optimising the cost of ownership.

  15. Advanced user support programme—TEMPUS IML-2

    NASA Astrophysics Data System (ADS)

    Diefenbach, A.; Kratz, M.; Uffelmann, D.; Willnecker, R.

    1995-05-01

    The DLR Microgravity User Support Centre (MUSC) in Cologne has supported microgravity experiments in the field of materials and life sciences since 1979. In the beginning of user support activities, MUSC tasks comprised the basic ground and mission support, whereas present programmes are expanded on, for example, powerful telescience and advanced real time data acquisition capabilities for efficient experiment operation and monitoring. In view of the Space Station era, user support functions will increase further. Additional tasks and growing responsibilities must be covered, e.g. extended science support as well as experiment and facility operations. The user support for TEMPUS IML-2, under contract of the German Space Agency DARA, represents a further step towards the required new-generation of future ground programme. TEMPUS is a new highly sophisticated Spacelab multi-user facility for containerless processing of metallic samples. Electromagnetic levitation technique is applied and various experiment diagnosis tools are offered. Experiments from eight U.S. and German investigator groups have been selected for flight on the second International Microgravity Laboratory Mission IML-2 in 1994. Based on the experience gained in the research programme of the DLR Institute for Space Simulation since 1984, MUSC is performing a comprehensive experiment preparation programme in close collaboration with the investigator teams. Complex laboratory equipment has been built up for technology and experiment preparation development. New experiment techniques have been developed for experiment verification tests. The MUSC programme includes thorough analysis and testing of scientific requirements of every proposed experiment with respect to the facility hard- and software capabilities. In addition, studies on the experiment-specific operation requirements have been performed and suitable telescience scenarios were analysed. The present paper will give a survey of the TEMPUS user support tasks emphasizing the advanced science support activities, which are considered significant for future ground programmes.

  16. Text-Content-Analysis based on the Syntactic Correlations between Ontologies

    NASA Astrophysics Data System (ADS)

    Tenschert, Axel; Kotsiopoulos, Ioannis; Koller, Bastian

    The work presented in this chapter is concerned with the analysis of semantic knowledge structures, represented in the form of Ontologies, through which Service Level Agreements (SLAs) are enriched with new semantic data. The objective of the enrichment process is to enable SLA negotiation in a way that is much more convenient for a Service Users. For this purpose the deployment of an SLA-Management-System as well as the development of an analyzing procedure for Ontologies is required. This chapter will refer to the BREIN, the FinGrid and the LarKC projects. The analyzing procedure examines the syntactic correlations of several Ontologies whose focus lies in the field of mechanical engineering. A method of analyzing text and content is developed as part of this procedure. In order to so, we introduce a formalism as well as a method for understanding content. The analysis and methods are integrated to an SLA Management System which enables a Service User to interact with the system as a service by negotiating the user requests and including the semantic knowledge. Through negotiation between Service User and Service Provider the analysis procedure considers the user requests by extending the SLAs with semantic knowledge. Through this the economic use of an SLA-Management-System is increased by the enhancement of SLAs with semantic knowledge structures. The main focus of this chapter is the analyzing procedure, respectively the Text-Content-Analysis, which provides the mentioned semantic knowledge structures.

  17. 'Fly Like This': Natural Language Interface for UAV Mission Planning

    NASA Technical Reports Server (NTRS)

    Chandarana, Meghan; Meszaros, Erica L.; Trujillo, Anna; Allen, B. Danette

    2017-01-01

    With the increasing presence of unmanned aerial vehicles (UAVs) in everyday environments, the user base of these powerful and potentially intelligent machines is expanding beyond exclusively highly trained vehicle operators to include non-expert system users. Scientists seeking to augment costly and often inflexible methods of data collection historically used are turning towards lower cost and reconfigurable UAVs. These new users require more intuitive and natural methods for UAV mission planning. This paper explores two natural language interfaces - gesture and speech - for UAV flight path generation through individual user studies. Subjects who participated in the user studies also used a mouse-based interface for a baseline comparison. Each interface allowed the user to build flight paths from a library of twelve individual trajectory segments. Individual user studies evaluated performance, efficacy, and ease-of-use of each interface using background surveys, subjective questionnaires, and observations on time and correctness. Analysis indicates that natural language interfaces are promising alternatives to traditional interfaces. The user study data collected on the efficacy and potential of each interface will be used to inform future intuitive UAV interface design for non-expert users.

  18. Full-scale system impact analysis: Digital document storage project

    NASA Technical Reports Server (NTRS)

    1989-01-01

    The Digital Document Storage Full Scale System can provide cost effective electronic document storage, retrieval, hard copy reproduction, and remote access for users of NASA Technical Reports. The desired functionality of the DDS system is highly dependent on the assumed requirements for remote access used in this Impact Analysis. It is highly recommended that NASA proceed with a phased, communications requirement analysis to ensure that adequate communications service can be supplied at a reasonable cost in order to validate recent working assumptions upon which the success of the DDS Full Scale System is dependent.

  19. [Development of Hospital Equipment Maintenance Information System].

    PubMed

    Zhou, Zhixin

    2015-11-01

    Hospital equipment maintenance information system plays an important role in improving medical treatment quality and efficiency. By requirement analysis of hospital equipment maintenance, the system function diagram is drawed. According to analysis of input and output data, tables and reports in connection with equipment maintenance process, relationships between entity and attribute is found out, and E-R diagram is drawed and relational database table is established. Software development should meet actual process requirement of maintenance and have a friendly user interface and flexible operation. The software can analyze failure cause by statistical analysis.

  20. Improvements to Integrated Tradespace Analysis of Communications Architectures (ITACA) Network Loading Analysis Tool

    NASA Technical Reports Server (NTRS)

    Lee, Nathaniel; Welch, Bryan W.

    2018-01-01

    NASA's SCENIC project aims to simplify and reduce the cost of space mission planning by replicating the analysis capabilities of commercially licensed software which are integrated with relevant analysis parameters specific to SCaN assets and SCaN supported user missions. SCENIC differs from current tools that perform similar analyses in that it 1) does not require any licensing fees, 2) will provide an all-in-one package for various analysis capabilities that normally requires add-ons or multiple tools to complete. As part of SCENIC's capabilities, the ITACA network loading analysis tool will be responsible for assessing the loading on a given network architecture and generating a network service schedule. ITACA will allow users to evaluate the quality of service of a given network architecture and determine whether or not the architecture will satisfy the mission's requirements. ITACA is currently under development, and the following improvements were made during the fall of 2017: optimization of runtime, augmentation of network asset pre-service configuration time, augmentation of Brent's method of root finding, augmentation of network asset FOV restrictions, augmentation of mission lifetimes, and the integration of a SCaN link budget calculation tool. The improvements resulted in (a) 25% reduction in runtime, (b) more accurate contact window predictions when compared to STK(Registered Trademark) contact window predictions, and (c) increased fidelity through the use of specific SCaN asset parameters.

  1. IAC-1.5 - INTEGRATED ANALYSIS CAPABILITY

    NASA Technical Reports Server (NTRS)

    Vos, R. G.

    1994-01-01

    The objective of the Integrated Analysis Capability (IAC) system is to provide a highly effective, interactive analysis tool for the integrated design of large structures. IAC was developed to interface programs from the fields of structures, thermodynamics, controls, and system dynamics with an executive system and a database to yield a highly efficient multi-disciplinary system. Special attention is given to user requirements such as data handling and on-line assistance with operational features, and the ability to add new modules of the user's choice at a future date. IAC contains an executive system, a database, general utilities, interfaces to various engineering programs, and a framework for building interfaces to other programs. IAC has shown itself to be effective in automating data transfer among analysis programs. The IAC system architecture is modular in design. 1) The executive module contains an input command processor, an extensive data management system, and driver code to execute the application modules. 2) Technical modules provide standalone computational capability as well as support for various solution paths or coupled analyses. 3) Graphics and model generation modules are supplied for building and viewing models. 4) Interface modules provide for the required data flow between IAC and other modules. 5) User modules can be arbitrary executable programs or JCL procedures with no pre-defined relationship to IAC. 6) Special purpose modules are included, such as MIMIC (Model Integration via Mesh Interpolation Coefficients), which transforms field values from one model to another; LINK, which simplifies incorporation of user specific modules into IAC modules; and DATAPAC, the National Bureau of Standards statistical analysis package. The IAC database contains structured files which provide a common basis for communication between modules and the executive system, and can contain unstructured files such as NASTRAN checkpoint files, DISCOS plot files, object code, etc. The user can define groups of data and relations between them. A full data manipulation and query system operates with the database. The current interface modules comprise five groups: 1) Structural analysis - IAC contains a NASTRAN interface for standalone analysis or certain structural/control/thermal combinations. IAC provides enhanced structural capabilities for normal modes and static deformation analysis via special DMAP sequences. 2) Thermal analysis - IAC supports finite element and finite difference techniques for steady state or transient analysis. There are interfaces for the NASTRAN thermal analyzer, SINDA/SINFLO, and TRASYS II. 3) System dynamics - A DISCOS interface allows full use of this simulation program for either nonlinear time domain analysis or linear frequency domain analysis. 4) Control analysis - Interfaces for the ORACLS, SAMSAN, NBOD2, and INCA programs allow a wide range of control system analyses and synthesis techniques. 5) Graphics - The graphics packages PLOT and MOSAIC are included in IAC. PLOT generates vector displays of tabular data in the form of curves, charts, correlation tables, etc., while MOSAIC generates color raster displays of either tabular of array type data. Either DI3000 or PLOT-10 graphics software is required for full graphics capability. IAC is available by license for a period of 10 years to approved licensees. The licensed program product includes one complete set of supporting documentation. Additional copies of the documentation may be purchased separately. IAC is written in FORTRAN 77 and has been implemented on a DEC VAX series computer operating under VMS. IAC can be executed by multiple concurrent users in batch or interactive mode. The basic central memory requirement is approximately 750KB. IAC includes the executive system, graphics modules, a database, general utilities, and the interfaces to all analysis and controls programs described above. Source code is provided for the control programs ORACLS, SAMSAN, NBOD2, and DISCOS. The following programs are also available from COSMIC a

  2. IAC-1.5 - INTEGRATED ANALYSIS CAPABILITY

    NASA Technical Reports Server (NTRS)

    Vos, R. G.

    1994-01-01

    The objective of the Integrated Analysis Capability (IAC) system is to provide a highly effective, interactive analysis tool for the integrated design of large structures. IAC was developed to interface programs from the fields of structures, thermodynamics, controls, and system dynamics with an executive system and a database to yield a highly efficient multi-disciplinary system. Special attention is given to user requirements such as data handling and on-line assistance with operational features, and the ability to add new modules of the user's choice at a future date. IAC contains an executive system, a database, general utilities, interfaces to various engineering programs, and a framework for building interfaces to other programs. IAC has shown itself to be effective in automating data transfer among analysis programs. The IAC system architecture is modular in design. 1) The executive module contains an input command processor, an extensive data management system, and driver code to execute the application modules. 2) Technical modules provide standalone computational capability as well as support for various solution paths or coupled analyses. 3) Graphics and model generation modules are supplied for building and viewing models. 4) Interface modules provide for the required data flow between IAC and other modules. 5) User modules can be arbitrary executable programs or JCL procedures with no pre-defined relationship to IAC. 6) Special purpose modules are included, such as MIMIC (Model Integration via Mesh Interpolation Coefficients), which transforms field values from one model to another; LINK, which simplifies incorporation of user specific modules into IAC modules; and DATAPAC, the National Bureau of Standards statistical analysis package. The IAC database contains structured files which provide a common basis for communication between modules and the executive system, and can contain unstructured files such as NASTRAN checkpoint files, DISCOS plot files, object code, etc. The user can define groups of data and relations between them. A full data manipulation and query system operates with the database. The current interface modules comprise five groups: 1) Structural analysis - IAC contains a NASTRAN interface for standalone analysis or certain structural/control/thermal combinations. IAC provides enhanced structural capabilities for normal modes and static deformation analysis via special DMAP sequences. 2) Thermal analysis - IAC supports finite element and finite difference techniques for steady state or transient analysis. There are interfaces for the NASTRAN thermal analyzer, SINDA/SINFLO, and TRASYS II. 3) System dynamics - A DISCOS interface allows full use of this simulation program for either nonlinear time domain analysis or linear frequency domain analysis. 4) Control analysis - Interfaces for the ORACLS, SAMSAN, NBOD2, and INCA programs allow a wide range of control system analyses and synthesis techniques. 5) Graphics - The graphics packages PLOT and MOSAIC are included in IAC. PLOT generates vector displays of tabular data in the form of curves, charts, correlation tables, etc., while MOSAIC generates color raster displays of either tabular of array type data. Either DI3000 or PLOT-10 graphics software is required for full graphics capability. IAC is available by license for a period of 10 years to approved licensees. The licensed program product includes one complete set of supporting documentation. Additional copies of the documentation may be purchased separately. IAC is written in FORTRAN 77 and has been implemented on a DEC VAX series computer operating under VMS. IAC can be executed by multiple concurrent users in batch or interactive mode. The basic central memory requirement is approximately 750KB. IAC includes the executive system, graphics modules, a database, general utilities, and the interfaces to all analysis and controls programs described above. Source code is provided for the control programs ORACLS, SAMSAN, NBOD2, and DISCOS. The following programs are also available from COSMIC as separate packages: NASTRAN, SINDA/SINFLO, TRASYS II, DISCOS, ORACLS, SAMSAN, NBOD2, and INCA. IAC was developed in 1985.

  3. Automated Tracking of Cell Migration with Rapid Data Analysis.

    PubMed

    DuChez, Brian J

    2017-09-01

    Cell migration is essential for many biological processes including development, wound healing, and metastasis. However, studying cell migration often requires the time-consuming and labor-intensive task of manually tracking cells. To accelerate the task of obtaining coordinate positions of migrating cells, we have developed a graphical user interface (GUI) capable of automating the tracking of fluorescently labeled nuclei. This GUI provides an intuitive user interface that makes automated tracking accessible to researchers with no image-processing experience or familiarity with particle-tracking approaches. Using this GUI, users can interactively determine a minimum of four parameters to identify fluorescently labeled cells and automate acquisition of cell trajectories. Additional features allow for batch processing of numerous time-lapse images, curation of unwanted tracks, and subsequent statistical analysis of tracked cells. Statistical outputs allow users to evaluate migratory phenotypes, including cell speed, distance, displacement, and persistence, as well as measures of directional movement, such as forward migration index (FMI) and angular displacement. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.

  4. Studying Psychosocial Barriers to Drug Treatment Among Chinese Methamphetamine Users Using A 3-Step Latent Class Analysis.

    PubMed

    Wang, Jichuan; Kelly, Brian C; Liu, Tieqiao; Hao, Wei

    2016-03-01

    Given the growth in methamphetamine use in China during the 21st century, we assessed perceived psychosocial barriers to drug treatment among this population. Using a sample of 303 methamphetamine users recruited via Respondent Driven Sampling, we use Latent Class Analysis (LCA) to identify possible distinct latent groups among Chinese methamphetamine users on the basis of their perceptions of psychosocial barriers to drug treatment. After covariates were included to predict latent class membership, the 3-step modeling approach was applied. Our findings indicate that the Chinese methamphetamine using population was heterogeneous on perceptions of drug treatment barriers; four distinct latent classes (subpopulations) were identified--Unsupported Deniers, Deniers, Privacy Anxious, and Low Barriers--and individual characteristics shaped the probability of class membership. Efforts to link Chinese methamphetamine users to treatment may require a multi-faceted approach that attends to differing perceptions about impediments to drug treatment. Copyright © 2015. Published by Elsevier Inc.

  5. JAliEn - A new interface between the AliEn jobs and the central services

    NASA Astrophysics Data System (ADS)

    Grigoras, A. G.; Grigoras, C.; Pedreira, M. M.; Saiz, P.; Schreiner, S.

    2014-06-01

    Since the ALICE experiment began data taking in early 2010, the amount of end user jobs on the AliEn Grid has increased significantly. Presently 1/3 of the 40K CPU cores available to ALICE are occupied by jobs submitted by about 400 distinct users, individually or in organized analysis trains. The overall stability of the AliEn middleware has been excellent throughout the 3 years of running, but the massive amount of end-user analysis and its specific requirements and load has revealed few components which can be improved. One of them is the interface between users and central AliEn services (catalogue, job submission system) which we are currently re-implementing in Java. The interface provides persistent connection with enhanced data and job submission authenticity. In this paper we will describe the architecture of the new interface, the ROOT binding which enables the use of a single interface in addition to the standard UNIX-like access shell and the new security-related features.

  6. Model-Driven Useware Engineering

    NASA Astrophysics Data System (ADS)

    Meixner, Gerrit; Seissler, Marc; Breiner, Kai

    User-oriented hardware and software development relies on a systematic development process based on a comprehensive analysis focusing on the users' requirements and preferences. Such a development process calls for the integration of numerous disciplines, from psychology and ergonomics to computer sciences and mechanical engineering. Hence, a correspondingly interdisciplinary team must be equipped with suitable software tools to allow it to handle the complexity of a multimodal and multi-device user interface development approach. An abstract, model-based development approach seems to be adequate for handling this complexity. This approach comprises different levels of abstraction requiring adequate tool support. Thus, in this chapter, we present the current state of our model-based software tool chain. We introduce the use model as the core model of our model-based process, transformation processes, and a model-based architecture, and we present different software tools that provide support for creating and maintaining the models or performing the necessary model transformations.

  7. User Requirements Analysis For Digital Library Application Using Quality Function Deployment.

    NASA Astrophysics Data System (ADS)

    Wulandari, Lily; Sularto, Lana; Yusnitasari, Tristyanti; Ikasari, Diana

    2017-03-01

    This study attemp to build Smart Digital Library to be used by the wider community wherever they are. The system is built in the form of Smart Digital Library portal which uses semantic similarity method (Semantic Similarity) to search journals, articles or books by title or author name. This method is also used to determine the recommended books to be read by visitors of Smart Digital Library based on testimony from a previous reader automatically. Steps being taken in the development of Smart Digital Library system is the analysis phase, design phase, testing and implementation phase. At this stage of the analysis using WebQual for the preparation of the instruments to be distributed to the respondents and the data obtained from the respondents will be processed using Quality Function Deployment. In the analysis phase has the purpose of identifying consumer needs and technical requirements. The analysis was performed to a digital library on the web digital library Gunadarma University, Bogor Institute of Agriculture, University of Indonesia, etc. The questionnaire was distributed to 200 respondents. The research methodology begins with the collection of user requirements and analyse it using QFD. Application design is funded by the government through a program of Featured Universities Research by the Directorate General of Higher Education (DIKTI). Conclusions from this research are identified which include the Consumer Requirements of digital library application. The elements of the consumers requirements consists of 13 elements and 25 elements of Engineering Characteristics digital library requirements. Therefore the design of digital library applications that will be built, is designed according to the findings by eliminating features that are not needed by restaurant based on QFD House of Quality.

  8. MetaNET--a web-accessible interactive platform for biological metabolic network analysis.

    PubMed

    Narang, Pankaj; Khan, Shawez; Hemrom, Anmol Jaywant; Lynn, Andrew Michael

    2014-01-01

    Metabolic reactions have been extensively studied and compiled over the last century. These have provided a theoretical base to implement models, simulations of which are used to identify drug targets and optimize metabolic throughput at a systemic level. While tools for the perturbation of metabolic networks are available, their applications are limited and restricted as they require varied dependencies and often a commercial platform for full functionality. We have developed MetaNET, an open source user-friendly platform-independent and web-accessible resource consisting of several pre-defined workflows for metabolic network analysis. MetaNET is a web-accessible platform that incorporates a range of functions which can be combined to produce different simulations related to metabolic networks. These include (i) optimization of an objective function for wild type strain, gene/catalyst/reaction knock-out/knock-down analysis using flux balance analysis. (ii) flux variability analysis (iii) chemical species participation (iv) cycles and extreme paths identification and (v) choke point reaction analysis to facilitate identification of potential drug targets. The platform is built using custom scripts along with the open-source Galaxy workflow and Systems Biology Research Tool as components. Pre-defined workflows are available for common processes, and an exhaustive list of over 50 functions are provided for user defined workflows. MetaNET, available at http://metanet.osdd.net , provides a user-friendly rich interface allowing the analysis of genome-scale metabolic networks under various genetic and environmental conditions. The framework permits the storage of previous results, the ability to repeat analysis and share results with other users over the internet as well as run different tools simultaneously using pre-defined workflows, and user-created custom workflows.

  9. Architecture-led Requirements and Safety Analysis of an Aircraft Survivability Situational Awareness System

    DTIC Science & Technology

    2015-05-01

    quality attributes. Prioritization of the utility tree leafs driven by mission goals help the user ensure that critical requirements are well-specified...Methods: State of the Art and Future Directions”, ACM Computing Surveys. 1996. 10 Laitenberger, Oliver , “A Survey of Software Inspection Technologies, Handbook on Software Engineering and Knowledge Engineering”. 2002.

  10. Space station needs, attributes and architectural options. Volume 3, task 1: Mission requirements

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The mission requirements of the space station program are investigated. Mission parameters are divided into user support from private industry, scientific experimentation, U.S. national security, and space operations away from the space station. These categories define the design and use of the space station. An analysis of cost estimates is included.

  11. Reasons For Physicians Not Adopting Clinical Decision Support Systems: Critical Analysis.

    PubMed

    Khairat, Saif; Marc, David; Crosby, William; Al Sanousi, Ali

    2018-04-18

    Clinical decision support systems (CDSSs) are an integral component of today's health information technologies. They assist with interpretation, diagnosis, and treatment. A CDSS can be embedded throughout the patient safety continuum providing reminders, recommendations, and alerts to health care providers. Although CDSSs have been shown to reduce medical errors and improve patient outcomes, they have fallen short of their full potential. User acceptance has been identified as one of the potential reasons for this shortfall. The purpose of this paper was to conduct a critical review and task analysis of CDSS research and to develop a new framework for CDSS design in order to achieve user acceptance. A critical review of CDSS papers was conducted with a focus on user acceptance. To gain a greater understanding of the problems associated with CDSS acceptance, we conducted a task analysis to identify and describe the goals, user input, system output, knowledge requirements, and constraints from two different perspectives: the machine (ie, the CDSS engine) and the user (ie, the physician). Favorability of CDSSs was based on user acceptance of clinical guidelines, reminders, alerts, and diagnostic suggestions. We propose two models: (1) the user acceptance and system adaptation design model, which includes optimizing CDSS design based on user needs/expectations, and (2) the input-process-output-engagemodel, which reveals to users the processes that govern CDSS outputs. This research demonstrates that the incorporation of the proposed models will improve user acceptance to support the beneficial effects of CDSSs adoption. Ultimately, if a user does not accept technology, this not only poses a threat to the use of the technology but can also pose a threat to the health and well-being of patients. ©Saif Khairat, David Marc, William Crosby, Ali Al Sanousi. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 18.04.2018.

  12. Requirements UML Tool (RUT) Expanded for Extreme Programming (CI02)

    NASA Technical Reports Server (NTRS)

    McCoy, James R.

    2003-01-01

    A procedure for capturing and managing system requirements that incorporates XP user stories. Because costs associated with identifying problems in requirements increase dramatically over the lifecycle of a project, a method for identifying sources of software risks in user stories is urgently needed. This initiative aims to determine a set of guide-lines for user stories that will result in high-quality requirement. To further this initiative, a tool is needed to analyze user stories that can assess the quality of individual user stories, detect sources cf software risk's, produce software metrics, and identify areas in user stories that can be improved.

  13. Task-specific usability requirements of electronic medical records systems: Lessons learned from a national survey of end-users.

    PubMed

    Farzandipour, Mehrdad; Meidani, Zahra; Riazi, Hossein; Sadeqi Jabali, Monireh

    2018-09-01

    There are various approaches to evaluating the usability of electronic medical record (EMR) systems. User perspectives are an integral part of evaluation. Usability evaluations efficiently and effectively contribute to user-centered design and supports tasks and increase user satisfaction. This study determined the main usability requirements for EMRs by means of an end-user survey. A mixed-method strategy was conducted in three phases. A qualitative approach was employed to collect and formulate EMR usability requirements using the focus group method and the modified Delphi technique. Classic Delphi technique was used to evaluate the proposed requirements among 380 end-users in Iran. The final list of EMR usability requirements was verified and included 163 requirements divided into nine groups. The highest rates of end-user agreement relate to EMR visual clarity (3.65 ± 0.61), fault tolerance (3.58 ± 0.56), and suitability for learning (3.55 ± 0.54). The lowest end-user agreement was for auditory presentation (3.18 ± 0.69). The highest and lowest agreement among end-users was for visual clarity and auditory presentation by EMRs, respectively. This suggests that user priorities in determination of EMR usability and their understanding of the importance of the types of individual tasks and context characteristics differ.

  14. Evaluation of the MEDLARS Demand Search Service.

    ERIC Educational Resources Information Center

    Lancaster, F.W.

    A detailed analysis was made by the National Library of Medicine of the performance of the Medical Literature and Analysis System (MEDLARS) in relation to 300 actual "demand search" requests made to the systems in 1966 and 1967. The objectives of the study were : (1) to study the demand search requirements of MEDLARS users, (2) to…

  15. iCanPlot: Visual Exploration of High-Throughput Omics Data Using Interactive Canvas Plotting

    PubMed Central

    Sinha, Amit U.; Armstrong, Scott A.

    2012-01-01

    Increasing use of high throughput genomic scale assays requires effective visualization and analysis techniques to facilitate data interpretation. Moreover, existing tools often require programming skills, which discourages bench scientists from examining their own data. We have created iCanPlot, a compelling platform for visual data exploration based on the latest technologies. Using the recently adopted HTML5 Canvas element, we have developed a highly interactive tool to visualize tabular data and identify interesting patterns in an intuitive fashion without the need of any specialized computing skills. A module for geneset overlap analysis has been implemented on the Google App Engine platform: when the user selects a region of interest in the plot, the genes in the region are analyzed on the fly. The visualization and analysis are amalgamated for a seamless experience. Further, users can easily upload their data for analysis—which also makes it simple to share the analysis with collaborators. We illustrate the power of iCanPlot by showing an example of how it can be used to interpret histone modifications in the context of gene expression. PMID:22393367

  16. Maser: one-stop platform for NGS big data from analysis to visualization

    PubMed Central

    Kinjo, Sonoko; Monma, Norikazu; Misu, Sadahiko; Kitamura, Norikazu; Imoto, Junichi; Yoshitake, Kazutoshi; Gojobori, Takashi; Ikeo, Kazuho

    2018-01-01

    Abstract A major challenge in analyzing the data from high-throughput next-generation sequencing (NGS) is how to handle the huge amounts of data and variety of NGS tools and visualize the resultant outputs. To address these issues, we developed a cloud-based data analysis platform, Maser (Management and Analysis System for Enormous Reads), and an original genome browser, Genome Explorer (GE). Maser enables users to manage up to 2 terabytes of data to conduct analyses with easy graphical user interface operations and offers analysis pipelines in which several individual tools are combined as a single pipeline for very common and standard analyses. GE automatically visualizes genome assembly and mapping results output from Maser pipelines, without requiring additional data upload. With this function, the Maser pipelines can graphically display the results output from all the embedded tools and mapping results in a web browser. Therefore Maser realized a more user-friendly analysis platform especially for beginners by improving graphical display and providing the selected standard pipelines that work with built-in genome browser. In addition, all the analyses executed on Maser are recorded in the analysis history, helping users to trace and repeat the analyses. The entire process of analysis and its histories can be shared with collaborators or opened to the public. In conclusion, our system is useful for managing, analyzing, and visualizing NGS data and achieves traceability, reproducibility, and transparency of NGS analysis. Database URL: http://cell-innovation.nig.ac.jp/maser/ PMID:29688385

  17. Database for Safety-Oriented Tracking of Chemicals

    NASA Technical Reports Server (NTRS)

    Stump, Jacob; Carr, Sandra; Plumlee, Debrah; Slater, Andy; Samson, Thomas M.; Holowaty, Toby L.; Skeete, Darren; Haenz, Mary Alice; Hershman, Scot; Raviprakash, Pushpa

    2010-01-01

    SafetyChem is a computer program that maintains a relational database for tracking chemicals and associated hazards at Johnson Space Center (JSC) by use of a Web-based graphical user interface. The SafetyChem database is accessible to authorized users via a JSC intranet. All new chemicals pass through a safety office, where information on hazards, required personal protective equipment (PPE), fire-protection warnings, and target organ effects (TOEs) is extracted from material safety data sheets (MSDSs) and recorded in the database. The database facilitates real-time management of inventory with attention to such issues as stability, shelf life, reduction of waste through transfer of unused chemicals to laboratories that need them, quantification of chemical wastes, and identification of chemicals for which disposal is required. Upon searching the database for a chemical, the user receives information on physical properties of the chemical, hazard warnings, required PPE, a link to the MSDS, and references to the applicable International Standards Organization (ISO) 9000 standard work instructions and the applicable job hazard analysis. Also, to reduce the labor hours needed to comply with reporting requirements of the Occupational Safety and Health Administration, the data can be directly exported into the JSC hazardous- materials database.

  18. Joint Program on Rapid Prototyping. RaPIER (Rapid Prototyping to Investigate End-User Requirements).

    DTIC Science & Technology

    1985-03-28

    can be found in [PATCH83]. In this section, we will discuss three systems which represent the state-of-the-technology. A . The DRACO - System . The DRACO ... System [NEIGHBORS8O] provides a programming environment in which the design and analysis of programs are reused. DRACO provides mechanisms for...automatic in the sense that the user can make individual implementation choices (called refinements in DRACO ) or even insert new tactics into the system

  19. Functional Requirements of a Target Description System for Vulnerability Analysis

    DTIC Science & Technology

    1979-11-01

    called GIFT .1,2 Together the COMGEOM description model and GIFT codes make up the BRL’s target description system. The significance of a target...and modifying target descriptions are described. 1 Lawrence W. Bain, Jr. and Mathew J. Reisinger, "The GIFT Code User Manual; Volume 1...34The GIFT Code User Manual; Volume II, The Output Options," unpublished draft of BRL report. II. UNDERLYING PHILOSOPHY The BRL has a computer

  20. Comparison of requirements and capabilities of major multipurpose software packages.

    PubMed

    Igo, Robert P; Schnell, Audrey H

    2012-01-01

    The aim of this chapter is to introduce the reader to commonly used software packages and illustrate their input requirements, analysis options, strengths, and limitations. We focus on packages that perform more than one function and include a program for quality control, linkage, and association analyses. Additional inclusion criteria were (1) programs that are free to academic users and (2) currently supported, maintained, and developed. Using those criteria, we chose to review three programs: Statistical Analysis for Genetic Epidemiology (S.A.G.E.), PLINK, and Merlin. We will describe the required input format and analysis options. We will not go into detail about every possible program in the packages, but we will give an overview of the packages requirements and capabilities.

  1. The Collaborative Search by Tag-Based User Profile in Social Media

    PubMed Central

    Li, Xiaodong; Li, Qing

    2014-01-01

    Recently, we have witnessed the popularity and proliferation of social media applications (e.g., Delicious, Flickr, and YouTube) in the web 2.0 era. The rapid growth of user-generated data results in the problem of information overload to users. Facing such a tremendous volume of data, it is a big challenge to assist the users to find their desired data. To attack this critical problem, we propose the collaborative search approach in this paper. The core idea is that similar users may have common interests so as to help users to find their demanded data. Similar research has been conducted on the user log analysis in web search. However, the rapid growth and change of user-generated data in social media require us to discover a brand-new approach to address the unsolved issues (e.g., how to profile users, how to measure the similar users, and how to depict user-generated resources) rather than adopting existing method from web search. Therefore, we investigate various metrics to identify the similar users (user community). Moreover, we conduct the experiment on two real-life data sets by comparing the Collaborative method with the latest baselines. The empirical results show the effectiveness of the proposed approach and validate our observations. PMID:25692176

  2. ISS Mini AERCam Radio Frequency (RF) Coverage Analysis Using iCAT Development Tool

    NASA Technical Reports Server (NTRS)

    Bolen, Steve; Vazquez, Luis; Sham, Catherine; Fredrickson, Steven; Fink, Patrick; Cox, Jan; Phan, Chau; Panneton, Robert

    2003-01-01

    The long-term goals of the National Aeronautics and Space Administration's (NASA's) Human Exploration and Development of Space (HEDS) enterprise may require the development of autonomous free-flier (FF) robotic devices to operate within the vicinity of low-Earth orbiting spacecraft to supplement human extravehicular activities (EVAs) in space. Future missions could require external visual inspection of the spacecraft that would be difficult, or dangerous, for humans to perform. Under some circumstance, it may be necessary to employ an un-tethered communications link between the FF and the users. The interactive coverage analysis tool (ICAT) is a software tool that has been developed to perform critical analysis of the communications link performance for a FF operating in the vicinity of the International Space Station (ISS) external environment. The tool allows users to interactively change multiple parameters of the communications link parameters to efficiently perform systems engineering trades on network performance. These trades can be directly translated into design and requirements specifications. This tool significantly reduces the development time in determining a communications network topology by allowing multiple parameters to be changed, and the results of link coverage to be statistically characterized and plotted interactively.

  3. Interactive access to LP DAAC satellite data archives through a combination of open-source and custom middleware web services

    USGS Publications Warehouse

    Davis, Brian N.; Werpy, Jason; Friesz, Aaron M.; Impecoven, Kevin; Quenzer, Robert; Maiersperger, Tom; Meyer, David J.

    2015-01-01

    Current methods of searching for and retrieving data from satellite land remote sensing archives do not allow for interactive information extraction. Instead, Earth science data users are required to download files over low-bandwidth networks to local workstations and process data before science questions can be addressed. New methods of extracting information from data archives need to become more interactive to meet user demands for deriving increasingly complex information from rapidly expanding archives. Moving the tools required for processing data to computer systems of data providers, and away from systems of the data consumer, can improve turnaround times for data processing workflows. The implementation of middleware services was used to provide interactive access to archive data. The goal of this middleware services development is to enable Earth science data users to access remote sensing archives for immediate answers to science questions instead of links to large volumes of data to download and process. Exposing data and metadata to web-based services enables machine-driven queries and data interaction. Also, product quality information can be integrated to enable additional filtering and sub-setting. Only the reduced content required to complete an analysis is then transferred to the user.

  4. User Needs, Benefits, and Integration of Robotic Systems in a Space Station Laboratory

    NASA Technical Reports Server (NTRS)

    Dodd, W. R.; Badgley, M. B.; Konkel, C. R.

    1989-01-01

    The methodology, results and conclusions of all tasks of the User Needs, Benefits, and Integration Study (UNBIS) of Robotic Systems in a Space Station Laboratory are summarized. Study goals included the determination of user requirements for robotics within the Space Station, United States Laboratory. In Task 1, three experiments were selected to determine user needs and to allow detailed investigation of microgravity requirements. In Task 2, a NASTRAN analysis of Space Station response to robotic disturbances, and acceleration measurement of a standard industrial robot (Intelledex Model 660) resulted in selection of two ranges of microgravity manipulation: Level 1 (10-3 to 10-5 G at greater than 1 Hz) and Level 2 (less than equal 10-6 G at 0.1 Hz). This task included an evaluation of microstepping methods for controlling stepper motors and concluded that an industrial robot actuator can perform milli-G motion without modification. Relative merits of end-effectors and manipulators were studied in Task 3 in order to determine their ability to perform a range of tasks related to the three microgravity experiments. An Effectivity Rating was established for evaluating these robotic system capabilities. Preliminary interface requirements for an orbital flight demonstration were determined in Task 4. Task 5 assessed the impact of robotics.

  5. Information for the user in design of intelligent systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Schreckenghost, Debra L.

    1993-01-01

    Recommendations are made for improving intelligent system reliability and usability based on the use of information requirements in system development. Information requirements define the task-relevant messages exchanged between the intelligent system and the user by means of the user interface medium. Thus, these requirements affect the design of both the intelligent system and its user interface. Many difficulties that users have in interacting with intelligent systems are caused by information problems. These information problems result from the following: (1) not providing the right information to support domain tasks; and (2) not recognizing that using an intelligent system introduces new user supervisory tasks that require new types of information. These problems are especially prevalent in intelligent systems used for real-time space operations, where data problems and unexpected situations are common. Information problems can be solved by deriving information requirements from a description of user tasks. Using information requirements embeds human-computer interaction design into intelligent system prototyping, resulting in intelligent systems that are more robust and easier to use.

  6. STS users study (study 2.2). Volume 2: STS users plan (user data requirements) study

    NASA Technical Reports Server (NTRS)

    Pritchard, E. I.

    1975-01-01

    Pre-flight scheduling and pre-flight requirements of the space transportation system are discussed. Payload safety requirements, shuttle flight manifests, and interface specifications are studied in detail.

  7. Shiny-phyloseq: Web application for interactive microbiome analysis with provenance tracking.

    PubMed

    McMurdie, Paul J; Holmes, Susan

    2015-01-15

    We have created a Shiny-based Web application, called Shiny-phyloseq, for dynamic interaction with microbiome data that runs on any modern Web browser and requires no programming, increasing the accessibility and decreasing the entrance requirement to using phyloseq and related R tools. Along with a data- and context-aware dynamic interface for exploring the effects of parameter and method choices, Shiny-phyloseq also records the complete user input and subsequent graphical results of a user's session, allowing the user to archive, share and reproduce the sequence of steps that created their result-without writing any new code themselves. Shiny-phyloseq is implemented entirely in the R language. It can be hosted/launched by any system with R installed, including Windows, Mac OS and most Linux distributions. Information technology administrators can also host Shiny--phyloseq from a remote server, in which case users need only have a Web browser installed. Shiny-phyloseq is provided free of charge under a GPL-3 open-source license through GitHub at http://joey711.github.io/shiny-phyloseq/. © The Author 2014. Published by Oxford University Press.

  8. THREAT ENSEMBLE VULNERABILITY ASSESSMENT ...

    EPA Pesticide Factsheets

    software and manual TEVA-SPOT is used by water utilities to optimize the number and location of contamination detection sensors so that economic and/or public health consequences are minimized. TEVA-SPOT is interactive, allowing a user to specify the minimization objective (e.g., the number of people exposed, the time to detection, or the extent of pipe length contaminated). It also allows a user to specify constraints. For example, a TEVA-SPOT user can employ expert knowledge during the design process by identifying either existing or unfeasible sensor locations. Installation and maintenance costs for sensor placement can also be factored into the analysis. Python and Java are required to run TEVA-SPOT

  9. Power mobility with collision avoidance for older adults: user, caregiver, and prescriber perspectives.

    PubMed

    Wang, Rosalie H; Korotchenko, Alexandra; Hurd Clarke, Laura; Mortenson, W Ben; Mihailidis, Alex

    2013-01-01

    Collision avoidance technology has the capacity to facilitate safer mobility among older power mobility users with physical, sensory, and cognitive impairments, thus enabling independence for more users. Little is known about consumers' perceptions of collision avoidance. This article draws on interviews (29 users, 5 caregivers, and 10 prescribers) to examine views on design and utilization of this technology. Data analysis identified three themes: "useful situations or contexts," "technology design issues and real-life application," and "appropriateness of collision avoidance technology for a variety of users." Findings support ongoing development of collision avoidance for older adult users. The majority of participants supported the technology and felt that it might benefit current users and users with visual impairments, but might be unsuitable for people with significant cognitive impairments. Some participants voiced concerns regarding the risk for injury with power mobility use and some identified situations where collision avoidance might be beneficial (driving backward, avoiding dynamic obstacles, negotiating outdoor barriers, and learning power mobility use). Design issues include the need for context awareness, reliability, and user interface specifications. User desire to maintain driving autonomy supports development of collaboratively controlled systems. This research lays the groundwork for future development by illustrating consumer requirements for this technology.

  10. MAAMD: a workflow to standardize meta-analyses and comparison of affymetrix microarray data

    PubMed Central

    2014-01-01

    Background Mandatory deposit of raw microarray data files for public access, prior to study publication, provides significant opportunities to conduct new bioinformatics analyses within and across multiple datasets. Analysis of raw microarray data files (e.g. Affymetrix CEL files) can be time consuming, complex, and requires fundamental computational and bioinformatics skills. The development of analytical workflows to automate these tasks simplifies the processing of, improves the efficiency of, and serves to standardize multiple and sequential analyses. Once installed, workflows facilitate the tedious steps required to run rapid intra- and inter-dataset comparisons. Results We developed a workflow to facilitate and standardize Meta-Analysis of Affymetrix Microarray Data analysis (MAAMD) in Kepler. Two freely available stand-alone software tools, R and AltAnalyze were embedded in MAAMD. The inputs of MAAMD are user-editable csv files, which contain sample information and parameters describing the locations of input files and required tools. MAAMD was tested by analyzing 4 different GEO datasets from mice and drosophila. MAAMD automates data downloading, data organization, data quality control assesment, differential gene expression analysis, clustering analysis, pathway visualization, gene-set enrichment analysis, and cross-species orthologous-gene comparisons. MAAMD was utilized to identify gene orthologues responding to hypoxia or hyperoxia in both mice and drosophila. The entire set of analyses for 4 datasets (34 total microarrays) finished in ~ one hour. Conclusions MAAMD saves time, minimizes the required computer skills, and offers a standardized procedure for users to analyze microarray datasets and make new intra- and inter-dataset comparisons. PMID:24621103

  11. An integrated user-friendly ArcMAP tool for bivariate statistical modeling in geoscience applications

    NASA Astrophysics Data System (ADS)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.

    2014-10-01

    Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  12. An integrated user-friendly ArcMAP tool for bivariate statistical modelling in geoscience applications

    NASA Astrophysics Data System (ADS)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.

    2015-03-01

    Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  13. Powerlaw: a Python package for analysis of heavy-tailed distributions.

    PubMed

    Alstott, Jeff; Bullmore, Ed; Plenz, Dietmar

    2014-01-01

    Power laws are theoretically interesting probability distributions that are also frequently used to describe empirical data. In recent years, effective statistical methods for fitting power laws have been developed, but appropriate use of these techniques requires significant programming and statistical insight. In order to greatly decrease the barriers to using good statistical methods for fitting power law distributions, we developed the powerlaw Python package. This software package provides easy commands for basic fitting and statistical analysis of distributions. Notably, it also seeks to support a variety of user needs by being exhaustive in the options available to the user. The source code is publicly available and easily extensible.

  14. A Health Insurance Portability and Accountability Act–Compliant Ocular Telehealth Network for the Remote Diagnosis and Management of Diabetic Retinopathy

    PubMed Central

    Li, Yaqin; Karnowski, Thomas P.; Tobin, Kenneth W.; Giancardo, Luca; Morris, Scott; Sparrow, Sylvia E.; Garg, Seema; Fox, Karen

    2011-01-01

    Abstract In this article, we present the design and implementation of a regional ocular telehealth network for remote assessment and management of diabetic retinopathy (DR), including the design requirements, network topology, protocol design, system work flow, graphics user interfaces, and performance evaluation. The Telemedical Retinal Image Analysis and Diagnosis Network is a computer-aided, image analysis telehealth paradigm for the diagnosis of DR and other retinal diseases using fundus images acquired from primary care end users delivering care to underserved patient populations in the mid-South and southeastern United States. PMID:21819244

  15. User interface support

    NASA Technical Reports Server (NTRS)

    Lewis, Clayton; Wilde, Nick

    1989-01-01

    Space construction will require heavy investment in the development of a wide variety of user interfaces for the computer-based tools that will be involved at every stage of construction operations. Using today's technology, user interface development is very expensive for two reasons: (1) specialized and scarce programming skills are required to implement the necessary graphical representations and complex control regimes for high-quality interfaces; (2) iteration on prototypes is required to meet user and task requirements, since these are difficult to anticipate with current (and foreseeable) design knowledge. We are attacking this problem by building a user interface development tool based on extensions to the spreadsheet model of computation. The tool provides high-level support for graphical user interfaces and permits dynamic modification of interfaces, without requiring conventional programming concepts and skills.

  16. IMCS reflight certification requirements and design specifications

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The requirements for reflight certification are established. Software requirements encompass the software programs that are resident in the PCC, DEP, PDSS, EC, or any related GSE. A design approach for the reflight software packages is recommended. These designs will be of sufficient detail to permit the implementation of reflight software. The PDSS/IMC Reflight Certification system provides the tools and mechanisms for the user to perform the reflight certification test procedures, test data capture, test data display, and test data analysis. The system as defined will be structured to permit maximum automation of reflight certification procedures and test data analysis.

  17. gRINN: a tool for calculation of residue interaction energies and protein energy network analysis of molecular dynamics simulations.

    PubMed

    Serçinoglu, Onur; Ozbek, Pemra

    2018-05-25

    Atomistic molecular dynamics (MD) simulations generate a wealth of information related to the dynamics of proteins. If properly analyzed, this information can lead to new insights regarding protein function and assist wet-lab experiments. Aiming to identify interactions between individual amino acid residues and the role played by each in the context of MD simulations, we present a stand-alone software called gRINN (get Residue Interaction eNergies and Networks). gRINN features graphical user interfaces (GUIs) and a command-line interface for generating and analyzing pairwise residue interaction energies and energy correlations from protein MD simulation trajectories. gRINN utilizes the features of NAMD or GROMACS MD simulation packages and automatizes the steps necessary to extract residue-residue interaction energies from user-supplied simulation trajectories, greatly simplifying the analysis for the end-user. A GUI, including an embedded molecular viewer, is provided for visualization of interaction energy time-series, distributions, an interaction energy matrix, interaction energy correlations and a residue correlation matrix. gRINN additionally offers construction and analysis of Protein Energy Networks, providing residue-based metrics such as degrees, betweenness-centralities, closeness centralities as well as shortest path analysis. gRINN is free and open to all users without login requirement at http://grinn.readthedocs.io.

  18. The Gap Procedure: for the identification of phylogenetic clusters in HIV-1 sequence data.

    PubMed

    Vrbik, Irene; Stephens, David A; Roger, Michel; Brenner, Bluma G

    2015-11-04

    In the context of infectious disease, sequence clustering can be used to provide important insights into the dynamics of transmission. Cluster analysis is usually performed using a phylogenetic approach whereby clusters are assigned on the basis of sufficiently small genetic distances and high bootstrap support (or posterior probabilities). The computational burden involved in this phylogenetic threshold approach is a major drawback, especially when a large number of sequences are being considered. In addition, this method requires a skilled user to specify the appropriate threshold values which may vary widely depending on the application. This paper presents the Gap Procedure, a distance-based clustering algorithm for the classification of DNA sequences sampled from individuals infected with the human immunodeficiency virus type 1 (HIV-1). Our heuristic algorithm bypasses the need for phylogenetic reconstruction, thereby supporting the quick analysis of large genetic data sets. Moreover, this fully automated procedure relies on data-driven gaps in sorted pairwise distances to infer clusters, thus no user-specified threshold values are required. The clustering results obtained by the Gap Procedure on both real and simulated data, closely agree with those found using the threshold approach, while only requiring a fraction of the time to complete the analysis. Apart from the dramatic gains in computational time, the Gap Procedure is highly effective in finding distinct groups of genetically similar sequences and obviates the need for subjective user-specified values. The clusters of genetically similar sequences returned by this procedure can be used to detect patterns in HIV-1 transmission and thereby aid in the prevention, treatment and containment of the disease.

  19. Screening_mgmt: a Python module for managing screening data.

    PubMed

    Helfenstein, Andreas; Tammela, Päivi

    2015-02-01

    High-throughput screening is an established technique in drug discovery and, as such, has also found its way into academia. High-throughput screening generates a considerable amount of data, which is why specific software is used for its analysis and management. The commercially available software packages are often beyond the financial limits of small-scale academic laboratories and, furthermore, lack the flexibility to fulfill certain user-specific requirements. We have developed a Python module, screening_mgmt, which is a lightweight tool for flexible data retrieval, analysis, and storage for different screening assays in one central database. The module reads custom-made analysis scripts and plotting instructions, and it offers a graphical user interface to import, modify, and display the data in a uniform manner. During the test phase, we used this module for the management of 10,000 data points of various origins. It has provided a practical, user-friendly tool for sharing and exchanging information between researchers. © 2014 Society for Laboratory Automation and Screening.

  20. PeptideDepot: flexible relational database for visual analysis of quantitative proteomic data and integration of existing protein information.

    PubMed

    Yu, Kebing; Salomon, Arthur R

    2009-12-01

    Recently, dramatic progress has been achieved in expanding the sensitivity, resolution, mass accuracy, and scan rate of mass spectrometers able to fragment and identify peptides through MS/MS. Unfortunately, this enhanced ability to acquire proteomic data has not been accompanied by a concomitant increase in the availability of flexible tools allowing users to rapidly assimilate, explore, and analyze this data and adapt to various experimental workflows with minimal user intervention. Here we fill this critical gap by providing a flexible relational database called PeptideDepot for organization of expansive proteomic data sets, collation of proteomic data with available protein information resources, and visual comparison of multiple quantitative proteomic experiments. Our software design, built upon the synergistic combination of a MySQL database for safe warehousing of proteomic data with a FileMaker-driven graphical user interface for flexible adaptation to diverse workflows, enables proteomic end-users to directly tailor the presentation of proteomic data to the unique analysis requirements of the individual proteomics lab. PeptideDepot may be deployed as an independent software tool or integrated directly with our high throughput autonomous proteomic pipeline used in the automated acquisition and post-acquisition analysis of proteomic data.

  1. IAC - INTEGRATED ANALYSIS CAPABILITY

    NASA Technical Reports Server (NTRS)

    Frisch, H. P.

    1994-01-01

    The objective of the Integrated Analysis Capability (IAC) system is to provide a highly effective, interactive analysis tool for the integrated design of large structures. With the goal of supporting the unique needs of engineering analysis groups concerned with interdisciplinary problems, IAC was developed to interface programs from the fields of structures, thermodynamics, controls, and system dynamics with an executive system and database to yield a highly efficient multi-disciplinary system. Special attention is given to user requirements such as data handling and on-line assistance with operational features, and the ability to add new modules of the user's choice at a future date. IAC contains an executive system, a data base, general utilities, interfaces to various engineering programs, and a framework for building interfaces to other programs. IAC has shown itself to be effective in automatic data transfer among analysis programs. IAC 2.5, designed to be compatible as far as possible with Level 1.5, contains a major upgrade in executive and database management system capabilities, and includes interfaces to enable thermal, structures, optics, and control interaction dynamics analysis. The IAC system architecture is modular in design. 1) The executive module contains an input command processor, an extensive data management system, and driver code to execute the application modules. 2) Technical modules provide standalone computational capability as well as support for various solution paths or coupled analyses. 3) Graphics and model generation interfaces are supplied for building and viewing models. Advanced graphics capabilities are provided within particular analysis modules such as INCA and NASTRAN. 4) Interface modules provide for the required data flow between IAC and other modules. 5) User modules can be arbitrary executable programs or JCL procedures with no pre-defined relationship to IAC. 6) Special purpose modules are included, such as MIMIC (Model Integration via Mesh Interpolation Coefficients), which transforms field values from one model to another; LINK, which simplifies incorporation of user specific modules into IAC modules; and DATAPAC, the National Bureau of Standards statistical analysis package. The IAC database contains structured files which provide a common basis for communication between modules and the executive system, and can contain unstructured files such as NASTRAN checkpoint files, DISCOS plot files, object code, etc. The user can define groups of data and relations between them. A full data manipulation and query system operates with the database. The current interface modules comprise five groups: 1) Structural analysis - IAC contains a NASTRAN interface for standalone analysis or certain structural/control/thermal combinations. IAC provides enhanced structural capabilities for normal modes and static deformation analysis via special DMAP sequences. IAC 2.5 contains several specialized interfaces from NASTRAN in support of multidisciplinary analysis. 2) Thermal analysis - IAC supports finite element and finite difference techniques for steady state or transient analysis. There are interfaces for the NASTRAN thermal analyzer, SINDA/SINFLO, and TRASYS II. FEMNET, which converts finite element structural analysis models to finite difference thermal analysis models, is also interfaced with the IAC database. 3) System dynamics - The DISCOS simulation program which allows for either nonlinear time domain analysis or linear frequency domain analysis, is fully interfaced to the IAC database management capability. 4) Control analysis - Interfaces for the ORACLS, SAMSAN, NBOD2, and INCA programs allow a wide range of control system analyses and synthesis techniques. Level 2.5 includes EIGEN, which provides tools for large order system eigenanalysis, and BOPACE, which allows for geometric capabilities and finite element analysis with nonlinear material. Also included in IAC level 2.5 is SAMSAN 3.1, an engineering analysis program which contains a general purpose library of over 600 subroutines for numerical analysis. 5) Graphics - The graphics package IPLOT is included in IAC. IPLOT generates vector displays of tabular data in the form of curves, charts, correlation tables, etc. Either DI3000 or PLOT-10 graphics software is required for full graphic capability. In addition to these analysis tools, IAC 2.5 contains an IGES interface which allows the user to read arbitrary IGES files into an IAC database and to edit and output new IGES files. IAC is available by license for a period of 10 years to approved U.S. licensees. The licensed program product includes one set of supporting documentation. Additional copies may be purchased separately. IAC is written in FORTRAN 77 and has been implemented on a DEC VAX series computer operating under VMS. IAC can be executed by multiple concurrent users in batch or interactive mode. The program is structured to allow users to easily delete those program capabilities and "how to" examples they do not want in order to reduce the size of the package. The basic central memory requirement for IAC is approximately 750KB. The following programs are also available from COSMIC as separate packages: NASTRAN, SINDA/SINFLO, TRASYS II, DISCOS, ORACLS, SAMSAN, NBOD2, and INCA. The development of level 2.5 of IAC was completed in 1989.

  2. Exploring representations and experiences of case-management users: towards difficulties and solutions to leading qualitative interviews with older people with complex living conditions.

    PubMed

    Balard, Frédéric; Corre, Stéphanie Pin Le; Trouvé, Hélène; Saint-Jean, Olivier; Somme, Dominique

    2013-01-01

    By matching needs to resource services, case management could be a useful tool for improving the care of older people with complex living conditions. Collecting and analysing the users' experiences represents a good way to evaluate the effectiveness and efficiency of a case-management service. However, in the literature, fieldwork is very rarely considered and the users included in qualitative research seem to be the most accessible. This study was undertaken to describe the challenges of conducting qualitative research with older people with complex living conditions in order to understand their experiences with case-management services. Reflective analysis was applied to describe the process of recruiting and interviewing older people with complex living conditions in private homes, describing the protocol with respect to fieldwork chronology. The practical difficulties inherent in this type of study are addressed, particularly in terms of defining a sample, the procedure for contacting the users and conducting the interview. The users are people who suffer from a loss of autonomy because of cognitive impairment, severe disease and/or psychiatric or social problems. Notably, most of them refuse care and assistance. Reflective analysis of our protocol showed that the methodology and difficulties encountered constituted the first phase of data analysis. Understanding the experience of users of case management to analyse the outcomes of case-management services requires a clear methodology for the fieldwork.

  3. Economic impacts of federal policy responses to drought in the Rio Grande Basin

    NASA Astrophysics Data System (ADS)

    Ward, Frank A.; Hurd, Brian H.; Rahmani, Tarik; Gollehon, Noel

    2006-03-01

    Significant growth in the Rio Grande Basin's demand for water has stressed the region's scarce water supply. This paper presents an analysis of the impacts of severe and sustained drought and of minimum in-stream flow requirements to support endangered species in the Rio Grande watershed. These impacts are investigated by modeling the physical and institutional constraints within the Rio Grande Basin and by identifying the hydrologic and economic responses of all major water users. Water supplies, which include all major tributaries, interbasin transfers, and hydrologically connected groundwater, are represented in a yearly time step. A nonlinear programming model is developed to maximize economic benefits subject to hydrologic and institutional constraints. Results indicate that drought produces considerable impacts on both agriculture and municipal and industrial (MI) uses in the Rio Grande watershed. In-stream flow requirements to support endangered species' habitat produce the largest impacts on agricultural water users in New Mexico and Texas. Hydrologic and economic impacts are more pronounced when in-stream flow requirements dictate larger quantities of water for endangered species' habitat. Higher in-stream flow requirements for endangered species in central New Mexico cause considerable losses to New Mexico agriculture above Elephant Butte Reservoir and to MI users in Albuquerque, New Mexico. Those same in-stream flow requirements reduce drought damages to New Mexico agriculture below Elephant Butte Reservoir and reduce the severity of drought damages to MI users in El Paso, Texas. Results provide a framework for formulating federal policy responses to drought in the Rio Grande Basin.

  4. High-resolution Single Particle Analysis from Electron Cryo-microscopy Images Using SPHIRE

    PubMed Central

    Moriya, Toshio; Saur, Michael; Stabrin, Markus; Merino, Felipe; Voicu, Horatiu; Huang, Zhong; Penczek, Pawel A.; Raunser, Stefan; Gatsogiannis, Christos

    2017-01-01

    SPHIRE (SPARX for High-Resolution Electron Microscopy) is a novel open-source, user-friendly software suite for the semi-automated processing of single particle electron cryo-microscopy (cryo-EM) data. The protocol presented here describes in detail how to obtain a near-atomic resolution structure starting from cryo-EM micrograph movies by guiding users through all steps of the single particle structure determination pipeline. These steps are controlled from the new SPHIRE graphical user interface and require minimum user intervention. Using this protocol, a 3.5 Å structure of TcdA1, a Tc toxin complex from Photorhabdus luminescens, was derived from only 9500 single particles. This streamlined approach will help novice users without extensive processing experience and a priori structural information, to obtain noise-free and unbiased atomic models of their purified macromolecular complexes in their native state. PMID:28570515

  5. Modeling and Analysis of Space Based Transceivers

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.; Liebetreu, John; Moore, Michael S.; Price, Jeremy C.; Abbott, Ben

    2005-01-01

    This paper presents the tool chain, methodology, and initial results of a study to provide a thorough, objective, and quantitative analysis of the design alternatives for space Software Defined Radio (SDR) transceivers. The approach taken was to develop a set of models and tools for describing communications requirements, the algorithm resource requirements, the available hardware, and the alternative software architectures, and generate analysis data necessary to compare alternative designs. The Space Transceiver Analysis Tool (STAT) was developed to help users identify and select representative designs, calculate the analysis data, and perform a comparative analysis of the representative designs. The tool allows the design space to be searched quickly while permitting incremental refinement in regions of higher payoff.

  6. Modeling and Analysis of Space Based Transceivers

    NASA Technical Reports Server (NTRS)

    Moore, Michael S.; Price, Jeremy C.; Abbott, Ben; Liebetreu, John; Reinhart, Richard C.; Kacpura, Thomas J.

    2007-01-01

    This paper presents the tool chain, methodology, and initial results of a study to provide a thorough, objective, and quantitative analysis of the design alternatives for space Software Defined Radio (SDR) transceivers. The approach taken was to develop a set of models and tools for describing communications requirements, the algorithm resource requirements, the available hardware, and the alternative software architectures, and generate analysis data necessary to compare alternative designs. The Space Transceiver Analysis Tool (STAT) was developed to help users identify and select representative designs, calculate the analysis data, and perform a comparative analysis of the representative designs. The tool allows the design space to be searched quickly while permitting incremental refinement in regions of higher payoff.

  7. Analysis of data systems requirements for global crop production forecasting in the 1985 time frame

    NASA Technical Reports Server (NTRS)

    Downs, S. W.; Larsen, P. A.; Gerstner, D. A.

    1978-01-01

    Data systems concepts that would be needed to implement the objective of the global crop production forecasting in an orderly transition from experimental to operational status in the 1985 time frame were examined. Information needs of users were converted into data system requirements, and the influence of these requirements on the formulation of a conceptual data system was analyzed. Any potential problem areas in meeting these data system requirements were identified in an iterative process.

  8. Integrated tools for control-system analysis

    NASA Technical Reports Server (NTRS)

    Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.

    1989-01-01

    The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.

  9. Natural Resource Information System, design analysis

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The computer-based system stores, processes, and displays map data relating to natural resources. The system was designed on the basis of requirements established in a user survey and an analysis of decision flow. The design analysis effort is described, and the rationale behind major design decisions, including map processing, cell vs. polygon, choice of classification systems, mapping accuracy, system hardware, and software language is summarized.

  10. Crux: Rapid Open Source Protein Tandem Mass Spectrometry Analysis

    PubMed Central

    2015-01-01

    Efficiently and accurately analyzing big protein tandem mass spectrometry data sets requires robust software that incorporates state-of-the-art computational, machine learning, and statistical methods. The Crux mass spectrometry analysis software toolkit (http://cruxtoolkit.sourceforge.net) is an open source project that aims to provide users with a cross-platform suite of analysis tools for interpreting protein mass spectrometry data. PMID:25182276

  11. One-Click Data Analysis Software for Science Operations

    NASA Astrophysics Data System (ADS)

    Navarro, Vicente

    2015-12-01

    One of the important activities of ESA Science Operations Centre is to provide Data Analysis Software (DAS) to enable users and scientists to process data further to higher levels. During operations and post-operations, Data Analysis Software (DAS) is fully maintained and updated for new OS and library releases. Nonetheless, once a Mission goes into the "legacy" phase, there are very limited funds and long-term preservation becomes more and more difficult. Building on Virtual Machine (VM), Cloud computing and Software as a Service (SaaS) technologies, this project has aimed at providing long-term preservation of Data Analysis Software for the following missions: - PIA for ISO (1995) - SAS for XMM-Newton (1999) - Hipe for Herschel (2009) - EXIA for EXOSAT (1983) Following goals have guided the architecture: - Support for all operations, post-operations and archive/legacy phases. - Support for local (user's computer) and cloud environments (ESAC-Cloud, Amazon - AWS). - Support for expert users, requiring full capabilities. - Provision of a simple web-based interface. This talk describes the architecture, challenges, results and lessons learnt gathered in this project.

  12. RAMPART (TM): Risk Assessment Method-Property Analysis and Ranking Tool v.4.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carson, Susan D.; Hunter, Regina L.; Link, Madison D.

    RAMPART{trademark}, Risk Assessment Method-property Analysis and Ranking Tool, is a new type of computer software package for the assessment of risk to buildings. RAMPART{trademark} has been developed by Sandia National Laboratories (SNL) for the U.S. General Services Administration (GSA). RAMPART {trademark} has been designed and developed to be a risk-based decision support tool that requires no risk analysis expertise on the part of the user. The RAMPART{trademark} user interface elicits information from the user about the building. The RAMPART{trademark} expert system is a set of rules that embodies GSA corporate knowledge and SNL's risk assessment experience. The RAMPART{trademark} database containsmore » both data entered by the user during a building analysis session and large sets of natural hazard and crime data. RAMPART{trademark} algorithms use these data to assess the risk associated with a given building in the face of certain hazards. Risks arising from five natural hazards (earthquake, hurricane, winter storm, tornado and flood); crime (inside and outside the building); fire and terrorism are calculated. These hazards may cause losses of various kinds. RAMPART{trademark} considers death, injury, loss of mission, loss of property, loss of contents, loss of building use, and first-responder loss. The results of each analysis are presented graphically on the screen and in a written report.« less

  13. Design features that affect the maneuverability of wheelchairs and scooters.

    PubMed

    Koontz, Alicia M; Brindle, Eric D; Kankipati, Padmaja; Feathers, David; Cooper, Rory A

    2010-05-01

    To determine the minimum space required for wheeled mobility device users to perform 4 maneuverability tasks and to investigate the impact of selected design attributes on space. Case series. University laboratory, Veterans Affairs research facility, vocational training center, and a national wheelchair sport event. The sample of convenience included manual wheelchair (MWC; n=109), power wheelchair (PWC; n=100), and scooter users (n=14). A mock environment was constructed to create passageways to form an L-turn, 360 degrees -turn in place, and a U-turn with and without a barrier. Passageway openings were increased in 5-cm increments until the user could successfully perform each task without hitting the walls. Structural dimensions of the device and user were collected using an electromechanical probe. Mobility devices were grouped into categories based on design features and compared using 1-way analysis of variance and post hoc pairwise Bonferroni-corrected tests. Minimum passageway widths for the 4 maneuverability tasks. Ultralight MWCs with rear axles posterior to the shoulder had the shortest lengths and required the least amount of space compared with all other types of MWCs (P<.05). Mid-wheel-drive PWCs required the least space for the 360 degrees -turn in place compared with front-wheel-drive and rear-wheel-drive PWCs (P<.01) but performed equally as well as front-wheel-drive models on all other turning tasks. PWCs with seat functions required more space to perform the tasks. Between 10% and 100% of users would not be able to maneuver in spaces that meet current Accessibility Guidelines for Buildings and Facilities specifications. This study provides data that can be used to support wheelchair prescription and home modifications and to update standards to improve the accessibility of public areas.

  14. Integrating the results of user research into medical device development: insights from a case study.

    PubMed

    Martin, Jennifer L; Barnett, Julie

    2012-07-19

    It is well established that considering users is an important aspect of medical device development. However it is also well established that there are numerous barriers to successfully conducting user research and integrating the results into product development. It is not sufficient to simply conduct user research, it must also be effectively integrated into product development. A case study of the development of a new medical imaging device was conducted to examine in detail how users were involved in a medical device development project. Two user research studies were conducted: a requirements elicitation interview study and an early prototype evaluation using contextual inquiry. A descriptive in situ approach was taken to investigate how these studies contributed to the product development process and how the results of this work influenced the development of the technology. Data was collected qualitatively through interviews with the development team, participant observation at development meetings and document analysis. The focus was on investigating the barriers that exist to prevent user data from being integrated into product development. A number of individual, organisational and system barriers were identified that functioned to prevent the results of the user research being fully integrated into development. The user and technological aspects of development were seen as separate work streams during development. The expectations of the developers were that user research would collect requirements for the appearance of the device, rather than challenge its fundamental concept. The manner that the user data was communicated to the development team was not effective in conveying the significance or breadth of the findings. There are a range of informal and formal organisational processes that can affect the uptake of user data during medical device development. Adopting formal decision making processes may assist manufacturers to take a more integrated and reflective approach to development, which should result in improved business decisions and a higher quality end product.

  15. Integrating the results of user research into medical device development: insights from a case study

    PubMed Central

    2012-01-01

    Background It is well established that considering users is an important aspect of medical device development. However it is also well established that there are numerous barriers to successfully conducting user research and integrating the results into product development. It is not sufficient to simply conduct user research, it must also be effectively integrated into product development. Methods A case study of the development of a new medical imaging device was conducted to examine in detail how users were involved in a medical device development project. Two user research studies were conducted: a requirements elicitation interview study and an early prototype evaluation using contextual inquiry. A descriptive in situ approach was taken to investigate how these studies contributed to the product development process and how the results of this work influenced the development of the technology. Data was collected qualitatively through interviews with the development team, participant observation at development meetings and document analysis. The focus was on investigating the barriers that exist to prevent user data from being integrated into product development. Results A number of individual, organisational and system barriers were identified that functioned to prevent the results of the user research being fully integrated into development. The user and technological aspects of development were seen as separate work streams during development. The expectations of the developers were that user research would collect requirements for the appearance of the device, rather than challenge its fundamental concept. The manner that the user data was communicated to the development team was not effective in conveying the significance or breadth of the findings. Conclusion There are a range of informal and formal organisational processes that can affect the uptake of user data during medical device development. Adopting formal decision making processes may assist manufacturers to take a more integrated and reflective approach to development, which should result in improved business decisions and a higher quality end product. PMID:22812565

  16. Definition of Tire Properties Required for Landing System Analysis

    NASA Technical Reports Server (NTRS)

    Clark, S. K.; Dodge, R. N.; Luchini, J. R.

    1978-01-01

    The data bank constructed provided two basic advantages for the user of aircraft tire information. First, computerization of the data bank allowed mechanical property data to be stored, corrected, updated, and revised quickly and easily as more reliable tests and measurements were carried out. Secondly, the format of the book which can be printed from the computerized data bank can be easily adjusted to suit the needs of the users without the great expense normally associated with reprinting and editing books set by ordinary typography.

  17. Ada (Trade Name) Foundation Technology. Volume 4. Software Requirements for WIS (WWMCCS (World Wide Military Command and Control System) Information System) Text Processing Prototypes

    DTIC Science & Technology

    1986-12-01

    graphics : The package allows a character set which can be defined by users giving the picture for a character by designating its pixels. Such characters...type lonts and gsei-oriented "help" messages tailored to the operations being performed and user expertise In general, critical design issues...other volumes include command language, software design , description and analysis tools, database management system operating systems; planning and

  18. An accounting system for water and consumptive use along the Colorado River, Hoover Dam to Mexico

    USGS Publications Warehouse

    Owen-Joyce, Sandra J.; Raymond, Lee H.

    1996-01-01

    An accounting system for estimating and distributing consumptive use of water by vegetation to water users was developed for the Colorado River to meet the requirements of a U.S. Supreme Court decree and used with data from calendar year 1984. The system is based on a water-budget method to estimate total consumptive use by vegetation which is apportioned to agricultural users by using percentages of total evapotranspiration by vegetation estimated from digital-image analysis of satellite data.

  19. Space shuttle/payload interface analysis. (Study 2.4) Volume 4: Business Risk and Value of Operations in Space (BRAVO). Part 2: User's manual

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The BRAVO User's Manual is presented which describes the BRAVO methodology in terms of step-by-step procedures, so that it may be used as a tool for a team of analysts performing cost effectiveness analyses on potential future space applications. BRAVO requires a relatively general set of input information and a relatively small expenditure of resources. For Vol. 1, see N74-12493; for Vol. 2, see N74-14530.

  20. An Initial Look at Adjacent Band Interference Between Aeronautical Mobile Telemetry and Long-Term Evolution Wireless Service

    DTIC Science & Technology

    2016-07-04

    required analysis, and further testing. 15. SUBJECT TERMS Adjacent Channel Interference, ACI, LTE -A, LTE , PCM/FM, SOQPSK-TG, ARTM CPM, AWS-3, User...Interference, ACI, LTE -A, LTE , PCM/FM, SOQPSK-TG, ARTM CPM, AWS-3, User Equipment, UE, Evolved Node B, eNodeB, Resource Blocks INTRODUCTION “On...these questions make necessary an improved understanding of the interferers that can be obtained only by hands-on measurements . This work will

  1. phylo-node: A molecular phylogenetic toolkit using Node.js.

    PubMed

    O'Halloran, Damien M

    2017-01-01

    Node.js is an open-source and cross-platform environment that provides a JavaScript codebase for back-end server-side applications. JavaScript has been used to develop very fast and user-friendly front-end tools for bioinformatic and phylogenetic analyses. However, no such toolkits are available using Node.js to conduct comprehensive molecular phylogenetic analysis. To address this problem, I have developed, phylo-node, which was developed using Node.js and provides a stable and scalable toolkit that allows the user to perform diverse molecular and phylogenetic tasks. phylo-node can execute the analysis and process the resulting outputs from a suite of software options that provides tools for read processing and genome alignment, sequence retrieval, multiple sequence alignment, primer design, evolutionary modeling, and phylogeny reconstruction. Furthermore, phylo-node enables the user to deploy server dependent applications, and also provides simple integration and interoperation with other Node modules and languages using Node inheritance patterns, and a customized piping module to support the production of diverse pipelines. phylo-node is open-source and freely available to all users without sign-up or login requirements. All source code and user guidelines are openly available at the GitHub repository: https://github.com/dohalloran/phylo-node.

  2. Design of an online health-promoting community: negotiating user community needs with public health goals and service capabilities.

    PubMed

    Ekberg, Joakim; Timpka, Toomas; Angbratt, Marianne; Frank, Linda; Norén, Anna-Maria; Hedin, Lena; Andersen, Emelie; Gursky, Elin A; Gäre, Boel Andersson

    2013-07-04

    An online health-promoting community (OHPC) has the potential to promote health and advance new means of dialogue between public health representatives and the general public. The aim of this study was to examine what aspects of an OHPC that are critical for satisfying the needs of the user community and public health goals and service capabilities. Community-based participatory research methods were used for data collection and analysis, and participatory design principles to develop a case study OHPC for adolescents. Qualitative data from adolescents on health appraisals and perspectives on health information were collected in a Swedish health service region and classified into categories of user health information exchange needs. A composite design rationale for the OHPC was completed by linking the identified user needs, user-derived requirements, and technical and organizational systems solutions. Conflicts between end-user requirements and organizational goals and resources were identified. The most prominent health information needs were associated to food, exercise, and well-being. The assessment of the design rationale document and prototype in light of the regional public health goals and service capabilities showed that compromises were needed to resolve conflicts involving the management of organizational resources and responsibilities. The users wanted to discuss health issues with health experts having little time to set aside to the OHPC and it was unclear who should set the norms for the online discussions. OHPCs can be designed to satisfy both the needs of user communities and public health goals and service capabilities. Compromises are needed to resolve conflicts between users' needs to discuss health issues with domain experts and the management of resources and responsibilities in public health organizations.

  3. Systems cost/performance analysis (study 2.3). Volume 2: Systems cost/performance model. [unmanned automated payload programs and program planning

    NASA Technical Reports Server (NTRS)

    Campbell, B. H.

    1974-01-01

    A methodology which was developed for balanced designing of spacecraft subsystems and interrelates cost, performance, safety, and schedule considerations was refined. The methodology consists of a two-step process: the first step is one of selecting all hardware designs which satisfy the given performance and safety requirements, the second step is one of estimating the cost and schedule required to design, build, and operate each spacecraft design. Using this methodology to develop a systems cost/performance model allows the user of such a model to establish specific designs and the related costs and schedule. The user is able to determine the sensitivity of design, costs, and schedules to changes in requirements. The resulting systems cost performance model is described and implemented as a digital computer program.

  4. CP-ABE Based Privacy-Preserving User Profile Matching in Mobile Social Networks

    PubMed Central

    Cui, Weirong; Du, Chenglie; Chen, Jinchao

    2016-01-01

    Privacy-preserving profile matching, a challenging task in mobile social networks, is getting more attention in recent years. In this paper, we propose a novel scheme that is based on ciphertext-policy attribute-based encryption to tackle this problem. In our scheme, a user can submit a preference-profile and search for users with matching-profile in decentralized mobile social networks. In this process, no participant’s profile and the submitted preference-profile is exposed. Meanwhile, a secure communication channel can be established between the pair of successfully matched users. In contrast to existing related schemes which are mainly based on the secure multi-party computation, our scheme can provide verifiability (both the initiator and any unmatched user cannot cheat each other to pretend to be matched), and requires few interactions among users. We provide thorough security analysis and performance evaluation on our scheme, and show its advantages in terms of security, efficiency and usability over state-of-the-art schemes. PMID:27337001

  5. CP-ABE Based Privacy-Preserving User Profile Matching in Mobile Social Networks.

    PubMed

    Cui, Weirong; Du, Chenglie; Chen, Jinchao

    2016-01-01

    Privacy-preserving profile matching, a challenging task in mobile social networks, is getting more attention in recent years. In this paper, we propose a novel scheme that is based on ciphertext-policy attribute-based encryption to tackle this problem. In our scheme, a user can submit a preference-profile and search for users with matching-profile in decentralized mobile social networks. In this process, no participant's profile and the submitted preference-profile is exposed. Meanwhile, a secure communication channel can be established between the pair of successfully matched users. In contrast to existing related schemes which are mainly based on the secure multi-party computation, our scheme can provide verifiability (both the initiator and any unmatched user cannot cheat each other to pretend to be matched), and requires few interactions among users. We provide thorough security analysis and performance evaluation on our scheme, and show its advantages in terms of security, efficiency and usability over state-of-the-art schemes.

  6. Beyond Logging of Fingertip Actions: Analysis of Collaborative Learning Using Multiple Sources of Data

    ERIC Educational Resources Information Center

    Avouris, N.; Fiotakis, G.; Kahrimanis, G.; Margaritis, M.; Komis, V.

    2007-01-01

    In this article, we discuss key requirements for collecting behavioural data concerning technology-supported collaborative learning activities. It is argued that the common practice of analysis of computer generated log files of user interactions with software tools is not enough for building a thorough view of the activity. Instead, more…

  7. Sensitivity Analysis of Down Woody Material Data Processing Routines

    Treesearch

    Christopher W. Woodall; Duncan C. Lutes

    2005-01-01

    Weight per unit area (load) estimates of Down Woody Material (DWM) are the most common requests by users of the USDA Forest Service's Forest Inventory and Analysis (FIA) program's DWM inventory. Estimating of DWM loads requires the uniform compilation of DWM transect data for the entire United States. DWM weights may vary by species, level of decay, woody...

  8. A cognitive task analysis of a visual analytic workflow: Exploring molecular interaction networks in systems biology.

    PubMed

    Mirel, Barbara; Eichinger, Felix; Keller, Benjamin J; Kretzler, Matthias

    2011-03-21

    Bioinformatics visualization tools are often not robust enough to support biomedical specialists’ complex exploratory analyses. Tools need to accommodate the workflows that scientists actually perform for specific translational research questions. To understand and model one of these workflows, we conducted a case-based, cognitive task analysis of a biomedical specialist’s exploratory workflow for the question: What functional interactions among gene products of high throughput expression data suggest previously unknown mechanisms of a disease? From our cognitive task analysis four complementary representations of the targeted workflow were developed. They include: usage scenarios, flow diagrams, a cognitive task taxonomy, and a mapping between cognitive tasks and user-centered visualization requirements. The representations capture the flows of cognitive tasks that led a biomedical specialist to inferences critical to hypothesizing. We created representations at levels of detail that could strategically guide visualization development, and we confirmed this by making a trial prototype based on user requirements for a small portion of the workflow. Our results imply that visualizations should make available to scientific users “bundles of features” consonant with the compositional cognitive tasks purposefully enacted at specific points in the workflow. We also highlight certain aspects of visualizations that: (a) need more built-in flexibility; (b) are critical for negotiating meaning; and (c) are necessary for essential metacognitive support.

  9. Condor-COPASI: high-throughput computing for biochemical networks

    PubMed Central

    2012-01-01

    Background Mathematical modelling has become a standard technique to improve our understanding of complex biological systems. As models become larger and more complex, simulations and analyses require increasing amounts of computational power. Clusters of computers in a high-throughput computing environment can help to provide the resources required for computationally expensive model analysis. However, exploiting such a system can be difficult for users without the necessary expertise. Results We present Condor-COPASI, a server-based software tool that integrates COPASI, a biological pathway simulation tool, with Condor, a high-throughput computing environment. Condor-COPASI provides a web-based interface, which makes it extremely easy for a user to run a number of model simulation and analysis tasks in parallel. Tasks are transparently split into smaller parts, and submitted for execution on a Condor pool. Result output is presented to the user in a number of formats, including tables and interactive graphical displays. Conclusions Condor-COPASI can effectively use a Condor high-throughput computing environment to provide significant gains in performance for a number of model simulation and analysis tasks. Condor-COPASI is free, open source software, released under the Artistic License 2.0, and is suitable for use by any institution with access to a Condor pool. Source code is freely available for download at http://code.google.com/p/condor-copasi/, along with full instructions on deployment and usage. PMID:22834945

  10. Image segmentation and registration for the analysis of joint motion from 3D MRI

    NASA Astrophysics Data System (ADS)

    Hu, Yangqiu; Haynor, David R.; Fassbind, Michael; Rohr, Eric; Ledoux, William

    2006-03-01

    We report an image segmentation and registration method for studying joint morphology and kinematics from in vivo MRI scans and its application to the analysis of ankle joint motion. Using an MR-compatible loading device, a foot was scanned in a single neutral and seven dynamic positions including maximal flexion, rotation and inversion/eversion. A segmentation method combining graph cuts and level sets was developed which allows a user to interactively delineate 14 bones in the neutral position volume in less than 30 minutes total, including less than 10 minutes of user interaction. In the subsequent registration step, a separate rigid body transformation for each bone is obtained by registering the neutral position dataset to each of the dynamic ones, which produces an accurate description of the motion between them. We have processed six datasets, including 3 normal and 3 pathological feet. For validation our results were compared with those obtained from 3DViewnix, a semi-automatic segmentation program, and achieved good agreement in volume overlap ratios (mean: 91.57%, standard deviation: 3.58%) for all bones. Our tool requires only 1/50 and 1/150 of the user interaction time required by 3DViewnix and NIH Image Plus, respectively, an improvement that has the potential to make joint motion analysis from MRI practical in research and clinical applications.

  11. Identification of MS-Cleavable and Non-Cleavable Chemically Crosslinked Peptides with MetaMorpheus.

    PubMed

    Lu, Lei; Millikin, Robert J; Solntsev, Stefan K; Rolfs, Zach; Scalf, Mark; Shortreed, Michael R; Smith, Lloyd M

    2018-05-25

    Protein chemical crosslinking combined with mass spectrometry has become an important technique for the analysis of protein structure and protein-protein interactions. A variety of crosslinkers are well developed, but reliable, rapid, and user-friendly tools for large-scale analysis of crosslinked proteins are still in need. Here we report MetaMorpheusXL, a new search module within the MetaMorpheus software suite that identifies both MS-cleavable and non-cleavable crosslinked peptides in MS data. MetaMorpheusXL identifies MS-cleavable crosslinked peptides with an ion-indexing algorithm, which enables an efficient large database search. The identification does not require the presence of signature fragment ions, an advantage compared to similar programs such as XlinkX. One complication associated with the need for signature ions from cleavable crosslinkers such as DSSO (disuccinimidyl sulfoxide) is the requirement for multiple fragmentation types and energy combinations, which is not necessary for MetaMorpheusXL. The ability to perform proteome-wide analysis is another advantage of MetaMorpheusXl compared to such programs as MeroX and DXMSMS. MetaMorpheusXL is also faster than other currently available MS-cleavable crosslink search software programs. It is imbedded in MetaMorpheus, an open-source and freely available software suite that provides a reliable, fast, user-friendly graphical user interface that is readily accessible to researchers.

  12. C-SPADE: a web-tool for interactive analysis and visualization of drug screening experiments through compound-specific bioactivity dendrograms

    PubMed Central

    Alam, Zaid; Peddinti, Gopal

    2017-01-01

    Abstract The advent of polypharmacology paradigm in drug discovery calls for novel chemoinformatic tools for analyzing compounds’ multi-targeting activities. Such tools should provide an intuitive representation of the chemical space through capturing and visualizing underlying patterns of compound similarities linked to their polypharmacological effects. Most of the existing compound-centric chemoinformatics tools lack interactive options and user interfaces that are critical for the real-time needs of chemical biologists carrying out compound screening experiments. Toward that end, we introduce C-SPADE, an open-source exploratory web-tool for interactive analysis and visualization of drug profiling assays (biochemical, cell-based or cell-free) using compound-centric similarity clustering. C-SPADE allows the users to visually map the chemical diversity of a screening panel, explore investigational compounds in terms of their similarity to the screening panel, perform polypharmacological analyses and guide drug-target interaction predictions. C-SPADE requires only the raw drug profiling data as input, and it automatically retrieves the structural information and constructs the compound clusters in real-time, thereby reducing the time required for manual analysis in drug development or repurposing applications. The web-tool provides a customizable visual workspace that can either be downloaded as figure or Newick tree file or shared as a hyperlink with other users. C-SPADE is freely available at http://cspade.fimm.fi/. PMID:28472495

  13. Activity recognition of assembly tasks using body-worn microphones and accelerometers.

    PubMed

    Ward, Jamie A; Lukowicz, Paul; Tröster, Gerhard; Starner, Thad E

    2006-10-01

    In order to provide relevant information to mobile users, such as workers engaging in the manual tasks of maintenance and assembly, a wearable computer requires information about the user's specific activities. This work focuses on the recognition of activities that are characterized by a hand motion and an accompanying sound. Suitable activities can be found in assembly and maintenance work. Here, we provide an initial exploration into the problem domain of continuous activity recognition using on-body sensing. We use a mock "wood workshop" assembly task to ground our investigation. We describe a method for the continuous recognition of activities (sawing, hammering, filing, drilling, grinding, sanding, opening a drawer, tightening a vise, and turning a screwdriver) using microphones and three-axis accelerometers mounted at two positions on the user's arms. Potentially "interesting" activities are segmented from continuous streams of data using an analysis of the sound intensity detected at the two different locations. Activity classification is then performed on these detected segments using linear discriminant analysis (LDA) on the sound channel and hidden Markov models (HMMs) on the acceleration data. Four different methods at classifier fusion are compared for improving these classifications. Using user-dependent training, we obtain continuous average recall and precision rates (for positive activities) of 78 percent and 74 percent, respectively. Using user-independent training (leave-one-out across five users), we obtain recall rates of 66 percent and precision rates of 63 percent. In isolation, these activities were recognized with accuracies of 98 percent, 87 percent, and 95 percent for the user-dependent, user-independent, and user-adapted cases, respectively.

  14. DiversePathsJ: diverse shortest paths for bioimage analysis.

    PubMed

    Uhlmann, Virginie; Haubold, Carsten; Hamprecht, Fred A; Unser, Michael

    2018-02-01

    We introduce a formulation for the general task of finding diverse shortest paths between two end-points. Our approach is not linked to a specific biological problem and can be applied to a large variety of images thanks to its generic implementation as a user-friendly ImageJ/Fiji plugin. It relies on the introduction of additional layers in a Viterbi path graph, which requires slight modifications to the standard Viterbi algorithm rules. This layered graph construction allows for the specification of various constraints imposing diversity between solutions. The software allows obtaining a collection of diverse shortest paths under some user-defined constraints through a convenient and user-friendly interface. It can be used alone or be integrated into larger image analysis pipelines. http://bigwww.epfl.ch/algorithms/diversepathsj. michael.unser@epfl.ch or fred.hamprecht@iwr.uni-heidelberg.de. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  15. Embedded CLIPS for SDI BM/C3 simulation and analysis

    NASA Technical Reports Server (NTRS)

    Gossage, Brett; Nanney, Van

    1990-01-01

    Nichols Research Corporation is developing the BM/C3 Requirements Analysis Tool (BRAT) for the U.S. Army Strategic Defense Command. BRAT uses embedded CLIPS/Ada to model the decision making processes used by the human commander of a defense system. Embedding CLlPS/Ada in BRAT allows the user to explore the role of the human in Command and Control (C2) and the use of expert systems for automated C2. BRAT models assert facts about the current state of the system, the simulated scenario, and threat information into CLIPS/Ada. A user-defined rule set describes the decision criteria for the commander. We have extended CLIPS/Ada with user-defined functions that allow the firing of a rule to invoke a system action such as weapons release or a change in strategy. The use of embedded CLIPS/Ada will provide a powerful modeling tool for our customer at minimal cost.

  16. Space station data system analysis/architecture study. Task 1: Functional requirements definition, DR-5. Appendix: Requirements data base

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Appendix A contains data that characterize the system functions in sufficient depth as to determine the requirements for the Space Station Data System (SSDS). This data is in the form of: (1) top down traceability report; (2) bottom up traceability report; (3) requirements data sheets; and (4) cross index of requirements paragraphs of the source documents and the requirements numbers. A data base users guide is included that interested parties can use to access the requirements data base and get up to date information about the functions.

  17. Patient Accounting Systems: Are They Fit with the Users' Requirements?

    PubMed

    Ayatollahi, Haleh; Nazemi, Zahra; Haghani, Hamid

    2016-01-01

    A patient accounting system is a subsystem of a hospital information system. This system like other information systems should be carefully designed to be able to meet users' requirements. The main aim of this research was to investigate users' requirements and to determine whether current patient accounting systems meet users' needs or not. This was a survey study, and the participants were the users of six patient accounting systems used in 24 teaching hospitals. A stratified sampling method was used to select the participants (n = 216). The research instruments were a questionnaire and a checklist. The mean value of ≥3 showed the importance of each data element and the capability of the system. Generally, the findings showed that the current patient accounting systems had some weaknesses and were able to meet between 70% and 80% of users' requirements. The current patient accounting systems need to be improved to be able to meet users' requirements. This approach can also help to provide hospitals with more usable and reliable financial information.

  18. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill

    2000-01-01

    We use the term "Grid" to refer to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. This infrastructure includes: (1) Tools for constructing collaborative, application oriented Problem Solving Environments / Frameworks (the primary user interfaces for Grids); (2) Programming environments, tools, and services providing various approaches for building applications that use aggregated computing and storage resources, and federated data sources; (3) Comprehensive and consistent set of location independent tools and services for accessing and managing dynamic collections of widely distributed resources: heterogeneous computing systems, storage systems, real-time data sources and instruments, human collaborators, and communications systems; (4) Operational infrastructure including management tools for distributed systems and distributed resources, user services, accounting and auditing, strong and location independent user authentication and authorization, and overall system security services The vision for NASA's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks. Such Grids will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. Examples of these problems include: (1) Coupled, multidisciplinary simulations too large for single systems (e.g., multi-component NPSS turbomachine simulation); (2) Use of widely distributed, federated data archives (e.g., simultaneous access to metrological, topological, aircraft performance, and flight path scheduling databases supporting a National Air Space Simulation systems}; (3) Coupling large-scale computing and data systems to scientific and engineering instruments (e.g., realtime interaction with experiments through real-time data analysis and interpretation presented to the experimentalist in ways that allow direct interaction with the experiment (instead of just with instrument control); (5) Highly interactive, augmented reality and virtual reality remote collaborations (e.g., Ames / Boeing Remote Help Desk providing field maintenance use of coupled video and NDI to a remote, on-line airframe structures expert who uses this data to index into detailed design databases, and returns 3D internal aircraft geometry to the field); (5) Single computational problems too large for any single system (e.g. the rotocraft reference calculation). Grids also have the potential to provide pools of resources that could be called on in extraordinary / rapid response situations (such as disaster response) because they can provide common interfaces and access mechanisms, standardized management, and uniform user authentication and authorization, for large collections of distributed resources (whether or not they normally function in concert). IPG development and deployment is addressing requirements obtained by analyzing a number of different application areas, in particular from the NASA Aero-Space Technology Enterprise. This analysis has focussed primarily on two types of users: the scientist / design engineer whose primary interest is problem solving (e.g. determining wing aerodynamic characteristics in many different operating environments), and whose primary interface to IPG will be through various sorts of problem solving frameworks. The second type of user is the tool designer: the computational scientists who convert physics and mathematics into code that can simulate the physical world. These are the two primary users of IPG, and they have rather different requirements. The results of the analysis of the needs of these two types of users provides a broad set of requirements that gives rise to a general set of required capabilities. The IPG project is intended to address all of these requirements. In some cases the required computing technology exists, and in some cases it must be researched and developed. The project is using available technology to provide a prototype set of capabilities in a persistent distributed computing testbed. Beyond this, there are required capabilities that are not immediately available, and whose development spans the range from near-term engineering development (one to two years) to much longer term R&D (three to six years). Additional information is contained in the original.

  19. Data analysis software for the autoradiographic enhancement process. Volumes 1, 2, and 3, and appendix

    NASA Technical Reports Server (NTRS)

    Singh, S. P.

    1979-01-01

    The computer software developed to set up a method for Wiener spectrum analysis of photographic films is presented. This method is used for the quantitative analysis of the autoradiographic enhancement process. The software requirements and design for the autoradiographic enhancement process are given along with the program listings and the users manual. A software description and program listings modification of the data analysis software are included.

  20. On Gait Analysis Estimation Errors Using Force Sensors on a Smart Rollator

    PubMed Central

    Ballesteros, Joaquin; Urdiales, Cristina; Martinez, Antonio B.; van Dieën, Jaap H.

    2016-01-01

    Gait analysis can provide valuable information on a person’s condition and rehabilitation progress. Gait is typically captured using external equipment and/or wearable sensors. These tests are largely constrained to specific controlled environments. In addition, gait analysis often requires experts for calibration, operation and/or to place sensors on volunteers. Alternatively, mobility support devices like rollators can be equipped with onboard sensors to monitor gait parameters, while users perform their Activities of Daily Living. Gait analysis in rollators may use odometry and force sensors in the handlebars. However, force based estimation of gait parameters is less accurate than traditional methods, especially when rollators are not properly used. This paper presents an evaluation of force based gait analysis using a smart rollator on different groups of users to determine when this methodology is applicable. In a second stage, the rollator is used in combination with two lab-based gait analysis systems to assess the rollator estimation error. Our results show that: (i) there is an inverse relation between the variance in the force difference between handlebars and support on the handlebars—related to the user condition—and the estimation error; and (ii) this error is lower than 10% when the variation in the force difference is above 7 N. This lower limit was exceeded by the 95.83% of our challenged volunteers. In conclusion, rollators are useful for gait characterization as long as users really need the device for ambulation. PMID:27834911

  1. On Gait Analysis Estimation Errors Using Force Sensors on a Smart Rollator.

    PubMed

    Ballesteros, Joaquin; Urdiales, Cristina; Martinez, Antonio B; van Dieën, Jaap H

    2016-11-10

    Gait analysis can provide valuable information on a person's condition and rehabilitation progress. Gait is typically captured using external equipment and/or wearable sensors. These tests are largely constrained to specific controlled environments. In addition, gait analysis often requires experts for calibration, operation and/or to place sensors on volunteers. Alternatively, mobility support devices like rollators can be equipped with onboard sensors to monitor gait parameters, while users perform their Activities of Daily Living. Gait analysis in rollators may use odometry and force sensors in the handlebars. However, force based estimation of gait parameters is less accurate than traditional methods, especially when rollators are not properly used. This paper presents an evaluation of force based gait analysis using a smart rollator on different groups of users to determine when this methodology is applicable. In a second stage, the rollator is used in combination with two lab-based gait analysis systems to assess the rollator estimation error. Our results show that: (i) there is an inverse relation between the variance in the force difference between handlebars and support on the handlebars-related to the user condition-and the estimation error; and (ii) this error is lower than 10% when the variation in the force difference is above 7 N. This lower limit was exceeded by the 95.83% of our challenged volunteers. In conclusion, rollators are useful for gait characterization as long as users really need the device for ambulation.

  2. Addressing and Presenting Quality of Satellite Data via Web-Based Services

    NASA Technical Reports Server (NTRS)

    Leptoukh, Gregory; Lynnes, C.; Ahmad, S.; Fox, P.; Zednik, S.; West, P.

    2011-01-01

    With the recent attention to climate change and proliferation of remote-sensing data utilization, climate model and various environmental monitoring and protection applications have begun to increasingly rely on satellite measurements. Research application users seek good quality satellite data, with uncertainties and biases provided for each data point. However, different communities address remote sensing quality issues rather inconsistently and differently. We describe our attempt to systematically characterize, capture, and provision quality and uncertainty information as it applies to the NASA MODIS Aerosol Optical Depth data product. In particular, we note the semantic differences in quality/bias/uncertainty at the pixel, granule, product, and record levels. We outline various factors contributing to uncertainty or error budget; errors. Web-based science analysis and processing tools allow users to access, analyze, and generate visualizations of data while alleviating users from having directly managing complex data processing operations. These tools provide value by streamlining the data analysis process, but usually shield users from details of the data processing steps, algorithm assumptions, caveats, etc. Correct interpretation of the final analysis requires user understanding of how data has been generated and processed and what potential biases, anomalies, or errors may have been introduced. By providing services that leverage data lineage provenance and domain-expertise, expert systems can be built to aid the user in understanding data sources, processing, and the suitability for use of products generated by the tools. We describe our experiences developing a semantic, provenance-aware, expert-knowledge advisory system applied to NASA Giovanni web-based Earth science data analysis tool as part of the ESTO AIST-funded Multi-sensor Data Synergy Advisor project.

  3. NASA GES DISC On-line Visualization and Analysis System for Gridded Remote Sensing Data

    NASA Technical Reports Server (NTRS)

    Leptoukh, Gregory G.; Berrick, S.; Rui, H.; Liu, Z.; Zhu, T.; Teng, W.; Shen, S.; Qin, J.

    2005-01-01

    The ability to use data stored in the current NASA Earth Observing System (EOS) archives for studying regional or global phenomena is highly dependent on having a detailed understanding of the data's internal structure and physical implementation. Gaining this understanding and applying it to data reduction is a time-consuming task that must be undertaken before the core investigation can begin. This is an especially difficult challenge when science objectives require users to deal with large multi-sensor data sets that are usually of different formats, structures, and resolutions. The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) has taken a major step towards meeting this challenge by developing an infrastructure with a Web interface that allows users to perform interactive analysis online without downloading any data, the GES-DISC Interactive Online Visualization and Analysis Infrastructure or "Giovanni." Giovanni provides interactive, online, analysis tools for data users to facilitate their research. There have been several instances of this interface created to serve TRMM users, Aerosol scientists, Ocean Color and Agriculture applications users. The first generation of these tools support gridded data only. The user selects geophysical parameters, area of interest, time period; and the system generates an output on screen in a matter of seconds. The currently available output options are: Area plot averaged or accumulated over any available data period for any rectangular area; Time plot time series averaged over any rectangular area; Hovmoller plots image view of any longitude-time and latitude-time cross sections; ASCII output for all plot types; Image animation for area plot. Another analysis suite deals with parameter intercomparison: scatter plots, temporal correlation maps, GIs-compatible outputs, etc. This allow user to focus on data content (i.e. science parameters) and eliminate the need for expensive learning, development and processing tasks that are redundantly incurred by an archive's user community. The current implementation utilizes the GrADS-DODS Server (GDS), and provides subsetting and analysis services across the Internet for any GrADS-readable dataset. The subsetting capability allows users to retrieve a specified temporal and/or spatial subdomain from a large dataset, eliminating the need to download everything simply to access a small relevant portion of a dataset. The analysis capability allows users to retrieve the results of an operation applied to one or more datasets on the server. We use this approach to read pre-processed binary files and/or to read and extract the needed parts directly from HDF or HDF-EOS files. These subsets then serve as inputs into GrADS analysis scripts. It can be used in a wide variety of Earth science applications: climate and weather events study and monitoring; modeling. It can be easily configured for new applications.

  4. netCDF Operators for Rapid Analysis of Measured and Modeled Swath-like Data

    NASA Astrophysics Data System (ADS)

    Zender, C. S.

    2015-12-01

    Swath-like data (hereafter SLD) are defined by non-rectangular and/or time-varying spatial grids in which one or more coordinates are multi-dimensional. It is often challenging and time-consuming to work with SLD, including all Level 2 satellite-retrieved data, non-rectangular subsets of Level 3 data, and model data on curvilinear grids. Researchers and data centers want user-friendly, fast, and powerful methods to specify, extract, serve, manipulate, and thus analyze, SLD. To meet these needs, large research-oriented agencies and modeling center such as NASA, DOE, and NOAA increasingly employ the netCDF Operators (NCO), an open-source scientific data analysis software package applicable to netCDF and HDF data. NCO includes extensive, fast, parallelized regridding features to facilitate analysis and intercomparison of SLD and model data. Remote sensing, weather and climate modeling and analysis communities face similar problems in handling SLD including how to easily: 1. Specify and mask irregular regions such as ocean basins and political boundaries in SLD (and rectangular) grids. 2. Bin, interpolate, average, or re-map SLD to regular grids. 3. Derive secondary data from given quality levels of SLD. These common tasks require a data extraction and analysis toolkit that is SLD-friendly and, like NCO, familiar in all these communities. With NCO users can 1. Quickly project SLD onto the most useful regular grids for intercomparison. 2. Access sophisticated statistical and regridding functions that are robust to missing data and allow easy specification of quality control metrics. These capabilities improve interoperability, software-reuse, and, because they apply to SLD, minimize transmission, storage, and handling of unwanted data. While SLD analysis still poses many challenges compared to regularly gridded, rectangular data, the custom analyses scripts SLD once required are now shorter, more powerful, and user-friendly.

  5. Rapid Prototyping of Hydrologic Model Interfaces with IPython

    NASA Astrophysics Data System (ADS)

    Farthing, M. W.; Winters, K. D.; Ahmadia, A. J.; Hesser, T.; Howington, S. E.; Johnson, B. D.; Tate, J.; Kees, C. E.

    2014-12-01

    A significant gulf still exists between the state of practice and state of the art in hydrologic modeling. Part of this gulf is due to the lack of adequate pre- and post-processing tools for newly developed computational models. The development of user interfaces has traditionally lagged several years behind the development of a particular computational model or suite of models. As a result, models with mature interfaces often lack key advancements in model formulation, solution methods, and/or software design and technology. Part of the problem has been a focus on developing monolithic tools to provide comprehensive interfaces for the entire suite of model capabilities. Such efforts require expertise in software libraries and frameworks for creating user interfaces (e.g., Tcl/Tk, Qt, and MFC). These tools are complex and require significant investment in project resources (time and/or money) to use. Moreover, providing the required features for the entire range of possible applications and analyses creates a cumbersome interface. For a particular site or application, the modeling requirements may be simplified or at least narrowed, which can greatly reduce the number and complexity of options that need to be accessible to the user. However, monolithic tools usually are not adept at dynamically exposing specific workflows. Our approach is to deliver highly tailored interfaces to users. These interfaces may be site and/or process specific. As a result, we end up with many, customized interfaces rather than a single, general-use tool. For this approach to be successful, it must be efficient to create these tailored interfaces. We need technology for creating quality user interfaces that is accessible and has a low barrier for integration into model development efforts. Here, we present efforts to leverage IPython notebooks as tools for rapid prototyping of site and application-specific user interfaces. We provide specific examples from applications in near-shore environments as well as levee analysis. We discuss our design decisions and methodology for developing customized interfaces, strategies for delivery of the interfaces to users in various computing environments, as well as implications for the design/implementation of simulation models.

  6. Earth Observation Training and Education with ESA LearnEO!

    NASA Astrophysics Data System (ADS)

    Byfield, Valborg; Mathieu, Pierre-Philippe; Dobson, Malcolm; Rosmorduc, Vinca; Del Frate, Fabio; Banks, Chris; Picchiani, Matteo

    2013-04-01

    For society to benefit fully from its investment in Earth observation, EO data must be accessible and familiar to a global community of users who have the skills, knowledge and understanding to use the observations appropriately in their work. Achieving this requires considerable education effort. LearnEO! (www.learn-eo.org) is a new ESA education project that contributes towards making this a reality. LearnEO! has two main aims: to develop new training resources that use data from sensors on ESA satellites to explore a variety of environmental topics, and to stimulate and support members of the EO and education communities who may be willing to develop and share new education resources in the future. The project builds on the UNESCO Bilko project, which currently supplies free software, tutorials, and example data to users in 175 countries. Most of these users are in academic education or research, but the training resources are also of interest to a growing number of professionals in government, NGOs and private enterprise. Typical users are not remote sensing experts, but see satellite data as one of many observational tools. They want an easy, low-cost means to process, display and analyse data from different satellite sensors as part of their work in environmental research, monitoring and policy development. Many of the software improvements and training materials developed in LearnEO! are in response to requests from this user community. The LearnEO! tutorial and peer-reviewed lessons are designed to teach satellite data processing and analysis skills at different levels, from beginner to advanced - where advanced lessons requires some previous experience with Earth observation techniques. The materials are aimed at students and professionals in various branches of Earth sciences who have not yet specialised in specific EO technologies. The lessons are suitable for self-study, university courses at undergraduate to MSc level, or for continued professional development training. Each lesson comes complete with data, analysis tools and background information required to complete the suggested activities and answer the study questions. Model answers are supplied for users working on their own or with limited specialist support. The web site also provides access to annotated data sets and a lesson developers resource library, both designed to support users who wish to develop their own lessons and tutorials and share these with others. Registered users are encouraged to become involved with the project by providing support for future software and lesson development, testing, and peer review.

  7. LANDSAT D local user terminal study

    NASA Technical Reports Server (NTRS)

    Alexander, L.; Louie, M.; Spencer, R.; Stow, W. K.

    1976-01-01

    The effect of the changes incorporated in the LANDSAT D system on the ability of a local user terminal to receive, record and process data in real time was studied. Alternate solutions to the problems raised by these changes were evaluated. A loading analysis was performed in order to determine the quantities of data that a local user terminal (LUT) would be interested in receiving and processing. The number of bits in an MSS and a TM scene were calculated along with the number of scenes per day that an LUT might require for processing. These then combined to a total number of processed bits/day for an LUT as a function of sensor and coverage circle radius.

  8. Failure Analysis for Composition of Web Services Represented as Labeled Transition Systems

    NASA Astrophysics Data System (ADS)

    Nadkarni, Dinanath; Basu, Samik; Honavar, Vasant; Lutz, Robyn

    The Web service composition problem involves the creation of a choreographer that provides the interaction between a set of component services to realize a goal service. Several methods have been proposed and developed to address this problem. In this paper, we consider those scenarios where the composition process may fail due to incomplete specification of goal service requirements or due to the fact that the user is unaware of the functionality provided by the existing component services. In such cases, it is desirable to have a composition algorithm that can provide feedback to the user regarding the cause of failure in the composition process. Such feedback will help guide the user to re-formulate the goal service and iterate the composition process. We propose a failure analysis technique for composition algorithms that views Web service behavior as multiple sequences of input/output events. Our technique identifies the possible cause of composition failure and suggests possible recovery options to the user. We discuss our technique using a simple e-Library Web service in the context of the MoSCoE Web service composition framework.

  9. Local Spatial Obesity Analysis and Estimation Using Online Social Network Sensors.

    PubMed

    Sun, Qindong; Wang, Nan; Li, Shancang; Zhou, Hongyi

    2018-03-15

    Recently, the online social networks (OSNs) have received considerable attentions as a revolutionary platform to offer users massive social interaction among users that enables users to be more involved in their own healthcare. The OSNs have also promoted increasing interests in the generation of analytical, data models in health informatics. This paper aims at developing an obesity identification, analysis, and estimation model, in which each individual user is regarded as an online social network 'sensor' that can provide valuable health information. The OSN-based obesity analytic model requires each sensor node in an OSN to provide associated features, including dietary habit, physical activity, integral/incidental emotions, and self-consciousness. Based on the detailed measurements on the correlation of obesity and proposed features, the OSN obesity analytic model is able to estimate the obesity rate in certain urban areas and the experimental results demonstrate a high success estimation rate. The measurements and estimation experimental findings created by the proposed obesity analytic model show that the online social networks could be used in analyzing the local spatial obesity problems effectively. Copyright © 2018. Published by Elsevier Inc.

  10. Usability study of clinical exome analysis software: top lessons learned and recommendations.

    PubMed

    Shyr, Casper; Kushniruk, Andre; Wasserman, Wyeth W

    2014-10-01

    New DNA sequencing technologies have revolutionized the search for genetic disruptions. Targeted sequencing of all protein coding regions of the genome, called exome analysis, is actively used in research-oriented genetics clinics, with the transition to exomes as a standard procedure underway. This transition is challenging; identification of potentially causal mutation(s) amongst ∼10(6) variants requires specialized computation in combination with expert assessment. This study analyzes the usability of user interfaces for clinical exome analysis software. There are two study objectives: (1) To ascertain the key features of successful user interfaces for clinical exome analysis software based on the perspective of expert clinical geneticists, (2) To assess user-system interactions in order to reveal strengths and weaknesses of existing software, inform future design, and accelerate the clinical uptake of exome analysis. Surveys, interviews, and cognitive task analysis were performed for the assessment of two next-generation exome sequence analysis software packages. The subjects included ten clinical geneticists who interacted with the software packages using the "think aloud" method. Subjects' interactions with the software were recorded in their clinical office within an urban research and teaching hospital. All major user interface events (from the user interactions with the packages) were time-stamped and annotated with coding categories to identify usability issues in order to characterize desired features and deficiencies in the user experience. We detected 193 usability issues, the majority of which concern interface layout and navigation, and the resolution of reports. Our study highlights gaps in specific software features typical within exome analysis. The clinicians perform best when the flow of the system is structured into well-defined yet customizable layers for incorporation within the clinical workflow. The results highlight opportunities to dramatically accelerate clinician analysis and interpretation of patient genomic data. We present the first application of usability methods to evaluate software interfaces in the context of exome analysis. Our results highlight how the study of user responses can lead to identification of usability issues and challenges and reveal software reengineering opportunities for improving clinical next-generation sequencing analysis. While the evaluation focused on two distinctive software tools, the results are general and should inform active and future software development for genome analysis software. As large-scale genome analysis becomes increasingly common in healthcare, it is critical that efficient and effective software interfaces are provided to accelerate clinical adoption of the technology. Implications for improved design of such applications are discussed. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  11. Graphic analysis of resources by numerical evaluation techniques (Garnet)

    USGS Publications Warehouse

    Olson, A.C.

    1977-01-01

    An interactive computer program for graphical analysis has been developed by the U.S. Geological Survey. The program embodies five goals, (1) economical use of computer resources, (2) simplicity for user applications, (3) interactive on-line use, (4) minimal core requirements, and (5) portability. It is designed to aid (1) the rapid analysis of point-located data, (2) structural mapping, and (3) estimation of area resources. ?? 1977.

  12. Classification of user interfaces for graph-based online analytical processing

    NASA Astrophysics Data System (ADS)

    Michaelis, James R.

    2016-05-01

    In the domain of business intelligence, user-oriented software for conducting multidimensional analysis via Online- Analytical Processing (OLAP) is now commonplace. In this setting, datasets commonly have well-defined sets of dimensions and measures around which analysis tasks can be conducted. However, many forms of data used in intelligence operations - deriving from social networks, online communications, and text corpora - will consist of graphs with varying forms of potential dimensional structure. Hence, enabling OLAP over such data collections requires explicit definition and extraction of supporting dimensions and measures. Further, as Graph OLAP remains an emerging technique, limited research has been done on its user interface requirements. Namely, on effective pairing of interface designs to different types of graph-derived dimensions and measures. This paper presents a novel technique for pairing of user interface designs to Graph OLAP datasets, rooted in Analytic Hierarchy Process (AHP) driven comparisons. Attributes of the classification strategy are encoded through an AHP ontology, developed in our alternate work and extended to support pairwise comparison of interfaces. Specifically, according to their ability, as perceived by Subject Matter Experts, to support dimensions and measures corresponding to Graph OLAP dataset attributes. To frame this discussion, a survey is provided both on existing variations of Graph OLAP, as well as existing interface designs previously applied in multidimensional analysis settings. Following this, a review of our AHP ontology is provided, along with a listing of corresponding dataset and interface attributes applicable toward SME recommendation structuring. A walkthrough of AHP-based recommendation encoding via the ontology-based approach is then provided. The paper concludes with a short summary of proposed future directions seen as essential for this research area.

  13. Shifting from Stewardship to Analytics of Massive Science Data

    NASA Astrophysics Data System (ADS)

    Crichton, D. J.; Doyle, R.; Law, E.; Hughes, S.; Huang, T.; Mahabal, A.

    2015-12-01

    Currently, the analysis of large data collections is executed through traditional computational and data analysis approaches, which require users to bring data to their desktops and perform local data analysis. Data collection, archiving and analysis from future remote sensing missions, be it from earth science satellites, planetary robotic missions, or massive radio observatories may not scale as more capable instruments stress existing architectural approaches and systems due to more continuous data streams, data from multiple observational platforms, and measurements and models from different agencies. A new paradigm is needed in order to increase the productivity and effectiveness of scientific data analysis. This paradigm must recognize that architectural choices, data processing, management, analysis, etc are interrelated, and must be carefully coordinated in any system that aims to allow efficient, interactive scientific exploration and discovery to exploit massive data collections. Future observational systems, including satellite and airborne experiments, and research in climate modeling will significantly increase the size of the data requiring new methodological approaches towards data analytics where users can more effectively interact with the data and apply automated mechanisms for data reduction, reduction and fusion across these massive data repositories. This presentation will discuss architecture, use cases, and approaches for developing a big data analytics strategy across multiple science disciplines.

  14. Data oriented job submission scheme for the PHENIX user analysis in CCJ

    NASA Astrophysics Data System (ADS)

    Nakamura, T.; En'yo, H.; Ichihara, T.; Watanabe, Y.; Yokkaichi, S.

    2011-12-01

    The RIKEN Computing Center in Japan (CCJ) has been developed to make it possible analyzing huge amount of data corrected by the PHENIX experiment at RHIC. The corrected raw data or reconstructed data are transferred via SINET3 with 10 Gbps bandwidth from Brookheaven National Laboratory (BNL) by using GridFTP. The transferred data are once stored in the hierarchical storage management system (HPSS) prior to the user analysis. Since the size of data grows steadily year by year, concentrations of the access request to data servers become one of the serious bottlenecks. To eliminate this I/O bound problem, 18 calculating nodes with total 180 TB local disks were introduced to store the data a priori. We added some setup in a batch job scheduler (LSF) so that user can specify the requiring data already distributed to the local disks. The locations of data are automatically obtained from a database, and jobs are dispatched to the appropriate node which has the required data. To avoid the multiple access to a local disk from several jobs in a node, techniques of lock file and access control list are employed. As a result, each job can handle a local disk exclusively. Indeed, the total throughput was improved drastically as compared to the preexisting nodes in CCJ, and users can analyze about 150 TB data within 9 hours. We report this successful job submission scheme and the feature of the PC cluster.

  15. User experience with on-road electric vehicles in the U.S.A. and Canada

    NASA Technical Reports Server (NTRS)

    Sandberg, J. J.; Leschly, K.

    1978-01-01

    Approximately 3000 on-road electric passenger cars and delivery vans are now in use in the U.S.A. and Canada. The owners and operators of almost one-third of these vehicles have been surveyed directly in an attempt to determine the suitability of commercially sold electric vehicles for real on-road jobs. This paper is primarily concerned with the analysis of the engineering aspects of the user experience with electric vehicles, i.e., mileage and application, failure modes and rates, energy economy, maintenance requirements, life cycle costs, and vehicle performance characteristics. It is concluded that existing electric vehicles can perform satisfactorily in applications that have limited performance requirements, particularly in terms of range.

  16. Space Trajectory Error Analysis Program (STEAP) for halo orbit missions. Volume 1: Analytic and user's manual

    NASA Technical Reports Server (NTRS)

    Byrnes, D. V.; Carney, P. C.; Underwood, J. W.; Vogt, E. D.

    1974-01-01

    Development, test, conversion, and documentation of computer software for the mission analysis of missions to halo orbits about libration points in the earth-sun system is reported. The software consisting of two programs called NOMNAL and ERRAN is part of the Space Trajectories Error Analysis Programs (STEAP). The program NOMNAL targets a transfer trajectory from Earth on a given launch date to a specified halo orbit on a required arrival date. Either impulsive or finite thrust insertion maneuvers into halo orbit are permitted by the program. The transfer trajectory is consistent with a realistic launch profile input by the user. The second program ERRAN conducts error analyses of the targeted transfer trajectory. Measurements including range, doppler, star-planet angles, and apparent planet diameter are processed in a Kalman-Schmidt filter to determine the trajectory knowledge uncertainty. Execution errors at injection, midcourse correction and orbit insertion maneuvers are analyzed along with the navigation uncertainty to determine trajectory control uncertainties and fuel-sizing requirements. The program is also capable of generalized covariance analyses.

  17. Model Based User's Access Requirement Analysis of E-Governance Systems

    NASA Astrophysics Data System (ADS)

    Saha, Shilpi; Jeon, Seung-Hwan; Robles, Rosslin John; Kim, Tai-Hoon; Bandyopadhyay, Samir Kumar

    The strategic and contemporary importance of e-governance has been recognized across the world. In India too, various ministries of Govt. of India and State Governments have taken e-governance initiatives to provide e-services to citizens and the business they serve. To achieve the mission objectives, and make such e-governance initiatives successful it would be necessary to improve the trust and confidence of the stakeholders. It is assumed that the delivery of government services will share the same public network information that is being used in the community at large. In particular, the Internet will be the principal means by which public access to government and government services will be achieved. To provide the security measures main aim is to identify user's access requirement for the stakeholders and then according to the models of Nath's approach. Based on this analysis, the Govt. can also make standards of security based on the e-governance models. Thus there will be less human errors and bias. This analysis leads to the security architecture of the specific G2C application.

  18. Proposing Electronic Health Record Usability Requirements Based on Enriched ISO 9241 Metric Usability Model

    PubMed Central

    Farzandipour, Mehrdad; Riazi, Hossein; Jabali, Monireh Sadeqi

    2018-01-01

    Introduction: System usability assessment is among the important aspects in assessing the quality of clinical information technology, especially when the end users of the system are concerned. This study aims at providing a comprehensive list of system usability. Methods: This research is a descriptive cross-sectional one conducted using Delphi technique in three phases in 2013. After experts’ ideas were concluded, the final version of the questionnaire including 163 items in three phases was presented to 40 users of information systems in hospitals. The grading ranged from 0-4. Data analysis was conducted using SPSS software. Those requirements with a mean point of three or higher were finally confirmed. Results: The list of system usability requirements for electronic health record was designed and confirmed in nine areas including suitability for the task (24 items), self-descriptiveness (22 items), controllability (19 questions), conformity with user expectations (25 items), error tolerance (21 items), suitability for individualization (7 items), suitability for learning (19 items), visual clarity (18 items) and auditory presentation (8 items). Conclusion: A relatively comprehensive model including useful requirements for using EHR was presented which can increase functionality, effectiveness and users’ satisfaction. Thus, it is suggested that the present model be adopted by system designers and healthcare system institutions to assess those systems. PMID:29719310

  19. Modelling health care processes for eliciting user requirements: a way to link a quality paradigm and clinical information system design.

    PubMed

    Staccini, P; Joubert, M; Quaranta, J F; Fieschi, D; Fieschi, M

    2000-01-01

    Hospital information systems have to support quality improvement objectives. The design issues of health care information system can be classified into three categories: 1) time-oriented and event-labelled storage of patient data; 2) contextual support of decision-making; 3) capabilities for modular upgrading. The elicitation of the requirements has to meet users' needs in relation to both the quality (efficacy, safety) and the monitoring of all health care activities (traceability). Information analysts need methods to conceptualize clinical information systems that provide actors with individual benefits and guide behavioural changes. A methodology is proposed to elicit and structure users' requirements using a process-oriented analysis, and it is applied to the field of blood transfusion. An object-oriented data model of a process has been defined in order to identify its main components: activity, sub-process, resources, constrains, guidelines, parameters and indicators. Although some aspects of activity, such as "where", "what else", and "why" are poorly represented by the data model alone, this method of requirement elicitation fits the dynamic of data input for the process to be traced. A hierarchical representation of hospital activities has to be found for this approach to be generalised within the organisation, for the processes to be interrelated, and for their characteristics to be shared.

  20. Modelling health care processes for eliciting user requirements: a way to link a quality paradigm and clinical information system design.

    PubMed

    Staccini, P; Joubert, M; Quaranta, J F; Fieschi, D; Fieschi, M

    2001-12-01

    Healthcare institutions are looking at ways to increase their efficiency by reducing costs while providing care services with a high level of safety. Thus, hospital information systems have to support quality improvement objectives. The elicitation of the requirements has to meet users' needs in relation to both the quality (efficacy, safety) and the monitoring of all health care activities (traceability). Information analysts need methods to conceptualise clinical information systems that provide actors with individual benefits and guide behavioural changes. A methodology is proposed to elicit and structure users' requirements using a process-oriented analysis, and it is applied to the blood transfusion process. An object-oriented data model of a process has been defined in order to organise the data dictionary. Although some aspects of activity, such as 'where', 'what else', and 'why' are poorly represented by the data model alone, this method of requirement elicitation fits the dynamic of data input for the process to be traced. A hierarchical representation of hospital activities has to be found for the processes to be interrelated, and for their characteristics to be shared, in order to avoid data redundancy and to fit the gathering of data with the provision of care.

  1. A study of diverse clinical decision support rule authoring environments and requirements for integration

    PubMed Central

    2012-01-01

    Background Efficient rule authoring tools are critical to allow clinical Knowledge Engineers (KEs), Software Engineers (SEs), and Subject Matter Experts (SMEs) to convert medical knowledge into machine executable clinical decision support rules. The goal of this analysis was to identify the critical success factors and challenges of a fully functioning Rule Authoring Environment (RAE) in order to define requirements for a scalable, comprehensive tool to manage enterprise level rules. Methods The authors evaluated RAEs in active use across Partners Healthcare, including enterprise wide, ambulatory only, and system specific tools, with a focus on rule editors for reminder and medication rules. We conducted meetings with users of these RAEs to discuss their general experience and perceived advantages and limitations of these tools. Results While the overall rule authoring process is similar across the 10 separate RAEs, the system capabilities and architecture vary widely. Most current RAEs limit the ability of the clinical decision support (CDS) interventions to be standardized, sharable, interoperable, and extensible. No existing system meets all requirements defined by knowledge management users. Conclusions A successful, scalable, integrated rule authoring environment will need to support a number of key requirements and functions in the areas of knowledge representation, metadata, terminology, authoring collaboration, user interface, integration with electronic health record (EHR) systems, testing, and reporting. PMID:23145874

  2. User and Task Analysis of the Flight Surgeon Console at the Mission Control Center of the NASA Johnson Space Center

    NASA Technical Reports Server (NTRS)

    Johnson, Kathy A.; Shek, Molly

    2003-01-01

    Astronauts in a space station are to some extent like patients in an intensive care unit (ICU). Medical support of a mission crew will require acquisition, transmission, distribution, integration, and archiving of significant amounts of data. These data are acquired by disparate systems and will require timely, reliable, and secure distribution to different communities for the execution of various tasks of space missions. The goal of the Comprehensive Medical Information System (CMIS) Project at Johnson Space Center Flight Medical Clinic is to integrate data from all Medical Operations sources, including the reference information sources and the electronic medical records of astronauts. A first step toward the full CMIS implementation is to integrate and organize the reference information sources and the electronic medical record with the Flight Surgeons console. In order to investigate this integration, we need to understand the usability problems of the Flight Surgeon's console in particular and medical information systems in general. One way to achieve this understanding is through the use of user and task analyses whose general purpose is to ensure that only the necessary and sufficient task features that match users capacities will be included in system implementations. The goal of this summer project was to conduct user and task analyses employing cognitive engineering techniques to analyze the task of the Flight Surgeons and Biomedical Engineers (BMEs) while they worked on Console. The techniques employed were user interviews, observations and a questionnaire to collect data for which a hierarchical task analysis and an information resource assessment were performed. They are described in more detail below. Finally, based on our analyses, we make recommendations for improvements to the support structure.

  3. Development of regional climate scenarios in the Netherlands - involvement of users

    NASA Astrophysics Data System (ADS)

    Bessembinder, Janette; Overbeek, Bernadet

    2013-04-01

    Climate scenarios are consistent and plausible pictures of possible future climates. They are intended for use in studies exploring the impacts of climate change, and to formulate possible adaptation strategies. To ensure that the developed climate scenarios are relevant to the intended users, interaction with the users is needed. As part of the research programmes "Climate changes Spatial Planning" and "Knowledge for Climate" several projects on climate services, tailoring of climate information and communication were conducted. Some of the important lessons learned about user interaction are: *) To be able to deliver relevant climate information in the right format, proper knowledge is required on who will be using the climate information and data, how it will be used and why they use it; *) Users' requirements can be very diverse and requirements may change over time. Therefore, sustained (personal) contact with users is required; *) Organising meetings with climate researchers and users of climate information together, and working together in projects results in mutual understanding on the requirements of users and the limitations to deliver certain types of climate information, which facilitates the communication and results in more widely accepted products; *) Information and communication should be adapted to the type of users (e.g. impact researchers or policy makers) and to the type of problem (unstructured problems require much more contact with the users). In 2001 KNMI developed climate scenarios for the National Commission on Water management in the 21st century (WB21 scenarios). In 2006 these were replaced by a the KNMI'06 scenarios, intended for a broader group of users. The above lessons are now taken into account during the development of the next generation of climate scenarios for the Netherlands, expected at the end of 2013, after the publication of the IPCC WG1 report: *) users' requirements are taken into account explicitly in the whole process of the development of the climate scenarios; *) users are involved already in the early phases of the development of new scenarios, among others in the following way: **) workshops on users' requirements to check whether they have changed and to get more information; **) feedback group of users to get more detailed feedback on the modes of communication; **) newsletter with information on the progress and procedures to be followed and separate workshops for researchers and policy makers with different levels of detail; **) projects together with impact researchers: tailoring of data and in order to be able to present impact information consistent with the climate scenarios much earlier. During the presentation more detailed information will be given on the interaction with users.

  4. User requirements for project-oriented remote sensing

    NASA Technical Reports Server (NTRS)

    Hitchcock, H. C.; Baxter, F. P.; Cox, T. L.

    1975-01-01

    Registration of remotely sensed data to geodetic coordinates provides for overlay analysis of land use data. For aerial photographs of a large area, differences in scales, dates, and film types are reconciled, and multispectral scanner data are machine registered at the time of acquisition.

  5. Proceedings of the 3rd Annual Conference on Aerospace Computational Control, volume 1

    NASA Technical Reports Server (NTRS)

    Bernard, Douglas E. (Editor); Man, Guy K. (Editor)

    1989-01-01

    Conference topics included definition of tool requirements, advanced multibody component representation descriptions, model reduction, parallel computation, real time simulation, control design and analysis software, user interface issues, testing and verification, and applications to spacecraft, robotics, and aircraft.

  6. Microbe-ID: An open source toolbox for microbial genotyping and species identification

    USDA-ARS?s Scientific Manuscript database

    Development of tools to identify species, genotypes, or novel strains of invasive organisms is critical for monitoring emergence and implementing rapid response measures. Molecular markers, although critical to identifying species or genotypes, require bioinformatic tools for analysis. However, user...

  7. Human-telerobot interactions - Information, control, and mental models

    NASA Technical Reports Server (NTRS)

    Smith, Randy L.; Gillan, Douglas J.

    1987-01-01

    A part of the NASA's Space Station will be a teleoperated robot (telerobot) with arms for grasping and manipulation, feet for holding onto objects, and television cameras for visual feedback. The objective of the work described in this paper is to develop the requirements and specifications for the user-telerobot interface and to determine through research and testing that the interface results in efficient system operation. The focus of the development of the user-telerobot interface is on the information required by the user, the user inputs, and the design of the control workstation. Closely related to both the information required by the user and the user's control of the telerobot is the user's mental model of the relationship between the control inputs and the telerobot's actions.

  8. CASE/A - COMPUTER AIDED SYSTEM ENGINEERING AND ANALYSIS, ECLSS/ATCS SERIES

    NASA Technical Reports Server (NTRS)

    Bacskay, A.

    1994-01-01

    Design and analysis of Environmental Control and Life Support Systems (ECLSS) and Active Thermal Control Systems (ATCS) for spacecraft missions requires powerful software that is flexible and responsive to the demands of particular projects. CASE/A is an interactive trade study and analysis tool designed to increase productivity during all phases of systems engineering. The graphics-based command-driven package provides a user-friendly environment in which the engineer can analyze the performance and interface characteristics of an ECLS/ATC system. The package is useful during all phases of a spacecraft design program, from initial conceptual design trade studies to the actual flight, including pre-flight prediction and in-flight anomaly analysis. The CASE/A program consists of three fundamental parts: 1) the schematic management system, 2) the database management system, and 3) the simulation control and execution system. The schematic management system allows the user to graphically construct a system model by arranging icons representing system components and connecting the components with physical fluid streams. Version 4.1 contains 51 fully coded and documented default component routines. New components can be added by the user through the "blackbox" component option. The database management system supports the storage and manipulation of component data, output data, and solution control data through interactive edit screens. The simulation control and execution system initiates and controls the iterative solution process, displaying time status and any necessary diagnostic messages. In addition to these primary functions, the program provides three other important functional areas: 1) model output management, 2) system utility commands, and 3) user operations logic capacity. The model output management system provides tabular and graphical output capability. Complete fluid constituent mass fraction and properties data (mass flow, pressure, temperature, specific heat, density, and viscosity) is generated at user-selected output intervals and stored for reference. The Integrated Plot Utility (IPU) provides plotting capability for all data output. System utility commands are provided to enable the user to operate more efficiently in the CASE/A environment. The user is able to customize a simulation through optional operations FORTRAN logic. This user-developed code is compiled and linked with a CASE/A model and enables the user to control and timeline component operating parameters during various phases of the iterative solution process. CASE/A provides for transient tracking of the flow stream constituents and determination of their thermodynamic state throughout an ECLSS/ATCS simulation, performing heat transfer, chemical reaction, mass/energy balance, and system pressure drop analysis based on user-specified operating conditions. The program tracks each constituent through all combination and decomposition states while maintaining a mass and energy balance on the overall system. This allows rapid assessment of ECLSS designs, the impact of alternate technologies, and impacts due to changes in metabolic forcing functions, consumables usage, and system control considerations. CASE/A is written in FORTRAN 77 for the DEC VAX/VMS computer series, and requires 12Mb of disk storage and a minimum paging file quota of 20,000 pages. The program operates on the Tektronix 4014 graphics standard and VT100 text standard. The program requires a Tektronix 4014 or later graphics terminal, third party composite graphics/text terminal, or personal computer loaded with appropriate VT100/TEK 4014 emulator software. The use of composite terminals or personal computers with popular emulation software is recommended for enhanced CASE/A operations and general ease of use. The program is available on an unlabeled 9-track 6250 BPI DEC VAX BACKUP format magnetic tape. CASE/A development began in 1985 under contract to NASA/Marshall Space Flight Center. The latest version (4.1) was released in 1990. Tektronix and TEK 4014 are trademarks of Tektronix, Inc. VT100 is a trademark of Digital Equipment Corporation.

  9. Research in image management and access

    NASA Technical Reports Server (NTRS)

    Vondran, Raymond F.; Barron, Billy J.

    1993-01-01

    Presently, the problem of over-all library system design has been compounded by the accretion of both function and structure to a basic framework of requirements. While more device power has led to increased functionality, opportunities for reducing system complexity at the user interface level have not always been pursued with equal zeal. The purpose of this book is therefore to set forth and examine these opportunities, within the general framework of human factors research in man-machine interfaces. Human factors may be viewed as a series of trade-off decisions among four polarized objectives: machine resources and user specifications; functionality and user requirements. In the past, a limiting factor was the availability of systems. However, in the last two years, over one hundred libraries supported by many different software configurations have been added to the Internet. This document includes a statistical analysis of human responses to five Internet library systems by key features, development of the ideal online catalog system, and ideal online catalog systems for libraries and information centers.

  10. Mission Analysis Program for Solar Electric Propulsion (MAPSEP). Volume 1: Analytical manual for earth orbital MAPSEP

    NASA Technical Reports Server (NTRS)

    1975-01-01

    An introduction to the MAPSEP organization and a detailed analytical description of all models and algorithms are given. These include trajectory and error covariance propagation methods, orbit determination processes, thrust modeling, and trajectory correction (guidance) schemes. Earth orbital MAPSEP contains the capability of analyzing almost any currently projected low thrust mission from low earth orbit to super synchronous altitudes. Furthermore, MAPSEP is sufficiently flexible to incorporate extended dynamic models, alternate mission strategies, and almost any other system requirement imposed by the user. As in the interplanetary version, earth orbital MAPSEP represents a trade-off between precision modeling and computational speed consistent with defining necessary system requirements. It can be used in feasibility studies as well as in flight operational support. Pertinent operational constraints are available both implicitly and explicitly. However, the reader should be warned that because of program complexity, MAPSEP is only as good as the user and will quickly succumb to faulty user inputs.

  11. Elimination sequence optimization for SPAR

    NASA Technical Reports Server (NTRS)

    Hogan, Harry A.

    1986-01-01

    SPAR is a large-scale computer program for finite element structural analysis. The program allows user specification of the order in which the joints of a structure are to be eliminated since this order can have significant influence over solution performance, in terms of both storage requirements and computer time. An efficient elimination sequence can improve performance by over 50% for some problems. Obtaining such sequences, however, requires the expertise of an experienced user and can take hours of tedious effort to affect. Thus, an automatic elimination sequence optimizer would enhance productivity by reducing the analysts' problem definition time and by lowering computer costs. Two possible methods for automating the elimination sequence specifications were examined. Several algorithms based on the graph theory representations of sparse matrices were studied with mixed results. Significant improvement in the program performance was achieved, but sequencing by an experienced user still yields substantially better results. The initial results provide encouraging evidence that the potential benefits of such an automatic sequencer would be well worth the effort.

  12. Low cost miniature data collection platform

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The development of the RF elements of a telecommunications package involved detailed study and analysis of concepts and techniques followed by laboratory testing and evaluation of designs. The design goals for a complete telecommunications package excluding antenna were a total weight of 300 grams, in a total volume of 400 cu cm with a capability of unattended operation for a period of six months. Of utmost importance is extremely low cost when produced in lots of 10,000. Early in the program it became apparent that a single Miniature Data Collection Platform would not satisfy all users. A single high efficiency system would not satisfy a user who had available a large battery capacity but required a low cost system. Conversely, the low cost system would not satisfy the end user who had a very limited battery capacity. A system design to satisfy these varied requirements was implemented by designing several versions of the system building blocks and then constructing three systems from these building blocks.

  13. Investigating the Concept of Consumers as Producers in Virtual Worlds: Looking through Social, Technical, Economic, and Legal Lenses

    NASA Astrophysics Data System (ADS)

    Kienle, Holger M.; Lober, Andreas; Vasiliu, Crina A.; Müller, Hausi A.

    Virtual worlds such as World of Warcraft and Second Life enable consumers as producers, that is users can choose to be passive consumers of content, active producers of content, or both. Consumers as producers poses unique challenges and opportunities for both operators and users of virtual worlds. While the degrees of freedom for user-generated content differ depending on the world, instances of consumers as producers can be found in many virtual worlds. In this paper we characterize consumers as producers with the help of four "lenses"—social, technical, economic, and legal—and use the lenses to discuss implications for operators and users. These lenses provide a complementary analysis of consumers as producers from different angels and shows that an understanding of it requires a holistic approach.

  14. Enhancing the Effectiveness of Consumer-Focused Health Information Technology Systems Through eHealth Literacy: A Framework for Understanding Users' Needs.

    PubMed

    Kayser, Lars; Kushniruk, Andre; Osborne, Richard H; Norgaard, Ole; Turner, Paul

    2015-05-20

    eHealth systems and applications are increasingly focused on supporting consumers to directly engage with and use health care services. Involving end users in the design of these systems is critical to ensure a generation of usable and effective eHealth products and systems. Often the end users engaged for these participatory design processes are not actual representatives of the general population, and developers may have limited understanding about how well they might represent the full range of intended users of the eHealth products. As a consequence, resulting information technology (IT) designs may not accommodate the needs, skills, cognitive capacities, and/or contexts of use of the intended broader population of health consumers. This may result in challenges for consumers who use the health IT systems, and could lead to limitations in adoption if the diversity of user attributes has not been adequately considered by health IT designers. The objective of this paper is to propose how users' needs and competences can be taken into account when designing new information and communications technology solutions in health care by expanding the user-task-context matrix model with the domains of a new concept of eHealth literacy. This approach expands an existing method for supporting health IT system development, which advocates use of a three-dimensional user-task-context matrix to comprehensively identify the users of health IT systems, and what their needs and requirements are under differing contexts of use. The extension of this model involved including knowledge about users' competences within the seven domains of eHealth literacy, which had been identified based on systematic engagement with computer scientists, academics, health professionals, and patients recruited from various patient organizations and primary care. A concept map was constructed based on a structured brainstorm procedure, card sorting, and computational analysis. The new eHealth literacy concept (based on 7 domains) was incorporated as a key factor in expanding the user-task-context matrix to describe and qualify user requirements and understanding related to eHealth literacy. This resulted in an expanded framework and a five-step process, which can support health IT designers in understanding and more accurately addressing end-users' needs, capabilities, and contexts to improve effectiveness and broader applicability of consumer-focused health IT systems. It is anticipated that the framework will also be useful for policy makers involved in the planning, procuring, and funding of eHealth infrastructure, applications, and services. Developing effective eHealth products requires complete understanding of the end-users' needs from multiple perspectives. In this paper, we have proposed and detailed a framework for modeling users' needs for designing eHealth systems that merges prior work in development of a user-task-context matrix with the emerging area of eHealth literacy. This framework is intended to be used to guide design of eHealth technologies and to make requirements explicitly related to eHealth literacy, enabling a generation of well-targeted, fit-for-purpose, equitable, and effective products and systems.

  15. User fees for public health care services in Hungary: expectations, experience, and acceptability from the perspectives of different stakeholders.

    PubMed

    Baji, Petra; Pavlova, Milena; Gulácsi, László; Groot, Wim

    2011-10-01

    The introduction of user fees for health care services is a new phenomenon in Central-Eastern European Countries. In Hungary, user fees were first introduced in 2007, but abolished one year later after a referendum. The aim of our study is to describe the experiences and expectations of health system stakeholders in Hungary related to user fees as well as their approval of such fees. For our analysis we use both qualitative and quantitative data from focus-group discussions with health care consumers and physicians, and in-depth interviews with policy makers and health insurance representatives. Our findings suggest that the reasons behind the unpopularity of user fees might be (a) the rejection of the objectives of user fees defined by the government, (b) negative personal experiences with user fees, and (c) the general mistrust of the Hungarian population when it comes to the utilization of public resources. Successful policy implementation of user fees requires social consensus on the policy objectives, also there should be real improvements in health care provision noticeable for consumers, to assure the fees acceptance. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  16. Users Guide for Fire Image Analysis System - Version 5.0: A Tool for Measuring Fire Behavior Characteristics

    Treesearch

    Carl W. Adkins

    1995-01-01

    The Fire Image Analysis System is a tool for quantifying flame geometry and relative position at selected points along a spreading line fire. At present, the system requires uniform terrain (constant slope). The system has been used in field and laboratory studies for determining flame length, depth, cross sectional area, and rate of spread.

  17. Financial modeling/case-mix analysis.

    PubMed

    Heck, S; Esmond, T

    1983-06-01

    The authors describe a case mix system developed by users which goes beyond DRG requirements to respond to management's clinical/financial data needs for marketing, planning, budgeting and financial analysis as well as reimbursement. Lessons learned in development of the system and the clinical/financial base will be helpful to those currently contemplating the implementation of such a system or evaluating available software.

  18. The professional psychiatric/mental health nurse: skills, competencies and supports required to adopt recovery-orientated policy in practice.

    PubMed

    Cusack, E; Killoury, F; Nugent, L E

    2017-03-01

    WHAT IS KNOWN ON THE SUBJECT?: Nationally and internationally there has been a movement away from the traditional medical model towards a more holistic recovery-oriented approach to mental health care delivery. At every level of service provision the emphasis is firmly on recovery and on facilitating active partnership working and involvement of service users, their carers and family members. WHAT THIS PAPER ADDS TO EXISTING KNOWLEDGE?: This is the first study to identify on a national level specific areas of care that are addressed most or least by psychiatric and mental health nurses in care planning for mental health service users in Ireland. In addition, this is the first study to identify nationally how the recovery approach is being implemented by psychiatric and mental health nurses in relation to current recovery-orientated policy. WHAT ARE THE IMPLICATIONS FOR PRACTICE?: Mental healthcare staff require more education on the recovery concept and this needs to be multidisciplinary team wide. Further research is required to establish how best to develop a shared approach to working with service users and their families within the mental healthcare environment. Further investigation is required to help determine how funding could be allocated appropriately for education and training and service development nationally. Introduction The restructuring of national mental health policy to an integrated recovery ethos demands a clarification in the psychiatric/mental health nurse's role, skills and competencies. Aim/Question To explore the psychiatric/mental health nurse's role and identify skills, competencies and supports required to adopt recovery-orientated policy in practice. Method An exploratory mixed methods study in multiple health services in Ireland with N = 1249 psychiatric/mental health nurses. Data collection used a survey, focus groups and written submissions. Data analysis used descriptive statistics and thematic analysis. Results The medical profession use a symptom-focused approach to mental healthcare delivery. Nurses viewed this as a primary inhibitor to recovery-orientated practice. Professional development in prevention and earlier intervention within primary care environments requires development. Nurses require research support to measure the effectiveness of the mental health interventions they provide. Implications and conclusion The effective implementation of the recovery approach requires a multitude of strategies and narrative threads in an overall medical assessment. Nurses need support from medics in providing consistency of assessments/documentation of required psychosocial interventions. A greater range of specialist services provided by nurses including psychosocial interventions and health promotion is fundamental to quality care and improving service user outcomes in primary care. © 2016 John Wiley & Sons Ltd.

  19. Personal Electronic Health Records: Understanding User Requirements and Needs in Chronic Cancer Care

    PubMed Central

    Winkler, Eva; Kamradt, Martina; Längst, Gerda; Eckrich, Felicitas; Heinze, Oliver; Bergh, Bjoern; Szecsenyi, Joachim; Ose, Dominik

    2015-01-01

    Background The integration of new information and communication technologies (ICTs) is becoming increasingly important in reorganizing health care. Adapting ICTs as supportive tools to users' needs and daily practices is vital for adoption and use. Objective In order to develop a Web-based personal electronic health record (PEPA), we explored user requirements and needs with regard to desired information and functions. Methods A qualitative study across health care sectors and health professions was conducted in a regional health care setting in Germany. Overall, 10 semistructured focus groups were performed, collecting views of 3 prospective user groups: patients with colorectal cancer (n=12) and representatives from patient support groups (n=2), physicians (n=17), and non-medical HCPs (n=16). Data were audio- and videotaped, transcribed verbatim, and thematically analyzed using qualitative content analysis. Results For both patients and HCPs, it was central to have a tool representing the chronology of illness and its care processes, for example, patients wanted to track their long-term laboratory findings (eg, tumor markers). Designing health information in a patient accessible way was highlighted as important. Users wanted to have general and tumor-specific health information available in a PEPA. Functions such as filtering information and adding information by patients (eg, on their well-being or electronic communication with HCPs via email) were discussed. Conclusions In order to develop a patient/user centered tool that is tailored to user needs, it is essential to address their perspectives. A challenge for implementation will be how to design PEPA’s health data in a patient accessible way. Adequate patient support and technical advice for users have to be addressed. PMID:25998006

  20. ISPyB for BioSAXS, the gateway to user autonomy in solution scattering experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Maria Antolinos, Alejandro; Pernot, Petra; Brennich, Martha E.

    The ISPyB information-management system for crystallography has been adapted to include data from small-angle X-ray scattering of macromolecules in solution experiments. Logging experiments with the laboratory-information management system ISPyB (Information System for Protein crystallography Beamlines) enhances the automation of small-angle X-ray scattering of biological macromolecules in solution (BioSAXS) experiments. The ISPyB interface provides immediate user-oriented online feedback and enables data cross-checking and downstream analysis. To optimize data quality and completeness, ISPyBB (ISPyB for BioSAXS) makes it simple for users to compare the results from new measurements with previous acquisitions from the same day or earlier experiments in order to maximizemore » the ability to collect all data required in a single synchrotron visit. The graphical user interface (GUI) of ISPyBB has been designed to guide users in the preparation of an experiment. The input of sample information and the ability to outline the experimental aims in advance provides feedback on the number of measurements required, calculation of expected sample volumes and time needed to collect the data: all of this information aids the users to better prepare for their trip to the synchrotron. A prototype version of the ISPyBB database is now available at the European Synchrotron Radiation Facility (ESRF) beamline BM29 and is already greatly appreciated by academic users and industrial clients. It will soon be available at the PETRA III beamline P12 and the Diamond Light Source beamlines I22 and B21.« less

  1. 33 CFR 161.2 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Radiotelephone Act; or (b) Required to participate in a VMRS within a VTS area (VMRS User). VTS User's Manual...) User means a vessel, or an owner, operator, charterer, Master, or person directing the movement of a... which special operating requirements apply. VTS User means a vessel, or an owner, operator, charterer...

  2. Requirements for the Military Message System (MMS) Family: Data Types and User Commands.

    DTIC Science & Technology

    1986-04-11

    AD-A167 126 REQUIREMENTS FOR THE MILITARY MESSASE SYSTEM (NHS) i FRILY: DATA TYPES AND USER CONNNDS(U) NAVAL RESEARCH LAB WASHINGTON DC C L HEITHEVER... System (MMS) Family: Data Types and User Commands CONSTANCE L. HEITMEYER Computer Science and Systems Branch I Information Technology Division April 11...Security Classification) Requirements for the Military Message System (MMS) Family: Data Types and User Commands 12. PERSONAL AUTHOR(S) Heitmeer, Constance

  3. An analysis of the Research Team-Service User relationship from the Service User perspective: a consideration of 'The Three Rs' (Roles, Relations, and Responsibilities) for healthcare research organisations.

    PubMed

    Jordan, Melanie; Rowley, Emma; Morriss, Richard; Manning, Nick

    2015-12-01

    This article debates interview data from service users who engaged with the work of a Collaboration for Leadership in Applied Health Research and Care (CLAHRC). The evidence base, to date, concerning the nature of CLAHRC work at the frontline (i.e. What is it actually like to do CLAHRC work?) is meagre; thus, this article represents an original contribution to that literature. Further, this article analyses service users' participation in research - as members of the research team - and so contributes to the body of developing literature regarding involvement too. This article explores the nature of the Research Team-Service User relationship, plus associated roles, relations and responsibilities of collaborative health research. Qualitative social science research was undertaken in a health-care research organization utilizing interview method and a medical sociology and organizational sociology theoretical framework for analysis. Data utilized originate from a larger evaluation study that focuses on the CLAHRC as an iterative organization and explores members' experiences. There can be a disparity between initial expectations and actual experiences of involvement for service users. Therefore, as structured via 'The Three Rs' (Roles, Relations and Responsibilities), aspects of the relationship are evaluated (e.g. motivation, altruism, satisfaction, transparency, scope, feedback, communication, time). Regarding the inclusion of service users in health research teams, a careful consideration of 'The Three Rs' is required to ensure expectations match experiences. © 2014 John Wiley & Sons Ltd.

  4. MemAxes: Visualization and Analytics for Characterizing Complex Memory Performance Behaviors.

    PubMed

    Gimenez, Alfredo; Gamblin, Todd; Jusufi, Ilir; Bhatele, Abhinav; Schulz, Martin; Bremer, Peer-Timo; Hamann, Bernd

    2018-07-01

    Memory performance is often a major bottleneck for high-performance computing (HPC) applications. Deepening memory hierarchies, complex memory management, and non-uniform access times have made memory performance behavior difficult to characterize, and users require novel, sophisticated tools to analyze and optimize this aspect of their codes. Existing tools target only specific factors of memory performance, such as hardware layout, allocations, or access instructions. However, today's tools do not suffice to characterize the complex relationships between these factors. Further, they require advanced expertise to be used effectively. We present MemAxes, a tool based on a novel approach for analytic-driven visualization of memory performance data. MemAxes uniquely allows users to analyze the different aspects related to memory performance by providing multiple visual contexts for a centralized dataset. We define mappings of sampled memory access data to new and existing visual metaphors, each of which enabling a user to perform different analysis tasks. We present methods to guide user interaction by scoring subsets of the data based on known performance problems. This scoring is used to provide visual cues and automatically extract clusters of interest. We designed MemAxes in collaboration with experts in HPC and demonstrate its effectiveness in case studies.

  5. Patient Accounting Systems: Are They Fit with the Users' Requirements?

    PubMed Central

    Ayatollahi, Haleh; Nazemi, Zahra

    2016-01-01

    Objectives A patient accounting system is a subsystem of a hospital information system. This system like other information systems should be carefully designed to be able to meet users' requirements. The main aim of this research was to investigate users' requirements and to determine whether current patient accounting systems meet users' needs or not. Methods This was a survey study, and the participants were the users of six patient accounting systems used in 24 teaching hospitals. A stratified sampling method was used to select the participants (n = 216). The research instruments were a questionnaire and a checklist. The mean value of ≥3 showed the importance of each data element and the capability of the system. Results Generally, the findings showed that the current patient accounting systems had some weaknesses and were able to meet between 70% and 80% of users' requirements. Conclusions The current patient accounting systems need to be improved to be able to meet users' requirements. This approach can also help to provide hospitals with more usable and reliable financial information. PMID:26893945

  6. Transversal analysis of public policies on user fees exemptions in six West African countries.

    PubMed

    Ridde, Valéry; Queuille, Ludovic; Kafando, Yamba; Robert, Emilie

    2012-11-20

    While more and more West African countries are implementing public user fees exemption policies, there is still little knowledge available on this topic. The long time required for scientific production, combined with the needs of decision-makers, led to the creation in 2010 of a project to support implementers in aggregating knowledge on their experiences. This article presents a transversal analysis of user fees exemption policies implemented in Benin, Burkina Faso, Mali, Niger, Togo and Senegal. This was a multiple case study with several embedded levels of analysis. The cases were public user fees exemption policies selected by the participants because of their instructive value. The data used in the countries were taken from documentary analysis, interviews and questionnaires. The transversal analysis was based on a framework for studying five implementation components and five actors' attitudes usually encountered in these policies. The analysis of the implementation components revealed: a majority of State financing; maintenance of centrally organized financing; a multiplicity of reimbursement methods; reimbursement delays and/or stock shortages; almost no implementation guides; a lack of support measures; communication plans that were rarely carried out, funded or renewed; health workers who were given general information but not details; poorly informed populations; almost no evaluation systems; ineffective and poorly funded coordination systems; low levels of community involvement; and incomplete referral-evacuation systems. With regard to actors' attitudes, the analysis revealed: objectives that were appreciated by everyone; dissatisfaction with the implementation; specific tensions between healthcare providers and patients; overall satisfaction among patients, but still some problems; the perception that while the financial barrier has been removed, other barriers persist; occasionally a reorganization of practices, service rationing due to lack of reimbursement, and some overcharging or shifting of resources. This transversal analysis confirms the need to assign a great deal of importance to the implementation of user fees exemption policies once these decisions have been taken. It also highlights some practices that suggest avenues of future research.

  7. Clinical evaluation of BrainTree, a motor imagery hybrid BCI speller

    NASA Astrophysics Data System (ADS)

    Perdikis, S.; Leeb, R.; Williamson, J.; Ramsay, A.; Tavella, M.; Desideri, L.; Hoogerwerf, E.-J.; Al-Khodairy, A.; Murray-Smith, R.; Millán, J. d. R.

    2014-06-01

    Objective. While brain-computer interfaces (BCIs) for communication have reached considerable technical maturity, there is still a great need for state-of-the-art evaluation by the end-users outside laboratory environments. To achieve this primary objective, it is necessary to augment a BCI with a series of components that allow end-users to type text effectively. Approach. This work presents the clinical evaluation of a motor imagery (MI) BCI text-speller, called BrainTree, by six severely disabled end-users and ten able-bodied users. Additionally, we define a generic model of code-based BCI applications, which serves as an analytical tool for evaluation and design. Main results. We show that all users achieved remarkable usability and efficiency outcomes in spelling. Furthermore, our model-based analysis highlights the added value of human-computer interaction techniques and hybrid BCI error-handling mechanisms, and reveals the effects of BCI performances on usability and efficiency in code-based applications. Significance. This study demonstrates the usability potential of code-based MI spellers, with BrainTree being the first to be evaluated by a substantial number of end-users, establishing them as a viable, competitive alternative to other popular BCI spellers. Another major outcome of our model-based analysis is the derivation of a 80% minimum command accuracy requirement for successful code-based application control, revising upwards previous estimates attempted in the literature.

  8. Clinical evaluation of BrainTree, a motor imagery hybrid BCI speller.

    PubMed

    Perdikis, S; Leeb, R; Williamson, J; Ramsay, A; Tavella, M; Desideri, L; Hoogerwerf, E-J; Al-Khodairy, A; Murray-Smith, R; Millán, J D R

    2014-06-01

    While brain-computer interfaces (BCIs) for communication have reached considerable technical maturity, there is still a great need for state-of-the-art evaluation by the end-users outside laboratory environments. To achieve this primary objective, it is necessary to augment a BCI with a series of components that allow end-users to type text effectively. This work presents the clinical evaluation of a motor imagery (MI) BCI text-speller, called BrainTree, by six severely disabled end-users and ten able-bodied users. Additionally, we define a generic model of code-based BCI applications, which serves as an analytical tool for evaluation and design. We show that all users achieved remarkable usability and efficiency outcomes in spelling. Furthermore, our model-based analysis highlights the added value of human-computer interaction techniques and hybrid BCI error-handling mechanisms, and reveals the effects of BCI performances on usability and efficiency in code-based applications. This study demonstrates the usability potential of code-based MI spellers, with BrainTree being the first to be evaluated by a substantial number of end-users, establishing them as a viable, competitive alternative to other popular BCI spellers. Another major outcome of our model-based analysis is the derivation of a 80% minimum command accuracy requirement for successful code-based application control, revising upwards previous estimates attempted in the literature.

  9. User-centered requirements engineering in health information systems: a study in the hemophilia field.

    PubMed

    Teixeira, Leonor; Ferreira, Carlos; Santos, Beatriz Sousa

    2012-06-01

    The use of sophisticated information and communication technologies (ICTs) in the health care domain is a way to improve the quality of services. However, there are also hazards associated with the introduction of ICTs in this domain and a great number of projects have failed due to the lack of systematic consideration of human and other non-technology issues throughout the design or implementation process, particularly in the requirements engineering process. This paper presents the methodological approach followed in the design process of a web-based information system (WbIS) for managing the clinical information in hemophilia care, which integrates the values and practices of user-centered design (UCD) activities into the principles of software engineering, particularly in the phase of requirements engineering (RE). This process followed a paradigm that combines a grounded theory for data collection with an evolutionary design based on constant development and refinement of the generic domain model using three well-known methodological approaches: (a) object-oriented system analysis; (b) task analysis; and, (c) prototyping, in a triangulation work. This approach seems to be a good solution for the requirements engineering process in this particular case of the health care domain, since the inherent weaknesses of individual methods are reduced, and emergent requirements are easier to elicit. Moreover, the requirements triangulation matrix gives the opportunity to look across the results of all used methods and decide what requirements are critical for the system success. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  10. Vehicle systems and payload requirements evaluation. [computer programs for identifying launch vehicle system requirements

    NASA Technical Reports Server (NTRS)

    Rea, F. G.; Pittenger, J. L.; Conlon, R. J.; Allen, J. D.

    1975-01-01

    Techniques developed for identifying launch vehicle system requirements for NASA automated space missions are discussed. Emphasis is placed on development of computer programs and investigation of astrionics for OSS missions and Scout. The Earth Orbit Mission Program - 1 which performs linear error analysis of launch vehicle dispersions for both vehicle and navigation system factors is described along with the Interactive Graphic Orbit Selection program which allows the user to select orbits which satisfy mission requirements and to evaluate the necessary injection accuracy.

  11. A Collaborative Extensible User Environment for Simulation and Knowledge Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freedman, Vicky L.; Lansing, Carina S.; Porter, Ellen A.

    2015-06-01

    In scientific simulation, scientists use measured data to create numerical models, execute simulations and analyze results from advanced simulators executing on high performance computing platforms. This process usually requires a team of scientists collaborating on data collection, model creation and analysis, and on authorship of publications and data. This paper shows that scientific teams can benefit from a user environment called Akuna that permits subsurface scientists in disparate locations to collaborate on numerical modeling and analysis projects. The Akuna user environment is built on the Velo framework that provides both a rich client environment for conducting and analyzing simulations andmore » a Web environment for data sharing and annotation. Akuna is an extensible toolset that integrates with Velo, and is designed to support any type of simulator. This is achieved through data-driven user interface generation, use of a customizable knowledge management platform, and an extensible framework for simulation execution, monitoring and analysis. This paper describes how the customized Velo content management system and the Akuna toolset are used to integrate and enhance an effective collaborative research and application environment. The extensible architecture of Akuna is also described and demonstrates its usage for creation and execution of a 3D subsurface simulation.« less

  12. SOURCE EXPLORER: Towards Web Browser Based Tools for Astronomical Source Visualization and Analysis

    NASA Astrophysics Data System (ADS)

    Young, M. D.; Hayashi, S.; Gopu, A.

    2014-05-01

    As a new generation of large format, high-resolution imagers come online (ODI, DECAM, LSST, etc.) we are faced with the daunting prospect of astronomical images containing upwards of hundreds of thousands of identifiable sources. Visualizing and interacting with such large datasets using traditional astronomical tools appears to be unfeasible, and a new approach is required. We present here a method for the display and analysis of arbitrarily large source datasets using dynamically scaling levels of detail, enabling scientists to rapidly move from large-scale spatial overviews down to the level of individual sources and everything in-between. Based on the recognized standards of HTML5+JavaScript, we enable observers and archival users to interact with their images and sources from any modern computer without having to install specialized software. We demonstrate the ability to produce large-scale source lists from the images themselves, as well as overlaying data from publicly available source ( 2MASS, GALEX, SDSS, etc.) or user provided source lists. A high-availability cluster of computational nodes allows us to produce these source maps on demand and customized based on user input. User-generated source lists and maps are persistent across sessions and are available for further plotting, analysis, refinement, and culling.

  13. SUGAR: graphical user interface-based data refiner for high-throughput DNA sequencing.

    PubMed

    Sato, Yukuto; Kojima, Kaname; Nariai, Naoki; Yamaguchi-Kabata, Yumi; Kawai, Yosuke; Takahashi, Mamoru; Mimori, Takahiro; Nagasaki, Masao

    2014-08-08

    Next-generation sequencers (NGSs) have become one of the main tools for current biology. To obtain useful insights from the NGS data, it is essential to control low-quality portions of the data affected by technical errors such as air bubbles in sequencing fluidics. We develop a software SUGAR (subtile-based GUI-assisted refiner) which can handle ultra-high-throughput data with user-friendly graphical user interface (GUI) and interactive analysis capability. The SUGAR generates high-resolution quality heatmaps of the flowcell, enabling users to find possible signals of technical errors during the sequencing. The sequencing data generated from the error-affected regions of a flowcell can be selectively removed by automated analysis or GUI-assisted operations implemented in the SUGAR. The automated data-cleaning function based on sequence read quality (Phred) scores was applied to a public whole human genome sequencing data and we proved the overall mapping quality was improved. The detailed data evaluation and cleaning enabled by SUGAR would reduce technical problems in sequence read mapping, improving subsequent variant analysis that require high-quality sequence data and mapping results. Therefore, the software will be especially useful to control the quality of variant calls to the low population cells, e.g., cancers, in a sample with technical errors of sequencing procedures.

  14. PeptideDepot: Flexible Relational Database for Visual Analysis of Quantitative Proteomic Data and Integration of Existing Protein Information

    PubMed Central

    Yu, Kebing; Salomon, Arthur R.

    2010-01-01

    Recently, dramatic progress has been achieved in expanding the sensitivity, resolution, mass accuracy, and scan rate of mass spectrometers able to fragment and identify peptides through tandem mass spectrometry (MS/MS). Unfortunately, this enhanced ability to acquire proteomic data has not been accompanied by a concomitant increase in the availability of flexible tools allowing users to rapidly assimilate, explore, and analyze this data and adapt to a variety of experimental workflows with minimal user intervention. Here we fill this critical gap by providing a flexible relational database called PeptideDepot for organization of expansive proteomic data sets, collation of proteomic data with available protein information resources, and visual comparison of multiple quantitative proteomic experiments. Our software design, built upon the synergistic combination of a MySQL database for safe warehousing of proteomic data with a FileMaker-driven graphical user interface for flexible adaptation to diverse workflows, enables proteomic end-users to directly tailor the presentation of proteomic data to the unique analysis requirements of the individual proteomics lab. PeptideDepot may be deployed as an independent software tool or integrated directly with our High Throughput Autonomous Proteomic Pipeline (HTAPP) used in the automated acquisition and post-acquisition analysis of proteomic data. PMID:19834895

  15. CMS users data management service integration and first experiences with its NoSQL data storage

    NASA Astrophysics Data System (ADS)

    Riahi, H.; Spiga, D.; Boccali, T.; Ciangottini, D.; Cinquilli, M.; Hernàndez, J. M.; Konstantinov, P.; Mascheroni, M.; Santocchia, A.

    2014-06-01

    The distributed data analysis workflow in CMS assumes that jobs run in a different location to where their results are finally stored. Typically the user outputs must be transferred from one site to another by a dedicated CMS service, AsyncStageOut. This new service is originally developed to address the inefficiency in using the CMS computing resources when transferring the analysis job outputs, synchronously, once they are produced in the job execution node to the remote site. The AsyncStageOut is designed as a thin application relying only on the NoSQL database (CouchDB) as input and data storage. It has progressed from a limited prototype to a highly adaptable service which manages and monitors the whole user files steps, namely file transfer and publication. The AsyncStageOut is integrated with the Common CMS/Atlas Analysis Framework. It foresees the management of nearly nearly 200k users' files per day of close to 1000 individual users per month with minimal delays, and providing a real time monitoring and reports to users and service operators, while being highly available. The associated data volume represents a new set of challenges in the areas of database scalability and service performance and efficiency. In this paper, we present an overview of the AsyncStageOut model and the integration strategy with the Common Analysis Framework. The motivations for using the NoSQL technology are also presented, as well as data design and the techniques used for efficient indexing and monitoring of the data. We describe deployment model for the high availability and scalability of the service. We also discuss the hardware requirements and the results achieved as they were determined by testing with actual data and realistic loads during the commissioning and the initial production phase with the Common Analysis Framework.

  16. Tool for the Integrated Dynamic Numerical Propulsion System Simulation (NPSS)/Turbine Engine Closed-Loop Transient Analysis (TTECTrA) User's Guide

    NASA Technical Reports Server (NTRS)

    Chin, Jeffrey C.; Csank, Jeffrey T.

    2016-01-01

    The Tool for Turbine Engine Closed-Loop Transient Analysis (TTECTrA ver2) is a control design tool thatenables preliminary estimation of transient performance for models without requiring a full nonlinear controller to bedesigned. The program is compatible with subsonic engine models implemented in the MATLAB/Simulink (TheMathworks, Inc.) environment and Numerical Propulsion System Simulation (NPSS) framework. At a specified flightcondition, TTECTrA will design a closed-loop controller meeting user-defined requirements in a semi or fully automatedfashion. Multiple specifications may be provided, in which case TTECTrA will design one controller for each, producing acollection of controllers in a single run. Each resulting controller contains a setpoint map, a schedule of setpointcontroller gains, and limiters; all contributing to transient characteristics. The goal of the program is to providesteady-state engine designers with more immediate feedback on the transient engine performance earlier in the design cycle.

  17. The advanced software development workstation project

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest M., III; Pitman, Charles L.

    1991-01-01

    The Advanced Software Development Workstation (ASDW) task is researching and developing the technologies required to support Computer Aided Software Engineering (CASE) with the emphasis on those advanced methods, tools, and processes that will be of benefit to support all NASA programs. Immediate goals are to provide research and prototype tools that will increase productivity, in the near term, in projects such as the Software Support Environment (SSE), the Space Station Control Center (SSCC), and the Flight Analysis and Design System (FADS) which will be used to support the Space Shuttle and Space Station Freedom. Goals also include providing technology for development, evolution, maintenance, and operations. The technologies under research and development in the ASDW project are targeted to provide productivity enhancements during the software life cycle phase of enterprise and information system modeling, requirements generation and analysis, system design and coding, and system use and maintenance. On-line user's guides will assist users in operating the developed information system with knowledge base expert assistance.

  18. On the SAR derived alert in the detection of oil spills according to the analysis of the EGEMP.

    PubMed

    Ferraro, Guido; Baschek, Björn; de Montpellier, Geraldine; Njoten, Ove; Perkovic, Marko; Vespe, Michele

    2010-01-01

    Satellite services that deliver information about possible oil spills at sea currently use different labels of "confidence" to describe the detections based on radar image processing. A common approach is to use a classification differentiating between low, medium and high levels of confidence. There is an ongoing discussion on the suitability of the existing classification systems of possible oil spills detected by radar satellite images with regard to the relevant significance and correspondence to user requirements. This paper contains a basic analysis of user requirements, current technical possibilities of satellite services as well as proposals for a redesign of the classification system as an evolution towards a more structured alert system. This research work offers a first review of implemented methodologies for the categorisation of detected oil spills, together with the proposal of explorative ideas evaluated by the European Group of Experts on satellite Monitoring of sea-based oil Pollution (EGEMP). Copyright 2009 Elsevier Ltd. All rights reserved.

  19. Information-computational platform for collaborative multidisciplinary investigations of regional climatic changes and their impacts

    NASA Astrophysics Data System (ADS)

    Gordov, Evgeny; Lykosov, Vasily; Krupchatnikov, Vladimir; Okladnikov, Igor; Titov, Alexander; Shulgina, Tamara

    2013-04-01

    Analysis of growing volume of related to climate change data from sensors and model outputs requires collaborative multidisciplinary efforts of researchers. To do it timely and in reliable way one needs in modern information-computational infrastructure supporting integrated studies in the field of environmental sciences. Recently developed experimental software and hardware platform Climate (http://climate.scert.ru/) provides required environment for regional climate change related investigations. The platform combines modern web 2.0 approach, GIS-functionality and capabilities to run climate and meteorological models, process large geophysical datasets and support relevant analysis. It also supports joint software development by distributed research groups, and organization of thematic education for students and post-graduate students. In particular, platform software developed includes dedicated modules for numerical processing of regional and global modeling results for consequent analysis and visualization. Also run of integrated into the platform WRF and «Planet Simulator» models, modeling results data preprocessing and visualization is provided. All functions of the platform are accessible by a user through a web-portal using common graphical web-browser in the form of an interactive graphical user interface which provides, particularly, capabilities of selection of geographical region of interest (pan and zoom), data layers manipulation (order, enable/disable, features extraction) and visualization of results. Platform developed provides users with capabilities of heterogeneous geophysical data analysis, including high-resolution data, and discovering of tendencies in climatic and ecosystem changes in the framework of different multidisciplinary researches. Using it even unskilled user without specific knowledge can perform reliable computational processing and visualization of large meteorological, climatic and satellite monitoring datasets through unified graphical web-interface. Partial support of RF Ministry of Education and Science grant 8345, SB RAS Program VIII.80.2 and Projects 69, 131, 140 and APN CBA2012-16NSY project is acknowledged.

  20. An arbitrary grid CFD algorithm for configuration aerodynamics analysis. Volume 1: Theory and validations

    NASA Technical Reports Server (NTRS)

    Baker, A. J.; Iannelli, G. S.; Manhardt, Paul D.; Orzechowski, J. A.

    1993-01-01

    This report documents the user input and output data requirements for the FEMNAS finite element Navier-Stokes code for real-gas simulations of external aerodynamics flowfields. This code was developed for the configuration aerodynamics branch of NASA ARC, under SBIR Phase 2 contract NAS2-124568 by Computational Mechanics Corporation (COMCO). This report is in two volumes. Volume 1 contains the theory for the derived finite element algorithm and describes the test cases used to validate the computer program described in the Volume 2 user guide.

  1. An Analysis of the Naval Innovation Laboratory’s Virtual Work Environment-Based Management Information System for Application in Joint Service Explosive Ordnance Disposal Notional Concepts Management

    DTIC Science & Technology

    2009-12-01

    Mr. A.N. Briggs and Susie Adams at NAVEODTECHDIV, Mr. Ronald Simmons of the Marine Corps Combat Development Command, and Mr. Michael Jinnett of the...was implemented. It is the system that members of the Notional Concepts Working Group are familiar with and requires little training or investigation...each user to individually perform each of these steps. There is little chance that all users would do this, and even less chance that they would come

  2. Synfuel program analysis. Volume 2: VENVAL users manual

    NASA Astrophysics Data System (ADS)

    Muddiman, J. B.; Whelan, J. W.

    1980-07-01

    This volume is intended for program analysts and is a users manual for the VENVAL model. It contains specific explanations as to input data requirements and programming procedures for the use of this model. VENVAL is a generalized computer program to aid in evaluation of prospective private sector production ventures. The program can project interrelated values of installed capacity, production, sales revenue, operating costs, depreciation, investment, dent, earnings, taxes, return on investment, depletion, and cash flow measures. It can also compute related public sector and other external costs and revenues if unit costs are furnished.

  3. Classification software technique assessment

    NASA Technical Reports Server (NTRS)

    Jayroe, R. R., Jr.; Atkinson, R.; Dasarathy, B. V.; Lybanon, M.; Ramapryian, H. K.

    1976-01-01

    A catalog of software options is presented for the use of local user communities to obtain software for analyzing remotely sensed multispectral imagery. The resources required to utilize a particular software program are described. Descriptions of how a particular program analyzes data and the performance of that program for an application and data set provided by the user are shown. An effort is made to establish a statistical performance base for various software programs with regard to different data sets and analysis applications, to determine the status of the state-of-the-art.

  4. Spacelab data analysis and interactive control study

    NASA Technical Reports Server (NTRS)

    Tarbell, T. D.; Drake, J. F.

    1980-01-01

    The study consisted of two main tasks, a series of interviews of Spacelab users and a survey of data processing and display equipment. Findings from the user interviews on questions of interactive control, downlink data formats, and Spacelab computer software development are presented. Equipment for quick look processing and display of scientific data in the Spacelab Payload Operations Control Center (POCC) was surveyed. Results of this survey effort are discussed in detail, along with recommendations for NASA development of several specific display systems which meet common requirements of many Spacelab experiments.

  5. 40 CFR 35.929-2 - General requirements for all user charge systems.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 1 2014-07-01 2014-07-01 false General requirements for all user charge systems. 35.929-2 Section 35.929-2 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... than every 2 years the waste water contribution of users and user classes, the total costs of operation...

  6. 40 CFR 35.929-2 - General requirements for all user charge systems.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 1 2012-07-01 2012-07-01 false General requirements for all user charge systems. 35.929-2 Section 35.929-2 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... than every 2 years the waste water contribution of users and user classes, the total costs of operation...

  7. 40 CFR 35.929-2 - General requirements for all user charge systems.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 1 2013-07-01 2013-07-01 false General requirements for all user charge systems. 35.929-2 Section 35.929-2 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... than every 2 years the waste water contribution of users and user classes, the total costs of operation...

  8. Clinical guideline representation in a CDS: a human information processing method.

    PubMed

    Kilsdonk, Ellen; Riezebos, Rinke; Kremer, Leontien; Peute, Linda; Jaspers, Monique

    2012-01-01

    The Dutch Childhood Oncology Group (DCOG) has developed evidence-based guidelines for screening childhood cancer survivors for possible late complications of treatment. These paper-based guidelines appeared to not suit clinicians' information retrieval strategies; it was thus decided to communicate the guidelines through a Computerized Decision Support (CDS) tool. To ensure high usability of this tool, an analysis of clinicians' cognitive strategies in retrieving information from the paper-based guidelines was used as requirements elicitation method. An information processing model was developed through an analysis of think aloud protocols and used as input for the design of the CDS user interface. Usability analysis of the user interface showed that the navigational structure of the CDS tool fitted well with the clinicians' mental strategies employed in deciding on survivors screening protocols. Clinicians were more efficient and more complete in deciding on patient-tailored screening procedures when supported by the CDS tool than by the paper-based guideline booklet. The think-aloud method provided detailed insight into users' clinical work patterns that supported the design of a highly usable CDS system.

  9. Using external data sources to improve audit trail analysis.

    PubMed

    Herting, R L; Asaro, P V; Roth, A C; Barnes, M R

    1999-01-01

    Audit trail analysis is the primary means of detection of inappropriate use of the medical record. While audit logs contain large amounts of information, the information required to determine useful user-patient relationships is often not present. Adequate information isn't present because most audit trail analysis systems rely on the limited information available within the medical record system. We report a feature of the STAR (System for Text Archive and Retrieval) audit analysis system where information available in the medical record is augmented with external information sources such as: database sources, Light-weight Directory Access Protocol (LDAP) server sources, and World Wide Web (WWW) database sources. We discuss several issues that arise when combining the information from each of these disparate information sources. Furthermore, we explain how the enhanced person specific information obtained can be used to determine user-patient relationships that might signify a motive for inappropriately accessing a patient's medical record.

  10. Development of alternative data analysis techniques for improving the accuracy and specificity of natural resource inventories made with digital remote sensing data

    NASA Technical Reports Server (NTRS)

    Lillesand, T. M.; Meisner, D. E. (Principal Investigator)

    1980-01-01

    An investigation was conducted into ways to improve the involvement of state and local user personnel in the digital image analysis process by isolating those elements of the analysis process which require extensive involvement by field personnel and providing means for performing those activities apart from a computer facility. In this way, the analysis procedure can be converted from a centralized activity focused on a computer facility to a distributed activity in which users can interact with the data at the field office level or in the field itself. A general image processing software was developed on the University of Minnesota computer system (Control Data Cyber models 172 and 74). The use of color hardcopy image data as a primary medium in supervised training procedures was investigated and digital display equipment and a coordinate digitizer were procured.

  11. Quantitative method of medication system interface evaluation.

    PubMed

    Pingenot, Alleene Anne; Shanteau, James; Pingenot, James D F

    2007-01-01

    The objective of this study was to develop a quantitative method of evaluating the user interface for medication system software. A detailed task analysis provided a description of user goals and essential activity. A structural fault analysis was used to develop a detailed description of the system interface. Nurses experienced with use of the system under evaluation provided estimates of failure rates for each point in this simplified fault tree. Means of estimated failure rates provided quantitative data for fault analysis. Authors note that, although failures of steps in the program were frequent, participants reported numerous methods of working around these failures so that overall system failure was rare. However, frequent process failure can affect the time required for processing medications, making a system inefficient. This method of interface analysis, called Software Efficiency Evaluation and Fault Identification Method, provides quantitative information with which prototypes can be compared and problems within an interface identified.

  12. XM1 Gunnery Training and Aptitude Requirements Analyses

    DTIC Science & Technology

    1981-02-01

    of the XML tank weapons system. Army materiel systems such as the XK tank are initiated, developed, de - ployed, supported, modified and disposed in...Analysis ( TASA ) to satisfy the FEA requirement. Users of the TASA at the Armor School were uniformly critical of the work. Generally described as...inaccurate, incomplete and to a large extent, obsolete the TASA failed to provide the information necessary for addressing the concerns of future operators

  13. Air Force Global Weather Central System Architecture Study. Final System/Subsystem Summary Report. Volume 2. Requirements Compilation and Analysis. Part 1. User and Model Requirements

    DTIC Science & Technology

    1976-03-01

    Milestones will be established after staffing at 6WW. Long-Term Procedures: A capability will be acquired in automated support to Command and Control under... geocentric latitude f = ZttsinQ g = geopotential a = mean radius of earth Ü = angular rotation of the earth 7.29 x 10" rad/sec u,v

  14. PASMet: a web-based platform for prediction, modelling and analyses of metabolic systems

    PubMed Central

    Sriyudthsak, Kansuporn; Mejia, Ramon Francisco; Arita, Masanori; Hirai, Masami Yokota

    2016-01-01

    PASMet (Prediction, Analysis and Simulation of Metabolic networks) is a web-based platform for proposing and verifying mathematical models to understand the dynamics of metabolism. The advantages of PASMet include user-friendliness and accessibility, which enable biologists and biochemists to easily perform mathematical modelling. PASMet offers a series of user-functions to handle the time-series data of metabolite concentrations. The functions are organised into four steps: (i) Prediction of a probable metabolic pathway and its regulation; (ii) Construction of mathematical models; (iii) Simulation of metabolic behaviours; and (iv) Analysis of metabolic system characteristics. Each function contains various statistical and mathematical methods that can be used independently. Users who may not have enough knowledge of computing or programming can easily and quickly analyse their local data without software downloads, updates or installations. Users only need to upload their files in comma-separated values (CSV) format or enter their model equations directly into the website. Once the time-series data or mathematical equations are uploaded, PASMet automatically performs computation on server-side. Then, users can interactively view their results and directly download them to their local computers. PASMet is freely available with no login requirement at http://pasmet.riken.jp/ from major web browsers on Windows, Mac and Linux operating systems. PMID:27174940

  15. An enhanced biometric authentication scheme for telecare medicine information systems with nonce using chaotic hash function.

    PubMed

    Das, Ashok Kumar; Goswami, Adrijit

    2014-06-01

    Recently, Awasthi and Srivastava proposed a novel biometric remote user authentication scheme for the telecare medicine information system (TMIS) with nonce. Their scheme is very efficient as it is based on efficient chaotic one-way hash function and bitwise XOR operations. In this paper, we first analyze Awasthi-Srivastava's scheme and then show that their scheme has several drawbacks: (1) incorrect password change phase, (2) fails to preserve user anonymity property, (3) fails to establish a secret session key beween a legal user and the server, (4) fails to protect strong replay attack, and (5) lacks rigorous formal security analysis. We then a propose a novel and secure biometric-based remote user authentication scheme in order to withstand the security flaw found in Awasthi-Srivastava's scheme and enhance the features required for an idle user authentication scheme. Through the rigorous informal and formal security analysis, we show that our scheme is secure against possible known attacks. In addition, we simulate our scheme for the formal security verification using the widely-accepted AVISPA (Automated Validation of Internet Security Protocols and Applications) tool and show that our scheme is secure against passive and active attacks, including the replay and man-in-the-middle attacks. Our scheme is also efficient as compared to Awasthi-Srivastava's scheme.

  16. Analysis and preliminary design of Kunming land use and planning management information system

    NASA Astrophysics Data System (ADS)

    Li, Li; Chen, Zhenjie

    2007-06-01

    This article analyzes Kunming land use planning and management information system from the system building objectives and system building requirements aspects, nails down the system's users, functional requirements and construction requirements. On these bases, the three-tier system architecture based on C/S and B/S is defined: the user interface layer, the business logic layer and the data services layer. According to requirements for the construction of land use planning and management information database derived from standards of the Ministry of Land and Resources and the construction program of the Golden Land Project, this paper divides system databases into planning document database, planning implementation database, working map database and system maintenance database. In the design of the system interface, this paper uses various methods and data formats for data transmission and sharing between upper and lower levels. According to the system analysis results, main modules of the system are designed as follows: planning data management, the planning and annual plan preparation and control function, day-to-day planning management, planning revision management, decision-making support, thematic inquiry statistics, planning public participation and so on; besides that, the system realization technologies are discussed from the system operation mode, development platform and other aspects.

  17. Point Analysis in Java applied to histological images of the perforant pathway: a user's account.

    PubMed

    Scorcioni, Ruggero; Wright, Susan N; Patrick Card, J; Ascoli, Giorgio A; Barrionuevo, Germán

    2008-01-01

    The freeware Java tool Point Analysis in Java (PAJ), created to perform 3D point analysis, was tested in an independent laboratory setting. The input data consisted of images of the hippocampal perforant pathway from serial immunocytochemical localizations of the rat brain in multiple views at different resolutions. The low magnification set (x2 objective) comprised the entire perforant pathway, while the high magnification set (x100 objective) allowed the identification of individual fibers. A preliminary stereological study revealed a striking linear relationship between the fiber count at high magnification and the optical density at low magnification. PAJ enabled fast analysis for down-sampled data sets and a friendly interface with automated plot drawings. Noted strengths included the multi-platform support as well as the free availability of the source code, conducive to a broad user base and maximum flexibility for ad hoc requirements. PAJ has great potential to extend its usability by (a) improving its graphical user interface, (b) increasing its input size limit, (c) improving response time for large data sets, and (d) potentially being integrated with other Java graphical tools such as ImageJ.

  18. Autonomous Information Fading and Provision to Achieve High Response Time in Distributed Information Systems

    NASA Astrophysics Data System (ADS)

    Lu, Xiaodong; Arfaoui, Helene; Mori, Kinji

    In highly dynamic electronic commerce environment, the need for adaptability and rapid response time to information service systems has become increasingly important. In order to cope with the continuously changing conditions of service provision and utilization, Faded Information Field (FIF) has been proposed. FIF is a distributed information service system architecture, sustained by push/pull mobile agents to bring high-assurance of services through a recursive demand-oriented provision of the most popular information closer to the users to make a tradeoff between the cost of information service allocation and access. In this paper, based on the analysis of the relationship that exists among the users distribution, information provision and access time, we propose the technology for FIF design to resolve the competing requirements of users and providers to improve users' access time. In addition, to achieve dynamic load balancing with changing users preference, the autonomous information reallocation technology is proposed. We proved the effectiveness of the proposed technology through the simulation and comparison with the conventional system.

  19. How Analysts Cognitively “Connect the Dots”

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradel, Lauren; Self, Jessica S.; Endert, Alexander

    2013-06-04

    As analysts attempt to make sense of a collection of documents, such as intelligence analysis reports, they may wish to “connect the dots” between pieces of information that may initially seem unrelated. This process of synthesizing information between information requires users to make connections between pairs of documents, creating a conceptual story. We conducted a user study to analyze the process by which users connect pairs of documents and how they spatially arrange information. Users created conceptual stories that connected the dots using organizational strategies that ranged in complexity. We propose taxonomies for cognitive connections and physical structures used whenmore » trying to “connect the dots” between two documents. We compared the user-created stories with a data-mining algorithm that constructs chains of documents using co-occurrence metrics. Using the insight gained into the storytelling process, we offer design considerations for the existing data mining algorithm and corresponding tools to combine the power of data mining and the complex cognitive processing of analysts.« less

  20. User-oriented views in health care information systems.

    PubMed

    Portoni, Luisa; Combi, Carlo; Pinciroli, Francesco

    2002-12-01

    In this paper, we present the methodology we adopted in designing and developing an object-oriented database system for the management of medical records. The designed system provides technical solutions to important requirements of most clinical information systems, such as 1) the support of tools to create and manage views on data and view schemas, offering to different users specific perspectives on data tailored to their needs; 2) the capability to handle in a suitable way the temporal aspects related to clinical information; and 3) the effective integration of multimedia data. Remote data access for authorized users is also considered. As clinical application, we describe here the prototype of a user-oriented clinical information system for the archiving and the management of multimedia and temporally oriented clinical data related to percutaneous transluminal coronary angioplasty (PTCA) patients. Suitable view schemas for various user roles (cath-lab physician, ward nurse, general practitioner) have been modeled and implemented on the basis of a detailed analysis of the considered clinical environment, carried out by an object-oriented approach.

  1. General Tool for Evaluating High-Contrast Coronagraphic Telescope Performance Error Budgets

    NASA Technical Reports Server (NTRS)

    Marchen, Luis F.

    2011-01-01

    The Coronagraph Performance Error Budget (CPEB) tool automates many of the key steps required to evaluate the scattered starlight contrast in the dark hole of a space-based coronagraph. The tool uses a Code V prescription of the optical train, and uses MATLAB programs to call ray-trace code that generates linear beam-walk and aberration sensitivity matrices for motions of the optical elements and line-of-sight pointing, with and without controlled fine-steering mirrors (FSMs). The sensitivity matrices are imported by macros into Excel 2007, where the error budget is evaluated. The user specifies the particular optics of interest, and chooses the quality of each optic from a predefined set of PSDs. The spreadsheet creates a nominal set of thermal and jitter motions, and combines that with the sensitivity matrices to generate an error budget for the system. CPEB also contains a combination of form and ActiveX controls with Visual Basic for Applications code to allow for user interaction in which the user can perform trade studies such as changing engineering requirements, and identifying and isolating stringent requirements. It contains summary tables and graphics that can be instantly used for reporting results in view graphs. The entire process to obtain a coronagraphic telescope performance error budget has been automated into three stages: conversion of optical prescription from Zemax or Code V to MACOS (in-house optical modeling and analysis tool), a linear models process, and an error budget tool process. The first process was improved by developing a MATLAB package based on the Class Constructor Method with a number of user-defined functions that allow the user to modify the MACOS optical prescription. The second process was modified by creating a MATLAB package that contains user-defined functions that automate the process. The user interfaces with the process by utilizing an initialization file where the user defines the parameters of the linear model computations. Other than this, the process is fully automated. The third process was developed based on the Terrestrial Planet Finder coronagraph Error Budget Tool, but was fully automated by using VBA code, form, and ActiveX controls.

  2. Alkahest NuclearBLAST : a user-friendly BLAST management and analysis system

    PubMed Central

    Diener, Stephen E; Houfek, Thomas D; Kalat, Sam E; Windham, DE; Burke, Mark; Opperman, Charles; Dean, Ralph A

    2005-01-01

    Background - Sequencing of EST and BAC end datasets is no longer limited to large research groups. Drops in per-base pricing have made high throughput sequencing accessible to individual investigators. However, there are few options available which provide a free and user-friendly solution to the BLAST result storage and data mining needs of biologists. Results - Here we describe NuclearBLAST, a batch BLAST analysis, storage and management system designed for the biologist. It is a wrapper for NCBI BLAST which provides a user-friendly web interface which includes a request wizard and the ability to view and mine the results. All BLAST results are stored in a MySQL database which allows for more advanced data-mining through supplied command-line utilities or direct database access. NuclearBLAST can be installed on a single machine or clustered amongst a number of machines to improve analysis throughput. NuclearBLAST provides a platform which eases data-mining of multiple BLAST results. With the supplied scripts, the program can export data into a spreadsheet-friendly format, automatically assign Gene Ontology terms to sequences and provide bi-directional best hits between two datasets. Users with SQL experience can use the database to ask even more complex questions and extract any subset of data they require. Conclusion - This tool provides a user-friendly interface for requesting, viewing and mining of BLAST results which makes the management and data-mining of large sets of BLAST analyses tractable to biologists. PMID:15958161

  3. Data Transfer Study HPSS Archiving

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wynne, James; Parete-Koon, Suzanne T; Mitchell, Quinn

    2015-01-01

    The movement of the large amounts of data produced by codes run in a High Performance Computing (HPC) environment can be a bottleneck for project workflows. To balance filesystem capacity and performance requirements, HPC centers enforce data management policies to purge old files to make room for new computation and analysis results. Users at Oak Ridge Leadership Computing Facility (OLCF) and many other HPC user facilities must archive data to avoid data loss during purges, therefore the time associated with data movement for archiving is something that all users must consider. This study observed the difference in transfer speed frommore » the originating location on the Lustre filesystem to the more permanent High Performance Storage System (HPSS). The tests were done with a number of different transfer methods for files that spanned a variety of sizes and compositions that reflect OLCF user data. This data will be used to help users of Titan and other Cray supercomputers plan their workflow and data transfers so that they are most efficient for their project. We will also discuss best practice for maintaining data at shared user facilities.« less

  4. LLCEDATA and LLCECALC for Windows version 1.0, Volume 1: User`s manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McFadden, J.G.

    LLCEDATA and LLCECALC for Windows are user-friendly computer software programs that work together to determine the proper waste designation, handling, and disposition requirements for Long Length Contaminated Equipment (LLCE). LLCEDATA reads from a variety of data bases to produce an equipment data file (EDF) that represents a snapshot of both the LLCE and the tank it originates from. LLCECALC reads the EDF and a gamma assay (AV2) file that is produced by the Flexible Receiver Gamma Energy Analysis System. LLCECALC performs corrections to the AV2 file as it is being read and characterizes the LLCE. Both programs produce a varietymore » of reports, including a characterization report and a status report. The status report documents each action taken by the user, LLCEDATA, and LLCECALC. Documentation for LLCEDATA and LLCECALC for Windows is available in three volumes. Volume 1 is a user`s manual, which is intended as a quick reference for both LLCEDATA and LLCECALC. Volume 2 is a technical manual, and Volume 3 is a software verification and validation document.« less

  5. Generic trending and analysis system

    NASA Technical Reports Server (NTRS)

    Keehan, Lori; Reese, Jay

    1994-01-01

    The Generic Trending and Analysis System (GTAS) is a generic spacecraft performance monitoring tool developed by NASA Code 511 and Loral Aerosys. It is designed to facilitate quick anomaly resolution and trend analysis. Traditionally, the job of off-line analysis has been performed using hardware and software systems developed for real-time spacecraft contacts; then, the systems were supplemented with a collection of tools developed by Flight Operations Team (FOT) members. Since the number of upcoming missions is increasing, NASA can no longer afford to operate in this manner. GTAS improves control center productivity and effectiveness because it provides a generic solution across multiple missions. Thus, GTAS eliminates the need for each individual mission to develop duplicate capabilities. It also allows for more sophisticated tools to be developed because it draws resources from several projects. In addition, the GTAS software system incorporates commercial off-the-shelf tools software (COTS) packages and reuses components of other NASA-developed systems wherever possible. GTAS has incorporated lessons learned from previous missions by involving the users early in the development process. GTAS users took a proactive role in requirements analysis, design, development, and testing. Because of user involvement, several special tools were designed and are now being developed. GTAS users expressed considerable interest in facilitating data collection for long term trending and analysis. As a result, GTAS provides easy access to large volumes of processed telemetry data directly in the control center. The GTAS archival and retrieval capabilities are supported by the integration of optical disk technology and a COTS relational database management system.

  6. Interactive analysis of geographically distributed population imaging data collections over light-path data networks

    NASA Astrophysics Data System (ADS)

    van Lew, Baldur; Botha, Charl P.; Milles, Julien R.; Vrooman, Henri A.; van de Giessen, Martijn; Lelieveldt, Boudewijn P. F.

    2015-03-01

    The cohort size required in epidemiological imaging genetics studies often mandates the pooling of data from multiple hospitals. Patient data, however, is subject to strict privacy protection regimes, and physical data storage may be legally restricted to a hospital network. To enable biomarker discovery, fast data access and interactive data exploration must be combined with high-performance computing resources, while respecting privacy regulations. We present a system using fast and inherently secure light-paths to access distributed data, thereby obviating the need for a central data repository. A secure private cloud computing framework facilitates interactive, computationally intensive exploration of this geographically distributed, privacy sensitive data. As a proof of concept, MRI brain imaging data hosted at two remote sites were processed in response to a user command at a third site. The system was able to automatically start virtual machines, run a selected processing pipeline and write results to a user accessible database, while keeping data locally stored in the hospitals. Individual tasks took approximately 50% longer compared to a locally hosted blade server but the cloud infrastructure reduced the total elapsed time by a factor of 40 using 70 virtual machines in the cloud. We demonstrated that the combination light-path and private cloud is a viable means of building an analysis infrastructure for secure data analysis. The system requires further work in the areas of error handling, load balancing and secure support of multiple users.

  7. Convergence Acceleration and Documentation of CFD Codes for Turbomachinery Applications

    NASA Technical Reports Server (NTRS)

    Marquart, Jed E.

    2005-01-01

    The development and analysis of turbomachinery components for industrial and aerospace applications has been greatly enhanced in recent years through the advent of computational fluid dynamics (CFD) codes and techniques. Although the use of this technology has greatly reduced the time required to perform analysis and design, there still remains much room for improvement in the process. In particular, there is a steep learning curve associated with most turbomachinery CFD codes, and the computation times need to be reduced in order to facilitate their integration into standard work processes. Two turbomachinery codes have recently been developed by Dr. Daniel Dorney (MSFC) and Dr. Douglas Sondak (Boston University). These codes are entitled Aardvark (for 2-D and quasi 3-D simulations) and Phantom (for 3-D simulations). The codes utilize the General Equation Set (GES), structured grid methodology, and overset O- and H-grids. The codes have been used with success by Drs. Dorney and Sondak, as well as others within the turbomachinery community, to analyze engine components and other geometries. One of the primary objectives of this study was to establish a set of parametric input values which will enhance convergence rates for steady state simulations, as well as reduce the runtime required for unsteady cases. The goal is to reduce the turnaround time for CFD simulations, thus permitting more design parametrics to be run within a given time period. In addition, other code enhancements to reduce runtimes were investigated and implemented. The other primary goal of the study was to develop enhanced users manuals for Aardvark and Phantom. These manuals are intended to answer most questions for new users, as well as provide valuable detailed information for the experienced user. The existence of detailed user s manuals will enable new users to become proficient with the codes, as well as reducing the dependency of new users on the code authors. In order to achieve the objectives listed, the following tasks were accomplished: 1) Parametric Study Of Preconditioning Parameters And Other Code Inputs; 2) Code Modifications To Reduce Runtimes; 3) Investigation Of Compiler Options To Reduce Code Runtime; and 4) Development/Enhancement of Users Manuals for Aardvark and Phantom

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Z. J.; Wells, D.; Green, J.

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switchingmore » the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.« less

  9. A System for the Semantic Multimodal Analysis of News Audio-Visual Content

    NASA Astrophysics Data System (ADS)

    Mezaris, Vasileios; Gidaros, Spyros; Papadopoulos, GeorgiosTh; Kasper, Walter; Steffen, Jörg; Ordelman, Roeland; Huijbregts, Marijn; de Jong, Franciska; Kompatsiaris, Ioannis; Strintzis, MichaelG

    2010-12-01

    News-related content is nowadays among the most popular types of content for users in everyday applications. Although the generation and distribution of news content has become commonplace, due to the availability of inexpensive media capturing devices and the development of media sharing services targeting both professional and user-generated news content, the automatic analysis and annotation that is required for supporting intelligent search and delivery of this content remains an open issue. In this paper, a complete architecture for knowledge-assisted multimodal analysis of news-related multimedia content is presented, along with its constituent components. The proposed analysis architecture employs state-of-the-art methods for the analysis of each individual modality (visual, audio, text) separately and proposes a novel fusion technique based on the particular characteristics of news-related content for the combination of the individual modality analysis results. Experimental results on news broadcast video illustrate the usefulness of the proposed techniques in the automatic generation of semantic annotations.

  10. Air Markets Program Data (AMPD)

    EPA Pesticide Factsheets

    The Air Markets Program Data tool allows users to search EPA data to answer scientific, general, policy, and regulatory questions about industry emissions. Air Markets Program Data (AMPD) is a web-based application that allows users easy access to both current and historical data collected as part of EPA's emissions trading programs. This site allows you to create and view reports and to download emissions data for further analysis. AMPD provides a query tool so users can create custom queries of industry source emissions data, allowance data, compliance data, and facility attributes. In addition, AMPD provides interactive maps, charts, reports, and pre-packaged datasets. AMPD does not require any additional software, plug-ins, or security controls and can be accessed using a standard web browser.

  11. Establishment of Kansei Database and Application to Design for Consensus Building

    NASA Astrophysics Data System (ADS)

    Yasuda, Keiichi; Shiraki, Wataru

    Reflecting the recent social background where the importance of bridge landscape design is recognized and the new business style of citizen-involved infrastructure development has started, there has been a growing need of design where aesthetic feeling of actual users is reflected. In this research, a focus has been placed on the Kansei engineering technique where users' needs are reflected on product development. A questionnaire survey has been conducted for bridge engineers who are most intensively involved in design work and students as actual users. The result was analyzed by factor analysis and the Hayashi's quantification methods (category I). A tool required at consensus-building occasions has been created to change design elements and display accompanying evaluation difference while using the Kansei database.

  12. User and group storage management the CMS CERN T2 centre

    NASA Astrophysics Data System (ADS)

    Cerminara, G.; Franzoni, G.; Pfeiffer, A.

    2015-12-01

    A wide range of detector commissioning, calibration and data analysis tasks is carried out by CMS using dedicated storage resources available at the CMS CERN Tier-2 centre. Relying on the functionalities of the EOS disk-only storage technology, the optimal exploitation of the CMS user/group resources has required the introduction of policies for data access management, data protection, cleanup campaigns based on access pattern, and long term tape archival. The resource management has been organised around the definition of working groups and the delegation to an identified responsible of each group composition. In this paper we illustrate the user/group storage management, and the development and operational experience at the CMS CERN Tier-2 centre in the 2012-2015 period.

  13. Authenticated multi-user quantum key distribution with single particles

    NASA Astrophysics Data System (ADS)

    Lin, Song; Wang, Hui; Guo, Gong-De; Ye, Guo-Hua; Du, Hong-Zhen; Liu, Xiao-Fen

    2016-03-01

    Quantum key distribution (QKD) has been growing rapidly in recent years and becomes one of the hottest issues in quantum information science. During the implementation of QKD on a network, identity authentication has been one main problem. In this paper, an efficient authenticated multi-user quantum key distribution (MQKD) protocol with single particles is proposed. In this protocol, any two users on a quantum network can perform mutual authentication and share a secure session key with the assistance of a semi-honest center. Meanwhile, the particles, which are used as quantum information carriers, are not required to be stored, therefore the proposed protocol is feasible with current technology. Finally, security analysis shows that this protocol is secure in theory.

  14. Attribution of Library Costs

    ERIC Educational Resources Information Center

    Drake, Miriam A.

    1977-01-01

    Universities conduct a variety of cost-allocation studies that require the collection and analysis of the library cost-data. Cost accounting methods are used in most studies; however, costs are attributed to library user groups in a variety of ways. Cost accounting studies are reviewed and allocation methods are discussed. (Author)

  15. DELIVERing Library Resources to the Virtual Learning Environment

    ERIC Educational Resources Information Center

    Secker, Jane

    2005-01-01

    Purpose: Examines a project to integrate digital libraries and virtual learning environments (VLE) focusing on requirements for online reading list systems. Design/methodology/approach: Conducted a user needs analysis using interviews and focus groups and evaluated three reading or resource list management systems. Findings: Provides a technical…

  16. Real English Project Report.

    ERIC Educational Resources Information Center

    Cautin, Harvey; Regan, Edward

    Requirements are discussed for an information retrieval language that enables users to employ natural language sentences in interaction with computer-stored files. Anticipated modes of operation of the system are outlined. These are: the search mode, the dictionary mode, the tables mode, and the statistical mode. Analysis of sample sentences…

  17. EMAAS: An extensible grid-based Rich Internet Application for microarray data analysis and management

    PubMed Central

    Barton, G; Abbott, J; Chiba, N; Huang, DW; Huang, Y; Krznaric, M; Mack-Smith, J; Saleem, A; Sherman, BT; Tiwari, B; Tomlinson, C; Aitman, T; Darlington, J; Game, L; Sternberg, MJE; Butcher, SA

    2008-01-01

    Background Microarray experimentation requires the application of complex analysis methods as well as the use of non-trivial computer technologies to manage the resultant large data sets. This, together with the proliferation of tools and techniques for microarray data analysis, makes it very challenging for a laboratory scientist to keep up-to-date with the latest developments in this field. Our aim was to develop a distributed e-support system for microarray data analysis and management. Results EMAAS (Extensible MicroArray Analysis System) is a multi-user rich internet application (RIA) providing simple, robust access to up-to-date resources for microarray data storage and analysis, combined with integrated tools to optimise real time user support and training. The system leverages the power of distributed computing to perform microarray analyses, and provides seamless access to resources located at various remote facilities. The EMAAS framework allows users to import microarray data from several sources to an underlying database, to pre-process, quality assess and analyse the data, to perform functional analyses, and to track data analysis steps, all through a single easy to use web portal. This interface offers distance support to users both in the form of video tutorials and via live screen feeds using the web conferencing tool EVO. A number of analysis packages, including R-Bioconductor and Affymetrix Power Tools have been integrated on the server side and are available programmatically through the Postgres-PLR library or on grid compute clusters. Integrated distributed resources include the functional annotation tool DAVID, GeneCards and the microarray data repositories GEO, CELSIUS and MiMiR. EMAAS currently supports analysis of Affymetrix 3' and Exon expression arrays, and the system is extensible to cater for other microarray and transcriptomic platforms. Conclusion EMAAS enables users to track and perform microarray data management and analysis tasks through a single easy-to-use web application. The system architecture is flexible and scalable to allow new array types, analysis algorithms and tools to be added with relative ease and to cope with large increases in data volume. PMID:19032776

  18. Easing access to R using 'shiny' to create graphical user interfaces: An example for the R package 'Luminescence'

    NASA Astrophysics Data System (ADS)

    Burow, Christoph; Kreutzer, Sebastian; Dietze, Michael; Fuchs, Margret C.; Schmidt, Christoph; Fischer, Manfred; Brückner, Helmut

    2017-04-01

    Since the release of the R package 'Luminescence' (Kreutzer et al., 2012) the functionality of the package has been greatly enhanced by implementing further functions for measurement data processing, statistical analysis and graphical output. Despite its capabilities for complex and non-standard analysis of luminescence data, working with the command-line interface (CLI) of R can be tedious at best and overwhelming at worst, especially for users without experience in programming languages. Even though much work is put into simplifying the usage of the package to continuously lower the entry threshold, at least basic knowledge of R will always be required. Thus, the potential user base of the package cannot be exhausted, at least as long as the CLI is the only means of utilising the 'Luminescence' package. But even experienced users may find it tedious to iteratively run a function until a satisfying results is produced. For example, plotting data is also at least partly subject to personal aesthetic tastes in accordance with the information it is supposed to convey and iterating through all the possible options in the R CLI can be a time-consuming task. An alternative approach to the CLI is the graphical user interface (GUI), which allows direct, interactive manipulation and interaction with the underlying software. For users with little or no experience with command-lines a GUI offers intuitive access that counteracts the perceived steep learning curve of a CLI. Even though R lacks native support for GUI functions, its capabilities of linking it to other programming languages allows to utilise external frameworks to build graphical user interfaces. A recent attempt to provide a GUI toolkit for R was the introduction of the 'shiny' package (Chang et al., 2016), which allows automatic construction of HTML, CSS and JavaScript based user interfaces straight from R. Here, we give (1) a brief introduction to the 'shiny' framework for R, before we (2) present a GUI for the R package 'Luminescence' in the form of interactive web applications. These applications can be accessed online so that a user is not even required to have a local installation of R and which provide access to most of the plotting functions of the R package 'Luminescence'. These functionalities will be demonstrated live during the PICO session. References Chang, W., Cheng, J., Allaire, JJ., Xie, Y., McPherson, J., 2016. shiny: Web Application Framework for R. R package version 0.13.2. https://CRAN.R-project.org/package=shiny Kreutzer, S., Schmidt, C., Fuchs, M.C., Dietze, M., Fischer, M., Fuchs, M., 2012. Introducing an R package for luminescence dating analysis. Ancient TL, 30: 1-8, 2012.

  19. User's manual for the Heat Pipe Space Radiator design and analysis Code (HEPSPARC)

    NASA Technical Reports Server (NTRS)

    Hainley, Donald C.

    1991-01-01

    A heat pipe space radiatior code (HEPSPARC), was written for the NASA Lewis Research Center and is used for the design and analysis of a radiator that is constructed from a pumped fluid loop that transfers heat to the evaporative section of heat pipes. This manual is designed to familiarize the user with this new code and to serve as a reference for its use. This manual documents the completed work and is intended to be the first step towards verification of the HEPSPARC code. Details are furnished to provide a description of all the requirements and variables used in the design and analysis of a combined pumped loop/heat pipe radiator system. A description of the subroutines used in the program is furnished for those interested in understanding its detailed workings.

  20. Analyzing microtomography data with Python and the scikit-image library.

    PubMed

    Gouillart, Emmanuelle; Nunez-Iglesias, Juan; van der Walt, Stéfan

    2017-01-01

    The exploration and processing of images is a vital aspect of the scientific workflows of many X-ray imaging modalities. Users require tools that combine interactivity, versatility, and performance. scikit-image is an open-source image processing toolkit for the Python language that supports a large variety of file formats and is compatible with 2D and 3D images. The toolkit exposes a simple programming interface, with thematic modules grouping functions according to their purpose, such as image restoration, segmentation, and measurements. scikit-image users benefit from a rich scientific Python ecosystem that contains many powerful libraries for tasks such as visualization or machine learning. scikit-image combines a gentle learning curve, versatile image processing capabilities, and the scalable performance required for the high-throughput analysis of X-ray imaging data.

  1. A cost and utility analysis of NIM/CAMAC standards and equipment for shuttle payload data acquisition and control systems. Volume 3: Tasks 3 and 4

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The modifications for the Nuclear Instrumentation Modular (NIM) and Computer Automated Measurement Control (CAMAC) equipment, designed for ground based laboratory use, that would be required to permit its use in the Spacelab environments were determined. The cost of these modifications were estimated and the most cost effective approach to implementing them were identified. A shared equipment implementation in which the various Spacelab users draw their required complement of standard NIM and CAMAC equipment for a given flight from a common equipment pool was considered. The alternative approach studied was a dedicated equipment implementation in which each of the users is responsible for procuring either their own NIM/CAMAC equipment or its custom built equivalent.

  2. An observational study of powered wheelchair provision in Italy.

    PubMed

    Salatino, Claudia; Andrich, Renzo; Converti, R M; Saruggia, M

    2016-01-01

    Powered wheelchairs are complex and expensive assistive devices that must be selected and configured on the basis of individual user needs, lifestyle, motivation, driving ability, and environment. Providing agencies often require evidence that their financial investment will lead to a successful outcome. The authors surveyed a sample of 79 users who had obtained powered wheelchairs from a Regional Health Service in Italy in the period 2008-2013. Follow-up interviews were conducted at the users' homes in order to collect information about wheelchair use, and its effectiveness, usefulness, and economic impact. The instruments used in the interviews included an introductory questionnaire, QUEST (Quebec User Evaluation of Satisfaction with Assistive Technology), PIADS (Psychosocial Impact of Assistive Devices Scale), FABS/M (Facilitators and Barriers Survey/Mobility), and SCAI (Siva Cost Analysis Instrument). The results indicated positive outcomes, especially in relation to user satisfaction and psychosocial impact. A number of barriers were identified in various settings that sometimes restrict user mobility, and suggest corrective actions. The provision of a powered wheelchair generated considerable savings in social costs for most users: an average of about $38,000 per person over a projected 5-year period was estimated by comparing the cost of the intervention with that of non-intervention.

  3. NASTRAN postprocessor program for transient response to input accelerations. [procedure for generating and writing modal input data on tapes using NASTRAN

    NASA Technical Reports Server (NTRS)

    Wingate, R. T.; Jones, T. C.; Stephens, M. V.

    1973-01-01

    The description of a transient analysis program for computing structural responses to input base accelerations is presented. A hybrid modal formulation is used and a procedure is demonstrated for generating and writing all modal input data on user tapes via NASTRAN. Use of several new Level 15 modules is illustrated along with a problem associated with reading the postprocessor program input from a user tape. An example application of the program is presented for the analysis of a spacecraft subjected to accelerations initiated by thrust transients. Experience with the program has indicated it to be very efficient and economical because of its simplicity and small central memory storage requirements.

  4. CIRCAL-2 - General-purpose on-line circuit design.

    NASA Technical Reports Server (NTRS)

    Dertouzos, M. L.; Jessel, G. P.; Stinger, J. R.

    1972-01-01

    CIRCAL-2 is a second-generation general-purpose on-line circuit-design program with the following main features: (1) multiple-analysis capability; (2) uniform and general data structures for handling text editing, network representations, and output results, regardless of analysis; (3) special techniques and structures for minimizing and controlling user-program interaction; (4) use of functionals for the description of hysteresis and heat effects; and (5) ability to define optimization procedures that 'replace' the user. The paper discusses the organization of CIRCAL-2, the aforementioned main features, and their consequences, such as a set of network elements and models general enough for most analyses and a set of functions tailored to circuit-design requirements. The presentation is descriptive, concentrating on conceptual rather than on program implementation details.

  5. A method to model latent heat for transient analysis using NASTRAN

    NASA Technical Reports Server (NTRS)

    Harder, R. L.

    1982-01-01

    A sample heat transfer analysis is demonstrated which includes the heat of fusion. The method can be used to analyze a system with nonconstant specific heat. The enthalpy is introduced as an independent degree of freedom at each node. The user input consists of a curve of temperature as a function of enthalpy, which may include a constant temperature phase change. The basic NASTRAN heat transfer capability is used to model the effects of latent heat with existing direct matrix output and nonlinear load data cards. Although some user care is required, the numerical stability of the integration is quite good when the given recommendations are followed. The theoretical equations used and the NASTRAN techniques are shown.

  6. Web usability evaluation with screen reader users: implementation of the partial concurrent thinking aloud technique.

    PubMed

    Federici, Stefano; Stefano, Federici; Borsci, Simone; Stamerra, Gianluca

    2010-08-01

    A verbal protocol technique, adopted for a web usability evaluation, requires that the users are able to perform a double task: surfing and talking. Nevertheless, when blind users surf by using a screen reader and talk about the way they interact with the computer, the evaluation is influenced by a structural interference: users are forced to think aloud and listen to the screen reader at the same time. The aim of this study is to build up a verbal protocol technique for samples of visual impaired users in order to overcome the limits of concurrent and retrospective protocols. The technique we improved, called partial concurrent thinking aloud (PCTA), integrates a modified set of concurrent verbalization and retrospective analysis. One group of 6 blind users and another group of 6 sighted users evaluated the usability of a website using PCTA. By estimating the number of necessary users by the means of an asymptotic test, it was found out that the two groups had an equivalent ability of identifying usability problems, both over 80%. The result suggests that PCTA, while respecting the properties of classic verbal protocols, also allows to overcome the structural interference and the limits of concurrent and retrospective protocols when used with screen reader users. In this way, PCTA reduces the efficiency difference of usability evaluation between blind and sighted users.

  7. Cloud hosting of the IPython Notebook to Provide Collaborative Research Environments for Big Data Analysis

    NASA Astrophysics Data System (ADS)

    Kershaw, Philip; Lawrence, Bryan; Gomez-Dans, Jose; Holt, John

    2015-04-01

    We explore how the popular IPython Notebook computing system can be hosted on a cloud platform to provide a flexible virtual research hosting environment for Earth Observation data processing and analysis and how this approach can be expanded more broadly into a generic SaaS (Software as a Service) offering for the environmental sciences. OPTIRAD (OPTImisation environment for joint retrieval of multi-sensor RADiances) is a project funded by the European Space Agency to develop a collaborative research environment for Data Assimilation of Earth Observation products for land surface applications. Data Assimilation provides a powerful means to combine multiple sources of data and derive new products for this application domain. To be most effective, it requires close collaboration between specialists in this field, land surface modellers and end users of data generated. A goal of OPTIRAD then is to develop a collaborative research environment to engender shared working. Another significant challenge is that of data volume and complexity. Study of land surface requires high spatial and temporal resolutions, a relatively large number of variables and the application of algorithms which are computationally expensive. These problems can be addressed with the application of parallel processing techniques on specialist compute clusters. However, scientific users are often deterred by the time investment required to port their codes to these environments. Even when successfully achieved, it may be difficult to readily change or update. This runs counter to the scientific process of continuous experimentation, analysis and validation. The IPython Notebook provides users with a web-based interface to multiple interactive shells for the Python programming language. Code, documentation and graphical content can be saved and shared making it directly applicable to OPTIRAD's requirements for a shared working environment. Given the web interface it can be readily made into a hosted service with Wakari and Microsoft Azure being notable examples. Cloud-hosting of the Notebook allows the same familiar Python interface to be retained but backed by Cloud Computing attributes of scalability, elasticity and resource pooling. This combination makes it a powerful solution to address the needs of long-tail science users of Big Data: an intuitive interactive interface with which to access powerful compute resources. IPython Notebook can be hosted as a single user desktop environment but the recent development by the IPython community of JupyterHub enables it to be run as a multi-user hosting environment. In addition, IPython.parallel allows the exposition of parallel compute infrastructure through a Python interface. Applying these technologies in combination, a collaborative research environment has been developed for OPTIRAD on the UK JASMIN/CEMS facility's private cloud (http://jasmin.ac.uk). Based on this experience, a generic virtualised solution is under development suitable for use by the wider environmental science community - on both JASMIN and portable to third party cloud platforms.

  8. CAPER 3.0: A Scalable Cloud-Based System for Data-Intensive Analysis of Chromosome-Centric Human Proteome Project Data Sets.

    PubMed

    Yang, Shuai; Zhang, Xinlei; Diao, Lihong; Guo, Feifei; Wang, Dan; Liu, Zhongyang; Li, Honglei; Zheng, Junjie; Pan, Jingshan; Nice, Edouard C; Li, Dong; He, Fuchu

    2015-09-04

    The Chromosome-centric Human Proteome Project (C-HPP) aims to catalog genome-encoded proteins using a chromosome-by-chromosome strategy. As the C-HPP proceeds, the increasing requirement for data-intensive analysis of the MS/MS data poses a challenge to the proteomic community, especially small laboratories lacking computational infrastructure. To address this challenge, we have updated the previous CAPER browser into a higher version, CAPER 3.0, which is a scalable cloud-based system for data-intensive analysis of C-HPP data sets. CAPER 3.0 uses cloud computing technology to facilitate MS/MS-based peptide identification. In particular, it can use both public and private cloud, facilitating the analysis of C-HPP data sets. CAPER 3.0 provides a graphical user interface (GUI) to help users transfer data, configure jobs, track progress, and visualize the results comprehensively. These features enable users without programming expertise to easily conduct data-intensive analysis using CAPER 3.0. Here, we illustrate the usage of CAPER 3.0 with four specific mass spectral data-intensive problems: detecting novel peptides, identifying single amino acid variants (SAVs) derived from known missense mutations, identifying sample-specific SAVs, and identifying exon-skipping events. CAPER 3.0 is available at http://prodigy.bprc.ac.cn/caper3.

  9. Development of a User Interface for a Regression Analysis Software Tool

    NASA Technical Reports Server (NTRS)

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  10. ARDS User Manual

    NASA Technical Reports Server (NTRS)

    Fleming, David P.

    2001-01-01

    Personal computers (PCs) are now used extensively for engineering analysis. their capability exceeds that of mainframe computers of only a few years ago. Programs originally written for mainframes have been ported to PCs to make their use easier. One of these programs is ARDS (Analysis of Rotor Dynamic Systems) which was developed at Arizona State University (ASU) by Nelson et al. to quickly and accurately analyze rotor steady state and transient response using the method of component mode synthesis. The original ARDS program was ported to the PC in 1995. Several extensions were made at ASU to increase the capability of mainframe ARDS. These extensions have also been incorporated into the PC version of ARDS. Each mainframe extension had its own user manual generally covering only that extension. Thus to exploit the full capability of ARDS required a large set of user manuals. Moreover, necessary changes and enhancements for PC ARDS were undocumented. The present document is intended to remedy those problems by combining all pertinent information needed for the use of PC ARDS into one volume.

  11. INSPECT: A graphical user interface software package for IDARC-2D

    NASA Astrophysics Data System (ADS)

    AlHamaydeh, Mohammad; Najib, Mohamad; Alawnah, Sameer

    Modern day Performance-Based Earthquake Engineering (PBEE) pivots about nonlinear analysis and its feasibility. IDARC-2D is a widely used and accepted software for nonlinear analysis; it possesses many attractive features and capabilities. However, it is operated from the command prompt in the DOS/Unix systems and requires elaborate text-based input files creation by the user. To complement and facilitate the use of IDARC-2D, a pre-processing GUI software package (INSPECT) is introduced herein. INSPECT is created in the C# environment and utilizes the .NET libraries and SQLite database. Extensive testing and verification demonstrated successful and high-fidelity re-creation of several existing IDARC-2D input files. Its design and built-in features aim at expediting, simplifying and assisting in the modeling process. Moreover, this practical aid enhances the reliability of the results and improves accuracy by reducing and/or eliminating many potential and common input mistakes. Such benefits would be appreciated by novice and veteran IDARC-2D users alike.

  12. A Method for Automated Detection of Usability Problems from Client User Interface Events

    PubMed Central

    Saadawi, Gilan M.; Legowski, Elizabeth; Medvedeva, Olga; Chavan, Girish; Crowley, Rebecca S.

    2005-01-01

    Think-aloud usability analysis provides extremely useful data but is very time-consuming and expensive to perform because of the extensive manual video analysis that is required. We describe a simple method for automated detection of usability problems from client user interface events for a developing medical intelligent tutoring system. The method incorporates (1) an agent-based method for communication that funnels all interface events and system responses to a centralized database, (2) a simple schema for representing interface events and higher order subgoals, and (3) an algorithm that reproduces the criteria used for manual coding of usability problems. A correction factor was empirically determining to account for the slower task performance of users when thinking aloud. We tested the validity of the method by simultaneously identifying usability problems using TAU and manually computing them from stored interface event data using the proposed algorithm. All usability problems that did not rely on verbal utterances were detectable with the proposed method. PMID:16779121

  13. Scheduler software for tracking and data relay satellite system loading analysis: User manual and programmer guide

    NASA Technical Reports Server (NTRS)

    Craft, R.; Dunn, C.; Mccord, J.; Simeone, L.

    1980-01-01

    A user guide and programmer documentation is provided for a system of PRIME 400 minicomputer programs. The system was designed to support loading analyses on the Tracking Data Relay Satellite System (TDRSS). The system is a scheduler for various types of data relays (including tape recorder dumps and real time relays) from orbiting payloads to the TDRSS. Several model options are available to statistically generate data relay requirements. TDRSS time lines (representing resources available for scheduling) and payload/TDRSS acquisition and loss of sight time lines are input to the scheduler from disk. Tabulated output from the interactive system includes a summary of the scheduler activities over time intervals specified by the user and overall summary of scheduler input and output information. A history file, which records every event generated by the scheduler, is written to disk to allow further scheduling on remaining resources and to provide data for graphic displays or additional statistical analysis.

  14. A Space and Atmospheric Visualization Science System

    NASA Technical Reports Server (NTRS)

    Szuszczewicz, E. P.; Blanchard, P.; Mankofsky, A.; Goodrich, C.; Kamins, D.; Kulkarni, R.; Mcnabb, D.; Moroh, M.

    1994-01-01

    SAVS (a Space and Atmospheric Visualization Science system) is an integrated system with user-friendly functionality that employs a 'push-button' software environment that mimics the logical scientific processes in data acquisition, reduction, analysis, and visualization. All of this is accomplished without requiring a detailed understanding of the methods, networks, and modules that link the tools and effectively execute the functions. This report describes SAVS and its components, followed by several applications based on generic research interests in interplanetary and magnetospheric physics (IMP/ISTP), active experiments in space (CRRES), and mission planning focused on the earth's thermospheric, ionospheric, and mesospheric domains (TIMED). The final chapters provide a user-oriented description of interface functionalities, hands-on operations, and customized modules, with details of the primary modules presented in the appendices. The overall intent of the report is to reflect the accomplishments of the three-year development effort and to introduce potential users to the power and utility of the integrated data acquisition, analysis, and visualization system.

  15. Tele-rehabilitation using in-house wearable ankle rehabilitation robot.

    PubMed

    Jamwal, Prashant K; Hussain, Shahid; Mir-Nasiri, Nazim; Ghayesh, Mergen H; Xie, Sheng Q

    2018-01-01

    This article explores wide-ranging potential of the wearable ankle robot for in-house rehabilitation. The presented robot has been conceptualized following a brief analysis of the existing technologies, systems, and solutions for in-house physical ankle rehabilitation. Configuration design analysis and component selection for ankle robot have been discussed as part of the conceptual design. The complexities of human robot interaction are closely encountered while maneuvering a rehabilitation robot. We present a fuzzy logic-based controller to perform the required robot-assisted ankle rehabilitation treatment. Designs of visual haptic interfaces have also been discussed, which will make the treatment interesting, and the subject will be motivated to exert more and regain lost functions rapidly. The complex nature of web-based communication between user and remotely sitting physiotherapy staff has also been discussed. A high-level software architecture appended with robot ensures user-friendly operations. This software is made up of three important components: patient-related database, graphical user interface (GUI), and a library of exercises creating virtual reality-specifically developed for ankle rehabilitation.

  16. SpecDB: The AAVSO’s Public Repository for Spectra of Variable Stars

    NASA Astrophysics Data System (ADS)

    Kafka, Stella; Weaver, John; Silvis, George; Beck, Sara

    2018-01-01

    SpecDB is the American Association of Variable Star Observers (AAVSO) spectral database. Accessible to any astronomer with the capability to perform spectroscopy, SpecDB provides an unprecedented scientific opportunity for amateur and professional astronomers around the globe. Backed by the Variable Star Index, one of the most utilized variable star catalogs, SpecDB is expected to become one of the world leading databases of its kind. Once verified by a team of expert spectroscopists, an observer can upload spectra of variable stars target easily and efficiently. Uploaded spectra can then be searched for, previewed, and downloaded for inclusion in publications. Close community development and involvement will ensure a user-friendly and versatile database, compatible with the needs of 21st century astrophysics. Observations of 1D spectra are submitted as FITS files. All spectra are required to be preprocessed for wavelength calibration and dark subtraction; Bias and flat are strongly recommended. First time observers are required to submit a spectrum of a standard (non-variable) star to be checked for errors in technique or equipment. Regardless of user validation, FITS headers must include several value cards detailing the observation, as well as information regarding the observer, equipment, and observing site in accordance with existing AAVSO records. This enforces consistency and provides necessary details for follow up analysis. Requirements are provided to users in a comprehensive guidebook and accompanying technical manual. Upon submission, FITS headers are automatically checked for errors and any anomalies are immediately fed back to the user. Successful candidates can then submit at will, including multiple simultaneous submissions. All published observations can be searched and interactively previewed. Community involvement will be enhanced by an associated forum where users can discuss observation techniques and suggest improvements to the database.

  17. Collection policy management for the Kuopio University and Kuopio University Hospital, Finland: detecting the needs of users and developing high-quality collections.

    PubMed

    Kananen, Jukka; Ovaska, Tuulevi; Saarti, Jarmo

    2006-09-01

    This article discusses the collection policies of a university library in a modern digital environment. A brief description of national collection policy decisions in Finland is provided. The rapid evolution and growth of scientific publication places new demands on building a collection in a health and bioscience orientated university, and it requires an evidence-based approach to support effective service processes. The aim of the study was to identify the needs of the university's students and staff. Usage statistics were surveyed and analysed. Both usage statistics and user surveys indicate that the library use is divided half-and-half between the traditional use of printed material and library premises and the modern use of digital materials via the Web. The former is mainly the way that the students and hospital staff use the library, and the latter can be viewed as the researchers' way of using the library. Librarians and information specialists act in this as service providers and/or guides and tutors to the end-users. These results, however, must be validated with a longer timescale data collation and analysis, both of which are an ongoing process within the library. It is important that requirements and needs of the library's users are monitored regularly and acquisition policies are updated frequently. It also seems that the needs have changed quite dramatically in response to modern ways of disseminating publications, but this supposition will require further study.

  18. Blind information-theoretic multiuser detection algorithms for DS-CDMA and WCDMA downlink systems.

    PubMed

    Waheed, Khuram; Salem, Fathi M

    2005-07-01

    Code division multiple access (CDMA) is based on the spread-spectrum technology and is a dominant air interface for 2.5G, 3G, and future wireless networks. For the CDMA downlink, the transmitted CDMA signals from the base station (BS) propagate through a noisy multipath fading communication channel before arriving at the receiver of the user equipment/mobile station (UE/MS). Classical CDMA single-user detection (SUD) algorithms implemented in the UE/MS receiver do not provide the required performance for modern high data-rate applications. In contrast, multi-user detection (MUD) approaches require a lot of a priori information not available to the UE/MS. In this paper, three promising adaptive Riemannian contra-variant (or natural) gradient based user detection approaches, capable of handling the highly dynamic wireless environments, are proposed. The first approach, blind multiuser detection (BMUD), is the process of simultaneously estimating multiple symbol sequences associated with all the users in the downlink of a CDMA communication system using only the received wireless data and without any knowledge of the user spreading codes. This approach is applicable to CDMA systems with relatively short spreading codes but becomes impractical for systems using long spreading codes. We also propose two other adaptive approaches, namely, RAKE -blind source recovery (RAKE-BSR) and RAKE-principal component analysis (RAKE-PCA) that fuse an adaptive stage into a standard RAKE receiver. This adaptation results in robust user detection algorithms with performance exceeding the linear minimum mean squared error (LMMSE) detectors for both Direct Sequence CDMA (DS-CDMA) and wide-band CDMA (WCDMA) systems under conditions of congestion, imprecise channel estimation and unmodeled multiple access interference (MAI).

  19. Paramedir: A Tool for Programmable Performance Analysis

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    Performance analysis of parallel scientific applications is time consuming and requires great expertise in areas such as programming paradigms, system software, and computer hardware architectures. In this paper we describe a tool that facilitates the programmability of performance metric calculations thereby allowing the automation of the analysis and reducing the application development time. We demonstrate how the system can be used to capture knowledge and intuition acquired by advanced parallel programmers in order to be transferred to novice users.

  20. A Hop-Count Analysis Scheme for Avoiding Wormhole Attacks in MANET

    PubMed Central

    Jen, Shang-Ming; Laih, Chi-Sung; Kuo, Wen-Chung

    2009-01-01

    MANET, due to the nature of wireless transmission, has more security issues compared to wired environments. A specific type of attack, the Wormhole attack does not require exploiting any nodes in the network and can interfere with the route establishment process. Instead of detecting wormholes from the role of administrators as in previous methods, we implement a new protocol, MHA, using a hop-count analysis from the viewpoint of users without any special environment assumptions. We also discuss previous works which require the role of administrator and their reliance on impractical assumptions, thus showing the advantages of MHA. PMID:22408566

  1. Results of a transparent expert consultation on patient and public involvement in palliative care research.

    PubMed

    Daveson, Barbara A; de Wolf-Linder, Susanne; Witt, Jana; Newson, Kirstie; Morris, Carolyn; Higginson, Irene J; Evans, Catherine J

    2015-12-01

    Support and evidence for patient, unpaid caregiver and public involvement in research (user involvement) are growing. Consensus on how best to involve users in palliative care research is lacking. To determine an optimal user-involvement model for palliative care research. We hosted a consultation workshop using expert presentations, discussion and nominal group technique to generate recommendations and consensus on agreement of importance. A total of 35 users and 32 researchers were approached to attend the workshop, which included break-out groups and a ranking exercise. Descriptive statistical analysis to establish consensus and highlight divergence was applied. Qualitative analysis of discussions was completed to aid interpretation of findings. Participants involved in palliative care research were invited to a global research institute, UK. A total of 12 users and 5 researchers participated. Users wanted their involvement to be more visible, including during dissemination, with a greater emphasis on the difference their involvement makes. Researchers wanted to improve productivity, relevance and quality through involvement. Users and researchers agreed that an optimal model should consist of (a) early involvement to ensure meaningful involvement and impact and (b) diverse virtual and face-to-face involvement methods to ensure flexibility. For involvement in palliative care research to succeed, early and flexible involvement is required. Researchers should advertise opportunities for involvement and promote impact of involvement via dissemination plans. Users should prioritise adding value to research through enhancing productivity, quality and relevance. More research is needed not only to inform implementation and ensure effectiveness but also to investigate the cost-effectiveness of involvement in palliative care research. © The Author(s) 2015.

  2. Results of a transparent expert consultation on patient and public involvement in palliative care research

    PubMed Central

    Daveson, Barbara A; de Wolf-Linder, Susanne; Witt, Jana; Newson, Kirstie; Morris, Carolyn; Higginson, Irene J; Evans, Catherine J

    2015-01-01

    Background: Support and evidence for patient, unpaid caregiver and public involvement in research (user involvement) are growing. Consensus on how best to involve users in palliative care research is lacking. Aim: To determine an optimal user-involvement model for palliative care research. Design: We hosted a consultation workshop using expert presentations, discussion and nominal group technique to generate recommendations and consensus on agreement of importance. A total of 35 users and 32 researchers were approached to attend the workshop, which included break-out groups and a ranking exercise. Descriptive statistical analysis to establish consensus and highlight divergence was applied. Qualitative analysis of discussions was completed to aid interpretation of findings. Setting/participants: Participants involved in palliative care research were invited to a global research institute, UK. Results: A total of 12 users and 5 researchers participated. Users wanted their involvement to be more visible, including during dissemination, with a greater emphasis on the difference their involvement makes. Researchers wanted to improve productivity, relevance and quality through involvement. Users and researchers agreed that an optimal model should consist of (a) early involvement to ensure meaningful involvement and impact and (b) diverse virtual and face-to-face involvement methods to ensure flexibility. Conclusion: For involvement in palliative care research to succeed, early and flexible involvement is required. Researchers should advertise opportunities for involvement and promote impact of involvement via dissemination plans. Users should prioritise adding value to research through enhancing productivity, quality and relevance. More research is needed not only to inform implementation and ensure effectiveness but also to investigate the cost-effectiveness of involvement in palliative care research. PMID:25931336

  3. 78 FR 40786 - Self-Regulatory Organizations; NYSE Arca, Inc.; Notice of Filing and Immediate Effectiveness of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-08

    ... the level of trading volume sent to the Exchange by such users, such that a user with significant... size of the firm, every user that connects its systems to the Exchange's trading systems requires at... required to support for connectivity to its trading systems. This would also provide incentive for users to...

  4. 'What makes an excellent mental health doctor?' A response integrating the experiences and views of service users with critical reflections of psychiatrists.

    PubMed

    Gunasekara, Imani; Patterson, Sue; Scott, James G

    2017-11-01

    While therapeutic relationships are appropriately recognised as the foundation of mental health service, service users commonly report suboptimal experiences. With shared understanding critical to improvement in practice, we explored service users' experiences and expectations of psychiatrists and consultations, engaging psychiatrists throughout the process. Using an iterative qualitative approach we co-produced a response to the question 'what makes an excellent mental health doctor?' Experiences and expectations of psychiatrists were explored in interviews with 22 service users. Data collection, analysis and interpretation were informed by consultation with peer workers. Findings were contextualised in formal consultations with psychiatrists. As 'masters of their craft', excellent mental health doctors engage authentically with service users as people (not diagnoses). They listen, validate experiences and empathise affectively and cognitively. They demonstrate phronesis, applying clinical knowledge compassionately. Psychiatrists share service users' aspiration of equitable partnership but competing demands and 'professional boundaries' constrain engagement. Consistent delivery of the person-centred, recovery-oriented care promoted by policy and sought by service users will require substantial revision of the structure and priorities of mental health services. The insights and experiences of service users must be integral to medical education, and systems must provide robust support to psychiatrists. © 2017 John Wiley & Sons Ltd.

  5. A Demonstration and Analysis of Requirements for Maritime Navigation Planning.

    DTIC Science & Technology

    1998-03-01

    it the highest and 0-4 and above the lowest. Once again, the value added by grouping the data and comparing may be nothing...purpose behind a prototype is to ascertain user requirements. It should be created rapidly to speed up the system development life cycle ( SDLC ). Since...system is contained in Chapter II, Section B. 3. Internet to Sea (SEANET) Program The SeaNet Project is a collaborative effort to bring the

  6. Case Study Analysis of the Impacts of Water Acquisition for Hydraulic Fracturing on Local Water Availability

    EPA Science Inventory

    Hydraulic fracturing (HF) is used to develop unconventional gas reserves, but the technology requires large volumes of water, placing demands on local water resources and potentially creating conflict with other users and ecosystems. This study examines the balance between water ...

  7. A Task-Based Analysis of Information Requirements of Tactical Maps

    DTIC Science & Technology

    1979-08-01

    work began with the precept that military maps are primarily intended to serve users in the performance of functional tasks. By capitalizing on the task...Recherche Des Facteurs, Humaine de la Defense Natimla Onissels 2 Canadian ,losir Stall Washtington 1 C/Air Staff. Royal Canadian AF, ATTN: Pars Slid

  8. Using Visualization and Computation in the Analysis of Separation Processes

    ERIC Educational Resources Information Center

    Joo, Yong Lak; Choudhary, Devashish

    2006-01-01

    For decades, every chemical engineer has been asked to have a background in separations. The required separations course can, however, be uninspiring and superficial because understanding many separation processes involves conventional graphical methods and commercial process simulators. We utilize simple, user-­friendly mathematical software,…

  9. Spatially Locating FIA Plots from Pixel Values

    Treesearch

    Greg C. Liknes; Geoffrey R. Holden; Mark D. Nelson; Ronald E. McRoberts

    2005-01-01

    The USDA Forest Service Forest Inventory and Analysis (FIA) program is required to ensure the confidentiality of the geographic locations of plots. To accommodate user requests for data without releasing actual plot coordinates, FIA creates overlays of plot locations on various geospatial data, including satellite imagery. Methods for reporting pixel values associated...

  10. The BCI competition. III: Validating alternative approaches to actual BCI problems.

    PubMed

    Blankertz, Benjamin; Müller, Klaus-Robert; Krusienski, Dean J; Schalk, Gerwin; Wolpaw, Jonathan R; Schlögl, Alois; Pfurtscheller, Gert; Millán, José del R; Schröder, Michael; Birbaumer, Niels

    2006-06-01

    A brain-computer interface (BCI) is a system that allows its users to control external devices with brain activity. Although the proof-of-concept was given decades ago, the reliable translation of user intent into device control commands is still a major challenge. Success requires the effective interaction of two adaptive controllers: the user's brain, which produces brain activity that encodes intent, and the BCI system, which translates that activity into device control commands. In order to facilitate this interaction, many laboratories are exploring a variety of signal analysis techniques to improve the adaptation of the BCI system to the user. In the literature, many machine learning and pattern classification algorithms have been reported to give impressive results when applied to BCI data in offline analyses. However, it is more difficult to evaluate their relative value for actual online use. BCI data competitions have been organized to provide objective formal evaluations of alternative methods. Prompted by the great interest in the first two BCI Competitions, we organized the third BCI Competition to address several of the most difficult and important analysis problems in BCI research. The paper describes the data sets that were provided to the competitors and gives an overview of the results.

  11. OAP- OFFICE AUTOMATION PILOT GRAPHICS DATABASE SYSTEM

    NASA Technical Reports Server (NTRS)

    Ackerson, T.

    1994-01-01

    The Office Automation Pilot (OAP) Graphics Database system offers the IBM PC user assistance in producing a wide variety of graphs and charts. OAP uses a convenient database system, called a chartbase, for creating and maintaining data associated with the charts, and twelve different graphics packages are available to the OAP user. Each of the graphics capabilities is accessed in a similar manner. The user chooses creation, revision, or chartbase/slide show maintenance options from an initial menu. The user may then enter or modify data displayed on a graphic chart. The cursor moves through the chart in a "circular" fashion to facilitate data entries and changes. Various "help" functions and on-screen instructions are available to aid the user. The user data is used to generate the graphics portion of the chart. Completed charts may be displayed in monotone or color, printed, plotted, or stored in the chartbase on the IBM PC. Once completed, the charts may be put in a vector format and plotted for color viewgraphs. The twelve graphics capabilities are divided into three groups: Forms, Structured Charts, and Block Diagrams. There are eight Forms available: 1) Bar/Line Charts, 2) Pie Charts, 3) Milestone Charts, 4) Resources Charts, 5) Earned Value Analysis Charts, 6) Progress/Effort Charts, 7) Travel/Training Charts, and 8) Trend Analysis Charts. There are three Structured Charts available: 1) Bullet Charts, 2) Organization Charts, and 3) Work Breakdown Structure (WBS) Charts. The Block Diagram available is an N x N Chart. Each graphics capability supports a chartbase. The OAP graphics database system provides the IBM PC user with an effective means of managing data which is best interpreted as a graphic display. The OAP graphics database system is written in IBM PASCAL 2.0 and assembler for interactive execution on an IBM PC or XT with at least 384K of memory, and a color graphics adapter and monitor. Printed charts require an Epson, IBM, OKIDATA, or HP Laser printer (or equivalent). Plots require the Tektronix 4662 Penplotter. Source code is supplied to the user for modification and customizing. Executables are also supplied for all twelve graphics capabilities. This system was developed in 1983, and Version 3.1 was released in 1986.

  12. How can we make progress with decision support systems in landscape and river basin management? Lessons learned from a comparative analysis of four different decision support systems.

    PubMed

    Volk, Martin; Lautenbach, Sven; van Delden, Hedwig; Newham, Lachlan T H; Seppelt, Ralf

    2010-12-01

    This article analyses the benefits and shortcomings of the recently developed decision support systems (DSS) FLUMAGIS, Elbe-DSS, CatchMODS, and MedAction. The analysis elaborates on the following aspects: (i) application area/decision problem, (ii) stakeholder interaction/users involved, (iii) structure of DSS/model structure, (iv) usage of the DSS, and finally (v) most important shortcomings. On the basis of this analysis, we formulate four criteria that we consider essential for the successful use of DSS in landscape and river basin management. The criteria relate to (i) system quality, (ii) user support and user training, (iii) perceived usefulness and (iv) user satisfaction. We can show that the availability of tools and technologies for DSS in landscape and river basin management is good to excellent. However, our investigations indicate that several problems have to be tackled. First of all, data availability and homogenisation, uncertainty analysis and uncertainty propagation and problems with model integration require further attention. Furthermore, the appropriate and methodological stakeholder interaction and the definition of 'what end-users really need and want' have been documented as general shortcomings of all four examples of DSS. Thus, we propose an iterative development process that enables social learning of the different groups involved in the development process, because it is easier to design a DSS for a group of stakeholders who actively participate in an iterative process. We also identify two important lines of further development in DSS: the use of interactive visualization tools and the methodology of optimization to inform scenario elaboration and evaluate trade-offs among environmental measures and management alternatives.

  13. How Can We Make Progress with Decision Support Systems in Landscape and River Basin Management? Lessons Learned from a Comparative Analysis of Four Different Decision Support Systems

    NASA Astrophysics Data System (ADS)

    Volk, Martin; Lautenbach, Sven; van Delden, Hedwig; Newham, Lachlan T. H.; Seppelt, Ralf

    2010-12-01

    This article analyses the benefits and shortcomings of the recently developed decision support systems (DSS) FLUMAGIS, Elbe-DSS, CatchMODS, and MedAction. The analysis elaborates on the following aspects: (i) application area/decision problem, (ii) stakeholder interaction/users involved, (iii) structure of DSS/model structure, (iv) usage of the DSS, and finally (v) most important shortcomings. On the basis of this analysis, we formulate four criteria that we consider essential for the successful use of DSS in landscape and river basin management. The criteria relate to (i) system quality, (ii) user support and user training, (iii) perceived usefulness and (iv) user satisfaction. We can show that the availability of tools and technologies for DSS in landscape and river basin management is good to excellent. However, our investigations indicate that several problems have to be tackled. First of all, data availability and homogenisation, uncertainty analysis and uncertainty propagation and problems with model integration require further attention. Furthermore, the appropriate and methodological stakeholder interaction and the definition of `what end-users really need and want' have been documented as general shortcomings of all four examples of DSS. Thus, we propose an iterative development process that enables social learning of the different groups involved in the development process, because it is easier to design a DSS for a group of stakeholders who actively participate in an iterative process. We also identify two important lines of further development in DSS: the use of interactive visualization tools and the methodology of optimization to inform scenario elaboration and evaluate trade-offs among environmental measures and management alternatives.

  14. Eurogrid: a new glideinWMS based portal for CDF data analysis

    NASA Astrophysics Data System (ADS)

    Amerio, S.; Benjamin, D.; Dost, J.; Compostella, G.; Lucchesi, D.; Sfiligoi, I.

    2012-12-01

    The CDF experiment at Fermilab ended its Run-II phase on September 2011 after 11 years of operations and 10 fb-1 of collected data. CDF computing model is based on a Central Analysis Farm (CAF) consisting of local computing and storage resources, supported by OSG and LCG resources accessed through dedicated portals. At the beginning of 2011 a new portal, Eurogrid, has been developed to effectively exploit computing and disk resources in Europe: a dedicated farm and storage area at the TIER-1 CNAF computing center in Italy, and additional LCG computing resources at different TIER-2 sites in Italy, Spain, Germany and France, are accessed through a common interface. The goal of this project is to develop a portal easy to integrate in the existing CDF computing model, completely transparent to the user and requiring a minimum amount of maintenance support by the CDF collaboration. In this paper we will review the implementation of this new portal, and its performance in the first months of usage. Eurogrid is based on the glideinWMS software, a glidein based Workload Management System (WMS) that works on top of Condor. As CDF CAF is based on Condor, the choice of the glideinWMS software was natural and the implementation seamless. Thanks to the pilot jobs, user-specific requirements and site resources are matched in a very efficient way, completely transparent to the users. Official since June 2011, Eurogrid effectively complements and supports CDF computing resources offering an optimal solution for the future in terms of required manpower for administration, support and development.

  15. Evaluation of the user requirements processes for NASA terrestrial applications programs

    NASA Technical Reports Server (NTRS)

    1982-01-01

    To support the evolution of increasingly sound user requirements definition processes that would meet the broad range of NASA's terrestrial applications planning and management needs during the 1980's, the user requirements processes as they function in the real world at the senior and middle management levels were evaluated. Special attention was given to geologic mapping and domestic crop reporting to provide insight into problems associated with the development and management of user established conventional practices and data sources. An attempt was made to identify alternative NASA user interfaces that sustain strengths, alleviate weaknesses, maximize application to multiple problems, and simplify management cognizance. Some of the alternatives are outlined and evaluated. It is recommended that NASA have an identified organizational point of focus for consolidation and oversight of the user processes.

  16. New Users | Center for Cancer Research

    Cancer.gov

    New Users Becoming a Core Facilities User The following steps are applicable to anyone who would like to become a user of the CCR SAXS Core facilities. All users are required to follow the Core Facilty User Polices.

  17. Ensembles of adaptive spatial filters increase BCI performance: an online evaluation

    NASA Astrophysics Data System (ADS)

    Sannelli, Claudia; Vidaurre, Carmen; Müller, Klaus-Robert; Blankertz, Benjamin

    2016-08-01

    Objective: In electroencephalographic (EEG) data, signals from distinct sources within the brain are widely spread by volume conduction and superimposed such that sensors receive mixtures of a multitude of signals. This reduction of spatial information strongly hampers single-trial analysis of EEG data as, for example, required for brain-computer interfacing (BCI) when using features from spontaneous brain rhythms. Spatial filtering techniques are therefore greatly needed to extract meaningful information from EEG. Our goal is to show, in online operation, that common spatial pattern patches (CSPP) are valuable to counteract this problem. Approach: Even though the effect of spatial mixing can be encountered by spatial filters, there is a trade-off between performance and the requirement of calibration data. Laplacian derivations do not require calibration data at all, but their performance for single-trial classification is limited. Conversely, data-driven spatial filters, such as common spatial patterns (CSP), can lead to highly distinctive features; however they require a considerable amount of training data. Recently, we showed in an offline analysis that CSPP can establish a valuable compromise. In this paper, we confirm these results in an online BCI study. In order to demonstrate the paramount feature that CSPP requires little training data, we used them in an adaptive setting with 20 participants and focused on users who did not have success with previous BCI approaches. Main results: The results of the study show that CSPP adapts faster and thereby allows users to achieve better feedback within a shorter time than previous approaches performed with Laplacian derivations and CSP filters. The success of the experiment highlights that CSPP has the potential to further reduce BCI inefficiency. Significance: CSPP are a valuable compromise between CSP and Laplacian filters. They allow users to attain better feedback within a shorter time and thus reduce BCI inefficiency to one-fourth in comparison to previous non-adaptive paradigms.

  18. Ensembles of adaptive spatial filters increase BCI performance: an online evaluation.

    PubMed

    Sannelli, Claudia; Vidaurre, Carmen; Müller, Klaus-Robert; Blankertz, Benjamin

    2016-08-01

    In electroencephalographic (EEG) data, signals from distinct sources within the brain are widely spread by volume conduction and superimposed such that sensors receive mixtures of a multitude of signals. This reduction of spatial information strongly hampers single-trial analysis of EEG data as, for example, required for brain-computer interfacing (BCI) when using features from spontaneous brain rhythms. Spatial filtering techniques are therefore greatly needed to extract meaningful information from EEG. Our goal is to show, in online operation, that common spatial pattern patches (CSPP) are valuable to counteract this problem. Even though the effect of spatial mixing can be encountered by spatial filters, there is a trade-off between performance and the requirement of calibration data. Laplacian derivations do not require calibration data at all, but their performance for single-trial classification is limited. Conversely, data-driven spatial filters, such as common spatial patterns (CSP), can lead to highly distinctive features; however they require a considerable amount of training data. Recently, we showed in an offline analysis that CSPP can establish a valuable compromise. In this paper, we confirm these results in an online BCI study. In order to demonstrate the paramount feature that CSPP requires little training data, we used them in an adaptive setting with 20 participants and focused on users who did not have success with previous BCI approaches. The results of the study show that CSPP adapts faster and thereby allows users to achieve better feedback within a shorter time than previous approaches performed with Laplacian derivations and CSP filters. The success of the experiment highlights that CSPP has the potential to further reduce BCI inefficiency. CSPP are a valuable compromise between CSP and Laplacian filters. They allow users to attain better feedback within a shorter time and thus reduce BCI inefficiency to one-fourth in comparison to previous non-adaptive paradigms.

  19. Navigation Architecture for a Space Mobile Network

    NASA Technical Reports Server (NTRS)

    Valdez, Jennifer E.; Ashman, Benjamin; Gramling, Cheryl; Heckler, Gregory W.; Carpenter, Russell

    2016-01-01

    The Tracking and Data Relay Satellite System (TDRSS) Augmentation Service for Satellites (TASS) is a proposed beacon service to provide a global, space based GPS augmentation service based on the NASA Global Differential GPS (GDGPS) System. The TASS signal will be tied to the GPS time system and usable as an additional ranging and Doppler radiometric source. Additionally, it will provide data vital to autonomous navigation in the near Earth regime, including space weather information, TDRS ephemerides, Earth Orientation Parameters (EOP), and forward commanding capability. TASS benefits include enhancing situational awareness, enabling increased autonomy, and providing near real-time command access for user platforms. As NASA Headquarters' Space Communication and Navigation Office (SCaN) begins to move away from a centralized network architecture and towards a Space Mobile Network (SMN) that allows for user initiated services, autonomous navigation will be a key part of such a system. This paper explores how a TASS beacon service enables the Space Mobile Networking paradigm, what a typical user platform would require, and provides an in-depth analysis of several navigation scenarios and operations concepts. This paper provides an overview of the TASS beacon and its role within the SMN and user community. Supporting navigation analysis is presented for two user mission scenarios: an Earth observing spacecraft in low earth orbit (LEO), and a highly elliptical spacecraft in a lunar resonance orbit. These diverse flight scenarios indicate the breadth of applicability of the TASS beacon for upcoming users within the current network architecture and in the SMN.

  20. An investigation of users' attitudes, requirements and willingness to use mobile phone-based interactive voice response systems for seeking healthcare in Ghana: a qualitative study.

    PubMed

    Brinkel, J; Dako-Gyeke, P; Krämer, A; May, J; Fobil, J N

    2017-03-01

    In implementing mobile health interventions, user requirements and willingness to use are among the most crucial concerns for success of the investigation and have only rarely been examined in sub-Saharan Africa. This study aimed to specify the requirements of caregivers of children in order to use a symptom-based interactive voice response (IVR) system for seeking healthcare. This included (i) the investigation of attitudes towards mobile phone use and user experiences and (ii) the assessment of facilitators and challenges to use the IVR system. This is a population-based cross-sectional study. Four qualitative focus group discussions were conducted in peri-urban and rural towns in Shai Osudoku and Ga West district, as well as in Tema- and Accra Metropolitan Assembly. Participants included male and female caregivers of at least one child between 0 and 10 years of age. A qualitative content analysis was conducted for data analysis. Participants showed a positive attitude towards the use of mobile phones for seeking healthcare. While no previous experience in using IVR for health information was reported, the majority of participants stated that it offers a huge advantage for improvement in health performance. Barriers to IVR use included concerns about costs, lack of familiarly with the technology, social barriers such as lack of human interaction and infrastructural challenges. The establishment of a toll-free number as well as training prior to IVR system was discussed for recommendation. This study suggests that caregivers in the socio-economic environment of Ghana are interested and willing to use mobile phone-based IVR to receive health information for child healthcare. Important identified users' needs should be considered by health programme implementers and policy makers to help facilitate the development and implementation of IVR systems in the field of seeking healthcare. Copyright © 2016 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  1. A collaborative design method to support integrated care. An ICT development method containing continuous user validation improves the entire care process and the individual work situation

    PubMed Central

    Scandurra, Isabella; Hägglund, Maria

    2009-01-01

    Introduction Integrated care involves different professionals, belonging to different care provider organizations and requires immediate and ubiquitous access to patient-oriented information, supporting an integrated view on the care process [1]. Purpose To present a method for development of usable and work process-oriented information and communication technology (ICT) systems for integrated care. Theory and method Based on Human-computer Interaction Science and in particular Participatory Design [2], we present a new collaborative design method in the context of health information systems (HIS) development [3]. This method implies a thorough analysis of the entire interdisciplinary cooperative work and a transformation of the results into technical specifications, via user validated scenarios, prototypes and use cases, ultimately leading to the development of appropriate ICT for the variety of occurring work situations for different user groups, or professions, in integrated care. Results and conclusions Application of the method in homecare of the elderly resulted in an HIS that was well adapted to the intended user groups. Conducted in multi-disciplinary seminars, the method captured and validated user needs and system requirements for different professionals, work situations, and environments not only for current work; it also aimed to improve collaboration in future (ICT supported) work processes. A holistic view of the entire care process was obtained and supported through different views of the HIS for different user groups, resulting in improved work in the entire care process as well as for each collaborating profession [4].

  2. Perspectives of UV nowcasting to monitor personal pro-health outdoor activities.

    PubMed

    Krzyścin, Janusz W; Lesiak, Aleksandra; Narbutt, Joanna; Sobolewski, Piotr; Guzikowski, Jakub

    2018-07-01

    Nowcasting model for online monitoring of personal outdoor behaviour is proposed. It is envisaged that it will provide an effective e-tool used by smartphone users. The model could estimate maximum duration of safe (without erythema risk) outdoor activity. Moreover, there are options to estimate duration of sunbathing to get adequate amount of vitamin D 3 and doses necessary for the antipsoriatic heliotherapy. The application requires information of starting time of sunbathing and the user's phototype. At the beginning the user will be informed of the approximate duration of sunbathing required to get the minimum erythemal dose, adequate amount of vitamin D 3 , and the dose necessary for the antipsoriatic heliotherapy. After every 20-min the application will recalculate the remaining duration of sunbathing based on the UVI measured in the preceding 20 min. If the estimate of remaining duration is <20 min the user will be informed that the deadline of sunbathing is approaching. Finally, a warning signal will be sent to stop sunbathing if the measured dose reaches the required dose. The proposed model is verified using the data collected at two measuring sites for the warm period of 2017 (1st April-30th September) in large Polish cities (Warsaw and Lodz). First instrument represents the UVI monitoring station. The information concerning sunbathing duration, which is sent to a remote user, is evaluated on the basis of the UVI measurements collected by the second measuring unit in a distance of ~7 km and 10 km for Warsaw and Lodz, respectively. The statistical analysis of the differences between sunbathing duration by nowcasting model and observation shows that the model provides reliable doses received by the users during outdoor activities in proximity (~10 km) to the UVI source site. Standard 24 h UVI forecast based on prognostic values of total ozone and cloudiness appears to only be valid for sunny days. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Standardizing Activation Analysis: New Software for Photon Activation Analysis

    NASA Astrophysics Data System (ADS)

    Sun, Z. J.; Wells, D.; Segebade, C.; Green, J.

    2011-06-01

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.

  4. Distributive On-line Processing, Visualization and Analysis System for Gridded Remote Sensing Data

    NASA Technical Reports Server (NTRS)

    Leptoukh, G.; Berrick, S.; Liu, Z.; Pham, L.; Rui, H.; Shen, S.; Teng, W.; Zhu, T.

    2004-01-01

    The ability to use data stored in the current Earth Observing System (EOS) archives for studying regional or global phenomena is highly dependent on having a detailed understanding of the data's internal structure and physical implementation. Gaining this understanding and applying it to data reduction is a time- consuming task that must be undertaken before the core investigation can begin. This is an especially difficult challenge when science objectives require users to deal with large multi-sensor data sets that are usually of different formats, structures, and resolutions, for example, when preparing data for input into modeling systems. The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) has taken a major step towards meeting this challenge by developing an infrastructure with a Web interface that allows users to perform interactive analysis online without downloading any data, the GES-DISC Interactive Online Visualization and Analysis Infrastructure or "Giovanni." Giovanni provides interactive, online, analysis tools for data users to facilitate their research. There have been several instances of this interface created to serve TRMM users, Aerosol scientists, Ocean Color and Agriculture applications users. The first generation of these tools support gridded data only. The user selects geophysical parameters, area of interest, time period; and the system generates an output on screen in a matter of seconds. The currently available output options are: Area plot averaged or accumulated over any available data period for any rectangular area; Time plot time series averaged over any rectangular area; Time plots image view of any longitude-time and latitude-time cross sections; ASCII output for all plot types; Image animation for area plot. In the future, we will add correlation plots, GIS-compatible outputs, etc. This allow user to focus on data content (i.e. science parameters) and eliminate the need for expensive learning, development and processing tasks that are redundantly incurred by an archive's user community. The current implementation utilizes the GrADS-DODS Server (GDS), a stable, secure data server that provides subsetting and analysis services across the Internet for any GrADS-readable dataset. The subsetting capability allows users to retrieve a specified temporal and/or spatial subdomain from a large dataset, eliminating the need to download everything simply to access a small relevant portion of a dataset. The analysis capability allows users to retrieve the results of an operation applied to one or more datasets on the server. In our case, we use this approach to read pre-processed binary files and/or to read and extract the needed parts from HDF or HDF-EOS files. These subsets then serve as inputs into GrADS processing and analysis scripts. It can be used in a wide variety of Earth science applications: climate and weather events study and monitoring; modeling. It can be easily configured for new applications.

  5. IAC level "O" program development

    NASA Technical Reports Server (NTRS)

    Vos, R. G.

    1982-01-01

    The current status of the IAC development activity is summarized. The listed prototype software and documentation was delivered, and details were planned for development of the level 1 operational system. The planned end product IAC is required to support LSST design analysis and performance evaluation, with emphasis on the coupling of required technical disciplines. The long term IAC effectively provides two distinct features: a specific set of analysis modules (thermal, structural, controls, antenna radiation performance and instrument optical performance) that will function together with the IAC supporting software in an integrated and user friendly manner; and a general framework whereby new analysis modules can readily be incorporated into IAC or be allowed to communicate with it.

  6. Forcing Interoperability: An Intentionally Fractured Approach

    NASA Astrophysics Data System (ADS)

    Gallaher, D. W.; Brodzik, M.; Scambos, T.; Stroeve, J.

    2008-12-01

    The NSIDC is attempting to rebuild a significant portion of its public-facing cyberinfrastructure to better meet the needs expressed by the cryospheric community. The project initially addresses a specific science need - understanding Greenland's contribution to global sea level rise through comparison and analysis of variables such as temperature, albedo, melt, ice velocity and surface elevation. This project will ultimately be expanded to cover most of NSIDC's cryospheric data. Like many organizations, we need to provide users with data discovery interfaces, collaboration tools and mapping services. Complicating this effort is the need to reduce the volume of raw data delivered to the user. Data growth, especially with time-series data, will overwhelm our software, processors and network like never before. We need to provide the users the ability to perform first level analysis directly on our site. In order to accomplish this, the users should be free to modify the behavior of these tools as well as incorporate their own tools and analysis to meet their needs. Rather than building one monolithic project to build this system, we have chosen to build three semi-independent systems. One team is building a data discovery and web based distribution system, the second is building an advanced analysis and workflow system and the third is building a customized web mapping service. These systems will use the same underlying data structures and services but will employ different technologies and teams to build their objectives, schedules and user interfaces. Obviously, we are adding complexity and risk to the overall project however this may be the best method to achieve interoperability because the development teams will be required to build off each others work. The teams will be forced to design with other users in mind as opposed to building interoperability as an afterthought, which a tendency in monolithic systems. All three teams will take advantage of preexisting software and standards whenever possible. We present this topic to stimulate discussion within the development, operational and research communities on how best to proceed.

  7. Understanding USGS user needs and Earth observing data use for decision making

    NASA Astrophysics Data System (ADS)

    Wu, Z.

    2016-12-01

    US Geological Survey (USGS) initiated the Requirements, Capabilities and Analysis for Earth Observations (RCA-EO) project in the Land Remote Sensing (LRS) program, collaborating with the National Oceanic and Atmospheric Administration (NOAA) to jointly develop the supporting information infrastructure - The Earth Observation Requirements Evaluation Systems (EORES). RCA-EO enables us to collect information on current data products and projects across the USGS and evaluate the impacts of Earth observation data from all sources, including spaceborne, airborne, and ground-based platforms. EORES allows users to query, filter, and analyze usage and impacts of Earth observation data at different organizational level within the bureau. We engaged over 500 subject matter experts and evaluated more than 1000 different Earth observing data sources and products. RCA-EO provides a comprehensive way to evaluate impacts of Earth observing data on USGS mission areas and programs through the survey of 345 key USGS products and services. We paid special attention to user feedback about Earth observing data to inform decision making on improving user satisfaction. We believe the approach and philosophy of RCA-EO can be applied in much broader scope to derive comprehensive knowledge of Earth observing systems impacts and usage and inform data products development and remote sensing technology innovation.

  8. 14 CFR 1215.108 - Defining user service requirements.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 5 2011-01-01 2010-01-01 true Defining user service requirements. 1215.108 Section 1215.108 Aeronautics and Space NATIONAL AERONAUTICS AND SPACE ADMINISTRATION TRACKING AND DATA..., spacecraft design, operations planning, and other significant mission parameters. When these user evaluations...

  9. 14 CFR 1215.108 - Defining user service requirements.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 5 2012-01-01 2012-01-01 false Defining user service requirements. 1215.108 Section 1215.108 Aeronautics and Space NATIONAL AERONAUTICS AND SPACE ADMINISTRATION TRACKING AND... services, spacecraft design, operations planning, and other significant mission parameters. When these user...

  10. Tooth segmentation system with intelligent editing for cephalometric analysis

    NASA Astrophysics Data System (ADS)

    Chen, Shoupu

    2015-03-01

    Cephalometric analysis is the study of the dental and skeletal relationship in the head, and it is used as an assessment and planning tool for improved orthodontic treatment of a patient. Conventional cephalometric analysis identifies bony and soft-tissue landmarks in 2D cephalometric radiographs, in order to diagnose facial features and abnormalities prior to treatment, or to evaluate the progress of treatment. Recent studies in orthodontics indicate that there are persistent inaccuracies and inconsistencies in the results provided using conventional 2D cephalometric analysis. Obviously, plane geometry is inappropriate for analyzing anatomical volumes and their growth; only a 3D analysis is able to analyze the three-dimensional, anatomical maxillofacial complex, which requires computing inertia systems for individual or groups of digitally segmented teeth from an image volume of a patient's head. For the study of 3D cephalometric analysis, the current paper proposes a system for semi-automatically segmenting teeth from a cone beam computed tomography (CBCT) volume with two distinct features, including an intelligent user-input interface for automatic background seed generation, and a graphics processing unit (GPU) acceleration mechanism for three-dimensional GrowCut volume segmentation. Results show a satisfying average DICE score of 0.92, with the use of the proposed tooth segmentation system, by 15 novice users who segmented a randomly sampled tooth set. The average GrowCut processing time is around one second per tooth, excluding user interaction time.

  11. PCIPS 2.0: Powerful multiprofile image processing implemented on PCs

    NASA Technical Reports Server (NTRS)

    Smirnov, O. M.; Piskunov, N. E.

    1992-01-01

    Over the years, the processing power of personal computers has steadily increased. Now, 386- and 486-based PC's are fast enough for many image processing applications, and inexpensive enough even for amateur astronomers. PCIPS is an image processing system based on these platforms that was designed to satisfy a broad range of data analysis needs, while requiring minimum hardware and providing maximum expandability. It will run (albeit at a slow pace) even on a 80286 with 640K memory, but will take full advantage of bigger memory and faster CPU's. Because the actual image processing is performed by external modules, the system can be easily upgraded by the user for all sorts of scientific data analysis. PCIPS supports large format lD and 2D images in any numeric type from 8-bit integer to 64-bit floating point. The images can be displayed, overlaid, printed and any part of the data examined via an intuitive graphical user interface that employs buttons, pop-up menus, and a mouse. PCIPS automatically converts images between different types and sizes to satisfy the requirements of various applications. PCIPS features an API that lets users develop custom applications in C or FORTRAN. While doing so, a programmer can concentrate on the actual data processing, because PCIPS assumes responsibility for accessing images and interacting with the user. This also ensures that all applications, even custom ones, have a consistent and user-friendly interface. The API is compatible with factory programming, a metaphor for constructing image processing procedures that will be implemented in future versions of the system. Several application packages were created under PCIPS. The basic package includes elementary arithmetics and statistics, geometric transformations and import/export in various formats (FITS, binary, ASCII, and GIF). The CCD processing package and the spectral analysis package were successfully used to reduce spectra from the Nordic Telescope at La Palma. A photometry package is also available, and other packages are being developed. A multitasking version of PCIPS that utilizes the factory programming concept is currently under development. This version will remain compatible (on the source code level) with existing application packages and custom applications.

  12. Adaptive interface for personalizing information seeking.

    PubMed

    Narayanan, S; Koppaka, Lavanya; Edala, Narasimha; Loritz, Don; Daley, Raymond

    2004-12-01

    An adaptive interface autonomously adjusts its display and available actions to current goals and abilities of the user by assessing user status, system task, and the context. Knowledge content adaptability is needed for knowledge acquisition and refinement tasks. In the case of knowledge content adaptability, the requirements of interface design focus on the elicitation of information from the user and the refinement of information based on patterns of interaction. In such cases, the emphasis on adaptability is on facilitating information search and knowledge discovery. In this article, we present research on adaptive interfaces that facilitates personalized information seeking from a large data warehouse. The resulting proof-of-concept system, called source recommendation system (SRS), assists users in locating and navigating data sources in the repository. Based on the initial user query and an analysis of the content of the search results, the SRS system generates a profile of the user tailored to the individual's context during information seeking. The user profiles are refined successively and are used in progressively guiding the user to the appropriate set of sources within the knowledge base. The SRS system is implemented as an Internet browser plug-in to provide a seamless and unobtrusive, personalized experience to the users during the information search process. The rationale behind our approach, system design, empirical evaluation, and implications for research on adaptive interfaces are described in this paper.

  13. SacLab: A toolbox for saccade analysis to increase usability of eye tracking systems in clinical ophthalmology practice.

    PubMed

    Cercenelli, Laura; Tiberi, Guido; Corazza, Ivan; Giannaccare, Giuseppe; Fresina, Michela; Marcelli, Emanuela

    2017-01-01

    Many open source software packages have been recently developed to expand the usability of eye tracking systems to study oculomotor behavior, but none of these is specifically designed to encompass all the main functions required for creating eye tracking tests and for providing the automatic analysis of saccadic eye movements. The aim of this study is to introduce SacLab, an intuitive, freely-available MATLAB toolbox based on Graphical User Interfaces (GUIs) that we have developed to increase the usability of the ViewPoint EyeTracker (Arrington Research, Scottsdale, AZ, USA) in clinical ophthalmology practice. SacLab consists of four processing modules that enable the user to easily create visual stimuli tests (Test Designer), record saccadic eye movements (Data Recorder), analyze the recorded data to automatically extract saccadic parameters of clinical interest (Data Analyzer) and provide an aggregate analysis from multiple eye movements recordings (Saccade Analyzer), without requiring any programming effort by the user. A demo application of SacLab to carry out eye tracking tests for the analysis of horizontal saccades was reported. We tested the usability of SacLab toolbox with three ophthalmologists who had no programming experience; the ophthalmologists were briefly trained in the use of SacLab GUIs and were asked to perform the demo application. The toolbox gained an enthusiastic feedback from all the clinicians in terms of intuitiveness, ease of use and flexibility. Test creation and data processing were accomplished in 52±21s and 46±19s, respectively, using the SacLab GUIs. SacLab may represent a useful tool to ease the application of the ViewPoint EyeTracker system in clinical routine in ophthalmology. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Modeling survival of juvenile salmon during downriver migration in the Columbia River on a microcomputer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peloquin, R.A.; McKenzie, D.H.

    1994-10-01

    A compartmental model has been implemented on a microcomputer as an aid in the analysis of alternative solutions to a problem. The model, entitled Smolt Survival Simulator, simulates the survival of juvenile salmon during their downstream migration and passage of hydroelectric dams in the Columbia River. The model is designed to function in a workshop environment where resource managers and fisheries biologists can study alternative measures that may potentially increase juvenile anadromous fish survival during downriver migration. The potential application of the model has placed several requirements on the implementing software. It must be available for use in workshop settings.more » The software must be easily to use with minimal computer knowledge. Scenarios must be created and executed quickly and efficiently. Results must be immediately available. Software design emphasis vas placed on the user interface because of these requirements. The discussion focuses on methods used in the development of the SSS software user interface. These methods should reduce user stress and alloy thorough and easy parameter modification.« less

  15. Development of CPR security using impact analysis.

    PubMed Central

    Salazar-Kish, J.; Tate, D.; Hall, P. D.; Homa, K.

    2000-01-01

    The HIPAA regulations will require that institutions ensure the prevention of unauthorized access to electronically stored or transmitted patient records. This paper discusses a process for analyzing the impact of security mechanisms on users of computerized patient records through "behind the scenes" electronic access audits. In this way, those impacts can be assessed and refined to an acceptable standard prior to implementation. Through an iterative process of design and evaluation, we develop security algorithms that will protect electronic health information from improper access, alteration or loss, while minimally affecting the flow of work of the user population as a whole. PMID:11079984

  16. TDRSS telecommunications system, PN code analysis

    NASA Technical Reports Server (NTRS)

    Dixon, R.; Gold, R.; Kaiser, F.

    1976-01-01

    The pseudo noise (PN) codes required to support the TDRSS telecommunications services are analyzed and the impact of alternate coding techniques on the user transponder equipment, the TDRSS equipment, and all factors that contribute to the acquisition and performance of these telecommunication services is assessed. Possible alternatives to the currently proposed hybrid FH/direct sequence acquisition procedures are considered and compared relative to acquisition time, implementation complexity, operational reliability, and cost. The hybrid FH/direct sequence technique is analyzed and rejected in favor of a recommended approach which minimizes acquisition time and user transponder complexity while maximizing probability of acquisition and overall link reliability.

  17. Federal Emergency Management Information System (FEMIS) system administration guide. Version 1.2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burford, M.J.; Burnett, R.A.; Curtis, L.M.

    The Federal Emergency Management Information System (FEMIS) is an emergency management planning and analysis tool that is being developed under the direction of the US Army Chemical biological Defense Command. The FEMIS System Administration Guide defines FEMIS hardware and software requirements and gives instructions for installing the FEMIS system package. System administrators, database administrators, and general users can use this guide to install, configure, and maintain the FEMIS client software package. This document provides a description of the FEMIS environment; distribution media; data, communications, and electronic mail servers; user workstations; and system management.

  18. A study of Minnesota land and water resources using remote sensing, volume 13

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Progress in the use of LANDSAT data to classify wetlands in the Upper Mississippi River Valley and efforts to evaluate stress in corn and soybean crops are described. Satellite remote sensing data was used to measure particle concentrations in Lake Superior and several different kinds of remote sensing data were synergistically combined in order to identify near surface bedrock in Minnesota. Data analysis techniques which separate those activities requiring extensive computing form those involving a great deal of user interaction were developed to allow the latter to be done in the user's office or in the field.

  19. Design analysis tracking and data relay satellite simulation system

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The design and development of the equipment necessary to simulate the S-band multiple access link between user spacecraft, the Tracking and Data Relay Satellite, and a ground control terminal are discussed. The core of the S-band multiple access concept is the use of an Adaptive Ground Implemented Phased Array. The array contains thirty channels and provides the multiplexing and demultiplexing equipment required to demonstrate the ground implemented beam forming feature. The system provided will make it possible to demonstrate the performance of a desired user and ten interfering sources attempting to pass data through the multiple access system.

  20. Public Service Communications Satellite User Requirements Workshop

    NASA Technical Reports Server (NTRS)

    Wolff, E. A.

    1977-01-01

    Information on user requirements for public service communications was acquired to provide the basis of a study to determine the optimum satellite system to satisfy user requirements. The concept for such a system is described: Topics discussed included requirements for data and message services, elementary and secondary education, extension and continuing education, environmental communications, library services, medical education, medical services, public broadcasting, public safety, religious applications, state and local communications, and voluntary services. Information was also obtained on procedures to follow to make the transfer to commercial services.

Top