AQBE — QBE Style Queries for Archetyped Data
NASA Astrophysics Data System (ADS)
Sachdeva, Shelly; Yaginuma, Daigo; Chu, Wanming; Bhalla, Subhash
Large-scale adoption of electronic healthcare applications requires semantic interoperability. The new proposals propose an advanced (multi-level) DBMS architecture for repository services for health records of patients. These also require query interfaces at multiple levels and at the level of semi-skilled users. In this regard, a high-level user interface for querying the new form of standardized Electronic Health Records system has been examined in this study. It proposes a step-by-step graphical query interface to allow semi-skilled users to write queries. Its aim is to decrease user effort and communication ambiguities, and increase user friendliness.
Railroad track inspection interface demonstration : final report.
DOT National Transportation Integrated Search
2016-01-01
This project developed a track data user interface utilizing the Google Glass optical display device. The interface allows the user : to recall data stored remotely and view the data on the Google Glass. The technical effort required developing a com...
Mental Effort in Binary Categorization Aided by Binary Cues
ERIC Educational Resources Information Center
Botzer, Assaf; Meyer, Joachim; Parmet, Yisrael
2013-01-01
Binary cueing systems assist in many tasks, often alerting people about potential hazards (such as alarms and alerts). We investigate whether cues, besides possibly improving decision accuracy, also affect the effort users invest in tasks and whether the required effort in tasks affects the responses to cues. We developed a novel experimental tool…
Website Redesign: A Case Study.
Wu, Jin; Brown, Janis F
2016-01-01
A library website redesign is a complicated and at times arduous task, requiring many different steps including determining user needs, analyzing past user behavior, examining other websites, defining design preferences, testing, marketing, and launching the site. Many different types of expertise are required over the entire process. Lessons learned from the Norris Medical Library's experience with the redesign effort may be useful to others undertaking a similar project.
Usability assessment of an electronic health record in a comprehensive dental clinic.
Suebnukarn, Siriwan; Rittipakorn, Pawornwan; Thongyoi, Budsara; Boonpitak, Kwanwong; Wongsapai, Mansuang; Pakdeesan, Panu
2013-12-01
In this paper we present the development and usability of an electronic health record (EHR) system in a comprehensive dental clinic.The graphic user interface of the system was designed to consider the concept of cognitive ergonomics.The cognitive task analysis was used to evaluate the user interface of the EHR by identifying all sub-tasks and classifying them into mental or physical operators, and to predict task execution time required to perform the given task. We randomly selected 30 cases that had oral examinations for routine clinical care in a comprehensive dental clinic. The results were based on the analysis of 4 prototypical tasks performed by ten EHR users. The results showed that on average a user needed to go through 27 steps to complete all tasks for one case. To perform all 4 tasks of 30 cases, they spent about 91 min (independent of system response time) for data entry, of which 51.8 min were spent on more effortful mental operators. In conclusion, the user interface can be improved by reducing the percentage of mental effort required for the tasks.
Nelson, Scott D; Del Fiol, Guilherme; Hanseler, Haley; Crouch, Barbara Insley; Cummins, Mollie R
2016-01-01
Health information exchange (HIE) between Poison Control Centers (PCCs) and Emergency Departments (EDs) could improve care of poisoned patients. However, PCC information systems are not designed to facilitate HIE with EDs; therefore, we are developing specialized software to support HIE within the normal workflow of the PCC using user-centered design and rapid prototyping. To describe the design of an HIE dashboard and the refinement of user requirements through rapid prototyping. Using previously elicited user requirements, we designed low-fidelity sketches of designs on paper with iterative refinement. Next, we designed an interactive high-fidelity prototype and conducted scenario-based usability tests with end users. Users were asked to think aloud while accomplishing tasks related to a case vignette. After testing, the users provided feedback and evaluated the prototype using the System Usability Scale (SUS). Survey results from three users provided useful feedback that was then incorporated into the design. After achieving a stable design, we used the prototype itself as the specification for development of the actual software. Benefits of prototyping included having 1) subject-matter experts heavily involved with the design; 2) flexibility to make rapid changes, 3) the ability to minimize software development efforts early in the design stage; 4) rapid finalization of requirements; 5) early visualization of designs; 6) and a powerful vehicle for communication of the design to the programmers. Challenges included 1) time and effort to develop the prototypes and case scenarios; 2) no simulation of system performance; 3) not having all proposed functionality available in the final product; and 4) missing needed data elements in the PCC information system.
Education of MIS Users: One Hospital's Experience
Jacobs, Patt
1982-01-01
Dr. Stanley Jacobs has identified at least five factors which impact the amount of effort required to implement a hospital based computer system. One of these factors is clearly the users of the system. This paper focuses upon the implementation process at St. Vincent Hospital and Medical Center (SVH&MC) highlighting the user education program developed by the hospital DP staff.
Lightweight Adaptation of Classifiers to Users and Contexts: Trends of the Emerging Domain
Vildjiounaite, Elena; Gimel'farb, Georgy; Kyllönen, Vesa; Peltola, Johannes
2015-01-01
Intelligent computer applications need to adapt their behaviour to contexts and users, but conventional classifier adaptation methods require long data collection and/or training times. Therefore classifier adaptation is often performed as follows: at design time application developers define typical usage contexts and provide reasoning models for each of these contexts, and then at runtime an appropriate model is selected from available ones. Typically, definition of usage contexts and reasoning models heavily relies on domain knowledge. However, in practice many applications are used in so diverse situations that no developer can predict them all and collect for each situation adequate training and test databases. Such applications have to adapt to a new user or unknown context at runtime just from interaction with the user, preferably in fairly lightweight ways, that is, requiring limited user effort to collect training data and limited time of performing the adaptation. This paper analyses adaptation trends in several emerging domains and outlines promising ideas, proposed for making multimodal classifiers user-specific and context-specific without significant user efforts, detailed domain knowledge, and/or complete retraining of the classifiers. Based on this analysis, this paper identifies important application characteristics and presents guidelines to consider these characteristics in adaptation design. PMID:26473165
Development and analysis of SCR requirements tables for system scenarios
NASA Technical Reports Server (NTRS)
Callahan, John R.; Morrison, Jeffery L.
1995-01-01
We describe the use of scenarios to develop and refine requirement tables for parts of the Earth Observing System Data and Information System (EOSDIS). The National Aeronautics and Space Administration (NASA) is developing EOSDIS as part of its Mission-To-Planet-Earth (MTPE) project to accept instrument/platform observation requests from end-user scientists, schedule and perform requested observations of the Earth from space, collect and process the observed data, and distribute data to scientists and archives. Current requirements for the system are managed with tools that allow developers to trace the relationships between requirements and other development artifacts, including other requirements. In addition, the user community (e.g., earth and atmospheric scientists), in conjunction with NASA, has generated scenarios describing the actions of EOSDIS subsystems in response to user requests and other system activities. As part of a research effort in verification and validation techniques, this paper describes our efforts to develop requirements tables from these scenarios for the EOSDIS Core System (ECS). The tables specify event-driven mode transitions based on techniques developed by the Naval Research Lab's (NRL) Software Cost Reduction (SCR) project. The SCR approach has proven effective in specifying requirements for large systems in an unambiguous, terse format that enhance identification of incomplete and inconsistent requirements. We describe development of SCR tables from user scenarios and identify the strengths and weaknesses of our approach in contrast to the requirements tracing approach. We also evaluate the capabilities of both approach to respond to the volatility of requirements in large, complex systems.
Low-Fatigue Hand Controller For Remote Manipulator
NASA Technical Reports Server (NTRS)
Maclaren, Brice; Mcmurray, Gary; Lipkin, Harvey
1993-01-01
Universal master controller used in brace mode, in which user's forearm rests atop upper (forearm) module. Alternatively, user manipulates hand controller in side mode, which gives greater latitude for motion but requires more muscular effort. Controller provides six degrees of freedom and reflects, back to user, scaled versions of forces experienced by manipulator. Manipulator designed to condense work space into user's natural work volume. Operated by both right-handed and left-handed users. Does not interfere with user's natural movements or obstruct line of sight. Controller compact and portable.
Wildlife monitoring program plan
NASA Technical Reports Server (NTRS)
Sebesta, P.; Arno, R.
1979-01-01
A plan for integrating the various requirements for wildlife monitoring with modern aerospace technology is presented. This plan is responsive to user needs, recognizes legal requirements, and is based on an evolutionary growth from domestic animals and larger animals to smaller, more scarce and remote species. The basis for animal study selection was made from the 1973 Santa Cruz Summer Study on Wildlife Monitoring. As techniques are developed the monitoring and management tasks will be interfaced with and eventually operated by the user agencies. Field efforts, aircraft and satellites, will be supplemented by laboratory investigations. Sixty percent of the effort will be in hardware research and development (satellite technology, microminiaturization) and the rest for gathering and interpreting data.
NASA Technical Reports Server (NTRS)
Kiusalaas, J.; Reddy, G. B.
1977-01-01
A finite element program is presented for computer-automated, minimum weight design of elastic structures with constraints on stresses (including local instability criteria) and displacements. Volume 1 of the report contains the theoretical and user's manual of the program. Sample problems and the listing of the program are included in Volumes 2 and 3. The element subroutines are organized so as to facilitate additions and changes by the user. As a result, a relatively minor programming effort would be required to make DESAP 1 into a special purpose program to handle the user's specific design requirements and failure criteria.
System Engineering Concept Demonstration, Effort Summary. Volume 1
1992-12-01
involve only the system software, user frameworks and user tools. U •User Tool....s , Catalyst oExternal 00 Computer Framwork P OSystems • •~ Sysytem...analysis, synthesis, optimization, conceptual design of Catalyst. The paper discusses the definition, design, test, and evaluation; operational concept...This approach will allow system engineering The conceptual requirements for the Process Model practitioners to recognize and tailor the model. This
Rapid Prototyping of Hydrologic Model Interfaces with IPython
NASA Astrophysics Data System (ADS)
Farthing, M. W.; Winters, K. D.; Ahmadia, A. J.; Hesser, T.; Howington, S. E.; Johnson, B. D.; Tate, J.; Kees, C. E.
2014-12-01
A significant gulf still exists between the state of practice and state of the art in hydrologic modeling. Part of this gulf is due to the lack of adequate pre- and post-processing tools for newly developed computational models. The development of user interfaces has traditionally lagged several years behind the development of a particular computational model or suite of models. As a result, models with mature interfaces often lack key advancements in model formulation, solution methods, and/or software design and technology. Part of the problem has been a focus on developing monolithic tools to provide comprehensive interfaces for the entire suite of model capabilities. Such efforts require expertise in software libraries and frameworks for creating user interfaces (e.g., Tcl/Tk, Qt, and MFC). These tools are complex and require significant investment in project resources (time and/or money) to use. Moreover, providing the required features for the entire range of possible applications and analyses creates a cumbersome interface. For a particular site or application, the modeling requirements may be simplified or at least narrowed, which can greatly reduce the number and complexity of options that need to be accessible to the user. However, monolithic tools usually are not adept at dynamically exposing specific workflows. Our approach is to deliver highly tailored interfaces to users. These interfaces may be site and/or process specific. As a result, we end up with many, customized interfaces rather than a single, general-use tool. For this approach to be successful, it must be efficient to create these tailored interfaces. We need technology for creating quality user interfaces that is accessible and has a low barrier for integration into model development efforts. Here, we present efforts to leverage IPython notebooks as tools for rapid prototyping of site and application-specific user interfaces. We provide specific examples from applications in near-shore environments as well as levee analysis. We discuss our design decisions and methodology for developing customized interfaces, strategies for delivery of the interfaces to users in various computing environments, as well as implications for the design/implementation of simulation models.
An Exploratory Survey of Information Requirements for Instrument Approach Charts
DOT National Transportation Integrated Search
1995-03-01
This report documents a user centered survey and interview effort conducted to analyze the information content of : current Instrument Approach Plates (IAP). In the pilot opinion survey of approach chart information requirements, : respondents indica...
Del Fiol, Guilherme; Hanseler, Haley; Crouch, Barbara Insley; Cummins, Mollie R.
2016-01-01
Summary Background Health information exchange (HIE) between Poison Control Centers (PCCs) and Emergency Departments (EDs) could improve care of poisoned patients. However, PCC information systems are not designed to facilitate HIE with EDs; therefore, we are developing specialized software to support HIE within the normal workflow of the PCC using user-centered design and rapid prototyping. Objective To describe the design of an HIE dashboard and the refinement of user requirements through rapid prototyping. Methods Using previously elicited user requirements, we designed low-fidelity sketches of designs on paper with iterative refinement. Next, we designed an interactive high-fidelity prototype and conducted scenario-based usability tests with end users. Users were asked to think aloud while accomplishing tasks related to a case vignette. After testing, the users provided feedback and evaluated the prototype using the System Usability Scale (SUS). Results Survey results from three users provided useful feedback that was then incorporated into the design. After achieving a stable design, we used the prototype itself as the specification for development of the actual software. Benefits of prototyping included having 1) subject-matter experts heavily involved with the design; 2) flexibility to make rapid changes, 3) the ability to minimize software development efforts early in the design stage; 4) rapid finalization of requirements; 5) early visualization of designs; 6) and a powerful vehicle for communication of the design to the programmers. Challenges included 1) time and effort to develop the prototypes and case scenarios; 2) no simulation of system performance; 3) not having all proposed functionality available in the final product; and 4) missing needed data elements in the PCC information system. PMID:27081404
The study on knowledge transferring incentive for information system requirement development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Yang
2015-03-10
Information system requirement development is a process of users’ knowledge sharing and transferring. However the tacit requirements developing is a main problem during requirement development process, for the reason of difficult to encoding, express, and communicate. Knowledge fusion and corporate effort is needed to finding tacit requirements. Under this background, our paper try to find out the rule of effort dynamic evolutionary of software developer and user by building an evolutionary game model on the condition of incentive system. And in addition this paper provides an in depth discussion at the end of this paper.
Yovcheva, Zornitza; van Elzakker, Corné P J M; Köbben, Barend
2013-11-01
Web-based tools developed in the last couple of years offer unique opportunities to effectively support scientists in their effort to collaborate. Communication among environmental researchers often involves not only work with geographical (spatial), but also with temporal data and information. Literature still provides limited documentation when it comes to user requirements for effective geo-collaborative work with spatio-temporal data. To start filling this gap, our study adopted a User-Centered Design approach and first explored the user requirements of environmental researchers working on distributed research projects for collaborative dissemination, exchange and work with spatio-temporal data. Our results show that system design will be mainly influenced by the nature and type of data users work with. From the end-users' perspective, optimal conversion of huge files of spatio-temporal data for further dissemination, accuracy of conversion, organization of content and security have a key role for effective geo-collaboration. Copyright © 2012 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Leroy, Gondy; Xu, Jennifer; Chung, Wingyan; Eggers, Shauna; Chen, Hsinchun
2007-01-01
Retrieving sufficient relevant information online is difficult for many people because they use too few keywords to search and search engines do not provide many support tools. To further complicate the search, users often ignore support tools when available. Our goal is to evaluate in a realistic setting when users use support tools and how they perceive these tools. We compared three medical search engines with support tools that require more or less effort from users to form a query and evaluate results. We carried out an end user study with 23 users who were asked to find information, i.e., subtopics and supporting abstracts, for a given theme. We used a balanced within-subjects design and report on the effectiveness, efficiency and usability of the support tools from the end user perspective. We found significant differences in efficiency but did not find significant differences in effectiveness between the three search engines. Dynamic user support tools requiring less effort led to higher efficiency. Fewer searches were needed and more documents were found per search when both query reformulation and result review tools dynamically adjust to the user query. The query reformulation tool that provided a long list of keywords, dynamically adjusted to the user query, was used most often and led to more subtopics. As hypothesized, the dynamic result review tools were used more often and led to more subtopics than static ones. These results were corroborated by the usability questionnaires, which showed that support tools that dynamically optimize output were preferred.
ERIC Educational Resources Information Center
Crawford, Natalie D.; Amesty, Silvia; Rivera, Alexis V.; Harripersaud, Katherine; Turner, Alezandria; Fuller, Crystal M.
2014-01-01
Objectives: In an effort to reduce HIV transmission among injection drug users (IDUs), New York State deregulated pharmacy syringe sales in 2001 through the Expanded Syringe Access Program by removing the requirement of a prescription. With evidence suggesting pharmacists' ability to expand their public health role, a structural, pharmacy-based…
Elimination sequence optimization for SPAR
NASA Technical Reports Server (NTRS)
Hogan, Harry A.
1986-01-01
SPAR is a large-scale computer program for finite element structural analysis. The program allows user specification of the order in which the joints of a structure are to be eliminated since this order can have significant influence over solution performance, in terms of both storage requirements and computer time. An efficient elimination sequence can improve performance by over 50% for some problems. Obtaining such sequences, however, requires the expertise of an experienced user and can take hours of tedious effort to affect. Thus, an automatic elimination sequence optimizer would enhance productivity by reducing the analysts' problem definition time and by lowering computer costs. Two possible methods for automating the elimination sequence specifications were examined. Several algorithms based on the graph theory representations of sparse matrices were studied with mixed results. Significant improvement in the program performance was achieved, but sequencing by an experienced user still yields substantially better results. The initial results provide encouraging evidence that the potential benefits of such an automatic sequencer would be well worth the effort.
Commercial Research and Development: Power to Explore, Opportunities from Discovery
NASA Technical Reports Server (NTRS)
Casas, Joseph C.; Nall, Mark; Powers, C. Blake; Henderson, Robin N. (Technical Monitor)
2002-01-01
The technical and economic goals of commercial use of space are laudable, and are addressed as a high priority by almost every national space program and most major aerospace companies the world over. Yet, the focus of most organizational agendas and discussions tends to focus on one or two very narrow enabling aspects of this potentially large technological and economic opportunity. While government sponsored commercial launch activities and private space platforms are an integral part of efforts to leverage the commercial use of space, these activities are possibly one of the smallest parts of creating, a viable and sustainable market for the commercial use of space. Most of the current programs usually do not appropriately address some of the critical issues of the current, already interested, potential space user communities. Current programs place the focus of the majority of the user requirements on the vehicle payload weight and mass performance considerations as the primary payload economical factor in providing a commercial market with a stimulating price for gaining access to the space environment. The larger user challenges of transformation from Earth-based research and development approaches to space environment approaches are not addressed early enough in programs to impact the new business considerations of potential users. Currently, space-based research and development user activities require a large user investment in time, in development of new areas of support expertise, in development of new systems, in risk of schedule to completion, and in long term capital positioning. The larger opportunities for stimulating a strong market driven interest in commercial use of space that could result from the development of vehicle payload "leap ahead technologies" for users are being missed, and there is a real risk of limiting the potentially broader market base to support a more technologically advanced and economically lucrative outcome. A major driving force for strengthening the commercial space activities is not only the technological advances in launch vehicle, or newer satellites, but the myriad of enabling payloads technologies that could, as a goal, result in an almost transparent facilitation to regular CD a, -n access to space and microgravity environments by the future users from the existing Earth-based research and development organizations market segments. Rather than focusing only on developing high lift performance launch vehicles and then developing payloads to fit them, the real focus from a business model perspective should to be on the customer payloads requirements, and on designing launch vehicles and platforms systems for a space transportation and facility infrastructure to support all aspects of the business model for the user market. To harness the full potential of space commercialization, new efforts need to be made to comprehensively examine all the critical business model areas for commercial research, development, and manufacturing in space so as to identify specific products and efforts; to determine how such operations must be both similar to and different from current Earth-based activities; to evaluate the enabling technological devices, processes and efforts so that like efforts can be addressed in a synergistic fashion for maximum user cost effectiveness; to delineate the services that are both needed and can be provided by such activities; and to use this information to drive design and development of space commercialization efforts and policy.
NASA Technical Reports Server (NTRS)
1974-01-01
A computer printout is presented of the mission requirement for the TERSSE missions and their associated user tasks. The data included in the data base represents a broad-based attempt to define the amount, extent, and type of information needed for an earth resources management program in the era of the space shuttle. An effort was made to consider all aspects of remote sensing and resource management; because of its broad scope, it is not intended that the data be used without verification for in-depth studies of particular missions and/or users. The data base represents the quantitative structure necessary to define the TERSSE architecture and requirements, and to an overall integrated view of the earth resources technology requirements of the 1980's.
PharmTeX: a LaTeX-Based Open-Source Platform for Automated Reporting Workflow.
Rasmussen, Christian Hove; Smith, Mike K; Ito, Kaori; Sundararajan, Vijayakumar; Magnusson, Mats O; Niclas Jonsson, E; Fostvedt, Luke; Burger, Paula; McFadyen, Lynn; Tensfeldt, Thomas G; Nicholas, Timothy
2018-03-16
Every year, the pharmaceutical industry generates a large number of scientific reports related to drug research, development, and regulatory submissions. Many of these reports are created using text processing tools such as Microsoft Word. Given the large number of figures, tables, references, and other elements, this is often a tedious task involving hours of copying and pasting and substantial efforts in quality control (QC). In the present article, we present the LaTeX-based open-source reporting platform, PharmTeX, a community-based effort to make reporting simple, reproducible, and user-friendly. The PharmTeX creators put a substantial effort into simplifying the sometimes complex elements of LaTeX into user-friendly functions that rely on advanced LaTeX and Perl code running in the background. Using this setup makes LaTeX much more accessible for users with no prior LaTeX experience. A software collection was compiled for users not wanting to manually install the required software components. The PharmTeX templates allow for inclusion of tables directly from mathematical software output as well and figures from several formats. Code listings can be included directly from source. No previous experience and only a few hours of training are required to start writing reports using PharmTeX. PharmTeX significantly reduces the time required for creating a scientific report fully compliant with regulatory and industry expectations. QC is made much simpler, since there is a direct link between analysis output and report input. PharmTeX makes available to report authors the strengths of LaTeX document processing without the need for extensive training. Graphical Abstract ᅟ.
Changes in inertia and effect on turning effort across different wheelchair configurations.
Caspall, Jayme J; Seligsohn, Erin; Dao, Phuc V; Sprigle, Stephen
2013-01-01
When executing turning maneuvers, manual wheelchair users must overcome the rotational inertia of the wheelchair system. Differences in wheelchair rotational inertia can result in increases in torque required to maneuver, resulting in greater propulsion effort and stress on the shoulder joints. The inertias of various configurations of an ultralightweight wheelchair were measured using a rotational inertia-measuring device. Adjustments in axle position, changes in wheel and tire type, and the addition of several accessories had various effects on rotational inertias. The configuration with the highest rotational inertia (solid tires, mag wheels with rearward axle) exceeded the configuration with the lowest (pneumatic tires, spoke wheels with forward axle) by 28%. The greater inertia requires increased torque to accelerate the wheelchair during turning. At a representative maximum acceleration, the reactive torque spanned the range of 11.7 to 15.0 N-m across the wheelchair configurations. At higher accelerations, these torques exceeded that required to overcome caster scrub during turning. These results indicate that a wheelchair's rotational inertia can significantly influence the torque required during turning and that this influence will affect active users who turn at higher speeds. Categorizing wheelchairs using both mass and rotational inertia would better represent differences in effort during wheelchair maneuvers.
ERTS direct readout ground station study
NASA Technical Reports Server (NTRS)
1971-01-01
A system configuration which provides for a wide variety of user requirements is described. Two distinct user types are considered and optimized configurations are provided. Independent satellite transmission systems allow simultaneous signal transmission to Regional Collection Centers via a high data rate channel and to local users who require near real time consumption of lower rate data. In order to maximize the ultimate utility of this study effort, a parametric system description is given such that in essence a shopping list is provided. To achieve these results, it was necessary to consider all technical disciplines associated with high resolution satellite imaging systems including signal processing, modulation and coding, recording, and display techniques. A total systems study was performed.
Paper simulation techniques in user requirements analysis for interactive computer systems
NASA Technical Reports Server (NTRS)
Ramsey, H. R.; Atwood, M. E.; Willoughby, J. K.
1979-01-01
This paper describes the use of a technique called 'paper simulation' in the analysis of user requirements for interactive computer systems. In a paper simulation, the user solves problems with the aid of a 'computer', as in normal man-in-the-loop simulation. In this procedure, though, the computer does not exist, but is simulated by the experimenters. This allows simulated problem solving early in the design effort, and allows the properties and degree of structure of the system and its dialogue to be varied. The technique, and a method of analyzing the results, are illustrated with examples from a recent paper simulation exercise involving a Space Shuttle flight design task
The Database Query Support Processor (QSP)
NASA Technical Reports Server (NTRS)
1993-01-01
The number and diversity of databases available to users continues to increase dramatically. Currently, the trend is towards decentralized, client server architectures that (on the surface) are less expensive to acquire, operate, and maintain than information architectures based on centralized, monolithic mainframes. The database query support processor (QSP) effort evaluates the performance of a network level, heterogeneous database access capability. Air Force Material Command's Rome Laboratory has developed an approach, based on ANSI standard X3.138 - 1988, 'The Information Resource Dictionary System (IRDS)' to seamless access to heterogeneous databases based on extensions to data dictionary technology. To successfully query a decentralized information system, users must know what data are available from which source, or have the knowledge and system privileges necessary to find out this information. Privacy and security considerations prohibit free and open access to every information system in every network. Even in completely open systems, time required to locate relevant data (in systems of any appreciable size) would be better spent analyzing the data, assuming the original question was not forgotten. Extensions to data dictionary technology have the potential to more fully automate the search and retrieval for relevant data in a decentralized environment. Substantial amounts of time and money could be saved by not having to teach users what data resides in which systems and how to access each of those systems. Information describing data and how to get it could be removed from the application and placed in a dedicated repository where it belongs. The result simplified applications that are less brittle and less expensive to build and maintain. Software technology providing the required functionality is off the shelf. The key difficulty is in defining the metadata required to support the process. The database query support processor effort will provide quantitative data on the amount of effort required to implement an extended data dictionary at the network level, add new systems, adapt to changing user needs, and provide sound estimates on operations and maintenance costs and savings.
Ward, Logan; Steel, James; Le Compte, Aaron; Evans, Alicia; Tan, Chia-Siong; Penning, Sophie; Shaw, Geoffrey M; Desaive, Thomas; Chase, J Geoffrey
2012-01-01
Tight glycemic control (TGC) has shown benefits but has been difficult to implement. Model-based methods and computerized protocols offer the opportunity to improve TGC quality and compliance. This research presents an interface design to maximize compliance, minimize real and perceived clinical effort, and minimize error based on simple human factors and end user input. The graphical user interface (GUI) design is presented by construction based on a series of simple, short design criteria based on fundamental human factors engineering and includes the use of user feedback and focus groups comprising nursing staff at Christchurch Hospital. The overall design maximizes ease of use and minimizes (unnecessary) interaction and use. It is coupled to a protocol that allows nurse staff to select measurement intervals and thus self-manage workload. The overall GUI design is presented and requires only one data entry point per intervention cycle. The design and main interface are heavily focused on the nurse end users who are the predominant users, while additional detailed and longitudinal data, which are of interest to doctors guiding overall patient care, are available via tabs. This dichotomy of needs and interests based on the end user's immediate focus and goals shows how interfaces must adapt to offer different information to multiple types of users. The interface is designed to minimize real and perceived clinical effort, and ongoing pilot trials have reported high levels of acceptance. The overall design principles, approach, and testing methods are based on fundamental human factors principles designed to reduce user effort and error and are readily generalizable. © 2012 Diabetes Technology Society.
Ward, Logan; Steel, James; Le Compte, Aaron; Evans, Alicia; Tan, Chia-Siong; Penning, Sophie; Shaw, Geoffrey M; Desaive, Thomas; Chase, J Geoffrey
2012-01-01
Introduction Tight glycemic control (TGC) has shown benefits but has been difficult to implement. Model-based methods and computerized protocols offer the opportunity to improve TGC quality and compliance. This research presents an interface design to maximize compliance, minimize real and perceived clinical effort, and minimize error based on simple human factors and end user input. Method The graphical user interface (GUI) design is presented by construction based on a series of simple, short design criteria based on fundamental human factors engineering and includes the use of user feedback and focus groups comprising nursing staff at Christchurch Hospital. The overall design maximizes ease of use and minimizes (unnecessary) interaction and use. It is coupled to a protocol that allows nurse staff to select measurement intervals and thus self-manage workload. Results The overall GUI design is presented and requires only one data entry point per intervention cycle. The design and main interface are heavily focused on the nurse end users who are the predominant users, while additional detailed and longitudinal data, which are of interest to doctors guiding overall patient care, are available via tabs. This dichotomy of needs and interests based on the end user's immediate focus and goals shows how interfaces must adapt to offer different information to multiple types of users. Conclusions The interface is designed to minimize real and perceived clinical effort, and ongoing pilot trials have reported high levels of acceptance. The overall design principles, approach, and testing methods are based on fundamental human factors principles designed to reduce user effort and error and are readily generalizable. PMID:22401330
Huo, Zhimin; Summers, Ronald M.; Paquerault, Sophie; Lo, Joseph; Hoffmeister, Jeffrey; Armato, Samuel G.; Freedman, Matthew T.; Lin, Jesse; Ben Lo, Shih-Chung; Petrick, Nicholas; Sahiner, Berkman; Fryd, David; Yoshida, Hiroyuki; Chan, Heang-Ping
2013-01-01
Computer-aided detection/diagnosis (CAD) is increasingly used for decision support by clinicians for detection and interpretation of diseases. However, there are no quality assurance (QA) requirements for CAD in clinical use at present. QA of CAD is important so that end users can be made aware of changes in CAD performance both due to intentional or unintentional causes. In addition, end-user training is critical to prevent improper use of CAD, which could potentially result in lower overall clinical performance. Research on QA of CAD and user training are limited to date. The purpose of this paper is to bring attention to these issues, inform the readers of the opinions of the members of the American Association of Physicists in Medicine (AAPM) CAD subcommittee, and thus stimulate further discussion in the CAD community on these topics. The recommendations in this paper are intended to be work items for AAPM task groups that will be formed to address QA and user training issues on CAD in the future. The work items may serve as a framework for the discussion and eventual design of detailed QA and training procedures for physicists and users of CAD. Some of the recommendations are considered by the subcommittee to be reasonably easy and practical and can be implemented immediately by the end users; others are considered to be “best practice” approaches, which may require significant effort, additional tools, and proper training to implement. The eventual standardization of the requirements of QA procedures for CAD will have to be determined through consensus from members of the CAD community, and user training may require support of professional societies. It is expected that high-quality CAD and proper use of CAD could allow these systems to achieve their true potential, thus benefiting both the patients and the clinicians, and may bring about more widespread clinical use of CAD for many other diseases and applications. It is hoped that the awareness of the need for appropriate CAD QA and user training will stimulate new ideas and approaches for implementing such procedures efficiently and effectively as well as funding opportunities to fulfill such critical efforts. PMID:23822459
Co-robotic ultrasound imaging: a cooperative force control approach
NASA Astrophysics Data System (ADS)
Finocchi, Rodolfo; Aalamifar, Fereshteh; Fang, Ting Yun; Taylor, Russell H.; Boctor, Emad M.
2017-03-01
Ultrasound (US) imaging remains one of the most commonly used imaging modalities in medical practice. However, due to the physical effort required to perform US imaging tasks, 63-91% of ultrasonographers develop musculoskeletal disorders throughout their careers. The goal of this work is to provide ultrasonographers with a system that facilitates and reduces strain in US image acquisition. To this end, we propose a system for admittance force robot control that uses the six-degree-of-freedom UR5 industrial robot. A six-axis force sensor is used to measure the forces and torques applied by the sonographer on the probe. As the sonographer pushes against the US probe, the robot complies with these forces, following the user's desired path. A one-axis load cell is used to measure contact forces between the patient and the probe in real time. When imaging, the robot augments the axial forces applied by the user, lessening the physical effort required. User studies showed an overall decrease in hand tremor while imaging at high forces, improvements in image stability, and a decrease in difficulty and strenuousness.
NASA Technical Reports Server (NTRS)
1972-01-01
Study efforts directed at defining all TDRS system elements are summarized. Emphasis was placed on synthesis of a space segment design optimized to support low and medium data rate user spacecraft and launched with Delta 2914. A preliminary design of the satellite was developed and conceptual designs of the user spacecraft terminal and TDRS ground station were defined. As a result of the analyses and design effort it was determined that (1) a 3-axis-stabilized tracking and data relay satellite launched on a Delta 2914 provides telecommunications services considerably in excess of that required by the study statement; and (2) the design concept supports the needs of the space shuttle and has sufficient growth potential and flexibility to provide telecommunications services to high data rate users. Recommendations for further study are included.
2005 5th Annual CMMI Technology Conference and User Group. Volume 1: Monday
2005-11-17
OF TOTAL EFFORT WORK 59% REWORK 41% By the numbers: the impact of requirements Dion, DIO1 McConnell, MCC1 Davis, DAV1, Novorita, NOV1 - 66% to 55% 55...1996, Rational Software Corporation DAV2 http://mozart.uccs.edu/adavis/reqbib.html Requirements management bibliography DIO1 http
ANNIE - INTERACTIVE PROCESSING OF DATA BASES FOR HYDROLOGIC MODELS.
Lumb, Alan M.; Kittle, John L.
1985-01-01
ANNIE is a data storage and retrieval system that was developed to reduce the time and effort required to calibrate, verify, and apply watershed models that continuously simulate water quantity and quality. Watershed models have three categories of input: parameters to describe segments of a drainage area, linkage of the segments, and time-series data. Additional goals for ANNIE include the development of software that is easily implemented on minicomputers and some microcomputers and software that has no special requirements for interactive display terminals. Another goal is for the user interaction to be based on the experience of the user so that ANNIE is helpful to the inexperienced user and yet efficient and brief for the experienced user. Finally, the code should be designed so that additional hydrologic models can easily be added to ANNIE.
Psychological Issues in Online Adaptive Task Allocation
NASA Technical Reports Server (NTRS)
Morris, N. M.; Rouse, W. B.; Ward, S. L.; Frey, P. R.
1984-01-01
Adaptive aiding is an idea that offers potential for improvement over many current approaches to aiding in human-computer systems. The expected return of tailoring the system to fit the user could be in the form of improved system performance and/or increased user satisfaction. Issues such as the manner in which information is shared between human and computer, the appropriate division of labor between them, and the level of autonomy of the aid are explored. A simulated visual search task was developed. Subjects are required to identify targets in a moving display while performing a compensatory sub-critical tracking task. By manipulating characteristics of the situation such as imposed task-related workload and effort required to communicate with the computer, it is possible to create conditions in which interaction with the computer would be more or less desirable. The results of preliminary research using this experimental scenario are presented, and future directions for this research effort are discussed.
Web accessibility support for visually impaired users using link content analysis.
Iwata, Hajime; Kobayashi, Naofumi; Tachibana, Kenji; Shirogane, Junko; Fukazawa, Yoshiaki
2013-12-01
Web pages are used for a variety of purposes. End users must understand dynamically changing content and sequentially follow page links to find desired material, requiring significant time and effort. However, for visually impaired users using screen readers, it can be difficult to find links to web pages when link text and alternative text descriptions are inappropriate. Our method supports the discovery of content by analyzing 8 categories of link types, and allows visually impaired users to be aware of the content represented by links in advance. This facilitates end users access to necessary information on web pages. Our method of classifying web page links is therefore effective as a means of evaluating accessibility.
NASA Astrophysics Data System (ADS)
Ritchey, N. A.; Brewer, M.; Houston, T.; Hollingshead, A.; Jones, N.; Dissen, J.
2017-12-01
NOAA's National Centers for Environmental Information (NCEI) is the world's largest repository of climate data. Customer analytics and uses of NCEI information are critical to understanding and evolving NCEI's suite of use-inspired data and information to make them applicable to decision making. Over the past three years, NCEI's Center for Weather and Climate has made a concerted effort to: 1) Establish a system for collection of user requirements, 2) Ensure that collected information informs product area management and prioritization activities, and 3) Include user insights into future products and product versions. These process changes require a long-term commitment to climate services and success is not possible with a "build it and they will come" mentality nor with a "drop-in, drop-out" customer engagement strategy. This presentation will focus on the path necessary to get from effective user engagement, centered on collection and adjudication of user requirements, all the way through the outcomes of the changed products and services and how those have benefitted users, including economic examples.
Unified Framework for Development, Deployment and Robust Testing of Neuroimaging Algorithms
Joshi, Alark; Scheinost, Dustin; Okuda, Hirohito; Belhachemi, Dominique; Murphy, Isabella; Staib, Lawrence H.; Papademetris, Xenophon
2011-01-01
Developing both graphical and command-line user interfaces for neuroimaging algorithms requires considerable effort. Neuroimaging algorithms can meet their potential only if they can be easily and frequently used by their intended users. Deployment of a large suite of such algorithms on multiple platforms requires consistency of user interface controls, consistent results across various platforms and thorough testing. We present the design and implementation of a novel object-oriented framework that allows for rapid development of complex image analysis algorithms with many reusable components and the ability to easily add graphical user interface controls. Our framework also allows for simplified yet robust nightly testing of the algorithms to ensure stability and cross platform interoperability. All of the functionality is encapsulated into a software object requiring no separate source code for user interfaces, testing or deployment. This formulation makes our framework ideal for developing novel, stable and easy-to-use algorithms for medical image analysis and computer assisted interventions. The framework has been both deployed at Yale and released for public use in the open source multi-platform image analysis software—BioImage Suite (bioimagesuite.org). PMID:21249532
Development of an information platform for new grid users in the biomedical field.
Skrowny, Daniela; Dickmann, Frank; Löhnhardt, Benjamin; Knoch, Tobias A; Sax, Ulrich
2010-01-01
Bringing new users into grids is a top priority for all grid initiatives and one of the most challenging tasks. Especially in life sciences it is essential to have a certain amount of users to establish a critical mass for a sustainable grid and give feedback back to the technological middleware layer. Based on the presumable lack of grid IT knowledge it is notably more arduous to satisfy user demands although here the requirements are especially demanding. Therefore, the development of an information- and learning platform could support the efforts of grid experts to guide new users. By providing a platform about grid technology and their feasibilities for users of the community of biomedicine potential, users could be supported using the high potential of their discipline.
NASA Technical Reports Server (NTRS)
1987-01-01
The Cosmic Dust Collection and Gas Grain Simulation Facilities represent collaborative efforts between the Life Sciences and Solar System Exploration Divisions designed to strengthen a natural exobiology/Planetary Sciences connection. The Cosmic Dust Collection Facility is a Planetary Science facility, with Exobiology a primary user. Conversely, the Gas Grain Facility is an exobiology facility, with Planetary Science a primary user. Requirements for the construction and operation of the two facilities, contained herein, were developed through joint workshops between the two disciplines, as were representative experiments comprising the reference payloads. In the case of the Gas Grain Simulation Facility, the astrophysics Division is an additional potential user, having participated in the workshop to select experiments and define requirements.
KSC Space Station Operations Language (SSOL)
NASA Technical Reports Server (NTRS)
1985-01-01
The Space Station Operations Language (SSOL) will serve a large community of diverse users dealing with the integration and checkout of Space Station modules. Kennedy Space Center's plan to achieve Level A specification of the SSOL system, encompassing both its language and its automated support environment, is presented in the format of a briefing. The SSOL concept is a collection of fundamental elements that span languages, operating systems, software development, software tools and several user classes. The approach outlines a thorough process that combines the benefits of rapid prototyping with a coordinated requirements gathering effort, yielding a Level A specification of the SSOL requirements.
Assessing and Adapting Scientific Results for Space Weather Research to Operations (R2O)
NASA Astrophysics Data System (ADS)
Thompson, B. J.; Friedl, L.; Halford, A. J.; Mays, M. L.; Pulkkinen, A. A.; Singer, H. J.; Stehr, J. W.
2017-12-01
Why doesn't a solid scientific paper necessarily result in a tangible improvement in space weather capability? A well-known challenge in space weather forecasting is investing effort to turn the results of basic scientific research into operational knowledge. This process is commonly known as "Research to Operations," abbreviated R2O. There are several aspects of this process: 1) How relevant is the scientific result to a particular space weather process? 2) If fully utilized, how much will that result improve the reliability of the forecast for the associated process? 3) How much effort will this transition require? Is it already in a relatively usable form, or will it require a great deal of adaptation? 4) How much burden will be placed on forecasters? Is it "plug-and-play" or will it require effort to operate? 5) How can robust space weather forecasting identify challenges for new research? This presentation will cover several approaches that have potential utility in assessing scientific results for use in space weather research. The demonstration of utility is the first step, relating to the establishment of metrics to ensure that there will be a clear benefit to the end user. The presentation will then move to means of determining cost vs. benefit, (where cost involves the full effort required to transition the science to forecasting, and benefit concerns the improvement of forecast reliability), and conclude with a discussion of the role of end users and forecasters in driving further innovation via "O2R."
Intelligent user interface concept for space station
NASA Technical Reports Server (NTRS)
Comer, Edward; Donaldson, Cameron; Bailey, Elizabeth; Gilroy, Kathleen
1986-01-01
The space station computing system must interface with a wide variety of users, from highly skilled operations personnel to payload specialists from all over the world. The interface must accommodate a wide variety of operations from the space platform, ground control centers and from remote sites. As a result, there is a need for a robust, highly configurable and portable user interface that can accommodate the various space station missions. The concept of an intelligent user interface executive, written in Ada, that would support a number of advanced human interaction techniques, such as windowing, icons, color graphics, animation, and natural language processing is presented. The user interface would provide intelligent interaction by understanding the various user roles, the operations and mission, the current state of the environment and the current working context of the users. In addition, the intelligent user interface executive must be supported by a set of tools that would allow the executive to be easily configured and to allow rapid prototyping of proposed user dialogs. This capability would allow human engineering specialists acting in the role of dialog authors to define and validate various user scenarios. The set of tools required to support development of this intelligent human interface capability is discussed and the prototyping and validation efforts required for development of the Space Station's user interface are outlined.
A MYSQL-BASED DATA ARCHIVER: PRELIMINARY RESULTS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matthew Bickley; Christopher Slominski
2008-01-23
Following an evaluation of the archival requirements of the Jefferson Laboratory accelerator’s user community, a prototyping effort was executed to determine if an archiver based on MySQL had sufficient functionality to meet those requirements. This approach was chosen because an archiver based on a relational database enables the development effort to focus on data acquisition and management, letting the database take care of storage, indexing and data consistency. It was clear from the prototype effort that there were no performance impediments to successful implementation of a final system. With our performance concerns addressed, the lab undertook the design and developmentmore » of an operational system. The system is in its operational testing phase now. This paper discusses the archiver system requirements, some of the design choices and their rationale, and presents the acquisition, storage and retrieval performance.« less
Transferring technology to the public sector.
NASA Technical Reports Server (NTRS)
Alper, M. E.
1972-01-01
Approximately four years ago the Jet Propulsion Laboratory, under NASA sponsorship, began to devote some of its resources to examining ways to transfer space technology to the civil sector. As experience accumulated under this program, certain principles basic to success in technology transfer became apparent. An adequate definition of each problem must be developed before any substantial effort is expended on a solution. In most instances, a source of funds other than the potential user is required to support the problem definition phase of the work. Sensitivity to the user's concerns and effective interpersonal communications between the user and technical personnel are essential to success.
NASA Astrophysics Data System (ADS)
Schneller, D.
2014-08-01
The E-ELT has completed its design phase and is now entering construction. ESO is acting as prime contractor and usually procures subsystems, including their design, from industry. This, in turn, leads to a large number of requirements, whose validity, consistency and conformity with user needs requires extensive management. Therefore E-ELT Systems Engineering has chosen to follow a systematic approach, based on a reasoned requirement architecture that follows the product breakdown structure of the observatory. The challenge ahead is the controlled flow-down of science user needs into engineering requirements, requirement specifications and system design documents. This paper shows how the E-ELT project manages this. The project has adopted IBM DOORTM as a supporting requirements management tool. This paper deals with emerging problems and pictures potential solutions. It shows trade-offs made to reach a proper balance between the effort put in this activity and potential overheads, and the benefit for the project.
NASTRAN users' experience of Avco Aerostructures Division
NASA Technical Reports Server (NTRS)
Blackburn, C. L.; Wilhelm, C. A.
1973-01-01
The NASTRAN experiences of a major structural design and fabrication subcontractor that has less engineering personnel and computer facilities than those available to large prime contractors are discussed. Efforts to obtain sufficient computer capacity and the development and implementation of auxiliary programs to reduce manpower requirements are described. Applications of the NASTRAN program for training users, checking out auxiliary programs, performing in-house research and development, and structurally analyzing an Avco designed and manufactured missile case are presented.
User Requirements Gathering for 3d Geographic Information in the United Kingdom
NASA Astrophysics Data System (ADS)
Wong, K.; Ellul, C.
2017-10-01
Despite significant developments, 3D technologies are still not fully exploited in practice due to the lack of awareness as well as the lack of understanding of who the users of 3D will be and what the user requirements are. From a National Mapping & Cadastral Agency and data acquisition perspective, each new 3D feature type and element within a feature added (such as doors, windows, chimneys, street lights) requires additional processing and cost to create. There is therefore a need to understand the importance of different 3D features and components for different applications. This will allow the direction of capture effort towards items that will be relevant to a wide range of users, as well as to understand the current status of, and interest in, 3D at a national level. This paper reports the results of an initial requirements gathering exercise for 3D geographic information in the United Kingdom (UK). It describes a user-centred design approach where usability and user needs are given extensive attention at each stage of the design process. Web-based questionnaires and semi-structured face-to-face interviews were used as complementary data collection methods to understand the user needs. The results from this initial study showed that while some applications lead the field with a high adoption of 3D, others are laggards, predominantly from organisational inertia. While individuals may be positive about the use of 3D, many struggle to justify the value and business case for 3D GI. Further work is required to identify the specific geometric and semantic requirements for different applications and to repeat the study with a larger sample.
Reporting of teratology studies.
Barrow, Paul C; Reynaud, Lucie
2013-01-01
The regulatory toxicology report is an unusual document that requires a particular skill to write. The report must be clear, accurate, concise, and focused. A clear and direct writing style is required. The end-users of the report will hope to find the information they seek with as little effort as possible. Few, or none, will read the entire document. The author should aim to appease the user by obliging him to read as little text and turn as few pages as possible. This chapter gives tips and guidance on how to present the experimental data and write the narrative text in the final study report for a teratology study.
NASA Astrophysics Data System (ADS)
Gonzalez, J. C.; Kurlandczyk, H.; Schmid, C.; Schneller, D.
2016-08-01
One of the critical activities in the systems engineering scope of work is managing requirements. In line with this, E-ELT devotes a significant effort to this activity, which follows a well-established process. This involves optimally deriving requirements from the user (Top-Level Requirements) through the system Level 1 Requirements and from here down to subsystems procurement specifications. This paper describes the process, which is illustrated with some practical examples, including in particular the role of technical budgets to derive requirements on subsystems. Also, the provisions taken for the requirements verification are discussed.
Reagan, Ian J; McClafferty, Julie A; Berlin, Sharon P; Hankey, Jonathan M
2013-01-01
Seat belt use is one of the most effective countermeasures to reduce traffic fatalities and injuries. The success of efforts to increase use is measured by road side observations and self-report questionnaires. These methods have shortcomings, with the former requiring a binary point estimate and the latter being subjective. The 100-car naturalistic driving study presented a unique opportunity to study seat belt use in that seat belt status was known for every trip each driver made during a 12-month period. Drivers were grouped into infrequent, occasional, or consistent seat belt users based on the frequency of belt use. Analyses were then completed to assess if these groups differed on several measures including personality, demographics, self-reported driving style variables as well as measures from the 100-car study instrumentation suite (average trip speed, trips per day). In addition, detailed analyses of the occasional belt user group were completed to identify factors that were predictive of occasional belt users wearing their belts. The analyses indicated that consistent seat belt users took fewer trips per day, and that increased average trip speed was associated with increased belt use among occasional belt users. The results of this project may help focus messaging efforts to convert occasional and inconsistent seat belt users to consistent users. Copyright © 2012 Elsevier Ltd. All rights reserved.
Assessing performance of an Electronic Health Record (EHR) using Cognitive Task Analysis.
Saitwal, Himali; Feng, Xuan; Walji, Muhammad; Patel, Vimla; Zhang, Jiajie
2010-07-01
Many Electronic Health Record (EHR) systems fail to provide user-friendly interfaces due to the lack of systematic consideration of human-centered computing issues. Such interfaces can be improved to provide easy to use, easy to learn, and error-resistant EHR systems to the users. To evaluate the usability of an EHR system and suggest areas of improvement in the user interface. The user interface of the AHLTA (Armed Forces Health Longitudinal Technology Application) was analyzed using the Cognitive Task Analysis (CTA) method called GOMS (Goals, Operators, Methods, and Selection rules) and an associated technique called KLM (Keystroke Level Model). The GOMS method was used to evaluate the AHLTA user interface by classifying each step of a given task into Mental (Internal) or Physical (External) operators. This analysis was performed by two analysts independently and the inter-rater reliability was computed to verify the reliability of the GOMS method. Further evaluation was performed using KLM to estimate the execution time required to perform the given task through application of its standard set of operators. The results are based on the analysis of 14 prototypical tasks performed by AHLTA users. The results show that on average a user needs to go through 106 steps to complete a task. To perform all 14 tasks, they would spend about 22 min (independent of system response time) for data entry, of which 11 min are spent on more effortful mental operators. The inter-rater reliability analysis performed for all 14 tasks was 0.8 (kappa), indicating good reliability of the method. This paper empirically reveals and identifies the following finding related to the performance of AHLTA: (1) large number of average total steps to complete common tasks, (2) high average execution time and (3) large percentage of mental operators. The user interface can be improved by reducing (a) the total number of steps and (b) the percentage of mental effort, required for the tasks. 2010 Elsevier Ireland Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Hurd, W. A.
1985-01-01
Modifications required to change the near ultraviolet source in the Optical Contamination Monitor to a source with output at or near the Lyman-Alpha hydrogen line are discussed. The effort consisted of selecting, acquiring and testing candidate miniature ultraviolet lamps with significant output in or near 121.6 nm. The effort also included selection of a miniature dc high-voltage power supply capable of operating the lamp. The power supply was required to operate from available primary power supplied by the Optical Effect Module (DEM) and it should be flight qualified or have the ability to be qualified by the user.
Generation of development environments for the Arden Syntax.
Bång, M.; Eriksson, H.
1997-01-01
Providing appropriate development environments for specialized languages requires a significant development and maintenance effort. Specialized environments are therefore expensive when compared to their general-language counterparts. The Arden Syntax for Medical Logic Modules (MLM) is a standardized language for representing medical knowledge. We have used PROTEGE-II, a knowledge-engineering environment, to generate a number of experimental development environments for the Arden Syntax. MEDAILLE is the resulting MLM editor, which provides a user-friendly environment that allows users to create and modify MLM definitions. Although MEDAILLE is a generated editor, it has similar functionality, while reducing the programming effort, as compared to other MLM editors developed using traditional programming techniques. We discuss how developers can use PROTEGE-II to generate development environments for other standardized languages and for general programming languages. PMID:9357639
Digital optical tape: Technology and standardization issues
NASA Technical Reports Server (NTRS)
Podio, Fernando L.
1996-01-01
During the coming years, digital data storage technologies will continue an aggressive growth to satisfy the user's need for higher storage capacities, higher data transfer rates and long-term archival media properties. Digital optical tape is a promising technology to satisfy these user's needs. As any emerging data storage technology, the industry faces many technological and standardization challenges. The technological challenges are great, but feasible to overcome. Although it is too early to consider formal industry standards, the optical tape industry has decided to work together by initiating prestandardization efforts that may lead in the future to formal voluntary industry standards. This paper will discuss current industry optical tape drive developments and the types of standards that will be required for the technology. The status of current industry prestandardization efforts will also be discussed.
NASA Technical Reports Server (NTRS)
Quick, Jason
2009-01-01
The Upper Stage (US) section of the National Aeronautics and Space Administration's (NASA) Ares I rocket will require internal access platforms for maintenance tasks performed by humans inside the vehicle. Tasks will occur during expensive critical path operations at Kennedy Space Center (KSC) including vehicle stacking and launch preparation activities. Platforms must be translated through a small human access hatch, installed in an enclosed worksite environment, support the weight of ground operators and be removed before flight - and their design must minimize additional vehicle mass at attachment points. This paper describes the application of a user-centered conceptual design process and the unique challenges encountered within NASA's systems engineering culture focused on requirements and "heritage hardware". The NASA design team at Marshall Space Flight Center (MSFC) initiated the user-centered design process by studying heritage internal access kits and proposing new design concepts during brainstorming sessions. Simultaneously, they partnered with the Technology Transfer/Innovative Partnerships Program to research inflatable structures and dynamic scaffolding solutions that could enable ground operator access. While this creative, technology-oriented exploration was encouraged by upper management, some design stakeholders consistently opposed ideas utilizing novel, untested equipment. Subsequent collaboration with an engineering consulting firm improved the technical credibility of several options, however, there was continued resistance from team members focused on meeting system requirements with pre-certified hardware. After a six-month idea-generating phase, an intensive six-week effort produced viable design concepts that justified additional vehicle mass while optimizing the human factors of platform installation and use. Although these selected final concepts closely resemble heritage internal access platforms, challenges from the application of the user-centered process provided valuable lessons for improving future collaborative conceptual design efforts.
Early-Stage Software Design for Usability
ERIC Educational Resources Information Center
Golden, Elspeth
2010-01-01
In spite of the goodwill and best efforts of software engineers and usability professionals, systems continue to be built and released with glaring usability flaws that are costly and difficult to fix after the system has been built. Although user interface (UI) designers, be they usability or design experts, communicate usability requirements to…
Tools to Support Expository Video Capture and Access
ERIC Educational Resources Information Center
Carter, Scott; Cooper, Matthew; Adcock, John; Branham, Stacy
2014-01-01
Video tends to be imbalanced as a medium. Typically, content creators invest enormous effort creating work that is then watched passively. However, learning tasks require that users not only consume video but also engage, interact with, and repurpose content. Furthermore, to promote learning across domains where content creators are not…
Developmental Challenges of SMES Technology for Applications
NASA Astrophysics Data System (ADS)
Rong, Charles C.; Barnes, Paul N.
2017-12-01
This paper reviews the current status of high temperature superconductor (HTS) based superconducting magnetic energy storage (SMES) technology as a developmental effort. Discussion centres on the major challenges in magnet optimization, loss reduction, cooling improvement, and new development of quench detection. The cryogenic operation for superconductivity in this technological application requires continued research and development, especially with a greater engineering effort that involves the end user. For the SMES-based technology to more fully mature, some suggestions are given for consideration and discussion.
Assessing country-level efforts to link research to action.
Lavis, John N.; Lomas, Jonathan; Hamid, Maimunah; Sewankambo, Nelson K.
2006-01-01
We developed a framework for assessing country-level efforts to link research to action. The framework has four elements. The first element assesses the general climate (how those who fund research, universities, researchers and users of research support or place value on efforts to link research to action). The second element addresses the production of research (how priority setting ensures that users' needs are identified and how scoping reviews, systematic reviews and single studies are undertaken to address these needs). The third element addresses the mix of four clusters of activities used to link research to action. These include push efforts (how strategies are used to support action based on the messages arising from research), efforts to facilitate "user pull" (how "one-stop shopping" is provided for optimally packaged high-quality reviews either alone or as part of a national electronic library for health, how these reviews are profiled during "teachable moments" such as intense media coverage, and how rapid-response units meet users' needs for the best research), "user pull" efforts undertaken by those who use research (how users assess their capacity to use research and how structures and processes are changed to support the use of research) and exchange efforts (how meaningful partnerships between researchers and users help them to jointly ask and answer relevant questions). The fourth element addresses approaches to evaluation (how support is provided for rigorous evaluations of efforts to link research to action). PMID:16917649
2013-05-01
Regulation The Proposed Action would not affect local government comprehensive plans. Requires local governments to prepare, adopt, and implement...state-level planning efforts. Requires the development of special statewide plans governing water use, land development, and transportation...cleared area in state waters. However, avoidance of this area would not be significantly burdensome for tourists or recreational users of the Gulf, as
Onboard processor technology review
NASA Technical Reports Server (NTRS)
Benz, Harry F.
1990-01-01
The general need and requirements for the onboard embedded processors necessary to control and manipulate data in spacecraft systems are discussed. The current known requirements are reviewed from a user perspective, based on current practices in the spacecraft development process. The current capabilities of available processor technologies are then discussed, and these are projected to the generation of spacecraft computers currently under identified, funded development. An appraisal is provided for the current national developmental effort.
IITET and shadow TT: an innovative approach to training at the point of need
NASA Astrophysics Data System (ADS)
Gross, Andrew; Lopez, Favio; Dirkse, James; Anderson, Darran; Berglie, Stephen; May, Christopher; Harkrider, Susan
2014-06-01
The Image Intensification and Thermal Equipment Training (IITET) project is a joint effort between Night Vision and Electronics Sensors Directorate (NVESD) Modeling and Simulation Division (MSD) and the Army Research Institute (ARI) Fort Benning Research Unit. The IITET effort develops a reusable and extensible training architecture that supports the Army Learning Model and trains Manned-Unmanned Teaming (MUM-T) concepts to Shadow Unmanned Aerial Systems (UAS) payload operators. The training challenge of MUM-T during aviation operations is that UAS payload operators traditionally learn few of the scout-reconnaissance skills and coordination appropriate to MUM-T at the schoolhouse. The IITET effort leveraged the simulation experience and capabilities at NVESD and ARI's research to develop a novel payload operator training approach consistent with the Army Learning Model. Based on the training and system requirements, the team researched and identified candidate capabilities in several distinct technology areas. The training capability will support a variety of training missions as well as a full campaign. Data from these missions will be captured in a fully integrated AAR capability, which will provide objective feedback to the user in near-real-time. IITET will be delivered via a combination of browser and video streaming technologies, eliminating the requirement for a client download and reducing user computer system requirements. The result is a novel UAS Payload Operator training capability, nested within an architecture capable of supporting a wide variety of training needs for air and ground tactical platforms and sensors, and potentially several other areas requiring vignette-based serious games training.
Save medical personnel's time by improved user interfaces.
Kindler, H
1997-01-01
Common objectives in the industrial countries are the improvement of quality of care, clinical effectiveness, and cost control. Cost control, in particular, has been addressed through the introduction of case mix systems for reimbursement by social-security institutions. More data is required to enable quality improvement, increases in clinical effectiveness and for juridical reasons. At first glance, this documentation effort is contradictory to cost reduction. However, integrated services for resource management based on better documentation should help to reduce costs. The clerical effort for documentation should be decreased by providing a co-operative working environment for healthcare professionals applying sophisticated human-computer interface technology. Additional services, e.g., automatic report generation, increase the efficiency of healthcare personnel. Modelling the medical work flow forms an essential prerequisite for integrated resource management services and for co-operative user interfaces. A user interface aware of the work flow provides intelligent assistance by offering the appropriate tools at the right moment. Nowadays there is a trend to client/server systems with relational databases or object-oriented databases as repository. The work flows used for controlling purposes and to steer the user interfaces must be represented in the repository.
A Pre-launch Analysis of NASA's SMAP Mission Data
NASA Astrophysics Data System (ADS)
Escobar, V. M.; Brown, M. E.
2012-12-01
Product applications have become an integral part of converting the data collected into actionable knowledge that can be used to inform policy. Successfully bridging scientific research with operational decision making in different application areas requires looking into thematic user requirements and data requirements. NASA's Soil Moisture Active/Passive mission (SMAP) has an applications program that actively seeks to integrate the data prior to launch into a broad range of environmental monitoring and decision making systems from drought and flood guidance to disease risk assessment and national security SMAP is a a combined active/passive microwave instrument, which will be launched into a near-polar orbit in late 2014. It aims to produce a series of soil moisture products and soil freeze/thaw products with an accuracy of +/- 10%, a nominal resolution of between 3 and 40km, and latency between 12 hours and 7 days. These measurements will be used to enhance the understanding of processes that link the water, energy and carbon cycles, and to extend the capabilities of weather and climate prediction models. The driving success of the SMAP applications program is joining mission scientists to thematic end users and leveraging the knowledge base of soil moisture data applications, increase the speed SMAP data product ingestion into critical processes and research, improving societal benefits to science. Because SMAP has not yet launched, the mission is using test algorithms to determine how the data will interact with existing processes. The objective of this profession review is to solicit data requirements, accuracy needs and current understanding of the SMAP mission from the user community and then feed that back into mission product development. Thus, understanding how users will apply SMAP data, prior to the satellite's launch, is an important component of SMAP Applied Sciences and one of NASA's measures for mission success. This paper presents an analysis of an email-based review of expert end-users and earth science researchers to eliciting how pre-launch activities and research is being conducted in thematic group's organizations. Our focus through the SMAP Applications Program will be to (1) improve the missions understanding of the SMAP user community requirements, (2) document and communicate the perceived challenges and advantages to the mission scientists, and (3) facilitate the movement of science into policy and decision making arenas. We will analyze the data of this review to understand the perceived benefits to pre-launch efforts, user engagement and define areas were the connection between science development and user engagement can continue to improve and further benefit future mission pre launch efforts. The research will facilitate collaborative opportunities between agencies, broadening the fields of science where soil moisture observation data can be applied.
Central American information system for energy planning (in English; Spanish)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fonseca, M.G.; Lyon, P.C.; Heskett, J.C.
1991-04-01
SICAPE (Sistema de Information Centroamericano para Planificacion Energetica) is an expandable information system designed for energy planning. Its objective is to satisfy ongoing information requirements by means of a menu driver operational environment. SICAPE is as easily used by the novice computer user as those with more experience. Moreover, the system is capable of evolving concurrently with future requirements of the individual country. The expansion is accomplished by menu restructuring as data and user requirements change. The new menu configurations require no programming effort. The use and modification of SICAPE are separate menu-driven processes that allow for rapid data query,more » minimal training, and effortless continued growth. SICAPE's data is organized by country or region. Information is available in the following areas: energy balance, macro economics, electricity generation capacity, and electricity and petroleum product pricing. (JF)« less
Mission Applications Support at NASA: The Proposal Surface Water and Ocean Topography Mission
NASA Astrophysics Data System (ADS)
Srinivasan, Margaret; Peterson, Craig; Callahan, Phil
2013-09-01
The NASA Applied Sciences Program is actively supporting an agency-wide effort to formalize a mission-level data applications approach. The program goal is to engage early-phase NASA Earth satellite mission project teams with applied science representation in the flight mission planning process. The end objective is to "to engage applications-oriented users and organizations early in the satellite mission lifecycle to enable them to envision possible applications and integrate end-user needs into satellite mission planning as a way to increase the benefits to the nation."Two mission applications representatives have been selected for each early phase Tier 2 mission, including the Surface Water and Ocean Topography (SWOT) mission concept. These representatives are tasked with identifying and organizing the applications communities and developing and promoting a process for the mission to optimize the reach of existing applications efforts in order to enhance the applications value of the missions. An early project-level awareness of mission planning decisions that may increase or decrease the utility of data products to diverse user and potential user communities (communities of practice and communities of potential, respectively) has high value and potential return to the mission and to the users.Successful strategies to enhance science and practical applications of projected SWOT data streams will require engaging with and facilitating between representatives in the science, societal applications, and mission planning communities.Some of the elements of this program include:• Identify early adopters of data products• Coordinate applications team, including;Project Scientist, Payload Scientist, ProjectManager, data processing lead• Describe mission and products sufficiently inearly stage of development to effectively incorporate all potential usersProducts and activities resulting from this effort will include (but are not limited to); workshops, workshop summaries, web pages, email lists of interested users/scientists, an Applications Plan, printed materials (posters, brochures) and participation in key meetings.
Classification software technique assessment
NASA Technical Reports Server (NTRS)
Jayroe, R. R., Jr.; Atkinson, R.; Dasarathy, B. V.; Lybanon, M.; Ramapryian, H. K.
1976-01-01
A catalog of software options is presented for the use of local user communities to obtain software for analyzing remotely sensed multispectral imagery. The resources required to utilize a particular software program are described. Descriptions of how a particular program analyzes data and the performance of that program for an application and data set provided by the user are shown. An effort is made to establish a statistical performance base for various software programs with regard to different data sets and analysis applications, to determine the status of the state-of-the-art.
Spacelab data analysis and interactive control study
NASA Technical Reports Server (NTRS)
Tarbell, T. D.; Drake, J. F.
1980-01-01
The study consisted of two main tasks, a series of interviews of Spacelab users and a survey of data processing and display equipment. Findings from the user interviews on questions of interactive control, downlink data formats, and Spacelab computer software development are presented. Equipment for quick look processing and display of scientific data in the Spacelab Payload Operations Control Center (POCC) was surveyed. Results of this survey effort are discussed in detail, along with recommendations for NASA development of several specific display systems which meet common requirements of many Spacelab experiments.
Automatic Dynamic Aircraft Modeler (ADAM) for the Computer Program NASTRAN
NASA Technical Reports Server (NTRS)
Griffis, H.
1985-01-01
Large general purpose finite element programs require users to develop large quantities of input data. General purpose pre-processors are used to decrease the effort required to develop structural models. Further reduction of effort can be achieved by specific application pre-processors. Automatic Dynamic Aircraft Modeler (ADAM) is one such application specific pre-processor. General purpose pre-processors use points, lines and surfaces to describe geometric shapes. Specifying that ADAM is used only for aircraft structures allows generic structural sections, wing boxes and bodies, to be pre-defined. Hence with only gross dimensions, thicknesses, material properties and pre-defined boundary conditions a complete model of an aircraft can be created.
Nam, Junghyun; Choo, Kim-Kwang Raymond; Han, Sangchul; Kim, Moonseong; Paik, Juryon; Won, Dongho
2015-01-01
A smart-card-based user authentication scheme for wireless sensor networks (hereafter referred to as a SCA-WSN scheme) is designed to ensure that only users who possess both a smart card and the corresponding password are allowed to gain access to sensor data and their transmissions. Despite many research efforts in recent years, it remains a challenging task to design an efficient SCA-WSN scheme that achieves user anonymity. The majority of published SCA-WSN schemes use only lightweight cryptographic techniques (rather than public-key cryptographic techniques) for the sake of efficiency, and have been demonstrated to suffer from the inability to provide user anonymity. Some schemes employ elliptic curve cryptography for better security but require sensors with strict resource constraints to perform computationally expensive scalar-point multiplications; despite the increased computational requirements, these schemes do not provide user anonymity. In this paper, we present a new SCA-WSN scheme that not only achieves user anonymity but also is efficient in terms of the computation loads for sensors. Our scheme employs elliptic curve cryptography but restricts its use only to anonymous user-to-gateway authentication, thereby allowing sensors to perform only lightweight cryptographic operations. Our scheme also enjoys provable security in a formal model extended from the widely accepted Bellare-Pointcheval-Rogaway (2000) model to capture the user anonymity property and various SCA-WSN specific attacks (e.g., stolen smart card attacks, node capture attacks, privileged insider attacks, and stolen verifier attacks).
Nam, Junghyun; Choo, Kim-Kwang Raymond; Han, Sangchul; Kim, Moonseong; Paik, Juryon; Won, Dongho
2015-01-01
A smart-card-based user authentication scheme for wireless sensor networks (hereafter referred to as a SCA-WSN scheme) is designed to ensure that only users who possess both a smart card and the corresponding password are allowed to gain access to sensor data and their transmissions. Despite many research efforts in recent years, it remains a challenging task to design an efficient SCA-WSN scheme that achieves user anonymity. The majority of published SCA-WSN schemes use only lightweight cryptographic techniques (rather than public-key cryptographic techniques) for the sake of efficiency, and have been demonstrated to suffer from the inability to provide user anonymity. Some schemes employ elliptic curve cryptography for better security but require sensors with strict resource constraints to perform computationally expensive scalar-point multiplications; despite the increased computational requirements, these schemes do not provide user anonymity. In this paper, we present a new SCA-WSN scheme that not only achieves user anonymity but also is efficient in terms of the computation loads for sensors. Our scheme employs elliptic curve cryptography but restricts its use only to anonymous user-to-gateway authentication, thereby allowing sensors to perform only lightweight cryptographic operations. Our scheme also enjoys provable security in a formal model extended from the widely accepted Bellare-Pointcheval-Rogaway (2000) model to capture the user anonymity property and various SCA-WSN specific attacks (e.g., stolen smart card attacks, node capture attacks, privileged insider attacks, and stolen verifier attacks). PMID:25849359
A Demonstration and Analysis of Requirements for Maritime Navigation Planning.
1998-03-01
it the highest and 0-4 and above the lowest. Once again, the value added by grouping the data and comparing may be nothing...purpose behind a prototype is to ascertain user requirements. It should be created rapidly to speed up the system development life cycle ( SDLC ). Since...system is contained in Chapter II, Section B. 3. Internet to Sea (SEANET) Program The SeaNet Project is a collaborative effort to bring the
An Authoring System for Creating Computer-Based Role-Performance Trainers.
ERIC Educational Resources Information Center
Guralnick, David; Kass, Alex
This paper describes a multimedia authoring system called MOPed-II. Like other authoring systems, MOPed-II reduces the time and expense of producing end-user applications by eliminating much of the programming effort they require. However, MOPed-II reflects an approach to authoring tools for educational multimedia which is different from most…
The cost of doing business: cost structure of electronic immunization registries.
Fontanesi, John M; Flesher, Don S; De Guire, Michelle; Lieberthal, Allan; Holcomb, Kathy
2002-10-01
To predict the true cost of developing and maintaining an electronic immunization registry, and to set the framework for developing future cost-effective and cost-benefit analysis. Primary data collected at three immunization registries located in California, accounting for 90 percent of all immunization records in registries in the state during the study period. A parametric cost analysis compared registry development and maintenance expenditures to registry performance requirements. Data were collected at each registry through interviews, reviews of expenditure records, technical accomplishments development schedules, and immunization coverage rates. The cost of building immunization registries is predictable and independent of the hardware/software combination employed. The effort requires four man-years of technical effort or approximately $250,000 in 1998 dollars. Costs for maintaining a registry were approximately $5,100 per end user per three-year period. There is a predictable cost structure for both developing and maintaining immunization registries. The cost structure can be used as a framework for examining the cost-effectiveness and cost-benefits of registries. The greatest factor effecting improvement in coverage rates was ongoing, user-based administrative investment.
Development of Sensors for Aerospace Applications
NASA Technical Reports Server (NTRS)
Medelius, Pedro
2005-01-01
Advances in technology have led to the availability of smaller and more accurate sensors. Computer power to process large amounts of data is no longer the prevailing issue; thus multiple and redundant sensors can be used to obtain more accurate and comprehensive measurements in a space vehicle. The successful integration and commercialization of micro- and nanotechnology for aerospace applications require that a close and interactive relationship be developed between the technology provider and the end user early in the project. Close coordination between the developers and the end users is critical since qualification for flight is time-consuming and expensive. The successful integration of micro- and nanotechnology into space vehicles requires a coordinated effort throughout the design, development, installation, and integration processes
NASA Technical Reports Server (NTRS)
1975-01-01
A system is presented which processes FORTRAN based software systems to surface potential problems before they become execution malfunctions. The system complements the diagnostic capabilities of compilers, loaders, and execution monitors rather than duplicating these functions. Also, it emphasizes frequent sources of FORTRAN problems which require inordinate manual effort to identify. The principle value of the system is extracting small sections of unusual code from the bulk of normal sequences. Code structures likely to cause immediate or future problems are brought to the user's attention. These messages stimulate timely corrective action of solid errors and promote identification of 'tricky' code. Corrective action may require recoding or simply extending software documentation to explain the unusual technique.
Optimizing Resources for Trustworthiness and Scientific Impact of Domain Repositories
NASA Astrophysics Data System (ADS)
Lehnert, K.
2017-12-01
Domain repositories, i.e. data archives tied to specific scientific communities, are widely recognized and trusted by their user communities for ensuring a high level of data quality, enhancing data value, access, and reuse through a unique combination of disciplinary and digital curation expertise. Their data services are guided by the practices and values of the specific community they serve and designed to support the advancement of their science. Domain repositories need to meet user expectations for scientific utility in order to be successful, but they also need to fulfill the requirements for trustworthy repository services to be acknowledged by scientists, funders, and publishers as a reliable facility that curates and preserves data following international standards. Domain repositories therefore need to carefully plan and balance investments to optimize the scientific impact of their data services and user satisfaction on the one hand, while maintaining a reliable and robust operation of the repository infrastructure on the other hand. Staying abreast of evolving repository standards to certify as a trustworthy repository and conducting a regular self-assessment and certification alone requires resources that compete with the demands for improving data holdings or usability of systems. The Interdisciplinary Earth Data Alliance (IEDA), a data facility funded by the US National Science Foundation, operates repositories for geochemical, marine Geoscience, and Antarctic research data, while also maintaining data products (global syntheses) and data visualization and analysis tools that are of high value for the science community and have demonstrated considerable scientific impact. Balancing the investments in the growth and utility of the syntheses with resources required for certifcation of IEDA's repository services has been challenging, and a major self-assessment effort has been difficult to accommodate. IEDA is exploring a partnership model to share generic repository functions (e.g. metadata registration, long-term archiving) with other repositories. This could substantially reduce the effort of certification and allow effort to focus on the domain-specific data curation and value-added services.
Crowdsourced Contributions to the Nation's Geodetic Elevation Infrastructure
NASA Astrophysics Data System (ADS)
Stone, W. A.
2014-12-01
NOAA's National Geodetic Survey (NGS), a United States Department of Commerce agency, is engaged in providing the nation's fundamental positioning infrastructure - the National Spatial Reference System (NSRS) - which includes the framework for latitude, longitude, and elevation determination as well as various geodetic models, tools, and data. Capitalizing on Global Navigation Satellite System (GNSS) technology for improved access to the nation's precise geodetic elevation infrastructure requires use of a geoid model, which relates GNSS-derived heights (ellipsoid heights) with traditional elevations (orthometric heights). NGS is facilitating the use of crowdsourced GNSS observations collected at published elevation control stations by the professional surveying, geospatial, and scientific communities to help improve NGS' geoid modeling capability. This collocation of published elevation data and newly collected GNSS data integrates together the two height systems. This effort in turn supports enhanced access to accurate elevation information across the nation, thereby benefiting all users of geospatial data. By partnering with the public in this collaborative effort, NGS is not only helping facilitate improvements to the elevation infrastructure for all users but also empowering users of NSRS with the capability to do their own high-accuracy positioning. The educational outreach facet of this effort helps inform the public, including the scientific community, about the utility of various NGS tools, including the widely used Online Positioning User Service (OPUS). OPUS plays a key role in providing user-friendly and high accuracy access to NSRS, with optional sharing of results with NGS and the public. All who are interested in helping evolve and improve the nationwide elevation determination capability are invited to participate in this nationwide partnership and to learn more about the geodetic infrastructure which is a vital component of viable spatial data for many disciplines, including the geosciences.
The New NASA Orbital Debris Engineering Model ORDEM2000
NASA Technical Reports Server (NTRS)
Liou, Jer-Chyi; Matney, Mark J.; Anz-Meador, Phillip D.; Kessler, Donald; Jansen, Mark; Theall, Jeffery R.
2002-01-01
The NASA Orbital Debris Program Office at Johnson Space Center has developed a new computer-based orbital debris engineering model, ORDEM2000, which describes the orbital debris environment in the low Earth orbit region between 200 and 2000 km altitude. The model is appropriate for those engineering solutions requiring knowledge and estimates of the orbital debris environment (debris spatial density, flux, etc.). ORDEM2000 can also be used as a benchmark for ground-based debris measurements and observations. We incorporated a large set of observational data, covering the object size range from 10 mm to 10 m, into the ORDEM2000 debris database, utilizing a maximum likelihood estimator to convert observations into debris population probability distribution functions. These functions then form the basis of debris populations. We developed a finite element model to process the debris populations to form the debris environment. A more capable input and output structure and a user-friendly graphical user interface are also implemented in the model. ORDEM2000 has been subjected to a significant verification and validation effort. This document describes ORDEM2000, which supersedes the previous model, ORDEM96. The availability of new sensor and in situ data, as well as new analytical techniques, has enabled the construction of this new model. Section 1 describes the general requirements and scope of an engineering model. Data analyses and the theoretical formulation of the model are described in Sections 2 and 3. Section 4 describes the verification and validation effort and the sensitivity and uncertainty analyses. Finally, Section 5 describes the graphical user interface, software installation, and test cases for the user.
NASA Technical Reports Server (NTRS)
Carroll, Bonnie C.; Jack, Robert F.; Cotter, Gladys A.
1990-01-01
An explosion of information has created a crisis for today's information age. It has to be determined how to use the best available information sources, tools, and technology. To do this it is necessary to have leadership at the interagency level to promote a coherent information policy. It is also important to find ways to educate the users of information regarding the tools available to them. Advances in technology resulted in efforts to shift from Disciplinary and Mission-oriented Systems to Decision Support Systems and Personalized Information Systems. One such effort is being made by the Interagency Working Group on Data Management for Global Change (IAWGDMGC). Five federal agencies - the Department of Commerce (DOC), Department of Energy (DOE), National Aeronautics and Space Administration (NASA), National Library of Medicine (NLM), and Department of Defense (DOD) - have an on-going cooperative information management group, CENDI (Commerce, Energy, NASA, NLM, and Defense Information), that is meeting the challenge of coordinating and integrating their information management systems. Although it is beginning to be technically feasible to have a system with text, bibliographic, and numeric data online for the user to manipulate at the user's own workstation, it will require national recognition that the resource investment in such a system is worthwhile, in order to promote its full development. It also requires close cooperation between the producers and users of the information - that is, the research and policy community, and the information community. National resources need to be mobilized in a coordinated manner to move people into the next generation of information support systems.
Multidisciplinary propulsion simulation using NPSS
NASA Technical Reports Server (NTRS)
Claus, Russell W.; Evans, Austin L.; Follen, Gregory J.
1992-01-01
The current status of the Numerical Propulsion System Simulation (NPSS) program, a cooperative effort of NASA, industry, and universities to reduce the cost and time of advanced technology propulsion system development, is reviewed. The technologies required for this program include (1) interdisciplinary analysis to couple the relevant disciplines, such as aerodynamics, structures, heat transfer, combustion, acoustics, controls, and materials; (2) integrated systems analysis; (3) a high-performance computing platform, including massively parallel processing; and (4) a simulation environment providing a user-friendly interface. Several research efforts to develop these technologies are discussed.
Extending the Reach of National Assessments: Addressing Local and Regional Needs
NASA Astrophysics Data System (ADS)
Lewis, K.; Carter, T.
2016-12-01
While climate change is global in scope, many impacts of greatest societal concern (and accompanying response decisions) occur on local to regional scales. The U.S. Global Change Research Program (USGCRP) is tasked with conducting quadrennial national climate assessments, and efforts for the fourth such assessment (NCA4) are underway. Recognizing that there is a growing appetite for climate information on more local scales, however, USGCRP is actively pursuing higher-resolution scientific information, while also seeking engagement with local and regional entities to ensure that NCA4 is well-positioned to address users' needs across geospatial scales. Effectively meeting user needs at regional scales requires robust observations and projections at sub-national scales, as well as a widespread network of agencies and organizations. We discuss our efforts to leverage existing relationships to identify potential users and their needs early in the assessment process. We also discuss plans for future mechanisms to engage additional regional stakeholders from resource managers to policy makers and scientists not only for quadrennial assessment but as part of a sustained process.
Support for Debugging Automatically Parallelized Programs
NASA Technical Reports Server (NTRS)
Jost, Gabriele; Hood, Robert; Biegel, Bryan (Technical Monitor)
2001-01-01
We describe a system that simplifies the process of debugging programs produced by computer-aided parallelization tools. The system uses relative debugging techniques to compare serial and parallel executions in order to show where the computations begin to differ. If the original serial code is correct, errors due to parallelization will be isolated by the comparison. One of the primary goals of the system is to minimize the effort required of the user. To that end, the debugging system uses information produced by the parallelization tool to drive the comparison process. In particular the debugging system relies on the parallelization tool to provide information about where variables may have been modified and how arrays are distributed across multiple processes. User effort is also reduced through the use of dynamic instrumentation. This allows us to modify the program execution without changing the way the user builds the executable. The use of dynamic instrumentation also permits us to compare the executions in a fine-grained fashion and only involve the debugger when a difference has been detected. This reduces the overhead of executing instrumentation.
Relative Debugging of Automatically Parallelized Programs
NASA Technical Reports Server (NTRS)
Jost, Gabriele; Hood, Robert; Biegel, Bryan (Technical Monitor)
2002-01-01
We describe a system that simplifies the process of debugging programs produced by computer-aided parallelization tools. The system uses relative debugging techniques to compare serial and parallel executions in order to show where the computations begin to differ. If the original serial code is correct, errors due to parallelization will be isolated by the comparison. One of the primary goals of the system is to minimize the effort required of the user. To that end, the debugging system uses information produced by the parallelization tool to drive the comparison process. In particular, the debugging system relies on the parallelization tool to provide information about where variables may have been modified and how arrays are distributed across multiple processes. User effort is also reduced through the use of dynamic instrumentation. This allows us to modify, the program execution with out changing the way the user builds the executable. The use of dynamic instrumentation also permits us to compare the executions in a fine-grained fashion and only involve the debugger when a difference has been detected. This reduces the overhead of executing instrumentation.
EFL Learners' Uses of Adverbs in Argumentative Essays
ERIC Educational Resources Information Center
Yilmaz, Ercan; Dikilitas, Kenan
2017-01-01
Adverbs require a great deal of effort to be mastered, and even the most advanced users of that language have difficulty in using them correctly (Narita & Sugiura, 2006; Peacock, 2010; Lei, 2012; Leedham & Cai, 2013). The purpose of this study is to find out to what extent relatively high proficiency level EFL learners use different types…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-20
... Collection; Comment Request; Harvest of Pacific Halibut by Guided Sport Charter Vessel Anglers off Alaska... halibut fishing effort and harvest by all user groups, including the guided sport charter sector of the... requirements. The State of Alaska Department of Fish and Game (ADF&G) Division of Sport Fish initiated a...
Architecture and Data Management Challenges in GEOSS and IEOS
NASA Technical Reports Server (NTRS)
Fontaine, Kathleen S.
2007-01-01
The international Group on Earth Observations (GEO) was initiated in 2003 to engage all the nations of the Earth in building a coordinated, comprehensive, and sustained Earth observation capability, known as the Global Earth Observation System (GEOSS). The GEO website describes GEOSS this way: "GEOSS will build on and add value to existing Earth-observation systems by coordinating their efforts, addressing critical gaps, supporting their interoperability, sharing information, reaching a common understanding of user requirements, and improving delivery of information to users." Each member nation has responded to GEO by establishing some sort of coordinating body; within the United States, that is the United States Group on Earth Observations (USGEO). This paper will describe the establishment of GEO and USGEO, will provide an overview of the activities and challenges in the area of architecture and data management, and will highlight some of the major efforts underway within USGEO today.
Continuation of research into software for space operations support, volume 1
NASA Technical Reports Server (NTRS)
Collier, Mark D.; Killough, Ronnie; Martin, Nancy L.
1990-01-01
A prototype workstation executive called the Hardware Independent Software Development Environment (HISDE) was developed. Software technologies relevant to workstation executives were researched and evaluated and HISDE was used as a test bed for prototyping efforts. New X Windows software concepts and technology were introduced into workstation executives and related applications. The four research efforts performed included: (1) Research into the usability and efficiency of Motif (an X Windows based graphic user interface) which consisted of converting the existing Athena widget based HISDE user interface to Motif demonstrating the usability of Motif and providing insight into the level of effort required to translate an application from widget to another; (2) Prototype a real time data display widget which consisted of research methods for and prototyping the selected method of displaying textual values in an efficient manner; (3) X Windows performance evaluation which consisted of a series of performance measurements which demonstrated the ability of low level X Windows to display textural information; (4) Convert the Display Manager to X Window/Motif which is the application used by NASA for data display during operational mode.
On the design of script languages for neural simulation.
Brette, Romain
2012-01-01
In neural network simulators, models are specified according to a language, either specific or based on a general programming language (e.g. Python). There are also ongoing efforts to develop standardized languages, for example NeuroML. When designing these languages, efforts are often focused on expressivity, that is, on maximizing the number of model types than can be described and simulated. I argue that a complementary goal should be to minimize the cognitive effort required on the part of the user to use the language. I try to formalize this notion with the concept of "language entropy", and I propose a few practical guidelines to minimize the entropy of languages for neural simulation.
Documentation requirements for Applications Systems Verification and Transfer projects (ASVTs)
NASA Technical Reports Server (NTRS)
Suchy, J. T.
1977-01-01
NASA's Application Systems Verification and Transfer Projects (ASVTs) are deliberate efforts to facilitate the transfer of applications of NASA-developed space technology to users such as federal agencies, state and local governments, regional planning groups, public service institutions, and private industry. This study focused on the role of documentation in facilitating technology transfer both to primary users identified during project planning and to others with similar information needs. It was understood that documentation can be used effectively when it is combined with informal (primarily verbal) communication within each user community and with other formal techniques such as organized demonstrations and training programs. Documentation examples from eight ASVT projects and one potential project were examined to give scope to the investigation.
NASA Astrophysics Data System (ADS)
Early, A. B.; Chen, G.; Beach, A. L., III; Northup, E. A.
2016-12-01
NASA has conducted airborne tropospheric chemistry studies for over three decades. These field campaigns have generated a great wealth of observations, including a wide range of the trace gases and aerosol properties. The Atmospheric Science Data Center (ASDC) at NASA Langley Research Center in Hampton Virginia originally developed the Toolsets for Airborne Data (TAD) web application in September 2013 to meet the user community needs for manipulating aircraft data for scientific research on climate change and air quality relevant issues. The analysis of airborne data typically requires data subsetting, which can be challenging and resource intensive for end users. In an effort to streamline this process, the TAD toolset enhancements will include new data subsetting features and updates to the current database model. These will include two subsetters: temporal and spatial, and vertical profile. The temporal and spatial subsetter will allow users to both focus on data from a specific location and/or time period. The vertical profile subsetter will retrieve data collected during an individual aircraft ascent or descent spiral. This effort will allow for the automation of the typically labor-intensive manual data subsetting process, which will provide users with data tailored to their specific research interests. The development of these enhancements will be discussed in this presentation.
QoS support for end users of I/O-intensive applications using shared storage systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, Marion Kei; Zhang, Xuechen; Jiang, Song
2011-01-19
I/O-intensive applications are becoming increasingly common on today's high-performance computing systems. While performance of compute-bound applications can be effectively guaranteed with techniques such as space sharing or QoS-aware process scheduling, it remains a challenge to meet QoS requirements for end users of I/O-intensive applications using shared storage systems because it is difficult to differentiate I/O services for different applications with individual quality requirements. Furthermore, it is difficult for end users to accurately specify performance goals to the storage system using I/O-related metrics such as request latency or throughput. As access patterns, request rates, and the system workload change in time,more » a fixed I/O performance goal, such as bounds on throughput or latency, can be expensive to achieve and may not lead to a meaningful performance guarantees such as bounded program execution time. We propose a scheme supporting end-users QoS goals, specified in terms of program execution time, in shared storage environments. We automatically translate the users performance goals into instantaneous I/O throughput bounds using a machine learning technique, and use dynamically determined service time windows to efficiently meet the throughput bounds. We have implemented this scheme in the PVFS2 parallel file system and have conducted an extensive evaluation. Our results show that this scheme can satisfy realistic end-user QoS requirements by making highly efficient use of the I/O resources. The scheme seeks to balance programs attainment of QoS requirements, and saves as much of the remaining I/O capacity as possible for best-effort programs.« less
Human motion retrieval from hand-drawn sketch.
Chao, Min-Wen; Lin, Chao-Hung; Assa, Jackie; Lee, Tong-Yee
2012-05-01
The rapid growth of motion capture data increases the importance of motion retrieval. The majority of the existing motion retrieval approaches are based on a labor-intensive step in which the user browses and selects a desired query motion clip from the large motion clip database. In this work, a novel sketching interface for defining the query is presented. This simple approach allows users to define the required motion by sketching several motion strokes over a drawn character, which requires less effort and extends the users’ expressiveness. To support the real-time interface, a specialized encoding of the motions and the hand-drawn query is required. Here, we introduce a novel hierarchical encoding scheme based on a set of orthonormal spherical harmonic (SH) basis functions, which provides a compact representation, and avoids the CPU/processing intensive stage of temporal alignment used by previous solutions. Experimental results show that the proposed approach can well retrieve the motions, and is capable of retrieve logically and numerically similar motions, which is superior to previous approaches. The user study shows that the proposed system can be a useful tool to input motion query if the users are familiar with it. Finally, an application of generating a 3D animation from a hand-drawn comics strip is demonstrated.
A study of Minnesota land and water resources using remote sensing, volume 13
NASA Technical Reports Server (NTRS)
1980-01-01
Progress in the use of LANDSAT data to classify wetlands in the Upper Mississippi River Valley and efforts to evaluate stress in corn and soybean crops are described. Satellite remote sensing data was used to measure particle concentrations in Lake Superior and several different kinds of remote sensing data were synergistically combined in order to identify near surface bedrock in Minnesota. Data analysis techniques which separate those activities requiring extensive computing form those involving a great deal of user interaction were developed to allow the latter to be done in the user's office or in the field.
Tabletop computed lighting for practical digital photography.
Mohan, Ankit; Bailey, Reynold; Waite, Jonathan; Tumblin, Jack; Grimm, Cindy; Bodenheimer, Bobby
2007-01-01
We apply simplified image-based lighting methods to reduce the equipment, cost, time, and specialized skills required for high-quality photographic lighting of desktop-sized static objects such as museum artifacts. We place the object and a computer-steered moving-head spotlight inside a simple foam-core enclosure and use a camera to record photos as the light scans the box interior. Optimization, guided by interactive user sketching, selects a small set of these photos whose weighted sum best matches the user-defined target sketch. Unlike previous image-based relighting efforts, our method requires only a single area light source, yet it can achieve high-resolution light positioning to avoid multiple sharp shadows. A reduced version uses only a handheld light and may be suitable for battery-powered field photography equipment that fits into a backpack.
Tracking and data acquisition system for the 1990's. Volume 7: TDAS space technology assessment
NASA Technical Reports Server (NTRS)
Khatri, R.
1983-01-01
The results of the TDAS and user spacecraft technology assessment effort are provided. For each TDAS Satellite enhancement and user spacecraft element previously enumerated, the technology issues are identified and the R&D needed to resolve these issues is delineated. Subsequently, taking into account developments taking place elsewhere, the addition unique TDAS satellite module and user spacecraft element R&D efforts needed are identified, and conclusions are drawn in each case. From these conclusions, it is evident that with additional unique R&D efforts carried out for TDAS and appropriate user spacecraft elements the desired TDAS' capabilities for the 1990's can be realized and user spacecraft can be implemented that adequately interface with the projected TDAS.
Workshop AccessibleTV "Accessible User Interfaces for Future TV Applications"
NASA Astrophysics Data System (ADS)
Hahn, Volker; Hamisu, Pascal; Jung, Christopher; Heinrich, Gregor; Duarte, Carlos; Langdon, Pat
Approximately half of the elderly people over 55 suffer from some type of typically mild visual, auditory, motor or cognitive impairment. For them interaction, especially with PCs and other complex devices is sometimes challenging, although accessible ICT applications could make much of a difference for their living quality. Basically they have the potential to enable or simplify participation and inclusion in their surrounding private and professional communities. However, the availability of accessible user interfaces being capable to adapt to the specific needs and requirements of users with individual impairments is very limited. Although there are a number of APIs [1, 2, 3, 4] available for various platforms that allow developers to provide accessibility features within their applications, today none of them provides features for the automatic adaptation of multimodal interfaces being capable to automatically fit the individual requirements of users with different kinds of impairments. Moreover, the provision of accessible user interfaces is still expensive and risky for application developers, as they need special experience and effort for user tests. Today many implementations simply neglect the needs of elderly people, thus locking out a large portion of their potential users. The workshop is organized as part of the dissemination activity for the European-funded project GUIDE "Gentle user interfaces for elderly people", which aims to address this situation with a comprehensive approach for the realization of multimodal user interfaces being capable to adapt to the needs of users with different kinds of mild impairments. As application platform, GUIDE will mainly target TVs and Set-Top Boxes, such as the emerging Connected-TV or WebTV platforms, as they have the potential to address the needs of the elderly users with applications such as for home automation, communication or continuing education.
Computer-Based Tools for Evaluating Graphical User Interfaces
NASA Technical Reports Server (NTRS)
Moore, Loretta A.
1997-01-01
The user interface is the component of a software system that connects two very complex system: humans and computers. Each of these two systems impose certain requirements on the final product. The user is the judge of the usability and utility of the system; the computer software and hardware are the tools with which the interface is constructed. Mistakes are sometimes made in designing and developing user interfaces because the designers and developers have limited knowledge about human performance (e.g., problem solving, decision making, planning, and reasoning). Even those trained in user interface design make mistakes because they are unable to address all of the known requirements and constraints on design. Evaluation of the user inter-face is therefore a critical phase of the user interface development process. Evaluation should not be considered the final phase of design; but it should be part of an iterative design cycle with the output of evaluation being feed back into design. The goal of this research was to develop a set of computer-based tools for objectively evaluating graphical user interfaces. The research was organized into three phases. The first phase resulted in the development of an embedded evaluation tool which evaluates the usability of a graphical user interface based on a user's performance. An expert system to assist in the design and evaluation of user interfaces based upon rules and guidelines was developed during the second phase. During the final phase of the research an automatic layout tool to be used in the initial design of graphical inter- faces was developed. The research was coordinated with NASA Marshall Space Flight Center's Mission Operations Laboratory's efforts in developing onboard payload display specifications for the Space Station.
COSTMODL - AN AUTOMATED SOFTWARE DEVELOPMENT COST ESTIMATION TOOL
NASA Technical Reports Server (NTRS)
Roush, G. B.
1994-01-01
The cost of developing computer software consumes an increasing portion of many organizations' budgets. As this trend continues, the capability to estimate the effort and schedule required to develop a candidate software product becomes increasingly important. COSTMODL is an automated software development estimation tool which fulfills this need. Assimilating COSTMODL to any organization's particular environment can yield significant reduction in the risk of cost overruns and failed projects. This user-customization capability is unmatched by any other available estimation tool. COSTMODL accepts a description of a software product to be developed and computes estimates of the effort required to produce it, the calendar schedule required, and the distribution of effort and staffing as a function of the defined set of development life-cycle phases. This is accomplished by the five cost estimation algorithms incorporated into COSTMODL: the NASA-developed KISS model; the Basic, Intermediate, and Ada COCOMO models; and the Incremental Development model. This choice affords the user the ability to handle project complexities ranging from small, relatively simple projects to very large projects. Unique to COSTMODL is the ability to redefine the life-cycle phases of development and the capability to display a graphic representation of the optimum organizational structure required to develop the subject project, along with required staffing levels and skills. The program is menu-driven and mouse sensitive with an extensive context-sensitive help system that makes it possible for a new user to easily install and operate the program and to learn the fundamentals of cost estimation without having prior training or separate documentation. The implementation of these functions, along with the customization feature, into one program makes COSTMODL unique within the industry. COSTMODL was written for IBM PC compatibles, and it requires Turbo Pascal 5.0 or later and Turbo Professional 5.0 for recompilation. An executable is provided on the distribution diskettes. COSTMODL requires 512K RAM. The standard distribution medium for COSTMODL is three 5.25 inch 360K MS-DOS format diskettes. The contents of the diskettes are compressed using the PKWARE archiving tools. The utility to unarchive the files, PKUNZIP.EXE, is included. COSTMODL was developed in 1991. IBM PC is a registered trademark of International Business Machines. Borland and Turbo Pascal are registered trademarks of Borland International, Inc. Turbo Professional is a trademark of TurboPower Software. MS-DOS is a registered trademark of Microsoft Corporation. Turbo Professional is a trademark of TurboPower Software.
ERIC Educational Resources Information Center
Lin, Yen-Ting; Jou, Min
2013-01-01
Advancements in information and communication technology (ICT) allowed several tools and systems to be proposed for improving classroom experiences to both instructors and students. However, most of these tools were brand-new and stand-alone programs that require users to invest additional time and effort to become familiar with their use. This…
Workshop Report: Joint Requirements. Oversight Council Process.
1996-02-28
provides media for professional exchange and peer criticism among students, theoreticians, practitioners, and users of military operations research. These... exchange of ideas and methods. involvement in the annual Joint Warfare Inter- Subsequent efforts could include multiple operability Demonstrations (JWID...forums for exchange of ideas at the working level, clear, visible relations but studies and analysis opportunities as well. between the JWCAs need to
Defense Security Enterprise Architecture (DSEA) Product Reference Guide. Revision 1.0
2016-06-01
research and development efforts and functional requirements to provide an information sharing capability across all defense security domains. The...Office of the Secretary of Defense (OSD) Research and Development (RDT&E) initiative addressing vertical and horizontal information sharing across the...legal responsibilities to ensure data received by analysts meets user- specified criteria. This advancement in information sharing is made
The Open Translation MOOC: Creating Online Communities to Transcend Linguistic Barriers
ERIC Educational Resources Information Center
Beaven, Tita; Comas-Quinn, Anna; Hauck, Mirjam; de los Arcos, Beatriz; Lewis, Timothy
2013-01-01
One of the main barriers to the reuse of Open Educational Resources (OER) is language (OLnet, 2009). OER may be available but in a language that users cannot access, so a preliminary step to reuse is their translation or localization. One of the obvious solutions to the vast effort required to translate OER is to crowd-source the translation, as…
The Cost of Doing Business: Cost Structure of Electronic Immunization Registries
Fontanesi, John M; Flesher, Don S; De Guire, Michelle; Lieberthal, Allan; Holcomb, Kathy
2002-01-01
Objective To predict the true cost of developing and maintaining an electronic immunization registry, and to set the framework for developing future cost-effective and cost-benefit analysis. Data Sources/Study Setting Primary data collected at three immunization registries located in California, accounting for 90 percent of all immunization records in registries in the state during the study period. Study Design A parametric cost analysis compared registry development and maintenance expenditures to registry performance requirements. Data Collection/Extraction Methods Data were collected at each registry through interviews, reviews of expenditure records, technical accomplishments development schedules, and immunization coverage rates. Principal Findings The cost of building immunization registries is predictable and independent of the hardware/software combination employed. The effort requires four man-years of technical effort or approximately $250,000 in 1998 dollars. Costs for maintaining a registry were approximately $5,100 per end user per three-year period. Conclusions There is a predictable cost structure for both developing and maintaining immunization registries. The cost structure can be used as a framework for examining the cost-effectiveness and cost-benefits of registries. The greatest factor effecting improvement in coverage rates was ongoing, user-based administrative investment. PMID:12479497
Building a Smart Portal for Astronomy
NASA Astrophysics Data System (ADS)
Derriere, S.; Boch, T.
2011-07-01
The development of a portal for accessing astronomical resources is not an easy task. The ever-increasing complexity of the data products can result in very complex user interfaces, requiring a lot of effort and learning from the user in order to perform searches. This is often a design choice, where the user must explicitly set many constraints, while the portal search logic remains simple. We investigated a different approach, where the query interface is kept as simple as possible (ideally, a simple text field, like for Google search), and the search logic is made much more complex to interpret the query in a relevant manner. We will present the implications of this approach in terms of interpretation and categorization of the query parameters (related to astronomical vocabularies), translation (mapping) of these concepts into the portal components metadata, identification of query schemes and use cases matching the input parameters, and delivery of query results to the user.
TADS: A CFD-based turbomachinery and analysis design system with GUI. Volume 2: User's manual
NASA Technical Reports Server (NTRS)
Myers, R. A.; Topp, D. A.; Delaney, R. A.
1995-01-01
The primary objective of this study was the development of a computational fluid dynamics (CFD) based turbomachinery airfoil analysis and design system, controlled by a graphical user interface (GUI). The computer codes resulting from this effort are referred to as the Turbomachinery Analysis and Design System (TADS). This document is intended to serve as a user's manual for the computer programs which comprise the TADS system. TADS couples a throughflow solver (ADPAC) with a quasi-3D blade-to-blade solver (RVCQ3D) in an interactive package. Throughflow analysis capability was developed in ADPAC through the addition of blade force and blockage terms to the governing equations. A GUI was developed to simplify user input and automate the many tasks required to perform turbomachinery analysis and design. The coupling of various programs was done in a way that alternative solvers or grid generators could be easily incorporated into the TADS framework.
NASA Astrophysics Data System (ADS)
Cortes Arevalo, Juliette; den Haan, Robert-Jan; van der Voort, Mascha; Hulscher, Suzanne
2016-04-01
Effective communication strategies are necessary between different scientific disciplines, practitioners and non-experts for a shared understanding and better implementation of river management measures. In that context, the RiverCare program aims to get a better understanding of riverine measures that are being implemented towards self-sustaining multifunctional rivers in the Netherlands. During the RiverCare program, user committees are organized between the researchers and practitioners to discuss the aim and value of RiverCare outputs, related assumptions and uncertainties behind scientific results. Beyond the RiverCare program end, knowledge about river interventions, integrated effects, management and self-sustaining applications will be available to experts and non-experts by means of River Care communication tools: A web-collaborative platform and a serious gaming environment. As part of the communication project of RiverCare, we are designing the RiverCare web-collaborative platform and the knowledge-base behind that platform. We aim at promoting collaborative efforts and knowledge exchange in river management. However, knowledge exchange does not magically happen. Consultation and discussion of RiverCare outputs as well as elicitation of perspectives and preferences from different actors about the effects of riverine measures has to be facilitated. During the RiverCare research activities, the platform will support the user committees or collaborative sessions that are regularly held with the organizations directly benefiting from our research, at project level or in study areas. The design process of the collaborative platform follows an user centred approach to identify user requirements, co-create a conceptual design and iterative develop and evaluate prototypes of the platform. The envisioned web-collaborative platform opens with an explanation and visualisation of the RiverCare outputs that are available in the knowledge base. Collaborative sessions are initiated by one facilitator that invites other users to contribute by agreeing on an objective for the session and ways and period of collaboration. Upon login, users can join the different sessions that they are invited or will be willing to participate. Within these sessions, users collaboratively engage on the topic at hand, acquiring knowledge about the ongoing results of RiverCare, sharing knowledge between actors and co-constructing new knowledge in the process as input for RiverCare research activities. An overview of each session will be presented to registered and non-registered users to document collaboration efforts and promote interaction with actors outside RiverCare. At the user requirements analysis stage of the collaborative platform, a questionnaire and workshop session was launched to uncover the end user's preferences and expectations about the tool to be designed. Results comprised insights about design criteria of the collaborative platform. The user requirements will be followed by interview sessions with RiverCare researchers and user committee members to identify considerations for data management, objectives of collaboration, expected outputs and indicators to evaluate the collaborative platform. On one side, considerations of intended users are important for co-designing tools that effectively communicate and promote a shared understanding of scientific outputs. On the other one, active involvement of end-users is important for the establishment of measurable indicators to evaluate the tool and the collaborative process.
A simple tool for neuroimaging data sharing
Haselgrove, Christian; Poline, Jean-Baptiste; Kennedy, David N.
2014-01-01
Data sharing is becoming increasingly common, but despite encouragement and facilitation by funding agencies, journals, and some research efforts, most neuroimaging data acquired today is still not shared due to political, financial, social, and technical barriers to sharing data that remain. In particular, technical solutions are few for researchers that are not a part of larger efforts with dedicated sharing infrastructures, and social barriers such as the time commitment required to share can keep data from becoming publicly available. We present a system for sharing neuroimaging data, designed to be simple to use and to provide benefit to the data provider. The system consists of a server at the International Neuroinformatics Coordinating Facility (INCF) and user tools for uploading data to the server. The primary design principle for the user tools is ease of use: the user identifies a directory containing Digital Imaging and Communications in Medicine (DICOM) data, provides their INCF Portal authentication, and provides identifiers for the subject and imaging session. The user tool anonymizes the data and sends it to the server. The server then runs quality control routines on the data, and the data and the quality control reports are made public. The user retains control of the data and may change the sharing policy as they need. The result is that in a few minutes of the user’s time, DICOM data can be anonymized and made publicly available, and an initial quality control assessment can be performed on the data. The system is currently functional, and user tools and access to the public image database are available at http://xnat.incf.org/. PMID:24904398
Understanding User Preferences and Awareness: Privacy Mechanisms in Location-Based Services
NASA Astrophysics Data System (ADS)
Burghardt, Thorben; Buchmann, Erik; Müller, Jens; Böhm, Klemens
Location based services (LBS) let people retrieve and share information related to their current position. Examples are Google Latitude or Panoramio. Since LBS share user-related content, location information etc., they put user privacy at risk. Literature has proposed various privacy mechanisms for LBS. However, it is unclear which mechanisms humans really find useful, and how they make use of them. We present a user study that addresses these issues. To obtain realistic results, we have implemented a geotagging application on the web and on GPS cellphones, and our study participants use this application in their daily lives. We test five privacy mechanisms that differ in the awareness, mental effort and degree of informedness required from the users. Among other findings, we have observed that in situations where a single simple mechanism does not meet all privacy needs, people want to use simple and sophisticated mechanisms in combination. Further, individuals are concerned about the privacy of others, even when they do not value privacy for themselves.
Flaherty, Sarah-Jane; McCarthy, Mary; Collins, Alan; McAuliffe, Fionnuala
2018-02-01
To assess the quality of nutrition content and the integration of user quality components and behaviour change theory relevant to food purchasing behaviour in a sample of existing mobile apps. Descriptive comparative analysis of eleven mobile apps comprising an assessment of their alignment with existing evidence on nutrition, behaviour change and user quality, and their potential ability to support healthier food purchasing behaviour. Mobile apps freely available for public use in GoogePlay were assessed and scored according to agreed criteria to assess nutrition content quality and integration of behaviour change theory and user quality components. A sample of eleven mobile apps that met predefined inclusion criteria to ensure relevance and good quality. The quality of the nutrition content varied. Improvements to the accuracy and appropriateness of nutrition content are needed to ensure mobile apps support a healthy behaviour change process and are accessible to a wider population. There appears to be a narrow focus towards behaviour change with an overemphasis on behavioural outcomes and a small number of behaviour change techniques, which may limit effectiveness. A significant effort from the user was required to use the mobile apps appropriately which may negatively influence user acceptability and subsequent utilisation. Existing mobile apps may offer a potentially effective approach to supporting healthier food purchasing behaviour but improvements in mobile app design are required to maximise their potential effectiveness. Engagement of mobile app users and nutrition professionals is recommended to support effective design.
NASA Technical Reports Server (NTRS)
Westerlund, F. V.
1975-01-01
User applications of remote sensing in Washington State are described. The first project created a multi-temporal land use/land cover data base for the environs of the Seattle-Tacoma International Airport, to serve planning and management operations of the Port of Seattle. The second is an on-going effort to develop a capability within the Puget Sound Governmental Conference, a council of governments (COG), to inventory and monitor land use within its four county jurisdiction. Developmental work has focused on refinement of land use/cover classification systems applicable at this regional scale and various levels of detail in relation to program requirements of the agency. Related research, refinement of manual methods, user training and approaches to technology transfer are discussed.
Professionals' views on mental health service users' education: challenges and support.
Nieminen, I; Kaunonen, M
2017-02-01
WHAT IS KNOWN ON THE SUBJECT?: Mental health service users (MHSUs) may experience disruptions in their education. However, education has been shown to have a positive influence on their recovery, potentially offering them broader employment opportunities. The literature suggests that providing support for MHSUs in their educational efforts may be beneficial and is wished for by the service users themselves. However, there is a lack of mental health professionals' views on the topic in the setting of a community mental health centre. WHAT DOES THIS PAPER ADD TO THE EXISTING KNOWLEDGE?: In the perception of mental health professionals, the predominance of disease in the life of MHSUs and their marginalization may form barriers to their success in education. Professionals can support MHSUs in their educational efforts by strengthening the MHSUs' internal resources and creating a supportive environment with professional expertise available. A service user-centred education might further help MHSUs to achieve their educational goals. Our findings confirm previous knowledge of a recovery-oriented approach to supporting MHSUs' education. This study explored the topic from the professionals' perspective in the context of community mental health centres, which is a fresh view in the research literature. WHAT ARE THE IMPLICATIONS FOR PRACTICE?: The findings suggest which types of support professionals perceive to be required for MHSUs to advance their studies. Knowledge of adequate forms of support can be applied in the mental health nursing practice to develop support measures for service users to advance in their studies. All levels of the community mental health centres should be aware of and adopt a recovery-oriented approach. MHSUs and professionals need to have a shared opinion on the definition of recovery orientation. This requires mutual discussion and the more active involvement of MHSUs in the design of their own rehabilitation process. Introduction Studies show the importance of providing support for mental health service users' (MHSUs') education. However, none of these studies explored this support in the community mental health centre setting. The range of MHSUs' educational activities identified in this study varied from participation in courses at the mental health centres to independent studies at different levels of education outside the centres. Aim (1) How do mental health professionals perceive the challenges that may limit service users' potential when they apply for, and complete, their education? (2) How do the professionals describe the methods of rehabilitation aimed at supporting the service users in achieving their educational goals? Method The data were collected from 14 mental health professionals using focus group interviews. Inductive content analysis was then performed. Results Professionals perceive that the predominance of disease and marginalization may be barriers to MHSUs' success in education. Strengthening the MHSUs' internal resources, creating a supportive environment with professional expertise available and service user-centred education appeared to support the MHSUs' educational achievements. Our findings confirm previous knowledge of a recovery-oriented approach to support MHSUs' education. However, professionals' views on this topic in the context of community mental health centres have not been investigated previously. Discussion Professionals perceive that a recovery-oriented approach to rehabilitation may support MHSUs in their educational efforts. Implications for practice A recovery-oriented approach should be adopted by all levels of the community mental health centres. MHSUs and professionals need to have a shared opinion on the definition of recovery orientation. This requires mutual discussion and a more active involvement of MHSUs in the design of their own rehabilitation process. © 2017 John Wiley & Sons Ltd.
Growth requirements for multidiscipline research and development on the evolutionary space station
NASA Technical Reports Server (NTRS)
Meredith, Barry; Ahlf, Peter; Saucillo, Rudy; Eakman, David
1988-01-01
The NASA Space Station Freedom is being designed to facilitate on-orbit evolution and growth to accommodate changing user needs and future options for U.S. space exploration. In support of the Space Station Freedom Program Preliminary Requirements Review, The Langley Space Station Office has identified a set of resource requirements for Station growth which is deemed adequate for the various evolution options. As part of that effort, analysis was performed to scope requirements for Space Station as an expanding, multidiscipline facility for scientific research, technology development and commercial production. This report describes the assumptions, approach and results of the study.
Modified weighted fair queuing for packet scheduling in mobile WiMAX networks
NASA Astrophysics Data System (ADS)
Satrya, Gandeva B.; Brotoharsono, Tri
2013-03-01
The increase of user mobility and the need for data access anytime also increases the interest in broadband wireless access (BWA). The best available quality of experience for mobile data service users are assured for IEEE 802.16e based users. The main problem of assuring a high QOS value is how to allocate available resources among users in order to meet the QOS requirement for criteria such as delay, throughput, packet loss and fairness. There is no specific standard scheduling mechanism stated by IEEE standards, which leaves it for implementer differentiation. There are five QOS service classes defined by IEEE 802.16: Unsolicited Grant Scheme (UGS), Extended Real Time Polling Service (ertPS), Real Time Polling Service (rtPS), Non Real Time Polling Service (nrtPS) and Best Effort Service (BE). Each class has different QOS parameter requirements for throughput and delay/jitter constraints. This paper proposes Modified Weighted Fair Queuing (MWFQ) scheduling scenario which was based on Weighted Round Robin (WRR) and Weighted Fair Queuing (WFQ). The performance of MWFQ was assessed by using above five QoS criteria. The simulation shows that using the concept of total packet size calculation improves the network's performance.
Hwang, Beomsoo; Jeon, Doyoung
2015-04-09
In exoskeletal robots, the quantification of the user's muscular effort is important to recognize the user's motion intentions and evaluate motor abilities. In this paper, we attempt to estimate users' muscular efforts accurately using joint torque sensor which contains the measurements of dynamic effect of human body such as the inertial, Coriolis, and gravitational torques as well as torque by active muscular effort. It is important to extract the dynamic effects of the user's limb accurately from the measured torque. The user's limb dynamics are formulated and a convenient method of identifying user-specific parameters is suggested for estimating the user's muscular torque in robotic exoskeletons. Experiments were carried out on a wheelchair-integrated lower limb exoskeleton, EXOwheel, which was equipped with torque sensors in the hip and knee joints. The proposed methods were evaluated by 10 healthy participants during body weight-supported gait training. The experimental results show that the torque sensors are to estimate the muscular torque accurately in cases of relaxed and activated muscle conditions.
2014-04-25
CAPE CANAVERAL, Fla. – Construction workers have installed the framing and some of the inner walls inside Firing Room 4 in the Launch Control Center at NASA's Kennedy Space Center in Florida. Three rows of upper level management consoles remain. The Ground Systems Development and Operations Program is overseeing efforts to create a new firing room based on a multi-user concept. The design of Firing Room 4 will incorporate five control room areas that are flexible to meet current and future NASA and commercial user requirements. The equipment and most of the consoles from Firing Room 4 were moved to Firing Room 2 for possible future reuse. Photo credit: NASA/Dimitri Gerondidakis
2014-04-25
CAPE CANAVERAL, Fla. – Construction workers have installed the framing and some of the inner walls inside Firing Room 4 in the Launch Control Center at NASA's Kennedy Space Center in Florida. Three rows of upper level management consoles remain. The Ground Systems Development and Operations Program is overseeing efforts to create a new firing room based on a multi-user concept. The design of Firing Room 4 will incorporate five control room areas that are flexible to meet current and future NASA and commercial user requirements. The equipment and most of the consoles from Firing Room 4 were moved to Firing Room 2 for possible future reuse. Photo credit: NASA/Dimitri Gerondidakis
2014-04-25
CAPE CANAVERAL, Fla. – Construction workers have installed the framing and some of the inner walls inside Firing Room 4 in the Launch Control Center at NASA's Kennedy Space Center in Florida. Three rows of upper level management consoles remain. The Ground Systems Development and Operations Program is overseeing efforts to create a new firing room based on a multi-user concept. The design of Firing Room 4 will incorporate five control room areas that are flexible to meet current and future NASA and commercial user requirements. The equipment and most of the consoles from Firing Room 4 were moved to Firing Room 2 for possible future reuse. Photo credit: NASA/Dimitri Gerondidakis
When is it time to get married? Or when should the assay user and the assay developer collaborate?
Swan, S H; Lasley, B L
1991-01-01
Hormone assays are being developed in the laboratory to detect specific molecular markers in nonclinical populations. Epidemiology is increasingly using these assays to improve the precision with which disease processes and exposures can be defined. This growing body of molecular epidemiology requires a high degree of cooperation between the assay developer and the assay user. We draw on our experience in using a sensitive hormone assay for the detection of early pregnancy via urinary human chorionic gonadotropin to illustrate these points. We conclude that this collaborative effort, in addition to making this study possible, has provided unexpected rewards. PMID:1954925
Nieke, Jens; Reusen, Ils
2007-01-01
User-driven requirements for remote sensing data are difficult to define, especially details on geometric, spectral and radiometric parameters. Even more difficult is a decent assessment of the required degrees of processing and corresponding data quality. It is therefore a real challenge to appropriately assess data costs and services to be provided. In 2006, the HYRESSA project was initiated within the framework 6 programme of the European Commission to analyze the user needs of the hyperspectral remote sensing community. Special focus was given to finding an answer to the key question, “What are the individual user requirements for hyperspectral imagery and its related data products?”. A Value-Benefit Analysis (VBA) was performed to retrieve user needs and address open items accordingly. The VBA is an established tool for systematic problem solving by supporting the possibility of comparing competing projects or solutions. It enables evaluation on the basis of a multidimensional objective model and can be augmented with expert's preferences. After undergoing a VBA, the scaling method (e.g., Law of Comparative Judgment) was applied for achieving the desired ranking judgments. The result, which is the relative value of projects with respect to a well-defined main objective, can therefore be produced analytically using a VBA. A multidimensional objective model adhering to VBA methodology was established. Thereafter, end users and experts were requested to fill out a Questionnaire of User Needs (QUN) at the highest level of detail - the value indicator level. The end user was additionally requested to report personal preferences for his particular research field. In the end, results from the experts' evaluation and results from a sensor data survey can be compared in order to understand user needs and the drawbacks of existing data products. The investigation – focusing on the needs of the hyperspectral user community in Europe – showed that a VBA is a suitable method for analyzing the needs of hyperspectral data users and supporting the sensor/data specification-building process. The VBA has the advantage of being easy to handle, resulting in a comprehensive evaluation. The primary disadvantage is the large effort in realizing such an analysis because the level of detail is extremely high.
NASA Technical Reports Server (NTRS)
Wakim, Nagi T.; Srivastava, Sadanand; Bousaidi, Mehdi; Goh, Gin-Hua
1995-01-01
Agent-based technologies answer to several challenges posed by additional information processing requirements in today's computing environments. In particular, (1) users desire interaction with computing devices in a mode which is similar to that used between people, (2) the efficiency and successful completion of information processing tasks often require a high-level of expertise in complex and multiple domains, (3) information processing tasks often require handling of large volumes of data and, therefore, continuous and endless processing activities. The concept of an agent is an attempt to address these new challenges by introducing information processing environments in which (1) users can communicate with a system in a natural way, (2) an agent is a specialist and a self-learner and, therefore, it qualifies to be trusted to perform tasks independent of the human user, and (3) an agent is an entity that is continuously active performing tasks that are either delegated to it or self-imposed. The work described in this paper focuses on the development of an interface agent for users of a complex information processing environment (IPE). This activity is part of an on-going effort to build a model for developing agent-based information systems. Such systems will be highly applicable to environments which require a high degree of automation, such as, flight control operations and/or processing of large volumes of data in complex domains, such as the EOSDIS environment and other multidisciplinary, scientific data systems. The concept of an agent as an information processing entity is fully described with emphasis on characteristics of special interest to the User-System Interface Agent (USIA). Issues such as agent 'existence' and 'qualification' are discussed in this paper. Based on a definition of an agent and its main characteristics, we propose an architecture for the development of interface agents for users of an IPE that is agent-oriented and whose resources are likely to be distributed and heterogeneous in nature. The architecture of USIA is outlined in two main components: (1) the user interface which is concerned with issues as user dialog and interaction, user modeling, and adaptation to user profile, and (2) the system interface part which deals with identification of IPE capabilities, task understanding and feasibility assessment, and task delegation and coordination of assistant agents.
Natural Resource Information System, design analysis
NASA Technical Reports Server (NTRS)
1972-01-01
The computer-based system stores, processes, and displays map data relating to natural resources. The system was designed on the basis of requirements established in a user survey and an analysis of decision flow. The design analysis effort is described, and the rationale behind major design decisions, including map processing, cell vs. polygon, choice of classification systems, mapping accuracy, system hardware, and software language is summarized.
An Assessment of Joint Chat Requirements From Current Usage Patterns
2006-06-01
Army Special Forces firebases always had SATCOM connectivity with the text 35 messaging capability running and most business within the firebases was...overreact, or proactive action on bad information, and points to the need for good business rules. Some, organizations, like USCENTAF, have already...developed chat business rules. Users like how chat facilitates understanding with written text. Time and effort is saved from repeating questions
Stakeholder Collaboration in Air Force Acquisition: Adaptive Design Using System Representations
2003-06-01
261 Figure 6.10. Notional efficacy of SR and analysis for different emphasis areas…...….264 Figure 7.1. Adaptive functions during...closer collaboration spanning requirements activities in the user community and acquisition activities . Drafts in 2003 of new versions of DoD...come to grips with the necessary changes in their activities and processes to effectively implement these objectives. As this research effort
NASA Astrophysics Data System (ADS)
Michaelis, A.; Wang, W.; Melton, F. S.; Votava, P.; Milesi, C.; Hashimoto, H.; Nemani, R. R.; Hiatt, S. H.
2009-12-01
As the length and diversity of the global earth observation data records grow, modeling and analyses of biospheric conditions increasingly requires multiple terabytes of data from a diversity of models and sensors. With network bandwidth beginning to flatten, transmission of these data from centralized data archives presents an increasing challenge, and costs associated with local storage and management of data and compute resources are often significant for individual research and application development efforts. Sharing community valued intermediary data sets, results and codes from individual efforts with others that are not in direct funded collaboration can also be a challenge with respect to time, cost and expertise. We purpose a modeling, data and knowledge center that houses NASA satellite data, climate data and ancillary data where a focused community may come together to share modeling and analysis codes, scientific results, knowledge and expertise on a centralized platform, named Ecosystem Modeling Center (EMC). With the recent development of new technologies for secure hardware virtualization, an opportunity exists to create specific modeling, analysis and compute environments that are customizable, “archiveable” and transferable. Allowing users to instantiate such environments on large compute infrastructures that are directly connected to large data archives may significantly reduce costs and time associated with scientific efforts by alleviating users from redundantly retrieving and integrating data sets and building modeling analysis codes. The EMC platform also provides the possibility for users receiving indirect assistance from expertise through prefabricated compute environments, potentially reducing study “ramp up” times.
The development of an intelligent user interface for NASA's scientific databases
NASA Technical Reports Server (NTRS)
Campbell, William J.; Roelofs, Larry H.
1986-01-01
The National Space Science Data Center (NSSDC) has initiated an Intelligent Data Management (IDM) research effort which has as one of its components, the development of an Intelligent User Interface (IUI). The intent of the IUI effort is to develop a friendly and intelligent user interface service that is based on expert systems and natural language processing technologies. This paper presents the design concepts, development approach and evaluation of performance of a prototype Intelligent User Interface Subsystem (IUIS) supporting an operational database.
NASA Technical Reports Server (NTRS)
1986-01-01
The Johnson Space Center Management Information System (JSCMIS) is an interface to computer data bases at NASA Johnson which allows an authorized user to browse and retrieve information from a variety of sources with minimum effort. This issue gives requirements definition and design specifications for versions 2.1 and 2.1.1, along with documented test scenario environments, and security object design and specifications.
NASA Technical Reports Server (NTRS)
McComas, David C.; Strege, Susanne L.; Carpenter, Paul B. Hartman, Randy
2015-01-01
The core Flight System (cFS) is a flight software (FSW) product line developed by the Flight Software Systems Branch (FSSB) at NASA's Goddard Space Flight Center (GSFC). The cFS uses compile-time configuration parameters to implement variable requirements to enable portability across embedded computing platforms and to implement different end-user functional needs. The verification and validation of these requirements is proving to be a significant challenge. This paper describes the challenges facing the cFS and the results of a pilot effort to apply EXB Solution's testing approach to the cFS applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baart, T. A.; Vandersypen, L. M. K.; Kavli Institute of Nanoscience, Delft University of Technology, P.O. Box 5046, 2600 GA Delft
We report the computer-automated tuning of gate-defined semiconductor double quantum dots in GaAs heterostructures. We benchmark the algorithm by creating three double quantum dots inside a linear array of four quantum dots. The algorithm sets the correct gate voltages for all the gates to tune the double quantum dots into the single-electron regime. The algorithm only requires (1) prior knowledge of the gate design and (2) the pinch-off value of the single gate T that is shared by all the quantum dots. This work significantly alleviates the user effort required to tune multiple quantum dot devices.
Onyx-Advanced Aeropropulsion Simulation Framework Created
NASA Technical Reports Server (NTRS)
Reed, John A.
2001-01-01
The Numerical Propulsion System Simulation (NPSS) project at the NASA Glenn Research Center is developing a new software environment for analyzing and designing aircraft engines and, eventually, space transportation systems. Its purpose is to dramatically reduce the time, effort, and expense necessary to design and test jet engines by creating sophisticated computer simulations of an aerospace object or system (refs. 1 and 2). Through a university grant as part of that effort, researchers at the University of Toledo have developed Onyx, an extensible Java-based (Sun Micro-systems, Inc.), objectoriented simulation framework, to investigate how advanced software design techniques can be successfully applied to aeropropulsion system simulation (refs. 3 and 4). The design of Onyx's architecture enables users to customize and extend the framework to add new functionality or adapt simulation behavior as required. It exploits object-oriented technologies, such as design patterns, domain frameworks, and software components, to develop a modular system in which users can dynamically replace components with others having different functionality.
ATR NSUF Instrumentation Enhancement Efforts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joy L. Rempe; Mitchell K. Meyer; Darrell L. Knudson
A key component of the Advanced Test Reactor (ATR) National Scientific User Facility (NSUF) effort is to expand instrumentation available to users conducting irradiation tests in this unique facility. In particular, development of sensors capable of providing real-time measurements of key irradiation parameters is emphasized because of their potential to increase data fidelity and reduce posttest examination costs. This paper describes the strategy for identifying new instrumentation needed for ATR irradiations and the program underway to develop and evaluate new sensors to address these needs. Accomplishments from this program are illustrated by describing new sensors now available to users ofmore » the ATR NSUF. In addition, progress is reported on current research efforts to provide improved in-pile instrumentation to users.« less
Standardised Benchmarking in the Quest for Orthologs
Altenhoff, Adrian M.; Boeckmann, Brigitte; Capella-Gutierrez, Salvador; Dalquen, Daniel A.; DeLuca, Todd; Forslund, Kristoffer; Huerta-Cepas, Jaime; Linard, Benjamin; Pereira, Cécile; Pryszcz, Leszek P.; Schreiber, Fabian; Sousa da Silva, Alan; Szklarczyk, Damian; Train, Clément-Marie; Bork, Peer; Lecompte, Odile; von Mering, Christian; Xenarios, Ioannis; Sjölander, Kimmen; Juhl Jensen, Lars; Martin, Maria J.; Muffato, Matthieu; Gabaldón, Toni; Lewis, Suzanna E.; Thomas, Paul D.; Sonnhammer, Erik; Dessimoz, Christophe
2016-01-01
The identification of evolutionarily related genes across different species—orthologs in particular—forms the backbone of many comparative, evolutionary, and functional genomic analyses. Achieving high accuracy in orthology inference is thus essential. Yet the true evolutionary history of genes, required to ascertain orthology, is generally unknown. Furthermore, orthologs are used for very different applications across different phyla, with different requirements in terms of the precision-recall trade-off. As a result, assessing the performance of orthology inference methods remains difficult for both users and method developers. Here, we present a community effort to establish standards in orthology benchmarking and facilitate orthology benchmarking through an automated web-based service (http://orthology.benchmarkservice.org). Using this new service, we characterise the performance of 15 well-established orthology inference methods and resources on a battery of 20 different benchmarks. Standardised benchmarking provides a way for users to identify the most effective methods for the problem at hand, sets a minimal requirement for new tools and resources, and guides the development of more accurate orthology inference methods. PMID:27043882
Geographic information system development in the CARETS project
Mitchell, William B.; Fegeas, Robin G.; Fitzpatrick, Katherine A.; Hallam, Cheryl A.
1977-01-01
Experience in the development of a geographic information system to support the CARETS project has confirmed the considerable advantages that may accrue by paralleling the system development with a rational and balanced system production effort which permits the integration of the education and training of users with interim deliverable products to them. Those advantages include support for a long-term staff plan that recognizes substantial staff changes through system development and implementation, a fiscal plan that provides continuity in resources necessary for total system development, and a feedback system which allows the user to communicate his experiences in using the system. Thus far balance between system development and system production has not been achieved because of continuing large-scale spatial data processing requirements coupled with strong and insistent demands from users for immediately deliverable products from the system. That imbalance has refocussed staffing and fiscal plans from long-term system development to short- and near-term production requirements, continuously extends total system development time, and increases the possibility that later system development may reduce the usefulness of current interim products.
The Value of Metrics for Science Data Center Management
NASA Astrophysics Data System (ADS)
Moses, J.; Behnke, J.; Watts, T. H.; Lu, Y.
2005-12-01
The Earth Observing System Data and Information System (EOSDIS) has been collecting and analyzing records of science data archive, processing and product distribution for more than 10 years. The types of information collected and the analysis performed has matured and progressed to become an integral and necessary part of the system management and planning functions. Science data center managers are realizing the importance that metrics can play in influencing and validating their business model. New efforts focus on better understanding of users and their methods. Examples include tracking user web site interactions and conducting user surveys such as the government authorized American Customer Satisfaction Index survey. This paper discusses the metrics methodology, processes and applications that are growing in EOSDIS, the driving requirements and compelling events, and the future envisioned for metrics as an integral part of earth science data systems.
Design of an immersive simulator for assisted power wheelchair driving.
Devigne, Louise; Babel, Marie; Nouviale, Florian; Narayanan, Vishnu K; Pasteau, Francois; Gallien, Philippe
2017-07-01
Driving a power wheelchair is a difficult and complex visual-cognitive task. As a result, some people with visual and/or cognitive disabilities cannot access the benefits of a power wheelchair because their impairments prevent them from driving safely. In order to improve their access to mobility, we have previously designed a semi-autonomous assistive wheelchair system which progressively corrects the trajectory as the user manually drives the wheelchair and smoothly avoids obstacles. Developing and testing such systems for wheelchair driving assistance requires a significant amount of material resources and clinician time. With Virtual Reality technology, prototypes can be developed and tested in a risk-free and highly flexible Virtual Environment before equipping and testing a physical prototype. Additionally, users can "virtually" test and train more easily during the development process. In this paper, we introduce a power wheelchair driving simulator allowing the user to navigate with a standard wheelchair in an immersive 3D Virtual Environment. The simulation framework is designed to be flexible so that we can use different control inputs. In order to validate the framework, we first performed tests on the simulator with able-bodied participants during which the user's Quality of Experience (QoE) was assessed through a set of questionnaires. Results show that the simulator is a promising tool for future works as it generates a good sense of presence and requires rather low cognitive effort from users.
Knoblock-Hahn, Amy L; LeRouge, Cynthia M
2014-04-01
Consumer health technologies (CHTs) are a growing part of the continuum of care for self-management of overweight and obesity. Parents positively or negatively influence adolescent weight-management efforts and are especially important throughout continuum of care settings. User-centered design (UCD) applications have been developed to assist primary users, such as adolescents, with their weight management, but less is known about the influence of parents as secondary users across many socio-ecological environments. The purpose of this study was to use the Unified Theory of Acceptance and Use of Technology (UTAUT) to inform the design of a UCD application in a qualitative study that sought to determine parental views on how technology can support previously learned behaviors that require ongoing management and support beyond formal lifestyle interventions. Parents of overweight and obese adolescents (n=14) were interviewed about perceived usefulness and planned user-intent of CHT that was designed for adolescents. UTAUT provided theoretical parental constructs (intention, performance and effort expectancy, and social influence) interactions within several socio-ecological contexts, including the home food environment and restaurant dining experiences. Although generalizations of this qualitative study are limited by a small sample size with predominantly mothers (n=13) of overweight and obese daughters (n=12), the exploratory inquiry using a parent as a secondary consumer user can complement the adoption of applications designed by adolescents. Copyright © 2014 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reisman, D.J.
A variety of issues must be addressed in development of software for information resources. One is accessibility and use of information. Another is that to properly design, abstract, index, and do quality control on a database requires the effort of well-trained and knowledgeable personnel as well as substantial financial resources. Transferring data to other locations has inherent difficulties, including those related to incompatibility. The main issue in developing health risk assessment databases is the needs of the user.
Irregular Applications: Architectures & Algorithms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feo, John T.; Villa, Oreste; Tumeo, Antonino
Irregular applications are characterized by irregular data structures, control and communication patterns. Novel irregular high performance applications which deal with large data sets and require have recently appeared. Unfortunately, current high performance systems and software infrastructures executes irregular algorithms poorly. Only coordinated efforts by end user, area specialists and computer scientists that consider both the architecture and the software stack may be able to provide solutions to the challenges of modern irregular applications.
Public Service Communication Satellite Program
NASA Technical Reports Server (NTRS)
Brown, J. P.
1977-01-01
The proposed NASA Public Service Communication Satellite Program consists of four different activities designed to fulfill the needs of public service sector. These are: interaction with the users, experimentation with existing satellites, development of a limited capability satellite for the earliest possible launch, and initiation of an R&D program to develop the greatly increased capability that future systems will require. This paper will discuss NASA efforts in each of these areas.
Novel Virtual User Models of Mild Cognitive Impairment for Simulating Dementia
Segkouli, Sofia; Tzovaras, Dimitrios; Tsakiris, Thanos; Tsolaki, Magda; Karagiannidis, Charalampos
2015-01-01
Virtual user modeling research has attempted to address critical issues of human-computer interaction (HCI) such as usability and utility through a large number of analytic, usability-oriented approaches as cognitive models in order to provide users with experiences fitting to their specific needs. However, there is demand for more specific modules embodied in cognitive architecture that will detect abnormal cognitive decline across new synthetic task environments. Also, accessibility evaluation of graphical user interfaces (GUIs) requires considerable effort for enhancing ICT products accessibility for older adults. The main aim of this study is to develop and test virtual user models (VUM) simulating mild cognitive impairment (MCI) through novel specific modules, embodied at cognitive models and defined by estimations of cognitive parameters. Well-established MCI detection tests assessed users' cognition, elaborated their ability to perform multitasks, and monitored the performance of infotainment related tasks to provide more accurate simulation results on existing conceptual frameworks and enhanced predictive validity in interfaces' design supported by increased tasks' complexity to capture a more detailed profile of users' capabilities and limitations. The final outcome is a more robust cognitive prediction model, accurately fitted to human data to be used for more reliable interfaces' evaluation through simulation on the basis of virtual models of MCI users. PMID:26339282
Trends in communicative access solutions for children with cerebral palsy.
Myrden, Andrew; Schudlo, Larissa; Weyand, Sabine; Zeyl, Timothy; Chau, Tom
2014-08-01
Access solutions may facilitate communication in children with limited functional speech and motor control. This study reviews current trends in access solution development for children with cerebral palsy, with particular emphasis on the access technology that harnesses a control signal from the user (eg, movement or physiological change) and the output device (eg, augmentative and alternative communication system) whose behavior is modulated by the user's control signal. Access technologies have advanced from simple mechanical switches to machine vision (eg, eye-gaze trackers), inertial sensing, and emerging physiological interfaces that require minimal physical effort. Similarly, output devices have evolved from bulky, dedicated hardware with limited configurability, to platform-agnostic, highly personalized mobile applications. Emerging case studies encourage the consideration of access technology for all nonverbal children with cerebral palsy with at least nascent contingency awareness. However, establishing robust evidence of the effectiveness of the aforementioned advances will require more expansive studies. © The Author(s) 2014.
Error Characterization of Altimetry Measurements at Climate Scales
NASA Astrophysics Data System (ADS)
Ablain, Michael; Larnicol, Gilles; Faugere, Yannice; Cazenave, Anny; Meyssignac, Benoit; Picot, Nicolas; Benveniste, Jerome
2013-09-01
Thanks to studies performed in the framework of the SALP project (supported by CNES) since the TOPEX era and more recently in the framework of the Sea- Level Climate Change Initiative project (supported by ESA), strong improvements have been provided on the estimation of the global and regional mean sea level over the whole altimeter period for all the altimetric missions. Thanks to these efforts, a better characterization of altimeter measurements errors at climate scales has been performed and presented in this paper. These errors have been compared to user requirements in order to know if scientific goals are reached by altimeter missions. The main issue of this paper is the importance to enhance the link between altimeter and climate communities to improve or refine user requirements, to better specify future altimeter system for climate applications but also to reprocess older missions beyond their original specifications.
Support of surgical process modeling by using adaptable software user interfaces
NASA Astrophysics Data System (ADS)
Neumuth, T.; Kaschek, B.; Czygan, M.; Goldstein, D.; Strauß, G.; Meixensberger, J.; Burgert, O.
2010-03-01
Surgical Process Modeling (SPM) is a powerful method for acquiring data about the evolution of surgical procedures. Surgical Process Models are used in a variety of use cases including evaluation studies, requirements analysis and procedure optimization, surgical education, and workflow management scheme design. This work proposes the use of adaptive, situation-aware user interfaces for observation support software for SPM. We developed a method to support the modeling of the observer by using an ontological knowledge base. This is used to drive the graphical user interface for the observer to restrict the search space of terminology depending on the current situation. In the evaluation study it is shown, that the workload of the observer was decreased significantly by using adaptive user interfaces. 54 SPM observation protocols were analyzed by using the NASA Task Load Index and it was shown that the use of the adaptive user interface disburdens the observer significantly in workload criteria effort, mental demand and temporal demand, helping him to concentrate on his essential task of modeling the Surgical Process.
Wang, Jichuan; Kelly, Brian C; Liu, Tieqiao; Hao, Wei
2016-03-01
Given the growth in methamphetamine use in China during the 21st century, we assessed perceived psychosocial barriers to drug treatment among this population. Using a sample of 303 methamphetamine users recruited via Respondent Driven Sampling, we use Latent Class Analysis (LCA) to identify possible distinct latent groups among Chinese methamphetamine users on the basis of their perceptions of psychosocial barriers to drug treatment. After covariates were included to predict latent class membership, the 3-step modeling approach was applied. Our findings indicate that the Chinese methamphetamine using population was heterogeneous on perceptions of drug treatment barriers; four distinct latent classes (subpopulations) were identified--Unsupported Deniers, Deniers, Privacy Anxious, and Low Barriers--and individual characteristics shaped the probability of class membership. Efforts to link Chinese methamphetamine users to treatment may require a multi-faceted approach that attends to differing perceptions about impediments to drug treatment. Copyright © 2015. Published by Elsevier Inc.
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Schreckenghost, Debra L.; Woods, David D.; Potter, Scott S.; Johannesen, Leila; Holloway, Matthew; Forbus, Kenneth D.
1991-01-01
Initial results are reported from a multi-year, interdisciplinary effort to provide guidance and assistance for designers of intelligent systems and their user interfaces. The objective is to achieve more effective human-computer interaction (HCI) for systems with real time fault management capabilities. Intelligent fault management systems within the NASA were evaluated for insight into the design of systems with complex HCI. Preliminary results include: (1) a description of real time fault management in aerospace domains; (2) recommendations and examples for improving intelligent systems design and user interface design; (3) identification of issues requiring further research; and (4) recommendations for a development methodology integrating HCI design into intelligent system design.
Enhanced In-Pile Instrumentation at the Advanced Test Reactor
NASA Astrophysics Data System (ADS)
Rempe, Joy L.; Knudson, Darrell L.; Daw, Joshua E.; Unruh, Troy; Chase, Benjamin M.; Palmer, Joe; Condie, Keith G.; Davis, Kurt L.
2012-08-01
Many of the sensors deployed at materials and test reactors cannot withstand the high flux/high temperature test conditions often requested by users at U.S. test reactors, such as the Advanced Test Reactor (ATR) at the Idaho National Laboratory. To address this issue, an instrumentation development effort was initiated as part of the ATR National Scientific User Facility in 2007 to support the development and deployment of enhanced in-pile sensors. This paper provides an update on this effort. Specifically, this paper identifies the types of sensors currently available to support in-pile irradiations and those sensors currently available to ATR users. Accomplishments from new sensor technology deployment efforts are highlighted by describing new temperature and thermal conductivity sensors now available to ATR users. Efforts to deploy enhanced in-pile sensors for detecting elongation and real-time flux detectors are also reported, and recently-initiated research to evaluate the viability of advanced technologies to provide enhanced accuracy for measuring key parameters during irradiation testing are noted.
D-MATRIX: A web tool for constructing weight matrix of conserved DNA motifs
Sen, Naresh; Mishra, Manoj; Khan, Feroz; Meena, Abha; Sharma, Ashok
2009-01-01
Despite considerable efforts to date, DNA motif prediction in whole genome remains a challenge for researchers. Currently the genome wide motif prediction tools required either direct pattern sequence (for single motif) or weight matrix (for multiple motifs). Although there are known motif pattern databases and tools for genome level prediction but no tool for weight matrix construction. Considering this, we developed a D-MATRIX tool which predicts the different types of weight matrix based on user defined aligned motif sequence set and motif width. For retrieval of known motif sequences user can access the commonly used databases such as TFD, RegulonDB, DBTBS, Transfac. DMATRIX program uses a simple statistical approach for weight matrix construction, which can be converted into different file formats according to user requirement. It provides the possibility to identify the conserved motifs in the coregulated genes or whole genome. As example, we successfully constructed the weight matrix of LexA transcription factor binding site with the help of known sosbox cisregulatory elements in Deinococcus radiodurans genome. The algorithm is implemented in C-Sharp and wrapped in ASP.Net to maintain a user friendly web interface. DMATRIX tool is accessible through the CIMAP domain network. Availability http://203.190.147.116/dmatrix/ PMID:19759861
D-MATRIX: a web tool for constructing weight matrix of conserved DNA motifs.
Sen, Naresh; Mishra, Manoj; Khan, Feroz; Meena, Abha; Sharma, Ashok
2009-07-27
Despite considerable efforts to date, DNA motif prediction in whole genome remains a challenge for researchers. Currently the genome wide motif prediction tools required either direct pattern sequence (for single motif) or weight matrix (for multiple motifs). Although there are known motif pattern databases and tools for genome level prediction but no tool for weight matrix construction. Considering this, we developed a D-MATRIX tool which predicts the different types of weight matrix based on user defined aligned motif sequence set and motif width. For retrieval of known motif sequences user can access the commonly used databases such as TFD, RegulonDB, DBTBS, Transfac. D-MATRIX program uses a simple statistical approach for weight matrix construction, which can be converted into different file formats according to user requirement. It provides the possibility to identify the conserved motifs in the co-regulated genes or whole genome. As example, we successfully constructed the weight matrix of LexA transcription factor binding site with the help of known sos-box cis-regulatory elements in Deinococcus radiodurans genome. The algorithm is implemented in C-Sharp and wrapped in ASP.Net to maintain a user friendly web interface. D-MATRIX tool is accessible through the CIMAP domain network. http://203.190.147.116/dmatrix/
An intelligent interface for satellite operations: Your Orbit Determination Assistant (YODA)
NASA Technical Reports Server (NTRS)
Schur, Anne
1988-01-01
An intelligent interface is often characterized by the ability to adapt evaluation criteria as the environment and user goals change. Some factors that impact these adaptations are redefinition of task goals and, hence, user requirements; time criticality; and system status. To implement adaptations affected by these factors, a new set of capabilities must be incorporated into the human-computer interface design. These capabilities include: (1) dynamic update and removal of control states based on user inputs, (2) generation and removal of logical dependencies as change occurs, (3) uniform and smooth interfacing to numerous processes, databases, and expert systems, and (4) unobtrusive on-line assistance to users of concepts were applied and incorporated into a human-computer interface using artificial intelligence techniques to create a prototype expert system, Your Orbit Determination Assistant (YODA). YODA is a smart interface that supports, in real teime, orbit analysts who must determine the location of a satellite during the station acquisition phase of a mission. Also described is the integration of four knowledge sources required to support the orbit determination assistant: orbital mechanics, spacecraft specifications, characteristics of the mission support software, and orbit analyst experience. This initial effort is continuing with expansion of YODA's capabilities, including evaluation of results of the orbit determination task.
Yu, Jessica S; Pertusi, Dante A; Adeniran, Adebola V; Tyo, Keith E J
2017-03-15
High throughput screening by fluorescence activated cell sorting (FACS) is a common task in protein engineering and directed evolution. It can also be a rate-limiting step if high false positive or negative rates necessitate multiple rounds of enrichment. Current FACS software requires the user to define sorting gates by intuition and is practically limited to two dimensions. In cases when multiple rounds of enrichment are required, the software cannot forecast the enrichment effort required. We have developed CellSort, a support vector machine (SVM) algorithm that identifies optimal sorting gates based on machine learning using positive and negative control populations. CellSort can take advantage of more than two dimensions to enhance the ability to distinguish between populations. We also present a Bayesian approach to predict the number of sorting rounds required to enrich a population from a given library size. This Bayesian approach allowed us to determine strategies for biasing the sorting gates in order to reduce the required number of enrichment rounds. This algorithm should be generally useful for improve sorting outcomes and reducing effort when using FACS. Source code available at http://tyolab.northwestern.edu/tools/ . k-tyo@northwestern.edu. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Building Format-Agnostic Metadata Repositories
NASA Astrophysics Data System (ADS)
Cechini, M.; Pilone, D.
2010-12-01
This presentation will discuss the problems that surround persisting and discovering metadata in multiple formats; a set of tenets that must be addressed in a solution; and NASA’s Earth Observing System (EOS) ClearingHOuse’s (ECHO) proposed approach. In order to facilitate cross-discipline data analysis, Earth Scientists will potentially interact with more than one data source. The most common data discovery paradigm relies on services and/or applications facilitating the discovery and presentation of metadata. What may not be common are the formats in which the metadata are formatted. As the number of sources and datasets utilized for research increases, it becomes more likely that a researcher will encounter conflicting metadata formats. Metadata repositories, such as the EOS ClearingHOuse (ECHO), along with data centers, must identify ways to address this issue. In order to define the solution to this problem, the following tenets are identified: - There exists a set of ‘core’ metadata fields recommended for data discovery. - There exists a set of users who will require the entire metadata record for advanced analysis. - There exists a set of users who will require a ‘core’ set of metadata fields for discovery only. - There will never be a cessation of new formats or a total retirement of all old formats. - Users should be presented metadata in a consistent format. ECHO has undertaken an effort to transform its metadata ingest and discovery services in order to support the growing set of metadata formats. In order to address the previously listed items, ECHO’s new metadata processing paradigm utilizes the following approach: - Identify a cross-format set of ‘core’ metadata fields necessary for discovery. - Implement format-specific indexers to extract the ‘core’ metadata fields into an optimized query capability. - Archive the original metadata in its entirety for presentation to users requiring the full record. - Provide on-demand translation of ‘core’ metadata to any supported result format. With this identified approach, the Earth Scientist is provided with a consistent data representation as they interact with a variety of datasets that utilize multiple metadata formats. They are then able to focus their efforts on the more critical research activities which they are undertaking.
An indoor navigation system for the visually impaired.
Guerrero, Luis A; Vasquez, Francisco; Ochoa, Sergio F
2012-01-01
Navigation in indoor environments is highly challenging for the severely visually impaired, particularly in spaces visited for the first time. Several solutions have been proposed to deal with this challenge. Although some of them have shown to be useful in real scenarios, they involve an important deployment effort or use artifacts that are not natural for blind users. This paper presents an indoor navigation system that was designed taking into consideration usability as the quality requirement to be maximized. This solution enables one to identify the position of a person and calculates the velocity and direction of his movements. Using this information, the system determines the user's trajectory, locates possible obstacles in that route, and offers navigation information to the user. The solution has been evaluated using two experimental scenarios. Although the results are still not enough to provide strong conclusions, they indicate that the system is suitable to guide visually impaired people through an unknown built environment.
Addressing Earth Science Data Access Challenges through User Experience Research
NASA Astrophysics Data System (ADS)
Hemmings, S. N.; Banks, B.; Kendall, J.; Lee, C. M.; Irwin, D.; Toll, D. L.; Searby, N. D.
2013-12-01
The NASA Capacity Building Program (Earth Science Division, Applied Sciences Program) works to enhance end-user capabilities to employ Earth observation and Earth science (EO/ES) data in decision-making. Open data access and user-tailored data delivery strategies are critical elements towards this end. User Experience (UX) and User Interface (UI) research methods can offer important contributions towards addressing data access challenges, particularly at the interface of science application/product development and product transition to end-users. This presentation focuses on developing nation contexts and describes methods, results, and lessons learned from two recent UX/UI efforts conducted in collaboration with NASA: the SERVIRglobal.net redesign project and the U.S. Water Partnership (USWP) Portal development effort. SERVIR, a collaborative venture among NASA, USAID, and global partners, seeks to improve environmental management and climate change response by helping governments and other stakeholders integrate EO and geospatial technologies into decision-making. The USWP, a collaboration among U.S. public and private sectors, harnesses U.S.-based resources and expertise to address water challenges in developing nations. SERVIR's study, conducted from 2010-2012, assessed and tested user needs, preferences, and online experiences to generate a more user-friendly online data portal at SERVIRglobal.net. The portal provides a central access interface to data and products from SERVIR's network of hubs in East Africa, the Hindu Kush Himalayas, and Mesoamerica. The second study, conducted by the USWP Secretariat and funded by the U.S. Department of State, seeks to match U.S.-based water information resources with developing nation stakeholder needs. The USWP study utilizes a multi-pronged approach to identify key design requirements and to understand the existing water data portal landscape. Adopting UX methods allows data distributors to design customized UIs that help users find, interpret, and obtain appropriate content quickly. The data access challenge for both SERVIR and USWP consisted of organizing a wide range of content for their respective user bases, which are diverse, international, and in some cases loosely characterized. The UX/UI design approach generated profiles of prototypical users and corresponding task flows and organizational schemes for their preferred types of content. Wireframe acceptance testing by SERVIR helped elicit and optimize how users interact with the information online. These approaches produced customized UIs and knowledge management strategies to address the data access challenges faced by each user type. Both studies revealed critical considerations for user experiences in developing nations (e.g., low-bandwidth internet connections, rolling power outages at data storage or network centers). For SERVIR, these findings influenced not only the portal infrastructure; they also informed the transition of the platform to a Cloud-based model, as well as the development of custom data delivery tools such as SMS and other mobile solutions. While SERVIR's data access solutions are customized for the network's community of users, they are also standardized and interoperable according to GEO and ISO standards, providing a model for other initiatives such as the ongoing USWP Portal development effort.
Goal-recognition-based adaptive brain-computer interface for navigating immersive robotic systems.
Abu-Alqumsan, Mohammad; Ebert, Felix; Peer, Angelika
2017-06-01
This work proposes principled strategies for self-adaptations in EEG-based Brain-computer interfaces (BCIs) as a way out of the bandwidth bottleneck resulting from the considerable mismatch between the low-bandwidth interface and the bandwidth-hungry application, and a way to enable fluent and intuitive interaction in embodiment systems. The main focus is laid upon inferring the hidden target goals of users while navigating in a remote environment as a basis for possible adaptations. To reason about possible user goals, a general user-agnostic Bayesian update rule is devised to be recursively applied upon the arrival of evidences, i.e. user input and user gaze. Experiments were conducted with healthy subjects within robotic embodiment settings to evaluate the proposed method. These experiments varied along three factors: the type of the robot/environment (simulated and physical), the type of the interface (keyboard or BCI), and the way goal recognition (GR) is used to guide a simple shared control (SC) driving scheme. Our results show that the proposed GR algorithm is able to track and infer the hidden user goals with relatively high precision and recall. Further, the realized SC driving scheme benefits from the output of the GR system and is able to reduce the user effort needed to accomplish the assigned tasks. Despite the fact that the BCI requires higher effort compared to the keyboard conditions, most subjects were able to complete the assigned tasks, and the proposed GR system is additionally shown able to handle the uncertainty in user input during SSVEP-based interaction. The SC application of the belief vector indicates that the benefits of the GR module are more pronounced for BCIs, compared to the keyboard interface. Being based on intuitive heuristics that model the behavior of the general population during the execution of navigation tasks, the proposed GR method can be used without prior tuning for the individual users. The proposed methods can be easily integrated in devising more advanced SC schemes and/or strategies for automatic BCI self-adaptations.
Goal-recognition-based adaptive brain-computer interface for navigating immersive robotic systems
NASA Astrophysics Data System (ADS)
Abu-Alqumsan, Mohammad; Ebert, Felix; Peer, Angelika
2017-06-01
Objective. This work proposes principled strategies for self-adaptations in EEG-based Brain-computer interfaces (BCIs) as a way out of the bandwidth bottleneck resulting from the considerable mismatch between the low-bandwidth interface and the bandwidth-hungry application, and a way to enable fluent and intuitive interaction in embodiment systems. The main focus is laid upon inferring the hidden target goals of users while navigating in a remote environment as a basis for possible adaptations. Approach. To reason about possible user goals, a general user-agnostic Bayesian update rule is devised to be recursively applied upon the arrival of evidences, i.e. user input and user gaze. Experiments were conducted with healthy subjects within robotic embodiment settings to evaluate the proposed method. These experiments varied along three factors: the type of the robot/environment (simulated and physical), the type of the interface (keyboard or BCI), and the way goal recognition (GR) is used to guide a simple shared control (SC) driving scheme. Main results. Our results show that the proposed GR algorithm is able to track and infer the hidden user goals with relatively high precision and recall. Further, the realized SC driving scheme benefits from the output of the GR system and is able to reduce the user effort needed to accomplish the assigned tasks. Despite the fact that the BCI requires higher effort compared to the keyboard conditions, most subjects were able to complete the assigned tasks, and the proposed GR system is additionally shown able to handle the uncertainty in user input during SSVEP-based interaction. The SC application of the belief vector indicates that the benefits of the GR module are more pronounced for BCIs, compared to the keyboard interface. Significance. Being based on intuitive heuristics that model the behavior of the general population during the execution of navigation tasks, the proposed GR method can be used without prior tuning for the individual users. The proposed methods can be easily integrated in devising more advanced SC schemes and/or strategies for automatic BCI self-adaptations.
Interactive Vulnerability Analysis Enhancement Results
2012-12-01
from JavaEE web based applications to other non-web based Java programs. Technology developed in this effort should be generally applicable to other...Generating a rule is a 2 click process that requires no input from the user. • Task 3: Added support for non- Java EE applications Aspect’s...investigated a variety of Java -based technologies and how IAST can support them. We were successful in adding support for Scala, a popular new language, and
NASA Technical Reports Server (NTRS)
Corey, Stephen; Carnahan, Richard S., Jr.
1990-01-01
A continuing effort to apply rapid prototyping and Artificial Intelligence techniques to problems associated with projected Space Station-era information management systems is examined. In particular, timely updating of the various databases and knowledge structures within the proposed intelligent information management system (IIMS) is critical to support decision making processes. Because of the significantly large amounts of data entering the IIMS on a daily basis, information updates will need to be automatically performed with some systems requiring that data be incorporated and made available to users within a few hours. Meeting these demands depends first, on the design and implementation of information structures that are easily modified and expanded, and second, on the incorporation of intelligent automated update techniques that will allow meaningful information relationships to be established. Potential techniques are studied for developing such an automated update capability and IIMS update requirements are examined in light of results obtained from the IIMS prototyping effort.
Effective organizational solutions for implementation of DBMS software packages
NASA Technical Reports Server (NTRS)
Jones, D.
1984-01-01
The space telescope management information system development effort is a guideline for discussing effective organizational solutions used in implementing DBMS software. Focus is on the importance of strategic planning. The value of constructing an information system architecture to conform to the organization's managerial needs, the need for a senior decision maker, dealing with shifting user requirements, and the establishment of a reliable working relationship with the DBMS vendor are examined. Requirements for a schedule to demonstrate progress against a defined timeline and the importance of continued monitoring for production software control, production data control, and software enhancements are also discussed.
1981-02-01
the machine . ARI’s efforts in this area focus on human perfor- mance problems related to interactions with command and control centers, and on issues...improvement of the user- machine interface. Lacking consistent design principles, current practice results in a fragmented and unsystematic approach to system...complexity in the user- machine interface of BAS, ARI supported this effort for develop- me:nt of an online language for Army tactical intelligence
User recruitment, training, and support at NOAO Data Lab
NASA Astrophysics Data System (ADS)
Nikutta, Robert; Fitzpatrick, Michael J.; NOAO Data Lab
2018-06-01
The NOAO Data Lab (datalab.noao.edu) is a fully-fledged science data & analysis platform. However, simply building a science platform is notenough to declare it a success. Like any such system built for users, it needs actual users who see enough value in it to be willing toovercome the inertia of registering an account, studying the documentation, working through examples, and ultimately attempting tosolve their own science problems using the platform. The NOAO Data Lab has been open to users since June 2016. In this past year we haveregistered hundreds of users and improved the system, not least through the interaction with and feedback from our users. The posterwill delineate our efforts to recruit new users through conference presentations, platform demos and user workshops, and what we do toassure that users experience their first steps and their learning process with Data Lab as easy, competent, and inspiring. It will alsopresent our efforts in user retention and user support, from a human-staffed helpdesk, to one-on-one sessions, to regular"bring-your-own-problem (BYOP)" in-house sessions with interested users.
GeosciNET: Building a Global Geoinformatics Partnership
NASA Astrophysics Data System (ADS)
Snyder, W. S.; Lehnert, K. A.; Ito, E.; Harms, U.; Klump, J.
2008-12-01
GeosciNET is a collaboration of several existing geoinformatics efforts organized to provide a more effective data system for geoscience projects. Current members are: CoreWall (www.corewall.org), Geoinformatics for Geochemistry (GfG; www.geoinfogeochem.org), System for Earth Sample Registration (SESAR; www.geosamples.org ), GeoStrat SYS (www.geostratsys.org (formerly: PaleoStrat, www.paleostrat.org)), and the International Continental Drilling Program (ICDP; www.icdp-online.org). GeosciNET's basic goal is to advance coordination, complementarity, and interoperability, and minimize duplication of efforts among the involved partner systems in order to streamline the development and operation of geoinformatics efforts. We believe that by advancing the development and data holdings of its member groups, the overall value of each site will be significantly enhanced and better meet the needs of the users. With the existing membership, GeosciNET can offer a comprehensive, integrated system for data acquisition, dissemination, archiving, visualization, integration, and analysis. The system will enable a single researcher or a group of collaborators to keep track of, visualize, and digitally archive any type of sample- or stratigraphic-based data produced from drill holes, dredges, measured stratigraphic sections, the field, or the laboratory. The challenge is to build a linked system that provides users a library of research data as well as tools to input, discover, access, integrate, manipulate, analyze, and model interdisciplinary data - all without corrupting the original data and insuring that the data are attributed to the originator at all times. Science runs on data, but despite the importance of data (legacy or otherwise), there are currently few convenient mechanisms that enable users to easily input their data into databases. While some efforts such as GfG databases, PetDB and SedDB have worked hard to compile such data, only users' active participation can capture the major part of critical legacy data, and insure that new data enter the digital stream as they are generated. GeosciNET wants to lower the barriers so users can take advantage of geoinformatics resources and embrace its promise as the platform for doing the science of the future. Once these benefits are understood by the user community, the obstacles that currently exist in building a larger geoinformatics system will start to erode. User participation requires the proper tools such as translators that can recognize tags and parse the data accordingly, and incentives such as tools for visualization, synthesis and analysis, and digital collaboration space. A major focus for GeosciNET is to support individual researchers and projects that do not have their own dedicated data management and education and outreach programs. One of the greatest challenges for geoinformatics lies in being perceived as a friendly resource by its users where they can easily link their observations and analyses and integrate them with other data. GeosciNET will be experimenting with mechanisms to accomplish these goals.
NASA Astrophysics Data System (ADS)
Lang, Sherman Y. T.; Brooks, Martin; Gauthier, Marc; Wein, Marceli
1993-05-01
A data display system for embedded realtime systems has been developed for use as an operator's user interface and debugging tool. The motivation for development of the On-Line Data Display (ODD) have come from several sources. In particular the design reflects the needs of researchers developing an experimental mobile robot within our laboratory. A proliferation of specialized user interfaces revealed a need for a flexible communications and graphical data display system. At the same time the system had to be readily extensible for arbitrary graphical display formats which would be required for data visualization needs of the researchers. The system defines a communication protocol transmitting 'datagrams' between tasks executing on the realtime system and virtual devices displaying the data in a meaningful way on a graphical workstation. The communication protocol multiplexes logical channels on a single data stream. The current implementation consists of a server for the Harmony realtime operating system and an application written for the Macintosh computer. Flexibility requirements resulted in a highly modular server design, and a layered modular object- oriented design for the Macintosh part of the system. Users assign data types to specific channels at run time. Then devices are instantiated by the user and connected to channels to receive datagrams. The current suite of device types do not provide enough functionality for most users' specialized needs. Instead the system design allows the creation of new device types with modest programming effort. The protocol, design and use of the system are discussed.
International laser-safety regulations: a status update
NASA Astrophysics Data System (ADS)
Weiner, Robert M.
1990-07-01
There is an increase in international laser safety requirements as part of the emphasis on world-wide standardization of products and regulations. In particular the documents which will evolve from the 1992 consolidation efforts of the European Community (EC) will impact both laser manufacturers and users. This paper provides a discussion of the current status of the various laser radiation standards. NORTH AMERICAN REQUIREMENTS United States Requirements on manufacturers from the Food and Drug Administration (FDA) have been in effect since 1975. The Center for Devices and Radiological Health (CDRH) within that agency ensures that these mandatory requirements [1] are satisfied. The CDRH regulations include the division of products into classes depending on their potential for hazard criteria for power measurement and requirements for product features labels and manuals and records and reports. Manufacturers must test products and certify that they comply with the CDRH requirements. User requirements are found in a standard published by the American National Standards Institute (ANSI) and in requirements from several individual states. Specific ANSI standards have also been published for fiber communications systems [34] and for lasers in medical applications [35]. Please note that the Appendix includes additional information on the standards discussed in this paper including sources for obtaining the documents. Canada In the past Canada has had requirements for two specified product categories (bar code scanners and educational lasers) [26 These will be replaced
ICT Solutions for Highly-Customized Water Demand Management Strategies
NASA Astrophysics Data System (ADS)
Giuliani, M.; Cominola, A.; Castelletti, A.; Fraternali, P.; Guardiola, J.; Barba, J.; Pulido-Velazquez, M.; Rizzoli, A. E.
2016-12-01
The recent deployment of smart metering networks is opening new opportunities for advancing the design of residential water demand management strategies (WDMS) relying on improved understanding of water consumers' behaviors. Recent applications showed that retrieving information on users' consumption behaviors, along with their explanatory and/or causal factors, is key to spot potential areas where targeting water saving efforts, and to design user-tailored WDMS. In this study, we explore the potential of ICT-based solutions in supporting the design and implementation of highly customized WDMS. On one side, the collection of consumption data at high spatial and temporal resolutions requires big data analytics and machine learning techniques to extract typical consumption features from the metered population of water users. On the other side, ICT solutions and gamifications can be used as effective means for facilitating both users' engagement and the collection of socio-psychographic users' information. This latter allows interpreting and improving the extracted profiles, ultimately supporting the customization of WDMS, such as awareness campaigns or personalized recommendations. Our approach is implemented in the SmartH2O platform and demonstrated in a pilot application in Valencia, Spain. Results show how the analysis of the smart metered consumption data, combined with the information retrieved from an ICT gamified web user portal, successfully identify the typical consumption profiles of the metered users and supports the design of alternative WDMS targeting the different users' profiles.
Testing and validation of computerized decision support systems.
Sailors, R M; East, T D; Wallace, C J; Carlson, D A; Franklin, M A; Heermann, L K; Kinder, A T; Bradshaw, R L; Randolph, A G; Morris, A H
1996-01-01
Systematic, through testing of decision support systems (DSSs) prior to release to general users is a critical aspect of high quality software design. Omission of this step may lead to the dangerous, and potentially fatal, condition of relying on a system with outputs of uncertain quality. Thorough testing requires a great deal of effort and is a difficult job because tools necessary to facilitate testing are not well developed. Testing is a job ill-suited to humans because it requires tireless attention to a large number of details. For these reasons, the majority of DSSs available are probably not well tested prior to release. We have successfully implemented a software design and testing plan which has helped us meet our goal of continuously improving the quality of our DSS software prior to release. While requiring large amounts of effort, we feel that the process of documenting and standardizing our testing methods are important steps toward meeting recognized national and international quality standards. Our testing methodology includes both functional and structural testing and requires input from all levels of development. Our system does not focus solely on meeting design requirements but also addresses the robustness of the system and the completeness of testing.
RELAP-7 Code Assessment Plan and Requirement Traceability Matrix
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoo, Junsoo; Choi, Yong-joon; Smith, Curtis L.
2016-10-01
The RELAP-7, a safety analysis code for nuclear reactor system, is under development at Idaho National Laboratory (INL). Overall, the code development is directed towards leveraging the advancements in computer science technology, numerical solution methods and physical models over the last decades. Recently, INL has also been putting an effort to establish the code assessment plan, which aims to ensure an improved final product quality through the RELAP-7 development process. The ultimate goal of this plan is to propose a suitable way to systematically assess the wide range of software requirements for RELAP-7, including the software design, user interface, andmore » technical requirements, etc. To this end, we first survey the literature (i.e., international/domestic reports, research articles) addressing the desirable features generally required for advanced nuclear system safety analysis codes. In addition, the V&V (verification and validation) efforts as well as the legacy issues of several recently-developed codes (e.g., RELAP5-3D, TRACE V5.0) are investigated. Lastly, this paper outlines the Requirement Traceability Matrix (RTM) for RELAP-7 which can be used to systematically evaluate and identify the code development process and its present capability.« less
NASA Astrophysics Data System (ADS)
Rajib, M. A.; Merwade, V.; Song, C.; Zhao, L.; Kim, I. L.; Zhe, S.
2014-12-01
Setting up of any hydrologic model requires a large amount of efforts including compilation of all the data, creation of input files, calibration and validation. Given the amount of efforts involved, it is possible that models for a watershed get created multiple times by multiple groups or organizations to accomplish different research, educational or policy goals. To reduce the duplication of efforts and enable collaboration among different groups or organizations around an already existing hydrology model, a platform is needed where anyone can search for existing models, perform simple scenario analysis and visualize model results. The creator and users of a model on such a platform can then collaborate to accomplish new research or educational objectives. From this perspective, a prototype cyber-infrastructure (CI), called SWATShare, is developed for sharing, running and visualizing Soil Water Assessment Tool (SWAT) models in an interactive GIS-enabled web environment. Users can utilize SWATShare to publish or upload their own models, search and download existing SWAT models developed by others, run simulations including calibration using high performance resources provided by XSEDE and Cloud. Besides running and sharing, SWATShare hosts a novel spatio-temporal visualization system for SWAT model outputs. In temporal scale, the system creates time-series plots for all the hydrology and water quality variables available along the reach as well as in watershed-level. In spatial scale, the system can dynamically generate sub-basin level thematic maps for any variable at any user-defined date or date range; and thereby, allowing users to run animations or download the data for subsequent analyses. In addition to research, SWATShare can also be used within a classroom setting as an educational tool for modeling and comparing the hydrologic processes under different geographic and climatic settings. SWATShare is publicly available at https://www.water-hub.org/swatshare.
A semi-automatic annotation tool for cooking video
NASA Astrophysics Data System (ADS)
Bianco, Simone; Ciocca, Gianluigi; Napoletano, Paolo; Schettini, Raimondo; Margherita, Roberto; Marini, Gianluca; Gianforme, Giorgio; Pantaleo, Giuseppe
2013-03-01
In order to create a cooking assistant application to guide the users in the preparation of the dishes relevant to their profile diets and food preferences, it is necessary to accurately annotate the video recipes, identifying and tracking the foods of the cook. These videos present particular annotation challenges such as frequent occlusions, food appearance changes, etc. Manually annotate the videos is a time-consuming, tedious and error-prone task. Fully automatic tools that integrate computer vision algorithms to extract and identify the elements of interest are not error free, and false positive and false negative detections need to be corrected in a post-processing stage. We present an interactive, semi-automatic tool for the annotation of cooking videos that integrates computer vision techniques under the supervision of the user. The annotation accuracy is increased with respect to completely automatic tools and the human effort is reduced with respect to completely manual ones. The performance and usability of the proposed tool are evaluated on the basis of the time and effort required to annotate the same video sequences.
General presentation including new structure
NASA Astrophysics Data System (ADS)
Soons, A.
2002-12-01
Electrical, electronic and electro-mechanical components play an essential role in the functional performance, quality, life cycle and costs of space systems. Their standardisation, product specification, development, evaluation, qualification and procurement must be based on a coherent and efficient approach, paying due attention to present and prospective European space policies and must be commensurate with user needs, market developments and technology trends. The European Space Components Coordination (ESCC) is established with the objective of harmonising the efforts concerning the various aspects of EEE space components by ESA. European national and international public space organisations, the component manufacturers and the user industries. The goal of the ESCC is to improve the availability of strategic EEE space components with the required performance and at affordable costs for institutional and commercial space programmes. It is the objective of ESCC to achieve this goal by harmonising the resources and development efforts for space components in the ESA Member States and by providing a single and unified system for the standardisation, product specification, evaluation, qualification and procurement of European EEE space components and for the certification of components and component manufacturers.
Providing oceanographic data and information for Pacific Island communities
NASA Astrophysics Data System (ADS)
Potemra, James; Maurer, John; Burns, Echelle
2016-04-01
The Pacific Islands Ocean Observing System (PacIOOS; http://pacioos.org) is a data-serving group that relies on and promotes data interoperability. The PacIOOS "enterprise" is part of a large, US National effort aimed at providing information about the ocean environment to a wide range of users. These users range from casual beach-goers interested in the latest weather forecast or wave conditions to federal agencies responsible for public safety. In an effort to bridge the gap between the scientific community, who are responsible for making measurements and running forecast models, and the wide-ranging end-users, the data management group in PacIOOS has developed the infrastructure to host and distribute ocean-related data. The efficiency of this system has also allowed the group to build web-based tools to further help users. In this presentation we describe these efforts in more detail.
2014-04-03
CAPE CANAVERAL, Fla. – The Ground Systems Development and Operations Program is overseeing efforts to create a new multi-user firing room in Firing Room 4 in the Launch Control Center at NASA's Kennedy Space Center in Florida. The main floor consoles, cabling and wires below the floor and ceiling tiles have been removed. Sub-flooring has been installed and the room is marked off to create four separate rooms on the main floor. The design of Firing Room 4 will incorporate five control room areas that are flexible to meet current and future NASA and commercial user requirements. The equipment and most of the consoles from Firing Room 4 were moved to Firing Room 2 for possible future reuse. Photo credit: NASA/Ben Smegelsky
2014-04-03
CAPE CANAVERAL, Fla. – Three rows of upper level management consoles are all that remain in Firing Room 4 in the Launch Control Center at NASA’s Kennedy Space Center in Florida. The main floor consoles, cabling and wires below the floor and ceiling tiles above have been removed. The Ground Systems Development and Operations Program is overseeing efforts to create a new firing room based on a multi-user concept that will support NASA and commercial launch needs. The design of Firing Room 4 will incorporate five control room areas that are flexible to meet current and future NASA and commercial user requirements. The equipment and most of the consoles from Firing Room 4 were moved to Firing Room 2 for possible future reuse. Photo credit: NASA/Ben Smegelsky
2014-04-03
CAPE CANAVERAL, Fla. – Three rows of upper level management consoles are all that remain in Firing Room 4 in the Launch Control Center at NASA’s Kennedy Space Center in Florida. The main floor consoles, cabling and wires below the floor and ceiling tiles above have been removed. The Ground Systems Development and Operations Program is overseeing efforts to create a new firing room based on a multi-user concept that will support NASA and commercial launch needs. The design of Firing Room 4 will incorporate five control room areas that are flexible to meet current and future NASA and commercial user requirements. The equipment and most of the consoles from Firing Room 4 were moved to Firing Room 2 for possible future reuse. Photo credit: NASA/Ben Smegelsky
The Conference on High Temperature Electronics
NASA Technical Reports Server (NTRS)
Hamilton, D. J.; Mccormick, J. B.; Kerwin, W. J.; Narud, J. A.
1981-01-01
The status of and directions for high temperature electronics research and development were evaluated. Major objectives were to (1) identify common user needs; (2) put into perspective the directions for future work; and (3) address the problem of bringing to practical fruition the results of these efforts. More than half of the presentations dealt with materials and devices, rather than circuits and systems. Conference session titles and an example of a paper presented in each session are (1) User requirements: High temperature electronics applications in space explorations; (2) Devices: Passive components for high temperature operation; (3) Circuits and systems: Process characteristics and design methods for a 300 degree QUAD or AMP; and (4) Packaging: Presently available energy supply for high temperature environment.
The Conference on High Temperature Electronics
NASA Astrophysics Data System (ADS)
Hamilton, D. J.; McCormick, J. B.; Kerwin, W. J.; Narud, J. A.
The status of and directions for high temperature electronics research and development were evaluated. Major objectives were to (1) identify common user needs; (2) put into perspective the directions for future work; and (3) address the problem of bringing to practical fruition the results of these efforts. More than half of the presentations dealt with materials and devices, rather than circuits and systems. Conference session titles and an example of a paper presented in each session are (1) User requirements: High temperature electronics applications in space explorations; (2) Devices: Passive components for high temperature operation; (3) Circuits and systems: Process characteristics and design methods for a 300 degree QUAD or AMP; and (4) Packaging: Presently available energy supply for high temperature environment.
User documentation for the FHWA Carpool Matching Program (second edition)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1975-01-01
This document provides persons interested in computerized carpool/buspool matching programs a complete description of the user documentation for the FHWA Carpool Matching Program. The FHWA program is written in American National Standard COBOL and thus should be readily transferable to environments other than the IBM 360/65 (OS) under which it has been developed and tested. The program has a compiled time core requirement of 110K and a maximum execution time core requirement of 110K. While considerable effort has been made to test the program in several applications and to achieve accuracy and completeness in the program and supporting documentation, themore » FHWA cannot guarantee the proper operation of this program by any user nor can it assume liability for any damage, loss, or inconvenience resulting from the operation of this program or the results obtained thereby. This present version of the carpool matching program represents the latest version of the first generation of an ongoing multi-phase process of improvements and refinements. The ultimate goal is an effective carpool and transit information system that will produce individualized information covering not only carpooling opportunities, but also transit routing, scheduling, and other identifying information for the commuter. (MCW)« less
Extending the Virtual Solar Observatory (VSO) to Incorporate Data Analysis Capabilities (III)
NASA Astrophysics Data System (ADS)
Csillaghy, A.; Etesi, L.; Dennis, B.; Zarro, D.; Schwartz, R.; Tolbert, K.
2008-12-01
We will present a progress report on our activities to extend the data analysis capabilities of the VSO. Our efforts to date have focused on three areas: 1. Extending the data retrieval capabilities by developing a centralized data processing server. The server is built with Java, IDL (Interactive Data Language), and the SSW (Solar SoftWare) package with all SSW-related instrument libraries and required calibration data. When a user requests VSO data that requires preprocessing, the data are transparently sent to the server, processed, and returned to the user's IDL session for viewing and analysis. It is possible to have any Java or IDL client connect to the server. An IDL prototype for preparing and calibrating SOHO/EIT data wll be demonstrated. 2. Improving the solar data search in SHOW SYNOP, a graphical user tool connected to VSO in IDL. We introduce the Java-IDL interface that allows a flexible dynamic, and extendable way of searching the VSO, where all the communication with VSO are managed dynamically by standard Java tools. 3. Improving image overlay capability to support coregistration of solar disk observations obtained from different orbital view angles, position angles, and distances - such as from the twin STEREO spacecraft.
Clarke, Kris; Harris, Debra; Zweifler, John A; Lasher, Marc; Mortimer, Roger B; Hughes, Susan
2016-01-01
Infectious disease remains a significant social and health concern in the United States. Preventing more people from contracting HIV/AIDS or Hepatitis C (HCV), requires a complex understanding of the interconnection between the biomedical and social dimensions of infectious disease. Opiate addiction in the US has skyrocketed in recent years. Preventing more cases of HIV/AIDS and HCV will require dealing with the social determinants of health. Needle exchange programs (NEPs) are based on a harm reduction approach that seeks to minimize the risk of infection and damage to the user and community. This article presents an exploratory small-scale quantitative study of the injection drug using habits of a group of injection drug users (IDUs) at a needle exchange program in Fresno, California. Respondents reported significant decreases in high risk IDU behaviors, including sharing of needles and to a lesser extent re-using of needles. They also reported frequent use of clean paraphernalia. Greater collaboration between social and health outreach professionals at NEPs could provide important frontline assistance to people excluded from mainstream office-based services and enhance efforts to reduce HIV/AIDS or HCV infection.
An intelligent tool for activity data collection.
Sarkar, A M Jehad
2011-01-01
Activity recognition systems using simple and ubiquitous sensors require a large variety of real-world sensor data for not only evaluating their performance but also training the systems for better functioning. However, a tremendous amount of effort is required to setup an environment for collecting such data. For example, expertise and resources are needed to design and install the sensors, controllers, network components, and middleware just to perform basic data collections. It is therefore desirable to have a data collection method that is inexpensive, flexible, user-friendly, and capable of providing large and diverse activity datasets. In this paper, we propose an intelligent activity data collection tool which has the ability to provide such datasets inexpensively without physically deploying the testbeds. It can be used as an inexpensive and alternative technique to collect human activity data. The tool provides a set of web interfaces to create a web-based activity data collection environment. It also provides a web-based experience sampling tool to take the user's activity input. The tool generates an activity log using its activity knowledge and the user-given inputs. The activity knowledge is mined from the web. We have performed two experiments to validate the tool's performance in producing reliable datasets.
Minimal-effort planning of active alignment processes for beam-shaping optics
NASA Astrophysics Data System (ADS)
Haag, Sebastian; Schranner, Matthias; Müller, Tobias; Zontar, Daniel; Schlette, Christian; Losch, Daniel; Brecher, Christian; Roßmann, Jürgen
2015-03-01
In science and industry, the alignment of beam-shaping optics is usually a manual procedure. Many industrial applications utilizing beam-shaping optical systems require more scalable production solutions and therefore effort has been invested in research regarding the automation of optics assembly. In previous works, the authors and other researchers have proven the feasibility of automated alignment of beam-shaping optics such as collimation lenses or homogenization optics. Nevertheless, the planning efforts as well as additional knowledge from the fields of automation and control required for such alignment processes are immense. This paper presents a novel approach of planning active alignment processes of beam-shaping optics with the focus of minimizing the planning efforts for active alignment. The approach utilizes optical simulation and the genetic programming paradigm from computer science for automatically extracting features from a simulated data basis with a high correlation coefficient regarding the individual degrees of freedom of alignment. The strategy is capable of finding active alignment strategies that can be executed by an automated assembly system. The paper presents a tool making the algorithm available to end-users and it discusses the results of planning the active alignment of the well-known assembly of a fast-axis collimator. The paper concludes with an outlook on the transferability to other use cases such as application specific intensity distributions which will benefit from reduced planning efforts.
Generating unstructured nuclear reactor core meshes in parallel
Jain, Rajeev; Tautges, Timothy J.
2014-10-24
Recent advances in supercomputers and parallel solver techniques have enabled users to run large simulations problems using millions of processors. Techniques for multiphysics nuclear reactor core simulations are under active development in several countries. Most of these techniques require large unstructured meshes that can be hard to generate in a standalone desktop computers because of high memory requirements, limited processing power, and other complexities. We have previously reported on a hierarchical lattice-based approach for generating reactor core meshes. Here, we describe efforts to exploit coarse-grained parallelism during reactor assembly and reactor core mesh generation processes. We highlight several reactor coremore » examples including a very high temperature reactor, a full-core model of the Korean MONJU reactor, a ¼ pressurized water reactor core, the fast reactor Experimental Breeder Reactor-II core with a XX09 assembly, and an advanced breeder test reactor core. The times required to generate large mesh models, along with speedups obtained from running these problems in parallel, are reported. A graphical user interface to the tools described here has also been developed.« less
ERIC Educational Resources Information Center
Cheney-Stern, Marilyn R.; Phelps, L. Allen
As part of Project IMPACT's efforts to develop procedures for complying with the impact requirements of Public Law 94-482, a case study was made of the Illinois Occupational Curriculum Project (IOCP). The top-down study traced the IOCP from its developers to its users and documented measurable changes in the 1971 versus the 1979 curriculum…
A user-friendly software package to ease the use of VIC hydrologic model for practitioners
NASA Astrophysics Data System (ADS)
Wi, S.; Ray, P.; Brown, C.
2016-12-01
The VIC (Variable Infiltration Capacity) hydrologic and river routing model simulates the water and energy fluxes that occur near the land surface and provides users with useful information regarding the quantity and timing of available water at points of interest within the basin. However, despite its popularity (proved by numerous applications in the literature), its wider adoption is hampered by the considerable effort required to prepare model inputs; e.g., input files storing spatial information related to watershed topography, soil properties, and land cover. This study presents a user-friendly software package (named VIC Setup Toolkit) developed within the MATLAB (matrix laboratory) framework and accessible through an intuitive graphical user interface. The VIC Setup Toolkit enables users to navigate the model building process confidently through prompts and automation, with an intention to promote the use of the model for both practical and academic purposes. The automated processes include watershed delineation, climate and geographical input set-up, model parameter calibration, graph generation and output evaluation. We demonstrate the package's usefulness in various case studies with the American River, Oklahoma River, Feather River and Zambezi River basins.
Hu, Yu-Chi J; Grossberg, Michael D; Mageras, Gikas S
2008-01-01
Planning radiotherapy and surgical procedures usually require onerous manual segmentation of anatomical structures from medical images. In this paper we present a semi-automatic and accurate segmentation method to dramatically reduce the time and effort required of expert users. This is accomplished by giving a user an intuitive graphical interface to indicate samples of target and non-target tissue by loosely drawing a few brush strokes on the image. We use these brush strokes to provide the statistical input for a Conditional Random Field (CRF) based segmentation. Since we extract purely statistical information from the user input, we eliminate the need of assumptions on boundary contrast previously used by many other methods, A new feature of our method is that the statistics on one image can be reused on related images without registration. To demonstrate this, we show that boundary statistics provided on a few 2D slices of volumetric medical data, can be propagated through the entire 3D stack of images without using the geometric correspondence between images. In addition, the image segmentation from the CRF can be formulated as a minimum s-t graph cut problem which has a solution that is both globally optimal and fast. The combination of a fast segmentation and minimal user input that is reusable, make this a powerful technique for the segmentation of medical images.
Visualization of medical data based on EHR standards.
Kopanitsa, G; Hildebrand, C; Stausberg, J; Englmeier, K H
2013-01-01
To organize an efficient interaction between a doctor and an EHR the data has to be presented in the most convenient way. Medical data presentation methods and models must be flexible in order to cover the needs of the users with different backgrounds and requirements. Most visualization methods are doctor oriented, however, there are indications that the involvement of patients can optimize healthcare. The research aims at specifying the state of the art of medical data visualization. The paper analyzes a number of projects and defines requirements for a generic ISO 13606 based data visualization method. In order to do so it starts with a systematic search for studies on EHR user interfaces. In order to identify best practices visualization methods were evaluated according to the following criteria: limits of application, customizability, re-usability. The visualization methods were compared by using specified criteria. The review showed that the analyzed projects can contribute knowledge to the development of a generic visualization method. However, none of them proposed a model that meets all the necessary criteria for a re-usable standard based visualization method. The shortcomings were mostly related to the structure of current medical concept specifications. The analysis showed that medical data visualization methods use hardcoded GUI, which gives little flexibility. So medical data visualization has to turn from a hardcoded user interface to generic methods. This requires a great effort because current standards are not suitable for organizing the management of visualization data. This contradiction between a generic method and a flexible and user-friendly data layout has to be overcome.
NASA Astrophysics Data System (ADS)
Walker, J. D.; Ash, J. M.; Bowring, J.; Bowring, S. A.; Deino, A. L.; Kislitsyn, R.; Koppers, A. A.
2009-12-01
One of the most onerous tasks in rigorous development of data reporting and databases for geochronological and thermochronological studies is to fully capture all of the metadata needed to completely document both the analytical work as well as the interpretation effort. This information is available in the data reduction programs used by researchers, but has proven difficult to harvest into either publications or databases. For this reason, the EarthChem and EARTHTIME efforts are collaborating to foster the next generation of data management and discovery for age information by integrating data reporting with data reduction. EarthChem is a community-driven effort to facilitate the discovery, access, and preservation of geochemical data of all types and to support research and enable new and better science. EARTHTIME is also a community-initiated project whose aim is to foster the next generation of high-precision geochronology and thermochoronology. In addition, collaboration with the CRONUS effort for cosmogenic radionuclides is in progress. EarthChem workers have met with groups working on the Ar-Ar, U-Pb, and (U-Th)/He systems to establish data reporting requirements as well as XML schemas to be used for transferring data from reduction programs to database. At present, we have prototype systems working for the U-Pb_Redux, ArArCalc, MassSpec, and Helios programs. In each program, the user can select to upload data and metadata to the GEOCHRON system hosted at EarthChem. There are two additional requirements for upload. The first is having a unique identifier (IGSN) obtained either manually or via web services contained within the reduction program from the SESAR system. The second is that the user selects whether the sample is to be available for discovery (public) or remain hidden (private). Search for data at the GEOCHRON portal can be done using age, method, mineral, or location parameters. Data can be downloaded in the full XML format for ingestion back into the reduction program or as abbreviated tables.
Biomechanical energy harvesting: generating electricity during walking with minimal user effort.
Donelan, J M; Li, Q; Naing, V; Hoffer, J A; Weber, D J; Kuo, A D
2008-02-08
We have developed a biomechanical energy harvester that generates electricity during human walking with little extra effort. Unlike conventional human-powered generators that use positive muscle work, our technology assists muscles in performing negative work, analogous to regenerative braking in hybrid cars, where energy normally dissipated during braking drives a generator instead. The energy harvester mounts at the knee and selectively engages power generation at the end of the swing phase, thus assisting deceleration of the joint. Test subjects walking with one device on each leg produced an average of 5 watts of electricity, which is about 10 times that of shoe-mounted devices. The cost of harvesting-the additional metabolic power required to produce 1 watt of electricity-is less than one-eighth of that for conventional human power generation. Producing substantial electricity with little extra effort makes this method well-suited for charging powered prosthetic limbs and other portable medical devices.
Modeling and Analysis of Power Processing Systems (MAPPS), initial phase 2
NASA Technical Reports Server (NTRS)
Yu, Y.; Lee, F. C.; Wangenheim, H.; Warren, D.
1977-01-01
The overall objective of the program is to provide the engineering tools to reduce the analysis, design, and development effort, and thus the cost, in achieving the required performances for switching regulators and dc-dc converter systems. The program was both tutorial and application oriented. Various analytical methods were described in detail and supplemented with examples, and those with standardization appeals were reduced into computer-based subprograms. Major program efforts included those concerning small and large signal control-dependent performance analysis and simulation, control circuit design, power circuit design and optimization, system configuration study, and system performance simulation. Techniques including discrete time domain, conventional frequency domain, Lagrange multiplier, nonlinear programming, and control design synthesis were employed in these efforts. To enhance interactive conversation between the modeling and analysis subprograms and the user, a working prototype of the Data Management Program was also developed to facilitate expansion as future subprogram capabilities increase.
BioSearch: a semantic search engine for Bio2RDF
Qiu, Honglei; Huang, Jiacheng
2017-01-01
Abstract Biomedical data are growing at an incredible pace and require substantial expertise to organize data in a manner that makes them easily findable, accessible, interoperable and reusable. Massive effort has been devoted to using Semantic Web standards and technologies to create a network of Linked Data for the life sciences, among others. However, while these data are accessible through programmatic means, effective user interfaces for non-experts to SPARQL endpoints are few and far between. Contributing to user frustrations is that data are not necessarily described using common vocabularies, thereby making it difficult to aggregate results, especially when distributed across multiple SPARQL endpoints. We propose BioSearch — a semantic search engine that uses ontologies to enhance federated query construction and organize search results. BioSearch also features a simplified query interface that allows users to optionally filter their keywords according to classes, properties and datasets. User evaluation demonstrated that BioSearch is more effective and usable than two state of the art search and browsing solutions. Database URL: http://ws.nju.edu.cn/biosearch/ PMID:29220451
PI in the sky: The astronaut science advisor on SLS-2
NASA Technical Reports Server (NTRS)
Hazelton, Lyman R.; Groleau, Nicolas; Frainier, Richard J.; Compton, Michael M.; Colombano, Silvano P.; Szolovits, Peter
1994-01-01
The Astronaut Science Advisor (ASA, also known as Principal-Investigator-in-a-Box) is an advanced engineering effort to apply expert systems technology to experiment monitoring and control. Its goal is to increase the scientific value of information returned from experiments on manned space missions. The first in-space test of the system will be in conjunction with Professor Larry Young's (MIT) vestibulo-ocular 'Rotating Dome' experiment on the Spacelab Life Sciences 2 mission (STS-58) in the Fall of 1993. In a cost-saving effort, off-the-shelf equipment was employed wherever possible. Several modifications were necessary in order to make the system flight-worthy. The software consists of three interlocking modules. A real-time data acquisition system digitizes and stores all experiment data and then characterizes the signals in symbolic form; a rule-based expert system uses the symbolic signal characteristics to make decisions concerning the experiment; and a highly graphic user interface requiring a minimum of user intervention presents information to the astronaut operator. Much has been learned about the design of software and user interfaces for interactive computing in space. In addition, we gained a great deal of knowledge about building relatively inexpensive hardware and software for use in space. New technologies are being assessed to make the system a much more powerful ally in future scientific research in space and on the ground.
Camp, J M; Krakow, M; McCarty, D; Argeriou, M
1992-01-01
There is increased interest in documenting the characteristics and treatment outcomes of clients served with Alcohol, Drug Abuse, and Mental Health Block Grant funds. The evolution of federal client-based management systems for substance abuse treatment services demonstrates that data collection systems are important but require continued support. A review of the Massachusetts substance abuse management information system illustrates the utility of a client-based data set. The development and implementation of a comprehensive information system require overcoming organizational barriers and project delays, fostering collaborative efforts among staff from diverse agencies, and employing considerable resources. In addition, the need to develop mechanisms for increasing the reliability of the data and ongoing training for the users is presented. Finally, three applications of the management information system's role in shaping policy are reviewed: developing services for special populations (communities of color, women, and pregnant substance abusers, and injection drug users), utilizing MIS data for evaluation purposes, and determining funding allocations.
Masters, Matthew N; Haardörfer, Regine; Windle, Michael; Berg, Carla
2018-02-01
Limited research has examined psychosocial factors that differ among cigarette users, marijuana users, and co-users and influence their cessation efforts. We examined: 1) sociodemographic, mental health, and other substance use in relation to user category; and 2) associations among these factors in relation to recent quit attempts and readiness to quit among single product versus co-users. We used a cross-sectional design to study college students aged 18-25 from seven Georgia campuses, focusing on the 721 reporting cigarette and/or marijuana use in the past 4months (238 cigarette-only; 331 marijuana-only; 152 co-users). Multinomial logistic regression showed that correlates (p's<0.05) of cigarette-only versus co-use included attending public or technical colleges (vs. private) and not using little cigars/cigarillos (LCCs), e-cigarettes, and alcohol. Correlates of marijuana-only versus co-use included being Black or Hispanic (vs. White), not attending technical school, and not using LCCs and e-cigarettes. Importance was rated higher for quitting cigarettes versus marijuana, but confidence was rated lower for quitting cigarettes versus marijuana (p's<0.001). Co-users were more likely to report readiness to quit and quit attempts of cigarettes versus marijuana (p's<0.001). While 23.26% of marijuana-only and 15.13% of cigarette-only users reported readiness to quit, 41.18% of cigarette-only and 21.75% of marijuana-only users reported recent quit attempts (p's<0.001). Binary logistic regressions indicated distinct correlates of readiness to quit and quit attempts of cigarettes and marijuana. Cessation efforts of the respective products must attend to co-use with the other product to better understand relative perceptions of importance and confidence in quitting and actual cessation efforts. Copyright © 2017 Elsevier Ltd. All rights reserved.
A review of medical terminology standards and structured reporting.
Awaysheh, Abdullah; Wilcke, Jeffrey; Elvinger, François; Rees, Loren; Fan, Weiguo; Zimmerman, Kurt
2018-01-01
Much effort has been invested in standardizing medical terminology for representation of medical knowledge, storage in electronic medical records, retrieval, reuse for evidence-based decision making, and for efficient messaging between users. We only focus on those efforts related to the representation of clinical medical knowledge required for capturing diagnoses and findings from a wide range of general to specialty clinical perspectives (e.g., internists to pathologists). Standardized medical terminology and the usage of structured reporting have been shown to improve the usage of medical information in secondary activities, such as research, public health, and case studies. The impact of standardization and structured reporting is not limited to secondary activities; standardization has been shown to have a direct impact on patient healthcare.
Tank waste remediation system functions and requirements document
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carpenter, K.E
1996-10-03
This is the Tank Waste Remediation System (TWRS) Functions and Requirements Document derived from the TWRS Technical Baseline. The document consists of several text sections that provide the purpose, scope, background information, and an explanation of how this document assists the application of Systems Engineering to the TWRS. The primary functions identified in the TWRS Functions and Requirements Document are identified in Figure 4.1 (Section 4.0) Currently, this document is part of the overall effort to develop the TWRS Functional Requirements Baseline, and contains the functions and requirements needed to properly define the top three TWRS function levels. TWRS Technicalmore » Baseline information (RDD-100 database) included in the appendices of the attached document contain the TWRS functions, requirements, and architecture necessary to define the TWRS Functional Requirements Baseline. Document organization and user directions are provided in the introductory text. This document will continue to be modified during the TWRS life-cycle.« less
User Data Spectrum Theory: Collecting, Interpreting, and Implementing User Data in Organizations
ERIC Educational Resources Information Center
Peer, Andrea Jo
2017-01-01
Organizations interested in increasing their user experience (UX) capacity lack the tools they need to know how to do so. This dissertation addresses this challenge via three major research efforts: 1) the creation of User Data Spectrum theory and a User Data Spectrum survey for helping organizations better invest resources to grow their UX…
47 CFR 64.604 - Mandatory minimum standards.
Code of Federal Regulations, 2014 CFR
2014-10-01
.... (vi) TRS providers must make best efforts to accommodate a TRS user's requested CA gender when a call... the CA does not interfere with the independence of the user, the user maintains control of the... that confidentiality of VRS users is maintained. (3) Types of calls. (i) Consistent with the...
Technology advances and market forces: Their impact on high performance architectures
NASA Technical Reports Server (NTRS)
Best, D. R.
1978-01-01
Reasonable projections into future supercomputer architectures and technology require an analysis of the computer industry market environment, the current capabilities and trends within the component industry, and the research activities on computer architecture in the industrial and academic communities. Management, programmer, architect, and user must cooperate to increase the efficiency of supercomputer development efforts. Care must be taken to match the funding, compiler, architecture and application with greater attention to testability, maintainability, reliability, and usability than supercomputer development programs of the past.
Public release of the ISC-GEM Global Instrumental Earthquake Catalogue (1900-2009)
Storchak, Dmitry A.; Di Giacomo, Domenico; Bondára, István; Engdahl, E. Robert; Harris, James; Lee, William H.K.; Villaseñor, Antonio; Bormann, Peter
2013-01-01
The International Seismological Centre–Global Earthquake Model (ISC–GEM) Global Instrumental Earthquake Catalogue (1900–2009) is the result of a special effort to substantially extend and improve currently existing global catalogs to serve the requirements of specific user groups who assess and model seismic hazard and risk. The data from the ISC–GEM Catalogue would be used worldwide yet will prove absolutely essential in those regions where a high seismicity level strongly correlates with a high population density.
Functional MRI of inhibitory processing in abstinent adolescent marijuana users.
Tapert, Susan F; Schweinsburg, Alecia D; Drummond, Sean P A; Paulus, Martin P; Brown, Sandra A; Yang, Tony T; Frank, Lawrence R
2007-10-01
Marijuana intoxication appears to impair response inhibition, but it is unclear if impaired inhibition and associated brain abnormalities persist after prolonged abstinence among adolescent users. We hypothesized that brain activation during a go/no-go task would show persistent abnormalities in adolescent marijuana users after 28 days of abstinence. Adolescents with (n = 16) and without (n = 17) histories of marijuana use were compared on blood oxygen level dependent (BOLD) response to a go/no-go task during functional magnetic resonance imaging (fMRI) after 28 days of monitored abstinence. Participants had no neurological problems or Axis I diagnoses other than cannabis abuse/dependence. Marijuana users did not differ from non-users on task performance but showed more BOLD response than non-users during inhibition trials in right dorsolateral prefrontal, bilateral medial frontal, bilateral inferior and superior parietal lobules, and right occipital gyri, as well as during "go" trials in right prefrontal, insular, and parietal cortices (p < 0.05, clusters > 943 microl). Differences remained significant even after controlling for lifetime and recent alcohol use. Adolescent marijuana users relative to non-users showed increased brain processing effort during an inhibition task in the presence of similar task performance, even after 28 days of abstinence. Thus, increased brain processing effort to achieve inhibition may predate the onset of regular use or result from it. Future investigations will need to determine whether increased brain processing effort is associated with risk to use.
2014-04-03
CAPE CANAVERAL, Fla. – The Mobile Launcher is visible through a window inside Firing Room 4 in the Launch Control Center at NASA's Kennedy Space Center in Florida. The Ground Systems Development and Operations Program is overseeing efforts to create a new multi-user firing room in Firing Room 4. The main floor consoles, cabling and wires below the floor and ceiling tiles above have been removed. Sub-flooring has been installed and the room is marked off to create four separate rooms on the main floor. The design of Firing Room 4 will incorporate five control room areas that are flexible to meet current and future NASA and commercial user requirements. The equipment and most of the consoles from Firing Room 4 were moved to Firing Room 2 for possible future reuse. Photo credit: NASA/Ben Smegelsky
McCarthy, Ilana Olin; Wojno, Abbey E; Joseph, Heather A; Teesdale, Scott
2017-11-14
The response to the 2014-2016 Ebola epidemic included an unprecedented effort from federal, state, and local public health authorities to monitor the health of travelers entering the United States from countries with Ebola outbreaks. The Check and Report Ebola (CARE) Hotline, a novel approach to monitoring, was designed to enable travelers to report their health status daily to an interactive voice recognition (IVR) system. The system was tested with 70 Centers for Disease Control and Prevention (CDC) federal employees returning from deployments in outbreak countries. The objective of this study was to describe the development of the CARE Hotline as a tool for postarrival monitoring and examine the usage characteristics and user experience of the tool during a public health emergency. Data were obtained from two sources. First, the CARE Hotline system produced a call log which summarized the usage characteristics of all 70 users' daily health reports. Second, we surveyed federal employees (n=70) who used the CARE Hotline to engage in monitoring. A total of 21 (21/70, 30%) respondents were included in the survey analytic sample. While the CARE Hotline was used for monitoring, 70 users completed a total of 1313 calls. We found that 94.06% (1235/1313) of calls were successful, and the average call time significantly decreased from the beginning of the monitoring period to the end by 32 seconds (Z score=-6.52, P<.001). CARE Hotline call log data were confirmed by user feedback; survey results indicated that users became more familiar with the system and found the system easier to use, from the beginning to the end of their monitoring period. The majority of the users were highly satisfied (90%, 19/21) with the system, indicating ease of use and convenience as primary reasons, and would recommend it for future monitoring efforts (90%, 19/21). The CARE Hotline garnered high user satisfaction, required minimal reporting time from users, and was an easily learned tool for monitoring. This phone-based technology can be modified for future public health emergencies. ©Ilana Olin McCarthy, Abbey E Wojno, Heather A Joseph, Scott Teesdale. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 14.11.2017.
Scheduling the future NASA Space Network: Experiences with a flexible scheduling prototype
NASA Technical Reports Server (NTRS)
Happell, Nadine; Moe, Karen L.; Minnix, Jay
1993-01-01
NASA's Space Network (SN) provides telecommunications and tracking services to low earth orbiting spacecraft. One proposal for improving resource allocation and automating conflict resolution for the SN is the concept of flexible scheduling. In this concept, each Payload Operations Control Center (POCC) will possess a Space Network User POCC Interface (SNUPI) to support the development and management of flexible requests. Flexible requests express the flexibility, constraints, and repetitious nature of the user's communications requirements. Flexible scheduling is expected to improve SN resource utilization and user satisfaction, as well as reduce the effort to produce and maintain a schedule. A prototype testbed has been developed to better understand flexible scheduling as it applies to the SN. This testbed consists of a SNUPI workstation, an SN scheduler, and a flexible request language that conveys information between the two systems. All three are being evaluated by operations personnel. Benchmark testing is being conducted on the scheduler to quantify the productivity improvements achieved with flexible requests.
An Indoor Navigation System for the Visually Impaired
Guerrero, Luis A.; Vasquez, Francisco; Ochoa, Sergio F.
2012-01-01
Navigation in indoor environments is highly challenging for the severely visually impaired, particularly in spaces visited for the first time. Several solutions have been proposed to deal with this challenge. Although some of them have shown to be useful in real scenarios, they involve an important deployment effort or use artifacts that are not natural for blind users. This paper presents an indoor navigation system that was designed taking into consideration usability as the quality requirement to be maximized. This solution enables one to identify the position of a person and calculates the velocity and direction of his movements. Using this information, the system determines the user's trajectory, locates possible obstacles in that route, and offers navigation information to the user. The solution has been evaluated using two experimental scenarios. Although the results are still not enough to provide strong conclusions, they indicate that the system is suitable to guide visually impaired people through an unknown built environment. PMID:22969398
Geographic Information System Tools for Conservation Planning: User's Manual
Fox, Timothy J.; Rohweder, Jason J.; Kenow, K.P.; Korschgen, C.E.; DeHaan, H.C.
2003-01-01
Public and private land managers desire better ways to incorporate landscape, species, and habitat relations into their conservation planning processes. We present three tools, developed for the Environmental Systems Research Institute?s ArcView 3.x platform, applicable to many types of wildlife conservation management and planning efforts. These tools provide managers and planners with the ability to rapidly assess landscape attributes and link these attributes with species-habitat information. To use the tools, the user provides a detailed land cover spatial database and develops a matrix to identify species-habitat relations for the landscape of interest. The tools are applicable to any taxa or suite of taxa for which the required data are available. The user also has the ability to interactively make polygon-specific changes to the landscape and re-examine species-habitat relations. The development of these tools has given resource managers the means to evaluate the merits of proposed landscape management scenarios and to choose the scenario that best fits the goals of the managed area.
Computerization of a preanesthetic evaluation and user satisfaction evaluation.
Arias, Antonio; Benítez, Sonia; Canosa, Daniela; Borbolla, Damián; Staccia, Gustavo; Plazzotta, Fernando; Casais, Marcela; Michelangelo, Hernán; Luna, Daniel; Bernaldo de Quirós, Fernán Gonzalez
2010-01-01
Preanesthetic evaluation purpose is to reduce morbidity and mortality through the review of the patient's medical history, clinical examination, and targeted clinical studies, providing referrals for medical consultations when appropriated. Changes in patient care, standards of health information management and patterns of perioperative care, have resulted in a re-conceptualization of this process where the documentation of patient medical information, the efforts in training and maintaining the integrity of the medical-legal evaluation are areas of concern. The aim of this paper is to describe the design, development, training, and implementation of a computerized preanesthetic evaluation form associated to the evaluation of the user satisfaction with the system. Since the system went live in September 2008 there were 15121 closed structured forms, 60% for ambulatory procedures and 40 % for procedures that required hospital admission. 82% of total closed structured forms had recorded a risk of the procedures of 1-2, according to the American Society of Anesthesiologists classification. The survey indicates a positive general satisfaction of the users with the system.
A Space and Atmospheric Visualization Science System
NASA Technical Reports Server (NTRS)
Szuszczewicz, E. P.; Blanchard, P.; Mankofsky, A.; Goodrich, C.; Kamins, D.; Kulkarni, R.; Mcnabb, D.; Moroh, M.
1994-01-01
SAVS (a Space and Atmospheric Visualization Science system) is an integrated system with user-friendly functionality that employs a 'push-button' software environment that mimics the logical scientific processes in data acquisition, reduction, analysis, and visualization. All of this is accomplished without requiring a detailed understanding of the methods, networks, and modules that link the tools and effectively execute the functions. This report describes SAVS and its components, followed by several applications based on generic research interests in interplanetary and magnetospheric physics (IMP/ISTP), active experiments in space (CRRES), and mission planning focused on the earth's thermospheric, ionospheric, and mesospheric domains (TIMED). The final chapters provide a user-oriented description of interface functionalities, hands-on operations, and customized modules, with details of the primary modules presented in the appendices. The overall intent of the report is to reflect the accomplishments of the three-year development effort and to introduce potential users to the power and utility of the integrated data acquisition, analysis, and visualization system.
Time synchronized video systems
NASA Technical Reports Server (NTRS)
Burnett, Ron
1994-01-01
The idea of synchronizing multiple video recordings to some type of 'range' time has been tried to varying degrees of success in the past. Combining this requirement with existing time code standards (SMPTE) and the new innovations in desktop multimedia however, have afforded an opportunity to increase the flexibility and usefulness of such efforts without adding costs over the traditional data recording and reduction systems. The concept described can use IRIG, GPS or a battery backed internal clock as the master time source. By converting that time source to Vertical Interval Time Code or Longitudinal Time Code, both in accordance with the SMPTE standards, the user will obtain a tape that contains machine/computer readable time code suitable for use with editing equipment that is available off-the-shelf. Accuracy on playback is then determined by the playback system chosen by the user. Accuracies of +/- 2 frames are common among inexpensive systems and complete frame accuracy is more a matter of the users' budget than the capability of the recording system.
Requirements Development for Interoperability Simulation Capability for Law Enforcement
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holter, Gregory M.
2004-05-19
The National Counterdrug Center (NCC) was initially authorized by Congress in FY 1999 appropriations to create a simulation-based counterdrug interoperability training capability. As the lead organization for Research and Analysis to support the NCC, the Pacific Northwest National Laboratory (PNNL) was responsible for developing the requirements for this interoperability simulation capability. These requirements were structured to address the hardware and software components of the system, as well as the deployment and use of the system. The original set of requirements was developed through a process of conducting a user-based survey of requirements for the simulation capability, coupled with an analysismore » of similar development efforts. The user-based approach ensured that existing concerns with respect to interoperability within the law enforcement community would be addressed. Law enforcement agencies within the designated pilot area of Cochise County, Arizona, were surveyed using interviews and ride-alongs during actual operations. The results of this survey were then accumulated, organized, and validated with the agencies to ensure the accuracy of the results. These requirements were then supplemented by adapting operational requirements from existing systems to ensure system reliability and operability. The NCC adopted a development approach providing incremental capability through the fielding of a phased series of progressively more capable versions of the system. This allowed for feedback from system users to be incorporated into subsequent revisions of the system requirements, and also allowed the addition of new elements as needed to adapt the system to broader geographic and geopolitical areas, including areas along the southwest and northwest U.S. borders. This paper addresses the processes used to develop and refine requirements for the NCC interoperability simulation capability, as well as the response of the law enforcement community to the use of the NCC system. The paper also addresses the applicability of such an interoperability simulation capability to a broader set of law enforcement, border protection, site/facility security, and first-responder needs.« less
Particulate matter exposure of bicycle path users in a high-altitude city
NASA Astrophysics Data System (ADS)
Fajardo, Oscar A.; Rojas, Nestor Y.
2012-01-01
It is necessary to evaluate cyclists' exposure to particulate matter and if they are at a higher risk due to their increased breathing rate and their exposure to freshly emitted pollutants. The aim of this pilot study was to determine cyclists' exposure to PM 10 in a highly-polluted, high-altitude city such as Bogotá, and comment on the appropriateness of building bicycle paths alongside roads with heavy traffic in third world cities. A total of 29 particulate matter (PM 10) measurements, taken at two sampling sites using Harvard impactors, were used for estimating the exposure of users of the 80th street bicycle path to this pollutant. PM 10 dose could be considered as being high, especially due to high concentrations and cyclists' increased inhalation rates. A random survey was conducted over 73 bicycle path users to determine cyclists' time, distance and speed on the bicycle path on a daily and weekly basis, their level of effort when cycling and general characteristics, such as this population's gender and age. Based on this information, the PM 10 average daily dose (ADD c) for different bicycle path users and the ratio between ADD c and a reference ADD for people at rest exposed to an indoor concentration of 25 μg m -3 were estimated. The average increase in ADD was 6%-9% when riding with light effort and by 12%-18% when riding with moderate effort. The most enthusiastic bicycle path users showed ADD c/ADD r ratios as high as 1.30 when riding with light effort and 1.64 when riding with moderate effort, thereby significantly increasing their PM 10 exposure-associated health risks.
Measurement Assurance for End-Item Users
NASA Technical Reports Server (NTRS)
Mimbs, Scott M.
2008-01-01
The goal of a Quality Management System (QMS) as specified in ISO 9001 and AS9100 is to assure the end product meets specifications and customer requirements. Measuring devices, often called measuring and test equipments (MTE), provide the evidence of product conformity to the prescribed requirements. Therefore the processes which employ MTE can become a weak link to the overall QMS if proper attention is not given to development and execution of these processes. Traditionally, calibration of MTE is given more focus in industry standards and process control efforts than the equally important proper usage of the same equipment. It is a common complaint of calibration laboratory personnel that MTE users are only interested in "a sticker." If the QMS requires the MTE "to demonstrate conformity of the product," then the quality of the measurement process must be adequate for the task. This leads to an ad hoc definition; measurement assurance is a discipline that assures that all processes, activities, environments, standards, and procedures involved in making a measurement produce a result that can be rigorously evaluated for validity and accuracy. To evaluate that the existing measurement processes are providing an adequate level of quality to support the decisions based upon this measurement data, an understanding of measurement assurance basics is essential. This topic is complimentary to the calibration standard, ANSI/NCSL Z540.3-2006, which targets the calibration of MTE at the organizational level. This paper will discuss general measurement assurance when MTE is used to provide evidence of product conformity, therefore the target audience of this paper is end item users of MTE. A central focus of the paper will be the verification of tolerances and the associated risks, so calibration professionals may find the paper useful in communication with their customers, MTE users.
Evaluation of the resistance of a geopolymer-based drug delivery system to tampering.
Cai, Bing; Engqvist, Håkan; Bredenberg, Susanne
2014-04-25
Tamper-resistance is an important property of controlled-release formulations of opioid drugs. Tamper-resistant formulations aim to increase the degree of effort required to override the controlled release of the drug molecules from extended-release formulations for the purpose of non-medical use. In this study, the resistance of a geopolymer-based formulation to tampering was evaluated by comparing it with a commercial controlled-release tablet using several methods commonly used by drug abusers. Because of its high compressive strength and resistance to heat, much more effort and time was required to extract the drug from the geopolymer-based formulation. Moreover, in the drug-release test, the geopolymer-based formulation maintained its controlled-release characteristics after milling, while the drug was released immediately from the milled commercial tablets, potentially resulting in dose dumping. Although the tampering methods used in this study does not cover all methods that abuser could access, the results obtained by the described methods showed that the geopolymer matrix increased the degree of effort required to override the controlled release of the drug, suggesting that the formulation has improved resistance to some common drug-abuse tampering methods. The geopolymer matrix has the potential to make the opioid product less accessible and attractive to non-medical users. Copyright © 2014 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, S.A.
In computing landscape which has a plethora of different hardware architectures and supporting software systems ranging from compilers to operating systems, there is an obvious and strong need for a philosophy of software development that lends itself to the design and construction of portable code systems. The current efforts to standardize software bear witness to this need. SABrE is an effort to implement a software development environment which is itself portable and promotes the design and construction of portable applications. SABrE does not include such important tools as editors and compilers. Well built tools of that kind are readily availablemore » across virtually all computer platforms. The areas that SABrE addresses are at a higher level involving issues such as data portability, portable inter-process communication, and graphics. These blocks of functionality have particular significance to the kind of code development done at LLNL. That is partly why the general computing community has not supplied us with these tools already. This is another key feature of the software development environments which we must recognize. The general computing community cannot and should not be expected to produce all of the tools which we require.« less
Manufacturing of Wearable Sensors for Human Health and Performance Monitoring
NASA Astrophysics Data System (ADS)
Alizadeh, Azar
2015-03-01
Continuous monitoring of physiological and biological parameters is expected to improve performance and medical outcomes by assessing overall health status and alerting for life-saving interventions. Continuous monitoring of these parameters requires wearable devices with an appropriate form factor (lightweight, comfortable, low energy consuming and even single-use) to avoid disrupting daily activities thus ensuring operation relevance and user acceptance. Many previous efforts to implement remote and wearable sensors have suffered from high cost and poor performance, as well as low clinical and end-use acceptance. New manufacturing and system level design approaches are needed to make the performance and clinical benefits of these sensors possible while satisfying challenging economic, regulatory, clinical, and user-acceptance criteria. In this talk we will review several recent design and manufacturing efforts aimed at designing and building prototype wearable sensors. We will discuss unique opportunities and challenges provided by additive manufacturing, including 3D printing, to drive innovation through new designs, faster prototyping and manufacturing, distributed networks, and new ecosystems. We will also show alternative hybrid self-assembly based integration techniques for low cost large scale manufacturing of single use wearable devices. Coauthors: Prabhjot Singh and Jeffrey Ashe.
A Human Reliability Based Usability Evaluation Method for Safety-Critical Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Phillippe Palanque; Regina Bernhaupt; Ronald Boring
2006-04-01
Recent years have seen an increasing use of sophisticated interaction techniques including in the field of safety critical interactive software [8]. The use of such techniques has been required in order to increase the bandwidth between the users and systems and thus to help them deal efficiently with increasingly complex systems. These techniques come from research and innovation done in the field of humancomputer interaction (HCI). A significant effort is currently being undertaken by the HCI community in order to apply and extend current usability evaluation techniques to these new kinds of interaction techniques. However, very little has been donemore » to improve the reliability of software offering these kinds of interaction techniques. Even testing basic graphical user interfaces remains a challenge that has rarely been addressed in the field of software engineering [9]. However, the non reliability of interactive software can jeopardize usability evaluation by showing unexpected or undesired behaviors. The aim of this SIG is to provide a forum for both researchers and practitioners interested in testing interactive software. Our goal is to define a roadmap of activities to cross fertilize usability and reliability testing of these kinds of systems to minimize duplicate efforts in both communities.« less
NASA Technical Reports Server (NTRS)
Foster, P.
1977-01-01
The NASA Lewis Research Center has held a series of six major and unique technology utilization conferences which were major milestones in planned structured efforts to establish effective working relationships with specific technology user communities. These efforts were unique in that the activities undertaken prior to the conference were extensive, and effectively laid the groundwork for productive technology transfer following, and as a direct result of, the conferences. The effort leading to the conference was in each case tailored to the characteristics of the potential user community, however, the common factors comprise a basic framework applicable to similar endeavors. The process is essentially a planned sequence of steps that constitute a technical market survey and a marketing program for the development of beneficial applications of aerospace technology beyond the aerospace field.
LaPeyre, Megan K.; Nix, Ashby; Laborde, Luke; Piazza, Bryan P.
2012-01-01
Successful oyster reef restoration, like many conservation challenges, requires not only biological understanding of the resource, but also stakeholder cooperation and political support. To measure perceptions of oyster reef restoration activities and priorities for future restoration along the northern Gulf of Mexico coast, a survey of 1500 individuals representing 4 user groups (oyster harvesters, shrimpers, environmental organization members, professionals), across 5 states (Texas, Louisiana, Mississippi, Alabama, Florida) was conducted in 2011. All respondents highly supported reef restoration efforts, but there was a dichotomy in preferred restoration goals with commercial fishermen more likely to support oyster reef restoration for stock enhancement, while professionals and environmental organization members were more likely to support oyster reef restoration to enhance ecosystem services. All user groups identified enforcement, funding, and appropriate site selection as basic requirements for successful reef restoration. For management of restored oyster reefs, oyster harvesters and shrimpers were less likely to support options that restricted the use of reefs, including gear restrictions and permanent closures, but did support rotating annual reef closures, while other stakeholders were willing to consider all options, including annual reef closures and sanctuary reefs. Overall, there were clear differences in management and communication preferences across user groups, but few differences across states. Understanding these key differences in stakeholder support for, and willingness to accept specific management actions is critical in moving management and restoration forward while minimizing conflict.
Technological, biological, and acoustical constraints to music perception in cochlear implant users.
Limb, Charles J; Roy, Alexis T
2014-02-01
Despite advances in technology, the ability to perceive music remains limited for many cochlear implant users. This paper reviews the technological, biological, and acoustical constraints that make music an especially challenging stimulus for cochlear implant users, while highlighting recent research efforts to overcome these shortcomings. The limitations of cochlear implant devices, which have been optimized for speech comprehension, become evident when applied to music, particularly with regards to inadequate spectral, fine-temporal, and dynamic range representation. Beyond the impoverished information transmitted by the device itself, both peripheral and central auditory nervous system deficits are seen in the presence of sensorineural hearing loss, such as auditory nerve degeneration and abnormal auditory cortex activation. These technological and biological constraints to effective music perception are further compounded by the complexity of the acoustical features of music itself that require the perceptual integration of varying rhythmic, melodic, harmonic, and timbral elements of sound. Cochlear implant users not only have difficulty perceiving spectral components individually (leading to fundamental disruptions in perception of pitch, melody, and harmony) but also display deficits with higher perceptual integration tasks required for music perception, such as auditory stream segregation. Despite these current limitations, focused musical training programs, new assessment methods, and improvements in the representation and transmission of the complex acoustical features of music through technological innovation offer the potential for significant advancements in cochlear implant-mediated music perception. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Ratib, Osman; Rosset, Antoine; Dahlbom, Magnus; Czernin, Johannes
2005-04-01
Display and interpretation of multi dimensional data obtained from the combination of 3D data acquired from different modalities (such as PET-CT) require complex software tools allowing the user to navigate and modify the different image parameters. With faster scanners it is now possible to acquire dynamic images of a beating heart or the transit of a contrast agent adding a fifth dimension to the data. We developed a DICOM-compliant software for real time navigation in very large sets of 5 dimensional data based on an intuitive multidimensional jog-wheel widely used by the video-editing industry. The software, provided under open source licensing, allows interactive, single-handed, navigation through 3D images while adjusting blending of image modalities, image contrast and intensity and the rate of cine display of dynamic images. In this study we focused our effort on the user interface and means for interactively navigating in these large data sets while easily and rapidly changing multiple parameters such as image position, contrast, intensity, blending of colors, magnification etc. Conventional mouse-driven user interface requiring the user to manipulate cursors and sliders on the screen are too cumbersome and slow. We evaluated several hardware devices and identified a category of multipurpose jogwheel device that is used in the video-editing industry that is particularly suitable for rapidly navigating in five dimensions while adjusting several display parameters interactively. The application of this tool will be demonstrated in cardiac PET-CT imaging and functional cardiac MRI studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Studwell, Sara; Robinson, Carly; Elliott, Jannean
Scientific research is producing ever-increasing amounts of data. Organizing and reflecting relationships across data collections, datasets, publications, and other research objects are essential functionalities of the modern science environment, yet challenging to implement. Landing pages are often used for providing ‘big picture’ contextual frameworks for datasets and data collections, and many large-volume data holders are utilizing them in thoughtful, creative ways. The benefits of their organizational efforts, however, are not realized unless the user eventually sees the landing page at the end point of their search. What if that organization and ‘big picture’ context could benefit the user at themore » beginning of the search? That is a challenging approach, but The Department of Energy’s (DOE) Office of Scientific and Technical Information (OSTI) is redesigning the database functionality of the DOE Data Explorer (DDE) with that goal in mind. Phase I is focused on redesigning the DDE database to leverage relationships between two existing distinct populations in DDE, data Projects and individual Datasets, and then adding a third intermediate population, data Collections. Mapped, structured linkages, designed to show user relationships, will allow users to make informed search choices. These linkages will be sustainable and scalable, created automatically with the use of new metadata fields and existing authorities. Phase II will study selected DOE Data ID Service clients, analyzing how their landing pages are organized, and how that organization might be used to improve DDE search capabilities. At the heart of both phases is the realization that adding more metadata information for cross-referencing may require additional effort for data scientists. Finally, OSTI’s approach seeks to leverage existing metadata and landing page intelligence without imposing an additional burden on the data creators.« less
NASA Astrophysics Data System (ADS)
Ross, A.; Stackhouse, P. W.; Tisdale, B.; Tisdale, M.; Chandler, W.; Hoell, J. M., Jr.; Kusterer, J.
2014-12-01
The NASA Langley Research Center Science Directorate and Atmospheric Science Data Center have initiated a pilot program to utilize Geographic Information System (GIS) tools that enable, generate and store climatological averages using spatial queries and calculations in a spatial database resulting in greater accessibility of data for government agencies, industry and private sector individuals. The major objectives of this effort include the 1) Processing and reformulation of current data to be consistent with ESRI and openGIS tools, 2) Develop functions to improve capability and analysis that produce "on-the-fly" data products, extending these past the single location to regional and global scales. 3) Update the current web sites to enable both web-based and mobile application displays for optimization on mobile platforms, 4) Interact with user communities in government and industry to test formats and usage of optimization, and 5) develop a series of metrics that allow for monitoring of progressive performance. Significant project results will include the the development of Open Geospatial Consortium (OGC) compliant web services (WMS, WCS, WFS, WPS) that serve renewable energy and agricultural application products to users using GIS software and tools. Each data product and OGC service will be registered within ECHO, the Common Metadata Repository, the Geospatial Platform, and Data.gov to ensure the data are easily discoverable and provide data users with enhanced access to SSE data, parameters, services, and applications. This effort supports cross agency, cross organization, and interoperability of SSE data products and services by collaborating with DOI, NRCan, NREL, NCAR, and HOMER for requirements vetting and test bed users before making available to the wider public.
Studwell, Sara; Robinson, Carly; Elliott, Jannean
2017-04-04
Scientific research is producing ever-increasing amounts of data. Organizing and reflecting relationships across data collections, datasets, publications, and other research objects are essential functionalities of the modern science environment, yet challenging to implement. Landing pages are often used for providing ‘big picture’ contextual frameworks for datasets and data collections, and many large-volume data holders are utilizing them in thoughtful, creative ways. The benefits of their organizational efforts, however, are not realized unless the user eventually sees the landing page at the end point of their search. What if that organization and ‘big picture’ context could benefit the user at themore » beginning of the search? That is a challenging approach, but The Department of Energy’s (DOE) Office of Scientific and Technical Information (OSTI) is redesigning the database functionality of the DOE Data Explorer (DDE) with that goal in mind. Phase I is focused on redesigning the DDE database to leverage relationships between two existing distinct populations in DDE, data Projects and individual Datasets, and then adding a third intermediate population, data Collections. Mapped, structured linkages, designed to show user relationships, will allow users to make informed search choices. These linkages will be sustainable and scalable, created automatically with the use of new metadata fields and existing authorities. Phase II will study selected DOE Data ID Service clients, analyzing how their landing pages are organized, and how that organization might be used to improve DDE search capabilities. At the heart of both phases is the realization that adding more metadata information for cross-referencing may require additional effort for data scientists. Finally, OSTI’s approach seeks to leverage existing metadata and landing page intelligence without imposing an additional burden on the data creators.« less
Lotte, Fabien; Larrue, Florian; Mühl, Christian
2013-01-01
While recent research on Brain-Computer Interfaces (BCI) has highlighted their potential for many applications, they remain barely used outside laboratories. The main reason is their lack of robustness. Indeed, with current BCI, mental state recognition is usually slow and often incorrect. Spontaneous BCI (i.e., mental imagery-based BCI) often rely on mutual learning efforts by the user and the machine, with BCI users learning to produce stable ElectroEncephaloGraphy (EEG) patterns (spontaneous BCI control being widely acknowledged as a skill) while the computer learns to automatically recognize these EEG patterns, using signal processing. Most research so far was focused on signal processing, mostly neglecting the human in the loop. However, how well the user masters the BCI skill is also a key element explaining BCI robustness. Indeed, if the user is not able to produce stable and distinct EEG patterns, then no signal processing algorithm would be able to recognize them. Unfortunately, despite the importance of BCI training protocols, they have been scarcely studied so far, and used mostly unchanged for years. In this paper, we advocate that current human training approaches for spontaneous BCI are most likely inappropriate. We notably study instructional design literature in order to identify the key requirements and guidelines for a successful training procedure that promotes a good and efficient skill learning. This literature study highlights that current spontaneous BCI user training procedures satisfy very few of these requirements and hence are likely to be suboptimal. We therefore identify the flaws in BCI training protocols according to instructional design principles, at several levels: in the instructions provided to the user, in the tasks he/she has to perform, and in the feedback provided. For each level, we propose new research directions that are theoretically expected to address some of these flaws and to help users learn the BCI skill more efficiently. PMID:24062669
State and local response to damaging land subsidence in United States urban areas
Holzer, T.L.
1989-01-01
Land subsidence caused by man-induced depressuring of underground reservoirs has occurred in at least nine urban areas in the United States. Significant efforts to control it have been made in three areas: Long Beach, California; Houston-Galveston, Texas; and Santa Clara Valley, California. In these areas coastal flooding and its control cost more than $300 million. Institutional changes were required in each area to ameliorate its subsidence problem. In Long Beach and Houston Galveston, efforts were made to mitigate subsidence only after significant flood damage had occurred. To arrest subsidence at Long Beach, the city lobbied for a special state law, the California Subsidence Act, that required unitization and repressuring of the Wilmington oil field. In the Houston-Galveston region, the Texas State Legislature authorized formation of the Harris-Galveston Coastal Subsidence District with authority to regulate groundwater pumping by permit. This solution, which was achieved through efforts of entities affected by subsidence, was the product of a series of compromises necessitated by political fragmentation and disjointed water planning in the region. Amelioration of subsidence in the Santa Clara Valley was a collateral benefit from the effort by water users to curtail ground-water overdraft in the valley. Importation of surface water and a tax on ground-water pumpage reduced ground-water use, thereby allowing the recovery of water level and the arresting of subsidence.
SOFTCOST - DEEP SPACE NETWORK SOFTWARE COST MODEL
NASA Technical Reports Server (NTRS)
Tausworthe, R. C.
1994-01-01
The early-on estimation of required resources and a schedule for the development and maintenance of software is usually the least precise aspect of the software life cycle. However, it is desirable to make some sort of an orderly and rational attempt at estimation in order to plan and organize an implementation effort. The Software Cost Estimation Model program, SOFTCOST, was developed to provide a consistent automated resource and schedule model which is more formalized than the often used guesswork model based on experience, intuition, and luck. SOFTCOST was developed after the evaluation of a number of existing cost estimation programs indicated that there was a need for a cost estimation program with a wide range of application and adaptability to diverse kinds of software. SOFTCOST combines several software cost models found in the open literature into one comprehensive set of algorithms that compensate for nearly fifty implementation factors relative to size of the task, inherited baseline, organizational and system environment, and difficulty of the task. SOFTCOST produces mean and variance estimates of software size, implementation productivity, recommended staff level, probable duration, amount of computer resources required, and amount and cost of software documentation. Since the confidence level for a project using mean estimates is small, the user is given the opportunity to enter risk-biased values for effort, duration, and staffing, to achieve higher confidence levels. SOFTCOST then produces a PERT/CPM file with subtask efforts, durations, and precedences defined so as to produce the Work Breakdown Structure (WBS) and schedule having the asked-for overall effort and duration. The SOFTCOST program operates in an interactive environment prompting the user for all of the required input. The program builds the supporting PERT data base in a file for later report generation or revision. The PERT schedule and the WBS schedule may be printed and stored in a file for later use. The SOFTCOST program is written in Microsoft BASIC for interactive execution and has been implemented on an IBM PC-XT/AT operating MS-DOS 2.1 or higher with 256K bytes of memory. SOFTCOST was originally developed for the Zylog Z80 system running under CP/M in 1981. It was converted to run on the IBM PC XT/AT in 1986. SOFTCOST is a copyrighted work with all copyright vested in NASA.
Cognitive consequences of clumsy automation on high workload, high consequence human performance
NASA Technical Reports Server (NTRS)
Cook, Richard I.; Woods, David D.; Mccolligan, Elizabeth; Howie, Michael B.
1991-01-01
The growth of computational power has fueled attempts to automate more of the human role in complex problem solving domains, especially those where system faults have high consequences and where periods of high workload may saturate the performance capacity of human operators. Examples of these domains include flightdecks, space stations, air traffic control, nuclear power operation, ground satellite control rooms, and surgical operating rooms. Automation efforts may have unanticipated effects on human performance, particularly if they increase the workload at peak workload times or change the practitioners' strategies for coping with workload. Smooth and effective changes in automation requires detailed understanding of the congnitive tasks confronting the user: it has been called user centered automation. The introduction of a new computerized technology in a group of hospital operating rooms used for heart surgery was observed. The study revealed how automation, especially 'clumsy automation', effects practitioner work patterns and suggest that clumsy automation constrains users in specific and significant ways. Users tailor both the new system and their tasks in order to accommodate the needs of process and production. The study of this tailoring may prove a powerful tool for exposing previously hidden patterns of user data processing, integration, and decision making which may, in turn, be useful in the design of more effective human-machine systems.
MDMA, cortisol, and heightened stress in recreational ecstasy users.
Parrott, Andrew C; Montgomery, Cathy; Wetherell, Mark A; Downey, Luke A; Stough, Con; Scholey, Andrew B
2014-09-01
Stress develops when an organism requires additional metabolic resources to cope with demanding situations. This review will debate how recreational 3,4-methylenedioxymethamphetamine (MDMA, 'ecstasy') can increase some aspects of acute and chronic stress in humans. Laboratory studies on the acute effects of MDMA on cortisol release and neurohormone levels in drug-free regular ecstasy/MDMA users have been reviewed, and the role of the hypothalamic-pituitary-adrenal (HPA) axis in chronic changes in anxiety, stress, and cognitive coping is debated. In the laboratory, acute ecstasy/MDMA use can increase cortisol levels by 100-200%, whereas ecstasy/MDMA-using dance clubbers experience an 800% increase in cortisol levels, because of the combined effects of the stimulant drug and dancing. Three-month hair samples of abstinent users revealed cortisol levels 400% higher than those in controls. Chronic users show heightened cortisol release in stressful environments and deficits in complex neurocognitive tasks. Event-related evoked response potential studies show altered patterns of brain activation, suggestive of increased mental effort, during basic information processing. Chronic mood deficits include more daily stress and higher depression in susceptible individuals. We conclude that ecstasy/MDMA increases cortisol levels acutely and subchronically and that changes in the HPA axis may explain why recreational ecstasy/MDMA users show various aspects of neuropsychobiological stress.
Satellite services system program plan
NASA Technical Reports Server (NTRS)
Hoffman, Stephen J.
1985-01-01
The purpose is to determine the potential for servicing from the Space Shuttle Orbiter and to assess NASA's role as the catalyst in bringing about routine on-orbit servicing. Specifically this study seeks to determine what requirements, in terms of both funds and time, are needed to make the Shuttle Orbiter not only a transporter of spacecraft but a servicing vehicle for those spacecraft as well. The scope of this effort is to focus on the near term development of a generic servicing capability. To make this capability truly generic and attractive requires that the customer's point of veiw be taken and transformed into a widely usable set of hardware. And to maintain a near term advent of this capability requires that a minimal reliance be made on advanced technology. With this background and scope, this study will proceed through three general phases to arrive at the desired program costs and schedule. The first step will be to determine the servicing requirements of the user community. This will provide the basis for the second phase which is to develop hardware concepts to meet these needs. Finally, a cost estimate will be made for each of the new hardware concepts and a phased hardware development plan will be established for the acquisition of these items based on the inputs obtained from the user community.
Airplane Mesh Development with Grid Density Studies
NASA Technical Reports Server (NTRS)
Cliff, Susan E.; Baker, Timothy J.; Thomas, Scott D.; Lawrence, Scott L.; Rimlinger, Mark J.
1999-01-01
Automatic Grid Generation Wish List Geometry handling, including CAD clean up and mesh generation, remains a major bottleneck in the application of CFD methods. There is a pressing need for greater automation in several aspects of the geometry preparation in order to reduce set up time and eliminate user intervention as much as possible. Starting from the CAD representation of a configuration, there may be holes or overlapping surfaces which require an intensive effort to establish cleanly abutting surface patches, and collections of many patches may need to be combined for more efficient use of the geometrical representation. Obtaining an accurate and suitable body conforming grid with an adequate distribution of points throughout the flow-field, for the flow conditions of interest, is often the most time consuming task for complex CFD applications. There is a need for a clean unambiguous definition of the CAD geometry. Ideally this would be carried out automatically by smart CAD clean up software. One could also define a standard piece-wise smooth surface representation suitable for use by computational methods and then create software to translate between the various CAD descriptions and the standard representation. Surface meshing remains a time consuming, user intensive procedure. There is a need for automated surface meshing, requiring only minimal user intervention to define the overall density of mesh points. The surface mesher should produce well shaped elements (triangles or quadrilaterals) whose size is determined initially according to the surface curvature with a minimum size for flat pieces, and later refined by the user in other regions if necessary. Present techniques for volume meshing all require some degree of user intervention. There is a need for fully automated and reliable volume mesh generation. In addition, it should be possible to create both surface and volume meshes that meet guaranteed measures of mesh quality (e.g. minimum and maximum angle, stretching ratios, etc.).
Carbon Dioxide Observational Platform System (CO-OPS), feasibility study
NASA Technical Reports Server (NTRS)
Bouquet, D. L.; Hall, D. W.; Mcelveen, R. P.
1987-01-01
The Carbon Dioxide Observational Platform System (CO-OPS) is a near-space, geostationary, multi-user, unmanned microwave powered monitoring platform system. This systems engineering feasibility study addressed identified existing requirements such as: carbon dioxide observational data requirements, communications requirements, and eye-in-the-sky requirements of other groups like the Defense Department, the Forestry Service, and the Coast Guard. In addition, potential applications in: earth system science, space system sciences, and test and verification (satellite sensors and data management techniques) were considered. The eleven month effort is summarized. Past work and methods of gathering the required observational data were assessed and rough-order-of magnitude cost estimates have shown the CO-OPS system to be most cost effective (less than $30 million within a 10 year lifetime). It was also concluded that there are no technical, schedule, or obstacles that would prevent achieving the objectives of the total 5-year CO-OPS program.
CANISTER HANDLING FACILITY DESCRIPTION DOCUMENT
DOE Office of Scientific and Technical Information (OSTI.GOV)
J.F. Beesley
The purpose of this facility description document (FDD) is to establish requirements and associated bases that drive the design of the Canister Handling Facility (CHF), which will allow the design effort to proceed to license application. This FDD will be revised at strategic points as the design matures. This FDD identifies the requirements and describes the facility design, as it currently exists, with emphasis on attributes of the design provided to meet the requirements. This FDD is an engineering tool for design control; accordingly, the primary audience and users are design engineers. This FDD is part of an iterative designmore » process. It leads the design process with regard to the flowdown of upper tier requirements onto the facility. Knowledge of these requirements is essential in performing the design process. The FDD follows the design with regard to the description of the facility. The description provided in this FDD reflects the current results of the design process.« less
Dufendach, Kevin R; Koch, Sabine; Unertl, Kim M; Lehmann, Christoph U
2017-10-26
Early involvement of stakeholders in the design of medical software is particularly important due to the need to incorporate complex knowledge and actions associated with clinical work. Standard user-centered design methods include focus groups and participatory design sessions with individual stakeholders, which generally limit user involvement to a small number of individuals due to the significant time investments from designers and end users. The goal of this project was to reduce the effort for end users to participate in co-design of a software user interface by developing an interactive web-based crowdsourcing platform. In a randomized trial, we compared a new web-based crowdsourcing platform to standard participatory design sessions. We developed an interactive, modular platform that allows responsive remote customization and design feedback on a visual user interface based on user preferences. The responsive canvas is a dynamic HTML template that responds in real time to user preference selections. Upon completion, the design team can view the user's interface creations through an administrator portal and download the structured selections through a REDCap interface. We have created a software platform that allows users to customize a user interface and see the results of that customization in real time, receiving immediate feedback on the impact of their design choices. Neonatal clinicians used the new platform to successfully design and customize a neonatal handoff tool. They received no specific instruction and yet were able to use the software easily and reported high usability. VandAID, a new web-based crowdsourcing platform, can involve multiple users in user-centered design simultaneously and provides means of obtaining design feedback remotely. The software can provide design feedback at any stage in the design process, but it will be of greatest utility for specifying user requirements and evaluating iterative designs with multiple options.
Lee, Chien-Ching; Lin, Shih-Pin; Yang, Shu-Ling; Tsou, Mei-Yung; Chang, Kuang-Yi
2013-03-01
Medical institutions are eager to introduce new information technology to improve patient safety and clinical efficiency. However, the acceptance of new information technology by medical personnel plays a key role in its adoption and application. This study aims to investigate whether perceived organizational learning capability (OLC) is associated with user acceptance of information technology among operating room nurse staff. Nurse anesthetists and operating room nurses were recruited in this questionnaire survey. A pilot study was performed to ensure the reliability and validity of the translated questionnaire, which consisted of 14 items from the four dimensions of OLC, and 16 items from the four constructs of user acceptance of information technology, including performance expectancy, effort expectancy, social influence, and behavioral intention. Confirmatory factor analysis was applied in the main survey to evaluate the construct validity of the questionnaire. Structural equation modeling was used to test the hypothetical relationships between the four dimensions of user acceptance of information technology and the second-ordered OLC. Goodness of fit of the hypothetic model was also assessed. Performance expectancy, effort expectancy, and social influence positively influenced behavioral intention of users of the clinical information system (all p < 0.001) and accounted for 75% of its variation. The second-ordered OLC was positively associated with performance expectancy, effort expectancy, and social influence (all p < 0.001). However, the hypothetic relationship between perceived OLC and behavioral intention was not significant (p = 0.87). The fit statistical analysis indicated reasonable model fit to data (root mean square error of approximation = 0.07 and comparative fit index = 0.91). Perceived OLC indirectly affects user behavioral intention through the mediation of performance expectancy, effort expectancy, and social influence in the operating room setting. Copyright © 2013. Published by Elsevier B.V.
A Case Study on the Geocuration of Multidisciplinary Data Products and Services
NASA Astrophysics Data System (ADS)
Downs, R. R.; Chen, R. S.
2015-12-01
Data curation at an interdisciplinary scientific data center that focuses on human interactions in the environment provides opportunities for the geocuration of data from diverse natural, social, health, and engineering disciplines to offer data products and services to users representing a variety of fields of inquiry, levels of expertise, and vocations. Addressing pressing issues such as disaster risk management, climate change, resource depletion, and environment-conflict interactions requires accessing and integrating different types of data from diverse sources, often collected with quite disparate methods, scales, levels of uncertainty and quality, and access and usage rights. Particular challenges for geocuration include identifying relevant data sets from diverse sources, assessing their suitability for integration, conversion to forms that enhance interoperability, obtaining suitable access and usage rights for data, documentation of methods in ways understandable to diverse users, and evaluation of the effectiveness of geocuration efforts. We describe here a number of efforts to develop geocurated data collections in such areas as environmental indicators, land use/land cover change, and human settlements and infrastructure. In addition to describing the incremental development of these collections, we examine how planning and anticipation of the needs of user communities are important to the collection development process. We assess the development and continuing enhancement of the cyberinfrastructure and capabilities needed to support efficient and effective geocuration throughout the data lifecycle. We conclude with selected observations and lessons learned from the development of these geocurated collections.
Numerical aerodynamic simulation program long haul communications prototype
NASA Technical Reports Server (NTRS)
Cmaylo, Bohden K.; Foo, Lee
1987-01-01
This document is a report of the Numerical Aerodynamic Simulation (NAS) Long Haul Communications Prototype (LHCP). It describes the accomplishments of the LHCP group, presents the results from all LHCP experiments and testing activities, makes recommendations for present and future LHCP activities, and evaluates the remote workstation accesses from Langley Research Center, Lewis Research Center, and Colorado State University to Ames Research Center. The report is the final effort of the Long Haul (Wideband) Communications Prototype Plan (PT-1133-02-N00), 3 October 1985, which defined the requirements for the development, test, and operation of the LHCP network and was the plan used to evaluate the remote user bandwidth requirements for the Numerical Aerodynamic Simulation Processing System Network.
Earthdata Search: How Usability Drives Innovation To Enable A Broad User Base
NASA Astrophysics Data System (ADS)
Reese, M.; Siarto, J.; Lynnes, C.; Shum, D.
2017-12-01
Earthdata Search (https://search.earthdata.nasa.gov) is a modern web application allowing users to search, discover, visualize, refine, and access NASA Earth Observation data using a wide array of service offerings. Its goal is to ease the technical burden on data users by providing a high-quality application that makes it simple to interact with NASA Earth observation data, freeing them to spend more effort on innovative endeavors. This talk would detail how we put end users first in our design and development process, focusing on usability and letting usability needs drive requirements for the underlying technology. Just a few examples of how this plays out practically, Earthdata Search teams with a lightning fast metadata repository, allowing it to be an extremely responsive UI that updates as the user changes criteria not only at the dataset level, but also at the file level. This results in a better exploration experience as the time penalty is greatly reduced. Also, since Earthdata Search uses metadata from over 35,000 datasets that are managed by different data providers, metadata standards, quality and consistency will vary. We found that this was negatively impacting users' search and exploration experience. We have resolved this problem with the introduction of "humanizers", which is a community-driven process to both "smooth out" metadata values and provide non-jargonistic representations of some content within the Earthdata Search UI. This is helpful for both the experience data scientist and our users that are brand new to the discipline.
Chadwell, Alix; Kenney, Laurence; Thies, Sibylle; Galpin, Adam; Head, John
2016-01-01
Users of myoelectric prostheses can often find them difficult to control. This can lead to passive-use of the device or total rejection, which can have detrimental effects on the contralateral limb due to overuse. Current clinically available prostheses are “open loop” systems, and although considerable effort has been focused on developing biofeedback to “close the loop,” there is evidence from laboratory-based studies that other factors, notably improving predictability of response, may be as, if not more, important. Interestingly, despite a large volume of research aimed at improving myoelectric prostheses, it is not currently known which aspect of clinically available systems has the greatest impact on overall functionality and everyday usage. A protocol has, therefore, been designed to assess electromyographic (EMG) skill of the user and predictability of the prosthesis response as significant parts of the control chain, and to relate these to functionality and everyday usage. Here, we present the protocol and results from early pilot work. A set of experiments has been developed. First, to characterize user skill in generating the required level of EMG signal, as well as the speed with which users are able to make the decision to activate the appropriate muscles. Second, to measure unpredictability introduced at the skin–electrode interface, in order to understand the effects of the socket-mounted electrode fit under different loads on the variability of time taken for the prosthetic hand to respond. To evaluate prosthesis user functionality, four different outcome measures are assessed. Using a simple upper limb functional task prosthesis users are assessed for (1) success of task completion, (2) task duration, (3) quality of movement, and (4) gaze behavior. To evaluate everyday usage away from the clinic, the symmetricity of their real-world arm use is assessed using activity monitoring. These methods will later be used to assess a prosthesis user cohort to establish the relative contribution of each control factor to the individual measures of functionality and everyday usage (using multiple regression models). The results will support future researchers, designers, and clinicians in concentrating their efforts on the area that will have the greatest impact on improving prosthesis use. PMID:27597823
Model-based software for simulating ultrasonic pulse/echo inspections of metal components
NASA Astrophysics Data System (ADS)
Chiou, Chien-Ping; Margetan, Frank J.; Taylor, Jared L.; McKillip, Matthew; Engle, Brady J.; Roberts, Ronald A.; Barnard, Daniel J.
2017-02-01
Under the sponsorship of the National Science Foundation's Industry/University Cooperative Research Center at Iowa State University, an effort was initiated in 2015 to repackage existing research-grade software into user friendly tools for the rapid estimation of signal-to-noise ratio (S/N) for ultrasonic inspections of metals. The software combines: (1) a Python-based graphical user interface for specifying an inspection scenario and displaying results; and (2) a Fortran-based engine for computing defect signals and backscattered grain noise characteristics. The later makes use the Thompson-Gray Model for the response from an internal defect and the Independent Scatterer Model for backscattered grain noise. This paper provides an overview of the ongoing modeling effort with emphasis on recent developments. These include: treatment of angle-beam inspections, implementation of distance-amplitude corrections, changes in the generation of "invented" calibration signals, efforts to simulate ultrasonic C-scans; and experimental testing of model predictions. The simulation software can now treat both normal and oblique-incidence immersion inspections of curved metal components having equiaxed microstructures in which the grain size varies with depth. Both longitudinal and shear-wave inspections are treated. The model transducer can either be planar, spherically-focused, or bi-cylindrically-focused. A calibration (or reference) signal is required and is used to deduce the measurement system efficiency function. This can be "invented" by the software using center frequency and bandwidth information specified by the user, or, alternatively, a measured calibration signal can be used. Defect types include flat-bottomed-hole reference reflectors, and spherical pores and inclusions. Simulation outputs include estimated defect signal amplitudes, root-mean-squared grain noise amplitudes, and S/N as functions of the depth of the defect within the metal component. At any particular depth, the user can view a simulated A-scan displaying the superimposed defect and grain-noise waveforms. The realistic grain noise signals used in the A-scans are generated from a set of measured "universal" noise signals whose strengths and spectral characteristics are altered to match predicted noise characteristics for the simulation at hand. We present simulation examples demonstrating recent developments, and discuss plans to improve simulator capabilities.
Engineering large-scale agent-based systems with consensus
NASA Technical Reports Server (NTRS)
Bokma, A.; Slade, A.; Kerridge, S.; Johnson, K.
1994-01-01
The paper presents the consensus method for the development of large-scale agent-based systems. Systems can be developed as networks of knowledge based agents (KBA) which engage in a collaborative problem solving effort. The method provides a comprehensive and integrated approach to the development of this type of system. This includes a systematic analysis of user requirements as well as a structured approach to generating a system design which exhibits the desired functionality. There is a direct correspondence between system requirements and design components. The benefits of this approach are that requirements are traceable into design components and code thus facilitating verification. The use of the consensus method with two major test applications showed it to be successful and also provided valuable insight into problems typically associated with the development of large systems.
Myokit: A simple interface to cardiac cellular electrophysiology.
Clerx, Michael; Collins, Pieter; de Lange, Enno; Volders, Paul G A
2016-01-01
Myokit is a new powerful and versatile software tool for modeling and simulation of cardiac cellular electrophysiology. Myokit consists of an easy-to-read modeling language, a graphical user interface, single and multi-cell simulation engines and a library of advanced analysis tools accessible through a Python interface. Models can be loaded from Myokit's native file format or imported from CellML. Model export is provided to C, MATLAB, CellML, CUDA and OpenCL. Patch-clamp data can be imported and used to estimate model parameters. In this paper, we review existing tools to simulate the cardiac cellular action potential to find that current tools do not cater specifically to model development and that there is a gap between easy-to-use but limited software and powerful tools that require strong programming skills from their users. We then describe Myokit's capabilities, focusing on its model description language, simulation engines and import/export facilities in detail. Using three examples, we show how Myokit can be used for clinically relevant investigations, multi-model testing and parameter estimation in Markov models, all with minimal programming effort from the user. This way, Myokit bridges a gap between performance, versatility and user-friendliness. Copyright © 2015 Elsevier Ltd. All rights reserved.
Lazinski, David W; Camilli, Andrew
2013-01-01
The amplification of DNA fragments, cloned between user-defined 5' and 3' end sequences, is a prerequisite step in the use of many current applications including massively parallel sequencing (MPS). Here we describe an improved method, called homopolymer tail-mediated ligation PCR (HTML-PCR), that requires very little starting template, minimal hands-on effort, is cost-effective, and is suited for use in high-throughput and robotic methodologies. HTML-PCR starts with the addition of homopolymer tails of controlled lengths to the 3' termini of a double-stranded genomic template. The homopolymer tails enable the annealing-assisted ligation of a hybrid oligonucleotide to the template's recessed 5' ends. The hybrid oligonucleotide has a user-defined sequence at its 5' end. This primer, together with a second primer composed of a longer region complementary to the homopolymer tail and fused to a second 5' user-defined sequence, are used in a PCR reaction to generate the final product. The user-defined sequences can be varied to enable compatibility with a wide variety of downstream applications. We demonstrate our new method by constructing MPS libraries starting from nanogram and sub-nanogram quantities of Vibrio cholerae and Streptococcus pneumoniae genomic DNA.
Cervantes-Trejo, Arturo; Leenen, Iwin; Fabila-Carrasco, John Stewart; Rojas-Vargas, Roy
2016-11-01
We explore demographic, temporal and geographic patterns of 256,588 road traffic fatalities from 1998 to 2013 in Mexico, in context of UN´s decade of action for road safety 2010-2020 (DARS). Combined traffic mortality data and population counts were analyzed using mixed-effects logistic regression, distinguishing sex-age groups, vulnerable and protected road users, and municipal size. Rapid growth from 1998 to 2008 in traffic mortality rates has been reversed since 2009. Most deaths averted are among young male protected road users (reduction of 0.95 fatalities per 100,000 per year in males 12-49). In spite of a steady decrease over the full study period, mortality rates remain high in vulnerable road users over 50, with a high mortality rate of 26 per 100,000 males over 75 years in 2013. Progress on the reduction of deaths advances in Mexico, in line with DARS targets. National road safety efforts require strengthening. Initiatives should target vulnerable road users, specifically adults >50 years in urban areas. Strengthening of drink driving programs aimed at young drivers/occupants is promising.
Rushton, Paula W; Kairy, Dahlia; Archambault, Philippe; Pituch, Evelina; Torkia, Caryne; El Fathi, Anas; Stone, Paula; Routhier, François; Forget, Robert; Pineau, Joelle; Gourdeau, Richard; Demers, Louise
2015-05-01
To explore power wheelchair users', caregivers' and clinicians' perspectives regarding the potential impact of intelligent power wheelchair use on social participation. Semi-structured interviews were conducted with power wheelchair users (n = 12), caregivers (n = 4) and clinicians (n = 12). An illustrative video was used to facilitate discussion. The transcribed interviews were analyzed using thematic analysis. Three main themes were identified based on the experiences of the power wheelchair users, caregivers and clinicians: (1) increased social participation opportunities, (2) changing how social participation is experienced and (3) decreased risk of accidents during social participation. Findings from this study suggest that an intelligent power wheelchair would enhance social participation in a variety of important ways, thereby providing support for continued design and development of this assistive technology. An intelligent power wheelchair has the potential to: Increase social participation opportunities by overcoming challenges associated with navigating through crowds and small spaces. Change how social participation is experienced through "normalizing" social interactions and decreasing the effort required to drive a power wheelchair. Decrease the risk of accidents during social participation by reducing the need for dangerous compensatory strategies and minimizing the impact of the physical environment.
GES DISC Datalist Enables Easy Data Selection For Natural Phenomena Studies
NASA Technical Reports Server (NTRS)
Li, Angela; Shie, Chung-Lin; Hegde, Mahabaleshwa; Petrenko, Maksym; Teng, William; Bryant, Keith; Liu, Zhong; Hearty, Thomas; Shen, Suhung; Seiler, Edward;
2017-01-01
In order to investigate and assess natural hazards such as tropical storms, winter storms, volcanic eruptions, floods, and drought in a timely manner, the NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) has been developing an efficient data search and access service. Called "Datalist," this service enables users to acquire their data of interest "all at once," with minimum effort. A Datalist is a virtual collection of predefined or user-defined data variables from one or more archived data sets. Datalists are more than just data. Datalists effectively provide users with a sophisticated integrated data and services package, including metadata, citation, documentation, visualization, and data-specific services (e.g., subset and OPeNDAP), all available from one-stop shopping. The predefined Datalists, created by the experienced GES DISC science support team, should save a significant amount of time that users would otherwise have to spend. The Datalist service is an extension of the new GES DISC website, which is completely data-driven. A Datalist, also known as "data bundle," is treated just as any other data set. Being a virtual collection, a Datalist requires no extra storage space.
Salathé, Marcel; Khandelwal, Shashank
2011-10-01
There is great interest in the dynamics of health behaviors in social networks and how they affect collective public health outcomes, but measuring population health behaviors over time and space requires substantial resources. Here, we use publicly available data from 101,853 users of online social media collected over a time period of almost six months to measure the spatio-temporal sentiment towards a new vaccine. We validated our approach by identifying a strong correlation between sentiments expressed online and CDC-estimated vaccination rates by region. Analysis of the network of opinionated users showed that information flows more often between users who share the same sentiments - and less often between users who do not share the same sentiments - than expected by chance alone. We also found that most communities are dominated by either positive or negative sentiments towards the novel vaccine. Simulations of infectious disease transmission show that if clusters of negative vaccine sentiments lead to clusters of unprotected individuals, the likelihood of disease outbreaks is greatly increased. Online social media provide unprecedented access to data allowing for inexpensive and efficient tools to identify target areas for intervention efforts and to evaluate their effectiveness.
ARIES: Acquisition of Requirements and Incremental Evolution of Specifications
NASA Technical Reports Server (NTRS)
Roberts, Nancy A.
1993-01-01
This paper describes a requirements/specification environment specifically designed for large-scale software systems. This environment is called ARIES (Acquisition of Requirements and Incremental Evolution of Specifications). ARIES provides assistance to requirements analysts for developing operational specifications of systems. This development begins with the acquisition of informal system requirements. The requirements are then formalized and gradually elaborated (transformed) into formal and complete specifications. ARIES provides guidance to the user in validating formal requirements by translating them into natural language representations and graphical diagrams. ARIES also provides ways of analyzing the specification to ensure that it is correct, e.g., testing the specification against a running simulation of the system to be built. Another important ARIES feature, especially when developing large systems, is the sharing and reuse of requirements knowledge. This leads to much less duplication of effort. ARIES combines all of its features in a single environment that makes the process of capturing a formal specification quicker and easier.
Feasibility of wake vortex monitoring systems for air terminals
NASA Technical Reports Server (NTRS)
Wilson, D. J.; Shrider, K. R.; Lawrence, T. R.
1972-01-01
Wake vortex monitoring systems, especially those using laser Doppler sensors, were investigated. The initial phases of the effort involved talking with potential users (air traffic controllers, pilots, etc.) of a wake vortex monitoring system to determine system requirements from the user's viewpoint. These discussions involved the volumes of airspace to be monitored for vortices, and potential methods of using the monitored vortex data once the data are available. A subsequent task led to determining a suitable mathematical model of the vortex phenomena and developing a mathematical model of the laser Doppler sensor for monitoring the vortex flow field. The mathematical models were used in combination to help evaluate the capability of laser Doppler instrumentation in monitoring vortex flow fields both in the near vicinity of the sensor (within 1 kilometer and at long ranges(10 kilometers).
Space Transportation Materials and Structures Technology Workshop. Volume 1: Executive summary
NASA Technical Reports Server (NTRS)
Cazier, F. W., Jr. (Compiler); Gardner, J. E. (Compiler)
1992-01-01
The workshop was held to provide a forum for communication within the space materials and structures technology developer and user communities. Workshop participants were organized into a Vehicle Technology Requirements session and three working panels: Materials and Structures Technologies for Vehicle Systems; Propulsion Systems; and Entry Systems. The goals accomplished were (1) to develop important strategic planning information necessary to transition materials and structures technologies from lab research programs into robust and affordable operational systems; (2) to provide a forum for the exchange of information and ideas between technology developers and users; and (3) to provide senior NASA management with a review of current space transportation programs, related subjects, and specific technology needs. The workshop thus provided a foundation on which a NASA and industry effort to address space transportation materials and structures technologies can grow.
2014-04-03
CAPE CANAVERAL, Fla. – The Ground Systems Development and Operations Program is overseeing efforts to create a new multi-user firing room in Firing Room 4 in the Launch Control Center at NASA's Kennedy Space Center in Florida. The main floor consoles, cabling and wires below the floor and ceiling tiles above have been removed. Sub-flooring has been installed and the room is marked off to create four separate rooms on the main floor. In view along the soffit are space shuttle launch plaques for 21 missions launched from Firing Room 4. The design of Firing Room 4 will incorporate five control room areas that are flexible to meet current and future NASA and commercial user requirements. The equipment and most of the consoles from Firing Room 4 were moved to Firing Room 2 for possible future reuse. Photo credit: NASA/Ben Smegelsky
Remote Operations and Ground Control Centers
NASA Technical Reports Server (NTRS)
Bryant, Barry S.; Lankford, Kimberly; Pitts, R. Lee
2004-01-01
The Payload Operations Integration Center (POIC) at the Marshall Space Flight Center supports the International Space Station (ISS) through remote interfaces around the world. The POIC was originally designed as a gateway to space for remote facilities; ranging from an individual user to a full-scale multiuser environment. This achievement was accomplished while meeting program requirements and accommodating the injection of modern technology on an ongoing basis to ensure cost effective operations. This paper will discuss the open POIC architecture developed to support similar and dissimilar remote operations centers. It will include technologies, protocols, and compromises which on a day to day basis support ongoing operations. Additional areas covered include centralized management of shared resources and methods utilized to provide highly available and restricted resources to remote users. Finally, the effort of coordinating the actions of participants will be discussed.
Eruptive event generator based on the Gibson-Low magnetic configuration
NASA Astrophysics Data System (ADS)
Borovikov, D.; Sokolov, I. V.; Manchester, W. B.; Jin, M.; Gombosi, T. I.
2017-08-01
Coronal mass ejections (CMEs), a kind of energetic solar eruptions, are an integral subject of space weather research. Numerical magnetohydrodynamic (MHD) modeling, which requires powerful computational resources, is one of the primary means of studying the phenomenon. With increasing accessibility of such resources, grows the demand for user-friendly tools that would facilitate the process of simulating CMEs for scientific and operational purposes. The Eruptive Event Generator based on Gibson-Low flux rope (EEGGL), a new publicly available computational model presented in this paper, is an effort to meet this demand. EEGGL allows one to compute the parameters of a model flux rope driving a CME via an intuitive graphical user interface. We provide a brief overview of the physical principles behind EEGGL and its functionality. Ways toward future improvements of the tool are outlined.
Plain packaging: a logical progression for tobacco control in one of the world's ‘darkest markets’
Scollo, Michelle; Bayly, Megan; Wakefield, Melanie
2015-01-01
The Australian approach to tobacco control has been a comprehensive one, encompassing mass media campaigns, consumer information, taxation policy, access for smokers to smoking cessation advice and pharmaceutical treatments, protection from exposure to tobacco smoke and regulation of promotion. World-first legislation to standardise the packaging of tobacco was a logical next step to further reduce misleadingly reassuring promotion of a product known for the past 50 years to kill a high proportion of its long-term users. Similarly, refreshed, larger pack warnings which started appearing on packs at the end of 2012 were a logical progression of efforts to ensure that consumers are better informed about the health risks associated with smoking. Regardless of the immediate effects of legislation, further progress will continue to require a comprehensive approach to maintain momentum and ensure that government efforts on one front are not undermined by more vigorous efforts and greater investment by tobacco companies elsewhere. PMID:28407604
NASA Technical Reports Server (NTRS)
Hall, Callie; Arnone, Robert
2006-01-01
The NASA Applied Sciences Program seeks to transfer NASA data, models, and knowledge into the hands of end-users by forming links with partner agencies and associated decision support tools (DSTs). Through the NASA REASoN (Research, Education and Applications Solutions Network) Cooperative Agreement, the Oceanography Division of the Naval Research Laboratory (NRLSSC) is developing new products through the integration of data from NASA Earth-Sun System assets with coastal ocean forecast models and other available data to enhance coastal management in the Gulf of Mexico. The recipient federal agency for this research effort is the National Oceanic and Atmospheric Administration (NOAA). The contents of this report detail the effort to further the goals of the NASA Applied Sciences Program by demonstrating the use of NASA satellite products combined with data-assimilating ocean models to provide near real-time information to maritime users and coastal managers of the Gulf of Mexico. This effort provides new and improved capabilities for monitoring, assessing, and predicting the coastal environment. Coastal managers can exploit these capabilities through enhanced DSTs at federal, state and local agencies. The project addresses three major issues facing coastal managers: 1) Harmful Algal Blooms (HABs); 2) hypoxia; and 3) freshwater fluxes to the coastal ocean. A suite of ocean products capable of describing Ocean Weather is assembled on a daily basis as the foundation for this semi-operational multiyear effort. This continuous realtime capability brings decision makers a new ability to monitor both normal and anomalous coastal ocean conditions with a steady flow of satellite and ocean model conditions. Furthermore, as the baseline data sets are used more extensively and the customer list increased, customer feedback is obtained and additional customized products are developed and provided to decision makers. Continual customer feedback and response with new improved products are required between the researcher and customer. This document details the methods by which these coastal ocean products are produced including the data flow, distribution, and verification. Product applications and the degree to which these products are used successfully within NOAA and coordinated with the Mississippi Department of Marine Resources (MDMR) is benchmarked.
NASA Technical Reports Server (NTRS)
Hua, Chongyu; Volakis, John L.
1990-01-01
AUTOMESH-2D is a computer program specifically designed as a preprocessor for the scattering analysis of two dimensional bodies by the finite element method. This program was developed due to a need for reproducing the effort required to define and check the geometry data, element topology, and material properties. There are six modules in the program: (1) Parameter Specification; (2) Data Input; (3) Node Generation; (4) Element Generation; (5) Mesh Smoothing; and (5) Data File Generation.
Automated CPX support system preliminary design phase
NASA Technical Reports Server (NTRS)
Bordeaux, T. A.; Carson, E. T.; Hepburn, C. D.; Shinnick, F. M.
1984-01-01
The development of the Distributed Command and Control System (DCCS) is discussed. The development of an automated C2 system stimulated the development of an automated command post exercise (CPX) support system to provide a more realistic stimulus to DCCS than could be achieved with the existing manual system. An automated CPX system to support corps-level exercise was designed. The effort comprised four tasks: (1) collecting and documenting user requirements; (2) developing a preliminary system design; (3) defining a program plan; and (4) evaluating the suitability of the TRASANA FOURCE computer model.
OLIFE: Tight Binding Code for Transmission Coefficient Calculation
NASA Astrophysics Data System (ADS)
Mijbil, Zainelabideen Yousif
2018-05-01
A new and human friendly transport calculation code has been developed. It requires a simple tight binding Hamiltonian as the only input file and uses a convenient graphical user interface to control calculations. The effect of magnetic field on junction has also been included. Furthermore the transmission coefficient can be calculated between any two points on the scatterer which ensures high flexibility to check the system. Therefore Olife can highly be recommended as an essential tool for pretesting studying and teaching electron transport in molecular devices that saves a lot of time and effort.
Ease of adoption of clinical natural language processing software: An evaluation of five systems.
Zheng, Kai; Vydiswaran, V G Vinod; Liu, Yang; Wang, Yue; Stubbs, Amber; Uzuner, Özlem; Gururaj, Anupama E; Bayer, Samuel; Aberdeen, John; Rumshisky, Anna; Pakhomov, Serguei; Liu, Hongfang; Xu, Hua
2015-12-01
In recognition of potential barriers that may inhibit the widespread adoption of biomedical software, the 2014 i2b2 Challenge introduced a special track, Track 3 - Software Usability Assessment, in order to develop a better understanding of the adoption issues that might be associated with the state-of-the-art clinical NLP systems. This paper reports the ease of adoption assessment methods we developed for this track, and the results of evaluating five clinical NLP system submissions. A team of human evaluators performed a series of scripted adoptability test tasks with each of the participating systems. The evaluation team consisted of four "expert evaluators" with training in computer science, and eight "end user evaluators" with mixed backgrounds in medicine, nursing, pharmacy, and health informatics. We assessed how easy it is to adopt the submitted systems along the following three dimensions: communication effectiveness (i.e., how effective a system is in communicating its designed objectives to intended audience), effort required to install, and effort required to use. We used a formal software usability testing tool, TURF, to record the evaluators' interactions with the systems and 'think-aloud' data revealing their thought processes when installing and using the systems and when resolving unexpected issues. Overall, the ease of adoption ratings that the five systems received are unsatisfactory. Installation of some of the systems proved to be rather difficult, and some systems failed to adequately communicate their designed objectives to intended adopters. Further, the average ratings provided by the end user evaluators on ease of use and ease of interpreting output are -0.35 and -0.53, respectively, indicating that this group of users generally deemed the systems extremely difficult to work with. While the ratings provided by the expert evaluators are higher, 0.6 and 0.45, respectively, these ratings are still low indicating that they also experienced considerable struggles. The results of the Track 3 evaluation show that the adoptability of the five participating clinical NLP systems has a great margin for improvement. Remedy strategies suggested by the evaluators included (1) more detailed and operation system specific use instructions; (2) provision of more pertinent onscreen feedback for easier diagnosis of problems; (3) including screen walk-throughs in use instructions so users know what to expect and what might have gone wrong; (4) avoiding jargon and acronyms in materials intended for end users; and (5) packaging prerequisites required within software distributions so that prospective adopters of the software do not have to obtain each of the third-party components on their own. Copyright © 2015 Elsevier Inc. All rights reserved.
User Impact on Selection, Digitization, and the Development of Digital Special Collections
ERIC Educational Resources Information Center
Mills, Alexandra
2015-01-01
Libraries and archives digitize their special collections in an effort to increase access to rare and unique items. To ensure that resulting digital collections meet user needs, institutions have made assessment, consultation, and user participation integral to digitization initiatives and the selection process. Institutions must also build…
Cooperative organic mine avoidance path planning
NASA Astrophysics Data System (ADS)
McCubbin, Christopher B.; Piatko, Christine D.; Peterson, Adam V.; Donnald, Creighton R.; Cohen, David
2005-06-01
The JHU/APL Path Planning team has developed path planning techniques to look for paths that balance the utility and risk associated with different routes through a minefield. Extending on previous years' efforts, we investigated real-world Naval mine avoidance requirements and developed a tactical decision aid (TDA) that satisfies those requirements. APL has developed new mine path planning techniques using graph based and genetic algorithms which quickly produce near-minimum risk paths for complicated fitness functions incorporating risk, path length, ship kinematics, and naval doctrine. The TDA user interface, a Java Swing application that obtains data via Corba interfaces to path planning databases, allows the operator to explore a fusion of historic and in situ mine field data, control the path planner, and display the planning results. To provide a context for the minefield data, the user interface also renders data from the Digital Nautical Chart database, a database created by the National Geospatial-Intelligence Agency containing charts of the world's ports and coastal regions. This TDA has been developed in conjunction with the COMID (Cooperative Organic Mine Defense) system. This paper presents a description of the algorithms, architecture, and application produced.
End-User Applications of Real-Time Earthquake Information in Europe
NASA Astrophysics Data System (ADS)
Cua, G. B.; Gasparini, P.; Giardini, D.; Zschau, J.; Filangieri, A. R.; Reakt Wp7 Team
2011-12-01
The primary objective of European FP7 project REAKT (Strategies and Tools for Real-Time Earthquake Risk Reduction) is to improve the efficiency of real-time earthquake risk mitigation methods and their capability of protecting structures, infrastructures, and populations. REAKT aims to address the issues of real-time earthquake hazard and response from end-to-end, with efforts directed along the full spectrum of methodology development in earthquake forecasting, earthquake early warning, and real-time vulnerability systems, through optimal decision-making, and engagement and cooperation of scientists and end users for the establishment of best practices for use of real-time information. Twelve strategic test cases/end users throughout Europe have been selected. This diverse group of applications/end users includes civil protection authorities, railway systems, hospitals, schools, industrial complexes, nuclear plants, lifeline systems, national seismic networks, and critical structures. The scale of target applications covers a wide range, from two school complexes in Naples, to individual critical structures, such as the Rion Antirion bridge in Patras, and the Fatih Sultan Mehmet bridge in Istanbul, to large complexes, such as the SINES industrial complex in Portugal and the Thessaloniki port area, to distributed lifeline and transportation networks and nuclear plants. Some end-users are interested in in-depth feasibility studies for use of real-time information and development of rapid response plans, while others intend to install real-time instrumentation and develop customized automated control systems. From the onset, REAKT scientists and end-users will work together on concept development and initial implementation efforts using the data products and decision-making methodologies developed with the goal of improving end-user risk mitigation. The aim of this scientific/end-user partnership is to ensure that scientific efforts are applicable to operational, real-world problems.
Earthdata 3.0: A Unified Experience and Platform for Earth Science Discovery
NASA Astrophysics Data System (ADS)
Plofchan, P.; McLaughlin, B. D.
2015-12-01
NASA's EOSDIS (Earth Observing System Data and Information System) as a multitude of websites and applications focused on serving the Earth Science community's extensive data needs. With no central user interface, theme, or mechanism for accessing that data, interrelated systems are confusing and potentially disruptive in users' searches for EOSDIS data holdings. In an effort to bring consistency across these systems, an effort was undertaken to develop Earthdata 3.0: a complete information architecture overhaul of the Earthdata website, a significant update to the Earthdata user experience and user interface, and an increased focus on searching across EOSDIS data holdings, including those housed and made available through DAAC websites. As part of this effort, and in a desire to unify the user experience across related websites, the Earthdata User Interface (EUI) was developed. The EUI is a collection of responsive design components and layouts geared toward creating websites and applications within the Earthdata ecosystem. Each component and layout has been designed specifically for Earth science-related projects which eliminates some of the complexities of building a website or application from the ground up. Its adoption will ensure both consistent markup and a unified look and feel for end users, thereby increasing usability and accessibility. Additionally, through the user of a Google Search Appliance, custom Clojure code, and in cooperation with DAACs, Earthdata 3.0 presents a variety of search results upon a user's keyword(s) entry. These results are not just textual links, but also direct links to downloadable datasets, visualizations of datasets and collections of data, and related articles and videos for further research. The end result of the development of the EUI and the enhanced multi-response type search is a consistent and usable platform for Earth scientists and users to navigate and locate data to further their research.
JSpOC Mission System Application Development Environment
NASA Astrophysics Data System (ADS)
Luce, R.; Reele, P.; Sabol, C.; Zetocha, P.; Echeverry, J.; Kim, R.; Golf, B.
2012-09-01
The Joint Space Operations Center (JSpOC) Mission System (JMS) is the program of record tasked with replacing the legacy Space Defense Operations Center (SPADOC) and Astrodynamics Support Workstation (ASW) capabilities by the end of FY2015 as well as providing additional Space Situational Awareness (SSA) and Command and Control (C2) capabilities post-FY2015. To meet the legacy replacement goal, the JMS program is maturing a government Service Oriented Architecture (SOA) infrastructure that supports the integration of mission applications while acquiring mature industry and government mission applications. Future capabilities required by the JSpOC after 2015 will require development of new applications and procedures as well as the exploitation of new SSA data sources. To support the post FY2015 efforts, the JMS program is partnering with the Air Force Research Laboratory (AFRL) to build a JMS application development environment. The purpose of this environment is to: 1) empower the research & development community, through access to relevant tools and data, to accelerate technology development, 2) allow the JMS program to communicate user capability priorities and requirements to the developer community, 3) provide the JMS program with access to state-of-the-art research, development, and computing capabilities, and 4) support market research efforts by identifying outstanding performers that are available to shepherd into the formal transition process. The application development environment will consist of both unclassified and classified environments that can be accessed over common networks (including the Internet) to provide software developers, scientists, and engineers everything they need (e.g., building block JMS services, modeling and simulation tools, relevant test scenarios, documentation, data sources, user priorities/requirements, and SOA integration tools) to develop and test mission applications. The developed applications will be exercised in these relevant environments with representative data sets to help bridge the gap between development and integration into the operational JMS enterprise.
Takanokura, Masato
2010-03-22
A four-wheeled walker is a valuable tool for assisting elderly persons with walking. The handgrip height is one of the most important factor determining the usefulness of the walker. However, the optimal handgrip height for elderly users has not been considered from a biomechanical viewpoint. In this study, the handgrip height was optimized by a two-dimensional mechanical model to reduce muscular loads in the lower body as well as in the upper body with various road conditions during steady walking. A critical height of the handgrip existed at 48% of the body height for the user regardless of gender and body dimension. A lower handgrip relieved muscular load for stooping users with a lower standing height. The stooping user pushed the handgrip strongly in the perpendicular direction by leaning the upper body on the walker. However, upright users with a higher standing height should use a four-wheeled walker with a higher handgrip for maintaining his or her upright posture. For downhill movement, the optimal handgrip height depended on the slope angle and the friction coefficient between the road and the wheels of the walker. On a low-friction downhill such as asphalt with a steeper slope angle, the user was required to maintain an erect trunk with a higher handgrip and to press on the handgrip strongly in the perpendicular direction. Movement on a low-friction road was easier for users on a flat road and an uphill road, but it compelled distinct effort from users when moving downhill. Copyright (c) 2009 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Chaumont, Diane; Huard, David; Logan, Travis; Sottile, Marie-France; Brown, Ross; Gauvin St-Denis, Blaise; Grenier, Patrick; Braun, Marco
2013-04-01
Planning and adapting to a changing climate requires credible information about the magnitude and rate of projected changes. Ouranos, a consortium on regional climatology and adaptation to climate change was launched in the Province of Québec, Canada, ten years ago, with the objective of developing and providing climate information and expertise in support to adaption. Ouranos differs from most other climate service centers by integrating climate modeling activities, impacts and adaptation expertise and climate analysis services under one roof. The Climate Scenarios Group operates at the interface between climate modellers and users and is responsible for developing, producing and communicating climate scenarios to end-users in a consistent manner. This process requires close collaboration with users to define, understand and eventually anticipate their needs. The varied scientific expertise of climate scenarios specialists --who also act as communicators-- has proven to be a key element for successful communication. A large amount of effort is spent on the characterization and communication of the uncertainties involved in scenario construction. Two main activities have been put in place by the experts in climate modeling to address this: (1) a training course on climate models and (2) a fact-sheet summarizing the uncertainty and robustness of the climate change scenario provided for each I&A application. The latter tool ensures the transparency, traceability, and accountability of our products, and at the same time, encourages a sense of shared responsibility for the final choice of climate scenarios. In addition to uncertainty, two other main issues have been identified as essential in communication with users: 1) observed natural variability at relevant scales and 2) reconciliation of the projected trend with the recent observed trend. Our group has devoted substantial resources for the advancement of communication with end-users in these particular areas. This presentation will provide an overview of progress in communicating climate information at the Ouranos Consortium. We will discuss success and failures and future plans, in particular the extent to which Ouranos needs to work with users in decision-making activities.
ESA Earth Observation Ground Segment Evolution Strategy
NASA Astrophysics Data System (ADS)
Benveniste, J.; Albani, M.; Laur, H.
2016-12-01
One of the key elements driving the evolution of EO Ground Segments, in particular in Europe, has been to enable the creation of added value from EO data and products. This requires the ability to constantly adapt and improve the service to a user base expanding far beyond the `traditional' EO user community of remote sensing specialists. Citizen scientists, the general public, media and educational actors form another user group that is expected to grow. Technological advances, Open Data policies, including those implemented by ESA and the EU, as well as an increasing number of satellites in operations (e.g. Copernicus Sentinels) have led to an enormous increase in available data volumes. At the same time, even with modern network and data handling services, fewer users can afford to bulk-download and consider all potentially relevant data and associated knowledge. The "EO Innovation Europe" concept is being implemented in Europe in coordination between the European Commission, ESA and other European Space Agencies, and industry. This concept is encapsulated in the main ideas of "Bringing the User to the Data" and "Connecting the Users" to complement the traditional one-to-one "data delivery" approach of the past. Both ideas are aiming to better "empower the users" and to create a "sustainable system of interconnected EO Exploitation Platforms", with the objective to enable large scale exploitation of European EO data assets for stimulating innovation and to maximize their impact. These interoperable/interconnected platforms are virtual environments in which the users - individually or collaboratively - have access to the required data sources and processing tools, as opposed to downloading and handling the data `at home'. EO-Innovation Europe has been structured around three elements: an enabling element (acting as a back office), a stimulating element and an outreach element (acting as a front office). Within the enabling element, a "mutualisation" of efforts and funding between public institutions should prevent an unnecessary duplication of investments for enabling infrastructures in Europe and will stimulate the existence of many exploitation platforms or value-adding add-ons funded by different public and private entities in the outreach element (front office).
ERIC Educational Resources Information Center
Erickson, William A.; Dumoulin-Smith, Adrien
2009-01-01
The mission of the Cornell StatsRRTC is to bridge the divide between the sources of disability data and the users of disability statistics. One product of this effort is a set of "User Guides" to national survey data that collect information on the disability population. The purpose of each of the "User Guides" is to provide…
NASA Technical Reports Server (NTRS)
Engle, H. A.; Christensen, D. L.
1975-01-01
The development and application of educational programs to improve public awareness of the space shuttle/space lab capabilities are reported. Special efforts were made to: identify the potential user, identify and analyze space education programs, plan methods for user involvement, develop techniques and programs to encourage new users, and compile follow-on ideas.
The NASA Electric Propulsion Program
NASA Technical Reports Server (NTRS)
Callahan, Lisa Wood; Curran, Francis M.
1996-01-01
Nearly all space missions require on-board propulsion systems and these systems typically have a major impact on spacecraft mass and cost. Electric propulsion systems offer major performance advantages over conventional chemical systems for many mission functions and the NASA Office of Space Access and Technology (OSAT) supports an extensive effort to develop the technology for high-performance, on-board electric propulsion system options to enhance and enable near- and far-term US space missions. This program includes research and development efforts on electrothermal, electrostatic, and electromagnetic propulsion system technologies to cover a wide range of potential applications. To maximize expectations of technology transfer, the program emphasizes strong interaction with the user community through a variety of cooperative and contracted approaches. This paper provides an overview of the OSAT electric propulsion program with an emphasis on recent progress and future directions.
Terminology development towards harmonizing multiple clinical neuroimaging research repositories.
Turner, Jessica A; Pasquerello, Danielle; Turner, Matthew D; Keator, David B; Alpert, Kathryn; King, Margaret; Landis, Drew; Calhoun, Vince D; Potkin, Steven G; Tallis, Marcelo; Ambite, Jose Luis; Wang, Lei
2015-07-01
Data sharing and mediation across disparate neuroimaging repositories requires extensive effort to ensure that the different domains of data types are referred to by commonly agreed upon terms. Within the SchizConnect project, which enables querying across decentralized databases of neuroimaging, clinical, and cognitive data from various studies of schizophrenia, we developed a model for each data domain, identified common usable terms that could be agreed upon across the repositories, and linked them to standard ontological terms where possible. We had the goal of facilitating both the current user experience in querying and future automated computations and reasoning regarding the data. We found that existing terminologies are incomplete for these purposes, even with the history of neuroimaging data sharing in the field; and we provide a model for efforts focused on querying multiple clinical neuroimaging repositories.
Terminology development towards harmonizing multiple clinical neuroimaging research repositories
Turner, Jessica A.; Pasquerello, Danielle; Turner, Matthew D.; Keator, David B.; Alpert, Kathryn; King, Margaret; Landis, Drew; Calhoun, Vince D.; Potkin, Steven G.; Tallis, Marcelo; Ambite, Jose Luis; Wang, Lei
2015-01-01
Data sharing and mediation across disparate neuroimaging repositories requires extensive effort to ensure that the different domains of data types are referred to by commonly agreed upon terms. Within the SchizConnect project, which enables querying across decentralized databases of neuroimaging, clinical, and cognitive data from various studies of schizophrenia, we developed a model for each data domain, identified common usable terms that could be agreed upon across the repositories, and linked them to standard ontological terms where possible. We had the goal of facilitating both the current user experience in querying and future automated computations and reasoning regarding the data. We found that existing terminologies are incomplete for these purposes, even with the history of neuroimaging data sharing in the field; and we provide a model for efforts focused on querying multiple clinical neuroimaging repositories. PMID:26688838
Durkalski, Valerie; Wenle Zhao; Dillon, Catherine; Kim, Jaemyung
2010-04-01
Clinical trial investigators and sponsors invest vast amounts of resources and energy into conducting trials and often face daily challenges with data management, project management, and data quality control. Rather than waiting months for study progress reports, investigators need the ability to use real-time data for the coordination and management of study activities across all study team members including site investigators, oversight committees, data and safety monitoring boards, and medical safety monitors. Web-based data management systems are beginning to meet this need but what distinguishes one system from the other are user needs/requirements and cost. To illustrate the development and implementation of a web-based data and project management system for a multicenter clinical trial designed to test the superiority of repeated transcranial magnetic stimulation versus sham for the treatment of patients with major depression. The authors discuss the reasons for not using a commercially available system for this study and describe the approach to developing their own web-based system for the OPT-TMS study. Timelines, effort, system architecture, and lessons learned are shared with the hope that this information will direct clinical trial researchers and software developers towards more efficient, user-friendly systems. The developers use a combination of generic and custom application code to allow for the flexibility to adapt the system to the needs of the study. Features of the system include: central participant registration and randomization; secure data entry at the site; participant progress/study calendar; safety data reporting; device accounting; monitor verification; and user-configurable generic reports and built-in customized reports. Hard coding was more time-efficient to address project-specific issues compared with the effort of creating a generic code application. As a consequence of this strategy, the required maintenance of the system is increased and the value of using this system for other trials is reduced. Web-based central computerized systems offer time-saving, secure options for managing clinical trial data. The choice of a commercially available system or an internally developed system is determined by the requirements of the study and users. Pros and cons to both approaches were discussed. If the intention is to use the system for various trials (single and multi-center, phases I-III) across various therapeutic areas, then the overall design should be a generic structure that simplifies the general application with minimal loss of functionality.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-19
..., Parent Company or Corporate Company, Name of Company Point of Contact (POC) for E-Verify Usage, POC Phone... information about user system usage. The information collected specifically on users includes: Name (last... improvement efforts and system enhancement planning, which may include conducting surveys, user interviews...
Exploration of Web Users' Search Interests through Automatic Subject Categorization of Query Terms.
ERIC Educational Resources Information Center
Pu, Hsiao-tieh; Yang, Chyan; Chuang, Shui-Lung
2001-01-01
Proposes a mechanism that carefully integrates human and machine efforts to explore Web users' search interests. The approach consists of a four-step process: extraction of core terms; construction of subject taxonomy; automatic subject categorization of query terms; and observation of users' search interests. Research findings are proved valuable…
ERIC Educational Resources Information Center
Ho, Jeannette; Crowley, Gwyneth H.
2003-01-01
Explored user perceptions of dependability and accuracy of Texas A&M library services through focus groups. Reports user difficulties in locating materials, inaccurate catalog and circulation records, inadequate signage, searching the online catalog, and late notification of interlibrary loan arrivals; and discusses the library's efforts to…
Design and evaluation of a trilateral shared-control architecture for teleoperated training robots.
Shamaei, Kamran; Kim, Lawrence H; Okamura, Allison M
2015-08-01
Multilateral teleoperated robots can be used to train humans to perform complex tasks that require collaborative interaction and expert supervision, such as laparoscopic surgical procedures. In this paper, we explain the design and performance evaluation of a shared-control architecture that can be used in trilateral teleoperated training robots. The architecture includes dominance and observation factors inspired by the determinants of motor learning in humans, including observational practice, focus of attention, feedback and augmented feedback, and self-controlled practice. Toward the validation of such an architecture, we (1) verify the stability of a trilateral system by applying Llewellyn's criterion on a two-port equivalent architecture, and (2) demonstrate that system transparency remains generally invariant across relevant observation factors and movement frequencies. In a preliminary experimental study, a dyad of two human users (one novice, one expert) collaborated on the control of a robot to follow a trajectory. The experiment showed that the framework can be used to modulate the efforts of the users and adjust the source and level of haptic feedback to the novice user.
CernVM WebAPI - Controlling Virtual Machines from the Web
NASA Astrophysics Data System (ADS)
Charalampidis, I.; Berzano, D.; Blomer, J.; Buncic, P.; Ganis, G.; Meusel, R.; Segal, B.
2015-12-01
Lately, there is a trend in scientific projects to look for computing resources in the volunteering community. In addition, to reduce the development effort required to port the scientific software stack to all the known platforms, the use of Virtual Machines (VMs)u is becoming increasingly popular. Unfortunately their use further complicates the software installation and operation, restricting the volunteer audience to sufficiently expert people. CernVM WebAPI is a software solution addressing this specific case in a way that opens wide new application opportunities. It offers a very simple API for setting-up, controlling and interfacing with a VM instance in the users computer, while in the same time offloading the user from all the burden of downloading, installing and configuring the hypervisor. WebAPI comes with a lightweight javascript library that guides the user through the application installation process. Malicious usage is prohibited by offering a per-domain PKI validation mechanism. In this contribution we will overview this new technology, discuss its security features and examine some test cases where it is already in use.
Martín, Diego; Alcarria, Ramón; Sánchez-Picot, Álvaro; Robles, Tomás
2015-10-01
End-user development is a new trend to provide tailored services to dynamic environments such as hospitals. These services not only facilitate daily work for pharmacy personnel but also improve self-care in elder people that are still related to hospital, such as discharged patients. This paper presents an ambient intelligence (AmI) environment for End-user service provisioning in the pharmacy department of Gregorio Marañón Hospital in Madrid, composed of a drug traceability infrastructure (DP-TraIN) and a ubiquitous application for enabling the pharmacy staff to create and execute their own services for facilitating drug management and dispensing. The authors carried out a case study with various experiments where different roles from the pharmacy department of Gregorio Marañón Hospital were involved in activities such as drug identification, dispensing and medication administering. The authors analyzed the effort required to create services by pharmacy staff, the discharged patients' perception of the AmI environment and the quantifiable benefits in reducing patient waiting time for drug dispensing.
Jurrus, Elizabeth; Watanabe, Shigeki; Giuly, Richard J.; Paiva, Antonio R. C.; Ellisman, Mark H.; Jorgensen, Erik M.; Tasdizen, Tolga
2013-01-01
Neuroscientists are developing new imaging techniques and generating large volumes of data in an effort to understand the complex structure of the nervous system. The complexity and size of this data makes human interpretation a labor-intensive task. To aid in the analysis, new segmentation techniques for identifying neurons in these feature rich datasets are required. This paper presents a method for neuron boundary detection and nonbranching process segmentation in electron microscopy images and visualizing them in three dimensions. It combines both automated segmentation techniques with a graphical user interface for correction of mistakes in the automated process. The automated process first uses machine learning and image processing techniques to identify neuron membranes that deliniate the cells in each two-dimensional section. To segment nonbranching processes, the cell regions in each two-dimensional section are connected in 3D using correlation of regions between sections. The combination of this method with a graphical user interface specially designed for this purpose, enables users to quickly segment cellular processes in large volumes. PMID:22644867
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jurrus, Elizabeth R.; Watanabe, Shigeki; Giuly, Richard J.
2013-01-01
Neuroscientists are developing new imaging techniques and generating large volumes of data in an effort to understand the complex structure of the nervous system. The complexity and size of this data makes human interpretation a labor-intensive task. To aid in the analysis, new segmentation techniques for identifying neurons in these feature rich datasets are required. This paper presents a method for neuron boundary detection and nonbranching process segmentation in electron microscopy images and visualizing them in three dimensions. It combines both automated segmentation techniques with a graphical user interface for correction of mistakes in the automated process. The automated processmore » first uses machine learning and image processing techniques to identify neuron membranes that deliniate the cells in each two-dimensional section. To segment nonbranching processes, the cell regions in each two-dimensional section are connected in 3D using correlation of regions between sections. The combination of this method with a graphical user interface specially designed for this purpose, enables users to quickly segment cellular processes in large volumes.« less
Boosting a Low-Cost Smart Home Environment with Usage and Access Control Rules.
Barsocchi, Paolo; Calabrò, Antonello; Ferro, Erina; Gennaro, Claudio; Marchetti, Eda; Vairo, Claudio
2018-06-08
Smart Home has gained widespread attention due to its flexible integration into everyday life. Pervasive sensing technologies are used to recognize and track the activities that people perform during the day, and to allow communication and cooperation of physical objects. Usually, the available infrastructures and applications leveraging these smart environments have a critical impact on the overall cost of the Smart Home construction, require to be preferably installed during the home construction and are still not user-centric. In this paper, we propose a low cost, easy to install, user-friendly, dynamic and flexible infrastructure able to perform runtime resources management by decoupling the different levels of control rules. The basic idea relies on the usage of off-the-shelf sensors and technologies to guarantee the regular exchange of critical information, without the necessity from the user to develop accurate models for managing resources or regulating their access/usage. This allows us to simplify the continuous updating and improvement, to reduce the maintenance effort and to improve residents’ living and security. A first validation of the proposed infrastructure on a case study is also presented.
Telemetry Monitoring and Display Using LabVIEW
NASA Technical Reports Server (NTRS)
Wells, George; Baroth, Edmund C.
1993-01-01
The Measurement Technology Center of the Instrumentation Section configures automated data acquisition systems to meet the diverse needs of JPL's experimental research community. These systems are based on personal computers or workstations (Apple, IBM/Compatible, Hewlett-Packard, and Sun Microsystems) and often include integrated data analysis, visualization and experiment control functions in addition to data acquisition capabilities. These integrated systems may include sensors, signal conditioning, data acquisition interface cards, software, and a user interface. Graphical programming is used to simplify configuration of such systems. Employment of a graphical programming language is the most important factor in enabling the implementation of data acquisition, analysis, display and visualization systems at low cost. Other important factors are the use of commercial software packages and off-the-shelf data acquisition hardware where possible. Understanding the experimenter's needs is also critical. An interactive approach to user interface construction and training of operators is also important. One application was created as a result of a competative effort between a graphical programming language team and a text-based C language programming team to verify the advantages of using a graphical programming language approach. With approximately eight weeks of funding over a period of three months, the text-based programming team accomplished about 10% of the basic requirements, while the Macintosh/LabVIEW team accomplished about 150%, having gone beyond the original requirements to simulate a telemetry stream and provide utility programs. This application verified that using graphical programming can significantly reduce software development time. As a result of this initial effort, additional follow-on work was awarded to the graphical programming team.
Semantic-gap-oriented active learning for multilabel image annotation.
Tang, Jinhui; Zha, Zheng-Jun; Tao, Dacheng; Chua, Tat-Seng
2012-04-01
User interaction is an effective way to handle the semantic gap problem in image annotation. To minimize user effort in the interactions, many active learning methods were proposed. These methods treat the semantic concepts individually or correlatively. However, they still neglect the key motivation of user feedback: to tackle the semantic gap. The size of the semantic gap of each concept is an important factor that affects the performance of user feedback. User should pay more efforts to the concepts with large semantic gaps, and vice versa. In this paper, we propose a semantic-gap-oriented active learning method, which incorporates the semantic gap measure into the information-minimization-based sample selection strategy. The basic learning model used in the active learning framework is an extended multilabel version of the sparse-graph-based semisupervised learning method that incorporates the semantic correlation. Extensive experiments conducted on two benchmark image data sets demonstrated the importance of bringing the semantic gap measure into the active learning process.
Turner, Anne M; Reeder, Blaine; Ramey, Judith
2013-08-01
Despite years of effort and millions of dollars spent to create unified electronic communicable disease reporting systems, the goal remains elusive. A major barrier has been a lack of understanding by system designers of communicable disease (CD) work and the public health workers who perform this work. This study reports on the application of user-centered design representations, traditionally used for improving interface design, to translate the complex CD work identified through ethnographic studies to guide designers and developers of CD systems. The purpose of this work is to: (1) better understand public health practitioners and their information workflow with respect to CD monitoring and control at a local health agency, and (2) to develop evidence-based design representations that model this CD work to inform the design of future disease surveillance systems. We performed extensive onsite semi-structured interviews, targeted work shadowing and a focus group to characterize local health agency CD workflow. Informed by principles of design ethnography and user-centered design we created persona, scenarios and user stories to accurately represent the user to system designers. We sought to convey to designers the key findings from ethnographic studies: (1) public health CD work is mobile and episodic, in contrast to current CD reporting systems, which are stationary and fixed, (2) health agency efforts are focused on CD investigation and response rather than reporting and (3) current CD information systems must conform to public health workflow to ensure their usefulness. In an effort to illustrate our findings to designers, we developed three contemporary design-support representations: persona, scenario, and user story. Through application of user-centered design principles, we were able to create design representations that illustrate complex public health communicable disease workflow and key user characteristics to inform the design of CD information systems for public health. Copyright © 2013 Elsevier Inc. All rights reserved.
Minimizing Experimental Setup Time and Effort at APS beamline 1-ID through Instrumentation Design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benda, Erika; Almer, Jonathan; Kenesei, Peter
2016-01-01
Sector 1-ID at the APS accommodates a number of dif-ferent experimental techniques in the same spatial enve-lope of the E-hutch end station. These include high-energy small and wide angle X-ray scattering (SAXS and WAXS), high-energy diffraction microscopy (HEDM, both near and far field modes) and high-energy X-ray tomography. These techniques are frequently combined to allow the users to obtain multimodal data, often attaining 1 μm spatial resolution and <0.05º angular resolution. Furthermore, these techniques are utilized while the sam-ple is thermo-mechanically loaded to mimic real operat-ing conditions. The instrumentation required for each of these techniques and environments has been designedmore » and configured in a modular way with a focus on stability and repeatability between changeovers. This approach allows the end station to be more versatile, capable of collecting multi-modal data in-situ while reducing time and effort typically required for set up and alignment, resulting in more efficient beam time use. Key instrumentation de-sign features and layout of the end station are presented.« less
Application of dexterous space robotics technology to myoelectric prostheses
NASA Astrophysics Data System (ADS)
Hess, Clifford; Li, Larry C. H.; Farry, Kristin A.; Walker, Ian D.
1994-02-01
Future space missions will require robots equipped with highly dexterous robotic hands to perform a variety of tasks. A major technical challenge in making this possible is an improvement in the way these dexterous robotic hands are remotely controlled or teleoperated. NASA is currently investigating the feasibility of using myoelectric signals to teleoperate a dexterous robotic hand. In theory, myoelectric control of robotic hands will require little or no mechanical parts and will greatly reduce the bulk and weight usually found in dexterous robotic hand control devices. An improvement in myoelectric control of multifinger hands will also benefit prosthetics users. Therefore, as an effort to transfer dexterous space robotics technology to prosthetics applications and to benefit from existing myoelectric technology, NASA is collaborating with the Limbs of Love Foundation, the Institute for Rehabilitation and Research, and Rice University in developing improved myoelectric control multifinger hands and prostheses. In this paper, we will address the objectives and approaches of this collaborative effort and discuss the technical issues associated with myoelectric control of multifinger hands. We will also report our current progress and discuss plans for future work.
Application of dexterous space robotics technology to myoelectric prostheses
NASA Technical Reports Server (NTRS)
Hess, Clifford; Li, Larry C. H.; Farry, Kristin A.; Walker, Ian D.
1994-01-01
Future space missions will require robots equipped with highly dexterous robotic hands to perform a variety of tasks. A major technical challenge in making this possible is an improvement in the way these dexterous robotic hands are remotely controlled or teleoperated. NASA is currently investigating the feasibility of using myoelectric signals to teleoperate a dexterous robotic hand. In theory, myoelectric control of robotic hands will require little or no mechanical parts and will greatly reduce the bulk and weight usually found in dexterous robotic hand control devices. An improvement in myoelectric control of multifinger hands will also benefit prosthetics users. Therefore, as an effort to transfer dexterous space robotics technology to prosthetics applications and to benefit from existing myoelectric technology, NASA is collaborating with the Limbs of Love Foundation, the Institute for Rehabilitation and Research, and Rice University in developing improved myoelectric control multifinger hands and prostheses. In this paper, we will address the objectives and approaches of this collaborative effort and discuss the technical issues associated with myoelectric control of multifinger hands. We will also report our current progress and discuss plans for future work.
'Requirements for an Automatic Collision Avoidance System'
NASA Astrophysics Data System (ADS)
Cooper, R. W.
I am disturbed by Perkins and Redfern's paper in the May 1996 Journal.The COLREGS have been developed over many years of tinkering and tuning. They are not designed only for educated European masters driving big merchant ships. The users; fishermen, yachtsmen, oarsmen, tugmasters, Rhine bargemasters, and Yangtse junkmen, etc, are from all educational standards and from all the world's cultures. The COLREGS are now well-known. There is such a huge investment of time and effort, of learning by millions of different people, that the prospect of tampering with their fundamentals is horrific, even if it would suit a small class of user belonging to the more advanced countries. Change is painful, and too much change, too fast, kills. Compare the practical decision not to change the side of the road on which we British drive: it might be convenient, but it would cost too much. The same applies to the COLREGS.
Web-based three-dimensional Virtual Body Structures: W3D-VBS.
Temkin, Bharti; Acosta, Eric; Hatfield, Paul; Onal, Erhan; Tong, Alex
2002-01-01
Major efforts are being made to improve the teaching of human anatomy to foster cognition of visuospatial relationships. The Visible Human Project of the National Library of Medicine makes it possible to create virtual reality-based applications for teaching anatomy. Integration of traditional cadaver and illustration-based methods with Internet-based simulations brings us closer to this goal. Web-based three-dimensional Virtual Body Structures (W3D-VBS) is a next-generation immersive anatomical training system for teaching human anatomy over the Internet. It uses Visible Human data to dynamically explore, select, extract, visualize, manipulate, and stereoscopically palpate realistic virtual body structures with a haptic device. Tracking user's progress through evaluation tools helps customize lesson plans. A self-guided "virtual tour" of the whole body allows investigation of labeled virtual dissections repetitively, at any time and place a user requires it.
Expanding the domain of drug delivery for HIV prevention: exploration of the transdermal route.
Puri, Ashana; Sivaraman, Arunprasad; Zhang, Wei; Clark, Meredith R; Banga, Ajay K
2017-01-01
Constant efforts for HIV prevention using antiretroviral drugs, pre- and postexposure prophylactic agents, and microbicides are being made by researchers. Drug-delivery systems such as oral tablets and coitally dependent vaginal gels are short acting, require daily application, and are associated with user adherence issues, whereas the coitally independent systems such as injectables and biodegradable implants are long acting, lasting several months, during which time the termination of prophylaxis is impractical in case of adverse effects. An effective drug-delivery system to be used for an intermediate duration, if available, would be an attractive alternative option for users in terms of adherence. Transdermal delivery systems, overcoming most of the limitations of the other routes of administration and aiming to provide sustained delivery of drugs through skin, may be explored for HIV prevention. Passive and physical enhancement techniques may be designed strategically to improve the transdermal delivery of HIV preventive agents.
BioMot exoskeleton - Towards a smart wearable robot for symbiotic human-robot interaction.
Bacek, Tomislav; Moltedo, Marta; Langlois, Kevin; Prieto, Guillermo Asin; Sanchez-Villamanan, Maria Carmen; Gonzalez-Vargas, Jose; Vanderborght, Bram; Lefeber, Dirk; Moreno, Juan C
2017-07-01
This paper presents design of a novel modular lower-limb gait exoskeleton built within the FP7 BioMot project. Exoskeleton employs a variable stiffness actuator in all 6 joints, a directional-flexibility structure and a novel physical humanrobot interfacing, which allows it to deliver the required output while minimally constraining user's gait by providing passive degrees of freedom. Due to modularity, the exoskeleton can be used as a full lower-limb orthosis, a single-joint orthosis in any of the three joints, and a two-joint orthosis in a combination of any of the two joints. By employing a simple torque control strategy, the exoskeleton can be used to deliver user-specific assistance, both in gait rehabilitation and in assisting people suffering musculoskeletal impairments. The result of the presented BioMot efforts is a low-footprint exoskeleton with powerful compliant actuators, simple, yet effective torque controller and easily adjustable flexible structure.
SensorWeb 3G: Extending On-Orbit Sensor Capabilities to Enable Near Realtime User Configurability
NASA Technical Reports Server (NTRS)
Mandl, Daniel; Cappelaere, Pat; Frye, Stuart; Sohlberg, Rob; Ly, Vuong; Chien, Steve; Tran, Daniel; Davies, Ashley; Sullivan, Don; Ames, Troy;
2010-01-01
This research effort prototypes an implementation of a standard interface, Web Coverage Processing Service (WCPS), which is an Open Geospatial Consortium(OGC) standard, to enable users to define, test, upload and execute algorithms for on-orbit sensor systems. The user is able to customize on-orbit data products that result from raw data streaming from an instrument. This extends the SensorWeb 2.0 concept that was developed under a previous Advanced Information System Technology (AIST) effort in which web services wrap sensors and a standardized Extensible Markup Language (XML) based scripting workflow language orchestrates processing steps across multiple domains. SensorWeb 3G extends the concept by providing the user controls into the flight software modules associated with on-orbit sensor and thus provides a degree of flexibility which does not presently exist. The successful demonstrations to date will be presented, which includes a realistic HyspIRI decadal mission testbed. Furthermore, benchmarks that were run will also be presented along with future demonstration and benchmark tests planned. Finally, we conclude with implications for the future and how this concept dovetails into efforts to develop "cloud computing" methods and standards.
Navigation Performance of Global Navigation Satellite Systems in the Space Service Volume
NASA Technical Reports Server (NTRS)
Force, Dale A.
2013-01-01
This paper extends the results I reported at this year's ION International Technical Meeting on multi-constellation GNSS coverage by showing how the use of multi-constellation GNSS improves Geometric Dilution of Precision (GDOP). Originally developed to provide position, navigation, and timing for terrestrial users, GPS has found increasing use for in space for precision orbit determination, precise time synchronization, real-time spacecraft navigation, and three-axis attitude control of Earth orbiting satellites. With additional Global Navigation Satellite Systems (GNSS) coming into service (GLONASS, Galileo, and Beidou) and the development of Satellite Based Augmentation Services, it is possible to obtain improved precision by using evolving multi-constellation receiver. The Space Service Volume formally defined as the volume of space between three thousand kilometers altitude and geosynchronous altitude ((is) approximately 36,500 km), with the volume below three thousand kilometers defined as the Terrestrial Service Volume (TSV). The USA has established signal requirements for the Space Service Volume (SSV) as part of the GPS Capability Development Documentation (CDD). Diplomatic efforts are underway to extend Space service Volume commitments to the other Position, Navigation, and Timing (PNT) service providers in an effort to assure that all space users will benefit from the enhanced capabilities of interoperating GNSS services in the space domain.
A qualitative exploration of driving stress and driving discourtesy.
Scott-Parker, B; Jones, C M; Rune, K; Tucker, J
2018-05-31
Driving courtesy, and conversely driving discourtesy, recently has been of great interest in the public domain. In addition, there has been increasing recognition of the negative impact of stress upon the individual's health and wellbeing, with a plethora of interventions aimed at minimising stress more generally. The research literature regarding driving dis/courtesy, in comparison, is scant, with a handful of studies examining the dis/courteous driving behaviour of road users, and the relationship between driving discourtesy and driving stress. To examine courteous and discourteous driving experiences, and to explore the impact of stress associated with such driving experiences. Thirty-eight drivers (20 females) from the Sunshine Coast region volunteered to participate in one of four 1-1.5 h focus groups. Content analysis used the verbatim utterances captured via an Mp3 device. Three themes pertaining to stressful and discourteous interactions were identified. Theme one pertained to the driving context: road infrastructure (eg, roundabouts, roadwork), vehicles (eg, features), location (eg, country vs city, unfamiliar areas), and temporal aspects (eg, holidays). Theme two pertained to other road users: their behaviour (eg, tailgating, merging), and unknown factors (eg, illicit and licit drug use). Theme three pertained to the self as road user: their own behaviours (eg, deliberate intimidation), and their emotions (eg, angry reaction to other drivers, being in control). Driving dis/courtesy and driving stress is a complex phenomenon, suggesting complex intervention efforts are required. Driving discourtesy was reported as being highly stressful, therefore intervention efforts which encourage driving courtesy and which foster emotional capacity to cope with stressful circumstances appear warranted. Copyright © 2018. Published by Elsevier Ltd.
Portable image-manipulation software: what is the extra development cost?
Ligier, Y; Ratib, O; Funk, M; Perrier, R; Girard, C; Logean, M
1992-08-01
A hospital-wide picture archiving and communication system (PACS) project is currently under development at the University Hospital of Geneva. The visualization and manipulation of images provided by different imaging modalities constitutes one of the most challenging component of a PACS. It was necessary to provide this visualization software on a number of types of workstations because of the varying requirements imposed by the range of clinical uses it must serve. The user interface must be the same, independent of the underlying workstation. In addition to a standard set of image-manipulation and processing tools, there is a need for more specific clinical tools that can be easily adapted to specific medical requirements. To achieve this goal, it was elected to develop a modular and portable software called OSIRIS. This software is available on two different operating systems (the UNIX standard X-11/OSF-Motif based workstations and the Macintosh family) and can be easily ported to other systems. The extra effort required to design such software in a modular and portable way was worthwhile because it resulted in a platform that can be easily expanded and adapted to a variety of specific clinical applications. Its portability allows users to benefit from the rapidly evolving workstation technology and to adapt the performance to suit their needs.
Earth Observation Training and Education with ESA LearnEO!
NASA Astrophysics Data System (ADS)
Byfield, Valborg; Mathieu, Pierre-Philippe; Dobson, Malcolm; Rosmorduc, Vinca; Del Frate, Fabio; Banks, Chris; Picchiani, Matteo
2013-04-01
For society to benefit fully from its investment in Earth observation, EO data must be accessible and familiar to a global community of users who have the skills, knowledge and understanding to use the observations appropriately in their work. Achieving this requires considerable education effort. LearnEO! (www.learn-eo.org) is a new ESA education project that contributes towards making this a reality. LearnEO! has two main aims: to develop new training resources that use data from sensors on ESA satellites to explore a variety of environmental topics, and to stimulate and support members of the EO and education communities who may be willing to develop and share new education resources in the future. The project builds on the UNESCO Bilko project, which currently supplies free software, tutorials, and example data to users in 175 countries. Most of these users are in academic education or research, but the training resources are also of interest to a growing number of professionals in government, NGOs and private enterprise. Typical users are not remote sensing experts, but see satellite data as one of many observational tools. They want an easy, low-cost means to process, display and analyse data from different satellite sensors as part of their work in environmental research, monitoring and policy development. Many of the software improvements and training materials developed in LearnEO! are in response to requests from this user community. The LearnEO! tutorial and peer-reviewed lessons are designed to teach satellite data processing and analysis skills at different levels, from beginner to advanced - where advanced lessons requires some previous experience with Earth observation techniques. The materials are aimed at students and professionals in various branches of Earth sciences who have not yet specialised in specific EO technologies. The lessons are suitable for self-study, university courses at undergraduate to MSc level, or for continued professional development training. Each lesson comes complete with data, analysis tools and background information required to complete the suggested activities and answer the study questions. Model answers are supplied for users working on their own or with limited specialist support. The web site also provides access to annotated data sets and a lesson developers resource library, both designed to support users who wish to develop their own lessons and tutorials and share these with others. Registered users are encouraged to become involved with the project by providing support for future software and lesson development, testing, and peer review.
Improving Access to Precipitation Data for GIS Users: Designing for Ease of Use
NASA Technical Reports Server (NTRS)
Stocker, Erich F.; Kelley, Owen A.
2007-01-01
The Global Precipitation Measurement Mission (GPM) is a NASA/JAXA led international mission to configure a constellation of space-based radiometers to monitor precipitation over the globe. The GPM goal of making global 3-hour precipitation products available in near real-time will make such global products more useful to a broader community of modelers and Geographic Information Systems (GIS) users than is currently the case with remote sensed precipitation products. Based on the existing interest to make Tropical Rainfall Measuring Mission (TRMM) data available to a growing community of GIS users as well as what will certainly be an expanded community during the GPM era, it is clear that data systems must make a greater effort to provide data in formats easily used by GIS. We describe precipitation GIS products being developed for TRMM data. These products will serve as prototypes for production efforts during the GPM era. We describe efforts to convert TRMM precipitation data to GeoTIFF, Shapefile, and ASCII grid. Clearly, our goal is to format GPM data so that it can be easily used within GIS applications. We desire feedback on these efforts and any additions or direction changes that should be undertaken by the data system.
Review of ride quality technology needs of industry and user groups
NASA Technical Reports Server (NTRS)
Mckenzie, J. R.; Brumaghim, S. H.
1975-01-01
A broad survey of ride quality technology state-of-the-art and a review of user evaluation of this technology were conducted. During the study 17 users of ride quality technology in 10 organizations representing land, marine and air passenger transportation modes were interviewed. Interim results and conclusions of this effort are reported.
Lazinski, David W.; Camilli, Andrew
2013-01-01
The amplification of DNA fragments, cloned between user-defined 5′ and 3′ end sequences, is a prerequisite step in the use of many current applications including massively parallel sequencing (MPS). Here we describe an improved method, called homopolymer tail-mediated ligation PCR (HTML-PCR), that requires very little starting template, minimal hands-on effort, is cost-effective, and is suited for use in high-throughput and robotic methodologies. HTML-PCR starts with the addition of homopolymer tails of controlled lengths to the 3′ termini of a double-stranded genomic template. The homopolymer tails enable the annealing-assisted ligation of a hybrid oligonucleotide to the template's recessed 5′ ends. The hybrid oligonucleotide has a user-defined sequence at its 5′ end. This primer, together with a second primer composed of a longer region complementary to the homopolymer tail and fused to a second 5′ user-defined sequence, are used in a PCR reaction to generate the final product. The user-defined sequences can be varied to enable compatibility with a wide variety of downstream applications. We demonstrate our new method by constructing MPS libraries starting from nanogram and sub-nanogram quantities of Vibrio cholerae and Streptococcus pneumoniae genomic DNA. PMID:23311318
Geo-reCAPTCHA: Crowdsourcing large amounts of geographic information from earth observation data
NASA Astrophysics Data System (ADS)
Hillen, Florian; Höfle, Bernhard
2015-08-01
The reCAPTCHA concept provides a large amount of valuable information for various applications. First, it provides security, e.g., for a form on a website, by means of a test that only a human could solve. Second, the effort of the user for this test is used to generate additional information, e.g., digitization of books or identification of house numbers. In this work, we present a concept for adapting the reCAPTCHA idea to create user-generated geographic information from earth observation data, and the requirements during the conception and implementation are depicted in detail. Furthermore, the essential parts of a Geo-reCAPTCHA system are described, and afterwards transferred, to a prototype implementation. An empirical user study is conducted to investigate the Geo-reCAPTCHA approach, assessing time and quality of the resulting geographic information. Our results show that a Geo-reCAPTCHA can be solved by the users of our study on building digitization in a short amount of time (19.2 s on average) with an overall average accuracy of the digitizations of 82.2%. In conclusion, Geo-reCAPTCHA has the potential to be a reasonable alternative to the typical reCAPTCHA, and to become a new data-rich channel of crowdsourced geographic information.
NASA Technical Reports Server (NTRS)
Moore, W. F.; Forsythe, C.
1977-01-01
A preliminary draft policy for reimbursement for Space Shuttle flights has been developed by NASA in the form of pricing criteria for Space Transportation System (STS) users in domestic and foreign government and industry. The reimbursement policy, the transition from expendable launch vehicles to STS, the new user services, and the interaction of the economics of new user services and STS cost to fly are discussed in the present paper. Current efforts to develop new users are noted.
Certification of Completion of ASC FY08 Level-2 Milestone ID #2933
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lipari, D A
2008-06-12
This report documents the satisfaction of the completion criteria associated with ASC FY08 Milestone ID No.2933: 'Deploy Moab resource management services on BlueGene/L'. Specifically, this milestone represents LLNL efforts to enhance both SLURM and Moab to extend Moab's capabilities to schedule and manage BlueGene/L, and increases portability of user scripts between ASC systems. The completion criteria for the milestone are the following: (1) Batch jobs can be specified, submitted to Moab, scheduled and run on the BlueGene/L system; (2) Moab will be able to support the markedly increased scale in node count as well as the wiring geometry that ismore » unique to BlueGene/L; and (3) Moab will also prepare and report statistics of job CPU usage just as it does for the current systems it supports. This document presents the completion evidence for both of the stated milestone certification methods: Completion evidence for this milestone will be in the form of (1) documentation--a report that certifies that the completion criteria have been met; and (2) user hand-off. As the selected Tri-Lab workload manager, Moab was chosen to replace LCRM as the enterprise-wide scheduler across Livermore Computing (LC) systems. While LCRM/SLURM successfully scheduled jobs on BG/L, the effort to replace LCRM with Moab on BG/L represented a significant challenge. Moab is a commercial product developed and sold by Cluster Resources, Inc. (CRI). Moab receives the users batch job requests and dispatches these jobs to run on a specific cluster. SLURM is an open-source resource manager whose development is managed by members of the Integrated Computational Resource Management Group (ICRMG) within the Services and Development Division at LLNL. SLURM is responsible for launching and running jobs on an individual cluster. Replacing LCRM with Moab on BG/L required substantial changes to both Moab and SLURM. While the ICRMG could directly manage the SLURM development effort, the work to enhance Moab had to be done by Moab's vendor. Members of the ICRMG held many meetings with CRI developers to develop the design and specify the requirements for what Moab needed to do. Extensions to SLURM are used to run jobs on the BlueGene/L architecture. These extensions support the three dimensional network topology unique to BG/L. While BG/L geometry support was already in SLURM, enhancements were needed to provide backfill capability and answer 'will-run' queries from Moab. For its part, the Moab architecture needed to be modified to interact with SLURM in a more coordinated way. It needed enhancements to support SLURM's shorthand notation for representing thousands of compute nodes and report this information using Moab's existing status commands. The LCRM wrapper scripts that emulated LCRM commands also needed to be enhanced to support BG/L usage. The effort was successful as Moab 5.2.2 and SLURM 1.3 was installed on the 106496 node BG/L machine on May 21, 2008, and turned over to the users to run production.« less
Mission-oriented requirements for updating MIL-H-8501: Calspan proposed structure and rationale
NASA Technical Reports Server (NTRS)
Chalk, C. R.; Radford, R. C.
1985-01-01
This report documents the effort by Arvin/Calspan Corporation to formulate a revision of MIL-H-8501A in terms of Mission-Oriented Flying Qualities Requirements for Military Rotorcraft. Emphasis is placed on development of a specification structure which will permit addressing Operational Missions and Flight Phases, Flight Regions, Classification of Required Operational Capability, Categorization of Flight Phases, and Levels of Flying Qualities. A number of definitions is established to permit addressing the rotorcraft state, flight envelopes, environments, and the conditions under which degraded flying qualities are permitted. Tentative requirements are drafted for Required Operational Capability Class 1. Also included is a Background Information and Users Guide for the draft specification structure proposed for the MIL-H-8501A revision. The report also contains a discussion of critical data gaps and attempts to prioritize these data gaps and to suggest experiments that should be performed to generate data needed to support formulation of quantitative design criteria for the additional Operational Capability Classes 2, 3, and 4.
MARC ES: a computer program for estimating medical information storage requirements.
Konoske, P J; Dobbins, R W; Gauker, E D
1998-01-01
During combat, documentation of medical treatment information is critical for maintaining continuity of patient care. However, knowledge of prior status and treatment of patients is limited to the information noted on a paper field medical card. The Multi-technology Automated Reader Card (MARC), a smart card, has been identified as a potential storage mechanism for casualty medical information. Focusing on data capture and storage technology, this effort developed a Windows program, MARC ES, to estimate storage requirements for the MARC. The program calculates storage requirements for a variety of scenarios using medical documentation requirements, casualty rates, and casualty flows and provides the user with a tool to estimate the space required to store medical data at each echelon of care for selected operational theaters. The program can also be used to identify the point at which data must be uploaded from the MARC if size constraints are imposed. Furthermore, this model can be readily extended to other systems that store or transmit medical information.
MOEMS Modeling Using the Geometrical Matrix Toolbox
NASA Technical Reports Server (NTRS)
Wilson, William C.; Atkinson, Gary M.
2005-01-01
New technologies such as MicroOptoElectro-Mechanical Systems (MOEMS) require new modeling tools. These tools must simultaneously model the optical, electrical, and mechanical domains and the interactions between these domains. To facilitate rapid prototyping of these new technologies an optical toolbox has been developed for modeling MOEMS devices. The toolbox models are constructed using MATLAB's dynamical simulator, Simulink. Modeling toolboxes will allow users to focus their efforts on system design and analysis as opposed to developing component models. This toolbox was developed to facilitate rapid modeling and design of a MOEMS based laser ultrasonic receiver system.
Accessing multimedia content from mobile applications using semantic web technologies
NASA Astrophysics Data System (ADS)
Kreutel, Jörn; Gerlach, Andrea; Klekamp, Stefanie; Schulz, Kristin
2014-02-01
We describe the ideas and results of an applied research project that aims at leveraging the expressive power of semantic web technologies as a server-side backend for mobile applications that provide access to location and multimedia data and allow for a rich user experience in mobile scenarios, ranging from city and museum guides to multimedia enhancements of any kind of narrative content, including e-book applications. In particular, we will outline a reusable software architecture for both server-side functionality and native mobile platforms that is aimed at significantly decreasing the effort required for developing particular applications of that kind.
Cargo Logistics Airlift Systems Study (CLASS). Volume 5: Summary
NASA Technical Reports Server (NTRS)
Burby, R. J.; Kuhlman, W. H.
1980-01-01
Findings and conclusions derived during the study of freighter aircraft requirements to the year 2008 are summarized. These results represent the stepping off point for the much needed coordinated planning efforts by government agencies, the airlines, the users, and the aircraft manufacturers. The methodology utilized in the investigations is shown. The analysis of the current system encompassed evaluations of the past and current cargo markets and on sight surveys of airport and cargo terminals. The findings that resulted provided the basis for formulating the case study procedures, developing the future scenario, and developing the future cargo market demand.
The Systems Autonomy Demonstration Project - Catalyst for Space Station advanced automation
NASA Technical Reports Server (NTRS)
Healey, Kathleen J.
1988-01-01
The Systems Autonomy Demonstration Project (SADP) was initiated by NASA to address the advanced automation needs for the Space Station program. The application of advanced automation to the Space Station's operations management system (OMS) is discussed. The SADP's future goals and objectives are discussed with respect to OMS functional requirements, design, and desired evolutionary capabilities. Major technical challenges facing the designers, developers, and users of the OMS are identified in order to guide the definition of objectives, plans, and scenarios for future SADP demonstrations, and to focus the efforts on the supporting research.
Accounting and Accountability for Distributed and Grid Systems
NASA Technical Reports Server (NTRS)
Thigpen, William; McGinnis, Laura F.; Hacker, Thomas J.
2001-01-01
While the advent of distributed and grid computing systems will open new opportunities for scientific exploration, the reality of such implementations could prove to be a system administrator's nightmare. A lot of effort is being spent on identifying and resolving the obvious problems of security, scheduling, authentication and authorization. Lurking in the background, though, are the largely unaddressed issues of accountability and usage accounting: (1) mapping resource usage to resource users; (2) defining usage economies or methods for resource exchange; (3) describing implementation standards that minimize and compartmentalize the tasks required for a site to participate in a grid.
Development of a mileage-based user fee research website.
DOT National Transportation Integrated Search
2011-01-01
The University Transportation Center for Mobility (UTCM) previously funded several research projects : related to mileage-based user fees (MBUFs). As part of these research efforts a website was developed to : support the planning for the first-ever ...
Cresswell, Kathrin; Morrison, Zoe; Crowe, Sarah; Robertson, Ann; Sheikh, Aziz
2011-01-01
The absence of meaningful end user engagement has repeatedly been highlighted as a key factor contributing to 'failed' implementations of electronic health records (EHRs), but achieving this is particularly challenging in the context of national scale initiatives. In 2002, the National Health Service (NHS) embarked on a so-called 'top-down' national implementation strategy aimed at introducing commercial, centrally procured, EHRs into hospitals throughout England. We aimed to examine approaches to, and experiences of, user engagement in the context of a large-scale EHR implementation across purposefully selected hospital care providers implementing early versions of nationally procured software. We conducted a qualitative, case-study based, socio-technically informed, longitudinal investigation, purposefully sampling and collecting data from four hospitals. Our data comprised a total of 123 semi-structured interviews with users and managers, 15 interviews with additional stakeholders, 43 hours of non-participant observations of meetings and system use, and relevant organisation-specific documents from each case study site. Analysis was thematic, building on an existing model of user engagement that was originally developed in the context of studying the implementation of relatively simple technologies in commercial settings. NVivo8 software was used to facilitate coding. Despite an enduring commitment to the vision of shared EHRs and an appreciation of their potential benefits, meaningful end user engagement was never achieved. Hospital staff were not consulted in systems choice, leading to frustration; they were then further alienated by the implementation of systems that they perceived as inadequately customised. Various efforts to achieve local engagement were attempted, but these were in effect risk mitigation strategies. We found the role of clinical champions to be important in these engagement efforts, but progress was hampered by the hierarchical structures within healthcare teams. As a result, engagement efforts focused mainly on clinical staff with inadequate consideration of management and administrative staff. This work has allowed us to further develop an existing model of user engagement from the commercial sector and adapt it to inform user engagement in the context of large-scale eHealth implementations. By identifying key points of possible engagement, disengagement and re-engagement, this model will we hope both help those planning similar large-scale EHR implementation efforts and act as a much needed catalyst to further research in this neglected field of enquiry.
New Technologies to Reclaim Arid Lands User's Manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
W. K. Ostler
2002-10-01
Approximately 70 percent of all U.S. military training lands are located in arid and semi-arid areas. Training activities in such areas frequently adversely affect vegetation, damaging plants and reducing the resilience of vegetation to recover once disturbed. Fugitive dust resulting from a loss of vegetation creates additional problems for human health, increasing accidents due to decreased visibility, and increasing maintenance costs for roads, vehicles, and equipment. Under conventional technologies to mitigate these impacts, it is estimated that up to 35 percent of revegetation projects in arid areas will fail due to unpredictable natural environmental conditions, such as drought, and reclamationmore » techniques that were inadequate to restore vegetative cover in a timely and cost-effective manner. New reclamation and restoration techniques are needed in desert ranges to help mitigate the adverse effects of military training and other activities to arid-land environments. In 1999, a cooperative effort between the U.S. Department of Energy (DOE), the US. Department of Defense (DoD), and selected university scientists was undertaken to focus on mitigating military impacts in arid lands. As arid lands are impacted due to DoD and DOE activities, biological and soil resources are gradually lost and the habitat is altered. A conceptual model of that change in habitat quality is described for varying levels of disturbance in the Mojave Desert. As the habitat quality degrades and more biological and physical resources are lost from training areas, greater costs are required to return the land to sustainable levels. The purpose of this manual is to assist land managers in recognizing thresholds associated with habitat degradation and provide reclamation planning and techniques that can reduce the costs of mitigation for these impacted lands to ensure sustainable use of these lands. The importance of reclamation planning is described in this manual with suggestions about establishing project objectives, scheduling, budgeting, and selecting cost-effective techniques. Reclamation techniques include sections describing: (1) erosion control (physical, chemical, and biological), (2) site preparation, (3) soil amendments, (4) seeding, (5) planting, (6) grazing and weed control, (7) mulching, (8) irrigation, and (9) site protection. Each section states the objectives of the technique, the principles, an in-depth look at the techniques, and any special considerations as it relates to DoD or DOE lands. The need for monitoring and remediation is described to guide users in monitoring reclamation efforts to evaluate their cost-effectiveness. Costs are provided for the proposed techniques for the major deserts of the southwestern U.S. showing the average and range of costs. A set of decision tools are provided in the form of a flow diagram and table to guide users in selecting effective reclamation techniques to achieve mitigation objectives. Recommendations are provided to help summarize key reclamation principles and to assist users in developing a successful program that contributes to sustainable uses of DoD and DOE lands. The users manual is helpful to managers in communicating to installation management the needs and consequences of training decisions and the costs required to achieve successful levels of sustainable use. This users manual focuses on the development of new reclamation techniques that have been implemented at the National Training Center at Fort Irwin, California, and are applicable to most arid land reclamation efforts.« less
NASA Technical Reports Server (NTRS)
Devito, D. M.
1981-01-01
A low-cost GPS civil-user mobile terminal whose purchase cost is substantially an order of magnitude less than estimates for the military counterpart is considered with focus on ground station requirements for position monitoring of civil users requiring this capability and the civil user navigation and location-monitoring requirements. Existing survey literature was examined to ascertain the potential users of a low-cost NAVSTAR receiver and to estimate their number, function, and accuracy requirements. System concepts are defined for low cost user equipments for in-situ navigation and the retransmission of low data rate positioning data via a geostationary satellite to a central computing facility.
Bryce, Thomas N.; Dijkers, Marcel P.
2015-01-01
Background: Powered exoskeletons have been demonstrated as being safe for persons with spinal cord injury (SCI), but little is known about how users learn to manage these devices. Objective: To quantify the time and effort required by persons with SCI to learn to use an exoskeleton for assisted walking. Methods: A convenience sample was enrolled to learn to use the first-generation Ekso powered exoskeleton to walk. Participants were given up to 24 weekly sessions of instruction. Data were collected on assistance level, walking distance and speed, heart rate, perceived exertion, and adverse events. Time and effort was quantified by the number of sessions required for participants to stand up, walk for 30 minutes, and sit down, initially with minimal and subsequently with contact guard assistance. Results: Of 22 enrolled participants, 9 screen-failed, and 7 had complete data. All of these 7 were men; 2 had tetraplegia and 5 had motor-complete injuries. Of these, 5 participants could stand, walk, and sit with contact guard or close supervision assistance, and 2 required minimal to moderate assistance. Walk times ranged from 28 to 94 minutes with average speeds ranging from 0.11 to 0.21 m/s. For all participants, heart rate changes and reported perceived exertion were consistent with light to moderate exercise. Conclusion: This study provides preliminary evidence that persons with neurological weakness due to SCI can learn to walk with little or no assistance and light to somewhat hard perceived exertion using a powered exoskeleton. Persons with different severities of injury, including those with motor complete C7 tetraplegia and motor incomplete C4 tetraplegia, may be able to learn to use this device. PMID:26364280
Kozlowski, Allan J; Bryce, Thomas N; Dijkers, Marcel P
2015-01-01
Powered exoskeletons have been demonstrated as being safe for persons with spinal cord injury (SCI), but little is known about how users learn to manage these devices. To quantify the time and effort required by persons with SCI to learn to use an exoskeleton for assisted walking. A convenience sample was enrolled to learn to use the first-generation Ekso powered exoskeleton to walk. Participants were given up to 24 weekly sessions of instruction. Data were collected on assistance level, walking distance and speed, heart rate, perceived exertion, and adverse events. Time and effort was quantified by the number of sessions required for participants to stand up, walk for 30 minutes, and sit down, initially with minimal and subsequently with contact guard assistance. Of 22 enrolled participants, 9 screen-failed, and 7 had complete data. All of these 7 were men; 2 had tetraplegia and 5 had motor-complete injuries. Of these, 5 participants could stand, walk, and sit with contact guard or close supervision assistance, and 2 required minimal to moderate assistance. Walk times ranged from 28 to 94 minutes with average speeds ranging from 0.11 to 0.21 m/s. For all participants, heart rate changes and reported perceived exertion were consistent with light to moderate exercise. This study provides preliminary evidence that persons with neurological weakness due to SCI can learn to walk with little or no assistance and light to somewhat hard perceived exertion using a powered exoskeleton. Persons with different severities of injury, including those with motor complete C7 tetraplegia and motor incomplete C4 tetraplegia, may be able to learn to use this device.
Turner, Anne M; Reeder, Blaine; Ramey, Judith
2014-01-01
Purpose Despite years of effort and millions of dollars spent to create a unified electronic communicable disease reporting systems, the goal remains elusive. A major barrier has been a lack of understanding by system designers of communicable disease (CD) work and the public health workers who perform this work. This study reports on the application of User Center Design representations, traditionally used for improving interface design, to translate the complex CD work identified through ethnographic studies to guide designers and developers of CD systems. The purpose of this work is to: (1) better understand public health practitioners and their information workflow with respect to communicable disease (CD) monitoring and control at a local health department, and (2) to develop evidence-based design representations that model this CD work to inform the design of future disease surveillance systems. Methods We performed extensive onsite semi-structured interviews, targeted work shadowing and a focus group to characterize local health department communicable disease workflow. Informed by principles of design ethnography and user-centered design (UCD) we created persona, scenarios and user stories to accurately represent the user to system designers. Results We sought to convey to designers the key findings from ethnographic studies: 1) that public health CD work is mobile and episodic, in contrast to current CD reporting systems, which are stationary and fixed 2) health department efforts are focused on CD investigation and response rather than reporting and 3) current CD information systems must conform to PH workflow to ensure their usefulness. In an effort to illustrate our findings to designers, we developed three contemporary design-support representations: persona, scenario, and user story. Conclusions Through application of user centered design principles, we were able to create design representations that illustrate complex public health communicable disease workflow and key user characteristics to inform the design of CD information systems for public health. PMID:23618996
Development of the Community Health Improvement Navigator Database of Interventions.
Roy, Brita; Stanojevich, Joel; Stange, Paul; Jiwani, Nafisa; King, Raymond; Koo, Denise
2016-02-26
With the passage of the Patient Protection and Affordable Care Act, the requirements for hospitals to achieve tax-exempt status include performing a triennial community health needs assessment and developing a plan to address identified needs. To address community health needs, multisector collaborative efforts to improve both health care and non-health care determinants of health outcomes have been the most effective and sustainable. In 2015, CDC released the Community Health Improvement Navigator to facilitate the development of these efforts. This report describes the development of the database of interventions included in the Community Health Improvement Navigator. The database of interventions allows the user to easily search for multisector, collaborative, evidence-based interventions to address the underlying causes of the greatest morbidity and mortality in the United States: tobacco use and exposure, physical inactivity, unhealthy diet, high cholesterol, high blood pressure, diabetes, and obesity.
Enhancing the Value of the Federal Climate-Relevant Data Through the Climate Data Initiative
NASA Astrophysics Data System (ADS)
Meyer, D. J.; Pinheiro Privette, A. C.; Bugbee, K.
2016-12-01
The Climate Data Initiative (CDI), launched by the Obama Administration in March of 2014, is an effort to leverage the extensive open Federal data to spur innovation and private-sector entrepreneurship around climate resilience. As part of this initiative the federal agencies identified key climate-relevant datasets and made them discoverable through an online catalog at data.gov/climate. Although this was a critical and foundational step to improve the discoverability to these federal data, enhancements to its accessibility and usability require a deeper understanding of the data needs of the different user communities. More recently, the focus of the CDI project has evolved toward extended engagement with communities of resilience trough the identification of use-cases. This effort aims to guide the next steps of the CDI project to make the CDI resources more easily integrated into decision support systems
The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Diachin, L F; Garaizar, F X; Henson, V E
2009-10-12
In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE andmore » the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.« less
Development of the Community Health Improvement Navigator Database of Interventions
Roy, Brita; Stanojevich, Joel; Stange, Paul; Jiwani, Nafisa; King, Raymond; Koo, Denise
2016-01-01
Summary With the passage of the Patient Protection and Affordable Care Act, the requirements for hospitals to achieve tax-exempt status include performing a triennial community health needs assessment and developing a plan to address identified needs. To address community health needs, multisector collaborative efforts to improve both health care and non–health care determinants of health outcomes have been the most effective and sustainable. In 2015, CDC released the Community Health Improvement Navigator to facilitate the development of these efforts. This report describes the development of the database of interventions included in the Community Health Improvement Navigator. The database of interventions allows the user to easily search for multisector, collaborative, evidence-based interventions to address the underlying causes of the greatest morbidity and mortality in the United States: tobacco use and exposure, physical inactivity, unhealthy diet, high cholesterol, high blood pressure, diabetes, and obesity. PMID:26917110
Group-Based Active Learning of Classification Models.
Luo, Zhipeng; Hauskrecht, Milos
2017-05-01
Learning of classification models from real-world data often requires additional human expert effort to annotate the data. However, this process can be rather costly and finding ways of reducing the human annotation effort is critical for this task. The objective of this paper is to develop and study new ways of providing human feedback for efficient learning of classification models by labeling groups of examples. Briefly, unlike traditional active learning methods that seek feedback on individual examples, we develop a new group-based active learning framework that solicits label information on groups of multiple examples. In order to describe groups in a user-friendly way, conjunctive patterns are used to compactly represent groups. Our empirical study on 12 UCI data sets demonstrates the advantages and superiority of our approach over both classic instance-based active learning work, as well as existing group-based active-learning methods.
Study of data collection platform concepts: Data collection system user requirements
NASA Technical Reports Server (NTRS)
1973-01-01
The overall purpose of the survey was to provide real world data on user requirements. The intent was to assess data collection system user requirements by questioning actual potential users rather than speculating on requirements. The end results of the survey are baseline requirements models for both a data collection platform and a data collection system. These models were derived from the survey results. The real value of these models lies in the fact that they are based on actual user requirements as delineated in the survey questionnaires. Some users desire data collection platforms of small size and light weight. These sizes and weights are beyond the present state of the art. Also, the survey provided a wealth of information on the nature and constituency of the data collection user community as well as information on user applications for data collection systems. Finally, the data sheds light on the generalized platform concept. That is, the diversity of user requirements shown in the data indicates the difficulty that can be anticipated in attempting to implement such a concept.
Flow Cytometry Scientist | Center for Cancer Research
PROGRAM DESCRIPTION The Basic Science Program (BSP) pursues independent, multidisciplinary research in basic and applied molecular biology, immunology, retrovirology, cancer biology, and human genetics. Research efforts and support are an integral part of the Center for Cancer Research (CCR) at the Frederick National Laboratory for Cancer Research (FNLCR). KEY ROLES/RESPONSIBILITIES The Flow Cytometry Core (Flow Core) in the Cancer and Inflammation Program (CIP) is a service core which supports the research efforts of the CCR by providing expertise in the field of flow cytometry (using analyzers and sorters) with the goal of gaining a more thorough understanding of the biology of the immune system, cancer, and inflammation processes. The Flow Core provides service to 12-15 CIP laboratories and more than 22 non-CIP laboratories. Flow core staff provide technical advice on the experimental design of applications, which include immunological phenotyping, cell function assays, and cell cycle analysis. Work is performed per customer requirements, and no independent research is involved. The Flow Cytometry Scientist will be responsible for: Daily management of the Flow Cytometry Core, to include the supervision and guidance of technical staff members Monitor performance of and maintain high dimensional flow cytometer analyzers and cell sorters Operate high dimensional flow cytometer analyzers and cell sorters Provide scientific expertise to the user community and facilitate the development of cutting edge technologies Interact with Flow Core users and customers, and provide technical and scientific advice, and guidance regarding their experiments, including possible collaborations Train staff and scientific end users on the use of flow cytometry in their research, as well as teach them how to operate and troubleshoot the bench-top analyzer instruments Prepare and deliver lectures, as well as one-on-one training sessions, with customers/users Ensure that protocols are up-to-date, and appropriately adhered to Experience with sterile technique and tissue culture
Grass Roots Design for the Ocean Science of Tomorrow
NASA Astrophysics Data System (ADS)
Jul, S.; Peach, C. L.; Kilb, D. L.; Schofield, O.; Fisher, C.; Quintana, C.; Keen, C. S.
2010-12-01
Current technologies offer the opportunity for ocean science to expand its traditional expeditionary base by embracing e-science methods of continuous interactive real-time research. The Ocean Observatories Initiative Cyberinfrastructure (OOI CI) is an NSF-funded effort to develop a national cyberinfrastructure that will allow researchers, educators and others to share in this new type of oceanography. The OOI is an environmental observatory spanning coastal waters to the deep ocean, enabled by the CI to offer scientists continuous interactive access to instruments in the ocean, and allow them to search, subscribe to and access real-time or archival data streams. It will also supply interactive analysis and visualization tools, and a virtual social environment for discovering and realizing collaborative opportunities. Most importantly, it provides an extensible open-access cyberinfrastructure that supports integration of new technologies and observatories, and which will allow adoption of its tools elsewhere, such as by the Integrated Ocean Observing System (IOOS). The eventual success of such a large and flexible system requires the input of a large number of people, and user-centered design has been a driving philosophy of the OOI CI from its beginning. Support for users’ real needs cannot be designed as an add-on or casual afterthought, but must be deeply embedded in all aspects of a project, from inception through architecture, implementation, and deployment. The OOI CI strategy is to employ the skills and knowledge of a small number of user experience professionals to channel and guide a very large collective effort to deliver tools, interfaces and interactions that are intellectually stimulating, scientifically productive, and conducive to innovation. Participation from all parts of the user community early in the design process is vital to meeting these goals. The OOI user experience team will be on hand to meet members of the Earth and ocean sciences community, and invites them to become partners in the design of the Ocean Observatory by offering their thoughts, ideas and observations.
NASA Astrophysics Data System (ADS)
Ganter, John H.; Reeves, Paul C.
2017-05-01
Processing remote sensing data is the epitome of computation, yet real-time collection systems remain human-labor intensive. Operator labor is consumed by both overhead tasks (cost) and value-added production (benefit). In effect, labor is taxed and then lost. When an operator comes on-shift, they typically duplicate setup work that their teammates have already performed many times. "Pass down" of state information can be difficult if security restrictions require total logouts and blank screens - hours or even days of valuable history and context are lost. As work proceeds, duplicative effort is common because it is typically easier for operators to "do it over" rather than share what others have already done. As we begin a major new system version, we are refactoring the user interface to reduce time and motion losses. Working with users, we are developing "click budgets" to streamline interface use. One basic function is shared clipboards to reduce the use of sticky notes and verbal communication of data strings. We illustrate two additional designs to share work: window copying and window sharing. Copying (technically, shallow or deep object cloning) allows any system user to duplicate a window and configuration for themselves or another to use. Sharing allows a window to have multiple users: shareholders with read-write functionality and viewers with read-only. These solutions would allow windows to persist across multiple shifts, with a rotating cast of shareholders and viewers. Windows thus become durable objects of shared effort and persistent state. While these are low-tech functions, the cumulative labor savings in a 24X7 crew position (525,000 minutes/year spread over multiple individuals) would be significant. New design and implementation is never free and these investments typically do not appeal to government acquisition officers with short-term acquisition-cost concerns rather than a long-term O and M (operations and maintenance) perspective. We share some successes in educating some officers, in collaboration with system users, about the human capital involved in operating the systems they are acquiring.
Action Information Management System (AIMS): a User's View
NASA Technical Reports Server (NTRS)
Wiskerchen, M.
1984-01-01
The initial approach used in establishing a user-defined information system to fulfill the needs of users at NASA Headquarters was unsuccessful in bringing this pilot endeaveor to full project status. The persistence of several users and the full involvement of the Ames Research Center were the ingredients needed to make the AIMS project a success. The lesson learned from this effort is that NASA should always work from its organizational strengths as a Headquarters-Center partnership.
Proceedings of the Workshop on Government Oil Spill Modeling
NASA Technical Reports Server (NTRS)
Bishop, J. M. (Compiler)
1980-01-01
Oil spill model users and modelers were brought together for the purpose of fostering joint communication and increasing understanding of mutual problems. The workshop concentrated on defining user needs, presentations on ongoing modeling programs, and discussions of supporting research for these modeling efforts. Specific user recommendations include the development of an oil spill model user library which identifies and describes available models. The development of models for the long-term fate and effect of spilled oil was examined.
NASA work unit system users manual
NASA Technical Reports Server (NTRS)
1972-01-01
The NASA Work Unit System is a management information system for research tasks (i.e., work units) performed under NASA grants and contracts. It supplies profiles to indicate how much effort is being expended to what types of research, where the effort is being expended, and how funds are being distributed. The user obtains information by entering requests on the keyboard of a time-sharing terminal. Responses are received as video displays or typed messages at the terminal, or as lists printed in the computer room for subsequent delivery by messenger.
Vucovich, Lee A; Gordon, Valerie S; Mitchell, Nicole; Ennis, Lisa A
2013-01-01
Librarians are using social networking sites as one means of sharing information and connecting with users from diverse groups. Usage statistics and other metrics compiled in 2011 for the library's Facebook page, representative library blogs, and the library YouTube channel were analyzed in an effort to understand how patrons use the library's social networking sites. Librarians also hoped to get a sense of these tools' effectiveness in reaching users at the point of need and engaging them in different ways.
Space Station module Power Management And Distribution (PMAD) system
NASA Technical Reports Server (NTRS)
Walls, Bryan
1990-01-01
This project consists of several tasks which are unified toward experimentally demonstrating the operation of a highly autonomous, user-supportive power management and distribution system for Space Station Freedom (SSF) habitation/laboratory modules. This goal will be extended to a demonstration of autonomous, cooperative power system operation for the whole SSF power system through a joint effort with NASA's Lewis Research Center, using their Autonomous Power System. Short term goals for the space station module power management and distribution include having an operational breadboard reflecting current plans for SSF, improving performance of the system communications, and improving the organization and mutability of the artificial intelligence (AI) systems. In the middle term, intermediate levels of autonomy will be added, user interfaces will be modified, and enhanced modeling capabilities will be integrated in the system. Long term goals involve conversion of all software into Ada, vigorous verification and validation efforts and, finally, seeing an impact of this research on the operation of SSF. Conversion of the system to a DC Star configuration is now in progress, and should be completed by the end of October, 1989. This configuration reflects the latest SSF module architecture. Hardware is now being procured which will improve system communications significantly. The Knowledge-Based Management System (KBMS) is initially developed and the rules from FRAMES have been implemented in the KBMS. Rules in the other two AI systems are also being grouped modularly, making them more tractable, and easier to eventually move into the KBMS. Adding an intermediate level of autonomy will require development of a planning utility, which will also be built using the KBMS. These changes will require having the user interface for the whole system available from one interface. An Enhanced Model will be developed, which will allow exercise of the system through the interface without requiring all of the power hardware to be operational. The functionality of the AI systems will continue to be advanced, including incipient failure detection. Ada conversion will begin with the lowest level processor (LLP) code. Then selected pieces of the higher level functionality will be recorded in Ada and, where possible, moved to the LLP level. Validation and verification will be done on the Ada code, and will complete sometimes after completion of the Ada conversion.
Interactive semiautomatic contour delineation using statistical conditional random fields framework.
Hu, Yu-Chi; Grossberg, Michael D; Wu, Abraham; Riaz, Nadeem; Perez, Carmen; Mageras, Gig S
2012-07-01
Contouring a normal anatomical structure during radiation treatment planning requires significant time and effort. The authors present a fast and accurate semiautomatic contour delineation method to reduce the time and effort required of expert users. Following an initial segmentation on one CT slice, the user marks the target organ and nontarget pixels with a few simple brush strokes. The algorithm calculates statistics from this information that, in turn, determines the parameters of an energy function containing both boundary and regional components. The method uses a conditional random field graphical model to define the energy function to be minimized for obtaining an estimated optimal segmentation, and a graph partition algorithm to efficiently solve the energy function minimization. Organ boundary statistics are estimated from the segmentation and propagated to subsequent images; regional statistics are estimated from the simple brush strokes that are either propagated or redrawn as needed on subsequent images. This greatly reduces the user input needed and speeds up segmentations. The proposed method can be further accelerated with graph-based interpolation of alternating slices in place of user-guided segmentation. CT images from phantom and patients were used to evaluate this method. The authors determined the sensitivity and specificity of organ segmentations using physician-drawn contours as ground truth, as well as the predicted-to-ground truth surface distances. Finally, three physicians evaluated the contours for subjective acceptability. Interobserver and intraobserver analysis was also performed and Bland-Altman plots were used to evaluate agreement. Liver and kidney segmentations in patient volumetric CT images show that boundary samples provided on a single CT slice can be reused through the entire 3D stack of images to obtain accurate segmentation. In liver, our method has better sensitivity and specificity (0.925 and 0.995) than region growing (0.897 and 0.995) and level set methods (0.912 and 0.985) as well as shorter mean predicted-to-ground truth distance (2.13 mm) compared to regional growing (4.58 mm) and level set methods (8.55 mm and 4.74 mm). Similar results are observed in kidney segmentation. Physician evaluation of ten liver cases showed that 83% of contours did not need any modification, while 6% of contours needed modifications as assessed by two or more evaluators. In interobserver and intraobserver analysis, Bland-Altman plots showed our method to have better repeatability than the manual method while the delineation time was 15% faster on average. Our method achieves high accuracy in liver and kidney segmentation and considerably reduces the time and labor required for contour delineation. Since it extracts purely statistical information from the samples interactively specified by expert users, the method avoids heuristic assumptions commonly used by other methods. In addition, the method can be expanded to 3D directly without modification because the underlying graphical framework and graph partition optimization method fit naturally with the image grid structure.
ERIC Educational Resources Information Center
Read, Aaron
2013-01-01
The rise of stakeholder centered software development has led to organizations engaging users early in the development process to help define system requirements. To facilitate user involvement in the requirements elicitation process, companies can use Group Support Systems (GSS) to conduct requirements elicitation workshops. The effectiveness of…
Automated Flight Dynamics Product Generation for the EOS AM-1 Spacecraft
NASA Technical Reports Server (NTRS)
Matusow, Carla
1999-01-01
As part of NASA's Earth Science Enterprise, the Earth Observing System (EOS) AM-1 spacecraft is designed to monitor long-term, global, environmental changes. Because of the complexity of the AM-1 spacecraft, the mission operations center requires more than 80 distinct flight dynamics products (reports). To create these products, the AM-1 Flight Dynamics Team (FDT) will use a combination of modified commercial software packages (e.g., Analytical Graphic's Satellite ToolKit) and NASA-developed software applications. While providing the most cost-effective solution to meeting the mission requirements, the integration of these software applications raises several operational concerns: (1) Routine product generation requires knowledge of multiple applications executing on variety of hardware platforms. (2) Generating products is a highly interactive process requiring a user to interact with each application multiple times to generate each product. (3) Routine product generation requires several hours to complete. (4) User interaction with each application introduces the potential for errors, since users are required to manually enter filenames and input parameters as well as run applications in the correct sequence. Generating products requires some level of flight dynamics expertise to determine the appropriate inputs and sequencing. To address these issues, the FDT developed an automation software tool called AutoProducts, which runs on a single hardware platform and provides all necessary coordination and communication among the various flight dynamics software applications. AutoProducts, autonomously retrieves necessary files, sequences and executes applications with correct input parameters, and deliver the final flight dynamics products to the appropriate customers. Although AutoProducts will normally generate pre-programmed sets of routine products, its graphical interface allows for easy configuration of customized and one-of-a-kind products. Additionally, AutoProducts has been designed as a mission-independent tool, and can be easily reconfigured to support other missions or incorporate new flight dynamics software packages. After the AM-1 launch, AutoProducts will run automatically at pre-determined time intervals . The AutoProducts tool reduces many of the concerns associated with the flight dynamics product generation. Although AutoProducts required a significant effort to develop because of the complexity of the interfaces involved, its use will provide significant cost savings through reduced operator time and maximum product reliability. In addition, user satisfaction is significantly improved and flight dynamics experts have more time to perform valuable analysis work. This paper will describe the evolution of the AutoProducts tool, highlighting the cost savings and customer satisfaction resulting from its development. It will also provide details about the tool including its graphical interface and operational capabilities.
Lawrence Berkeley Laboratory/University of California lighting program overview
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berman, S.
1981-12-01
The objective of the Lighting Program is to assist and work in concert with the lighting community (composed of manufacturers, designers, and users) to achieve a more efficient lighting economy. To implement its objectives, the Lighting Program has been divided into three major categories: technical engineering, buildings applications, and human impacts (impacts on health and vision). The technical program aims to undertake research and development projects that are both long-range and high-risk and which the lighting industry has little interest in pursuing on its own, but from which significant benefits could accrue to both the public and the industry. Themore » building applications program studies the effects that introducing daylighting in commercial buildings has on lighting and cooling electrical energy requirements as well as on peak demand. This program also examines optimization strategies for integrating energy-efficient design, lighting hardware, daylighting, and overall building energy requirements. The impacts program examines relationships between the user and the physical lighting environment, in particular how new energy-efficient technologies relate to human productivity and health. These efforts are interdisciplinary, involving engineering, optometry, and medicine. The program facilities are described and the personnel in the program is identified.« less
An integrated approach to stakeholder engagement.
Carr, Dafna; Howells, Arlene; Chang, Melissa; Hirji, Nadir; English, Ann
2009-01-01
The Wait Time Information System (WTIS) project was a complex change-management initiative. For the first time in Ontario, wait time data would be captured directly from clinician offices and publicly reported in an effort to improve access to care. The change meant using new technology, new business processes and, most importantly, a new dimension of accountability for making improvements within the health system. Success required engaging thousands of individuals at all levels of healthcare, many of whom were skeptical and resistant to the upcoming change, and subsequently gaining their support and motivating them to use the WTIS and its data. To achieve the level of stakeholder engagement that would be required to deploy and sustain the WTIS, the project team needed to address both the business reasons for change, and the emotional reactions to it. The team applied a three-pronged approach encompassing strong communications, compelling adoption efforts and hands-on training. Communication focused on awareness and education, ensuring that information was coordinated, consistent and transparent. Adoption efforts involved helping hospitals and users understand and prepare for the impact of change. Training provided hands-on practice to get people comfortable with using the system. This article explores how information management/information technology (IM/IT) projects can integrate communications, adoption and training to drive stakeholder engagement. It also provides insight around how, when used effectively, these functions can maximize limited resources and provide valuable benefits.
Scheduling with Automatic Resolution of Conflicts
NASA Technical Reports Server (NTRS)
Clement, Bradley; Schaffer, Steve
2006-01-01
DSN Requirement Scheduler is a computer program that automatically schedules, reschedules, and resolves conflicts for allocations of resources of NASA s Deep Space Network (DSN) on the basis of ever-changing project requirements for DSN services. As used here, resources signifies, primarily, DSN antennas, ancillary equipment, and times during which they are available. Examples of project-required DSN services include arraying, segmentation, very-long-baseline interferometry, and multiple spacecraft per aperture. Requirements can include periodic reservations of specific or optional resources during specific time intervals or within ranges specified in terms of starting times and durations. This program is built on the Automated Scheduling and Planning Environment (ASPEN) software system (aspects of which have been described in previous NASA Tech Briefs articles), with customization to reflect requirements and constraints involved in allocation of DSN resources. Unlike prior DSN-resource- scheduling programs that make single passes through the requirements and require human intervention to resolve conflicts, this program makes repeated passes in a continuing search for all possible allocations, provides a best-effort solution at any time, and presents alternative solutions among which users can choose.
Clark, Alex M; Bunin, Barry A; Litterman, Nadia K; Schürer, Stephan C; Visser, Ubbo
2014-01-01
Bioinformatics and computer aided drug design rely on the curation of a large number of protocols for biological assays that measure the ability of potential drugs to achieve a therapeutic effect. These assay protocols are generally published by scientists in the form of plain text, which needs to be more precisely annotated in order to be useful to software methods. We have developed a pragmatic approach to describing assays according to the semantic definitions of the BioAssay Ontology (BAO) project, using a hybrid of machine learning based on natural language processing, and a simplified user interface designed to help scientists curate their data with minimum effort. We have carried out this work based on the premise that pure machine learning is insufficiently accurate, and that expecting scientists to find the time to annotate their protocols manually is unrealistic. By combining these approaches, we have created an effective prototype for which annotation of bioassay text within the domain of the training set can be accomplished very quickly. Well-trained annotations require single-click user approval, while annotations from outside the training set domain can be identified using the search feature of a well-designed user interface, and subsequently used to improve the underlying models. By drastically reducing the time required for scientists to annotate their assays, we can realistically advocate for semantic annotation to become a standard part of the publication process. Once even a small proportion of the public body of bioassay data is marked up, bioinformatics researchers can begin to construct sophisticated and useful searching and analysis algorithms that will provide a diverse and powerful set of tools for drug discovery researchers.
Bunin, Barry A.; Litterman, Nadia K.; Schürer, Stephan C.; Visser, Ubbo
2014-01-01
Bioinformatics and computer aided drug design rely on the curation of a large number of protocols for biological assays that measure the ability of potential drugs to achieve a therapeutic effect. These assay protocols are generally published by scientists in the form of plain text, which needs to be more precisely annotated in order to be useful to software methods. We have developed a pragmatic approach to describing assays according to the semantic definitions of the BioAssay Ontology (BAO) project, using a hybrid of machine learning based on natural language processing, and a simplified user interface designed to help scientists curate their data with minimum effort. We have carried out this work based on the premise that pure machine learning is insufficiently accurate, and that expecting scientists to find the time to annotate their protocols manually is unrealistic. By combining these approaches, we have created an effective prototype for which annotation of bioassay text within the domain of the training set can be accomplished very quickly. Well-trained annotations require single-click user approval, while annotations from outside the training set domain can be identified using the search feature of a well-designed user interface, and subsequently used to improve the underlying models. By drastically reducing the time required for scientists to annotate their assays, we can realistically advocate for semantic annotation to become a standard part of the publication process. Once even a small proportion of the public body of bioassay data is marked up, bioinformatics researchers can begin to construct sophisticated and useful searching and analysis algorithms that will provide a diverse and powerful set of tools for drug discovery researchers. PMID:25165633
fNIRS suggests increased effort during executive access in ecstasy polydrug users.
Roberts, C A; Montgomery, C
2015-05-01
Ecstasy use is associated with cognitive impairment, believed to result from damage to 5-HT axons. Neuroimaging techniques to investigate executive dysfunction in ecstasy users provide a more sensitive measure of cognitive impairment than behavioural indicators. The present study assessed executive access to semantic memory in ecstasy polydrug users and non-users. Twenty ecstasy polydrug users and 20 non-user controls completed an oral variant of the Chicago Word Fluency Test (CWFT), whilst the haemodynamic response to the task was measured using functional near-infrared spectroscopy (fNIRS). There were no between-group differences in many background measures including measures of sleep and mood state (anxiety, arousal, hedonic tone). No behavioural differences were observed on the CWFT. However, there were significant differences in oxy-Hb level change at several voxels relating to the left dorsolateral prefrontal cortex (DLPFC) and right medial prefrontal cortex (PFC) during the CWFT, indicating increased cognitive effort in ecstasy users relative to controls. Regression analyses showed that frequency of ecstasy use, total lifetime dose and amount used in the last 30 days was significant predictors of oxy-Hb increase at several voxels after controlling for alcohol and cannabis use indices. The results suggest that ecstasy users show increased activation in the PFC as a compensatory mechanism, to achieve equivalent performance to non-users. These findings are in agreement with much of the literature in the area which suggests that ecstasy may be a selective serotonin neurotoxin in humans.
Criteria for successful uptake of AAL technologies: lessons learned from Norwegian pilot projects.
Svagård, Ingrid; Ausen, Dag; Standal, Kristin
2013-01-01
Implementation of AAL-technology as an integrated part of public health and care services requires a systematic and multidisciplinary approach. There are several challenges that need to be handled in parallel and with sustained effort over time, to tackle the multidimensional problem of building the value chain that is required for widespread uptake of AAL technology. Several pilot projects are on-going in Norway, involving municipalities, technology providers and research partners. Examples are "Home Safety" (NO: Trygghetspakken) and "Safe Tracks" (NO: Trygge spor). This paper will elaborate on our lessons learned with focus on five main points: 1) User-friendly and robust technology 2) Technology adapted organization 3) Service oriented technology providers 4) Care service organizations as demanding customer and 5) Sustainable financial model.
Automated crystallographic system for high-throughput protein structure determination.
Brunzelle, Joseph S; Shafaee, Padram; Yang, Xiaojing; Weigand, Steve; Ren, Zhong; Anderson, Wayne F
2003-07-01
High-throughput structural genomic efforts require software that is highly automated, distributive and requires minimal user intervention to determine protein structures. Preliminary experiments were set up to test whether automated scripts could utilize a minimum set of input parameters and produce a set of initial protein coordinates. From this starting point, a highly distributive system was developed that could determine macromolecular structures at a high throughput rate, warehouse and harvest the associated data. The system uses a web interface to obtain input data and display results. It utilizes a relational database to store the initial data needed to start the structure-determination process as well as generated data. A distributive program interface administers the crystallographic programs which determine protein structures. Using a test set of 19 protein targets, 79% were determined automatically.
The IRMIS object model and services API.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saunders, C.; Dohan, D. A.; Arnold, N. D.
2005-01-01
The relational model developed for the Integrated Relational Model of Installed Systems (IRMIS) toolkit has been successfully used to capture the Advanced Photon Source (APS) control system software (EPICS process variables and their definitions). The relational tables are populated by a crawler script that parses each Input/Output Controller (IOC) start-up file when an IOC reboot is detected. User interaction is provided by a Java Swing application that acts as a desktop for viewing the process variable information. Mapping between the display objects and the relational tables was carried out with the Hibernate Object Relational Modeling (ORM) framework. Work is wellmore » underway at the APS to extend the relational modeling to include control system hardware. For this work, due in part to the complex user interaction required, the primary application development environment has shifted from the relational database view to the object oriented (Java) perspective. With this approach, the business logic is executed in Java rather than in SQL stored procedures. This paper describes the object model used to represent control system software, hardware, and interconnects in IRMIS. We also describe the services API used to encapsulate the required behaviors for creating and maintaining the complex data. In addition to the core schema and object model, many important concepts in IRMIS are captured by the services API. IRMIS is an ambitious collaborative effort for defining and developing a relational database and associated applications to comprehensively document the large and complex EPICS-based control systems of today's accelerators. The documentation effort includes process variables, control system hardware, and interconnections. The approach could also be used to document all components of the accelerator, including mechanical, vacuum, power supplies, etc. One key aspect of IRMIS is that it is a documentation framework, not a design and development tool. We do not generate EPICS control system configurations from IRMIS, and hence do not impose any additional requirements on EPICS developers.« less
Kim, Myunghee; Collins, Steven H
2015-05-01
Individuals with below-knee amputation have more difficulty balancing during walking, yet few studies have explored balance enhancement through active prosthesis control. We previously used a dynamical model to show that prosthetic ankle push-off work affects both sagittal and frontal plane dynamics, and that appropriate step-by-step control of push-off work can improve stability. We hypothesized that this approach could be applied to a robotic prosthesis to partially fulfill the active balance requirements of human walking, thereby reducing balance-related activity and associated effort for the person using the device. We conducted experiments on human participants (N = 10) with simulated amputation. Prosthetic ankle push-off work was varied on each step in ways expected to either stabilize, destabilize or have no effect on balance. Average ankle push-off work, known to affect effort, was kept constant across conditions. Stabilizing controllers commanded more push-off work on steps when the mediolateral velocity of the center of mass was lower than usual at the moment of contralateral heel strike. Destabilizing controllers enforced the opposite relationship, while a neutral controller maintained constant push-off work regardless of body state. A random disturbance to landing foot angle and a cognitive distraction task were applied, further challenging participants' balance. We measured metabolic rate, foot placement kinematics, center of pressure kinematics, distraction task performance, and user preference in each condition. We expected the stabilizing controller to reduce active control of balance and balance-related effort for the user, improving user preference. The best stabilizing controller lowered metabolic rate by 5.5% (p = 0.003) and 8.5% (p = 0.02), and step width variability by 10.0% (p = 0.009) and 10.7% (p = 0.03) compared to conditions with no control and destabilizing control, respectively. Participants tended to prefer stabilizing controllers. These effects were not due to differences in average push-off work, which was unchanged across conditions, or to average gait mechanics, which were also unchanged. Instead, benefits were derived from step-by-step adjustments to prosthesis behavior in response to variations in mediolateral velocity at heel strike. Once-per-step control of prosthetic ankle push-off work can reduce both active control of foot placement and balance-related metabolic energy use during walking.
DOT National Transportation Integrated Search
The GPS-based Odograph Prototype (GOP or GPS Odograph) was developed in an effort sponsored by The Federal Highway Administration (FHWA). The purpose of this effort was to develop a means of using inexpensive commercial off-the-self laptop (or notebo...
Human Factors Plan for the Aeronautical Information Subsystem
DOT National Transportation Integrated Search
1994-10-01
This human factors plan covers the human factors effort for the development of the Aeronautical Information Subsystem (AIS) of the Operational Data Management System (ODMS). Broadly the goals of the human factors effort are to provide a user interfac...
Safety | Argonne National Laboratory
laboratory's ongoing effort to provide a safe and productive environment for employees, users, other site Skip to main content Argonne National Laboratory Toggle Navigation Toggle Search Energy Environment Careers Education Community Diversity Directory Energy Environment National Security User Facilities
Earth Observations and the Role of UAVs: A Capabilities Assessment. Version 1.1
NASA Technical Reports Server (NTRS)
Cox, Timothy H.; Somers, Ivan; Fratello, David J.
2006-01-01
This document provides an assessment of the civil UAV missions and technologies and is intended to parallel the Office of the Secretary of Defense UAV Roadmap. The intent of this document is four-fold: 1. Determine and document desired future missions of Earth observation UAVs based on user-defined needs 2. Determine and document the technologies necessary to support those missions 3. Discuss the present state of the platform capabilities and required technologies, identifying those in progress, those planned, and those for which no current plans exist 4. Provide the foundations for development of a comprehensive civil UAV roadmap to complement the Department of Defense (DoD) effort (http://www.acq.osd.mil/uas/). Two aspects of the President's Management Agenda (refer to the document located at: www.whitehouse.gov/omb/budget/fy2002/mgmt.pdf ) are supported by this undertaking. First, it is one that will engage multiple Agencies in the effort as stakeholders and benefactors of the systems. In that sense, the market will be driven by the user requirements and applications. The second aspect is one of supporting economic development in the commercial sector. Market forecasts for the civil use of UAVs have indicated an infant market stage at present with a sustained forecasted growth. There is some difficulty in quantifying the value of the market since the typical estimate excludes system components other than the aerial platforms. Section 2.4 addresses the civil UAV market forecast and lists several independent forecasts. One conclusion that can be drawn from these forecasts is that all show a sustained growth for the duration of each long-term forecast.
Evaluation of DVD-R for Archival Applications
NASA Technical Reports Server (NTRS)
Martin, Michael D.; Hyon, Jason J.
2000-01-01
For more than a decade, CD-ROM and CD-R have provided an unprecedented level of reliability, low cost and cross-platform compatibility to support federal data archiving and distribution efforts. However, it should be remembered that years of effort were required to achieve the standardization that has supported the growth of the CD industry. Incompatibilities in the interpretation of the ISO-9660 standard on different operating systems had to be dealt with, and the imprecise specifications in the Orange Book Part n and Part Hi led to incompatibilities between CD-R media and CD-R recorders. Some of these issues were presented by the authors at Optical Data Storage '95. The major current problem with the use of CD technology is the growing volume of digital data that needs to be stored. CD-ROM collections of hundreds of volumes and CD-R collections of several thousand volumes are becoming almost too cumbersome to be useful. The emergence of Digital Video Disks Recorder (DVD-R) technology promises to reduce the number of discs required for archive applications by a factor of seven while providing improved reliability. It is important to identify problem areas for DVD-R media and provide guidelines to manufacturers, file system developers and users in order to provide reliable data storage and interchange. The Data Distribution Laboratory (DDL) at NASA's Jet Propulsion Laboratory began its evaluation of DVD-R technology in early 1998. The initial plan was to obtain a DVD-Recorder for preliminary testing, deploy reader hardware to user sites for compatibility testing, evaluate the quality and longevity of DVD-R media and develop proof-of-concept archive collections to test the reliability and usability of DVD-R media and jukebox hardware.
R2R--software to speed the depiction of aesthetic consensus RNA secondary structures.
Weinberg, Zasha; Breaker, Ronald R
2011-01-04
With continuing identification of novel structured noncoding RNAs, there is an increasing need to create schematic diagrams showing the consensus features of these molecules. RNA structural diagrams are typically made either with general-purpose drawing programs like Adobe Illustrator, or with automated or interactive programs specific to RNA. Unfortunately, the use of applications like Illustrator is extremely time consuming, while existing RNA-specific programs produce figures that are useful, but usually not of the same aesthetic quality as those produced at great cost in Illustrator. Additionally, most existing RNA-specific applications are designed for drawing single RNA molecules, not consensus diagrams. We created R2R, a computer program that facilitates the generation of aesthetic and readable drawings of RNA consensus diagrams in a fraction of the time required with general-purpose drawing programs. Since the inference of a consensus RNA structure typically requires a multiple-sequence alignment, the R2R user annotates the alignment with commands directing the layout and annotation of the RNA. R2R creates SVG or PDF output that can be imported into Adobe Illustrator, Inkscape or CorelDRAW. R2R can be used to create consensus sequence and secondary structure models for novel RNA structures or to revise models when new representatives for known RNA classes become available. Although R2R does not currently have a graphical user interface, it has proven useful in our efforts to create 100 schematic models of distinct noncoding RNA classes. R2R makes it possible to obtain high-quality drawings of the consensus sequence and structural models of many diverse RNA structures with a more practical amount of effort. R2R software is available at http://breaker.research.yale.edu/R2R and as an Additional file.
The DYNES Instrument: A Description and Overview
NASA Astrophysics Data System (ADS)
Zurawski, Jason; Ball, Robert; Barczyk, Artur; Binkley, Mathew; Boote, Jeff; Boyd, Eric; Brown, Aaron; Brown, Robert; Lehman, Tom; McKee, Shawn; Meekhof, Benjeman; Mughal, Azher; Newman, Harvey; Rozsa, Sandor; Sheldon, Paul; Tackett, Alan; Voicu, Ramiro; Wolff, Stephen; Yang, Xi
2012-12-01
Scientific innovation continues to increase requirements for the computing and networking infrastructures of the world. Collaborative partners, instrumentation, storage, and processing facilities are often geographically and topologically separated, as is the case with LHC virtual organizations. These separations challenge the technology used to interconnect available resources, often delivered by Research and Education (R&E) networking providers, and leads to complications in the overall process of end-to-end data management. Capacity and traffic management are key concerns of R&E network operators; a delicate balance is required to serve both long-lived, high capacity network flows, as well as more traditional end-user activities. The advent of dynamic circuit services, a technology that enables the creation of variable duration, guaranteed bandwidth networking channels, allows for the efficient use of common network infrastructures. These gains are seen particularly in locations where overall capacity is scarce compared to the (sustained peak) needs of user communities. Related efforts, including those of the LHCOPN [3] operations group and the emerging LHCONE [4] project, may take advantage of available resources by designating specific network activities as a “high priority”, allowing reservation of dedicated bandwidth or optimizing for deadline scheduling and predicable delivery patterns. This paper presents the DYNES instrument, an NSF funded cyberinfrastructure project designed to facilitate end-to-end dynamic circuit services [2]. This combination of hardware and software innovation is being deployed across R&E networks in the United States at selected end-sites located on University Campuses. DYNES is peering with international efforts in other countries using similar solutions, and is increasing the reach of this emerging technology. This global data movement solution could be integrated into computing paradigms such as cloud and grid computing platforms, and through the use of APIs can be integrated into existing data movement software.
No Pixel Left Behind - Peeling Away NASA's Satellite Swaths
NASA Astrophysics Data System (ADS)
Cechini, M. F.; Boller, R. A.; Schmaltz, J. E.; Roberts, J. T.; Alarcon, C.; Huang, T.; McGann, M.; Murphy, K. J.
2014-12-01
Discovery and identification of Earth Science products should not be the majority effort of scientific research. Search aides based on text metadata go to great lengths to simplify this process. However, the process is still cumbersome and requires too much data download and analysis to down select to valid products. The EOSDIS Global Imagery Browse Services (GIBS) is attempting to improve this process by providing "visual metadata" in the form of full-resolution visualizations representing geophysical parameters taken directly fromt he data. Through the use of accompanying interpretive information such as color legends and the natural visual processing of the human eye, researchers are able to search and filter through data products in a more natural and efficient way. The GIBS "visual metadata" products are generated as representations of Level 3 data or as temporal composites of the Level 2 granule- or swath-based data products projected across a geographic or polar region. Such an approach allows for low-latency tiled access to pre-generated imagery products. For many GIBS users, the resulting image suffices for a basic representation of the underlying data. However, composite imagery presents an insurmountable problem: for areas of spatial overlap within the composite, only one observation is visually represented. This is especially problematic in the polar regions where a significant portion of sensed data is "lost." In response to its user community, the GIBS team coordinated with its stakeholders to begin developing an approach to ensure that there is "no pixel left behind." In this presentation we will discuss the use cases and requirements guiding our efforts, considerations regarding standards compliance and interoperability, and near term goals. We will also discuss opportunities to actively engage with the GIBS team on this topic to continually improve our services.
Application of a single-flicker online SSVEP BCI for spatial navigation.
Chen, Jingjing; Zhang, Dan; Engel, Andreas K; Gong, Qin; Maye, Alexander
2017-01-01
A promising approach for brain-computer interfaces (BCIs) employs the steady-state visual evoked potential (SSVEP) for extracting control information. Main advantages of these SSVEP BCIs are a simple and low-cost setup, little effort to adjust the system parameters to the user and comparatively high information transfer rates (ITR). However, traditional frequency-coded SSVEP BCIs require the user to gaze directly at the selected flicker stimulus, which is liable to cause fatigue or even photic epileptic seizures. The spatially coded SSVEP BCI we present in this article addresses this issue. It uses a single flicker stimulus that appears always in the extrafoveal field of view, yet it allows the user to control four control channels. We demonstrate the embedding of this novel SSVEP stimulation paradigm in the user interface of an online BCI for navigating a 2-dimensional computer game. Offline analysis of the training data reveals an average classification accuracy of 96.9±1.64%, corresponding to an information transfer rate of 30.1±1.8 bits/min. In online mode, the average classification accuracy reached 87.9±11.4%, which resulted in an ITR of 23.8±6.75 bits/min. We did not observe a strong relation between a subject's offline and online performance. Analysis of the online performance over time shows that users can reliably control the new BCI paradigm with stable performance over at least 30 minutes of continuous operation.
Survey of upper limb prosthesis users in Sweden, the United Kingdom and Canada.
Kyberd, Peter J; Hill, Wendy
2011-06-01
As part of the process of improving prosthetic arms, it is important to obtain the opinions of the user population. To identify factors that should be focused on to improve prosthesis provision. Postal questionnaire. The questionnaire was sent to 292 adults (aged 18 to 70 years) with upper-limb loss or absence at five centres (four in Europe) Participants were identified as regular attendees of the centres. This questionnaire received a response from 180 users (response rate 62%) of different types of prosthetic devices. Responses showed that the type of prosthesis generally used was associated with gender, level of loss and use for work (Pearson chi-square, p-values below 0.05). The type of prosthesis was not associated with cause, side, usage (length per day, sports or driving) or reported problems. The findings did not identify any single factor requiring focus for the improvement of prostheses or prosthetic provision. Every part of the process of fitting a prosthesis can be improved, which will have an effect for some of the population who use their devices regularly. There is, however, no single factor that would bring greater improvement to all users. Based on information gained from a broad range of prosthesis users, no single aspect of prosthetic provision will have a greater impact on the use of upper limb prostheses than any other. Efforts to improve the designs of prosthetic systems can cover any aspect of provision.
OAP- OFFICE AUTOMATION PILOT GRAPHICS DATABASE SYSTEM
NASA Technical Reports Server (NTRS)
Ackerson, T.
1994-01-01
The Office Automation Pilot (OAP) Graphics Database system offers the IBM PC user assistance in producing a wide variety of graphs and charts. OAP uses a convenient database system, called a chartbase, for creating and maintaining data associated with the charts, and twelve different graphics packages are available to the OAP user. Each of the graphics capabilities is accessed in a similar manner. The user chooses creation, revision, or chartbase/slide show maintenance options from an initial menu. The user may then enter or modify data displayed on a graphic chart. The cursor moves through the chart in a "circular" fashion to facilitate data entries and changes. Various "help" functions and on-screen instructions are available to aid the user. The user data is used to generate the graphics portion of the chart. Completed charts may be displayed in monotone or color, printed, plotted, or stored in the chartbase on the IBM PC. Once completed, the charts may be put in a vector format and plotted for color viewgraphs. The twelve graphics capabilities are divided into three groups: Forms, Structured Charts, and Block Diagrams. There are eight Forms available: 1) Bar/Line Charts, 2) Pie Charts, 3) Milestone Charts, 4) Resources Charts, 5) Earned Value Analysis Charts, 6) Progress/Effort Charts, 7) Travel/Training Charts, and 8) Trend Analysis Charts. There are three Structured Charts available: 1) Bullet Charts, 2) Organization Charts, and 3) Work Breakdown Structure (WBS) Charts. The Block Diagram available is an N x N Chart. Each graphics capability supports a chartbase. The OAP graphics database system provides the IBM PC user with an effective means of managing data which is best interpreted as a graphic display. The OAP graphics database system is written in IBM PASCAL 2.0 and assembler for interactive execution on an IBM PC or XT with at least 384K of memory, and a color graphics adapter and monitor. Printed charts require an Epson, IBM, OKIDATA, or HP Laser printer (or equivalent). Plots require the Tektronix 4662 Penplotter. Source code is supplied to the user for modification and customizing. Executables are also supplied for all twelve graphics capabilities. This system was developed in 1983, and Version 3.1 was released in 1986.
Development and Performance of the ACTS High Speed VSAT
NASA Technical Reports Server (NTRS)
Quintana, J.; Tran, Q.; Dendy, R.
1999-01-01
The Advanced Communication Technology Satellite (ACTS), developed by the U.S. National Aeronautics and Space Administration (NASA) has demonstrated the breakthrough technologies of Ka-band, spot beam antennas, and on-board processing. These technologies have enabled the development of very small aperture terminals (VSAT) and ultra-small aperture terminals (USAT) which have capabilities greater than were previously possible with conventional satellite technologies. However, the ACTS baseband processor (BBP) is designed using a time division multiple access (TDMA) scheme, which requires each earth station using the BBP to transmit data at a burst rate which is much higher than the user throughput data rate. This tends to mitigate the advantage of the new technologies by requiring a larger earth station antenna and/or a higher-powered uplink amplifier than would be necessary for a continuous transmission at the user data rate. Conversely, the user data rate is much less than the rate that can be supported by the antenna size and amplifier. For example, the ACTS TI VSAT operates at a burst rate of 27.5 Mbps, but the maximum user data rate is 1.792 Mbps. The throughput efficiency is slightly more than 6.5%. For an operational network, this level of overhead will greatly increase the cost of the user earth stations, and that increased cost must be repeated thousands of times, which may ultimately reduce the market for such a system. The ACTS High Speed VSAT (HS VSAT) is an effort to experimentally demonstrate the maximum user throughput data rate which can be achieved using the technologies developed and implemented on ACTS. Specifically, this was done by operating the system uplinks as frequency division multiple access (FDMA), essentially assigning all available TDMA time slots to a single user on each of two uplink frequencies. Preliminary results show that using a 1.2-m antenna in this mode, the HS VSAT can achieve between 22 and 24 Mbps out of the 27.5 Mbps burst rate, for a throughput efficiency of 80-88%. This paper describes the modifications made to the TI VSAT to enable it to operate at high speed, including hardware considerations, interface modifications, and software modifications. In addition, it describes the results of NASA HS VSAT experiments, continuing work on an improved user interface, and plans for future experiments.
ERIC Educational Resources Information Center
Murphy, Molly; Franklin, Shelly; Raia, Ann
2007-01-01
Sooner Xpress service arose out of a need to improve and expand services for library users at the University of Oklahoma. After several years of service for our distance education students, a decision was made to expand those services to include all campus and local users in an effort to streamline retrieval services in the library. Both…
An Assessment of the Need for Standard Variable Names for Airborne Field Campaigns
NASA Astrophysics Data System (ADS)
Beach, A. L., III; Chen, G.; Northup, E. A.; Kusterer, J.; Quam, B. M.
2017-12-01
The NASA Earth Venture Program has led to a dramatic increase in airborne observations, requiring updated data management practices with clearly defined data standards and protocols for metadata. An airborne field campaign can involve multiple aircraft and a variety of instruments. It is quite common to have different instruments/techniques measure the same parameter on one or more aircraft platforms. This creates a need to allow instrument Principal Investigators (PIs) to name their variables in a way that would distinguish them across various data sets. A lack of standardization of variables names presents a challenge for data search tools in enabling discovery of similar data across airborne studies, aircraft platforms, and instruments. This was also identified by data users as one of the top issues in data use. One effective approach for mitigating this problem is to enforce variable name standardization, which can effectively map the unique PI variable names to fixed standard names. In order to ensure consistency amongst the standard names, it will be necessary to choose them from a controlled list. However, no such list currently exists despite a number of previous efforts to establish a sufficient list of atmospheric variable names. The Atmospheric Composition Variable Standard Name Working Group was established under the auspices of NASA's Earth Science Data Systems Working Group (ESDSWG) to solicit research community feedback to create a list of standard names that are acceptable to data providers and data users This presentation will discuss the challenges and recommendations of standard variable names in an effort to demonstrate how airborne metadata curation/management can be improved to streamline data ingest, improve interoperability, and discoverability to a broader user community.
Hasegawa, Kohei; Sullivan, Ashley F; Tovar Hirashima, Eva; Gaeta, Theodore J; Fee, Christopher; Turner, Stuart J; Massaro, Susan; Camargo, Carlos A
2014-01-01
Despite the substantial burden of asthma-related emergency department (ED) visits, there have been no recent multicenter efforts to characterize this high-risk population. We aimed to characterize patients with asthma according to their frequency of ED visits and to identify factors associated with frequent ED visits. A multicenter chart review study of 48 EDs across 23 US states. We identified ED patients ages 18 to 54 years with acute asthma during 2011 and 2012. Primary outcome was frequency of ED visits for acute asthma in the past year, excluding the index ED visit. Of the 1890 enrolled patients, 863 patients (46%) had 1 or more (frequent) ED visits in the past year. Specifically, 28% had 1 to 2 visits, 11% had 3 to 5 visits, and 7% had 6 or more visits. Among frequent ED users, guideline-recommended management was suboptimal. For example, of patients with 6 or more ED visits, 85% lacked evidence of prior evaluation by an asthma specialist, and 43% were not treated with inhaled corticosteroids. In a multivariable model, significant predictors of frequent ED visits were public insurance, no insurance, and markers for chronic asthma severity (all P < .05). Stronger associations were found among those with a higher frequency of asthma-related ED visits (eg, 6 or more ED visits). This multicenter study of US adults with acute asthma demonstrated many frequent ED users and suboptimal preventive management in this high-risk population. Future reductions in asthma morbidity and associated health care utilization will require continued efforts to bridge these major gaps in asthma care. Copyright © 2014 American Academy of Allergy, Asthma & Immunology. Published by Elsevier Inc. All rights reserved.
User Experience Evaluation of a Smoking Cessation App in People With Serious Mental Illness
Rizo, Javier; Kientz, Julie A.; McDonell, Michael G.; Ries, Richard K.; Sobel, Kiley
2016-01-01
Introduction: Smoking rates among people with serious mental illness are 3 to 4 times higher than the general population, yet currently there are no smoking cessation apps specifically designed to address this need. We report the results of a User Experience (UX) evaluation of a National Cancer Institute smoking cessation app, QuitPal, and provide user centered design data that can be used to tailor smoking cessation apps for this population. Methods: Two hundred forty hours of field experience with QuitPal, 10 hours of recorded interviews and task performances, usage logs and a self-reported usability scale, informed the results of our study. Participants were five individuals recruited from a community mental health clinic with a reported serious mental illness history. Performance, self-reports, usage logs and interview data were triangulated to identify critical usability errors and UX themes emerging from this population. Results: Data suggests QuitPal has below average levels of usability, elevated time on task performances and required considerable amounts of guidance. UX themes provided critical information to tailor smoking cessation apps for this population, such as the importance of breaking down “cessation” into smaller steps and use of a reward system. Conclusions: This is the first study to examine the UX of a smoking cessation app among people with serious mental illness. Data from this study will inform future research efforts to expand the effectiveness and reach of smoking cessation apps for this highly nicotine dependent yet under-served population. Implications: Data from this study will inform future research efforts to expand the effectiveness and reach of smoking cessation apps for people with serious mental illness, a highly nicotine dependent yet under-served population. PMID:26581430
Cloud-Based Automated Design and Additive Manufacturing: A Usage Data-Enabled Paradigm Shift
Lehmhus, Dirk; Wuest, Thorsten; Wellsandt, Stefan; Bosse, Stefan; Kaihara, Toshiya; Thoben, Klaus-Dieter; Busse, Matthias
2015-01-01
Integration of sensors into various kinds of products and machines provides access to in-depth usage information as basis for product optimization. Presently, this large potential for more user-friendly and efficient products is not being realized because (a) sensor integration and thus usage information is not available on a large scale and (b) product optimization requires considerable efforts in terms of manpower and adaptation of production equipment. However, with the advent of cloud-based services and highly flexible additive manufacturing techniques, these obstacles are currently crumbling away at rapid pace. The present study explores the state of the art in gathering and evaluating product usage and life cycle data, additive manufacturing and sensor integration, automated design and cloud-based services in manufacturing. By joining and extrapolating development trends in these areas, it delimits the foundations of a manufacturing concept that will allow continuous and economically viable product optimization on a general, user group or individual user level. This projection is checked against three different application scenarios, each of which stresses different aspects of the underlying holistic concept. The following discussion identifies critical issues and research needs by adopting the relevant stakeholder perspectives. PMID:26703606
Cloud-Based Automated Design and Additive Manufacturing: A Usage Data-Enabled Paradigm Shift.
Lehmhus, Dirk; Wuest, Thorsten; Wellsandt, Stefan; Bosse, Stefan; Kaihara, Toshiya; Thoben, Klaus-Dieter; Busse, Matthias
2015-12-19
Integration of sensors into various kinds of products and machines provides access to in-depth usage information as basis for product optimization. Presently, this large potential for more user-friendly and efficient products is not being realized because (a) sensor integration and thus usage information is not available on a large scale and (b) product optimization requires considerable efforts in terms of manpower and adaptation of production equipment. However, with the advent of cloud-based services and highly flexible additive manufacturing techniques, these obstacles are currently crumbling away at rapid pace. The present study explores the state of the art in gathering and evaluating product usage and life cycle data, additive manufacturing and sensor integration, automated design and cloud-based services in manufacturing. By joining and extrapolating development trends in these areas, it delimits the foundations of a manufacturing concept that will allow continuous and economically viable product optimization on a general, user group or individual user level. This projection is checked against three different application scenarios, each of which stresses different aspects of the underlying holistic concept. The following discussion identifies critical issues and research needs by adopting the relevant stakeholder perspectives.
Using a Java Web-based Graphical User Interface to access the SOHO Data Arch ive
NASA Astrophysics Data System (ADS)
Scholl, I.; Girard, Y.; Bykowski, A.
This paper presents the architecture of a Java web-based graphical interface dedicated to the access of the SOHO Data archive. This application allows local and remote users to search in the SOHO data catalog and retrieve the SOHO data files from the archive. It has been developed at MEDOC (Multi-Experiment Data and Operations Centre), located at the Institut d'Astrophysique Spatiale (Orsay, France), which is one of the European Archives for the SOHO data. This development is part of a joint effort between ESA, NASA and IAS in order to implement long term archive systems for the SOHO data. The software architecture is built as a client-server application using Java language and SQL above a set of components such as an HTTP server, a JDBC gateway, a RDBMS server, a data server and a Web browser. Since HTML pages and CGI scripts are not powerful enough to allow user interaction during a multi-instrument catalog search, this type of requirement enforces the choice of Java as the main language. We also discuss performance issues, security problems and portability on different Web browsers and operating syste ms.
Cross Support Transfer Service (CSTS) Framework Library
NASA Technical Reports Server (NTRS)
Ray, Timothy
2014-01-01
Within the Consultative Committee for Space Data Systems (CCSDS), there is an effort to standardize data transfer between ground stations and control centers. CCSDS plans to publish a collection of transfer services that will each address the transfer of a particular type of data (e.g., tracking data). These services will be called Cross Support Transfer Services (CSTSs). All of these services will make use of a common foundation that is called the CSTS Framework. This library implements the User side of the CSTS Framework. "User side" means that the library performs the role that is typically expected of the control center. This library was developed in support of the Goddard Data Standards program. This technology could be applicable for control centers, and possibly for use in control center simulators needed to test ground station capabilities. The main advantages of this implementation are its flexibility and simplicity. It provides the framework capabilities, while allowing the library user to provide a wrapper that adapts the library to any particular environment. The main purpose of this implementation was to support the inter-operability testing required by CCSDS. In addition, it is likely that the implementation will be useful within the Goddard mission community (for use in control centers).
FDA-Required Tobacco Product Inserts & Onserts–and the First Amendment.
Lindblom, Eric N; Berman, Micah L; Thrasher, James F
In 2012, a federal court of appeals struck down an FDA rule requiring graphic health warnings on cigarettes as violating First Amendment commercial speech protections. Tobacco product inserts and onserts can more readily avoid First Amendment constraints while delivering more extensive information to tobacco users, and can work effectively to support and encourage smoking cessation. This paper examines FDA’s authority to require effective inserts and onserts and shows how FDA could design and support them to avoid First Amendment problems. Through this process, the paper offers helpful insights regarding how key Tobacco Control Act provisions can and should be interpreted and applied to follow and promote the statute’s purposes and objectives. The paper’s rigorous analysis of existing First Amendment case law relating to compelled commercial speech also provides useful guidance for any government efforts either to compel product disclosures or to require government messaging in or on commercial products or their advertising, whether done for remedial, purely informational, or behavior modification purposes.
Vadnais, Carolyn; Stensaas, Gregory
2014-01-01
Under the National Land Imaging Requirements (NLIR) Project, the U.S. Geological Survey (USGS) is developing a functional capability to obtain, characterize, manage, maintain and prioritize all Earth observing (EO) land remote sensing user requirements. The goal is a better understanding of community needs that can be supported with land remote sensing resources, and a means to match needs with appropriate solutions in an effective and efficient way. The NLIR Project is composed of two components. The first component is focused on the development of the Earth Observation Requirements Evaluation System (EORES) to capture, store and analyze user requirements, whereas, the second component is the mechanism and processes to elicit and document the user requirements that will populate the EORES. To develop the second component, the requirements elicitation methodology was exercised and refined through a pilot project conducted from June to September 2013. The pilot project focused specifically on applications and user requirements for moderate resolution imagery (5–120 meter resolution) as the test case for requirements development. The purpose of this summary report is to provide a high-level overview of the requirements elicitation process that was exercised through the pilot project and an early analysis of the moderate resolution imaging user requirements acquired to date to support ongoing USGS sustainable land imaging study needs. The pilot project engaged a limited set of Federal Government users from the operational and research communities and therefore the information captured represents only a subset of all land imaging user requirements. However, based on a comparison of results, trends, and analysis, the pilot captured a strong baseline of typical applications areas and user needs for moderate resolution imagery. Because these results are preliminary and represent only a sample of users and application areas, the information from this report should only be used to indicate general user needs for the applications covered. Users of the information are cautioned that use of specific numeric results may be inappropriate without additional research. Any information used or cited from this report should specifically be cited as preliminary findings.
Crowd science user contribution patterns and their implications
Sauermann, Henry; Franzoni, Chiara
2015-01-01
Scientific research performed with the involvement of the broader public (the crowd) attracts increasing attention from scientists and policy makers. A key premise is that project organizers may be able to draw on underused human resources to advance research at relatively low cost. Despite a growing number of examples, systematic research on the effort contributions volunteers are willing to make to crowd science projects is lacking. Analyzing data on seven different projects, we quantify the financial value volunteers can bring by comparing their unpaid contributions with counterfactual costs in traditional or online labor markets. The volume of total contributions is substantial, although some projects are much more successful in attracting effort than others. Moreover, contributions received by projects are very uneven across time—a tendency toward declining activity is interrupted by spikes typically resulting from outreach efforts or media attention. Analyzing user-level data, we find that most contributors participate only once and with little effort, leaving a relatively small share of users who return responsible for most of the work. Although top contributor status is earned primarily through higher levels of effort, top contributors also tend to work faster. This speed advantage develops over multiple sessions, suggesting that it reflects learning rather than inherent differences in skills. Our findings inform recent discussions about potential benefits from crowd science, suggest that involving the crowd may be more effective for some kinds of projects than others, provide guidance for project managers, and raise important questions for future research. PMID:25561529
Crowd science user contribution patterns and their implications
NASA Astrophysics Data System (ADS)
Sauermann, Henry; Franzoni, Chiara
2015-01-01
Scientific research performed with the involvement of the broader public (the crowd) attracts increasing attention from scientists and policy makers. A key premise is that project organizers may be able to draw on underused human resources to advance research at relatively low cost. Despite a growing number of examples, systematic research on the effort contributions volunteers are willing to make to crowd science projects is lacking. Analyzing data on seven different projects, we quantify the financial value volunteers can bring by comparing their unpaid contributions with counterfactual costs in traditional or online labor markets. The volume of total contributions is substantial, although some projects are much more successful in attracting effort than others. Moreover, contributions received by projects are very uneven across time-a tendency toward declining activity is interrupted by spikes typically resulting from outreach efforts or media attention. Analyzing user-level data, we find that most contributors participate only once and with little effort, leaving a relatively small share of users who return responsible for most of the work. Although top contributor status is earned primarily through higher levels of effort, top contributors also tend to work faster. This speed advantage develops over multiple sessions, suggesting that it reflects learning rather than inherent differences in skills. Our findings inform recent discussions about potential benefits from crowd science, suggest that involving the crowd may be more effective for some kinds of projects than others, provide guidance for project managers, and raise important questions for future research.
Crowd science user contribution patterns and their implications.
Sauermann, Henry; Franzoni, Chiara
2015-01-20
Scientific research performed with the involvement of the broader public (the crowd) attracts increasing attention from scientists and policy makers. A key premise is that project organizers may be able to draw on underused human resources to advance research at relatively low cost. Despite a growing number of examples, systematic research on the effort contributions volunteers are willing to make to crowd science projects is lacking. Analyzing data on seven different projects, we quantify the financial value volunteers can bring by comparing their unpaid contributions with counterfactual costs in traditional or online labor markets. The volume of total contributions is substantial, although some projects are much more successful in attracting effort than others. Moreover, contributions received by projects are very uneven across time--a tendency toward declining activity is interrupted by spikes typically resulting from outreach efforts or media attention. Analyzing user-level data, we find that most contributors participate only once and with little effort, leaving a relatively small share of users who return responsible for most of the work. Although top contributor status is earned primarily through higher levels of effort, top contributors also tend to work faster. This speed advantage develops over multiple sessions, suggesting that it reflects learning rather than inherent differences in skills. Our findings inform recent discussions about potential benefits from crowd science, suggest that involving the crowd may be more effective for some kinds of projects than others, provide guidance for project managers, and raise important questions for future research.
Nuclear Energy Knowledge and Validation Center (NEKVaC) Needs Workshop Summary Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gougar, Hans
2015-02-01
The Department of Energy (DOE) has made significant progress developing simulation tools to predict the behavior of nuclear systems with greater accuracy and of increasing our capability to predict the behavior of these systems outside of the standard range of applications. These analytical tools require a more complex array of validation tests to accurately simulate the physics and multiple length and time scales. Results from modern simulations will allow experiment designers to narrow the range of conditions needed to bound system behavior and to optimize the deployment of instrumentation to limit the breadth and cost of the campaign. Modern validation,more » verification and uncertainty quantification (VVUQ) techniques enable analysts to extract information from experiments in a systematic manner and provide the users with a quantified uncertainty estimate. Unfortunately, the capability to perform experiments that would enable taking full advantage of the formalisms of these modern codes has progressed relatively little (with some notable exceptions in fuels and thermal-hydraulics); the majority of the experimental data available today is the "historic" data accumulated over the last decades of nuclear systems R&D. A validated code-model is a tool for users. An unvalidated code-model is useful for code developers to gain understanding, publish research results, attract funding, etc. As nuclear analysis codes have become more sophisticated, so have the measurement and validation methods and the challenges that confront them. A successful yet cost-effective validation effort requires expertise possessed only by a few, resources possessed only by the well-capitalized (or a willing collective), and a clear, well-defined objective (validating a code that is developed to satisfy the need(s) of an actual user). To that end, the Idaho National Laboratory established the Nuclear Energy Knowledge and Validation Center to address the challenges of modern code validation and to manage the knowledge from past, current, and future experimental campaigns. By pulling together the best minds involved in code development, experiment design, and validation to establish and disseminate best practices and new techniques, the Nuclear Energy Knowledge and Validation Center (NEKVaC or the ‘Center’) will be a resource for industry, DOE Programs, and academia validation efforts.« less
User-Centered Design Strategies for Massive Open Online Courses (MOOCs)
ERIC Educational Resources Information Center
Mendoza-Gonzalez, Ricardo, Ed.
2016-01-01
In today's society, educational opportunities have evolved beyond the traditional classroom setting. Most universities have implemented virtual learning environments in an effort to provide more opportunities for potential or current students seeking alternative and more affordable learning solutions. "User-Centered Design Strategies for…
International epidemiology of HIV and AIDS among injecting drug users.
Des Jarlais, D C; Friedman, S R; Choopanya, K; Vanichseni, S; Ward, T P
1992-10-01
HIV/AIDS and iv drug use (IVDU) are of significant multinational scope and growing. Supporting increased IVDU in many countries are countries' geographical proximity to illicit drug trafficking distribution routes, law enforcement efforts which increase the demand for more efficient drug distribution and consumption, and countries' infrastructural and social modernization. Given the failures of intensified law enforcement efforts to thwart the use and proliferation of illegal drugs, countries with substantial IVDU should look away from preventing use to preventing HIV transmission within drug user populations. With HIV seroprevalence rates rapidly reaching 40-50% in some developing country IVDU groups, a variety of prevention programs is warranted. Such programs should be supported and implemented while prevention remains feasible. This paper examines the variation in HIV seroprevalence among IVD users, rapid HIV spread among users, HIV among IVDUs in Bangkok, emerging issues in HIV transmission among IVDUs, non-AIDS manifestations of HIV infection among IVDUs, prevention programs and effectiveness, and harm reduction.
Nuclear Engine System Simulation (NESS). Volume 1: Program user's guide
NASA Astrophysics Data System (ADS)
Pelaccio, Dennis G.; Scheil, Christine M.; Petrosky, Lyman J.
1993-03-01
A Nuclear Thermal Propulsion (NTP) engine system design analysis tool is required to support current and future Space Exploration Initiative (SEI) propulsion and vehicle design studies. Currently available NTP engine design models are those developed during the NERVA program in the 1960's and early 1970's and are highly unique to that design or are modifications of current liquid propulsion system design models. To date, NTP engine-based liquid design models lack integrated design of key NTP engine design features in the areas of reactor, shielding, multi-propellant capability, and multi-redundant pump feed fuel systems. Additionally, since the SEI effort is in the initial development stage, a robust, verified NTP analysis design tool could be of great use to the community. This effort developed an NTP engine system design analysis program (tool), known as the Nuclear Engine System Simulation (NESS) program, to support ongoing and future engine system and stage design study efforts. In this effort, Science Applications International Corporation's (SAIC) NTP version of the Expanded Liquid Engine Simulation (ELES) program was modified extensively to include Westinghouse Electric Corporation's near-term solid-core reactor design model. The ELES program has extensive capability to conduct preliminary system design analysis of liquid rocket systems and vehicles. The program is modular in nature and is versatile in terms of modeling state-of-the-art component and system options as discussed. The Westinghouse reactor design model, which was integrated in the NESS program, is based on the near-term solid-core ENABLER NTP reactor design concept. This program is now capable of accurately modeling (characterizing) a complete near-term solid-core NTP engine system in great detail, for a number of design options, in an efficient manner. The following discussion summarizes the overall analysis methodology, key assumptions, and capabilities associated with the NESS presents an example problem, and compares the results to related NTP engine system designs. Initial installation instructions and program disks are in Volume 2 of the NESS Program User's Guide.
Nuclear Engine System Simulation (NESS). Volume 1: Program user's guide
NASA Technical Reports Server (NTRS)
Pelaccio, Dennis G.; Scheil, Christine M.; Petrosky, Lyman J.
1993-01-01
A Nuclear Thermal Propulsion (NTP) engine system design analysis tool is required to support current and future Space Exploration Initiative (SEI) propulsion and vehicle design studies. Currently available NTP engine design models are those developed during the NERVA program in the 1960's and early 1970's and are highly unique to that design or are modifications of current liquid propulsion system design models. To date, NTP engine-based liquid design models lack integrated design of key NTP engine design features in the areas of reactor, shielding, multi-propellant capability, and multi-redundant pump feed fuel systems. Additionally, since the SEI effort is in the initial development stage, a robust, verified NTP analysis design tool could be of great use to the community. This effort developed an NTP engine system design analysis program (tool), known as the Nuclear Engine System Simulation (NESS) program, to support ongoing and future engine system and stage design study efforts. In this effort, Science Applications International Corporation's (SAIC) NTP version of the Expanded Liquid Engine Simulation (ELES) program was modified extensively to include Westinghouse Electric Corporation's near-term solid-core reactor design model. The ELES program has extensive capability to conduct preliminary system design analysis of liquid rocket systems and vehicles. The program is modular in nature and is versatile in terms of modeling state-of-the-art component and system options as discussed. The Westinghouse reactor design model, which was integrated in the NESS program, is based on the near-term solid-core ENABLER NTP reactor design concept. This program is now capable of accurately modeling (characterizing) a complete near-term solid-core NTP engine system in great detail, for a number of design options, in an efficient manner. The following discussion summarizes the overall analysis methodology, key assumptions, and capabilities associated with the NESS presents an example problem, and compares the results to related NTP engine system designs. Initial installation instructions and program disks are in Volume 2 of the NESS Program User's Guide.
Transforming data into usable knowledge: the CIRC experience
NASA Astrophysics Data System (ADS)
Mote, P.; Lach, D.; Hartmann, H.; Abatzoglou, J. T.; Stevenson, J.
2017-12-01
NOAA's northwest RISA, the Climate Impacts Research Consortium, emphasizes the transformation of data into usable knowledge. This effort involves physical scientists (e.g., Abatzoglou) building web-based tools with climate and hydrologic data and model output, a team performing data mining to link crop loss claims to droughts, social scientists (eg., Lach, Hartmann) evaluating the effectiveness of such tools at communicating with end users, and two-way engagement with a wide variety of audiences who are interested in using and improving the tools. Unusual in this effort is the seamless integration across timescales past, present, and future; data mining; and the level of effort in evaluating the tools. We provide examples of agriculturally relevant climate variables (e.g. growing degree days, day of first fall freeze) and describe the iterative process of incorporating user feedback.
The Changing Conduct of Geoscience in a Data Intensive World (Ian McHarg Medal Lecture)
NASA Astrophysics Data System (ADS)
Fox, P.
2012-04-01
Electronic facilitation of scientific research (often called eResearch or eScience) is increasingly prevelant in geosciences. Among the consequences of new and diversifying means of complex (*) data generation is that as many branches of science have become data-intensive (so-called fourth paradigm), they in turn broaden their long-tail distributions - smaller volume, but often complex data, will always lead to excellent science. There are many familar informatics functions that enable the conduct of science (by specialists or non-specialists) in this new regime. For example, the need for any user to be able to discover relations among and between the results of data analyses and informational queries. Unfortunately, true science exploration, for example visual discovery, over complex data remains more of an art form than an easily conducted practice. In general, the resource costs of creating useful visualizations has been increasing. Less than 10 years ago, it was assessed that data-centric science required a rough split between the time to generate, analyze, and publish data and the science based on that data. Today however, the visualization and analysis component has become a bottleneck, requiring considerably more of the overall effort and this trend will continue. Potentially even worse, is the choice to simplify analyses to 'get the work out'. Extra effort to make data understandable, something that should be routine, is now consuming considerable resources that could be used for many other purposes. It is now time to change that trend. This contribution lays out informatics paths for truly 'exploratory' conduct of science cast in the present and rapidly changing reality of Web/Internet-based data and software infrastructures. A logical consequence of these paths is that the people working in this new mode of research, i.e. data scientists, require additional and different education to become effective and routine users of new informatics capabilities. One goal is to achieve the same fluency that researchers may have in lab techniques, instrument utilization, model development and use, etc. Thus, in conclusion, curriculum and skill requirements for data scientists will be presented and discussed. * complex/ intensive = large volume, multi-scale, multi-modal, multi-dimensional, multi-disciplinary, and heterogeneous structure.
Increasing the Automation and Autonomy for Spacecraft Operations with Criteria Action Table
NASA Technical Reports Server (NTRS)
Li, Zhen-Ping; Savki, Cetin
2005-01-01
The Criteria Action Table (CAT) is an automation tool developed for monitoring real time system messages for specific events and processes in order to take user defined actions based on a set of user-defined rules. CAT was developed by Lockheed Martin Space Operations as a part of a larger NASA effort at the Goddard Space Flight Center (GSFC) to create a component-based, middleware-based, and standard-based general purpose ground system architecture referred as GMSEC - the GSFC Mission Services Evolution Center. CAT has been integrated into the upgraded ground systems for Tropical Rainfall Measuring Mission (TRMM) and Small Explorer (SMEX) satellites and it plays the central role in their automation effort to reduce the cost and increase the reliability for spacecraft operations. The GMSEC architecture provides a standard communication interface and protocol for components to publish/describe messages to an information bus. It also provides a standard message definition so components can send and receive messages to the bus interface rather than each other, thus reducing the component-to-component coupling, interface, protocols, and link (socket) management. With the GMSEC architecture, components can publish standard event messages to the bus for all nominal, significant, and surprising events in regard to satellite, celestial, ground system, or any other activity. In addition to sending standard event messages, each GMSEC compliant component is required to accept and process GMSEC directive request messages.
Vali, Faisal; Hong, Robert
2007-10-11
With the evolution of AJAX, ruby on rails, advanced dynamic XHTML technologies and the advent of powerful user interface libraries for javascript (EXT, Yahoo User Interface Library), developers now have the ability to provide truly rich interfaces within web browsers, with reasonable effort and without third-party plugins. We designed and developed an example of such a solution. The User Interface allows radiation oncology practices to intuitively manage different dose fractionation schemes by helping estimate total dose to irradiated organs.
Tensoral for post-processing users and simulation authors
NASA Technical Reports Server (NTRS)
Dresselhaus, Eliot
1993-01-01
The CTR post-processing effort aims to make turbulence simulations and data more readily and usefully available to the research and industrial communities. The Tensoral language, which provides the foundation for this effort, is introduced here in the form of a user's guide. The Tensoral user's guide is presented in two main sections. Section one acts as a general introduction and guides database users who wish to post-process simulation databases. Section two gives a brief description of how database authors and other advanced users can make simulation codes and/or the databases they generate available to the user community via Tensoral database back ends. The two-part structure of this document conforms to the two-level design structure of the Tensoral language. Tensoral has been designed to be a general computer language for performing tensor calculus and statistics on numerical data. Tensoral's generality allows it to be used for stand-alone native coding of high-level post-processing tasks (as described in section one of this guide). At the same time, Tensoral's specialization to a minute task (namely, to numerical tensor calculus and statistics) allows it to be easily embedded into applications written partly in Tensoral and partly in other computer languages (here, C and Vectoral). Embedded Tensoral, aimed at advanced users for more general coding (e.g. of efficient simulations, for interfacing with pre-existing software, for visualization, etc.), is described in section two of this guide.
A Generic Evaluation Model for Semantic Web Services
NASA Astrophysics Data System (ADS)
Shafiq, Omair
Semantic Web Services research has gained momentum over the last few Years and by now several realizations exist. They are being used in a number of industrial use-cases. Soon software developers will be expected to use this infrastructure to build their B2B applications requiring dynamic integration. However, there is still a lack of guidelines for the evaluation of tools developed to realize Semantic Web Services and applications built on top of them. In normal software engineering practice such guidelines can already be found for traditional component-based systems. Also some efforts are being made to build performance models for servicebased systems. Drawing on these related efforts in component-oriented and servicebased systems, we identified the need for a generic evaluation model for Semantic Web Services applicable to any realization. The generic evaluation model will help users and customers to orient their systems and solutions towards using Semantic Web Services. In this chapter, we have presented the requirements for the generic evaluation model for Semantic Web Services and further discussed the initial steps that we took to sketch such a model. Finally, we discuss related activities for evaluating semantic technologies.
NASA Astrophysics Data System (ADS)
Gao, Guoyou; Jiang, Chunsheng; Chen, Tao; Hui, Chun
2018-05-01
Industrial robots are widely used in various processes of surface manufacturing, such as thermal spraying. The established robot programming methods are highly time-consuming and not accurate enough to fulfil the demands of the actual market. There are many off-line programming methods developed to reduce the robot programming effort. This work introduces the principle of several based robot trajectory generation strategy on planar surface and curved surface. Since the off-line programming software is widely used and thus facilitates the robot programming efforts and improves the accuracy of robot trajectory, the analysis of this work is based on the second development of off-line programming software Robot studio™. To meet the requirements of automotive paint industry, this kind of software extension helps provide special functions according to the users defined operation parameters. The presented planning strategy generates the robot trajectory by moving an orthogonal surface according to the information of coating surface, a series of intersection curves are then employed to generate the trajectory points. The simulation results show that the path curve created with this method is successive and smooth, which corresponds to the requirements of automotive spray industrial applications.
Low cost environmental sensors for Spaceflight : NMP Space Environmental Monitor (SEM) requirements
NASA Technical Reports Server (NTRS)
Garrett, Henry B.; Buelher, Martin G.; Brinza, D.; Patel, J. U.
2005-01-01
An outstanding problem in spaceflight is the lack of adequate sensors for monitoring the space environment and its effects on engineering systems. By adequate, we mean low cost in terms of mission impact (e.g., low price, low mass/size, low power, low data rate, and low design impact). The New Millennium Program (NMP) is investigating the development of such a low-cost Space Environmental Monitor (SEM) package for inclusion on its technology validation flights. This effort follows from the need by NMP to characterize the space environment during testing so that potential users can extrapolate the test results to end-use conditions. The immediate objective of this effort is to develop a small diagnostic sensor package that could be obtained from commercial sources. Environments being considered are: contamination, atomic oxygen, ionizing radiation, cosmic radiation, EMI, and temperature. This talk describes the requirements and rational for selecting these environments and reviews a preliminary design that includes a micro-controller data logger with data storage and interfaces to the sensors and spacecraft. If successful, such a sensor package could be the basis of a unique, long term program for monitoring the effects of the space environment on spacecraft systems.
Low Cost Environmental Sensors for Spaceflight: NMP Space Environmental Monitor (SEM) Requirements
NASA Technical Reports Server (NTRS)
Garrett, Henry B.; Buehler, Martin G.; Brinza, D.; Patel, J. U.
2005-01-01
An outstanding problem in spaceflight is the lack of adequate sensors for monitoring the space environment and its effects on engineering systems. By adequate, we mean low cost in terms of mission impact (e.g., low price, low mass/size, low power, low data rate, and low design impact). The New Millennium Program (NMP) is investigating the development of such a low-cost Space Environmental Monitor (SEM) package for inclusion on its technology validation flights. This effort follows from the need by NMP to characterize the space environment during testing so that potential users can extrapolate the test results to end-use conditions. The immediate objective of this effort is to develop a small diagnostic sensor package that could be obtained from commercial sources. Environments being considered are: contamination, atomic oxygen, ionizing radiation, cosmic radiation, EMI, and temperature. This talk describes the requirements and rational for selecting these environments and reviews a preliminary design that includes a micro-controller data logger with data storage and interfaces to the sensors and spacecraft. If successful, such a sensor package could be the basis of a unique, long term program for monitoring the effects of the space environment on spacecraft systems.
Improving computer security by health smart card.
Nisand, Gabriel; Allaert, François-André; Brézillon, Régine; Isphording, Wilhem; Roeslin, Norbert
2003-01-01
The University hospitals of Strasbourg have worked for several years on the computer security of the medical data and have of this fact be the first to use the Health Care Professional Smart Card (CPS). This new tool must provide security to the information processing systems and especially to the medical data exchanges between the partners who collaborate to the care of the Beyond the purely data-processing aspects of the functions of safety offered by the CPS, safety depends above all on the practices on the users, their knowledge concerning the legislation, the risks and the stakes, of their adhesion to the procedures and protections installations. The aim of this study is to evaluate this level of knowledge, the practices and the feelings of the users concerning the computer security of the medical data, to check the relevance of the step taken, and if required, to try to improve it. The survey by questionnaires involved 648 users. The practices of users in terms of data security are clearly improved by the implementation of the security server and the use of the CPS system, but security breaches due to bad practices are not however completely eliminated. That confirms that is illusory to believe that data security is first and foremost a technical issue. Technical measures are of course indispensable, but the greatest efforts are required after their implementation and consist in making the key players [2], i.e. users, aware and responsible. However, it must be stressed that the user-friendliness of the security interface has a major effect on the results observed. For instance, it is highly probable that the bad practices continued or introduced upon the implementation of the security server and CPS scheme are due to the complicated nature or functional defects of the proposed solution, which must therefore be improved. Besides, this is only the pilot phase and card holders can be expected to become more responsible as time goes by, along with the gradual national implementation of the CPS project and the introduction of new functions using electronic signatures and encryption.
Usability Guidelines for Product Recommenders Based on Example Critiquing Research
NASA Astrophysics Data System (ADS)
Pu, Pearl; Faltings, Boi; Chen, Li; Zhang, Jiyong; Viappiani, Paolo
Over the past decade, our group has developed a suite of decision tools based on example critiquing to help users find their preferred products in e-commerce environments. In this chapter, we survey important usability research work relative to example critiquing and summarize the major results by deriving a set of usability guidelines. Our survey is focused on three key interaction activities between the user and the system: the initial preference elicitation process, the preference revision process, and the presentation of the systems recommendation results. To provide a basis for the derivation of the guidelines, we developed a multi-objective framework of three interacting criteria: accuracy, confidence, and effort (ACE). We use this framework to analyze our past work and provide a specific context for each guideline: when the system should maximize its ability to increase users' decision accuracy, when to increase user confidence, and when to minimize the interaction effort for the users. Due to the general nature of this multi-criteria model, the set of guidelines that we propose can be used to ease the usability engineering process of other recommender systems, especially those used in e-commerce environments. The ACE framework presented here is also the first in the field to evaluate the performance of preference-based recommenders from a user-centric point of view.
TADS--A CFD-Based Turbomachinery Analysis and Design System with GUI: User's Manual. 2.0
NASA Technical Reports Server (NTRS)
Koiro, M. J.; Myers, R. A.; Delaney, R. A.
1999-01-01
The primary objective of this study was the development of a Computational Fluid Dynamics (CFD) based turbomachinery airfoil analysis and design system, controlled by a Graphical User Interface (GUI). The computer codes resulting from this effort are referred to as TADS (Turbomachinery Analysis and Design System). This document is intended to serve as a User's Manual for the computer programs which comprise the TADS system, developed under Task 18 of NASA Contract NAS3-27350, ADPAC System Coupling to Blade Analysis & Design System GUI and Task 10 of NASA Contract NAS3-27394, ADPAC System Coupling to Blade Analysis & Design System GUI, Phase II-Loss, Design and, Multi-stage Analysis. TADS couples a throughflow solver (ADPAC) with a quasi-3D blade-to-blade solver (RVCQ3D) in an interactive package. Throughflow analysis and design capability was developed in ADPAC through the addition of blade force and blockage terms to the governing equations. A GUI was developed to simplify user input and automate the many tasks required to perform turbomachinery analysis and design. The coupling of the various programs was done in such a way that alternative solvers or grid generators could be easily incorporated into the TADS framework. Results of aerodynamic calculations using the TADS system are presented for a highly loaded fan, a compressor stator, a low speed turbine blade and a transonic turbine vane.
Software and Systems Producibility Collaboration and Experimentation Environment (SPRUCE)
2014-04-01
represent course materials and assignments from Vanderbilt University’s Dr . Gokhale’s courses. 3.2.4. Communities of Interest Current list of...blogging platforms of Twitter, Facebook and LinkedIn today, these user interactions represent low-effort means for users to start getting involved. A
Progress in The Semantic Analysis of Scientific Code
NASA Technical Reports Server (NTRS)
Stewart, Mark
2000-01-01
This paper concerns a procedure that analyzes aspects of the meaning or semantics of scientific and engineering code. This procedure involves taking a user's existing code, adding semantic declarations for some primitive variables, and parsing this annotated code using multiple, independent expert parsers. These semantic parsers encode domain knowledge and recognize formulae in different disciplines including physics, numerical methods, mathematics, and geometry. The parsers will automatically recognize and document some static, semantic concepts and help locate some program semantic errors. These techniques may apply to a wider range of scientific codes. If so, the techniques could reduce the time, risk, and effort required to develop and modify scientific codes.
Nonlinear and progressive failure aspects of transport composite fuselage damage tolerance
NASA Technical Reports Server (NTRS)
Walker, Tom; Ilcewicz, L.; Murphy, Dan; Dopker, Bernhard
1993-01-01
The purpose is to provide an end-user's perspective on the state of the art in life prediction and failure analysis by focusing on subsonic transport fuselage issues being addressed in the NASA/Boeing Advanced Technology Composite Aircraft Structure (ATCAS) contract and a related task-order contract. First, some discrepancies between the ATCAS tension-fracture test database and classical prediction methods is discussed, followed by an overview of material modeling work aimed at explaining some of these discrepancies. Finally, analysis efforts associated with a pressure-box test fixture are addressed, as an illustration of modeling complexities required to model and interpret tests.
Failure modes and effects analysis automation
NASA Technical Reports Server (NTRS)
Kamhieh, Cynthia H.; Cutts, Dannie E.; Purves, R. Byron
1988-01-01
A failure modes and effects analysis (FMEA) assistant was implemented as a knowledge based system and will be used during design of the Space Station to aid engineers in performing the complex task of tracking failures throughout the entire design effort. The three major directions in which automation was pursued were the clerical components of the FMEA process, the knowledge acquisition aspects of FMEA, and the failure propagation/analysis portions of the FMEA task. The system is accessible to design, safety, and reliability engineers at single user workstations and, although not designed to replace conventional FMEA, it is expected to decrease by many man years the time required to perform the analysis.
Requirements UML Tool (RUT) Expanded for Extreme Programming (CI02)
NASA Technical Reports Server (NTRS)
McCoy, James R.
2003-01-01
A procedure for capturing and managing system requirements that incorporates XP user stories. Because costs associated with identifying problems in requirements increase dramatically over the lifecycle of a project, a method for identifying sources of software risks in user stories is urgently needed. This initiative aims to determine a set of guide-lines for user stories that will result in high-quality requirement. To further this initiative, a tool is needed to analyze user stories that can assess the quality of individual user stories, detect sources cf software risk's, produce software metrics, and identify areas in user stories that can be improved.
Farzandipour, Mehrdad; Meidani, Zahra; Riazi, Hossein; Sadeqi Jabali, Monireh
2018-09-01
There are various approaches to evaluating the usability of electronic medical record (EMR) systems. User perspectives are an integral part of evaluation. Usability evaluations efficiently and effectively contribute to user-centered design and supports tasks and increase user satisfaction. This study determined the main usability requirements for EMRs by means of an end-user survey. A mixed-method strategy was conducted in three phases. A qualitative approach was employed to collect and formulate EMR usability requirements using the focus group method and the modified Delphi technique. Classic Delphi technique was used to evaluate the proposed requirements among 380 end-users in Iran. The final list of EMR usability requirements was verified and included 163 requirements divided into nine groups. The highest rates of end-user agreement relate to EMR visual clarity (3.65 ± 0.61), fault tolerance (3.58 ± 0.56), and suitability for learning (3.55 ± 0.54). The lowest end-user agreement was for auditory presentation (3.18 ± 0.69). The highest and lowest agreement among end-users was for visual clarity and auditory presentation by EMRs, respectively. This suggests that user priorities in determination of EMR usability and their understanding of the importance of the types of individual tasks and context characteristics differ.
Hughes, Sarah E; Hutchings, Hayley A; Rapport, Frances L; McMahon, Catherine M; Boisvert, Isabelle
2018-02-08
Individuals with hearing loss often report a need for increased effort when listening, particularly in challenging acoustic environments. Despite audiologists' recognition of the impact of listening effort on individuals' quality of life, there are currently no standardized clinical measures of listening effort, including patient-reported outcome measures (PROMs). To generate items and content for a new PROM, this qualitative study explored the perceptions, understanding, and experiences of listening effort in adults with severe-profound sensorineural hearing loss before and after cochlear implantation. Three focus groups (1 to 3) were conducted. Purposive sampling was used to recruit 17 participants from a cochlear implant (CI) center in the United Kingdom. The participants included adults (n = 15, mean age = 64.1 years, range 42 to 84 years) with acquired severe-profound sensorineural hearing loss who satisfied the UK's national candidacy criteria for cochlear implantation and their normal-hearing significant others (n = 2). Participants were CI candidates who used hearing aids (HAs) and were awaiting CI surgery or CI recipients who used a unilateral CI or a CI and contralateral HA (CI + HA). Data from a pilot focus group conducted with 2 CI recipients were included in the analysis. The data, verbatim transcripts of the focus group proceedings, were analyzed qualitatively using constructivist grounded theory (GT) methodology. A GT of listening effort in cochlear implantation was developed from participants' accounts. The participants provided rich, nuanced descriptions of the complex and multidimensional nature of their listening effort. Interpreting and integrating these descriptions through GT methodology, listening effort was described as the mental energy required to attend to and process the auditory signal, as well as the effort required to adapt to, and compensate for, a hearing loss. Analyses also suggested that listening effort for most participants was motivated by a need to maintain a sense of social connectedness (i.e., the subjective awareness of being in touch with one's social world). Before implantation, low social connectedness in the presence of high listening effort encouraged self-alienating behaviors and resulted in social isolation with adverse effects for participant's well-being and quality of life. A CI moderated but did not remove the requirement for listening effort. Listening effort, in combination with the improved auditory signal supplied by the CI, enabled most participants to listen and communicate more effectively. These participants reported a restored sense of social connectedness and an acceptance of the continued need for listening effort. Social connectedness, effort-reward balance, and listening effort as a multidimensional phenomenon were the core constructs identified as important to participants' experiences and understanding of listening effort. The study's findings suggest: (1) perceived listening effort is related to social and psychological factors and (2) these factors may influence how individuals with hearing loss report on the actual cognitive processing demands of listening. These findings provide evidence in support of the Framework for Understanding Effortful Listening a heuristic that describes listening effort as a function of both motivation and demands on cognitive capacity. This GT will inform item development and establish the content validity for a new PROM for measuring listening effort.
Expanding Access to NCAR's Digital Assets: Towards a Unified Scientific Data Management System
NASA Astrophysics Data System (ADS)
Stott, D.
2016-12-01
In 2014 the National Center for Atmospheric Research (NCAR) Directorate created the Data Stewardship Engineering Team (DSET) to plan and implement the strategic vision of an integrated front door for data discovery and access across the organization, including all laboratories, the library, and UCAR Community Programs. The DSET is focused on improving the quality of users' experiences in finding and using NCAR's digital assets. This effort also supports new policies included in federal mandates, NSF requirements, and journal publication rules. An initial survey with 97 respondents identified 68 persons responsible for more than 3 petabytes of data. An inventory, using the Data Asset Framework produced by the UK Digital Curation Centre as a starting point, identified asset types that included files and metadata, publications, images, and software (visualization, analysis, model codes). User story sessions with representatives from each lab identified and ranked desired features for a unified Scientific Data Management System (SDMS). A process beginning with an organization-wide assessment of metadata by the HDF Group and followed by meetings with labs to identify key documentation concepts, culminated in the development of an NCAR metadata dialect that leverages the DataCite and ISO 19115 standards. The tasks ahead are to build out an SDMS and populate it with rich standardized metadata. Software packages have been prototyped and currently are being tested and reviewed by DSET members. Key challenges for the DSET include technical and non-technical issues. First, the status quo with regard to how assets are managed varies widely across the organization. There are differences in file format standards, technologies, and discipline-specific vocabularies. Metadata diversity is another real challenge. The types of metadata, the standards used, and the capacity to create new metadata varies across the organization. Significant effort is required to develop tools to create new standard metadata across the organization, adapt and integrate current digital assets, and establish consistent data management practices going forward. To be successful, best practices must be infused into daily activities. This poster will highlight the processes, lessons learned, and current status of the DSET effort at NCAR.
Development of a portable multispectral thermal infrared camera
NASA Technical Reports Server (NTRS)
Osterwisch, Frederick G.
1991-01-01
The purpose of this research and development effort was to design and build a prototype instrument designated the 'Thermal Infrared Multispectral Camera' (TIRC). The Phase 2 effort was a continuation of the Phase 1 feasibility study and preliminary design for such an instrument. The completed instrument designated AA465 has application in the field of geologic remote sensing and exploration. The AA465 Thermal Infrared Camera (TIRC) System is a field-portable multispectral thermal infrared camera operating over the 8.0 - 13.0 micron wavelength range. Its primary function is to acquire two-dimensional thermal infrared images of user-selected scenes. Thermal infrared energy emitted by the scene is collected, dispersed into ten 0.5 micron wide channels, and then measured and recorded by the AA465 System. This multispectral information is presented in real time on a color display to be used by the operator to identify spectral and spatial variations in the scenes emissivity and/or irradiance. This fundamental instrument capability has a wide variety of commercial and research applications. While ideally suited for two-man operation in the field, the AA465 System can be transported and operated effectively by a single user. Functionally, the instrument operates as if it were a single exposure camera. System measurement sensitivity requirements dictate relatively long (several minutes) instrument exposure times. As such, the instrument is not suited for recording time-variant information. The AA465 was fabricated, assembled, tested, and documented during this Phase 2 work period. The detailed design and fabrication of the instrument was performed during the period of June 1989 to July 1990. The software development effort and instrument integration/test extended from July 1990 to February 1991. Software development included an operator interface/menu structure, instrument internal control functions, DSP image processing code, and a display algorithm coding program. The instrument was delivered to NASA in March 1991. Potential commercial and research uses for this instrument are in its primary application as a field geologists exploration tool. Other applications have been suggested but not investigated in depth. These are measurements of process control in commercial materials processing and quality control functions which require information on surface heterogeneity.
Status of multijunction solar cells
NASA Technical Reports Server (NTRS)
Yeh, Y. C. M.; Chu, C. L.
1996-01-01
This paper describes Applied Solar's present activity on Multijunction (MJ) space cells. We have worked on a variety of MJ cells, both monolithic and mechanically stacked. In recent years, most effort has been directed to GaInP2/GaAs monolithic cells, grown on Ge substrates, and the status of this cell design will be reviewed here. MJ cells are in demand to provide satellite power because of the acceptance of the overwhelming importance of high efficiency to reduce the area, weight and cost of space PV power systems. The need for high efficiencies has already accelerated the production of GaAs/Ge cells, with efficiencies 18.5-19%. When users realized that MJ cells could provide higher efficiencies (from 22% to 26%) with only fractional increase in costs, the demand for production MJ cells increased rapidly. The main purpose of the work described is to transfer the MOCVD growth technology of MJ high efficiency cells to a production environment, providing all the space requirements of users.
D'Erchia, Frank; Korschgen, Carl E.; Nyquist, M.; Root, Ralph; Sojda, Richard S.; Stine, Peter
2001-01-01
Workshops in the late 1990's launched the commitment of the U.S. Geological Survey's Biological Resources Division (BRD) to develop and implement decision support systems (DSS) applications. One of the primary goals of this framework document is to provide sufficient background and information for Department of the Interior (DOI) bureau stakeholders and other clients to determine the potential for DSS development. Such an understanding can assist them in carrying out effective land planning and management practices. This document provides a definition of DSS and its characteristics and capabilities. It proceeds to describe issues related to meeting resource managers needs, such as the needs for specific applications, customer requirements, information and technology transfer, user support, and institutionalization. Using the decision process as a means to guide DSS development and determine users needs is also discussed. We conclude with information on method to evaluate DSS development efforts and recommended procedures for verification and validation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Curtis, Darren S.; Peterson, Elena S.; Oehmen, Chris S.
2008-05-04
This work presents the ScalaBLAST Web Application (SWA), a web based application implemented using the PHP script language, MySQL DBMS, and Apache web server under a GNU/Linux platform. SWA is an application built as part of the Data Intensive Computer for Complex Biological Systems (DICCBS) project at the Pacific Northwest National Laboratory (PNNL). SWA delivers accelerated throughput of bioinformatics analysis via high-performance computing through a convenient, easy-to-use web interface. This approach greatly enhances emerging fields of study in biology such as ontology-based homology, and multiple whole genome comparisons which, in the absence of a tool like SWA, require a heroicmore » effort to overcome the computational bottleneck associated with genome analysis. The current version of SWA includes a user account management system, a web based user interface, and a backend process that generates the files necessary for the Internet scientific community to submit a ScalaBLAST parallel processing job on a dedicated cluster.« less
Operating a production pilot factory serving several scientific domains
NASA Astrophysics Data System (ADS)
Sfiligoi, I.; Würthwein, F.; Andrews, W.; Dost, J. M.; MacNeill, I.; McCrea, A.; Sheripon, E.; Murphy, C. W.
2011-12-01
Pilot infrastructures are becoming prominent players in the Grid environment. One of the major advantages is represented by the reduced effort required by the user communities (also known as Virtual Organizations or VOs) due to the outsourcing of the Grid interfacing services, i.e. the pilot factory, to Grid experts. One such pilot factory, based on the glideinWMS pilot infrastructure, is being operated by the Open Science Grid at University of California San Diego (UCSD). This pilot factory is serving multiple VOs from several scientific domains. Currently the three major clients are the analysis operations of the HEP experiment CMS, the community VO HCC, which serves mostly math, biology and computer science users, and the structural biology VO NEBioGrid. The UCSD glidein factory allows the served VOs to use Grid resources distributed over 150 sites in North and South America, in Europe, and in Asia. This paper presents the steps taken to create a production quality pilot factory, together with the challenges encountered along the road.
Design and evaluation of Mina: a robotic orthosis for paraplegics.
Neuhaus, Peter D; Noorden, Jerryll H; Craig, Travis J; Torres, Tecalote; Kirschbaum, Justin; Pratt, Jerry E
2011-01-01
Mobility options for persons suffering from paraplegia or paraparesis are limited to mainly wheeled devices. There are significant health, psychological, and social consequences related to being confined to a wheelchair. We present the Mina, a robotic orthosis for assisting mobility, which offers a legged mobility option for these persons. Mina is an overground robotic device that is worn on the back and around the legs to provide mobility assistance for people suffering from paraplegia or paraparesis. Mina uses compliant actuation to power the hip and knee joints. For paralyzed users, balance is provided with the assistance of forearm crutches. This paper presents the evaluation of Mina with two paraplegics (SCI ASIA-A). We confirmed that with a few hours of training and practice, Mina is currently able to provide paraplegics walking mobility at speeds of up to 0.20 m/s. We further confirmed that using Mina is not physically taxing and requires little cognitive effort, allowing the user to converse and maintain eye contact while walking. © 2011 IEEE
Dugan, J M; Berrios, D C; Liu, X; Kim, D K; Kaizer, H; Fagan, L M
1999-01-01
Our group has built an information retrieval system based on a complex semantic markup of medical textbooks. We describe the construction of a set of web-based knowledge-acquisition tools that expedites the collection and maintenance of the concepts required for text markup and the search interface required for information retrieval from the marked text. In the text markup system, domain experts (DEs) identify sections of text that contain one or more elements from a finite set of concepts. End users can then query the text using a predefined set of questions, each of which identifies a subset of complementary concepts. The search process matches that subset of concepts to relevant points in the text. The current process requires that the DE invest significant time to generate the required concepts and questions. We propose a new system--called ACQUIRE (Acquisition of Concepts and Queries in an Integrated Retrieval Environment)--that assists a DE in two essential tasks in the text-markup process. First, it helps her to develop, edit, and maintain the concept model: the set of concepts with which she marks the text. Second, ACQUIRE helps her to develop a query model: the set of specific questions that end users can later use to search the marked text. The DE incorporates concepts from the concept model when she creates the questions in the query model. The major benefit of the ACQUIRE system is a reduction in the time and effort required for the text-markup process. We compared the process of concept- and query-model creation using ACQUIRE to the process used in previous work by rebuilding two existing models that we previously constructed manually. We observed a significant decrease in the time required to build and maintain the concept and query models.
Information for the user in design of intelligent systems
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Schreckenghost, Debra L.
1993-01-01
Recommendations are made for improving intelligent system reliability and usability based on the use of information requirements in system development. Information requirements define the task-relevant messages exchanged between the intelligent system and the user by means of the user interface medium. Thus, these requirements affect the design of both the intelligent system and its user interface. Many difficulties that users have in interacting with intelligent systems are caused by information problems. These information problems result from the following: (1) not providing the right information to support domain tasks; and (2) not recognizing that using an intelligent system introduces new user supervisory tasks that require new types of information. These problems are especially prevalent in intelligent systems used for real-time space operations, where data problems and unexpected situations are common. Information problems can be solved by deriving information requirements from a description of user tasks. Using information requirements embeds human-computer interaction design into intelligent system prototyping, resulting in intelligent systems that are more robust and easier to use.
Sustainable Land Imaging User Requirements
NASA Astrophysics Data System (ADS)
Wu, Z.; Snyder, G.; Vadnais, C. M.
2017-12-01
The US Geological Survey (USGS) Land Remote Sensing Program (LRSP) has collected user requirements from a range of applications to help formulate the Landsat 9 follow-on mission (Landsat 10) through the Requirements, Capabilities and Analysis (RCA) activity. The USGS is working with NASA to develop Landsat 10, which is scheduled to launch in the 2027 timeframe as part of the Sustainable Land Imaging program. User requirements collected through RCA will help inform future Landsat 10 sensor designs and mission characteristics. Current Federal civil community users have provided hundreds of requirements through systematic, in-depth interviews. Academic, State, local, industry, and international Landsat user community input was also incorporated in the process. Emphasis was placed on spatial resolution, temporal revisit, and spectral characteristics, as well as other aspects such as accuracy, continuity, sampling condition, data access and format. We will provide an overview of the Landsat 10 user requirements collection process and summary results of user needs from the broad land imagining community.
STS users study (study 2.2). Volume 2: STS users plan (user data requirements) study
NASA Technical Reports Server (NTRS)
Pritchard, E. I.
1975-01-01
Pre-flight scheduling and pre-flight requirements of the space transportation system are discussed. Payload safety requirements, shuttle flight manifests, and interface specifications are studied in detail.
Informing Extension Program Development through Audience Segmentation: Targeting High Water Users
ERIC Educational Resources Information Center
Huang, Pei-wen; Lamm, Alexa J.; Dukes, Michael D.
2016-01-01
Human reliance on water has led to water issues globally. Although extension professionals have made efforts successfully to educate the general public about water conservation to enhance water resource sustainability, difficulty has been found in reaching high water users, defined as residents irrigating excessively to their landscape irrigation…
A GIS-Interface Web Site: Exploratory Learning for Geography Curriculum
ERIC Educational Resources Information Center
Huang, Kuo Hung
2011-01-01
Although Web-based instruction provides learners with sufficient resources for self-paced learning, previous studies have confirmed that browsing navigation-oriented Web sites possibly hampers users' comprehension of information. Web sites designed as "categories of materials" for navigation demand more cognitive effort from users to orient their…
ERIC Educational Resources Information Center
Oblinger, Diana
The Internet is an international network linking hundreds of smaller computer networks in North America, Europe, and Asia. Using the Internet, computer users can connect to a variety of computers with little effort or expense. The potential for use by college faculty is enormous. The largest problem faced by most users is understanding what such…
NASA Technology Investments in Electric Propulsion: New Directions in the New Millennium
NASA Technical Reports Server (NTRS)
Sankovic, John M.
2002-01-01
The last decade was a period of unprecedented acceptance of NASA developed electric propulsion by the user community. The benefits of high performance electric propulsion systems are now widely recognized, and new technologies have been accepted across the commonly. NASA clearly recognizes the need for new, high performance, electric propulsion technologies for future solar system missions and is sponsoring aggressive efforts in this area. These efforts are mainly conducted under the Office of Aerospace Technology. Plans over the next six years include the development of next generation ion thrusters for end of decade missions. Additional efforts are planned for the development of very high power thrusters, including magnetoplasmadynamic, pulsed inductive, and VASIMR, and clusters of Hall thrusters. In addition to the in-house technology efforts, NASA continues to work closely with both supplier and user communities to maximize the acceptance of new technology in a timely and cost-effective manner. This paper provides an overview of NASA's activities in the area of electric propulsion with an emphasis on future program directions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hudson, C.R.
Industrial consumers of energy now have the opportunity to participate directly in electricity generation. This report seeks to give the reader (1) insights into the various types of generation services that distributed generation (DG) units could provide, (2) a mechanism to evaluate the economics of using DG, (3) an overview of the status of DG deployment in selected states, and (4) a summary of the communication technologies involved with DG and what testing activities are needed to encourage industrial application of DG. Section 1 provides details on electricity markets and the types of services that can be offered. Subsequent sectionsmore » in the report address the technical requirements for participating in such markets, the economic decision process that an industrial energy user should go through in evaluating distributed generation, the status of current deployment efforts, and the requirements for test-bed or field demonstration projects.« less
Space station experiment definition: Advanced power system test bed
NASA Technical Reports Server (NTRS)
Pollard, H. E.; Neff, R. E.
1986-01-01
A conceptual design for an advanced photovoltaic power system test bed was provided and the requirements for advanced photovoltaic power system experiments better defined. Results of this study will be used in the design efforts conducted in phase B and phase C/D of the space station program so that the test bed capabilities will be responsive to user needs. Critical PV and energy storage technologies were identified and inputs were received from the idustry (government and commercial, U.S. and international) which identified experimental requirements. These inputs were used to develop a number of different conceptual designs. Pros and cons of each were discussed and a strawman candidate identified. A preliminary evolutionary plan, which included necessary precursor activities, was established and cost estimates presented which would allow for a successful implementation to the space station in the 1994 time frame.
Training evaluation final report
NASA Technical Reports Server (NTRS)
Sepulveda, Jose A.
1992-01-01
In the area of management training, 'evaluation' refers both to the specific evaluation instrument used to determine whether a training effort was considered effective, and to the procedures followed to evaluate specific training requests. This report recommends to evaluate new training requests in the same way new procurement or new projects are evaluated. This includes examining training requests from the perspective of KSC goals and objectives, and determining expected ROI of proposed training program (does training result in improved productivity, through savings of time, improved outputs, and/or personnel reduction?). To determine whether a specific training course is effective, a statement of what constitutes 'good performance' is required. The user (NOT the Training Branch) must define what is 'required level of performance'. This 'model' will be the basis for the design and development of an objective, performance-based, training evaluation instrument.
NASA Technical Reports Server (NTRS)
1993-01-01
During America's space shuttle flights, press and public attention focuses on the Johnson Space Center in Houston. The press and public often put questions to JSC technical and management staff. This fourth JSC Almanac supplies answers for many such questions, and provide an informational resource for speeches to general interest groups. This Almanac is not necessarily comprehensive or definitive. It is not intended as a statement of JSC or NASA policy. However, it does provide a much needed compilation of information from diverse sources. These sources are given as references, permitting the reader to obtain additional information as required. While every effort has been made to ensure accuracy and to reconcile statistics, users requiring the most up-to-date and accurate information should contact the office supplying the information at issue. The Almanac is updated periodically as needed. The following offices were responsible for supplying material for this update.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alstone, Peter; Jacobson, Arne; Mills, Evan
Efforts to promote rechargeable electric lighting as a replacement for fuel-based light sources in developing countries are typically predicated on the notion that lighting service levels can be maintained or improved while reducing the costs and environmental impacts of existing practices. However, the extremely low incomes of those who depend on fuel-based lighting create a need to balance the hypothetically possible or desirable levels of light with those that are sufficient and affordable. In a pilot study of four night vendors in Kenya, we document a field technique we developed to simultaneously measure the effectiveness of lighting service provided bymore » a lighting system and conduct a survey of lighting service demand by end-users. We took gridded illuminance measurements across each vendor's working and selling area, with users indicating the sufficiency of light at each point. User light sources included a mix of kerosene-fueled hurricane lanterns, pressure lamps, and LED lanterns.We observed illuminance levels ranging from just above zero to 150 lux. The LED systems markedly improved the lighting service levels over those provided by kerosene-fueled hurricane lanterns. Users reported that the minimum acceptable threshold was about 2 lux. The results also indicated that the LED lamps in use by the subjects did not always provide sufficient illumination over the desired retail areas. Our sample size is much too small, however, to reach any conclusions about requirements in the broader population. Given the small number of subjects and very specific type of user, our results should be regarded as indicative rather than conclusive. We recommend replicating the method at larger scales and across a variety of user types and contexts. Policymakers should revisit the subject of recommended illuminance levels regularly as LED technology advances and the price/service balance point evolves.« less
Pérez-Pérez, Martín; Glez-Peña, Daniel; Fdez-Riverola, Florentino; Lourenço, Anália
2015-02-01
Document annotation is a key task in the development of Text Mining methods and applications. High quality annotated corpora are invaluable, but their preparation requires a considerable amount of resources and time. Although the existing annotation tools offer good user interaction interfaces to domain experts, project management and quality control abilities are still limited. Therefore, the current work introduces Marky, a new Web-based document annotation tool equipped to manage multi-user and iterative projects, and to evaluate annotation quality throughout the project life cycle. At the core, Marky is a Web application based on the open source CakePHP framework. User interface relies on HTML5 and CSS3 technologies. Rangy library assists in browser-independent implementation of common DOM range and selection tasks, and Ajax and JQuery technologies are used to enhance user-system interaction. Marky grants solid management of inter- and intra-annotator work. Most notably, its annotation tracking system supports systematic and on-demand agreement analysis and annotation amendment. Each annotator may work over documents as usual, but all the annotations made are saved by the tracking system and may be further compared. So, the project administrator is able to evaluate annotation consistency among annotators and across rounds of annotation, while annotators are able to reject or amend subsets of annotations made in previous rounds. As a side effect, the tracking system minimises resource and time consumption. Marky is a novel environment for managing multi-user and iterative document annotation projects. Compared to other tools, Marky offers a similar visually intuitive annotation experience while providing unique means to minimise annotation effort and enforce annotation quality, and therefore corpus consistency. Marky is freely available for non-commercial use at http://sing.ei.uvigo.es/marky. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Promoting free online CME for intimate partner violence: what works at what cost?
Harris, John M; Novalis-Marine, Cheryl; Amend, Robert W; Surprenant, Zita J
2009-01-01
There is a need to provide practicing physicians with training on the recognition and management of intimate partner violence (IPV). Online continuing medical education (CME) could help meet this need, but there is little information on the costs and effectiveness of promoting online CME to physicians. This lack of information may discourage IPV training efforts and the use of online CME in general. We promoted an interactive, multimedia, online IPV CME program, which offered free CME credit, to 92,000 California physicians for 24 months. We collected data on user satisfaction, the costs of different promotional strategies, and self-reported user referral source. We evaluated California physician awareness of the promotion via telephone surveys. Over 2 years, the CME program was used by 1869 California physicians (2% of market), who rated the program's overall quality highly (4.52 on a 1-5 scale; 5 = excellent). The average promotional cost per physician user was $75. Direct mail was the most effective strategy, costing $143 each for 821 users. E-promotion via search engine advertising and e-mail solicitation had less reach, but was more cost efficient ($30-$80 per user). Strategies with no direct cost, such as notices in professional newsletters, accounted for 31% (578) of physician users. Phone surveys found that 24% of California physicians were aware of the online IPV CME program after 18 months of promotion. Promoting online CME, even well-received free CME, to busy community physicians requires resources, in this case at least $75 per physician reached. The effective use of promotional resources needs to be considered when developing social marketing strategies to improve community physician practices. Organizations with an interest in promoting online training might consider the use of e-promotion techniques along with conventional promotion strategies.
NASA Technical Reports Server (NTRS)
Lewis, Clayton; Wilde, Nick
1989-01-01
Space construction will require heavy investment in the development of a wide variety of user interfaces for the computer-based tools that will be involved at every stage of construction operations. Using today's technology, user interface development is very expensive for two reasons: (1) specialized and scarce programming skills are required to implement the necessary graphical representations and complex control regimes for high-quality interfaces; (2) iteration on prototypes is required to meet user and task requirements, since these are difficult to anticipate with current (and foreseeable) design knowledge. We are attacking this problem by building a user interface development tool based on extensions to the spreadsheet model of computation. The tool provides high-level support for graphical user interfaces and permits dynamic modification of interfaces, without requiring conventional programming concepts and skills.
ERIC Educational Resources Information Center
Zeng, Qingtian; Zhao, Zhongying; Liang, Yongquan
2009-01-01
User's knowledge requirement acquisition and analysis are very important for a personalized or user-adaptive learning system. Two approaches to capture user's knowledge requirement about course content within an e-learning system are proposed and implemented in this paper. The first approach is based on the historical data accumulated by an…
2005-10-01
AFRL-HE-WP-TP-2005-0030 AIR FORCE RESEARCH LABORATORY Application of Cognitive Task Analysis in User Requirements and Prototype Design Presentation...TITLE AND SUBTITLE 5a. CONTRACT NUMBER FA8650-04-C-6406 Application of Cognitive Task Analysis in User Requirements 5b.GRANTNUMBER and Prototype...maintainer experience 21 21 Questions & Answers Application of Cognitive Task Analysis in User Requirements Definition and Prototype Design Christopher Curtis
Computerized Adaptive Testing (CAT): A User Manual
1984-03-12
NPRDC TR 84-32 COMPUTERIZED ADAPTIVE TESTING ( CAT ): A USER MANUAL Susan Hardwick Lawrence Eastman Ross Cooper Rehab Group, Incorporated San...a ~EI’IOD COVIRED COMPUTERIZED ADAPTIVE TESTING ( CAT ) Final Report Aug 1981-June 1982 A USER MANUAL 1. ~l:l’t,ORMINCI ORCI. RE~ORT NUM.I:R 62-83...II nee• .. _, entl ldentll)’ ,,. llloclr _,.,) A joint-service effort is underway to develop a computerized adaptive testing ( CAT ) system and to
Global Framework for Climate Services (GFCS): implementation approach
NASA Astrophysics Data System (ADS)
Lucio, Filipe
2013-04-01
The Extraordinary Session of the World Meteorological Congress, held from 29 to 31 October 2012, adopted the Implementation Plan of the Global Framework for Climate Services, for the subsequent consideration by the Intergovernmental Board on Climate Services, which will host its first session in July 2013. The Extraordinary Congress called for an immediate move to action, so that the work undertaken can result in activities on the ground which will benefit, in particular, vulnerable countries. The development of the GFCS through a broad consultation process accross the pillars of the GFCS (User Interface Platform; Observations and Monitoring; Climate Services Information System; Research, Modelling and Prediction; and Capacity Development) and the initial four priority areas (Agriculture and Food Security; Water; Health and Disaster Risk Reductio) identified a number of challenges, which in some cases constitute barries to implementation: - Accessibility: many countries do not have climate services at all, and all countries have scope to improve access to such services; - Capacity: many countries lack the capacity to anticipate and managed climate-related risks and opportunities; - Data: the current availability and quality of climate observations and impacts data are inadequate for large parts of the globe; - Partnerships: mechanisms to enhance interaction between climate users and providers are not always well developed, and user requirements are not always adequately understood and addressed; - Quality: operational climate services are lagging advances in climate and applications science, and the spatial and temporal resolution of information to support decision-making is often insufficient to match user requirements. To address these challenges, the Implementation Plan of the GFCS identified initial implementation projects and activities. The initial priority is to establish the leadership and management capacity to take the GFCS forward at all levels. Capacity development is seen as the critical element to build the foundation for progress. This includes but not limited to: - Linking climate services users and providers, e.g., through User Interface mechanisms; - Developing capacities at national level; - Strengthening regional climate capabilities. Taking advantage of exhausting mechanisms and others under planning the GFCS offers an adequate platform for coordination and integration of efforts towards effective action to deliver user tailored climate services. This paper will provide details on the implementation approach of the GFCS and highlight progress made thus far.
Space Station communications and tracking systems modeling and RF link simulation
NASA Technical Reports Server (NTRS)
Tsang, Chit-Sang; Chie, Chak M.; Lindsey, William C.
1986-01-01
In this final report, the effort spent on Space Station Communications and Tracking System Modeling and RF Link Simulation is described in detail. The effort is mainly divided into three parts: frequency division multiple access (FDMA) system simulation modeling and software implementation; a study on design and evaluation of a functional computerized RF link simulation/analysis system for Space Station; and a study on design and evaluation of simulation system architecture. This report documents the results of these studies. In addition, a separate User's Manual on Space Communications Simulation System (SCSS) (Version 1) documents the software developed for the Space Station FDMA communications system simulation. The final report, SCSS user's manual, and the software located in the NASA JSC system analysis division's VAX 750 computer together serve as the deliverables from LinCom for this project effort.
Ehn, Maria; Eriksson, Lennie Carlén; Åkerberg, Nina; Johansson, Ann-Christin
2018-02-01
Falls are a major threat to the health and independence of seniors. Regular physical activity (PA) can prevent 40% of all fall injuries. The challenge is to motivate and support seniors to be physically active. Persuasive systems can constitute valuable support for persons aiming at establishing and maintaining healthy habits. However, these systems need to support effective behavior change techniques (BCTs) for increasing older adults' PA and meet the senior users' requirements and preferences. Therefore, involving users as codesigners of new systems can be fruitful. Prestudies of the user's experience with similar solutions can facilitate future user-centered design of novel persuasive systems. The aim of this study was to investigate how seniors experience using activity monitors (AMs) as support for PA in daily life. The addressed research questions are as follows: (1) What are the overall experiences of senior persons, of different age and balance function, in using wearable AMs in daily life?; (2) Which aspects did the users perceive relevant to make the measurements as meaningful and useful in the long-term perspective?; and (3) What needs and requirements did the users perceive as more relevant for the activity monitors to be useful in a long-term perspective? This qualitative interview study included 8 community-dwelling older adults (median age: 83 years). The participants' experiences in using two commercial AMs together with tablet-based apps for 9 days were investigated. Activity diaries during the usage and interviews after the usage were exploited to gather user experience. Comments in diaries were summarized, and interviews were analyzed by inductive content analysis. The users (n=8) perceived that, by using the AMs, their awareness of own PA had increased. However, the AMs' impact on the users' motivation for PA and activity behavior varied between participants. The diaries showed that self-estimated physical effort varied between participants and varied for each individual over time. Additionally, participants reported different types of accomplished activities; talking walks was most frequently reported. To be meaningful, measurements need to provide the user with a reliable receipt of whether his or her current activity behavior is sufficient for reaching an activity goal. Moreover, praise when reaching a goal was described as motivating feedback. To be useful, the devices must be easy to handle. In this study, the users perceived wearables as easy to handle, whereas tablets were perceived difficult to maneuver. Users reported in the diaries that the devices had been functional 78% (58/74) of the total test days. Activity monitors can be valuable for supporting seniors' PA. However, the potential of the solutions for a broader group of seniors can significantly be increased. Areas of improvement include reliability, usability, and content supporting effective BCTs with respect to increasing older adults' PA. ©Maria Ehn, Lennie Carlén Eriksson, Nina Åkerberg, Ann-Christin Johansson. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 01.02.2018.
Bridging the Gap: Need for a Data Repository to Support Vaccine Prioritization Efforts*
Madhavan, Guruprasad; Phelps, Charles; Sangha, Kinpritma; Levin, Scott; Rappuoli, Rino
2015-01-01
As the mechanisms for discovery, development, and delivery of new vaccines become increasingly complex, strategic planning and priority setting have become ever more crucial. Traditional single value metrics such as disease burden or cost-effectiveness no longer suffice to rank vaccine candidates for development. The Institute of Medicine—in collaboration with the National Academy of Engineering—has developed a novel software system to support vaccine prioritization efforts. The Strategic Multi-Attribute Ranking Tool for Vaccines—SMART Vaccines—allows decision makers to specify their own value structure, selecting from among 28 pre-defined and up to 7 user-defined attributes relevant to the ranking of vaccine candidates. Widespread use of SMART Vaccines will require compilation of a comprehensive data repository for numerous relevant populations—including their demographics, disease burdens and associated treatment costs, as well as characterizing performance features of potential or existing vaccines that might be created, improved, or deployed. While the software contains preloaded data for a modest number of populations, a large gap exists between the existing data and a comprehensive data repository necessary to make full use of SMART Vaccines. While some of these data exist in disparate sources and forms, constructing a data repository will require much new coordination and focus. Finding strategies to bridge the gap to a comprehensive data repository remains the most important task in bringing SMART Vaccines to full fruition, and to support strategic vaccine prioritization efforts in general. PMID:26022565
GOCE User Toolbox and Tutorial
NASA Astrophysics Data System (ADS)
Benveniste, Jérôme; Knudsen, Per
2016-07-01
The GOCE User Toolbox GUT is a compilation of tools for the utilisation and analysis of GOCE Level 2 products. GUT support applications in Geodesy, Oceanography and Solid Earth Physics. The GUT Tutorial provides information and guidance in how to use the toolbox for a variety of applications. GUT consists of a series of advanced computer routines that carry out the required computations. It may be used on Windows PCs, UNIX/Linux Workstations, and Mac. The toolbox is supported by The GUT Algorithm Description and User Guide and The GUT Install Guide. A set of a-priori data and models are made available as well. Without any doubt the development of the GOCE user toolbox have played a major role in paving the way to successful use of the GOCE data for oceanography. The GUT version 2.2 was released in April 2014 and beside some bug-fixes it adds the capability for the computation of Simple Bouguer Anomaly (Solid-Earth). During this fall a new GUT version 3 has been released. GUTv3 was further developed through a collaborative effort where the scientific communities participate aiming on an implementation of remaining functionalities facilitating a wider span of research in the fields of Geodesy, Oceanography and Solid earth studies. Accordingly, the GUT version 3 has: - An attractive and easy to use Graphic User Interface (GUI) for the toolbox, - Enhance the toolbox with some further software functionalities such as to facilitate the use of gradients, anisotropic diffusive filtering and computation of Bouguer and isostatic gravity anomalies. - An associated GUT VCM tool for analyzing the GOCE variance covariance matrices.
Astronomical Data Integration Beyond the Virtual Observatory
NASA Astrophysics Data System (ADS)
Lemson, G.; Laurino, O.
2015-09-01
"Data integration" generally refers to the process of combining data from different source data bases into a unified view. Much work has been devoted in this area by the International Virtual Observatory Alliance (IVOA), allowing users to discover and access databases through standard protocols. However, different archives present their data through their own schemas and users must still select, filter, and combine data for each archive individually. An important reason for this is that the creation of common data models that satisfy all sub-disciplines is fraught with difficulties. Furthermore it requires a substantial amount of work for data providers to present their data according to some standard representation. We will argue that existing standards allow us to build a data integration framework that works around these problems. The particular framework requires the implementation of the IVOA Table Access Protocol (TAP) only. It uses the newly developed VO data modelling language (VO-DML) specification, which allows one to define extensible object-oriented data models using a subset of UML concepts through a simple XML serialization language. A rich mapping language allows one to describe how instances of VO-DML data models are represented by the TAP service, bridging the possible mismatch between a local archive's schema and some agreed-upon representation of the astronomical domain. In this so called local-as-view approach to data integration, “mediators" use the mapping prescriptions to translate queries phrased in terms of the common schema to the underlying TAP service. This mapping language has a graphical representation, which we expose through a web based graphical “drag-and-drop-and-connect" interface. This service allows any user to map the holdings of any TAP service to the data model(s) of choice. The mappings are defined and stored outside of the data sources themselves, which allows the interface to be used in a kind of crowd-sourcing effort to annotate any remote database of interest. This reduces the burden of publishing one's data and allows a great flexibility in the definition of the views through which particular communities might wish to access remote archives. At the same time, the framework easies the user's effort to select, filter, and combine data from many different archives, so as to build knowledge bases for their analysis. We will present the framework and demonstrate a prototype implementation. We will discuss ideas for producing the missing elements, in particular the query language and the implementation of mediator tools to translate object queries to ADQL
User-oriented evaluation of a medical image retrieval system for radiologists.
Markonis, Dimitrios; Holzer, Markus; Baroz, Frederic; De Castaneda, Rafael Luis Ruiz; Boyer, Célia; Langs, Georg; Müller, Henning
2015-10-01
This article reports the user-oriented evaluation of a text- and content-based medical image retrieval system. User tests with radiologists using a search system for images in the medical literature are presented. The goal of the tests is to assess the usability of the system, identify system and interface aspects that need improvement and useful additions. Another objective is to investigate the system's added value to radiology information retrieval. The study provides an insight into required specifications and potential shortcomings of medical image retrieval systems through a concrete methodology for conducting user tests. User tests with a working image retrieval system of images from the biomedical literature were performed in an iterative manner, where each iteration had the participants perform radiology information seeking tasks and then refining the system as well as the user study design itself. During these tasks the interaction of the users with the system was monitored, usability aspects were measured, retrieval success rates recorded and feedback was collected through survey forms. In total, 16 radiologists participated in the user tests. The success rates in finding relevant information were on average 87% and 78% for image and case retrieval tasks, respectively. The average time for a successful search was below 3 min in both cases. Users felt quickly comfortable with the novel techniques and tools (after 5 to 15 min), such as content-based image retrieval and relevance feedback. User satisfaction measures show a very positive attitude toward the system's functionalities while the user feedback helped identifying the system's weak points. The participants proposed several potentially useful new functionalities, such as filtering by imaging modality and search for articles using image examples. The iterative character of the evaluation helped to obtain diverse and detailed feedback on all system aspects. Radiologists are quickly familiar with the functionalities but have several comments on desired functionalities. The analysis of the results can potentially assist system refinement for future medical information retrieval systems. Moreover, the methodology presented as well as the discussion on the limitations and challenges of such studies can be useful for user-oriented medical image retrieval evaluation, as user-oriented evaluation of interactive system is still only rarely performed. Such interactive evaluations can be limited in effort if done iteratively and can give many insights for developing better systems. Copyright © 2015. Published by Elsevier Ireland Ltd.
Khawaja, Zain-Ul-Abdin; Ali, Khudejah Iqbal; Khan, Shanze
2017-02-01
Social marketing related to sexual health is a problematic task, especially in religiously and/or culturally conservative countries. Social media presents a possible alternative channel for sexual health efforts to disseminate information and engage new users. In an effort to understand how well sexual health campaigns and organizations have leveraged this opportunity, this study presents a systematic examination of ongoing Facebook-based sexual health efforts in conservative Asian countries. It was discovered that out of hundreds of sexual health organizations identified in the region, less than half had created a Facebook page. Of those that had, only 31 were found to have posted sexual health-relevant content at least once a month. Many of these 31 organizations were also unsuccessful in maintaining regular official and user activity on their page. In order to assess the quality of the Facebook pages as Web-based information resources, the sexual health-related official activity on each page was analyzed for information (a) value, (b) reliability, (c) currency, and (d) system accessibility. User responsiveness to official posts on the pages was also used to discuss the potential of Facebook as a sexual health information delivery platform.
The Essential Elements of a Risk Governance Framework for Current and Future Nanotechnologies.
Stone, Vicki; Führ, Martin; Feindt, Peter H; Bouwmeester, Hans; Linkov, Igor; Sabella, Stefania; Murphy, Finbarr; Bizer, Kilian; Tran, Lang; Ågerstrand, Marlene; Fito, Carlos; Andersen, Torben; Anderson, Diana; Bergamaschi, Enrico; Cherrie, John W; Cowan, Sue; Dalemcourt, Jean-Francois; Faure, Michael; Gabbert, Silke; Gajewicz, Agnieszka; Fernandes, Teresa F; Hristozov, Danail; Johnston, Helinor J; Lansdown, Terry C; Linder, Stefan; Marvin, Hans J P; Mullins, Martin; Purnhagen, Kai; Puzyn, Tomasz; Sanchez Jimenez, Araceli; Scott-Fordsmand, Janeck J; Streftaris, George; van Tongeren, Martie; Voelcker, Nicolas H; Voyiatzis, George; Yannopoulos, Spyros N; Poortvliet, P Marijn
2017-12-14
Societies worldwide are investing considerable resources into the safe development and use of nanomaterials. Although each of these protective efforts is crucial for governing the risks of nanomaterials, they are insufficient in isolation. What is missing is a more integrative governance approach that goes beyond legislation. Development of this approach must be evidence based and involve key stakeholders to ensure acceptance by end users. The challenge is to develop a framework that coordinates the variety of actors involved in nanotechnology and civil society to facilitate consideration of the complex issues that occur in this rapidly evolving research and development area. Here, we propose three sets of essential elements required to generate an effective risk governance framework for nanomaterials. (1) Advanced tools to facilitate risk-based decision making, including an assessment of the needs of users regarding risk assessment, mitigation, and transfer. (2) An integrated model of predicted human behavior and decision making concerning nanomaterial risks. (3) Legal and other (nano-specific and general) regulatory requirements to ensure compliance and to stimulate proactive approaches to safety. The implementation of such an approach should facilitate and motivate good practice for the various stakeholders to allow the safe and sustainable future development of nanotechnology. © 2017 Society for Risk Analysis.
An Intelligent Tool for Activity Data Collection
Jehad Sarkar, A. M.
2011-01-01
Activity recognition systems using simple and ubiquitous sensors require a large variety of real-world sensor data for not only evaluating their performance but also training the systems for better functioning. However, a tremendous amount of effort is required to setup an environment for collecting such data. For example, expertise and resources are needed to design and install the sensors, controllers, network components, and middleware just to perform basic data collections. It is therefore desirable to have a data collection method that is inexpensive, flexible, user-friendly, and capable of providing large and diverse activity datasets. In this paper, we propose an intelligent activity data collection tool which has the ability to provide such datasets inexpensively without physically deploying the testbeds. It can be used as an inexpensive and alternative technique to collect human activity data. The tool provides a set of web interfaces to create a web-based activity data collection environment. It also provides a web-based experience sampling tool to take the user’s activity input. The tool generates an activity log using its activity knowledge and the user-given inputs. The activity knowledge is mined from the web. We have performed two experiments to validate the tool’s performance in producing reliable datasets. PMID:22163832
Nurses' Experiences of an Initial and Reimplemented Electronic Health Record Use.
Chang, Chi-Ping; Lee, Ting-Ting; Liu, Chia-Hui; Mills, Mary Etta
2016-04-01
The electronic health record is a key component of healthcare information systems. Currently, numerous hospitals have adopted electronic health records to replace paper-based records to document care processes and improve care quality. Integrating healthcare information system into traditional nursing daily operations requires time and effort for nurses to become familiarized with this new technology. In the stages of electronic health record implementation, smooth adoption can streamline clinical nursing activities. In order to explore the adoption process, a descriptive qualitative study design and focus group interviews were conducted 3 months after and 2 years after electronic health record system implementation (system aborted 1 year in between) in one hospital located in southern Taiwan. Content analysis was performed to analyze the interview data, and six main themes were derived, in the first stage: (1) liability, work stress, and anticipation for electronic health record; (2) slow network speed, user-unfriendly design for learning process; (3) insufficient information technology/organization support; on the second stage: (4) getting used to electronic health record and further system requirements, (5) benefits of electronic health record in time saving and documentation, (6) unrealistic information technology competence expectation and future use. It concluded that user-friendly design and support by informatics technology and manpower backup would facilitate this adoption process as well.
Automated Planning Enables Complex Protocols on Liquid-Handling Robots.
Whitehead, Ellis; Rudolf, Fabian; Kaltenbach, Hans-Michael; Stelling, Jörg
2018-03-16
Robotic automation in synthetic biology is especially relevant for liquid handling to facilitate complex experiments. However, research tasks that are not highly standardized are still rarely automated in practice. Two main reasons for this are the substantial investments required to translate molecular biological protocols into robot programs, and the fact that the resulting programs are often too specific to be easily reused and shared. Recent developments of standardized protocols and dedicated programming languages for liquid-handling operations addressed some aspects of ease-of-use and portability of protocols. However, either they focus on simplicity, at the expense of enabling complex protocols, or they entail detailed programming, with corresponding skills and efforts required from the users. To reconcile these trade-offs, we developed Roboliq, a software system that uses artificial intelligence (AI) methods to integrate (i) generic formal, yet intuitive, protocol descriptions, (ii) complete, but usually hidden, programming capabilities, and (iii) user-system interactions to automatically generate executable, optimized robot programs. Roboliq also enables high-level specifications of complex tasks with conditional execution. To demonstrate the system's benefits for experiments that are difficult to perform manually because of their complexity, duration, or time-critical nature, we present three proof-of-principle applications for the reproducible, quantitative characterization of GFP variants.
TELICS—A Telescope Instrument Control System for Small/Medium Sized Astronomical Observatories
NASA Astrophysics Data System (ADS)
Srivastava, Mudit K.; Ramaprakash, A. N.; Burse, Mahesh P.; Chordia, Pravin A.; Chillal, Kalpesh S.; Mestry, Vilas B.; Das, Hillol K.; Kohok, Abhay A.
2009-10-01
For any modern astronomical observatory, it is essential to have an efficient interface between the telescope and its back-end instruments. However, for small and medium-sized observatories, this requirement is often limited by tight financial constraints. Therefore a simple yet versatile and low-cost control system is required for such observatories to minimize cost and effort. Here we report the development of a modern, multipurpose instrument control system TELICS (Telescope Instrument Control System) to integrate the controls of various instruments and devices mounted on the telescope. TELICS consists of an embedded hardware unit known as a common control unit (CCU) in combination with Linux-based data acquisition and user interface. The hardware of the CCU is built around the ATmega 128 microcontroller (Atmel Corp.) and is designed with a backplane, master-slave architecture. A Qt-based graphical user interface (GUI) has been developed and the back-end application software is based on C/C++. TELICS provides feedback mechanisms that give the operator good visibility and a quick-look display of the status and modes of instruments as well as data. TELICS has been used for regular science observations since 2008 March on the 2 m, f/10 IUCAA Telescope located at Girawali in Pune, India.
The human role in space (THURIS) applications study. Final briefing
NASA Technical Reports Server (NTRS)
Maybee, George W.
1987-01-01
The THURIS (The Human Role in Space) application is an iterative process involving successive assessments of man/machine mixes in terms of performance, cost and technology to arrive at an optimum man/machine mode for the mission application. The process begins with user inputs which define the mission in terms of an event sequence and performance time requirements. The desired initial operational capability date is also an input requirement. THURIS terms and definitions (e.g., generic activities) are applied to the input data converting it into a form which can be analyzed using the THURIS cost model outputs. The cost model produces tabular and graphical outputs for determining the relative cost-effectiveness of a given man/machine mode and generic activity. A technology database is provided to enable assessment of support equipment availability for selected man/machine modes. If technology gaps exist for an application, the database contains information supportive of further investigation into the relevant technologies. The present study concentrated on testing and enhancing the THURIS cost model and subordinate data files and developing a technology database which interfaces directly with the user via technology readiness displays. This effort has resulted in a more powerful, easy-to-use applications system for optimization of man/machine roles. Volume 1 is an executive summary.
Supporting Dictation Speech Recognition Error Correction: The Impact of External Information
ERIC Educational Resources Information Center
Shi, Yongmei; Zhou, Lina
2011-01-01
Although speech recognition technology has made remarkable progress, its wide adoption is still restricted by notable effort made and frustration experienced by users while correcting speech recognition errors. One of the promising ways to improve error correction is by providing user support. Although support mechanisms have been proposed for…
DOT National Transportation Integrated Search
1981-09-01
Volume II is the second volume of a three volume document describing the computer program HEVSIM for use with buses and heavy duty trucks. This volume is a user's manual describing how to prepare data input and execute the program. A strong effort ha...
Secret Shopping as User Experience Assessment Tool
ERIC Educational Resources Information Center
Boyce, Crystal M.
2015-01-01
Secret shopping is a form of unobtrusive evaluation that can be accomplished with minimal effort, but still produce rich results. With as few as 11 shoppers, the author was able to identify trends in user satisfaction with services provided across two entry-level desks at Illinois Wesleyan University's The Ames Library. The focus of this secret…
A Framework and Implementation of User Interface and Human-Computer Interaction Instruction
ERIC Educational Resources Information Center
Peslak, Alan
2005-01-01
Researchers have suggested that up to 50 % of the effort in development of information systems is devoted to user interface development (Douglas, Tremaine, Leventhal, Wills, & Manaris, 2002; Myers & Rosson, 1992). Yet little study has been performed on the inclusion of important interface and human-computer interaction topics into a current…
Alternative Films for Making Presentation Slides for the Occasional User.
ERIC Educational Resources Information Center
Hunt, Harold R., Jr.
1985-01-01
As alternatives to the well-known Kodak Kodalith film for making presentation slides, suggests using Kodak Technical Pan Film, 2415 and Kodak Precision Fine Film LPD4. Although less known, both films are capable of making excellent quality slides with minimum effort and, for the occasional user, offer advantages over the Kodalith-Diazochrome…
ERIC Educational Resources Information Center
Hudomalj, Emil; Jauk, Avgust
2006-01-01
Purpose: To give an overview of the current state and trends in authentication and authorisation in satisfying academic library users' mobility and instant access to digital information resources, and to propose that libraries strongly support efforts to establish a global authentication and authorisation infrastructure.…
Toward a Theory of Media Reconciliation: A Closed Captioning Exploratory Study
ERIC Educational Resources Information Center
Snell, Nicole Elaine
2012-01-01
This project is an interdisciplinary empirical study that explores the emotional experiences resulting from the use of the assistive technology closed captioning. More specifically, this study focuses on documenting the user experiences of both the D/deaf and Hearing multimedia user in an effort to better identify and understand those variables…
Databases post-processing in Tensoral
NASA Technical Reports Server (NTRS)
Dresselhaus, Eliot
1994-01-01
The Center for Turbulent Research (CTR) post-processing effort aims to make turbulence simulations and data more readily and usefully available to the research and industrial communities. The Tensoral language, introduced in this document and currently existing in prototype form, is the foundation of this effort. Tensoral provides a convenient and powerful protocol to connect users who wish to analyze fluids databases with the authors who generate them. In this document we introduce Tensoral and its prototype implementation in the form of a user's guide. This guide focuses on use of Tensoral for post-processing turbulence databases. The corresponding document - the Tensoral 'author's guide' - which focuses on how authors can make databases available to users via the Tensoral system - is currently unwritten. Section 1 of this user's guide defines Tensoral's basic notions: we explain the class of problems at hand and how Tensoral abstracts them. Section 2 defines Tensoral syntax for mathematical expressions. Section 3 shows how these expressions make up Tensoral statements. Section 4 shows how Tensoral statements and expressions are embedded into other computer languages (such as C or Vectoral) to make Tensoral programs. We conclude with a complete example program.
Patient Accounting Systems: Are They Fit with the Users' Requirements?
Ayatollahi, Haleh; Nazemi, Zahra; Haghani, Hamid
2016-01-01
A patient accounting system is a subsystem of a hospital information system. This system like other information systems should be carefully designed to be able to meet users' requirements. The main aim of this research was to investigate users' requirements and to determine whether current patient accounting systems meet users' needs or not. This was a survey study, and the participants were the users of six patient accounting systems used in 24 teaching hospitals. A stratified sampling method was used to select the participants (n = 216). The research instruments were a questionnaire and a checklist. The mean value of ≥3 showed the importance of each data element and the capability of the system. Generally, the findings showed that the current patient accounting systems had some weaknesses and were able to meet between 70% and 80% of users' requirements. The current patient accounting systems need to be improved to be able to meet users' requirements. This approach can also help to provide hospitals with more usable and reliable financial information.
Chen, I-Min A; Markowitz, Victor M; Palaniappan, Krishna; Szeto, Ernest; Chu, Ken; Huang, Jinghua; Ratner, Anna; Pillay, Manoj; Hadjithomas, Michalis; Huntemann, Marcel; Mikhailova, Natalia; Ovchinnikova, Galina; Ivanova, Natalia N; Kyrpides, Nikos C
2016-04-26
The exponential growth of genomic data from next generation technologies renders traditional manual expert curation effort unsustainable. Many genomic systems have included community annotation tools to address the problem. Most of these systems adopted a "Wiki-based" approach to take advantage of existing wiki technologies, but encountered obstacles in issues such as usability, authorship recognition, information reliability and incentive for community participation. Here, we present a different approach, relying on tightly integrated method rather than "Wiki-based" method, to support community annotation and user collaboration in the Integrated Microbial Genomes (IMG) system. The IMG approach allows users to use existing IMG data warehouse and analysis tools to add gene, pathway and biosynthetic cluster annotations, to analyze/reorganize contigs, genes and functions using workspace datasets, and to share private user annotations and workspace datasets with collaborators. We show that the annotation effort using IMG can be part of the research process to overcome the user incentive and authorship recognition problems thus fostering collaboration among domain experts. The usability and reliability issues are addressed by the integration of curated information and analysis tools in IMG, together with DOE Joint Genome Institute (JGI) expert review. By incorporating annotation operations into IMG, we provide an integrated environment for users to perform deeper and extended data analysis and annotation in a single system that can lead to publications and community knowledge sharing as shown in the case studies.
Sheehan, Barbara; Kaufman, David; Stetson, Peter; Currie, Leanne M.
2009-01-01
Computerized decision support systems have been used to help ensure safe medication prescribing. However, the acceptance of these types of decision support has been reported to be low. It has been suggested that decreased acceptance may be due to lack of clinical relevance. Additionally, cognitive fit between the user interface and clinical task may impact the response of clinicians as they interact with the system. In order to better understand clinician responses to such decision support, we used cognitive task analysis methods to evaluate clinical alerts for antibiotic prescribing in a neonatal intensive care unit. Two methods were used: 1) a cognitive walkthrough; and 2) usability testing with a ‘think-aloud’ protocol. Data were analyzed for impact on cognitive effort according to categories of cognitive distance. We found that responses to alerts may be context specific and that lack of screen cues often increases cognitive effort required to use a system. PMID:20351922
Dearing, James W; Maibach, Edward W; Buller, David B
2006-10-01
Approaches from diffusion of innovations and social marketing are used here to propose efficient means to promote and enhance the dissemination of evidence-based physical activity programs. While both approaches have traditionally been conceptualized as top-down, center-to-periphery, centralized efforts at social change, their operational methods have usually differed. The operational methods of diffusion theory have a strong relational emphasis, while the operational methods of social marketing have a strong transactional emphasis. Here, we argue for a convergence of diffusion of innovation and social marketing principles to stimulate the efficient dissemination of proven-effective programs. In general terms, we are encouraging a focus on societal sectors as a logical and efficient means for enhancing the impact of dissemination efforts. This requires an understanding of complex organizations and the functional roles played by different individuals in such organizations. In specific terms, ten principles are provided for working effectively within societal sectors and enhancing user involvement in the processes of adoption and implementation.
Computational Control Workstation: Users' perspectives
NASA Technical Reports Server (NTRS)
Roithmayr, Carlos M.; Straube, Timothy M.; Tave, Jeffrey S.
1993-01-01
A Workstation has been designed and constructed for rapidly simulating motions of rigid and elastic multibody systems. We examine the Workstation from the point of view of analysts who use the machine in an industrial setting. Two aspects of the device distinguish it from other simulation programs. First, one uses a series of windows and menus on a computer terminal, together with a keyboard and mouse, to provide a mathematical and geometrical description of the system under consideration. The second hallmark is a facility for animating simulation results. An assessment of the amount of effort required to numerically describe a system to the Workstation is made by comparing the process to that used with other multibody software. The apparatus for displaying results as a motion picture is critiqued as well. In an effort to establish confidence in the algorithms that derive, encode, and solve equations of motion, simulation results from the Workstation are compared to answers obtained with other multibody programs. Our study includes measurements of computational speed.
Over-the-counter but out of reach: a pharmacy-based survey of OTC syringe sales in Tijuana, Mexico.
Pollini, Robin A; Gallardo, Manuel; Ruiz, Serena; Case, Patricia; Zaller, Nickolas; Lozada, Remedios
2014-05-01
Sterile syringe access is critical to HIV prevention efforts targeting injection drug users (IDUs) but some pharmacies do not sell syringes over-the-counter (OTC) even where such sales are legal. We conducted a pharmacy survey in Tijuana, Mexico (where OTC sales are legal) to characterize attitudes toward syringe sales and to explore support for expanding pharmacy-based HIV prevention efforts. Of 203 respondents, 28% supported OTC syringe sales to IDUs and 74% said their pharmacy required a prescription for at least some syringe sales. Support for OTC syringe sales was independently associated with selling OTC syringes, understanding the role of sterile syringes in HIV prevention, and recognizing pharmacies as an important health resource for IDUs. Most respondents supported an expanded role for pharmacies in HIV prevention, exclusive of OTC syringe sales. Our study provides information for developing interventions to promote OTC syringe sales and expanding pharmacy-based distribution of HIV-related information and resources.
Considerations in change management related to technology.
Luo, John S; Hilty, Donald M; Worley, Linda L; Yager, Joel
2006-01-01
The authors describe the complexity of social processes for implementing technological change. Once a new technology is available, information about its availability and benefits must be made available to the community of users, with opportunities to try the innovations and find them worthwhile, despite organizational resistances. The authors reviewed the literature from psychiatry, psychology, sociology, business, and technology to distill common denominators for success and failure related to implementing technology. Beneficial technological innovations that are simple to use and obviously save everyone time and effort are easy to inaugurate. However, innovations that primarily serve management rather than subordinates or front-line utilizers may fail, despite considerable institutional effort. This article reviews and outlines several of the more prominent theoretical models governing successful institutional change. Successful implementation of difficult technological changes requires visionary leadership that has carefully considered the benefits, consulted with influence leaders at all organizational levels to spot unintended consequences and sources of resistance, and developed a detailed plan and continuous quality assurance process to foster implementation over time.
Novel pervasive scenarios for home management: the Butlers architecture.
Denti, Enrico
2014-01-01
Many efforts today aim to energy saving, promoting the user's awareness and virtuous behavior in a sustainability perspective. Our houses, appliances, energy meters and devices are becoming smarter and connected, domotics is increasing possibilities in house automation and control, and ambient intelligence and assisted living are bringing attention onto people's needs from different viewpoints. Our assumption is that considering these aspects together allows for novel intriguing possibilities. To this end, in this paper we combine home energy management with domotics, coordination technologies, intelligent agents, ambient intelligence, ubiquitous technologies and gamification to devise novel scenarios, where energy monitoring and management is just the basic brick of a much wider and comprehensive home management system. The aim is to control home appliances well beyond energy consumption, combining home comfort, appliance scheduling, safety constraints, etc. with dynamically-changeable users' preferences, goals and priorities. At the same time, usability and attractiveness are seen as key success factors: so, the intriguing technologies available in most houses and smart devices are exploited to make the system configuration and use simpler, entertaining and attractive for users. These aspects are also integrated with ubiquitous and pervasive technologies, geo-localization, social networks and communities to provide enhanced functionalities and support smarter application scenarios, hereby further strengthening technology acceptation and diffusion. Accordingly, we first analyse the system requirements and define a reference multi-layer architectural model - the Butlers architecture - that specifies seven layers of functionalities, correlating the requirements, the corresponding technologies and the consequent value-added for users in each layer. Then, we outline a set of notable scenarios of increasing functionalities and complexity, discuss the structure of the corresponding system patterns in terms of the proposed architecture, and make this concrete by presenting some comprehensive interaction examples as comic strip stories. Next, we discuss the implementation requirements and how they can be met with the available technologies, discuss a possible architecture, refine it in the concrete case of the TuCSoN coordination technology, present a subsystem prototype and discuss its properties in the Butlers perspective.
Mkpojiogu, Emmanuel O C; Hashim, Nor Laily
2016-01-01
Customer satisfaction is the result of product quality and viability. The place of the perceived satisfaction of users/customers for a software product cannot be neglected especially in today competitive market environment as it drives the loyalty of customers and promotes high profitability and return on investment. Therefore understanding the importance of requirements as it is associated with the satisfaction of users/customers when their requirements are met is worth the pain considering. It is necessary to know the relationship between customer satisfactions when their requirements are met (or their dissatisfaction when their requirements are unmet) and the importance of such requirement. So many works have been carried out on customer satisfaction in connection with the importance of requirements but the relationship between customer satisfaction scores (coefficients) of the Kano model and users/customers self-stated requirements importance have not been sufficiently explored. In this study, an attempt is made to unravel the underlying relationship existing between Kano model's customer satisfaction indexes and users/customers self reported requirements importance. The results of the study indicate some interesting associations between these considered variables. These bivariate associations reveal that customer satisfaction index (SI), and average satisfaction coefficient (ASC) and customer dissatisfaction index (DI) and average satisfaction coefficient (ASC) are highly correlated (r = 96 %) and thus ASC can be used in place of either SI or DI in representing customer satisfaction scores. Also, these Kano model's customer satisfaction variables (SI, DI, and ASC) are each associated with self-stated requirements importance (IMP). Further analysis indicates that the value customers or users place on requirements that are met or on features that are incorporated into a product influences the level of satisfaction such customers derive from the product. The worth of a product feature is indicated by the perceived satisfaction customers get from the inclusion of such feature in the product design and development. The satisfaction users/customers derive when a requirement is fulfilled or when a feature is placed in the product (SI or ASC) is strongly influenced by the value the users/customers place on such requirements/features when met (IMP). However, the dissatisfaction users/customers received when a requirement is not met or when a feature is not incorporated into the product (DI), even though related to self-stated requirements importance (IMP), does not have a strong effect on the importance/worth (IMP) of that given requirement/feature as perceived by the users or customers. Therefore, since customer satisfaction is proportionally related to the perceived requirements importance (worth), it is then necessary to give adequate attention to user/customer satisfying requirements (features) from elicitation to design and to the final implementation of the design. Incorporating user or customer satisfying requirements in product design is of great worth or value to the future users or customers of the product.
The NASA Program Management Tool: A New Vision in Business Intelligence
NASA Technical Reports Server (NTRS)
Maluf, David A.; Swanson, Keith; Putz, Peter; Bell, David G.; Gawdiak, Yuri
2006-01-01
This paper describes a novel approach to business intelligence and program management for large technology enterprises like the U.S. National Aeronautics and Space Administration (NASA). Two key distinctions of the approach are that 1) standard business documents are the user interface, and 2) a "schema-less" XML database enables flexible integration of technology information for use by both humans and machines in a highly dynamic environment. The implementation utilizes patent-pending NASA software called the NASA Program Management Tool (PMT) and its underlying "schema-less" XML database called Netmark. Initial benefits of PMT include elimination of discrepancies between business documents that use the same information and "paperwork reduction" for program and project management in the form of reducing the effort required to understand standard reporting requirements and to comply with those reporting requirements. We project that the underlying approach to business intelligence will enable significant benefits in the timeliness, integrity and depth of business information available to decision makers on all organizational levels.
PACS--and beyond. A journey to the digital promised land.
Viau, Mark A
2004-01-01
A successful picture archiving and communication system (PACS) integration depends on much more than the technology; marketing also plays a large role. This fact was evident from the inception of the PACS project at Boca Raton Community Hospital (BRCH). Strategic and effective marketing efforts should target technologists, nurses, physicians (including radiologists), administration, and colleagues in other departments. The buy-in of these users is critical to the project's success. BRCH's first marketing effort took place during the initial PACS presentation made to the hospital's board of directors. Once approval was given and a 6-month implementation target was set, a strategic and effective marketing/education plan commenced. Posters, brochures, t-shirts, and promotional items were distributed in a coordinated effort to target hospital staff and referring physician offices. Through its "Got PACS?" branding and other identity materials, BRCH implemented a marketing plan that informed, educated, and engaged PACS users.
MISSIONS: The Mobile-Based Disaster Mitigation System in Indonesia
NASA Astrophysics Data System (ADS)
Passarella, Rossi; Putri Raflesia, Sarifah; Lestarini, Dinda; Rifai, Ahmad; Veny, Harumi
2018-04-01
Disaster mitigation is essential to minimize the effects of disasters. Indonesia is one of the disaster prone areas in Asia and the government explores the usage of Information technology (IT) to aid its mitigation efforts. Currently, there are Indonesian websites which hold information regarding the weather monitoring, climate conditions, and geophysics. But, there is no clear indicator of mitigation efforts or things to do during an emergency. Therefore, this research proposed MISSIONS, a disaster mitigation model using geo-fencing technique to detect the location of the users through their mobile devices. MISSIONS uses mobile-based disaster mitigation system as a way to disseminate critical information to victims during emergency when they are in disaster zones using virtual fences. It aims to help the government to reduce the effects of disaster and aid in the mitigation efforts. The implementation result shows that MISSIONS have a high accuracy in detecting user whereabouts.
A Upgrade of the Aeroheating Software "MINIVER"
NASA Technical Reports Server (NTRS)
Louderback, Pierce
2013-01-01
Many software packages assist engineers with performing flight vehicle analysis, but some of these packages have gone many years without updates or significant improvements to their workflows. One such software package, known as MINIVER, is a powerful yet lightweight tool used for aeroheating analyses. However, it is an aging program that has not seen major improvements within the past decade. As part of a collaborative effort with the Florida Institute of Technology, MINIVER has received a major user interface overhaul, a change in program language, and will be continually receiving updates to improve its capabilities. The user interface update includes a migration from a command-line interface to that of a graphical user interface supported in the Windows operating system. The organizational structure of the pre-processor has been transformed to clearly defined categories to provide ease of data entry. Helpful tools have been incorporated, including the ability to copy sections of cases as well as a generalized importer which aids in bulk data entry. A visual trajectory editor has been included, as well as a CAD Editor which allows the user to input simplified geometries in order to generate MINIVER cases in bulk. To demonstrate its continued effectiveness, a case involving the JAXA OREX flight vehicle will be included, providing comparisons to captured flight data as well as other computational solutions. The most recent upgrade effort incorporated the use of the CAD Editor, and current efforts are investigating methods to link MINIVER projects with SINDA/Fluint and Thermal Desktop.
An Upgrade of the Aeroheating Software "MINIVER"
NASA Technical Reports Server (NTRS)
Louderback, Pierce M.
2013-01-01
Many software packages assist engineers with performing flight vehicle analysis, but some of these packages have gone many years without updates or significant improvements to their workflows. One such software, known as MINIVER, is a powerful yet lightweight tool that is used for aeroheating analyses. However, it is an aging program that has not seen major improvements within the past decade. As part of a collaborative effort with Florida Institute of Technology, MINIVER has received a major user interface overhaul, a change in program language, and will be continually receiving updates to improve its capabilities. The user interface update includes a migration from a command-line interface to that of a graphical user interface supported in the Windows operating system. The organizational structure of the preprocessor has been transformed to clearly defined categories to provide ease of data entry. Helpful tools have been incorporated, including the ability to copy sections of cases as well as a generalized importer which aids in bulk data entry. A visual trajectory editor has been included, as well as a CAD Editor which allows the user to input simplified geometries in order to generate MINIVER cases in bulk. To demonstrate its continued effectiveness, a case involving the JAXA OREX flight vehicle will be included, providing comparisons to captured flight data as well as other computational solutions. The most recent upgrade effort incorporated the use of the CAD Editor, and current efforts are investigating methods to link MINIVER projects with SINDA/Fluint and Thermal Desktop.
NASA Astrophysics Data System (ADS)
McGibbney, L. J.; Hausman, J.; Laurencelle, J. C.; Toaz, R., Jr.; McAuley, J.; Freeborn, D. J.; Stoner, C.
2016-12-01
The Surface Water & Ocean Topography (SWOT) mission brings together two communities focused on a better understanding of the world's oceans and its terrestrial surface waters. U.S. and French oceanographers and hydrologists and international partners have joined forces to develop this new space mission. At NASA JPL's PO.DAAC, the team is currently engaged in the gathering of SWOT User Stores (access patterns, metadata requirements, primary and value added product requirements, data access protocols, etc.) to better inform the adaptive planning of what will be known as the next generation PO.DAAC Information Architecture (IA). The IA effort acknowledges that missions such as SWOT (and NISAR) have few or no precedent in terms of data volume, hot and cold storage, archival, analysis, existing system engineering complexities, etc. and that the only way we can better understand the projected impacts of such requirements is to interface directly with the User Community. Additionally, it also acknowledges that collective learning has taken place to understand certain limitations in the existing data models (DM) underlying the existing PO.DAAC Data Management and Archival System. This work documents an evolutionary, use case based, standards driven approach to adapting the legacy DM and accompanying knowledge representation infrastructure at NASA JPL's PO.DAAC to address forthcoming DAAC mission requirements presented by missions such as SWOT. Some of the topics covered in this evolution include, but are not limited to: How we are leveraging lessons learned from the development of existing DM (such as that generated for SMAP) in an attempt to map them to SWOT. What is the governance model for the SWOT IA? What are the `governing' entities? What is the hierarchy of the `governed entities'? How are elements grouped? How is the design-working group formed? How is model independence maintained and what choices/requirements do we have for the implementation language? The use of Standards such as CF Conventions, NetCDF, HDF and ISO Metadata, etc. Beyond SWOT… what choices were made such that the new PO.DAAC IA will flexible enough and adequately design such that future missions with even more advanced requirements can be accommodated within PO.DAAC.
Sharma, Vinod; Simpson, Richard; Lopresti, Edmund; Schmeler, Mark
2010-01-01
Some individuals with disabilities are denied powered mobility because they lack the visual, motor, and/or cognitive skills required to safely operate a power wheelchair. The Drive-Safe System (DSS) is an add-on, distributed, shared-control navigation assistance system for power wheelchairs intended to provide safe and independent mobility to such individuals. The DSS is a human-machine system in which the user is responsible for high-level control of the wheelchair, such as choosing the destination, path planning, and basic navigation actions, while the DSS overrides unsafe maneuvers through autonomous collision avoidance, wall following, and door crossing. In this project, the DSS was clinically evaluated in a controlled laboratory with blindfolded, nondisabled individuals. Further, these individuals' performance with the DSS was compared with standard cane use for navigation assistance by people with visual impairments. Results indicate that compared with a cane, the DSS significantly reduced the number of collisions. Users rated the DSS favorably even though they took longer to navigate the same obstacle course than they would have using a standard long cane. Participants experienced less physical demand, effort, and frustration when using the DSS as compared with a cane. These findings suggest that the DSS can be a viable powered mobility solution for wheelchair users with visual impairments.
Embracing value co-creation in primary care services research: a framework for success.
Janamian, Tina; Crossland, Lisa; Jackson, Claire L
2016-04-18
Value co-creation redresses a key criticism of researcher-driven approaches to research - that researchers may lack insight into the end users' needs and values across the research journey. Value co-creation creates, in a step-wise way, value with, and for, multiple stakeholders through regular, ongoing interactions leading to innovation, increased productivity and co-created outcomes of value to all parties - thus creating a "win more-win more" environment. The Centre of Research Excellence (CRE) in Building Primary Care Quality, Performance and Sustainability has co-created outcomes of value that have included robust and enduring partnerships, research findings that have value to end users (such as the Primary Care Practice Improvement Tool and the best-practice governance framework), an International Implementation Research Network in Primary Care and the International Primary Health Reform Conference. Key lessons learned in applying the strategies of value co-creation have included the recognition that partnership development requires an investment of time and effort to ensure meaningful interactions and enriched end user experiences, that research management systems including governance, leadership and communication also need to be "co-creative", and that openness and understanding is needed to work across different sectors and cultures with flexibility, fairness and transparency being essential to the value co-creation process.
The PARIGA server for real time filtering and analysis of reciprocal BLAST results.
Orsini, Massimiliano; Carcangiu, Simone; Cuccuru, Gianmauro; Uva, Paolo; Tramontano, Anna
2013-01-01
BLAST-based similarity searches are commonly used in several applications involving both nucleotide and protein sequences. These applications span from simple tasks such as mapping sequences over a database to more complex procedures as clustering or annotation processes. When the amount of analysed data increases, manual inspection of BLAST results become a tedious procedure. Tools for parsing or filtering BLAST results for different purposes are then required. We describe here PARIGA (http://resources.bioinformatica.crs4.it/pariga/), a server that enables users to perform all-against-all BLAST searches on two sets of sequences selected by the user. Moreover, since it stores the two BLAST output in a python-serialized-objects database, results can be filtered according to several parameters in real-time fashion, without re-running the process and avoiding additional programming efforts. Results can be interrogated by the user using logical operations, for example to retrieve cases where two queries match same targets, or when sequences from the two datasets are reciprocal best hits, or when a query matches a target in multiple regions. The Pariga web server is designed to be a helpful tool for managing the results of sequence similarity searches. The design and implementation of the server renders all operations very fast and easy to use.
Design and Implementation of a Metadata-rich File System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ames, S; Gokhale, M B; Maltzahn, C
2010-01-19
Despite continual improvements in the performance and reliability of large scale file systems, the management of user-defined file system metadata has changed little in the past decade. The mismatch between the size and complexity of large scale data stores and their ability to organize and query their metadata has led to a de facto standard in which raw data is stored in traditional file systems, while related, application-specific metadata is stored in relational databases. This separation of data and semantic metadata requires considerable effort to maintain consistency and can result in complex, slow, and inflexible system operation. To address thesemore » problems, we have developed the Quasar File System (QFS), a metadata-rich file system in which files, user-defined attributes, and file relationships are all first class objects. In contrast to hierarchical file systems and relational databases, QFS defines a graph data model composed of files and their relationships. QFS incorporates Quasar, an XPATH-extended query language for searching the file system. Results from our QFS prototype show the effectiveness of this approach. Compared to the de facto standard, the QFS prototype shows superior ingest performance and comparable query performance on user metadata-intensive operations and superior performance on normal file metadata operations.« less
Modernization of the NASA scientific and technical information program
NASA Technical Reports Server (NTRS)
Cotter, Gladys A.; Hunter, Judy F.; Ostergaard, K.
1993-01-01
The NASA Scientific and Technical Information Program utilizes a technology infrastructure assembled in the mid 1960s to late 1970s to process and disseminate its information products. When this infrastructure was developed it placed NASA as a leader in processing STI. The retrieval engine for the STI database was the first of its kind and was used as the basis for developing commercial, other U.S., and foreign government agency retrieval systems. Due to the combination of changes in user requirements and the tremendous increase in technological capabilities readily available in the marketplace, this infrastructure is no longer the most cost-effective or efficient methodology available. Consequently, the NASA STI Program is pursuing a modernization effort that applies new technology to current processes to provide near-term benefits to the user. In conjunction with this activity, we are developing a long-term modernization strategy designed to transition the Program to a multimedia, global 'library without walls.' Critical pieces of the long-term strategy include streamlining access to sources of STI by using advances in computer networking and graphical user interfaces; creating and disseminating technical information in various electronic media including optical disks, video, and full text; and establishing a Technology Focus Group to maintain a current awareness of emerging technology and to plan for the future.
Bridging Hydroinformatics Services Between HydroShare and SWATShare
NASA Astrophysics Data System (ADS)
Merwade, V.; Zhao, L.; Song, C. X.; Tarboton, D. G.; Goodall, J. L.; Stealey, M.; Rajib, A.; Morsy, M. M.; Dash, P. K.; Miles, B.; Kim, I. L.
2016-12-01
Many cyberinfrastructure systems in the hydrologic and related domains emerged in the past decade with more being developed to address various data management and modeling needs. Although clearly beneficial to the broad user community, it is a challenging task to build interoperability across these systems due to various obstacles including technological, organizational, semantic, and social issues. This work presents our experience in developing interoperability between two hydrologic cyberinfrastructure systems - SWATShare and HydroShare. HydroShare is a large-scale online system aiming at enabling the hydrologic user community to share their data, models, and analysis online for solving complex hydrologic research questions. On the other side, SWATShare is a focused effort to allow SWAT (Soil and Water Assessment Tool) modelers share, execute and analyze SWAT models using high performance computing resources. Making these two systems interoperable required common sign-in through OAuth, sharing of models through common metadata standards and use of standard web-services for implementing key import/export functionalities. As a result, users from either community can leverage the resources and services across these systems without having to manually importing, exporting, or processing their models. Overall, this use case is an example that can serve as a model for the interoperability among other systems as no one system can provide all the functionality needed to address large interdisciplinary problems.
NASA Technical Reports Server (NTRS)
Arya, Vinod K.; Halford, Gary R. (Technical Monitor)
2003-01-01
This manual presents computer programs FLAPS for characterizing and predicting fatigue and creep-fatigue resistance of metallic materials in the high-temperature, long-life regime for isothermal and nonisothermal fatigue. The programs use the Total Strain version of Strainrange Partitioning (TS-SRP), and several other life prediction methods described in this manual. The user should be thoroughly familiar with the TS-SRP and these life prediction methods before attempting to use any of these programs. Improper understanding can lead to incorrect use of the method and erroneous life predictions. An extensive database has also been developed in a parallel effort. The database is probably the largest source of high-temperature, creep-fatigue test data available in the public domain and can be used with other life-prediction methods as well. This users' manual, software, and database are all in the public domain and can be obtained by contacting the author. The Compact Disk (CD) accompanying this manual contains an executable file for the FLAPS program, two datasets required for the example problems in the manual, and the creep-fatigue data in a format compatible with these programs.
Advances in model-based software for simulating ultrasonic immersion inspections of metal components
NASA Astrophysics Data System (ADS)
Chiou, Chien-Ping; Margetan, Frank J.; Taylor, Jared L.; Engle, Brady J.; Roberts, Ronald A.
2018-04-01
Under the sponsorship of the National Science Foundation's Industry/University Cooperative Research Center at ISU, an effort was initiated in 2015 to repackage existing research-grade software into user-friendly tools for the rapid estimation of signal-to-noise ratio (SNR) for ultrasonic inspections of metals. The software combines: (1) a Python-based graphical user interface for specifying an inspection scenario and displaying results; and (2) a Fortran-based engine for computing defect signals and backscattered grain noise characteristics. The later makes use the Thompson-Gray measurement model for the response from an internal defect, and the Thompson-Margetan independent scatterer model for backscattered grain noise. This paper, the third in the series [1-2], provides an overview of the ongoing modeling effort with emphasis on recent developments. These include the ability to: (1) treat microstructures where grain size, shape and tilt relative to the incident sound direction can all vary with depth; and (2) simulate C-scans of defect signals in the presence of backscattered grain noise. The simulation software can now treat both normal and oblique-incidence immersion inspections of curved metal components. Both longitudinal and shear-wave inspections are treated. The model transducer can either be planar, spherically-focused, or bi-cylindrically-focused. A calibration (or reference) signal is required and is used to deduce the measurement system efficiency function. This can be "invented" by the software using center frequency and bandwidth information specified by the user, or, alternatively, a measured calibration signal can be used. Defect types include flat-bottomed-hole reference reflectors, and spherical pores and inclusions. Simulation outputs include estimated defect signal amplitudes, root-mean-square values of grain noise amplitudes, and SNR as functions of the depth of the defect within the metal component. At any particular depth, the user can view a simulated A-, B-, and C-scans displaying the superimposed defect and grain-noise waveforms. The realistic grain noise signals used in the A-scans are generated from a set of measured "universal" noise signals whose strengths and spectral characteristics are altered to match predicted noise characteristics for the simulation at hand.
Focusing the research agenda for simulation training visual system requirements
NASA Astrophysics Data System (ADS)
Lloyd, Charles J.
2014-06-01
Advances in the capabilities of the display-related technologies with potential uses in simulation training devices continue to occur at a rapid pace. Simultaneously, ongoing reductions in defense spending stimulate the services to push a higher proportion of training into ground-based simulators to reduce their operational costs. These two trends result in increased customer expectations and desires for more capable training devices, while the money available for these devices is decreasing. Thus, there exists an increasing need to improve the efficiency of the acquisition process and to increase the probability that users get the training devices they need at the lowest practical cost. In support of this need the IDEAS program was initiated in 2010 with the goal of improving display system requirements associated with unmet user needs and expectations and disrupted acquisitions. This paper describes a process of identifying, rating, and selecting the design parameters that should receive research attention. Analyses of existing requirements documents reveal that between 40 and 50 specific design parameters (i.e., resolution, contrast, luminance, field of view, frame rate, etc.) are typically called out for the acquisition of a simulation training display system. Obviously no research effort can address the effects of this many parameters. Thus, we developed a defensible strategy for focusing limited R&D resources on a fraction of these parameters. This strategy encompasses six criteria to identify the parameters most worthy of research attention. Examples based on display design parameters recommended by stakeholders are provided.
System Considerations and Challendes in 3d Mapping and Modeling Using Low-Cost Uav Systems
NASA Astrophysics Data System (ADS)
Lari, Z.; El-Sheimy, N.
2015-08-01
In the last few years, low-cost UAV systems have been acknowledged as an affordable technology for geospatial data acquisition that can meet the needs of a variety of traditional and non-traditional mapping applications. In spite of its proven potential, UAV-based mapping is still lacking in terms of what is needed for it to become an acceptable mapping tool. In other words, a well-designed system architecture that considers payload restrictions as well as the specifications of the utilized direct geo-referencing component and the imaging systems in light of the required mapping accuracy and intended application is still required. Moreover, efficient data processing workflows, which are capable of delivering the mapping products with the specified quality while considering the synergistic characteristics of the sensors onboard, the wide range of potential users who might lack deep knowledge in mapping activities, and time constraints of emerging applications, are still needed to be adopted. Therefore, the introduced challenges by having low-cost imaging and georeferencing sensors onboard UAVs with limited payload capability, the necessity of efficient data processing techniques for delivering required products for intended applications, and the diversity of potential users with insufficient mapping-related expertise needs to be fully investigated and addressed by UAV-based mapping research efforts. This paper addresses these challenges and reviews system considerations, adaptive processing techniques, and quality assurance/quality control procedures for achievement of accurate mapping products from these systems.
Quantifying reproducibility in computational biology: the case of the tuberculosis drugome.
Garijo, Daniel; Kinnings, Sarah; Xie, Li; Xie, Lei; Zhang, Yinliang; Bourne, Philip E; Gil, Yolanda
2013-01-01
How easy is it to reproduce the results found in a typical computational biology paper? Either through experience or intuition the reader will already know that the answer is with difficulty or not at all. In this paper we attempt to quantify this difficulty by reproducing a previously published paper for different classes of users (ranging from users with little expertise to domain experts) and suggest ways in which the situation might be improved. Quantification is achieved by estimating the time required to reproduce each of the steps in the method described in the original paper and make them part of an explicit workflow that reproduces the original results. Reproducing the method took several months of effort, and required using new versions and new software that posed challenges to reconstructing and validating the results. The quantification leads to "reproducibility maps" that reveal that novice researchers would only be able to reproduce a few of the steps in the method, and that only expert researchers with advance knowledge of the domain would be able to reproduce the method in its entirety. The workflow itself is published as an online resource together with supporting software and data. The paper concludes with a brief discussion of the complexities of requiring reproducibility in terms of cost versus benefit, and a desiderata with our observations and guidelines for improving reproducibility. This has implications not only in reproducing the work of others from published papers, but reproducing work from one's own laboratory.
Active opioid use does not attenuate the humoral responses to inactivated influenza vaccine
Moroz, Ekaterina; Albrecht, Randy A.; Aden, Brandon; Beeder, Ann Bordwine; Yuan, Jianda; García-Sastre, Adolfo; Edlin, Brian R.; Salvatore, Mirella
2016-01-01
Background Influenza vaccination is recommended for vulnerable individuals, including active drug users, to prevent influenza complications and decrease influenza spread. Recent studies suggest that opioids negatively regulate immune responses in experimental models, but the extent to which opioid use will affect the humoral responses to influenza vaccine in humans is unknown. This information is critical in maximizing vaccination efforts. Objective To determine whether there is a difference in antibody response after influenza vaccination in heroin or methadone users compared to control subjects. Methods We studied active heroin users, subjects on methadone maintenance treatment (MMT) and subjects that did not use any drugs before and 1 and 4 weeks after vaccination with trivalent influenza vaccine (TIV). We measured hemagglutination inhibition and microneutralization titers, and we compared geometric mean titers (GMT), and rates of seroprotection and seroconversion for each of the vaccine strains among the 3 groups of subjects. Results Heroin users, subjects on MMT and non-user controls mount a similarly robust serologic response to TIV. GMT and rates of seroprotection and seroconversion were not significantly different among groups. Conclusion Our results suggest that opioid use do not significantly alter antibody responses to influenza vaccine supporting the vaccination effort in these populations. PMID:26859239
Don't Trust a Management Metric, Especially in Life Support
NASA Technical Reports Server (NTRS)
Jones, Harry W.
2014-01-01
Goodhart's law states that metrics do not work. Metrics become distorted when used and they deflect effort away from more important goals. These well-known and unavoidable problems occurred when the closure and system mass metrics were used to manage life support research. The intent of life support research should be to develop flyable, operable, reliable systems, not merely to increase life support system closure or to reduce its total mass. It would be better to design life support systems to meet the anticipated mission requirements and user needs. Substituting the metrics of closure and total mass for these goals seems to have led life support research to solve the wrong problems.
NASA Technical Reports Server (NTRS)
Aaron, Susan
1991-01-01
One of the many services NSI provides as an extension of customer/user support is to attend major scientific conferences. The conference effort provides NASA/OSSA scientists with many benefits: (1) scientist get to see NSI in action; they utilize the network to read email, and have recently begun to demonstrate their scientific research to their colleagues; (2) scientist get an opportunity to meet and interact with NSI Staff, which gives scientists a chance to get status on their requirements, ask about network status, get acquainted with our procedures, and learn about services; and (3) scientists are exposed to networking in a larger sense; particularly by knowing about other NASA groups who provide valuable scientific resources over the Internet.
CCMC Plans to Support SDO Operations
NASA Technical Reports Server (NTRS)
MacNeice, Peter
2008-01-01
The CCMC will actively support the SDO Mission. It will do this, wherever feasible, by installing and running those models which the SDO science planners deem both appropriate and necessary to enable the science goals of SDO. In this presentation I will outline our philosophy in offering this support, the models we are actively pursuing to enable this, and the modes in which we intend to run these models. I will discuss how users of SDO data will be able to request model runs and analyse their outputs. I will also describe the facilities which we have at our disposal to support this effort, and our expectations for the resource requirements which this support will need.
Laboratory Information Management System (LIMS): A case study
NASA Technical Reports Server (NTRS)
Crandall, Karen S.; Auping, Judith V.; Megargle, Robert G.
1987-01-01
In the late 70's, a refurbishment of the analytical laboratories serving the Materials Division at NASA Lewis Research Center was undertaken. As part of the modernization efforts, a Laboratory Information Management System (LIMS) was to be included. Preliminary studies indicated a custom-designed system as the best choice in order to satisfy all of the requirements. A scaled down version of the original design has been in operation since 1984. The LIMS, a combination of computer hardware, provides the chemical characterization laboratory with an information data base, a report generator, a user interface, and networking capabilities. This paper is an account of the processes involved in designing and implementing that LIMS.
Numerical aerodynamic simulation facility. Preliminary study extension
NASA Technical Reports Server (NTRS)
1978-01-01
The production of an optimized design of key elements of the candidate facility was the primary objective of this report. This was accomplished by effort in the following tasks: (1) to further develop, optimize and describe the function description of the custom hardware; (2) to delineate trade off areas between performance, reliability, availability, serviceability, and programmability; (3) to develop metrics and models for validation of the candidate systems performance; (4) to conduct a functional simulation of the system design; (5) to perform a reliability analysis of the system design; and (6) to develop the software specifications to include a user level high level programming language, a correspondence between the programming language and instruction set and outline the operation system requirements.
DIGIMEN, optical mass memory investigations, volume 2
NASA Technical Reports Server (NTRS)
1977-01-01
The DIGIMEM phase of the Optical Mass Memory Investigation Program addressed problems related to the analysis, design, and implementation of a direct digital optical recorder/reproducer. Effort was placed on developing an operational archival mass storage system to support one or more key NASA missions. The primary activity of the DIGIMEM program phase was the design, fabrication, and test and evaluation of a breadboard digital optical recorder/reproducer. Starting with technology and subsystem perfected during the HOLOMEM program phase, a fully operational optical spot recording breadboard that met or exceeded all program goals was evaluated. A thorough evaluation of several high resolution electrophotographic recording films was performed and a preliminary data base management/end user requirements survey was completed.
Economic model for QoS guarantee on the Internet
NASA Astrophysics Data System (ADS)
Zhang, Chi; Wei, Jiaolong
2001-09-01
This paper describes a QoS guarantee architecture suited for best-effort environments, based on ideas from microeconomics and non-cooperative game theory. First, an analytic model is developed for the study of the resource allocation in the Internet. Then we show that with a simple pricing mechanism (from network implementation and users' points-of-view), we were able to provide QoS guarantee at per flow level without resource allocation or complicated scheduling mechanisms or maintaining per flow state in the core network. Unlike the previous work on this area, we extend the basic model to support inelastic applications which require minimum bandwidth guarantees for a given time period by introducing derivative market.
Optimization of a Monte Carlo Model of the Transient Reactor Test Facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Kristin; DeHart, Mark; Goluoglu, Sedat
2017-03-01
The ultimate goal of modeling and simulation is to obtain reasonable answers to problems that don’t have representations which can be easily evaluated while minimizing the amount of computational resources. With the advances during the last twenty years of large scale computing centers, researchers have had the ability to create a multitude of tools to minimize the number of approximations necessary when modeling a system. The tremendous power of these centers requires the user to possess an immense amount of knowledge to optimize the models for accuracy and efficiency.This paper seeks to evaluate the KENO model of TREAT to optimizemore » calculational efforts.« less
An online semi-supervised brain-computer interface.
Gu, Zhenghui; Yu, Zhuliang; Shen, Zhifang; Li, Yuanqing
2013-09-01
Practical brain-computer interface (BCI) systems should require only low training effort for the user, and the algorithms used to classify the intent of the user should be computationally efficient. However, due to inter- and intra-subject variations in EEG signal, intermittent training/calibration is often unavoidable. In this paper, we present an online semi-supervised P300 BCI speller system. After a short initial training (around or less than 1 min in our experiments), the system is switched to a mode where the user can input characters through selective attention. In this mode, a self-training least squares support vector machine (LS-SVM) classifier is gradually enhanced in back end with the unlabeled EEG data collected online after every character input. In this way, the classifier is gradually enhanced. Even though the user may experience some errors in input at the beginning due to the small initial training dataset, the accuracy approaches that of fully supervised method in a few minutes. The algorithm based on LS-SVM and its sequential update has low computational complexity; thus, it is suitable for online applications. The effectiveness of the algorithm has been validated through data analysis on BCI Competition III dataset II (P300 speller BCI data). The performance of the online system was evaluated through experimental results on eight healthy subjects, where all of them achieved the spelling accuracy of 85 % or above within an average online semi-supervised learning time of around 3 min.
The AAS Working Group on Accessibility and Disability (WGAD) Year 1 Highlights and Database Access
NASA Astrophysics Data System (ADS)
Knierman, Karen A.; Diaz Merced, Wanda; Aarnio, Alicia; Garcia, Beatriz; Monkiewicz, Jacqueline A.; Murphy, Nicholas Arnold
2017-06-01
The AAS Working Group on Accessibility and Disability (WGAD) was formed in January of 2016 with the express purpose of seeking equity of opportunity and building inclusive practices for disabled astronomers at all educational and career stages. In this presentation, we will provide a summary of current activities, focusing on developing best practices for accessibility with respect to astronomical databases, publications, and meetings. Due to the reliance of space sciences on databases, it is important to have user centered design systems for data retrieval. The cognitive overload that may be experienced by users of current databases may be mitigated by use of multi-modal interfaces such as xSonify. Such interfaces would be in parallel or outside the original database and would not require additional software efforts from the original database. WGAD is partnering with the IAU Commission C1 WG Astronomy for Equity and Inclusion to develop such accessibility tools for databases and methods for user testing. To collect data on astronomical conference and meeting accessibility considerations, WGAD solicited feedback from January AAS attendees via a web form. These data, together with upcoming input from the community and analysis of accessibility documents of similar conferences, will be used to create a meeting accessibility document. Additionally, we will update the progress of journal access guidelines and our social media presence via Twitter. We recommend that astronomical journals form committees to evaluate the accessibility of their publications by performing user-centered usability studies.
Yoon, Young
2017-01-01
Web of Things (WoT) platforms are growing fast so as the needs for composing WoT apps more easily and efficiently. We have recently commenced the campaign to develop an interface where users can issue requests for WoT apps entirely in natural language. This requires an effort to build a system that can learn to identify relevant WoT functions that fulfill user's requests. In our preceding work, we trained a supervised learning system with thousands of publicly-available IFTTT app recipes based on conditional random fields (CRF). However, the sub-par accuracy and excessive training time motivated us to devise a better approach. In this paper, we present a novel solution that creates a separate learning engine for each trigger service. With this approach, parallel and incremental learning becomes possible. For inference, our system first identifies the most relevant trigger service for a given user request by using an information retrieval technique. Then, the learning engine associated with the trigger service predicts the most likely pair of trigger and action functions. We expect that such two-phase inference method given parallel learning engines would improve the accuracy of identifying related WoT functions. We verify our new solution through the empirical evaluation with training and test sets sampled from a pool of refined IFTTT app recipes. We also meticulously analyze the characteristics of the recipes to find future research directions.
Development of regional climate scenarios in the Netherlands - involvement of users
NASA Astrophysics Data System (ADS)
Bessembinder, Janette; Overbeek, Bernadet
2013-04-01
Climate scenarios are consistent and plausible pictures of possible future climates. They are intended for use in studies exploring the impacts of climate change, and to formulate possible adaptation strategies. To ensure that the developed climate scenarios are relevant to the intended users, interaction with the users is needed. As part of the research programmes "Climate changes Spatial Planning" and "Knowledge for Climate" several projects on climate services, tailoring of climate information and communication were conducted. Some of the important lessons learned about user interaction are: *) To be able to deliver relevant climate information in the right format, proper knowledge is required on who will be using the climate information and data, how it will be used and why they use it; *) Users' requirements can be very diverse and requirements may change over time. Therefore, sustained (personal) contact with users is required; *) Organising meetings with climate researchers and users of climate information together, and working together in projects results in mutual understanding on the requirements of users and the limitations to deliver certain types of climate information, which facilitates the communication and results in more widely accepted products; *) Information and communication should be adapted to the type of users (e.g. impact researchers or policy makers) and to the type of problem (unstructured problems require much more contact with the users). In 2001 KNMI developed climate scenarios for the National Commission on Water management in the 21st century (WB21 scenarios). In 2006 these were replaced by a the KNMI'06 scenarios, intended for a broader group of users. The above lessons are now taken into account during the development of the next generation of climate scenarios for the Netherlands, expected at the end of 2013, after the publication of the IPCC WG1 report: *) users' requirements are taken into account explicitly in the whole process of the development of the climate scenarios; *) users are involved already in the early phases of the development of new scenarios, among others in the following way: **) workshops on users' requirements to check whether they have changed and to get more information; **) feedback group of users to get more detailed feedback on the modes of communication; **) newsletter with information on the progress and procedures to be followed and separate workshops for researchers and policy makers with different levels of detail; **) projects together with impact researchers: tailoring of data and in order to be able to present impact information consistent with the climate scenarios much earlier. During the presentation more detailed information will be given on the interaction with users.
The intelligent user interface for NASA's advanced information management systems
NASA Technical Reports Server (NTRS)
Campbell, William J.; Short, Nicholas, Jr.; Rolofs, Larry H.; Wattawa, Scott L.
1987-01-01
NASA has initiated the Intelligent Data Management Project to design and develop advanced information management systems. The project's primary goal is to formulate, design and develop advanced information systems that are capable of supporting the agency's future space research and operational information management needs. The first effort of the project was the development of a prototype Intelligent User Interface to an operational scientific database, using expert systems and natural language processing technologies. An overview of Intelligent User Interface formulation and development is given.
SmartG: Spontaneous Malaysian Augmented Reality Tourist Guide
NASA Astrophysics Data System (ADS)
Kasinathan, Vinothini; Mustapha, Aida; Subramaniam, Tanabalan
2016-11-01
In effort to attract higher tourist expenditure along with higher tourist arrivals, this paper proposes a travel application called the SmartG, acronym for Spontaneous Malaysian Augmented Reality Tourist Guide, which operates by making recommendations to user based on the travel objective and individual budget constraints. The applications relies on augmented reality technology, whereby a three dimensional model is presented to the user based on input from real world environment. User testing returned a favorable feedback on the concept of using augmented reality in promoting Malaysian tourism.
Human-telerobot interactions - Information, control, and mental models
NASA Technical Reports Server (NTRS)
Smith, Randy L.; Gillan, Douglas J.
1987-01-01
A part of the NASA's Space Station will be a teleoperated robot (telerobot) with arms for grasping and manipulation, feet for holding onto objects, and television cameras for visual feedback. The objective of the work described in this paper is to develop the requirements and specifications for the user-telerobot interface and to determine through research and testing that the interface results in efficient system operation. The focus of the development of the user-telerobot interface is on the information required by the user, the user inputs, and the design of the control workstation. Closely related to both the information required by the user and the user's control of the telerobot is the user's mental model of the relationship between the control inputs and the telerobot's actions.
Characterizing Interference in Radio Astronomy Observations through Active and Unsupervised Learning
NASA Technical Reports Server (NTRS)
Doran, G.
2013-01-01
In the process of observing signals from astronomical sources, radio astronomers must mitigate the effects of manmade radio sources such as cell phones, satellites, aircraft, and observatory equipment. Radio frequency interference (RFI) often occurs as short bursts (< 1 ms) across a broad range of frequencies, and can be confused with signals from sources of interest such as pulsars. With ever-increasing volumes of data being produced by observatories, automated strategies are required to detect, classify, and characterize these short "transient" RFI events. We investigate an active learning approach in which an astronomer labels events that are most confusing to a classifier, minimizing the human effort required for classification. We also explore the use of unsupervised clustering techniques, which automatically group events into classes without user input. We apply these techniques to data from the Parkes Multibeam Pulsar Survey to characterize several million detected RFI events from over a thousand hours of observation.
A Multidimensional View of Personal Health Systems for Underserved Populations
Botts, Nathan E; Burkhard, Richard J
2010-01-01
The advent of electronic personal health records (PHR) provides a major opportunity to encourage positive health management practices, such as chronic disease management. Yet, to date there has been little attention toward the use of PHRs where advanced health information services are perhaps most needed, namely, in underserved communities. Drawing upon research conducted with safety net providers and patients, the authors propose a multi-level analytical framework for guiding actions aimed at fostering PHR adoption and utilization. The authors first outline distinctive user and technical requirements that need to be considered. Next, they assess organizational requirements necessary to implement PHRs within health systems bound by limited resources. Finally, the authors analyze the overriding health care policy context that can facilitate or thwart such efforts. The conclusion notes that heightened national attention toward health information technology and reform provides a significant opportunity for initiatives whose goal is to increase widepread access to PHRs. PMID:20685644
An image analysis system for near-infrared (NIR) fluorescence lymph imaging
NASA Astrophysics Data System (ADS)
Zhang, Jingdan; Zhou, Shaohua Kevin; Xiang, Xiaoyan; Rasmussen, John C.; Sevick-Muraca, Eva M.
2011-03-01
Quantitative analysis of lymphatic function is crucial for understanding the lymphatic system and diagnosing the associated diseases. Recently, a near-infrared (NIR) fluorescence imaging system is developed for real-time imaging lymphatic propulsion by intradermal injection of microdose of a NIR fluorophore distal to the lymphatics of interest. However, the previous analysis software3, 4 is underdeveloped, requiring extensive time and effort to analyze a NIR image sequence. In this paper, we develop a number of image processing techniques to automate the data analysis workflow, including an object tracking algorithm to stabilize the subject and remove the motion artifacts, an image representation named flow map to characterize lymphatic flow more reliably, and an automatic algorithm to compute lymph velocity and frequency of propulsion. By integrating all these techniques to a system, the analysis workflow significantly reduces the amount of required user interaction and improves the reliability of the measurement.
Advanced Design Methodology for Robust Aircraft Sizing and Synthesis
NASA Technical Reports Server (NTRS)
Mavris, Dimitri N.
1997-01-01
Contract efforts are focused on refining the Robust Design Methodology for Conceptual Aircraft Design. Robust Design Simulation (RDS) was developed earlier as a potential solution to the need to do rapid trade-offs while accounting for risk, conflict, and uncertainty. The core of the simulation revolved around Response Surface Equations as approximations of bounded design spaces. An ongoing investigation is concerned with the advantages of using Neural Networks in conceptual design. Thought was also given to the development of systematic way to choose or create a baseline configuration based on specific mission requirements. Expert system was developed, which selects aerodynamics, performance and weights model from several configurations based on the user's mission requirements for subsonic civil transport. The research has also resulted in a step-by-step illustration on how to use the AMV method for distribution generation and the search for robust design solutions to multivariate constrained problems.
Gulla, Joy; Neri, Pamela M; Bates, David W; Samal, Lipika
2017-05-01
Timely referral of patients with CKD has been associated with cost and mortality benefits, but referrals are often done too late in the course of the disease. Clinical decision support (CDS) offers a potential solution, but interventions have failed because they were not designed to support the physician workflow. We sought to identify user requirements for a chronic kidney disease (CKD) CDS system to promote timely referral. We interviewed primary care physicians (PCPs) to identify data needs for a CKD CDS system that would encourage timely referral and also gathered information about workflow to assess risk factors for progression of CKD. Interviewees were general internists recruited from a network of 14 primary care clinics affiliated with Brigham and Women's Hospital (BWH). We then performed a qualitative analysis to identify user requirements and system attributes for a CKD CDS system. Of the 12 participants, 25% were women, the mean age was 53 (range 37-82), mean years in clinical practice was 27 (range 11-58). We identified 21 user requirements. Seven of these user requirements were related to support for the referral process workflow, including access to pertinent information and support for longitudinal co-management. Six user requirements were relevant to PCP management of CKD, including management of risk factors for progression, interpretation of biomarkers of CKD severity, and diagnosis of the cause of CKD. Finally, eight user requirements addressed user-centered design of CDS, including the need for actionable information, links to guidelines and reference materials, and visualization of trends. These 21 user requirements can be used to design an intuitive and usable CDS system with the attributes necessary to promote timely referral. Copyright © 2017 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Bock, R. Darrell
Efforts have been made to increase the dissemination and use of data generated by the National Assessment of Educational Progress (NAEP). Potential users include those concerned with curriculum and methods evaluation, public policymakers, and researchers. NAEP can provide data for curriculum evaluation, including item analysis data which assist in…
SILVAH-OAK: ensuring adoption by engaging users in the full cycle of forest research
Susan L. Stout; Pat Brose; Kurt Gottschalk; Gary Miller; Pete Knopp; Gary Rutherford; Mark Deibler; Gary Frank; Gary Gilmore
2007-01-01
Recent Forest Service Research and Development (FS R&D) logic modeling efforts focused on program delivery stated that an important precondition for effective science delivery was the engagement of users and partners throughout the full research and development cycle. The ongoing partnership among the Pennsylvania Department of Conservation and Natural Resources...
ERIC Educational Resources Information Center
National Council on Disability, Washington, DC.
This report investigates the use of the graphical user interface (GUI) in computer programs, the problems it creates for individuals with visual impairments or blindness, and advocacy efforts concerning this issue, which have been targeted primarily at Microsoft, producer of Windows. The report highlights the concerns of individuals with visual…
Managing urban parks for a racially and ethnically diverse clientele
Paul H. Gobster
2002-01-01
A major planning effort for Chicago's largest park provided an opprotunity yto examine outdoor recreation use patterns and preferences among a racially and ethnically diverse clientele. Results from on-site surveys of 898 park users (217 Black, 210 Latino, 182 Asian, and 289 White) showed that park users shared a core set of interests, preferences, and concerns...
ERIC Educational Resources Information Center
Tebbutt, John
1999-01-01
Discusses efforts at National Institute of Standards and Technology (NIST) to construct an information discovery tool through the fusion of hypertext and information retrieval that works by parsing a contiguous document base into smaller documents and inserting semantic links between them. Also presents a case study that evaluated user reactions.…
Seamless Digital Environment – Data Analytics Use Case Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oxstrand, Johanna
Multiple research efforts in the U.S Department of Energy Light Water Reactor Sustainability (LWRS) Program studies the need and design of an underlying architecture to support the increased amount and use of data in the nuclear power plant. More specifically the three LWRS research efforts; Digital Architecture for an Automated Plant, Automated Work Packages, Computer-Based Procedures for Field Workers, and the Online Monitoring efforts all have identified the need for a digital architecture and more importantly the need for a Seamless Digital Environment (SDE). A SDE provides a mean to access multiple applications, gather the data points needed, conduct themore » analysis requested, and present the result to the user with minimal or no effort by the user. During the 2016 annual Nuclear Information Technology Strategic Leadership (NITSL) group meeting the nuclear utilities identified the need for research focused on data analytics. The effort was to develop and evaluate use cases for data mining and analytics for employing information from plant sensors and database for use in developing improved business analytics. The goal of the study is to research potential approaches to building an analytics solution for equipment reliability, on a small scale, focusing on either a single piece of equipment or a single system. The analytics solution will likely consist of a data integration layer, predictive and machine learning layer and the user interface layer that will display the output of the analysis in a straight forward, easy to consume manner. This report describes the use case study initiated by NITSL and conducted in a collaboration between Idaho National Laboratory, Arizona Public Service – Palo Verde Nuclear Generating Station, and NextAxiom Inc.« less
Geometric modeling for computer aided design
NASA Technical Reports Server (NTRS)
Schwing, James L.
1988-01-01
Research focused on two major areas. The first effort addressed the design and implementation of a technique that allows for the visualization of the real time variation of physical properties. The second effort focused on the design and implementation of an on-line help system with components designed for both authors and users of help information.
Milicic, Sandra; Piérard, Emma; DeCicca, Philip; Leatherdale, Scott T
2017-11-01
Youth e-cigarette use is common worldwide, but the profile of e-cigarette users compared with tobacco users is unclear. This study examines how sport participation and activity levels among youth differ between e-cigarette users and smokers. Using Canadian data from 38,977 grade 9 to 12 students who participated in Year 3 (2014-15) of the COMPASS study, logistic regression models were used to examine the likelihood of sport participation and activity level based on e-cigarette use and smoking status. Pearson's chi-square tests were used to examine subgroup differences by gender. E-cigarette users are more likely to participate in intramural, competitive, and team sports compared to non-users. Current and former smokers are less likely to participate in those sports than non-smokers. Youth e-cigarette users are more likely than non-users to meet the physical activity guidelines. Current smokers are more likely than non-smokers to undertake physical activity at least 60 minutes daily but less likely than non-smokers to tone at least 3 times per week. Youth e-cigarette users are less likely than non-users to be sedentary less than 2 hours daily. Gender differences among males and females show that male e-cigarettes users drive the general relationship. Results suggest that e-cigarette users are more likely to engage in physical activity compared to non e-cigarette users. Youth e-cigarette users are more likely to be physically active while the opposite is true for smokers. Although e-cigarettes may be less harmful to health compared to cigarette smoking, the increased uptake among youth of differing profiles should be considered in prevention efforts. These results highlight the importance of addressing e-cigarette use in youth who undertake health promoting behaviours. Prevention efforts should not focus only on youth who may undertake riskier health habits; e-cigarette prevention programs should go beyond the domain of tobacco control. © The Author 2017. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Radiotelephone Act; or (b) Required to participate in a VMRS within a VTS area (VMRS User). VTS User's Manual...) User means a vessel, or an owner, operator, charterer, Master, or person directing the movement of a... which special operating requirements apply. VTS User means a vessel, or an owner, operator, charterer...
Requirements for the Military Message System (MMS) Family: Data Types and User Commands.
1986-04-11
AD-A167 126 REQUIREMENTS FOR THE MILITARY MESSASE SYSTEM (NHS) i FRILY: DATA TYPES AND USER CONNNDS(U) NAVAL RESEARCH LAB WASHINGTON DC C L HEITHEVER... System (MMS) Family: Data Types and User Commands CONSTANCE L. HEITMEYER Computer Science and Systems Branch I Information Technology Division April 11...Security Classification) Requirements for the Military Message System (MMS) Family: Data Types and User Commands 12. PERSONAL AUTHOR(S) Heitmeer, Constance
SEDAC information gateway plan V(1)
NASA Technical Reports Server (NTRS)
Chen, Robert S. (Compiler)
1995-01-01
This annual update of the Information Gateway Plan incorporates changes recommended by the Socioeconomic Data and Applications Center (SEDAC) User Working Group (UWG) and reflects comments and suggestions from users, collaborators, and the Contracting Officer Technical Representative (COTR). The Information Gateway Plan is a concise and specific plan that outlines SEDAC activities and services in support of the earth and social sciences and other user communities. The SEDAC Information Gateway effort is a primary means by which the Earth Observing System Data and Information System (EOSDIS) can link meaningfully with a broad range of social science data sources and users in ways that lead to tangible benefits to the American people. The SEDAC Information Gateway provides interdisciplinary access to socioeconomic and physical science data and information resources held by SEDAC and numerous other institutions and networks around the world. The Plan describes the areas of research of earth scientists and socioeconomic scientists where interchange of data and information is most needed. It sets guidelines for the continued development of SEDAC's directory of social science datasets and establishes priorities for efforts to make data held by SEDAC or accessible through SEDAC available to the user community. The Plan also describes the means by which the SEDAC user community can access information products specified by the SEDAC Data and Applications Development Plan (DADP). Among other major activities, SEDAC will continue to enhance and operate a directory capability, interoperable with the Global Change Master Directory, that provides the socioeconomic community with information about earth science products and the earth science research community with information about socioeconomic data. The Information Gateway also serves as a unique and powerful access pathway for a wide range of users and potential users of socioeconomic and earth science data, including especially remote sensing data.
A CMMI-based approach for medical software project life cycle study.
Chen, Jui-Jen; Su, Wu-Chen; Wang, Pei-Wen; Yen, Hung-Chi
2013-01-01
In terms of medical techniques, Taiwan has gained international recognition in recent years. However, the medical information system industry in Taiwan is still at a developing stage compared with the software industries in other nations. In addition, systematic development processes are indispensable elements of software development. They can help developers increase their productivity and efficiency and also avoid unnecessary risks arising during the development process. Thus, this paper presents an application of Light-Weight Capability Maturity Model Integration (LW-CMMI) to Chang Gung Medical Research Project (CMRP) in the Nuclear medicine field. This application was intended to integrate user requirements, system design and testing of software development processes into three layers (Domain, Concept and Instance) model. Then, expressing in structural System Modeling Language (SysML) diagrams and converts part of the manual effort necessary for project management maintenance into computational effort, for example: (semi-) automatic delivery of traceability management. In this application, it supports establishing artifacts of "requirement specification document", "project execution plan document", "system design document" and "system test document", and can deliver a prototype of lightweight project management tool on the Nuclear Medicine software project. The results of this application can be a reference for other medical institutions in developing medical information systems and support of project management to achieve the aim of patient safety.
NASA Astrophysics Data System (ADS)
Gordov, Evgeny; Lykosov, Vasily; Krupchatnikov, Vladimir; Okladnikov, Igor; Titov, Alexander; Shulgina, Tamara
2013-04-01
Analysis of growing volume of related to climate change data from sensors and model outputs requires collaborative multidisciplinary efforts of researchers. To do it timely and in reliable way one needs in modern information-computational infrastructure supporting integrated studies in the field of environmental sciences. Recently developed experimental software and hardware platform Climate (http://climate.scert.ru/) provides required environment for regional climate change related investigations. The platform combines modern web 2.0 approach, GIS-functionality and capabilities to run climate and meteorological models, process large geophysical datasets and support relevant analysis. It also supports joint software development by distributed research groups, and organization of thematic education for students and post-graduate students. In particular, platform software developed includes dedicated modules for numerical processing of regional and global modeling results for consequent analysis and visualization. Also run of integrated into the platform WRF and «Planet Simulator» models, modeling results data preprocessing and visualization is provided. All functions of the platform are accessible by a user through a web-portal using common graphical web-browser in the form of an interactive graphical user interface which provides, particularly, capabilities of selection of geographical region of interest (pan and zoom), data layers manipulation (order, enable/disable, features extraction) and visualization of results. Platform developed provides users with capabilities of heterogeneous geophysical data analysis, including high-resolution data, and discovering of tendencies in climatic and ecosystem changes in the framework of different multidisciplinary researches. Using it even unskilled user without specific knowledge can perform reliable computational processing and visualization of large meteorological, climatic and satellite monitoring datasets through unified graphical web-interface. Partial support of RF Ministry of Education and Science grant 8345, SB RAS Program VIII.80.2 and Projects 69, 131, 140 and APN CBA2012-16NSY project is acknowledged.