The Strategic WAste Minimization Initiative (SWAMI) Software, Version 2.0 is a tool for using process analysis for identifying waste minimization opportunities within an industrial setting. The software requires user-supplied information for process definition, as well as materia...
Charliecloud: Unprivileged containers for user-defined software stacks in HPC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Priedhorsky, Reid; Randles, Timothy C.
Supercomputing centers are seeing increasing demand for user-defined software stacks (UDSS), instead of or in addition to the stack provided by the center. These UDSS support user needs such as complex dependencies or build requirements, externally required configurations, portability, and consistency. The challenge for centers is to provide these services in a usable manner while minimizing the risks: security, support burden, missing functionality, and performance. We present Charliecloud, which uses the Linux user and mount namespaces to run industry-standard Docker containers with no privileged operations or daemons on center resources. Our simple approach avoids most security risks while maintaining accessmore » to the performance and functionality already on offer, doing so in less than 500 lines of code. Charliecloud promises to bring an industry-standard UDSS user workflow to existing, minimally altered HPC resources.« less
Optimal design method to minimize users' thinking mapping load in human-machine interactions.
Huang, Yanqun; Li, Xu; Zhang, Jie
2015-01-01
The discrepancy between human cognition and machine requirements/behaviors usually results in serious mental thinking mapping loads or even disasters in product operating. It is important to help people avoid human-machine interaction confusions and difficulties in today's mental work mastered society. Improving the usability of a product and minimizing user's thinking mapping and interpreting load in human-machine interactions. An optimal human-machine interface design method is introduced, which is based on the purpose of minimizing the mental load in thinking mapping process between users' intentions and affordance of product interface states. By analyzing the users' thinking mapping problem, an operating action model is constructed. According to human natural instincts and acquired knowledge, an expected ideal design with minimized thinking loads is uniquely determined at first. Then, creative alternatives, in terms of the way human obtains operational information, are provided as digital interface states datasets. In the last, using the cluster analysis method, an optimum solution is picked out from alternatives, by calculating the distances between two datasets. Considering multiple factors to minimize users' thinking mapping loads, a solution nearest to the ideal value is found in the human-car interaction design case. The clustering results show its effectiveness in finding an optimum solution to the mental load minimizing problems in human-machine interaction design.
Text-Based On-Line Conferencing: A Conceptual and Empirical Analysis Using a Minimal Prototype.
ERIC Educational Resources Information Center
McCarthy, John C.; And Others
1993-01-01
Analyzes requirements for text-based online conferencing through the use of a minimal prototype. Topics discussed include prototyping with a minimal system; text-based communication; the system as a message passer versus the system as a shared data structure; and three exercises that showed how users worked with the prototype. (Contains 61…
THREAT ENSEMBLE VULNERABILITY ASSESSMENT ...
software and manual TEVA-SPOT is used by water utilities to optimize the number and location of contamination detection sensors so that economic and/or public health consequences are minimized. TEVA-SPOT is interactive, allowing a user to specify the minimization objective (e.g., the number of people exposed, the time to detection, or the extent of pipe length contaminated). It also allows a user to specify constraints. For example, a TEVA-SPOT user can employ expert knowledge during the design process by identifying either existing or unfeasible sensor locations. Installation and maintenance costs for sensor placement can also be factored into the analysis. Python and Java are required to run TEVA-SPOT
36 CFR 1193.41 - Input, control, and mechanical functions.
Code of Federal Regulations, 2012 CFR
2012-07-01
.... Provide at least one mode that does not require user speech. (i) Operable with limited cognitive skills. Provide at least one mode that minimizes the cognitive, memory, language, and learning skills required of...
36 CFR 1193.41 - Input, control, and mechanical functions.
Code of Federal Regulations, 2014 CFR
2014-07-01
.... Provide at least one mode that does not require user speech. (i) Operable with limited cognitive skills. Provide at least one mode that minimizes the cognitive, memory, language, and learning skills required of...
36 CFR § 1193.41 - Input, control, and mechanical functions.
Code of Federal Regulations, 2013 CFR
2013-07-01
.... Provide at least one mode that does not require user speech. (i) Operable with limited cognitive skills. Provide at least one mode that minimizes the cognitive, memory, language, and learning skills required of...
Visual Information for the Desktop, version 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
2006-03-29
VZIN integrates visual analytics capabilities into popular desktop tools to aid a user in searching and understanding an information space. VZIN allows users to Drag-Drop-Visualize-Explore-Organize information within tools such as Microsoft Office, Windows Explorer, Excel, and Outlook. VZIN is tailorable to specific client or industry requirements. VZIN follows the desktop metaphors so that advanced analytical capabilities are available with minimal user training.
Ward, Logan; Steel, James; Le Compte, Aaron; Evans, Alicia; Tan, Chia-Siong; Penning, Sophie; Shaw, Geoffrey M; Desaive, Thomas; Chase, J Geoffrey
2012-01-01
Tight glycemic control (TGC) has shown benefits but has been difficult to implement. Model-based methods and computerized protocols offer the opportunity to improve TGC quality and compliance. This research presents an interface design to maximize compliance, minimize real and perceived clinical effort, and minimize error based on simple human factors and end user input. The graphical user interface (GUI) design is presented by construction based on a series of simple, short design criteria based on fundamental human factors engineering and includes the use of user feedback and focus groups comprising nursing staff at Christchurch Hospital. The overall design maximizes ease of use and minimizes (unnecessary) interaction and use. It is coupled to a protocol that allows nurse staff to select measurement intervals and thus self-manage workload. The overall GUI design is presented and requires only one data entry point per intervention cycle. The design and main interface are heavily focused on the nurse end users who are the predominant users, while additional detailed and longitudinal data, which are of interest to doctors guiding overall patient care, are available via tabs. This dichotomy of needs and interests based on the end user's immediate focus and goals shows how interfaces must adapt to offer different information to multiple types of users. The interface is designed to minimize real and perceived clinical effort, and ongoing pilot trials have reported high levels of acceptance. The overall design principles, approach, and testing methods are based on fundamental human factors principles designed to reduce user effort and error and are readily generalizable. © 2012 Diabetes Technology Society.
Ward, Logan; Steel, James; Le Compte, Aaron; Evans, Alicia; Tan, Chia-Siong; Penning, Sophie; Shaw, Geoffrey M; Desaive, Thomas; Chase, J Geoffrey
2012-01-01
Introduction Tight glycemic control (TGC) has shown benefits but has been difficult to implement. Model-based methods and computerized protocols offer the opportunity to improve TGC quality and compliance. This research presents an interface design to maximize compliance, minimize real and perceived clinical effort, and minimize error based on simple human factors and end user input. Method The graphical user interface (GUI) design is presented by construction based on a series of simple, short design criteria based on fundamental human factors engineering and includes the use of user feedback and focus groups comprising nursing staff at Christchurch Hospital. The overall design maximizes ease of use and minimizes (unnecessary) interaction and use. It is coupled to a protocol that allows nurse staff to select measurement intervals and thus self-manage workload. Results The overall GUI design is presented and requires only one data entry point per intervention cycle. The design and main interface are heavily focused on the nurse end users who are the predominant users, while additional detailed and longitudinal data, which are of interest to doctors guiding overall patient care, are available via tabs. This dichotomy of needs and interests based on the end user's immediate focus and goals shows how interfaces must adapt to offer different information to multiple types of users. Conclusions The interface is designed to minimize real and perceived clinical effort, and ongoing pilot trials have reported high levels of acceptance. The overall design principles, approach, and testing methods are based on fundamental human factors principles designed to reduce user effort and error and are readily generalizable. PMID:22401330
Schulze, H Georg; Turner, Robin F B
2014-01-01
Charge-coupled device detectors are vulnerable to cosmic rays that can contaminate Raman spectra with positive going spikes. Because spikes can adversely affect spectral processing and data analyses, they must be removed. Although both hardware-based and software-based spike removal methods exist, they typically require parameter and threshold specification dependent on well-considered user input. Here, we present a fully automated spike removal algorithm that proceeds without requiring user input. It is minimally dependent on sample attributes, and those that are required (e.g., standard deviation of spectral noise) can be determined with other fully automated procedures. At the core of the method is the identification and location of spikes with coincident second derivatives along both the spectral and spatiotemporal dimensions of two-dimensional datasets. The method can be applied to spectra that are relatively inhomogeneous because it provides fairly effective and selective targeting of spikes resulting in minimal distortion of spectra. Relatively effective spike removal obtained with full automation could provide substantial benefits to users where large numbers of spectra must be processed.
Minimum Requirements for the CUS (Common User Subsystem) Workstation
1987-04-20
PAGE -2- / ’ " I& REPORT SECURITY CLASSIFICATION lb RESTRICTIVE MARKINGS Unclassified 2a SECURITY CLASSIFICATION AUTHORITY 3 DISTRMBUTION...CLASSIFICATION UNCLASSIID/UNLIMITED r" SAME AS RPT. [ 3 DTIC USERS Unclassified tNM F RESPONSIBLE INDIVIDUAL 22b TELEPHONE (Include area codi) 22c OFFICE...Summary 1 1. Introduction 3 1.1 Purpose 3 1.2 Scope 3 1.3 Reference 4 2. Background 5 3 . Minimal WIS Workstation Requirements 8 3.1 Overview 8 4. Overview
Efficient spares matrix multiplication scheme for the CYBER 203
NASA Technical Reports Server (NTRS)
Lambiotte, J. J., Jr.
1984-01-01
This work has been directed toward the development of an efficient algorithm for performing this computation on the CYBER-203. The desire to provide software which gives the user the choice between the often conflicting goals of minimizing central processing (CPU) time or storage requirements has led to a diagonal-based algorithm in which one of three types of storage is selected for each diagonal. For each storage type, an initialization sub-routine estimates the CPU and storage requirements based upon results from previously performed numerical experimentation. These requirements are adjusted by weights provided by the user which reflect the relative importance the user places on the resources. The three storage types employed were chosen to be efficient on the CYBER-203 for diagonals which are sparse, moderately sparse, or dense; however, for many densities, no diagonal type is most efficient with respect to both resource requirements. The user-supplied weights dictate the choice.
DNASynth: a software application to optimization of artificial gene synthesis
NASA Astrophysics Data System (ADS)
Muczyński, Jan; Nowak, Robert M.
2017-08-01
DNASynth is a client-server software application in which the client runs in a web browser. The aim of this program is to support and optimize process of artificial gene synthesizing using Ligase Chain Reaction. Thanks to LCR it is possible to obtain DNA strand coding defined by user peptide. The DNA sequence is calculated by optimization algorithm that consider optimal codon usage, minimal energy of secondary structures and minimal number of required LCR. Additionally absence of sequences characteristic for defined by user set of restriction enzymes is guaranteed. The presented software was tested on synthetic and real data.
User's Manual for the Langley Aerothermodynamic Upwind Relaxation Algorithm (LAURA)
NASA Technical Reports Server (NTRS)
Gnoffo, Peter A.; Cheatwood, F. McNeil
1996-01-01
This user's manual provides detailed instructions for the installation and the application of version 4.1 of the Langley Aerothermodynamic Upwind Relaxation Algorithm (LAURA). Also provides simulation of flow field in thermochemical nonequilibrium around vehicles traveling at hypersonic velocities through the atmosphere. Earlier versions of LAURA were predominantly research codes, and they had minimal (or no) documentation. This manual describes UNIX-based utilities for customizing the code for special applications that also minimize system resource requirements. The algorithm is reviewed, and the various program options are related to specific equations and variables in the theoretical development.
Residual Neurocognitive Features of Long-Term Ecstasy Users With Minimal Exposure to Other Drugs
Halpern, John H.; Sherwood, Andrea R.; Hudson, James I.; Gruber, Staci; Kozin, David; Pope, Harrison G.
2010-01-01
Aims In field studies assessing cognitive function in illicit ecstasy users, there are several frequent confounding factors that might plausibly bias the findings toward an overestimate of ecstasy-induced neurocognitive toxicity. We designed an investigation seeking to minimize these possible sources of bias. Design We compared illicit ecstasy users and non-users while 1) excluding individuals with significant lifetime exposure to other illicit drugs or alcohol; 2) requiring that all participants be members of the “rave” subculture; and 3) testing all participants with breath, urine, and hair samples at the time of evaluation to exclude possible surreptitious substance use. We compared groups with adjustment for age, gender, race/ethnicity, family-of-origin variables, and childhood history of conduct disorder and attention deficit hyperactivity disorder. We provide significance levels without correction for multiple comparisons. Setting Field study. Participants Fifty-two illicit ecstasy users and 59 non-users, age 18-45. Measurements Battery of 15 neuropsychological tests tapping a range of cognitive functions. Findings We found little evidence of decreased cognitive performance in ecstasy users, save for poorer strategic-self-regulation, possibly reflecting increased impulsivity. However this finding might have reflected a premorbid attribute of ecstasy users, rather than a residual neurotoxic effect of the drug. Conclusions In a study designed to minimize limitations found in many prior investigations, we failed to demonstrate marked residual cognitive effects in ecstasy users. This finding contrasts with many previous findings—including our own—and emphasizes the need for continued caution in interpreting field studies of cognitive function in illicit ecstasy users. PMID:21205042
NASA Technical Reports Server (NTRS)
Ebert, D. H.; Chase, P. E.; Dye, J.; Fahline, W. C.; Johnson, R. H.
1973-01-01
The impact of a conical scan versus a linear scan multispectral scanner (MSS) instrument on a small local-user data processing facility was studied. User data requirements were examined to determine the unique system rquirements for a low cost ground system (LCGS) compatible with the Earth Observatory Satellite (EOS) system. Candidate concepts were defined for the LCGS and preliminary designs were developed for selected concepts. The impact of a conical scan MSS versus a linear scan MSS was evaluated for the selected concepts. It was concluded that there are valid user requirements for the LCGS and, as a result of these requirements, the impact of the conical scanner is minimal, although some new hardware development for the LCGS is necessary to handle conical scan data.
Olvingson, Christina; Hallberg, Niklas; Timpka, Toomas; Greenes, Robert A
2002-12-18
The introduction of computer-based information systems (ISs) in public health provides enhanced possibilities for service improvements and hence also for improvement of the population's health. Not least, new communication systems can help in the socialization and integration process needed between the different professions and geographical regions. Therefore, development of ISs that truly support public health practices require that technical, cognitive, and social issues be taken into consideration. A notable problem is to capture 'voices' of all potential users, i.e., the viewpoints of different public health practitioners. Failing to capture these voices will result in inefficient or even useless systems. The aim of this study is to develop a minimal data set for capturing users' voices on problems experienced by public health professionals in their daily work and opinions about how these problems can be solved. The issues of concern thus captured can be used both as the basis for formulating the requirements of ISs for public health professionals and to create an understanding of the use context. Further, the data can help in directing the design to the features most important for the users.
Further Automate Planned Cluster Maintenance to Minimize System Downtime during Maintenance Windows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Springmeyer, R.
This report documents the integration and testing of the automated update process of compute clusters in LC to minimize impact to user productivity. Description: A set of scripts will be written and deployed to further standardize cluster maintenance activities and minimize downtime during planned maintenance windows. Completion Criteria: When the scripts have been deployed and used during planned maintenance windows and a timing comparison is completed between the existing process and the new more automated process, this milestone is complete. This milestone was completed on Aug 23, 2016 on the new CTS1 cluster called Jade when a request to upgrademore » the version of TOSS 3 was initiated while SWL jobs and normal user jobs were running. Jobs that were running when the update to the system began continued to run to completion. New jobs on the cluster started on the new release of TOSS 3. No system administrator action was required. Current update procedures in TOSS 2 begin by killing all users jobs. Then all diskfull nodes are updated, which can take a few hours. Only after the updates are applied are all nodes are rebooted, and then finally put back into service. A system administrator is required for all steps. In terms of human time spent during a cluster OS update, the TOSS 3 automated procedure on Jade took 0 FTE hours. Doing the same update without the Toss Update Tool would have required 4 FTE hours.« less
Live minimal path for interactive segmentation of medical images
NASA Astrophysics Data System (ADS)
Chartrand, Gabriel; Tang, An; Chav, Ramnada; Cresson, Thierry; Chantrel, Steeve; De Guise, Jacques A.
2015-03-01
Medical image segmentation is nowadays required for medical device development and in a growing number of clinical and research applications. Since dedicated automatic segmentation methods are not always available, generic and efficient interactive tools can alleviate the burden of manual segmentation. In this paper we propose an interactive segmentation tool based on image warping and minimal path segmentation that is efficient for a wide variety of segmentation tasks. While the user roughly delineates the desired organs boundary, a narrow band along the cursors path is straightened, providing an ideal subspace for feature aligned filtering and minimal path algorithm. Once the segmentation is performed on the narrow band, the path is warped back onto the original image, precisely delineating the desired structure. This tool was found to have a highly intuitive dynamic behavior. It is especially efficient against misleading edges and required only coarse interaction from the user to achieve good precision. The proposed segmentation method was tested for 10 difficult liver segmentations on CT and MRI images, and the resulting 2D overlap Dice coefficient was 99% on average..
Cooperative data dissemination to mission sites
NASA Astrophysics Data System (ADS)
Chen, Fangfei; Johnson, Matthew P.; Bar-Noy, Amotz; La Porta, Thomas F.
2010-04-01
Timely dissemination of information to mobile users is vital in many applications. In a critical situation, no network infrastructure may be available for use in dissemination, over and above the on-board storage capability of the mobile users themselves. We consider the following specialized content distribution application: a group of users equipped with wireless devices build an ad hoc network in order cooperatively to retrieve information from certain regions (the mission sites). Each user requires access to some set of information items originating from sources lying within a region. Each user desires low-latency access to its desired data items, upon request (i.e., when pulled). In order to minimize average response time, we allow users to pull data either directly from sources or, when possible, from other nearby users who have already pulled, and continue to carry, the desired data items. That is, we allow for data to be pushed to one user and then pulled by one or more additional users. The total latency experienced by a user vis-vis a certain data item is then in general a combination of the push delay and the pull delay. We assume each delay time is a function of the hop distance between the pair of points in question. Our goal in this paper is to assign data to mobile users, in order to minimize the total cost and the average latency experienced by all the users. In a static setting, we solve this problem in two different schemes, one of which is easy to solve but wasteful, one of which relates to NP-hard problems but is less so. Then in a dynamic setting, we adapt the algorithm for the static setting and develop a new algorithm with respect to users' gradual arrival. In the end we show a trade-off can be made between minimizing the cost and latency.
Evolution of the INMARSAT aeronautical system: Service, system, and business considerations
NASA Technical Reports Server (NTRS)
Sengupta, Jay R.
1995-01-01
A market-driven approach was adopted to develop enhancements to the Inmarsat-Aeronautical system, to address the requirements of potential new market segments. An evolutionary approach and well differentiated product/service portfolio was required, to minimize system upgrade costs and market penetration, respectively. The evolved system definition serves to minimize equipment cost/size/mass for short/medium range aircraft, by reducing the antenna gain requirement and relaxing the performance requirements for non safety-related communications. A validation program involving simulation, laboratory tests, over-satellite tests and flight trials is being conducted to confirm the system definition. Extensive market research has been conducted to determine user requirements and to quantify market demand for future Inmarsat Aero-1 AES, using sophisticated computer assisted survey techniques.
User interface for a tele-operated robotic hand system
Crawford, Anthony L
2015-03-24
Disclosed here is a user interface for a robotic hand. The user interface anchors a user's palm in a relatively stationary position and determines various angles of interest necessary for a user's finger to achieve a specific fingertip location. The user interface additionally conducts a calibration procedure to determine the user's applicable physiological dimensions. The user interface uses the applicable physiological dimensions and the specific fingertip location, and treats the user's finger as a two link three degree-of-freedom serial linkage in order to determine the angles of interest. The user interface communicates the angles of interest to a gripping-type end effector which closely mimics the range of motion and proportions of a human hand. The user interface requires minimal contact with the operator and provides distinct advantages in terms of available dexterity, work space flexibility, and adaptability to different users.
Lin, Di; Labeau, Fabrice; Yao, Yuanzhe; Vasilakos, Athanasios V; Tang, Yu
2016-07-01
Wireless technologies and vehicle-mounted or wearable medical sensors are pervasive to support ubiquitous healthcare applications. However, a critical issue of using wireless communications under a healthcare scenario rests at the electromagnetic interference (EMI) caused by radio frequency transmission. A high level of EMI may lead to a critical malfunction of medical sensors, and in such a scenario, a few users who are not transmitting emergency data could be required to reduce their transmit power or even temporarily disconnect from the network in order to guarantee the normal operation of medical sensors as well as the transmission of emergency data. In this paper, we propose a joint power and admission control algorithm to schedule the users' transmission of medical data. The objective of this algorithm is to minimize the number of users who are forced to disconnect from the network while keeping the EMI on medical sensors at an acceptable level. We show that a fixed point of proposed algorithm always exists, and at the fixed point, our proposed algorithm can minimize the number of low-priority users who are required to disconnect from the network. Numerical results illustrate that the proposed algorithm can achieve robust performance against the variations of mobile hospital environments.
Van Hoecke, Sofie; Steurbaut, Kristof; Taveirne, Kristof; De Turck, Filip; Dhoedt, Bart
2010-01-01
We designed a broker platform for e-homecare services using web service technology. The broker allows efficient data communication and guarantees quality requirements such as security, availability and cost-efficiency by dynamic selection of services, minimizing user interactions and simplifying authentication through a single user sign-on. A prototype was implemented, with several e-homecare services (alarm, telemonitoring, audio diary and video-chat). It was evaluated by patients with diabetes and multiple sclerosis. The patients found that the start-up time and overhead imposed by the platform was satisfactory. Having all e-homecare services integrated into a single application, which required only one login, resulted in a high quality of experience for the patients.
A Simplified Shuttle Payload Thermal Analyzer /SSPTA/ program
NASA Technical Reports Server (NTRS)
Bartoszek, J. T.; Huckins, B.; Coyle, M.
1979-01-01
A simple thermal analysis program for Space Shuttle payloads has been developed to accommodate the user who requires an easily understood but dependable analytical tool. The thermal analysis program includes several thermal subprograms traditionally employed in spacecraft thermal studies, a data management system for data generated by the subprograms, and a master program to coordinate the data files and thermal subprograms. The language and logic used to run the thermal analysis program are designed for the small user. In addition, analytical and storage techniques which conserve computer time and minimize core requirements are incorporated into the program.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davida, G.I.; Frankel, Y.; Matt, B.J.
In developing secure applications and systems, the designers often must incorporate secure user identification in the design specification. In this paper, the authors study secure off line authenticated user identification schemes based on a biometric system that can measure a user`s biometric accurately (up to some Hamming distance). The schemes presented here enhance identification and authorization in secure applications by binding a biometric template with authorization information on a token such as a magnetic strip. Also developed here are schemes specifically designed to minimize the compromise of a user`s private biometrics data, encapsulated in the authorization information, without requiring securemore » hardware tokens. In this paper the authors furthermore study the feasibility of biometrics performing as an enabling technology for secure system and application design. The authors investigate a new technology which allows a user`s biometrics to facilitate cryptographic mechanisms.« less
RIMS: Resource Information Management System
NASA Technical Reports Server (NTRS)
Symes, J.
1983-01-01
An overview is given of the capabilities and functions of the resource management system (RIMS). It is a simple interactive DMS tool which allows users to build, modify, and maintain data management applications. The RIMS minimizes programmer support required to develop/maintain small data base applications. The RIMS also assists in bringing the United Information Services (UIS) budget system work inhouse. Information is also given on the relationship between the RIMS and the user community.
Bronson, N R
1984-05-01
A new A-mode biometry system for determining axial length measurements of the eye has been developed that incorporates a soft-membrane transducer. The soft transducer decreases the risk of indenting the cornea with the probe resulting in inaccurate measurements. A microprocessor evaluates echo patterns and determines whether or not axial alignment has been obtained, eliminating possible user error. The new A-scan requires minimal user skill and can be used successfully by both physician and technician.
Candidate Mission from Planet Earth control and data delivery system architecture
NASA Technical Reports Server (NTRS)
Shapiro, Phillip; Weinstein, Frank C.; Hei, Donald J., Jr.; Todd, Jacqueline
1992-01-01
Using a structured, experienced-based approach, Goddard Space Flight Center (GSFC) has assessed the generic functional requirements for a lunar mission control and data delivery (CDD) system. This analysis was based on lunar mission requirements outlined in GSFC-developed user traffic models. The CDD system will facilitate data transportation among user elements, element operations, and user teams by providing functions such as data management, fault isolation, fault correction, and link acquisition. The CDD system for the lunar missions must not only satisfy lunar requirements but also facilitate and provide early development of data system technologies for Mars. Reuse and evolution of existing data systems can help to maximize system reliability and minimize cost. This paper presents a set of existing and currently planned NASA data systems that provide the basic functionality. Reuse of such systems can have an impact on mission design and significantly reduce CDD and other system development costs.
Implementing Model-Check for Employee and Management Satisfaction
NASA Technical Reports Server (NTRS)
Jones, Corey; LaPha, Steven
2013-01-01
This presentation will discuss methods to which ModelCheck can be implemented to not only improve model quality, but also satisfy both employees and management through different sets of quality checks. This approach allows a standard set of modeling practices to be upheld throughout a company, with minimal interaction required by the end user. The presenter will demonstrate how to create multiple ModelCheck standards, preventing users from evading the system, and how it can improve the quality of drawings and models.
Integration of an expert system into a user interface language demonstration
NASA Technical Reports Server (NTRS)
Stclair, D. C.
1986-01-01
The need for a User Interface Language (UIL) has been recognized by the Space Station Program Office as a necessary tool to aid in minimizing the cost of software generation by multiple users. Previous history in the Space Shuttle Program has shown that many different areas of software generation, such as operations, integration, testing, etc., have each used a different user command language although the types of operations being performed were similar in many respects. Since the Space Station represents a much more complex software task, a common user command language--a user interface language--is required to support the large spectrum of space station software developers and users. To assist in the selection of an appropriate set of definitions for a UIL, a series of demonstration programs was generated with which to test UIL concepts against specific Space Station scenarios using operators for the astronaut and scientific community. Because of the importance of expert system in the space station, it was decided that an expert system should be embedded in the UIL. This would not only provide insight into the UIL components required but would indicate the effectiveness with which an expert system could function in such an environment.
A Multi-User Model for Effectively Communicating Research Through Electronic Media
NASA Astrophysics Data System (ADS)
Hinds, J. J.; Fairley, J. P.
2003-12-01
Electronic media have demonstrated potential for data exchange, dissemination of results to other scientists, communication with community interest groups, and education of the general public regarding scientific advances. Few researchers, however, receive training in the skills required to capture the attention of the broad spectrum of Internet users. Because different people assimilate information in different ways, effective communication is best accomplished using an appropriate mix of photographs, graphics, tables, and text. In addition, effective web page design requires a clear, consistent organizational structure, easily-navigated layout, and attention to details such as page printability, downloading time, and minimal page scrolling. One of the strengths of electronic media is that the user can chose an appropriate level of involvement for his or her interest. In designing a web page for the multidisciplinary NSF/EPSCoR "Biocomplexity in Extreme Environments" project, we divided potential users into three categories based on our perception of the level of detail they required: 1) project participants, 2) non-participants with technical backgrounds, and 3) the general public. By understanding the needs and expectations of potential viewers, it was possible to present each group with an appropriate balance of visual and textural elements. For example, project participants are often most interested in raw data, which can be effectively presented in tabular format. Non-participants with technical backgrounds are more interested in analyzed data, while a project overview, presented through photographs and graphics with minimal text, will be most effective for communicating with the general public. The completed web page illustrates one solution for effectively communicating with a diverse audience, and provides examples for meeting many of the challenges of web page design.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gosnell, Thomas B.; Chavez, Joseph R.; Rowland, Mark S.
2014-02-26
RadID is a new gamma-ray spectrum analysis program for rapid screening of HPGe gamma-ray data to reveal the presence of radionuclide signatures. It is an autonomous, rule-based heuristic system that can identify well over 200 radioactive sources with particular interest in uranium and plutonium characteristics. It executes in about one second. RadID does not require knowledge of the detector efficiency, the source-to-detector distance, or the geometry of the inspected radiation source—including any shielding. In this first of a three-document series we sketch the RadID program’s origin, its minimal requirements, the user experience, and the program operation.
Process Improvements in Training Device Acceptance Testing: A Study in Total Quality Management
1990-12-12
Quality Management , a small group of Government and industry specialists examined the existing training device acceptance test process for potential improvements. The agreed-to mission of the Air Force/Industry partnership was to continuously identify and promote implementable approaches to minimize the cost and time required for acceptance testing while ensuring that validated performance supports the user training requirements. Application of a Total Quality process improvement model focused on the customers and their requirements, analyzed how work was accomplished, and
NASA Technical Reports Server (NTRS)
Moerder, Daniel D.
2014-01-01
MADS (Minimization Assistant for Dynamical Systems) is a trajectory optimization code in which a user-specified performance measure is directly minimized, subject to constraints placed on a low-order discretization of user-supplied plant ordinary differential equations. This document describes the mathematical formulation of the set of trajectory optimization problems for which MADS is suitable, and describes the user interface. Usage examples are provided.
Bringing Text Display Digital Radio to Consumers with Hearing Loss
ERIC Educational Resources Information Center
Sheffield, Ellyn G.; Starling, Michael; Schwab, Daniel
2011-01-01
Radio is migrating to digital transmission, expanding its offerings to include captioning for individuals with hearing loss. Text display radio requires a large amount of word throughput with minimal screen display area, making good user interface design crucial to its success. In two experiments, we presented hearing, hard-of-hearing, and deaf…
User-Centered Evaluation of the Quality of Blogs
ERIC Educational Resources Information Center
Chuenchom, Sutthinan
2011-01-01
Blogs serve multiple purposes, resulting in several types of blogs that vary greatly in terms of quality and content. It is important to evaluate the quality of blogs, which requires appropriate evaluation criteria. Unfortunately, there are minimal studies on framework and the specific criteria and indicators for evaluating the quality of blogs.…
Nelson, Scott D; Del Fiol, Guilherme; Hanseler, Haley; Crouch, Barbara Insley; Cummins, Mollie R
2016-01-01
Health information exchange (HIE) between Poison Control Centers (PCCs) and Emergency Departments (EDs) could improve care of poisoned patients. However, PCC information systems are not designed to facilitate HIE with EDs; therefore, we are developing specialized software to support HIE within the normal workflow of the PCC using user-centered design and rapid prototyping. To describe the design of an HIE dashboard and the refinement of user requirements through rapid prototyping. Using previously elicited user requirements, we designed low-fidelity sketches of designs on paper with iterative refinement. Next, we designed an interactive high-fidelity prototype and conducted scenario-based usability tests with end users. Users were asked to think aloud while accomplishing tasks related to a case vignette. After testing, the users provided feedback and evaluated the prototype using the System Usability Scale (SUS). Survey results from three users provided useful feedback that was then incorporated into the design. After achieving a stable design, we used the prototype itself as the specification for development of the actual software. Benefits of prototyping included having 1) subject-matter experts heavily involved with the design; 2) flexibility to make rapid changes, 3) the ability to minimize software development efforts early in the design stage; 4) rapid finalization of requirements; 5) early visualization of designs; 6) and a powerful vehicle for communication of the design to the programmers. Challenges included 1) time and effort to develop the prototypes and case scenarios; 2) no simulation of system performance; 3) not having all proposed functionality available in the final product; and 4) missing needed data elements in the PCC information system.
Central American information system for energy planning (in English; Spanish)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fonseca, M.G.; Lyon, P.C.; Heskett, J.C.
1991-04-01
SICAPE (Sistema de Information Centroamericano para Planificacion Energetica) is an expandable information system designed for energy planning. Its objective is to satisfy ongoing information requirements by means of a menu driver operational environment. SICAPE is as easily used by the novice computer user as those with more experience. Moreover, the system is capable of evolving concurrently with future requirements of the individual country. The expansion is accomplished by menu restructuring as data and user requirements change. The new menu configurations require no programming effort. The use and modification of SICAPE are separate menu-driven processes that allow for rapid data query,more » minimal training, and effortless continued growth. SICAPE's data is organized by country or region. Information is available in the following areas: energy balance, macro economics, electricity generation capacity, and electricity and petroleum product pricing. (JF)« less
Aligning Technology with the Organisation Using Focus and User Groups
NASA Astrophysics Data System (ADS)
Owens, Simeon
As an IT Manager of nine years in a small healthcare organisation, which has transitioned from a minimal base of IT to fully fledged systems in place, I have discovered two structures which have helped enormously in this transition. These structures are firstly, the focus group, which looks at the IT requirements of the business, and secondly the user group, or a group of super users, which help in the day to day running of the systems. I have put together a number of lessons, which I have learnt over the years through experience of the workings of these groups, the benefits of them and the value they bring to the organisation.
An efficient sparse matrix multiplication scheme for the CYBER 205 computer
NASA Technical Reports Server (NTRS)
Lambiotte, Jules J., Jr.
1988-01-01
This paper describes the development of an efficient algorithm for computing the product of a matrix and vector on a CYBER 205 vector computer. The desire to provide software which allows the user to choose between the often conflicting goals of minimizing central processing unit (CPU) time or storage requirements has led to a diagonal-based algorithm in which one of four types of storage is selected for each diagonal. The candidate storage types employed were chosen to be efficient on the CYBER 205 for diagonals which have nonzero structure which is dense, moderately sparse, very sparse and short, or very sparse and long; however, for many densities, no diagonal type is most efficient with respect to both resource requirements, and a trade-off must be made. For each diagonal, an initialization subroutine estimates the CPU time and storage required for each storage type based on results from previously performed numerical experimentation. These requirements are adjusted by weights provided by the user which reflect the relative importance the user places on the two resources. The adjusted resource requirements are then compared to select the most efficient storage and computational scheme.
AMMOS2: a web server for protein-ligand-water complexes refinement via molecular mechanics.
Labbé, Céline M; Pencheva, Tania; Jereva, Dessislava; Desvillechabrol, Dimitri; Becot, Jérôme; Villoutreix, Bruno O; Pajeva, Ilza; Miteva, Maria A
2017-07-03
AMMOS2 is an interactive web server for efficient computational refinement of protein-small organic molecule complexes. The AMMOS2 protocol employs atomic-level energy minimization of a large number of experimental or modeled protein-ligand complexes. The web server is based on the previously developed standalone software AMMOS (Automatic Molecular Mechanics Optimization for in silico Screening). AMMOS utilizes the physics-based force field AMMP sp4 and performs optimization of protein-ligand interactions at five levels of flexibility of the protein receptor. The new version 2 of AMMOS implemented in the AMMOS2 web server allows the users to include explicit water molecules and individual metal ions in the protein-ligand complexes during minimization. The web server provides comprehensive analysis of computed energies and interactive visualization of refined protein-ligand complexes. The ligands are ranked by the minimized binding energies allowing the users to perform additional analysis for drug discovery or chemical biology projects. The web server has been extensively tested on 21 diverse protein-ligand complexes. AMMOS2 minimization shows consistent improvement over the initial complex structures in terms of minimized protein-ligand binding energies and water positions optimization. The AMMOS2 web server is freely available without any registration requirement at the URL: http://drugmod.rpbs.univ-paris-diderot.fr/ammosHome.php. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
AMMOS2: a web server for protein–ligand–water complexes refinement via molecular mechanics
Labbé, Céline M.; Pencheva, Tania; Jereva, Dessislava; Desvillechabrol, Dimitri; Becot, Jérôme; Villoutreix, Bruno O.; Pajeva, Ilza
2017-01-01
Abstract AMMOS2 is an interactive web server for efficient computational refinement of protein–small organic molecule complexes. The AMMOS2 protocol employs atomic-level energy minimization of a large number of experimental or modeled protein–ligand complexes. The web server is based on the previously developed standalone software AMMOS (Automatic Molecular Mechanics Optimization for in silico Screening). AMMOS utilizes the physics-based force field AMMP sp4 and performs optimization of protein–ligand interactions at five levels of flexibility of the protein receptor. The new version 2 of AMMOS implemented in the AMMOS2 web server allows the users to include explicit water molecules and individual metal ions in the protein–ligand complexes during minimization. The web server provides comprehensive analysis of computed energies and interactive visualization of refined protein–ligand complexes. The ligands are ranked by the minimized binding energies allowing the users to perform additional analysis for drug discovery or chemical biology projects. The web server has been extensively tested on 21 diverse protein–ligand complexes. AMMOS2 minimization shows consistent improvement over the initial complex structures in terms of minimized protein–ligand binding energies and water positions optimization. The AMMOS2 web server is freely available without any registration requirement at the URL: http://drugmod.rpbs.univ-paris-diderot.fr/ammosHome.php. PMID:28486703
Multiple linear regression analysis
NASA Technical Reports Server (NTRS)
Edwards, T. R.
1980-01-01
Program rapidly selects best-suited set of coefficients. User supplies only vectors of independent and dependent data and specifies confidence level required. Program uses stepwise statistical procedure for relating minimal set of variables to set of observations; final regression contains only most statistically significant coefficients. Program is written in FORTRAN IV for batch execution and has been implemented on NOVA 1200.
Application-Defined Decentralized Access Control
Xu, Yuanzhong; Dunn, Alan M.; Hofmann, Owen S.; Lee, Michael Z.; Mehdi, Syed Akbar; Witchel, Emmett
2014-01-01
DCAC is a practical OS-level access control system that supports application-defined principals. It allows normal users to perform administrative operations within their privilege, enabling isolation and privilege separation for applications. It does not require centralized policy specification or management, giving applications freedom to manage their principals while the policies are still enforced by the OS. DCAC uses hierarchically-named attributes as a generic framework for user-defined policies such as groups defined by normal users. For both local and networked file systems, its execution time overhead is between 0%–9% on file system microbenchmarks, and under 1% on applications. This paper shows the design and implementation of DCAC, as well as several real-world use cases, including sandboxing applications, enforcing server applications’ security policies, supporting NFS, and authenticating user-defined sub-principals in SSH, all with minimal code changes. PMID:25426493
Identifying and Tracing User Needs
NASA Astrophysics Data System (ADS)
To, C.; Tauer, E.
2017-12-01
Providing adequate tools to the user community hinges on reaching the specific goals and needs behind the intended application of the tool. While the approach of leveraging user-supplied inputs and use cases to identify those goals is not new, there frequently remains the challenge of tracing those use cases through to implementation in an efficient and manageable fashion. Processes can become overcomplicated very quickly, and additionally, explicitly mapping progress towards the achievement of the user demands can become overwhelming when hundreds of use-cases are at play. This presentation will discuss a demonstrated use-case approach that has achieved an initial success with a tool re-design and deployment, the means to apply use cases in the generation of a roadmap for future releases over time, and the ability to include and adjust to new user requirements and suggestions with minimal disruption to the traceability. It is hoped that the findings and lessons learned will help make use case employment easier for others seeking to create user-targeted capabilities.
Development of CPR security using impact analysis.
Salazar-Kish, J.; Tate, D.; Hall, P. D.; Homa, K.
2000-01-01
The HIPAA regulations will require that institutions ensure the prevention of unauthorized access to electronically stored or transmitted patient records. This paper discusses a process for analyzing the impact of security mechanisms on users of computerized patient records through "behind the scenes" electronic access audits. In this way, those impacts can be assessed and refined to an acceptable standard prior to implementation. Through an iterative process of design and evaluation, we develop security algorithms that will protect electronic health information from improper access, alteration or loss, while minimally affecting the flow of work of the user population as a whole. PMID:11079984
TDRSS telecommunications system, PN code analysis
NASA Technical Reports Server (NTRS)
Dixon, R.; Gold, R.; Kaiser, F.
1976-01-01
The pseudo noise (PN) codes required to support the TDRSS telecommunications services are analyzed and the impact of alternate coding techniques on the user transponder equipment, the TDRSS equipment, and all factors that contribute to the acquisition and performance of these telecommunication services is assessed. Possible alternatives to the currently proposed hybrid FH/direct sequence acquisition procedures are considered and compared relative to acquisition time, implementation complexity, operational reliability, and cost. The hybrid FH/direct sequence technique is analyzed and rejected in favor of a recommended approach which minimizes acquisition time and user transponder complexity while maximizing probability of acquisition and overall link reliability.
NASA Technical Reports Server (NTRS)
Schoen, A. H.; Rosenstein, H.; Stanzione, K.; Wisniewski, J. S.
1980-01-01
This report describes the use of the V/STOL Aircraft Sizing and Performance Computer Program (VASCOMP II). The program is useful in performing aircraft parametric studies in a quick and cost efficient manner. Problem formulation and data development were performed by the Boeing Vertol Company and reflects the present preliminary design technology. The computer program, written in FORTRAN IV, has a broad range of input parameters, to enable investigation of a wide variety of aircraft. User oriented features of the program include minimized input requirements, diagnostic capabilities, and various options for program flexibility.
Barnette, Daniel W.
2002-01-01
The present invention provides a method of grid generation that uses the geometry of the problem space and the governing relations to generate a grid. The method can generate a grid with minimized discretization errors, and with minimal user interaction. The method of the present invention comprises assigning grid cell locations so that, when the governing relations are discretized using the grid, at least some of the discretization errors are substantially zero. Conventional grid generation is driven by the problem space geometry; grid generation according to the present invention is driven by problem space geometry and by governing relations. The present invention accordingly can provide two significant benefits: more efficient and accurate modeling since discretization errors are minimized, and reduced cost grid generation since less human interaction is required.
Del Fiol, Guilherme; Hanseler, Haley; Crouch, Barbara Insley; Cummins, Mollie R.
2016-01-01
Summary Background Health information exchange (HIE) between Poison Control Centers (PCCs) and Emergency Departments (EDs) could improve care of poisoned patients. However, PCC information systems are not designed to facilitate HIE with EDs; therefore, we are developing specialized software to support HIE within the normal workflow of the PCC using user-centered design and rapid prototyping. Objective To describe the design of an HIE dashboard and the refinement of user requirements through rapid prototyping. Methods Using previously elicited user requirements, we designed low-fidelity sketches of designs on paper with iterative refinement. Next, we designed an interactive high-fidelity prototype and conducted scenario-based usability tests with end users. Users were asked to think aloud while accomplishing tasks related to a case vignette. After testing, the users provided feedback and evaluated the prototype using the System Usability Scale (SUS). Results Survey results from three users provided useful feedback that was then incorporated into the design. After achieving a stable design, we used the prototype itself as the specification for development of the actual software. Benefits of prototyping included having 1) subject-matter experts heavily involved with the design; 2) flexibility to make rapid changes, 3) the ability to minimize software development efforts early in the design stage; 4) rapid finalization of requirements; 5) early visualization of designs; 6) and a powerful vehicle for communication of the design to the programmers. Challenges included 1) time and effort to develop the prototypes and case scenarios; 2) no simulation of system performance; 3) not having all proposed functionality available in the final product; and 4) missing needed data elements in the PCC information system. PMID:27081404
User requirements for a patient scheduling system
NASA Technical Reports Server (NTRS)
Zimmerman, W.
1979-01-01
A rehabilitation institute's needs and wants from a scheduling system were established by (1) studying the existing scheduling system and the variables that affect patient scheduling, (2) conducting a human-factors study to establish the human interfaces that affect patients' meeting prescribed therapy schedules, and (3) developing and administering a questionnaire to the staff which pertains to the various interface problems in order to identify staff requirements to minimize scheduling problems and other factors that may limit the effectiveness of any new scheduling system.
KRISSY: user's guide to modeling three-dimensional wind flow in complex terrain
Michael A. Fosberg; Michael L. Sestak
1986-01-01
KRISSY is a computer model for generating three-dimensional wind flows in complex terrain from data that were not or perhaps cannot be collected. The model is written in FORTRAN IV This guide describes data requirements, modeling, and output from an applications viewpoint rather than that of programming or theoretical modeling. KRISSY is designed to minimize...
BFEE: A User-Friendly Graphical Interface Facilitating Absolute Binding Free-Energy Calculations.
Fu, Haohao; Gumbart, James C; Chen, Haochuan; Shao, Xueguang; Cai, Wensheng; Chipot, Christophe
2018-03-26
Quantifying protein-ligand binding has attracted the attention of both theorists and experimentalists for decades. Many methods for estimating binding free energies in silico have been reported in recent years. Proper use of the proposed strategies requires, however, adequate knowledge of the protein-ligand complex, the mathematical background for deriving the underlying theory, and time for setting up the simulations, bookkeeping, and postprocessing. Here, to minimize human intervention, we propose a toolkit aimed at facilitating the accurate estimation of standard binding free energies using a geometrical route, coined the binding free-energy estimator (BFEE), and introduced it as a plug-in of the popular visualization program VMD. Benefitting from recent developments in new collective variables, BFEE can be used to generate the simulation input files, based solely on the structure of the complex. Once the simulations are completed, BFEE can also be utilized to perform the post-treatment of the free-energy calculations, allowing the absolute binding free energy to be estimated directly from the one-dimensional potentials of mean force in simulation outputs. The minimal amount of human intervention required during the whole process combined with the ergonomic graphical interface makes BFEE a very effective and practical tool for the end-user.
NASA Technical Reports Server (NTRS)
Hockensmith, R.; Devine, E.; Digiacomo, M.; Hager, F.; Moss, R.
1983-01-01
Satellites that use the NASA Tracking and Data Relay Satellite System (TDRSS) require antennas that are crucial for performing and achieving reliable TDRSS link performance at the desired data rate. Technical guidelines are presented to assist the prospective TDRSS medium-and high-data rate user in selecting and procuring a viable, steerable high-gain antenna system. Topics addressed include the antenna gain/transmitter power/data rate relationship; Earth power flux-density limitations; electromechanical requirements dictated by the small beam widths, desired angular coverage, and minimal torque disturbance to the spacecraft; weight and moment considerations; mechanical, electrical and thermal interfaces; design lifetime failure modes; and handling and storage. Proven designs are cited and space-qualified assemblies and components are identified.
NASA Technical Reports Server (NTRS)
Mazaheri, Alireza; Gnoffo, Peter A.; Johnston, Chirstopher O.; Kleb, Bil
2010-01-01
This users manual provides in-depth information concerning installation and execution of LAURA, version 5. LAURA is a structured, multi-block, computational aerothermodynamic simulation code. Version 5 represents a major refactoring of the original Fortran 77 LAURA code toward a modular structure afforded by Fortran 95. The refactoring improved usability and maintainability by eliminating the requirement for problem-dependent re-compilations, providing more intuitive distribution of functionality, and simplifying interfaces required for multi-physics coupling. As a result, LAURA now shares gas-physics modules, MPI modules, and other low-level modules with the FUN3D unstructured-grid code. In addition to internal refactoring, several new features and capabilities have been added, e.g., a GNU-standard installation process, parallel load balancing, automatic trajectory point sequencing, free-energy minimization, and coupled ablation and flowfield radiation.
NASA Technical Reports Server (NTRS)
Mazaheri, Alireza; Gnoffo, Peter A.; Johnston, Christopher O.; Kleb, William L.
2013-01-01
This users manual provides in-depth information concerning installation and execution of LAURA, version 5. LAURA is a structured, multi-block, computational aerothermodynamic simulation code. Version 5 represents a major refactoring of the original Fortran 77 LAURA code toward a modular structure afforded by Fortran 95. The refactoring improved usability and maintain ability by eliminating the requirement for problem dependent recompilations, providing more intuitive distribution of functionality, and simplifying interfaces required for multi-physics coupling. As a result, LAURA now shares gas-physics modules, MPI modules, and other low-level modules with the Fun3D unstructured-grid code. In addition to internal refactoring, several new features and capabilities have been added, e.g., a GNU standard installation process, parallel load balancing, automatic trajectory point sequencing, free-energy minimization, and coupled ablation and flowfield radiation.
NASA Technical Reports Server (NTRS)
Mazaheri, Alireza; Gnoffo, Peter A.; Johnston, Christopher O.; Kleb, Bil
2011-01-01
This users manual provides in-depth information concerning installation and execution of Laura, version 5. Laura is a structured, multi-block, computational aerothermodynamic simulation code. Version 5 represents a major refactoring of the original Fortran 77 Laura code toward a modular structure afforded by Fortran 95. The refactoring improved usability and maintainability by eliminating the requirement for problem dependent re-compilations, providing more intuitive distribution of functionality, and simplifying interfaces required for multi-physics coupling. As a result, Laura now shares gas-physics modules, MPI modules, and other low-level modules with the Fun3D unstructured-grid code. In addition to internal refactoring, several new features and capabilities have been added, e.g., a GNU-standard installation process, parallel load balancing, automatic trajectory point sequencing, free-energy minimization, and coupled ablation and flowfield radiation.
NASA Technical Reports Server (NTRS)
Mazaheri, Alireza; Gnoffo, Peter A.; Johnston, Christopher O.; Kleb, Bil
2009-01-01
This users manual provides in-depth information concerning installation and execution of LAURA, version 5. LAURA is a structured, multi-block, computational aerothermodynamic simulation code. Version 5 represents a major refactoring of the original Fortran 77 LAURA code toward a modular structure afforded by Fortran 95. The refactoring improved usability and maintainability by eliminating the requirement for problem-dependent re-compilations, providing more intuitive distribution of functionality, and simplifying interfaces required for multiphysics coupling. As a result, LAURA now shares gas-physics modules, MPI modules, and other low-level modules with the FUN3D unstructured-grid code. In addition to internal refactoring, several new features and capabilities have been added, e.g., a GNU-standard installation process, parallel load balancing, automatic trajectory point sequencing, free-energy minimization, and coupled ablation and flowfield radiation.
NASA Technical Reports Server (NTRS)
Mazaheri, Alireza; Gnoffo, Peter A.; Johnston, Christopher O.; Kleb, Bil
2009-01-01
This users manual provides in-depth information concerning installation and execution of LAURA, version 5. LAURA is a structured, multi-block, computational aerothermodynamic simulation code. Version 5 represents a major refactoring of the original Fortran 77 LAURA code toward a modular structure afforded by Fortran 95. The refactoring improved usability and maintainability by eliminating the requirement for problem-dependent re-compilations, providing more intuitive distribution of functionality, and simplifying interfaces required for multiphysics coupling. As a result, LAURA now shares gas-physics modules, MPI modules, and other low-level modules with the FUN3D unstructured-grid code. In addition to internal refactoring, several new features and capabilities have been added, e.g., a GNU-standard installation process, parallel load balancing, automatic trajectory point sequencing, free-energy minimization, and coupled ablation and flowfield radiation.
NASA Astrophysics Data System (ADS)
Brown, Nicholas J.; Lloyd, David S.; Reynolds, Melvin I.; Plummer, David L.
2002-05-01
A visible digital image is rendered from a set of digital image data. Medical digital image data can be stored as either: (a) pre-rendered format, corresponding to a photographic print, or (b) un-rendered format, corresponding to a photographic negative. The appropriate image data storage format and associated header data (metadata) required by a user of the results of a diagnostic procedure recorded electronically depends on the task(s) to be performed. The DICOM standard provides a rich set of metadata that supports the needs of complex applications. Many end user applications, such as simple report text viewing and display of a selected image, are not so demanding and generic image formats such as JPEG are sometimes used. However, these are lacking some basic identification requirements. In this paper we make specific proposals for minimal extensions to generic image metadata of value in various domains, which enable safe use in the case of two simple healthcare end user scenarios: (a) viewing of text and a selected JPEG image activated by a hyperlink and (b) viewing of one or more JPEG images together with superimposed text and graphics annotation using a file specified by a profile of the ISO/IEC Basic Image Interchange Format (BIIF).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Petrini, Fabrizio; Nieplocha, Jarek; Tipparaju, Vinod
2006-04-15
In this paper we will present a new technology that we are currently developing within the SFT: Scalable Fault Tolerance FastOS project which seeks to implement fault tolerance at the operating system level. Major design goals include dynamic reallocation of resources to allow continuing execution in the presence of hardware failures, very high scalability, high efficiency (low overhead), and transparency—requiring no changes to user applications. Our technology is based on a global coordination mechanism, that enforces transparent recovery lines in the system, and TICK, a lightweight, incremental checkpointing software architecture implemented as a Linux kernel module. TICK is completely user-transparentmore » and does not require any changes to user code or system libraries; it is highly responsive: an interrupt, such as a timer interrupt, can trigger a checkpoint in as little as 2.5μs; and it supports incremental and full checkpoints with minimal overhead—less than 6% with full checkpointing to disk performed as frequently as once per minute.« less
Healey, Benjamin; Hoek, Janet; Edwards, Richard
2014-01-01
Online Cessation Support Networks (OCSNs) are associated with increased quit success rates, but few studies have examined their use over time. We identified usage patterns in New Zealand's largest OCSN over two years and explored implications for OCSN intervention design and evaluation. We analysed metadata relating to 133,096 OCSN interactions during 2011 and 2012. Metrics covered aggregate network activity, user posting activity and longevity, and between-user commenting. Binary logistic regression models were estimated to investigate the feasibility of predicting low user engagement using early interaction data. Repeating periodic peaks and troughs in aggregate activity related not only to seasonality (e.g., New Year), but also to day of the week. Out of 2,062 unique users, 69 Highly Engaged Users (180+ interactions each) contributed 69% of all OCSN interactions in 2012 compared to 1.3% contributed by 864 Minimally Engaged Users (< = 2 items each). The proportion of Highly Engaged Users increased with network growth between 2011 and 2012 (with marginal significance), but the proportion of Minimally Engaged Users did not decline substantively. First week interaction data enabled identification of Minimally Engaged Users with high specificity and sensitivity (AUROC= 0.94). Results suggest future research should develop and test interventions that promote activity, and hence cessation support, amongst specific user groups or at key time points. For example, early usage information could help identify Minimally Engaged Users for tests of targeted messaging designed to improve their integration into, or re-engagement with, the OCSN. Furthermore, although we observed strong growth over time on varied metrics including posts and comments, this change did not coincide with large gains in first-time user persistence. Researchers assessing intervention effects should therefore examine multiple measures when evaluating changes in network dynamics over time.
BioPCD - A Language for GUI Development Requiring a Minimal Skill Set.
Alvare, Graham Gm; Roche-Lima, Abiel; Fristensky, Brian
2012-11-01
BioPCD is a new language whose purpose is to simplify the creation of Graphical User Interfaces (GUIs) by biologists with minimal programming skills. The first step in developing BioPCD was to create a minimal superset of the language referred to as PCD (Pythonesque Command Description). PCD defines the core of terminals and high-level nonterminals required to describe data of almost any type. BioPCD adds to PCD the constructs necessary to describe GUI components and the syntax for executing system commands. BioPCD is implemented using JavaCC to convert the grammar into code. BioPCD is designed to be terse and readable and simple enough to be learned by copying and modifying existing BioPCD files. We demonstrate that BioPCD can easily be used to generate GUIs for existing command line programs. Although BioPCD was designed to make it easier to run bioinformatics programs, it could be used in any domain in which many useful command line programs exist that do not have GUI interfaces.
ROCOPT: A user friendly interactive code to optimize rocket structural components
NASA Technical Reports Server (NTRS)
Rule, William K.
1989-01-01
ROCOPT is a user-friendly, graphically-interfaced, microcomputer-based computer program (IBM compatible) that optimizes rocket components by minimizing the structural weight. The rocket components considered are ring stiffened truncated cones and cylinders. The applied loading is static, and can consist of any combination of internal or external pressure, axial force, bending moment, and torque. Stress margins are calculated by means of simple closed form strength of material type equations. Stability margins are determined by approximate, orthotropic-shell, closed-form equations. A modified form of Powell's method, in conjunction with a modified form of the external penalty method, is used to determine the minimum weight of the structure subject to stress and stability margin constraints, as well as user input constraints on the structural dimensions. The graphical interface guides the user through the required data prompts, explains program options and graphically displays results for easy interpretation.
More realistic power estimation for new user, active comparator studies: an empirical example.
Gokhale, Mugdha; Buse, John B; Pate, Virginia; Marquis, M Alison; Stürmer, Til
2016-04-01
Pharmacoepidemiologic studies are often expected to be sufficiently powered to study rare outcomes, but there is sequential loss of power with implementation of study design options minimizing bias. We illustrate this using a study comparing pancreatic cancer incidence after initiating dipeptidyl-peptidase-4 inhibitors (DPP-4i) versus thiazolidinediones or sulfonylureas. We identified Medicare beneficiaries with at least one claim of DPP-4i or comparators during 2007-2009 and then applied the following steps: (i) exclude prevalent users, (ii) require a second prescription of same drug, (iii) exclude prevalent cancers, (iv) exclude patients age <66 years and (v) censor for treatment changes during follow-up. Power to detect hazard ratios (effect measure strongly driven by the number of events) ≥ 2.0 estimated after step 5 was compared with the naïve power estimated prior to step 1. There were 19,388 and 28,846 DPP-4i and thiazolidinedione initiators during 2007-2009. The number of drug initiators dropped most after requiring a second prescription, outcomes dropped most after excluding patients with prevalent cancer and person-time dropped most after requiring a second prescription and as-treated censoring. The naïve power (>99%) was considerably higher than the power obtained after the final step (~75%). In designing new-user active-comparator studies, one should be mindful how steps minimizing bias affect sample-size, number of outcomes and person-time. While actual numbers will depend on specific settings, application of generic losses in percentages will improve estimates of power compared with the naive approach mostly ignoring steps taken to increase validity. Copyright © 2015 John Wiley & Sons, Ltd.
Design and evaluation of nonverbal sound-based input for those with motor handicapped.
Punyabukkana, Proadpran; Chanjaradwichai, Supadaech; Suchato, Atiwong
2013-03-01
Most personal computing interfaces rely on the users' ability to use their hand and arm movements to interact with on-screen graphical widgets via mainstream devices, including keyboards and mice. Without proper assistive devices, this style of input poses difficulties for motor-handicapped users. We propose a sound-based input scheme enabling users to operate Windows' Graphical User Interface by producing hums and fricatives through regular microphones. Hierarchically arranged menus are utilized so that only minimal numbers of different actions are required at a time. The proposed scheme was found to be accurate and capable of responding promptly compared to other sound-based schemes. Being able to select from multiple item-selecting modes helps reducing the average time duration needed for completing tasks in the test scenarios almost by half the time needed when the tasks were performed solely through cursor movements. Still, improvements on facilitating users to select the most appropriate modes for desired tasks should improve the overall usability of the proposed scheme.
A User's Guide for the Differential Reduced Ejector/Mixer Analysis "DREA" Program. 1.0
NASA Technical Reports Server (NTRS)
DeChant, Lawrence J.; Nadell, Shari-Beth
1999-01-01
A system of analytical and numerical two-dimensional mixer/ejector nozzle models that require minimal empirical input has been developed and programmed for use in conceptual and preliminary design. This report contains a user's guide describing the operation of the computer code, DREA (Differential Reduced Ejector/mixer Analysis), that contains these mathematical models. This program is currently being adopted by the Propulsion Systems Analysis Office at the NASA Glenn Research Center. A brief summary of the DREA method is provided, followed by detailed descriptions of the program input and output files. Sample cases demonstrating the application of the program are presented.
Rankin, Jeffery W; Kwarciak, Andrew M; Richter, W Mark; Neptune, Richard R
2012-11-01
The majority of manual wheelchair users will experience upper extremity injuries or pain, in part due to the high force requirements, repetitive motion and extreme joint postures associated with wheelchair propulsion. Recent studies have identified cadence, contact angle and peak force as important factors for reducing upper extremity demand during propulsion. However, studies often make comparisons between populations (e.g., able-bodied vs. paraplegic) or do not investigate specific measures of upper extremity demand. The purpose of this study was to use a musculoskeletal model and forward dynamics simulations of wheelchair propulsion to investigate how altering cadence, peak force and contact angle influence individual muscle demand. Forward dynamics simulations of wheelchair propulsion were generated to emulate group-averaged experimental data during four conditions: 1) self-selected propulsion technique, and while 2) minimizing cadence, 3) maximizing contact angle, and 4) minimizing peak force using biofeedback. Simulations were used to determine individual muscle mechanical power and stress as measures of muscle demand. Minimizing peak force and cadence had the lowest muscle power requirements. However, minimizing peak force increased cadence and recovery power, while minimizing cadence increased average muscle stress. Maximizing contact angle increased muscle stress and had the highest muscle power requirements. Minimizing cadence appears to have the most potential for reducing muscle demand and fatigue, which could decrease upper extremity injuries and pain. However, altering any of these variables to extreme values appears to be less effective; instead small to moderate changes may better reduce overall muscle demand. Copyright © 2012 Elsevier Ltd. All rights reserved.
Hadavand, Mostafa; Mirbagheri, Alireza; Behzadipour, Saeed; Farahmand, Farzam
2014-06-01
An effective master robot for haptic tele-surgery applications needs to provide a solution for the inversed movements of the surgical tool, in addition to sufficient workspace and manipulability, with minimal moving inertia. A novel 4 + 1-DOF mechanism was proposed, based on a triple parallelogram linkage, which provided a Remote Center of Motion (RCM) at the back of the user's hand. The kinematics of the robot was analyzed and a prototype was fabricated and evaluated by experimental tests. With a RCM at the back of the user's hand the actuators far from the end effector, the robot could produce the sensation of hand-inside surgery with minimal moving inertia. The target workspace was achieved with an acceptable manipulability. The trajectory tracking experiments revealed small errors, due to backlash at the joints. The proposed mechanism meets the basic requirements of an effective master robot for haptic tele-surgery applications. Copyright © 2013 John Wiley & Sons, Ltd.
Developing an integrated electronic nursing record based on standards.
van Grunsven, Arno; Bindels, Rianne; Coenen, Chel; de Bel, Ernst
2006-01-01
The Radboud University Nijmegen Medical Centre in the Netherlands develops a multidisciplinar (Electronic Health Record) based on the latest HL7 v3 (Health Level 7 version 3) D-MIM : Care provision. As part of this process we are trying to establish which nursing diagnoses and activities are minimally required. These NMDS (Nursing Minimal Data Set) are mapped or translated to ICF (for diagnoses) and CEN1828 Structures for (for activities). The mappings will be the foundation for the development of user interfaces for the registration of nursing activities. A homegrown custom-made web based configuration tool is used to exploit the possibilities of HL7 v3. This enables a sparkling launch of user interfaces that can contain the diversity of health care work processes. The first screens will be developed to support history taking for the nursing chart of the Neurology ward. The screens will contain both Dutch NMDS items and ward specific information. This will be configured dynamically per (group of) ward(s).
Healey, Benjamin; Hoek, Janet; Edwards, Richard
2014-01-01
Objectives Online Cessation Support Networks (OCSNs) are associated with increased quit success rates, but few studies have examined their use over time. We identified usage patterns in New Zealand's largest OCSN over two years and explored implications for OCSN intervention design and evaluation. Methods We analysed metadata relating to 133,096 OCSN interactions during 2011 and 2012. Metrics covered aggregate network activity, user posting activity and longevity, and between-user commenting. Binary logistic regression models were estimated to investigate the feasibility of predicting low user engagement using early interaction data. Results Repeating periodic peaks and troughs in aggregate activity related not only to seasonality (e.g., New Year), but also to day of the week. Out of 2,062 unique users, 69 Highly Engaged Users (180+ interactions each) contributed 69% of all OCSN interactions in 2012 compared to 1.3% contributed by 864 Minimally Engaged Users (< = 2 items each). The proportion of Highly Engaged Users increased with network growth between 2011 and 2012 (with marginal significance), but the proportion of Minimally Engaged Users did not decline substantively. First week interaction data enabled identification of Minimally Engaged Users with high specificity and sensitivity (AUROC = 0.94). Implications Results suggest future research should develop and test interventions that promote activity, and hence cessation support, amongst specific user groups or at key time points. For example, early usage information could help identify Minimally Engaged Users for tests of targeted messaging designed to improve their integration into, or re-engagement with, the OCSN. Furthermore, although we observed strong growth over time on varied metrics including posts and comments, this change did not coincide with large gains in first-time user persistence. Researchers assessing intervention effects should therefore examine multiple measures when evaluating changes in network dynamics over time. PMID:25192174
2011-01-01
Background Academic literature and international standards bodies suggest that user involvement, via the incorporation of human factors engineering methods within the medical device design and development (MDDD) process, offer many benefits that enable the development of safer and more usable medical devices that are better suited to users' needs. However, little research has been carried out to explore medical device manufacturers' beliefs and attitudes towards user involvement within this process, or indeed what value they believe can be added by doing so. Methods In-depth interviews with representatives from 11 medical device manufacturers are carried out. We ask them to specify who they believe the intended users of the device to be, who they consult to inform the MDDD process, what role they believe the user plays within this process, and what value (if any) they believe users add. Thematic analysis is used to analyse the fully transcribed interview data, to gain insight into medical device manufacturers' beliefs and attitudes towards user involvement within the MDDD process. Results A number of high-level themes emerged, relating who the user is perceived to be, the methods used, the perceived value and barriers to user involvement, and the nature of user contributions. The findings reveal that despite standards agencies and academic literature offering strong support for the employment formal methods, manufacturers are still hesitant due to a range of factors including: perceived barriers to obtaining ethical approval; the speed at which such activity may be carried out; the belief that there is no need given the 'all-knowing' nature of senior health care staff and clinical champions; a belief that effective results are achievable by consulting a minimal number of champions. Furthermore, less senior health care practitioners and patients were rarely seen as being able to provide valuable input into the process. Conclusions Medical device manufacturers often do not see the benefit of employing formal human factors engineering methods within the MDDD process. Research is required to better understand the day-to-day requirements of manufacturers within this sector. The development of new or adapted methods may be required if user involvement is to be fully realised. PMID:21356097
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fischer, D.K.; Gitt, M.; Williams, G.A.
1991-07-01
The objective of this document is to provide a resource for all states and compact regions interested in promoting the minimization of low-level radioactive waste (LLW). This project was initiated by the Commonwealth of Massachusetts, and Massachusetts waste streams have been used as examples; however, the methods of analysis presented here are applicable to similar waste streams generated elsewhere. This document is a guide for states/compact regions to use in developing a system to evaluate and prioritize various waste minimization techniques in order to encourage individual radioactive materials users (LLW generators) to consider these techniques in their own independent evaluations.more » This review discusses the application of specific waste minimization techniques to waste streams characteristic of three categories of radioactive materials users: (1) industrial operations using radioactive materials in the manufacture of commercial products, (2) health care institutions, including hospitals and clinics, and (3) educational and research institutions. Massachusetts waste stream characterization data from key radioactive materials users in each category are used to illustrate the applicability of various minimization techniques. The utility group is not included because extensive information specific to this category of LLW generators is available in the literature.« less
Waste minimization for commercial radioactive materials users generating low-level radioactive waste
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fischer, D.K.; Gitt, M.; Williams, G.A.
1991-07-01
The objective of this document is to provide a resource for all states and compact regions interested in promoting the minimization of low-level radioactive waste (LLW). This project was initiated by the Commonwealth of Massachusetts, and Massachusetts waste streams have been used as examples; however, the methods of analysis presented here are applicable to similar waste streams generated elsewhere. This document is a guide for states/compact regions to use in developing a system to evaluate and prioritize various waste minimization techniques in order to encourage individual radioactive materials users (LLW generators) to consider these techniques in their own independent evaluations.more » This review discusses the application of specific waste minimization techniques to waste streams characteristic of three categories of radioactive materials users: (1) industrial operations using radioactive materials in the manufacture of commercial products, (2) health care institutions, including hospitals and clinics, and (3) educational and research institutions. Massachusetts waste stream characterization data from key radioactive materials users in each category are used to illustrate the applicability of various minimization techniques. The utility group is not included because extensive information specific to this category of LLW generators is available in the literature.« less
ERIC Educational Resources Information Center
Guzei, Ilia A.; Hill, Nicholas J.; Zakai, Uzma I.
2010-01-01
Bruker SMART X2S is a portable benchtop diffractometer that requires only a 110 V outlet to operate. The instrument operation is intuitive and facile with an automation layer governing the workflow from behind the scenes. The user participation is minimal. At the end of an experiment, the instrument attempts to solve the structure automatically;…
Graphic analysis of resources by numerical evaluation techniques (Garnet)
Olson, A.C.
1977-01-01
An interactive computer program for graphical analysis has been developed by the U.S. Geological Survey. The program embodies five goals, (1) economical use of computer resources, (2) simplicity for user applications, (3) interactive on-line use, (4) minimal core requirements, and (5) portability. It is designed to aid (1) the rapid analysis of point-located data, (2) structural mapping, and (3) estimation of area resources. ?? 1977.
Atcherson, Samuel R; Damji, Zohra; Upson, Steve
2011-11-01
We explored the feasibility of a subtraction technique described by Friesen and Picton to remove the cochlear implant (CI) artifact to long duration stimuli in the soundfield and using direct input all through the participant's preferred MAP. Friesen and Picton previously explored this technique by recording cortical potentials in four CI users with 1000 pulse per second (pps) stimuli, bypassing the speech processor. Cortical auditory evoked potentials (N1-P2) to 1000 Hz tones were recorded from a post-lingually deafened adult with three different stimulus presentation setups: soundfield to processor T-mic (SF), soundfield to lapel mic (SF-LM), and direct input (DI). Stimuli were presented at 65 dB SPL(A). The SF setup required stabilizing the head to minimize changes in magnitude for the CI artifact. The SF-LM and DI setups did not require head stabilization, but were evaluated as alternatives to the SF setup. Clear N1-P2 responses were obtained with comparable waveform morphologies, amplitudes, and latencies despite some differences in the magnitude of the CI artifact for the different stimulus presentation setups. The results of this study demonstrate that subtraction technique is feasible for recording N1-P2 responses in CI users, though further studies are needed for the three stimulation setups.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crandall, Duard W; Rutz, Frederick C
2004-08-12
Military maneuvers and training exercises are essential for national and world defense. These maneuvers must however be performed in a manner that will have a minimal effect on the environment and local civilians. As residential areas continue to develop near military sites, possible impacts from military traffic and exercises to these areas begin to become of greater concern. Concerns facing the military include the effects of particulate air quality and atmospheric dust dispersion caused by such maneuvers. To aid the Department of Defense with this problem, Pacific Northwest National Laboratory proposed a plan to develop, document and test a modelingmore » system for use in dust dispersion reduction and management near government sites. To accomplish this task a user interface was developed that would be user friendly yet sophisticated enough to accommodate the needs of the client. One such need is to integrate a geographic information system (GIS) with the dust dispersion modeling software. This allows the user to enter the point, area, or line source required for the model runs. Incorporating the GIS with the software will also allow the user to view plume rise and expansion over actual data maps of the desired site. Data collected during previous field studies will be used to verify the results generated by the dust dispersion models. Thus utilizing historical, current, and user defined data, near real-time dust dispersion models will be able to aid in estimating and minimizing the effects of military exercises on the environment and nonmilitary personnel.« less
A user-driven treadmill control scheme for simulating overground locomotion.
Kim, Jonghyun; Stanley, Christopher J; Curatalo, Lindsey A; Park, Hyung-Soon
2012-01-01
Treadmill-based locomotor training should simulate overground walking as closely as possible for optimal skill transfer. The constant speed of a standard treadmill encourages automaticity rather than engagement and fails to simulate the variable speeds encountered during real-world walking. To address this limitation, this paper proposes a user-driven treadmill velocity control scheme that allows the user to experience natural fluctuations in walking velocity with minimal unwanted inertial force due to acceleration/deceleration of the treadmill belt. A smart estimation limiter in the scheme effectively attenuates the inertial force during velocity changes. The proposed scheme requires measurement of pelvic and swing foot motions, and is developed for a treadmill of typical belt length (1.5 m). The proposed scheme is quantitatively evaluated here with four healthy subjects by comparing it with the most advanced control scheme identified in the literature.
Duo, Jia; Dong, Huijin; DeSilva, Binodh; Zhang, Yan J
2013-07-01
Sample dilution and reagent pipetting are time-consuming steps in ligand-binding assays (LBAs). Traditional automation-assisted LBAs use assay-specific scripts that require labor-intensive script writing and user training. Five major script modules were developed on Tecan Freedom EVO liquid handling software to facilitate the automated sample preparation and LBA procedure: sample dilution, sample minimum required dilution, standard/QC minimum required dilution, standard/QC/sample addition, and reagent addition. The modular design of automation scripts allowed the users to assemble an automated assay with minimal script modification. The application of the template was demonstrated in three LBAs to support discovery biotherapeutic programs. The results demonstrated that the modular scripts provided the flexibility in adapting to various LBA formats and the significant time saving in script writing and scientist training. Data generated by the automated process were comparable to those by manual process while the bioanalytical productivity was significantly improved using the modular robotic scripts.
Standardised Benchmarking in the Quest for Orthologs
Altenhoff, Adrian M.; Boeckmann, Brigitte; Capella-Gutierrez, Salvador; Dalquen, Daniel A.; DeLuca, Todd; Forslund, Kristoffer; Huerta-Cepas, Jaime; Linard, Benjamin; Pereira, Cécile; Pryszcz, Leszek P.; Schreiber, Fabian; Sousa da Silva, Alan; Szklarczyk, Damian; Train, Clément-Marie; Bork, Peer; Lecompte, Odile; von Mering, Christian; Xenarios, Ioannis; Sjölander, Kimmen; Juhl Jensen, Lars; Martin, Maria J.; Muffato, Matthieu; Gabaldón, Toni; Lewis, Suzanna E.; Thomas, Paul D.; Sonnhammer, Erik; Dessimoz, Christophe
2016-01-01
The identification of evolutionarily related genes across different species—orthologs in particular—forms the backbone of many comparative, evolutionary, and functional genomic analyses. Achieving high accuracy in orthology inference is thus essential. Yet the true evolutionary history of genes, required to ascertain orthology, is generally unknown. Furthermore, orthologs are used for very different applications across different phyla, with different requirements in terms of the precision-recall trade-off. As a result, assessing the performance of orthology inference methods remains difficult for both users and method developers. Here, we present a community effort to establish standards in orthology benchmarking and facilitate orthology benchmarking through an automated web-based service (http://orthology.benchmarkservice.org). Using this new service, we characterise the performance of 15 well-established orthology inference methods and resources on a battery of 20 different benchmarks. Standardised benchmarking provides a way for users to identify the most effective methods for the problem at hand, sets a minimal requirement for new tools and resources, and guides the development of more accurate orthology inference methods. PMID:27043882
Evolving bipartite authentication graph partitions
Pope, Aaron Scott; Tauritz, Daniel Remy; Kent, Alexander D.
2017-01-16
As large scale enterprise computer networks become more ubiquitous, finding the appropriate balance between user convenience and user access control is an increasingly challenging proposition. Suboptimal partitioning of users’ access and available services contributes to the vulnerability of enterprise networks. Previous edge-cut partitioning methods unduly restrict users’ access to network resources. This paper introduces a novel method of network partitioning superior to the current state-of-the-art which minimizes user impact by providing alternate avenues for access that reduce vulnerability. Networks are modeled as bipartite authentication access graphs and a multi-objective evolutionary algorithm is used to simultaneously minimize the size of largemore » connected components while minimizing overall restrictions on network users. Lastly, results are presented on a real world data set that demonstrate the effectiveness of the introduced method compared to previous naive methods.« less
Evolving bipartite authentication graph partitions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pope, Aaron Scott; Tauritz, Daniel Remy; Kent, Alexander D.
As large scale enterprise computer networks become more ubiquitous, finding the appropriate balance between user convenience and user access control is an increasingly challenging proposition. Suboptimal partitioning of users’ access and available services contributes to the vulnerability of enterprise networks. Previous edge-cut partitioning methods unduly restrict users’ access to network resources. This paper introduces a novel method of network partitioning superior to the current state-of-the-art which minimizes user impact by providing alternate avenues for access that reduce vulnerability. Networks are modeled as bipartite authentication access graphs and a multi-objective evolutionary algorithm is used to simultaneously minimize the size of largemore » connected components while minimizing overall restrictions on network users. Lastly, results are presented on a real world data set that demonstrate the effectiveness of the introduced method compared to previous naive methods.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peloquin, R.A.; McKenzie, D.H.
1994-10-01
A compartmental model has been implemented on a microcomputer as an aid in the analysis of alternative solutions to a problem. The model, entitled Smolt Survival Simulator, simulates the survival of juvenile salmon during their downstream migration and passage of hydroelectric dams in the Columbia River. The model is designed to function in a workshop environment where resource managers and fisheries biologists can study alternative measures that may potentially increase juvenile anadromous fish survival during downriver migration. The potential application of the model has placed several requirements on the implementing software. It must be available for use in workshop settings.more » The software must be easily to use with minimal computer knowledge. Scenarios must be created and executed quickly and efficiently. Results must be immediately available. Software design emphasis vas placed on the user interface because of these requirements. The discussion focuses on methods used in the development of the SSS software user interface. These methods should reduce user stress and alloy thorough and easy parameter modification.« less
BioPCD - A Language for GUI Development Requiring a Minimal Skill Set
Alvare, Graham GM; Roche-Lima, Abiel; Fristensky, Brian
2016-01-01
BioPCD is a new language whose purpose is to simplify the creation of Graphical User Interfaces (GUIs) by biologists with minimal programming skills. The first step in developing BioPCD was to create a minimal superset of the language referred to as PCD (Pythonesque Command Description). PCD defines the core of terminals and high-level nonterminals required to describe data of almost any type. BioPCD adds to PCD the constructs necessary to describe GUI components and the syntax for executing system commands. BioPCD is implemented using JavaCC to convert the grammar into code. BioPCD is designed to be terse and readable and simple enough to be learned by copying and modifying existing BioPCD files. We demonstrate that BioPCD can easily be used to generate GUIs for existing command line programs. Although BioPCD was designed to make it easier to run bioinformatics programs, it could be used in any domain in which many useful command line programs exist that do not have GUI interfaces. PMID:27818582
I-deas TMG to NX Space Systems Thermal Model Conversion and Computational Performance Comparison
NASA Technical Reports Server (NTRS)
Somawardhana, Ruwan
2011-01-01
CAD/CAE packages change on a continuous basis as the power of the tools increase to meet demands. End -users must adapt to new products as they come to market and replace legacy packages. CAE modeling has continued to evolve and is constantly becoming more detailed and complex. Though this comes at the cost of increased computing requirements Parallel processing coupled with appropriate hardware can minimize computation time. Users of Maya Thermal Model Generator (TMG) are faced with transitioning from NX I -deas to NX Space Systems Thermal (SST). It is important to understand what differences there are when changing software packages We are looking for consistency in results.
Trends in communicative access solutions for children with cerebral palsy.
Myrden, Andrew; Schudlo, Larissa; Weyand, Sabine; Zeyl, Timothy; Chau, Tom
2014-08-01
Access solutions may facilitate communication in children with limited functional speech and motor control. This study reviews current trends in access solution development for children with cerebral palsy, with particular emphasis on the access technology that harnesses a control signal from the user (eg, movement or physiological change) and the output device (eg, augmentative and alternative communication system) whose behavior is modulated by the user's control signal. Access technologies have advanced from simple mechanical switches to machine vision (eg, eye-gaze trackers), inertial sensing, and emerging physiological interfaces that require minimal physical effort. Similarly, output devices have evolved from bulky, dedicated hardware with limited configurability, to platform-agnostic, highly personalized mobile applications. Emerging case studies encourage the consideration of access technology for all nonverbal children with cerebral palsy with at least nascent contingency awareness. However, establishing robust evidence of the effectiveness of the aforementioned advances will require more expansive studies. © The Author(s) 2014.
On the design of script languages for neural simulation.
Brette, Romain
2012-01-01
In neural network simulators, models are specified according to a language, either specific or based on a general programming language (e.g. Python). There are also ongoing efforts to develop standardized languages, for example NeuroML. When designing these languages, efforts are often focused on expressivity, that is, on maximizing the number of model types than can be described and simulated. I argue that a complementary goal should be to minimize the cognitive effort required on the part of the user to use the language. I try to formalize this notion with the concept of "language entropy", and I propose a few practical guidelines to minimize the entropy of languages for neural simulation.
Optimization of a Monte Carlo Model of the Transient Reactor Test Facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Kristin; DeHart, Mark; Goluoglu, Sedat
2017-03-01
The ultimate goal of modeling and simulation is to obtain reasonable answers to problems that don’t have representations which can be easily evaluated while minimizing the amount of computational resources. With the advances during the last twenty years of large scale computing centers, researchers have had the ability to create a multitude of tools to minimize the number of approximations necessary when modeling a system. The tremendous power of these centers requires the user to possess an immense amount of knowledge to optimize the models for accuracy and efficiency.This paper seeks to evaluate the KENO model of TREAT to optimizemore » calculational efforts.« less
Applying Minimal Manual Principles for Documentation of Graphical User Interfaces.
ERIC Educational Resources Information Center
Nowaczyk, Ronald H.; James, E. Christopher
1993-01-01
Investigates the need to include computer screens in documentation for software using a graphical user interface. Describes the uses and purposes of "minimal manuals" and their principles. Studies student reaction to their use of one of three on-screen manuals: screens, icon, and button. Finds some benefit for including icon and button…
Small, Lightweight, Collapsible Glove Box
NASA Technical Reports Server (NTRS)
James, Jerry
2009-01-01
A small, lightweight, collapsible glove box enables its user to perform small experiments and other tasks. Originally intended for use aboard a space shuttle or the International Space Station (ISS), this glove box could also be attractive for use on Earth in settings in which work space or storage space is severely limited and, possibly, in which it is desirable to minimize weight. The development of this glove box was prompted by the findings that in the original space-shuttle or ISS setting, (1) it was necessary to perform small experiments in a large general-purpose work station, so that, in effect, they occupied excessive space; and it took excessive amounts of time to set up small experiments. The design of the glove box reflects the need to minimize the space occupied by experiments and the time needed to set up experiments, plus the requirement to limit the launch weight of the box and the space needed to store the box during transport into orbit. To prepare the glove box for use, the astronaut or other user has merely to insert hands through the two fabric glove ports in the side walls of the box and move two hinges to a locking vertical position (see figure). The user could do this while seated with the glove box on the user fs lap. When stowed, the glove box is flat and has approximately the thickness of two pieces of 8-in. (.20 cm) polycarbonate.
Accessibility of insulin pumps for blind and visually impaired people.
Uslan, Mark M; Burton, Darren M; Chertow, Bruce S; Collins, Ronda
2004-10-01
Continuous subcutaneous insulin infusion using an insulin pump (IP) more closely mimics the normal pancreas than multiple insulin injections. It is an effective, and often a preferred, means of maintaining normal blood glucose levels, but IPs were not designed to be fully accessible to blind or visually impaired people. This study will identify accessibility issues related to the design of IPs and focus on the key improvements required in the user interface to provide access for people who are blind or visually impaired. IPs that are commercially available were evaluated, and features and functions such as operating procedures, user interface design, and user manuals were tabulated and analyzed. Potential failures and design priorities were identified through a Failure Modes and Effects Analysis (FMEA). Although the IPs do provide some limited audio output, in general, it was found to be of minimal use to people who are blind or visually impaired. None of the IPs uses high-contrast displays with consistently large fonts preferred by people who are visually impaired. User manuals were also found to be of minimal use. Results of the FMEA emphasize the need to focus design improvements on communicating and verifying information so that errors and failures can be detected and corrected. The most important recommendation for future IP development is speech output capability, which, more than any other improvement, would break down accessibility barriers and allow blind and visually impaired people to take advantage of the benefits of IP technology.
Obayashi, Chihiro; Tamei, Tomoya; Shibata, Tomohiro
2014-05-01
This paper proposes a novel robotic trainer for motor skill learning. It is user-adaptive inspired by the assist-as-needed principle well known in the field of physical therapy. Most previous studies in the field of the robotic assistance of motor skill learning have used predetermined desired trajectories, and it has not been examined intensively whether these trajectories were optimal for each user. Furthermore, the guidance hypothesis states that humans tend to rely too much on external assistive feedback, resulting in interference with the internal feedback necessary for motor skill learning. A few studies have proposed a system that adjusts its assistive strength according to the user's performance in order to prevent the user from relying too much on the robotic assistance. There are, however, problems in these studies, in that a physical model of the user's motor system is required, which is inherently difficult to construct. In this paper, we propose a framework for a robotic trainer that is user-adaptive and that neither requires a specific desired trajectory nor a physical model of the user's motor system, and we achieve this using model-free reinforcement learning. We chose dart-throwing as an example motor-learning task as it is one of the simplest throwing tasks, and its performance can easily be and quantitatively measured. Training experiments with novices, aiming at maximizing the score with the darts and minimizing the physical robotic assistance, demonstrate the feasibility and plausibility of the proposed framework. Copyright © 2014 Elsevier Ltd. All rights reserved.
OxMaR: open source free software for online minimization and randomization for clinical trials.
O'Callaghan, Christopher A
2014-01-01
Minimization is a valuable method for allocating participants between the control and experimental arms of clinical studies. The use of minimization reduces differences that might arise by chance between the study arms in the distribution of patient characteristics such as gender, ethnicity and age. However, unlike randomization, minimization requires real time assessment of each new participant with respect to the preceding distribution of relevant participant characteristics within the different arms of the study. For multi-site studies, this necessitates centralized computational analysis that is shared between all study locations. Unfortunately, there is no suitable freely available open source or free software that can be used for this purpose. OxMaR was developed to enable researchers in any location to use minimization for patient allocation and to access the minimization algorithm using any device that can connect to the internet such as a desktop computer, tablet or mobile phone. The software is complete in itself and requires no special packages or libraries to be installed. It is simple to set up and run over the internet using online facilities which are very low cost or even free to the user. Importantly, it provides real time information on allocation to the study lead or administrator and generates real time distributed backups with each allocation. OxMaR can readily be modified and customised and can also be used for standard randomization. It has been extensively tested and has been used successfully in a low budget multi-centre study. Hitherto, the logistical difficulties involved in minimization have precluded its use in many small studies and this software should allow more widespread use of minimization which should lead to studies with better matched control and experimental arms. OxMaR should be particularly valuable in low resource settings.
NASA Astrophysics Data System (ADS)
Suchacka, Grazyna
2005-02-01
The paper concerns a new research area that is Quality of Web Service (QoWS). The need for QoWS is motivated by a still growing number of Internet users, by a steady development and diversification of Web services, and especially by popularization of e-commerce applications. The goal of the paper is a critical analysis of the literature concerning scheduling algorithms for e-commerce Web servers. The paper characterizes factors affecting the load of the Web servers and discusses ways of improving their efficiency. Crucial QoWS requirements of the business Web server are identified: serving requests before their individual deadlines, supporting user session integrity, supporting different classes of users and minimizing a number of rejected requests. It is justified that meeting these requirements and implementing them in an admission control (AC) and scheduling algorithm for the business Web server is crucial to the functioning of e-commerce Web sites and revenue generated by them. The paper presents results of the literature analysis and discusses algorithms that implement these important QoWS requirements. The analysis showed that very few algorithms take into consideration the above mentioned factors and that there is a need for designing an algorithm implementing them.
Embedded CLIPS for SDI BM/C3 simulation and analysis
NASA Technical Reports Server (NTRS)
Gossage, Brett; Nanney, Van
1990-01-01
Nichols Research Corporation is developing the BM/C3 Requirements Analysis Tool (BRAT) for the U.S. Army Strategic Defense Command. BRAT uses embedded CLIPS/Ada to model the decision making processes used by the human commander of a defense system. Embedding CLlPS/Ada in BRAT allows the user to explore the role of the human in Command and Control (C2) and the use of expert systems for automated C2. BRAT models assert facts about the current state of the system, the simulated scenario, and threat information into CLIPS/Ada. A user-defined rule set describes the decision criteria for the commander. We have extended CLIPS/Ada with user-defined functions that allow the firing of a rule to invoke a system action such as weapons release or a change in strategy. The use of embedded CLIPS/Ada will provide a powerful modeling tool for our customer at minimal cost.
NASA Astrophysics Data System (ADS)
Yoo, Jongsoo; Jara-Almonte, J.; Majeski, S.; Frank, S.; Ji, H.; Yamada, M.
2016-10-01
FLARE (Facility for Laboratory Reconnection Experiments) will be operated as a flexible user facility, and so a complete set of research diagnostics is under development, including magnetic probe arrays, Langmuir probes, Mach probes, spectroscopic probes, and a laser interferometer. In order to accommodate the various requirements of users, large-scale (1 m), variable resolution (0.5-4 cm) magnetic probes have been designed, and are currently being prototyped. Moreover, a fully fiber-coupled laser interferometer has been designed to measure the line-integrated electron density. This fiber-coupled interferometer system will reduce the complexity of alignment processes and minimize maintenance of the system. Finally, improvements to the electrostatic probes and spectroscopic probes currently used in the Magnetic Reconnection Experiment (MRX) are discussed. The specifications of other subsystems, such as integrators and digitizers, are also presented. This work is supported by DoE Contract No. DE-AC0209CH11466.
Evaluation of the Next-Gen Exercise Software Interface in the NEEMO Analog
NASA Technical Reports Server (NTRS)
Hanson, Andrea; Kalogera, Kent; Sandor, Aniko; Hardy, Marc; Frank, Andrew; English, Kirk; Williams, Thomas; Perera, Jeevan; Amonette, William
2017-01-01
NSBRI (National Space Biomedical Research Institute) funded research grant to develop the 'NextGen' exercise software for the NEEMO (NASA Extreme Environment Mission Operations) analog. Develop a software architecture to integrate instructional, motivational and socialization techniques into a common portal to enhance exercise countermeasures in remote environments. Increase user efficiency and satisfaction, and institute commonality across multiple exercise systems. Utilized GUI (Graphical User Interface) design principals focused on intuitive ease of use to minimize training time and realize early user efficiency. Project requirement to test the software in an analog environment. Top Level Project Aims: 1) Improve the usability of crew interface software to exercise CMS (Crew Management System) through common app-like interfaces. 2) Introduce virtual instructional motion training. 3) Use virtual environment to provide remote socialization with family and friends, improve exercise technique, adherence, motivation and ultimately performance outcomes.
NASA Technical Reports Server (NTRS)
Vanderplaats, G. N.; Chen, Xiang; Zhang, Ning-Tian
1988-01-01
The use of formal numerical optimization methods for the design of gears is investigated. To achieve this, computer codes were developed for the analysis of spur gears and spiral bevel gears. These codes calculate the life, dynamic load, bending strength, surface durability, gear weight and size, and various geometric parameters. It is necessary to calculate all such important responses because they all represent competing requirements in the design process. The codes developed here were written in subroutine form and coupled to the COPES/ADS general purpose optimization program. This code allows the user to define the optimization problem at the time of program execution. Typical design variables include face width, number of teeth and diametral pitch. The user is free to choose any calculated response as the design objective to minimize or maximize and may impose lower and upper bounds on any calculated responses. Typical examples include life maximization with limits on dynamic load, stress, weight, etc. or minimization of weight subject to limits on life, dynamic load, etc. The research codes were written in modular form for easy expansion and so that they could be combined to create a multiple reduction optimization capability in future.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ripeanu, Matei; Al-Kiswany, Samer; Iamnitchi, Adriana
2009-03-01
The avalanche of data from scientific instruments and the ensuing interest from geographically distributed users to analyze and interpret it accentuates the need for efficient data dissemination. A suitable data distribution scheme will find the delicate balance between conflicting requirements of minimizing transfer times, minimizing the impact on the network, and uniformly distributing load among participants. We identify several data distribution techniques, some successfully employed by today's peer-to-peer networks: staging, data partitioning, orthogonal bandwidth exploitation, and combinations of the above. We use simulations to explore the performance of these techniques in contexts similar to those used by today's data-centric scientificmore » collaborations and derive several recommendations for efficient data dissemination. Our experimental results show that the peer-to-peer solutions that offer load balancing and good fault tolerance properties and have embedded participation incentives lead to unjustified costs in today's scientific data collaborations deployed on over-provisioned network cores. However, as user communities grow and these deployments scale, peer-to-peer data delivery mechanisms will likely outperform other techniques.« less
Performance Test Data Analysis of Scintillation Cameras
NASA Astrophysics Data System (ADS)
Demirkaya, Omer; Mazrou, Refaat Al
2007-10-01
In this paper, we present a set of image analysis tools to calculate the performance parameters of gamma camera systems from test data acquired according to the National Electrical Manufacturers Association NU 1-2001 guidelines. The calculation methods are either completely automated or require minimal user interaction; minimizing potential human errors. The developed methods are robust with respect to varying conditions under which these tests may be performed. The core algorithms have been validated for accuracy. They have been extensively tested on images acquired by the gamma cameras from different vendors. All the algorithms are incorporated into a graphical user interface that provides a convenient way to process the data and report the results. The entire application has been developed in MATLAB programming environment and is compiled to run as a stand-alone program. The developed image analysis tools provide an automated, convenient and accurate means to calculate the performance parameters of gamma cameras and SPECT systems. The developed application is available upon request for personal or non-commercial uses. The results of this study have been partially presented in Society of Nuclear Medicine Annual meeting as an InfoSNM presentation.
Meader, Nicholas; Li, Ryan; Des Jarlais, Don C; Pilling, Stephen
2010-01-20
Drug users (including both injection drug users and crack cocaine users), are at high levels of risk for contracting HIV. Therefore it is important to reduce the injection and/or sexual risk behaviours of these groups both for the benefit of themselves and for society as a whole. To assess the efficacy of multi-session psychosocial interventions in comparison with standard education and minimal intervention controls for the reduction of injection and sexual risk behaviour. Electronic searches were conducted of a number of bibliographic databases (including Cochrane Library, CINAHL, MEDLINE, PsycINFO). In addition, other methods of locating papers were employed including contacting various authors working in the field of HIV risk reduction and examining reference lists of applicable papers identified in the electronic search. The inclusion criteria consisted of randomised and quazi-randomised trials assessing the efficacy of psychosocial interventions in the reduction of injection and sexual risk behaviour for people who misused opiates, cocaine, or a combination of these drugs. Two authors independently assessed the eligibility of studies identified by the search strategy, quality assessed these studies and extracted the data. A total of 35 trials met the eligibility criteria of the review providing data on 11,867 participants. There were minimal differences identified between multi-session psychosocial interventions and standard educational interventions for both injection and sexual risk behaviour. Although it should be noted there were large pre-post changes for both groups suggesting both were effective in reducing risk behaviours. In addition, there was some evidence of benefit for multi-session psychosocial interventions when compared with minimal controls. Subgroup analyses suggest that people in formal treatment are likely to respond to multi-session psychosocial interventions. It also appears single-gender groups may be associated with greater benefit. There is limited support for the widespread use of formal multi-session psychosocial interventions for reducing injection and sexual risk behaviour. Brief standard education interventions appear to be a more cost-effective option. Further research is required to assess if there are particular groups of drug users more likely to respond to such interventions.
Wu, Dongrui; Lance, Brent J; Parsons, Thomas D
2013-01-01
Brain-computer interaction (BCI) and physiological computing are terms that refer to using processed neural or physiological signals to influence human interaction with computers, environment, and each other. A major challenge in developing these systems arises from the large individual differences typically seen in the neural/physiological responses. As a result, many researchers use individually-trained recognition algorithms to process this data. In order to minimize time, cost, and barriers to use, there is a need to minimize the amount of individual training data required, or equivalently, to increase the recognition accuracy without increasing the number of user-specific training samples. One promising method for achieving this is collaborative filtering, which combines training data from the individual subject with additional training data from other, similar subjects. This paper describes a successful application of a collaborative filtering approach intended for a BCI system. This approach is based on transfer learning (TL), active class selection (ACS), and a mean squared difference user-similarity heuristic. The resulting BCI system uses neural and physiological signals for automatic task difficulty recognition. TL improves the learning performance by combining a small number of user-specific training samples with a large number of auxiliary training samples from other similar subjects. ACS optimally selects the classes to generate user-specific training samples. Experimental results on 18 subjects, using both k nearest neighbors and support vector machine classifiers, demonstrate that the proposed approach can significantly reduce the number of user-specific training data samples. This collaborative filtering approach will also be generalizable to handling individual differences in many other applications that involve human neural or physiological data, such as affective computing.
Wu, Dongrui; Lance, Brent J.; Parsons, Thomas D.
2013-01-01
Brain-computer interaction (BCI) and physiological computing are terms that refer to using processed neural or physiological signals to influence human interaction with computers, environment, and each other. A major challenge in developing these systems arises from the large individual differences typically seen in the neural/physiological responses. As a result, many researchers use individually-trained recognition algorithms to process this data. In order to minimize time, cost, and barriers to use, there is a need to minimize the amount of individual training data required, or equivalently, to increase the recognition accuracy without increasing the number of user-specific training samples. One promising method for achieving this is collaborative filtering, which combines training data from the individual subject with additional training data from other, similar subjects. This paper describes a successful application of a collaborative filtering approach intended for a BCI system. This approach is based on transfer learning (TL), active class selection (ACS), and a mean squared difference user-similarity heuristic. The resulting BCI system uses neural and physiological signals for automatic task difficulty recognition. TL improves the learning performance by combining a small number of user-specific training samples with a large number of auxiliary training samples from other similar subjects. ACS optimally selects the classes to generate user-specific training samples. Experimental results on 18 subjects, using both nearest neighbors and support vector machine classifiers, demonstrate that the proposed approach can significantly reduce the number of user-specific training data samples. This collaborative filtering approach will also be generalizable to handling individual differences in many other applications that involve human neural or physiological data, such as affective computing. PMID:23437188
Karpievitch, Yuliya V; Almeida, Jonas S
2006-01-01
Background Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. Results mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Conclusion Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet. PMID:16539707
Karpievitch, Yuliya V; Almeida, Jonas S
2006-03-15
Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet.
Pilot Study of iPad Incorporation Into Graduate Medical Education.
Lobo, Mark J; Crandley, Edwin F; Rumph, Jake S; Kirk, Susan E; Dunlap, Neal E; Rahimi, Asal S; Turner, A Benton; Larner, James M; Read, Paul W
2013-03-01
Increased documentation and charting requirements are challenging for residents, given duty hour limits. Use of mobile electronic devices may help residents complete these tasks efficiently. To collect initial data on usage rates, information technology (IT) support requirements, and resident use of iPads during training. In this pilot study, we provided 12 residents/fellows from various specialties at the University of Virginia with an iPad with IT support. The system used a virtual private network with access to the institution's electronic health record. Participants were allowed to develop their own methods and systems for personalized iPad use, and after 9 months they provided data on the utility of the iPad. Feedback from the IT team also was obtained. Average iPad use was 2.1 h/d (range, 0.5-6 h/d). The average self-reported reduction in administrative work due to the iPad was 2.7 h/wk (range, 0-9 h/wk). A total of 75% (9 of 12) of the users would recommend universal adoption among residents and fellows. More than 90% (11 of 12) of users reported the iPad would improve communication for coordination of care. A total of 68% (8 of 12) of users reported that an iPad facilitated their activities as educators of medical students and junior residents. Residents cited slow data entry into the electronic health record and hospital areas lacking Wi-Fi connectivity as potential drawbacks to iPad use. The IT team reported minimal support time for device setup, maintenance, and upgrades, and limited security risks. The iPad may contribute to increased clinical efficiency, reduced hours spent on administrative tasks, and enhanced educational opportunities for residents, with minimal IT support.
The Role of Direct and Visual Force Feedback in Suturing Using a 7-DOF Dual-Arm Teleoperated System.
Talasaz, Ali; Trejos, Ana Luisa; Patel, Rajni V
2017-01-01
The lack of haptic feedback in robotics-assisted surgery can result in tissue damage or accidental tool-tissue hits. This paper focuses on exploring the effect of haptic feedback via direct force reflection and visual presentation of force magnitudes on performance during suturing in robotics-assisted minimally invasive surgery (RAMIS). For this purpose, a haptics-enabled dual-arm master-slave teleoperation system capable of measuring tool-tissue interaction forces in all seven Degrees-of-Freedom (DOFs) was used. Two suturing tasks, tissue puncturing and knot-tightening, were chosen to assess user skills when suturing on phantom tissue. Sixteen subjects participated in the trials and their performance was evaluated from various points of view: force consistency, number of accidental hits with tissue, amount of tissue damage, quality of the suture knot, and the time required to accomplish the task. According to the results, visual force feedback was not very useful during the tissue puncturing task as different users needed different amounts of force depending on the penetration of the needle into the tissue. Direct force feedback, however, was more useful for this task to apply less force and to minimize the amount of damage to the tissue. Statistical results also reveal that both visual and direct force feedback were required for effective knot tightening: direct force feedback could reduce the number of accidental hits with the tissue and also the amount of tissue damage, while visual force feedback could help to securely tighten the suture knots and maintain force consistency among different trials/users. These results provide evidence of the importance of 7-DOF force reflection when performing complex tasks in a RAMIS setting.
Pneumatic Muscle Actuated Equipment for Continuous Passive Motion
NASA Astrophysics Data System (ADS)
Deaconescu, Tudor T.; Deaconescu, Andrea I.
2009-10-01
Applying continuous passive rehabilitation movements as part of the recovery programme of patients with post-traumatic disabilities of the bearing joints of the inferior limbs requires the development of new high performance equipment. This chapter discusses a study of the kinematics and performance of such a new, continuous passive motion based rehabilitation system actuated by pneumatic muscles. The utilized energy source is compressed air ensuring complete absorption of the end of stroke shocks, thus minimizing user discomfort.
Culvert analysis program for indirect measurement of discharge
Fulford, Janice M.; ,
1993-01-01
A program based on the U.S. Geological Survey (USGS) methods for indirectly computing peak discharges through culverts allows users to employ input data formats used by the water surface profile program (WSPRO). The program can be used to compute discharge rating surfaces or curves that describe the behavior of flow through a particular culvert or to compute discharges from measurements of upstream of the gradually varied flow equations and has been adapted slightly to provide solutions that minimize the need for the user to determine between different flow regimes. The program source is written in Fortran 77 and has been run on mini-computers and personal computers. The program does not use or require graphics capability, a color monitor, or a mouse.
Efficient monitoring of CRAB jobs at CMS
NASA Astrophysics Data System (ADS)
Silva, J. M. D.; Balcas, J.; Belforte, S.; Ciangottini, D.; Mascheroni, M.; Rupeika, E. A.; Ivanov, T. T.; Hernandez, J. M.; Vaandering, E.
2017-10-01
CRAB is a tool used for distributed analysis of CMS data. Users can submit sets of jobs with similar requirements (tasks) with a single request. CRAB uses a client-server architecture, where a lightweight client, a server, and ancillary services work together and are maintained by CMS operators at CERN. As with most complex software, good monitoring tools are crucial for efficient use and longterm maintainability. This work gives an overview of the monitoring tools developed to ensure the CRAB server and infrastructure are functional, help operators debug user problems, and minimize overhead and operating cost. This work also illustrates the design choices and gives a report on our experience with the tools we developed and the external ones we used.
Automated imaging system for single molecules
Schwartz, David Charles; Runnheim, Rodney; Forrest, Daniel
2012-09-18
There is provided a high throughput automated single molecule image collection and processing system that requires minimal initial user input. The unique features embodied in the present disclosure allow automated collection and initial processing of optical images of single molecules and their assemblies. Correct focus may be automatically maintained while images are collected. Uneven illumination in fluorescence microscopy is accounted for, and an overall robust imaging operation is provided yielding individual images prepared for further processing in external systems. Embodiments described herein are useful in studies of any macromolecules such as DNA, RNA, peptides and proteins. The automated image collection and processing system and method of same may be implemented and deployed over a computer network, and may be ergonomically optimized to facilitate user interaction.
Efficient Monitoring of CRAB Jobs at CMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Silva, J. M.D.; Balcas, J.; Belforte, S.
CRAB is a tool used for distributed analysis of CMS data. Users can submit sets of jobs with similar requirements (tasks) with a single request. CRAB uses a client-server architecture, where a lightweight client, a server, and ancillary services work together and are maintained by CMS operators at CERN. As with most complex software, good monitoring tools are crucial for efficient use and longterm maintainability. This work gives an overview of the monitoring tools developed to ensure the CRAB server and infrastructure are functional, help operators debug user problems, and minimize overhead and operating cost. This work also illustrates themore » design choices and gives a report on our experience with the tools we developed and the external ones we used.« less
CIRCAL-2 - General-purpose on-line circuit design.
NASA Technical Reports Server (NTRS)
Dertouzos, M. L.; Jessel, G. P.; Stinger, J. R.
1972-01-01
CIRCAL-2 is a second-generation general-purpose on-line circuit-design program with the following main features: (1) multiple-analysis capability; (2) uniform and general data structures for handling text editing, network representations, and output results, regardless of analysis; (3) special techniques and structures for minimizing and controlling user-program interaction; (4) use of functionals for the description of hysteresis and heat effects; and (5) ability to define optimization procedures that 'replace' the user. The paper discusses the organization of CIRCAL-2, the aforementioned main features, and their consequences, such as a set of network elements and models general enough for most analyses and a set of functions tailored to circuit-design requirements. The presentation is descriptive, concentrating on conceptual rather than on program implementation details.
NLINEAR - NONLINEAR CURVE FITTING PROGRAM
NASA Technical Reports Server (NTRS)
Everhart, J. L.
1994-01-01
A common method for fitting data is a least-squares fit. In the least-squares method, a user-specified fitting function is utilized in such a way as to minimize the sum of the squares of distances between the data points and the fitting curve. The Nonlinear Curve Fitting Program, NLINEAR, is an interactive curve fitting routine based on a description of the quadratic expansion of the chi-squared statistic. NLINEAR utilizes a nonlinear optimization algorithm that calculates the best statistically weighted values of the parameters of the fitting function and the chi-square that is to be minimized. The inputs to the program are the mathematical form of the fitting function and the initial values of the parameters to be estimated. This approach provides the user with statistical information such as goodness of fit and estimated values of parameters that produce the highest degree of correlation between the experimental data and the mathematical model. In the mathematical formulation of the algorithm, the Taylor expansion of chi-square is first introduced, and justification for retaining only the first term are presented. From the expansion, a set of n simultaneous linear equations are derived, which are solved by matrix algebra. To achieve convergence, the algorithm requires meaningful initial estimates for the parameters of the fitting function. NLINEAR is written in Fortran 77 for execution on a CDC Cyber 750 under NOS 2.3. It has a central memory requirement of 5K 60 bit words. Optionally, graphical output of the fitting function can be plotted. Tektronix PLOT-10 routines are required for graphics. NLINEAR was developed in 1987.
[Provision of building maintenance services in healthcare facilities].
Amorim, Gláucia Maria; Quintão, Eliana Cardoso Vieira; Martelli Júnior, Hercílio; Bonan, Paulo Rogério Ferreti
2013-01-01
The scope of this paper was to evaluate the provision of building maintenance services in health units, by means of a descriptive, quantitative and cross-sectional study, considering the five types of facilities (Primary Health, Emergency, Specialty, Hospital and Mental Health Units). The research was approved by the Research Ethics Comittee of FHEMIG with the Terms of Agreement signed with the Unified Health System of Betim. Comparative analysis was conducted by checking the requirements of "Physical-Functional Structure Management" of the "Brazilian Hospital Accreditation Manual" of the National Accreditation Organization. Nonconformities were noted in the physical-functional management of the health centers, especially the primary health units. The assessment was important, considering that compliance with formal, technical and structural requirements, welfare activities, according to the service organization and appropriate to the profile and complexity, can collaborate to minimize the risks of users. To improve the quality of health care establishments, it is essential that managers, backed by "top management," prioritize financial, human and material resources in planning to ensure compliance with security requirements of users in buildings.
Clarke, Kris; Harris, Debra; Zweifler, John A; Lasher, Marc; Mortimer, Roger B; Hughes, Susan
2016-01-01
Infectious disease remains a significant social and health concern in the United States. Preventing more people from contracting HIV/AIDS or Hepatitis C (HCV), requires a complex understanding of the interconnection between the biomedical and social dimensions of infectious disease. Opiate addiction in the US has skyrocketed in recent years. Preventing more cases of HIV/AIDS and HCV will require dealing with the social determinants of health. Needle exchange programs (NEPs) are based on a harm reduction approach that seeks to minimize the risk of infection and damage to the user and community. This article presents an exploratory small-scale quantitative study of the injection drug using habits of a group of injection drug users (IDUs) at a needle exchange program in Fresno, California. Respondents reported significant decreases in high risk IDU behaviors, including sharing of needles and to a lesser extent re-using of needles. They also reported frequent use of clean paraphernalia. Greater collaboration between social and health outreach professionals at NEPs could provide important frontline assistance to people excluded from mainstream office-based services and enhance efforts to reduce HIV/AIDS or HCV infection.
Computer programing for geosciences: Teach your students how to make tools
NASA Astrophysics Data System (ADS)
Grapenthin, Ronni
2011-12-01
When I announced my intention to pursue a Ph.D. in geophysics, some people gave me confused looks, because I was working on a master's degree in computer science at the time. My friends, like many incoming geoscience graduate students, have trouble linking these two fields. From my perspective, it is pretty straightforward: Much of geoscience evolves around novel analyses of large data sets that require custom tools—computer programs—to minimize the drudgery of manual data handling; other disciplines share this characteristic. While most faculty adapted to the need for tool development quite naturally, as they grew up around computer terminal interfaces, incoming graduate students lack intuitive understanding of programing concepts such as generalization and automation. I believe the major cause is the intuitive graphical user interfaces of modern operating systems and applications, which isolate the user from all technical details. Generally, current curricula do not recognize this gap between user and machine. For students to operate effectively, they require specialized courses teaching them the skills they need to make tools that operate on particular data sets and solve their specific problems. Courses in computer science departments are aimed at a different audience and are of limited help.
Head-mounted display systems and the special operations soldier
NASA Astrophysics Data System (ADS)
Loyd, Rodney B.
1998-08-01
In 1997, the Boeing Company, working with DARPA under the Smart Modules program and the US Army Soldier Systems Command, embarked on an advanced research and development program to develop a wearable computer system tailored for use with soldiers of the US Special Operations Command. The 'special operations combat management system' is a rugged advanced wearable tactical computer, designed to provide the special operations soldier with enhanced situation awareness and battlefield information capabilities. Many issues must be considered during the design of wearable computers for a combat soldier, including the system weight, placement on the body with respect to other equipment, user interfaces and display system characteristics. During the initial feasibility study for the system, the operational environment was examined and potential users were interviewed to establish the proper display solution for the system. Many display system requirements resulted, such as head or helmet mounting, Night Vision Goggle compatibility, minimal visible light emissions, environmental performance and even the need for handheld or other 'off the head' type display systems. This paper will address these issues and other end user requirements for display systems for applications in the harsh and demanding environment of the Special Operations soldier.
McNeil, Ryan; Small, Will; Lampkin, Hugh; Shannon, Kate; Kerr, Thomas
2013-01-01
People who require help injecting are disproportionately vulnerable to drug-related harm, including HIV transmission. North America’s only sanctioned SIF operates in Vancouver, Canada under an exemption to federal drug laws, which imposes operating regulations prohibiting assisted injections. In response, the Vancouver Area Network of Drug Users (VANDU) launched a peer-run unsanctioned SIF in which trained peer volunteers provide assisted injections to increase the coverage of supervised injection services and minimize drug-related harm. We undertook qualitative interviews (n=23) and ethnographic observation (50 hours) to explore how this facility shaped assisted injection practices. Findings indicated that VANDU reshaped the social, structural, and spatial contexts of assisted injection practices in a manner that minimized HIV and other health risks, while allowing people who require help injecting to escape drug scene violence. Findings underscore the need for changes to regulatory frameworks governing SIFs to ensure that they accommodate people who require help injecting. PMID:23797831
NASA Astrophysics Data System (ADS)
Messerotti, M.
2009-04-01
Earth and Space Science research, as well as many other disciplines, can nowadays benefit from advanced data handling techniques and tools capable to significantly relieve the scientist of the burden of data search, retrieval, visualization and manipulation, and to exploit the data information content. Some typical examples are Virtual Observatories (VO) specific to a variety of sub-disciplines but anyway interlinked, a feature intrinsic to the VO architecture, Virtual Globes as advanced 3D selection and visualization interfaces to distributed data repositories, and the Global Earth Observation System of Systems. These information systems are proving also effective in education and outreach activities as they are usable via web interfaces to give access to, to display and to download nonhomogeneous datasets in order to raise the awareness of the students and the public on the relevant disciplines. Despite of that, all of this effective machineries are still poorly used both by the scientific community and by the community active in education and outreach. All such infrastructures are designed and developed according to the state-of-the-art information and computer engineering techniques and are provided with top features such as ontology- and semantics-based data management, and advanced unified web-based interfaces. Anyway, a careful analysis of the issue mentioned above indicates a key aspect that play a major role, i.e., the inadequate interaction with the users' communities during the design, the development, the deployment and the test phases. Even the best technical tool can appear inadequate to the final user when it does not meet the user's requirements in terms of achievable goals and use friendliness. In this work, we consider the user-side features to be taken into account for the optimum exploitation of an information system in the framework of the interaction among the design engineers and the target communities towards the setting of a good practice for minimizing the developer-user divide.
Detailed requirements document for common software of shuttle program information management system
NASA Technical Reports Server (NTRS)
Everette, J. M.; Bradfield, L. D.; Horton, C. L.
1975-01-01
Common software was investigated as a method for minimizing development and maintenance cost of the shuttle program information management system (SPIMS) applications while reducing the time-frame of their development. Those requirements satisfying these criteria are presented along with the stand-alone modules which may be used directly by applications. The SPIMS applications operating on the CYBER 74 computer, are specialized information management systems which use System 2000 as a data base manager. Common software provides the features to support user interactions on a CRT terminal using form input and command response capabilities. These features are available as subroutines to the applications.
Ryu, Joonghyun; Lee, Mokwon; Cha, Jehyun; Laskowski, Roman A.; Ryu, Seong Eon; Kim, Deok-Soo
2016-01-01
Many applications, such as protein design, homology modeling, flexible docking, etc. require the prediction of a protein's optimal side-chain conformations from just its amino acid sequence and backbone structure. Side-chain prediction (SCP) is an NP-hard energy minimization problem. Here, we present BetaSCPWeb which efficiently computes a conformation close to optimal using a geometry-prioritization method based on the Voronoi diagram of spherical atoms. Its outputs are visual, textual and PDB file format. The web server is free and open to all users at http://voronoi.hanyang.ac.kr/betascpweb with no login requirement. PMID:27151195
Space Station Freedom power management and distribution design status
NASA Technical Reports Server (NTRS)
Javidi, S.; Gholdston, E.; Stroh, P.
1989-01-01
The design status of the power management and distribution electric power system for the Space Station Freedom is presented. The current design is a star architecture, which has been found to be the best approach for meeting the requirement to deliver 120 V dc to the user interface. The architecture minimizes mass and power losses while improving element-to-element isolation and system flexibility. The design is partitioned into three elements: energy collection, storage and conversion, system protection and distribution, and management and control.
Subband Coding Methods for Seismic Data Compression
NASA Technical Reports Server (NTRS)
Kiely, A.; Pollara, F.
1995-01-01
This paper presents a study of seismic data compression techniques and a compression algorithm based on subband coding. The compression technique described could be used as a progressive transmission system, where successive refinements of the data can be requested by the user. This allows seismologists to first examine a coarse version of waveforms with minimal usage of the channel and then decide where refinements are required. Rate-distortion performance results are presented and comparisons are made with two block transform methods.
Rambrain - a library for virtually extending physical memory
NASA Astrophysics Data System (ADS)
Imgrund, Maximilian; Arth, Alexander
2017-08-01
We introduce Rambrain, a user space library that manages memory consumption of your code. Using Rambrain you can overcommit memory over the size of physical memory present in the system. Rambrain takes care of temporarily swapping out data to disk and can handle multiples of the physical memory size present. Rambrain is thread-safe, OpenMP and MPI compatible and supports Asynchronous IO. The library was designed to require minimal changes to existing programs and to be easy to use.
NASA Astrophysics Data System (ADS)
Kromp, Florian; Taschner-Mandl, Sabine; Schwarz, Magdalena; Blaha, Johanna; Weiss, Tamara; Ambros, Peter F.; Reiter, Michael
2015-02-01
We propose a user-driven method for the segmentation of neuroblastoma nuclei in microscopic fluorescence images involving the gradient energy tensor. Multispectral fluorescence images contain intensity and spatial information about antigene expression, fluorescence in situ hybridization (FISH) signals and nucleus morphology. The latter serves as basis for the detection of single cells and the calculation of shape features, which are used to validate the segmentation and to reject false detections. Accurate segmentation is difficult due to varying staining intensities and aggregated cells. It requires several (meta-) parameters, which have a strong influence on the segmentation results and have to be selected carefully for each sample (or group of similar samples) by user interactions. Because our method is designed for clinicians and biologists, who may have only limited image processing background, an interactive parameter selection step allows the implicit tuning of parameter values. With this simple but intuitive method, segmentation results with high precision for a large number of cells can be achieved by minimal user interaction. The strategy was validated on handsegmented datasets of three neuroblastoma cell lines.
Hu, Yu-Chi J; Grossberg, Michael D; Mageras, Gikas S
2008-01-01
Planning radiotherapy and surgical procedures usually require onerous manual segmentation of anatomical structures from medical images. In this paper we present a semi-automatic and accurate segmentation method to dramatically reduce the time and effort required of expert users. This is accomplished by giving a user an intuitive graphical interface to indicate samples of target and non-target tissue by loosely drawing a few brush strokes on the image. We use these brush strokes to provide the statistical input for a Conditional Random Field (CRF) based segmentation. Since we extract purely statistical information from the user input, we eliminate the need of assumptions on boundary contrast previously used by many other methods, A new feature of our method is that the statistics on one image can be reused on related images without registration. To demonstrate this, we show that boundary statistics provided on a few 2D slices of volumetric medical data, can be propagated through the entire 3D stack of images without using the geometric correspondence between images. In addition, the image segmentation from the CRF can be formulated as a minimum s-t graph cut problem which has a solution that is both globally optimal and fast. The combination of a fast segmentation and minimal user input that is reusable, make this a powerful technique for the segmentation of medical images.
Martin, G. T.; Yoon, S. S.; Mott, K. E.
1991-01-01
Schistosomiasis, a group of parasitic diseases caused by Schistosoma parasites, is associated with water resources development and affects more than 200 million people in 76 countries. Depending on the species of parasite involved, disease of the liver, spleen, gastrointestinal or urinary tract, or kidneys may result. A computer-assisted teaching package has been developed by WHO for use in the training of public health workers involved in schistosomiasis control. The package consists of the software, ZOOM, and a schistosomiasis information file, Dr Schisto, and uses hypermedia technology to link pictures and text. ZOOM runs on the IBM-PC and IBM-compatible computers, is user-friendly, requires a minimal hardware configuration, and can interact with the user in English, French, Spanish or Portuguese. The information files for ZOOM can be created or modified by the instructor using a word processor, and thus can be designed to suit the need of students. No programming knowledge is required to create the stacks. PMID:1786618
Web-Enabled Optoelectronic Particle-Fallout Monitor
NASA Technical Reports Server (NTRS)
Lineberger, Lewis P.
2008-01-01
A Web-enabled optoelectronic particle- fallout monitor has been developed as a prototype of future such instruments that (l) would be installed in multiple locations for which assurance of cleanliness is required and (2) could be interrogated and controlled in nearly real time by multiple remote users. Like prior particle-fallout monitors, this instrument provides a measure of particles that accumulate on a surface as an indication of the quantity of airborne particulate contaminants. The design of this instrument reflects requirements to: Reduce the cost and complexity of its optoelectronic sensory subsystem relative to those of prior optoelectronic particle fallout monitors while maintaining or improving capabilities; Use existing network and office computers for distributed display and control; Derive electric power for the instrument from a computer network, a wall outlet, or a battery; Provide for Web-based retrieval and analysis of measurement data and of a file containing such ancillary data as a log of command attempts at remote units; and Use the User Datagram Protocol (UDP) for maximum performance and minimal network overhead.
NASA Technical Reports Server (NTRS)
Liberman, Eugene M.; Manner, David B.; Dolce, James L.; Mellor, Pamela A.
1993-01-01
Expert systems are widely used in health monitoring and fault detection applications. One of the key features of an expert system is that it possesses a large body of knowledge about the application for which it was designed. When the user consults this knowledge base, it is essential that the expert system's reasoning process and its conclusions be as concise as possible. If, in addition, an expert system is part of a process monitoring system, the expert system's conclusions must be combined with current events of the process. Under these circumstances, it is difficult for a user to absorb and respond to all the available information. For example, a user can become distracted and confused if two or more unrelated devices in different parts of the system require attention. A human interface designed to integrate expert system diagnoses with process data and to focus the user's attention to the important matters provides a solution to the 'information overload' problem. This paper will discuss a user interface to the power distribution expert system for Space Station Freedom. The importance of features which simplify assessing system status and which minimize navigating through layers of information will be discussed. Design rationale and implementation choices will also be presented.
Sawers, Andrew; Hafner, Brian J
2018-05-08
Challenging clinical balance tests are needed to expose balance deficits in lower-limb prost-hesis users. This study examined whether narrowing beam-walking could overcome conceptual and practical limitations identified in fixed-width beam-walking. Cross-sectional. Unilateral lower-limb prosthesis users. Participants walked 10 times along a low, narrowing beam. Performance was quantified using the normalized distance walked. Heuristic rules were applied to determine whether the narrowing beam task was "too easy," "too hard," or "appropriately challenging" for each participant. Linear regression and Bland-Altman plots were used to determine whether combinations of the first 5 trials could predict participants' stable beam-walking performance. Forty unilateral lower-limb prosthesis users participated. Narrowing beam-walking was appropriately challenging for 98% of participants. Performance stabilized for 93% of participants within 5 trials, while 62% were stable across all trials. The mean of trials 3-5 accurately predicted stable performance. A clinical narrowing beam-walking test is likely to challenge a range of lower-limb prosthesis users, have minimal administrative burden, and exhibit no floor or ceiling effects. Narrowing beam-walking is therefore a clinically viable method to evaluate lower-limb prosthesis users' balance ability, but requires psychometric testing before it is used to assess fall risk.
Usability Issues in the User Interfaces of Privacy-Enhancing Technologies
ERIC Educational Resources Information Center
LaTouche, Lerone W.
2013-01-01
Privacy on the Internet has become one of the leading concerns for Internet users. These users are not wrong in their concerns if personally identifiable information is not protected and under their control. To minimize the collection of Internet users' personal information and help solve the problem of online privacy, a number of…
NEWSUMT: A FORTRAN program for inequality constrained function minimization, users guide
NASA Technical Reports Server (NTRS)
Miura, H.; Schmit, L. A., Jr.
1979-01-01
A computer program written in FORTRAN subroutine form for the solution of linear and nonlinear constrained and unconstrained function minimization problems is presented. The algorithm is the sequence of unconstrained minimizations using the Newton's method for unconstrained function minimizations. The use of NEWSUMT and the definition of all parameters are described.
NASA Technical Reports Server (NTRS)
Stahara, S. S.
1984-01-01
An investigation was carried out to complete the preliminary development of a combined perturbation/optimization procedure and associated computational code for designing optimized blade-to-blade profiles of turbomachinery blades. The overall purpose of the procedures developed is to provide demonstration of a rapid nonlinear perturbation method for minimizing the computational requirements associated with parametric design studies of turbomachinery flows. The method combines the multiple parameter nonlinear perturbation method, successfully developed in previous phases of this study, with the NASA TSONIC blade-to-blade turbomachinery flow solver, and the COPES-CONMIN optimization procedure into a user's code for designing optimized blade-to-blade surface profiles of turbomachinery blades. Results of several design applications and a documented version of the code together with a user's manual are provided.
Progress in the development of paper-based diagnostics for low-resource point-of-care settings
Byrnes, Samantha; Thiessen, Gregory; Fu, Elain
2014-01-01
This Review focuses on recent work in the field of paper microfluidics that specifically addresses the goal of translating the multistep processes that are characteristic of gold-standard laboratory tests to low-resource point-of-care settings. A major challenge is to implement multistep processes with the robust fluid control required to achieve the necessary sensitivity and specificity of a given application in a user-friendly package that minimizes equipment. We review key work in the areas of fluidic controls for automation in paper-based devices, readout methods that minimize dedicated equipment, and power and heating methods that are compatible with low-resource point-of-care settings. We also highlight a focused set of recent applications and discuss future challenges. PMID:24256361
High-Resolution Strain Analysis of the Human Heart with Fast-DENSE
NASA Astrophysics Data System (ADS)
Aletras, Anthony H.; Balaban, Robert S.; Wen, Han
1999-09-01
Single breath-hold displacement data from the human heart were acquired with fast-DENSE (fast displacement encoding with stimulated echoes) during systolic contraction at 2.5 × 2.5 mm in-plane resolution. Encoding strengths of 0.86-1.60 mm/π were utilized in order to extend the dynamic range of the phase measurements and minimize effects of physiologic and instrument noise. The noise level in strain measurements for both contraction and dilation corresponded to a strain value of 2.8%. In the human heart, strain analysis has sufficient resolution to reveal transmural variation across the left ventricular wall. Data processing required minimal user intervention and provided a rapid quantitative feedback. The intrinsic temporal integration of fast-DENSE achieves high accuracy at the expense of temporal resolution.
Systems identification using a modified Newton-Raphson method: A FORTRAN program
NASA Technical Reports Server (NTRS)
Taylor, L. W., Jr.; Iliff, K. W.
1972-01-01
A FORTRAN program is offered which computes a maximum likelihood estimate of the parameters of any linear, constant coefficient, state space model. For the case considered, the maximum likelihood estimate can be identical to that which minimizes simultaneously the weighted mean square difference between the computed and measured response of a system and the weighted square of the difference between the estimated and a priori parameter values. A modified Newton-Raphson or quasilinearization method is used to perform the minimization which typically requires several iterations. A starting technique is used which insures convergence for any initial values of the unknown parameters. The program and its operation are described in sufficient detail to enable the user to apply the program to his particular problem with a minimum of difficulty.
EDMUS, a European database for multiple sclerosis.
Confavreux, C; Compston, D A; Hommes, O R; McDonald, W I; Thompson, A J
1992-08-01
EDMUS is a minimal descriptive record developed for research purposes to document clinical and laboratory data in patients with multiple sclerosis (MS). It has been designed by a committee of the European Concerted Action for MS, organised under the auspices of the Commission of the European Communities. The software is user-friendly and fast, with a minimal set of obligatory data. Priority has been given to analytical data and the system is capable of automatically generating data, such as diagnosis classification, using appropriate algorithms. This procedure saves time, ensures a uniform approach to individual cases and allows automatic updating of the classification whenever additional information becomes available. It is also compatible with future developments and requirements since new algorithms can be entered in the programme when necessary. This system is flexible and may be adapted to the users needs. It is run on Apple and IBM-PC personal microcomputers. Great care has been taken to preserve confidentiality of the data. It is anticipated that this "common" language will enable the collection of appropriate cases for specific purposes, including population-based studies of MS and will be particularly useful in projects where the collaboration of several centres is needed to recruit a critical number of patients.
PELE web server: atomistic study of biomolecular systems at your fingertips.
Madadkar-Sobhani, Armin; Guallar, Victor
2013-07-01
PELE, Protein Energy Landscape Exploration, our novel technology based on protein structure prediction algorithms and a Monte Carlo sampling, is capable of modelling the all-atom protein-ligand dynamical interactions in an efficient and fast manner, with two orders of magnitude reduced computational cost when compared with traditional molecular dynamics techniques. PELE's heuristic approach generates trial moves based on protein and ligand perturbations followed by side chain sampling and global/local minimization. The collection of accepted steps forms a stochastic trajectory. Furthermore, several processors may be run in parallel towards a collective goal or defining several independent trajectories; the whole procedure has been parallelized using the Message Passing Interface. Here, we introduce the PELE web server, designed to make the whole process of running simulations easier and more practical by minimizing input file demand, providing user-friendly interface and producing abstract outputs (e.g. interactive graphs and tables). The web server has been implemented in C++ using Wt (http://www.webtoolkit.eu) and MySQL (http://www.mysql.com). The PELE web server, accessible at http://pele.bsc.es, is free and open to all users with no login requirement.
NASA Technical Reports Server (NTRS)
Scharfstein, Gregory; Cox, Russell
2012-01-01
A document discusses a simulation chamber that represents a shift from the thermal-vacuum chamber stereotype. This innovation, currently in development, combines the capabilities of space simulation chambers, the user-friendliness of modern-day electronics, and the modularity of plug-and-play computing. The Mobile Chamber is a customized test chamber that can be deployed with great ease, and is capable of bringing payloads at temperatures down to 20 K, in high vacuum, and with the desired metrology instruments integrated to the systems control. Flexure plans to lease Mobile Chambers, making them affordable for smaller budgets and available to a larger customer base. A key feature of this design will be an Apple iPad-like user interface that allows someone with minimal training to control the environment inside the chamber, and to simulate the required extreme environments. The feedback of thermal, pressure, and other measurements is delivered in a 3D CAD model of the chamber's payload and support hardware. This GUI will provide the user with a better understanding of the payload than any existing thermal-vacuum system.
Main control computer security model of closed network systems protection against cyber attacks
NASA Astrophysics Data System (ADS)
Seymen, Bilal
2014-06-01
The model that brings the data input/output under control in closed network systems, that maintains the system securely, and that controls the flow of information through the Main Control Computer which also brings the network traffic under control against cyber-attacks. The network, which can be controlled single-handedly thanks to the system designed to enable the network users to make data entry into the system or to extract data from the system securely, intends to minimize the security gaps. Moreover, data input/output record can be kept by means of the user account assigned for each user, and it is also possible to carry out retroactive tracking, if requested. Because the measures that need to be taken for each computer on the network regarding cyber security, do require high cost; it has been intended to provide a cost-effective working environment with this model, only if the Main Control Computer has the updated hardware.
Tool for a configurable integrated circuit that uses determination of dynamic power consumption
NASA Technical Reports Server (NTRS)
Davoodi, Azadeh (Inventor); French, Matthew C. (Inventor); Agarwal, Deepak (Inventor); Wang, Li (Inventor)
2011-01-01
A configurable logic tool that allows minimization of dynamic power within an FPGA design without changing user-entered specifications. The minimization of power may use minimized clock nets as a first order operation, and a second order operation that minimizes other factors, such as area of placement, area of clocks and/or slack.
A 3D virtual reality simulator for training of minimally invasive surgery.
Mi, Shao-Hua; Hou, Zeng-Gunag; Yang, Fan; Xie, Xiao-Liang; Bian, Gui-Bin
2014-01-01
For the last decade, remarkable progress has been made in the field of cardiovascular disease treatment. However, these complex medical procedures require a combination of rich experience and technical skills. In this paper, a 3D virtual reality simulator for core skills training in minimally invasive surgery is presented. The system can generate realistic 3D vascular models segmented from patient datasets, including a beating heart, and provide a real-time computation of force and force feedback module for surgical simulation. Instruments, such as a catheter or guide wire, are represented by a multi-body mass-spring model. In addition, a realistic user interface with multiple windows and real-time 3D views are developed. Moreover, the simulator is also provided with a human-machine interaction module that gives doctors the sense of touch during the surgery training, enables them to control the motion of a virtual catheter/guide wire inside a complex vascular model. Experimental results show that the simulator is suitable for minimally invasive surgery training.
Syroid, Noah; Liu, David; Albert, Robert; Agutter, James; Egan, Talmage D; Pace, Nathan L; Johnson, Ken B; Dowdle, Michael R; Pulsipher, Daniel; Westenskow, Dwayne R
2012-11-01
Drug administration errors are frequent and are often associated with the misuse of IV infusion pumps. One source of these errors may be the infusion pump's user interface. We used failure modes-and-effects analyses to identify programming errors and to guide the design of a new syringe pump user interface. We designed the new user interface to clearly show the pump's operating state simultaneously in more than 1 monitoring location. We evaluated anesthesia residents in laboratory and simulated environments on programming accuracy and error detection between the new user interface and the user interface of a commercially available infusion pump. With the new user interface, we observed the number of programming errors reduced by 81%, the number of keystrokes per task reduced from 9.2 ± 5.0 to 7.5 ± 5.5 (mean ± SD), the time required per task reduced from 18.1 ± 14.1 seconds to 10.9 ± 9.5 seconds and significantly less perceived workload. Residents detected 38 of 70 (54%) of the events with the new user interface and 37 of 70 (53%) with the existing user interface, despite no experience with the new user interface and extensive experience with the existing interface. The number of programming errors and workload were reduced partly because it took less time and fewer keystrokes to program the pump when using the new user interface. Despite minimal training, residents quickly identified preexisting infusion pump problems with the new user interface. Intuitive and easy-to-program infusion pump interfaces may reduce drug administration errors and infusion pump-related adverse events.
Bandodkar, Amay J; Jia, Wenzhao; Ramírez, Julian; Wang, Joseph
2015-06-03
The development of enzymatic-ink-based roller pens for direct drawing of biocatalytic sensors, in general, and for realizing renewable glucose sensor strips, in particular, is described. The resulting enzymatic-ink pen allows facile fabrication of high-quality inexpensive electrochemical biosensors of any design by the user on a wide variety of surfaces having complex textures with minimal user training. Unlike prefabricated sensors, this approach empowers the end user with the ability of "on-demand" and "on-site" designing and fabricating of biocatalytic sensors to suit their specific requirement. The resulting devices are thus referred to as "do-it-yourself" sensors. The bio-active pens produce highly reproducible biocatalytic traces with minimal edge roughness. The composition of the new enzymatic inks has been optimized for ensuring good biocatalytic activity, electrical conductivity, biocompati-bility, reproducible writing, and surface adherence. The resulting inks are characterized using spectroscopic, viscometric, electrochemical, thermal and microscopic techniques. Applicability to renewable blood glucose testing, epidermal glucose monitoring, and on-leaf phenol detection are demonstrated in connection to glucose oxidase and tyrosinase-based carbon inks. The "do-it-yourself" renewable glucose sensor strips offer a "fresh," reproducible, low-cost biocatalytic sensor surface for each blood test. The ability to directly draw biocatalytic conducting traces even on unconventional surfaces opens up new avenues in various sensing applications in low-resource settings and holds great promise for diverse healthcare, environmental, and defense domains. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Ryu, Joonghyun; Lee, Mokwon; Cha, Jehyun; Laskowski, Roman A; Ryu, Seong Eon; Kim, Deok-Soo
2016-07-08
Many applications, such as protein design, homology modeling, flexible docking, etc. require the prediction of a protein's optimal side-chain conformations from just its amino acid sequence and backbone structure. Side-chain prediction (SCP) is an NP-hard energy minimization problem. Here, we present BetaSCPWeb which efficiently computes a conformation close to optimal using a geometry-prioritization method based on the Voronoi diagram of spherical atoms. Its outputs are visual, textual and PDB file format. The web server is free and open to all users at http://voronoi.hanyang.ac.kr/betascpweb with no login requirement. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
NASA Technical Reports Server (NTRS)
McElroy, Mark W.
2017-01-01
This document serves as a user guide for the AF-Shell 1.0 software, an efficient tool for progressive damage simulation in composite laminates. This guide contains minimal technical material and is meant solely as a guide for a new user to apply AF-Shell 1.0 to laminate damage simulation problems.
21 CFR 874.5840 - Antistammering device.
Code of Federal Regulations, 2010 CFR
2010-04-01
... it senses the user's speech and that is intended to prevent the user from hearing the sounds of his or her own voice. The device is used to minimize a user's involuntary hesitative or repetitive speech. (b) Classification. Class I (general controls). The device is exempt from the premarket notification...
Development of a Haptic Interface for Natural Orifice Translumenal Endoscopic Surgery Simulation
Dargar, Saurabh; Sankaranarayanan, Ganesh
2016-01-01
Natural orifice translumenal endoscopic surgery (NOTES) is a minimally invasive procedure, which utilizes the body’s natural orifices to gain access to the peritoneal cavity. The NOTES procedure is designed to minimize external scarring and patient trauma, however flexible endoscopy based pure NOTES procedures require critical scope handling skills. The delicate nature of the NOTES procedure requires extensive training, thus to improve access to training while reducing risk to patients we have designed and developed the VTEST©, a virtual reality NOTES simulator. As part of the simulator, a novel decoupled 2-DOF haptic device was developed to provide realistic force feedback to the user in training. A series of experiments were performed to determine the behavioral characteristics of the device. The device was found capable of rendering up to 5.62N and 0.190Nm of continuous force and torque in the translational and rotational DOF, respectively. The device possesses 18.1Hz and 5.7Hz of force bandwidth in the translational and rotational DOF, respectively. A feedforward friction compensator was also successfully implemented to minimize the negative impact of friction during the interaction with the device. In this work we have presented the detailed development and evaluation of the haptic device for the VTEST©. PMID:27008674
Transformation of an uncertain video search pipeline to a sketch-based visual analytics loop.
Legg, Philip A; Chung, David H S; Parry, Matthew L; Bown, Rhodri; Jones, Mark W; Griffiths, Iwan W; Chen, Min
2013-12-01
Traditional sketch-based image or video search systems rely on machine learning concepts as their core technology. However, in many applications, machine learning alone is impractical since videos may not be semantically annotated sufficiently, there may be a lack of suitable training data, and the search requirements of the user may frequently change for different tasks. In this work, we develop a visual analytics systems that overcomes the shortcomings of the traditional approach. We make use of a sketch-based interface to enable users to specify search requirement in a flexible manner without depending on semantic annotation. We employ active machine learning to train different analytical models for different types of search requirements. We use visualization to facilitate knowledge discovery at the different stages of visual analytics. This includes visualizing the parameter space of the trained model, visualizing the search space to support interactive browsing, visualizing candidature search results to support rapid interaction for active learning while minimizing watching videos, and visualizing aggregated information of the search results. We demonstrate the system for searching spatiotemporal attributes from sports video to identify key instances of the team and player performance.
Earthquake Early Warning Beta Users: Java, Modeling, and Mobile Apps
NASA Astrophysics Data System (ADS)
Strauss, J. A.; Vinci, M.; Steele, W. P.; Allen, R. M.; Hellweg, M.
2014-12-01
Earthquake Early Warning (EEW) is a system that can provide a few to tens of seconds warning prior to ground shaking at a user's location. The goal and purpose of such a system is to reduce, or minimize, the damage, costs, and casualties resulting from an earthquake. A demonstration earthquake early warning system (ShakeAlert) is undergoing testing in the United States by the UC Berkeley Seismological Laboratory, Caltech, ETH Zurich, University of Washington, the USGS, and beta users in California and the Pacific Northwest. The beta users receive earthquake information very rapidly in real-time and are providing feedback on their experiences of performance and potential uses within their organization. Beta user interactions allow the ShakeAlert team to discern: which alert delivery options are most effective, what changes would make the UserDisplay more useful in a pre-disaster situation, and most importantly, what actions users plan to take for various scenarios. Actions could include: personal safety approaches, such as drop cover, and hold on; automated processes and procedures, such as opening elevator or fire stations doors; or situational awareness. Users are beginning to determine which policy and technological changes may need to be enacted, and funding requirements to implement their automated controls. The use of models and mobile apps are beginning to augment the basic Java desktop applet. Modeling allows beta users to test their early warning responses against various scenarios without having to wait for a real event. Mobile apps are also changing the possible response landscape, providing other avenues for people to receive information. All of these combine to improve business continuity and resiliency.
Automated Design of Restraint Layer of an Inflatable Vessel
NASA Technical Reports Server (NTRS)
Spexarth, Gary
2007-01-01
A Mathcad computer program largely automates the design and analysis of the restraint layer (the primary load-bearing layer) of an inflatable vessel that consists of one or more sections having cylindrical, toroidal, and/or spherical shape(s). A restraint layer typically comprises webbing in the form of multiple straps. The design task includes choosing indexing locations along the straps, computing the load at every location in each strap, computing the resulting stretch at each location, and computing the amount of undersizing required of each strap so that, once the vessel is inflated and the straps thus stretched, the vessel can be expected to assume the desired shape. Prior to the development of this program, the design task was performed by use of a difficult-to-use spreadsheet program that required manual addition of rows and columns depending on the numbers of strap rows and columns of a given design. In contrast, this program is completely parametric and includes logic that automatically adds or deletes rows and columns as needed. With minimal input from the user, this program automatically computes indexing locations, strap lengths, undersizing requirements, and all design data required to produce detailed drawings and assembly procedures. It also generates textual comments that help the user understand the calculations.
Satellite services system program plan
NASA Technical Reports Server (NTRS)
Hoffman, Stephen J.
1985-01-01
The purpose is to determine the potential for servicing from the Space Shuttle Orbiter and to assess NASA's role as the catalyst in bringing about routine on-orbit servicing. Specifically this study seeks to determine what requirements, in terms of both funds and time, are needed to make the Shuttle Orbiter not only a transporter of spacecraft but a servicing vehicle for those spacecraft as well. The scope of this effort is to focus on the near term development of a generic servicing capability. To make this capability truly generic and attractive requires that the customer's point of veiw be taken and transformed into a widely usable set of hardware. And to maintain a near term advent of this capability requires that a minimal reliance be made on advanced technology. With this background and scope, this study will proceed through three general phases to arrive at the desired program costs and schedule. The first step will be to determine the servicing requirements of the user community. This will provide the basis for the second phase which is to develop hardware concepts to meet these needs. Finally, a cost estimate will be made for each of the new hardware concepts and a phased hardware development plan will be established for the acquisition of these items based on the inputs obtained from the user community.
GASOLINE: Smoothed Particle Hydrodynamics (SPH) code
NASA Astrophysics Data System (ADS)
N-Body Shop
2017-10-01
Gasoline solves the equations of gravity and hydrodynamics in astrophysical problems, including simulations of planets, stars, and galaxies. It uses an SPH method that features correct mixing behavior in multiphase fluids and minimal artificial viscosity. This method is identical to the SPH method used in the ChaNGa code (ascl:1105.005), allowing users to extend results to problems requiring >100,000 cores. Gasoline uses a fast, memory-efficient O(N log N) KD-Tree to solve Poisson's Equation for gravity and avoids artificial viscosity in non-shocking compressive flows.
LaPeyre, Megan K.; Nix, Ashby; Laborde, Luke; Piazza, Bryan P.
2012-01-01
Successful oyster reef restoration, like many conservation challenges, requires not only biological understanding of the resource, but also stakeholder cooperation and political support. To measure perceptions of oyster reef restoration activities and priorities for future restoration along the northern Gulf of Mexico coast, a survey of 1500 individuals representing 4 user groups (oyster harvesters, shrimpers, environmental organization members, professionals), across 5 states (Texas, Louisiana, Mississippi, Alabama, Florida) was conducted in 2011. All respondents highly supported reef restoration efforts, but there was a dichotomy in preferred restoration goals with commercial fishermen more likely to support oyster reef restoration for stock enhancement, while professionals and environmental organization members were more likely to support oyster reef restoration to enhance ecosystem services. All user groups identified enforcement, funding, and appropriate site selection as basic requirements for successful reef restoration. For management of restored oyster reefs, oyster harvesters and shrimpers were less likely to support options that restricted the use of reefs, including gear restrictions and permanent closures, but did support rotating annual reef closures, while other stakeholders were willing to consider all options, including annual reef closures and sanctuary reefs. Overall, there were clear differences in management and communication preferences across user groups, but few differences across states. Understanding these key differences in stakeholder support for, and willingness to accept specific management actions is critical in moving management and restoration forward while minimizing conflict.
Expert System for Building TRU Waste Payloads - 13554
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bruemmer, Heather; Slater, Bryant
2013-07-01
The process for grouping TRU waste drums into payloads for shipment to the Waste Isolation Pilot Plant (WIPP) for disposal is a very complex process. Transportation and regulatory requirements must be met, along with striving for the goals of shipment efficiency: maximize the number of waste drums in a shipment and minimize the use of empty drums which take up precious underground storage space. The restrictions on payloads range from weight restrictions, to limitations on flammable gas in the headspace, to minimum TRU alpha activity concentration requirements. The Overpack and Payload Assistant Tool (OPAT) has been developed as a mixed-initiativemore » intelligent system within the WIPP Waste Data System (WDS) to guide the construction of multiple acceptable payloads. OPAT saves the user time while at the same time maximizes the efficiency of shipments for the given drum population. The tool provides the user with the flexibility to tune critical factors that guide OPAT's operation based on real-time feedback concerning the results of the execution. This feedback complements the user's external knowledge of the drum population (such as location of drums, known challenges, internal shipment goals). This work demonstrates how software can be utilized to complement the unique domain knowledge of the users. The mixed-initiative approach combines the insight and intuition of the human expert with the proficiency of automated computational algorithms. The result is the ability to thoroughly and efficiently explore the search space of possible solutions and derive the best waste management decision. (authors)« less
NASA Astrophysics Data System (ADS)
Rover, J.; Goldhaber, M. B.; Holen, C.; Dittmeier, R.; Wika, S.; Steinwand, D.; Dahal, D.; Tolk, B.; Quenzer, R.; Nelson, K.; Wylie, B. K.; Coan, M.
2015-12-01
Multi-year land cover mapping from remotely sensed data poses challenges. Producing land cover products at spatial and temporal scales required for assessing longer-term trends in land cover change are typically a resource-limited process. A recently developed approach utilizes open source software libraries to automatically generate datasets, decision tree classifications, and data products while requiring minimal user interaction. Users are only required to supply coordinates for an area of interest, land cover from an existing source such as National Land Cover Database and percent slope from a digital terrain model for the same area of interest, two target acquisition year-day windows, and the years of interest between 1984 and present. The algorithm queries the Landsat archive for Landsat data intersecting the area and dates of interest. Cloud-free pixels meeting the user's criteria are mosaicked to create composite images for training the classifiers and applying the classifiers. Stratification of training data is determined by the user and redefined during an iterative process of reviewing classifiers and resulting predictions. The algorithm outputs include yearly land cover raster format data, graphics, and supporting databases for further analysis. Additional analytical tools are also incorporated into the automated land cover system and enable statistical analysis after data are generated. Applications tested include the impact of land cover change and water permanence. For example, land cover conversions in areas where shrubland and grassland were replaced by shale oil pads during hydrofracking of the Bakken Formation were quantified. Analytical analysis of spatial and temporal changes in surface water included identifying wetlands in the Prairie Pothole Region of North Dakota with potential connectivity to ground water, indicating subsurface permeability and geochemistry.
Elimination of water pathogens with solar radiation using an automated sequential batch CPC reactor.
Polo-López, M I; Fernández-Ibáñez, P; Ubomba-Jaswa, E; Navntoft, C; García-Fernández, I; Dunlop, P S M; Schmid, M; Byrne, J A; McGuigan, K G
2011-11-30
Solar disinfection (SODIS) of water is a well-known, effective treatment process which is practiced at household level in many developing countries. However, this process is limited by the small volume treated and there is no indication of treatment efficacy for the user. Low cost glass tube reactors, together with compound parabolic collector (CPC) technology, have been shown to significantly increase the efficiency of solar disinfection. However, these reactors still require user input to control each batch SODIS process and there is no feedback that the process is complete. Automatic operation of the batch SODIS process, controlled by UVA-radiation sensors, can provide information on the status of the process, can ensure the required UVA dose to achieve complete disinfection is received and reduces user work-load through automatic sequential batch processing. In this work, an enhanced CPC photo-reactor with a concentration factor of 1.89 was developed. The apparatus was automated to achieve exposure to a pre-determined UVA dose. Treated water was automatically dispensed into a reservoir tank. The reactor was tested using Escherichia coli as a model pathogen in natural well water. A 6-log inactivation of E. coli was achieved following exposure to the minimum uninterrupted lethal UVA dose. The enhanced reactor decreased the exposure time required to achieve the lethal UVA dose, in comparison to a CPC system with a concentration factor of 1.0. Doubling the lethal UVA dose prevented the need for a period of post-exposure dark inactivation and reduced the overall treatment time. Using this reactor, SODIS can be automatically carried out at an affordable cost, with reduced exposure time and minimal user input. Copyright © 2011 Elsevier B.V. All rights reserved.
An information model to support user-centered design of medical devices.
Hagedorn, Thomas J; Krishnamurty, Sundar; Grosse, Ian R
2016-08-01
The process of engineering design requires the product development team to balance the needs and limitations of many stakeholders, including those of the user, regulatory organizations, and the designing institution. This is particularly true in medical device design, where additional consideration must be given for a much more complex user-base that can only be accessed on a limited basis. Given this inherent challenge, few projects exist that consider design domain concepts, such as aspects of a detailed design, a detailed view of various stakeholders and their capabilities, along with the user-needs simultaneously. In this paper, we present a novel information model approach that combines a detailed model of design elements with a model of the design itself, customer requirements, and of the capabilities of the customer themselves. The information model is used to facilitate knowledge capture and automated reasoning across domains with a minimal set of rules by adopting a terminology that treats customer and design specific factors identically, thus enabling straightforward assessments. A uniqueness of this approach is that it systematically provides an integrated perspective on the key usability information that drive design decisions towards more universal or effective outcomes with the very design information impacted by the usability information. This can lead to cost-efficient optimal designs based on a direct inclusion of the needs of customers alongside those of business, marketing, and engineering requirements. Two case studies are presented to show the method's potential as a more effective knowledge management tool with built-in automated inferences that provide design insight, as well as its overall effectiveness as a platform to develop and execute medical device design from a holistic perspective. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Quick, Jason
2009-01-01
The Upper Stage (US) section of the National Aeronautics and Space Administration's (NASA) Ares I rocket will require internal access platforms for maintenance tasks performed by humans inside the vehicle. Tasks will occur during expensive critical path operations at Kennedy Space Center (KSC) including vehicle stacking and launch preparation activities. Platforms must be translated through a small human access hatch, installed in an enclosed worksite environment, support the weight of ground operators and be removed before flight - and their design must minimize additional vehicle mass at attachment points. This paper describes the application of a user-centered conceptual design process and the unique challenges encountered within NASA's systems engineering culture focused on requirements and "heritage hardware". The NASA design team at Marshall Space Flight Center (MSFC) initiated the user-centered design process by studying heritage internal access kits and proposing new design concepts during brainstorming sessions. Simultaneously, they partnered with the Technology Transfer/Innovative Partnerships Program to research inflatable structures and dynamic scaffolding solutions that could enable ground operator access. While this creative, technology-oriented exploration was encouraged by upper management, some design stakeholders consistently opposed ideas utilizing novel, untested equipment. Subsequent collaboration with an engineering consulting firm improved the technical credibility of several options, however, there was continued resistance from team members focused on meeting system requirements with pre-certified hardware. After a six-month idea-generating phase, an intensive six-week effort produced viable design concepts that justified additional vehicle mass while optimizing the human factors of platform installation and use. Although these selected final concepts closely resemble heritage internal access platforms, challenges from the application of the user-centered process provided valuable lessons for improving future collaborative conceptual design efforts.
Anon-Pass: Practical Anonymous Subscriptions.
Lee, Michael Z; Dunn, Alan M; Katz, Jonathan; Waters, Brent; Witchel, Emmett
2013-12-31
We present the design, security proof, and implementation of an anonymous subscription service. Users register for the service by providing some form of identity, which might or might not be linked to a real-world identity such as a credit card, a web login, or a public key. A user logs on to the system by presenting a credential derived from information received at registration. Each credential allows only a single login in any authentication window, or epoch . Logins are anonymous in the sense that the service cannot distinguish which user is logging in any better than random guessing. This implies unlinkability of a user across different logins. We find that a central tension in an anonymous subscription service is the service provider's desire for a long epoch (to reduce server-side computation) versus users' desire for a short epoch (so they can repeatedly "re-anonymize" their sessions). We balance this tension by having short epochs, but adding an efficient operation for clients who do not need unlinkability to cheaply re-authenticate themselves for the next time period. We measure performance of a research prototype of our protocol that allows an independent service to offer anonymous access to existing services. We implement a music service, an Android-based subway-pass application, and a web proxy, and show that adding anonymity adds minimal client latency and only requires 33 KB of server memory per active user.
NASA Technical Reports Server (NTRS)
Harper, Richard
1989-01-01
In a fault-tolerant parallel computer, a functional programming model can facilitate distributed checkpointing, error recovery, load balancing, and graceful degradation. Such a model has been implemented on the Draper Fault-Tolerant Parallel Processor (FTPP). When used in conjunction with the FTPP's fault detection and masking capabilities, this implementation results in a graceful degradation of system performance after faults. Three graceful degradation algorithms have been implemented and are presented. A user interface has been implemented which requires minimal cognitive overhead by the application programmer, masking such complexities as the system's redundancy, distributed nature, variable complement of processing resources, load balancing, fault occurrence and recovery. This user interface is described and its use demonstrated. The applicability of the functional programming style to the Activation Framework, a paradigm for intelligent systems, is then briefly described.
Myokit: A simple interface to cardiac cellular electrophysiology.
Clerx, Michael; Collins, Pieter; de Lange, Enno; Volders, Paul G A
2016-01-01
Myokit is a new powerful and versatile software tool for modeling and simulation of cardiac cellular electrophysiology. Myokit consists of an easy-to-read modeling language, a graphical user interface, single and multi-cell simulation engines and a library of advanced analysis tools accessible through a Python interface. Models can be loaded from Myokit's native file format or imported from CellML. Model export is provided to C, MATLAB, CellML, CUDA and OpenCL. Patch-clamp data can be imported and used to estimate model parameters. In this paper, we review existing tools to simulate the cardiac cellular action potential to find that current tools do not cater specifically to model development and that there is a gap between easy-to-use but limited software and powerful tools that require strong programming skills from their users. We then describe Myokit's capabilities, focusing on its model description language, simulation engines and import/export facilities in detail. Using three examples, we show how Myokit can be used for clinically relevant investigations, multi-model testing and parameter estimation in Markov models, all with minimal programming effort from the user. This way, Myokit bridges a gap between performance, versatility and user-friendliness. Copyright © 2015 Elsevier Ltd. All rights reserved.
Dong, Yu-Shuang; Xu, Gao-Chao; Fu, Xiao-Dong
2014-01-01
The cloud platform provides various services to users. More and more cloud centers provide infrastructure as the main way of operating. To improve the utilization rate of the cloud center and to decrease the operating cost, the cloud center provides services according to requirements of users by sharding the resources with virtualization. Considering both QoS for users and cost saving for cloud computing providers, we try to maximize performance and minimize energy cost as well. In this paper, we propose a distributed parallel genetic algorithm (DPGA) of placement strategy for virtual machines deployment on cloud platform. It executes the genetic algorithm parallelly and distributedly on several selected physical hosts in the first stage. Then it continues to execute the genetic algorithm of the second stage with solutions obtained from the first stage as the initial population. The solution calculated by the genetic algorithm of the second stage is the optimal one of the proposed approach. The experimental results show that the proposed placement strategy of VM deployment can ensure QoS for users and it is more effective and more energy efficient than other placement strategies on the cloud platform. PMID:25097872
Lazinski, David W; Camilli, Andrew
2013-01-01
The amplification of DNA fragments, cloned between user-defined 5' and 3' end sequences, is a prerequisite step in the use of many current applications including massively parallel sequencing (MPS). Here we describe an improved method, called homopolymer tail-mediated ligation PCR (HTML-PCR), that requires very little starting template, minimal hands-on effort, is cost-effective, and is suited for use in high-throughput and robotic methodologies. HTML-PCR starts with the addition of homopolymer tails of controlled lengths to the 3' termini of a double-stranded genomic template. The homopolymer tails enable the annealing-assisted ligation of a hybrid oligonucleotide to the template's recessed 5' ends. The hybrid oligonucleotide has a user-defined sequence at its 5' end. This primer, together with a second primer composed of a longer region complementary to the homopolymer tail and fused to a second 5' user-defined sequence, are used in a PCR reaction to generate the final product. The user-defined sequences can be varied to enable compatibility with a wide variety of downstream applications. We demonstrate our new method by constructing MPS libraries starting from nanogram and sub-nanogram quantities of Vibrio cholerae and Streptococcus pneumoniae genomic DNA.
Leszczuk, Mikołaj; Dudek, Łukasz; Witkowski, Marcin
The VQiPS (Video Quality in Public Safety) Working Group, supported by the U.S. Department of Homeland Security, has been developing a user guide for public safety video applications. According to VQiPS, five parameters have particular importance influencing the ability to achieve a recognition task. They are: usage time-frame, discrimination level, target size, lighting level, and level of motion. These parameters form what are referred to as Generalized Use Classes (GUCs). The aim of our research was to develop algorithms that would automatically assist classification of input sequences into one of the GUCs. Target size and lighting level parameters were approached. The experiment described reveals the experts' ambiguity and hesitation during the manual target size determination process. However, the automatic methods developed for target size classification make it possible to determine GUC parameters with 70 % compliance to the end-users' opinion. Lighting levels of the entire sequence can be classified with an efficiency reaching 93 %. To make the algorithms available for use, a test application has been developed. It is able to process video files and display classification results, the user interface being very simple and requiring only minimal user interaction.
Dong, Yu-Shuang; Xu, Gao-Chao; Fu, Xiao-Dong
2014-01-01
The cloud platform provides various services to users. More and more cloud centers provide infrastructure as the main way of operating. To improve the utilization rate of the cloud center and to decrease the operating cost, the cloud center provides services according to requirements of users by sharding the resources with virtualization. Considering both QoS for users and cost saving for cloud computing providers, we try to maximize performance and minimize energy cost as well. In this paper, we propose a distributed parallel genetic algorithm (DPGA) of placement strategy for virtual machines deployment on cloud platform. It executes the genetic algorithm parallelly and distributedly on several selected physical hosts in the first stage. Then it continues to execute the genetic algorithm of the second stage with solutions obtained from the first stage as the initial population. The solution calculated by the genetic algorithm of the second stage is the optimal one of the proposed approach. The experimental results show that the proposed placement strategy of VM deployment can ensure QoS for users and it is more effective and more energy efficient than other placement strategies on the cloud platform.
Rushton, Paula W; Kairy, Dahlia; Archambault, Philippe; Pituch, Evelina; Torkia, Caryne; El Fathi, Anas; Stone, Paula; Routhier, François; Forget, Robert; Pineau, Joelle; Gourdeau, Richard; Demers, Louise
2015-05-01
To explore power wheelchair users', caregivers' and clinicians' perspectives regarding the potential impact of intelligent power wheelchair use on social participation. Semi-structured interviews were conducted with power wheelchair users (n = 12), caregivers (n = 4) and clinicians (n = 12). An illustrative video was used to facilitate discussion. The transcribed interviews were analyzed using thematic analysis. Three main themes were identified based on the experiences of the power wheelchair users, caregivers and clinicians: (1) increased social participation opportunities, (2) changing how social participation is experienced and (3) decreased risk of accidents during social participation. Findings from this study suggest that an intelligent power wheelchair would enhance social participation in a variety of important ways, thereby providing support for continued design and development of this assistive technology. An intelligent power wheelchair has the potential to: Increase social participation opportunities by overcoming challenges associated with navigating through crowds and small spaces. Change how social participation is experienced through "normalizing" social interactions and decreasing the effort required to drive a power wheelchair. Decrease the risk of accidents during social participation by reducing the need for dangerous compensatory strategies and minimizing the impact of the physical environment.
DyNAVacS: an integrative tool for optimized DNA vaccine design.
Harish, Nagarajan; Gupta, Rekha; Agarwal, Parul; Scaria, Vinod; Pillai, Beena
2006-07-01
DNA vaccines have slowly emerged as keystones in preventive immunology due to their versatility in inducing both cell-mediated as well as humoral immune responses. The design of an efficient DNA vaccine, involves choice of a suitable expression vector, ensuring optimal expression by codon optimization, engineering CpG motifs for enhancing immune responses and providing additional sequence signals for efficient translation. DyNAVacS is a web-based tool created for rapid and easy design of DNA vaccines. It follows a step-wise design flow, which guides the user through the various sequential steps in the design of the vaccine. Further, it allows restriction enzyme mapping, design of primers spanning user specified sequences and provides information regarding the vectors currently used for generation of DNA vaccines. The web version uses Apache HTTP server. The interface was written in HTML and utilizes the Common Gateway Interface scripts written in PERL for functionality. DyNAVacS is an integrated tool consisting of user-friendly programs, which require minimal information from the user. The software is available free of cost, as a web based application at URL: http://miracle.igib.res.in/dynavac/.
Brock, Douglas; Kim, Sara; Palmer, Odawni; Gallagher, Thomas; Holmboe, Eric
2013-01-01
Usability evaluation provides developers and educators with the means to understand user needs, improve overall product utility, and increase user satisfaction. The application of "discount usability" principles developed to make usability testing more practical and useful may improve user experience at minimal cost and require little existing expertise to conduct. We describe an application of discount usability to a high-fidelity online communications assessment application developed by the University of Washington for the American Board of Internal Medicine. Eight internal medicine physicians completed a discount usability test. Sessions were recorded and the videos analyzed for significant usability concerns. Concerns were identified, summarized, discussed, and prioritized by the authors in collaboration with the software developers before implementing any changes to the interface. Thirty-eight significant usability issues were detected and four technical problems were identified. Each issue was responded to through modification of the software, by providing additional instruction, or delayed for a later version to be developed. Discount usability can be easily implemented in academic developmental activities. Our study resulted in the discovery and remediation of significant user problems, in addition to giving important insight into the novel methods built into the application.
Yu, Kebing; Salomon, Arthur R
2009-12-01
Recently, dramatic progress has been achieved in expanding the sensitivity, resolution, mass accuracy, and scan rate of mass spectrometers able to fragment and identify peptides through MS/MS. Unfortunately, this enhanced ability to acquire proteomic data has not been accompanied by a concomitant increase in the availability of flexible tools allowing users to rapidly assimilate, explore, and analyze this data and adapt to various experimental workflows with minimal user intervention. Here we fill this critical gap by providing a flexible relational database called PeptideDepot for organization of expansive proteomic data sets, collation of proteomic data with available protein information resources, and visual comparison of multiple quantitative proteomic experiments. Our software design, built upon the synergistic combination of a MySQL database for safe warehousing of proteomic data with a FileMaker-driven graphical user interface for flexible adaptation to diverse workflows, enables proteomic end-users to directly tailor the presentation of proteomic data to the unique analysis requirements of the individual proteomics lab. PeptideDepot may be deployed as an independent software tool or integrated directly with our high throughput autonomous proteomic pipeline used in the automated acquisition and post-acquisition analysis of proteomic data.
Airplane Mesh Development with Grid Density Studies
NASA Technical Reports Server (NTRS)
Cliff, Susan E.; Baker, Timothy J.; Thomas, Scott D.; Lawrence, Scott L.; Rimlinger, Mark J.
1999-01-01
Automatic Grid Generation Wish List Geometry handling, including CAD clean up and mesh generation, remains a major bottleneck in the application of CFD methods. There is a pressing need for greater automation in several aspects of the geometry preparation in order to reduce set up time and eliminate user intervention as much as possible. Starting from the CAD representation of a configuration, there may be holes or overlapping surfaces which require an intensive effort to establish cleanly abutting surface patches, and collections of many patches may need to be combined for more efficient use of the geometrical representation. Obtaining an accurate and suitable body conforming grid with an adequate distribution of points throughout the flow-field, for the flow conditions of interest, is often the most time consuming task for complex CFD applications. There is a need for a clean unambiguous definition of the CAD geometry. Ideally this would be carried out automatically by smart CAD clean up software. One could also define a standard piece-wise smooth surface representation suitable for use by computational methods and then create software to translate between the various CAD descriptions and the standard representation. Surface meshing remains a time consuming, user intensive procedure. There is a need for automated surface meshing, requiring only minimal user intervention to define the overall density of mesh points. The surface mesher should produce well shaped elements (triangles or quadrilaterals) whose size is determined initially according to the surface curvature with a minimum size for flat pieces, and later refined by the user in other regions if necessary. Present techniques for volume meshing all require some degree of user intervention. There is a need for fully automated and reliable volume mesh generation. In addition, it should be possible to create both surface and volume meshes that meet guaranteed measures of mesh quality (e.g. minimum and maximum angle, stretching ratios, etc.).
23 CFR 630.1106 - Policy and procedures for work zone safety management.
Code of Federal Regulations, 2010 CFR
2010-04-01
... established in accordance with 23 CFR 630.1006, shall include the consideration and management of road user...; Exposure Control Measures to avoid or minimize worker exposure to motorized traffic and road user exposure... road users. (b) Agency processes, procedures, and/or guidance should be based on consideration of...
Smart HVAC Control in IoT: Energy Consumption Minimization with User Comfort Constraints
Verikoukis, Christos
2014-01-01
Smart grid is one of the main applications of the Internet of Things (IoT) paradigm. Within this context, this paper addresses the efficient energy consumption management of heating, ventilation, and air conditioning (HVAC) systems in smart grids with variable energy price. To that end, first, we propose an energy scheduling method that minimizes the energy consumption cost for a particular time interval, taking into account the energy price and a set of comfort constraints, that is, a range of temperatures according to user's preferences for a given room. Then, we propose an energy scheduler where the user may select to relax the temperature constraints to save more energy. Moreover, thanks to the IoT paradigm, the user may interact remotely with the HVAC control system. In particular, the user may decide remotely the temperature of comfort, while the temperature and energy consumption information is sent through Internet and displayed at the end user's device. The proposed algorithms have been implemented in a real testbed, highlighting the potential gains that can be achieved in terms of both energy and cost. PMID:25054163
Smart HVAC control in IoT: energy consumption minimization with user comfort constraints.
Serra, Jordi; Pubill, David; Antonopoulos, Angelos; Verikoukis, Christos
2014-01-01
Smart grid is one of the main applications of the Internet of Things (IoT) paradigm. Within this context, this paper addresses the efficient energy consumption management of heating, ventilation, and air conditioning (HVAC) systems in smart grids with variable energy price. To that end, first, we propose an energy scheduling method that minimizes the energy consumption cost for a particular time interval, taking into account the energy price and a set of comfort constraints, that is, a range of temperatures according to user's preferences for a given room. Then, we propose an energy scheduler where the user may select to relax the temperature constraints to save more energy. Moreover, thanks to the IoT paradigm, the user may interact remotely with the HVAC control system. In particular, the user may decide remotely the temperature of comfort, while the temperature and energy consumption information is sent through Internet and displayed at the end user's device. The proposed algorithms have been implemented in a real testbed, highlighting the potential gains that can be achieved in terms of both energy and cost.
Semantic-gap-oriented active learning for multilabel image annotation.
Tang, Jinhui; Zha, Zheng-Jun; Tao, Dacheng; Chua, Tat-Seng
2012-04-01
User interaction is an effective way to handle the semantic gap problem in image annotation. To minimize user effort in the interactions, many active learning methods were proposed. These methods treat the semantic concepts individually or correlatively. However, they still neglect the key motivation of user feedback: to tackle the semantic gap. The size of the semantic gap of each concept is an important factor that affects the performance of user feedback. User should pay more efforts to the concepts with large semantic gaps, and vice versa. In this paper, we propose a semantic-gap-oriented active learning method, which incorporates the semantic gap measure into the information-minimization-based sample selection strategy. The basic learning model used in the active learning framework is an extended multilabel version of the sparse-graph-based semisupervised learning method that incorporates the semantic correlation. Extensive experiments conducted on two benchmark image data sets demonstrated the importance of bringing the semantic gap measure into the active learning process.
Nagy, Jennifer; Winslow, Amy; Brown, Jessica M; Adams, Lisa; O'Brien, Kathleen; Boninger, Michael; Nemunaitis, Gregory
2012-01-01
To assess the peak force during wheelchair propulsion of individuals with spinal cord injury propelling over obstacles from the Wheelchair Skills Test. Twenty-three individuals with spinal cord injury (SCI) who are full-time manual wheelchair users were included in this prospective study. A SmartWheel (Three Rivers Holdings, LLC) was used to analyze each push while subjects negotiated standardized obstacles used in the Wheelchair Skills Test, including tile, carpet, soft surface, 5° and 10° ramps, 2 cm, 5 cm, and 15 cm curbs. When the peak forces of the advanced skills were compared to level 10 m tile/10 m carpet, there was a statistically significant increase in all peak forces (P value ranged from .0001 to .0268). It is well documented that a large number of individuals with SCI develop upper limb pain. One of the recommendations to preserve the upper limb is to minimize force during repetitive tasks. Advanced wheelchair skills require an increase in force to accomplish. The increase in forces ranged from 18% to 130% over that required for level 10 m tile/10 m carpet.
Semiautomated Segmentation of Polycystic Kidneys in T2-Weighted MR Images.
Kline, Timothy L; Edwards, Marie E; Korfiatis, Panagiotis; Akkus, Zeynettin; Torres, Vicente E; Erickson, Bradley J
2016-09-01
The objective of the present study is to develop and validate a fast, accurate, and reproducible method that will increase and improve institutional measurement of total kidney volume and thereby avoid the higher costs, increased operator processing time, and inherent subjectivity associated with manual contour tracing. We developed a semiautomated segmentation approach, known as the minimal interaction rapid organ segmentation (MIROS) method, which results in human interaction during measurement of total kidney volume on MR images being reduced to a few minutes. This software tool automatically steps through slices and requires rough definition of kidney boundaries supplied by the user. The approach was verified on T2-weighted MR images of 40 patients with autosomal dominant polycystic kidney disease of varying degrees of severity. The MIROS approach required less than 5 minutes of user interaction in all cases. When compared with the ground-truth reference standard, MIROS showed no significant bias and had low variability (mean ± 2 SD, 0.19% ± 6.96%). The MIROS method will greatly facilitate future research studies in which accurate and reproducible measurements of cystic organ volumes are needed.
CAVE3: A general transient heat transfer computer code utilizing eigenvectors and eigenvalues
NASA Technical Reports Server (NTRS)
Palmieri, J. V.; Rathjen, K. A.
1978-01-01
The method of solution is a hybrid analytical numerical technique which utilizes eigenvalues and eigenvectors. The method is inherently stable, permitting large time steps even with the best of conductors with the finest of mesh sizes which can provide a factor of five reduction in machine time compared to conventional explicit finite difference methods when structures with small time constants are analyzed over long time periods. This code will find utility in analyzing hypersonic missile and aircraft structures which fall naturally into this class. The code is a completely general one in that problems involving any geometry, boundary conditions and materials can be analyzed. This is made possible by requiring the user to establish the thermal network conductances between nodes. Dynamic storage allocation is used to minimize core storage requirements. This report is primarily a user's manual for CAVE3 code. Input and output formats are presented and explained. Sample problems are included which illustrate the usage of the code as well as establish the validity and accuracy of the method.
Controlling a multi-degree of freedom upper limb prosthesis using foot controls: user experience.
Resnik, Linda; Klinger, Shana Lieberman; Etter, Katherine; Fantini, Christopher
2014-07-01
The DEKA Arm, a pre-commercial upper limb prosthesis, funded by the DARPA Revolutionizing Prosthetics Program, offers increased degrees of freedom while requiring a large number of user control inputs to operate. To address this challenge, DEKA developed prototype foot controls. Although the concept of utilizing foot controls to operate an upper limb prosthesis has been discussed for decades, only small-sized studies have been performed and no commercial product exists. The purpose of this paper is to report amputee user perspectives on using three different iterations of foot controls to operate the DEKA Arm. Qualitative data was collected from 36 subjects as part of the Department of Veterans Affairs (VA) Study to Optimize the DEKA Arm through surveys, interviews, audio memos, and videotaped sessions. Three major, interrelated themes were identified using the constant comparative method: attitudes towards foot controls, psychomotor learning and physical experience of using foot controls. Feedback about foot controls was generally positive for all iterations. The final version of foot controls was viewed most favorably. Our findings indicate that foot controls are a viable control option that can enable control of a multifunction upper limb prosthesis (the DEKA Arm). Multifunction upper limb prostheses require many user control inputs to operate. Foot controls offer additional control input options for such advanced devices, yet have had minimal study. This study found that foot controls were a viable option for controlling multifunction upper limb prostheses. Most of the 36 subjects in this study were willing to adopt foot controls to control the multiple degrees of freedom of the DEKA Arm. With training and practice, all users were able to develop the psychomotor skills needed to successfully operate food controls. Some had initial difficulty, but acclimated over time.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reynolds, John; Jankovsky, Zachary; Metzroth, Kyle G
2018-04-04
The purpose of the ADAPT code is to generate Dynamic Event Trees (DET) using a user specified set of simulators. ADAPT can utilize any simulation tool which meets a minimal set of requirements. ADAPT is based on the concept of DET which uses explicit modeling of the deterministic dynamic processes that take place during a nuclear reactor plant system (or other complex system) evolution along with stochastic modeling. When DET are used to model various aspects of Probabilistic Risk Assessment (PRA), all accident progression scenarios starting from an initiating event are considered simultaneously. The DET branching occurs at user specifiedmore » times and/or when an action is required by the system and/or the operator. These outcomes then decide how the dynamic system variables will evolve in time for each DET branch. Since two different outcomes at a DET branching may lead to completely different paths for system evolution, the next branching for these paths may occur not only at separate times, but can be based on different branching criteria. The computational infrastructure allows for flexibility in ADAPT to link with different system simulation codes, parallel processing of the scenarios under consideration, on-line scenario management (initiation as well as termination), analysis of results, and user friendly graphical capabilities. The ADAPT system is designed for a distributed computing environment; the scheduler can track multiple concurrent branches simultaneously. The scheduler is modularized so that the DET branching strategy can be modified (e.g. biasing towards the worst-case scenario/event). Independent database systems store data from the simulation tasks and the DET structure so that the event tree can be constructed and analyzed later. ADAPT is provided with a user-friendly client which can easily sort through and display the results of an experiment, precluding the need for the user to manually inspect individual simulator runs.« less
TELMA: Technology-enhanced learning environment for minimally invasive surgery.
Sánchez-González, Patricia; Burgos, Daniel; Oropesa, Ignacio; Romero, Vicente; Albacete, Antonio; Sánchez-Peralta, Luisa F; Noguera, José F; Sánchez-Margallo, Francisco M; Gómez, Enrique J
2013-06-01
Cognitive skills training for minimally invasive surgery has traditionally relied upon diverse tools, such as seminars or lectures. Web technologies for e-learning have been adopted to provide ubiquitous training and serve as structured repositories for the vast amount of laparoscopic video sources available. However, these technologies fail to offer such features as formative and summative evaluation, guided learning, or collaborative interaction between users. The "TELMA" environment is presented as a new technology-enhanced learning platform that increases the user's experience using a four-pillared architecture: (1) an authoring tool for the creation of didactic contents; (2) a learning content and knowledge management system that incorporates a modular and scalable system to capture, catalogue, search, and retrieve multimedia content; (3) an evaluation module that provides learning feedback to users; and (4) a professional network for collaborative learning between users. Face validation of the environment and the authoring tool are presented. Face validation of TELMA reveals the positive perception of surgeons regarding the implementation of TELMA and their willingness to use it as a cognitive skills training tool. Preliminary validation data also reflect the importance of providing an easy-to-use, functional authoring tool to create didactic content. The TELMA environment is currently installed and used at the Jesús Usón Minimally Invasive Surgery Centre and several other Spanish hospitals. Face validation results ascertain the acceptance and usefulness of this new minimally invasive surgery training environment. Copyright © 2013 Elsevier Inc. All rights reserved.
The 30/20 GHz mixed user architecture development study
NASA Technical Reports Server (NTRS)
1979-01-01
A mixed-user system is described which provides cost-effective communications services to a wide range of user terminal classes, ranging from one or two voice channel support in a direct-to-user mode, to multiple 500 mbps trunking channel support. Advanced satellite capabilities are utilized to minimize the cost of small terminals. In a system with thousands of small terminals, this approach results in minimum system cost.
Atalağ, Koray; Bilgen, Semih; Gür, Gürden; Boyacioğlu, Sedat
2007-09-01
There are very few evaluation studies for the Minimal Standard Terminology for Digestive Endoscopy. This study aims to evaluate the usage of the Turkish translation of Minimal Standard Terminology by developing an endoscopic information system. After elicitation of requirements, database modeling and software development were performed. Minimal Standard Terminology driven forms were designed for rapid data entry. The endoscopic report was rapidly created by applying basic Turkish syntax and grammar rules. Entering free text and also editing of final report were possible. After three years of live usage, data analysis was performed and results were evaluated. The system has been used for reporting of all endoscopic examinations. 15,638 valid records were analyzed, including 11,381 esophagogastroduodenoscopies, 2,616 colonoscopies, 1,079 rectoscopies and 562 endoscopic retrograde cholangiopancreatographies. In accordance with other previous validation studies, the overall usage of Minimal Standard Terminology terms was very high: 85% for examination characteristics, 94% for endoscopic findings and 94% for endoscopic diagnoses. Some new terms, attributes and allowed values were also added for better clinical coverage. Minimal Standard Terminology has been shown to cover a high proportion of routine endoscopy reports. Good user acceptance proves that both the terms and structure of Minimal Standard Terminology were consistent with usual clinical thinking. However, future work on Minimal Standard Terminology is mandatory for better coverage of endoscopic retrograde cholangiopancreatographies examinations. Technically new software development methodologies have to be sought for lowering cost of development and the maintenance phase. They should also address integration and interoperability of disparate information systems.
The effects of oral d-amphetamine on impulsivity in smoked and intranasal cocaine users.
Reed, Stephanie Collins; Evans, Suzette M
2016-06-01
Effective treatments for cocaine use disorders remain elusive. Two factors that may be related to treatment failures are route of cocaine used and impulsivity. Smoked cocaine users are more likely to have poorer treatment outcomes compared to intranasal cocaine users. Further, cocaine users are impulsive and impulsivity is associated with poor treatment outcomes. While stimulants are used to treat Attention Deficit Hyperactivity Disorder (ADHD) and attenuate certain cocaine-related behaviors, few studies have comprehensively examined whether stimulants can reduce behavioral impulsivity in cocaine users, and none examined route of cocaine use as a factor. The effects of immediate release oral d-amphetamine (AMPH) were examined in 34 cocaine users (13 intranasal, 21 smoked). Participants had three separate sessions where they were administered AMPH (0, 10, or 20mg) and completed behavioral measures of impulsivity and risk-taking and subjective measures of abuse liability. Smoked cocaine users were more impulsive on the Delayed Memory Task, the GoStop task and the Delay Discounting Task than intranasal cocaine users. Smoked cocaine users also reported more cocaine craving and negative mood than intranasal cocaine users. AMPH produced minimal increases on measures of abuse liability (e.g., Drug Liking). Smoked cocaine users were more impulsive than intranasal cocaine users on measures of impulsivity that had a delay component. Additionally, although AMPH failed to attenuate impulsive responding, there was minimal evidence of abuse liability in cocaine users. These preliminary findings need to be confirmed in larger samples that control for route and duration of cocaine use. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
The Effects of Oral d-Amphetamine on Impulsivity in Smoked and Intranasal Cocaine Users
Reed, Stephanie Collins; Evans, Suzette M.
2016-01-01
BACKGROUND Effective treatments for cocaine use disorders remain elusive. Two factors that may be related to treatment failures are route of cocaine used and impulsivity. Smoked cocaine users are more likely to have poorer treatment outcomes compared to intranasal cocaine users. Further, cocaine users are impulsive and impulsivity is associated with poor treatment outcomes. While stimulants are used to treat Attention Deficit Hyperactivity Disorder (ADHD) and attenuate certain cocaine-related behaviors, few studies have comprehensively examined whether stimulants can reduce behavioral impulsivity in cocaine users, and none examined route of cocaine use as a factor. METHODS The effects of immediate release oral d-amphetamine (AMPH) were examined in 34 cocaine users (13 intranasal, 21 smoked). Participants had three separate sessions where they were administered AMPH (0, 10, or 20 mg) and completed behavioral measures of impulsivity and risk-taking and subjective measures of abuse liability. RESULTS Smoked cocaine users were more impulsive on the Delayed Memory Task, the GoStop task and the Delay Discounting Task than intranasal cocaine users. Smoked cocaine users also reported more cocaine craving and negative mood than intranasal cocaine users. AMPH produced minimal increases on measures of abuse liability (e.g., Drug Liking). CONCLUSIONS Smoked cocaine users were more impulsive than intranasal cocaine users on measures of impulsivity that had a delay component. Additionally, although AMPH failed to attenuate impulsive responding, there was minimal evidence of abuse liability in cocaine users. These preliminary findings need to be confirmed in larger samples that control for route and duration of cocaine use. PMID:27114203
Auction-based bandwidth allocation in the Internet
NASA Astrophysics Data System (ADS)
Wei, Jiaolong; Zhang, Chi
2002-07-01
It has been widely accepted that auctioning which is the pricing approach with minimal information requirement is a proper tool to manage scare network resources. Previous works focus on Vickrey auction which is incentive compatible in classic auction theory. In the beginning of this paper, the faults of the most representative auction-based mechanisms are discussed. And then a new method called uniform-price auction (UPA), which has the simplest auction rule is proposed and it's incentive compatibility in the network environment is also proved. Finally, the basic mode is extended to support applications which require minimum bandwidth guarantees for a given time period by introducing derivative market, and a market mechanism for network resource allocation which is predictable, riskless, and simple for end-users is completed.
OpenMM 7: Rapid development of high performance algorithms for molecular dynamics
Swails, Jason; Zhao, Yutong; Beauchamp, Kyle A.; Wang, Lee-Ping; Stern, Chaya D.; Brooks, Bernard R.; Pande, Vijay S.
2017-01-01
OpenMM is a molecular dynamics simulation toolkit with a unique focus on extensibility. It allows users to easily add new features, including forces with novel functional forms, new integration algorithms, and new simulation protocols. Those features automatically work on all supported hardware types (including both CPUs and GPUs) and perform well on all of them. In many cases they require minimal coding, just a mathematical description of the desired function. They also require no modification to OpenMM itself and can be distributed independently of OpenMM. This makes it an ideal tool for researchers developing new simulation methods, and also allows those new methods to be immediately available to the larger community. PMID:28746339
A Vision and Roadmap for Increasing User Autonomy in Flight Operations in the National Airspace
NASA Technical Reports Server (NTRS)
Cotton, William B.; Hilb, Robert; Koczo, Stefan; Wing, David
2016-01-01
The purpose of Air Transportation is to move people and cargo safely, efficiently and swiftly to their destinations. The companies and individuals who use aircraft for this purpose, the airspace users, desire to operate their aircraft according to a dynamically optimized business trajectory for their specific mission and operational business model. In current operations, the dynamic optimization of business trajectories is limited by constraints built into operations in the National Airspace System (NAS) for reasons of safety and operational needs of the air navigation service providers. NASA has been developing and testing means to overcome many of these constraints and permit operations to be conducted closer to the airspace user's changing business trajectory as conditions unfold before and during the flight. A roadmap of logical steps progressing toward increased user autonomy is proposed, beginning with NASA's Traffic Aware Strategic Aircrew Requests (TASAR) concept that enables flight crews to make informed, deconflicted flight-optimization requests to air traffic control. These steps include the use of data communications for route change requests and approvals, integration with time-based arrival flow management processes under development by the Federal Aviation Administration (FAA), increased user authority for defining and modifying downstream, strategic portions of the trajectory, and ultimately application of self-separation. This progression takes advantage of existing FAA NextGen programs and RTCA standards development, and it is designed to minimize the number of hardware upgrades required of airspace users to take advantage of these advanced capabilities to achieve dynamically optimized business trajectories in NAS operations. The roadmap is designed to provide operational benefits to first adopters so that investment decisions do not depend upon a large segment of the user community becoming equipped before benefits can be realized. The issues of equipment certification and operational approval of new procedures are addressed in a way that minimizes their impact on the transition by deferring a change in the assignment of separation responsibility until a large body of operational data is available to support the safety case for this change in the last roadmap step.This paper will relate the roadmap steps to ongoing activities to clarify the economics-based transition to these technologies for operational use.
Smart Bandwidth Assignation in an Underlay Cellular Network for Internet of Vehicles.
de la Iglesia, Idoia; Hernandez-Jayo, Unai; Osaba, Eneko; Carballedo, Roberto
2017-09-27
The evolution of the IoT (Internet of Things) paradigm applied to new scenarios as VANETs (Vehicular Ad Hoc Networks) has gained momentum in recent years. Both academia and industry have triggered advanced studies in the IoV (Internet of Vehicles), which is understood as an ecosystem where different types of users (vehicles, elements of the infrastructure, pedestrians) are connected. How to efficiently share the available radio resources among the different types of eligible users is one of the important issues to be addressed. This paper briefly analyzes various concepts presented hitherto in the literature and it proposes an enhanced algorithm for ensuring a robust co-existence of the aforementioned system users. Therefore, this paper introduces an underlay RRM (Radio Resource Management) methodology which is capable of (1) improving cellular spectral efficiency while making a minimal impact on cellular communications and (2) ensuring the different QoS (Quality of Service) requirements of ITS (Intelligent Transportation Systems) applications. Simulation results, where we compare the proposed algorithm to the other two RRM, show the promising spectral efficiency performance of the proposed RRM methodology.
Smart Bandwidth Assignation in an Underlay Cellular Network for Internet of Vehicles
de la Iglesia, Idoia; Hernandez-Jayo, Unai
2017-01-01
The evolution of the IoT (Internet of Things) paradigm applied to new scenarios as VANETs (Vehicular Ad Hoc Networks) has gained momentum in recent years. Both academia and industry have triggered advanced studies in the IoV (Internet of Vehicles), which is understood as an ecosystem where different types of users (vehicles, elements of the infrastructure, pedestrians) are connected. How to efficiently share the available radio resources among the different types of eligible users is one of the important issues to be addressed. This paper briefly analyzes various concepts presented hitherto in the literature and it proposes an enhanced algorithm for ensuring a robust co-existence of the aforementioned system users. Therefore, this paper introduces an underlay RRM (Radio Resource Management) methodology which is capable of (1) improving cellular spectral efficiency while making a minimal impact on cellular communications and (2) ensuring the different QoS (Quality of Service) requirements of ITS (Intelligent Transportation Systems) applications. Simulation results, where we compare the proposed algorithm to the other two RRM, show the promising spectral efficiency performance of the proposed RRM methodology. PMID:28953256
A method for real-time generation of augmented reality work instructions via expert movements
NASA Astrophysics Data System (ADS)
Bhattacharya, Bhaskar; Winer, Eliot
2015-03-01
Augmented Reality (AR) offers tremendous potential for a wide range of fields including entertainment, medicine, and engineering. AR allows digital models to be integrated with a real scene (typically viewed through a video camera) to provide useful information in a variety of contexts. The difficulty in authoring and modifying scenes is one of the biggest obstacles to widespread adoption of AR. 3D models must be created, textured, oriented and positioned to create the complex overlays viewed by a user. This often requires using multiple software packages in addition to performing model format conversions. In this paper, a new authoring tool is presented which uses a novel method to capture product assembly steps performed by a user with a depth+RGB camera. Through a combination of computer vision and imaging process techniques, each individual step is decomposed into objects and actions. The objects are matched to those in a predetermined geometry library and the actions turned into animated assembly steps. The subsequent instruction set is then generated with minimal user input. A proof of concept is presented to establish the method's viability.
Support for Debugging Automatically Parallelized Programs
NASA Technical Reports Server (NTRS)
Jost, Gabriele; Hood, Robert; Biegel, Bryan (Technical Monitor)
2001-01-01
We describe a system that simplifies the process of debugging programs produced by computer-aided parallelization tools. The system uses relative debugging techniques to compare serial and parallel executions in order to show where the computations begin to differ. If the original serial code is correct, errors due to parallelization will be isolated by the comparison. One of the primary goals of the system is to minimize the effort required of the user. To that end, the debugging system uses information produced by the parallelization tool to drive the comparison process. In particular the debugging system relies on the parallelization tool to provide information about where variables may have been modified and how arrays are distributed across multiple processes. User effort is also reduced through the use of dynamic instrumentation. This allows us to modify the program execution without changing the way the user builds the executable. The use of dynamic instrumentation also permits us to compare the executions in a fine-grained fashion and only involve the debugger when a difference has been detected. This reduces the overhead of executing instrumentation.
DIA2: Web-based Cyberinfrastructure for Visual Analysis of Funding Portfolios.
Madhavan, Krishna; Elmqvist, Niklas; Vorvoreanu, Mihaela; Chen, Xin; Wong, Yuetling; Xian, Hanjun; Dong, Zhihua; Johri, Aditya
2014-12-01
We present a design study of the Deep Insights Anywhere, Anytime (DIA2) platform, a web-based visual analytics system that allows program managers and academic staff at the U.S. National Science Foundation to search, view, and analyze their research funding portfolio. The goal of this system is to facilitate users' understanding of both past and currently active research awards in order to make more informed decisions of their future funding. This user group is characterized by high domain expertise yet not necessarily high literacy in visualization and visual analytics-they are essentially casual experts-and thus require careful visual and information design, including adhering to user experience standards, providing a self-instructive interface, and progressively refining visualizations to minimize complexity. We discuss the challenges of designing a system for casual experts and highlight how we addressed this issue by modeling the organizational structure and workflows of the NSF within our system. We discuss each stage of the design process, starting with formative interviews, prototypes, and finally live deployments and evaluation with stakeholders.
An intelligent multi-media human-computer dialogue system
NASA Technical Reports Server (NTRS)
Neal, J. G.; Bettinger, K. E.; Byoun, J. S.; Dobes, Z.; Thielman, C. Y.
1988-01-01
Sophisticated computer systems are being developed to assist in the human decision-making process for very complex tasks performed under stressful conditions. The human-computer interface is a critical factor in these systems. The human-computer interface should be simple and natural to use, require a minimal learning period, assist the user in accomplishing his task(s) with a minimum of distraction, present output in a form that best conveys information to the user, and reduce cognitive load for the user. In pursuit of this ideal, the Intelligent Multi-Media Interfaces project is devoted to the development of interface technology that integrates speech, natural language text, graphics, and pointing gestures for human-computer dialogues. The objective of the project is to develop interface technology that uses the media/modalities intelligently in a flexible, context-sensitive, and highly integrated manner modelled after the manner in which humans converse in simultaneous coordinated multiple modalities. As part of the project, a knowledge-based interface system, called CUBRICON (CUBRC Intelligent CONversationalist) is being developed as a research prototype. The application domain being used to drive the research is that of military tactical air control.
Relative Debugging of Automatically Parallelized Programs
NASA Technical Reports Server (NTRS)
Jost, Gabriele; Hood, Robert; Biegel, Bryan (Technical Monitor)
2002-01-01
We describe a system that simplifies the process of debugging programs produced by computer-aided parallelization tools. The system uses relative debugging techniques to compare serial and parallel executions in order to show where the computations begin to differ. If the original serial code is correct, errors due to parallelization will be isolated by the comparison. One of the primary goals of the system is to minimize the effort required of the user. To that end, the debugging system uses information produced by the parallelization tool to drive the comparison process. In particular, the debugging system relies on the parallelization tool to provide information about where variables may have been modified and how arrays are distributed across multiple processes. User effort is also reduced through the use of dynamic instrumentation. This allows us to modify, the program execution with out changing the way the user builds the executable. The use of dynamic instrumentation also permits us to compare the executions in a fine-grained fashion and only involve the debugger when a difference has been detected. This reduces the overhead of executing instrumentation.
Robust model-based 3d/3D fusion using sparse matching for minimally invasive surgery.
Neumann, Dominik; Grbic, Sasa; John, Matthias; Navab, Nassir; Hornegger, Joachim; Ionasec, Razvan
2013-01-01
Classical surgery is being disrupted by minimally invasive and transcatheter procedures. As there is no direct view or access to the affected anatomy, advanced imaging techniques such as 3D C-arm CT and C-arm fluoroscopy are routinely used for intra-operative guidance. However, intra-operative modalities have limited image quality of the soft tissue and a reliable assessment of the cardiac anatomy can only be made by injecting contrast agent, which is harmful to the patient and requires complex acquisition protocols. We propose a novel sparse matching approach for fusing high quality pre-operative CT and non-contrasted, non-gated intra-operative C-arm CT by utilizing robust machine learning and numerical optimization techniques. Thus, high-quality patient-specific models can be extracted from the pre-operative CT and mapped to the intra-operative imaging environment to guide minimally invasive procedures. Extensive quantitative experiments demonstrate that our model-based fusion approach has an average execution time of 2.9 s, while the accuracy lies within expert user confidence intervals.
Integrated tools for control-system analysis
NASA Technical Reports Server (NTRS)
Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.
1989-01-01
The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.
NASA Technical Reports Server (NTRS)
Griffin, Timothy P.; Naylor, Guy R.; Haskell, William D.; Breznik, Greg S.; Mizell, Carolyn A.; Helms, William R.; Voska, N. (Technical Monitor)
2002-01-01
An on-line gas monitoring system was developed to replace the older systems used to monitor for cryogenic leaks on the Space Shuttles before launch. The system uses a mass spectrometer to monitor multiple locations in the process, which allows the system to monitor all gas constituents of interest in a nearly simultaneous manner. The system is fully redundant and meets all requirements for ground support equipment (GSE). This includes ruggedness to withstand launch on the Mobile Launcher Platform (MLP), ease of operation, and minimal operator intervention. The system can be fully automated so that an operator is notified when an unusual situation or fault is detected. User inputs are through personal computer using mouse and keyboard commands. The graphical user for detecting cryogenic leaks, many other gas constituents could be monitored using the Hazardous Gas Detection System (HGDS) 2000.
Improving the accuracy of burn-surface estimation.
Nichter, L S; Williams, J; Bryant, C A; Edlich, R F
1985-09-01
A user-friendly computer-assisted method of calculating total body surface area burned (TBSAB) has been developed. This method is more accurate, faster, and subject to less error than conventional methods. For comparison, the ability of 30 physicians to estimate TBSAB was tested. Parameters studied included the effect of prior burn care experience, the influence of burn size, the ability to accurately sketch the size of burns on standard burn charts, and the ability to estimate percent TBSAB from the sketches. Despite the ability for physicians of all levels of training to accurately sketch TBSAB, significant burn size over-estimation (p less than 0.01) and large interrater variability of potential consequence was noted. Direct benefits of a computerized system are many. These include the need for minimal user experience and the ability for wound-trend analysis, permanent record storage, calculation of fluid and caloric requirements, hemodynamic parameters, and the ability to compare meaningfully the different treatment protocols.
Development of a Computer Writing System Based on EOG
López, Alberto; Ferrero, Francisco; Yangüela, David; Álvarez, Constantina; Postolache, Octavian
2017-01-01
The development of a novel computer writing system based on eye movements is introduced herein. A system of these characteristics requires the consideration of three subsystems: (1) A hardware device for the acquisition and transmission of the signals generated by eye movement to the computer; (2) A software application that allows, among other functions, data processing in order to minimize noise and classify signals; and (3) A graphical interface that allows the user to write text easily on the computer screen using eye movements only. This work analyzes these three subsystems and proposes innovative and low cost solutions for each one of them. This computer writing system was tested with 20 users and its efficiency was compared to a traditional virtual keyboard. The results have shown an important reduction in the time spent on writing, which can be very useful, especially for people with severe motor disorders. PMID:28672863
BioMot exoskeleton - Towards a smart wearable robot for symbiotic human-robot interaction.
Bacek, Tomislav; Moltedo, Marta; Langlois, Kevin; Prieto, Guillermo Asin; Sanchez-Villamanan, Maria Carmen; Gonzalez-Vargas, Jose; Vanderborght, Bram; Lefeber, Dirk; Moreno, Juan C
2017-07-01
This paper presents design of a novel modular lower-limb gait exoskeleton built within the FP7 BioMot project. Exoskeleton employs a variable stiffness actuator in all 6 joints, a directional-flexibility structure and a novel physical humanrobot interfacing, which allows it to deliver the required output while minimally constraining user's gait by providing passive degrees of freedom. Due to modularity, the exoskeleton can be used as a full lower-limb orthosis, a single-joint orthosis in any of the three joints, and a two-joint orthosis in a combination of any of the two joints. By employing a simple torque control strategy, the exoskeleton can be used to deliver user-specific assistance, both in gait rehabilitation and in assisting people suffering musculoskeletal impairments. The result of the presented BioMot efforts is a low-footprint exoskeleton with powerful compliant actuators, simple, yet effective torque controller and easily adjustable flexible structure.
Development of a Computer Writing System Based on EOG.
López, Alberto; Ferrero, Francisco; Yangüela, David; Álvarez, Constantina; Postolache, Octavian
2017-06-26
The development of a novel computer writing system based on eye movements is introduced herein. A system of these characteristics requires the consideration of three subsystems: (1) A hardware device for the acquisition and transmission of the signals generated by eye movement to the computer; (2) A software application that allows, among other functions, data processing in order to minimize noise and classify signals; and (3) A graphical interface that allows the user to write text easily on the computer screen using eye movements only. This work analyzes these three subsystems and proposes innovative and low cost solutions for each one of them. This computer writing system was tested with 20 users and its efficiency was compared to a traditional virtual keyboard. The results have shown an important reduction in the time spent on writing, which can be very useful, especially for people with severe motor disorders.
NASA Astrophysics Data System (ADS)
Fragkoulis, Alexandros; Kondi, Lisimachos P.; Parsopoulos, Konstantinos E.
2015-03-01
We propose a method for the fair and efficient allocation of wireless resources over a cognitive radio system network to transmit multiple scalable video streams to multiple users. The method exploits the dynamic architecture of the Scalable Video Coding extension of the H.264 standard, along with the diversity that OFDMA networks provide. We use a game-theoretic Nash Bargaining Solution (NBS) framework to ensure that each user receives the minimum video quality requirements, while maintaining fairness over the cognitive radio system. An optimization problem is formulated, where the objective is the maximization of the Nash product while minimizing the waste of resources. The problem is solved by using a Swarm Intelligence optimizer, namely Particle Swarm Optimization. Due to the high dimensionality of the problem, we also introduce a dimension-reduction technique. Our experimental results demonstrate the fairness imposed by the employed NBS framework.
ConfocalGN: A minimalistic confocal image generator
NASA Astrophysics Data System (ADS)
Dmitrieff, Serge; Nédélec, François
Validating image analysis pipelines and training machine-learning segmentation algorithms require images with known features. Synthetic images can be used for this purpose, with the advantage that large reference sets can be produced easily. It is however essential to obtain images that are as realistic as possible in terms of noise and resolution, which is challenging in the field of microscopy. We describe ConfocalGN, a user-friendly software that can generate synthetic microscopy stacks from a ground truth (i.e. the observed object) specified as a 3D bitmap or a list of fluorophore coordinates. This software can analyze a real microscope image stack to set the noise parameters and directly generate new images of the object with noise characteristics similar to that of the sample image. With a minimal input from the user and a modular architecture, ConfocalGN is easily integrated with existing image analysis solutions.
BPM Button Optimization to Minimize Distortion Due to Trapped Mode Heating
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cameron,P.; Blednyk, A.; Kosciuk, B.
2009-05-04
The outer circumference of a BPM button and the inner circumference of the button housing comprise a transmission line. This transmission line typically presents an impedance of a few tens of ohms to the beam, and couples very weakly to the 50 ohm coaxial transmission line that comprises the signal path out of the button. The modes which are consequently excited and trapped often have quality factors of several hundred, permitting resonant excitation by the beam. The thermal distortion resulting from trapped mode heating is potentially problematic for achieving the high precision beam position measurements needed to provide the sub-micronmore » beam position stability required by light source users. We present a button design that has been optimized via material selection and component geometry to minimize both the trapped mode heating and the resulting thermal distortion.« less
Linte, Cristian A.; Davenport, Katherine P.; Cleary, Kevin; Peters, Craig; Vosburgh, Kirby G.; Navab, Nassir; Edwards, Philip “Eddie”; Jannin, Pierre; Peters, Terry M.; Holmes, David R.; Robb, Richard A.
2013-01-01
Mixed reality environments for medical applications have been explored and developed over the past three decades in an effort to enhance the clinician’s view of anatomy and facilitate the performance of minimally invasive procedures. These environments must faithfully represent the real surgical field and require seamless integration of pre- and intra-operative imaging, surgical instrument tracking, and display technology into a common framework centered around and registered to the patient. However, in spite of their reported benefits, few mixed reality environments have been successfully translated into clinical use. Several challenges that contribute to the difficulty in integrating such environments into clinical practice are presented here and discussed in terms of both technical and clinical limitations. This article should raise awareness among both developers and end-users toward facilitating a greater application of such environments in the surgical practice of the future. PMID:23632059
NASA Astrophysics Data System (ADS)
Nalli, N. R.; Gambacorta, A.; Tan, C.; Iturbide, F.; Barnet, C. D.; Reale, A.; Sun, B.; Liu, Q.
2017-12-01
This presentation overviews the performance of the operational SNPP NOAA Unique Combined Atmospheric Processing System (NUCAPS) environmental data record (EDR) products. The SNPP Cross-track Infrared Sounder and Advanced Technology Microwave Sounder (CrIS/ATMS) suite, the first of the Joint Polar Satellite System (JPSS) Program, is one of NOAA's major investments in our nation's future operational environmental observation capability. The NUCAPS algorithm is a world-class NOAA-operational IR/MW retrieval algorithm based upon the well-established AIRS science team algorithm for deriving temperature, moisture, ozone and carbon trace gas to provide users with state-of-the-art EDR products. Operational use of the products includes the NOAA National Weather Service (NWS) Advanced Weather Interactive Processing System (AWIPS), along with numerous science-user applications. NUCAPS EDR product assessments are made with reference to JPSS Level 1 global requirements, which provide the definitive metrics for assessing that the products have minimally met predefined global performance specifications. The NESDIS/STAR NUCAPS development and validation team recently delivered the Phase 4 algorithm which incorporated critical updates necessary for compatibility with full spectral-resolution (FSR) CrIS sensor data records (SDRs). Based on comprehensive analyses, the NUCAPS Phase 4 CrIS-FSR temperature, moisture and ozone profile EDRs, as well as the carbon trace gas EDRs (CO, CH4 and CO2), are shown o be meeting or close to meeting the JPSS program global requirements. Regional and temporal assessments of interest to EDR users (e.g., AWIPS) will also be presented.
Anon-Pass: Practical Anonymous Subscriptions
Lee, Michael Z.; Dunn, Alan M.; Katz, Jonathan; Waters, Brent; Witchel, Emmett
2014-01-01
We present the design, security proof, and implementation of an anonymous subscription service. Users register for the service by providing some form of identity, which might or might not be linked to a real-world identity such as a credit card, a web login, or a public key. A user logs on to the system by presenting a credential derived from information received at registration. Each credential allows only a single login in any authentication window, or epoch. Logins are anonymous in the sense that the service cannot distinguish which user is logging in any better than random guessing. This implies unlinkability of a user across different logins. We find that a central tension in an anonymous subscription service is the service provider’s desire for a long epoch (to reduce server-side computation) versus users’ desire for a short epoch (so they can repeatedly “re-anonymize” their sessions). We balance this tension by having short epochs, but adding an efficient operation for clients who do not need unlinkability to cheaply re-authenticate themselves for the next time period. We measure performance of a research prototype of our protocol that allows an independent service to offer anonymous access to existing services. We implement a music service, an Android-based subway-pass application, and a web proxy, and show that adding anonymity adds minimal client latency and only requires 33 KB of server memory per active user. PMID:24504081
Training Persons with Spinal Cord Injury to Ambulate Using a Powered Exoskeleton
Asselin, Pierre K.; Avedissian, Manuel; Knezevic, Steven; Kornfeld, Stephen; Spungen, Ann M.
2016-01-01
Powered exoskeletons have become available for overground ambulation in persons with paralyses due to spinal cord injury (SCI) who have intact upper extremity function and are able to maintain upright balance using forearm crutches. To ambulate in an exoskeleton, the user must acquire the ability to maintain balance while standing, sitting and appropriate weight shifting with each step. This can be a challenging task for those with deficits in sensation and proprioception in their lower extremities. This manuscript describes screening criteria and a training program developed at the James J. Peters VA Medical Center, Bronx, NY to teach users the skills needed to utilize these devices in institutional, home or community environments. Before training can begin, potential users are screened for appropriate range of motion of the hip, knee and ankle joints. Persons with SCI are at an increased risk of sustaining lower extremity fractures, even with minimal strain or trauma, therefore a bone mineral density assessment is performed to reduce the risk of fracture. Also, as part of screening, a physical examination is performed in order to identify additional health-related contraindications. Once the person has successfully passed all screening requirements, they are cleared to begin the training program. The device is properly adjusted to fit the user. A series of static and dynamic balance tasks are taught and performed by the user before learning to walk. The person is taught to ambulate in various environments ranging from indoor level surfaces to outdoors over uneven or changing surfaces. Once skilled enough to be a candidate for home use with the exoskeleton, the user is then required to designate a companion-walker who will train alongside them. Together, the pair must demonstrate the ability to perform various advanced tasks in order to be permitted to use the exoskeleton in their home/community environment. PMID:27340808
Efficient Parallel Engineering Computing on Linux Workstations
NASA Technical Reports Server (NTRS)
Lou, John Z.
2010-01-01
A C software module has been developed that creates lightweight processes (LWPs) dynamically to achieve parallel computing performance in a variety of engineering simulation and analysis applications to support NASA and DoD project tasks. The required interface between the module and the application it supports is simple, minimal and almost completely transparent to the user applications, and it can achieve nearly ideal computing speed-up on multi-CPU engineering workstations of all operating system platforms. The module can be integrated into an existing application (C, C++, Fortran and others) either as part of a compiled module or as a dynamically linked library (DLL).
MatchGUI: A Graphical MATLAB-Based Tool for Automatic Image Co-Registration
NASA Technical Reports Server (NTRS)
Ansar, Adnan I.
2011-01-01
MatchGUI software, based on MATLAB, automatically matches two images and displays the match result by superimposing one image on the other. A slider bar allows focus to shift between the two images. There are tools for zoom, auto-crop to overlap region, and basic image markup. Given a pair of ortho-rectified images (focused primarily on Mars orbital imagery for now), this software automatically co-registers the imagery so that corresponding image pixels are aligned. MatchGUI requires minimal user input, and performs a registration over scale and inplane rotation fully automatically
Material control and accountancy at EDF PWR plants; GCN: Gestion du Combustible Nucleaire
DOE Office of Scientific and Technical Information (OSTI.GOV)
de Cormis, F.
1991-01-01
The paper describes the comprehensive system which is developed and implemented at Electricite de France to provide a single reliable nuclear material control and accounting system for all nuclear plants. This software aims at several objectives among which are: the control and the accountancy of nuclear material at the plant, the optimization of the consistency of data by minimizing the possibility of transcription errors, the fulfillment of the statutory requirements by automatic transfer of reports to national and international safeguards authorities, the servicing of other EDF users of nuclear material data for technical or commercial purposes.
Virtual Ultrasound Guidance for Inexperienced Operators
NASA Technical Reports Server (NTRS)
Caine, Timothy; Martin, Davis
2012-01-01
Medical ultrasound or echocardiographic studies are highly operator-dependent and generally require lengthy training and internship to perfect. To obtain quality echocardiographic images in remote environments, such as on-orbit, remote guidance of studies has been employed. This technique involves minimal training for the user, coupled with remote guidance from an expert. When real-time communication or expert guidance is not available, a more autonomous system of guiding an inexperienced operator through an ultrasound study is needed. One example would be missions beyond low Earth orbit, in which the time delay inherent with communication will make remote guidance impractical.
Accounting and Accountability for Distributed and Grid Systems
NASA Technical Reports Server (NTRS)
Thigpen, William; McGinnis, Laura F.; Hacker, Thomas J.
2001-01-01
While the advent of distributed and grid computing systems will open new opportunities for scientific exploration, the reality of such implementations could prove to be a system administrator's nightmare. A lot of effort is being spent on identifying and resolving the obvious problems of security, scheduling, authentication and authorization. Lurking in the background, though, are the largely unaddressed issues of accountability and usage accounting: (1) mapping resource usage to resource users; (2) defining usage economies or methods for resource exchange; (3) describing implementation standards that minimize and compartmentalize the tasks required for a site to participate in a grid.
Development of an electronic radiation oncology patient information management system.
Mandal, Abhijit; Asthana, Anupam Kumar; Aggarwal, Lalit Mohan
2008-01-01
The quality of patient care is critically influenced by the availability of accurate information and its efficient management. Radiation oncology consists of many information components, for example there may be information related to the patient (e.g., profile, disease site, stage, etc.), to people (radiation oncologists, radiological physicists, technologists, etc.), and to equipment (diagnostic, planning, treatment, etc.). These different data must be integrated. A comprehensive information management system is essential for efficient storage and retrieval of the enormous amounts of information. A radiation therapy patient information system (RTPIS) has been developed using open source software. PHP and JAVA script was used as the programming languages, MySQL as the database, and HTML and CSF as the design tool. This system utilizes typical web browsing technology using a WAMP5 server. Any user having a unique user ID and password can access this RTPIS. The user ID and password is issued separately to each individual according to the person's job responsibilities and accountability, so that users will be able to only access data that is related to their job responsibilities. With this system authentic users will be able to use a simple web browsing procedure to gain instant access. All types of users in the radiation oncology department should find it user-friendly. The maintenance of the system will not require large human resources or space. The file storage and retrieval process would be be satisfactory, unique, uniform, and easily accessible with adequate data protection. There will be very little possibility of unauthorized handling with this system. There will also be minimal risk of loss or accidental destruction of information.
Basirat, Anahita
2017-01-01
Cochlear implant (CI) users frequently achieve good speech understanding based on phoneme and word recognition. However, there is a significant variability between CI users in processing prosody. The aim of this study was to examine the abilities of an excellent CI user to segment continuous speech using intonational cues. A post-lingually deafened adult CI user and 22 normal hearing (NH) subjects segmented phonemically identical and prosodically different sequences in French such as 'l'affiche' (the poster) versus 'la fiche' (the sheet), both [lafiʃ]. All participants also completed a minimal pair discrimination task. Stimuli were presented in auditory-only and audiovisual presentation modalities. The performance of the CI user in the minimal pair discrimination task was 97% in the auditory-only and 100% in the audiovisual condition. In the segmentation task, contrary to the NH participants, the performance of the CI user did not differ from the chance level. Visual speech did not improve word segmentation. This result suggests that word segmentation based on intonational cues is challenging when using CIs even when phoneme/word recognition is very well rehabilitated. This finding points to the importance of the assessment of CI users' skills in prosody processing and the need for specific interventions focusing on this aspect of speech communication.
Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu
2010-01-01
Purpose Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. Methods A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. Results The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. Conclusion A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users. PMID:20714933
Maciel, Anderson; Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu
2011-07-01
Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users.
Maes, Wouter H; Heuvelmans, Griet; Muys, Bart
2009-10-01
Although the importance of green (evaporative) water flows in delivering ecosystem services has been recognized, most operational impact assessment methods still focus only on blue water flows. In this paper, we present a new model to evaluate the effect of land use occupation and transformation on water quantity. Conceptually based on the supply of ecosystem services by terrestrial and aquatic ecosystems, the model is developed for, but not limited to, land use impact assessment in life cycle assessment (LCA) and requires a minimum amount of input data. Impact is minimal when evapotranspiration is equal to that of the potential natural vegetation, and maximal when evapotranspiration is zero or when it exceeds a threshold value derived from the concept of environmental water requirement. Three refinements to the model, requiring more input data, are proposed. The first refinement considers a minimal impact over a certain range based on the boundary evapotranspiration of the potential natural vegetation. In the second refinement the effects of evaporation and transpiration are accounted for separately, and in the third refinement a more correct estimate of evaporation from a fully sealed surface is incorporated. The simplicity and user friendliness of the proposed impact assessment method are illustrated with two examples.
Computational Prediction of the Immunomodulatory Potential of RNA Sequences.
Nagpal, Gandharva; Chaudhary, Kumardeep; Dhanda, Sandeep Kumar; Raghava, Gajendra Pal Singh
2017-01-01
Advances in the knowledge of various roles played by non-coding RNAs have stimulated the application of RNA molecules as therapeutics. Among these molecules, miRNA, siRNA, and CRISPR-Cas9 associated gRNA have been identified as the most potent RNA molecule classes with diverse therapeutic applications. One of the major limitations of RNA-based therapeutics is immunotoxicity of RNA molecules as it may induce the innate immune system. In contrast, RNA molecules that are potent immunostimulators are strong candidates for use in vaccine adjuvants. Thus, it is important to understand the immunotoxic or immunostimulatory potential of these RNA molecules. The experimental techniques for determining immunostimulatory potential of siRNAs are time- and resource-consuming. To overcome this limitation, recently our group has developed a web-based server "imRNA" for predicting the immunomodulatory potential of RNA sequences. This server integrates a number of modules that allow users to perform various tasks including (1) generation of RNA analogs with reduced immunotoxicity, (2) identification of highly immunostimulatory regions in RNA sequence, and (3) virtual screening. This server may also assist users in the identification of minimum mutations required in a given RNA sequence to minimize its immunomodulatory potential that is required for designing RNA-based therapeutics. Besides, the server can be used for designing RNA-based vaccine adjuvants as it may assist users in the identification of mutations required for increasing immunomodulatory potential of a given RNA sequence. In summary, this chapter describes major applications of the "imRNA" server in designing RNA-based therapeutics and vaccine adjuvants (http://www.imtech.res.in/raghava/imrna/).
Recreational System Optimization to Reduce Conflict on Public Lands
NASA Astrophysics Data System (ADS)
Shilling, Fraser; Boggs, Jennifer; Reed, Sarah
2012-09-01
In response to federal administrative rule, the Tahoe National Forest (TNF), California, USA engaged in trail-route prioritization for motorized recreation (e.g., off-highway-vehicles) and other recreation types. The prioritization was intended to identify routes that were suitable and ill-suited for maintenance in a transportation system. A recreational user survey was conducted online ( n = 813) for user preferences for trail system characteristics, recreational use patterns, and demographics. Motorized trail users and non-motorized users displayed very clear and contrasting preferences for the same system. As has been found by previous investigators, non-motorized users expressed antagonism to motorized use on the same recreational travel system, whereas motorized users either supported multiple-use routes or dismissed non-motorized recreationists' concerns. To help the TNF plan for reduced conflict, a geographic information system (GIS) based modeling approach was used to identify recreational opportunities and potential environmental impacts of all travel routes. This GIS-based approach was based on an expert-derived rule set. The rules addressed particular environmental and recreation concerns in the TNF. Route segments were identified that could be incorporated into minimal-impact networks to support various types of recreation. The combination of potential impacts and user-benefits supported an optimization approach for an appropriate recreational travel network to minimize environmental impacts and user-conflicts in a multi-purpose system.
Recreational system optimization to reduce conflict on public lands.
Shilling, Fraser; Boggs, Jennifer; Reed, Sarah
2012-09-01
In response to federal administrative rule, the Tahoe National Forest (TNF), California, USA engaged in trail-route prioritization for motorized recreation (e.g., off-highway-vehicles) and other recreation types. The prioritization was intended to identify routes that were suitable and ill-suited for maintenance in a transportation system. A recreational user survey was conducted online (n = 813) for user preferences for trail system characteristics, recreational use patterns, and demographics. Motorized trail users and non-motorized users displayed very clear and contrasting preferences for the same system. As has been found by previous investigators, non-motorized users expressed antagonism to motorized use on the same recreational travel system, whereas motorized users either supported multiple-use routes or dismissed non-motorized recreationists' concerns. To help the TNF plan for reduced conflict, a geographic information system (GIS) based modeling approach was used to identify recreational opportunities and potential environmental impacts of all travel routes. This GIS-based approach was based on an expert-derived rule set. The rules addressed particular environmental and recreation concerns in the TNF. Route segments were identified that could be incorporated into minimal-impact networks to support various types of recreation. The combination of potential impacts and user-benefits supported an optimization approach for an appropriate recreational travel network to minimize environmental impacts and user-conflicts in a multi-purpose system.
Distributed Energy Resources Customer Adoption Model - Graphical User Interface, Version 2.1.8
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ewald, Friedrich; Stadler, Michael; Cardoso, Goncalo F
The DER-CAM Graphical User Interface has been redesigned to consist of a dynamic tree structure on the left side of the application window to allow users to quickly navigate between different data categories and views. Views can either be tables with model parameters and input data, the optimization results, or a graphical interface to draw circuit topology and visualize investment results. The model parameters and input data consist of tables where values are assigned to specific keys. The aggregation of all model parameters and input data amounts to the data required to build a DER-CAM model, and is passed tomore » the GAMS solver when users initiate the DER-CAM optimization process. Passing data to the GAMS solver relies on the use of a Java server that handles DER-CAM requests, queuing, and results delivery. This component of the DER-CAM GUI can be deployed either locally or remotely, and constitutes an intermediate step between the user data input and manipulation, and the execution of a DER-CAM optimization in the GAMS engine. The results view shows the results of the DER-CAM optimization and distinguishes between a single and a multi-objective process. The single optimization runs the DER-CAM optimization once and presents the results as a combination of summary charts and hourly dispatch profiles. The multi-objective optimization process consists of a sequence of runs initiated by the GUI, including: 1) CO2 minimization, 2) cost minimization, 3) a user defined number of points in-between objectives 1) and 2). The multi-objective results view includes both access to the detailed results of each point generated by the process as well as the generation of a Pareto Frontier graph to illustrate the trade-off between objectives. DER-CAM GUI 2.1.8 also introduces the ability to graphically generate circuit topologies, enabling support to DER-CAM 5.0.0. This feature consists of: 1) The drawing area, where users can manually create nodes and define their properties (e.g. point of common coupling, slack bus, load) and connect them through edges representing either power lines, transformers, or heat pipes, all with user defined characteristics (e.g., length, ampacity, inductance, or heat loss); 2) The tables, which display the user-defined topology in the final numerical form that will be passed to the DER-CAM optimization. Finally, the DER-CAM GUI is also deployed with a database schema that allows users to provide different energy load profiles, solar irradiance profiles, and tariff data, that can be stored locally and later used in any DER-CAM model. However, no real data will be delivered with this version.« less
Analysis of counting data: Development of the SATLAS Python package
NASA Astrophysics Data System (ADS)
Gins, W.; de Groote, R. P.; Bissell, M. L.; Granados Buitrago, C.; Ferrer, R.; Lynch, K. M.; Neyens, G.; Sels, S.
2018-01-01
For the analysis of low-statistics counting experiments, a traditional nonlinear least squares minimization routine may not always provide correct parameter and uncertainty estimates due to the assumptions inherent in the algorithm(s). In response to this, a user-friendly Python package (SATLAS) was written to provide an easy interface between the data and a variety of minimization algorithms which are suited for analyzinglow, as well as high, statistics data. The advantage of this package is that it allows the user to define their own model function and then compare different minimization routines to determine the optimal parameter values and their respective (correlated) errors. Experimental validation of the different approaches in the package is done through analysis of hyperfine structure data of 203Fr gathered by the CRIS experiment at ISOLDE, CERN.
Preventing Shoulder-Surfing Attack with the Concept of Concealing the Password Objects' Information
Ho, Peng Foong; Kam, Yvonne Hwei-Syn; Wee, Mee Chin
2014-01-01
Traditionally, picture-based password systems employ password objects (pictures/icons/symbols) as input during an authentication session, thus making them vulnerable to “shoulder-surfing” attack because the visual interface by function is easily observed by others. Recent software-based approaches attempt to minimize this threat by requiring users to enter their passwords indirectly by performing certain mental tasks to derive the indirect password, thus concealing the user's actual password. However, weaknesses in the positioning of distracter and password objects introduce usability and security issues. In this paper, a new method, which conceals information about the password objects as much as possible, is proposed. Besides concealing the password objects and the number of password objects, the proposed method allows both password and distracter objects to be used as the challenge set's input. The correctly entered password appears to be random and can only be derived with the knowledge of the full set of password objects. Therefore, it would be difficult for a shoulder-surfing adversary to identify the user's actual password. Simulation results indicate that the correct input object and its location are random for each challenge set, thus preventing frequency of occurrence analysis attack. User study results show that the proposed method is able to prevent shoulder-surfing attack. PMID:24991649
Lazinski, David W.; Camilli, Andrew
2013-01-01
The amplification of DNA fragments, cloned between user-defined 5′ and 3′ end sequences, is a prerequisite step in the use of many current applications including massively parallel sequencing (MPS). Here we describe an improved method, called homopolymer tail-mediated ligation PCR (HTML-PCR), that requires very little starting template, minimal hands-on effort, is cost-effective, and is suited for use in high-throughput and robotic methodologies. HTML-PCR starts with the addition of homopolymer tails of controlled lengths to the 3′ termini of a double-stranded genomic template. The homopolymer tails enable the annealing-assisted ligation of a hybrid oligonucleotide to the template's recessed 5′ ends. The hybrid oligonucleotide has a user-defined sequence at its 5′ end. This primer, together with a second primer composed of a longer region complementary to the homopolymer tail and fused to a second 5′ user-defined sequence, are used in a PCR reaction to generate the final product. The user-defined sequences can be varied to enable compatibility with a wide variety of downstream applications. We demonstrate our new method by constructing MPS libraries starting from nanogram and sub-nanogram quantities of Vibrio cholerae and Streptococcus pneumoniae genomic DNA. PMID:23311318
NASA Astrophysics Data System (ADS)
Zhang, Hao; Chen, Minghua; Parekh, Abhay; Ramchandran, Kannan
2011-09-01
We design a distributed multi-channel P2P Video-on-Demand (VoD) system using "plug-and-play" helpers. Helpers are heterogenous "micro-servers" with limited storage, bandwidth and number of users they can serve simultaneously. Our proposed system has the following salient features: (1) it jointly optimizes over helper-user connection topology, video storage distribution and transmission bandwidth allocation; (2) it minimizes server load, and is adaptable to varying supply and demand patterns across multiple video channels irrespective of video popularity; and (3) it is fully distributed and requires little or no maintenance overhead. The combinatorial nature of the problem and the system demand for distributed algorithms makes the problem uniquely challenging. By utilizing Lagrangian decomposition and Markov chain approximation based arguments, we address this challenge by designing two distributed algorithms running in tandem: a primal-dual storage and bandwidth allocation algorithm and a "soft-worst-neighbor-choking" topology-building algorithm. Our scheme provably converges to a near-optimal solution, and is easy to implement in practice. Packet-level simulation results show that the proposed scheme achieves minimum sever load under highly heterogeneous combinations of supply and demand patterns, and is robust to system dynamics of user/helper churn, user/helper asynchrony, and random delays in the network.
Yu, Kebing; Salomon, Arthur R.
2010-01-01
Recently, dramatic progress has been achieved in expanding the sensitivity, resolution, mass accuracy, and scan rate of mass spectrometers able to fragment and identify peptides through tandem mass spectrometry (MS/MS). Unfortunately, this enhanced ability to acquire proteomic data has not been accompanied by a concomitant increase in the availability of flexible tools allowing users to rapidly assimilate, explore, and analyze this data and adapt to a variety of experimental workflows with minimal user intervention. Here we fill this critical gap by providing a flexible relational database called PeptideDepot for organization of expansive proteomic data sets, collation of proteomic data with available protein information resources, and visual comparison of multiple quantitative proteomic experiments. Our software design, built upon the synergistic combination of a MySQL database for safe warehousing of proteomic data with a FileMaker-driven graphical user interface for flexible adaptation to diverse workflows, enables proteomic end-users to directly tailor the presentation of proteomic data to the unique analysis requirements of the individual proteomics lab. PeptideDepot may be deployed as an independent software tool or integrated directly with our High Throughput Autonomous Proteomic Pipeline (HTAPP) used in the automated acquisition and post-acquisition analysis of proteomic data. PMID:19834895
Oba, Sandra I.; Galvin, John J.; Fu, Qian-Jie
2014-01-01
Auditory training has been shown to significantly improve cochlear implant (CI) users’ speech and music perception. However, it is unclear whether post-training gains in performance were due to improved auditory perception or to generally improved attention, memory and/or cognitive processing. In this study, speech and music perception, as well as auditory and visual memory were assessed in ten CI users before, during, and after training with a non-auditory task. A visual digit span (VDS) task was used for training, in which subjects recalled sequences of digits presented visually. After the VDS training, VDS performance significantly improved. However, there were no significant improvements for most auditory outcome measures (auditory digit span, phoneme recognition, sentence recognition in noise, digit recognition in noise), except for small (but significant) improvements in vocal emotion recognition and melodic contour identification. Post-training gains were much smaller with the non-auditory VDS training than observed in previous auditory training studies with CI users. The results suggest that post-training gains observed in previous studies were not solely attributable to improved attention or memory, and were more likely due to improved auditory perception. The results also suggest that CI users may require targeted auditory training to improve speech and music perception. PMID:23516087
1993-11-01
way is to develop a crude but working model of an entire system. The other is by developing a realistic model of the user interface , leaving out most...devices or by incorporating software for a more user -friendly interface . Automation introduces the possibility of making data entry errors. Multimode...across various human- computer interfaces . 127 a Memory: Minimize the amount of information that the user must maintain in short-term memory
Dead simple OWL design patterns
DOE Office of Scientific and Technical Information (OSTI.GOV)
Osumi-Sutherland, David; Courtot, Melanie; Balhoff, James P.
Bio-ontologies typically require multiple axes of classification to support the needs of their users. Development of such ontologies can only be made scalable and sustainable by the use of inference to automate classification via consistent patterns of axiomatization. Many bio-ontologies originating in OBO or OWL follow this approach. These patterns need to be documented in a form that requires minimal expertise to understand and edit and that can be validated and applied using any of the various programmatic approaches to working with OWL ontologies. We describe a system, Dead Simple OWL Design Patterns (DOS-DPs), which fulfills these requirements, illustrating themore » system with examples from the Gene Ontology. In conclusion, the rapid adoption of DOS-DPs by multiple ontology development projects illustrates both the ease-of use and the pressing need for the simple design pattern system we have developed.« less
Dead simple OWL design patterns
Osumi-Sutherland, David; Courtot, Melanie; Balhoff, James P.; ...
2017-06-05
Bio-ontologies typically require multiple axes of classification to support the needs of their users. Development of such ontologies can only be made scalable and sustainable by the use of inference to automate classification via consistent patterns of axiomatization. Many bio-ontologies originating in OBO or OWL follow this approach. These patterns need to be documented in a form that requires minimal expertise to understand and edit and that can be validated and applied using any of the various programmatic approaches to working with OWL ontologies. We describe a system, Dead Simple OWL Design Patterns (DOS-DPs), which fulfills these requirements, illustrating themore » system with examples from the Gene Ontology. In conclusion, the rapid adoption of DOS-DPs by multiple ontology development projects illustrates both the ease-of use and the pressing need for the simple design pattern system we have developed.« less
Automated crystallographic system for high-throughput protein structure determination.
Brunzelle, Joseph S; Shafaee, Padram; Yang, Xiaojing; Weigand, Steve; Ren, Zhong; Anderson, Wayne F
2003-07-01
High-throughput structural genomic efforts require software that is highly automated, distributive and requires minimal user intervention to determine protein structures. Preliminary experiments were set up to test whether automated scripts could utilize a minimum set of input parameters and produce a set of initial protein coordinates. From this starting point, a highly distributive system was developed that could determine macromolecular structures at a high throughput rate, warehouse and harvest the associated data. The system uses a web interface to obtain input data and display results. It utilizes a relational database to store the initial data needed to start the structure-determination process as well as generated data. A distributive program interface administers the crystallographic programs which determine protein structures. Using a test set of 19 protein targets, 79% were determined automatically.
Interactive semiautomatic contour delineation using statistical conditional random fields framework.
Hu, Yu-Chi; Grossberg, Michael D; Wu, Abraham; Riaz, Nadeem; Perez, Carmen; Mageras, Gig S
2012-07-01
Contouring a normal anatomical structure during radiation treatment planning requires significant time and effort. The authors present a fast and accurate semiautomatic contour delineation method to reduce the time and effort required of expert users. Following an initial segmentation on one CT slice, the user marks the target organ and nontarget pixels with a few simple brush strokes. The algorithm calculates statistics from this information that, in turn, determines the parameters of an energy function containing both boundary and regional components. The method uses a conditional random field graphical model to define the energy function to be minimized for obtaining an estimated optimal segmentation, and a graph partition algorithm to efficiently solve the energy function minimization. Organ boundary statistics are estimated from the segmentation and propagated to subsequent images; regional statistics are estimated from the simple brush strokes that are either propagated or redrawn as needed on subsequent images. This greatly reduces the user input needed and speeds up segmentations. The proposed method can be further accelerated with graph-based interpolation of alternating slices in place of user-guided segmentation. CT images from phantom and patients were used to evaluate this method. The authors determined the sensitivity and specificity of organ segmentations using physician-drawn contours as ground truth, as well as the predicted-to-ground truth surface distances. Finally, three physicians evaluated the contours for subjective acceptability. Interobserver and intraobserver analysis was also performed and Bland-Altman plots were used to evaluate agreement. Liver and kidney segmentations in patient volumetric CT images show that boundary samples provided on a single CT slice can be reused through the entire 3D stack of images to obtain accurate segmentation. In liver, our method has better sensitivity and specificity (0.925 and 0.995) than region growing (0.897 and 0.995) and level set methods (0.912 and 0.985) as well as shorter mean predicted-to-ground truth distance (2.13 mm) compared to regional growing (4.58 mm) and level set methods (8.55 mm and 4.74 mm). Similar results are observed in kidney segmentation. Physician evaluation of ten liver cases showed that 83% of contours did not need any modification, while 6% of contours needed modifications as assessed by two or more evaluators. In interobserver and intraobserver analysis, Bland-Altman plots showed our method to have better repeatability than the manual method while the delineation time was 15% faster on average. Our method achieves high accuracy in liver and kidney segmentation and considerably reduces the time and labor required for contour delineation. Since it extracts purely statistical information from the samples interactively specified by expert users, the method avoids heuristic assumptions commonly used by other methods. In addition, the method can be expanded to 3D directly without modification because the underlying graphical framework and graph partition optimization method fit naturally with the image grid structure.
A user view of office automation or the integrated workstation
NASA Technical Reports Server (NTRS)
Schmerling, E. R.
1984-01-01
Central data bases are useful only if they are kept up to date and easily accessible in an interactive (query) mode rather than in monthly reports that may be out of date and must be searched by hand. The concepts of automatic data capture, data base management and query languages require good communications and readily available work stations to be useful. The minimal necessary work station is a personal computer which can be an important office tool if connected into other office machines and properly integrated into an office system. It has a great deal of flexibility and can often be tailored to suit the tastes, work habits and requirements of the user. Unlike dumb terminals, there is less tendency to saturate a central computer, since its free standing capabilities are available after down loading a selection of data. The PC also permits the sharing of many other facilities, like larger computing power, sophisticated graphics programs, laser printers and communications. It can provide rapid access to common data bases able to provide more up to date information than printed reports. Portable computers can access the same familiar office facilities from anywhere in the world where a telephone connection can be made.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saw, C; Baikadi, M; Peters, C
2015-06-15
Purpose: Using systems engineering to design HDR skin treatment operation for small lesions using shielded applicators to enhance patient safety. Methods: Systems engineering is an interdisciplinary field that offers formal methodologies to study, design, implement, and manage complex engineering systems as a whole over their life-cycles. The methodologies deal with human work-processes, coordination of different team, optimization, and risk management. The V-model of systems engineering emphasize two streams, the specification and the testing streams. The specification stream consists of user requirements, functional requirements, and design specifications while the testing on installation, operational, and performance specifications. In implementing system engineering tomore » this project, the user and functional requirements are (a) HDR unit parameters be downloaded from the treatment planning system, (b) dwell times and positions be generated by treatment planning system, (c) source decay be computer calculated, (d) a double-check system of treatment parameters to comply with the NRC regulation. These requirements are intended to reduce human intervention to improve patient safety. Results: A formal investigation indicated that the user requirements can be satisfied. The treatment operation consists of using the treatment planning system to generate a pseudo plan that is adjusted for different shielded applicators to compute the dwell times. The dwell positions, channel numbers, and the dwell times are verified by the medical physicist and downloaded into the HDR unit. The decayed source strength is transferred to a spreadsheet that computes the dwell times based on the type of applicators and prescribed dose used. Prior to treatment, the source strength, dwell times, dwell positions, and channel numbers are double-checked by the radiation oncologist. No dosimetric parameters are manually calculated. Conclusion: Systems engineering provides methodologies to effectively design the HDR treatment operation that minimize human intervention and improve patient safety.« less
Khomtchouk, Bohdan B; Van Booven, Derek J; Wahlestedt, Claes
2014-01-01
The graphical visualization of gene expression data using heatmaps has become an integral component of modern-day medical research. Heatmaps are used extensively to plot quantitative differences in gene expression levels, such as those measured with RNAseq and microarray experiments, to provide qualitative large-scale views of the transcriptonomic landscape. Creating high-quality heatmaps is a computationally intensive task, often requiring considerable programming experience, particularly for customizing features to a specific dataset at hand. Software to create publication-quality heatmaps is developed with the R programming language, C++ programming language, and OpenGL application programming interface (API) to create industry-grade high performance graphics. We create a graphical user interface (GUI) software package called HeatmapGenerator for Windows OS and Mac OS X as an intuitive, user-friendly alternative to researchers with minimal prior coding experience to allow them to create publication-quality heatmaps using R graphics without sacrificing their desired level of customization. The simplicity of HeatmapGenerator is that it only requires the user to upload a preformatted input file and download the publicly available R software language, among a few other operating system-specific requirements. Advanced features such as color, text labels, scaling, legend construction, and even database storage can be easily customized with no prior programming knowledge. We provide an intuitive and user-friendly software package, HeatmapGenerator, to create high-quality, customizable heatmaps generated using the high-resolution color graphics capabilities of R. The software is available for Microsoft Windows and Apple Mac OS X. HeatmapGenerator is released under the GNU General Public License and publicly available at: http://sourceforge.net/projects/heatmapgenerator/. The Mac OS X direct download is available at: http://sourceforge.net/projects/heatmapgenerator/files/HeatmapGenerator_MAC_OSX.tar.gz/download. The Windows OS direct download is available at: http://sourceforge.net/projects/heatmapgenerator/files/HeatmapGenerator_WINDOWS.zip/download.
Reasoning and memory: People make varied use of the information available in working memory.
Hardman, Kyle O; Cowan, Nelson
2016-05-01
Working memory (WM) is used for storing information in a highly accessible state so that other mental processes, such as reasoning, can use that information. Some WM tasks require that participants not only store information, but also reason about that information to perform optimally on the task. In this study, we used visual WM tasks that had both storage and reasoning components to determine both how ideally people are able to reason about information in WM and if there is a relationship between information storage and reasoning. We developed novel psychological process models of the tasks that allowed us to estimate for each participant both how much information they had in WM and how efficiently they reasoned about that information. Our estimates of information use showed that participants are not all ideal information users or minimal information users, but rather that there are individual differences in the thoroughness of information use in our WM tasks. However, we found that our participants tended to be more ideal than minimal. One implication of this work is that to accurately estimate the amount of information in WM, it is important to also estimate how efficiently that information is used. This new analysis contributes to the theoretical premise that human rationality may be bounded by the complexity of task demands. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Reasoning and memory: People make varied use of the information available in working memory
Hardman, Kyle O.; Cowan, Nelson
2015-01-01
Working memory (WM) is used for storing information in a highly-accessible state so that other mental processes, such as reasoning, can use that information. Some WM tasks require that participants not only store information, but also reason about that information in order to perform optimally on the task. In this study, we used visual WM tasks that had both storage and reasoning components in order to determine both how ideally people are able to reason about information in WM and if there is a relationship between information storage and reasoning. We developed novel psychological process models of the tasks that allowed us to estimate for each participant both how much information they had in WM and how efficiently they reasoned about that information. Our estimates of information use showed that participants are not all ideal information users or minimal information users, but rather that there are individual differences in the thoroughness of information use in our WM tasks. However, we found that our participants tended to be more ideal than minimal. One implication of this work is that in order to accurately estimate the amount of information in WM, it is important to also estimate how efficiently that information is used. This new analysis contributes to the theoretical premise that human rationality may be bounded by the complexity of task demands. PMID:26569436
Graafland, Maurits; Bok, Kiki; Schreuder, Henk W R; Schijven, Marlies P
2014-06-01
Untrained laparoscopic camera assistants in minimally invasive surgery (MIS) may cause suboptimal view of the operating field, thereby increasing risk for errors. Camera navigation is often performed by the least experienced member of the operating team, such as inexperienced surgical residents, operating room nurses, and medical students. The operating room nurses and medical students are currently not included as key user groups in structured laparoscopic training programs. A new virtual reality laparoscopic camera navigation (LCN) module was specifically developed for these key user groups. This multicenter prospective cohort study assesses face validity and construct validity of the LCN module on the Simendo virtual reality simulator. Face validity was assessed through a questionnaire on resemblance to reality and perceived usability of the instrument among experts and trainees. Construct validity was assessed by comparing scores of groups with different levels of experience on outcome parameters of speed and movement proficiency. The results obtained show uniform and positive evaluation of the LCN module among expert users and trainees, signifying face validity. Experts and intermediate experience groups performed significantly better in task time and camera stability during three repetitions, compared to the less experienced user groups (P < .007). Comparison of learning curves showed significant improvement of proficiency in time and camera stability for all groups during three repetitions (P < .007). The results of this study show face validity and construct validity of the LCN module. The module is suitable for use in training curricula for operating room nurses and novice surgical trainees, aimed at improving team performance in minimally invasive surgery. © The Author(s) 2013.
Bandwidth Allocation to Interactive Users in DBS-Based Hybrid Internet
1998-01-01
policies 12 3.1 Framework for queuing analysis: ON/OFF source traffic model . 13 3.2 Service quality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14...minimizing the queuing delay. In consequence, we were interested in ob- taining improvements in the service quality , as perceived by the users. A...the service quality as per- ceived by users. The merit of this approach, first introduced in [8], is the ability to capture the characteristics of the
General Mission Analysis Tool (GMAT) Architectural Specification. Draft
NASA Technical Reports Server (NTRS)
Hughes, Steven P.; Conway, Darrel, J.
2007-01-01
Early in 2002, Goddard Space Flight Center (GSFC) began to identify requirements for the flight dynamics software needed to fly upcoming missions that use formations of spacecraft to collect data. These requirements ranged from low level modeling features to large scale interoperability requirements. In 2003 we began work on a system designed to meet these requirement; this system is GMAT. The General Mission Analysis Tool (GMAT) is a general purpose flight dynamics modeling tool built on open source principles. The GMAT code is written in C++, and uses modern C++ constructs extensively. GMAT can be run through either a fully functional Graphical User Interface (GUI) or as a command line program with minimal user feedback. The system is built and runs on Microsoft Windows, Linux, and Macintosh OS X platforms. The GMAT GUI is written using wxWidgets, a cross platform library of components that streamlines the development and extension of the user interface Flight dynamics modeling is performed in GMAT by building components that represent the players in the analysis problem that is being modeled. These components interact through the sequential execution of instructions, embodied in the GMAT Mission Sequence. A typical Mission Sequence will model the trajectories of a set of spacecraft evolving over time, calculating relevant parameters during this propagation, and maneuvering individual spacecraft to maintain a set of mission constraints as established by the mission analyst. All of the elements used in GMAT for mission analysis can be viewed in the GMAT GUI or through a custom scripting language. Analysis problems modeled in GMAT are saved as script files, and these files can be read into GMAT. When a script is read into the GMAT GUI, the corresponding user interface elements are constructed in the GMAT GUI. The GMAT system was developed from the ground up to run in a platform agnostic environment. The source code compiles on numerous different platforms, and is regularly exercised running on Windows, Linux and Macintosh computers by the development and analysis teams working on the project. The system can be run using either a graphical user interface, written using the open source wxWidgets framework, or from a text console. The GMAT source code was written using open source tools. GSFC has released the code using the NASA open source license.
Pahwa, Mrinal; Kusner, Matthew; Hacker, Carl D; Bundy, David T; Weinberger, Kilian Q; Leuthardt, Eric C
2015-01-01
Previous studies suggest stable and robust control of a brain-computer interface (BCI) can be achieved using electrocorticography (ECoG). Translation of this technology from the laboratory to the real world requires additional methods that allow users operate their ECoG-based BCI autonomously. In such an environment, users must be able to perform all tasks currently performed by the experimenter, including manually switching the BCI system on/off. Although a simple task, it can be challenging for target users (e.g., individuals with tetraplegia) due to severe motor disability. In this study, we present an automated and practical strategy to switch a BCI system on or off based on the cognitive state of the user. Using a logistic regression, we built probabilistic models that utilized sub-dural ECoG signals from humans to estimate in pseudo real-time whether a person is awake or in a sleep-like state, and subsequently, whether to turn a BCI system on or off. Furthermore, we constrained these models to identify the optimal anatomical and spectral parameters for delineating states. Other methods exist to differentiate wake and sleep states using ECoG, but none account for practical requirements of BCI application, such as minimizing the size of an ECoG implant and predicting states in real time. Our results demonstrate that, across 4 individuals, wakeful and sleep-like states can be classified with over 80% accuracy (up to 92%) in pseudo real-time using high gamma (70-110 Hz) band limited power from only 5 electrodes (platinum discs with a diameter of 2.3 mm) located above the precentral and posterior superior temporal gyrus.
The elastic ratio: introducing curvature into ratio-based image segmentation.
Schoenemann, Thomas; Masnou, Simon; Cremers, Daniel
2011-09-01
We present the first ratio-based image segmentation method that allows imposing curvature regularity of the region boundary. Our approach is a generalization of the ratio framework pioneered by Jermyn and Ishikawa so as to allow penalty functions that take into account the local curvature of the curve. The key idea is to cast the segmentation problem as one of finding cyclic paths of minimal ratio in a graph where each graph node represents a line segment. Among ratios whose discrete counterparts can be globally minimized with our approach, we focus in particular on the elastic ratio [Formula: see text] that depends, given an image I, on the oriented boundary C of the segmented region candidate. Minimizing this ratio amounts to finding a curve, neither small nor too curvy, through which the brightness flux is maximal. We prove the existence of minimizers for this criterion among continuous curves with mild regularity assumptions. We also prove that the discrete minimizers provided by our graph-based algorithm converge, as the resolution increases, to continuous minimizers. In contrast to most existing segmentation methods with computable and meaningful, i.e., nondegenerate, global optima, the proposed approach is fully unsupervised in the sense that it does not require any kind of user input such as seed nodes. Numerical experiments demonstrate that curvature regularity allows substantial improvement of the quality of segmentations. Furthermore, our results allow drawing conclusions about global optima of a parameterization-independent version of the snakes functional: the proposed algorithm allows determining parameter values where the functional has a meaningful solution and simultaneously provides the corresponding global solution.
Yuan, Michael Juntao; Finley, George Mike; Long, Ju; Mills, Christy; Johnson, Ron Kim
2013-01-31
Clinical decision support systems (CDSS) are important tools to improve health care outcomes and reduce preventable medical adverse events. However, the effectiveness and success of CDSS depend on their implementation context and usability in complex health care settings. As a result, usability design and validation, especially in real world clinical settings, are crucial aspects of successful CDSS implementations. Our objective was to develop a novel CDSS to help frontline nurses better manage critical symptom changes in hospitalized patients, hence reducing preventable failure to rescue cases. A robust user interface and implementation strategy that fit into existing workflows was key for the success of the CDSS. Guided by a formal usability evaluation framework, UFuRT (user, function, representation, and task analysis), we developed a high-level specification of the product that captures key usability requirements and is flexible to implement. We interviewed users of the proposed CDSS to identify requirements, listed functions, and operations the system must perform. We then designed visual and workflow representations of the product to perform the operations. The user interface and workflow design were evaluated via heuristic and end user performance evaluation. The heuristic evaluation was done after the first prototype, and its results were incorporated into the product before the end user evaluation was conducted. First, we recruited 4 evaluators with strong domain expertise to study the initial prototype. Heuristic violations were coded and rated for severity. Second, after development of the system, we assembled a panel of nurses, consisting of 3 licensed vocational nurses and 7 registered nurses, to evaluate the user interface and workflow via simulated use cases. We recorded whether each session was successfully completed and its completion time. Each nurse was asked to use the National Aeronautics and Space Administration (NASA) Task Load Index to self-evaluate the amount of cognitive and physical burden associated with using the device. A total of 83 heuristic violations were identified in the studies. The distribution of the heuristic violations and their average severity are reported. The nurse evaluators successfully completed all 30 sessions of the performance evaluations. All nurses were able to use the device after a single training session. On average, the nurses took 111 seconds (SD 30 seconds) to complete the simulated task. The NASA Task Load Index results indicated that the work overhead on the nurses was low. In fact, most of the burden measures were consistent with zero. The only potentially significant burden was temporal demand, which was consistent with the primary use case of the tool. The evaluation has shown that our design was functional and met the requirements demanded by the nurses' tight schedules and heavy workloads. The user interface embedded in the tool provided compelling utility to the nurse with minimal distraction.
Ammerman, Alice S; Hartman, Terry; DeMarco, Molly M
2017-02-01
The Supplemental Nutrition Assistance Program (SNAP) serves as an important nutritional safety net program for many Americans. Given its aim to use traditional economic levers to provide access to food, the SNAP program includes minimal nutritional requirements and restrictions. As food choices are influenced by more than just economic constraints, behavioral economics may offer insights and tools for altering food purchases for SNAP users. This manuscript outlines behavioral economics strategies that have potential to encourage healthier food choices within the SNAP program. Copyright © 2016 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.
Probabilistic sparse matching for robust 3D/3D fusion in minimally invasive surgery.
Neumann, Dominik; Grbic, Sasa; John, Matthias; Navab, Nassir; Hornegger, Joachim; Ionasec, Razvan
2015-01-01
Classical surgery is being overtaken by minimally invasive and transcatheter procedures. As there is no direct view or access to the affected anatomy, advanced imaging techniques such as 3D C-arm computed tomography (CT) and C-arm fluoroscopy are routinely used in clinical practice for intraoperative guidance. However, due to constraints regarding acquisition time and device configuration, intraoperative modalities have limited soft tissue image quality and reliable assessment of the cardiac anatomy typically requires contrast agent, which is harmful to the patient and requires complex acquisition protocols. We propose a probabilistic sparse matching approach to fuse high-quality preoperative CT images and nongated, noncontrast intraoperative C-arm CT images by utilizing robust machine learning and numerical optimization techniques. Thus, high-quality patient-specific models can be extracted from the preoperative CT and mapped to the intraoperative imaging environment to guide minimally invasive procedures. Extensive quantitative experiments on 95 clinical datasets demonstrate that our model-based fusion approach has an average execution time of 1.56 s, while the accuracy of 5.48 mm between the anchor anatomy in both images lies within expert user confidence intervals. In direct comparison with image-to-image registration based on an open-source state-of-the-art medical imaging library and a recently proposed quasi-global, knowledge-driven multi-modal fusion approach for thoracic-abdominal images, our model-based method exhibits superior performance in terms of registration accuracy and robustness with respect to both target anatomy and anchor anatomy alignment errors.
OWLing Clinical Data Repositories With the Ontology Web Language
Pastor, Xavier; Lozano, Esther
2014-01-01
Background The health sciences are based upon information. Clinical information is usually stored and managed by physicians with precarious tools, such as spreadsheets. The biomedical domain is more complex than other domains that have adopted information and communication technologies as pervasive business tools. Moreover, medicine continuously changes its corpus of knowledge because of new discoveries and the rearrangements in the relationships among concepts. This scenario makes it especially difficult to offer good tools to answer the professional needs of researchers and constitutes a barrier that needs innovation to discover useful solutions. Objective The objective was to design and implement a framework for the development of clinical data repositories, capable of facing the continuous change in the biomedicine domain and minimizing the technical knowledge required from final users. Methods We combined knowledge management tools and methodologies with relational technology. We present an ontology-based approach that is flexible and efficient for dealing with complexity and change, integrated with a solid relational storage and a Web graphical user interface. Results Onto Clinical Research Forms (OntoCRF) is a framework for the definition, modeling, and instantiation of data repositories. It does not need any database design or programming. All required information to define a new project is explicitly stated in ontologies. Moreover, the user interface is built automatically on the fly as Web pages, whereas data are stored in a generic repository. This allows for immediate deployment and population of the database as well as instant online availability of any modification. Conclusions OntoCRF is a complete framework to build data repositories with a solid relational storage. Driven by ontologies, OntoCRF is more flexible and efficient to deal with complexity and change than traditional systems and does not require very skilled technical people facilitating the engineering of clinical software systems. PMID:25599697
OWLing Clinical Data Repositories With the Ontology Web Language.
Lozano-Rubí, Raimundo; Pastor, Xavier; Lozano, Esther
2014-08-01
The health sciences are based upon information. Clinical information is usually stored and managed by physicians with precarious tools, such as spreadsheets. The biomedical domain is more complex than other domains that have adopted information and communication technologies as pervasive business tools. Moreover, medicine continuously changes its corpus of knowledge because of new discoveries and the rearrangements in the relationships among concepts. This scenario makes it especially difficult to offer good tools to answer the professional needs of researchers and constitutes a barrier that needs innovation to discover useful solutions. The objective was to design and implement a framework for the development of clinical data repositories, capable of facing the continuous change in the biomedicine domain and minimizing the technical knowledge required from final users. We combined knowledge management tools and methodologies with relational technology. We present an ontology-based approach that is flexible and efficient for dealing with complexity and change, integrated with a solid relational storage and a Web graphical user interface. Onto Clinical Research Forms (OntoCRF) is a framework for the definition, modeling, and instantiation of data repositories. It does not need any database design or programming. All required information to define a new project is explicitly stated in ontologies. Moreover, the user interface is built automatically on the fly as Web pages, whereas data are stored in a generic repository. This allows for immediate deployment and population of the database as well as instant online availability of any modification. OntoCRF is a complete framework to build data repositories with a solid relational storage. Driven by ontologies, OntoCRF is more flexible and efficient to deal with complexity and change than traditional systems and does not require very skilled technical people facilitating the engineering of clinical software systems.
Absorbed Power Minimization in Cellular Users with Circular Antenna Arrays
NASA Astrophysics Data System (ADS)
Christofilakis, Vasilis; Votis, Constantinos; Tatsis, Giorgos; Raptis, Vasilis; Kostarakis, Panos
2010-01-01
Nowadays electromagnetic pollution of non ionizing radiation generated by cellular phones concerns millions of people. In this paper the use of circular antenna array as a means of minimizing the absorbed power by cellular phone users is introduced. In particular, the different characteristics of radiation patterns produced by a helical conventional antenna used in mobile phones operating at 900 MHz and those produced by a circular antenna array, hypothetically used in the same mobile phones, are in detail examined. Furthermore, the percentage of decrement of the power absorbed in the head as a function of direction of arrival is estimated for the circular antenna array.
Eckmann, Christian; Olbrich, Guenter; Shekarriz, Hodjat; Bruch, Hans-Peter
2003-01-01
The reproducible advantages of minimal invasive surgery have led to a worldwide spread of these techniques. Nevertheless, the increasing use of technology causes problems in the operating room (OR). The workstation environment and workflow are handicapped by a great number of isolated solutions that demand a large amount of space. The Center of Excellence in Medical Technology (CEMET) was established in 2001 as an institution for a close cooperation between users, science, and manufacturers of medical devices in the State of Schleswig-Holstein, Germany. The future OR, as a major project, began with a detailed process analysis, which disclosed a large number of medical devices with different interfaces and poor standardisation as main problems. Smaller and more flexible devices are necessary, as well as functional modules located outside the OR. Only actuators should be positioned near the operation area. The future OR should include a flexible-room concept and less equipment than is in use currently. A uniform human-user interface is needed to control the OR environment. This article addresses the need for a clear workspace environment, intelligent-user interfaces, and flexible-room concept to improve the potentials in use of minimal invasive surgery.
A mixed-integer linear programming approach to the reduction of genome-scale metabolic networks.
Röhl, Annika; Bockmayr, Alexander
2017-01-03
Constraint-based analysis has become a widely used method to study metabolic networks. While some of the associated algorithms can be applied to genome-scale network reconstructions with several thousands of reactions, others are limited to small or medium-sized models. In 2015, Erdrich et al. introduced a method called NetworkReducer, which reduces large metabolic networks to smaller subnetworks, while preserving a set of biological requirements that can be specified by the user. Already in 2001, Burgard et al. developed a mixed-integer linear programming (MILP) approach for computing minimal reaction sets under a given growth requirement. Here we present an MILP approach for computing minimum subnetworks with the given properties. The minimality (with respect to the number of active reactions) is not guaranteed by NetworkReducer, while the method by Burgard et al. does not allow specifying the different biological requirements. Our procedure is about 5-10 times faster than NetworkReducer and can enumerate all minimum subnetworks in case there exist several ones. This allows identifying common reactions that are present in all subnetworks, and reactions appearing in alternative pathways. Applying complex analysis methods to genome-scale metabolic networks is often not possible in practice. Thus it may become necessary to reduce the size of the network while keeping important functionalities. We propose a MILP solution to this problem. Compared to previous work, our approach is more efficient and allows computing not only one, but even all minimum subnetworks satisfying the required properties.
Using generalizability theory to develop clinical assessment protocols.
Preuss, Richard A
2013-04-01
Clinical assessment protocols must produce data that are reliable, with a clinically attainable minimal detectable change (MDC). In a reliability study, generalizability theory has 2 advantages over classical test theory. These advantages provide information that allows assessment protocols to be adjusted to match individual patient profiles. First, generalizability theory allows the user to simultaneously consider multiple sources of measurement error variance (facets). Second, it allows the user to generalize the findings of the main study across the different study facets and to recalculate the reliability and MDC based on different combinations of facet conditions. In doing so, clinical assessment protocols can be chosen based on minimizing the number of measures that must be taken to achieve a realistic MDC, using repeated measures to minimize the MDC, or simply based on the combination that best allows the clinician to monitor an individual patient's progress over a specified period of time.
Learning free energy landscapes using artificial neural networks.
Sidky, Hythem; Whitmer, Jonathan K
2018-03-14
Existing adaptive bias techniques, which seek to estimate free energies and physical properties from molecular simulations, are limited by their reliance on fixed kernels or basis sets which hinder their ability to efficiently conform to varied free energy landscapes. Further, user-specified parameters are in general non-intuitive yet significantly affect the convergence rate and accuracy of the free energy estimate. Here we propose a novel method, wherein artificial neural networks (ANNs) are used to develop an adaptive biasing potential which learns free energy landscapes. We demonstrate that this method is capable of rapidly adapting to complex free energy landscapes and is not prone to boundary or oscillation problems. The method is made robust to hyperparameters and overfitting through Bayesian regularization which penalizes network weights and auto-regulates the number of effective parameters in the network. ANN sampling represents a promising innovative approach which can resolve complex free energy landscapes in less time than conventional approaches while requiring minimal user input.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Z. J.; Wells, D.; Green, J.
Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switchingmore » the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.« less
The optimal design of service level agreement in IAAS based on BDIM
NASA Astrophysics Data System (ADS)
Liu, Xiaochen; Zhan, Zhiqiang
2013-03-01
Cloud Computing has become more and more prevalent over the past few years, and we have seen the importance of Infrastructure-as-a-service (IaaS). This kind of service enables scaling of bandwidth, memory, computing power and storage. But the SLA in IaaS also faces complexity and variety. Users also consider the business of the service. To meet the most users requirements, a methodology for designing optimal SLA in IaaS from the business perspectives is proposed. This method is different from the conventional SLA design method, It not only focuses on service provider perspective, also from the customer to carry on the design. This methodology better captures the linkage between service provider and service client by considering minimizing the business loss originated from performance degradation and IT infrastructure failures and maximizing profits for service provider and clients. An optimal design in an IaaS model is provided and an example are analyzed to show this approach obtain higher profit.
Learning free energy landscapes using artificial neural networks
NASA Astrophysics Data System (ADS)
Sidky, Hythem; Whitmer, Jonathan K.
2018-03-01
Existing adaptive bias techniques, which seek to estimate free energies and physical properties from molecular simulations, are limited by their reliance on fixed kernels or basis sets which hinder their ability to efficiently conform to varied free energy landscapes. Further, user-specified parameters are in general non-intuitive yet significantly affect the convergence rate and accuracy of the free energy estimate. Here we propose a novel method, wherein artificial neural networks (ANNs) are used to develop an adaptive biasing potential which learns free energy landscapes. We demonstrate that this method is capable of rapidly adapting to complex free energy landscapes and is not prone to boundary or oscillation problems. The method is made robust to hyperparameters and overfitting through Bayesian regularization which penalizes network weights and auto-regulates the number of effective parameters in the network. ANN sampling represents a promising innovative approach which can resolve complex free energy landscapes in less time than conventional approaches while requiring minimal user input.
Exascale Storage Systems the SIRIUS Way
NASA Astrophysics Data System (ADS)
Klasky, S. A.; Abbasi, H.; Ainsworth, M.; Choi, J.; Curry, M.; Kurc, T.; Liu, Q.; Lofstead, J.; Maltzahn, C.; Parashar, M.; Podhorszki, N.; Suchyta, E.; Wang, F.; Wolf, M.; Chang, C. S.; Churchill, M.; Ethier, S.
2016-10-01
As the exascale computing age emerges, data related issues are becoming critical factors that determine how and where we do computing. Popular approaches used by traditional I/O solution and storage libraries become increasingly bottlenecked due to their assumptions about data movement, re-organization, and storage. While, new technologies, such as “burst buffers”, can help address some of the short-term performance issues, it is essential that we reexamine the underlying storage and I/O infrastructure to effectively support requirements and challenges at exascale and beyond. In this paper we present a new approach to the exascale Storage System and I/O (SSIO), which is based on allowing users to inject application knowledge into the system and leverage this knowledge to better manage, store, and access large data volumes so as to minimize the time to scientific insights. Central to our approach is the distinction between the data, metadata, and the knowledge contained therein, transferred from the user to the system by describing “utility” of data as it ages.
Developing a Ruggedized User-Friendly UAS for Monitoring Volcanic Emissions
NASA Astrophysics Data System (ADS)
Wardell, L. J.; Elston, J. S.; Stachura, M.
2017-12-01
Using lessons learned from a history of airborne volcano measurements and a range of UAS R&D, a reliable and ruggedized UAS is being developed specifically for volcano monitoring and response. A key feature is the user interface (UI) that allows for a menu of automated flight plans that will account for terrain and sensor requirements. Due to variation in response times of miniaturized airborne the sensors, flight plan options are extended to account for sensor lag when needed. By automating such complicating variables into the UI, the amount of background and training needed for operation is further minimized. Payload options include simultaneous in situ gas and particle sensors combined with downward-looking imagers to provide a wide range of data products. Currently under development by Black Swift Technologies, the latest updates and test results will be presented. Specifications of the Superswift airframe include a 6,000 m flight ceiling, 2.4 kg payload capacity, and 2 hr endurance.
The graphics and data acquisition software package
NASA Technical Reports Server (NTRS)
Crosier, W. G.
1981-01-01
A software package was developed for use with micro and minicomputers, particularly the LSI-11/DPD-11 series. The package has a number of Fortran-callable subroutines which perform a variety of frequently needed tasks for biomedical applications. All routines are well documented, flexible, easy to use and modify, and require minimal programmer knowledge of peripheral hardware. The package is also economical of memory and CPU time. A single subroutine call can perform any one of the following functions: (1) plot an array of integer values from sampled A/D data, (2) plot an array of Y values versus an array of X values; (3) draw horizontal and/or vertical grid lines of selectable type; (4) annotate grid lines with user units; (5) get coordinates of user controlled crosshairs from the terminal for interactive graphics; (6) sample any analog channel with program selectable gain; (7) wait a specified time interval, and (8) perform random access I/O of one or more blocks of a sequential disk file. Several miscellaneous functions are also provided.
Drawing road networks with focus regions.
Haunert, Jan-Henrik; Sering, Leon
2011-12-01
Mobile users of maps typically need detailed information about their surroundings plus some context information about remote places. In order to avoid that the map partly gets too dense, cartographers have designed mapping functions that enlarge a user-defined focus region--such functions are sometimes called fish-eye projections. The extra map space occupied by the enlarged focus region is compensated by distorting other parts of the map. We argue that, in a map showing a network of roads relevant to the user, distortion should preferably take place in those areas where the network is sparse. Therefore, we do not apply a predefined mapping function. Instead, we consider the road network as a graph whose edges are the road segments. We compute a new spatial mapping with a graph-based optimization approach, minimizing the square sum of distortions at edges. Our optimization method is based on a convex quadratic program (CQP); CQPs can be solved in polynomial time. Important requirements on the output map are expressed as linear inequalities. In particular, we show how to forbid edge crossings. We have implemented our method in a prototype tool. For instances of different sizes, our method generated output maps that were far less distorted than those generated with a predefined fish-eye projection. Future work is needed to automate the selection of roads relevant to the user. Furthermore, we aim at fast heuristics for application in real-time systems. © 2011 IEEE
Worst-Case Energy Efficiency Maximization in a 5G Massive MIMO-NOMA System.
Chinnadurai, Sunil; Selvaprabhu, Poongundran; Jeong, Yongchae; Jiang, Xueqin; Lee, Moon Ho
2017-09-18
In this paper, we examine the robust beamforming design to tackle the energy efficiency (EE) maximization problem in a 5G massive multiple-input multiple-output (MIMO)-non-orthogonal multiple access (NOMA) downlink system with imperfect channel state information (CSI) at the base station. A novel joint user pairing and dynamic power allocation (JUPDPA) algorithm is proposed to minimize the inter user interference and also to enhance the fairness between the users. This work assumes imperfect CSI by adding uncertainties to channel matrices with worst-case model, i.e., ellipsoidal uncertainty model (EUM). A fractional non-convex optimization problem is formulated to maximize the EE subject to the transmit power constraints and the minimum rate requirement for the cell edge user. The designed problem is difficult to solve due to its nonlinear fractional objective function. We firstly employ the properties of fractional programming to transform the non-convex problem into its equivalent parametric form. Then, an efficient iterative algorithm is proposed established on the constrained concave-convex procedure (CCCP) that solves and achieves convergence to a stationary point of the above problem. Finally, Dinkelbach's algorithm is employed to determine the maximum energy efficiency. Comprehensive numerical results illustrate that the proposed scheme attains higher worst-case energy efficiency as compared with the existing NOMA schemes and the conventional orthogonal multiple access (OMA) scheme.
Worst-Case Energy Efficiency Maximization in a 5G Massive MIMO-NOMA System
Jeong, Yongchae; Jiang, Xueqin; Lee, Moon Ho
2017-01-01
In this paper, we examine the robust beamforming design to tackle the energy efficiency (EE) maximization problem in a 5G massive multiple-input multiple-output (MIMO)-non-orthogonal multiple access (NOMA) downlink system with imperfect channel state information (CSI) at the base station. A novel joint user pairing and dynamic power allocation (JUPDPA) algorithm is proposed to minimize the inter user interference and also to enhance the fairness between the users. This work assumes imperfect CSI by adding uncertainties to channel matrices with worst-case model, i.e., ellipsoidal uncertainty model (EUM). A fractional non-convex optimization problem is formulated to maximize the EE subject to the transmit power constraints and the minimum rate requirement for the cell edge user. The designed problem is difficult to solve due to its nonlinear fractional objective function. We firstly employ the properties of fractional programming to transform the non-convex problem into its equivalent parametric form. Then, an efficient iterative algorithm is proposed established on the constrained concave-convex procedure (CCCP) that solves and achieves convergence to a stationary point of the above problem. Finally, Dinkelbach’s algorithm is employed to determine the maximum energy efficiency. Comprehensive numerical results illustrate that the proposed scheme attains higher worst-case energy efficiency as compared with the existing NOMA schemes and the conventional orthogonal multiple access (OMA) scheme. PMID:28927019
BigDebug: Debugging Primitives for Interactive Big Data Processing in Spark.
Gulzar, Muhammad Ali; Interlandi, Matteo; Yoo, Seunghyun; Tetali, Sai Deep; Condie, Tyson; Millstein, Todd; Kim, Miryung
2016-05-01
Developers use cloud computing platforms to process a large quantity of data in parallel when developing big data analytics. Debugging the massive parallel computations that run in today's data-centers is time consuming and error-prone. To address this challenge, we design a set of interactive, real-time debugging primitives for big data processing in Apache Spark, the next generation data-intensive scalable cloud computing platform. This requires re-thinking the notion of step-through debugging in a traditional debugger such as gdb, because pausing the entire computation across distributed worker nodes causes significant delay and naively inspecting millions of records using a watchpoint is too time consuming for an end user. First, BIGDEBUG's simulated breakpoints and on-demand watchpoints allow users to selectively examine distributed, intermediate data on the cloud with little overhead. Second, a user can also pinpoint a crash-inducing record and selectively resume relevant sub-computations after a quick fix. Third, a user can determine the root causes of errors (or delays) at the level of individual records through a fine-grained data provenance capability. Our evaluation shows that BIGDEBUG scales to terabytes and its record-level tracing incurs less than 25% overhead on average. It determines crash culprits orders of magnitude more accurately and provides up to 100% time saving compared to the baseline replay debugger. The results show that BIGDEBUG supports debugging at interactive speeds with minimal performance impact.
Cost-efficient scheduling of FAST observations
NASA Astrophysics Data System (ADS)
Luo, Qi; Zhao, Laiping; Yu, Ce; Xiao, Jian; Sun, Jizhou; Zhu, Ming; Zhong, Yi
2018-03-01
A cost-efficient schedule for the Five-hundred-meter Aperture Spherical radio Telescope (FAST) requires to maximize the number of observable proposals and the overall scientific priority, and minimize the overall slew-cost generated by telescope shifting, while taking into account the constraints including the astronomical objects visibility, user-defined observable times, avoiding Radio Frequency Interference (RFI). In this contribution, first we solve the problem of maximizing the number of observable proposals and scientific priority by modeling it as a Minimum Cost Maximum Flow (MCMF) problem. The optimal schedule can be found by any MCMF solution algorithm. Then, for minimizing the slew-cost of the generated schedule, we devise a maximally-matchable edges detection-based method to reduce the problem size, and propose a backtracking algorithm to find the perfect matching with minimum slew-cost. Experiments on a real dataset from NASA/IPAC Extragalactic Database (NED) show that, the proposed scheduler can increase the usage of available times with high scientific priority and reduce the slew-cost significantly in a very short time.
Linte, Cristian A; Davenport, Katherine P; Cleary, Kevin; Peters, Craig; Vosburgh, Kirby G; Navab, Nassir; Edwards, Philip Eddie; Jannin, Pierre; Peters, Terry M; Holmes, David R; Robb, Richard A
2013-03-01
Mixed reality environments for medical applications have been explored and developed over the past three decades in an effort to enhance the clinician's view of anatomy and facilitate the performance of minimally invasive procedures. These environments must faithfully represent the real surgical field and require seamless integration of pre- and intra-operative imaging, surgical instrument tracking, and display technology into a common framework centered around and registered to the patient. However, in spite of their reported benefits, few mixed reality environments have been successfully translated into clinical use. Several challenges that contribute to the difficulty in integrating such environments into clinical practice are presented here and discussed in terms of both technical and clinical limitations. This article should raise awareness among both developers and end-users toward facilitating a greater application of such environments in the surgical practice of the future. Copyright © 2013 Elsevier Ltd. All rights reserved.
Logging roads and log decks for wildlife habitat
William H. Healy
1989-01-01
Roads are essential to manage and use forest land. They can improve wildlife habitat and provide recreational opportunities. But roads are often controversial because they have so many different users-loggers, hikers, hunters, and off-road-vehicle drivers. Benefits to wildlife can be maximized and user conflicts minimized by careful planning and design. Decisions about...
Secret Shopping as User Experience Assessment Tool
ERIC Educational Resources Information Center
Boyce, Crystal M.
2015-01-01
Secret shopping is a form of unobtrusive evaluation that can be accomplished with minimal effort, but still produce rich results. With as few as 11 shoppers, the author was able to identify trends in user satisfaction with services provided across two entry-level desks at Illinois Wesleyan University's The Ames Library. The focus of this secret…
Airport and Airway Costs: Allocation and Recovery in the 1980’s.
1987-02-01
1997 [8]. 3*X S.% Volume 4, FAA Cost Recovery Options [9). Volume 5, Econometric Cost Functions for FAA Cost Allocation Model [10]. Volume 6, Users...and relative price elasticities ( Ramsey pricing technique). User fees based on the Ramsey pricing tend to be less burdensome on users and minimize...full discussion of the Ramsey pricing techniques is provided in Allocation of Federal Airport and Airway Costs for FY 1985 [6]. -12- In step 5
TDRSS telecommunications study. Phase 1: Final report
NASA Technical Reports Server (NTRS)
Cahn, C. R.; Cnossen, R. S.
1974-01-01
A parametric analysis of the telecommunications support capability of the Tracking and Data Relay Satellite System (TDRSS) was performed. Emphasis was placed on maximizing support capability provided to the user while minimizing impact on the user spacecraft. This study evaluates the present TDRSS configuration as presented in the TDRSS Definition Phase Study Report, December 1973 to determine potential changes for improving the overall performance. In addition, it provides specifications of the user transponder equipment to be used in the TDRSS.
Moustgaard, Heta; Joutsenniemi, Kaisla; Myrskylä, Mikko; Martikainen, Pekka
2014-01-01
A marked decline in suicide rates has co-occurred with increased antidepressant sales in several countries but the causal connection between the trends remains debated. Most previous studies have focused on overall suicide rates and neglected differential effects in population subgroups. Our objective was to investigate whether increasing sales of non-tricyclic antidepressants have reduced alcohol- and non-alcohol-related suicide risk in different population subgroups. We followed a nationally representative sample of 950,158 Finnish adults in 1995-2007 for alcohol-related (n = 2,859) and non-alcohol-related (n = 8,632) suicides. We assessed suicide risk by gender and social group according to regional sales of non-tricyclic antidepressants, measured by sold doses per capita, prevalence of antidepressant users, and proportion of antidepressant users with doses reflecting minimally adequate treatment. Fixed-effects Poisson regression models controlled for regional differences and time trends that may influence suicide risk irrespective of antidepressant sales. The number of sold antidepressant doses per capita and the prevalence of antidepressant users were unrelated to male suicide risk. However, one percentage point increase in the proportion of antidepressant users receiving minimally adequate treatment reduced non-alcohol-related male suicide risk by one percent (relative risk 0.987, 95% confidence interval 0.976-0.998). This beneficial effect only emerged among men with high education, high income, and employment, among men without a partner, and men not owning their home. Alcohol-related suicides and female suicides were unrelated to all measures of antidepressant sales. We found little evidence that increase in overall sales or in the prevalence of non-tricyclic antidepressant users would have caused the fall in suicide rates in Finland in 1995-2007. However, the rise in the proportion of antidepressant users receiving minimally adequate treatment, possibly due to enhanced treatment compliance, may have prevented non-alcohol-related suicides among men.
Moustgaard, Heta; Joutsenniemi, Kaisla; Myrskylä, Mikko; Martikainen, Pekka
2014-01-01
Objectives A marked decline in suicide rates has co-occurred with increased antidepressant sales in several countries but the causal connection between the trends remains debated. Most previous studies have focused on overall suicide rates and neglected differential effects in population subgroups. Our objective was to investigate whether increasing sales of non-tricyclic antidepressants have reduced alcohol- and non-alcohol-related suicide risk in different population subgroups. Methods We followed a nationally representative sample of 950,158 Finnish adults in 1995–2007 for alcohol-related (n = 2,859) and non-alcohol-related (n = 8,632) suicides. We assessed suicide risk by gender and social group according to regional sales of non-tricyclic antidepressants, measured by sold doses per capita, prevalence of antidepressant users, and proportion of antidepressant users with doses reflecting minimally adequate treatment. Fixed-effects Poisson regression models controlled for regional differences and time trends that may influence suicide risk irrespective of antidepressant sales. Results The number of sold antidepressant doses per capita and the prevalence of antidepressant users were unrelated to male suicide risk. However, one percentage point increase in the proportion of antidepressant users receiving minimally adequate treatment reduced non-alcohol-related male suicide risk by one percent (relative risk 0.987, 95% confidence interval 0.976–0.998). This beneficial effect only emerged among men with high education, high income, and employment, among men without a partner, and men not owning their home. Alcohol-related suicides and female suicides were unrelated to all measures of antidepressant sales. Conclusion We found little evidence that increase in overall sales or in the prevalence of non-tricyclic antidepressant users would have caused the fall in suicide rates in Finland in 1995–2007. However, the rise in the proportion of antidepressant users receiving minimally adequate treatment, possibly due to enhanced treatment compliance, may have prevented non-alcohol-related suicides among men. PMID:24892560
Mobile app self-care versus in-office care for stress reduction: a cost minimization analysis.
Luxton, David D; Hansen, Ryan N; Stanfill, Katherine
2014-12-01
We calculated the cost of providing stress reduction care with a mobile phone app (Breathe2Relax) in comparison with normal in-person care, the standard method for managing stress in military and civilian populations. We conducted a cost-minimization analysis. The total cost to the military healthcare system of treating 1000 patients with the app was $106,397. Treating 1000 patients with in-office care cost $68,820. Treatment using the app became less expensive than in-office treatment at approximately 1600 users. From the perspective of the civilian healthcare system, treatment using the app became less expensive than in-office treatment at approximately 1500 users. An online tool was used to obtain data about the number of app downloads and usage sessions. A total of 47,000 users had accessed the app for 10-30 min sessions in the 2.5 years since the release of the app. Assuming that all 47,000 users were military beneficiaries, the savings to the military healthcare system would be $2.7 million; if the 47,000 users were civilian, the savings to the civilian healthcare system would be $2.9 million. Because of the large number of potential users, the total societal savings resulting from self-care using the app may be considerable. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
2011-01-01
Background Laboratory Information Management Systems (LIMS) are an increasingly important part of modern laboratory infrastructure. As typically very sophisticated software products, LIMS often require considerable resources to select, deploy and maintain. Larger organisations may have access to specialist IT support to assist with requirements elicitation and software customisation, however smaller groups will often have limited IT support to perform the kind of iterative development that can resolve the difficulties that biologists often have when specifying requirements. Translational medicine aims to accelerate the process of treatment discovery by bringing together multiple disciplines to discover new approaches to treating disease, or novel applications of existing treatments. The diverse set of disciplines and complexity of processing procedures involved, especially with the use of high throughput technologies, bring difficulties in customizing a generic LIMS to provide a single system for managing sample related data within a translational medicine research setting, especially where limited IT support is available. Results We have designed and developed a LIMS, BonsaiLIMS, around a very simple data model that can be easily implemented using a variety of technologies, and can be easily extended as specific requirements dictate. A reference implementation using Oracle 11 g database and the Python framework, Django is presented. Conclusions By focusing on a minimal feature set and a modular design we have been able to deploy the BonsaiLIMS system very quickly. The benefits to our institute have been the avoidance of the prolonged implementation timescales, budget overruns, scope creep, off-specifications and user fatigue issues that typify many enterprise software implementations. The transition away from using local, uncontrolled records in spreadsheet and paper formats to a centrally held, secured and backed-up database brings the immediate benefits of improved data visibility, audit and overall data quality. The open-source availability of this software allows others to rapidly implement a LIMS which in itself might sufficiently address user requirements. In situations where this software does not meet requirements, it can serve to elicit more accurate specifications from end-users for a more heavyweight LIMS by acting as a demonstrable prototype. PMID:21569484
Bath, Timothy G; Bozdag, Selcuk; Afzal, Vackar; Crowther, Daniel
2011-05-13
Laboratory Information Management Systems (LIMS) are an increasingly important part of modern laboratory infrastructure. As typically very sophisticated software products, LIMS often require considerable resources to select, deploy and maintain. Larger organisations may have access to specialist IT support to assist with requirements elicitation and software customisation, however smaller groups will often have limited IT support to perform the kind of iterative development that can resolve the difficulties that biologists often have when specifying requirements. Translational medicine aims to accelerate the process of treatment discovery by bringing together multiple disciplines to discover new approaches to treating disease, or novel applications of existing treatments. The diverse set of disciplines and complexity of processing procedures involved, especially with the use of high throughput technologies, bring difficulties in customizing a generic LIMS to provide a single system for managing sample related data within a translational medicine research setting, especially where limited IT support is available. We have designed and developed a LIMS, BonsaiLIMS, around a very simple data model that can be easily implemented using a variety of technologies, and can be easily extended as specific requirements dictate. A reference implementation using Oracle 11 g database and the Python framework, Django is presented. By focusing on a minimal feature set and a modular design we have been able to deploy the BonsaiLIMS system very quickly. The benefits to our institute have been the avoidance of the prolonged implementation timescales, budget overruns, scope creep, off-specifications and user fatigue issues that typify many enterprise software implementations. The transition away from using local, uncontrolled records in spreadsheet and paper formats to a centrally held, secured and backed-up database brings the immediate benefits of improved data visibility, audit and overall data quality. The open-source availability of this software allows others to rapidly implement a LIMS which in itself might sufficiently address user requirements. In situations where this software does not meet requirements, it can serve to elicit more accurate specifications from end-users for a more heavyweight LIMS by acting as a demonstrable prototype.
Surles, M C; Richardson, J S; Richardson, D C; Brooks, F P
1994-02-01
We describe a new paradigm for modeling proteins in interactive computer graphics systems--continual maintenance of a physically valid representation, combined with direct user control and visualization. This is achieved by a fast algorithm for energy minimization, capable of real-time performance on all atoms of a small protein, plus graphically specified user tugs. The modeling system, called Sculpt, rigidly constrains bond lengths, bond angles, and planar groups (similar to existing interactive modeling programs), while it applies elastic restraints to minimize the potential energy due to torsions, hydrogen bonds, and van der Waals and electrostatic interactions (similar to existing batch minimization programs), and user-specified springs. The graphical interface can show bad and/or favorable contacts, and individual energy terms can be turned on or off to determine their effects and interactions. Sculpt finds a local minimum of the total energy that satisfies all the constraints using an augmented Lagrange-multiplier method; calculation time increases only linearly with the number of atoms because the matrix of constraint gradients is sparse and banded. On a 100-MHz MIPS R4000 processor (Silicon Graphics Indigo), Sculpt achieves 11 updates per second on a 20-residue fragment and 2 updates per second on an 80-residue protein, using all atoms except non-H-bonding hydrogens, and without electrostatic interactions. Applications of Sculpt are described: to reverse the direction of bundle packing in a designed 4-helix bundle protein, to fold up a 2-stranded beta-ribbon into an approximate beta-barrel, and to design the sequence and conformation of a 30-residue peptide that mimics one partner of a protein subunit interaction. Computer models that are both interactive and physically realistic (within the limitations of a given force field) have 2 significant advantages: (1) they make feasible the modeling of very large changes (such as needed for de novo design), and (2) they help the user understand how different energy terms interact to stabilize a given conformation. The Sculpt paradigm combines many of the best features of interactive graphical modeling, energy minimization, and actual physical models, and we propose it as an especially productive way to use current and future increases in computer speed.
Automated information retrieval using CLIPS
NASA Technical Reports Server (NTRS)
Raines, Rodney Doyle, III; Beug, James Lewis
1991-01-01
Expert systems have considerable potential to assist computer users in managing the large volume of information available to them. One possible use of an expert system is to model the information retrieval interests of a human user and then make recommendations to the user as to articles of interest. At Cal Poly, a prototype expert system written in the C Language Integrated Production System (CLIPS) serves as an Automated Information Retrieval System (AIRS). AIRS monitors a user's reading preferences, develops a profile of the user, and then evaluates items returned from the information base. When prompted by the user, AIRS returns a list of items of interest to the user. In order to minimize the impact on system resources, AIRS is designed to run in the background during periods of light system use.
Kiel, Menno A; Röder, Esther; Gerth van Wijk, Roy; Al, Maiwenn J; Hop, Wim C J; Rutten-van Mölken, Maureen P M H
2013-08-01
Subcutaneous allergen immunotherapy (SCIT) and sublingual allergen immunotherapy (SLIT) are safe and effective treatments of allergic rhinitis, but high levels of compliance and persistence are crucial to achieving the desired clinical effects. Our objective was to assess levels and predictors of compliance and persistence among grass pollen, tree pollen, and house dust mite immunotherapy users in real life and to estimate the costs of premature discontinuation. We performed a retrospective analysis of a community pharmacy database from The Netherlands containing data from 6486 patients starting immunotherapy for 1 or more of the allergens of interest between 1994 and 2009. Two thousand seven hundred ninety-six patients received SCIT, and 3690 received SLIT. Time to treatment discontinuation was analyzed and included Cox proportional hazard models with time-dependent covariates, where appropriate. Overall, only 18% of users reached the minimally required duration of treatment of 3 years (SCIT, 23%; SLIT, 7%). Median durations for SCIT and SLIT users were 1.7 and 0.6 years, respectively (P < .001). Other independent predictors of premature discontinuation were prescriber, with patients of general practitioners demonstrating longer persistence than those of allergologists and other medical specialists; single-allergen immunotherapy, lower socioeconomic status; and younger age. Of the persistent patients, 56% were never late in picking up their medication from the pharmacy. Direct medication costs per nonpersistent patient discontinuing in the third year of treatment were €3800, an amount that was largely misspent. Real-life persistence is better in SCIT users than in SLIT users, although it is low overall. There is an urgent need for further identification of potential barriers and measures that will enhance persistence and compliance. Copyright © 2013 American Academy of Allergy, Asthma & Immunology. Published by Mosby, Inc. All rights reserved.
Evaluation of a proximity card authentication system for health care settings.
Fontaine, Jacqueline; Zheng, Kai; Van De Ven, Cosmas; Li, Huiyang; Hiner, James; Mitchell, Kathy; Gendler, Stephen; Hanauer, David A
2016-08-01
Multiple users access computer workstations in busy clinical settings, requiring many logins throughout the day as users switch from one computer to another. This can lead to workflow inefficiencies as well as security concerns resulting from users sharing login sessions to save time. Proximity cards and readers have the potential to improve efficiency and security by allowing users to access clinical workstations simply by bringing the card near the reader, without the need for manual entry of a username and password. To assess the perceived impact of proximity cards and readers for rapid user authentication to clinical workstations in the setting of an existing electronic health record with single sign-on software already installed. Questionnaires were administered to clinical faculty and staff five months before and three months after the installation of proximity card readers in an inpatient birthing center and an outpatient obstetrics clinic. Open-ended feedback was also collected and qualitatively analyzed. There were 71 and 33 responses to the pre- and post-implementation surveys, respectively. There was a significant increase in the perceived speed of login with the proximity cards, and a significant decrease in the self-reported occurrence of shared login sessions between users. Feedback regarding the system was mostly positive, although several caveats were noted, including minimal benefit when used with an obstetric application that did not support single sign-on. Proximity cards and readers, along with single sign-on software, have the potential to enhance workflow efficiency by allowing for faster login times and diminish security concerns by reducing shared logins on clinical workstations. The positive feedback was used by our health system leadership to support the expanded implementation of the proximity card readers throughout the clinical setting. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
CARE 3 user-friendly interface user's guide
NASA Technical Reports Server (NTRS)
Martensen, A. L.
1987-01-01
CARE 3 predicts the unreliability of highly reliable reconfigurable fault-tolerant systems that include redundant computers or computer systems. CARE3MENU is a user-friendly interface used to create an input for the CARE 3 program. The CARE3MENU interface has been designed to minimize user input errors. Although a CARE3MENU session may be successfully completed and all parameters may be within specified limits or ranges, the CARE 3 program is not guaranteed to produce meaningful results if the user incorrectly interprets the CARE 3 stochastic model. The CARE3MENU User Guide provides complete information on how to create a CARE 3 model with the interface. The CARE3MENU interface runs under the VAX/VMS operating system.
NASA Technical Reports Server (NTRS)
Devito, D. M.
1981-01-01
A low-cost GPS civil-user mobile terminal whose purchase cost is substantially an order of magnitude less than estimates for the military counterpart is considered with focus on ground station requirements for position monitoring of civil users requiring this capability and the civil user navigation and location-monitoring requirements. Existing survey literature was examined to ascertain the potential users of a low-cost NAVSTAR receiver and to estimate their number, function, and accuracy requirements. System concepts are defined for low cost user equipments for in-situ navigation and the retransmission of low data rate positioning data via a geostationary satellite to a central computing facility.
Yu, Rong; Zhong, Weifeng; Xie, Shengli; Zhang, Yan; Zhang, Yun
2016-02-01
As the next-generation power grid, smart grid will be integrated with a variety of novel communication technologies to support the explosive data traffic and the diverse requirements of quality of service (QoS). Cognitive radio (CR), which has the favorable ability to improve the spectrum utilization, provides an efficient and reliable solution for smart grid communications networks. In this paper, we study the QoS differential scheduling problem in the CR-based smart grid communications networks. The scheduler is responsible for managing the spectrum resources and arranging the data transmissions of smart grid users (SGUs). To guarantee the differential QoS, the SGUs are assigned to have different priorities according to their roles and their current situations in the smart grid. Based on the QoS-aware priority policy, the scheduler adjusts the channels allocation to minimize the transmission delay of SGUs. The entire transmission scheduling problem is formulated as a semi-Markov decision process and solved by the methodology of adaptive dynamic programming. A heuristic dynamic programming (HDP) architecture is established for the scheduling problem. By the online network training, the HDP can learn from the activities of primary users and SGUs, and adjust the scheduling decision to achieve the purpose of transmission delay minimization. Simulation results illustrate that the proposed priority policy ensures the low transmission delay of high priority SGUs. In addition, the emergency data transmission delay is also reduced to a significantly low level, guaranteeing the differential QoS in smart grid.
NASA Technical Reports Server (NTRS)
Idris, Husni; Shen, Ni; Wing, David J.
2011-01-01
The growing demand for air travel is increasing the need for mitigating air traffic congestion and complexity problems, which are already at high levels. At the same time new surveillance, navigation, and communication technologies are enabling major transformations in the air traffic management system, including net-based information sharing and collaboration, performance-based access to airspace resources, and trajectory-based rather than clearance-based operations. The new system will feature different schemes for allocating tasks and responsibilities between the ground and airborne agents and between the human and automation, with potential capacity and cost benefits. Therefore, complexity management requires new metrics and methods that can support these new schemes. This paper presents metrics and methods for preserving trajectory flexibility that have been proposed to support a trajectory-based approach for complexity management by airborne or ground-based systems. It presents extensions to these metrics as well as to the initial research conducted to investigate the hypothesis that using these metrics to guide user and service provider actions will naturally mitigate traffic complexity. The analysis showed promising results in that: (1) Trajectory flexibility preservation mitigated traffic complexity as indicated by inducing self-organization in the traffic patterns and lowering traffic complexity indicators such as dynamic density and traffic entropy. (2)Trajectory flexibility preservation reduced the potential for secondary conflicts in separation assurance. (3) Trajectory flexibility metrics showed potential application to support user and service provider negotiations for minimizing the constraints imposed on trajectories without jeopardizing their objectives.
Study of data collection platform concepts: Data collection system user requirements
NASA Technical Reports Server (NTRS)
1973-01-01
The overall purpose of the survey was to provide real world data on user requirements. The intent was to assess data collection system user requirements by questioning actual potential users rather than speculating on requirements. The end results of the survey are baseline requirements models for both a data collection platform and a data collection system. These models were derived from the survey results. The real value of these models lies in the fact that they are based on actual user requirements as delineated in the survey questionnaires. Some users desire data collection platforms of small size and light weight. These sizes and weights are beyond the present state of the art. Also, the survey provided a wealth of information on the nature and constituency of the data collection user community as well as information on user applications for data collection systems. Finally, the data sheds light on the generalized platform concept. That is, the diversity of user requirements shown in the data indicates the difficulty that can be anticipated in attempting to implement such a concept.
van Velsen, Evert F S; Niessen, Wiro J; de Weert, Thomas T; de Monyé, Cécile; van der Lugt, Aad; Meijering, Erik; Stokking, Rik
2007-07-01
Vessel image analysis is crucial when considering therapeutical options for (cardio-) vascular diseases. Our method, VAMPIRE (Vascular Analysis using Multiscale Paths Inferred from Ridges and Edges), involves two parts: a user defines a start- and endpoint upon which a lumen path is automatically defined, and which is used for initialization; the automatic segmentation of the vessel lumen on computed tomographic angiography (CTA) images. Both parts are based on the detection of vessel-like structures by analyzing intensity, edge, and ridge information. A multi-observer evaluation study was performed to compare VAMPIRE with a conventional method on the CTA data of 15 patients with carotid artery stenosis. In addition to the start- and endpoint, the two radiologists required on average 2.5 (SD: 1.9) additional points to define a lumen path when using the conventional method, and 0.1 (SD: 0.3) when using VAMPIRE. The segmentation results were quantitatively evaluated using Similarity Indices, which were slightly lower between VAMPIRE and the two radiologists (respectively 0.90 and 0.88) compared with the Similarity Index between the radiologists (0.92). The evaluation shows that the improved definition of a lumen path requires minimal user interaction, and that using this path as initialization leads to good automatic lumen segmentation results.
Dynamic Task Optimization in Remote Diabetes Monitoring Systems.
Suh, Myung-Kyung; Woodbridge, Jonathan; Moin, Tannaz; Lan, Mars; Alshurafa, Nabil; Samy, Lauren; Mortazavi, Bobak; Ghasemzadeh, Hassan; Bui, Alex; Ahmadi, Sheila; Sarrafzadeh, Majid
2012-09-01
Diabetes is the seventh leading cause of death in the United States, but careful symptom monitoring can prevent adverse events. A real-time patient monitoring and feedback system is one of the solutions to help patients with diabetes and their healthcare professionals monitor health-related measurements and provide dynamic feedback. However, data-driven methods to dynamically prioritize and generate tasks are not well investigated in the domain of remote health monitoring. This paper presents a wireless health project (WANDA) that leverages sensor technology and wireless communication to monitor the health status of patients with diabetes. The WANDA dynamic task management function applies data analytics in real-time to discretize continuous features, applying data clustering and association rule mining techniques to manage a sliding window size dynamically and to prioritize required user tasks. The developed algorithm minimizes the number of daily action items required by patients with diabetes using association rules that satisfy a minimum support, confidence and conditional probability thresholds. Each of these tasks maximizes information gain, thereby improving the overall level of patient adherence and satisfaction. Experimental results from applying EM-based clustering and Apriori algorithms show that the developed algorithm can predict further events with higher confidence levels and reduce the number of user tasks by up to 76.19 %.
Dynamic Task Optimization in Remote Diabetes Monitoring Systems
Suh, Myung-kyung; Woodbridge, Jonathan; Moin, Tannaz; Lan, Mars; Alshurafa, Nabil; Samy, Lauren; Mortazavi, Bobak; Ghasemzadeh, Hassan; Bui, Alex; Ahmadi, Sheila; Sarrafzadeh, Majid
2016-01-01
Diabetes is the seventh leading cause of death in the United States, but careful symptom monitoring can prevent adverse events. A real-time patient monitoring and feedback system is one of the solutions to help patients with diabetes and their healthcare professionals monitor health-related measurements and provide dynamic feedback. However, data-driven methods to dynamically prioritize and generate tasks are not well investigated in the domain of remote health monitoring. This paper presents a wireless health project (WANDA) that leverages sensor technology and wireless communication to monitor the health status of patients with diabetes. The WANDA dynamic task management function applies data analytics in real-time to discretize continuous features, applying data clustering and association rule mining techniques to manage a sliding window size dynamically and to prioritize required user tasks. The developed algorithm minimizes the number of daily action items required by patients with diabetes using association rules that satisfy a minimum support, confidence and conditional probability thresholds. Each of these tasks maximizes information gain, thereby improving the overall level of patient adherence and satisfaction. Experimental results from applying EM-based clustering and Apriori algorithms show that the developed algorithm can predict further events with higher confidence levels and reduce the number of user tasks by up to 76.19 %. PMID:27617297
TELICS—A Telescope Instrument Control System for Small/Medium Sized Astronomical Observatories
NASA Astrophysics Data System (ADS)
Srivastava, Mudit K.; Ramaprakash, A. N.; Burse, Mahesh P.; Chordia, Pravin A.; Chillal, Kalpesh S.; Mestry, Vilas B.; Das, Hillol K.; Kohok, Abhay A.
2009-10-01
For any modern astronomical observatory, it is essential to have an efficient interface between the telescope and its back-end instruments. However, for small and medium-sized observatories, this requirement is often limited by tight financial constraints. Therefore a simple yet versatile and low-cost control system is required for such observatories to minimize cost and effort. Here we report the development of a modern, multipurpose instrument control system TELICS (Telescope Instrument Control System) to integrate the controls of various instruments and devices mounted on the telescope. TELICS consists of an embedded hardware unit known as a common control unit (CCU) in combination with Linux-based data acquisition and user interface. The hardware of the CCU is built around the ATmega 128 microcontroller (Atmel Corp.) and is designed with a backplane, master-slave architecture. A Qt-based graphical user interface (GUI) has been developed and the back-end application software is based on C/C++. TELICS provides feedback mechanisms that give the operator good visibility and a quick-look display of the status and modes of instruments as well as data. TELICS has been used for regular science observations since 2008 March on the 2 m, f/10 IUCAA Telescope located at Girawali in Pune, India.
Smartphone-Based Food Diagnostic Technologies: A Review.
Rateni, Giovanni; Dario, Paolo; Cavallo, Filippo
2017-06-20
A new generation of mobile sensing approaches offers significant advantages over traditional platforms in terms of test speed, control, low cost, ease-of-operation, and data management, and requires minimal equipment and user involvement. The marriage of novel sensing technologies with cellphones enables the development of powerful lab-on-smartphone platforms for many important applications including medical diagnosis, environmental monitoring, and food safety analysis. This paper reviews the recent advancements and developments in the field of smartphone-based food diagnostic technologies, with an emphasis on custom modules to enhance smartphone sensing capabilities. These devices typically comprise multiple components such as detectors, sample processors, disposable chips, batteries and software, which are integrated with a commercial smartphone. One of the most important aspects of developing these systems is the integration of these components onto a compact and lightweight platform that requires minimal power. To date, researchers have demonstrated several promising approaches employing various sensing techniques and device configurations. We aim to provide a systematic classification according to the detection strategy, providing a critical discussion of strengths and weaknesses. We have also extended the analysis to the food scanning devices that are increasingly populating the Internet of Things (IoT) market, demonstrating how this field is indeed promising, as the research outputs are quickly capitalized on new start-up companies.
Smartphone-Based Food Diagnostic Technologies: A Review
Rateni, Giovanni; Dario, Paolo; Cavallo, Filippo
2017-01-01
A new generation of mobile sensing approaches offers significant advantages over traditional platforms in terms of test speed, control, low cost, ease-of-operation, and data management, and requires minimal equipment and user involvement. The marriage of novel sensing technologies with cellphones enables the development of powerful lab-on-smartphone platforms for many important applications including medical diagnosis, environmental monitoring, and food safety analysis. This paper reviews the recent advancements and developments in the field of smartphone-based food diagnostic technologies, with an emphasis on custom modules to enhance smartphone sensing capabilities. These devices typically comprise multiple components such as detectors, sample processors, disposable chips, batteries and software, which are integrated with a commercial smartphone. One of the most important aspects of developing these systems is the integration of these components onto a compact and lightweight platform that requires minimal power. To date, researchers have demonstrated several promising approaches employing various sensing techniques and device configurations. We aim to provide a systematic classification according to the detection strategy, providing a critical discussion of strengths and weaknesses. We have also extended the analysis to the food scanning devices that are increasingly populating the Internet of Things (IoT) market, demonstrating how this field is indeed promising, as the research outputs are quickly capitalized on new start-up companies. PMID:28632188
Smartphones for cell and biomolecular detection.
Liu, Xiyuan; Lin, Tung-Yi; Lillehoj, Peter B
2014-11-01
Recent advances in biomedical science and technology have played a significant role in the development of new sensors and assays for cell and biomolecular detection. Generally, these efforts are aimed at reducing the complexity and costs associated with diagnostic testing so that it can be performed outside of a laboratory or hospital setting, requiring minimal equipment and user involvement. In particular, point-of-care (POC) testing offers immense potential for many important applications including medical diagnosis, environmental monitoring, food safety, and biosecurity. When coupled with smartphones, POC systems can offer portability, ease of use and enhanced functionality while maintaining performance. This review article focuses on recent advancements and developments in smartphone-based POC systems within the last 6 years with an emphasis on cell and biomolecular detection. These devices typically comprise multiple components, such as detectors, sample processors, disposable chips, batteries, and software, which are integrated with a commercial smartphone. One of the most important aspects of developing these systems is the integration of these components onto a compact and lightweight platform that requires minimal power. Researchers have demonstrated several promising approaches employing various detection schemes and device configurations, and it is expected that further developments in biosensors, battery technology and miniaturized electronics will enable smartphone-based POC technologies to become more mainstream tools in the scientific and biomedical communities.
Alor-Hernández, Giner; Sánchez-Cervantes, José Luis; Juárez-Martínez, Ulises; Posada-Gómez, Rubén; Cortes-Robles, Guillermo; Aguilar-Laserre, Alberto
2012-03-01
Emergency healthcare is one of the emerging application domains for information services, which requires highly multimodal information services. The time of consuming pre-hospital emergency process is critical. Therefore, the minimization of required time for providing primary care and consultation to patients is one of the crucial factors when trying to improve the healthcare delivery in emergency situations. In this sense, dynamic location of medical entities is a complex process that needs time and it can be critical when a person requires medical attention. This work presents a multimodal location-based system for locating and assigning medical entities called ITOHealth. ITOHealth provides a multimodal middleware-oriented integrated architecture using a service-oriented architecture in order to provide information of medical entities in mobile devices and web browsers with enriched interfaces providing multimodality support. ITOHealth's multimodality is based on the use of Microsoft Agent Characters, the integration of natural language voice to the characters, and multi-language and multi-characters support providing an advantage for users with visual impairments.
Harvesting geographic features from heterogeneous raster maps
NASA Astrophysics Data System (ADS)
Chiang, Yao-Yi
2010-11-01
Raster maps offer a great deal of geospatial information and are easily accessible compared to other geospatial data. However, harvesting geographic features locked in heterogeneous raster maps to obtain the geospatial information is challenging. This is because of the varying image quality of raster maps (e.g., scanned maps with poor image quality and computer-generated maps with good image quality), the overlapping geographic features in maps, and the typical lack of metadata (e.g., map geocoordinates, map source, and original vector data). Previous work on map processing is typically limited to a specific type of map and often relies on intensive manual work. In contrast, this thesis investigates a general approach that does not rely on any prior knowledge and requires minimal user effort to process heterogeneous raster maps. This approach includes automatic and supervised techniques to process raster maps for separating individual layers of geographic features from the maps and recognizing geographic features in the separated layers (i.e., detecting road intersections, generating and vectorizing road geometry, and recognizing text labels). The automatic technique eliminates user intervention by exploiting common map properties of how road lines and text labels are drawn in raster maps. For example, the road lines are elongated linear objects and the characters are small connected-objects. The supervised technique utilizes labels of road and text areas to handle complex raster maps, or maps with poor image quality, and can process a variety of raster maps with minimal user input. The results show that the general approach can handle raster maps with varying map complexity, color usage, and image quality. By matching extracted road intersections to another geospatial dataset, we can identify the geocoordinates of a raster map and further align the raster map, separated feature layers from the map, and recognized features from the layers with the geospatial dataset. The road vectorization and text recognition results outperform state-of-art commercial products, and with considerably less user input. The approach in this thesis allows us to make use of the geospatial information of heterogeneous maps locked in raster format.
ERIC Educational Resources Information Center
Read, Aaron
2013-01-01
The rise of stakeholder centered software development has led to organizations engaging users early in the development process to help define system requirements. To facilitate user involvement in the requirements elicitation process, companies can use Group Support Systems (GSS) to conduct requirements elicitation workshops. The effectiveness of…
The Berlin Brain–Computer Interface: Non-Medical Uses of BCI Technology
Blankertz, Benjamin; Tangermann, Michael; Vidaurre, Carmen; Fazli, Siamac; Sannelli, Claudia; Haufe, Stefan; Maeder, Cecilia; Ramsey, Lenny; Sturm, Irene; Curio, Gabriel; Müller, Klaus-Robert
2010-01-01
Brain–computer interfacing (BCI) is a steadily growing area of research. While initially BCI research was focused on applications for paralyzed patients, increasingly more alternative applications in healthy human subjects are proposed and investigated. In particular, monitoring of mental states and decoding of covert user states have seen a strong rise of interest. Here, we present some examples of such novel applications which provide evidence for the promising potential of BCI technology for non-medical uses. Furthermore, we discuss distinct methodological improvements required to bring non-medical applications of BCI technology to a diversity of layperson target groups, e.g., ease of use, minimal training, general usability, short control latencies. PMID:21165175
Passive and hybrid solar technologies program summary
NASA Astrophysics Data System (ADS)
1985-05-01
The goal of the national energy policy is to foster an adequate supply of energy at reasonable prices. This policy recognizes that adequate supply requires flexibility, with no undue reliance on any single source of supply. The goal of reasonable prices suggests economic efficiency so that consumers, individuals, commercial and industrial users alike, are not penalized by government regulation or subside. The strategies for achieving this energy policy goal are: (1) to minimize federal regulation in energy pricing while maintaining public health and safety and environmental quality, and (2) to promote a balanced and mixed energy resource system through research and development. One of the keys to energy sufficiently is the scientific application of passive solar energy techniques.
Assembling, maintaining and servicing Space Station
NASA Technical Reports Server (NTRS)
Doetsch, K. H.; Werstiuk, H.; Creasy, W.; Browning, R.
1987-01-01
The assembly, maintenance, and servicing of the Space Station and its facilities are discussed. The tools and facilities required for the assembly, maintenance, and servicing of the Station are described; the ground and transportation infrastructures needed for the Space Station are examined. The roles of automation and robotics in reducing the EVAs of the crew, minimizing disturbances to the Space Station environment, and enhancing user friendliness are investigated. Servicing/maintenance tasks are categorized based on: (1) urgency, (2) location of servicing/maintenance, (3) environmental control, (4) dexterity, (5) transportation, (6) crew interactions, (7) equipment interactions, and (8) Space Station servicing architecture. An example of a servicing mission by the Space Station for the Hubble Space Telescope is presented.
Spaceflight Operations Services Grid (SOSG) Prototype Implementation and Feasibility Study
NASA Technical Reports Server (NTRS)
Bradford, Robert N.; Thigpen, William W.; Lisotta, Anthony J.; Redman, Sandra
2004-01-01
Science Operations Services Grid is focusing on building a prototype grid-based environment that incorporates existing and new spaceflight services to enable current and future NASA programs with cost savings and new and evolvable methods to conduct science in a distributed environment. The Science Operations Services Grid (SOSG) will provide a distributed environment for widely disparate organizations to conduct their systems and processes in a more efficient and cost effective manner. These organizations include those that: 1) engage in space-based science and operations, 2) develop space-based systems and processes, and 3) conduct scientific research, bringing together disparate scientific disciplines like geology and oceanography to create new information. In addition educational outreach will be significantly enhanced by providing to schools the same tools used by NASA with the ability of the schools to actively participate on many levels in the science generated by NASA from space and on the ground. The services range from voice, video and telemetry processing and display to data mining, high level processing and visualization tools all accessible from a single portal. In this environment, users would not require high end systems or processes at their home locations to use these services. Also, the user would need to know minimal details about the applications in order to utilize the services. In addition, security at all levels is an underlying goal of the project. The Science Operations Services Grid will focus on four tools that are currently used by the ISS Payload community along with nine more that are new to the community. Under the prototype four Grid virtual organizations PO) will be developed to represent four types of users. They are a Payload (experimenters) VO, a Flight Controllers VO, an Engineering and Science Collaborators VO and an Education and Public Outreach VO. The User-based services will be implemented to replicate the operational voice, video, telemetry and commanding systems. Once the User-based services are in place, they will be analyzed to establish feasibility for Grid enabling. If feasible then each User-based service will be Grid enabled. The remaining non-Grid services if not already Web enabled will be so enabled. In the end, four portals will be developed one for each VO. Each portal will contain the appropriate User-based services required for that VO to operate.
Yuan, Michael Juntao; Finley, George Mike; Mills, Christy; Johnson, Ron Kim
2013-01-01
Background Clinical decision support systems (CDSS) are important tools to improve health care outcomes and reduce preventable medical adverse events. However, the effectiveness and success of CDSS depend on their implementation context and usability in complex health care settings. As a result, usability design and validation, especially in real world clinical settings, are crucial aspects of successful CDSS implementations. Objective Our objective was to develop a novel CDSS to help frontline nurses better manage critical symptom changes in hospitalized patients, hence reducing preventable failure to rescue cases. A robust user interface and implementation strategy that fit into existing workflows was key for the success of the CDSS. Methods Guided by a formal usability evaluation framework, UFuRT (user, function, representation, and task analysis), we developed a high-level specification of the product that captures key usability requirements and is flexible to implement. We interviewed users of the proposed CDSS to identify requirements, listed functions, and operations the system must perform. We then designed visual and workflow representations of the product to perform the operations. The user interface and workflow design were evaluated via heuristic and end user performance evaluation. The heuristic evaluation was done after the first prototype, and its results were incorporated into the product before the end user evaluation was conducted. First, we recruited 4 evaluators with strong domain expertise to study the initial prototype. Heuristic violations were coded and rated for severity. Second, after development of the system, we assembled a panel of nurses, consisting of 3 licensed vocational nurses and 7 registered nurses, to evaluate the user interface and workflow via simulated use cases. We recorded whether each session was successfully completed and its completion time. Each nurse was asked to use the National Aeronautics and Space Administration (NASA) Task Load Index to self-evaluate the amount of cognitive and physical burden associated with using the device. Results A total of 83 heuristic violations were identified in the studies. The distribution of the heuristic violations and their average severity are reported. The nurse evaluators successfully completed all 30 sessions of the performance evaluations. All nurses were able to use the device after a single training session. On average, the nurses took 111 seconds (SD 30 seconds) to complete the simulated task. The NASA Task Load Index results indicated that the work overhead on the nurses was low. In fact, most of the burden measures were consistent with zero. The only potentially significant burden was temporal demand, which was consistent with the primary use case of the tool. Conclusions The evaluation has shown that our design was functional and met the requirements demanded by the nurses’ tight schedules and heavy workloads. The user interface embedded in the tool provided compelling utility to the nurse with minimal distraction. PMID:23612350
MAAMD: a workflow to standardize meta-analyses and comparison of affymetrix microarray data
2014-01-01
Background Mandatory deposit of raw microarray data files for public access, prior to study publication, provides significant opportunities to conduct new bioinformatics analyses within and across multiple datasets. Analysis of raw microarray data files (e.g. Affymetrix CEL files) can be time consuming, complex, and requires fundamental computational and bioinformatics skills. The development of analytical workflows to automate these tasks simplifies the processing of, improves the efficiency of, and serves to standardize multiple and sequential analyses. Once installed, workflows facilitate the tedious steps required to run rapid intra- and inter-dataset comparisons. Results We developed a workflow to facilitate and standardize Meta-Analysis of Affymetrix Microarray Data analysis (MAAMD) in Kepler. Two freely available stand-alone software tools, R and AltAnalyze were embedded in MAAMD. The inputs of MAAMD are user-editable csv files, which contain sample information and parameters describing the locations of input files and required tools. MAAMD was tested by analyzing 4 different GEO datasets from mice and drosophila. MAAMD automates data downloading, data organization, data quality control assesment, differential gene expression analysis, clustering analysis, pathway visualization, gene-set enrichment analysis, and cross-species orthologous-gene comparisons. MAAMD was utilized to identify gene orthologues responding to hypoxia or hyperoxia in both mice and drosophila. The entire set of analyses for 4 datasets (34 total microarrays) finished in ~ one hour. Conclusions MAAMD saves time, minimizes the required computer skills, and offers a standardized procedure for users to analyze microarray datasets and make new intra- and inter-dataset comparisons. PMID:24621103
An Optimized Handover Scheme with Movement Trend Awareness for Body Sensor Networks
Sun, Wen; Zhang, Zhiqiang; Ji, Lianying; Wong, Wai-Choong
2013-01-01
When a body sensor network (BSN) that is linked to the backbone via a wireless network interface moves from one coverage zone to another, a handover is required to maintain network connectivity. This paper presents an optimized handover scheme with movement trend awareness for BSNs. The proposed scheme predicts the future position of a BSN user using the movement trend extracted from the historical position, and adjusts the handover decision accordingly. Handover initiation time is optimized when the unnecessary handover rate is estimated to meet the requirement and the outage probability is minimized. The proposed handover scheme is simulated in a BSN deployment area in a hospital environment in UK. Simulation results show that the proposed scheme reduces the outage probability by 22% as compared with the existing hysteresis-based handover scheme under the constraint of acceptable handover rate. PMID:23736852
Auto identification technology and its impact on patient safety in the Operating Room of the Future.
Egan, Marie T; Sandberg, Warren S
2007-03-01
Automatic identification technologies, such as bar coding and radio frequency identification, are ubiquitous in everyday life but virtually nonexistent in the operating room. User expectations, based on everyday experience with automatic identification technologies, have generated much anticipation that these systems will improve readiness, workflow, and safety in the operating room, with minimal training requirements. We report, in narrative form, a multi-year experience with various automatic identification technologies in the Operating Room of the Future Project at Massachusetts General Hospital. In each case, the additional human labor required to make these ;labor-saving' technologies function in the medical environment has proved to be their undoing. We conclude that while automatic identification technologies show promise, significant barriers to realizing their potential still exist. Nevertheless, overcoming these obstacles is necessary if the vision of an operating room of the future in which all processes are monitored, controlled, and optimized is to be achieved.
Characterizing Interference in Radio Astronomy Observations through Active and Unsupervised Learning
NASA Technical Reports Server (NTRS)
Doran, G.
2013-01-01
In the process of observing signals from astronomical sources, radio astronomers must mitigate the effects of manmade radio sources such as cell phones, satellites, aircraft, and observatory equipment. Radio frequency interference (RFI) often occurs as short bursts (< 1 ms) across a broad range of frequencies, and can be confused with signals from sources of interest such as pulsars. With ever-increasing volumes of data being produced by observatories, automated strategies are required to detect, classify, and characterize these short "transient" RFI events. We investigate an active learning approach in which an astronomer labels events that are most confusing to a classifier, minimizing the human effort required for classification. We also explore the use of unsupervised clustering techniques, which automatically group events into classes without user input. We apply these techniques to data from the Parkes Multibeam Pulsar Survey to characterize several million detected RFI events from over a thousand hours of observation.
Energy usage while maintaining thermal comfort: A case study of a UNT dormitory
NASA Astrophysics Data System (ADS)
Gambrell, Dusten
Campus dormitories for the University of North Texas house over 5500 students per year; each one of them requires certain comfortable living conditions while they live there. There is an inherit amount of money required in order to achieve minimal comfort levels; the cost is mostly natural gas for water and room heating and electricity for cooling, lighting and peripherals. The US Department of Energy has developed several programs to aid in performing energy simulations to help those interested design more cost effective building designs. Energy-10 is such a program that allows users to conduct whole house evaluations by reviewing and altering a few parameters such as building materials, solar heating, energy efficient windows etc. The idea of this project was to recreate a campus dormitory and try to emulate existent energy consumption then try to find ways of lowering that usage while maintaining a high level of personal comfort.
NASA Astrophysics Data System (ADS)
Nagel, Markus; Hoheisel, Martin; Petzold, Ralf; Kalender, Willi A.; Krause, Ulrich H. W.
2007-03-01
Integrated solutions for navigation systems with CT, MR or US systems become more and more popular for medical products. Such solutions improve the medical workflow, reduce hardware, space and costs requirements. The purpose of our project was to develop a new electromagnetic navigation system for interventional radiology which is integrated into C-arm CT systems. The application is focused on minimally invasive percutaneous interventions performed under local anaesthesia. Together with a vacuum-based patient immobilization device and newly developed navigation tools (needles, panels) we developed a safe and fully automatic navigation system. The radiologist can directly start with navigated interventions after loading images without any prior user interaction. The complete system is adapted to the requirements of the radiologist and to the clinical workflow. For evaluation of the navigation system we performed different phantom studies and achieved an average accuracy of better than 2.0 mm.
NASA Technical Reports Server (NTRS)
Creason, A. S.; Miranda, F. A.
1996-01-01
Knowledge of the microwave properties at cryogenic temperatures of components fabricated using High-Temperature-Superconductors (HTS) is useful in the design of HTS-based microwave circuits. Therefore, fast and reliable characterization techniques have been developed to study the aforementioned properties. In this paper, we discuss computer analysis techniques employed in the cryogenic characterization of HTS-based resonators. The revised data analysis process requires minimal user input. and organizes the data in a form that is easily accessible by the user for further examination. These programs retrieve data generated during the cryogenic characterization at microwave frequencies of HTS based resonators and use it to calculate parameters such as the loaded and unloaded quality factors (Q and Q(sub o), respectively), the resonant frequency (f(sub o)), and the coupling coefficient (k), which are important quantities in the evaluation of HTS resonators. While the data are also stored for further use, the programs allow the user to obtain a graphical representation of any of the measured parameters as a function of temperature soon after the completion of the cryogenic measurement cycle. Although these programs were developed to study planar HTS-based resonators operating in the reflection mode, they could also be used in the cryogenic characterization of two ports (i.e., reflection/transmission) resonators.
Precise Haptic Device Co-Location for Visuo-Haptic Augmented Reality.
Eck, Ulrich; Pankratz, Frieder; Sandor, Christian; Klinker, Gudrun; Laga, Hamid
2015-12-01
Visuo-haptic augmented reality systems enable users to see and touch digital information that is embedded in the real world. PHANToM haptic devices are often employed to provide haptic feedback. Precise co-location of computer-generated graphics and the haptic stylus is necessary to provide a realistic user experience. Previous work has focused on calibration procedures that compensate the non-linear position error caused by inaccuracies in the joint angle sensors. In this article we present a more complete procedure that additionally compensates for errors in the gimbal sensors and improves position calibration. The proposed procedure further includes software-based temporal alignment of sensor data and a method for the estimation of a reference for position calibration, resulting in increased robustness against haptic device initialization and external tracker noise. We designed our procedure to require minimal user input to maximize usability. We conducted an extensive evaluation with two different PHANToMs, two different optical trackers, and a mechanical tracker. Compared to state-of-the-art calibration procedures, our approach significantly improves the co-location of the haptic stylus. This results in higher fidelity visual and haptic augmentations, which are crucial for fine-motor tasks in areas such as medical training simulators, assembly planning tools, or rapid prototyping applications.
Effects of Meteorological Data Quality on Snowpack Modeling
NASA Astrophysics Data System (ADS)
Havens, S.; Marks, D. G.; Robertson, M.; Hedrick, A. R.; Johnson, M.
2017-12-01
Detailed quality control of meteorological inputs is the most time-intensive component of running the distributed, physically-based iSnobal snow model, and the effect of data quality of the inputs on the model is unknown. The iSnobal model has been run operationally since WY2013, and is currently run in several basins in Idaho and California. The largest amount of user input during modeling is for the quality control of precipitation, temperature, relative humidity, solar radiation, wind speed and wind direction inputs. Precipitation inputs require detailed user input and are crucial to correctly model the snowpack mass. This research applies a range of quality control methods to meteorological input, from raw input with minimal cleaning, to complete user-applied quality control. The meteorological input cleaning generally falls into two categories. The first is global minimum/maximum and missing value correction that could be corrected and/or interpolated with automated processing. The second category is quality control for inputs that are not globally erroneous, yet are still unreasonable and generally indicate malfunctioning measurement equipment, such as temperature or relative humidity that remains constant, or does not correlate with daily trends observed at nearby stations. This research will determine how sensitive model outputs are to different levels of quality control and guide future operational applications.
Robertson, Eden G; Wakefield, Claire E; Cohn, Richard J; O'Brien, Tracey; Ziegler, David S; Fardell, Joanna E
2018-05-04
The internet is increasingly being used to disseminate health information. Given the complexity of pediatric oncology clinical trials, we developed Delta, a Web-based decision aid to support families deciding whether or not to enroll their child with cancer in a clinical trial. This paper details the Agile development process of Delta and user testing results of Delta. Development was iterative and involved 5 main stages: a requirements analysis, planning, design, development, and user testing. For user testing, we conducted 13 eye-tracking analyses and think-aloud interviews with health care professionals (n=6) and parents (n=7). Results suggested that there was minimal rereading of content and a high level of engagement in content. However, there were some navigational problems. Participants reported high acceptability (12/13) and high usability of the website (8/13). Delta demonstrates the utility for the use of Agile in the development of a Web-based decision aid for health purposes. Our study provides a clear step-by-step guide to develop a Web-based psychosocial tool within the health setting. ©Eden G Robertson, Claire E Wakefield, Richard J Cohn, Tracey O'Brien, David S Ziegler, Joanna E Fardell. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 04.05.2018.
An overview on the emerging area of identification, characterization, and assessment of health apps.
Paglialonga, Alessia; Lugo, Alessandra; Santoro, Eugenio
2018-05-28
The need to characterize and assess health apps has inspired a significant amount of research in the past years, in search for methods able to provide potential app users with relevant, meaningful knowledge. This article presents an overview of the recent literature in this field and categorizes - by discussing some specific examples - the various methodologies introduced so far for the identification, characterization, and assessment of health apps. Specifically, this article outlines the most significant web-based resources for app identification, relevant frameworks for descriptive characterization of apps' features, and a number of methods for the assessment of quality along its various components (e.g., evidence base, trustworthiness, privacy, or user engagement). The development of methods to characterize the apps' features and to assess their quality is important to define benchmarks and minimum requirements. Similarly, such methods are important to categorize potential risks and challenges in the field so that risks can be minimized, whenever possible, by design. Understanding methods to assess apps is key to raise the standards of quality of health apps on the market, towards the final goal of delivering apps that are built on the pillars of evidence-base, reliability, long-term effectiveness, and user-oriented quality. Copyright © 2018. Published by Elsevier Inc.
EMPHASIS™/Nevada CABANA User Guide Version 2.1.2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turner, C. David; Bohnhoff, William J.; Powell, Jennifer L.
2017-11-15
The CABle ANAlysis (CABANA) portion of the EMPHASIS™ suite is designed specifically for the simulation of cable SGEMP. The code can be used to evaluate the response of a specific cable design to threat or to compare and minimize the relative response of difference designs. This document provides user-specific information to facilitate the application of the code to cables of interest.
IntellEditS: intelligent learning-based editor of segmentations.
Harrison, Adam P; Birkbeck, Neil; Sofka, Michal
2013-01-01
Automatic segmentation techniques, despite demonstrating excellent overall accuracy, can often produce inaccuracies in local regions. As a result, correcting segmentations remains an important task that is often laborious, especially when done manually for 3D datasets. This work presents a powerful tool called Intelligent Learning-Based Editor of Segmentations (IntellEditS) that minimizes user effort and further improves segmentation accuracy. The tool partners interactive learning with an energy-minimization approach to editing. Based on interactive user input, a discriminative classifier is trained and applied to the edited 3D region to produce soft voxel labeling. The labels are integrated into a novel energy functional along with the existing segmentation and image data. Unlike the state of the art, IntellEditS is designed to correct segmentation results represented not only as masks but also as meshes. In addition, IntellEditS accepts intuitive boundary-based user interactions. The versatility and performance of IntellEditS are demonstrated on both MRI and CT datasets consisting of varied anatomical structures and resolutions.
Iturrate, Iñaki; Montesano, Luis; Chavarriaga, Ricardo; del R Millán, Jose; Minguez, Javier
2011-01-01
One of the main problems of both synchronous and asynchronous EEG-based BCIs is the need of an initial calibration phase before the system can be used. This phase is necessary due to the high non-stationarity of the EEG, since it changes between sessions and users. The calibration process limits the BCI systems to scenarios where the outputs are very controlled, and makes these systems non-friendly and exhausting for the users. Although it has been studied how to reduce calibration time for asynchronous signals, it is still an open issue for event-related potentials. Here, we propose the minimization of the calibration time on single-trial error potentials by using classifiers based on inter-subject information. The results show that it is possible to have a classifier with a high performance from the beginning of the experiment, and which is able to adapt itself making the calibration phase shorter and transparent to the user.
Gener: a minimal programming module for chemical controllers based on DNA strand displacement
Kahramanoğulları, Ozan; Cardelli, Luca
2015-01-01
Summary: Gener is a development module for programming chemical controllers based on DNA strand displacement. Gener is developed with the aim of providing a simple interface that minimizes the opportunities for programming errors: Gener allows the user to test the computations of the DNA programs based on a simple two-domain strand displacement algebra, the minimal available so far. The tool allows the user to perform stepwise computations with respect to the rules of the algebra as well as exhaustive search of the computation space with different options for exploration and visualization. Gener can be used in combination with existing tools, and in particular, its programs can be exported to Microsoft Research’s DSD tool as well as to LaTeX. Availability and implementation: Gener is available for download at the Cosbi website at http://www.cosbi.eu/research/prototypes/gener as a windows executable that can be run on Mac OS X and Linux by using Mono. Contact: ozan@cosbi.eu PMID:25957353
Gener: a minimal programming module for chemical controllers based on DNA strand displacement.
Kahramanoğulları, Ozan; Cardelli, Luca
2015-09-01
: Gener is a development module for programming chemical controllers based on DNA strand displacement. Gener is developed with the aim of providing a simple interface that minimizes the opportunities for programming errors: Gener allows the user to test the computations of the DNA programs based on a simple two-domain strand displacement algebra, the minimal available so far. The tool allows the user to perform stepwise computations with respect to the rules of the algebra as well as exhaustive search of the computation space with different options for exploration and visualization. Gener can be used in combination with existing tools, and in particular, its programs can be exported to Microsoft Research's DSD tool as well as to LaTeX. Gener is available for download at the Cosbi website at http://www.cosbi.eu/research/prototypes/gener as a windows executable that can be run on Mac OS X and Linux by using Mono. ozan@cosbi.eu. © The Author 2015. Published by Oxford University Press.
Design for disassembly and sustainability assessment to support aircraft end-of-life treatment
NASA Astrophysics Data System (ADS)
Savaria, Christian
Gas turbine engine design is a multidisciplinary and iterative process. Many design iterations are necessary to address the challenges among the disciplines. In the creation of a new engine architecture, the design time is crucial in capturing new business opportunities. At the detail design phase, it was proven very difficult to correct an unsatisfactory design. To overcome this difficulty, the concept of Multi-Disciplinary Optimization (MDO) at the preliminary design phase (Preliminary MDO or PMDO) is used allowing more freedom to perform changes in the design. PMDO also reduces the design time at the preliminary design phase. The concept of PMDO was used was used to create parametric models, and new correlations for high pressure gas turbine housing and shroud segments towards a new design process. First, dedicated parametric models were created because of their reusability and versatility. Their ease of use compared to non-parameterized models allows more design iterations thus reduces set up and design time. Second, geometry correlations were created to minimize the number of parameters used in turbine housing and shroud segment design. Since the turbine housing and the shroud segment geometries are required in tip clearance analyses, care was taken as to not oversimplify the parametric formulation. In addition, a user interface was developed to interact with the parametric models and improve the design time. Third, the cooling flow predictions require many engine parameters (i.e. geometric and performance parameters and air properties) and a reference shroud segments. A second correlation study was conducted to minimize the number of engine parameters required in the cooling flow predictions and to facilitate the selection of a reference shroud segment. Finally, the parametric models, the geometry correlations, and the user interface resulted in a time saving of 50% and an increase in accuracy of 56% in the new design system compared to the existing design system. Also, regarding the cooling flow correlations, the number of engine parameters was reduced by a factor of 6 to create a simplified prediction model and hence a faster shroud segment selection process. None
Study of power management technology for orbital multi-100KWe applications. Volume 2: Study results
NASA Technical Reports Server (NTRS)
Mildice, J. W.
1980-01-01
The preliminary requirements and technology advances required for cost effective space power management systems for multi-100 kilowatt requirements were identified. System requirements were defined by establishing a baseline space platform in the 250 KE KWe range and examining typical user loads and interfaces. The most critical design parameters identified for detailed analysis include: increased distribution voltages and space plasma losses, the choice between ac and dc distribution systems, shuttle servicing effects on reliability, life cycle costs, and frequency impacts to power management system and payload systems for AC transmission. The first choice for a power management system for this kind of application and size range is a hybrid ac/dc combination with the following major features: modular design and construction-sized minimum weight/life cycle cost; high voltage transmission (100 Vac RMS); medium voltage array or = 440 Vdc); resonant inversion; transformer rotary joint; high frequency power transmission line or = 20 KHz); energy storage on array side or rotary joint; fully redundant; and 10 year life with minimal replacement and repair.
NASA Technical Reports Server (NTRS)
1983-01-01
A profile of altitude, airspeed, and flight path angle as a function of range between a given set of origin and destination points for particular models of transport aircraft provided by NASA is generated. Inputs to the program include the vertical wind profile, the aircraft takeoff weight, the costs of time and fuel, certain constraint parameters and control flags. The profile can be near optimum in the sense of minimizing: (1) fuel, (2) time, or (3) a combination of fuel and time (direct operating cost (DOC)). The user can also, as an option, specify the length of time the flight is to span. The theory behind the technical details of this program is also presented.
Vadnais, Carolyn; Stensaas, Gregory
2014-01-01
Under the National Land Imaging Requirements (NLIR) Project, the U.S. Geological Survey (USGS) is developing a functional capability to obtain, characterize, manage, maintain and prioritize all Earth observing (EO) land remote sensing user requirements. The goal is a better understanding of community needs that can be supported with land remote sensing resources, and a means to match needs with appropriate solutions in an effective and efficient way. The NLIR Project is composed of two components. The first component is focused on the development of the Earth Observation Requirements Evaluation System (EORES) to capture, store and analyze user requirements, whereas, the second component is the mechanism and processes to elicit and document the user requirements that will populate the EORES. To develop the second component, the requirements elicitation methodology was exercised and refined through a pilot project conducted from June to September 2013. The pilot project focused specifically on applications and user requirements for moderate resolution imagery (5–120 meter resolution) as the test case for requirements development. The purpose of this summary report is to provide a high-level overview of the requirements elicitation process that was exercised through the pilot project and an early analysis of the moderate resolution imaging user requirements acquired to date to support ongoing USGS sustainable land imaging study needs. The pilot project engaged a limited set of Federal Government users from the operational and research communities and therefore the information captured represents only a subset of all land imaging user requirements. However, based on a comparison of results, trends, and analysis, the pilot captured a strong baseline of typical applications areas and user needs for moderate resolution imagery. Because these results are preliminary and represent only a sample of users and application areas, the information from this report should only be used to indicate general user needs for the applications covered. Users of the information are cautioned that use of specific numeric results may be inappropriate without additional research. Any information used or cited from this report should specifically be cited as preliminary findings.
Evaluation of users' satisfaction on pedestrian facilities using pair-wise comparison approach
NASA Astrophysics Data System (ADS)
Zainol, R.; Ahmad, F.; Nordin, N. A.; Aripin, A. W. M.
2014-02-01
Global climate change issues demand people of the world to change the way they live today. Thus, current cities need to be redeveloped towards less use of carbon in their day to day operations. Pedestrianized environment is one of the approaches used in reducing carbon foot print in cities. Heritage cities are the first to be looked into since they were built in the era in which motorized vehicles were minimal. Therefore, the research explores users' satisfaction on assessment of physical attributes of pedestrianization in Melaka Historical City, a UNESCO World Heritage Site. It aims to examine users' satisfaction on pedestrian facilities provided within the study area using pair wise questionnaire comparison approach. A survey of 200 respondents using random sampling was conducted in six different sites namely Jonker Street, Church Street, Kota Street, Goldsmith Street, Merdeka Street to Taming Sari Tower and Merdeka Street to River Cruise terminal. The survey consists of an assessment tool based on a nine-point scale of users' satisfaction level of pathway properties, zebra pedestrian crossing, street furniture, personal safety, adjacent to traffic flow, aesthetic and amenities. Analytical hierarchical process (AHP) was used to avoid any biasness in analyzing the data collected. Findings show that Merdeka Street to Taming Sari Tower as the street that scores the highest satisfaction level that fulfils all the required needs of a pedestrianized environment. Similar assessment elements can be used to evaluate existing streets in other cities and these criteria should also be used in planning for future cities.
AsyncStageOut: Distributed user data management for CMS Analysis
NASA Astrophysics Data System (ADS)
Riahi, H.; Wildish, T.; Ciangottini, D.; Hernández, J. M.; Andreeva, J.; Balcas, J.; Karavakis, E.; Mascheroni, M.; Tanasijczuk, A. J.; Vaandering, E. W.
2015-12-01
AsyncStageOut (ASO) is a new component of the distributed data analysis system of CMS, CRAB, designed for managing users' data. It addresses a major weakness of the previous model, namely that mass storage of output data was part of the job execution resulting in inefficient use of job slots and an unacceptable failure rate at the end of the jobs. ASO foresees the management of up to 400k files per day of various sizes, spread worldwide across more than 60 sites. It must handle up to 1000 individual users per month, and work with minimal delay. This creates challenging requirements for system scalability, performance and monitoring. ASO uses FTS to schedule and execute the transfers between the storage elements of the source and destination sites. It has evolved from a limited prototype to a highly adaptable service, which manages and monitors the user file placement and bookkeeping. To ensure system scalability and data monitoring, it employs new technologies such as a NoSQL database and re-uses existing components of PhEDEx and the FTS Dashboard. We present the asynchronous stage-out strategy and the architecture of the solution we implemented to deal with those issues and challenges. The deployment model for the high availability and scalability of the service is discussed. The performance of the system during the commissioning and the first phase of production are also shown, along with results from simulations designed to explore the limits of scalability.
BigDebug: Debugging Primitives for Interactive Big Data Processing in Spark
Gulzar, Muhammad Ali; Interlandi, Matteo; Yoo, Seunghyun; Tetali, Sai Deep; Condie, Tyson; Millstein, Todd; Kim, Miryung
2016-01-01
Developers use cloud computing platforms to process a large quantity of data in parallel when developing big data analytics. Debugging the massive parallel computations that run in today’s data-centers is time consuming and error-prone. To address this challenge, we design a set of interactive, real-time debugging primitives for big data processing in Apache Spark, the next generation data-intensive scalable cloud computing platform. This requires re-thinking the notion of step-through debugging in a traditional debugger such as gdb, because pausing the entire computation across distributed worker nodes causes significant delay and naively inspecting millions of records using a watchpoint is too time consuming for an end user. First, BIGDEBUG’s simulated breakpoints and on-demand watchpoints allow users to selectively examine distributed, intermediate data on the cloud with little overhead. Second, a user can also pinpoint a crash-inducing record and selectively resume relevant sub-computations after a quick fix. Third, a user can determine the root causes of errors (or delays) at the level of individual records through a fine-grained data provenance capability. Our evaluation shows that BIGDEBUG scales to terabytes and its record-level tracing incurs less than 25% overhead on average. It determines crash culprits orders of magnitude more accurately and provides up to 100% time saving compared to the baseline replay debugger. The results show that BIGDEBUG supports debugging at interactive speeds with minimal performance impact. PMID:27390389
Computational high-resolution heart phantoms for medical imaging and dosimetry simulations
NASA Astrophysics Data System (ADS)
Gu, Songxiang; Gupta, Rajiv; Kyprianou, Iacovos
2011-09-01
Cardiovascular disease in general and coronary artery disease (CAD) in particular, are the leading cause of death worldwide. They are principally diagnosed using either invasive percutaneous transluminal coronary angiograms or non-invasive computed tomography angiograms (CTA). Minimally invasive therapies for CAD such as angioplasty and stenting are rendered under fluoroscopic guidance. Both invasive and non-invasive imaging modalities employ ionizing radiation and there is concern for deterministic and stochastic effects of radiation. Accurate simulation to optimize image quality with minimal radiation dose requires detailed, gender-specific anthropomorphic phantoms with anatomically correct heart and associated vasculature. Such phantoms are currently unavailable. This paper describes an open source heart phantom development platform based on a graphical user interface. Using this platform, we have developed seven high-resolution cardiac/coronary artery phantoms for imaging and dosimetry from seven high-quality CTA datasets. To extract a phantom from a coronary CTA, the relationship between the intensity distribution of the myocardium, the ventricles and the coronary arteries is identified via histogram analysis of the CTA images. By further refining the segmentation using anatomy-specific criteria such as vesselness, connectivity criteria required by the coronary tree and image operations such as active contours, we are able to capture excellent detail within our phantoms. For example, in one of the female heart phantoms, as many as 100 coronary artery branches could be identified. Triangular meshes are fitted to segmented high-resolution CTA data. We have also developed a visualization tool for adding stenotic lesions to the coronaries. The male and female heart phantoms generated so far have been cross-registered and entered in the mesh-based Virtual Family of phantoms with matched age/gender information. Any phantom in this family, along with user-defined stenoses, can be used to obtain clinically realistic projection images with the Monte Carlo code penMesh for optimizing imaging and dosimetry.
GPU-based Parallel Application Design for Emerging Mobile Devices
NASA Astrophysics Data System (ADS)
Gupta, Kshitij
A revolution is underway in the computing world that is causing a fundamental paradigm shift in device capabilities and form-factor, with a move from well-established legacy desktop/laptop computers to mobile devices in varying sizes and shapes. Amongst all the tasks these devices must support, graphics has emerged as the 'killer app' for providing a fluid user interface and high-fidelity game rendering, effectively making the graphics processor (GPU) one of the key components in (present and future) mobile systems. By utilizing the GPU as a general-purpose parallel processor, this dissertation explores the GPU computing design space from an applications standpoint, in the mobile context, by focusing on key challenges presented by these devices---limited compute, memory bandwidth, and stringent power consumption requirements---while improving the overall application efficiency of the increasingly important speech recognition workload for mobile user interaction. We broadly partition trends in GPU computing into four major categories. We analyze hardware and programming model limitations in current-generation GPUs and detail an alternate programming style called Persistent Threads, identify four use case patterns, and propose minimal modifications that would be required for extending native support. We show how by manually extracting data locality and altering the speech recognition pipeline, we are able to achieve significant savings in memory bandwidth while simultaneously reducing the compute burden on GPU-like parallel processors. As we foresee GPU computing to evolve from its current 'co-processor' model into an independent 'applications processor' that is capable of executing complex work independently, we create an alternate application framework that enables the GPU to handle all control-flow dependencies autonomously at run-time while minimizing host involvement to just issuing commands, that facilitates an efficient application implementation. Finally, as compute and communication capabilities of mobile devices improve, we analyze energy implications of processing speech recognition locally (on-chip) and offloading it to servers (in-cloud).
NSLS-II HIGH LEVEL APPLICATION INFRASTRUCTURE AND CLIENT API DESIGN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shen, G.; Yang; L.
2011-03-28
The beam commissioning software framework of NSLS-II project adopts a client/server based architecture to replace the more traditional monolithic high level application approach. It is an open structure platform, and we try to provide a narrow API set for client application. With this narrow API, existing applications developed in different language under different architecture could be ported to our platform with small modification. This paper describes system infrastructure design, client API and system integration, and latest progress. As a new 3rd generation synchrotron light source with ultra low emittance, there are new requirements and challenges to control and manipulate themore » beam. A use case study and a theoretical analysis have been performed to clarify requirements and challenges to the high level applications (HLA) software environment. To satisfy those requirements and challenges, adequate system architecture of the software framework is critical for beam commissioning, study and operation. The existing traditional approaches are self-consistent, and monolithic. Some of them have adopted a concept of middle layer to separate low level hardware processing from numerical algorithm computing, physics modelling, data manipulating, plotting, and error handling. However, none of the existing approaches can satisfy the requirement. A new design has been proposed by introducing service oriented architecture technology. The HLA is combination of tools for accelerator physicists and operators, which is same as traditional approach. In NSLS-II, they include monitoring applications and control routines. Scripting environment is very important for the later part of HLA and both parts are designed based on a common set of APIs. Physicists and operators are users of these APIs, while control system engineers and a few accelerator physicists are the developers of these APIs. With our Client/Server mode based approach, we leave how to retrieve information to the developers of APIs and how to use them to form a physics application to the users. For example, how the channels are related to magnet and what the current real-time setting of a magnet is in physics unit are the internals of APIs. Measuring chromaticities are the users of APIs. All the users of APIs are working with magnet and instrument names in a physics unit. The low level communications in current or voltage unit are minimized. In this paper, we discussed our recent progress of our infrastructure development, and client API.« less
Ono, Shunsuke
2017-04-01
Minimizing L 0 gradient, the number of the non-zero gradients of an image, together with a quadratic data-fidelity to an input image has been recognized as a powerful edge-preserving filtering method. However, the L 0 gradient minimization has an inherent difficulty: a user-given parameter controlling the degree of flatness does not have a physical meaning since the parameter just balances the relative importance of the L 0 gradient term to the quadratic data-fidelity term. As a result, the setting of the parameter is a troublesome work in the L 0 gradient minimization. To circumvent the difficulty, we propose a new edge-preserving filtering method with a novel use of the L 0 gradient. Our method is formulated as the minimization of the quadratic data-fidelity subject to the hard constraint that the L 0 gradient is less than a user-given parameter α . This strategy is much more intuitive than the L 0 gradient minimization because the parameter α has a clear meaning: the L 0 gradient value of the output image itself, so that one can directly impose a desired degree of flatness by α . We also provide an efficient algorithm based on the so-called alternating direction method of multipliers for computing an approximate solution of the nonconvex problem, where we decompose it into two subproblems and derive closed-form solutions to them. The advantages of our method are demonstrated through extensive experiments.
Efficient data communication protocols for wireless networks
NASA Astrophysics Data System (ADS)
Zeydan, Engin
In this dissertation, efficient decentralized algorithms are investigated for cost minimization problems in wireless networks. For wireless sensor networks, we investigate both the reduction in the energy consumption and throughput maximization problems separately using multi-hop data aggregation for correlated data in wireless sensor networks. The proposed algorithms exploit data redundancy using a game theoretic framework. For energy minimization, routes are chosen to minimize the total energy expended by the network using best response dynamics to local data. The cost function used in routing takes into account distance, interference and in-network data aggregation. The proposed energy-efficient correlation-aware routing algorithm significantly reduces the energy consumption in the network and converges in a finite number of steps iteratively. For throughput maximization, we consider both the interference distribution across the network and correlation between forwarded data when establishing routes. Nodes along each route are chosen to minimize the interference impact in their neighborhood and to maximize the in-network data aggregation. The resulting network topology maximizes the global network throughput and the algorithm is guaranteed to converge with a finite number of steps using best response dynamics. For multiple antenna wireless ad-hoc networks, we present distributed cooperative and regret-matching based learning schemes for joint transmit beanformer and power level selection problem for nodes operating in multi-user interference environment. Total network transmit power is minimized while ensuring a constant received signal-to-interference and noise ratio at each receiver. In cooperative and regret-matching based power minimization algorithms, transmit beanformers are selected from a predefined codebook to minimize the total power. By selecting transmit beamformers judiciously and performing power adaptation, the cooperative algorithm is shown to converge to pure strategy Nash equilibrium with high probability throughout the iterations in the interference impaired network. On the other hand, the regret-matching learning algorithm is noncooperative and requires minimum amount of overhead. The proposed cooperative and regret-matching based distributed algorithms are also compared with centralized solutions through simulation results.
Interface Prostheses With Classifier-Feedback-Based User Training.
Fang, Yinfeng; Zhou, Dalin; Li, Kairu; Liu, Honghai
2017-11-01
It is evident that user training significantly affects performance of pattern-recognition-based myoelectric prosthetic device control. Despite plausible classification accuracy on offline datasets, online accuracy usually suffers from the changes in physiological conditions and electrode displacement. The user ability in generating consistent electromyographic (EMG) patterns can be enhanced via proper user training strategies in order to improve online performance. This study proposes a clustering-feedback strategy that provides real-time feedback to users by means of a visualized online EMG signal input as well as the centroids of the training samples, whose dimensionality is reduced to minimal number by dimension reduction. Clustering feedback provides a criterion that guides users to adjust motion gestures and muscle contraction forces intentionally. The experiment results have demonstrated that hand motion recognition accuracy increases steadily along the progress of the clustering-feedback-based user training, while conventional classifier-feedback methods, i.e., label feedback, hardly achieve any improvement. The result concludes that the use of proper classifier feedback can accelerate the process of user training, and implies prosperous future for the amputees with limited or no experience in pattern-recognition-based prosthetic device manipulation.It is evident that user training significantly affects performance of pattern-recognition-based myoelectric prosthetic device control. Despite plausible classification accuracy on offline datasets, online accuracy usually suffers from the changes in physiological conditions and electrode displacement. The user ability in generating consistent electromyographic (EMG) patterns can be enhanced via proper user training strategies in order to improve online performance. This study proposes a clustering-feedback strategy that provides real-time feedback to users by means of a visualized online EMG signal input as well as the centroids of the training samples, whose dimensionality is reduced to minimal number by dimension reduction. Clustering feedback provides a criterion that guides users to adjust motion gestures and muscle contraction forces intentionally. The experiment results have demonstrated that hand motion recognition accuracy increases steadily along the progress of the clustering-feedback-based user training, while conventional classifier-feedback methods, i.e., label feedback, hardly achieve any improvement. The result concludes that the use of proper classifier feedback can accelerate the process of user training, and implies prosperous future for the amputees with limited or no experience in pattern-recognition-based prosthetic device manipulation.
User's manual for the Macintosh version of PASCO
NASA Technical Reports Server (NTRS)
Lucas, S. H.; Davis, Randall C.
1991-01-01
A user's manual for Macintosh PASCO is presented. Macintosh PASCO is an Apple Macintosh version of PASCO, an existing computer code for structural analysis and optimization of longitudinally stiffened composite panels. PASCO combines a rigorous buckling analysis program with a nonlinear mathematical optimization routine to minimize panel mass. Macintosh PASCO accepts the same input as mainframe versions of PASCO. As output, Macintosh PASCO produces a text file and mode shape plots in the form of Apple Macintosh PICT files. Only the user interface for Macintosh is discussed here.
Service user involvement in care planning: the mental health nurse's perspective.
Anthony, P; Crawford, P
2000-10-01
A dissonance between espoused values of consumerism within mental health care and the 'reality' of clinical practice has been firmly established in the literature, not least in terms of service user involvement in care planning. In order to begin to minimize such dissonance, it is vital that mental health nurse perceptions of service user involvement in the core activity of care planning are better understood. The main findings of this qualitative study, which uses semistructured interviews, suggest that mental health nurses value the concept of user involvement but consider it to be problematic in certain circumstances. The study reveals that nurses hold similar views about the 'meaning' of patient involvement in care planning but limited resources, individual patients characteristics and limitations in nursing care are the main inhibiting factors. Factors perceived as promoting and increasing user involvement included: provision of accurate information, 'user-friendly' documentation, mechanisms for gaining service user feedback, and high staff morale.
Incorporating User Input in Template-Based Segmentation
Vidal, Camille; Beggs, Dale; Younes, Laurent; Jain, Sanjay K.; Jedynak, Bruno
2015-01-01
We present a simple and elegant method to incorporate user input in a template-based segmentation method for diseased organs. The user provides a partial segmentation of the organ of interest, which is used to guide the template towards its target. The user also highlights some elements of the background that should be excluded from the final segmentation. We derive by likelihood maximization a registration algorithm from a simple statistical image model in which the user labels are modeled as Bernoulli random variables. The resulting registration algorithm minimizes the sum of square differences between the binary template and the user labels, while preventing the template from shrinking, and penalizing for the inclusion of background elements into the final segmentation. We assess the performance of the proposed algorithm on synthetic images in which the amount of user annotation is controlled. We demonstrate our algorithm on the segmentation of the lungs of Mycobacterium tuberculosis infected mice from μCT images. PMID:26146532
Requirements UML Tool (RUT) Expanded for Extreme Programming (CI02)
NASA Technical Reports Server (NTRS)
McCoy, James R.
2003-01-01
A procedure for capturing and managing system requirements that incorporates XP user stories. Because costs associated with identifying problems in requirements increase dramatically over the lifecycle of a project, a method for identifying sources of software risks in user stories is urgently needed. This initiative aims to determine a set of guide-lines for user stories that will result in high-quality requirement. To further this initiative, a tool is needed to analyze user stories that can assess the quality of individual user stories, detect sources cf software risk's, produce software metrics, and identify areas in user stories that can be improved.
Farzandipour, Mehrdad; Meidani, Zahra; Riazi, Hossein; Sadeqi Jabali, Monireh
2018-09-01
There are various approaches to evaluating the usability of electronic medical record (EMR) systems. User perspectives are an integral part of evaluation. Usability evaluations efficiently and effectively contribute to user-centered design and supports tasks and increase user satisfaction. This study determined the main usability requirements for EMRs by means of an end-user survey. A mixed-method strategy was conducted in three phases. A qualitative approach was employed to collect and formulate EMR usability requirements using the focus group method and the modified Delphi technique. Classic Delphi technique was used to evaluate the proposed requirements among 380 end-users in Iran. The final list of EMR usability requirements was verified and included 163 requirements divided into nine groups. The highest rates of end-user agreement relate to EMR visual clarity (3.65 ± 0.61), fault tolerance (3.58 ± 0.56), and suitability for learning (3.55 ± 0.54). The lowest end-user agreement was for auditory presentation (3.18 ± 0.69). The highest and lowest agreement among end-users was for visual clarity and auditory presentation by EMRs, respectively. This suggests that user priorities in determination of EMR usability and their understanding of the importance of the types of individual tasks and context characteristics differ.
Innovative Technology Transfer Partnerships
NASA Technical Reports Server (NTRS)
Kohler, Jeff
2004-01-01
The National Aeronautics and Space Administration (NASA) seeks to license its Advanced Tire and Strut Pressure Monitor (TSPM) technology. The TSPM is a handheld system to accurately measure tire and strut pressure and temperature over a wide temperature range (20 to 120 OF), as well as improve personnel safety. Sensor accuracy, electronics design, and a simple user interface allow operators quick, easy access to required measurements. The handheld electronics, powered by 12-VAC or by 9-VDC batteries, provide the user with an easy-to-read visual display of pressure/temperature or the streaming of pressure/temperature data via an RS-232 interface. When connected to a laptop computer, this new measurement system can provide users with automated data recording and trending, eliminating the chance for data hand-recording errors. In addition, calibration software allows for calibration data to be automatically utilized for the generation of new data conversion equations, simplifying the calibration processes that are so critical to reliable measurements. The design places a high-accuracy pressure sensor (also used as a temperature sensor) as close to the tire or strut measurement location as possible, allowing the user to make accurate measurements rapidly, minimizing the amount of high-pressure volumes, and allowing reasonable distance between the tire or strut and the operator. The pressure sensor attaches directly to the pressure supply/relief valve on the tire and/or strut, with necessary electronics contained in the handheld enclosure. A software algorithm ensures high accuracy of the device over the wide temperature range. Using the pressure sensor as a temperature sensor permits measurement of the actual temperature of the pressurized gas. This device can be adapted to create a portable calibration standard that does not require thermal conditioning. This allows accurate pressure measurements without disturbing the gas temperature. In-place calibration can save considerable time and money and is suitable in many process applications throughout industry.
Optimal Sampling to Provide User-Specific Climate Information.
NASA Astrophysics Data System (ADS)
Panturat, Suwanna
The types of weather-related world problems which are of socio-economic importance selected in this study as representative of three different levels of user groups include: (i) a regional problem concerned with air pollution plumes which lead to acid rain in the north eastern United States, (ii) a state-level problem in the form of winter wheat production in Oklahoma, and (iii) an individual-level problem involving reservoir management given errors in rainfall estimation at Lake Ellsworth, upstream from Lawton, Oklahoma. The study is aimed at designing optimal sampling networks which are based on customer value systems and also abstracting from data sets that information which is most cost-effective in reducing the climate-sensitive aspects of a given user problem. Three process models being used in this study to interpret climate variability in terms of the variables of importance to the user comprise: (i) the HEFFTER-SAMSON diffusion model as the climate transfer function for acid rain, (ii) the CERES-MAIZE plant process model for winter wheat production and (iii) the AGEHYD streamflow model selected as "a black box" for reservoir management. A state-of-the-art Non Linear Program (NLP) algorithm for minimizing an objective function is employed to determine the optimal number and location of various sensors. Statistical quantities considered in determining sensor locations including Bayes Risk, the chi-squared value, the probability of the Type I error (alpha) and the probability of the Type II error (beta) and the noncentrality parameter delta^2. Moreover, the number of years required to detect a climate change resulting in a given bushel per acre change in mean wheat production is determined; the number of seasons of observations required to reduce the standard deviation of the error variance of the ambient sulfur dioxide to less than a certain percent of the mean is found; and finally the policy of maintaining pre-storm flood pools at selected levels is examined given information from the optimal sampling network as defined by the study.
Army Energy and Water Reporting System Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deprez, Peggy C.; Giardinelli, Michael J.; Burke, John S.
There are many areas of desired improvement for the Army Energy and Water Reporting System. The purpose of system is to serve as a data repository for collecting information from energy managers, which is then compiled into an annual energy report. This document summarizes reported shortcomings of the system and provides several alternative approaches for improving application usability and adding functionality. The U.S. Army has been using Army Energy and Water Reporting System (AEWRS) for many years to collect and compile energy data from installations for facilitating compliance with Federal and Department of Defense energy management program reporting requirements. Inmore » this analysis, staff from Pacific Northwest National Laboratory found that substantial opportunities exist to expand AEWRS functions to better assist the Army to effectively manage energy programs. Army leadership must decide if it wants to invest in expanding AEWRS capabilities as a web-based, enterprise-wide tool for improving the Army Energy and Water Management Program or simply maintaining a bottom-up reporting tool. This report looks at both improving system functionality from an operational perspective and increasing user-friendliness, but also as a tool for potential improvements to increase program effectiveness. The authors of this report recommend focusing on making the system easier for energy managers to input accurate data as the top priority for improving AEWRS. The next major focus of improvement would be improved reporting. The AEWRS user interface is dated and not user friendly, and a new system is recommended. While there are relatively minor improvements that could be made to the existing system to make it easier to use, significant improvements will be achieved with a user-friendly interface, new architecture, and a design that permits scalability and reliability. An expanded data set would naturally have need of additional requirements gathering and a focus on integrating with other existing data sources, thus minimizing manually entered data.« less
Linux containers for fun and profit in HPC
Priedhorsky, Reid; Randles, Timothy C.
2017-10-01
This article outlines options for user-defined software stacks from an HPC perspective. Here, we argue that a lightweight approach based on Linux containers is most suitable for HPC centers because it provides the best balance between maximizing service of user needs and minimizing risks. We also discuss how containers work and several implementations, including Charliecloud, our own open-source solution developed at Los Alamos.
Linux containers for fun and profit in HPC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Priedhorsky, Reid; Randles, Timothy C.
This article outlines options for user-defined software stacks from an HPC perspective. Here, we argue that a lightweight approach based on Linux containers is most suitable for HPC centers because it provides the best balance between maximizing service of user needs and minimizing risks. We also discuss how containers work and several implementations, including Charliecloud, our own open-source solution developed at Los Alamos.
Water-Energy Nexus: Examining The Crucial Connection Through Simulation Based Optimization
NASA Astrophysics Data System (ADS)
Erfani, T.; Tan, C. C.
2014-12-01
With a growing urbanisation and the emergence of climate change, the world is facing a more water constrained future. This phenomenon will have direct impacts on the resilience and performance of energy sector as water is playing a key role in electricity generation processes. As energy is becoming a thirstier resource and the pressure on finite water sources is increasing, modelling and analysing this closely interlinked and interdependent loop, called 'water-energy nexus' is becoming an important cross-disciplinary challenge. Conflict often arises in transboundary river where several countries share the same source of water to be used in productive sectors for economic growth. From the perspective of the upstream users, it would be ideal to store the water for hydropower generation and protect the city against drought whereas the downstream users need the supply of water for growth. This research use the case study on the transboundary Blue Nile River basin located in the Middle East where the Ethiopian government decided to invest on building a new dam to store the water and generate hydropower. This leads to an opposition by downstream users as they believe that the introduction of the dam would reduce the amount of water available downstream. This calls for a compromise management where the reservoir operating rules need to be derived considering the interdependencies between the resources available and the requirements proposed by all users. For this, we link multiobjective optimization algorithm to water-energy use simulation model to achieve effective management of the transboundary reservoir operating strategies. The objective functions aim to attain social and economic welfare by minimizing the deficit of water supply and maximizing the hydropower generation. The study helps to improve the policies by understanding the value of water and energy in their alternative uses. The results show how different optimal reservoir release rules generate different trade-off solutions inherently involved in upstream and downstream users requirements and decisions. This study stimulates the research in this context by using simulation based optimization techniques to manage for security for food, water and energy generation, which leads to improve sustainability and long-term political stability.
Space communications scheduler: A rule-based approach to adaptive deadline scheduling
NASA Technical Reports Server (NTRS)
Straguzzi, Nicholas
1990-01-01
Job scheduling is a deceptively complex subfield of computer science. The highly combinatorial nature of the problem, which is NP-complete in nearly all cases, requires a scheduling program to intelligently transverse an immense search tree to create the best possible schedule in a minimal amount of time. In addition, the program must continually make adjustments to the initial schedule when faced with last-minute user requests, cancellations, unexpected device failures, quests, cancellations, unexpected device failures, etc. A good scheduler must be quick, flexible, and efficient, even at the expense of generating slightly less-than-optimal schedules. The Space Communication Scheduler (SCS) is an intelligent rule-based scheduling system. SCS is an adaptive deadline scheduler which allocates modular communications resources to meet an ordered set of user-specified job requests on board the NASA Space Station. SCS uses pattern matching techniques to detect potential conflicts through algorithmic and heuristic means. As a result, the system generates and maintains high density schedules without relying heavily on backtracking or blind search techniques. SCS is suitable for many common real-world applications.
DATMAN: A reliability data analysis program using Bayesian updating
DOE Office of Scientific and Technical Information (OSTI.GOV)
Becker, M.; Feltus, M.A.
1996-12-31
Preventive maintenance (PM) techniques focus on the prevention of failures, in particular, system components that are important to plant functions. Reliability-centered maintenance (RCM) improves on the PM techniques by introducing a set of guidelines by which to evaluate the system functions. It also minimizes intrusive maintenance, labor, and equipment downtime without sacrificing system performance when its function is essential for plant safety. Both the PM and RCM approaches require that system reliability data be updated as more component failures and operation time are acquired. Systems reliability and the likelihood of component failures can be calculated by Bayesian statistical methods, whichmore » can update these data. The DATMAN computer code has been developed at Penn State to simplify the Bayesian analysis by performing tedious calculations needed for RCM reliability analysis. DATMAN reads data for updating, fits a distribution that best fits the data, and calculates component reliability. DATMAN provides a user-friendly interface menu that allows the user to choose from several common prior and posterior distributions, insert new failure data, and visually select the distribution that matches the data most accurately.« less
Standardizing Activation Analysis: New Software for Photon Activation Analysis
NASA Astrophysics Data System (ADS)
Sun, Z. J.; Wells, D.; Segebade, C.; Green, J.
2011-06-01
Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.
Vanky, Anthony P; Verma, Santosh K; Courtney, Theodore K; Santi, Paolo; Ratti, Carlo
2017-12-01
We examined the association between meteorological (weather) conditions in a given locale and pedestrian trips frequency and duration, through the use of locative digital data. These associations were determined for seasonality, urban microclimate, and commuting. We analyzed GPS data from a broadly available activity tracking mobile phone application that automatically recorded 247,814 trips from 5432 unique users in Boston and 257,697 trips from 8256 users in San Francisco over a 50-week period. Generally, we observed increased air temperature and the presence of light cloud cover had a positive association with hourly trip frequency in both cities, regardless of seasonality. Temperature and weather conditions generally showed greater associations with weekend and discretionary travel, than with weekday and required travel. Weather conditions had minimal association with the duration of the trip, once the trip was initiated. The observed associations in some cases differed between the two cities. Our study illustrates the opportunity that emerging technology presents to study active transportation, and exposes new methods to wider consideration in preventive medicine.
Toward interactive scheduling systems for managing medical resources.
Oddi, A; Cesta, A
2000-10-01
Managers of medico-hospital facilities are facing two general problems when allocating resources to activities: (1) to find an agreement between several and contrasting requirements; (2) to manage dynamic and uncertain situations when constraints suddenly change over time due to medical needs. This paper describes the results of a research aimed at applying constraint-based scheduling techniques to the management of medical resources. A mixed-initiative problem solving approach is adopted in which a user and a decision support system interact to incrementally achieve a satisfactory solution to the problem. A running prototype is described called Interactive Scheduler which offers a set of functionalities for a mixed-initiative interaction to cope with the medical resource management. Interactive Scheduler is endowed with a representation schema used for describing the medical environment, a set of algorithms that address the specific problems of the domain, and an innovative interaction module that offers functionalities for the dialogue between the support system and its user. A particular contribution of this work is the explicit representation of constraint violations, and the definition of scheduling algorithms that aim at minimizing the amount of constraint violations in a solution.
Interactive-rate Motion Planning for Concentric Tube Robots.
Torres, Luis G; Baykal, Cenk; Alterovitz, Ron
2014-05-01
Concentric tube robots may enable new, safer minimally invasive surgical procedures by moving along curved paths to reach difficult-to-reach sites in a patient's anatomy. Operating these devices is challenging due to their complex, unintuitive kinematics and the need to avoid sensitive structures in the anatomy. In this paper, we present a motion planning method that computes collision-free motion plans for concentric tube robots at interactive rates. Our method's high speed enables a user to continuously and freely move the robot's tip while the motion planner ensures that the robot's shaft does not collide with any anatomical obstacles. Our approach uses a highly accurate mechanical model of tube interactions, which is important since small movements of the tip position may require large changes in the shape of the device's shaft. Our motion planner achieves its high speed and accuracy by combining offline precomputation of a collision-free roadmap with online position control. We demonstrate our interactive planner in a simulated neurosurgical scenario where a user guides the robot's tip through the environment while the robot automatically avoids collisions with the anatomical obstacles.
Cross-sensor iris recognition through kernel learning.
Pillai, Jaishanker K; Puertas, Maria; Chellappa, Rama
2014-01-01
Due to the increasing popularity of iris biometrics, new sensors are being developed for acquiring iris images and existing ones are being continuously upgraded. Re-enrolling users every time a new sensor is deployed is expensive and time-consuming, especially in applications with a large number of enrolled users. However, recent studies show that cross-sensor matching, where the test samples are verified using data enrolled with a different sensor, often lead to reduced performance. In this paper, we propose a machine learning technique to mitigate the cross-sensor performance degradation by adapting the iris samples from one sensor to another. We first present a novel optimization framework for learning transformations on iris biometrics. We then utilize this framework for sensor adaptation, by reducing the distance between samples of the same class, and increasing it between samples of different classes, irrespective of the sensors acquiring them. Extensive evaluations on iris data from multiple sensors demonstrate that the proposed method leads to improvement in cross-sensor recognition accuracy. Furthermore, since the proposed technique requires minimal changes to the iris recognition pipeline, it can easily be incorporated into existing iris recognition systems.
Using a simulation assistant in modeling manufacturing systems
NASA Technical Reports Server (NTRS)
Schroer, Bernard J.; Tseng, Fan T.; Zhang, S. X.; Wolfsberger, John W.
1988-01-01
Numerous simulation languages exist for modeling discrete event processes, and are now ported to microcomputers. Graphic and animation capabilities were added to many of these languages to assist the users build models and evaluate the simulation results. With all these languages and added features, the user is still plagued with learning the simulation language. Futhermore, the time to construct and then to validate the simulation model is always greater than originally anticipated. One approach to minimize the time requirement is to use pre-defined macros that describe various common processes or operations in a system. The development of a simulation assistant for modeling discrete event manufacturing processes is presented. A simulation assistant is defined as an interactive intelligent software tool that assists the modeler in writing a simulation program by translating the modeler's symbolic description of the problem and then automatically generating the corresponding simulation code. The simulation assistant is discussed with emphasis on an overview of the simulation assistant, the elements of the assistant, and the five manufacturing simulation generators. A typical manufacturing system will be modeled using the simulation assistant and the advantages and disadvantages discussed.
Farahmand, Farid; Khadivi, Kevin O.; Rodrigues, Joel J. P. C.
2009-01-01
The utility of a novel, high-precision, non-intrusive, wireless, accelerometer-based patient orientation monitoring system (APOMS) in determining orientation change in patients undergoing radiation treatment is reported here. Using this system a small wireless accelerometer sensor is placed on a patient’s skin, broadcasting its orientation to the receiving station connected to a PC in the control area. A threshold-based algorithm is developed to identify the exact amount of the patient’s head orientation change. Through real-time measurements, an audible alarm can alert the radiation therapist if the user-defined orientation threshold is violated. Our results indicate that, in spite of its low-cost and simplicity, the APOMS is highly sensitive and offers accurate measurements. Furthermore, the APOMS is patient friendly, vendor neutral, and requires minimal user training. The versatile architecture of the APOMS makes it potentially suitable for variety of applications, including study of correlation between external and internal markers during Image-Guided Radiation Therapy (IGRT), with no major changes in hardware setup or algorithm. PMID:22423196
Information for the user in design of intelligent systems
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Schreckenghost, Debra L.
1993-01-01
Recommendations are made for improving intelligent system reliability and usability based on the use of information requirements in system development. Information requirements define the task-relevant messages exchanged between the intelligent system and the user by means of the user interface medium. Thus, these requirements affect the design of both the intelligent system and its user interface. Many difficulties that users have in interacting with intelligent systems are caused by information problems. These information problems result from the following: (1) not providing the right information to support domain tasks; and (2) not recognizing that using an intelligent system introduces new user supervisory tasks that require new types of information. These problems are especially prevalent in intelligent systems used for real-time space operations, where data problems and unexpected situations are common. Information problems can be solved by deriving information requirements from a description of user tasks. Using information requirements embeds human-computer interaction design into intelligent system prototyping, resulting in intelligent systems that are more robust and easier to use.
Sustainable Land Imaging User Requirements
NASA Astrophysics Data System (ADS)
Wu, Z.; Snyder, G.; Vadnais, C. M.
2017-12-01
The US Geological Survey (USGS) Land Remote Sensing Program (LRSP) has collected user requirements from a range of applications to help formulate the Landsat 9 follow-on mission (Landsat 10) through the Requirements, Capabilities and Analysis (RCA) activity. The USGS is working with NASA to develop Landsat 10, which is scheduled to launch in the 2027 timeframe as part of the Sustainable Land Imaging program. User requirements collected through RCA will help inform future Landsat 10 sensor designs and mission characteristics. Current Federal civil community users have provided hundreds of requirements through systematic, in-depth interviews. Academic, State, local, industry, and international Landsat user community input was also incorporated in the process. Emphasis was placed on spatial resolution, temporal revisit, and spectral characteristics, as well as other aspects such as accuracy, continuity, sampling condition, data access and format. We will provide an overview of the Landsat 10 user requirements collection process and summary results of user needs from the broad land imagining community.
Mittal, Varun; Hung, Ling-Hong; Keswani, Jayant; Kristiyanto, Daniel; Lee, Sung Bong
2017-01-01
Abstract Background: Software container technology such as Docker can be used to package and distribute bioinformatics workflows consisting of multiple software implementations and dependencies. However, Docker is a command line–based tool, and many bioinformatics pipelines consist of components that require a graphical user interface. Results: We present a container tool called GUIdock-VNC that uses a graphical desktop sharing system to provide a browser-based interface for containerized software. GUIdock-VNC uses the Virtual Network Computing protocol to render the graphics within most commonly used browsers. We also present a minimal image builder that can add our proposed graphical desktop sharing system to any Docker packages, with the end result that any Docker packages can be run using a graphical desktop within a browser. In addition, GUIdock-VNC uses the Oauth2 authentication protocols when deployed on the cloud. Conclusions: As a proof-of-concept, we demonstrated the utility of GUIdock-noVNC in gene network inference. We benchmarked our container implementation on various operating systems and showed that our solution creates minimal overhead. PMID:28327936
Mittal, Varun; Hung, Ling-Hong; Keswani, Jayant; Kristiyanto, Daniel; Lee, Sung Bong; Yeung, Ka Yee
2017-04-01
Software container technology such as Docker can be used to package and distribute bioinformatics workflows consisting of multiple software implementations and dependencies. However, Docker is a command line-based tool, and many bioinformatics pipelines consist of components that require a graphical user interface. We present a container tool called GUIdock-VNC that uses a graphical desktop sharing system to provide a browser-based interface for containerized software. GUIdock-VNC uses the Virtual Network Computing protocol to render the graphics within most commonly used browsers. We also present a minimal image builder that can add our proposed graphical desktop sharing system to any Docker packages, with the end result that any Docker packages can be run using a graphical desktop within a browser. In addition, GUIdock-VNC uses the Oauth2 authentication protocols when deployed on the cloud. As a proof-of-concept, we demonstrated the utility of GUIdock-noVNC in gene network inference. We benchmarked our container implementation on various operating systems and showed that our solution creates minimal overhead. © The Authors 2017. Published by Oxford University Press.
Augmented reality in surgical procedures
NASA Astrophysics Data System (ADS)
Samset, E.; Schmalstieg, D.; Vander Sloten, J.; Freudenthal, A.; Declerck, J.; Casciaro, S.; Rideng, Ø.; Gersak, B.
2008-02-01
Minimally invasive therapy (MIT) is one of the most important trends in modern medicine. It includes a wide range of therapies in videoscopic surgery and interventional radiology and is performed through small incisions. It reduces hospital stay-time by allowing faster recovery and offers substantially improved cost-effectiveness for the hospital and the society. However, the introduction of MIT has also led to new problems. The manipulation of structures within the body through small incisions reduces dexterity and tactile feedback. It requires a different approach than conventional surgical procedures, since eye-hand co-ordination is not based on direct vision, but more predominantly on image guidance via endoscopes or radiological imaging modalities. ARIS*ER is a multidisciplinary consortium developing a new generation of decision support tools for MIT by augmenting visual and sensorial feedback. We will present tools based on novel concepts in visualization, robotics and haptics providing tailored solutions for a range of clinical applications. Examples from radio-frequency ablation of liver-tumors, laparoscopic liver surgery and minimally invasive cardiac surgery will be presented. Demonstrators were developed with the aim to provide a seamless workflow for the clinical user conducting image-guided therapy.
STS users study (study 2.2). Volume 2: STS users plan (user data requirements) study
NASA Technical Reports Server (NTRS)
Pritchard, E. I.
1975-01-01
Pre-flight scheduling and pre-flight requirements of the space transportation system are discussed. Payload safety requirements, shuttle flight manifests, and interface specifications are studied in detail.
al3c: high-performance software for parameter inference using Approximate Bayesian Computation.
Stram, Alexander H; Marjoram, Paul; Chen, Gary K
2015-11-01
The development of Approximate Bayesian Computation (ABC) algorithms for parameter inference which are both computationally efficient and scalable in parallel computing environments is an important area of research. Monte Carlo rejection sampling, a fundamental component of ABC algorithms, is trivial to distribute over multiple processors but is inherently inefficient. While development of algorithms such as ABC Sequential Monte Carlo (ABC-SMC) help address the inherent inefficiencies of rejection sampling, such approaches are not as easily scaled on multiple processors. As a result, current Bayesian inference software offerings that use ABC-SMC lack the ability to scale in parallel computing environments. We present al3c, a C++ framework for implementing ABC-SMC in parallel. By requiring only that users define essential functions such as the simulation model and prior distribution function, al3c abstracts the user from both the complexities of parallel programming and the details of the ABC-SMC algorithm. By using the al3c framework, the user is able to scale the ABC-SMC algorithm in parallel computing environments for his or her specific application, with minimal programming overhead. al3c is offered as a static binary for Linux and OS-X computing environments. The user completes an XML configuration file and C++ plug-in template for the specific application, which are used by al3c to obtain the desired results. Users can download the static binaries, source code, reference documentation and examples (including those in this article) by visiting https://github.com/ahstram/al3c. astram@usc.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Towards Integrating Distributed Energy Resources and Storage Devices in Smart Grid.
Xu, Guobin; Yu, Wei; Griffith, David; Golmie, Nada; Moulema, Paul
2017-02-01
Internet of Things (IoT) provides a generic infrastructure for different applications to integrate information communication techniques with physical components to achieve automatic data collection, transmission, exchange, and computation. The smart grid, as one of typical applications supported by IoT, denoted as a re-engineering and a modernization of the traditional power grid, aims to provide reliable, secure, and efficient energy transmission and distribution to consumers. How to effectively integrate distributed (renewable) energy resources and storage devices to satisfy the energy service requirements of users, while minimizing the power generation and transmission cost, remains a highly pressing challenge in the smart grid. To address this challenge and assess the effectiveness of integrating distributed energy resources and storage devices, in this paper we develop a theoretical framework to model and analyze three types of power grid systems: the power grid with only bulk energy generators, the power grid with distributed energy resources, and the power grid with both distributed energy resources and storage devices. Based on the metrics of the power cumulative cost and the service reliability to users, we formally model and analyze the impact of integrating distributed energy resources and storage devices in the power grid. We also use the concept of network calculus, which has been traditionally used for carrying out traffic engineering in computer networks, to derive the bounds of both power supply and user demand to achieve a high service reliability to users. Through an extensive performance evaluation, our data shows that integrating distributed energy resources conjointly with energy storage devices can reduce generation costs, smooth the curve of bulk power generation over time, reduce bulk power generation and power distribution losses, and provide a sustainable service reliability to users in the power grid.
Towards Integrating Distributed Energy Resources and Storage Devices in Smart Grid
Xu, Guobin; Yu, Wei; Griffith, David; Golmie, Nada; Moulema, Paul
2017-01-01
Internet of Things (IoT) provides a generic infrastructure for different applications to integrate information communication techniques with physical components to achieve automatic data collection, transmission, exchange, and computation. The smart grid, as one of typical applications supported by IoT, denoted as a re-engineering and a modernization of the traditional power grid, aims to provide reliable, secure, and efficient energy transmission and distribution to consumers. How to effectively integrate distributed (renewable) energy resources and storage devices to satisfy the energy service requirements of users, while minimizing the power generation and transmission cost, remains a highly pressing challenge in the smart grid. To address this challenge and assess the effectiveness of integrating distributed energy resources and storage devices, in this paper we develop a theoretical framework to model and analyze three types of power grid systems: the power grid with only bulk energy generators, the power grid with distributed energy resources, and the power grid with both distributed energy resources and storage devices. Based on the metrics of the power cumulative cost and the service reliability to users, we formally model and analyze the impact of integrating distributed energy resources and storage devices in the power grid. We also use the concept of network calculus, which has been traditionally used for carrying out traffic engineering in computer networks, to derive the bounds of both power supply and user demand to achieve a high service reliability to users. Through an extensive performance evaluation, our data shows that integrating distributed energy resources conjointly with energy storage devices can reduce generation costs, smooth the curve of bulk power generation over time, reduce bulk power generation and power distribution losses, and provide a sustainable service reliability to users in the power grid1. PMID:29354654
Mehta, A; Patel, S; Robison, W; Senkowski, T; Allen, J; Shaw, E; Senkowski, C
2018-03-01
New techniques in minimally invasive and robotic surgical platforms require staged curricula to insure proficiency. Scant literature exists as to how much simulation should play a role in training those who have skills in advanced surgical technology. The abilities of novel users may help discriminate if surgically experienced users should start at a higher simulation level or if the tasks are too rudimentary. The study's purpose is to explore the ability of General Surgery residents to gain proficiency on the dVSS as compared to novel users. The hypothesis is that Surgery residents will have increased proficiency in skills acquisition as compared to naive users. Six General Surgery residents at a single institution were compared with six teenagers using metrics measured by the dVSS. Participants were given two 1-h sessions to achieve an MScoreTM in the 90th percentile on each of the five simulations. MScoreTM software compiles a variety of metrics including total time, number of attempts, and high score. Statistical analysis was run using Student's t test. Significance was set at p value <0.05. Total time, attempts, and high score were compared between the two groups. The General Surgery residents took significantly less Total Time to complete Pegboard 1 (PB1) (p = 0.043). No significant difference was evident between the two groups in the other four simulations across the same MScoreTM metrics. A focused look at the energy dissection task revealed that overall score might not be discriminant enough. Our findings indicate that prior medical knowledge or surgical experience does not significantly impact one's ability to acquire new skills on the dVSS. It is recommended that residency-training programs begin to include exposure to robotic technology.
NASA Technical Reports Server (NTRS)
Lewis, Clayton; Wilde, Nick
1989-01-01
Space construction will require heavy investment in the development of a wide variety of user interfaces for the computer-based tools that will be involved at every stage of construction operations. Using today's technology, user interface development is very expensive for two reasons: (1) specialized and scarce programming skills are required to implement the necessary graphical representations and complex control regimes for high-quality interfaces; (2) iteration on prototypes is required to meet user and task requirements, since these are difficult to anticipate with current (and foreseeable) design knowledge. We are attacking this problem by building a user interface development tool based on extensions to the spreadsheet model of computation. The tool provides high-level support for graphical user interfaces and permits dynamic modification of interfaces, without requiring conventional programming concepts and skills.
Inferior vena cava segmentation with parameter propagation and graph cut.
Yan, Zixu; Chen, Feng; Wu, Fa; Kong, Dexing
2017-09-01
The inferior vena cava (IVC) is one of the vital veins inside the human body. Accurate segmentation of the IVC from contrast-enhanced CT images is of great importance. This extraction not only helps the physician understand its quantitative features such as blood flow and volume, but also it is helpful during the hepatic preoperative planning. However, manual delineation of the IVC is time-consuming and poorly reproducible. In this paper, we propose a novel method to segment the IVC with minimal user interaction. The proposed method performs the segmentation block by block between user-specified beginning and end masks. At each stage, the proposed method builds the segmentation model based on information from image regional appearances, image boundaries, and a prior shape. The intensity range and the prior shape for this segmentation model are estimated based on the segmentation result from the last block, or from user- specified beginning mask if at first stage. Then, the proposed method minimizes the energy function and generates the segmentation result for current block using graph cut. Finally, a backward tracking step from the end of the IVC is performed if necessary. We have tested our method on 20 clinical datasets and compared our method to three other vessel extraction approaches. The evaluation was performed using three quantitative metrics: the Dice coefficient (Dice), the mean symmetric distance (MSD), and the Hausdorff distance (MaxD). The proposed method has achieved a Dice of [Formula: see text], an MSD of [Formula: see text] mm, and a MaxD of [Formula: see text] mm, respectively, in our experiments. The proposed approach can achieve a sound performance with a relatively low computational cost and a minimal user interaction. The proposed algorithm has high potential to be applied for the clinical applications in the future.
Scharschmidt, D; Preiß, S; Brähler, E; Fischer, T; Borkenhagen, A
2017-12-01
More and more people worldwide and also in Germany are using botulinum toxin type A (BoNT-A) and hyaluronic acid injections for skin rejuvenation. Study on body image and self-esteem of women with BoNT-A and/or hyaluronic acid filler treatment. A total of 145 women who requested BoNT-A and/or hyaluronic acid injections completed a survey comprised of the body dysmorphic disorder questionnaire, the Rosenberg self-esteem scale and questionnaires on the attitudes and motives on measures for optimization of the body and demographic features. Using this instrument data on the body image and self-esteem as well as attitudes and motives for utilization of minimally invasive skin rejuvenation were collated. Female users of minimally invasive skin rejuvenation showed an overall higher socioeconomic status and an above average high monthly income. They lived in a partnership more often in comparison to women of equal age living in Berlin. The users of BoNT-A and/or hyaluronic acid fillers showed no conspicuous differences in body image and self-esteem. They showed a moderately positive attitude to body optimization procedures and 91% achieved their standard weight with a body mass index (BMI) of ≤25 kg/m 2 in comparison to 56% of German women in the same age range (25 to ≥75 years old). In the first study of body image and self-esteem in users of BoNT‑A and/or dermal fillers in German women, the users showed no signs of body dysmorphic patterns or disorders of self-esteem.
ERIC Educational Resources Information Center
Zeng, Qingtian; Zhao, Zhongying; Liang, Yongquan
2009-01-01
User's knowledge requirement acquisition and analysis are very important for a personalized or user-adaptive learning system. Two approaches to capture user's knowledge requirement about course content within an e-learning system are proposed and implemented in this paper. The first approach is based on the historical data accumulated by an…
NASA Astrophysics Data System (ADS)
Areeda, J. S.; Smith, J. R.; Lundgren, A. P.; Maros, E.; Macleod, D. M.; Zweizig, J.
2017-01-01
Gravitational-wave observatories around the world, including the Laser Interferometer Gravitational-Wave Observatory (LIGO), record a large volume of gravitational-wave output data and auxiliary data about the instruments and their environments. These data are stored at the observatory sites and distributed to computing clusters for data analysis. LigoDV-web is a web-based data viewer that provides access to data recorded at the LIGO Hanford, LIGO Livingston and GEO600 observatories, and the 40 m prototype interferometer at Caltech. The challenge addressed by this project is to provide meaningful visualizations of small data sets to anyone in the collaboration in a fast, secure and reliable manner with minimal software, hardware and training required of the end users. LigoDV-web is implemented as a Java Enterprise Application, with Shibboleth Single Sign On for authentication and authorization, and a proprietary network protocol used for data access on the back end. Collaboration members with proper credentials can request data be displayed in any of several general formats from any Internet appliance that supports a modern browser with Javascript and minimal HTML5 support, including personal computers, smartphones, and tablets. Since its inception in 2012, 634 unique users have visited the LigoDV-web website in a total of 33 , 861 sessions and generated a total of 139 , 875 plots. This infrastructure has been helpful in many analyses within the collaboration including follow-up of the data surrounding the first gravitational-wave events observed by LIGO in 2015.
NASA Astrophysics Data System (ADS)
Ahlers, Volker; Weigl, Paul; Schachtzabel, Hartmut
2005-04-01
Due to the increasing demand for high-quality ceramic crowns and bridges, the CAD/CAM-based production of dental restorations has been a subject of intensive research during the last fifteen years. A prerequisite for the efficient processing of the 3D measurement of prepared teeth with a minimal amount of user interaction is the automatic determination of the preparation line, which defines the sealing margin between the restoration and the prepared tooth. Current dental CAD/CAM systems mostly require the interactive definition of the preparation line by the user, at least by means of giving a number of start points. Previous approaches to the automatic extraction of the preparation line rely on single contour detection algorithms. In contrast, we use a combination of different contour detection algorithms to find several independent potential preparation lines from a height profile of the measured data. The different algorithms (gradient-based, contour-based, and region-based) show their strengths and weaknesses in different clinical situations. A classifier consisting of three stages (range check, decision tree, support vector machine), which is trained by human experts with real-world data, finally decides which is the correct preparation line. In a test with 101 clinical preparations, a success rate of 92.0% has been achieved. Thus the combination of different contour detection algorithms yields a reliable method for the automatic extraction of the preparation line, which enables the setup of a turn-key dental CAD/CAM process chain with a minimal amount of interactive screen work.
Automatic detection and visualisation of MEG ripple oscillations in epilepsy.
van Klink, Nicole; van Rosmalen, Frank; Nenonen, Jukka; Burnos, Sergey; Helle, Liisa; Taulu, Samu; Furlong, Paul Lawrence; Zijlmans, Maeike; Hillebrand, Arjan
2017-01-01
High frequency oscillations (HFOs, 80-500 Hz) in invasive EEG are a biomarker for the epileptic focus. Ripples (80-250 Hz) have also been identified in non-invasive MEG, yet detection is impeded by noise, their low occurrence rates, and the workload of visual analysis. We propose a method that identifies ripples in MEG through noise reduction, beamforming and automatic detection with minimal user effort. We analysed 15 min of presurgical resting-state interictal MEG data of 25 patients with epilepsy. The MEG signal-to-noise was improved by using a cross-validation signal space separation method, and by calculating ~ 2400 beamformer-based virtual sensors in the grey matter. Ripples in these sensors were automatically detected by an algorithm optimized for MEG. A small subset of the identified ripples was visually checked. Ripple locations were compared with MEG spike dipole locations and the resection area if available. Running the automatic detection algorithm resulted in on average 905 ripples per patient, of which on average 148 ripples were visually reviewed. Reviewing took approximately 5 min per patient, and identified ripples in 16 out of 25 patients. In 14 patients the ripple locations showed good or moderate concordance with the MEG spikes. For six out of eight patients who had surgery, the ripple locations showed concordance with the resection area: 4/5 with good outcome and 2/3 with poor outcome. Automatic ripple detection in beamformer-based virtual sensors is a feasible non-invasive tool for the identification of ripples in MEG. Our method requires minimal user effort and is easily applicable in a clinical setting.
Body Image, Personality Traits, and Quality of Life in Botulinum Toxin A and Dermal Filler Patients.
Scharschmidt, Dagmar; Mirastschijski, Ursula; Preiss, Simone; Brähler, Elmar; Fischer, Tanja; Borkenhagen, A
2018-06-11
The demand for minimally invasive cosmetic procedures has continued to rise, especially in Germany, yet few studies have examined this patient population. The literature in Germany has repeatedly voiced the speculation that users of minimally invasive, skin-rejuvenating procedures displayed a higher tendency toward dysmorphic behavior patterns or, respectively, other abnormal personality traits. The aim of this study was to investigate body image, personality traits, quality of life, and socioeconomic parameters in users of botulinum toxin and/or facial fillers. One hundred forty-five females presented for botulinum toxin and/or soft tissue filler injections completed demographic and standardized psychometric questionnaires such as the World-Health-Organization Quality of Life-Short Form, Big Five Inventory-10, Body Dysmorphic Disorder Questionnaire before treatment. Patients undergoing injectable aesthetic treatments in an urban dermatology practice were women, middle-aged, highly educated, and mostly employed. Furthermore, participants showed higher quality of life, especially health-related quality of life, and a lower body mass index than controls. Concerning personality traits, our participants scored significantly higher on extraversion, agreeableness, openness to experience, and neuroticism. This study helps to better understand the psychosocial factors characterizing this patient population. Patients differ from controls by having a higher level of quality of life. No signs of body dysmorphic patterns or problematic personality traits were found. This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266 .
Search and retrieval of office files using dBASE 3
NASA Technical Reports Server (NTRS)
Breazeale, W. L.; Talley, C. R.
1986-01-01
Described is a method of automating the office files retrieval process using a commercially available software package (dBASE III). The resulting product is a menu-driven computer program which requires no computer skills to operate. One part of the document is written for the potential user who has minimal computer experience and uses sample menu screens to explain the program; while a second part is oriented towards the computer literate individual and includes rather detailed descriptions of the methodology and search routines. Although much of the programming techniques are explained, this document is not intended to be a tutorial on dBASE III. It is hoped that the document will serve as a stimulus for other applications of dBASE III.
Natural Aggregation Approach based Home Energy Manage System with User Satisfaction Modelling
NASA Astrophysics Data System (ADS)
Luo, F. J.; Ranzi, G.; Dong, Z. Y.; Murata, J.
2017-07-01
With the prevalence of advanced sensing and two-way communication technologies, Home Energy Management System (HEMS) has attracted lots of attentions in recent years. This paper proposes a HEMS that optimally schedules the controllable Residential Energy Resources (RERs) in a Time-of-Use (TOU) pricing and high solar power penetrated environment. The HEMS aims to minimize the overall operational cost of the home, and the user’s satisfactions and requirements on the operation of different household appliances are modelled and considered in the HEMS. Further, a new biological self-aggregation intelligence based optimization technique previously proposed by the authors, i.e., Natural Aggregation Algorithm (NAA), is applied to solve the proposed HEMS optimization model. Simulations are conducted to validate the proposed method.
Representation of thermal infrared imaging data in the DICOM using XML configuration files.
Ruminski, Jacek
2007-01-01
The DICOM standard has become a widely accepted and implemented format for the exchange and storage of medical imaging data. Different imaging modalities are supported however there is not a dedicated solution for thermal infrared imaging in medicine. In this article we propose new ideas and improvements to final proposal of the new DICOM Thermal Infrared Imaging structures and services. Additionally, we designed, implemented and tested software packages for universal conversion of existing thermal imaging files to the DICOM format using XML configuration files. The proposed solution works fast and requires minimal number of user interactions. The XML configuration file enables to compose a set of attributes for any source file format of thermal imaging camera.
PETRO.CALC.PLOT, Microsoft Excel macros to aid petrologic interpretation
Sidder, G.B.
1994-01-01
PETRO.CALC.PLOT is a package of macros which normalizes whole-rock oxide data to 100%, calculates the cation percentages and molecular proportions used for normative mineral calculations, computes the apices for ternary diagrams, determines sums and ratios of specific elements of petrologic interest, and plots 33 X-Y graphs and five ternary diagrams. PETRO.CALC.PLOT also may be used to create other diagrams as desired by the user. The macros run in Microsoft Excel 3.0 and 4.0 for Macintosh computers and in Microsoft Excel 3.0 and 4.0 for Windows. Macros provided in PETRO.CALC.PLOT minimize repetition and time required to recalculate and plot whole-rock oxide data for petrologic analysis. ?? 1994.
Robotic follower experimentation results: ready for FCS increment I
NASA Astrophysics Data System (ADS)
Jaczkowski, Jeffrey J.
2003-09-01
Robotics is a fundamental enabling technology required to meet the U.S. Army's vision to be a strategically responsive force capable of domination across the entire spectrum of conflict. The U. S. Army Research, Development and Engineering Command (RDECOM) Tank Automotive Research, Development & Engineering Center (TARDEC), in partnership with the U.S. Army Research Laboratory, is developing a leader-follower capability for Future Combat Systems. The Robotic Follower Advanced Technology Demonstration (ATD) utilizes a manned leader to provide a highlevel proofing of the follower's path, which operates with minimal user intervention. This paper will give a programmatic overview and discuss both the technical approach and operational experimentation results obtained during testing conducted at Ft. Bliss, New Mexico in February-March 2003.
Modular optical detector system
Horn, Brent A [Livermore, CA; Renzi, Ronald F [Tracy, CA
2006-02-14
A modular optical detector system. The detector system is designed to detect the presence of molecules or molecular species by inducing fluorescence with exciting radiation and detecting the emitted fluorescence. Because the system is capable of accurately detecting and measuring picomolar concentrations it is ideally suited for use with microchemical analysis systems generally and capillary chromatographic systems in particular. By employing a modular design, the detector system provides both the ability to replace various elements of the detector system without requiring extensive realignment or recalibration of the components as well as minimal user interaction with the system. In addition, the modular concept provides for the use and addition of a wide variety of components, including optical elements (lenses and filters), light sources, and detection means, to fit particular needs.
2005-10-01
AFRL-HE-WP-TP-2005-0030 AIR FORCE RESEARCH LABORATORY Application of Cognitive Task Analysis in User Requirements and Prototype Design Presentation...TITLE AND SUBTITLE 5a. CONTRACT NUMBER FA8650-04-C-6406 Application of Cognitive Task Analysis in User Requirements 5b.GRANTNUMBER and Prototype...maintainer experience 21 21 Questions & Answers Application of Cognitive Task Analysis in User Requirements Definition and Prototype Design Christopher Curtis
Lifetime Alcohol Use & Cognitive Performance in Older Adults
Kalapatapu, Raj K.; Ventura, Maria I.; Barnes, Deborah E.
2016-01-01
Background Substance use is an important clinical issue in the older adult population. As older adults are susceptible to cognitive disorders, the intersection of the fields of substance use and cognitive neuroscience is an active area of research. Prior studies of alcohol use and cognitive performance are mixed, and inconsistencies may be due to under- or over-adjustment for confounders. Aim This manuscript adds to this literature by conducting a secondary analysis of self-reported lifetime history of alcohol use and cognitive performance in older adults (n = 133). We hypothesized that current alcohol users would have poorer cognitive performance compared to never/minimal and former alcohol users. Methods Older adult participants were classified into never/minimal alcohol users, former alcohol users, and current alcohol users. A neurocognitive battery included a global cognitive measure and individual measures of attention, memory, fluency, and executive function. A directed acyclic graph (DAG)-based approach was used to select variables to be included in the multiple linear regression models. Results Though unadjusted analyses showed some significant associations between alcohol use and cognitive performance, all associations between alcohol use and cognitive performance were eliminated after adjusting for age, education, sex, race and smoking pack years. Alcohol drink years were not significantly associated with cognitive performance among current and former alcohol users. Discussion These results suggest that lifetime alcohol use is not significantly associated with cognitive performance in older adults after adjustment for key confounders. Inconsistencies in prior studies may be due to uncontrolled confounding and/or unnecessary adjustment of mediators and/or colliders. PMID:27719514
Design of Launcher Towards Spacecraft Comfort: Ariane 6 Objectives
NASA Astrophysics Data System (ADS)
Mourey, Patrick; Lambare, Hadrien; Valbuena, Matias F.
2014-06-01
Preliminary advanced studies were performed recently to select the possible concepts for a launcher that could succeed to Ariane 5. During the end of 2012 Space Ministry Conference, a configuration defining the propellant of the stages and the coarse staging ("PPH") was frozen in order to engage the preliminary selection concept studies. The first phase consisted to select the main features of the architecture in order to go deeper in the different matters or the advanced studies. The concept was selected mid of 2013.During all these phases of the preliminary project, different criteria (such as the recurring cost which is a major one) were used to quote the different concepts, among which the "payload comfort", ie the minimization of the environment generated by the launcher toward the satellites.The minimization of the environment was first expressed in term of objectives in the Mission Requirement Document (MRD) for the different mechanical environment such as quasi-static loads, dynamic loads, acoustics, shocks... Criteria such as usable volume, satellites frequency requirement and interface requirement are also expressed in the MRD.The definition of these different criteria was of course fixed taking benefit from the launcher operator experience based on a long story of dealing with spacecraft-launcher interface issues on Ariane, Soyouz and Vega. The general idea is to target improved or similar levels than those currently applicable for Ariane 5. For some environment for which a special need is anticipated from the potential end users, a special effort is aimed.The preliminary advanced study phase is currently running and has to address specific topics such as the definition of the upper part layout including geometry ofthe fairing, the definition of the launch pad with preliminary ideas to minimize acoustics and blast wave or first calculations on dimensioning dynamic load- cases such as thrust oscillations of the solid rocket motors (SRM).The present paper will give a very preliminary overview of the different topics in relation with these general launcher-spacecraft issues.
BnmrOffice: A Free Software for β-nmr Data Analysis
NASA Astrophysics Data System (ADS)
Saadaoui, Hassan
A data-analysis framework with a graphical user interface (GUI) is developed to analyze β-nmr spectra in an automated and intuitive way. This program, named BnmrOffice is written in C++ and employs the QT libraries and tools for designing the GUI, and the CERN's Minuit optimization routines for minimization. The program runs under multiple platforms, and is available for free under the terms of the GNU GPL standards. The GUI is structured in tabs to search, plot and analyze data, along other functionalities. The user can tweak the minimization options; and fit multiple data files (or runs) using single or global fitting routines with pre-defined or new models. Currently, BnmrOffice reads TRIUMF's MUD data and ASCII files, and can be extended to other formats.
Validation of semi-automatic segmentation of the left atrium
NASA Astrophysics Data System (ADS)
Rettmann, M. E.; Holmes, D. R., III; Camp, J. J.; Packer, D. L.; Robb, R. A.
2008-03-01
Catheter ablation therapy has become increasingly popular for the treatment of left atrial fibrillation. The effect of this treatment on left atrial morphology, however, has not yet been completely quantified. Initial studies have indicated a decrease in left atrial size with a concomitant decrease in pulmonary vein diameter. In order to effectively study if catheter based therapies affect left atrial geometry, robust segmentations with minimal user interaction are required. In this work, we validate a method to semi-automatically segment the left atrium from computed-tomography scans. The first step of the technique utilizes seeded region growing to extract the entire blood pool including the four chambers of the heart, the pulmonary veins, aorta, superior vena cava, inferior vena cava, and other surrounding structures. Next, the left atrium and pulmonary veins are separated from the rest of the blood pool using an algorithm that searches for thin connections between user defined points in the volumetric data or on a surface rendering. Finally, pulmonary veins are separated from the left atrium using a three dimensional tracing tool. A single user segmented three datasets three times using both the semi-automatic technique as well as manual tracing. The user interaction time for the semi-automatic technique was approximately forty-five minutes per dataset and the manual tracing required between four and eight hours per dataset depending on the number of slices. A truth model was generated using a simple voting scheme on the repeated manual segmentations. A second user segmented each of the nine datasets using the semi-automatic technique only. Several metrics were computed to assess the agreement between the semi-automatic technique and the truth model including percent differences in left atrial volume, DICE overlap, and mean distance between the boundaries of the segmented left atria. Overall, the semi-automatic approach was demonstrated to be repeatable within and between raters, and accurate when compared to the truth model. Finally, we generated a visualization to assess the spatial variability in the segmentation errors between the semi-automatic approach and the truth model. The visualization demonstrates the highest errors occur at the boundaries between the left atium and pulmonary veins as well as the left atrium and left atrial appendage. In conclusion, we describe a semi-automatic approach for left atrial segmentation that demonstrates repeatability and accuracy, with the advantage of significant time reduction in user interaction time.
van Stralen, Marijn; Bosch, Johan G; Voormolen, Marco M; van Burken, Gerard; Krenning, Boudewijn J; van Geuns, Robert-Jan M; Lancée, Charles T; de Jong, Nico; Reiber, Johan H C
2005-10-01
We propose a semiautomatic endocardial border detection method for three-dimensional (3D) time series of cardiac ultrasound (US) data based on pattern matching and dynamic programming, operating on two-dimensional (2D) slices of the 3D plus time data, for the estimation of full cycle left ventricular volume, with minimal user interaction. The presented method is generally applicable to 3D US data and evaluated on data acquired with the Fast Rotating Ultrasound (FRU-) Transducer, developed by Erasmus Medical Center (Rotterdam, the Netherlands), a conventional phased-array transducer, rotating at very high speed around its image axis. The detection is based on endocardial edge pattern matching using dynamic programming, which is constrained by a 3D plus time shape model. It is applied to an automatically selected subset of 2D images of the original data set, for typically 10 equidistant rotation angles and 16 cardiac phases (160 images). Initialization requires the drawing of four contours per patient manually. We evaluated this method on 14 patients against MRI end-diastole and end-systole volumes. Initialization requires the drawing of four contours per patient manually. We evaluated this method on 14 patients against MRI end-diastolic (ED) and end-systolic (ES) volumes. The semiautomatic border detection approach shows good correlations with MRI ED/ES volumes (r = 0.938) and low interobserver variability (y = 1.005x - 16.7, r = 0.943) over full-cycle volume estimations. It shows a high consistency in tracking the user-defined initial borders over space and time. We show that the ease of the acquisition using the FRU-transducer and the semiautomatic endocardial border detection method together can provide a way to quickly estimate the left ventricular volume over the full cardiac cycle using little user interaction.
netCDF Operators for Rapid Analysis of Measured and Modeled Swath-like Data
NASA Astrophysics Data System (ADS)
Zender, C. S.
2015-12-01
Swath-like data (hereafter SLD) are defined by non-rectangular and/or time-varying spatial grids in which one or more coordinates are multi-dimensional. It is often challenging and time-consuming to work with SLD, including all Level 2 satellite-retrieved data, non-rectangular subsets of Level 3 data, and model data on curvilinear grids. Researchers and data centers want user-friendly, fast, and powerful methods to specify, extract, serve, manipulate, and thus analyze, SLD. To meet these needs, large research-oriented agencies and modeling center such as NASA, DOE, and NOAA increasingly employ the netCDF Operators (NCO), an open-source scientific data analysis software package applicable to netCDF and HDF data. NCO includes extensive, fast, parallelized regridding features to facilitate analysis and intercomparison of SLD and model data. Remote sensing, weather and climate modeling and analysis communities face similar problems in handling SLD including how to easily: 1. Specify and mask irregular regions such as ocean basins and political boundaries in SLD (and rectangular) grids. 2. Bin, interpolate, average, or re-map SLD to regular grids. 3. Derive secondary data from given quality levels of SLD. These common tasks require a data extraction and analysis toolkit that is SLD-friendly and, like NCO, familiar in all these communities. With NCO users can 1. Quickly project SLD onto the most useful regular grids for intercomparison. 2. Access sophisticated statistical and regridding functions that are robust to missing data and allow easy specification of quality control metrics. These capabilities improve interoperability, software-reuse, and, because they apply to SLD, minimize transmission, storage, and handling of unwanted data. While SLD analysis still poses many challenges compared to regularly gridded, rectangular data, the custom analyses scripts SLD once required are now shorter, more powerful, and user-friendly.
An object-oriented approach to deploying highly configurable Web interfaces for the ATLAS experiment
NASA Astrophysics Data System (ADS)
Lange, Bruno; Maidantchik, Carmen; Pommes, Kathy; Pavani, Varlen; Arosa, Breno; Abreu, Igor
2015-12-01
The ATLAS Technical Coordination disposes of 17 Web systems to support its operation. These applications, whilst ranging from managing the process of publishing scientific papers to monitoring radiation levels in the equipment in the experimental cavern, are constantly prone to changes in requirements due to the collaborative nature of the experiment and its management. In this context, a Web framework is proposed to unify the generation of the supporting interfaces. FENCE assembles classes to build applications by making extensive use of JSON configuration files. It relies heavily on Glance, a technology that was set forth in 2003 to create an abstraction layer on top of the heterogeneous sources that store the technical coordination data. Once Glance maps out the database modeling, records can be referenced in the configuration files by wrapping unique identifiers around double enclosing brackets. The deployed content can be individually secured by attaching clearance attributes to their description thus ensuring that view/edit privileges are granted to eligible users only. The framework also provides tools for securely writing into a database. Fully HTML5-compliant multi-step forms can be generated from their JSON description to assure that the submitted data comply with a series of constraints. Input validation is carried out primarily on the server- side but, following progressive enhancement guidelines, verification might also be performed on the client-side by enabling specific markup data attributes which are then handed over to the jQuery validation plug-in. User monitoring is accomplished by thoroughly logging user requests along with any POST data. Documentation is built from the source code using the phpDocumentor tool and made readily available for developers online. Fence, therefore, speeds up the implementation of Web interfaces and reduces the response time to requirement changes by minimizing maintenance overhead.
Program Aids Analysis And Optimization Of Design
NASA Technical Reports Server (NTRS)
Rogers, James L., Jr.; Lamarsh, William J., II
1994-01-01
NETS/ PROSSS (NETS Coupled With Programming System for Structural Synthesis) computer program developed to provide system for combining NETS (MSC-21588), neural-network application program and CONMIN (Constrained Function Minimization, ARC-10836), optimization program. Enables user to reach nearly optimal design. Design then used as starting point in normal optimization process, possibly enabling user to converge to optimal solution in significantly fewer iterations. NEWT/PROSSS written in C language and FORTRAN 77.
Research on a dynamic workflow access control model
NASA Astrophysics Data System (ADS)
Liu, Yiliang; Deng, Jinxia
2007-12-01
In recent years, the access control technology has been researched widely in workflow system, two typical technologies of that are RBAC (Role-Based Access Control) and TBAC (Task-Based Access Control) model, which has been successfully used in the role authorizing and assigning in a certain extent. However, during the process of complicating a system's structure, these two types of technology can not be used in minimizing privileges and separating duties, and they are inapplicable when users have a request of frequently changing on the workflow's process. In order to avoid having these weakness during the applying, a variable flow dynamic role_task_view (briefly as DRTVBAC) of fine-grained access control model is constructed on the basis existed model. During the process of this model applying, an algorithm is constructed to solve users' requirements of application and security needs on fine-grained principle of privileges minimum and principle of dynamic separation of duties. The DRTVBAC model is implemented in the actual system, the figure shows that the task associated with the dynamic management of role and the role assignment is more flexible on authority and recovery, it can be met the principle of least privilege on the role implement of a specific task permission activated; separated the authority from the process of the duties completing in the workflow; prevented sensitive information discovering from concise and dynamic view interface; satisfied with the requirement of the variable task-flow frequently.
Solving bezel reliability and CRT obsolescence
NASA Astrophysics Data System (ADS)
Schwartz, Richard J.; Bowen, Arlen R.; Knowles, Terry
2003-09-01
Scientific Research Corporation designed a Smart Multi-Function Color Display with Positive Pilot Feedback under the funding of an U. S. Navy Small Business Innovative Research program. The Smart Multi-Function Color Display can replace the obsolete monochrome Cathode Ray Tube display currently on the T-45C aircraft built by Boeing. The design utilizes a flat panel color Active Matrix Liquid Crystal Display and TexZec's patented Touch Thru Metal bezel technology providing both visual and biomechanical feedback to the pilot in a form, fit, and function replacement to the current T-45C display. Use of an existing color AMLCD, requires the least adaptation to fill the requirements of this application, thereby minimizing risk associated with developing a new display technology and maximizing the investment in improved user interface technology. The improved user interface uses TexZec's Touch Thru Metal technology to eliminate all of the moving parts that traditionally have limited Mean-Time-Between-Failure. The touch detection circuit consists of Commercial-Off-The-Shelf components, creating touch detection circuitry, which is simple and durable. This technology provides robust switch activation and a high level of environmental immunity, both mechanical and electrical. Replacement of all the T-45C multi-function displays with this design will improve the Mean-Time-Between-Failure and drastically reduce display life cycle costs. The design methodology described in this paper can be adapted to any new or replacement display.
Software Aids for radiologists: Part 1, Useful Photoshop skills.
Gross, Joel A; Thapa, Mahesh M
2012-12-01
The purpose of this review is to describe the use of several essential techniques and tools in Adobe Photoshop image-editing software. The techniques shown expand on those previously described in the radiologic literature. Radiologists, especially those with minimal experience with image-editing software, can quickly apply a few essential Photoshop tools to minimize the frustration that can result from attempting to navigate a complex user interface.
Cast Coil Transformer Fire Susceptibility and Reliability Study
1991-04-01
transformers reduce risk to the user compared to liquid-filled units, eliminate environmental impacts, are more efficient than most transformer designs, and...filled units, eliminate environmental impacts, arc more efficient than most transformer designs, and add minimal risk to the facility in a fire situation...add minimal risk to the facility in a fire situation. Cast coil transformers have a long record of operation and have proven to be reliable and
Amber Plug-In for Protein Shop
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oliva, Ricardo
2004-05-10
The Amber Plug-in for ProteinShop has two main components: an AmberEngine library to compute the protein energy models, and a module to solve the energy minimization problem using an optimization algorithm in the OPTI-+ library. Together, these components allow the visualization of the protein folding process in ProteinShop. AmberEngine is a object-oriented library to compute molecular energies based on the Amber model. The main class is called ProteinEnergy. Its main interface methods are (1) "init" to initialize internal variables needed to compute the energy. (2) "eval" to evaluate the total energy given a vector of coordinates. Additional methods allow themore » user to evaluate the individual components of the energy model (bond, angle, dihedral, non-bonded-1-4, and non-bonded energies) and to obtain the energy of each individual atom. The Amber Engine library source code includes examples and test routines that illustrate the use of the library in stand alone programs. The energy minimization module uses the AmberEngine library and the nonlinear optimization library OPT++. OPT++ is open source software available under the GNU Lesser General Public License. The minimization module currently makes use of the LBFGS optimization algorithm in OPT++ to perform the energy minimization. Future releases may give the user a choice of other algorithms available in OPT++.« less
Minimal-effort planning of active alignment processes for beam-shaping optics
NASA Astrophysics Data System (ADS)
Haag, Sebastian; Schranner, Matthias; Müller, Tobias; Zontar, Daniel; Schlette, Christian; Losch, Daniel; Brecher, Christian; Roßmann, Jürgen
2015-03-01
In science and industry, the alignment of beam-shaping optics is usually a manual procedure. Many industrial applications utilizing beam-shaping optical systems require more scalable production solutions and therefore effort has been invested in research regarding the automation of optics assembly. In previous works, the authors and other researchers have proven the feasibility of automated alignment of beam-shaping optics such as collimation lenses or homogenization optics. Nevertheless, the planning efforts as well as additional knowledge from the fields of automation and control required for such alignment processes are immense. This paper presents a novel approach of planning active alignment processes of beam-shaping optics with the focus of minimizing the planning efforts for active alignment. The approach utilizes optical simulation and the genetic programming paradigm from computer science for automatically extracting features from a simulated data basis with a high correlation coefficient regarding the individual degrees of freedom of alignment. The strategy is capable of finding active alignment strategies that can be executed by an automated assembly system. The paper presents a tool making the algorithm available to end-users and it discusses the results of planning the active alignment of the well-known assembly of a fast-axis collimator. The paper concludes with an outlook on the transferability to other use cases such as application specific intensity distributions which will benefit from reduced planning efforts.
Bryce, Thomas N.; Dijkers, Marcel P.
2015-01-01
Background: Powered exoskeletons have been demonstrated as being safe for persons with spinal cord injury (SCI), but little is known about how users learn to manage these devices. Objective: To quantify the time and effort required by persons with SCI to learn to use an exoskeleton for assisted walking. Methods: A convenience sample was enrolled to learn to use the first-generation Ekso powered exoskeleton to walk. Participants were given up to 24 weekly sessions of instruction. Data were collected on assistance level, walking distance and speed, heart rate, perceived exertion, and adverse events. Time and effort was quantified by the number of sessions required for participants to stand up, walk for 30 minutes, and sit down, initially with minimal and subsequently with contact guard assistance. Results: Of 22 enrolled participants, 9 screen-failed, and 7 had complete data. All of these 7 were men; 2 had tetraplegia and 5 had motor-complete injuries. Of these, 5 participants could stand, walk, and sit with contact guard or close supervision assistance, and 2 required minimal to moderate assistance. Walk times ranged from 28 to 94 minutes with average speeds ranging from 0.11 to 0.21 m/s. For all participants, heart rate changes and reported perceived exertion were consistent with light to moderate exercise. Conclusion: This study provides preliminary evidence that persons with neurological weakness due to SCI can learn to walk with little or no assistance and light to somewhat hard perceived exertion using a powered exoskeleton. Persons with different severities of injury, including those with motor complete C7 tetraplegia and motor incomplete C4 tetraplegia, may be able to learn to use this device. PMID:26364280
Kozlowski, Allan J; Bryce, Thomas N; Dijkers, Marcel P
2015-01-01
Powered exoskeletons have been demonstrated as being safe for persons with spinal cord injury (SCI), but little is known about how users learn to manage these devices. To quantify the time and effort required by persons with SCI to learn to use an exoskeleton for assisted walking. A convenience sample was enrolled to learn to use the first-generation Ekso powered exoskeleton to walk. Participants were given up to 24 weekly sessions of instruction. Data were collected on assistance level, walking distance and speed, heart rate, perceived exertion, and adverse events. Time and effort was quantified by the number of sessions required for participants to stand up, walk for 30 minutes, and sit down, initially with minimal and subsequently with contact guard assistance. Of 22 enrolled participants, 9 screen-failed, and 7 had complete data. All of these 7 were men; 2 had tetraplegia and 5 had motor-complete injuries. Of these, 5 participants could stand, walk, and sit with contact guard or close supervision assistance, and 2 required minimal to moderate assistance. Walk times ranged from 28 to 94 minutes with average speeds ranging from 0.11 to 0.21 m/s. For all participants, heart rate changes and reported perceived exertion were consistent with light to moderate exercise. This study provides preliminary evidence that persons with neurological weakness due to SCI can learn to walk with little or no assistance and light to somewhat hard perceived exertion using a powered exoskeleton. Persons with different severities of injury, including those with motor complete C7 tetraplegia and motor incomplete C4 tetraplegia, may be able to learn to use this device.
Pharmit: interactive exploration of chemical space.
Sunseri, Jocelyn; Koes, David Ryan
2016-07-08
Pharmit (http://pharmit.csb.pitt.edu) provides an online, interactive environment for the virtual screening of large compound databases using pharmacophores, molecular shape and energy minimization. Users can import, create and edit virtual screening queries in an interactive browser-based interface. Queries are specified in terms of a pharmacophore, a spatial arrangement of the essential features of an interaction, and molecular shape. Search results can be further ranked and filtered using energy minimization. In addition to a number of pre-built databases of popular compound libraries, users may submit their own compound libraries for screening. Pharmit uses state-of-the-art sub-linear algorithms to provide interactive screening of millions of compounds. Queries typically take a few seconds to a few minutes depending on their complexity. This allows users to iteratively refine their search during a single session. The easy access to large chemical datasets provided by Pharmit simplifies and accelerates structure-based drug design. Pharmit is available under a dual BSD/GPL open-source license. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
van Engen-Verheul, Mariëtte M; Peute, Linda W P; de Keizer, Nicolette F; Peek, Niels; Jaspers, Monique W M
2016-03-01
Cumbersome electronic patient record (EPR) interfaces may complicate data-entry in clinical practice. Completeness of data entered in the EPR determines, among other things, the value of computerized clinical decision support (CCDS). Quantitative usability evaluations can provide insight into mismatches between the system design model of data entry and users' data entry behavior, but not into the underlying causes for these mismatches. Mixed method usability evaluation studies may provide these insights, and thus support generating redesign recommendations for improving an EPR system's data entry interface. To improve the usability of the data entry interface of an EPR system with CCDS in the field of cardiac rehabilitation (CR), and additionally, to assess the value of a mixed method usability approach in this context. Seven CR professionals performed a think-aloud usability evaluation both before (beta-version) and after the redesign of the system. Observed usability problems from both evaluations were analyzed and categorized using Zhang et al.'s heuristic principles of good interface design. We combined the think-aloud usability evaluation of the system's beta-version with the measurement of a new usability construct: users' deviations in action sequence from the system's predefined data entry order sequence. Recommendations for redesign were implemented. We assessed whether the redesign improved CR professionals' (1) task efficacy (with respect to the completeness of data they collected), and (2) task efficiency (with respect to the average number of mouse clicks they needed to complete data entry subtasks). With the system's beta version, 40% of health care professionals' navigation actions through the system deviated from the predefined next system action. The causes for these deviations as revealed by the think-aloud method mostly concerned mismatches between the system design model for data entry action sequences and users expectations of these action sequences, based on their paper-based daily routines. This caused non completion of data entry tasks (31% of main tasks completed), and more navigation actions than minimally required (146% of the minimum required). In the redesigned system the data entry navigational structure was organized in a flexible way around an overview screen to better mimic users' paper-based daily routines of collecting patient data. This redesign resulted in an increased number of completed main tasks (70%) and a decrease in navigation actions (133% of the minimum required). The think-aloud usability evaluation of the redesigned system showed that remaining problems concerned flexibility (e.g., lack of customization options) and consistency (mainly with layout and position of items on the screen). The mixed method usability evaluation was supportive in revealing the magnitude and causes of mismatches between the system design model of data-entry with users' data entry behavior. However, as both task efficacy and efficiency were still not optimal with the redesigned EPR, we advise to perform a cognitive analysis on end users' mental processes and behavior patterns in daily work processes specifically during the requirements analysis phase of development of interactive healthcare information systems. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Zhu, Tingju; Marques, Guilherme Fernandes; Lund, Jay R.
2015-05-01
Efficient reallocation and conjunctive operation of existing water supplies is gaining importance as demands grow, competitions among users intensify, and new supplies become more costly. This paper analyzes the roles and benefits of conjunctive use of surface water and groundwater and market-based water transfers in an integrated regional water system where agricultural and urban water users coordinate supply and demand management based on supply reliability and economic values of water. Agricultural users optimize land and water use for annual and perennial crops to maximize farm income, while urban users choose short-term and long-term water conservation actions to maintain reliability and minimize costs. The temporal order of these decisions is represented in a two-stage optimization that maximizes the net expected benefits of crop production, urban conservation and water management including conjunctive use and water transfers. Long-term decisions are in the first stage and short-term decisions are in a second stage based on probabilities of water availability events. Analytical and numerical analyses are made. Results show that conjunctive use and water transfers can substantially stabilize farmer's income and reduce system costs by reducing expensive urban water conservation or construction. Water transfers can equalize marginal values of water across users, while conjunctive use minimizes water marginal value differences in time. Model results are useful for exploring the integration of different water demands and supplies through water transfers, conjunctive use, and conservation, providing valuable insights for improving system management.
Patient Accounting Systems: Are They Fit with the Users' Requirements?
Ayatollahi, Haleh; Nazemi, Zahra; Haghani, Hamid
2016-01-01
A patient accounting system is a subsystem of a hospital information system. This system like other information systems should be carefully designed to be able to meet users' requirements. The main aim of this research was to investigate users' requirements and to determine whether current patient accounting systems meet users' needs or not. This was a survey study, and the participants were the users of six patient accounting systems used in 24 teaching hospitals. A stratified sampling method was used to select the participants (n = 216). The research instruments were a questionnaire and a checklist. The mean value of ≥3 showed the importance of each data element and the capability of the system. Generally, the findings showed that the current patient accounting systems had some weaknesses and were able to meet between 70% and 80% of users' requirements. The current patient accounting systems need to be improved to be able to meet users' requirements. This approach can also help to provide hospitals with more usable and reliable financial information.
A segmentation editing framework based on shape change statistics
NASA Astrophysics Data System (ADS)
Mostapha, Mahmoud; Vicory, Jared; Styner, Martin; Pizer, Stephen
2017-02-01
Segmentation is a key task in medical image analysis because its accuracy significantly affects successive steps. Automatic segmentation methods often produce inadequate segmentations, which require the user to manually edit the produced segmentation slice by slice. Because editing is time-consuming, an editing tool that enables the user to produce accurate segmentations by only drawing a sparse set of contours would be needed. This paper describes such a framework as applied to a single object. Constrained by the additional information enabled by the manually segmented contours, the proposed framework utilizes object shape statistics to transform the failed automatic segmentation to a more accurate version. Instead of modeling the object shape, the proposed framework utilizes shape change statistics that were generated to capture the object deformation from the failed automatic segmentation to its corresponding correct segmentation. An optimization procedure was used to minimize an energy function that consists of two terms, an external contour match term and an internal shape change regularity term. The high accuracy of the proposed segmentation editing approach was confirmed by testing it on a simulated data set based on 10 in-vivo infant magnetic resonance brain data sets using four similarity metrics. Segmentation results indicated that our method can provide efficient and adequately accurate segmentations (Dice segmentation accuracy increase of 10%), with very sparse contours (only 10%), which is promising in greatly decreasing the work expected from the user.
Mkpojiogu, Emmanuel O C; Hashim, Nor Laily
2016-01-01
Customer satisfaction is the result of product quality and viability. The place of the perceived satisfaction of users/customers for a software product cannot be neglected especially in today competitive market environment as it drives the loyalty of customers and promotes high profitability and return on investment. Therefore understanding the importance of requirements as it is associated with the satisfaction of users/customers when their requirements are met is worth the pain considering. It is necessary to know the relationship between customer satisfactions when their requirements are met (or their dissatisfaction when their requirements are unmet) and the importance of such requirement. So many works have been carried out on customer satisfaction in connection with the importance of requirements but the relationship between customer satisfaction scores (coefficients) of the Kano model and users/customers self-stated requirements importance have not been sufficiently explored. In this study, an attempt is made to unravel the underlying relationship existing between Kano model's customer satisfaction indexes and users/customers self reported requirements importance. The results of the study indicate some interesting associations between these considered variables. These bivariate associations reveal that customer satisfaction index (SI), and average satisfaction coefficient (ASC) and customer dissatisfaction index (DI) and average satisfaction coefficient (ASC) are highly correlated (r = 96 %) and thus ASC can be used in place of either SI or DI in representing customer satisfaction scores. Also, these Kano model's customer satisfaction variables (SI, DI, and ASC) are each associated with self-stated requirements importance (IMP). Further analysis indicates that the value customers or users place on requirements that are met or on features that are incorporated into a product influences the level of satisfaction such customers derive from the product. The worth of a product feature is indicated by the perceived satisfaction customers get from the inclusion of such feature in the product design and development. The satisfaction users/customers derive when a requirement is fulfilled or when a feature is placed in the product (SI or ASC) is strongly influenced by the value the users/customers place on such requirements/features when met (IMP). However, the dissatisfaction users/customers received when a requirement is not met or when a feature is not incorporated into the product (DI), even though related to self-stated requirements importance (IMP), does not have a strong effect on the importance/worth (IMP) of that given requirement/feature as perceived by the users or customers. Therefore, since customer satisfaction is proportionally related to the perceived requirements importance (worth), it is then necessary to give adequate attention to user/customer satisfying requirements (features) from elicitation to design and to the final implementation of the design. Incorporating user or customer satisfying requirements in product design is of great worth or value to the future users or customers of the product.
Russ, Alissa L; Saleem, Jason J
2018-02-01
The quality of usability testing is highly dependent upon the associated usability scenarios. To promote usability testing as part of electronic health record (EHR) certification, the Office of the National Coordinator (ONC) for Health Information Technology requires that vendors test specific capabilities of EHRs with clinical end-users and report their usability testing process - including the test scenarios used - along with the results. The ONC outlines basic expectations for usability testing, but there is little guidance in usability texts or scientific literature on how to develop usability scenarios for healthcare applications. The objective of this article is to outline key factors to consider when developing usability scenarios and tasks to evaluate computer-interface based health information technologies. To achieve this goal, we draw upon a decade of our experience conducting usability tests with a variety of healthcare applications and a wide range of end-users, to include healthcare professionals as well as patients. We discuss 10 key factors that influence scenario development: objectives of usability testing; roles of end-user(s); target performance goals; evaluation time constraints; clinical focus; fidelity; scenario-related bias and confounders; embedded probes; minimize risks to end-users; and healthcare related outcome measures. For each factor, we present an illustrative example. This article is intended to aid usability researchers and practitioners in their efforts to advance health information technologies. The article provides broad guidance on usability scenario development and can be applied to a wide range of clinical information systems and applications. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Xie, Dengling; Xie, Yanjun; Liu, Peng; Tong, Lieshu; Chu, Kaiqin; Smith, Zachary J.
2017-02-01
Current flow-based blood counting devices require expensive and centralized medical infrastructure and are not appropriate for field use. In this paper we report a method to count red blood cells, white blood cells as well as platelets through a low-cost and fully-automated blood counting system. The approach consists of using a compact, custom-built microscope with large field-of-view to record bright-field and fluorescence images of samples that are diluted with a single, stable reagent mixture and counted using automatic algorithms. Sample collection is performed manually using a spring loaded lancet, and volume-metering capillary tubes. The capillaries are then dropped into a tube of pre-measured reagents and gently shaken for 10-30 seconds. The sample is loaded into a measurement chamber and placed on a custom 3D printed platform. Sample translation and focusing is fully automated, and a user has only to press a button for the measurement and analysis to commence. Cost of the system is minimized through the use of custom-designed motorized components. We performed a series of comparative experiments by trained and untrained users on blood from adults and children. We compare the performance of our system, as operated by trained and untrained users, to the clinical gold standard using a Bland-Altman analysis, demonstrating good agreement of our system to the clinical standard. The system's low cost, complete automation, and good field performance indicate that it can be successfully translated for use in low-resource settings where central hematology laboratories are not accessible.
CMS users data management service integration and first experiences with its NoSQL data storage
NASA Astrophysics Data System (ADS)
Riahi, H.; Spiga, D.; Boccali, T.; Ciangottini, D.; Cinquilli, M.; Hernàndez, J. M.; Konstantinov, P.; Mascheroni, M.; Santocchia, A.
2014-06-01
The distributed data analysis workflow in CMS assumes that jobs run in a different location to where their results are finally stored. Typically the user outputs must be transferred from one site to another by a dedicated CMS service, AsyncStageOut. This new service is originally developed to address the inefficiency in using the CMS computing resources when transferring the analysis job outputs, synchronously, once they are produced in the job execution node to the remote site. The AsyncStageOut is designed as a thin application relying only on the NoSQL database (CouchDB) as input and data storage. It has progressed from a limited prototype to a highly adaptable service which manages and monitors the whole user files steps, namely file transfer and publication. The AsyncStageOut is integrated with the Common CMS/Atlas Analysis Framework. It foresees the management of nearly nearly 200k users' files per day of close to 1000 individual users per month with minimal delays, and providing a real time monitoring and reports to users and service operators, while being highly available. The associated data volume represents a new set of challenges in the areas of database scalability and service performance and efficiency. In this paper, we present an overview of the AsyncStageOut model and the integration strategy with the Common Analysis Framework. The motivations for using the NoSQL technology are also presented, as well as data design and the techniques used for efficient indexing and monitoring of the data. We describe deployment model for the high availability and scalability of the service. We also discuss the hardware requirements and the results achieved as they were determined by testing with actual data and realistic loads during the commissioning and the initial production phase with the Common Analysis Framework.
Collaborative Planetary GIS with JMARS
NASA Astrophysics Data System (ADS)
Dickenshied, S.; Christensen, P. R.; Edwards, C. S.; Prashad, L. C.; Anwar, S.; Engle, E.; Noss, D.; Jmars Development Team
2010-12-01
Traditional GIS tools have allowed users to work locally with their own datasets in their own computing environment. More recently, data providers have started offering online repositories of preprocessed data which helps minimize the learning curve required to access new datasets. The ideal collaborative GIS tool provides the functionality of a traditional GIS and easy access to preprocessed data repositories while also enabling users to contribute data, analysis, and ideas back into the very tools they're using. JMARS (Java Mission-planning and Analysis for Remote Sensing) is a suite of geospatial applications developed by the Mars Space Flight Facility at Arizona State University. This software is used for mission planning and scientific data analysis by several NASA missions, including Mars Odyssey, Mars Reconnaissance Orbiter, and the Lunar Reconnaissance Orbiter. It is used by scientists, researchers and students of all ages from more than 40 countries around the world. In addition to offering a rich set of global and regional maps and publicly released orbiter images, the JMARS software development team has been working on ways to encourage the creation of collaborative datasets. Bringing together users from diverse teams and backgrounds allows new features to be developed with an interest in making the application useful and accessible to as wide a potential audience as possible. Actively engaging the scientific community in development strategy and hands on tasks allows the creation of user driven data content that would not otherwise be possible. The first community generated dataset to result from this effort is a tool mapping peer-reviewed papers to the locations they relate to on Mars with links to ancillary data. This allows users of JMARS to browse to an area of interest and then quickly locate papers corresponding to that area. Alternately, users can search for published papers over a specified time interval and visually see what areas of Mars have received the most attention over the requested time span.
Lung fissure detection in CT images using global minimal paths
NASA Astrophysics Data System (ADS)
Appia, Vikram; Patil, Uday; Das, Bipul
2010-03-01
Pulmonary fissures separate human lungs into five distinct regions called lobes. Detection of fissure is essential for localization of the lobar distribution of lung diseases, surgical planning and follow-up. Treatment planning also requires calculation of the lobe volume. This volume estimation mandates accurate segmentation of the fissures. Presence of other structures (like vessels) near the fissure, along with its high variational probability in terms of position, shape etc. makes the lobe segmentation a challenging task. Also, false incomplete fissures and occurrence of diseases add to the complications of fissure detection. In this paper, we propose a semi-automated fissure segmentation algorithm using a minimal path approach on CT images. An energy function is defined such that the path integral over the fissure is the global minimum. Based on a few user defined points on a single slice of the CT image, the proposed algorithm minimizes a 2D energy function on the sagital slice computed using (a) intensity (b) distance of the vasculature, (c) curvature in 2D, (d) continuity in 3D. The fissure is the infimum energy path between a representative point on the fissure and nearest lung boundary point in this energy domain. The algorithm has been tested on 10 CT volume datasets acquired from GE scanners at multiple clinical sites. The datasets span through different pathological conditions and varying imaging artifacts.
System for robot-assisted real-time laparoscopic ultrasound elastography
NASA Astrophysics Data System (ADS)
Billings, Seth; Deshmukh, Nishikant; Kang, Hyun Jae; Taylor, Russell; Boctor, Emad M.
2012-02-01
Surgical robots provide many advantages for surgery, including minimal invasiveness, precise motion, high dexterity, and crisp stereovision. One limitation of current robotic procedures, compared to open surgery, is the loss of haptic information for such purposes as palpation, which can be very important in minimally invasive tumor resection. Numerous studies have reported the use of real-time ultrasound elastography, in conjunction with conventional B-mode ultrasound, to differentiate malignant from benign lesions. Several groups (including our own) have reported integration of ultrasound with the da Vinci robot, and ultrasound elastography is a very promising image guidance method for robotassisted procedures that will further enable the role of robots in interventions where precise knowledge of sub-surface anatomical features is crucial. We present a novel robot-assisted real-time ultrasound elastography system for minimally invasive robot-assisted interventions. Our system combines a da Vinci surgical robot with a non-clinical experimental software interface, a robotically articulated laparoscopic ultrasound probe, and our GPU-based elastography system. Elasticity and B-mode ultrasound images are displayed as picture-in-picture overlays in the da Vinci console. Our system minimizes dependence on human performance factors by incorporating computer-assisted motion control that automatically generates the tissue palpation required for elastography imaging, while leaving high-level control in the hands of the user. In addition to ensuring consistent strain imaging, the elastography assistance mode avoids the cognitive burden of tedious manual palpation. Preliminary tests of the system with an elasticity phantom demonstrate the ability to differentiate simulated lesions of varied stiffness and to clearly delineate lesion boundaries.
2012-09-01
scheduler to adapt its uplink and downlink assignments to channel conditions. Sleep mode is used by the MS to minimize power drain and radio...is addressed in one resource unit, while for multi-user (MU) schemes , multiple users can be scheduled in one resource unit. Open-loop techniques...17 7. Mobility and Power Management ......................................... 18 8. Scheduling Services
Evaluation of construction strategies for PCC pavement rehabilitation projects.
DOT National Transportation Integrated Search
2010-09-30
This study investigated project management level solutions to optimizing resources, minimizing costs : (including user costs) and time for PCC pavement rehabilitation projects. This study extensively : evaluated the applicability of the Construction ...
Micromilling: A method for ultra-rapid prototyping of plastic microfluidic devices
Guckenberger, David J.; de Groot, Theodorus E.; Wan, Alwin M.D.; Beebe, David J.; Young, Edmond W. K.
2015-01-01
This tutorial review offers protocols, tips, insight, and considerations for practitioners interested in using micromilling to create microfluidic devices. The objective is to provide a potential user with information to guide them on whether micromilling would fill a specific need within their overall fabrication strategy. Comparisons are made between micromilling and other common fabrication methods for plastics in terms of technical capabilities and cost. The main discussion focuses on “how-to” aspects of micromilling, to enable a user to select proper equipment and tools, and obtain usable microfluidic parts with minimal start-up time and effort. The supplementary information provides more extensive discussion on CNC mill setup, alignment, and programming. We aim to reach an audience with minimal prior experience in milling, but with strong interests in fabrication of microfluidic devices. PMID:25906246
NASA Astrophysics Data System (ADS)
Watson, Clifton L.; Biswas, Subir
2014-06-01
With an increasing demand for spectrum, dynamic spectrum access (DSA) has been proposed as viable means for providing the flexibility and greater access to spectrum necessary to meet this demand. Within the DSA concept, unlicensed secondary users temporarily "borrow" or access licensed spectrum, while respecting the licensed primary user's rights to that spectrum. As key enablers for DSA, cognitive radios (CRs) are based on software-defined radios which allow them to sense, learn, and adapt to the spectrum environment. These radios can operate independently and rapidly switch channels. Thus, the initial setup and maintenance of cognitive radio networks are dependent upon the ability of CR nodes to find each other, in a process known as rendezvous, and create a link on a common channel for the exchange of data and control information. In this paper, we propose a novel rendezvous protocol, known as QLP, which is based on Q-learning and the p-persistent CSMA protocol. With the QLP protocol, CR nodes learn which channels are best for rendezvous and thus adapt their behavior to visit those channels more frequently. We demonstrate through simulation that the QLP protocol provides a rendevous capability for DSA environments with different dynamics of PU activity, while attempting to achieve the following performance goals: (1) minimize the average time-to-rendezvous, (2) maximize system throughput, (3) minimize primary user interference, and (4) minimize collisions among CR nodes.
Young, I; Rajić, A; Hendrick, S; Parker, S; Sanchez, J; McClure, J T; McEwen, S A
2010-04-01
To harmonize good production practices (GPP) for dairy producers in Canada, the Canadian dairy industry has developed and is implementing a program called Canadian Quality Milk (CQM). A postal questionnaire was administered to all Canadian dairy producers enrolled in dairy herd-improvement organizations in 2008 (n=10,474) to investigate their attitudes towards the program and to establish baseline information on their use of GPP. The response percentage was 20.9% (2185/10,474). Two-thirds of producers (67.6%) reported participation in CQM and 61.4% of these indicated that the requirements were easy to implement. Most producers (85.0%) reported the use of cats as a pest-control method in their barns. For dead-livestock disposal, 65.0% and 38.0% indicated use of a collection service and burial, respectively. Nearly 40.0% of respondents indicated that they purchase replacement cattle, and somatic cell-count score was the main health indicator considered before purchase. Over 70% of producers reported that they clean and disinfect maternity, calf and weaned-calf pens, while only 34.1% and 53.1% reported that they provide visitors and employees, respectively, with clean clothes and boots. Through latent-class analysis, five groups (classes) of producers with distinctive patterns of reported use of GPP were identified. These were labelled as "minimal", "sanitation-only", "employee-visitor hygiene", "typical" and "ideal" user groups, with 11.1%, 23.8%, 20.2%, 37.1% and 7.7% of respondents, respectively. Respondents in the "ideal users" group had a higher probability of reporting the use of each GPP and were more likely to have completed an educational course in food safety compared to respondents in each other group. They were also more likely to have a herd size in the uppermost quartile (>65 cows) and report participation in CQM compared to each other group except the "employee-visitor hygiene users". The greatest differences were observed when compared to the "minimal users" group for completion of a food-safety course (OR=2.81), participation in CQM (OR=2.39) and having a herd size of >65 vs. <36 cows (OR=3.04). Targeted education of dairy producers on the importance of various GPP (e.g. detailed health assessments of replacement cattle before purchase) for infection control is warranted.
Towards Efficient Scientific Data Management Using Cloud Storage
NASA Technical Reports Server (NTRS)
He, Qiming
2013-01-01
A software prototype allows users to backup and restore data to/from both public and private cloud storage such as Amazon's S3 and NASA's Nebula. Unlike other off-the-shelf tools, this software ensures user data security in the cloud (through encryption), and minimizes users operating costs by using space- and bandwidth-efficient compression and incremental backup. Parallel data processing utilities have also been developed by using massively scalable cloud computing in conjunction with cloud storage. One of the innovations in this software is using modified open source components to work with a private cloud like NASA Nebula. Another innovation is porting the complex backup to- cloud software to embedded Linux, running on the home networking devices, in order to benefit more users.
NASA Technical Reports Server (NTRS)
Liberman, Eugene M.; Manner, David B.; Dolce, James L.; Mellor, Pamela A.
1993-01-01
A user interface to the power distribution expert system for Space Station Freedom is discussed. The importance of features which simplify assessing system status and which minimize navigating through layers of information are examined. Design rationale and implementation choices are also presented. The amalgamation of such design features as message linking arrows, reduced information content screens, high salience anomaly icons, and color choices with failure detection and diagnostic explanation from an expert system is shown to provide an effective status-at-a-glance monitoring system for power distribution. This user interface design offers diagnostic reasoning without compromising the monitoring of current events. The display can convey complex concepts in terms that are clear to its users.
Weyand, Sabine; Takehara-Nishiuchi, Kaori; Chau, Tom
2015-10-30
Near-infrared spectroscopy (NIRS) brain-computer interfaces (BCIs) enable users to interact with their environment using only cognitive activities. This paper presents the results of a comparison of four methodological frameworks used to select a pair of tasks to control a binary NIRS-BCI; specifically, three novel personalized task paradigms and the state-of-the-art prescribed task framework were explored. Three types of personalized task selection approaches were compared, including: user-selected mental tasks using weighted slope scores (WS-scores), user-selected mental tasks using pair-wise accuracy rankings (PWAR), and researcher-selected mental tasks using PWAR. These paradigms, along with the state-of-the-art prescribed mental task framework, where mental tasks are selected based on the most commonly used tasks in literature, were tested by ten able-bodied participants who took part in five NIRS-BCI sessions. The frameworks were compared in terms of their accuracy, perceived ease-of-use, computational time, user preference, and length of training. Most notably, researcher-selected personalized tasks resulted in significantly higher accuracies, while user-selected personalized tasks resulted in significantly higher perceived ease-of-use. It was also concluded that PWAR minimized the amount of data that needed to be collected; while, WS-scores maximized user satisfaction and minimized computational time. In comparison to the state-of-the-art prescribed mental tasks, our findings show that overall, personalized tasks appear to be superior to prescribed tasks with respect to accuracy and perceived ease-of-use. The deployment of personalized rather than prescribed mental tasks ought to be considered and further investigated in future NIRS-BCI studies. Copyright © 2015 Elsevier B.V. All rights reserved.
Development of regional climate scenarios in the Netherlands - involvement of users
NASA Astrophysics Data System (ADS)
Bessembinder, Janette; Overbeek, Bernadet
2013-04-01
Climate scenarios are consistent and plausible pictures of possible future climates. They are intended for use in studies exploring the impacts of climate change, and to formulate possible adaptation strategies. To ensure that the developed climate scenarios are relevant to the intended users, interaction with the users is needed. As part of the research programmes "Climate changes Spatial Planning" and "Knowledge for Climate" several projects on climate services, tailoring of climate information and communication were conducted. Some of the important lessons learned about user interaction are: *) To be able to deliver relevant climate information in the right format, proper knowledge is required on who will be using the climate information and data, how it will be used and why they use it; *) Users' requirements can be very diverse and requirements may change over time. Therefore, sustained (personal) contact with users is required; *) Organising meetings with climate researchers and users of climate information together, and working together in projects results in mutual understanding on the requirements of users and the limitations to deliver certain types of climate information, which facilitates the communication and results in more widely accepted products; *) Information and communication should be adapted to the type of users (e.g. impact researchers or policy makers) and to the type of problem (unstructured problems require much more contact with the users). In 2001 KNMI developed climate scenarios for the National Commission on Water management in the 21st century (WB21 scenarios). In 2006 these were replaced by a the KNMI'06 scenarios, intended for a broader group of users. The above lessons are now taken into account during the development of the next generation of climate scenarios for the Netherlands, expected at the end of 2013, after the publication of the IPCC WG1 report: *) users' requirements are taken into account explicitly in the whole process of the development of the climate scenarios; *) users are involved already in the early phases of the development of new scenarios, among others in the following way: **) workshops on users' requirements to check whether they have changed and to get more information; **) feedback group of users to get more detailed feedback on the modes of communication; **) newsletter with information on the progress and procedures to be followed and separate workshops for researchers and policy makers with different levels of detail; **) projects together with impact researchers: tailoring of data and in order to be able to present impact information consistent with the climate scenarios much earlier. During the presentation more detailed information will be given on the interaction with users.
Human-telerobot interactions - Information, control, and mental models
NASA Technical Reports Server (NTRS)
Smith, Randy L.; Gillan, Douglas J.
1987-01-01
A part of the NASA's Space Station will be a teleoperated robot (telerobot) with arms for grasping and manipulation, feet for holding onto objects, and television cameras for visual feedback. The objective of the work described in this paper is to develop the requirements and specifications for the user-telerobot interface and to determine through research and testing that the interface results in efficient system operation. The focus of the development of the user-telerobot interface is on the information required by the user, the user inputs, and the design of the control workstation. Closely related to both the information required by the user and the user's control of the telerobot is the user's mental model of the relationship between the control inputs and the telerobot's actions.
Gulla, Joy; Neri, Pamela M; Bates, David W; Samal, Lipika
2017-05-01
Timely referral of patients with CKD has been associated with cost and mortality benefits, but referrals are often done too late in the course of the disease. Clinical decision support (CDS) offers a potential solution, but interventions have failed because they were not designed to support the physician workflow. We sought to identify user requirements for a chronic kidney disease (CKD) CDS system to promote timely referral. We interviewed primary care physicians (PCPs) to identify data needs for a CKD CDS system that would encourage timely referral and also gathered information about workflow to assess risk factors for progression of CKD. Interviewees were general internists recruited from a network of 14 primary care clinics affiliated with Brigham and Women's Hospital (BWH). We then performed a qualitative analysis to identify user requirements and system attributes for a CKD CDS system. Of the 12 participants, 25% were women, the mean age was 53 (range 37-82), mean years in clinical practice was 27 (range 11-58). We identified 21 user requirements. Seven of these user requirements were related to support for the referral process workflow, including access to pertinent information and support for longitudinal co-management. Six user requirements were relevant to PCP management of CKD, including management of risk factors for progression, interpretation of biomarkers of CKD severity, and diagnosis of the cause of CKD. Finally, eight user requirements addressed user-centered design of CDS, including the need for actionable information, links to guidelines and reference materials, and visualization of trends. These 21 user requirements can be used to design an intuitive and usable CDS system with the attributes necessary to promote timely referral. Copyright © 2017 Elsevier B.V. All rights reserved.
The transfer of East Coast fever immunisation to veterinary paraprofessionals in Zambia.
Marcotty, T; Chaka, G; Brandt, J; Berkvens, D; Thys, E; Mulumba, M; Mataa, L; Van den Bossche, P
2008-12-01
In eastern Zambia, immunisation by 'infection and treatment' is the main method used to control East Coast fever, an acute and lethal cattle disease. This service, which requires a stringent cold chain, used to be free of charge. When a minimal user fee was introduced, attendance dropped drastically. Consequently, this complex immunisation programme was transferred to veterinary paraprofessionals working on their own account, with the aim of boosting a more sustainable distribution of vaccine. Paraprofessionals were provided with a motorbike and the required specific equipment, but fuel and drugs were at their expenses. The paraprofessionals recovered their costs, with a profit margin, by charging the cattle owners for immunisation. The reasons for the successful transfer of immunisation to paraprofessionals (despite the maintenance of a fee) are attributed mainly to the absence of information asymmetry between the paraprofessional and the livestock owner, the appreciable level of effort of the paraprofessionals and the verifiable outcome of the service provided.
An integrated dexterous robotic testbed for space applications
NASA Technical Reports Server (NTRS)
Li, Larry C.; Nguyen, Hai; Sauer, Edward
1992-01-01
An integrated dexterous robotic system was developed as a testbed to evaluate various robotics technologies for advanced space applications. The system configuration consisted of a Utah/MIT Dexterous Hand, a PUMA 562 arm, a stereo vision system, and a multiprocessing computer control system. In addition to these major subsystems, a proximity sensing system was integrated with the Utah/MIT Hand to provide capability for non-contact sensing of a nearby object. A high-speed fiber-optic link was used to transmit digitized proximity sensor signals back to the multiprocessing control system. The hardware system was designed to satisfy the requirements for both teleoperated and autonomous operations. The software system was designed to exploit parallel processing capability, pursue functional modularity, incorporate artificial intelligence for robot control, allow high-level symbolic robot commands, maximize reusable code, minimize compilation requirements, and provide an interactive application development and debugging environment for the end users. An overview is presented of the system hardware and software configurations, and implementation is discussed of subsystem functions.
NASA Astrophysics Data System (ADS)
Brandic, Ivona; Music, Dejan; Dustdar, Schahram
Nowadays, novel computing paradigms as for example Cloud Computing are gaining more and more on importance. In case of Cloud Computing users pay for the usage of the computing power provided as a service. Beforehand they can negotiate specific functional and non-functional requirements relevant for the application execution. However, providing computing power as a service bears different research challenges. On one hand dynamic, versatile, and adaptable services are required, which can cope with system failures and environmental changes. On the other hand, human interaction with the system should be minimized. In this chapter we present the first results in establishing adaptable, versatile, and dynamic services considering negotiation bootstrapping and service mediation achieved in context of the Foundations of Self-Governing ICT Infrastructures (FoSII) project. We discuss novel meta-negotiation and SLA mapping solutions for Cloud services bridging the gap between current QoS models and Cloud middleware and representing important prerequisites for the establishment of autonomic Cloud services.
A high throughput MATLAB program for automated force-curve processing using the AdG polymer model.
O'Connor, Samantha; Gaddis, Rebecca; Anderson, Evan; Camesano, Terri A; Burnham, Nancy A
2015-02-01
Research in understanding biofilm formation is dependent on accurate and representative measurements of the steric forces related to brush on bacterial surfaces. A MATLAB program to analyze force curves from an AFM efficiently, accurately, and with minimal user bias has been developed. The analysis is based on a modified version of the Alexander and de Gennes (AdG) polymer model, which is a function of equilibrium polymer brush length, probe radius, temperature, separation distance, and a density variable. Automating the analysis reduces the amount of time required to process 100 force curves from several days to less than 2min. The use of this program to crop and fit force curves to the AdG model will allow researchers to ensure proper processing of large amounts of experimental data and reduce the time required for analysis and comparison of data, thereby enabling higher quality results in a shorter period of time. Copyright © 2014 Elsevier B.V. All rights reserved.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Radiotelephone Act; or (b) Required to participate in a VMRS within a VTS area (VMRS User). VTS User's Manual...) User means a vessel, or an owner, operator, charterer, Master, or person directing the movement of a... which special operating requirements apply. VTS User means a vessel, or an owner, operator, charterer...
Requirements for the Military Message System (MMS) Family: Data Types and User Commands.
1986-04-11
AD-A167 126 REQUIREMENTS FOR THE MILITARY MESSASE SYSTEM (NHS) i FRILY: DATA TYPES AND USER CONNNDS(U) NAVAL RESEARCH LAB WASHINGTON DC C L HEITHEVER... System (MMS) Family: Data Types and User Commands CONSTANCE L. HEITMEYER Computer Science and Systems Branch I Information Technology Division April 11...Security Classification) Requirements for the Military Message System (MMS) Family: Data Types and User Commands 12. PERSONAL AUTHOR(S) Heitmeer, Constance
Patient Accounting Systems: Are They Fit with the Users' Requirements?
Ayatollahi, Haleh; Nazemi, Zahra
2016-01-01
Objectives A patient accounting system is a subsystem of a hospital information system. This system like other information systems should be carefully designed to be able to meet users' requirements. The main aim of this research was to investigate users' requirements and to determine whether current patient accounting systems meet users' needs or not. Methods This was a survey study, and the participants were the users of six patient accounting systems used in 24 teaching hospitals. A stratified sampling method was used to select the participants (n = 216). The research instruments were a questionnaire and a checklist. The mean value of ≥3 showed the importance of each data element and the capability of the system. Results Generally, the findings showed that the current patient accounting systems had some weaknesses and were able to meet between 70% and 80% of users' requirements. Conclusions The current patient accounting systems need to be improved to be able to meet users' requirements. This approach can also help to provide hospitals with more usable and reliable financial information. PMID:26893945
Environmental management system for transportation maintenance operations : [technical brief].
DOT National Transportation Integrated Search
2014-04-01
This report provides the framework for the environmental management system to analyze : greenhouse gas emissions from transportation maintenance operations. The system enables user : to compare different scenarios and make informed decisions to minim...
Research into display sharing techniques for distributed computing environments
NASA Technical Reports Server (NTRS)
Hugg, Steven B.; Fitzgerald, Paul F., Jr.; Rosson, Nina Y.; Johns, Stephen R.
1990-01-01
The X-based Display Sharing solution for distributed computing environments is described. The Display Sharing prototype includes the base functionality for telecast and display copy requirements. Since the prototype implementation is modular and the system design provided flexibility for the Mission Control Center Upgrade (MCCU) operational consideration, the prototype implementation can be the baseline for a production Display Sharing implementation. To facilitate the process the following discussions are presented: Theory of operation; System of architecture; Using the prototype; Software description; Research tools; Prototype evaluation; and Outstanding issues. The prototype is based on the concept of a dedicated central host performing the majority of the Display Sharing processing, allowing minimal impact on each individual workstation. Each workstation participating in Display Sharing hosts programs to facilitate the user's access to Display Sharing as host machine.
Image Navigation and Registration Performance Assessment Evaluation Tools for GOES-R ABI and GLM
NASA Technical Reports Server (NTRS)
Houchin, Scott; Porter, Brian; Graybill, Justin; Slingerland, Philip
2017-01-01
The GOES-R Flight Project has developed an Image Navigation and Registration (INR) Performance Assessment Tool Set (IPATS) for measuring Advanced Baseline Imager (ABI) and Geostationary Lightning Mapper (GLM) INR performance metrics in the post-launch period for performance evaluation and long term monitoring. IPATS utilizes a modular algorithmic design to allow user selection of data processing sequences optimized for generation of each INR metric. This novel modular approach minimizes duplication of common processing elements, thereby maximizing code efficiency and speed. Fast processing is essential given the large number of sub-image registrations required to generate INR metrics for the many images produced over a 24 hour evaluation period. This paper describes the software design and implementation of IPATS and provides preliminary test results.
Usability factors of mobile health application for chronic diseases
NASA Astrophysics Data System (ADS)
Zahra, Fatima; Hussain, Azham; Mohd, Haslina
2016-08-01
M-health has changed the conventional delivery system of health-care, permitting continuous, pervasive Health-care anywhere, anytime. Chronic disease apps are increasing, as many health workers, patients and clinicians already embracing smartphones in their comprehensive and diverse practices. There are lots of challenges and requirements that need to be addressed for mobile health applications to prevent or eliminate design problems and minimize potential threats for users, the proposed factors for chronic disease mobile applications can be used as a guide for app developers While, the usability testing, and evaluations of chronic disease apps have not yet touched the accuracy level of other web based applications. This study is being conducted to learn about challenges of m-health apps and to identify the factors that affect the usability of such applications.
Goh, Glenn; Tan, Ngiap Chuan; Malhotra, Rahul; Padmanabhan, Uma; Barbier, Sylvaine; Allen, John Carson; Østbye, Truls
2015-02-03
Self-management plays an important role in maintaining good control of diabetes mellitus, and mobile phone interventions have been shown to improve such self-management. The Health Promotion Board of Singapore has created a caloric-monitoring mobile health app, the "interactive Diet and Activity Tracker" (iDAT). The objective was to identify and describe short-term (8-week) trajectories of use of the iDAT app among patients with type 2 diabetes mellitus in a primary care setting in Singapore, and identify patient characteristics associated with each trajectory. A total of 84 patients with type 2 diabetes mellitus from a public primary care clinic in Singapore who had not previously used the iDAT app were enrolled. The app was demonstrated and patients' weekly use of the app was monitored over 8 weeks. Weekly use was defined as any record in terms of food entry or exercise workout entry in that week. Information on demographics, diet and exercise motivation, diabetes self-efficacy (Diabetes Empowerment Scale-Short Form), and clinical variables (body mass index, blood pressure, and glycosylated hemoglobin/HbA1c) were collected at baseline. iDAT app use trajectories were delineated using latent-class growth modeling (LCGM). Association of patient characteristics with the trajectories was ascertained using logistic regression analysis. Three iDAT app use trajectories were observed: Minimal Users (66 out of 84 patients, 78.6%, with either no iDAT use at all or use only in the first 2 weeks), Intermittent-Waning Users (10 out of 84 patients, 11.9%, with occasional weekly use mainly in the first 4 weeks), and Consistent Users (8 out of 84 patients, 9.5%, with weekly use throughout all or most of the 8 weeks). The adjusted odds ratio of being a Consistent User, relative to a Minimal User, was significantly higher for females (OR 19.55, 95% CI 1.78-215.42) and for those with higher exercise motivation scores at baseline (OR 4.89, 95% CI 1.80-13.28). The adjusted odds ratio of being an Intermittent-Waning User relative to a Minimal User was also significantly higher for those with higher exercise motivation scores at baseline (OR 1.82, 95% CI 1.00-3.32). This study provides insight into the nature and extent of usage of a caloric-monitoring app among patients with type 2 diabetes and managed in primary care. The application of LCGM provides a useful framework for evaluating future app use in other patient populations.
Unique strategies for technical information management at Johnson Space Center
NASA Technical Reports Server (NTRS)
Krishen, Vijay
1994-01-01
In addition to the current NASA manned programs, the maturation of Space Station and the introduction of the Space Exploration programs are anticipated to add substantially to the number and variety of data and documentation at NASA Johnson Space Center (JSC). This growth in the next decade has been estimated at five to ten fold compared to the current numbers. There will be an increased requirement for the tracking and currency of space program data and documents with National pressures to realize economic benefits from the research and technological developments of space programs. From a global perspective the demand for NASA's technical data and documentation is anticipated to increase at local, national, and international levels. The primary users will be government, industry, and academia. In our present national strategy, NASA's research and technology will assume a great role in the revitalization of the economy and gaining international competitiveness. Thus, greater demand will be placed on NASA's data and documentation resources. In this paper the strategies and procedures developed by DDMS, Inc., to accommodate the present and future information utilization needs are presented. The DDMS, Inc., strategies and procedures rely on understanding user requirements, library management issues, and technological applications for acquiring, searching, storing, and retrieving specific information accurately and quickly. The proposed approach responds to changing customer requirements and product deliveries. The unique features of the proposed strategy include: (1) To establish customer driven data and documentation management through an innovative and unique methods to identify needs and requirements. (2) To implement a structured process which responds to user needs, aimed at minimizing costs and maximizing services, resulting in increased productivity. (3) To provide a process of standardization of services and procedures. This standardization is the central theme of the strategic approach. It will allow Division level Data and Documentation Libraries (DDL's) to function independently and optimize efficiency at the Directorate level. This process also facilitates interconnectivity between Division level DDL's and makes them transparent to the users. (4) To implement the process of 'cost savings', and at the same time the objective is to gain substantial improvement in the organization, categorization, and preservation of JSC-generated data and documentation, and (5) To find, locate, retrace, restore, and preserve the Center-generated crucial scientific and technical information that has been and is being provided by the engineers and scientists of JSC. This is important to the preservation of 'lessons learned'. Preliminary estimates of the possible cost savings which will result from the implementation of this process will also be discussed in this paper.
ERIC Educational Resources Information Center
Anderson, Barry D.
Little is known about the costs of setting up and implementing legislated minimal competency testing (MCT). To estimate the financial obstacles which lie between the idea and its implementation, MCT requirements are viewed from two perspectives. The first, government regulation, views legislated minimal competency requirements as an attempt by the…
McCarthy, Ilana Olin; Wojno, Abbey E; Joseph, Heather A; Teesdale, Scott
2017-11-14
The response to the 2014-2016 Ebola epidemic included an unprecedented effort from federal, state, and local public health authorities to monitor the health of travelers entering the United States from countries with Ebola outbreaks. The Check and Report Ebola (CARE) Hotline, a novel approach to monitoring, was designed to enable travelers to report their health status daily to an interactive voice recognition (IVR) system. The system was tested with 70 Centers for Disease Control and Prevention (CDC) federal employees returning from deployments in outbreak countries. The objective of this study was to describe the development of the CARE Hotline as a tool for postarrival monitoring and examine the usage characteristics and user experience of the tool during a public health emergency. Data were obtained from two sources. First, the CARE Hotline system produced a call log which summarized the usage characteristics of all 70 users' daily health reports. Second, we surveyed federal employees (n=70) who used the CARE Hotline to engage in monitoring. A total of 21 (21/70, 30%) respondents were included in the survey analytic sample. While the CARE Hotline was used for monitoring, 70 users completed a total of 1313 calls. We found that 94.06% (1235/1313) of calls were successful, and the average call time significantly decreased from the beginning of the monitoring period to the end by 32 seconds (Z score=-6.52, P<.001). CARE Hotline call log data were confirmed by user feedback; survey results indicated that users became more familiar with the system and found the system easier to use, from the beginning to the end of their monitoring period. The majority of the users were highly satisfied (90%, 19/21) with the system, indicating ease of use and convenience as primary reasons, and would recommend it for future monitoring efforts (90%, 19/21). The CARE Hotline garnered high user satisfaction, required minimal reporting time from users, and was an easily learned tool for monitoring. This phone-based technology can be modified for future public health emergencies. ©Ilana Olin McCarthy, Abbey E Wojno, Heather A Joseph, Scott Teesdale. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 14.11.2017.
A Fast and Robust Poisson-Boltzmann Solver Based on Adaptive Cartesian Grids
Boschitsch, Alexander H.; Fenley, Marcia O.
2011-01-01
An adaptive Cartesian grid (ACG) concept is presented for the fast and robust numerical solution of the 3D Poisson-Boltzmann Equation (PBE) governing the electrostatic interactions of large-scale biomolecules and highly charged multi-biomolecular assemblies such as ribosomes and viruses. The ACG offers numerous advantages over competing grid topologies such as regular 3D lattices and unstructured grids. For very large biological molecules and multi-biomolecule assemblies, the total number of grid-points is several orders of magnitude less than that required in a conventional lattice grid used in the current PBE solvers thus allowing the end user to obtain accurate and stable nonlinear PBE solutions on a desktop computer. Compared to tetrahedral-based unstructured grids, ACG offers a simpler hierarchical grid structure, which is naturally suited to multigrid, relieves indirect addressing requirements and uses fewer neighboring nodes in the finite difference stencils. Construction of the ACG and determination of the dielectric/ionic maps are straightforward, fast and require minimal user intervention. Charge singularities are eliminated by reformulating the problem to produce the reaction field potential in the molecular interior and the total electrostatic potential in the exterior ionic solvent region. This approach minimizes grid-dependency and alleviates the need for fine grid spacing near atomic charge sites. The technical portion of this paper contains three parts. First, the ACG and its construction for general biomolecular geometries are described. Next, a discrete approximation to the PBE upon this mesh is derived. Finally, the overall solution procedure and multigrid implementation are summarized. Results obtained with the ACG-based PBE solver are presented for: (i) a low dielectric spherical cavity, containing interior point charges, embedded in a high dielectric ionic solvent – analytical solutions are available for this case, thus allowing rigorous assessment of the solution accuracy; (ii) a pair of low dielectric charged spheres embedded in a ionic solvent to compute electrostatic interaction free energies as a function of the distance between sphere centers; (iii) surface potentials of proteins, nucleic acids and their larger-scale assemblies such as ribosomes; and (iv) electrostatic solvation free energies and their salt sensitivities – obtained with both linear and nonlinear Poisson-Boltzmann equation – for a large set of proteins. These latter results along with timings can serve as benchmarks for comparing the performance of different PBE solvers. PMID:21984876
NASA Technical Reports Server (NTRS)
Choi, H. J.; Su, Y. T.
1986-01-01
The User Constraint Measurement System (UCMS) is a hardware/software package developed by NASA Goddard to measure the signal parameter constraints of the user transponder in the TDRSS environment by means of an all-digital signal sampling technique. An account is presently given of the features of UCMS design and of its performance capabilities and applications; attention is given to such important aspects of the system as RF interface parameter definitions, hardware minimization, the emphasis on offline software signal processing, and end-to-end link performance. Applications to the measurement of other signal parameters are also discussed.
Heroin-associated anthrax with minimal morbidity.
Black, Heather; Chapman, Ann; Inverarity, Donald; Sinha, Satyajit
2017-03-08
In 2010, during an outbreak of anthrax affecting people who inject drugs, a heroin user aged 37 years presented with soft tissue infection. He subsequently was found to have anthrax. We describe his management and the difficulty in distinguishing anthrax from non-anthrax lesions. His full recovery, despite an overall mortality of 30% for injectional anthrax, demonstrates that some heroin-related anthrax cases can be managed predominately with oral antibiotics and minimal surgical intervention. 2017 BMJ Publishing Group Ltd.
An Algorithm to Atmospherically Correct Visible and Thermal Airborne Imagery
NASA Technical Reports Server (NTRS)
Rickman, Doug L.; Luvall, Jeffrey C.; Schiller, Stephen; Arnold, James E. (Technical Monitor)
2000-01-01
The program Watts implements a system of physically based models developed by the authors, described elsewhere, for the removal of atmospheric effects in multispectral imagery. The band range we treat covers the visible, near IR and the thermal IR. Input to the program begins with atmospheric pal red models specifying transmittance and path radiance. The system also requires the sensor's spectral response curves and knowledge of the scanner's geometric definition. Radiometric characterization of the sensor during data acquisition is also necessary. While the authors contend that active calibration is critical for serious analytical efforts, we recognize that most remote sensing systems, either airborne or space borne, do not as yet attain that minimal level of sophistication. Therefore, Watts will also use semi-active calibration where necessary and available. All of the input is then reduced to common terms, in terms of the physical units. From this it Is then practical to convert raw sensor readings into geophysically meaningful units. There are a large number of intricate details necessary to bring an algorithm or this type to fruition and to even use the program. Further, at this stage of development the authors are uncertain as to the optimal presentation or minimal analytical techniques which users of this type of software must have. Therefore, Watts permits users to break out and analyze the input in various ways. Implemented in REXX under OS/2 the program is designed with attention to the probability that it will be ported to other systems and other languages. Further, as it is in REXX, it is relatively simple for anyone that is literate in any computer language to open the code and modify to meet their needs. The authors have employed Watts in their research addressing precision agriculture and urban heat island.
Rotondi, Armando J; Eack, Shaun M; Hanusa, Barbara H; Spring, Michael B; Haas, Gretchen L
2015-03-01
E-health applications are becoming integral components of general medical care delivery models and emerging for mental health care. Few exist for treatment of those with severe mental illness (SMI). In part, this is due to a lack of models to design such technologies for persons with cognitive impairments and lower technology experience. This study evaluated the effectiveness of an e-health design model for persons with SMI termed the Flat Explicit Design Model (FEDM). Persons with schizophrenia (n = 38) performed tasks to evaluate the effectiveness of 5 Web site designs: 4 were prominent public Web sites, and 1 was designed according to the FEDM. Linear mixed-effects regression models were used to examine differences in usability between the Web sites. Omnibus tests of between-site differences were conducted, followed by post hoc pairwise comparisons of means to examine specific Web site differences when omnibus tests reached statistical significance. The Web site designed using the FEDM required less time to find information, had a higher success rate, and was rated easier to use and less frustrating than the other Web sites. The home page design of one of the other Web sites provided the best indication to users about a Web site's contents. The results are consistent with and were used to expand the FEDM. The FEDM provides evidence-based guidelines to design e-health applications for person with SMI, including: minimize an application's layers or hierarchy, use explicit text, employ navigational memory aids, group hyperlinks in 1 area, and minimize the number of disparate subjects an application addresses. © The Author 2013. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.
40 CFR 35.929-2 - General requirements for all user charge systems.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 1 2014-07-01 2014-07-01 false General requirements for all user charge systems. 35.929-2 Section 35.929-2 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... than every 2 years the waste water contribution of users and user classes, the total costs of operation...
40 CFR 35.929-2 - General requirements for all user charge systems.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 1 2012-07-01 2012-07-01 false General requirements for all user charge systems. 35.929-2 Section 35.929-2 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... than every 2 years the waste water contribution of users and user classes, the total costs of operation...
40 CFR 35.929-2 - General requirements for all user charge systems.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 1 2013-07-01 2013-07-01 false General requirements for all user charge systems. 35.929-2 Section 35.929-2 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... than every 2 years the waste water contribution of users and user classes, the total costs of operation...
Evaluating and Improving Metadata for Data Use and Understanding
NASA Astrophysics Data System (ADS)
Habermann, T.
2013-12-01
The last several decades have seen an extraordinary increase in the number and breadth of environmental data available to the scientific community and the general public. These increases have focused the environmental data community on creating metadata for discovering data and on the creation and population of catalogs and portals for facilitating discovery. This focus is reflected in the fields required by commonly used metadata standards and has resulted in collections populated with metadata that meet, but don't go far beyond, minimal discovery requirements. Discovery is the first step towards addressing scientific questions using data. As more data are discovered and accessed, users need metadata that 1) automates use and integration of these data in tools and 2) facilitates understanding the data when it is compared to similar datasets or as internal variations are observed. When data discovery is the primary goal, it is important to create records for as many datasets as possible. The content of these records is controlled by minimum requirements, and evaluation is generally limited to testing for required fields and counting records. As the use and understanding needs become more important, more comprehensive evaluation tools are needed. An approach is described for evaluating existing metadata in the light of these new requirements and for improving the metadata to meet them.
Performance of a new test strip for freestyle blood glucose monitoring systems.
Lock, John Paul; Brazg, Ronald; Bernstein, Robert M; Taylor, Elizabeth; Patel, Mona; Ward, Jeanne; Alva, Shridhara; Chen, Ting; Welsh, Zoë; Amor, Walter; Bhogal, Claire; Ng, Ronald
2011-01-01
a new strip, designed to enhance the ease of use and minimize interference of non-glucose sugars, has been developed to replace the current FreeStyle (Abbott Diabetes Care, Alameda, CA) blood glucose test strip. We evaluated the performance of this new strip. laboratory evaluation included precision, linearity, dynamic range, effects of operating temperature, humidity, altitude, hematocrit, interferents, and blood reapplication. System accuracy, lay user performance, and ease of use for finger capillary blood testing and accuracy for venous blood testing were evaluated at clinics. Lay users also compared the speed and ease of use between the new strip and the current FreeStyle strip. for glucose concentrations <75 mg/dL, 73%, 100%, and 100% of the individual capillary blood glucose results obtained by lay users fell within ± 5, 10, and 15 mg/dL, respectively, of the reference. For glucose concentrations ≥75 mg/dL, 68%, 95%, 99%, and 99% of the lay user results fell within ± 5%, 10%, 15%, and 20%, respectively, of the reference. Comparable accuracy was obtained in the venous blood study. Lay users found the new test strip easy to use and faster and easier to use than the current FreeStyle strip. The new strip maintained accuracy under various challenging conditions, including high concentrations of various interferents, sample reapplication up to 60 s, and extremes in hematocrit, altitude, and operating temperature and humidity. our results demonstrated excellent accuracy of the new FreeStyle test strip and validated the improvements in minimizing interference and enhancing ease of use.
Smart building temperature control using occupant feedback
NASA Astrophysics Data System (ADS)
Gupta, Santosh K.
This work was motivated by the problem of computing optimal commonly-agreeable thermal settings in spaces with multiple occupants. In this work we propose algorithms that take into account each occupant's preferences along with the thermal correlations between different zones in a building, to arrive at optimal thermal settings for all zones of the building in a coordinated manner. In the first part of this work we incorporate active occupant feedback to minimize aggregate user discomfort and total energy cost. User feedback is used to estimate the users comfort range, taking into account possible inaccuracies in the feedback. The control algorithm takes the energy cost into account, trading it off optimally with the aggregate user discomfort. A lumped heat transfer model based on thermal resistance and capacitance is used to model a multi-zone building. We provide a stability analysis and establish convergence of the proposed solution to a desired temperature that minimizes the sum of energy cost and aggregate user discomfort. However, for convergence to the optimal, sufficient separation between the user feedback frequency and the dynamics of the system is necessary; otherwise, the user feedback provided do not correctly reflect the effect of current control input value on user discomfort. The algorithm is further extended using singular perturbation theory to determine the minimum time between successive user feedback solicitations. Under sufficient time scale separation, we establish convergence of the proposed solution. Simulation study and experimental runs on the Watervliet based test facility demonstrates performance of the algorithm. In the second part we develop a consensus algorithm for attaining a common temperature set-point that is agreeable to all occupants of a zone in a typical multi-occupant space. The information on the comfort range functions is indeed held privately by each occupant. Using occupant differentiated dynamically adjusted prices as feedback signals, we propose a distributed solution, which ensures that a consensus is attained among all occupants upon convergence, irrespective of their temperature preferences being in coherence or conflicting. Occupants are only assumed to be rational, in that they choose their own temperature set-points so as to minimize their individual energy cost plus discomfort. We use Alternating Direction Method of Multipliers ( ADMM) to solve our consensus problem. We further establish the convergence of the proposed algorithm to the optimal thermal set point values that minimize the sum of the energy cost and the aggregate discomfort of all occupants in a multi-zone building. For simulating our consensus algorithm we use realistic building parameters based on the Watervliet test facility. The simulation study based on real world building parameters establish the validity of our theoretical model and provide insights on the dynamics of the system with a mobile user population. In the third part we present a game-theoretic (auction) mechanism, that requires occupants to "purchase" their individualized comfort levels beyond what is provided by default by the building operator. The comfort pricing policy, derived as an extension of Vickrey-Clarke-Groves (VCG) pricing, ensures incentive-compatibility of the mechanism, i.e., an occupant acting in self-interest cannot benefit from declaring their comfort function untruthfully, irrespective of the choices made by other occupants. The declared (or estimated) occupant comfort ranges (functions) are then utilized by the building operator---along with the energy cost information---to set the environment controls to optimally balance the aggregate discomfort of the occupants and the energy cost of the building operator. We use realistic building model and parameters based on our test facility to demonstrate the convergence of the actual temperatures in different zones to the desired temperatures, and provide insight to the pricing structure necessary for truthful comfort feedback from the occupants. Finally, we present an end-to-end framework designed for enabling occupant feedback collection and incorporating the feedback data towards energy efficient operation of a building. We have designed a mobile application that occupants can use on their smart phones to provide their thermal preference feedback. When relaying the occupant feedback to the central server the mobile application also uses indoor localization techniques to tie the occupant preference to their current thermal zone. Texas Instruments sensortags are used for real time zonal temperature readings. The mobile application relays the occupant preference along with the location to a central server that also hosts our learning algorithm to learn the environment and using occupant feedback calculates the optimal temperature set point. The entire process is triggered upon change of occupancy, environmental conditions, and or occupant preference. The learning algorithm is scheduled to run at regular intervals to respond dynamically to environmental and occupancy changes. We describe results from experimental studies in two different settings: a single family residential home setting and in a university based laboratory space setting. (Abstract shortened by UMI.).
The FITS model office ergonomics program: a model for best practice.
Chim, Justine M Y
2014-01-01
An effective office ergonomics program can predict positive results in reducing musculoskeletal injury rates, enhancing productivity, and improving staff well-being and job satisfaction. Its objective is to provide a systematic solution to manage the potential risk of musculoskeletal disorders among computer users in an office setting. A FITS Model office ergonomics program is developed. The FITS Model Office Ergonomics Program has been developed which draws on the legislative requirements for promoting the health and safety of workers using computers for extended periods as well as previous research findings. The Model is developed according to the practical industrial knowledge in ergonomics, occupational health and safety management, and human resources management in Hong Kong and overseas. This paper proposes a comprehensive office ergonomics program, the FITS Model, which considers (1) Furniture Evaluation and Selection; (2) Individual Workstation Assessment; (3) Training and Education; (4) Stretching Exercises and Rest Break as elements of an effective program. An experienced ergonomics practitioner should be included in the program design and implementation. Through the FITS Model Office Ergonomics Program, the risk of musculoskeletal disorders among computer users can be eliminated or minimized, and workplace health and safety and employees' wellness enhanced.
Autonomous Modelling of X-ray Spectra Using Robust Global Optimization Methods
NASA Astrophysics Data System (ADS)
Rogers, Adam; Safi-Harb, Samar; Fiege, Jason
2015-08-01
The standard approach to model fitting in X-ray astronomy is by means of local optimization methods. However, these local optimizers suffer from a number of problems, such as a tendency for the fit parameters to become trapped in local minima, and can require an involved process of detailed user intervention to guide them through the optimization process. In this work we introduce a general GUI-driven global optimization method for fitting models to X-ray data, written in MATLAB, which searches for optimal models with minimal user interaction. We directly interface with the commonly used XSPEC libraries to access the full complement of pre-existing spectral models that describe a wide range of physics appropriate for modelling astrophysical sources, including supernova remnants and compact objects. Our algorithm is powered by the Ferret genetic algorithm and Locust particle swarm optimizer from the Qubist Global Optimization Toolbox, which are robust at finding families of solutions and identifying degeneracies. This technique will be particularly instrumental for multi-parameter models and high-fidelity data. In this presentation, we provide details of the code and use our techniques to analyze X-ray data obtained from a variety of astrophysical sources.
Open source integrated modeling environment Delta Shell
NASA Astrophysics Data System (ADS)
Donchyts, G.; Baart, F.; Jagers, B.; van Putten, H.
2012-04-01
In the last decade, integrated modelling has become a very popular topic in environmental modelling since it helps solving problems, which is difficult to model using a single model. However, managing complexity of integrated models and minimizing time required for their setup remains a challenging task. The integrated modelling environment Delta Shell simplifies this task. The software components of Delta Shell are easy to reuse separately from each other as well as a part of integrated environment that can run in a command-line or a graphical user interface mode. The most components of the Delta Shell are developed using C# programming language and include libraries used to define, save and visualize various scientific data structures as well as coupled model configurations. Here we present two examples showing how Delta Shell simplifies process of setting up integrated models from the end user and developer perspectives. The first example shows coupling of a rainfall-runoff, a river flow and a run-time control models. The second example shows how coastal morphological database integrates with the coastal morphological model (XBeach) and a custom nourishment designer. Delta Shell is also available as open-source software released under LGPL license and accessible via http://oss.deltares.nl.
A Human Reliability Based Usability Evaluation Method for Safety-Critical Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Phillippe Palanque; Regina Bernhaupt; Ronald Boring
2006-04-01
Recent years have seen an increasing use of sophisticated interaction techniques including in the field of safety critical interactive software [8]. The use of such techniques has been required in order to increase the bandwidth between the users and systems and thus to help them deal efficiently with increasingly complex systems. These techniques come from research and innovation done in the field of humancomputer interaction (HCI). A significant effort is currently being undertaken by the HCI community in order to apply and extend current usability evaluation techniques to these new kinds of interaction techniques. However, very little has been donemore » to improve the reliability of software offering these kinds of interaction techniques. Even testing basic graphical user interfaces remains a challenge that has rarely been addressed in the field of software engineering [9]. However, the non reliability of interactive software can jeopardize usability evaluation by showing unexpected or undesired behaviors. The aim of this SIG is to provide a forum for both researchers and practitioners interested in testing interactive software. Our goal is to define a roadmap of activities to cross fertilize usability and reliability testing of these kinds of systems to minimize duplicate efforts in both communities.« less
Interactive-rate Motion Planning for Concentric Tube Robots
Torres, Luis G.; Baykal, Cenk; Alterovitz, Ron
2014-01-01
Concentric tube robots may enable new, safer minimally invasive surgical procedures by moving along curved paths to reach difficult-to-reach sites in a patient’s anatomy. Operating these devices is challenging due to their complex, unintuitive kinematics and the need to avoid sensitive structures in the anatomy. In this paper, we present a motion planning method that computes collision-free motion plans for concentric tube robots at interactive rates. Our method’s high speed enables a user to continuously and freely move the robot’s tip while the motion planner ensures that the robot’s shaft does not collide with any anatomical obstacles. Our approach uses a highly accurate mechanical model of tube interactions, which is important since small movements of the tip position may require large changes in the shape of the device’s shaft. Our motion planner achieves its high speed and accuracy by combining offline precomputation of a collision-free roadmap with online position control. We demonstrate our interactive planner in a simulated neurosurgical scenario where a user guides the robot’s tip through the environment while the robot automatically avoids collisions with the anatomical obstacles. PMID:25436176
Viger, R.J.
2008-01-01
The GIS Weasel is a freely available, open-source software package built on top of ArcInfo Workstation?? [ESRI, Inc., 2001, ArcInfo Workstation (8.1 ed.), Redlands, CA] for creating maps and parameters of geographic features used in environmental simulation models. The software has been designed to minimize the need for GIS expertise and automate the preparation of the geographic information as much as possible. Although many kinds of data can be exploited with the GIS Weasel, the only information required is a raster dataset of elevation for the user's area of interest (AOI). The user-defined AOI serves as a starting point from which to create maps of many different types of geographic features, including sub-watersheds, streams, elevation bands, land cover patches, land parcels, or anything else that can be discerned from the available data. The GIS Weasel has a library of over 200 routines that can be applied to any raster map of geographic features to generate information about shape, area, or topological association with other features of the same or different maps. In addition, a wide variety of parameters can be derived using ancillary data layers such as soil and vegetation maps.
Enriching Triangle Mesh Animations with Physically Based Simulation.
Li, Yijing; Xu, Hongyi; Barbic, Jernej
2017-10-01
We present a system to combine arbitrary triangle mesh animations with physically based Finite Element Method (FEM) simulation, enabling control over the combination both in space and time. The input is a triangle mesh animation obtained using any method, such as keyframed animation, character rigging, 3D scanning, or geometric shape modeling. The input may be non-physical, crude or even incomplete. The user provides weights, specified using a minimal user interface, for how much physically based simulation should be allowed to modify the animation in any region of the model, and in time. Our system then computes a physically-based animation that is constrained to the input animation to the amount prescribed by these weights. This permits smoothly turning physics on and off over space and time, making it possible for the output to strictly follow the input, to evolve purely based on physically based simulation, and anything in between. Achieving such results requires a careful combination of several system components. We propose and analyze these components, including proper automatic creation of simulation meshes (even for non-manifold and self-colliding undeformed triangle meshes), converting triangle mesh animations into animations of the simulation mesh, and resolving collisions and self-collisions while following the input.
PredGuid+A: Orion Entry Guidance Modified for Aerocapture
NASA Technical Reports Server (NTRS)
Lafleur, Jarret
2013-01-01
PredGuid+A software was developed to enable a unique numerical predictor-corrector aerocapture guidance capability that builds on heritage Orion entry guidance algorithms. The software can be used for both planetary entry and aerocapture applications. Furthermore, PredGuid+A implements a new Delta-V minimization guidance option that can take the place of traditional targeting guidance and can result in substantial propellant savings. PredGuid+A allows the user to set a mode flag and input a target orbit's apoapsis and periapsis. Using bank angle control, the guidance will then guide the vehicle to the appropriate post-aerocapture orbit using one of two algorithms: Apoapsis Targeting or Delta-V Minimization (as chosen by the user). Recently, the PredGuid guidance algorithm was adapted for use in skip-entry scenarios for NASA's Orion multi-purpose crew vehicle (MPCV). To leverage flight heritage, most of Orion's entry guidance routines are adapted from the Apollo program.
Interactive information retrieval systems with minimalist representation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Domeshek, E.; Kedar, S.; Gordon, A.
Almost any information you might want is becoming available on-line. The problem is how to find what you need. One strategy to improve access to existing information sources, is intelligent information agents - an approach based on extensive representation and inference. Another alternative is to simply concentrate on better information organization and indexing. Our systems use a form of conceptual indexing sensitive to users` task-specific information needs. We aim for minimalist representation, coding only select aspects of stored items. Rather than supporting reliable automated inference, the primary purpose of our representations is to provide sufficient discrimination and guidance to amore » user for a given domain and task. This paper argues, using case studies, that minimal representations can make strong contributions to the usefulness and usability of interactive information systems, while minimizing knowledge engineering effort. We demonstrate this approach in several broad spectrum applications including video retrieval and advisory systems.« less
Hübner, U; Klein, F; Hofstetter, J; Kammeyer, G; Seete, H
2000-01-01
Web-based drug ordering allows a growing number of hospitals without pharmacy to communicate seamlessly with their external pharmacy. Business process analysis and object oriented modelling performed together with the users at a pilot hospital resulted in a comprehensive picture of the user and business requirements for electronic drug ordering. The user requirements were further validated with the help of a software prototype. In order to capture the needs of a large number of users CAP10, a new method making use of pre-built models, is proposed. Solutions for coping with the technical requirements (interfacing the business software at the pharmacy) and with the legal requirements (signing the orders) are presented.
Implementation of Emergency Medical Text Classifier for Syndromic Surveillance
Travers, Debbie; Haas, Stephanie W.; Waller, Anna E.; Schwartz, Todd A.; Mostafa, Javed; Best, Nakia C.; Crouch, John
2013-01-01
Public health officials use syndromic surveillance systems to facilitate early detection and response to infectious disease outbreaks. Emergency department clinical notes are becoming more available for surveillance but present the challenge of accurately extracting concepts from these text data. The purpose of this study was to implement a new system, Emergency Medical Text Classifier (EMT-C), into daily production for syndromic surveillance and evaluate system performance and user satisfaction. The system was designed to meet user preferences for a syndromic classifier that maximized positive predictive value and minimized false positives in order to provide a manageable workload. EMT-C performed better than the baseline system on all metrics and users were slightly more satisfied with it. It is vital to obtain user input and test new systems in the production environment. PMID:24551413
How to Create, Modify, and Interface Aspen In-House and User Databanks for System Configuration 1:
DOE Office of Scientific and Technical Information (OSTI.GOV)
Camp, D W
2000-10-27
The goal of this document is to provide detailed instructions to create, modify, interface, and test Aspen User and In-House databanks with minimal frustration. The level of instructions are aimed at a novice Aspen Plus simulation user who is neither a programming nor computer-system expert. The instructions are tailored to Version 10.1 of Aspen Plus and the specific computing configuration summarized in the Title of this document and detailed in Section 2. Many details of setting up databanks depend on the computing environment specifics, such as the machines, operating systems, command languages, directory structures, inter-computer communications software, the version ofmore » the Aspen Engine and Graphical User Interface (GUI), and the directory structure of how these were installed.« less
Implementation of Emergency Medical Text Classifier for syndromic surveillance.
Travers, Debbie; Haas, Stephanie W; Waller, Anna E; Schwartz, Todd A; Mostafa, Javed; Best, Nakia C; Crouch, John
2013-01-01
Public health officials use syndromic surveillance systems to facilitate early detection and response to infectious disease outbreaks. Emergency department clinical notes are becoming more available for surveillance but present the challenge of accurately extracting concepts from these text data. The purpose of this study was to implement a new system, Emergency Medical Text Classifier (EMT-C), into daily production for syndromic surveillance and evaluate system performance and user satisfaction. The system was designed to meet user preferences for a syndromic classifier that maximized positive predictive value and minimized false positives in order to provide a manageable workload. EMT-C performed better than the baseline system on all metrics and users were slightly more satisfied with it. It is vital to obtain user input and test new systems in the production environment.
Selecting a Control Strategy for Plug and Process Loads
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lobato, C.; Sheppy, M.; Brackney, L.
2012-09-01
Plug and Process Loads (PPLs) are building loads that are not related to general lighting, heating, ventilation, cooling, and water heating, and typically do not provide comfort to the building occupants. PPLs in commercial buildings account for almost 5% of U.S. primary energy consumption. On an individual building level, they account for approximately 25% of the total electrical load in a minimally code-compliant commercial building, and can exceed 50% in an ultra-high efficiency building such as the National Renewable Energy Laboratory's (NREL) Research Support Facility (RSF) (Lobato et al. 2010). Minimizing these loads is a primary challenge in the designmore » and operation of an energy-efficient building. A complex array of technologies that measure and manage PPLs has emerged in the marketplace. Some fall short of manufacturer performance claims, however. NREL has been actively engaged in developing an evaluation and selection process for PPLs control, and is using this process to evaluate a range of technologies for active PPLs management that will cap RSF plug loads. Using a control strategy to match plug load use to users' required job functions is a huge untapped potential for energy savings.« less
Augmented reality in the surgery of cerebral aneurysms: a technical report.
Cabrilo, Ivan; Bijlenga, Philippe; Schaller, Karl
2014-06-01
Augmented reality is the overlay of computer-generated images on real-world structures. It has previously been used for image guidance during surgical procedures, but it has never been used in the surgery of cerebral aneurysms. To report our experience of cerebral aneurysm surgery aided by augmented reality. Twenty-eight patients with 39 unruptured aneurysms were operated on in a prospective manner with augmented reality. Preoperative 3-dimensional image data sets (angio-magnetic resonance imaging, angio-computed tomography, and 3-dimensional digital subtraction angiography) were used to create virtual segmentations of patients' vessels, aneurysms, aneurysm necks, skulls, and heads. These images were injected intraoperatively into the eyepiece of the operating microscope. An example case of an unruptured posterior communicating artery aneurysm clipping is illustrated in a video. The described operating procedure allowed continuous monitoring of the accuracy of patient registration with neuronavigation data and assisted in the performance of tailored surgical approaches and optimal clipping with minimized exposition. Augmented reality may add to the performance of a minimally invasive approach, although further studies need to be performed to evaluate whether certain groups of aneurysms are more likely to benefit from it. Further technological development is required to improve its user friendliness.
AUCTION MECHANISMS FOR IMPLEMENTING TRADABLE NETWORK PERMIT MARKETS
NASA Astrophysics Data System (ADS)
Wada, Kentaro; Akamatsu, Takashi
This paper proposes a new auction mechanism for implementing the tradable network permit markets. Assuming that each user makes a trip from an origin to a destination along a path in a specific time period, we design an auction mechanism that enables each user to purchase a bundle of permits corresponding to a set of links in the user's preferred path. The objective of the proposed mechanism is to achieve a socially optimal state with minimal revelation of users' private information. In order to achieve this, the mechanism employs an evolutionary approach that has an auction phase and a path capacity adjustment phase, which are repeated on a day-to-day basis. We prove that the proposed mechanism has the following desirable properties: (1) truthful bidding is the dominant strategy for each user and (2) the proposed mechanism converges to an approximate socially optimal state in the sense that the achieved value of the social surplus reaches its maximum value when the number of users is large.
Cloud Computing for the Grid: GridControl: A Software Platform to Support the Smart Grid
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
GENI Project: Cornell University is creating a new software platform for grid operators called GridControl that will utilize cloud computing to more efficiently control the grid. In a cloud computing system, there are minimal hardware and software demands on users. The user can tap into a network of computers that is housed elsewhere (the cloud) and the network runs computer applications for the user. The user only needs interface software to access all of the cloud’s data resources, which can be as simple as a web browser. Cloud computing can reduce costs, facilitate innovation through sharing, empower users, and improvemore » the overall reliability of a dispersed system. Cornell’s GridControl will focus on 4 elements: delivering the state of the grid to users quickly and reliably; building networked, scalable grid-control software; tailoring services to emerging smart grid uses; and simulating smart grid behavior under various conditions.« less
Nasr, Nasrin; Leon, Beatriz; Mountain, Gail; Nijenhuis, Sharon M; Prange, Gerdienke; Sale, Patrizio; Amirabdollahian, Farshid
2016-11-01
We drew on an interdisciplinary research design to examine stroke survivors' experiences of living with stroke and with technology in order to provide technology developers with insight into values, thoughts and feelings of the potential users of a to-be-designed robotic technology for home-based rehabilitation of the hand and wrist. Ten stroke survivors and their family carers were purposefully selected. On the first home visit, they were introduced to cultural probe. On the second visit, the content of the probe packs were used as prompt to conduct one-to-one interviews with them. The data generated was analysed using thematic analysis. A third home visit was conducted to evaluate the early prototype. User requirements were categorised into their network of relationships, their attitude towards technology, their skills, their goals and motivations. The user requirements were used to envision the requirements of the system including providing feedback on performance, motivational aspects and usability of the system. Participants' views on the system requirements were obtained during a participatory evaluation. This study showed that prior to the development of technology, it is important to engage with potential users to identify user requirements and subsequently envision system requirements based on users' views. Implications for Rehabilitation An understanding of how stroke survivors make sense of their experiences of living with stroke is needed to design home-based rehabilitation technologies. Linking stroke survivors' goals, motivations, behaviour, feelings and attitude to user requirements prior to technology development has a significant impact on improving the design.
Data System Implications Derived from User Application Requirements for Satellite Data
NASA Technical Reports Server (NTRS)
Neiers, J.
1979-01-01
An investigation of the data system needs as driven by users of space acquired Earth observation data is documented. Two major categories of users, operational and research, are identified. Limiting data acquisition alleviates some of the delays in processing thus improving timeliness of the delivered product. Trade offs occur between timeliness and data distribution costs, and between data storage and reprocessing. The complexity of the data system requirements to apply space data to users' needs is such that no single analysis suffices to design and implement the optimum system. A series of iterations is required with analyses of the salient problems in a general way, followed by a limited implementation of benefit to some users with a continual upgrade in system capacity, functions, and applications served. The resulting most important requirement for the data system is flexibility to accommodate changing requirements as the system is implemented.
Wipfli, Rolf; Teodoro, Douglas; Sarrey, Everlyne; Walesa, Magali; Lovis, Christian
2013-01-01
Background Working in a clinical environment requires unfettered mobility. This is especially true for nurses who are always on the move providing patients’ care in different locations. Since the introduction of clinical information systems in hospitals, this mobility has often been considered hampered by interactions with computers. The popularity of personal mobile assistants such as smartphones makes it possible to gain easy access to clinical data anywhere. Objective To identify the challenges involved in the deployment of clinical applications on handheld devices and to share our solutions to these problems. Methods A team of experts underwent an iterative development process of a mobile application prototype that aimed to improve the mobility of nurses during their daily clinical activities. Through the process, challenges inherent to mobile platforms have emerged. These issues have been classified, focusing on factors related to ensuring information safety and quality, as well as pleasant and efficient user experiences. Results The team identified five main challenges related to the deployment of clinical mobile applications and presents solutions to overcome each of them: (1) Financial: Equipping every care giver with a new mobile device requires substantial investment that can be lowered if users use their personal device instead, (2) Hardware: The constraints inherent to the clinical environment made us choose the mobile device with the best tradeoff between size and portability, (3) Communication: the connection of the mobile application with any existing clinical information systems (CIS) is insured by a bridge formatting the information appropriately, (4) Security: In order to guarantee the confidentiality and safety of the data, the amount of data stored on the device is minimized, and (5) User interface: The design of our user interface relied on homogeneity, hierarchy, and indexicality principles to prevent an increase in data acquisition errors. Conclusions The introduction of nomadic computing often raises enthusiastic reactions from users, but several challenges due to specific constraints of mobile platforms must be overcome. The ease of development of mobile applications and their rapid spread should not overshadow the real challenges of clinical applications and the potential threats for patient safety and the liability of people and organizations using them. For example, careful attention must be given to the overall architecture of the system and to user interfaces. If these precautions are not taken, it can easily lead to unexpected failures such as an increased number of input errors, loss of data, or decreased efficiency. PMID:25100680
Ehrler, Frederic; Wipfli, Rolf; Teodoro, Douglas; Sarrey, Everlyne; Walesa, Magali; Lovis, Christian
2013-06-12
Working in a clinical environment requires unfettered mobility. This is especially true for nurses who are always on the move providing patients' care in different locations. Since the introduction of clinical information systems in hospitals, this mobility has often been considered hampered by interactions with computers. The popularity of personal mobile assistants such as smartphones makes it possible to gain easy access to clinical data anywhere. To identify the challenges involved in the deployment of clinical applications on handheld devices and to share our solutions to these problems. A team of experts underwent an iterative development process of a mobile application prototype that aimed to improve the mobility of nurses during their daily clinical activities. Through the process, challenges inherent to mobile platforms have emerged. These issues have been classified, focusing on factors related to ensuring information safety and quality, as well as pleasant and efficient user experiences. The team identified five main challenges related to the deployment of clinical mobile applications and presents solutions to overcome each of them: (1) Financial: Equipping every care giver with a new mobile device requires substantial investment that can be lowered if users use their personal device instead, (2) Hardware: The constraints inherent to the clinical environment made us choose the mobile device with the best tradeoff between size and portability, (3) Communication: the connection of the mobile application with any existing clinical information systems (CIS) is insured by a bridge formatting the information appropriately, (4) Security: In order to guarantee the confidentiality and safety of the data, the amount of data stored on the device is minimized, and (5) User interface: The design of our user interface relied on homogeneity, hierarchy, and indexicality principles to prevent an increase in data acquisition errors. The introduction of nomadic computing often raises enthusiastic reactions from users, but several challenges due to specific constraints of mobile platforms must be overcome. The ease of development of mobile applications and their rapid spread should not overshadow the real challenges of clinical applications and the potential threats for patient safety and the liability of people and organizations using them. For example, careful attention must be given to the overall architecture of the system and to user interfaces. If these precautions are not taken, it can easily lead to unexpected failures such as an increased number of input errors, loss of data, or decreased efficiency.
Autonomous Flight Rules Concept: User Implementation Costs and Strategies
NASA Technical Reports Server (NTRS)
Cotton, William B.; Hilb, Robert
2014-01-01
The costs to implement Autonomous Flight Rules (AFR) were examined for estimates in acquisition, installation, training and operations. The user categories were airlines, fractional operators, general aviation and unmanned aircraft systems. Transition strategies to minimize costs while maximizing operational benefits were also analyzed. The primary cost category was found to be the avionics acquisition. Cost ranges for AFR equipment were given to reflect the uncertainty of the certification level for the equipment and the extent of existing compatible avionics in the aircraft to be modified.
NECAP 4.1: NASA's energy-cost analysis program user's manual
NASA Technical Reports Server (NTRS)
Jensen, R. N.; Henninger, R. H.; Miner, D. L.
1983-01-01
The Enery Cost Analysis Program (NECAP) is a powerful computerized method to determine and to minimize building energy consumption. The program calculates hourly heat gain or losses taking into account the building thermal resistance and mass, using hourly weather and a "response factor' method. Internal temperatures are allowed to vary in accordance with thermostat settings and equipment capacity. A simplified input procedure and numerous other technical improvements are presented. This Users Manual describes the program and provides examples.
1984-09-01
subsystems including labor reporting, Prime BEEF (Base Engineer Emergency Forces) composition, work order control, material control, cost accounting...AirComan IL ARCE BallsticLangey AB,24 Misl Supr)-FC Norton AF CA ( Estr Region) - AFRCE (United Kingdom) Ruislip AB UK UntdSae0i Fre nErp * Ramstein AB...Air Force personnel and minimize information burden on users, providers, and handlers, thereby reducing the costs, labor and intensiveness, and time
Halon 1301 management planning guidance
NASA Astrophysics Data System (ADS)
1995-05-01
This ETL provides guidance to help the Base Civil Engineer (BCE) and other users manage inventories of Halon 1301, an ozone depleting substance used in many facility fire protection systems. This guidance will allow Halon 1301 users to develop the transition plans necessary to implement the DOD and Air Force policies on ozone depleting substances. Attachment 3 to this ETL contains detailed instructions on how to develop a Base Halon 1 301 Management Plan and comply with Air Force policies and regulations designed to minimize dependency on Halon 1301.
A systematic review: the influence of real time feedback on wheelchair propulsion biomechanics.
Symonds, Andrew; Barbareschi, Giulia; Taylor, Stephen; Holloway, Catherine
2018-01-01
Clinical guidelines recommend that, in order to minimize upper limb injury risk, wheelchair users adopt a semi-circular pattern with a slow cadence and a large push arc. To examine whether real time feedback can be used to influence manual wheelchair propulsion biomechanics. Clinical trials and case series comparing the use of real time feedback against no feedback were included. A general review was performed and methodological quality assessed by two independent practitioners using the Downs and Black checklist. The review was completed in accordance with the Preferred Reporting Items for Systematic Reviews and Meta Analyses (PRISMA) guidelines. Six papers met the inclusion criteria. Selected studies involved 123 participants and analysed the effect of visual and, in one case, haptic feedback. Across the studies it was shown that participants were able to achieve significant changes in propulsion biomechanics, when provided with real time feedback. However, the effect of targeting a single propulsion variable might lead to unwanted alterations in other parameters. Methodological assessment identified weaknesses in external validity. Visual feedback could be used to consistently increase push arc and decrease push rate, and may be the best focus for feedback training. Further investigation is required to assess such intervention during outdoor propulsion. Implications for Rehabilitation Upper limb pain and injuries are common secondary disorders that negatively affect wheelchair users' physical activity and quality of life. Clinical guidelines suggest that manual wheelchair users should aim to propel with a semi-circular pattern with low a push rate and large push arc in the range in order to minimise upper limbs' loading. Real time visual and haptic feedback are effective tools for improving propulsion biomechanics in both complete novices and experienced manual wheelchair users.
Mooney, Karen; McElnay, James C; Donnelly, Ryan F
2015-08-01
Microneedle (MN) arrays could offer an alternative method to traditional drug delivery and blood sampling methods. However, acceptance among key end-users is critical for new technologies to succeed. MNs have been advocated for use in children and so, paediatricians are key potential end-users. However, the opinions of paediatricians on MN use have been previously unexplored. The aim of this study was to investigate the views of UK paediatricians on the use of MN technology within neonatal and paediatric care. An online survey was developed and distributed among UK paediatricians to gain their opinions of MN technology and its use in the neonatal and paediatric care settings, particularly for MN-mediated monitoring. A total of 145 responses were obtained, with a completion response rate of 13.7 %. Respondents believed an alternative monitoring technique to blood sampling in children was required. Furthermore, 83 % of paediatricians believed there was a particular need in premature neonates. Overall, this potential end-user group approved of the MN technology and a MN-mediated monitoring approach. Minimal pain and the perceived ease of use were important elements in gaining favour. Concerns included the need for confirmation of correct application and the potential for skin irritation. The findings of this study provide an initial indication of MN acceptability among a key potential end-user group. Furthermore, the concerns identified present a challenge to those working within the MN field to provide solutions to further improve this technology. The work strengthens the rationale behind MN technology and facilitates the translation of MN technology from lab bench into the clinical setting.
Sensor fusion and computer vision for context-aware control of a multi degree-of-freedom prosthesis
NASA Astrophysics Data System (ADS)
Markovic, Marko; Dosen, Strahinja; Popovic, Dejan; Graimann, Bernhard; Farina, Dario
2015-12-01
Objective. Myoelectric activity volitionally generated by the user is often used for controlling hand prostheses in order to replicate the synergistic actions of muscles in healthy humans during grasping. Muscle synergies in healthy humans are based on the integration of visual perception, heuristics and proprioception. Here, we demonstrate how sensor fusion that combines artificial vision and proprioceptive information with the high-level processing characteristics of biological systems can be effectively used in transradial prosthesis control. Approach. We developed a novel context- and user-aware prosthesis (CASP) controller integrating computer vision and inertial sensing with myoelectric activity in order to achieve semi-autonomous and reactive control of a prosthetic hand. The presented method semi-automatically provides simultaneous and proportional control of multiple degrees-of-freedom (DOFs), thus decreasing overall physical effort while retaining full user control. The system was compared against the major commercial state-of-the art myoelectric control system in ten able-bodied and one amputee subject. All subjects used transradial prosthesis with an active wrist to grasp objects typically associated with activities of daily living. Main results. The CASP significantly outperformed the myoelectric interface when controlling all of the prosthesis DOF. However, when tested with less complex prosthetic system (smaller number of DOF), the CASP was slower but resulted with reaching motions that contained less compensatory movements. Another important finding is that the CASP system required minimal user adaptation and training. Significance. The CASP constitutes a substantial improvement for the control of multi-DOF prostheses. The application of the CASP will have a significant impact when translated to real-life scenarious, particularly with respect to improving the usability and acceptance of highly complex systems (e.g., full prosthetic arms) by amputees.
Long Term Use of Aspirin and the Risk of Gastrointestinal Bleeding
Huang, Edward S.; Strate, Lisa L.; Ho, Wendy W.; Lee, Salina S.; Chan, Andrew T.
2011-01-01
Background In short-term trials, aspirin is associated with gastrointestinal bleeding. However, the effect of dose and duration of aspirin use on risk remains unclear. Methods We conducted a prospective study of 87,680 women enrolled in the Nurses' Health Study in 1990 who provided biennial data on aspirin use. We examined the relative risk (RR) of major gastrointestinal bleeding requiring hospitalization or blood transfusion. Results Over a 24-year follow-up, 1537 women reported a major gastrointestinal bleeding. Among women who used aspirin regularly (≥2 standard [325-mg] tablets/week), the multivariate RR of gastrointestinal bleeding was 1.43 (95% confidence interval [CI], 1.29-1.59) compared with non-regular users. Compared with women who denied any aspirin use, the multivariate RRs of gastrointestinal bleeding were 1.03 (95% CI, 0.85-1.24) for women who used 0.5 to 1.5 standard aspirin tablets/week, 1.30 (95% CI, 1.07-1.58) for 2 to 5 tablets/week, 1.77 (95% CI, 1.44-2.18) for 6-14 tablets/week, and 2.24 (95% CI, 1.66-3.03) for >14 tablets/week (Ptrend<.001). Similar dose-response relationships were observed among short-terms users (≤ 5 years; Ptrend<.001) and long-term users (>5 years; Ptrend<.001). In contrast, after adjustments were made for dose, increasing duration of use did not confer greater risk of bleeding (Ptrend=0.28). Conclusions Regular aspirin use is associated with gastrointestinal bleeding. Risk appears more strongly related to dose than duration of aspirin use. Efforts to minimize adverse effects of aspirin therapy should emphasize using the lowest effective dose among both short-term and long-term users. PMID:21531232
Sensor fusion and computer vision for context-aware control of a multi degree-of-freedom prosthesis.
Markovic, Marko; Dosen, Strahinja; Popovic, Dejan; Graimann, Bernhard; Farina, Dario
2015-12-01
Myoelectric activity volitionally generated by the user is often used for controlling hand prostheses in order to replicate the synergistic actions of muscles in healthy humans during grasping. Muscle synergies in healthy humans are based on the integration of visual perception, heuristics and proprioception. Here, we demonstrate how sensor fusion that combines artificial vision and proprioceptive information with the high-level processing characteristics of biological systems can be effectively used in transradial prosthesis control. We developed a novel context- and user-aware prosthesis (CASP) controller integrating computer vision and inertial sensing with myoelectric activity in order to achieve semi-autonomous and reactive control of a prosthetic hand. The presented method semi-automatically provides simultaneous and proportional control of multiple degrees-of-freedom (DOFs), thus decreasing overall physical effort while retaining full user control. The system was compared against the major commercial state-of-the art myoelectric control system in ten able-bodied and one amputee subject. All subjects used transradial prosthesis with an active wrist to grasp objects typically associated with activities of daily living. The CASP significantly outperformed the myoelectric interface when controlling all of the prosthesis DOF. However, when tested with less complex prosthetic system (smaller number of DOF), the CASP was slower but resulted with reaching motions that contained less compensatory movements. Another important finding is that the CASP system required minimal user adaptation and training. The CASP constitutes a substantial improvement for the control of multi-DOF prostheses. The application of the CASP will have a significant impact when translated to real-life scenarious, particularly with respect to improving the usability and acceptance of highly complex systems (e.g., full prosthetic arms) by amputees.
Flexible and Transparent User Authentication for Mobile Devices
NASA Astrophysics Data System (ADS)
Clarke, Nathan; Karatzouni, Sevasti; Furnell, Steven
The mobile device has become a ubiquitous technology that is capable of supporting an increasingly large array of services, applications and information. Given their increasing importance, it is imperative to ensure that such devices are not misused or abused. Unfortunately, a key enabling control to prevent this, user authentication, has not kept up with the advances in device technology. This paper presents the outcomes of a 2 year study that proposes the use of transparent and continuous biometric authentication of the user: providing more comprehensive identity verification; minimizing user inconvenience; and providing security throughout the period of use. A Non-Intrusive and Continuous Authentication (NICA) system is described that maintains a continuous measure of confidence in the identity of the user, removing access to sensitive services and information with low confidence levels and providing automatic access with higher confidence levels. An evaluation of the framework is undertaken from an end-user perspective via a trial involving 27 participants. Whilst the findings raise concerns over education, privacy and intrusiveness, overall 92% of users felt the system offered a more secure environment when compared to existing forms of authentication.
NASA Astrophysics Data System (ADS)
Koo, Cheol Hea; Lee, Hoon Hee; Moon, Sung Tae; Han, Sang Hyuck; Ju, Gwang Hyeok
2013-08-01
In aerospace research and practical development area, increasing the usage of simulation in software development, component design and system operation has been maintained and the increasing speed getting faster. This phenomenon can be found from the easiness of handling of simulation and the powerfulness of the output from the simulation. Simulation brings lots of benefit from the several characteristics of it as following, - easy to handle ; it is never broken or damaged by mistake - never wear out ; it is never getting old - cost effective ; once it is built, it can be distributed over 100 ~ 1000 people GenSim (Generic Simulator) which is developing by KARI and compatible with ESA SMP standard provides such a simulation platform to support flight software validation and mission operation verification. User interface of GenSim is shown in Figure 1 [1,2]. As shown in Figure 1, as most simulation platform typically has, GenSim has GRD (Graphical Display) and AND (Alpha Numeric Display). But frequently more complex and powerful handling of the simulated data is required at the actual system validation for example mission operation. In Figure 2, system simulation result of COMS (Communication, Ocean, and Meteorological Satellite, launched at June 28 2008) is being drawn by Celestia 3D program. In this case, the needed data from Celestia is given by one of the simulation model resident in system simulator through UDP network connection in this case. But the requirement of displaying format, data size, and communication rate is variable so developer has to manage the connection protocol manually at each time and each case. It brings a chaos in the simulation model design and development, also to the performance issue at last. Performance issue is happen when the required data magnitude is higher than the capacity of simulation kernel to process the required data safely. The problem is that the sending data to a visualization tool such as celestia is given by a simulation model not kernel. Because the simulation model has no way to know about the status of simulation kernel load to process simulation events, as the result the simulation model sends the data as frequent as needed. This story may make many potential problems like lack of response, failure of meeting deadline and data integrity problem with the model data during the simulation. SIMSAT and EuroSim gives a warning message if the user request event such as printing log can't be processed as planned or requested. As the consequence the requested event will be delayed or not be able to be processed, and it means that this phenomenon may violate the planned deadline. In most soft real time simulation, this can be neglected and just make a little inconvenience of users. But it shall be noted that if the user request is not managed properly at some critical situation, the simulation results may be ended with a mess and chaos. As we traced the disadvantages of what simulation model provide the user request, simulation model is not appropriate to provide a service for such user request. This kind of work shall be minimized as much as possible.
Zheng, Wenjing; Balzer, Laura; van der Laan, Mark; Petersen, Maya
2018-01-30
Binary classification problems are ubiquitous in health and social sciences. In many cases, one wishes to balance two competing optimality considerations for a binary classifier. For instance, in resource-limited settings, an human immunodeficiency virus prevention program based on offering pre-exposure prophylaxis (PrEP) to select high-risk individuals must balance the sensitivity of the binary classifier in detecting future seroconverters (and hence offering them PrEP regimens) with the total number of PrEP regimens that is financially and logistically feasible for the program. In this article, we consider a general class of constrained binary classification problems wherein the objective function and the constraint are both monotonic with respect to a threshold. These include the minimization of the rate of positive predictions subject to a minimum sensitivity, the maximization of sensitivity subject to a maximum rate of positive predictions, and the Neyman-Pearson paradigm, which minimizes the type II error subject to an upper bound on the type I error. We propose an ensemble approach to these binary classification problems based on the Super Learner methodology. This approach linearly combines a user-supplied library of scoring algorithms, with combination weights and a discriminating threshold chosen to minimize the constrained optimality criterion. We then illustrate the application of the proposed classifier to develop an individualized PrEP targeting strategy in a resource-limited setting, with the goal of minimizing the number of PrEP offerings while achieving a minimum required sensitivity. This proof of concept data analysis uses baseline data from the ongoing Sustainable East Africa Research in Community Health study. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Thakur, Chandar S.; Brown, Margaret E.; Sama, Jacob N.; Jackson, Melantha E.
2010-01-01
Since RNAs lie at the center of most cellular processes, there is a need for synthesizing large amounts of RNAs made from stable isotope-labeled nucleotides to advance the study of their structure and dynamics by nuclear magnetic resonance (NMR) spectroscopy. A particularly effective means of obtaining labeled nucleotides is to harvest these nucleotides from bacteria grown in defined minimal media supplemented with 15NH4Cl and various carbon sources. Given the high cost of carbon precursors required for labeling nucleic acids for NMR studies, it becomes important to evaluate the optimal growth for commonly used strains under standard minimal media conditions. Such information is lacking. In this study, we characterize the growth for Escherichia coli strains K12, K10zwf, and DL323 in three minimal media with isotopic-labeled carbon sources of acetate, glycerol, and glycerol combined with formate. Of the three media, the LeMaster-Richards and the Studier media outperform the commonly used M9 media and both support optimal growth of E. coli for the production of nucleotides. However, the growth of all three E. coli strains in acetate is reduced almost twofold compared to growth in glycerol. Analysis of the metabolic pathway and previous gene array studies help to explain this differential growth in glycerol and acetate. These studies should benefit efforts to make selective 13C-15N isotopic-labeled nucleotides for synthesizing biologically important RNAs. Electronic supplementary material The online version of this article (doi:10.1007/s00253-010-2813-y) contains supplementary material, which is available to authorized users. PMID:20730533
Development of a User Interface for a Regression Analysis Software Tool
NASA Technical Reports Server (NTRS)
Ulbrich, Norbert Manfred; Volden, Thomas R.
2010-01-01
An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-08
... the level of trading volume sent to the Exchange by such users, such that a user with significant... size of the firm, every user that connects its systems to the Exchange's trading systems requires at... required to support for connectivity to its trading systems. This would also provide incentive for users to...
BioWord: A sequence manipulation suite for Microsoft Word
2012-01-01
Background The ability to manipulate, edit and process DNA and protein sequences has rapidly become a necessary skill for practicing biologists across a wide swath of disciplines. In spite of this, most everyday sequence manipulation tools are distributed across several programs and web servers, sometimes requiring installation and typically involving frequent switching between applications. To address this problem, here we have developed BioWord, a macro-enabled self-installing template for Microsoft Word documents that integrates an extensive suite of DNA and protein sequence manipulation tools. Results BioWord is distributed as a single macro-enabled template that self-installs with a single click. After installation, BioWord will open as a tab in the Office ribbon. Biologists can then easily manipulate DNA and protein sequences using a familiar interface and minimize the need to switch between applications. Beyond simple sequence manipulation, BioWord integrates functionality ranging from dyad search and consensus logos to motif discovery and pair-wise alignment. Written in Visual Basic for Applications (VBA) as an open source, object-oriented project, BioWord allows users with varying programming experience to expand and customize the program to better meet their own needs. Conclusions BioWord integrates a powerful set of tools for biological sequence manipulation within a handy, user-friendly tab in a widely used word processing software package. The use of a simple scripting language and an object-oriented scheme facilitates customization by users and provides a very accessible educational platform for introducing students to basic bioinformatics algorithms. PMID:22676326
BioWord: a sequence manipulation suite for Microsoft Word.
Anzaldi, Laura J; Muñoz-Fernández, Daniel; Erill, Ivan
2012-06-07
The ability to manipulate, edit and process DNA and protein sequences has rapidly become a necessary skill for practicing biologists across a wide swath of disciplines. In spite of this, most everyday sequence manipulation tools are distributed across several programs and web servers, sometimes requiring installation and typically involving frequent switching between applications. To address this problem, here we have developed BioWord, a macro-enabled self-installing template for Microsoft Word documents that integrates an extensive suite of DNA and protein sequence manipulation tools. BioWord is distributed as a single macro-enabled template that self-installs with a single click. After installation, BioWord will open as a tab in the Office ribbon. Biologists can then easily manipulate DNA and protein sequences using a familiar interface and minimize the need to switch between applications. Beyond simple sequence manipulation, BioWord integrates functionality ranging from dyad search and consensus logos to motif discovery and pair-wise alignment. Written in Visual Basic for Applications (VBA) as an open source, object-oriented project, BioWord allows users with varying programming experience to expand and customize the program to better meet their own needs. BioWord integrates a powerful set of tools for biological sequence manipulation within a handy, user-friendly tab in a widely used word processing software package. The use of a simple scripting language and an object-oriented scheme facilitates customization by users and provides a very accessible educational platform for introducing students to basic bioinformatics algorithms.
Smart sensing to drive real-time loads scheduling algorithm in a domotic architecture
NASA Astrophysics Data System (ADS)
Santamaria, Amilcare Francesco; Raimondo, Pierfrancesco; De Rango, Floriano; Vaccaro, Andrea
2014-05-01
Nowadays the focus on power consumption represent a very important factor regarding the reduction of power consumption with correlated costs and the environmental sustainability problems. Automatic control load based on power consumption and use cycle represents the optimal solution to costs restraint. The purpose of these systems is to modulate the power request of electricity avoiding an unorganized work of the loads, using intelligent techniques to manage them based on real time scheduling algorithms. The goal is to coordinate a set of electrical loads to optimize energy costs and consumptions based on the stipulated contract terms. The proposed algorithm use two new main notions: priority driven loads and smart scheduling loads. The priority driven loads can be turned off (stand by) according to a priority policy established by the user if the consumption exceed a defined threshold, on the contrary smart scheduling loads are scheduled in a particular way to don't stop their Life Cycle (LC) safeguarding the devices functions or allowing the user to freely use the devices without the risk of exceeding the power threshold. The algorithm, using these two kind of notions and taking into account user requirements, manages loads activation and deactivation allowing the completion their operation cycle without exceeding the consumption threshold in an off-peak time range according to the electricity fare. This kind of logic is inspired by industrial lean manufacturing which focus is to minimize any kind of power waste optimizing the available resources.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berg, Larry K.; Allwine, K Jerry; Rutz, Frederick C.
2004-08-23
A new modeling system has been developed to provide a non-meteorologist with tools to predict air pollution transport in regions of complex terrain. This system couples the Penn State/NCAR Mesoscale Model 5 (MM5) with Earth Tech’s CALMET-CALPUFF system using a unique Graphical User Interface (GUI) developed at Pacific Northwest National Laboratory. This system is most useful in data-sparse regions, where there are limited observations to initialize the CALMET model. The user is able to define the domain of interest, provide details about the source term, and enter a surface weather observation through the GUI. The system then generates initial conditionsmore » and time constant boundary conditions for use by MM5. MM5 is run and the results are piped to CALPUFF for the dispersion calculations. Contour plots of pollutant concentration are prepared for the user. The primary advantages of the system are the streamlined application of MM5 and CALMET, limited data requirements, and the ability to run the coupled system on a desktop or laptop computer. In comparison with data collected as part of a field campaign, the new modeling system shows promise that a full-physics mesoscale model can be used in an applied modeling system to effectively simulate locally thermally-driven winds with minimal observations as input. An unexpected outcome of this research was how well CALMET represented the locally thermally-driven flows.« less
Evaluation of the user requirements processes for NASA terrestrial applications programs
NASA Technical Reports Server (NTRS)
1982-01-01
To support the evolution of increasingly sound user requirements definition processes that would meet the broad range of NASA's terrestrial applications planning and management needs during the 1980's, the user requirements processes as they function in the real world at the senior and middle management levels were evaluated. Special attention was given to geologic mapping and domestic crop reporting to provide insight into problems associated with the development and management of user established conventional practices and data sources. An attempt was made to identify alternative NASA user interfaces that sustain strengths, alleviate weaknesses, maximize application to multiple problems, and simplify management cognizance. Some of the alternatives are outlined and evaluated. It is recommended that NASA have an identified organizational point of focus for consolidation and oversight of the user processes.
New Users | Center for Cancer Research
New Users Becoming a Core Facilities User The following steps are applicable to anyone who would like to become a user of the CCR SAXS Core facilities. All users are required to follow the Core Facilty User Polices.
NASA Technical Reports Server (NTRS)
1999-01-01
The full complement of EDOMP investigations called for a broad spectrum of flight hardware ranging from commercial items, modified for spaceflight, to custom designed hardware made to meet the unique requirements of testing in the space environment. In addition, baseline data collection before and after spaceflight required numerous items of ground-based hardware. Two basic categories of ground-based hardware were used in EDOMP testing before and after flight: (1) hardware used for medical baseline testing and analysis, and (2) flight-like hardware used both for astronaut training and medical testing. To ensure post-landing data collection, hardware was required at both the Kennedy Space Center (KSC) and the Dryden Flight Research Center (DFRC) landing sites. Items that were very large or sensitive to the rigors of shipping were housed permanently at the landing site test facilities. Therefore, multiple sets of hardware were required to adequately support the prime and backup landing sites plus the Johnson Space Center (JSC) laboratories. Development of flight hardware was a major element of the EDOMP. The challenges included obtaining or developing equipment that met the following criteria: (1) compact (small size and light weight), (2) battery-operated or requiring minimal spacecraft power, (3) sturdy enough to survive the rigors of spaceflight, (4) quiet enough to pass acoustics limitations, (5) shielded and filtered adequately to assure electromagnetic compatibility with spacecraft systems, (6) user-friendly in a microgravity environment, and (7) accurate and efficient operation to meet medical investigative requirements.
Mepham, Nick; Bouman, Walter P; Arcelus, Jon; Hayter, Mark; Wylie, Kevan R
2014-12-01
There is a scarcity of research into the use of non-physician-sourced cross-sex hormones in the transgender population. However, when medication is not prescribed by health professionals, users' knowledge of such medication may be adversely affected. This study aims to define the prevalence of Internet-sourced sex hormone use in a population attending for initial assessment at a gender identity clinic, to compare the prevalence between gender-dysphoric men and women, and to compare knowledge of cross-sex hormone side effects between users who source cross-sex hormones from medical doctors and those who source them elsewhere. In the first part of the study, a cross-sectional design is used to measure the overall prevalence of sex hormone use among individuals referred to a gender clinic. The second part is a questionnaire survey aiming at measuring sex hormone knowledge among individuals referred to this clinic. Main outcome measures were (i) categorical data on the prevalence and source of cross-sex hormone use and (ii) knowledge of sex hormone side effects in a population referred to a gender clinic. Cross-sex hormone use was present in 23% of gender clinic referrals, of whom 70% sourced the hormones via the Internet. Trans men using testosterone had a sex hormone usage prevalence of 6%; one-third of users sourced it from the Internet. Trans women had a sex hormone usage prevalence of 32%; approximately 70% of users sourced hormones from the Internet. Cross-sex hormone users who sourced their hormones from physicians were more aware of side effects than those who used other sources to access hormones. One in four trans women self-prescribe cross-sex hormones before attending gender clinics, most commonly via the Internet. This practice is currently rare among trans men. Self-prescribing without medical advice leaves individuals without the knowledge required to minimize health risks. © 2014 International Society for Sexual Medicine.
Gaze Tracking System for User Wearing Glasses
Gwon, Su Yeong; Cho, Chul Woo; Lee, Hyeon Chang; Lee, Won Oh; Park, Kang Ryoung
2014-01-01
Conventional gaze tracking systems are limited in cases where the user is wearing glasses because the glasses usually produce noise due to reflections caused by the gaze tracker's lights. This makes it difficult to locate the pupil and the specular reflections (SRs) from the cornea of the user's eye. These difficulties increase the likelihood of gaze detection errors because the gaze position is estimated based on the location of the pupil center and the positions of the corneal SRs. In order to overcome these problems, we propose a new gaze tracking method that can be used by subjects who are wearing glasses. Our research is novel in the following four ways: first, we construct a new control device for the illuminator, which includes four illuminators that are positioned at the four corners of a monitor. Second, our system automatically determines whether a user is wearing glasses or not in the initial stage by counting the number of white pixels in an image that is captured using the low exposure setting on the camera. Third, if it is determined that the user is wearing glasses, the four illuminators are turned on and off sequentially in order to obtain an image that has a minimal amount of noise due to reflections from the glasses. As a result, it is possible to avoid the reflections and accurately locate the pupil center and the positions of the four corneal SRs. Fourth, by turning off one of the four illuminators, only three corneal SRs exist in the captured image. Since the proposed gaze detection method requires four corneal SRs for calculating the gaze position, the unseen SR position is estimated based on the parallelogram shape that is defined by the three SR positions and the gaze position is calculated. Experimental results showed that the average gaze detection error with 20 persons was about 0.70° and the processing time is 63.72 ms per each frame. PMID:24473283
Freckmann, Guido; Jendrike, Nina; Baumstark, Annette; Pleus, Stefan; Liebing, Christina; Haug, Cornelia
2018-04-01
The international standard ISO 15197:2013 requires a user performance evaluation to assess if intended users are able to obtain accurate blood glucose measurement results with a self-monitoring of blood glucose (SMBG) system. In this study, user performance was evaluated for four SMBG systems on the basis of ISO 15197:2013, and possibly related insulin dosing errors were calculated. Additionally, accuracy was assessed in the hands of study personnel. Accu-Chek ® Performa Connect (A), Contour ® plus ONE (B), FreeStyle Optium Neo (C), and OneTouch Select ® Plus (D) were evaluated with one test strip lot. After familiarization with the systems, subjects collected a capillary blood sample and performed an SMBG measurement. Study personnel observed the subjects' measurement technique. Then, study personnel performed SMBG measurements and comparison measurements. Number and percentage of SMBG measurements within ± 15 mg/dl and ± 15% of the comparison measurements at glucose concentrations < 100 and ≥ 100 mg/dl, respectively, were calculated. In addition, insulin dosing errors were modelled. In the hands of lay-users three systems fulfilled ISO 15197:2013 accuracy criteria with the investigated test strip lot showing 96% (A), 100% (B), and 98% (C) of results within the defined limits. All systems fulfilled minimum accuracy criteria in the hands of study personnel [99% (A), 100% (B), 99.5% (C), 96% (D)]. Measurements with all four systems were within zones of the consensus error grid and surveillance error grid associated with no or minimal risk. Regarding calculated insulin dosing errors, all 99% ranges were between dosing errors of - 2.7 and + 1.4 units for measurements in the hands of lay-users and between - 2.5 and + 1.4 units for study personnel. Frequent lay-user errors were not checking the test strips' expiry date and applying blood incorrectly. Data obtained in this study show that not all available SMBG systems complied with ISO 15197:2013 accuracy criteria when measurements were performed by lay-users. The study was registered at ClinicalTrials.gov (NCT02916576). Ascensia Diabetes Care Deutschland GmbH.
Data Sciences Summer Institute Topology Optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watts, Seth
DSSI_TOPOPT is a 2D topology optimization code that designs stiff structures made of a single linear elastic material and void space. The code generates a finite element mesh of a rectangular design domain on which the user specifies displacement and load boundary conditions. The code iteratively designs a structure that minimizes the compliance (maximizes the stiffness) of the structure under the given loading, subject to an upper bound on the amount of material used. Depending on user options, the code can evaluate the performance of a user-designed structure, or create a design from scratch. Output includes the finite element mesh,more » design, and visualizations of the design.« less
Product Related Advantages of a Structured PACS Architecture
Greinacher, C.F.C.; Fuchs, D.; Perry, J.
1986-01-01
Most of the previously described PACS solutions are either developed and evaluated by the users themselves or they are pilot projects evaluated in close cooperation between a user and one or several manufacturers. Many of the prerequisites that will have to be met by future PACS are of minor interest in these pilot projects; e.g. minimization of costs, compatibility between different manufacturers' equipment, feasibility of systems engineering and customizing of a standard product, serviceability of different manufacturers' subsystems, stepwise introduction of the system into the daily routine and failsafety of the system. The paper shows a structured PACS architecture as a prerequisite to make PACS a high quality system from the user's point of view.
14 CFR 1215.108 - Defining user service requirements.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 14 Aeronautics and Space 5 2011-01-01 2010-01-01 true Defining user service requirements. 1215.108 Section 1215.108 Aeronautics and Space NATIONAL AERONAUTICS AND SPACE ADMINISTRATION TRACKING AND DATA..., spacecraft design, operations planning, and other significant mission parameters. When these user evaluations...
14 CFR 1215.108 - Defining user service requirements.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 14 Aeronautics and Space 5 2012-01-01 2012-01-01 false Defining user service requirements. 1215.108 Section 1215.108 Aeronautics and Space NATIONAL AERONAUTICS AND SPACE ADMINISTRATION TRACKING AND... services, spacecraft design, operations planning, and other significant mission parameters. When these user...
Evaluation of consolidation problems in thicker Portland cement concrete pavements
DOT National Transportation Integrated Search
2003-08-01
Minimizing the amount of entrapped air in concrete is necessary to produce quality concrete with a longer pavement performance life, lower maintenance costs and fewer delays to the roadway users. Good quality concrete with low entrapped air content w...
Spiral-Bevel-Gear Damage Detected Using Decision Fusion Analysis
NASA Technical Reports Server (NTRS)
Dempsey, Paula J.; Handschuh, Robert F.
2003-01-01
Helicopter transmission integrity is critical to helicopter safety because helicopters depend on the power train for propulsion, lift, and flight maneuvering. To detect impending transmission failures, the ideal diagnostic tools used in the health-monitoring system would provide real-time health monitoring of the transmission, demonstrate a high level of reliable detection to minimize false alarms, and provide end users with clear information on the health of the system without requiring them to interpret large amounts of sensor data. A diagnostic tool for detecting damage to spiral bevel gears was developed. (Spiral bevel gears are used in helicopter transmissions to transfer power between nonparallel intersecting shafts.) Data fusion was used to integrate two different monitoring technologies, oil debris analysis and vibration, into a health-monitoring system for detecting surface fatigue pitting damage on the gears.
Symmetrically private information retrieval based on blind quantum computing
NASA Astrophysics Data System (ADS)
Sun, Zhiwei; Yu, Jianping; Wang, Ping; Xu, Lingling
2015-05-01
Universal blind quantum computation (UBQC) is a new secure quantum computing protocol which allows a user Alice who does not have any sophisticated quantum technology to delegate her computing to a server Bob without leaking any privacy. Using the features of UBQC, we propose a protocol to achieve symmetrically private information retrieval, which allows a quantum limited Alice to query an item from Bob with a fully fledged quantum computer; meanwhile, the privacy of both parties is preserved. The security of our protocol is based on the assumption that malicious Alice has no quantum computer, which avoids the impossibility proof of Lo. For the honest Alice, she is almost classical and only requires minimal quantum resources to carry out the proposed protocol. Therefore, she does not need any expensive laboratory which can maintain the coherence of complicated quantum experimental setups.
A training paradigm to enhance performance and safe use of an innovative neuroendovascular device
Ricci, Donald R.; Marotta, Thomas R.; Riina, Howard A.; Wan, Martina; De Vries, Joost
2016-01-01
Training has been important to facilitate the safe use of new devices designed to repair vascular structures. This paper outlines the generic elements of a training program for vascular devices and uses as an example the actual training requirements for a novel device developed for the treatment of bifurcation intracranial aneurysms. Critical elements of the program include awareness of the clinical problem, technical features of device, case selection, and use of a simulator. Formal proctoring, evaluation of the training, and recording the clinical outcomes complement these elements. Interventional physicians should embrace the merits of a training module to improve the user experience, and vendors, physicians, and patients alike should be aligned in the goal of device training to improve its success rate and minimize complications of the procedure. PMID:27867466
Soga, Kenichi; Schooling, Jennifer
2016-08-06
Design, construction, maintenance and upgrading of civil engineering infrastructure requires fresh thinking to minimize use of materials, energy and labour. This can only be achieved by understanding the performance of the infrastructure, both during its construction and throughout its design life, through innovative monitoring. Advances in sensor systems offer intriguing possibilities to radically alter methods of condition assessment and monitoring of infrastructure. In this paper, it is hypothesized that the future of infrastructure relies on smarter information; the rich information obtained from embedded sensors within infrastructure will act as a catalyst for new design, construction, operation and maintenance processes for integrated infrastructure systems linked directly with user behaviour patterns. Some examples of emerging sensor technologies for infrastructure sensing are given. They include distributed fibre-optics sensors, computer vision, wireless sensor networks, low-power micro-electromechanical systems, energy harvesting and citizens as sensors.
Yoon, Hyejin; Leitner, Thomas
2014-12-17
Analyses of entire viral genomes or mtDNA requires comprehensive design of many primers across their genomes. In addition, simultaneous optimization of several DNA primer design criteria may improve overall experimental efficiency and downstream bioinformatic processing. To achieve these goals, we developed PrimerDesign-M. It includes several options for multiple-primer design, allowing researchers to efficiently design walking primers that cover long DNA targets, such as entire HIV-1 genomes, and that optimizes primers simultaneously informed by genetic diversity in multiple alignments and experimental design constraints given by the user. PrimerDesign-M can also design primers that include DNA barcodes and minimize primer dimerization. PrimerDesign-Mmore » finds optimal primers for highly variable DNA targets and facilitates design flexibility by suggesting alternative designs to adapt to experimental conditions.« less
New trends in the virtualization of hospitals--tools for global e-Health.
Graschew, Georgi; Roelofs, Theo A; Rakowsky, Stefan; Schlag, Peter M; Heinzlreiter, Paul; Kranzlmüller, Dieter; Volkert, Jens
2006-01-01
The development of virtual hospitals and digital medicine helps to bridge the digital divide between different regions of the world and enables equal access to high-level medical care. Pre-operative planning, intra-operative navigation and minimally-invasive surgery require a digital and virtual environment supporting the perception of the physician. As data and computing resources in a virtual hospital are distributed over many sites the concept of the Grid should be integrated with other communication networks and platforms. A promising approach is the implementation of service-oriented architectures for an invisible grid, hiding complexity for both application developers and end-users. Examples of promising medical applications of Grid technology are the real-time 3D-visualization and manipulation of patient data for individualized treatment planning and the creation of distributed intelligent databases of medical images.