Development and evaluation of nursing user interface screens using multiple methods.
Hyun, Sookyung; Johnson, Stephen B; Stetson, Peter D; Bakken, Suzanne
2009-12-01
Building upon the foundation of the Structured Narrative Electronic Health Record (EHR) model, we applied theory-based (combined Technology Acceptance Model and Task-Technology Fit Model) and user-centered methods to explore nurses' perceptions of functional requirements for an electronic nursing documentation system, design user interface screens reflective of the nurses' perspectives, and assess nurses' perceptions of the usability of the prototype user interface screens. The methods resulted in user interface screens that were perceived to be easy to use, potentially useful, and well-matched to nursing documentation tasks associated with Nursing Admission Assessment, Blood Administration, and Nursing Discharge Summary. The methods applied in this research may serve as a guide for others wishing to implement user-centered processes to develop or extend EHR systems. In addition, some of the insights obtained in this study may be informative to the development of safe and efficient user interface screens for nursing document templates in EHRs.
Social Image Captioning: Exploring Visual Attention and User Attention.
Wang, Leiquan; Chu, Xiaoliang; Zhang, Weishan; Wei, Yiwei; Sun, Weichen; Wu, Chunlei
2018-02-22
Image captioning with a natural language has been an emerging trend. However, the social image, associated with a set of user-contributed tags, has been rarely investigated for a similar task. The user-contributed tags, which could reflect the user attention, have been neglected in conventional image captioning. Most existing image captioning models cannot be applied directly to social image captioning. In this work, a dual attention model is proposed for social image captioning by combining the visual attention and user attention simultaneously.Visual attention is used to compress a large mount of salient visual information, while user attention is applied to adjust the description of the social images with user-contributed tags. Experiments conducted on the Microsoft (MS) COCO dataset demonstrate the superiority of the proposed method of dual attention.
Social Image Captioning: Exploring Visual Attention and User Attention
Chu, Xiaoliang; Zhang, Weishan; Wei, Yiwei; Sun, Weichen; Wu, Chunlei
2018-01-01
Image captioning with a natural language has been an emerging trend. However, the social image, associated with a set of user-contributed tags, has been rarely investigated for a similar task. The user-contributed tags, which could reflect the user attention, have been neglected in conventional image captioning. Most existing image captioning models cannot be applied directly to social image captioning. In this work, a dual attention model is proposed for social image captioning by combining the visual attention and user attention simultaneously.Visual attention is used to compress a large mount of salient visual information, while user attention is applied to adjust the description of the social images with user-contributed tags. Experiments conducted on the Microsoft (MS) COCO dataset demonstrate the superiority of the proposed method of dual attention. PMID:29470409
USER'S MANUAL FOR THE INSTREAM SEDIMENT-CONTAMINANT TRANSPORT MODEL SERATRA
This manual guides the user in applying the sediment-contaminant transport model SERATRA. SERATRA is an unsteady, two-dimensional code that uses the finite element computation method with the Galerkin weighted residual technique. The model has general convection-diffusion equatio...
Users' Interaction with World Wide Web Resources: An Exploratory Study Using a Holistic Approach.
ERIC Educational Resources Information Center
Wang, Peiling; Hawk, William B.; Tenopir, Carol
2000-01-01
Presents results of a study that explores factors of user-Web interaction in finding factual information, develops a conceptual framework for studying user-Web interaction, and applies a process-tracing method for conducting holistic user-Web studies. Describes measurement techniques and proposes a model consisting of the user, interface, and the…
User Modeling for Contextual Suggestion
2014-11-01
information retrieval literature ( Salton et al., 1975). To apply this metric, we converted the user interest model into a vector representation with all...Discovering Virtual Interest Groups across Chat Rooms, International Conference on Knowledge Management and Information Sharing (KMIS 2012). [7] Salton , G., A
The motivation for drug abuse treatment: testing cognitive and 12-step theories.
Bell, D C; Montoya, I D; Richard, A J; Dayton, C A
1998-11-01
The purpose of this paper is to evaluate two models of behavior change: cognitive theory and 12-step theory. Research subjects were drawn from three separate, but parallel, samples of adults. The first sample consisted of out-of-treatment chronic drug users, the second consisted of drug users who had applied for treatment at a publicly funded multiple-provider drug treatment facility, and the third consisted of drug users who had applied for treatment at an intensive outpatient program for crack cocaine users. Cognitive theory was supported. Study participants applying for drug abuse treatment reported a higher level of perceived problem severity and a higher level of cognitive functioning than out-of-treatment drug users. Two hypotheses drawn from 12-step theory were not supported. Treatment applicants had more positive emotional functioning than out-of-treatment drug users, and one treatment-seeking sample had higher self-esteem.
ERIC Educational Resources Information Center
Amershi, Saleema; Conati, Cristina
2009-01-01
In this paper, we present a data-based user modeling framework that uses both unsupervised and supervised classification to build student models for exploratory learning environments. We apply the framework to build student models for two different learning environments and using two different data sources (logged interface and eye-tracking data).…
Social relevance: toward understanding the impact of the individual in an information cascade
NASA Astrophysics Data System (ADS)
Hall, Robert T.; White, Joshua S.; Fields, Jeremy
2016-05-01
Information Cascades (IC) through a social network occur due to the decision of users to disseminate content. We define this decision process as User Diffusion (UD). IC models typically describe an information cascade by treating a user as a node within a social graph, where a node's reception of an idea is represented by some activation state. The probability of activation then becomes a function of a node's connectedness to other activated nodes as well as, potentially, the history of activation attempts. We enrich this Coarse-Grained User Diffusion (CGUD) model by applying actor type logics to the nodes of the graph. The resulting Fine-Grained User Diffusion (FGUD) model utilizes prior research in actor typing to generate a predictive model regarding the future influence a user will have on an Information Cascade. Furthermore, we introduce a measure of Information Resonance that is used to aid in predictions regarding user behavior.
BASINS enables users to efficiently access nationwide environmental databases and local user-specified datasets, apply assessment and planning tools, and run a variety of proven nonpoint loading and water quality models within a single GIS format.
Online Video-Based Training in the Use of Hydrologic Models: A Case Example Using SWAT
NASA Astrophysics Data System (ADS)
Frankenberger, J.
2009-12-01
Hydrologic models are increasingly important tools in public decision-making. For example, watershed models are used to develop Total Maximum Daily Load (TMDL) plans, quantify pollutant loads, and estimate the effects of watershed restoration efforts funded by the public. One widely-used tool is the Soil and Water Assessment Tool (SWAT), which has been applied by state and federal agencies, consultants, and university researchers to assess sources of nonpoint source pollution and the effects of potential solutions, and used in testimony in at least one lawsuit. The SWAT model has the capability to evaluate the relative effects of different management scenarios on water quality, sediment, and agricultural chemical yield at the watershed scale. As with all models, the model user and the decisions that s/he makes in the modeling process are important determinants of model performance. The SWAT model has an open structure, leaving most decisions up to the model user, which was especially appropriate when the model was primarily used in research by highly-experienced modelers. However, as the model has become more widely applied in planning and assessment, by people who may have limited hydrology background and modeling knowledge, the possibility that users may be using the model inconsistently or even incorrectly becomes a concern. Consistent training can lead to a minimum standard of knowledge that model users are expected to have, and therefore to higher use of best practices in modeling efforts. In addition, widespread availability of training can lead to better decisions about when and where using the model is appropriate, and what level of data needs to be available for confidence in predictions. Currently, most training in model use takes place in occasional face-to-face workshops, courses offered at a few universities, and a short tutorial available in the manual. Many new users simply acquire the model and learn from the manual, other users, trial and error, and posing questions to the web-based group. Although excellent model documentation is available (http://www.brc.tamus.edu/swat/), the extent to which users conduct careful study to ensure that their understanding goes beyond the superficial level is unknown. Online video-on-demand technology provides a way for users to learn consistent content while saving travel resources, and allows for training to fit into people’s schedule. Online videos were created to teach the basics of setting up the model, acquiring input data, parameterizing, calibrating and evaluating model results, and analyzing outputs, as well as model science, uncertainty, and appropriate use. Feedback was sought from the SWAT modeling community to determine typical backgrounds of model users and how the video training fits into the available means of learning SWAT. Evaluation is being undertaken to assess results. The processes used in developing and evaluating the online SWAT training could be applied to other computer models for which use among broader groups of people is increasing or has the potential to grow.
An approach to developing user interfaces for space systems
NASA Astrophysics Data System (ADS)
Shackelford, Keith; McKinney, Karen
1993-08-01
Inherent weakness in the traditional waterfall model of software development has led to the definition of the spiral model. The spiral model software development lifecycle model, however, has not been applied to NASA projects. This paper describes its use in developing real time user interface software for an Environmental Control and Life Support System (ECLSS) Process Control Prototype at NASA's Marshall Space Flight Center.
Cognitive Modeling of Video Game Player User Experience
NASA Technical Reports Server (NTRS)
Bohil, Corey J.; Biocca, Frank A.
2010-01-01
This paper argues for the use of cognitive modeling to gain a detailed and dynamic look into user experience during game play. Applying cognitive models to game play data can help researchers understand a player's attentional focus, memory status, learning state, and decision strategies (among other things) as these cognitive processes occurred throughout game play. This is a stark contrast to the common approach of trying to assess the long-term impact of games on cognitive functioning after game play has ended. We describe what cognitive models are, what they can be used for and how game researchers could benefit by adopting these methods. We also provide details of a single model - based on decision field theory - that has been successfUlly applied to data sets from memory, perception, and decision making experiments, and has recently found application in real world scenarios. We examine possibilities for applying this model to game-play data.
Modeling User Behavior in Computer Learning Tasks.
ERIC Educational Resources Information Center
Mantei, Marilyn M.
Model building techniques from Artifical Intelligence and Information-Processing Psychology are applied to human-computer interface tasks to evaluate existing interfaces and suggest new and better ones. The model is in the form of an augmented transition network (ATN) grammar which is built by applying grammar induction heuristics on a sequential…
Statistical modeling for visualization evaluation through data fusion.
Chen, Xiaoyu; Jin, Ran
2017-11-01
There is a high demand of data visualization providing insights to users in various applications. However, a consistent, online visualization evaluation method to quantify mental workload or user preference is lacking, which leads to an inefficient visualization and user interface design process. Recently, the advancement of interactive and sensing technologies makes the electroencephalogram (EEG) signals, eye movements as well as visualization logs available in user-centered evaluation. This paper proposes a data fusion model and the application procedure for quantitative and online visualization evaluation. 15 participants joined the study based on three different visualization designs. The results provide a regularized regression model which can accurately predict the user's evaluation of task complexity, and indicate the significance of all three types of sensing data sets for visualization evaluation. This model can be widely applied to data visualization evaluation, and other user-centered designs evaluation and data analysis in human factors and ergonomics. Copyright © 2016 Elsevier Ltd. All rights reserved.
Shin, Dong-Hee; Kim, Won-Yong; Kim, Won-Young
2008-06-01
This study explores attitudinal and behavioral patterns when using Cyworld by adopting an expanded Technology Acceptance Model (TAM). A model for Cyworld acceptance is used to examine how various factors modified from the TAM influence acceptance and its antecedents. This model is examined through an empirical study involving Cyworld users using structural equation modeling techniques. The model shows reasonably good measurement properties and the constructs are validated. The results not only confirm the model but also reveal general factors applicable to Web2.0. A set of constructs in the model can be the Web2.0-specific factors, playing as enhancing factor to attitudes and intention.
Measuring Levels of End-Users' Acceptance and Use of Hybrid Library Services
ERIC Educational Resources Information Center
Tibenderana, Prisca; Ogao, Patrick; Ikoja-Odongo, J.; Wokadala, James
2010-01-01
This study concerns the adoption of Information Communication Technology (ICT) services in libraries. The study collected 445 usable data from university library end-users using a cross-sectional survey instrument. It develops, applies and tests a research model of acceptance and use of such services based on an existing UTAUT model by Venkatesh,…
Artificial Intelligence for VHSIC Systems Design (AIVD) User Reference Manual
1988-12-01
The goal of this program was to develop prototype tools which would use artificial intelligence techniques to extend the Architecture Design and Assessment (ADAS) software capabilities. These techniques were applied in a number of ways to increase the productivity of ADAS users. AIM will reduce the amount of time spent on tedious, negative, and error-prone steps. It will also provide f documentation that will assist users in varifying that the models they build are correct Finally, AIVD will help make ADAS models more reusable.
Development of 3D browsing and interactive web system
NASA Astrophysics Data System (ADS)
Shi, Xiaonan; Fu, Jian; Jin, Chaolin
2017-09-01
In the current market, users need to download specific software or plug-ins to browse the 3D model, and browsing the system may be unstable, and it cannot be 3D model interaction issues In order to solve this problem, this paper presents a solution to the interactive browsing of the model in the server-side parsing model, and when the system is applied, the user only needs to input the system URL and upload the 3D model file to operate the browsing The server real-time parsing 3D model, the interactive response speed, these completely follows the user to walk the minimalist idea, and solves the current market block 3D content development question.
Wu, Yiping; Liu, Shu-Guang
2012-01-01
R program language-Soil and Water Assessment Tool-Flexible Modeling Environment (R-SWAT-FME) (Wu and Liu, 2012) is a comprehensive modeling framework that adopts an R package, Flexible Modeling Environment (FME) (Soetaert and Petzoldt, 2010), for the Soil and Water Assessment Tool (SWAT) model (Arnold and others, 1998; Neitsch and others, 2005). This framework provides the functionalities of parameter identifiability, model calibration, and sensitivity and uncertainty analysis with instant visualization. This user's guide shows how to apply this framework for a customized SWAT project.
The Snowmelt-Runoff Model (SRM) user's manual
NASA Technical Reports Server (NTRS)
Martinec, J.; Rango, A.; Major, E.
1983-01-01
A manual to provide a means by which a user may apply the snowmelt runoff model (SRM) unaided is presented. Model structure, conditions of application, and data requirements, including remote sensing, are described. Guidance is given for determining various model variables and parameters. Possible sources of error are discussed and conversion of snowmelt runoff model (SRM) from the simulation mode to the operational forecasting mode is explained. A computer program is presented for running SRM is easily adaptable to most systems used by water resources agencies.
Product Recommendation System Based on Personal Preference Model Using CAM
NASA Astrophysics Data System (ADS)
Murakami, Tomoko; Yoshioka, Nobukazu; Orihara, Ryohei; Furukawa, Koichi
Product recommendation system is realized by applying business rules acquired by data maining techniques. Business rules such as demographical patterns of purchase, are able to cover the groups of users that have a tendency to purchase products, but it is difficult to recommend products adaptive to various personal preferences only by utilizing them. In addition to that, it is very costly to gather the large volume of high quality survey data, which is necessary for good recommendation based on personal preference model. A method collecting kansei information automatically without questionnaire survey is required. The constructing personal preference model from less favor data is also necessary, since it is costly for the user to input favor data. In this paper, we propose product recommendation system based on kansei information extracted by text mining and user's preference model constructed by Category-guided Adaptive Modeling, CAM for short. CAM is a feature construction method that can generate new features constructing the space where same labeled examples are close and different labeled examples are far away from some labeled examples. It is possible to construct personal preference model by CAM despite less information of likes and dislikes categories. In the system, retrieval agent gathers the products' specification and user agent manages preference model, user's likes and dislikes. Kansei information of the products is gained by applying text mining technique to the reputation documents about the products on the web site. We carry out some experimental studies to make sure that prefrence model obtained by our method performs effectively.
Visual Basic, Excel-based fish population modeling tool - The pallid sturgeon example
Moran, Edward H.; Wildhaber, Mark L.; Green, Nicholas S.; Albers, Janice L.
2016-02-10
The model presented in this report is a spreadsheet-based model using Visual Basic for Applications within Microsoft Excel (http://dx.doi.org/10.5066/F7057D0Z) prepared in cooperation with the U.S. Army Corps of Engineers and U.S. Fish and Wildlife Service. It uses the same model structure and, initially, parameters as used by Wildhaber and others (2015) for pallid sturgeon. The difference between the model structure used for this report and that used by Wildhaber and others (2015) is that variance is not partitioned. For the model of this report, all variance is applied at the iteration and time-step levels of the model. Wildhaber and others (2015) partition variance into parameter variance (uncertainty about the value of a parameter itself) applied at the iteration level and temporal variance (uncertainty caused by random environmental fluctuations with time) applied at the time-step level. They included implicit individual variance (uncertainty caused by differences between individuals) within the time-step level.The interface developed for the model of this report is designed to allow the user the flexibility to change population model structure and parameter values and uncertainty separately for every component of the model. This flexibility makes the modeling tool potentially applicable to any fish species; however, the flexibility inherent in this modeling tool makes it possible for the user to obtain spurious outputs. The value and reliability of the model outputs are only as good as the model inputs. Using this modeling tool with improper or inaccurate parameter values, or for species for which the structure of the model is inappropriate, could lead to untenable management decisions. By facilitating fish population modeling, this modeling tool allows the user to evaluate a range of management options and implications. The goal of this modeling tool is to be a user-friendly modeling tool for developing fish population models useful to natural resource managers to inform their decision-making processes; however, as with all population models, caution is needed, and a full understanding of the limitations of a model and the veracity of user-supplied parameters should always be considered when using such model output in the management of any species.
Shin, Wonkyoung; Park, Minyong
2017-01-01
Background/Study Context: The increasing longevity and health of older users as well as aging populations has created the need to develop senior-oriented product interfaces. This study aims to find user interface (UI) priorities according to older user groups based on their lifestyle and develop quality of UI (QUI) models for large electronic home appliances and mobile products. A segmentation table designed to show how older users can be categorized was created through a review of the literature to survey 252 subjects with a questionnaire. Factor analysis was performed to extract six preliminary lifestyle factors, which were then used for subsequent cluster analysis. The analysis resulted in four groups. Cross-analysis was carried out to investigate which characteristics were included in the groups. Analysis of variance was then applied to investigate the differences in the UI priorities among the user groups for various electronic devices. Finally, QUI models were developed and applied to those electronic devices. Differences in UI priorities were found according to the four lifestyles ("money-oriented," "innovation-oriented," "stability- and simplicity-oriented," and "innovation- and intellectual-oriented"). Twelve QUI models were developed for four different lifestyle groups associated with different products. Three washers and three smartphones were used as an example for testing the QUI models. The UI differences of the older user groups by the segmentation in this study using several key (i.e., demographic, socioeconomic, and physical-cognitive) variables are distinct from earlier studies made by a single variable. The differences in responses clearly indicate the benefits of integrating various factors of older users, rather than single variable, in order to design and develop more innovative and better consumer products in the future. The results of this study showed that older users with a potentially high buying power in the future are likely to have higher satisfaction when selecting products customized for their lifestyle. Designers could also use the results of UI evaluation for older users based on their lifestyle before developing products through QUI modeling. This approach would save time and costs.
Dabek, Filip; Caban, Jesus J
2017-01-01
Despite the recent popularity of visual analytics focusing on big data, little is known about how to support users that use visualization techniques to explore multi-dimensional datasets and accomplish specific tasks. Our lack of models that can assist end-users during the data exploration process has made it challenging to learn from the user's interactive and analytical process. The ability to model how a user interacts with a specific visualization technique and what difficulties they face are paramount in supporting individuals with discovering new patterns within their complex datasets. This paper introduces the notion of visualization systems understanding and modeling user interactions with the intent of guiding a user through a task thereby enhancing visual data exploration. The challenges faced and the necessary future steps to take are discussed; and to provide a working example, a grammar-based model is presented that can learn from user interactions, determine the common patterns among a number of subjects using a K-Reversible algorithm, build a set of rules, and apply those rules in the form of suggestions to new users with the goal of guiding them along their visual analytic process. A formal evaluation study with 300 subjects was performed showing that our grammar-based model is effective at capturing the interactive process followed by users and that further research in this area has the potential to positively impact how users interact with a visualization system.
UTOPIAN: user-driven topic modeling based on interactive nonnegative matrix factorization.
Choo, Jaegul; Lee, Changhyun; Reddy, Chandan K; Park, Haesun
2013-12-01
Topic modeling has been widely used for analyzing text document collections. Recently, there have been significant advancements in various topic modeling techniques, particularly in the form of probabilistic graphical modeling. State-of-the-art techniques such as Latent Dirichlet Allocation (LDA) have been successfully applied in visual text analytics. However, most of the widely-used methods based on probabilistic modeling have drawbacks in terms of consistency from multiple runs and empirical convergence. Furthermore, due to the complicatedness in the formulation and the algorithm, LDA cannot easily incorporate various types of user feedback. To tackle this problem, we propose a reliable and flexible visual analytics system for topic modeling called UTOPIAN (User-driven Topic modeling based on Interactive Nonnegative Matrix Factorization). Centered around its semi-supervised formulation, UTOPIAN enables users to interact with the topic modeling method and steer the result in a user-driven manner. We demonstrate the capability of UTOPIAN via several usage scenarios with real-world document corpuses such as InfoVis/VAST paper data set and product review data sets.
ERIC Educational Resources Information Center
Wu, Xiaoyu; Gao, Yuan
2011-01-01
This paper applies the extended technology acceptance model (exTAM) in information systems research to the use of clickers in student learning. The technology acceptance model (TAM) posits that perceived ease of use and perceived usefulness of technology influence users' attitudes toward using and intention to use technology. Research subsequent…
Intelligent Context-Aware and Adaptive Interface for Mobile LBS
Liu, Yanhong
2015-01-01
Context-aware user interface plays an important role in many human-computer Interaction tasks of location based services. Although spatial models for context-aware systems have been studied extensively, how to locate specific spatial information for users is still not well resolved, which is important in the mobile environment where location based services users are impeded by device limitations. Better context-aware human-computer interaction models of mobile location based services are needed not just to predict performance outcomes, such as whether people will be able to find the information needed to complete a human-computer interaction task, but to understand human processes that interact in spatial query, which will in turn inform the detailed design of better user interfaces in mobile location based services. In this study, a context-aware adaptive model for mobile location based services interface is proposed, which contains three major sections: purpose, adjustment, and adaptation. Based on this model we try to describe the process of user operation and interface adaptation clearly through the dynamic interaction between users and the interface. Then we show how the model applies users' demands in a complicated environment and suggested the feasibility by the experimental results. PMID:26457077
Tang, Jessica Pui-Shan; Tse, Samson Shu-Ki; Davidson, Larry; Cheng, Patrick
2017-12-22
Current models of user participation in mental health services were developed within Western culture and thus may not be applicable to Chinese communities. To present a new model of user participation, which emerged from research within a Chinese community, for understanding the processes of and factors influencing user participation in a non-Western culture. Multiple qualitative methods, including focus groups, individual in-depth interviews, and photovoice, were applied within the framework of constructivist grounded theory and collaborative research. Diverging from conceptualizations of user participation with emphasis on civil rights and the individual as a central agent, participants in the study highlighted the interpersonal dynamics between service users and different players affecting the participation intensity and outcomes. They valued a reciprocal relationship with their caregivers in making treatment decisions, cooperated with staff to observe power hierarchies and social harmony, identified the importance of peer support in enabling service engagement and delivery, and emphasized professional facilitation in advancing involvement at the policy level. User participation in Chinese culture embeds dynamic interdependence. The proposed model adds this new dimension to the existing frameworks and calls for attention to the complex local ecology and cultural consistency in realizing user participation.
An Evaluation of the Automated Cost Estimating Integrated Tools (ACEIT) System
1989-09-01
residual and it is described as the residual divided by its standard deviation (13:App A,17). Neter, Wasserman, and Kutner, in Applied Linear Regression Models...others. Applied Linear Regression Models. Homewood IL: Irwin, 1983. 19. Raduchel, William J. "A Professional’s Perspective on User-Friendliness," Byte
ANNIE - INTERACTIVE PROCESSING OF DATA BASES FOR HYDROLOGIC MODELS.
Lumb, Alan M.; Kittle, John L.
1985-01-01
ANNIE is a data storage and retrieval system that was developed to reduce the time and effort required to calibrate, verify, and apply watershed models that continuously simulate water quantity and quality. Watershed models have three categories of input: parameters to describe segments of a drainage area, linkage of the segments, and time-series data. Additional goals for ANNIE include the development of software that is easily implemented on minicomputers and some microcomputers and software that has no special requirements for interactive display terminals. Another goal is for the user interaction to be based on the experience of the user so that ANNIE is helpful to the inexperienced user and yet efficient and brief for the experienced user. Finally, the code should be designed so that additional hydrologic models can easily be added to ANNIE.
A study on the attitude of use the mobile clinic registration system in Taiwan.
Lai, Yi-Horng; Huang, Fen-Fen; Yang, Hsieh-Hua
2015-01-01
Mobile apps provide diverse services and various convenient functions. This study applied the modified technology acceptance model (MTAM) in information systems research to the use of the mobile hospital registration system in Taiwan. The MTAM posits that perceived ease of use and perceived usefulness of technology influence users' attitudes toward using technology. Research studies using MTAM have determined information technology experience as a factor in predicting attitude. The objective of this present study is to test the validity of the MTAM model when it is being applied to the mobile registration system. The data was collected from 501 patients in a Taiwan's medical center. Path analysis results have shown that TAM is an applicable model in examining factors influencing users' attitudes of using the mobile registration system. It can be found that the perceived usefulness and the perceived ease of use are positively associated with users' attitudes toward using the mobile registration system, and they can improve users' attitudes of using it. In addition, the perceived ease of use is positively associated with the perceived usefulness. As for the personal prior experience, the information technology experience is positively associated with perceived usefulness and the perceived ease of use.
Coarse cluster enhancing collaborative recommendation for social network systems
NASA Astrophysics Data System (ADS)
Zhao, Yao-Dong; Cai, Shi-Min; Tang, Ming; Shang, Min-Sheng
2017-10-01
Traditional collaborative filtering based recommender systems for social network systems bring very high demands on time complexity due to computing similarities of all pairs of users via resource usages and annotation actions, which thus strongly suppresses recommending speed. In this paper, to overcome this drawback, we propose a novel approach, namely coarse cluster that partitions similar users and associated items at a high speed to enhance user-based collaborative filtering, and then develop a fast collaborative user model for the social tagging systems. The experimental results based on Delicious dataset show that the proposed model is able to dramatically reduce the processing time cost greater than 90 % and relatively improve the accuracy in comparison with the ordinary user-based collaborative filtering, and is robust for the initial parameter. Most importantly, the proposed model can be conveniently extended by introducing more users' information (e.g., profiles) and practically applied for the large-scale social network systems to enhance the recommending speed without accuracy loss.
Thermal Network Modelling Handbook
NASA Technical Reports Server (NTRS)
1972-01-01
Thermal mathematical modelling is discussed in detail. A three-fold purpose was established: (1) to acquaint the new user with the terminology and concepts used in thermal mathematical modelling, (2) to present the more experienced and occasional user with quick formulas and methods for solving everyday problems, coupled with study cases which lend insight into the relationships that exist among the various solution techniques and parameters, and (3) to begin to catalog in an orderly fashion the common formulas which may be applied to automated conversational language techniques.
[War on Drugs or War against Health? The pitfalls for public health of Puerto Rican drug policy].
Santiago-Negrón, Salvador; Albizu-García, Carmen E
2003-03-01
Puerto Rico has followed the United States in adopting drug policy sustained on a criminal justice model that limits the opportunities to address problematic drug use through public health interventions. Demand for illegal drugs is controlled by criminalizing drug use and applying jail sentences for drug offenses. These strategies marginalize drug users and reduce opportunities to minimize health risks applying public health measures. Production and sale of illegal drugs is criminalized with the intent of dissuading drug use, with adverse unintended health effects that impact both drug users and non-drug users in the community. The present work reviews the assumptions of the punitive prohibitionist model and its outcomes that present themselves as public health challenges in Puerto Rico. It also presents those principles that should sustain pragmatic drug policy to address problematic drug use from a health and social perspective.
An Inter-Personal Information Sharing Model Based on Personalized Recommendations
NASA Astrophysics Data System (ADS)
Kamei, Koji; Funakoshi, Kaname; Akahani, Jun-Ichi; Satoh, Tetsuji
In this paper, we propose an inter-personal information sharing model among individuals based on personalized recommendations. In the proposed model, we define an information resource as shared between people when both of them consider it important --- not merely when they both possess it. In other words, the model defines the importance of information resources based on personalized recommendations from identifiable acquaintances. The proposed method is based on a collaborative filtering system that focuses on evaluations from identifiable acquaintances. It utilizes both user evaluations for documents and their contents. In other words, each user profile is represented as a matrix of credibility to the other users' evaluations on each domain of interests. We extended the content-based collaborative filtering method to distinguish other users to whom the documents should be recommended. We also applied a concept-based vector space model to represent the domain of interests instead of the previous method which represented them by a term-based vector space model. We introduce a personalized concept-base compiled from each user's information repository to improve the information retrieval in the user's environment. Furthermore, the concept-spaces change from user to user since they reflect the personalities of the users. Because of different concept-spaces, the similarity between a document and a user's interest varies for each user. As a result, a user receives recommendations from other users who have different view points, achieving inter-personal information sharing based on personalized recommendations. This paper also describes an experimental simulation of our information sharing model. In our laboratory, five participants accumulated a personal repository of e-mails and web pages from which they built their own concept-base. Then we estimated the user profiles according to personalized concept-bases and sets of documents which others evaluated. We simulated inter-personal recommendation based on the user profiles and evaluated the performance of the recommendation method by comparing the recommended documents to the result of the content-based collaborative filtering.
NASA Technical Reports Server (NTRS)
John, Bonnie; Vera, Alonso; Matessa, Michael; Freed, Michael; Remington, Roger
2002-01-01
CPM-GOMS is a modeling method that combines the task decomposition of a GOMS analysis with a model of human resource usage at the level of cognitive, perceptual, and motor operations. CPM-GOMS models have made accurate predictions about skilled user behavior in routine tasks, but developing such models is tedious and error-prone. We describe a process for automatically generating CPM-GOMS models from a hierarchical task decomposition expressed in a cognitive modeling tool called Apex. Resource scheduling in Apex automates the difficult task of interleaving the cognitive, perceptual, and motor resources underlying common task operators (e.g. mouse move-and-click). Apex's UI automatically generates PERT charts, which allow modelers to visualize a model's complex parallel behavior. Because interleaving and visualization is now automated, it is feasible to construct arbitrarily long sequences of behavior. To demonstrate the process, we present a model of automated teller interactions in Apex and discuss implications for user modeling. available to model human users, the Goals, Operators, Methods, and Selection (GOMS) method [6, 21] has been the most widely used, providing accurate, often zero-parameter, predictions of the routine performance of skilled users in a wide range of procedural tasks [6, 13, 15, 27, 28]. GOMS is meant to model routine behavior. The user is assumed to have methods that apply sequences of operators and to achieve a goal. Selection rules are applied when there is more than one method to achieve a goal. Many routine tasks lend themselves well to such decomposition. Decomposition produces a representation of the task as a set of nested goal states that include an initial state and a final state. The iterative decomposition into goals and nested subgoals can terminate in primitives of any desired granularity, the choice of level of detail dependent on the predictions required. Although GOMS has proven useful in HCI, tools to support the construction of GOMS models have not yet come into general use.
Unit Cohesion and the Surface Navy: Does Cohesion Affect Performance
1989-12-01
v. 68, 1968. Neter, J., Wasserman, W., and Kutner, M. H., Applied Linear Regression Models, 2d ed., Boston, MA: Irwin, 1989. Rand Corporation R-2607...Neter, J., Wasserman, W., and Kutner, M. H., Applied Linear Regression Models, 2d ed., Boston, MA: Irwin, 1989. SAS User’s Guide: Basics, Version 5 ed
Location contexts of user check-ins to model urban geo life-style patterns.
Hasan, Samiul; Ukkusuri, Satish V
2015-01-01
Geo-location data from social media offers us information, in new ways, to understand people's attitudes and interests through their activity choices. In this paper, we explore the idea of inferring individual life-style patterns from activity-location choices revealed in social media. We present a model to understand life-style patterns using the contextual information (e. g. location categories) of user check-ins. Probabilistic topic models are developed to infer individual geo life-style patterns from two perspectives: i) to characterize the patterns of user interests to different types of places and ii) to characterize the patterns of user visits to different neighborhoods. The method is applied to a dataset of Foursquare check-ins of the users from New York City. The co-existence of several location contexts and the corresponding probabilities in a given pattern provide useful information about user interests and choices. It is found that geo life-style patterns have similar items-either nearby neighborhoods or similar location categories. The semantic and geographic proximity of the items in a pattern reflects the hidden regularity in user preferences and location choice behavior.
Developing Access Control Model of Web OLAP over Trusted and Collaborative Data Warehouses
NASA Astrophysics Data System (ADS)
Fugkeaw, Somchart; Mitrpanont, Jarernsri L.; Manpanpanich, Piyawit; Juntapremjitt, Sekpon
This paper proposes the design and development of Role- based Access Control (RBAC) model for the Single Sign-On (SSO) Web-OLAP query spanning over multiple data warehouses (DWs). The model is based on PKI Authentication and Privilege Management Infrastructure (PMI); it presents a binding model of RBAC authorization based on dimension privilege specified in attribute certificate (AC) and user identification. Particularly, the way of attribute mapping between DW user authentication and privilege of dimensional access is illustrated. In our approach, we apply the multi-agent system to automate flexible and effective management of user authentication, role delegation as well as system accountability. Finally, the paper culminates in the prototype system A-COLD (Access Control of web-OLAP over multiple DWs) that incorporates the OLAP features and authentication and authorization enforcement in the multi-user and multi-data warehouse environment.
CHISSL: A Human-Machine Collaboration Space for Unsupervised Learning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arendt, Dustin L.; Komurlu, Caner; Blaha, Leslie M.
We developed CHISSL, a human-machine interface that utilizes supervised machine learning in an unsupervised context to help the user group unlabeled instances by her own mental model. The user primarily interacts via correction (moving a misplaced instance into its correct group) or confirmation (accepting that an instance is placed in its correct group). Concurrent with the user's interactions, CHISSL trains a classification model guided by the user's grouping of the data. It then predicts the group of unlabeled instances and arranges some of these alongside the instances manually organized by the user. We hypothesize that this mode of human andmore » machine collaboration is more effective than Active Learning, wherein the machine decides for itself which instances should be labeled by the user. We found supporting evidence for this hypothesis in a pilot study where we applied CHISSL to organize a collection of handwritten digits.« less
Predicting future protection of respirator users: Statistical approaches and practical implications.
Hu, Chengcheng; Harber, Philip; Su, Jing
2016-01-01
The purpose of this article is to describe a statistical approach for predicting a respirator user's fit factor in the future based upon results from initial tests. A statistical prediction model was developed based upon joint distribution of multiple fit factor measurements over time obtained from linear mixed effect models. The model accounts for within-subject correlation as well as short-term (within one day) and longer-term variability. As an example of applying this approach, model parameters were estimated from a research study in which volunteers were trained by three different modalities to use one of two types of respirators. They underwent two quantitative fit tests at the initial session and two on the same day approximately six months later. The fitted models demonstrated correlation and gave the estimated distribution of future fit test results conditional on past results for an individual worker. This approach can be applied to establishing a criterion value for passing an initial fit test to provide reasonable likelihood that a worker will be adequately protected in the future; and to optimizing the repeat fit factor test intervals individually for each user for cost-effective testing.
Toward User Interfaces and Data Visualization Criteria for Learning Design of Digital Textbooks
ERIC Educational Resources Information Center
Railean, Elena
2014-01-01
User interface and data visualisation criteria are central issues in digital textbooks design. However, when applying mathematical modelling of learning process to the analysis of the possible solutions, it could be observed that results differ. Mathematical learning views cognition in on the base on statistics and probability theory, graph…
Effects of Meteorological Data Quality on Snowpack Modeling
NASA Astrophysics Data System (ADS)
Havens, S.; Marks, D. G.; Robertson, M.; Hedrick, A. R.; Johnson, M.
2017-12-01
Detailed quality control of meteorological inputs is the most time-intensive component of running the distributed, physically-based iSnobal snow model, and the effect of data quality of the inputs on the model is unknown. The iSnobal model has been run operationally since WY2013, and is currently run in several basins in Idaho and California. The largest amount of user input during modeling is for the quality control of precipitation, temperature, relative humidity, solar radiation, wind speed and wind direction inputs. Precipitation inputs require detailed user input and are crucial to correctly model the snowpack mass. This research applies a range of quality control methods to meteorological input, from raw input with minimal cleaning, to complete user-applied quality control. The meteorological input cleaning generally falls into two categories. The first is global minimum/maximum and missing value correction that could be corrected and/or interpolated with automated processing. The second category is quality control for inputs that are not globally erroneous, yet are still unreasonable and generally indicate malfunctioning measurement equipment, such as temperature or relative humidity that remains constant, or does not correlate with daily trends observed at nearby stations. This research will determine how sensitive model outputs are to different levels of quality control and guide future operational applications.
Matsubara, Takamitsu; Morimoto, Jun
2013-08-01
In this study, we propose a multiuser myoelectric interface that can easily adapt to novel users. When a user performs different motions (e.g., grasping and pinching), different electromyography (EMG) signals are measured. When different users perform the same motion (e.g., grasping), different EMG signals are also measured. Therefore, designing a myoelectric interface that can be used by multiple users to perform multiple motions is difficult. To cope with this problem, we propose for EMG signals a bilinear model that is composed of two linear factors: 1) user dependent and 2) motion dependent. By decomposing the EMG signals into these two factors, the extracted motion-dependent factors can be used as user-independent features. We can construct a motion classifier on the extracted feature space to develop the multiuser interface. For novel users, the proposed adaptation method estimates the user-dependent factor through only a few interactions. The bilinear EMG model with the estimated user-dependent factor can extract the user-independent features from the novel user data. We applied our proposed method to a recognition task of five hand gestures for robotic hand control using four-channel EMG signals measured from subject forearms. Our method resulted in 73% accuracy, which was statistically significantly different from the accuracy of standard nonmultiuser interfaces, as the result of a two-sample t -test at a significance level of 1%.
Dynamic, stochastic models for congestion pricing and congestion securities.
DOT National Transportation Integrated Search
2010-12-01
This research considers congestion pricing under demand uncertainty. In particular, a robust optimization (RO) approach is applied to optimal congestion pricing problems under user equilibrium. A mathematical model is developed and an analysis perfor...
A review of game-theoretic models of road user behaviour.
Elvik, Rune
2014-01-01
This paper reviews game-theoretic models that have been developed to explain road user behaviour in situations where road users interact with each other. The paper includes the following game-theoretic models: 1.A general model of the interaction between road users and their possible reaction to measures improving safety (behavioural adaptation).2.Choice of vehicle size as a Prisoners’ dilemma game.3.Speed choice as a co-ordination game.4.Speed compliance as a game between drivers and the police.5.Merging into traffic from an acceleration lane as a mixed-strategy game.6.Choice of level of attention in following situations as an evolutionary game.7.Choice of departure time to avoid congestion as variant of a Prisoners’ dilemma game.8.Interaction between cyclists crossing the road and car drivers.9.Dipping headlights at night well ahead of the point when glare becomes noticeable.10.Choice of evasive action in a situation when cars are on collision course. The models reviewed are different in many respects, but a common feature of the models is that they can explain how informal norms of behaviour can develop among road users and be sustained even if these informal norms violate the formal regulations of the traffic code. Game-theoretic models are not applicable to every conceivable interaction between road users or to situations in which road users choose behaviour without interacting with other road users. Nevertheless, it is likely that game-theoretic models can be applied more widely than they have been until now. Copyright © 2013 Elsevier Ltd. All rights reserved.
Understanding Deep Representations Learned in Modeling Users Likes.
Guntuku, Sharath Chandra; Zhou, Joey Tianyi; Roy, Sujoy; Lin, Weisi; Tsang, Ivor W
2016-08-01
Automatically understanding and discriminating different users' liking for an image is a challenging problem. This is because the relationship between image features (even semantic ones extracted by existing tools, viz., faces, objects, and so on) and users' likes is non-linear, influenced by several subtle factors. This paper presents a deep bi-modal knowledge representation of images based on their visual content and associated tags (text). A mapping step between the different levels of visual and textual representations allows for the transfer of semantic knowledge between the two modalities. Feature selection is applied before learning deep representation to identify the important features for a user to like an image. The proposed representation is shown to be effective in discriminating users based on images they like and also in recommending images that a given user likes, outperforming the state-of-the-art feature representations by ∼ 15 %-20%. Beyond this test-set performance, an attempt is made to qualitatively understand the representations learned by the deep architecture used to model user likes.
Zradziński, Patryk; Karpowicz, Jolanta; Gryz, Krzysztof; Leszko, Wiesław
2017-06-27
Low frequency magnetic field, inducing electrical field (Ein) inside conductive structures may directly affect the human body, e.g., by electrostimulation in the nervous system. In addition, the spatial distribution and level of Ein are disturbed in tissues neighbouring the medical implant. Numerical models of magneto-therapeutic applicator (emitting sinusoidal magnetic field of frequency 100 Hz) and the user of hearing implant (based on bone conduction: Bonebridge type - IS-BB or BAHA (bone anchorde hearing aid) type - IS-BAHA) were worked out. Values of Ein were analyzed in the model of the implant user's head, e.g., physiotherapist, placed next to the applicator. It was demonstrated that the use of IS-BB or IS-BAHA makes electromagnetic hazards significantly higher (up to 4-fold) compared to the person without implant exposed to magnetic field heterogeneous in space. Hazards for IS-BAHA users are higher than those for IS-BB users. It was found that applying the principles of directive 2013/35/EU, at exposure to magnetic field below exposure limits the direct biophysical effects of exposure in hearing prosthesis users may exceed relevant limits. Whereas applying principles and limits set up by Polish labor law or the International Commission on Non-Ionizing Radiation Protection (ICNIRP) guidelines, the compliance with the exposure limits also ensures the compliance with relevant limits of electric field induced in the body of hearing implant user. It is necessary to assess individually electromagnetic hazard concerning hearing implant users bearing in mind significantly higher hazards to them compared to person without implant or differences between levels of hazards faced by users of implants of various structural or technological solutions. Med Pr 2017;68(4):469-477. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.
Constrained reduced-order models based on proper orthogonal decomposition
Reddy, Sohail R.; Freno, Brian Andrew; Cizmas, Paul G. A.; ...
2017-04-09
A novel approach is presented to constrain reduced-order models (ROM) based on proper orthogonal decomposition (POD). The Karush–Kuhn–Tucker (KKT) conditions were applied to the traditional reduced-order model to constrain the solution to user-defined bounds. The constrained reduced-order model (C-ROM) was applied and validated against the analytical solution to the first-order wave equation. C-ROM was also applied to the analysis of fluidized beds. Lastly, it was shown that the ROM and C-ROM produced accurate results and that C-ROM was less sensitive to error propagation through time than the ROM.
Generic Software Architecture for Prognostics (GSAP) User Guide
NASA Technical Reports Server (NTRS)
Teubert, Christopher Allen; Daigle, Matthew John; Watkins, Jason; Sankararaman, Shankar; Goebel, Kai
2016-01-01
The Generic Software Architecture for Prognostics (GSAP) is a framework for applying prognostics. It makes applying prognostics easier by implementing many of the common elements across prognostic applications. The standard interface enables reuse of prognostic algorithms and models across systems using the GSAP framework.
Impact of remote sensing upon the planning, management, and development of water resources
NASA Technical Reports Server (NTRS)
Castruccio, P. A.; Loats, H. L.; Fowler, T. R.; Frech, S. L.
1975-01-01
Principal water resources users were surveyed to determine the impact of remote data streams on hydrologic computer models. Analysis of responses demonstrated that: most water resources effort suitable to remote sensing inputs is conducted through federal agencies or through federally stimulated research; and, most hydrologic models suitable to remote sensing data are federally developed. Computer usage by major water resources users was analyzed to determine the trends of usage and costs for the principal hydrologic users/models. The laws and empirical relationships governing the growth of the data processing loads were described and applied to project the future data loads. Data loads for ERTS CCT image processing were computed and projected through the 1985 era.
NASA Astrophysics Data System (ADS)
Zhang, Qi; Kosaka, Michitaka; Shirahada, Kunio; Yabutani, Takashi
This paper proposes a new framework for B to B collaboration process based on a concept of service. Service value, which gives users satisfaction for provided services, depends on the situation, user characteristics, and user objectives in seeking the service. Vargo proposed Service Dominant Logic (SDL), which determines service value according to “value in use”. This concept illustrates the importance of the relationship between the service itself and its situation. This relationship is analogous to electro-magnetic field theory in physics. We developed the concept of service fields to create service value based on an analogy of the electro-magnetic field. By applying this concept to B to B collaboration, a model of service value co-creation in the collaboration can be formulated. Then, the collaboration can be described by 4 steps of KIKI model (Knowledge sharing related to service system, Identification of service field, Knowledge creation for new service idea, Implementation of service idea). As its application to B to B collaboration, the energy saving service business is reported to demonstrate the validity of the proposed collaboration model. This concept can be applied to make a collaboration process effective.
Photonic band gap structure simulator
Chen, Chiping; Shapiro, Michael A.; Smirnova, Evgenya I.; Temkin, Richard J.; Sirigiri, Jagadishwar R.
2006-10-03
A system and method for designing photonic band gap structures. The system and method provide a user with the capability to produce a model of a two-dimensional array of conductors corresponding to a unit cell. The model involves a linear equation. Boundary conditions representative of conditions at the boundary of the unit cell are applied to a solution of the Helmholtz equation defined for the unit cell. The linear equation can be approximated by a Hermitian matrix. An eigenvalue of the Helmholtz equation is calculated. One computation approach involves calculating finite differences. The model can include a symmetry element, such as a center of inversion, a rotation axis, and a mirror plane. A graphical user interface is provided for the user's convenience. A display is provided to display to a user the calculated eigenvalue, corresponding to a photonic energy level in the Brilloin zone of the unit cell.
A mathematical model of medial consonant identification by cochlear implant users.
Svirsky, Mario A; Sagi, Elad; Meyer, Ted A; Kaiser, Adam R; Teoh, Su Wooi
2011-04-01
The multidimensional phoneme identification model is applied to consonant confusion matrices obtained from 28 postlingually deafened cochlear implant users. This model predicts consonant matrices based on these subjects' ability to discriminate a set of postulated spectral, temporal, and amplitude speech cues as presented to them by their device. The model produced confusion matrices that matched many aspects of individual subjects' consonant matrices, including information transfer for the voicing, manner, and place features, despite individual differences in age at implantation, implant experience, device and stimulation strategy used, as well as overall consonant identification level. The model was able to match the general pattern of errors between consonants, but not the full complexity of all consonant errors made by each individual. The present study represents an important first step in developing a model that can be used to test specific hypotheses about the mechanisms cochlear implant users employ to understand speech.
A mathematical model of medial consonant identification by cochlear implant users
Svirsky, Mario A.; Sagi, Elad; Meyer, Ted A.; Kaiser, Adam R.; Teoh, Su Wooi
2011-01-01
The multidimensional phoneme identification model is applied to consonant confusion matrices obtained from 28 postlingually deafened cochlear implant users. This model predicts consonant matrices based on these subjects’ ability to discriminate a set of postulated spectral, temporal, and amplitude speech cues as presented to them by their device. The model produced confusion matrices that matched many aspects of individual subjects’ consonant matrices, including information transfer for the voicing, manner, and place features, despite individual differences in age at implantation, implant experience, device and stimulation strategy used, as well as overall consonant identification level. The model was able to match the general pattern of errors between consonants, but not the full complexity of all consonant errors made by each individual. The present study represents an important first step in developing a model that can be used to test specific hypotheses about the mechanisms cochlear implant users employ to understand speech. PMID:21476674
Location Contexts of User Check-Ins to Model Urban Geo Life-Style Patterns
Hasan, Samiul; Ukkusuri, Satish V.
2015-01-01
Geo-location data from social media offers us information, in new ways, to understand people's attitudes and interests through their activity choices. In this paper, we explore the idea of inferring individual life-style patterns from activity-location choices revealed in social media. We present a model to understand life-style patterns using the contextual information (e. g. location categories) of user check-ins. Probabilistic topic models are developed to infer individual geo life-style patterns from two perspectives: i) to characterize the patterns of user interests to different types of places and ii) to characterize the patterns of user visits to different neighborhoods. The method is applied to a dataset of Foursquare check-ins of the users from New York City. The co-existence of several location contexts and the corresponding probabilities in a given pattern provide useful information about user interests and choices. It is found that geo life-style patterns have similar items—either nearby neighborhoods or similar location categories. The semantic and geographic proximity of the items in a pattern reflects the hidden regularity in user preferences and location choice behavior. PMID:25970430
Identity-Based Authentication for Cloud Computing
NASA Astrophysics Data System (ADS)
Li, Hongwei; Dai, Yuanshun; Tian, Ling; Yang, Haomiao
Cloud computing is a recently developed new technology for complex systems with massive-scale services sharing among numerous users. Therefore, authentication of both users and services is a significant issue for the trust and security of the cloud computing. SSL Authentication Protocol (SAP), once applied in cloud computing, will become so complicated that users will undergo a heavily loaded point both in computation and communication. This paper, based on the identity-based hierarchical model for cloud computing (IBHMCC) and its corresponding encryption and signature schemes, presented a new identity-based authentication protocol for cloud computing and services. Through simulation testing, it is shown that the authentication protocol is more lightweight and efficient than SAP, specially the more lightweight user side. Such merit of our model with great scalability is very suited to the massive-scale cloud.
An Interactive Tool For Semi-automated Statistical Prediction Using Earth Observations and Models
NASA Astrophysics Data System (ADS)
Zaitchik, B. F.; Berhane, F.; Tadesse, T.
2015-12-01
We developed a semi-automated statistical prediction tool applicable to concurrent analysis or seasonal prediction of any time series variable in any geographic location. The tool was developed using Shiny, JavaScript, HTML and CSS. A user can extract a predictand by drawing a polygon over a region of interest on the provided user interface (global map). The user can select the Climatic Research Unit (CRU) precipitation or Climate Hazards Group InfraRed Precipitation with Station data (CHIRPS) as predictand. They can also upload their own predictand time series. Predictors can be extracted from sea surface temperature, sea level pressure, winds at different pressure levels, air temperature at various pressure levels, and geopotential height at different pressure levels. By default, reanalysis fields are applied as predictors, but the user can also upload their own predictors, including a wide range of compatible satellite-derived datasets. The package generates correlations of the variables selected with the predictand. The user also has the option to generate composites of the variables based on the predictand. Next, the user can extract predictors by drawing polygons over the regions that show strong correlations (composites). Then, the user can select some or all of the statistical prediction models provided. Provided models include Linear Regression models (GLM, SGLM), Tree-based models (bagging, random forest, boosting), Artificial Neural Network, and other non-linear models such as Generalized Additive Model (GAM) and Multivariate Adaptive Regression Splines (MARS). Finally, the user can download the analysis steps they used, such as the region they selected, the time period they specified, the predictand and predictors they chose and preprocessing options they used, and the model results in PDF or HTML format. Key words: Semi-automated prediction, Shiny, R, GLM, ANN, RF, GAM, MARS
Hypertext-based design of a user interface for scheduling
NASA Technical Reports Server (NTRS)
Woerner, Irene W.; Biefeld, Eric
1993-01-01
Operations Mission Planner (OMP) is an ongoing research project at JPL that utilizes AI techniques to create an intelligent, automated planning and scheduling system. The information space reflects the complexity and diversity of tasks necessary in most real-world scheduling problems. Thus the problem of the user interface is to present as much information as possible at a given moment and allow the user to quickly navigate through the various types of displays. This paper describes a design which applies the hypertext model to solve these user interface problems. The general paradigm is to provide maps and search queries to allow the user to quickly find an interesting conflict or problem, and then allow the user to navigate through the displays in a hypertext fashion.
Applying Utility Functions to Adaptation Planning for Home Automation Applications
NASA Astrophysics Data System (ADS)
Bratskas, Pyrros; Paspallis, Nearchos; Kakousis, Konstantinos; Papadopoulos, George A.
A pervasive computing environment typically comprises multiple embedded devices that may interact together and with mobile users. These users are part of the environment, and they experience it through a variety of devices embedded in the environment. This perception involves technologies which may be heterogeneous, pervasive, and dynamic. Due to the highly dynamic properties of such environments, the software systems running on them have to face problems such as user mobility, service failures, or resource and goal changes which may happen in an unpredictable manner. To cope with these problems, such systems must be autonomous and self-managed. In this chapter we deal with a special kind of a ubiquitous environment, a smart home environment, and introduce a user-preference-based model for adaptation planning. The model, which dynamically forms a set of configuration plans for resources, reasons automatically and autonomously, based on utility functions, on which plan is likely to best achieve the user's goals with respect to resource availability and user needs.
2012-08-01
calculation of the erosion rate is based on the United States Department of Agriculture (USDA) Universal Soil Loss Equation ( USLE ). ERDC/EL TR-12-16 147...to specifying the USLE input parameters, the user must select which method to use for computing the soil loss type (i.e., “SDR,” or “Without SDR...34 Soil Model
Erosion Risk Management Tool (ERMiT) user manual (version 2006.01.18)
Peter R. Robichaud; William J. Elliot; Fredrick B. Pierson; David E. Hall; Corey A. Moffet; Louise E. Ashmun
2007-01-01
The decision of where, when, and how to apply the most effective post-fire erosion mitigation treatments requires land managers to assess the risk of damaging runoff and erosion events occurring after a fire. To aid in this assessment, the Erosion Risk Management Tool (ERMiT) was developed. This user manual describes the input parameters, input interface, model...
User-Adapted Recommendation of Content on Mobile Devices Using Bayesian Networks
NASA Astrophysics Data System (ADS)
Iwasaki, Hirotoshi; Mizuno, Nobuhiro; Hara, Kousuke; Motomura, Yoichi
Mobile devices, such as cellular phones and car navigation systems, are essential to daily life. People acquire necessary information and preferred content over communication networks anywhere, anytime. However, usability issues arise from the simplicity of user interfaces themselves. Thus, a recommendation of content that is adapted to a user's preference and situation will help the user select content. In this paper, we describe a method to realize such a system using Bayesian networks. This user-adapted mobile system is based on a user model that provides recommendation of content (i.e., restaurants, shops, and music that are suitable to the user and situation) and that learns incrementally based on accumulated usage history data. However, sufficient samples are not always guaranteed, since a user model would require combined dependency among users, situations, and contents. Therefore, we propose the LK method for modeling, which complements incomplete and insufficient samples using knowledge data, and CPT incremental learning for adaptation based on a small number of samples. In order to evaluate the methods proposed, we applied them to restaurant recommendations made on car navigation systems. The evaluation results confirmed that our model based on the LK method can be expected to provide better generalization performance than that of the conventional method. Furthermore, our system would require much less operation than current car navigation systems from the beginning of use. Our evaluation results also indicate that learning a user's individual preference through CPT incremental learning would be beneficial to many users, even with only a few samples. As a result, we have developed the technology of a system that becomes more adapted to a user the more it is used.
SSDA code to apply data assimilation in soil water flow modeling: Documentation and user manual
USDA-ARS?s Scientific Manuscript database
Soil water flow models are based on simplified assumptions about the mechanisms, processes, and parameters of water retention and flow. That causes errors in soil water flow model predictions. Data assimilation (DA) with the ensemble Kalman filter (EnKF) corrects modeling results based on measured s...
Applied Meteorology Unit (AMU)
NASA Technical Reports Server (NTRS)
Bauman, William; Crawford, Winifred; Barrett, Joe; Watson, Leela; Wheeler, Mark
2010-01-01
This report summarizes the Applied Meteorology Unit (AMU) activities for the first quarter of Fiscal Year 2010 (October - December 2009). A detailed project schedule is included in the Appendix. Included tasks are: (1) Peak Wind Tool for User Launch Commit Criteria (LCC), (2) Objective Lightning Probability Tool, Phase III, (3) Peak Wind Tool for General Forecasting, Phase II, (4) Upgrade Summer Severe Weather Tool in Meteorological Interactive Data Display System (MIDDS), (5) Advanced Regional Prediction System (ARPS) Data Analysis System (ADAS) Update and Maintainability, (5) Verify 12-km resolution North American Model (MesoNAM) Performance, and (5) Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) Graphical User Interface.
Collaborative Filtering Based on Sequential Extraction of User-Item Clusters
NASA Astrophysics Data System (ADS)
Honda, Katsuhiro; Notsu, Akira; Ichihashi, Hidetomo
Collaborative filtering is a computational realization of “word-of-mouth” in network community, in which the items prefered by “neighbors” are recommended. This paper proposes a new item-selection model for extracting user-item clusters from rectangular relation matrices, in which mutual relations between users and items are denoted in an alternative process of “liking or not”. A technique for sequential co-cluster extraction from rectangular relational data is given by combining the structural balancing-based user-item clustering method with sequential fuzzy cluster extraction appraoch. Then, the tecunique is applied to the collaborative filtering problem, in which some items may be shared by several user clusters.
NETPATH-WIN: an interactive user version of the mass-balance model, NETPATH
El-Kadi, A. I.; Plummer, Niel; Aggarwal, P.
2011-01-01
NETPATH-WIN is an interactive user version of NETPATH, an inverse geochemical modeling code used to find mass-balance reaction models that are consistent with the observed chemical and isotopic composition of waters from aquatic systems. NETPATH-WIN was constructed to migrate NETPATH applications into the Microsoft WINDOWS® environment. The new version facilitates model utilization by eliminating difficulties in data preparation and results analysis of the DOS version of NETPATH, while preserving all of the capabilities of the original version. Through example applications, the note describes some of the features of NETPATH-WIN as applied to adjustment of radiocarbon data for geochemical reactions in groundwater systems.
Pragmatic Applications of RE-AIM for Health Care Initiatives in Community and Clinical Settings
Estabrooks, Paul E.
2018-01-01
The RE-AIM (Reach Effectiveness Adoption Implementation Maintenance) planning and evaluation framework has been applied broadly, but users often have difficulty in applying the model because of data collection needs across multiple domains and sources. Questions in the more common “who, what, where, how, when, and why” format may be an effective guide to ensure that individual participants, organization staff, and the perspectives of the setting are considered in planning and evaluation. Such a format can also help users in typical community and clinical settings to identify which outcomes are most valued and to focus limited measurement resources. Translations of RE-AIM that are easy to understand and apply are needed for application in real-world community and clinical settings where research and evaluation resources are limited. The purpose of this article is to provide simplified, pragmatic, user-centered and stakeholder-centered recommendations to increase the use of RE-AIM in community and clinical settings and in translational research. PMID:29300695
Prospero - A tool for organizing Internet resources
NASA Technical Reports Server (NTRS)
Neuman, B. C.
1992-01-01
This article describes Prospero, a distributed file system based on the Virtual System Model. Prospero provides tools to help users organize Internet resources. These tools allow users to construct customized views of available resources, while taking advantage of the structure imposed by others. Prospero provides a framework that can tie together various indexing services producing the fabric on which resource discovery techniques can be applied.
Gillingham, Philip
2016-01-01
Recent developments in digital technology have facilitated the recording and retrieval of administrative data from multiple sources about children and their families. Combined with new ways to mine such data using algorithms which can ‘learn’, it has been claimed that it is possible to develop tools that can predict which individual children within a population are most likely to be maltreated. The proposed benefit is that interventions can then be targeted to the most vulnerable children and their families to prevent maltreatment from occurring. As expertise in predictive modelling increases, the approach may also be applied in other areas of social work to predict and prevent adverse outcomes for vulnerable service users. In this article, a glimpse inside the ‘black box’ of predictive tools is provided to demonstrate how their development for use in social work may not be straightforward, given the nature of the data recorded about service users and service activity. The development of predictive risk modelling (PRM) in New Zealand is focused on as an example as it may be the first such tool to be applied as part of ongoing reforms to child protection services. PMID:27559213
Gillingham, Philip
2016-06-01
Recent developments in digital technology have facilitated the recording and retrieval of administrative data from multiple sources about children and their families. Combined with new ways to mine such data using algorithms which can 'learn', it has been claimed that it is possible to develop tools that can predict which individual children within a population are most likely to be maltreated. The proposed benefit is that interventions can then be targeted to the most vulnerable children and their families to prevent maltreatment from occurring. As expertise in predictive modelling increases, the approach may also be applied in other areas of social work to predict and prevent adverse outcomes for vulnerable service users. In this article, a glimpse inside the 'black box' of predictive tools is provided to demonstrate how their development for use in social work may not be straightforward, given the nature of the data recorded about service users and service activity. The development of predictive risk modelling (PRM) in New Zealand is focused on as an example as it may be the first such tool to be applied as part of ongoing reforms to child protection services.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berg, Larry K.; Allwine, K Jerry; Rutz, Frederick C.
2004-08-23
A new modeling system has been developed to provide a non-meteorologist with tools to predict air pollution transport in regions of complex terrain. This system couples the Penn State/NCAR Mesoscale Model 5 (MM5) with Earth Tech’s CALMET-CALPUFF system using a unique Graphical User Interface (GUI) developed at Pacific Northwest National Laboratory. This system is most useful in data-sparse regions, where there are limited observations to initialize the CALMET model. The user is able to define the domain of interest, provide details about the source term, and enter a surface weather observation through the GUI. The system then generates initial conditionsmore » and time constant boundary conditions for use by MM5. MM5 is run and the results are piped to CALPUFF for the dispersion calculations. Contour plots of pollutant concentration are prepared for the user. The primary advantages of the system are the streamlined application of MM5 and CALMET, limited data requirements, and the ability to run the coupled system on a desktop or laptop computer. In comparison with data collected as part of a field campaign, the new modeling system shows promise that a full-physics mesoscale model can be used in an applied modeling system to effectively simulate locally thermally-driven winds with minimal observations as input. An unexpected outcome of this research was how well CALMET represented the locally thermally-driven flows.« less
Co-streaming classes: a follow-up study in improving the user experience to better reach users.
Hayes, Barrie E; Handler, Lara J; Main, Lindsey R
2011-01-01
Co-streaming classes have enabled library staff to extend open classes to distance education students and other users. Student evaluations showed that the model could be improved. Two areas required attention: audio problems experienced by online participants and staff teaching methods. Staff tested equipment and adjusted software configuration to improve user experience. Staff training increased familiarity with specialized teaching techniques and troubleshooting procedures. Technology testing and staff training were completed, and best practices were developed and applied. Class evaluations indicate improvements in classroom experience. Future plans include expanding co-streaming to more classes and on-going data collection, evaluation, and improvement of classes.
NASA Astrophysics Data System (ADS)
Madani, K.; Dinar, A.
2013-12-01
Tragedy of the commons is generally recognized as one of the possible destinies for common pool resources (CPRs). To avoid the tragedy of the commons and prolonging the life of CPRs, users may show different behavioral characteristics and use different rationales for CPR planning and management. Furthermore, regulators may adopt different strategies for sustainable management of CPRs. The effectiveness of different regulatory exogenous management institutions cannot be evaluated through conventional CPR models since they assume that either users base their behavior on individual rationality and adopt a selfish behavior (Nash behavior), or that the users seek the system's optimal solution without giving priority to their own interests. Therefore, conventional models fail to reliably predict the outcome of CPR problems in which parties may have a range of behavioral characteristics, putting them somewhere in between the two types of behaviors traditionally considered. This work examines the effectiveness of different regulatory exogenous CPR management institutions through a user-based model (as opposed to a system-based model). The new modeling framework allows for consideration of sensitivity of the results to different behavioral characteristics of interacting CPR users. The suggested modeling approach is applied to a benchmark groundwater management problem. Results indicate that some well-known exogenous management institutions (e.g. taxing) are ineffective in sustainable management of CPRs in most cases. Bankruptcy-based management can be helpful, but determination of the fair level of cutbacks remains challenging under this type of institution. Furthermore, some bankruptcy rules such as the Constrained Equal Award (CEA) method are more beneficial to wealthier users, failing to establish social justice. Quota-based and CPR status-based management perform as the most promising and robust regulatory exogenous institutions in prolonging the CPR's life and increasing the long-term benefits to its users.
Progressively consolidating historical visual explorations for new discoveries
NASA Astrophysics Data System (ADS)
Zhao, Kaiyu; Ward, Matthew O.; Rundensteiner, Elke A.; Higgins, Huong N.
2013-12-01
A significant task within data mining is to identify data models of interest. While facilitating the exploration tasks, most visualization systems do not make use of all the data models that are generated during the exploration. In this paper, we introduce a system that allows the user to gain insights from the data space progressively by forming data models and consolidating the generated models on the fly. Each model can be a a computationally extracted or user-defined subset that contains a certain degree of interest and might lead to some discoveries. When the user generates more and more data models, the degree of interest of some portion of some models will either grow (indicating higher occurrence) or will fluctuate or decrease (corresponding to lower occurrence). Our system maintains a collection of such models and accumulates the interestingness of each model into a consolidated model. In order to consolidate the models, the system summarizes the associations between the models in the collection and identifies support (models reinforce each other), complementary (models complement each other), and overlap of the models. The accumulated interestingness keeps track of historical exploration and helps the user summarize their findings which can lead to new discoveries. This mechanism for integrating results from multiple models can be applied to a wide range of decision support systems. We demonstrate our system in a case study involving the financial status of US companies.
Identifying the causes of road crashes in Europe
Thomas, Pete; Morris, Andrew; Talbot, Rachel; Fagerlind, Helen
2013-01-01
This research applies a recently developed model of accident causation, developed to investigate industrial accidents, to a specially gathered sample of 997 crashes investigated in-depth in 6 countries. Based on the work of Hollnagel the model considers a collision to be a consequence of a breakdown in the interaction between road users, vehicles and the organisation of the traffic environment. 54% of road users experienced interpretation errors while 44% made observation errors and 37% planning errors. In contrast to other studies only 11% of drivers were identified as distracted and 8% inattentive. There was remarkably little variation in these errors between the main road user types. The application of the model to future in-depth crash studies offers the opportunity to identify new measures to improve safety and to mitigate the social impact of collisions. Examples given include the potential value of co-driver advisory technologies to reduce observation errors and predictive technologies to avoid conflicting interactions between road users. PMID:24406942
This technical report describes the new one-dimensional (1D) hydrodynamic and sediment transport model EFDC1D. This model that can be applied to stream networks. The model code and two sample data sets are included on the distribution CD. EFDC1D can simulate bi-directional unstea...
NASA Astrophysics Data System (ADS)
Friedl, L. A.; Cox, L.
2008-12-01
The NASA Applied Sciences Program collaborates with organizations to discover and demonstrate applications of NASA Earth science research and technology to decision making. The desired outcome is for public and private organizations to use NASA Earth science products in innovative applications for sustained, operational uses to enhance their decisions. In addition, the program facilitates the end-user feedback to Earth science to improve products and demands for research. The Program thus serves as a bridge between Earth science research and technology and the applied organizations and end-users with management, policy, and business responsibilities. Since 2002, the Applied Sciences Program has sponsored over 115 applications-oriented projects to apply Earth observations and model products to decision making activities. Projects have spanned numerous topics - agriculture, air quality, water resources, disasters, public health, aviation, etc. The projects have involved government agencies, private companies, universities, non-governmental organizations, and foreign entities in multiple types of teaming arrangements. The paper will examine this set of applications projects and present specific examples of successful use of Earth science in decision making. The paper will discuss scientific, organizational, and management factors that contribute to or impede the integration of the Earth science research in policy and management. The paper will also present new methods the Applied Sciences Program plans to implement to improve linkages between science and end users.
The secure authorization model for healthcare information system.
Hsu, Wen-Shin; Pan, Jiann-I
2013-10-01
Exploring healthcare system for assisting medical services or transmitting patients' personal health information in web application has been widely investigated. Information and communication technologies have been applied to the medical services and healthcare area for a number of years to resolve problems in medical management. In the healthcare system, not all users are allowed to access all the information. Several authorization models for restricting users to access specific information at specific permissions have been proposed. However, as the number of users and the amount of information grows, the difficulties for administrating user authorization will increase. The critical problem limits the widespread usage of the healthcare system. This paper proposes an approach for role-based and extends it to deal with the information for authorizations in the healthcare system. We propose the role-based authorization model which supports authorizations for different kinds of objects, and a new authorization domain. Based on this model, we discuss the issues and requirements of security in the healthcare systems. The security issues for services shared between different healthcare industries will also be discussed.
Nicholas L. Crookston; Donald C. E. Robinson; Sarah J. Beukema
2003-01-01
The Fire and Fuels Extension (FFE) to the Forest Vegetation Simulator (FVS) simulates fuel dynamics and potential fire behavior over time, in the context of stand development and management. This chapter presents the model's options, provides annotated examples, describes the outputs, and describes how to use and apply the model.
Models of railroad passenger-car requirements in the northeast corridor : volume II user's guide
DOT National Transportation Integrated Search
1976-09-30
Models and techniques for determining passenger-car requirements in railroad service were developed and applied by a research project of which this is the final report. The report is published in two volumes. The solution and analysis of the Northeas...
GeneOnEarth: fitting genetic PC plots on the globe.
Torres-Sánchez, Sergio; Medina-Medina, Nuria; Gignoux, Chris; Abad-Grau, María M; González-Burchard, Esteban
2013-01-01
Principal component (PC) plots have become widely used to summarize genetic variation of individuals in a sample. The similarity between genetic distance in PC plots and geographical distance has shown to be quite impressive. However, in most situations, individual ancestral origins are not precisely known or they are heterogeneously distributed; hence, they are hardly linked to a geographical area. We have developed GeneOnEarth, a user-friendly web-based tool to help geneticists to understand whether a linear isolation-by-distance model may apply to a genetic data set; thus, genetic distances among a set of individuals resemble geographical distances among their origins. Its main goal is to allow users to first apply a by-view Procrustes method to visually learn whether this model holds. To do that, the user can choose the exact geographical area from an on line 2D or 3D world map by using, respectively, Google Maps or Google Earth, and rotate, flip, and resize the images. GeneOnEarth can also compute the optimal rotation angle using Procrustes analysis and assess statistical evidence of similarity when a different rotation angle has been chosen by the user. An online version of GeneOnEarth is available for testing and using purposes at http://bios.ugr.es/GeneOnEarth.
Ghomi, Haniyeh; Bagheri, Morteza; Fu, Liping; Miranda-Moreno, Luis F
2016-11-16
The main objective of this study is to identify the main factors associated with injury severity of vulnerable road users (VRUs) involved in accidents at highway railroad grade crossings (HRGCs) using data mining techniques. This article applies an ordered probit model, association rules, and classification and regression tree (CART) algorithms to the U.S. Federal Railroad Administration's (FRA) HRGC accident database for the period 2007-2013 to identify VRU injury severity factors at HRGCs. The results show that train speed is a key factor influencing injury severity. Further analysis illustrated that the presence of illumination does not reduce the severity of accidents for high-speed trains. In addition, there is a greater propensity toward fatal accidents for elderly road users compared to younger individuals. Interestingly, at night, injury accidents involving female road users are more severe compared to those involving males. The ordered probit model was the primary technique, and CART and association rules act as the supporter and identifier of interactions between variables. All 3 algorithms' results consistently show that the most influential accident factors are train speed, VRU age, and gender. The findings of this research could be applied for identifying high-risk hotspots and developing cost-effective countermeasures targeting VRUs at HRGCs.
Chien, Tsair-Wei; Shao, Yang; Kuo, Shu-Chun
2017-01-10
Many continuous item responses (CIRs) are encountered in healthcare settings, but no one uses item response theory's (IRT) probabilistic modeling to present graphical presentations for interpreting CIR results. A computer module that is programmed to deal with CIRs is required. To present a computer module, validate it, and verify its usefulness in dealing with CIR data, and then to apply the model to real healthcare data in order to show how the CIR that can be applied to healthcare settings with an example regarding a safety attitude survey. Using Microsoft Excel VBA (Visual Basic for Applications), we designed a computer module that minimizes the residuals and calculates model's expected scores according to person responses across items. Rasch models based on a Wright map and on KIDMAP were demonstrated to interpret results of the safety attitude survey. The author-made CIR module yielded OUTFIT mean square (MNSQ) and person measures equivalent to those yielded by professional Rasch Winsteps software. The probabilistic modeling of the CIR module provides messages that are much more valuable to users and show the CIR advantage over classic test theory. Because of advances in computer technology, healthcare users who are familiar to MS Excel can easily apply the study CIR module to deal with continuous variables to benefit comparisons of data with a logistic distribution and model fit statistics.
The iCARE R Package allows researchers to quickly build models for absolute risk, and apply them to estimate an individual's risk of developing disease during a specifed time interval, based on a set of user defined input parameters.
Choi, Wona; Rho, Mi Jung; Park, Jiyun; Kim, Kwang-Jum; Kwon, Young Dae; Choi, In Young
2013-06-01
Intensified competitiveness in the healthcare industry has increased the number of healthcare centers and propelled the introduction of customer relationship management (CRM) systems to meet diverse customer demands. This study aimed to develop the information system success model of the CRM system by investigating previously proposed indicators within the model. THE EVALUATION AREAS OF THE CRM SYSTEM INCLUDES THREE AREAS: the system characteristics area (system quality, information quality, and service quality), the user area (perceived usefulness and user satisfaction), and the performance area (personal performance and organizational performance). Detailed evaluation criteria of the three areas were developed, and its validity was verified by a survey administered to CRM system users in 13 nationwide health promotion centers. The survey data were analyzed by the structural equation modeling method, and the results confirmed that the model is feasible. Information quality and service quality showed a statistically significant relationship with perceived usefulness and user satisfaction. Consequently, the perceived usefulness and user satisfaction had significant influence on individual performance as well as an indirect influence on organizational performance. This study extends the research area on information success from general information systems to CRM systems in health promotion centers applying a previous information success model. This lays a foundation for evaluating health promotion center systems and provides a useful guide for successful implementation of hospital CRM systems.
Choi, Wona; Rho, Mi Jung; Park, Jiyun; Kim, Kwang-Jum; Kwon, Young Dae
2013-01-01
Objectives Intensified competitiveness in the healthcare industry has increased the number of healthcare centers and propelled the introduction of customer relationship management (CRM) systems to meet diverse customer demands. This study aimed to develop the information system success model of the CRM system by investigating previously proposed indicators within the model. Methods The evaluation areas of the CRM system includes three areas: the system characteristics area (system quality, information quality, and service quality), the user area (perceived usefulness and user satisfaction), and the performance area (personal performance and organizational performance). Detailed evaluation criteria of the three areas were developed, and its validity was verified by a survey administered to CRM system users in 13 nationwide health promotion centers. The survey data were analyzed by the structural equation modeling method, and the results confirmed that the model is feasible. Results Information quality and service quality showed a statistically significant relationship with perceived usefulness and user satisfaction. Consequently, the perceived usefulness and user satisfaction had significant influence on individual performance as well as an indirect influence on organizational performance. Conclusions This study extends the research area on information success from general information systems to CRM systems in health promotion centers applying a previous information success model. This lays a foundation for evaluating health promotion center systems and provides a useful guide for successful implementation of hospital CRM systems. PMID:23882416
ERIC Educational Resources Information Center
Torres, Francisco; Neira Tovar, Leticia A.; del Rio, Marta Sylvia
2017-01-01
This study aims to explore the results of welding virtual training performance, designed using a learning model based on cognitive and usability techniques, applying an immersive concept focused on person attention. Moreover, it also intended to demonstrate that exits a moderating effect of performance improvement when the user experience is taken…
Polcicová, Gabriela; Tino, Peter
2004-01-01
We introduce topographic versions of two latent class models (LCM) for collaborative filtering. Latent classes are topologically organized on a square grid. Topographic organization of latent classes makes orientation in rating/preference patterns captured by the latent classes easier and more systematic. The variation in film rating patterns is modelled by multinomial and binomial distributions with varying independence assumptions. In the first stage of topographic LCM construction, self-organizing maps with neural field organized according to the LCM topology are employed. We apply our system to a large collection of user ratings for films. The system can provide useful visualization plots unveiling user preference patterns buried in the data, without loosing potential to be a good recommender model. It appears that multinomial distribution is most adequate if the model is regularized by tight grid topologies. Since we deal with probabilistic models of the data, we can readily use tools from probability and information theories to interpret and visualize information extracted by our system.
A New Approach to Predict user Mobility Using Semantic Analysis and Machine Learning.
Fernandes, Roshan; D'Souza G L, Rio
2017-10-19
Mobility prediction is a technique in which the future location of a user is identified in a given network. Mobility prediction provides solutions to many day-to-day life problems. It helps in seamless handovers in wireless networks to provide better location based services and to recalculate paths in Mobile Ad hoc Networks (MANET). In the present study, a framework is presented which predicts user mobility in presence and absence of mobility history. Naïve Bayesian classification algorithm and Markov Model are used to predict user future location when user mobility history is available. An attempt is made to predict user future location by using Short Message Service (SMS) and instantaneous Geological coordinates in the absence of mobility patterns. The proposed technique compares the performance metrics with commonly used Markov Chain model. From the experimental results it is evident that the techniques used in this work gives better results when considering both spatial and temporal information. The proposed method predicts user's future location in the absence of mobility history quite fairly. The proposed work is applied to predict the mobility of medical rescue vehicles and social security systems.
ERIC Educational Resources Information Center
Zan, Xinxing Anna; Yoon, Sang Won; Khasawneh, Mohammad; Srihari, Krishnaswami
2013-01-01
In an effort to develop a low-cost and user-friendly forecasting model to minimize forecasting error, we have applied average and exponentially weighted return ratios to project undergraduate student enrollment. We tested the proposed forecasting models with different sets of historical enrollment data, such as university-, school-, and…
A Compositional Relevance Model for Adaptive Information Retrieval
NASA Technical Reports Server (NTRS)
Mathe, Nathalie; Chen, James; Lu, Henry, Jr. (Technical Monitor)
1994-01-01
There is a growing need for rapid and effective access to information in large electronic documentation systems. Access can be facilitated if information relevant in the current problem solving context can be automatically supplied to the user. This includes information relevant to particular user profiles, tasks being performed, and problems being solved. However most of this knowledge on contextual relevance is not found within the contents of documents, and current hypermedia tools do not provide any easy mechanism to let users add this knowledge to their documents. We propose a compositional relevance network to automatically acquire the context in which previous information was found relevant. The model records information on the relevance of references based on user feedback for specific queries and contexts. It also generalizes such information to derive relevant references for similar queries and contexts. This model lets users filter information by context of relevance, build personalized views of documents over time, and share their views with other users. It also applies to any type of multimedia information. Compared to other approaches, it is less costly and doesn't require any a priori statistical computation, nor an extended training period. It is currently being implemented into the Computer Integrated Documentation system which enables integration of various technical documents in a hypertext framework.
Kristoffersen, Agnete Egilsdatter; Fønnebø, Vinjar; Norheim, Arne Johan
2008-10-01
Self-reported use of complementary and alternative medicine (CAM) among patients varies widely between studies, possibly because the definition of a CAM user is not comparable. This makes it difficult to compare studies. The aim of this study is to present a six-level model for classifying patients' reported exposure to CAM. Prayer, physical exercise, special diets, over-the-counter products/CAM techniques, and personal visits to a CAM practitioner are successively removed from the model in a reductive fashion. By applying the model to responses given by Norwegian patients with cancer, we found that 72% use CAM if the user was defined to include all types of CAM. This proportion was reduced successively to only 11% in the same patient group when a CAM user was defined as a user visiting a CAM practitioner four or more times. When considering a sample of 10 recently published studies of CAM use among patients with breast cancer, we found 98% use when the CAM user was defined to include all sorts of CAM. This proportion was reduced successively to only 20% when a CAM user was defined as a user of a CAM practitioner. We recommend future surveys of CAM use to report at more than one level and to clarify which intensity level of CAM use the report is based on.
CLUSTERnGO: a user-defined modelling platform for two-stage clustering of time-series data.
Fidaner, Işık Barış; Cankorur-Cetinkaya, Ayca; Dikicioglu, Duygu; Kirdar, Betul; Cemgil, Ali Taylan; Oliver, Stephen G
2016-02-01
Simple bioinformatic tools are frequently used to analyse time-series datasets regardless of their ability to deal with transient phenomena, limiting the meaningful information that may be extracted from them. This situation requires the development and exploitation of tailor-made, easy-to-use and flexible tools designed specifically for the analysis of time-series datasets. We present a novel statistical application called CLUSTERnGO, which uses a model-based clustering algorithm that fulfils this need. This algorithm involves two components of operation. Component 1 constructs a Bayesian non-parametric model (Infinite Mixture of Piecewise Linear Sequences) and Component 2, which applies a novel clustering methodology (Two-Stage Clustering). The software can also assign biological meaning to the identified clusters using an appropriate ontology. It applies multiple hypothesis testing to report the significance of these enrichments. The algorithm has a four-phase pipeline. The application can be executed using either command-line tools or a user-friendly Graphical User Interface. The latter has been developed to address the needs of both specialist and non-specialist users. We use three diverse test cases to demonstrate the flexibility of the proposed strategy. In all cases, CLUSTERnGO not only outperformed existing algorithms in assigning unique GO term enrichments to the identified clusters, but also revealed novel insights regarding the biological systems examined, which were not uncovered in the original publications. The C++ and QT source codes, the GUI applications for Windows, OS X and Linux operating systems and user manual are freely available for download under the GNU GPL v3 license at http://www.cmpe.boun.edu.tr/content/CnG. sgo24@cam.ac.uk Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.
NASA Technical Reports Server (NTRS)
Mitchell, Paul H.
1991-01-01
F77NNS (FORTRAN 77 Neural Network Simulator) computer program simulates popular back-error-propagation neural network. Designed to take advantage of vectorization when used on computers having this capability, also used on any computer equipped with ANSI-77 FORTRAN Compiler. Problems involving matching of patterns or mathematical modeling of systems fit class of problems F77NNS designed to solve. Program has restart capability so neural network solved in stages suitable to user's resources and desires. Enables user to customize patterns of connections between layers of network. Size of neural network F77NNS applied to limited only by amount of random-access memory available to user.
BUMPER: the Bayesian User-friendly Model for Palaeo-Environmental Reconstruction
NASA Astrophysics Data System (ADS)
Holden, Phil; Birks, John; Brooks, Steve; Bush, Mark; Hwang, Grace; Matthews-Bird, Frazer; Valencia, Bryan; van Woesik, Robert
2017-04-01
We describe the Bayesian User-friendly Model for Palaeo-Environmental Reconstruction (BUMPER), a Bayesian transfer function for inferring past climate and other environmental variables from microfossil assemblages. The principal motivation for a Bayesian approach is that the palaeoenvironment is treated probabilistically, and can be updated as additional data become available. Bayesian approaches therefore provide a reconstruction-specific quantification of the uncertainty in the data and in the model parameters. BUMPER is fully self-calibrating, straightforward to apply, and computationally fast, requiring 2 seconds to build a 100-taxon model from a 100-site training-set on a standard personal computer. We apply the model's probabilistic framework to generate thousands of artificial training-sets under ideal assumptions. We then use these to demonstrate both the general applicability of the model and the sensitivity of reconstructions to the characteristics of the training-set, considering assemblage richness, taxon tolerances, and the number of training sites. We demonstrate general applicability to real data, considering three different organism types (chironomids, diatoms, pollen) and different reconstructed variables. In all of these applications an identically configured model is used, the only change being the input files that provide the training-set environment and taxon-count data.
Evaluating Model-Driven Development for large-scale EHRs through the openEHR approach.
Christensen, Bente; Ellingsen, Gunnar
2016-05-01
In healthcare, the openEHR standard is a promising Model-Driven Development (MDD) approach for electronic healthcare records. This paper aims to identify key socio-technical challenges when the openEHR approach is put to use in Norwegian hospitals. More specifically, key fundamental assumptions are investigated empirically. These assumptions promise a clear separation of technical and domain concerns, users being in control of the modelling process, and widespread user commitment. Finally, these assumptions promise an easy way to model and map complex organizations. This longitudinal case study is based on an interpretive approach, whereby data were gathered through 440h of participant observation, 22 semi-structured interviews and extensive document studies over 4 years. The separation of clinical and technical concerns seemed to be aspirational, because both designing the technical system and modelling the domain required technical and clinical competence. Hence developers and clinicians found themselves working together in both arenas. User control and user commitment seemed not to apply in large-scale projects, as modelling the domain turned out to be too complicated and hence to appeal only to especially interested users worldwide, not the local end-users. Modelling proved to be a complex standardization process that shaped both the actual modelling and healthcare practice itself. A broad assemblage of contributors seems to be needed for developing an archetype-based system, in which roles, responsibilities and contributions cannot be clearly defined and delimited. The way MDD occurs has implications for medical practice per se in the form of the need to standardize practices to ensure that medical concepts are uniform across practices. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Modeling Freight Ocean Rail and Truck Transportation Flows to Support Policy Analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gearhart, Jared Lee; Wang, Hao; Nozick, Linda Karen
Freight transportation represents about 9.5% of GDP, is responsible for about 8% of greenhouse gas emissions and supports the import and export of about 3.6 trillion in international trade; hence it is important that our national freight transportation system is designed and operated efficiently and embodies user fees and other policies that balance costs and environmental consequences. Hence, this paper develops a mathematical model to estimate international and domestic freight flows across ocean, rail and truck modes which can be used to study the impacts of changes in our infrastructure as well as the imposition of new user fees andmore » changes in operating policies. This model is applied to two case studies: (1) a disruption of the maritime ports at Los Angeles/Long Beach similar to the impacts that would be felt in an earthquake; and (2) implementation of new user fees at the California ports.« less
Visualizing Dataflow Graphs of Deep Learning Models in TensorFlow.
Wongsuphasawat, Kanit; Smilkov, Daniel; Wexler, James; Wilson, Jimbo; Mane, Dandelion; Fritz, Doug; Krishnan, Dilip; Viegas, Fernanda B; Wattenberg, Martin
2018-01-01
We present a design study of the TensorFlow Graph Visualizer, part of the TensorFlow machine intelligence platform. This tool helps users understand complex machine learning architectures by visualizing their underlying dataflow graphs. The tool works by applying a series of graph transformations that enable standard layout techniques to produce a legible interactive diagram. To declutter the graph, we decouple non-critical nodes from the layout. To provide an overview, we build a clustered graph using the hierarchical structure annotated in the source code. To support exploration of nested structure on demand, we perform edge bundling to enable stable and responsive cluster expansion. Finally, we detect and highlight repeated structures to emphasize a model's modular composition. To demonstrate the utility of the visualizer, we describe example usage scenarios and report user feedback. Overall, users find the visualizer useful for understanding, debugging, and sharing the structures of their models.
Water Energy Simulation Toolset
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, Thuy; Jeffers, Robert
The Water-Energy Simulation Toolset (WEST) is an interactive simulation model that helps visualize impacts of different stakeholders on water quantity and quality of a watershed. The case study is applied for the Snake River Basin with the fictional name Cutthroat River Basin. There are four groups of stakeholders of interest: hydropower, agriculture, flood control, and environmental protection. Currently, the quality component depicts nitrogen-nitrate contaminant. Users can easily interact with the model by changing certain inputs (climate change, fertilizer inputs, etc.) to observe the change over the entire system. Users can also change certain parameters to test their management policy.
Systems and methods for knowledge discovery in spatial data
Obradovic, Zoran; Fiez, Timothy E.; Vucetic, Slobodan; Lazarevic, Aleksandar; Pokrajac, Dragoljub; Hoskinson, Reed L.
2005-03-08
Systems and methods are provided for knowledge discovery in spatial data as well as to systems and methods for optimizing recipes used in spatial environments such as may be found in precision agriculture. A spatial data analysis and modeling module is provided which allows users to interactively and flexibly analyze and mine spatial data. The spatial data analysis and modeling module applies spatial data mining algorithms through a number of steps. The data loading and generation module obtains or generates spatial data and allows for basic partitioning. The inspection module provides basic statistical analysis. The preprocessing module smoothes and cleans the data and allows for basic manipulation of the data. The partitioning module provides for more advanced data partitioning. The prediction module applies regression and classification algorithms on the spatial data. The integration module enhances prediction methods by combining and integrating models. The recommendation module provides the user with site-specific recommendations as to how to optimize a recipe for a spatial environment such as a fertilizer recipe for an agricultural field.
Validation of Fatigue Modeling Predictions in Aviation Operations
NASA Technical Reports Server (NTRS)
Gregory, Kevin; Martinez, Siera; Flynn-Evans, Erin
2017-01-01
Bio-mathematical fatigue models that predict levels of alertness and performance are one potential tool for use within integrated fatigue risk management approaches. A number of models have been developed that provide predictions based on acute and chronic sleep loss, circadian desynchronization, and sleep inertia. Some are publicly available and gaining traction in settings such as commercial aviation as a means of evaluating flight crew schedules for potential fatigue-related risks. Yet, most models have not been rigorously evaluated and independently validated for the operations to which they are being applied and many users are not fully aware of the limitations in which model results should be interpreted and applied.
Draft user's guide for UDOT mechanistic-empirical pavement design.
DOT National Transportation Integrated Search
2009-10-01
Validation of the new AASHTO Mechanistic-Empirical Pavement Design Guides (MEPDG) nationally calibrated pavement distress and smoothness prediction models when applied under Utah conditions, and local calibration of the new hot-mix asphalt (HMA) p...
ERIC Educational Resources Information Center
Pek, Jolynn; Chalmers, R. Philip; Kok, Bethany E.; Losardo, Diane
2015-01-01
Structural equation mixture models (SEMMs), when applied as a semiparametric model (SPM), can adequately recover potentially nonlinear latent relationships without their specification. This SPM is useful for exploratory analysis when the form of the latent regression is unknown. The purpose of this article is to help users familiar with structural…
Information technology acceptance in health information management.
Abdekhoda, M; Ahmadi, M; Dehnad, A; Hosseini, A F
2014-01-01
User acceptance of information technology has been a significant area of research for more than two decades in the field of information technology. This study assessed the acceptance of information technology in the context of Health Information Management (HIM) by utilizing Technology Acceptance Model (TAM) which was modified and applied to assess user acceptance of health information technology as well as viability of TAM as a research construct in the context of HIM. This was a descriptive- analytical study in which a sample of 187 personnel from a population of 363 personnel, working in medical records departments of hospitals affiliated to Tehran University of Medical Sciences, was selected. Users' perception of applying information technology was studied by a researcher-developed questionnaire. Collected data were analyzed by SPSS software (version16) using descriptive statistics and regression analysis. The results suggest that TAM is a useful construct to assess user acceptance of information technology in the context of HIM. The findings also evidenced the perceived ease of use (PEOU) and perceived usefulness (PE) were positively associated with favorable users' attitudes towards HIM. PU was relatively more associated (r= 0.22, p = 0.05) than PEOU (r = 0.014, p = 0.05) with favorable user attitudes towards HIM. Users' perception of usefulness and ease of use are important determinants providing the incentive for users to accept information technologies when the application of a successful HIM system is attempted. The findings of the present study suggest that user acceptance is a key element and should subsequently be the major concern of health organizations and health policy makers.
A Conjoint Analysis Framework for Evaluating User Preferences in Machine Translation
Kirchhoff, Katrin; Capurro, Daniel; Turner, Anne M.
2013-01-01
Despite much research on machine translation (MT) evaluation, there is surprisingly little work that directly measures users’ intuitive or emotional preferences regarding different types of MT errors. However, the elicitation and modeling of user preferences is an important prerequisite for research on user adaptation and customization of MT engines. In this paper we explore the use of conjoint analysis as a formal quantitative framework to assess users’ relative preferences for different types of translation errors. We apply our approach to the analysis of MT output from translating public health documents from English into Spanish. Our results indicate that word order errors are clearly the most dispreferred error type, followed by word sense, morphological, and function word errors. The conjoint analysis-based model is able to predict user preferences more accurately than a baseline model that chooses the translation with the fewest errors overall. Additionally we analyze the effect of using a crowd-sourced respondent population versus a sample of domain experts and observe that main preference effects are remarkably stable across the two samples. PMID:24683295
NASA Astrophysics Data System (ADS)
Breuer, Glynn E.
The purpose of this study was to determine whether applying Gilbert's Behavior Engineering Model to military tactical aviation organizations would foster effective user integration of retro-fit digital avionics in analog-instrumented flight decks. This study examined the relationship between the reported presence of environmental supports and personal repertory supports as defined by Gilbert, and the reported self-efficacy of users of retro-fit digital avionics to analog flight decks, and examined the efficacious behaviors of users as they attain mastery of the equipment and procedures, and user reported best practices and criteria for masterful performance in the use of retro-fit digital avionics and components. This study used a mixed methodology, using quantitative surveys to measure the perceived level of organizational supports that foster mastery of retro-fit digital avionic components, and qualitative interviews to ascertain the efficacious behaviors and best practices of masterful users of these devices. The results of this study indicate that there is some relationship between the reported presence of organizational supports and personal repertory supports and the reported self-mastery and perceived organizational mastery of retro-fit digital avionics applied to the operation of the research aircraft. The primary recommendation is that unit leadership decide exactly the capabilities desired from retro-fit equipment, publish these standards, ensure training in these standards is effective, and evaluate performance based on these standards. Conclusions indicate that sufficient time and resources are available to the individual within the study population, and the organization as a whole, to apply Gilbert's criteria toward the mastery of retro-fit digital avionics applied to the operation of the research aircraft.
Ragonnet, Romain; Trauer, James M; Denholm, Justin T; Marais, Ben J; McBryde, Emma S
2017-05-30
Multidrug-resistant and rifampicin-resistant tuberculosis (MDR/RR-TB) represent an important challenge for global tuberculosis (TB) control. The high rates of MDR/RR-TB observed among re-treatment cases can arise from diverse pathways: de novo amplification during initial treatment, inappropriate treatment of undiagnosed MDR/RR-TB, relapse despite appropriate treatment, or reinfection with MDR/RR-TB. Mathematical modelling allows quantification of the contribution made by these pathways in different settings. This information provides valuable insights for TB policy-makers, allowing better contextualised solutions. However, mathematical modelling outputs need to consider local data and be easily accessible to decision makers in order to improve their usefulness. We present a user-friendly web-based modelling interface, which can be used by people without technical knowledge. Users can input their own parameter values and produce estimates for their specific setting. This innovative tool provides easy access to mathematical modelling outputs that are highly relevant to national TB control programs. In future, the same approach could be applied to a variety of modelling applications, enhancing local decision making.
Detection of Anomalous Insiders in Collaborative Environments via Relational Analysis of Access Logs
Chen, You; Malin, Bradley
2014-01-01
Collaborative information systems (CIS) are deployed within a diverse array of environments, ranging from the Internet to intelligence agencies to healthcare. It is increasingly the case that such systems are applied to manage sensitive information, making them targets for malicious insiders. While sophisticated security mechanisms have been developed to detect insider threats in various file systems, they are neither designed to model nor to monitor collaborative environments in which users function in dynamic teams with complex behavior. In this paper, we introduce a community-based anomaly detection system (CADS), an unsupervised learning framework to detect insider threats based on information recorded in the access logs of collaborative environments. CADS is based on the observation that typical users tend to form community structures, such that users with low a nity to such communities are indicative of anomalous and potentially illicit behavior. The model consists of two primary components: relational pattern extraction and anomaly detection. For relational pattern extraction, CADS infers community structures from CIS access logs, and subsequently derives communities, which serve as the CADS pattern core. CADS then uses a formal statistical model to measure the deviation of users from the inferred communities to predict which users are anomalies. To empirically evaluate the threat detection model, we perform an analysis with six months of access logs from a real electronic health record system in a large medical center, as well as a publicly-available dataset for replication purposes. The results illustrate that CADS can distinguish simulated anomalous users in the context of real user behavior with a high degree of certainty and with significant performance gains in comparison to several competing anomaly detection models. PMID:25485309
Applying Human-Centered Design Methods to Scientific Communication Products
NASA Astrophysics Data System (ADS)
Burkett, E. R.; Jayanty, N. K.; DeGroot, R. M.
2016-12-01
Knowing your users is a critical part of developing anything to be used or experienced by a human being. User interviews, journey maps, and personas are all techniques commonly employed in human-centered design practices because they have proven effective for informing the design of products and services that meet the needs of users. Many non-designers are unaware of the usefulness of personas and journey maps. Scientists who are interested in developing more effective products and communication can adopt and employ user-centered design approaches to better reach intended audiences. Journey mapping is a qualitative data-collection method that captures the story of a user's experience over time as related to the situation or product that requires development or improvement. Journey maps help define user expectations, where they are coming from, what they want to achieve, what questions they have, their challenges, and the gaps and opportunities that can be addressed by designing for them. A persona is a tool used to describe the goals and behavioral patterns of a subset of potential users or customers. The persona is a qualitative data model that takes the form of a character profile, built upon data about the behaviors and needs of multiple users. Gathering data directly from users avoids the risk of basing models on assumptions, which are often limited by misconceptions or gaps in understanding. Journey maps and user interviews together provide the data necessary to build the composite character that is the persona. Because a persona models the behaviors and needs of the target audience, it can then be used to make informed product design decisions. We share the methods and advantages of developing and using personas and journey maps to create more effective science communication products.
Regression Model Optimization for the Analysis of Experimental Data
NASA Technical Reports Server (NTRS)
Ulbrich, N.
2009-01-01
A candidate math model search algorithm was developed at Ames Research Center that determines a recommended math model for the multivariate regression analysis of experimental data. The search algorithm is applicable to classical regression analysis problems as well as wind tunnel strain gage balance calibration analysis applications. The algorithm compares the predictive capability of different regression models using the standard deviation of the PRESS residuals of the responses as a search metric. This search metric is minimized during the search. Singular value decomposition is used during the search to reject math models that lead to a singular solution of the regression analysis problem. Two threshold dependent constraints are also applied. The first constraint rejects math models with insignificant terms. The second constraint rejects math models with near-linear dependencies between terms. The math term hierarchy rule may also be applied as an optional constraint during or after the candidate math model search. The final term selection of the recommended math model depends on the regressor and response values of the data set, the user s function class combination choice, the user s constraint selections, and the result of the search metric minimization. A frequently used regression analysis example from the literature is used to illustrate the application of the search algorithm to experimental data.
NASA Astrophysics Data System (ADS)
Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.
2014-10-01
Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.
NASA Astrophysics Data System (ADS)
Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.
2015-03-01
Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.
Human-Robot Cooperation with Commands Embedded in Actions
NASA Astrophysics Data System (ADS)
Kobayashi, Kazuki; Yamada, Seiji
In this paper, we first propose a novel interaction model, CEA (Commands Embedded in Actions). It can explain the way how some existing systems reduce the work-load of their user. We next extend the CEA and build ECEA (Extended CEA) model. The ECEA enables robots to achieve more complicated tasks. On this extension, we employ ACS (Action Coding System) which can describe segmented human acts and clarifies the relationship between user's actions and robot's actions in a task. The ACS utilizes the CEA's strong point which enables a user to send a command to a robot by his/her natural action for the task. The instance of the ECEA led by using the ACS is a temporal extension which has the user keep a final state of a previous his/her action. We apply the temporal extension of the ECEA for a sweeping task. The high-level task, a cooperative task between the user and the robot can be realized. The robot with simple reactive behavior can sweep the region of under an object when the user picks up the object. In addition, we measure user's cognitive loads on the ECEA and a traditional method, DCM (Direct Commanding Method) in the sweeping task, and compare between them. The results show that the ECEA has a lower cognitive load than the DCM significantly.
Human Systems Integration Competency Development for Navy Systems Commands
2012-09-01
cognizance of Applied Engineering /Psychology relative to knowledge engineering, training, teamwork, user interface design and decision sciences. KSA...cognizance of Applied Engineering /Psychology relative to knowledge engineering, training, teamwork, user interface design and decision sciences...requirements (as required). Fundamental cognizance of Applied Engineering / Psychology relative to knowledge engineering, training, team work, user
Probabilistic rainfall warning system with an interactive user interface
NASA Astrophysics Data System (ADS)
Koistinen, Jarmo; Hohti, Harri; Kauhanen, Janne; Kilpinen, Juha; Kurki, Vesa; Lauri, Tuomo; Nurmi, Pertti; Rossi, Pekka; Jokelainen, Miikka; Heinonen, Mari; Fred, Tommi; Moisseev, Dmitri; Mäkelä, Antti
2013-04-01
A real time 24/7 automatic alert system is in operational use at the Finnish Meteorological Institute (FMI). It consists of gridded forecasts of the exceedance probabilities of rainfall class thresholds in the continuous lead time range of 1 hour to 5 days. Nowcasting up to six hours applies ensemble member extrapolations of weather radar measurements. With 2.8 GHz processors using 8 threads it takes about 20 seconds to generate 51 radar based ensemble members in a grid of 760 x 1226 points. Nowcasting exploits also lightning density and satellite based pseudo rainfall estimates. The latter ones utilize convective rain rate (CRR) estimate from Meteosat Second Generation. The extrapolation technique applies atmospheric motion vectors (AMV) originally developed for upper wind estimation with satellite images. Exceedance probabilities of four rainfall accumulation categories are computed for the future 1 h and 6 h periods and they are updated every 15 minutes. For longer forecasts exceedance probabilities are calculated for future 6 and 24 h periods during the next 4 days. From approximately 1 hour to 2 days Poor man's Ensemble Prediction System (PEPS) is used applying e.g. the high resolution short range Numerical Weather Prediction models HIRLAM and AROME. The longest forecasts apply EPS data from the European Centre for Medium Range Weather Forecasts (ECMWF). The blending of the ensemble sets from the various forecast sources is performed applying mixing of accumulations with equal exceedance probabilities. The blending system contains a real time adaptive estimator of the predictability of radar based extrapolations. The uncompressed output data are written to file for each member, having total size of 10 GB. Ensemble data from other sources (satellite, lightning, NWP) are converted to the same geometry as the radar data and blended as was explained above. A verification system utilizing telemetering rain gauges has been established. Alert dissemination e.g. for citizens and professional end users applies SMS messages and, in near future, smartphone maps. The present interactive user interface facilitates free selection of alert sites and two warning thresholds (any rain, heavy rain) at any location in Finland. The pilot service was tested by 1000-3000 users during summers 2010 and 2012. As an example of dedicated end-user services gridded exceedance scenarios (of probabilities 5 %, 50 % and 90 %) of hourly rainfall accumulations for the next 3 hours have been utilized as an online input data for the influent model at the Greater Helsinki Wastewater Treatment Plant.
Studying User Income through Language, Behaviour and Affect in Social Media.
Preoţiuc-Pietro, Daniel; Volkova, Svitlana; Lampos, Vasileios; Bachrach, Yoram; Aletras, Nikolaos
2015-01-01
Automatically inferring user demographics from social media posts is useful for both social science research and a range of downstream applications in marketing and politics. We present the first extensive study where user behaviour on Twitter is used to build a predictive model of income. We apply non-linear methods for regression, i.e. Gaussian Processes, achieving strong correlation between predicted and actual user income. This allows us to shed light on the factors that characterise income on Twitter and analyse their interplay with user emotions and sentiment, perceived psycho-demographics and language use expressed through the topics of their posts. Our analysis uncovers correlations between different feature categories and income, some of which reflect common belief e.g. higher perceived education and intelligence indicates higher earnings, known differences e.g. gender and age differences, however, others show novel findings e.g. higher income users express more fear and anger, whereas lower income users express more of the time emotion and opinions.
Framework for End-User Programming of Cross-Smart Space Applications
Palviainen, Marko; Kuusijärvi, Jarkko; Ovaska, Eila
2012-01-01
Cross-smart space applications are specific types of software services that enable users to share information, monitor the physical and logical surroundings and control it in a way that is meaningful for the user's situation. For developing cross-smart space applications, this paper makes two main contributions: it introduces (i) a component design and scripting method for end-user programming of cross-smart space applications and (ii) a backend framework of components that interwork to support the brunt of the RDFScript translation, and the use and execution of ontology models. Before end-user programming activities, the software professionals must develop easy-to-apply Driver components for the APIs of existing software systems. Thereafter, end-users are able to create applications from the commands of the Driver components with the help of the provided toolset. The paper also introduces the reference implementation of the framework, tools for the Driver component development and end-user programming of cross-smart space applications and the first evaluation results on their application. PMID:23202169
Studying User Income through Language, Behaviour and Affect in Social Media
Preoţiuc-Pietro, Daniel; Volkova, Svitlana; Lampos, Vasileios; Bachrach, Yoram; Aletras, Nikolaos
2015-01-01
Automatically inferring user demographics from social media posts is useful for both social science research and a range of downstream applications in marketing and politics. We present the first extensive study where user behaviour on Twitter is used to build a predictive model of income. We apply non-linear methods for regression, i.e. Gaussian Processes, achieving strong correlation between predicted and actual user income. This allows us to shed light on the factors that characterise income on Twitter and analyse their interplay with user emotions and sentiment, perceived psycho-demographics and language use expressed through the topics of their posts. Our analysis uncovers correlations between different feature categories and income, some of which reflect common belief e.g. higher perceived education and intelligence indicates higher earnings, known differences e.g. gender and age differences, however, others show novel findings e.g. higher income users express more fear and anger, whereas lower income users express more of the time emotion and opinions. PMID:26394145
The application of remote sensing to the development and formulation of hydrologic planning models
NASA Technical Reports Server (NTRS)
Fowler, T. R.; Castruccio, P. A.; Loats, H. L., Jr.
1977-01-01
The development of a remote sensing model and its efficiency in determining parameters of hydrologic models are reviewed. Procedures for extracting hydrologic data from LANDSAT imagery, and the visual analysis of composite imagery are presented. A hydrologic planning model is developed and applied to determine seasonal variations in watershed conditions. The transfer of this technology to a user community and contract arrangements are discussed.
Cimperman, Miha; Makovec Brenčič, Maja; Trkman, Peter
2016-06-01
Although telehealth offers an improved approach to providing healthcare services, its adoption by end users remains slow. With an older population as the main target, these traditionally conservative users pose a big challenge to the successful implementation of innovative telehealth services. The objective of this study was to develop and empirically test a model for predicting the factors affecting older users' acceptance of Home Telehealth Services (HTS). A survey instrument was administered to 400 participants aged 50 years and above from both rural and urban environments in Slovenia. Structural equation modeling was applied to analyze the causal effect of seven hypothesized predicting factors. HTS were introduced as a bundle of functionalities, representing future services that currently do not exist. This enabled users' perceptions to be measured on the conceptual level, rather than attitudes to a specific technical solution. Six relevant predictors were confirmed in older users' HTS acceptance behavior, with Performance Expectancy (r=0.30), Effort Expectancy (r=0.49), Facilitating Conditions (r=0.12), and Perceived Security (r=0.16) having a direct impact on behavioral intention to use HTS. In addition, Computer Anxiety is positioned as an antecedent of Effort Expectancy with a strong negative influence (r=-0.61), and Doctor's Opinion influence showed a strong impact on Performance Expectancy (r=0.31). The results also indicate Social Influence as an irrelevant predictor of acceptance behavior. The model of six predictors yielded 77% of the total variance explained in the final measured Behavioral Intention to Use HTS by older adults. The level at which HTS are perceived as easy to use and manage is the leading acceptance predictor in older users' HTS acceptance. Together with Perceived Usefulness and Perceived Security, these three factors represent the key influence on older people's HTS acceptance behavior. When promoting HTS, interventions should focus to portray it as secure. Marketing interventions should focus also on promoting HTS among health professionals, using them as social agents to frame the services as useful and beneficial. The important role of computer anxiety may result in a need to use different equipment such as a tablet computer to access HTS. Finally, this paper introduces important methodological guidelines for measuring perceptions on a conceptual level of future services that currently do not exist. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Persuasive Conversational Agent with Persuasion Tactics
NASA Astrophysics Data System (ADS)
Narita, Tatsuya; Kitamura, Yasuhiko
Persuasive conversational agents persuade people to change their attitudes or behaviors through conversation, and are expected to be applied as virtual sales clerks in e-shopping sites. As an approach to create such an agent, we have developed a learning agent with the Wizard of Oz method in which a person called Wizard talks to the user pretending to be the agent. The agent observes the conversations between the Wizard and the user, and learns how to persuade people. In this method, the Wizard has to reply to most of the user's inputs at the beginning, but the burden gradually falls because the agent learns how to reply as the conversation model grows.
New similarity of triangular fuzzy number and its application.
Zhang, Xixiang; Ma, Weimin; Chen, Liping
2014-01-01
The similarity of triangular fuzzy numbers is an important metric for application of it. There exist several approaches to measure similarity of triangular fuzzy numbers. However, some of them are opt to be large. To make the similarity well distributed, a new method SIAM (Shape's Indifferent Area and Midpoint) to measure triangular fuzzy number is put forward, which takes the shape's indifferent area and midpoint of two triangular fuzzy numbers into consideration. Comparison with other similarity measurements shows the effectiveness of the proposed method. Then, it is applied to collaborative filtering recommendation to measure users' similarity. A collaborative filtering case is used to illustrate users' similarity based on cloud model and triangular fuzzy number; the result indicates that users' similarity based on triangular fuzzy number can obtain better discrimination. Finally, a simulated collaborative filtering recommendation system is developed which uses cloud model and triangular fuzzy number to express users' comprehensive evaluation on items, and result shows that the accuracy of collaborative filtering recommendation based on triangular fuzzy number is higher.
NASA Astrophysics Data System (ADS)
Made Tirta, I.; Anggraeni, Dian
2018-04-01
Statistical models have been developed rapidly into various directions to accommodate various types of data. Data collected from longitudinal, repeated measured, clustered data (either continuous, binary, count, or ordinal), are more likely to be correlated. Therefore statistical model for independent responses, such as Generalized Linear Model (GLM), Generalized Additive Model (GAM) are not appropriate. There are several models available to apply for correlated responses including GEEs (Generalized Estimating Equations), for marginal model and various mixed effect model such as GLMM (Generalized Linear Mixed Models) and HGLM (Hierarchical Generalized Linear Models) for subject spesific models. These models are available on free open source software R, but they can only be accessed through command line interface (using scrit). On the othe hand, most practical researchers very much rely on menu based or Graphical User Interface (GUI). We develop, using Shiny framework, standard pull down menu Web-GUI that unifies most models for correlated responses. The Web-GUI has accomodated almost all needed features. It enables users to do and compare various modeling for repeated measure data (GEE, GLMM, HGLM, GEE for nominal responses) much more easily trough online menus. This paper discusses the features of the Web-GUI and illustrates the use of them. In General we find that GEE, GLMM, HGLM gave very closed results.
Development and application of virtual reality for man/systems integration
NASA Technical Reports Server (NTRS)
Brown, Marcus
1991-01-01
While the graphical presentation of computer models signified a quantum leap over presentations limited to text and numbers, it still has the problem of presenting an interface barrier between the human user and the computer model. The user must learn a command language in order to orient themselves in the model. For example, to move left from the current viewpoint of the model, they might be required to type 'LEFT' at a keyboard. This command is fairly intuitive, but if the viewpoint moves far enough that there are no visual cues overlapping with the first view, the user does not know if the viewpoint has moved inches, feet, or miles to the left, or perhaps remained in the same position, but rotated to the left. Until the user becomes quite familiar with the interface language of the computer model presentation, they will be proned to lossing their bearings frequently. Even a highly skilled user will occasionally get lost in the model. A new approach to presenting type type of information is to directly interpret the user's body motions as the input language for determining what view to present. When the user's head turns 45 degrees to the left, the viewpoint should be rotated 45 degrees to the left. Since the head moves through several intermediate angles between the original view and the final one, several intermediate views should be presented, providing the user with a sense of continuity between the original view and the final one. Since the primary way a human physically interacts with their environment should monitor the movements of the user's hands and alter objects in the virtual model in a way consistent with the way an actual object would move when manipulated using the same hand movements. Since this approach to the man-computer interface closely models the same type of interface that humans have with the physical world, this type of interface is often called virtual reality, and the model is referred to as a virtual world. The task of this summer fellowship was to set up a virtual reality system at MSFC and begin applying it to some of the questions which concern scientists and engineers involved in space flight. A brief discussion of this work is presented.
NASA Technical Reports Server (NTRS)
Castruccio, P. A.; Loats, H. L., Jr.
1975-01-01
An analysis of current computer usage by major water resources users was made to determine the trends of usage and costs for the principal hydrologic users/models. The laws and empirical relationships governing the growth of the data processing loads were described and applied to project the future data loads. Data loads for ERTS CCT image processing were computed and projected through the 1985 era. The analysis showns significant impact due to the utilization and processing of ERTS CCT's data.
Tilahun, Binyam; Fritz, Fleur
2015-08-01
With the increasing implementation of Electronic Medical Record Systems (EMR) in developing countries, there is a growing need to identify antecedents of EMR success to measure and predict the level of adoption before costly implementation. However, less evidence is available about EMR success in the context of low-resource setting implementations. Therefore, this study aims to fill this gap by examining the constructs and relationships of the widely used DeLone and MacLean (D&M) information system success model to determine whether it can be applied to measure EMR success in those settings. A quantitative cross sectional study design using self-administered questionnaires was used to collect data from 384 health professionals working in five governmental hospitals in Ethiopia. The hospitals use a comprehensive EMR system since three years. Descriptive and structural equation modeling methods were applied to describe and validate the extent of relationship of constructs and mediating effects. The findings of the structural equation modeling shows that system quality has significant influence on EMR use (β = 0.32, P < 0.05) and user satisfaction (β = 0.53, P < 0.01); information quality has significant influence on EMR use (β = 0.44, P < 0.05) and user satisfaction (β = 0.48, P < 0.01) and service quality has strong significant influence on EMR use (β = 0.36, P < 0.05) and user satisfaction (β = 0.56, P < 0.01). User satisfaction has significant influence on EMR use (β = 0.41, P < 0.05) but the effect of EMR use on user satisfaction was not significant. Both EMR use and user satisfaction have significant influence on perceived net-benefit (β = 0.31, P < 0.01; β = 0.60, P < 0.01), respectively. Additionally, computer literacy was found to be a mediating factor in the relationship between service quality and EMR use (P < 0.05) as well as user satisfaction (P < 0.01). Among all the constructs, user satisfaction showed the strongest effect on perceived net-benefit of health professionals. EMR implementers and managers in developing countries are in urgent need of implementation models to design proper implementation strategies. In this study, the constructs and relationships depicted in the updated D&M model were found to be applicable to assess the success of EMR in low resource settings. Additionally, computer literacy was found to be a mediating factor in EMR use and user satisfaction of health professionals. Hence, EMR implementers and managers in those settings should give priority in improving service quality of the hospitals like technical support and infrastructure; providing continuous basic computer trainings to health professionals; and give attention to the system and information quality of the systems they want to implement.
Section 3. The SPARROW Surface Water-Quality Model: Theory, Application and User Documentation
Schwarz, G.E.; Hoos, A.B.; Alexander, R.B.; Smith, R.A.
2006-01-01
SPARROW (SPAtially Referenced Regressions On Watershed attributes) is a watershed modeling technique for relating water-quality measurements made at a network of monitoring stations to attributes of the watersheds containing the stations. The core of the model consists of a nonlinear regression equation describing the non-conservative transport of contaminants from point and diffuse sources on land to rivers and through the stream and river network. The model predicts contaminant flux, concentration, and yield in streams and has been used to evaluate alternative hypotheses about the important contaminant sources and watershed properties that control transport over large spatial scales. This report provides documentation for the SPARROW modeling technique and computer software to guide users in constructing and applying basic SPARROW models. The documentation gives details of the SPARROW software, including the input data and installation requirements, and guidance in the specification, calibration, and application of basic SPARROW models, as well as descriptions of the model output and its interpretation. The documentation is intended for both researchers and water-resource managers with interest in using the results of existing models and developing and applying new SPARROW models. The documentation of the model is presented in two parts. Part 1 provides a theoretical and practical introduction to SPARROW modeling techniques, which includes a discussion of the objectives, conceptual attributes, and model infrastructure of SPARROW. Part 1 also includes background on the commonly used model specifications and the methods for estimating and evaluating parameters, evaluating model fit, and generating water-quality predictions and measures of uncertainty. Part 2 provides a user's guide to SPARROW, which includes a discussion of the software architecture and details of the model input requirements and output files, graphs, and maps. The text documentation and computer software are available on the Web at http://usgs.er.gov/sparrow/sparrow-mod/.
ERIC Educational Resources Information Center
Holden, Heather; Rada, Roy
2011-01-01
The Technology Acceptance Model (TAM) represents how users come to accept and use a given technology and can be applied to teachers' use of educational technologies. Here the model is extended to incorporate teachers' perceived usability and self-efficacy measures toward the technologies they are currently using. The authors administered a survey…
DOT National Transportation Integrated Search
2003-04-01
The Louisiana Department of Transportation and Development (LADOTD) is interested in applying the Federal Highway Administration=s (FHWA) life cycle cost analysis procedures and model to large roadway construction, maintenance, and rehabilitation pro...
Graphical User Interface for an Observing Control System for the UK Infrared Telescope
NASA Astrophysics Data System (ADS)
Tan, M.; Bridger, A.; Wright, G. S.; Adamson, A. J.; Currie, M. J.; Economou, F.
A Graphical user interface for the observing control system of UK Infrared Telescope has been developed as a part of the ORAC (Observatory Reduction and Acquisition Control) Project. We analyzed and designed the system using the Unified Modelling Language (UML) with the CASE tool Rational Rose 98. The system has been implemented in a modular way with Java packages using Swing and RMI. This system is component-based with pluggability. Object orientation concepts and UML notations have been applied throughout the development.
An Open Software Platform for Sharing Water Resource Models, Code and Data
NASA Astrophysics Data System (ADS)
Knox, Stephen; Meier, Philipp; Mohamed, Khaled; Korteling, Brett; Matrosov, Evgenii; Huskova, Ivana; Harou, Julien; Rosenberg, David; Tilmant, Amaury; Medellin-Azuara, Josue; Wicks, Jon
2016-04-01
The modelling of managed water resource systems requires new approaches in the face of increasing future uncertainty. Water resources management models, even if applied to diverse problem areas, use common approaches such as representing the problem as a network of nodes and links. We propose a data management software platform, called Hydra, that uses this commonality to allow multiple models using a node-link structure to be managed and run using a single software system. Hydra's user interface allows users to manage network topology and associated data. Hydra feeds this data directly into a model, importing from and exporting to different file formats using Apps. An App connects Hydra to a custom model, a modelling system such as GAMS or MATLAB or to different file formats such as MS Excel, CSV and ESRI Shapefiles. Hydra allows users to manage their data in a single, consistent place. Apps can be used to run domain-specific models and allow users to work with their own required file formats. The Hydra App Store offers a collaborative space where model developers can publish, review and comment on Apps, models and data. Example Apps and open-source libraries are available in a variety of languages (Python, Java and .NET). The App Store can act as a hub for water resource modellers to view and share Apps, models and data easily. This encourages an ecosystem of development using a shared platform, resulting in more model integration and potentially greater unity within resource modelling communities. www.hydraplatform.org www.hydraappstore.com
The discounting model selector: Statistical software for delay discounting applications.
Gilroy, Shawn P; Franck, Christopher T; Hantula, Donald A
2017-05-01
Original, open-source computer software was developed and validated against established delay discounting methods in the literature. The software executed approximate Bayesian model selection methods from user-supplied temporal discounting data and computed the effective delay 50 (ED50) from the best performing model. Software was custom-designed to enable behavior analysts to conveniently apply recent statistical methods to temporal discounting data with the aid of a graphical user interface (GUI). The results of independent validation of the approximate Bayesian model selection methods indicated that the program provided results identical to that of the original source paper and its methods. Monte Carlo simulation (n = 50,000) confirmed that true model was selected most often in each setting. Simulation code and data for this study were posted to an online repository for use by other researchers. The model selection approach was applied to three existing delay discounting data sets from the literature in addition to the data from the source paper. Comparisons of model selected ED50 were consistent with traditional indices of discounting. Conceptual issues related to the development and use of computer software by behavior analysts and the opportunities afforded by free and open-sourced software are discussed and a review of possible expansions of this software are provided. © 2017 Society for the Experimental Analysis of Behavior.
Fast flexible modeling of RNA structure using internal coordinates.
Flores, Samuel Coulbourn; Sherman, Michael A; Bruns, Christopher M; Eastman, Peter; Altman, Russ Biagio
2011-01-01
Modeling the structure and dynamics of large macromolecules remains a critical challenge. Molecular dynamics (MD) simulations are expensive because they model every atom independently, and are difficult to combine with experimentally derived knowledge. Assembly of molecules using fragments from libraries relies on the database of known structures and thus may not work for novel motifs. Coarse-grained modeling methods have yielded good results on large molecules but can suffer from difficulties in creating more detailed full atomic realizations. There is therefore a need for molecular modeling algorithms that remain chemically accurate and economical for large molecules, do not rely on fragment libraries, and can incorporate experimental information. RNABuilder works in the internal coordinate space of dihedral angles and thus has time requirements proportional to the number of moving parts rather than the number of atoms. It provides accurate physics-based response to applied forces, but also allows user-specified forces for incorporating experimental information. A particular strength of RNABuilder is that all Leontis-Westhof basepairs can be specified as primitives by the user to be satisfied during model construction. We apply RNABuilder to predict the structure of an RNA molecule with 160 bases from its secondary structure, as well as experimental information. Our model matches the known structure to 10.2 Angstroms RMSD and has low computational expense.
P-Care BPJS Acceptance Model in Primary Health Centers.
Markam, Hosizah
2017-01-01
Electronic Medical Records (EMR) are increasingly adopted in healthcare facilities. Recently, implementation failure of electronic information systems is known to be caused by not only the quality of technical aspects, but also the user's behavior. It is known as applying the Technology Acceptance Model (TAM). This research aimed to analyze the acceptance model of p-care BPJS in the primary health centers. A total sample of 30 p-care BPJS users was drawn by multistage random sampling in which of these 30 primary health centers participated. Data analysis used both descriptive and inferential statistics. In the phase of structural model, it indicated that p-care BPJS acceptance model in the primary health centers was formed by Perceived Ease of Use (PEOU) and Perceived Usefulness (PU) through Attitude towards use of p-care BPJS and Behavioral Intention to use p-care BPJS.
Scaled CMOS Technology Reliability Users Guide
NASA Technical Reports Server (NTRS)
White, Mark
2010-01-01
The desire to assess the reliability of emerging scaled microelectronics technologies through faster reliability trials and more accurate acceleration models is the precursor for further research and experimentation in this relevant field. The effect of semiconductor scaling on microelectronics product reliability is an important aspect to the high reliability application user. From the perspective of a customer or user, who in many cases must deal with very limited, if any, manufacturer's reliability data to assess the product for a highly-reliable application, product-level testing is critical in the characterization and reliability assessment of advanced nanometer semiconductor scaling effects on microelectronics reliability. A methodology on how to accomplish this and techniques for deriving the expected product-level reliability on commercial memory products are provided.Competing mechanism theory and the multiple failure mechanism model are applied to the experimental results of scaled SDRAM products. Accelerated stress testing at multiple conditions is applied at the product level of several scaled memory products to assess the performance degradation and product reliability. Acceleration models are derived for each case. For several scaled SDRAM products, retention time degradation is studied and two distinct soft error populations are observed with each technology generation: early breakdown, characterized by randomly distributed weak bits with Weibull slope (beta)=1, and a main population breakdown with an increasing failure rate. Retention time soft error rates are calculated and a multiple failure mechanism acceleration model with parameters is derived for each technology. Defect densities are calculated and reflect a decreasing trend in the percentage of random defective bits for each successive product generation. A normalized soft error failure rate of the memory data retention time in FIT/Gb and FIT/cm2 for several scaled SDRAM generations is presented revealing a power relationship. General models describing the soft error rates across scaled product generations are presented. The analysis methodology may be applied to other scaled microelectronic products and their key parameters.
Applying Risk and Resilience Metrics to Energy Investments
2015-12-01
the model as a positive aspect, though the user can easily devalue risk and resiliency while increasing the value of the cost and policy categories to... policy or position of the Department of Defense or the U.S. Government. IRB Protocol number ____N/A____. 12a. DISTRIBUTION / AVAILABILITY STATEMENT...decision making model. The model developed for this project includes cost metrics and policy mandates that the current model considers and adds the
Analysis of Learning Curve Fitting Techniques.
1987-09-01
1986. 15. Neter, John and others. Applied Linear Regression Models. Homewood IL: Irwin, 19-33. 16. SAS User’s Guide: Basics, Version 5 Edition. SAS... Linear Regression Techniques (15:23-52). Random errors are assumed to be normally distributed when using -# ordinary least-squares, according to Johnston...lot estimated by the improvement curve formula. For a more detailed explanation of the ordinary least-squares technique, see Neter, et. al., Applied
Requirements Modeling with the Aspect-oriented User Requirements Notation (AoURN): A Case Study
NASA Astrophysics Data System (ADS)
Mussbacher, Gunter; Amyot, Daniel; Araújo, João; Moreira, Ana
The User Requirements Notation (URN) is a recent ITU-T standard that supports requirements engineering activities. The Aspect-oriented URN (AoURN) adds aspect-oriented concepts to URN, creating a unified framework that allows for scenario-based, goal-oriented, and aspect-oriented modeling. AoURN is applied to the car crash crisis management system (CCCMS), modeling its functional and non-functional requirements (NFRs). AoURN generally models all use cases, NFRs, and stakeholders as individual concerns and provides general guidelines for concern identification. AoURN handles interactions between concerns, capturing their dependencies and conflicts as well as the resolutions. We present a qualitative comparison of aspect-oriented techniques for scenario-based and goal-oriented requirements engineering. An evaluation carried out based on the metrics adapted from literature and a task-based evaluation suggest that AoURN models are more scalable than URN models and exhibit better modularity, reusability, and maintainability.
ISBP: Understanding the Security Rule of Users' Information-Sharing Behaviors in Partnership.
Wu, Hongchen; Wang, Xinjun
2016-01-01
The rapid growth of social network data has given rise to high security awareness among users, especially when they exchange and share their personal information. However, because users have different feelings about sharing their information, they are often puzzled about who their partners for exchanging information can be and what information they can share. Is it possible to assist users in forming a partnership network in which they can exchange and share information with little worry? We propose a modified information sharing behavior prediction (ISBP) model that can help in understanding the underlying rules by which users share their information with partners in light of three common aspects: what types of items users are likely to share, what characteristics of users make them likely to share information, and what features of users' sharing behavior are easy to predict. This model is applied with machine learning techniques in WEKA to predict users' decisions pertaining to information sharing behavior and form them into trustable partnership networks by learning their features. In the experiment section, by using two real-life datasets consisting of citizens' sharing behavior, we identify the effect of highly sensitive requests on sharing behavior adjacent to individual variables: the younger participants' partners are more difficult to predict than those of the older participants, whereas the partners of people who are not computer majors are easier to predict than those of people who are computer majors. Based on these findings, we believe that it is necessary and feasible to offer users personalized suggestions on information sharing decisions, and this is pioneering work that could benefit college researchers focusing on user-centric strategies and website owners who want to collect more user information without raising their privacy awareness or losing their trustworthiness.
Computer Models Simulate Fine Particle Dispersion
NASA Technical Reports Server (NTRS)
2010-01-01
Through a NASA Seed Fund partnership with DEM Solutions Inc., of Lebanon, New Hampshire, scientists at Kennedy Space Center refined existing software to study the electrostatic phenomena of granular and bulk materials as they apply to planetary surfaces. The software, EDEM, allows users to import particles and obtain accurate representations of their shapes for modeling purposes, such as simulating bulk solids behavior, and was enhanced to be able to more accurately model fine, abrasive, cohesive particles. These new EDEM capabilities can be applied in many industries unrelated to space exploration and have been adopted by several prominent U.S. companies, including John Deere, Pfizer, and Procter & Gamble.
NASA Astrophysics Data System (ADS)
Malard, J. J.; Rojas, M.; Adamowski, J. F.; Anandaraja, N.; Tuy, H.; Melgar-Quiñonez, H.
2016-12-01
While several well-validated crop growth models are currently widely used, very few crop pest models of the same caliber have been developed or applied, and pest models that take trophic interactions into account are even rarer. This may be due to several factors, including 1) the difficulty of representing complex agroecological food webs in a quantifiable model, and 2) the general belief that pesticides effectively remove insect pests from immediate concern. However, pests currently claim a substantial amount of harvests every year (and account for additional control costs), and the impact of insects and of their trophic interactions on agricultural crops cannot be ignored, especially in the context of changing climates and increasing pressures on crops across the globe. Unfortunately, most integrated pest management frameworks rely on very simple models (if at all), and most examples of successful agroecological management remain more anecdotal than scientifically replicable. In light of this, there is a need for validated and robust agroecological food web models that allow users to predict the response of these webs to changes in management, crops or climate, both in order to predict future pest problems under a changing climate as well as to develop effective integrated management plans. Here we present Tiko'n, a Python-based software whose API allows users to rapidly build and validate trophic web agroecological models that predict pest dynamics in the field. The programme uses a Bayesian inference approach to calibrate the models according to field data, allowing for the reuse of literature data from various sources and reducing the need for extensive field data collection. We apply the model to the cononut black-headed caterpillar (Opisina arenosella) and associated parasitoid data from Sri Lanka, showing how the modeling framework can be used to rapidly develop, calibrate and validate models that elucidate how the internal structures of food webs determine their behaviour and allow users to evaluate different integrated management options.
Recommendation in evolving online networks
NASA Astrophysics Data System (ADS)
Hu, Xiao; Zeng, An; Shang, Ming-Sheng
2016-02-01
Recommender system is an effective tool to find the most relevant information for online users. By analyzing the historical selection records of users, recommender system predicts the most likely future links in the user-item network and accordingly constructs a personalized recommendation list for each user. So far, the recommendation process is mostly investigated in static user-item networks. In this paper, we propose a model which allows us to examine the performance of the state-of-the-art recommendation algorithms in evolving networks. We find that the recommendation accuracy in general decreases with time if the evolution of the online network fully depends on the recommendation. Interestingly, some randomness in users' choice can significantly improve the long-term accuracy of the recommendation algorithm. When a hybrid recommendation algorithm is applied, we find that the optimal parameter gradually shifts towards the diversity-favoring recommendation algorithm, indicating that recommendation diversity is essential to keep a high long-term recommendation accuracy. Finally, we confirm our conclusions by studying the recommendation on networks with the real evolution data.
Kim, Seungjoo
2014-01-01
There has been an explosive increase in the population of the OSN (online social network) in recent years. The OSN provides users with many opportunities to communicate among friends and family. Further, it facilitates developing new relationships with previously unknown people having similar beliefs or interests. However, the OSN can expose users to adverse effects such as privacy breaches, the disclosing of uncontrolled material, and the disseminating of false information. Traditional access control models such as MAC, DAC, and RBAC are applied to the OSN to address these problems. However, these models are not suitable for the dynamic OSN environment because user behavior in the OSN is unpredictable and static access control imposes a burden on the users to change the access control rules individually. We propose a dynamic trust-based access control for the OSN to address the problems of the traditional static access control. Moreover, we provide novel criteria to evaluate trust factors such as sociological approach and evaluate a method to calculate the dynamic trust values. The proposed method can monitor negative behavior and modify access permission levels dynamically to prevent the indiscriminate disclosure of information. PMID:25374943
Baek, Seungsoo; Kim, Seungjoo
2014-01-01
There has been an explosive increase in the population of the OSN (online social network) in recent years. The OSN provides users with many opportunities to communicate among friends and family. Further, it facilitates developing new relationships with previously unknown people having similar beliefs or interests. However, the OSN can expose users to adverse effects such as privacy breaches, the disclosing of uncontrolled material, and the disseminating of false information. Traditional access control models such as MAC, DAC, and RBAC are applied to the OSN to address these problems. However, these models are not suitable for the dynamic OSN environment because user behavior in the OSN is unpredictable and static access control imposes a burden on the users to change the access control rules individually. We propose a dynamic trust-based access control for the OSN to address the problems of the traditional static access control. Moreover, we provide novel criteria to evaluate trust factors such as sociological approach and evaluate a method to calculate the dynamic trust values. The proposed method can monitor negative behavior and modify access permission levels dynamically to prevent the indiscriminate disclosure of information.
Bruns, Eric J.; Hyde, Kelly L.; Sather, April; Hook, Alyssa; Lyon, Aaron R.
2015-01-01
Health information technology (HIT) and care coordination for individuals with complex needs are high priorities for quality improvement in health care. However, there is little empirical guidance about how best to design electronic health record systems and related technologies to facilitate implementation of care coordination models in behavioral health, or how best to apply user input to the design and testing process. In this paper, we describe an iterative development process that incorporated user/stakeholder perspectives at multiple points and resulted in an electronic behavioral health information system (EBHIS) specific to the wraparound care coordination model for youth with serious emotional and behavioral disorders. First, we review foundational HIT research on how EBHIS can enhance efficiency and outcomes of wraparound that was used to inform development. After describing the rationale for and functions of a prototype EBHIS for wraparound, we describe methods and results for a series of six small studies that informed system development across four phases of effort – predevelopment, development, initial user testing, and commercialization – and discuss how these results informed system design and refinement. Finally, we present next steps, challenges to dissemination, and guidance for others aiming to develop specialized behavioral health HIT. The research team's experiences reinforce the opportunity presented by EBHIS to improve care coordination for populations with complex needs, while also pointing to a litany of barriers and challenges to be overcome to implement such technologies. PMID:26060099
A Multiple Indicators Multiple Causes (MIMIC) model of internal barriers to drug treatment in China.
Qi, Chang; Kelly, Brian C; Liao, Yanhui; He, Haoyu; Luo, Tao; Deng, Huiqiong; Liu, Tieqiao; Hao, Wei; Wang, Jichuan
2015-03-01
Although evidence exists for distinct barriers to drug abuse treatment (BDATs), investigations of their inter-relationships and the effect of individual characteristics on the barrier factors have been sparse, especially in China. A Multiple Indicators Multiple Causes (MIMIC) model is applied for this target. A sample of 262 drug users were recruited from three drug rehabilitation centers in Hunan Province, China. We applied a MIMIC approach to investigate the effect of gender, age, marital status, education, primary substance use, duration of primary drug use, and drug treatment experience on the internal barrier factors: absence of problem (AP), negative social support (NSS), fear of treatment (FT), and privacy concerns (PC). Drug users of various characteristics were found to report different internal barrier factors. Younger participants were more likely to report NSS (-0.19, p=0.038) and PC (-0.31, p<0.001). Compared to other drug users, ice users were more likely to report AP (0.44, p<0.001) and NSS (0.25, p=0.010). Drug treatment experiences related to AP (0.20, p=0.012). In addition, differential item functioning (DIF) occurred in three items when participant from groups with different duration of drug use, ice use, or marital status. Individual characteristics had significant effects on internal barriers to drug treatment. On this basis, BDAT perceived by different individuals could be assessed before tactics were utilized to successfully remove perceived barriers to drug treatment. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Preference heterogeneity in a count data model of demand for off-highway vehicle recreation
Thomas P Holmes; Jeffrey E Englin
2010-01-01
This paper examines heterogeneity in the preferences for OHV recreation by applying the random parameters Poisson model to a data set of off-highway vehicle (OHV) users at four National Forest sites in North Carolina. The analysis develops estimates of individual consumer surplus and finds that estimates are systematically affected by the random parameter specification...
A users manual for the method of moments Aircraft Modeling Code (AMC), version 2
NASA Technical Reports Server (NTRS)
Peters, M. E.; Newman, E. H.
1994-01-01
This report serves as a user's manual for Version 2 of the 'Aircraft Modeling Code' or AMC. AMC is a user-oriented computer code, based on the method of moments (MM), for the analysis of the radiation and/or scattering from geometries consisting of a main body or fuselage shape with attached wings and fins. The shape of the main body is described by defining its cross section at several stations along its length. Wings, fins, rotor blades, and radiating monopoles can then be attached to the main body. Although AMC was specifically designed for aircraft or helicopter shapes, it can also be applied to missiles, ships, submarines, jet inlets, automobiles, spacecraft, etc. The problem geometry and run control parameters are specified via a two character command language input format. This report describes the input command language and also includes several examples which illustrate typical code inputs and outputs.
Design and Implementation of User-Created Information Systems with Mobile RFID
NASA Astrophysics Data System (ADS)
Lee, Jae Kwoen; Chin, Sungho; Kim, Hee Cheon; Chung, Kwang Sik
RFID (Radio Frequency Identification) has been usually applied at physical distribution field. The Mobile RFID can be the only technology that we can lead the market. In our country, ETRI standardizes MOBION (MOBile Identification ON), and the mobile-telecommunication companies provide the trial-mobile RFID service from 2006. In the trial-mobile RFID services, the Broker model is used to decode the mobile RFID code. However, the Broker model has some problems, such as communication overhead caused by the frequent ODS query, service performance, and various services for users. In this paper, we developed device application that is capable for filtering unrelated code from RFID service to improve the decoding performance. We also improve the performance through simplifying connection process between device application and the broker. Finally, we propose and develop the user-created information system to widely distribute the Mobile RFID service.
A user's manual for the method of moments Aircraft Modeling Code (AMC)
NASA Technical Reports Server (NTRS)
Peters, M. E.; Newman, E. H.
1989-01-01
This report serves as a user's manual for the Aircraft Modeling Code or AMC. AMC is a user-oriented computer code, based on the method of moments (MM), for the analysis of the radiation and/or scattering from geometries consisting of a main body or fuselage shape with attached wings and fins. The shape of the main body is described by defining its cross section at several stations along its length. Wings, fins, rotor blades, and radiating monopoles can then be attached to the main body. Although AMC was specifically designed for aircraft or helicopter shapes, it can also be applied to missiles, ships, submarines, jet inlets, automobiles, spacecraft, etc. The problem geometry and run control parameters are specified via a two character command language input format. The input command language is described and several examples which illustrate typical code inputs and outputs are also included.
NASA Technical Reports Server (NTRS)
Jiang, Jian-Ping; Murphy, Elizabeth D.; Bailin, Sidney C.; Truszkowski, Walter F.
1993-01-01
Capturing human factors knowledge about the design of graphical user interfaces (GUI's) and applying this knowledge on-line are the primary objectives of the Computer-Human Interaction Models (CHIMES) project. The current CHIMES prototype is designed to check a GUI's compliance with industry-standard guidelines, general human factors guidelines, and human factors recommendations on color usage. Following the evaluation, CHIMES presents human factors feedback and advice to the GUI designer. The paper describes the approach to modeling human factors guidelines, the system architecture, a new method developed to convert quantitative RGB primaries into qualitative color representations, and the potential for integrating CHIMES with user interface management systems (UIMS). Both the conceptual approach and its implementation are discussed. This paper updates the presentation on CHIMES at the first International Symposium on Ground Data Systems for Spacecraft Control.
TOOKUIL: A case study in user interface development for safety code application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gray, D.L.; Harkins, C.K.; Hoole, J.G.
1997-07-01
Traditionally, there has been a very high learning curve associated with using nuclear power plant (NPP) analysis codes. Even for seasoned plant analysts and engineers, the process of building or modifying an input model for present day NPP analysis codes is tedious, error prone, and time consuming. Current cost constraints and performance demands place an additional burden on today`s safety analysis community. Advances in graphical user interface (GUI) technology have been applied to obtain significant productivity and quality assurance improvements for the Transient Reactor Analysis Code (TRAC) input model development. KAPL Inc. has developed an X Windows-based graphical user interfacemore » named TOOKUIL which supports the design and analysis process, acting as a preprocessor, runtime editor, help system, and post processor for TRAC. This paper summarizes the objectives of the project, the GUI development process and experiences, and the resulting end product, TOOKUIL.« less
User-Centered Design Practices to Redesign a Nursing e-Chart in Line with the Nursing Process.
Schachner, María B; Recondo, Francisco J; González, Zulma A; Sommer, Janine A; Stanziola, Enrique; Gassino, Fernando D; Simón, Mariana; López, Gastón E; Benítez, Sonia E
2016-01-01
Regarding the user-centered design (UCD) practices carried out at Hospital Italiano of Buenos Aires, nursing e-chart user interface was redesigned in order to improve records' quality of nursing process based on an adapted Virginia Henderson theoretical model and patient safety standards to fulfil Joint Commission accreditation requirements. UCD practices were applied as standardized and recommended for electronic medical records usability evaluation. Implementation of these practices yielded a series of prototypes in 5 iterative cycles of incremental improvements to achieve goals of usability which were used and perceived as satisfactory by general care nurses. Nurses' involvement allowed balance between their needs and institution requirements.
A psychotechnological review on eye-tracking systems: towards user experience.
Mele, Maria Laura; Federici, Stefano
2012-07-01
The aim of the present work is to show a critical review of the international literature on eye-tracking technologies by focusing on those features that characterize them as 'psychotechnologies'. A critical literature review was conducted through the main psychology, engineering, and computer sciences databases by following specific inclusion and exclusion criteria. A total of 46 matches from 1998 to 2010 were selected for content analysis. Results have been divided into four broad thematic areas. We found that, although there is a growing attention to end-users, most of the studies reviewed in this work are far from being considered as adopting holistic human-computer interaction models that include both individual differences and needs of users. User is often considered only as a measurement object of the functioning of the technological system and not as a real alter-ego of the intrasystemic interaction. In order to fully benefit from the communicative functions of gaze, the research on eye-tracking must emphasize user experience. Eye-tracking systems would become an effective assistive technology for integration, adaptation and neutralization of the environmental barrier only when a holistic model can be applied for both design processes and assessment of the functional components of the interaction.
Interactive shape metamorphosis
NASA Technical Reports Server (NTRS)
Chen, David T.; State, Andrei; Banks, David
1994-01-01
A technique for controlled metamorphosis between surfaces in 3-space is described. Well-understood techniques to produce shape metamorphosis between models in a 2D parametric space is applied. The user selects morphable features interactively, and the morphing process executes in real time on a high-performance graphics multicomputer.
Process for selecting engineering tools : applied to selecting a SysML tool.
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Spain, Mark J.; Post, Debra S.; Taylor, Jeffrey L.
2011-02-01
Process for Selecting Engineering Tools outlines the process and tools used to select a SysML (Systems Modeling Language) tool. The process is general in nature and users could use the process to select most engineering tools and software applications.
NASA Astrophysics Data System (ADS)
Cox, S. J.; Wyborn, L. A.; Fraser, R.; Rankine, T.; Woodcock, R.; Vote, J.; Evans, B.
2012-12-01
The Virtual Geophysics Laboratory (VGL) is web portal that provides geoscientists with an integrated online environment that: seamlessly accesses geophysical and geoscience data services from the AuScope national geoscience information infrastructure; loosely couples these data to a variety of gesocience software tools; and provides large scale processing facilities via cloud computing. VGL is a collaboration between CSIRO, Geoscience Australia, National Computational Infrastructure, Monash University, Australian National University and the University of Queensland. The VGL provides a distributed system whereby a user can enter an online virtual laboratory to seamlessly connect to OGC web services for geoscience data. The data is supplied in open standards formats using international standards like GeoSciML. A VGL user uses a web mapping interface to discover and filter the data sources using spatial and attribute filters to define a subset. Once the data is selected the user is not required to download the data. VGL collates the service query information for later in the processing workflow where it will be staged directly to the computing facilities. The combination of deferring data download and access to Cloud computing enables VGL users to access their data at higher resolutions and to undertake larger scale inversions, more complex models and simulations than their own local computing facilities might allow. Inside the Virtual Geophysics Laboratory, the user has access to a library of existing models, complete with exemplar workflows for specific scientific problems based on those models. For example, the user can load a geological model published by Geoscience Australia, apply a basic deformation workflow provided by a CSIRO scientist, and have it run in a scientific code from Monash. Finally the user can publish these results to share with a colleague or cite in a paper. This opens new opportunities for access and collaboration as all the resources (models, code, data, processing) are shared in the one virtual laboratory. VGL provides end users with access to an intuitive, user-centered interface that leverages cloud storage and cloud and cluster processing from both the research communities and commercial suppliers (e.g. Amazon). As the underlying data and information services are agnostic of the scientific domain, they can support many other data types. This fundamental characteristic results in a highly reusable virtual laboratory infrastructure that could also be used for example natural hazards, satellite processing, soil geochemistry, climate modeling, agriculture crop modeling.
An integrated view of data quality in Earth observation
Yang, X.; Blower, J. D.; Bastin, L.; Lush, V.; Zabala, A.; Masó, J.; Cornford, D.; Díaz, P.; Lumsden, J.
2013-01-01
Data quality is a difficult notion to define precisely, and different communities have different views and understandings of the subject. This causes confusion, a lack of harmonization of data across communities and omission of vital quality information. For some existing data infrastructures, data quality standards cannot address the problem adequately and cannot fulfil all user needs or cover all concepts of data quality. In this study, we discuss some philosophical issues on data quality. We identify actual user needs on data quality, review existing standards and specifications on data quality, and propose an integrated model for data quality in the field of Earth observation (EO). We also propose a practical mechanism for applying the integrated quality information model to a large number of datasets through metadata inheritance. While our data quality management approach is in the domain of EO, we believe that the ideas and methodologies for data quality management can be applied to wider domains and disciplines to facilitate quality-enabled scientific research. PMID:23230156
An integrated view of data quality in Earth observation.
Yang, X; Blower, J D; Bastin, L; Lush, V; Zabala, A; Masó, J; Cornford, D; Díaz, P; Lumsden, J
2013-01-28
Data quality is a difficult notion to define precisely, and different communities have different views and understandings of the subject. This causes confusion, a lack of harmonization of data across communities and omission of vital quality information. For some existing data infrastructures, data quality standards cannot address the problem adequately and cannot fulfil all user needs or cover all concepts of data quality. In this study, we discuss some philosophical issues on data quality. We identify actual user needs on data quality, review existing standards and specifications on data quality, and propose an integrated model for data quality in the field of Earth observation (EO). We also propose a practical mechanism for applying the integrated quality information model to a large number of datasets through metadata inheritance. While our data quality management approach is in the domain of EO, we believe that the ideas and methodologies for data quality management can be applied to wider domains and disciplines to facilitate quality-enabled scientific research.
PseudoBase: a database with RNA pseudoknots.
van Batenburg, F H; Gultyaev, A P; Pleij, C W; Ng, J; Oliehoek, J
2000-01-01
PseudoBase is a database containing structural, functional and sequence data related to RNA pseudo-knots. It can be reached at http://wwwbio. Leiden Univ.nl/ approximately Batenburg/PKB.html. This page will direct the user to a retrieval page from where a particular pseudoknot can be chosen, or to a submission page which enables the user to add pseudoknot information to the database or to an informative page that elaborates on the various aspects of the database. For each pseudoknot, 12 items are stored, e.g. the nucleotides of the region that contains the pseudoknot, the stem positions of the pseudoknot, the EMBL accession number of the sequence that contains this pseudoknot and the support that can be given regarding the reliability of the pseudoknot. Access is via a small number of steps, using 16 different categories. The development process was done by applying the evolutionary methodology for software development rather than by applying the methodology of the classical waterfall model or the more modern spiral model.
Automatic programming of simulation models
NASA Technical Reports Server (NTRS)
Schroer, Bernard J.; Tseng, Fan T.; Zhang, Shou X.; Dwan, Wen S.
1988-01-01
The objective of automatic programming is to improve the overall environment for describing the program. This improved environment is realized by a reduction in the amount of detail that the programmer needs to know and is exposed to. Furthermore, this improved environment is achieved by a specification language that is more natural to the user's problem domain and to the user's way of thinking and looking at the problem. The goal of this research is to apply the concepts of automatic programming (AP) to modeling discrete event simulation system. Specific emphasis is on the design and development of simulation tools to assist the modeler define or construct a model of the system and to then automatically write the corresponding simulation code in the target simulation language, GPSS/PC. A related goal is to evaluate the feasibility of various languages for constructing automatic programming simulation tools.
Op den Akker, Harm; Cabrita, Miriam; Op den Akker, Rieks; Jones, Valerie M; Hermens, Hermie J
2015-06-01
This paper presents a comprehensive and practical framework for automatic generation of real-time tailored messages in behavior change applications. Basic aspects of motivational messages are time, intention, content and presentation. Tailoring of messages to the individual user may involve all aspects of communication. A linear modular system is presented for generating such messages. It is explained how properties of user and context are taken into account in each of the modules of the system and how they affect the linguistic presentation of the generated messages. The model of motivational messages presented is based on an analysis of existing literature as well as the analysis of a corpus of motivational messages used in previous studies. The model extends existing 'ontology-based' approaches to message generation for real-time coaching systems found in the literature. Practical examples are given on how simple tailoring rules can be implemented throughout the various stages of the framework. Such examples can guide further research by clarifying what it means to use e.g. user targeting to tailor a message. As primary example we look at the issue of promoting daily physical activity. Future work is pointed out in applying the present model and framework, defining efficient ways of evaluating individual tailoring components, and improving effectiveness through the creation of accurate and complete user- and context models. Copyright © 2015 Elsevier Inc. All rights reserved.
Workie, Demeke Lakew; Zike, Dereje Tesfaye; Fenta, Haile Mekonnen; Mekonnen, Mulusew Admasu
2018-05-10
Ethiopia is among countries with low contraceptive usage prevalence rate and resulted in high total fertility rate and unwanted pregnancy which intern affects the maternal and child health status. This study aimed to investigate the major factors that affect the number of modern contraceptive users at service delivery point in Ethiopia. The Performance Monitoring and Accountability2020/Ethiopia data collected between March and April 2016 at round-4 from 461 eligible service delivery points were in this study. The weighted log-linear negative binomial model applied to analyze the service delivery point's data. Fifty percent of service delivery points in Ethiopia given service for 61 modern contraceptive users with the interquartile range of 0.62. The expected log number of modern contraceptive users at rural was 1.05 (95% Wald CI: - 1.42 to - 0.68) lower than the expected log number of modern contraceptive users at urban. In addition, the expected log count of modern contraceptive users at others facility type was 0.58 lower than the expected log count of modern contraceptive users at the health center. The numbers of nurses/midwives were affecting the number of modern contraceptive users. Since, the incidence rate of modern contraceptive users increased by one due to an additional nurse in the delivery point. Among different factors considered in this study, residence, region, facility type, the number of days per week family planning offered, the number of nurses/midwives and number of medical assistants were to be associated with the number of modern contraceptive users. Thus, the Government of Ethiopia would take immediate steps to address causes of the number of modern contraceptive users in Ethiopia.
Omar, Abdurahman; Ellenius, Johan; Lindemalm, Synnöve
2017-01-01
This study aims to evaluate pediatrician's acceptance, perception and use of Electronic Prescribing Decision Support Systems (EPDSS) at a tertiary care using Extended Technology Acceptance Model (TAM2). Qualitative research methodology was applied. Semi-structured questions were developed according to TAM2 model. Pediatricians perceived that the EPDSS is useful and they showed a favorable attitude. However, perceived ease of use and output quality appeared to affect use of EPDSS. Concerns were expressed about complicated screens, difficulty to read and view medication overview of the patient, the navigation requires many clicks and medication system don't meet their need. End users have difficulty of ordering drugs for ploy-clinical patients and they were unable to cancel or stop medications. Junior pediatricians were influenced by senior colleague since they can get better advice about medication order than the system. Applying TAM2 framework has revealed that pediatrician's attitude and acceptance of electronic prescribing system. This study has identified factors that are important for end user acceptance as well as suggestions for system improvement. Although pediatricians are positive to the usefulness of EPDSS, it appears there are some acceptance problems due to ease of use concern and usability issues of the system.
Stance and influence of Twitter users regarding the Brexit referendum.
Grčar, Miha; Cherepnalkoski, Darko; Mozetič, Igor; Kralj Novak, Petra
2017-01-01
Social media are an important source of information about the political issues, reflecting, as well as influencing, public mood. We present an analysis of Twitter data, collected over 6 weeks before the Brexit referendum, held in the UK in June 2016. We address two questions: what is the relation between the Twitter mood and the referendum outcome, and who were the most influential Twitter users in the pro- and contra-Brexit camps? First, we construct a stance classification model by machine learning methods, and are then able to predict the stance of about one million UK-based Twitter users. The demography of Twitter users is, however, very different from the demography of the voters. By applying a simple age-adjusted mapping to the overall Twitter stance, the results show the prevalence of the pro-Brexit voters, something unexpected by most of the opinion polls. Second, we apply the Hirsch index to estimate the influence, and rank the Twitter users from both camps. We find that the most productive Twitter users are not the most influential, that the pro-Brexit camp was four times more influential, and had considerably larger impact on the campaign than the opponents. Third, we find that the top pro-Brexit communities are considerably more polarized than the contra-Brexit camp. These results show that social media provide a rich resource of data to be exploited, but accumulated knowledge and lessons learned from the opinion polls have to be adapted to the new data sources.
Estimating User Influence in Online Social Networks Subject to Information Overload
NASA Astrophysics Data System (ADS)
Li, Pei; Sun, Yunchuan; Chen, Yingwen; Tian, Zhi
2014-11-01
Online social networks have attracted remarkable attention since they provide various approaches for hundreds of millions of people to stay connected with their friends. Due to the existence of information overload, the research on diffusion dynamics in epidemiology cannot be adopted directly to that in online social networks. In this paper, we consider diffusion dynamics in online social networks subject to information overload, and model the information-processing process of a user by a queue with a batch arrival and a finite buffer. We use the average number of times a message is processed after it is generated by a given user to characterize the user influence, which is then estimated through theoretical analysis for a given network. We validate the accuracy of our estimation by simulations, and apply the results to study the impacts of different factors on the user influence. Among the observations, we find that the impact of network size on the user influence is marginal while the user influence decreases with assortativity due to information overload, which is particularly interesting.
NASA Astrophysics Data System (ADS)
Jaume-i-Capó, Antoni; Varona, Javier; González-Hidalgo, Manuel; Mas, Ramon; Perales, Francisco J.
2012-02-01
Human motion capture has a wide variety of applications, and in vision-based motion capture systems a major issue is the human body model and its initialization. We present a computer vision algorithm for building a human body model skeleton in an automatic way. The algorithm is based on the analysis of the human shape. We decompose the body into its main parts by computing the curvature of a B-spline parameterization of the human contour. This algorithm has been applied in a context where the user is standing in front of a camera stereo pair. The process is completed after the user assumes a predefined initial posture so as to identify the main joints and construct the human model. Using this model, the initialization problem of a vision-based markerless motion capture system of the human body is solved.
Computer software tool REALM for sustainable water allocation and management.
Perera, B J C; James, B; Kularathna, M D U
2005-12-01
REALM (REsource ALlocation Model) is a generalised computer simulation package that models harvesting and bulk distribution of water resources within a water supply system. It is a modeling tool, which can be applied to develop specific water allocation models. Like other water resource simulation software tools, REALM uses mass-balance accounting at nodes, while the movement of water within carriers is subject to capacity constraints. It uses a fast network linear programming algorithm to optimise the water allocation within the network during each simulation time step, in accordance with user-defined operating rules. This paper describes the main features of REALM and provides potential users with an appreciation of its capabilities. In particular, it describes two case studies covering major urban and rural water supply systems. These case studies illustrate REALM's capabilities in the use of stochastically generated data in water supply planning and management, modelling of environmental flows, and assessing security of supply issues.
Integrating Health Behavior Theory and Design Elements in Serious Games.
Cheek, Colleen; Fleming, Theresa; Lucassen, Mathijs Fg; Bridgman, Heather; Stasiak, Karolina; Shepherd, Matthew; Orpin, Peter
2015-01-01
Internet interventions for improving health and well-being have the potential to reach many people and fill gaps in service provision. Serious gaming interfaces provide opportunities to optimize user adherence and impact. Health interventions based in theory and evidence and tailored to psychological constructs have been found to be more effective to promote behavior change. Defining the design elements which engage users and help them to meet their goals can contribute to better informed serious games. To elucidate design elements important in SPARX, a serious game for adolescents with depression, from a user-centered perspective. We proposed a model based on an established theory of health behavior change and practical features of serious game design to organize ideas and rationale. We analyzed data from 5 studies comprising a total of 22 focus groups and 66 semistructured interviews conducted with youth and families in New Zealand and Australia who had viewed or used SPARX. User perceptions of the game were applied to this framework. A coherent framework was established using the three constructs of self-determination theory (SDT), autonomy, competence, and relatedness, to organize user perceptions and design elements within four areas important in design: computer game, accessibility, working alliance, and learning in immersion. User perceptions mapped well to the framework, which may assist developers in understanding the context of user needs. By mapping these elements against the constructs of SDT, we were able to propose a sound theoretical base for the model. This study's method allowed for the articulation of design elements in a serious game from a user-centered perspective within a coherent overarching framework. The framework can be used to deliberately incorporate serious game design elements that support a user's sense of autonomy, competence, and relatedness, key constructs which have been found to mediate motivation at all stages of the change process. The resulting model introduces promising avenues for future exploration. Involving users in program design remains an imperative if serious games are to be fit for purpose.
Operational research as implementation science: definitions, challenges and research priorities.
Monks, Thomas
2016-06-06
Operational research (OR) is the discipline of using models, either quantitative or qualitative, to aid decision-making in complex implementation problems. The methods of OR have been used in healthcare since the 1950s in diverse areas such as emergency medicine and the interface between acute and community care; hospital performance; scheduling and management of patient home visits; scheduling of patient appointments; and many other complex implementation problems of an operational or logistical nature. To date, there has been limited debate about the role that operational research should take within implementation science. I detail three such roles for OR all grounded in upfront system thinking: structuring implementation problems, prospective evaluation of improvement interventions, and strategic reconfiguration. Case studies from mental health, emergency medicine, and stroke care are used to illustrate each role. I then describe the challenges for applied OR within implementation science at the organisational, interventional, and disciplinary levels. Two key challenges include the difficulty faced in achieving a position of mutual understanding between implementation scientists and research users and a stark lack of evaluation of OR interventions. To address these challenges, I propose a research agenda to evaluate applied OR through the lens of implementation science, the liberation of OR from the specialist research and consultancy environment, and co-design of models with service users. Operational research is a mature discipline that has developed a significant volume of methodology to improve health services. OR offers implementation scientists the opportunity to do more upfront system thinking before committing resources or taking risks. OR has three roles within implementation science: structuring an implementation problem, prospective evaluation of implementation problems, and a tool for strategic reconfiguration of health services. Challenges facing OR as implementation science include limited evidence and evaluation of impact, limited service user involvement, a lack of managerial awareness, effective communication between research users and OR modellers, and availability of healthcare data. To progress the science, a focus is needed in three key areas: evaluation of OR interventions, embedding the knowledge of OR in health services, and educating OR modellers about the aims and benefits of service user involvement.
INTEGRATED AIR POLLUTION CONTROL SYSTEM, VERSION 4.0 - VOLUME 1: USER'S GUIDE
The Integrated Air Pollution Control System (IAPCS) was developed for the U.S. EPA's Air and Energy Engineering Research Laboratory to estimate costs and performance for emission control systems applied to coal-fired utility boilers. The model can project a material balance, and ...
Detailed model for practical pulverized coal furnaces and gasifiers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, P.J.; Smoot, L.D.
1989-08-01
This study has been supported by a consortium of nine industrial and governmental sponsors. Work was initiated on May 1, 1985 and completed August 31, 1989. The central objective of this work was to develop, evaluate and apply a practical combustion model for utility boilers, industrial furnaces and gasifiers. Key accomplishments have included: Development of an advanced first-generation, computer model for combustion in three dimensional furnaces; development of a new first generation fouling and slagging submodel; detailed evaluation of an existing NO{sub x} submodel; development and evaluation of an improved radiation submodel; preparation and distribution of a three-volume final report:more » (a) Volume 1: General Technical Report; (b) Volume 2: PCGC-3 User's Manual; (c) Volume 3: Data Book for Evaluation of Three-Dimensional Combustion Models; and organization of a user's workshop on the three-dimensional code. The furnace computer model developed under this study requires further development before it can be applied generally to all applications; however, it can be used now by specialists for many specific applications, including non-combusting systems and combusting geseous systems. A new combustion center was organized and work was initiated to continue the important research effort initiated by this study. 212 refs., 72 figs., 38 tabs.« less
Can Counter-Gang Models be Applied to Counter ISIS’s Internet Recruitment Campaign
2016-06-10
limitation that exists is the lack of reliable statistics from social media companies in regards to the quantity of ISIS-affiliated sites, which exist on... statistics , they have approximately 320-million monthly active users with thirty-five-plus languages supported and 77 percent of accounts located...Justice and Delinquency Prevention program. For deterrence-based models, the primary point of research is focused deterrence models with emphasis placed
Factors Affecting Intention to Use in Social Networking Sites: An Empirical Study on Thai Society
NASA Astrophysics Data System (ADS)
Jairak, Rath; Sahakhunchai, Napath; Jairak, Kallaya; Praneetpolgrang, Prasong
This research aims to explore the factors that affect the intention to use in Social Networking Sites (SNS). We apply the theory of Technology Acceptance Model (TAM), intrinsic motivation, and trust properties to develop the theoretical framework for SNS users' intention. The results show that the important factors influencing SNS users' intention for general purpose and collaborative learning are task-oriented, pleasure-oriented, and familiarity-based trust. In marketing usage, dispositional trust and pleasure-oriented are two main factors that reflect intention to use in SNS.
Internet-based data warehousing
NASA Astrophysics Data System (ADS)
Boreisha, Yurii
2001-10-01
In this paper, we consider the process of the data warehouse creation and population using the latest Internet and database access technologies. The logical three-tier model is applied. This approach allows developing of an enterprise schema by analyzing the various processes in the organization, and extracting the relevant entities and relationships from them. Integration with local schemas and population of the data warehouse is done through the corresponding user, business, and data services components. The hierarchy of these components is used to hide from the data warehouse users the entire complex online analytical processing functionality.
Constructing Agent Model for Virtual Training Systems
NASA Astrophysics Data System (ADS)
Murakami, Yohei; Sugimoto, Yuki; Ishida, Toru
Constructing highly realistic agents is essential if agents are to be employed in virtual training systems. In training for collaboration based on face-to-face interaction, the generation of emotional expressions is one key. In training for guidance based on one-to-many interaction such as direction giving for evacuations, emotional expressions must be supplemented by diverse agent behaviors to make the training realistic. To reproduce diverse behavior, we characterize agents by using a various combinations of operation rules instantiated by the user operating the agent. To accomplish this goal, we introduce a user modeling method based on participatory simulations. These simulations enable us to acquire information observed by each user in the simulation and the operating history. Using these data and the domain knowledge including known operation rules, we can generate an explanation for each behavior. Moreover, the application of hypothetical reasoning, which offers consistent selection of hypotheses, to the generation of explanations allows us to use otherwise incompatible operation rules as domain knowledge. In order to validate the proposed modeling method, we apply it to the acquisition of an evacuee's model in a fire-drill experiment. We successfully acquire a subject's model corresponding to the results of an interview with the subject.
Scenario driven data modelling: a method for integrating diverse sources of data and data streams
2011-01-01
Background Biology is rapidly becoming a data intensive, data-driven science. It is essential that data is represented and connected in ways that best represent its full conceptual content and allows both automated integration and data driven decision-making. Recent advancements in distributed multi-relational directed graphs, implemented in the form of the Semantic Web make it possible to deal with complicated heterogeneous data in new and interesting ways. Results This paper presents a new approach, scenario driven data modelling (SDDM), that integrates multi-relational directed graphs with data streams. SDDM can be applied to virtually any data integration challenge with widely divergent types of data and data streams. In this work, we explored integrating genetics data with reports from traditional media. SDDM was applied to the New Delhi metallo-beta-lactamase gene (NDM-1), an emerging global health threat. The SDDM process constructed a scenario, created a RDF multi-relational directed graph that linked diverse types of data to the Semantic Web, implemented RDF conversion tools (RDFizers) to bring content into the Sematic Web, identified data streams and analytical routines to analyse those streams, and identified user requirements and graph traversals to meet end-user requirements. Conclusions We provided an example where SDDM was applied to a complex data integration challenge. The process created a model of the emerging NDM-1 health threat, identified and filled gaps in that model, and constructed reliable software that monitored data streams based on the scenario derived multi-relational directed graph. The SDDM process significantly reduced the software requirements phase by letting the scenario and resulting multi-relational directed graph define what is possible and then set the scope of the user requirements. Approaches like SDDM will be critical to the future of data intensive, data-driven science because they automate the process of converting massive data streams into usable knowledge. PMID:22165854
31 CFR 344.3 - What provisions apply to the SLGSafe Service?
Code of Federal Regulations, 2010 CFR
2010-07-01
... SLGSafe transactions: (1) SLGSafe Application for Internet Access and SLGSafe User Acknowledgment; and (2) SLGSafe User's Manual. (d) Who can apply for SLGSafe access? If you are an owner or a potential owner of... access. Other potential users of SLGSafe include, but are not limited to, underwriters, financial...
SIGMA: A Knowledge-Based Simulation Tool Applied to Ecosystem Modeling
NASA Technical Reports Server (NTRS)
Dungan, Jennifer L.; Keller, Richard; Lawless, James G. (Technical Monitor)
1994-01-01
The need for better technology to facilitate building, sharing and reusing models is generally recognized within the ecosystem modeling community. The Scientists' Intelligent Graphical Modelling Assistant (SIGMA) creates an environment for model building, sharing and reuse which provides an alternative to more conventional approaches which too often yield poorly documented, awkwardly structured model code. The SIGMA interface presents the user a list of model quantities which can be selected for computation. Equations to calculate the model quantities may be chosen from an existing library of ecosystem modeling equations, or built using a specialized equation editor. Inputs for dim equations may be supplied by data or by calculation from other equations. Each variable and equation is expressed using ecological terminology and scientific units, and is documented with explanatory descriptions and optional literature citations. Automatic scientific unit conversion is supported and only physically-consistent equations are accepted by the system. The system uses knowledge-based semantic conditions to decide which equations in its library make sense to apply in a given situation, and supplies these to the user for selection. "Me equations and variables are graphically represented as a flow diagram which provides a complete summary of the model. Forest-BGC, a stand-level model that simulates photosynthesis and evapo-transpiration for conifer canopies, was originally implemented in Fortran and subsequenty re-implemented using SIGMA. The SIGMA version reproduces daily results and also provides a knowledge base which greatly facilitates inspection, modification and extension of Forest-BGC.
ERIC Educational Resources Information Center
Smith, Harvey A.
This module is designed to apply mathematical models to nuclear deterrent problems, and to aid users in developing enlightened skepticism about the use of linear models in stability analyses and long-term predictions. An attempt is made at avoiding overwhelming complexities through concentration on land-based missile forces. It is noted that after…
Regional Disparities in Online Map User Access Volume and Determining Factors
NASA Astrophysics Data System (ADS)
Li, R.; Yang, N.; Li, R.; Huang, W.; Wu, H.
2017-09-01
The regional disparities of online map user access volume (use `user access volume' in this paper to indicate briefly) is a topic of growing interest with the increment of popularity in public users, which helps to target the construction of geographic information services for different areas. At first place we statistically analysed the online map user access logs and quantified these regional access disparities on different scales. The results show that the volume of user access is decreasing from east to the west in China as a whole, while East China produces the most access volume; these cities are also the crucial economic and transport centres. Then Principal Component Regression (PCR) is applied to explore the regional disparities of user access volume. A determining model for Online Map access volume is proposed afterwards, which indicates that area scale is the primary determining factor for regional disparities, followed by public transport development level and public service development level. Other factors like user quality index and financial index have very limited influence on the user access volume. According to the study of regional disparities in user access volume, map providers can reasonably dispatch and allocate the data resources and service resources in each area and improve the operational efficiency of the Online Map server cluster.
Investigation of Micro-Scale Architectural Effects on Damage of Composites
NASA Technical Reports Server (NTRS)
Stier, Bertram; Bednarcyk, Brett A.; Simon, Jaan W.; Reese, Stefanie
2015-01-01
This paper presents a three-dimensional, energy based, anisotropic, stiffness reduction, progressive damage model for composite materials and composite material constituents. The model has been implemented as a user-defined constitutive model within the Abaqus finite element software package and applied to simulate the nonlinear behavior of a damaging epoxy matrix within a unidirectional composite material. Three different composite microstructures were considered as finite element repeating unit cells, with appropriate periodicity conditions applied at the boundaries. Results representing predicted transverse tensile, longitudinal shear, and transverse shear stress-strain curves are presented, along with plots of the local fields indicating the damage progression within the microstructure. It is demonstrated that the damage model functions appropriately at the matrix scale, enabling localization of the damage to simulate failure of the composite material. The influence of the repeating unit cell geometry and the effect of the directionality of the applied loading are investigated and discussed.
Clark, David W.; Skinner, Kenneth D.; Pollock, David W.
2006-01-01
A flow and transport model was created with a graphical user interface to simplify the evaluation of nitrogen loading and nitrate transport in the mid-Snake region in south-central Idaho. This model and interface package, the Snake River Nitrate Scenario Simulator, uses the U.S. Geological Survey's MODFLOW 2000 and MOC3D models. The interface, which is enabled for use with geographic information systems (GIS), was created using ESRI's royalty-free MapObjects LT software. The interface lets users view initial nitrogen-loading conditions (representing conditions as of 1998), alter the nitrogen loading within selected zones by specifying a multiplication factor and applying it to the initial condition, run the flow and transport model, and view a graphical representation of the modeling results. The flow and transport model of the Snake River Nitrate Scenario Simulator was created by rediscretizing and recalibrating a clipped portion of an existing regional flow model. The new subregional model was recalibrated with newly available water-level data and spring and ground-water nitrate concentration data for the study area. An updated nitrogen input GIS layer controls the application of nitrogen to the flow and transport model. Users can alter the nitrogen application to the flow and transport model by altering the nitrogen load in predefined spatial zones contained within similar political, hydrologic, and size-constrained boundaries.
A Framework for Sharing and Integrating Remote Sensing and GIS Models Based on Web Service
Chen, Zeqiang; Lin, Hui; Chen, Min; Liu, Deer; Bao, Ying; Ding, Yulin
2014-01-01
Sharing and integrating Remote Sensing (RS) and Geographic Information System/Science (GIS) models are critical for developing practical application systems. Facilitating model sharing and model integration is a problem for model publishers and model users, respectively. To address this problem, a framework based on a Web service for sharing and integrating RS and GIS models is proposed in this paper. The fundamental idea of the framework is to publish heterogeneous RS and GIS models into standard Web services for sharing and interoperation and then to integrate the RS and GIS models using Web services. For the former, a “black box” and a visual method are employed to facilitate the publishing of the models as Web services. For the latter, model integration based on the geospatial workflow and semantic supported marching method is introduced. Under this framework, model sharing and integration is applied for developing the Pearl River Delta water environment monitoring system. The results show that the framework can facilitate model sharing and model integration for model publishers and model users. PMID:24901016
A framework for sharing and integrating remote sensing and GIS models based on Web service.
Chen, Zeqiang; Lin, Hui; Chen, Min; Liu, Deer; Bao, Ying; Ding, Yulin
2014-01-01
Sharing and integrating Remote Sensing (RS) and Geographic Information System/Science (GIS) models are critical for developing practical application systems. Facilitating model sharing and model integration is a problem for model publishers and model users, respectively. To address this problem, a framework based on a Web service for sharing and integrating RS and GIS models is proposed in this paper. The fundamental idea of the framework is to publish heterogeneous RS and GIS models into standard Web services for sharing and interoperation and then to integrate the RS and GIS models using Web services. For the former, a "black box" and a visual method are employed to facilitate the publishing of the models as Web services. For the latter, model integration based on the geospatial workflow and semantic supported marching method is introduced. Under this framework, model sharing and integration is applied for developing the Pearl River Delta water environment monitoring system. The results show that the framework can facilitate model sharing and model integration for model publishers and model users.
ISBP: Understanding the Security Rule of Users' Information-Sharing Behaviors in Partnership
Wu, Hongchen; Wang, Xinjun
2016-01-01
The rapid growth of social network data has given rise to high security awareness among users, especially when they exchange and share their personal information. However, because users have different feelings about sharing their information, they are often puzzled about who their partners for exchanging information can be and what information they can share. Is it possible to assist users in forming a partnership network in which they can exchange and share information with little worry? We propose a modified information sharing behavior prediction (ISBP) model that can help in understanding the underlying rules by which users share their information with partners in light of three common aspects: what types of items users are likely to share, what characteristics of users make them likely to share information, and what features of users’ sharing behavior are easy to predict. This model is applied with machine learning techniques in WEKA to predict users’ decisions pertaining to information sharing behavior and form them into trustable partnership networks by learning their features. In the experiment section, by using two real-life datasets consisting of citizens’ sharing behavior, we identify the effect of highly sensitive requests on sharing behavior adjacent to individual variables: the younger participants’ partners are more difficult to predict than those of the older participants, whereas the partners of people who are not computer majors are easier to predict than those of people who are computer majors. Based on these findings, we believe that it is necessary and feasible to offer users personalized suggestions on information sharing decisions, and this is pioneering work that could benefit college researchers focusing on user-centric strategies and website owners who want to collect more user information without raising their privacy awareness or losing their trustworthiness. PMID:26950064
Rosella, Laura C; Kornas, Kathy; Yao, Zhan; Manuel, Douglas G; Bornbaum, Catherine; Fransoo, Randall; Stukel, Therese
2017-11-17
A large proportion of health care spending is incurred by a small proportion of the population. Population-based health planning tools that consider both the clinical and upstream determinants of high resource users (HRU) of the health system are lacking. To develop and validate the High Resource User Population Risk Tool (HRUPoRT), a predictive model of adults that will become the top 5% of health care users over a 5-year period, based on self-reported clinical, sociodemographic, and health behavioral predictors in population survey data. The HRUPoRT model was developed in a prospective cohort design using the combined 2005 and 2007/2008 Canadian Community Health Surveys (CCHS) (N=58,617), and validated using the external 2009/2010 CCHS cohort (N=28,721). Health care utilization for each of the 5 years following CCHS interview date were determined by applying a person-centered costing algorithm to the linked health administrative databases. Discrimination and calibration of the model were assessed using c-statistic and Hosmer-Lemeshow (HL) χ statistic. The best prediction model for 5-year transition to HRU status included 12 predictors and had good discrimination (c-statistic=0.8213) and calibration (HL χ=18.71) in the development cohort. The model performed similarly in the validation cohort (c-statistic=0.8171; HL χ=19.95). The strongest predictors in the HRUPoRT model were age, perceived general health, and body mass index. HRUPoRT can accurately project the proportion of individuals in the population that will become a HRU over 5 years. HRUPoRT can be applied to inform health resource planning and prevention strategies at the community level.This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-No Derivatives License 4.0 (CCBY-NC-ND), where it is permissible to download and share the work provided it is properly cited. The work cannot be changed in any way or used commercially without permission from the journal. http://creativecommons.org/licenses/by-nc-nd/4.0/.
Stanley, Clayton; Byrne, Michael D
2016-12-01
The growth of social media and user-created content on online sites provides unique opportunities to study models of human declarative memory. By framing the task of choosing a hashtag for a tweet and tagging a post on Stack Overflow as a declarative memory retrieval problem, 2 cognitively plausible declarative memory models were applied to millions of posts and tweets and evaluated on how accurately they predict a user's chosen tags. An ACT-R based Bayesian model and a random permutation vector-based model were tested on the large data sets. The results show that past user behavior of tag use is a strong predictor of future behavior. Furthermore, past behavior was successfully incorporated into the random permutation model that previously used only context. Also, ACT-R's attentional weight term was linked to an entropy-weighting natural language processing method used to attenuate high-frequency words (e.g., articles and prepositions). Word order was not found to be a strong predictor of tag use, and the random permutation model performed comparably to the Bayesian model without including word order. This shows that the strength of the random permutation model is not in the ability to represent word order, but rather in the way in which context information is successfully compressed. The results of the large-scale exploration show how the architecture of the 2 memory models can be modified to significantly improve accuracy, and may suggest task-independent general modifications that can help improve model fit to human data in a much wider range of domains. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
INTEGRATED AIR POLLUTION CONTROL SYSTEM VERSION 5.0 - VOLUME 1: USER'S GUIDE
The three volume report and two diskettes document the Integrated Air Pollution Control System (IAPCS), developed for the U.S. EPA to estimate costs and performance for emission control systems applied to coal-fired utility boilers. The model can project a material balance, an eq...
A Design Analysis Model for Developing World Wide Web Sites.
ERIC Educational Resources Information Center
Ma, Yan
2002-01-01
Examines the relationship between and among designers, text, and users of the Galter Health Sciences Library Web site at Northwestern University by applying reader-response criticism. Highlights include Web site design; comparison of designers' intentions with the actual organization of knowledge on the Web site; and compares designer's intentions…
ACIRF user's guide: Theory and examples
NASA Astrophysics Data System (ADS)
Dana, Roger A.
1989-12-01
Design and evaluation of radio frequency systems that must operate through ionospheric disturbances resulting from high altitude nuclear detonations requires an accurate channel model. This model must include the effects of high gain antennas that may be used to receive the signals. Such a model can then be used to construct realizations of the received signal for use in digital simulations of trans-ionospheric links or for use in hardware channel simulators. The FORTRAN channel model ACIRF (Antenna Channel Impulse Response Function) generates random realizations of the impulse response function at the outputs of multiple antennas. This user's guide describes the FORTRAN program ACIRF (version 2.0) that generates realizations of channel impulse response functions at the outputs of multiple antennas with arbitrary beamwidths, pointing angles, and relatives positions. This channel model is valid under strong scattering conditions when Rayleigh fading statistics apply. Both frozen-in and turbulent models for the temporal fluctuations are included in this version of ACIRF. The theory of the channel model is described and several examples are given.
Augmented Reality Implementation in Watch Catalog as e-Marketing Based on Mobile Aplication
NASA Astrophysics Data System (ADS)
Adrianto, D.; Luwinda, F. A.; Yesmaya, V.
2017-01-01
Augmented Reality is one of important methods to provide user with a better interactive user interface. In this research, Augmented Reality in Mobile Application will be applied to provide user with useful information related with Watch Catalogue. This research will be focused on design and implementation an application using Augmented Reality. The process model used in this research is Extreme Programming. Extreme Programming have a several steps which are planning, design, coding, and testing. The result of this research is Augmented Reality application based on Android. This research will be conclude that implementation of Augmented Reality based on Android in Watch Catalogue will help customer to collect the useful information related to the specific object of watch.
Semantics of User Interface for Image Retrieval: Possibility Theory and Learning Techniques.
ERIC Educational Resources Information Center
Crehange, M.; And Others
1989-01-01
Discusses the need for a rich semantics for the user interface in interactive image retrieval and presents two methods for building such interfaces: possibility theory applied to fuzzy data retrieval, and a machine learning technique applied to learning the user's deep need. Prototypes developed using videodisks and knowledge-based software are…
User's Guide for Monthly Vector Wind Profile Model
NASA Technical Reports Server (NTRS)
Adelfang, S. I.
1999-01-01
The background, theoretical concepts, and methodology for construction of vector wind profiles based on a statistical model are presented. The derived monthly vector wind profiles are to be applied by the launch vehicle design community for establishing realistic estimates of critical vehicle design parameter dispersions related to wind profile dispersions. During initial studies a number of months are used to establish the model profiles that produce the largest monthly dispersions of ascent vehicle aerodynamic load indicators. The largest monthly dispersions for wind, which occur during the winter high-wind months, are used for establishing the design reference dispersions for the aerodynamic load indicators. This document includes a description of the computational process for the vector wind model including specification of input data, parameter settings, and output data formats. Sample output data listings are provided to aid the user in the verification of test output.
Users manual for a one-dimensional Lagrangian transport model
Schoellhamer, D.H.; Jobson, H.E.
1986-01-01
A Users Manual for the Lagrangian Transport Model (LTM) is presented. The LTM uses Lagrangian calculations that are based on a reference frame moving with the river flow. The Lagrangian reference frame eliminates the need to numerically solve the convective term of the convection-diffusion equation and provides significant numerical advantages over the more commonly used Eulerian reference frame. When properly applied, the LTM can simulate riverine transport and decay processes within the accuracy required by most water quality studies. The LTM is applicable to steady or unsteady one-dimensional unidirectional flows in fixed channels with tributary and lateral inflows. Application of the LTM is relatively simple and optional capabilities improve the model 's convenience. Appendices give file formats and three example LTM applications that include the incorporation of the QUAL II water quality model 's reaction kinetics into the LTM. (Author 's abstract)
NASA Astrophysics Data System (ADS)
Horvath, Denis; Gazda, Juraj; Brutovsky, Branislav
Evolutionary species and quasispecies models provide the universal and flexible basis for a large-scale description of the dynamics of evolutionary systems, which can be built conceived as a constraint satisfaction dynamics. It represents a general framework to design and study many novel, technologically contemporary models and their variants. Here, we apply the classical quasispecies concept to model the emerging dynamic spectrum access (DSA) markets. The theory describes the mechanisms of mimetic transfer, competitive interactions between socioeconomic strata of the end-users, their perception of the utility and inter-operator switching in the variable technological environments of the operators offering the wireless spectrum services. The algorithmization and numerical modeling demonstrate the long-term evolutionary socioeconomic changes which reflect the end-user preferences and results of the majorization of their irrational decisions in the same manner as the prevailing tendencies which are embodied in the efficient market hypothesis.
Designing Class Methods from Dataflow Diagrams
NASA Astrophysics Data System (ADS)
Shoval, Peretz; Kabeli-Shani, Judith
A method for designing the class methods of an information system is described. The method is part of FOOM - Functional and Object-Oriented Methodology. In the analysis phase of FOOM, two models defining the users' requirements are created: a conceptual data model - an initial class diagram; and a functional model - hierarchical OO-DFDs (object-oriented dataflow diagrams). Based on these models, a well-defined process of methods design is applied. First, the OO-DFDs are converted into transactions, i.e., system processes that supports user task. The components and the process logic of each transaction are described in detail, using pseudocode. Then, each transaction is decomposed, according to well-defined rules, into class methods of various types: basic methods, application-specific methods and main transaction (control) methods. Each method is attached to a proper class; messages between methods express the process logic of each transaction. The methods are defined using pseudocode or message charts.
Predicting Operator Execution Times Using CogTool
NASA Technical Reports Server (NTRS)
Santiago-Espada, Yamira; Latorella, Kara A.
2013-01-01
Researchers and developers of NextGen systems can use predictive human performance modeling tools as an initial approach to obtain skilled user performance times analytically, before system testing with users. This paper describes the CogTool models for a two pilot crew executing two different types of a datalink clearance acceptance tasks, and on two different simulation platforms. The CogTool time estimates for accepting and executing Required Time of Arrival and Interval Management clearances were compared to empirical data observed in video tapes and registered in simulation files. Results indicate no statistically significant difference between empirical data and the CogTool predictions. A population comparison test found no significant differences between the CogTool estimates and the empirical execution times for any of the four test conditions. We discuss modeling caveats and considerations for applying CogTool to crew performance modeling in advanced cockpit environments.
PSAMM: A Portable System for the Analysis of Metabolic Models
Steffensen, Jon Lund; Dufault-Thompson, Keith; Zhang, Ying
2016-01-01
The genome-scale models of metabolic networks have been broadly applied in phenotype prediction, evolutionary reconstruction, community functional analysis, and metabolic engineering. Despite the development of tools that support individual steps along the modeling procedure, it is still difficult to associate mathematical simulation results with the annotation and biological interpretation of metabolic models. In order to solve this problem, here we developed a Portable System for the Analysis of Metabolic Models (PSAMM), a new open-source software package that supports the integration of heterogeneous metadata in model annotations and provides a user-friendly interface for the analysis of metabolic models. PSAMM is independent of paid software environments like MATLAB, and all its dependencies are freely available for academic users. Compared to existing tools, PSAMM significantly reduced the running time of constraint-based analysis and enabled flexible settings of simulation parameters using simple one-line commands. The integration of heterogeneous, model-specific annotation information in PSAMM is achieved with a novel format of YAML-based model representation, which has several advantages, such as providing a modular organization of model components and simulation settings, enabling model version tracking, and permitting the integration of multiple simulation problems. PSAMM also includes a number of quality checking procedures to examine stoichiometric balance and to identify blocked reactions. Applying PSAMM to 57 models collected from current literature, we demonstrated how the software can be used for managing and simulating metabolic models. We identified a number of common inconsistencies in existing models and constructed an updated model repository to document the resolution of these inconsistencies. PMID:26828591
Optimizing Sensing: From Water to the Web
2009-05-01
e.g., when applied to busi - ness practices, the Pareto Principle says that “80% of your sales come from 20% of your clients.” In water distribution...of this challenge included a realistic model of a real metropolitan area water distribution network (Figure 4(a)) with 12,527 nodes, as well as a...we learn a probabilistic model P (Y,XV) from training data collected by [37] from 20 users. This model encodes the statistical dependencies between
Research on a dynamic workflow access control model
NASA Astrophysics Data System (ADS)
Liu, Yiliang; Deng, Jinxia
2007-12-01
In recent years, the access control technology has been researched widely in workflow system, two typical technologies of that are RBAC (Role-Based Access Control) and TBAC (Task-Based Access Control) model, which has been successfully used in the role authorizing and assigning in a certain extent. However, during the process of complicating a system's structure, these two types of technology can not be used in minimizing privileges and separating duties, and they are inapplicable when users have a request of frequently changing on the workflow's process. In order to avoid having these weakness during the applying, a variable flow dynamic role_task_view (briefly as DRTVBAC) of fine-grained access control model is constructed on the basis existed model. During the process of this model applying, an algorithm is constructed to solve users' requirements of application and security needs on fine-grained principle of privileges minimum and principle of dynamic separation of duties. The DRTVBAC model is implemented in the actual system, the figure shows that the task associated with the dynamic management of role and the role assignment is more flexible on authority and recovery, it can be met the principle of least privilege on the role implement of a specific task permission activated; separated the authority from the process of the duties completing in the workflow; prevented sensitive information discovering from concise and dynamic view interface; satisfied with the requirement of the variable task-flow frequently.
Marsh, T; Wright, P; Smith, S
2001-04-01
New and emerging media technologies have the potential to induce a variety of experiences in users. In this paper, it is argued that the inducement of experience presupposes that users are absorbed in the illusion created by these media. Looking to another successful visual medium, film, this paper borrows from the techniques used in "shaping experience" to hold spectators' attention in the illusion of film, and identifies what breaks the illusion/experience for spectators. This paper focuses on one medium, virtual reality (VR), and advocates a transparent or "invisible style" of interaction. We argue that transparency keeps users in the "flow" of their activities and consequently enhances experience in users. Breakdown in activities breaks the experience and subsequently provides opportunities to identify and analyze potential causes of usability problems. Adopting activity theory, we devise a model of interaction with VR--through consciousness and activity--and introduce the concept of breakdown in illusion. From this, a model of effective interaction with VR is devised and the occurrence of breakdown in interaction and illusion is identified along a continuum of engagement. Evaluation guidelines for the design of experience are proposed and applied to usability problems detected in an empirical study of a head-mounted display (HMD) VR system. This study shows that the guidelines are effective in the evaluation of VR. Finally, we look at the potential experiences that may be induced in users and propose a way to evaluate user experience in virtual environments (VEs) and other new and emerging media.
Santos, Debora de Souza; Mishima, Silvana Martins; Merhy, Emerson Elias
2018-03-01
This is a study with a qualitative approach that aims to analyze the subjective dimension of the Family Health teams' practices conducted in order to reconfigure the care model, using the work process in health as the fundamental analytical category from the Marxist standpoint. The data gathering tool used was the focus group, applied in 13 Family Health teams in Maceió, Alagoas. Thematic analysis combined with principles of hermeneutics and dialectics was used for treatment and interpretation of the data. The results indicate that the user´s needs identified by the teams are multiple and permeated by a lack of sympathetic care, and that the user repeatedly wishes to be "heard." The teams show that they are open to the use of soft technologies of compassionate care, although they do not legitimate them as instruments of their work. Clinical knowledge and techniques are prioritized in the work process, which limits the potential of its subjective dimension in order to (re)configure a care model which attends the user´s needs and is sustained by integrated care.
A Two-Phase Model for Trade Matching and Price Setting in Double Auction Water Markets
NASA Astrophysics Data System (ADS)
Xu, Tingting; Zheng, Hang; Zhao, Jianshi; Liu, Yicheng; Tang, Pingzhong; Yang, Y. C. Ethan; Wang, Zhongjing
2018-04-01
Delivery in water markets is generally operated by agencies through channel systems, which imposes physical and institutional market constraints. Many water markets allow water users to post selling and buying requests on a board. However, water users may not be able to choose efficiently when the information (including the constraints) becomes complex. This study proposes an innovative two-phase model to address this problem based on practical experience in China. The first phase seeks and determines the optimal assignment that maximizes the incremental improvement of the system's social welfare according to the bids and asks in the water market. The second phase sets appropriate prices under constraints. Applying this model to China's Xiying Irrigation District shows that it can improve social welfare more than the current "pool exchange" method can. Within the second phase, we evaluate three objective functions (minimum variance, threshold-based balance, and two-sided balance), which represent different managerial goals. The threshold-based balance function should be preferred by most users, while the two-sided balance should be preferred by players who post extreme prices.
Dawson, Carolyn H; Mackrill, Jamie B; Cain, Rebecca
2017-12-01
Hand hygiene (HH) prevents harmful contaminants spreading in settings including domestic, health care and food handling. Strategies to improve HH range from behavioural techniques through to automated sinks that ensure hand surface cleaning. This study aimed to assess user experience and acceptance towards a new automated sink, compared to a normal sink. An adapted version of the technology acceptance model (TAM) assessed each mode of handwashing. A within-subjects design enabled N = 46 participants to evaluate both sinks. Perceived Ease of Use and Satisfaction of Use were significantly lower for the automated sink, compared to the conventional sink (p < 0.005). Across the remaining TAM factors, there was no significant difference. Participants suggested design features including jet strength, water temperature and device affordance may improve HH technology. We provide recommendations for future HH technology development to contribute a positive user experience, relevant to technology developers, ergonomists and those involved in HH across all sectors. Practitioner Summary: The need to facilitate timely, effective hand hygiene to prevent illness has led to a rise in automated handwashing systems across different contexts. User acceptance is a key factor in system uptake. This paper applies the technology acceptance model as a means to explore and optimise the design of such systems.
Predicting mosaics and wildlife diversity resulting from fire disturbance to a forest ecosystem
NASA Astrophysics Data System (ADS)
Potter, Meredith W.; Kessell, Stephen R.
1980-05-01
A model for predicting community mosaics and wildlife diversity resulting from fire disturbance to a forest ecosystem is presented. It applies an algorithm that delineates the size and shape of each patch from grid-based input data and calculates standard diversity measures for the entire mosaic of community patches and their included animal species. The user can print these diversity calculations, maps of the current community-type-age-class mosaic, and maps of habitat utilization by each animal species. Furthermore, the user can print estimates of changes in each resulting from natural disturbance. Although data and resolution level independent, the model is demonstrated and tested with data from the Lewis and Clark National Forest in Montana.
Modelling operations and security of cloud systems using Z-notation and Chinese Wall security policy
NASA Astrophysics Data System (ADS)
Basu, Srijita; Sengupta, Anirban; Mazumdar, Chandan
2016-11-01
Enterprises are increasingly using cloud computing for hosting their applications. Availability of fast Internet and cheap bandwidth are causing greater number of people to use cloud-based services. This has the advantage of lower cost and minimum maintenance. However, ensuring security of user data and proper management of cloud infrastructure remain major areas of concern. Existing techniques are either too complex, or fail to properly represent the actual cloud scenario. This article presents a formal cloud model using the constructs of Z-notation. Principles of the Chinese Wall security policy have been applied to design secure cloud-specific operations. The proposed methodology will enable users to safely host their services, as well as process sensitive data, on cloud.
Precise tracking of remote sensing satellites with the Global Positioning System
NASA Technical Reports Server (NTRS)
Yunck, Thomas P.; Wu, Sien-Chong; Wu, Jiun-Tsong; Thornton, Catherine L.
1990-01-01
The Global Positioning System (GPS) can be applied in a number of ways to track remote sensing satellites at altitudes below 3000 km with accuracies of better than 10 cm. All techniques use a precise global network of GPS ground receivers operating in concert with a receiver aboard the user satellite, and all estimate the user orbit, GPS orbits, and selected ground locations simultaneously. The GPS orbit solutions are always dynamic, relying on the laws of motion, while the user orbit solution can range from purely dynamic to purely kinematic (geometric). Two variations show considerable promise. The first one features an optimal synthesis of dynamics and kinematics in the user solution, while the second introduces a novel gravity model adjustment technique to exploit data from repeat ground tracks. These techniques, to be demonstrated on the Topex/Poseidon mission in 1992, will offer subdecimeter tracking accuracy for dynamically unpredictable satellites down to the lowest orbital altitudes.
A Fuzzy Query Mechanism for Human Resource Websites
NASA Astrophysics Data System (ADS)
Lai, Lien-Fu; Wu, Chao-Chin; Huang, Liang-Tsung; Kuo, Jung-Chih
Users' preferences often contain imprecision and uncertainty that are difficult for traditional human resource websites to deal with. In this paper, we apply the fuzzy logic theory to develop a fuzzy query mechanism for human resource websites. First, a storing mechanism is proposed to store fuzzy data into conventional database management systems without modifying DBMS models. Second, a fuzzy query language is proposed for users to make fuzzy queries on fuzzy databases. User's fuzzy requirement can be expressed by a fuzzy query which consists of a set of fuzzy conditions. Third, each fuzzy condition associates with a fuzzy importance to differentiate between fuzzy conditions according to their degrees of importance. Fourth, the fuzzy weighted average is utilized to aggregate all fuzzy conditions based on their degrees of importance and degrees of matching. Through the mutual compensation of all fuzzy conditions, the ordering of query results can be obtained according to user's preference.
NASA Technical Reports Server (NTRS)
Shih, Ming H.; Soni, Bharat K.
1993-01-01
The issue of time efficiency in grid generation is addressed by developing a user friendly graphical interface for interactive/automatic construction of structured grids around complex turbomachinery/axis-symmetric configurations. The accuracy of geometry modeling and its fidelity is accomplished by adapting the nonuniform rational b-spline (NURBS) representation. A customized interactive grid generation code, TIGER, has been developed to facilitate the grid generation process for complicated internal, external, and internal-external turbomachinery fields simulations. The FORMS Library is utilized to build user-friendly graphical interface. The algorithm allows a user to redistribute grid points interactively on curves/surfaces using NURBS formulation with accurate geometric definition. TIGER's features include multiblock, multiduct/shroud, multiblade row, uneven blade count, and patched/overlapping block interfaces. It has been applied to generate grids for various complicated turbomachinery geometries, as well as rocket and missile configurations.
Individual and Environmental Correlates to Quality of Life in Park Users in Colombia
Camargo, Diana Marina
2017-01-01
Purpose: To explore individual and environmental correlates to quality of life (QoL) in park users in Colombia. Methods: A cross-sectional study with face-to-face interviews was conducted with 1392 park users from ten parks in Colombia. The survey included sociodemographic questions, health condition assessed with EuroQuol-5-Dimensions-5-Levels; in addition, questions about accessibility to the parks and perceptions about quality of infrastructure and green areas were asked. The Spanish version of the questionnaire EUROHIS-QOL-8 items was applied to assess QoL. Log-binomial regression models were applied for analyses. Results: Years of schooling, visits to the park with a companion, active use of the park, a maximum score for quality of trees and walking paths, and the perception of safety on the way to the park were positively associated with a better QoL (p < 0.05). Health conditions related to problems in the ability to perform activities of daily living and anxiety/depression showed negative associations. Conclusions: The present study contributes to the Latin American studies by providing information on how parks in an intermediate city may contribute to increased QoL of park users through safety in neighborhoods, social support, active use, and aesthetics, cleanliness, and care of green areas. PMID:29048373
An effective trust-based recommendation method using a novel graph clustering algorithm
NASA Astrophysics Data System (ADS)
Moradi, Parham; Ahmadian, Sajad; Akhlaghian, Fardin
2015-10-01
Recommender systems are programs that aim to provide personalized recommendations to users for specific items (e.g. music, books) in online sharing communities or on e-commerce sites. Collaborative filtering methods are important and widely accepted types of recommender systems that generate recommendations based on the ratings of like-minded users. On the other hand, these systems confront several inherent issues such as data sparsity and cold start problems, caused by fewer ratings against the unknowns that need to be predicted. Incorporating trust information into the collaborative filtering systems is an attractive approach to resolve these problems. In this paper, we present a model-based collaborative filtering method by applying a novel graph clustering algorithm and also considering trust statements. In the proposed method first of all, the problem space is represented as a graph and then a sparsest subgraph finding algorithm is applied on the graph to find the initial cluster centers. Then, the proposed graph clustering algorithm is performed to obtain the appropriate users/items clusters. Finally, the identified clusters are used as a set of neighbors to recommend unseen items to the current active user. Experimental results based on three real-world datasets demonstrate that the proposed method outperforms several state-of-the-art recommender system methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jarocki, John Charles; Zage, David John; Fisher, Andrew N.
LinkShop is a software tool for applying the method of Linkography to the analysis time-sequence data. LinkShop provides command line, web, and application programming interfaces (API) for input and processing of time-sequence data, abstraction models, and ontologies. The software creates graph representations of the abstraction model, ontology, and derived linkograph. Finally, the tool allows the user to perform statistical measurements of the linkograph and refine the ontology through direct manipulation of the linkograph.
2014-06-01
from the ODM standard. Leveraging SPARX EA’s Java application programming interface (API), the team built a tool called OWL2EA that can ingest an OWL...server MySQL creates the physical schema that enables a user to store and retrieve data conforming to the vocabulary of the JC3IEDM. 6. GENERATING AN
2015-04-29
in which we applied these adaptation patterns to an adaptive news web server intended to tolerate extremely heavy, unexpected loads. To address...collection of existing models used as benchmarks for OO-based refactoring and an existing web -based repository called REMODD to provide users with model...invariant properties. Specifically, we developed Avida- MDE (based on the Avida digital evolution platform) to support the automatic generation of software
Modelling of information diffusion on social networks with applications to WeChat
NASA Astrophysics Data System (ADS)
Liu, Liang; Qu, Bo; Chen, Bin; Hanjalic, Alan; Wang, Huijuan
2018-04-01
Traces of user activities recorded in online social networks open new possibilities to systematically understand the information diffusion process on social networks. From the online social network WeChat, we collected a large number of information cascade trees, each of which tells the spreading trajectory of a message/information such as which user creates the information and which users view or forward the information shared by which neighbours. In this work, we propose two heterogeneous non-linear models, one for the topologies of the information cascade trees and the other for the stochastic process of information diffusion on a social network. Both models are validated by the WeChat data in reproducing and explaining key features of cascade trees. Specifically, we apply the Random Recursive Tree (RRT) to model the growth of cascade trees. The RRT model could capture key features, i.e. the average path length and degree variance of a cascade tree in relation to the number of nodes (size) of the tree. Its single identified parameter quantifies the relative depth or broadness of the cascade trees and indicates that information propagates via a star-like broadcasting or viral-like hop by hop spreading. The RRT model explains the appearance of hubs, thus a possibly smaller average path length as the cascade size increases, as observed in WeChat. We further propose the stochastic Susceptible View Forward Removed (SVFR) model to depict the dynamic user behaviour including creating, viewing, forwarding and ignoring a message on a given social network. Beside the average path length and degree variance of the cascade trees in relation to their sizes, the SVFR model could further explain the power-law cascade size distribution in WeChat and unravel that a user with a large number of friends may actually have a smaller probability to read a message (s)he receives due to limited attention.
Bohnet-Joschko, Sabine; Kientzler, Fionn
2010-01-01
Management science defines user-generated innovations as open innovation and lead user innovation. The medical technology industry finds user-generated innovations profitable and even indispensable. Innovative medical doctors as lead users need medical technology innovations in order to improve patient care. Their motivation to innovate is mostly intrinsic. But innovations may also involve extrinsic motivators such as gain in reputation or monetary incentives. Medical doctors' innovative activities often take place in hospitals and are thus embedded into the hospital's organisational setting. Hospitals find it difficult to gain short-term profits from in-house generated innovations and sometimes hesitate to support them. Strategic investment in medical doctors' innovative activities may be profitable for hospitals in the long run if innovations provide first-mover competitive advantages. Industry co-operations with innovative medical doctors offer chances but also bear potential risks. Innovative ideas generated by expert users may result in even higher complexity of medical devices; this could cause mistakes when applied by less specialised users and thus affect patient safety. Innovations that yield benefits for patients, medical doctors, hospitals and the medical technology industry can be advanced by offering adequate support for knowledge transfer and co-operation models.
Trajectory Based Behavior Analysis for User Verification
NASA Astrophysics Data System (ADS)
Pao, Hsing-Kuo; Lin, Hong-Yi; Chen, Kuan-Ta; Fadlil, Junaidillah
Many of our activities on computer need a verification step for authorized access. The goal of verification is to tell apart the true account owner from intruders. We propose a general approach for user verification based on user trajectory inputs. The approach is labor-free for users and is likely to avoid the possible copy or simulation from other non-authorized users or even automatic programs like bots. Our study focuses on finding the hidden patterns embedded in the trajectories produced by account users. We employ a Markov chain model with Gaussian distribution in its transitions to describe the behavior in the trajectory. To distinguish between two trajectories, we propose a novel dissimilarity measure combined with a manifold learnt tuning for catching the pairwise relationship. Based on the pairwise relationship, we plug-in any effective classification or clustering methods for the detection of unauthorized access. The method can also be applied for the task of recognition, predicting the trajectory type without pre-defined identity. Given a trajectory input, the results show that the proposed method can accurately verify the user identity, or suggest whom owns the trajectory if the input identity is not provided.
NASA Astrophysics Data System (ADS)
Quattrochi, D. A.; Estes, M. G., Jr.; Al-Hamdan, M. Z.; Thom, R.; Woodruff, D.; Judd, C.; Ellis, J. T.; Swann, R.; Johnson, H., III
2010-12-01
New data, tools, and capabilities for decision making are significant needs in the northern Gulf of Mexico and other coastal areas. The goal of this project is to support NASA’s Earth Science Mission Directorate and its Applied Science Program and the Gulf of Mexico Alliance by producing and providing NASA data and products that will benefit decision making by coastal resource managers and other end users in the Gulf region. Data and research products are being developed to assist coastal resource managers adapt and plan for changing conditions by evaluating how climate changes and urban expansion will impact land cover/land use (LCLU), hydrodynamics, water properties, and shallow water habitats; to identify priority areas for conservation and restoration; and to distribute datasets to end-users and facilitating user interaction with models. The proposed host sites for data products are NOAA’s National Coastal Data Development Center Regional Ecosystem Data Management, and Mississippi-Alabama Habitat Database. Tools will be available on the Gulf of Mexico Regional Collaborative website with links to data portals to enable end users to employ models and datasets to develop and evaluate LCLU and climate scenarios of particular interest. These data will benefit the Mobile Bay National Estuary Program in ongoing efforts to protect and restore the Fish River watershed and around Weeks Bay National Estuarine Research Reserve. The usefulness of data products and tools will be demonstrated at an end-user workshop.
NASA Technical Reports Server (NTRS)
Quattrochi, Dale; Estes, Maurice, Jr.; Al-Hamdan, Mohammad; Thom, Ron; Woodruff, Dana; Judd, Chaeli; Ellis, Jean; Swann, Roberta; Johnson, Hoyt, III
2010-01-01
New data, tools, and capabilities for decision making are significant needs in the northern Gulf of Mexico and other coastal areas. The goal of this project is to support NASA s Earth Science Mission Directorate and its Applied Science Program and the Gulf of Mexico Alliance by producing and providing NASA data and products that will benefit decision making by coastal resource managers and other end users in the Gulf region. Data and research products are being developed to assist coastal resource managers adapt and plan for changing conditions by evaluating how climate changes and urban expansion will impact land cover/land use (LCLU), hydrodynamics, water properties, and shallow water habitats; to identify priority areas for conservation and restoration; and to distribute datasets to end-users and facilitating user interaction with models. The proposed host sites for data products are NOAA s National Coastal Data Development Center Regional Ecosystem Data Management, and Mississippi-Alabama Habitat Database. Tools will be available on the Gulf of Mexico Regional Collaborative website with links to data portals to enable end users to employ models and datasets to develop and evaluate LCLU and climate scenarios of particular interest. These data will benefit the Mobile Bay National Estuary Program in ongoing efforts to protect and restore the Fish River watershed and around Weeks Bay National Estuarine Research Reserve. The usefulness of data products and tools will be demonstrated at an end-user workshop.
Augmented Computer Mouse Would Measure Applied Force
NASA Technical Reports Server (NTRS)
Li, Larry C. H.
1993-01-01
Proposed computer mouse measures force of contact applied by user. Adds another dimension to two-dimensional-position-measuring capability of conventional computer mouse; force measurement designated to represent any desired continuously variable function of time and position, such as control force, acceleration, velocity, or position along axis perpendicular to computer video display. Proposed mouse enhances sense of realism and intuition in interaction between operator and computer. Useful in such applications as three-dimensional computer graphics, computer games, and mathematical modeling of dynamics.
Yoon, Sunmoo
2017-01-01
Background Twitter can address the mental health challenges of dementia care. The aims of this study is to explore the contents and user interactions of tweets mentioning dementia to gain insights for dementia care. Methods We collected 35,260 tweets mentioning Alzheimer’s or dementia on World Alzheimer’s Day, September 21st in 2015. Topic modeling and social network analysis were applied to uncover content and structure of user communication. Results Global users generated keywords related to mental health and care including #psychology and #mental health. There were similarities and differences between the UK and the US in tweet content. The macro-level analysis uncovered substantial public interest on dementia. The meso-level network analysis revealed that top leaders of communities were spiritual organizations and traditional media. Conclusions The application of topic modeling and multi-level network analysis while incorporating visualization techniques can promote a global level understanding regarding public attention, interests, and insights regarding dementia care and mental health. PMID:27803262
Dieckmann, P; Rall, M; Ostergaard, D
2009-01-01
We describe how simulation and incident reporting can be used in combination to make the interaction between people, (medical) technology and organisation safer for patients and users. We provide the background rationale for our conceptual ideas and apply the concepts to the analysis of an actual incident report. Simulation can serve as a laboratory to analyse such cases and to create relevant and effective training scenarios based on such analyses. We will describe a methodological framework for analysing simulation scenarios in a way that allows discovering and discussing mismatches between conceptual models of the device design and mental models users hold about the device and its use. We further describe how incident reporting systems can be used as one source of data to conduct the necessary needs analyses - both for training and further needs for closer analysis of specific devices or some of their special features or modes during usability analyses.
Nunes, David; Tran, Thanh-Dien; Raposo, Duarte; Pinto, André; Gomes, André; Silva, Jorge Sá
2012-01-01
As the Internet evolved, social networks (such as Facebook) have bloomed and brought together an astonishing number of users. Mashing up mobile phones and sensors with these social environments enables the creation of people-centric sensing systems which have great potential for expanding our current social networking usage. However, such systems also have many associated technical challenges, such as privacy concerns, activity detection mechanisms or intermittent connectivity, as well as limitations due to the heterogeneity of sensor nodes and networks. Considering the openness of the Web 2.0, good technical solutions for these cases consist of frameworks that expose sensing data and functionalities as common Web-Services. This paper presents our RESTful Web Service-based model for people-centric sensing frameworks, which uses sensors and mobile phones to detect users' activities and locations, sharing this information amongst the user's friends within a social networking site. We also present some screenshot results of our experimental prototype.
Remote health coaching for interactive exercise with older adults in a home environment.
Jimison, Holly B; Hagler, Stuart; Kurillo, Gregorij; Bajcsy, Ruzena; Pavel, Misha
2015-01-01
Optimal health coaching interventions are tailored to individuals' needs, preferences, motivations, barriers, timing, and readiness to change. Technology approaches are useful in both monitoring a user's adherence to their behavior change goals and also in providing just-in-time feedback and coaching messages. User models that incorporate dynamically varying behavior change variables with algorithms that trigger tailored messages provide a framework for making health interventions more effective. These principles are applied in the described system for assisting older adults in meeting their physical exercise goals with a tailored interactive video system with just-in-time feedback and encouragement.
Understanding Teacher Users of a Digital Library Service: A Clustering Approach
ERIC Educational Resources Information Center
Xu, Beijie
2011-01-01
This research examined teachers' online behaviors while using a digital library service--the Instructional Architect (IA)--through three consecutive studies. In the first two studies, a statistical model called latent class analysis (LCA) was applied to cluster different groups of IA teachers according to their diverse online behaviors. The third…
Spatial modeling of potential woody biomass flow
Woodam Chung; Nathaniel Anderson
2012-01-01
The flow of woody biomass to end users is determined by economic factors, especially the amount available across a landscape and delivery costs of bioenergy facilities. The objective of this study develop methodology to quantify landscape-level stocks and potential biomass flows using the currently available spatial database road network analysis tool. We applied this...
The Chaos Theory of Careers: A User's Guide
ERIC Educational Resources Information Center
Bright, Jim E. H.; Pryor, Robert G. L.
2005-01-01
The purpose of this article is to set out the key elements of the Chaos Theory of Careers. The complexity of influences on career development presents a significant challenge to traditional predictive models of career counseling. Chaos theory can provide a more appropriate description of career behavior, and the theory can be applied with clients…
Akiyama, Miki; Abraham, Chon
2017-08-01
Tele-homecare is gaining prominence as a viable care alternative, as evidenced by the increase in financial support from international governments to fund initiatives in their respective countries. The primary reason for the funding is to support efforts to reduce lags and increase capacity in access to care as well as to promote preventive measures that can avert costly emergent issues from arising. These efforts are especially important to super-aged and aging societies such as in Japan, many European countries, and the United States (US). However, to date and to our knowledge, a direct comparison of non-government vs. government-supported funding models for tele-homecare is particularly lacking in Japan. The aim of this study is to compare these operational models (i.e., non-government vs. government-supported funding) from a cost-benefit perspective. This simulation study applies to a Japanese hypothetical cohort with implications for other super-aged and aging societies abroad. We performed a cost-benefit analysis (CBA) on two operational models for enabling tele-homecare for elderly community-dwelling cohorts based on a decision tree model, which we created with parameters from published literature. The two models examined are (a) Model 1-non-government-supported funding that includes monthly fixed charges paid by users for a portion of the operating costs, and (b) Model 2-government-supported funding that includes startup and installation costs only (i.e., no operating costs) and no monthly user charges. We performed base case cost-benefit analysis and probabilistic cost-benefit analysis with a Monte Carlo simulation. We calculated net benefit and benefit-to-cost ratios (BCRs) from the societal perspective with a five-year time horizon applying a 3% discount rate for both cost and benefit values. The cost of tele-homecare included (a) the startup system expense, averaged over a five-year depreciation period, and (b) operation expenses (i.e., labor and non-labor) per user per year. The benefit of tele-homecare was measured by annual willingness to pay (WTP) for tele-homecare by a user and medical expenditures avoided. Both costs and benefits were inflated using the relevant Japanese consumer price index (CPI) and converted into 2015 US dollars with purchasing power parity (PPP) adjusted. Base case net benefits of Model 1 and Model 2 were $417.00 and $97.30, respectively. Base case BCR of Model 1 tele-homecare was 1.63, while Model 2 was 1.03. The probabilistic analysis estimated mean (95%CI) for BCRs of Model 1 and Model 2 was 1.84 (1.89, 1.88) and 1.46 (1.43, 1.49), respectively. Sensitivity analysis showed robustness of Model 1 in 7 parameters but Model 2 was sensitive in all key parameters such as initial system cost, device cost, number of users, and medical expenditure saved. Break-even analysis showed that the system cost of Model 2 had to be under $187,500. Our results for each model collectively showed that tele-homecare in Japan is cost-saving to some extent. However, the government-funded model (i.e., Model 2), which typically requires use of all startup funding to be spent within the first year on system costs, was inferior to the monthly fee model (i.e., Model 1) that did not use the government funding for installation or continued operations, but rather incorporated a monthly fee from users to support the receipt of services via tele-homecare. While the benefits of Model 1 outweighed the benefits of Model 2, the government-subsidized method employed in Model 2 could be more beneficial in general if some explicit prequalifying estimated metrics are instituted prior to funding. Thus, governments need to require applicants requesting funding to note, at a minimum, (a) estimated costs, (b) the expected number of tele-homecare users, and expected benefits such as (c) WTP by the user, or (d) medical expenditure saved by tele-homecare as a means of financing some of the operational costs. Copyright © 2017 Elsevier B.V. All rights reserved.
Public Health Analysis Transport Optimization Model v. 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beyeler, Walt; Finley, Patrick; Walser, Alex
PHANTOM models logistic functions of national public health systems. The system enables public health officials to visualize and coordinate options for public health surveillance, diagnosis, response and administration in an integrated analytical environment. Users may simulate and analyze system performance applying scenarios that represent current conditions or future contingencies what-if analyses of potential systemic improvements. Public health networks are visualized as interactive maps, with graphical displays of relevant system performance metrics as calculated by the simulation modeling components.
Notional Scoring for Technical Review Weighting As Applied to Simulation Credibility Assessment
NASA Technical Reports Server (NTRS)
Hale, Joseph Peter; Hartway, Bobby; Thomas, Danny
2008-01-01
NASA's Modeling and Simulation Standard requires a credibility assessment for critical engineering data produced by models and simulations. Credibility assessment is thus a "qualifyingfactor" in reporting results from simulation-based analysis. The degree to which assessors should be independent of the simulation developers, users and decision makers is a recurring question. This paper provides alternative "weighting algorithms" for calculating the value-added for independence of the levels of technical review defined for the NASA Modeling and Simulation Standard.
Robust camera calibration for sport videos using court models
NASA Astrophysics Data System (ADS)
Farin, Dirk; Krabbe, Susanne; de With, Peter H. N.; Effelsberg, Wolfgang
2003-12-01
We propose an automatic camera calibration algorithm for court sports. The obtained camera calibration parameters are required for applications that need to convert positions in the video frame to real-world coordinates or vice versa. Our algorithm uses a model of the arrangement of court lines for calibration. Since the court model can be specified by the user, the algorithm can be applied to a variety of different sports. The algorithm starts with a model initialization step which locates the court in the image without any user assistance or a-priori knowledge about the most probable position. Image pixels are classified as court line pixels if they pass several tests including color and local texture constraints. A Hough transform is applied to extract line elements, forming a set of court line candidates. The subsequent combinatorial search establishes correspondences between lines in the input image and lines from the court model. For the succeeding input frames, an abbreviated calibration algorithm is used, which predicts the camera parameters for the new image and optimizes the parameters using a gradient-descent algorithm. We have conducted experiments on a variety of sport videos (tennis, volleyball, and goal area sequences of soccer games). Video scenes with considerable difficulties were selected to test the robustness of the algorithm. Results show that the algorithm is very robust to occlusions, partial court views, bad lighting conditions, or shadows.
BUMPER v1.0: a Bayesian user-friendly model for palaeo-environmental reconstruction
NASA Astrophysics Data System (ADS)
Holden, Philip B.; Birks, H. John B.; Brooks, Stephen J.; Bush, Mark B.; Hwang, Grace M.; Matthews-Bird, Frazer; Valencia, Bryan G.; van Woesik, Robert
2017-02-01
We describe the Bayesian user-friendly model for palaeo-environmental reconstruction (BUMPER), a Bayesian transfer function for inferring past climate and other environmental variables from microfossil assemblages. BUMPER is fully self-calibrating, straightforward to apply, and computationally fast, requiring ˜ 2 s to build a 100-taxon model from a 100-site training set on a standard personal computer. We apply the model's probabilistic framework to generate thousands of artificial training sets under ideal assumptions. We then use these to demonstrate the sensitivity of reconstructions to the characteristics of the training set, considering assemblage richness, taxon tolerances, and the number of training sites. We find that a useful guideline for the size of a training set is to provide, on average, at least 10 samples of each taxon. We demonstrate general applicability to real data, considering three different organism types (chironomids, diatoms, pollen) and different reconstructed variables. An identically configured model is used in each application, the only change being the input files that provide the training-set environment and taxon-count data. The performance of BUMPER is shown to be comparable with weighted average partial least squares (WAPLS) in each case. Additional artificial datasets are constructed with similar characteristics to the real data, and these are used to explore the reasons for the differing performances of the different training sets.
A Unified Model for BDS Wide Area and Local Area Augmentation Positioning Based on Raw Observations.
Tu, Rui; Zhang, Rui; Lu, Cuixian; Zhang, Pengfei; Liu, Jinhai; Lu, Xiaochun
2017-03-03
In this study, a unified model for BeiDou Navigation Satellite System (BDS) wide area and local area augmentation positioning based on raw observations has been proposed. Applying this model, both the Real-Time Kinematic (RTK) and Precise Point Positioning (PPP) service can be realized by performing different corrections at the user end. This algorithm was assessed and validated with the BDS data collected at four regional stations from Day of Year (DOY) 080 to 083 of 2016. When the users are located within the local reference network, the fast and high precision RTK service can be achieved using the regional observation corrections, revealing a convergence time of about several seconds and a precision of about 2-3 cm. For the users out of the regional reference network, the global broadcast State-Space Represented (SSR) corrections can be utilized to realize the global PPP service which shows a convergence time of about 25 min for achieving an accuracy of 10 cm. With this unified model, it can not only integrate the Network RTK (NRTK) and PPP into a seamless positioning service, but also recover the ionosphere Vertical Total Electronic Content (VTEC) and Differential Code Bias (DCB) values that are useful for the ionosphere monitoring and modeling.
A Unified Model for BDS Wide Area and Local Area Augmentation Positioning Based on Raw Observations
Tu, Rui; Zhang, Rui; Lu, Cuixian; Zhang, Pengfei; Liu, Jinhai; Lu, Xiaochun
2017-01-01
In this study, a unified model for BeiDou Navigation Satellite System (BDS) wide area and local area augmentation positioning based on raw observations has been proposed. Applying this model, both the Real-Time Kinematic (RTK) and Precise Point Positioning (PPP) service can be realized by performing different corrections at the user end. This algorithm was assessed and validated with the BDS data collected at four regional stations from Day of Year (DOY) 080 to 083 of 2016. When the users are located within the local reference network, the fast and high precision RTK service can be achieved using the regional observation corrections, revealing a convergence time of about several seconds and a precision of about 2–3 cm. For the users out of the regional reference network, the global broadcast State-Space Represented (SSR) corrections can be utilized to realize the global PPP service which shows a convergence time of about 25 min for achieving an accuracy of 10 cm. With this unified model, it can not only integrate the Network RTK (NRTK) and PPP into a seamless positioning service, but also recover the ionosphere Vertical Total Electronic Content (VTEC) and Differential Code Bias (DCB) values that are useful for the ionosphere monitoring and modeling. PMID:28273814
Service Modeling Language Applied to Critical Infrastructure
NASA Astrophysics Data System (ADS)
Baldini, Gianmarco; Fovino, Igor Nai
The modeling of dependencies in complex infrastructure systems is still a very difficult task. Many methodologies have been proposed, but a number of challenges still remain, including the definition of the right level of abstraction, the presence of different views on the same critical infrastructure and how to adequately represent the temporal evolution of systems. We propose a modeling methodology where dependencies are described in terms of the service offered by the critical infrastructure and its components. The model provides a clear separation between services and the underlying organizational and technical elements, which may change in time. The model uses the Service Modeling Language proposed by the W3 consortium for describing critical infrastructure in terms of interdependent services nodes including constraints, behavior, information flows, relations, rules and other features. Each service node is characterized by its technological, organizational and process components. The model is then applied to a real case of an ICT system for users authentication.
Newe, Axel; Becker, Linda; Schenk, Andrea
2014-01-01
Background & Objectives The Portable Document Format (PDF) is the de-facto standard for the exchange of electronic documents. It is platform-independent, suitable for the exchange of medical data, and allows for the embedding of three-dimensional (3D) surface mesh models. In this article, we present the first clinical routine application of interactive 3D surface mesh models which have been integrated into PDF files for the presentation and the exchange of Computer Assisted Surgery Planning (CASP) results in liver surgery. We aimed to prove the feasibility of applying 3D PDF in medical reporting and investigated the user experience with this new technology. Methods We developed an interactive 3D PDF report document format and implemented a software tool to create these reports automatically. After more than 1000 liver CASP cases that have been reported in clinical routine using our 3D PDF report, an international user survey was carried out online to evaluate the user experience. Results Our solution enables the user to interactively explore the anatomical configuration and to have different analyses and various resection proposals displayed within a 3D PDF document covering only a single page that acts more like a software application than like a typical PDF file (“PDF App”). The new 3D PDF report offers many advantages over the previous solutions. According to the results of the online survey, the users have assessed the pragmatic quality (functionality, usability, perspicuity, efficiency) as well as the hedonic quality (attractiveness, novelty) very positively. Conclusion The usage of 3D PDF for reporting and sharing CASP results is feasible and well accepted by the target audience. Using interactive PDF with embedded 3D models is an enabler for presenting and exchanging complex medical information in an easy and platform-independent way. Medical staff as well as patients can benefit from the possibilities provided by 3D PDF. Our results open the door for a wider use of this new technology, since the basic idea can and should be applied for many medical disciplines and use cases. PMID:25551375
Newe, Axel; Becker, Linda; Schenk, Andrea
2014-01-01
The Portable Document Format (PDF) is the de-facto standard for the exchange of electronic documents. It is platform-independent, suitable for the exchange of medical data, and allows for the embedding of three-dimensional (3D) surface mesh models. In this article, we present the first clinical routine application of interactive 3D surface mesh models which have been integrated into PDF files for the presentation and the exchange of Computer Assisted Surgery Planning (CASP) results in liver surgery. We aimed to prove the feasibility of applying 3D PDF in medical reporting and investigated the user experience with this new technology. We developed an interactive 3D PDF report document format and implemented a software tool to create these reports automatically. After more than 1000 liver CASP cases that have been reported in clinical routine using our 3D PDF report, an international user survey was carried out online to evaluate the user experience. Our solution enables the user to interactively explore the anatomical configuration and to have different analyses and various resection proposals displayed within a 3D PDF document covering only a single page that acts more like a software application than like a typical PDF file ("PDF App"). The new 3D PDF report offers many advantages over the previous solutions. According to the results of the online survey, the users have assessed the pragmatic quality (functionality, usability, perspicuity, efficiency) as well as the hedonic quality (attractiveness, novelty) very positively. The usage of 3D PDF for reporting and sharing CASP results is feasible and well accepted by the target audience. Using interactive PDF with embedded 3D models is an enabler for presenting and exchanging complex medical information in an easy and platform-independent way. Medical staff as well as patients can benefit from the possibilities provided by 3D PDF. Our results open the door for a wider use of this new technology, since the basic idea can and should be applied for many medical disciplines and use cases.
Versatile clinical information system design for emergency departments.
Amouh, Teh; Gemo, Monica; Macq, Benoît; Vanderdonckt, Jean; El Gariani, Abdul Wahed; Reynaert, Marc S; Stamatakis, Lambert; Thys, Frédéric
2005-06-01
Compared to other hospital units, the emergency department presents some distinguishing characteristics of its own. Emergency health-care delivery is a collaborative process involving the contribution of several individuals who accomplish their tasks while working autonomously under pressure and sometimes with limited resources. Effective computerization of the emergency department information system presents a real challenge due to the complexity of the scenario. Current computerized support suffers from several problems, including inadequate data models, clumsy user interfaces, and poor integration with other clinical information systems. To tackle such complexity, we propose an approach combining three points of view, namely the transactions (in and out of the department), the (mono and multi) user interfaces and data management. Unlike current systems, we pay particular attention to the user-friendliness and versatility of our system. This means that intuitive user interfaces have been conceived and specific software modeling methodologies have been applied to provide our system with the flexibility and adaptability necessary for the individual and group coordinated tasks. Our approach has been implemented by prototyping a web-based, multiplatform, multiuser, and versatile clinical information system built upon multitier software architecture, using the Java programming language.
Dynamic Computation of Change Operations in Version Management of Business Process Models
NASA Astrophysics Data System (ADS)
Küster, Jochen Malte; Gerth, Christian; Engels, Gregor
Version management of business process models requires that changes can be resolved by applying change operations. In order to give a user maximal freedom concerning the application order of change operations, position parameters of change operations must be computed dynamically during change resolution. In such an approach, change operations with computed position parameters must be applicable on the model and dependencies and conflicts of change operations must be taken into account because otherwise invalid models can be constructed. In this paper, we study the concept of partially specified change operations where parameters are computed dynamically. We provide a formalization for partially specified change operations using graph transformation and provide a concept for their applicability. Based on this, we study potential dependencies and conflicts of change operations and show how these can be taken into account within change resolution. Using our approach, a user can resolve changes of business process models without being unnecessarily restricted to a certain order.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-24
...-01] RIN 0694-AF26 Amendment to Existing Validated End-User Authorizations for Applied Materials (China), Inc., Boeing Tianjin Composites Co. Ltd., CSMC Technologies Corporation, Lam Research... VEUs in the People's Republic of China (PRC) and one VEU in India. For Applied Materials (China), Inc...
Magarey, Roger; Newton, Leslie; Hong, Seung C.; Takeuchi, Yu; Christie, Dave; Jarnevich, Catherine S.; Kohl, Lisa; Damus, Martin; Higgins, Steven I.; Miller, Leah; Castro, Karen; West, Amanda; Hastings, John; Cook, Gericke; Kartesz, John; Koop, Anthony
2018-01-01
This study compares four models for predicting the potential distribution of non-indigenous weed species in the conterminous U.S. The comparison focused on evaluating modeling tools and protocols as currently used for weed risk assessment or for predicting the potential distribution of invasive weeds. We used six weed species (three highly invasive and three less invasive non-indigenous species) that have been established in the U.S. for more than 75 years. The experiment involved providing non-U. S. location data to users familiar with one of the four evaluated techniques, who then developed predictive models that were applied to the United States without knowing the identity of the species or its U.S. distribution. We compared a simple GIS climate matching technique known as Proto3, a simple climate matching tool CLIMEX Match Climates, the correlative model MaxEnt, and a process model known as the Thornley Transport Resistance (TTR) model. Two experienced users ran each modeling tool except TTR, which had one user. Models were trained with global species distribution data excluding any U.S. data, and then were evaluated using the current known U.S. distribution. The influence of weed species identity and modeling tool on prevalence and sensitivity effects was compared using a generalized linear mixed model. Each modeling tool itself had a low statistical significance, while weed species alone accounted for 69.1 and 48.5% of the variance for prevalence and sensitivity, respectively. These results suggest that simple modeling tools might perform as well as complex ones in the case of predicting potential distribution for a weed not yet present in the United States. Considerations of model accuracy should also be balanced with those of reproducibility and ease of use. More important than the choice of modeling tool is the construction of robust protocols and testing both new and experienced users under blind test conditions that approximate operational conditions.
Sturgeon, Gregory M; Kiarashi, Nooshin; Lo, Joseph Y; Samei, E; Segars, W P
2016-05-01
The authors are developing a series of computational breast phantoms based on breast CT data for imaging research. In this work, the authors develop a program that will allow a user to alter the phantoms to simulate the effect of gravity and compression of the breast (craniocaudal or mediolateral oblique) making the phantoms applicable to multimodality imaging. This application utilizes a template finite-element (FE) breast model that can be applied to their presegmented voxelized breast phantoms. The FE model is automatically fit to the geometry of a given breast phantom, and the material properties of each element are set based on the segmented voxels contained within the element. The loading and boundary conditions, which include gravity, are then assigned based on a user-defined position and compression. The effect of applying these loads to the breast is computed using a multistage contact analysis in FEBio, a freely available and well-validated FE software package specifically designed for biomedical applications. The resulting deformation of the breast is then applied to a boundary mesh representation of the phantom that can be used for simulating medical images. An efficient script performs the above actions seamlessly. The user only needs to specify which voxelized breast phantom to use, the compressed thickness, and orientation of the breast. The authors utilized their FE application to simulate compressed states of the breast indicative of mammography and tomosynthesis. Gravity and compression were simulated on example phantoms and used to generate mammograms in the craniocaudal or mediolateral oblique views. The simulated mammograms show a high degree of realism illustrating the utility of the FE method in simulating imaging data of repositioned and compressed breasts. The breast phantoms and the compression software can become a useful resource to the breast imaging research community. These phantoms can then be used to evaluate and compare imaging modalities that involve different positioning and compression of the breast.
Automatic control of finite element models for temperature-controlled radiofrequency ablation.
Haemmerich, Dieter; Webster, John G
2005-07-14
The finite element method (FEM) has been used to simulate cardiac and hepatic radiofrequency (RF) ablation. The FEM allows modeling of complex geometries that cannot be solved by analytical methods or finite difference models. In both hepatic and cardiac RF ablation a common control mode is temperature-controlled mode. Commercial FEM packages don't support automating temperature control. Most researchers manually control the applied power by trial and error to keep the tip temperature of the electrodes constant. We implemented a PI controller in a control program written in C++. The program checks the tip temperature after each step and controls the applied voltage to keep temperature constant. We created a closed loop system consisting of a FEM model and the software controlling the applied voltage. The control parameters for the controller were optimized using a closed loop system simulation. We present results of a temperature controlled 3-D FEM model of a RITA model 30 electrode. The control software effectively controlled applied voltage in the FEM model to obtain, and keep electrodes at target temperature of 100 degrees C. The closed loop system simulation output closely correlated with the FEM model, and allowed us to optimize control parameters. The closed loop control of the FEM model allowed us to implement temperature controlled RF ablation with minimal user input.
Addressing the unmet need for visualizing conditional random fields in biological data
2014-01-01
Background The biological world is replete with phenomena that appear to be ideally modeled and analyzed by one archetypal statistical framework - the Graphical Probabilistic Model (GPM). The structure of GPMs is a uniquely good match for biological problems that range from aligning sequences to modeling the genome-to-phenome relationship. The fundamental questions that GPMs address involve making decisions based on a complex web of interacting factors. Unfortunately, while GPMs ideally fit many questions in biology, they are not an easy solution to apply. Building a GPM is not a simple task for an end user. Moreover, applying GPMs is also impeded by the insidious fact that the “complex web of interacting factors” inherent to a problem might be easy to define and also intractable to compute upon. Discussion We propose that the visualization sciences can contribute to many domains of the bio-sciences, by developing tools to address archetypal representation and user interaction issues in GPMs, and in particular a variety of GPM called a Conditional Random Field(CRF). CRFs bring additional power, and additional complexity, because the CRF dependency network can be conditioned on the query data. Conclusions In this manuscript we examine the shared features of several biological problems that are amenable to modeling with CRFs, highlight the challenges that existing visualization and visual analytics paradigms induce for these data, and document an experimental solution called StickWRLD which, while leaving room for improvement, has been successfully applied in several biological research projects. Software and tutorials are available at http://www.stickwrld.org/ PMID:25000815
Validation of the United States Marine Corps Qualified Candidate Population Model
2003-03-01
time. Fields are created in the database to support this forecasting. User forms and a macro are programmed in Microsoft VBA to develop the...at 0.001. To accomplish 50,000 iterations of a minimization problem, this study wrote a macro in the VBA programming language that guides the solver...success in the commissioning process. **To improve the diagnostics of this propensity model, other factors were considered as well. Applying SQL
Luo, Mei; Wang, Hao; Lyu, Zhi
2017-12-01
Species distribution models (SDMs) are widely used by researchers and conservationists. Results of prediction from different models vary significantly, which makes users feel difficult in selecting models. In this study, we evaluated the performance of two commonly used SDMs, the Biomod2 and Maximum Entropy (MaxEnt), with real presence/absence data of giant panda, and used three indicators, i.e., area under the ROC curve (AUC), true skill statistics (TSS), and Cohen's Kappa, to evaluate the accuracy of the two model predictions. The results showed that both models could produce accurate predictions with adequate occurrence inputs and simulation repeats. Comparedto MaxEnt, Biomod2 made more accurate prediction, especially when occurrence inputs were few. However, Biomod2 was more difficult to be applied, required longer running time, and had less data processing capability. To choose the right models, users should refer to the error requirements of their objectives. MaxEnt should be considered if the error requirement was clear and both models could achieve, otherwise, we recommend the use of Biomod2 as much as possible.
Health effects of the London bicycle sharing system: health impact modelling study.
Woodcock, James; Tainio, Marko; Cheshire, James; O'Brien, Oliver; Goodman, Anna
2014-02-13
To model the impacts of the bicycle sharing system in London on the health of its users. Health impact modelling and evaluation, using a stochastic simulation model. Central and inner London, England. Total population operational registration and usage data for the London cycle hire scheme (collected April 2011-March 2012), surveys of cycle hire users (collected 2011), and London data on travel, physical activity, road traffic collisions, and particulate air pollution (PM2.5, (collected 2005-12). 578,607 users of the London cycle hire scheme, aged 14 years and over, with an estimated 78% of travel time accounted for by users younger than 45 years. Change in lifelong disability adjusted life years (DALYs) based on one year impacts on incidence of disease and injury, modelled through medium term changes in physical activity, road traffic injuries, and exposure to air pollution. Over the year examined the users made 7.4 million cycle hire trips (estimated 71% of cycling time by men). These trips would mostly otherwise have been made on foot (31%) or by public transport (47%). To date there has been a trend towards fewer fatalities and injuries than expected on cycle hire bicycles. Using these observed injury rates, the population benefits from the cycle hire scheme substantially outweighed harms (net change -72 DALYs (95% credible interval -110 to -43) among men using cycle hire per accounting year; -15 (-42 to -6) among women; note that negative DALYs represent a health benefit). When we modelled cycle hire injury rates as being equal to background rates for all cycling in central London, these benefits were smaller and there was no evidence of a benefit among women (change -49 DALYs (-88 to -17) among men; -1 DALY (-27 to 12) among women). This sex difference largely reflected higher road collision fatality rates for female cyclists. At older ages the modelled benefits of cycling were much larger than the harms. Using background injury rates in the youngest age group (15 to 29 years), the medium term benefits and harms were both comparatively small and potentially negative. London's bicycle sharing system has positive health impacts overall, but these benefits are clearer for men than for women and for older users than for younger users. The potential benefits of cycling may not currently apply to all groups in all settings.
NASA Astrophysics Data System (ADS)
Toutin, Thierry; Wang, Huili; Charbonneau, Francois; Schmitt, Carla
2013-08-01
This paper presented two methods for the orthorectification of full/compact polarimetric SAR data: the polarimetric processing is performed in the image space (scientist's idealism) or in the ground space (user's realism) before or after the geometric processing, respectively. Radarsat-2 (R2) fine-quad and simulated very high-resolution RCM data acquired with different look angles over a hilly relief study site were processed using accurate lidar digital surface model. Quantitative evaluations between the two methods as a function of different geometric and radiometric parameters were performed to evaluate the impact during the orthorectification. The results demonstrated that the ground-space method can be safely applied to polarimetric R2 SAR data with an exception with the steep look angles and steep terrain slopes. On the other hand, the ground-space method cannot be applied to simulated compact RCM data due to 17dB noise floor and oversampling.
Lee, Chien-Ching; Lin, Shih-Pin; Yang, Shu-Ling; Tsou, Mei-Yung; Chang, Kuang-Yi
2013-03-01
Medical institutions are eager to introduce new information technology to improve patient safety and clinical efficiency. However, the acceptance of new information technology by medical personnel plays a key role in its adoption and application. This study aims to investigate whether perceived organizational learning capability (OLC) is associated with user acceptance of information technology among operating room nurse staff. Nurse anesthetists and operating room nurses were recruited in this questionnaire survey. A pilot study was performed to ensure the reliability and validity of the translated questionnaire, which consisted of 14 items from the four dimensions of OLC, and 16 items from the four constructs of user acceptance of information technology, including performance expectancy, effort expectancy, social influence, and behavioral intention. Confirmatory factor analysis was applied in the main survey to evaluate the construct validity of the questionnaire. Structural equation modeling was used to test the hypothetical relationships between the four dimensions of user acceptance of information technology and the second-ordered OLC. Goodness of fit of the hypothetic model was also assessed. Performance expectancy, effort expectancy, and social influence positively influenced behavioral intention of users of the clinical information system (all p < 0.001) and accounted for 75% of its variation. The second-ordered OLC was positively associated with performance expectancy, effort expectancy, and social influence (all p < 0.001). However, the hypothetic relationship between perceived OLC and behavioral intention was not significant (p = 0.87). The fit statistical analysis indicated reasonable model fit to data (root mean square error of approximation = 0.07 and comparative fit index = 0.91). Perceived OLC indirectly affects user behavioral intention through the mediation of performance expectancy, effort expectancy, and social influence in the operating room setting. Copyright © 2013. Published by Elsevier B.V.
2014-01-01
Background Internet-based physical activity interventions have great potential in supporting patients in cardiac rehabilitation. Health behavior change theories and user input are identified as important contributors in the effectiveness of the interventions, but they are rarely combined in a systematic way in the design of the interventions. Objective The aim of this study is to identify the appropriate theoretical framework, along with the needs of the users of a physical activity intervention for cardiac rehabilitation, and to combine them into an effective Internet- and mobile-based intervention. Methods We explain the theoretical framework of the intervention in a narrative overview of the existing health behavior change literature as it applies to physical activity. We also conducted a focus group with 11 participants of a cardiac rehabilitation program and used thematic analysis to identify and analyze patterns of meaning in the transcribed data. Results We chose stage-based approaches, specifically the transtheoretical model and the health action process approach as our main framework for tailoring, supplemented with other theoretical concepts such as regulatory focus within the appropriate stages. From the thematic analysis of the focus group data, we identified seven themes: (1) social, (2) motivation, (3) integration into everyday life, (4) information, (5) planning, (6) monitoring and feedback, and (7) concerns and potential problems. The final design of the intervention was based on both the theoretical review and the user input, and it is explained in detail. Conclusions We applied a combination of health behavioral theory and user input in designing our intervention. We think this is a promising design approach with the potential to combine the high efficacy of theory-based interventions with the higher perceived usefulness of interventions designed according to user input. Trial Registration Clinicaltrials.gov NCT01223170; http://clinicaltrials.gov/show/NCT01223170 (Archived by WebCite at http://www.webcitation.org/6M5FqT9Q2). PMID:24413185
Object-oriented software design in semiautomatic building extraction
NASA Astrophysics Data System (ADS)
Guelch, Eberhard; Mueller, Hardo
1997-08-01
Developing a system for semiautomatic building acquisition is a complex process, that requires constant integration and updating of software modules and user interfaces. To facilitate these processes we apply an object-oriented design not only for the data but also for the software involved. We use the unified modeling language (UML) to describe the object-oriented modeling of the system in different levels of detail. We can distinguish between use cases from the users point of view, that represent a sequence of actions, yielding in an observable result and the use cases for the programmers, who can use the system as a class library to integrate the acquisition modules in their own software. The structure of the system is based on the model-view-controller (MVC) design pattern. An example from the integration of automated texture extraction for the visualization of results demonstrate the feasibility of this approach.
Daveson, Barbara A; de Wolf-Linder, Susanne; Witt, Jana; Newson, Kirstie; Morris, Carolyn; Higginson, Irene J; Evans, Catherine J
2015-12-01
Support and evidence for patient, unpaid caregiver and public involvement in research (user involvement) are growing. Consensus on how best to involve users in palliative care research is lacking. To determine an optimal user-involvement model for palliative care research. We hosted a consultation workshop using expert presentations, discussion and nominal group technique to generate recommendations and consensus on agreement of importance. A total of 35 users and 32 researchers were approached to attend the workshop, which included break-out groups and a ranking exercise. Descriptive statistical analysis to establish consensus and highlight divergence was applied. Qualitative analysis of discussions was completed to aid interpretation of findings. Participants involved in palliative care research were invited to a global research institute, UK. A total of 12 users and 5 researchers participated. Users wanted their involvement to be more visible, including during dissemination, with a greater emphasis on the difference their involvement makes. Researchers wanted to improve productivity, relevance and quality through involvement. Users and researchers agreed that an optimal model should consist of (a) early involvement to ensure meaningful involvement and impact and (b) diverse virtual and face-to-face involvement methods to ensure flexibility. For involvement in palliative care research to succeed, early and flexible involvement is required. Researchers should advertise opportunities for involvement and promote impact of involvement via dissemination plans. Users should prioritise adding value to research through enhancing productivity, quality and relevance. More research is needed not only to inform implementation and ensure effectiveness but also to investigate the cost-effectiveness of involvement in palliative care research. © The Author(s) 2015.
Daveson, Barbara A; de Wolf-Linder, Susanne; Witt, Jana; Newson, Kirstie; Morris, Carolyn; Higginson, Irene J; Evans, Catherine J
2015-01-01
Background: Support and evidence for patient, unpaid caregiver and public involvement in research (user involvement) are growing. Consensus on how best to involve users in palliative care research is lacking. Aim: To determine an optimal user-involvement model for palliative care research. Design: We hosted a consultation workshop using expert presentations, discussion and nominal group technique to generate recommendations and consensus on agreement of importance. A total of 35 users and 32 researchers were approached to attend the workshop, which included break-out groups and a ranking exercise. Descriptive statistical analysis to establish consensus and highlight divergence was applied. Qualitative analysis of discussions was completed to aid interpretation of findings. Setting/participants: Participants involved in palliative care research were invited to a global research institute, UK. Results: A total of 12 users and 5 researchers participated. Users wanted their involvement to be more visible, including during dissemination, with a greater emphasis on the difference their involvement makes. Researchers wanted to improve productivity, relevance and quality through involvement. Users and researchers agreed that an optimal model should consist of (a) early involvement to ensure meaningful involvement and impact and (b) diverse virtual and face-to-face involvement methods to ensure flexibility. Conclusion: For involvement in palliative care research to succeed, early and flexible involvement is required. Researchers should advertise opportunities for involvement and promote impact of involvement via dissemination plans. Users should prioritise adding value to research through enhancing productivity, quality and relevance. More research is needed not only to inform implementation and ensure effectiveness but also to investigate the cost-effectiveness of involvement in palliative care research. PMID:25931336
Addressing the “Risk Environment” for Injection Drug Users: The Mysterious Case of the Missing Cop
Burris, Scott; Blankenship, Kim M; Donoghoe, Martin; Sherman, Susan; Vernick, Jon S; Case, Patricia; Lazzarini, Zita; Koester, Stephen
2004-01-01
Ecological models of the determinants of health and the consequent importance of structural interventions have been widely accepted, but using these models in research and practice has been challenging. Examining the role of criminal law enforcement in the “risk environment” of injection drug users (IDUs) provides an opportunity to apply structural thinking to the health problems associated with drug use. This article reviews international evidence that laws and law enforcement practices influence IDU risk. It argues that more research is needed at four levels—laws; management of law enforcement agencies; knowledge, attitudes, beliefs, and practices of frontline officers; and attitudes and experiences of IDUs—and that such research can be the basis of interventions within law enforcement to enhance IDU health. PMID:15016246
Classification of a set of vectors using self-organizing map- and rule-based technique
NASA Astrophysics Data System (ADS)
Ae, Tadashi; Okaniwa, Kaishirou; Nosaka, Kenzaburou
2005-02-01
There exist various objects, such as pictures, music, texts, etc., around our environment. We have a view for these objects by looking, reading or listening. Our view is concerned with our behaviors deeply, and is very important to understand our behaviors. We have a view for an object, and decide the next action (data selection, etc.) with our view. Such a series of actions constructs a sequence. Therefore, we propose a method which acquires a view as a vector from several words for a view, and apply the vector to sequence generation. We focus on sequences of the data of which a user selects from a multimedia database containing pictures, music, movie, etc... These data cannot be stereotyped because user's view for them changes by each user. Therefore, we represent the structure of the multimedia database as the vector representing user's view and the stereotyped vector, and acquire sequences containing the structure as elements. Such a vector can be classified by SOM (Self-Organizing Map). Hidden Markov Model (HMM) is a method to generate sequences. Therefore, we use HMM of which a state corresponds to the representative vector of user's view, and acquire sequences containing the change of user's view. We call it Vector-state Markov Model (VMM). We introduce the rough set theory as a rule-base technique, which plays a role of classifying the sets of data such as the sets of "Tour".
Marsac, Meghan L.; Winston, Flaura K.; Hildenbrand, Aimee K.; Kohser, Kristen L.; March, Sonja; Kenardy, Justin; Kassam-Adams, Nancy
2015-01-01
Background Millions of children are affected by acute medical events annually, creating need for resources to promote recovery. While web-based interventions promise wide reach and low cost for users, development can be time- and cost-intensive. A systematic approach to intervention development can help to minimize costs and increase likelihood of effectiveness. Using a systematic approach, our team integrated evidence on the etiology of traumatic stress, an explicit program theory, and a user-centered design process to intervention development. Objective To describe evidence and the program theory model applied to the Coping Coach intervention and present pilot data evaluating intervention feasibility and acceptability. Method Informed by empirical evidence on traumatic stress prevention, an overarching program theory model was articulated to delineate pathways from a) specific intervention content to b) program targets and proximal outcomes to c) key longer-term health outcomes. Systematic user-testing with children ages 8–12 (N = 42) exposed to an acute medical event and their parents was conducted throughout intervention development. Results Functionality challenges in early prototypes necessitated revisions. Child engagement was positive throughout revisions to the Coping Coach intervention. Final pilot-testing demonstrated promising feasibility and high user-engagement and satisfaction. Conclusion Applying a systematic approach to the development of Coping Coach led to the creation of a functional intervention that is accepted by children and parents. Development of new e-health interventions may benefit from a similar approach. Future research should evaluate the efficacy of Coping Coach in achieving targeted outcomes of reduced trauma symptoms and improved health-related quality of life. PMID:25844276
TIME Impact - a new user-friendly tuberculosis (TB) model to inform TB policy decisions.
Houben, R M G J; Lalli, M; Sumner, T; Hamilton, M; Pedrazzoli, D; Bonsu, F; Hippner, P; Pillay, Y; Kimerling, M; Ahmedov, S; Pretorius, C; White, R G
2016-03-24
Tuberculosis (TB) is the leading cause of death from infectious disease worldwide, predominantly affecting low- and middle-income countries (LMICs), where resources are limited. As such, countries need to be able to choose the most efficient interventions for their respective setting. Mathematical models can be valuable tools to inform rational policy decisions and improve resource allocation, but are often unavailable or inaccessible for LMICs, particularly in TB. We developed TIME Impact, a user-friendly TB model that enables local capacity building and strengthens country-specific policy discussions to inform support funding applications at the (sub-)national level (e.g. Ministry of Finance) or to international donors (e.g. the Global Fund to Fight AIDS, Tuberculosis and Malaria).TIME Impact is an epidemiological transmission model nested in TIME, a set of TB modelling tools available for free download within the widely-used Spectrum software. The TIME Impact model reflects key aspects of the natural history of TB, with additional structure for HIV/ART, drug resistance, treatment history and age. TIME Impact enables national TB programmes (NTPs) and other TB policymakers to better understand their own TB epidemic, plan their response, apply for funding and evaluate the implementation of the response.The explicit aim of TIME Impact's user-friendly interface is to enable training of local and international TB experts towards independent use. During application of TIME Impact, close involvement of the NTPs and other local partners also builds critical understanding of the modelling methods, assumptions and limitations inherent to modelling. This is essential to generate broad country-level ownership of the modelling data inputs and results. In turn, it stimulates discussions and a review of the current evidence and assumptions, strengthening the decision-making process in general.TIME Impact has been effectively applied in a variety of settings. In South Africa, it informed the first South African HIV and TB Investment Cases and successfully leveraged additional resources from the National Treasury at a time of austerity. In Ghana, a long-term TIME model-centred interaction with the NTP provided new insights into the local epidemiology and guided resource allocation decisions to improve impact.
Improvements to NASA's Debris Assessment Software
NASA Technical Reports Server (NTRS)
Opiela, J.; Johnson, Nicholas L.
2007-01-01
NASA's Debris Assessment Software (DAS) has been substantially revised and expanded. DAS is designed to assist NASA programs in performing orbital debris assessments, as described in NASA s Guidelines and Assessment Procedures for Limiting Orbital Debris. The extensive upgrade of DAS was undertaken to reflect changes in the debris mitigation guidelines, to incorporate recommendations from DAS users, and to take advantage of recent software capabilities for greater user utility. DAS 2.0 includes an updated environment model and enhanced orbital propagators and reentry-survivability models. The ORDEM96 debris environment model has been replaced by ORDEM2000 in DAS 2.0, which is also designed to accept anticipated revisions to the environment definition. Numerous upgrades have also been applied to the assessment of human casualty potential due to reentering debris. Routines derived from the Object Reentry Survival Analysis Tool, Version 6 (ORSAT 6), determine which objects are assessed to survive reentry, and the resulting risk of human casualty is calculated directly based upon the orbital inclination and a future world population database. When evaluating reentry risks, the user may enter up to 200 unique hardware components for each launched object, in up to four nested levels. This last feature allows the software to more accurately model components that are exposed below the initial breakup altitude. The new DAS 2.0 provides an updated set of tools for users to assess their mission s compliance with the NASA Safety Standard and does so with a clear and easy-to-understand interface. The new native Microsoft Windows graphical user interface (GUI) is a vast improvement over the previous DOS-based interface. In the new version, functions are more-clearly laid out, and the GUI includes the standard Windows-style Help functions. The underlying routines within the DAS code are also improved.
Vector representation of user's view using self-organizing map
NASA Astrophysics Data System (ADS)
Ae, Tadashi; Yamaguchi, Tomohisa; Monden, Eri; Kawabata, Shunji; Kamitani, Motoki
2004-05-01
There exist various objects, such as pictures, music, texts, etc., around our environment. We have a view for these objects by looking, reading or listening. Our view is concerned with our behaviors deeply, and is very important to understand our behaviors. Therefore, we propose a method which acquires a view as a vector, and apply the vector to sequence generation. We focus on sequences of the data of which a user selects from a multimedia database containing pictures, music, movie, etc.. These data cannot be stereotyped because user's view for them changes by each user. Therefore, we represent the structure of the multimedia database as the vector representing user's view and the stereotyped vector, and acquire sequences containing the structure as elements. We demonstrate a city-sequence generation system which reflects user's intension as an application of sequence generation containing user's view. We apply the self-organizing map to this system to represent user's view.
Addressing and Presenting Quality of Satellite Data via Web-Based Services
NASA Technical Reports Server (NTRS)
Leptoukh, Gregory; Lynnes, C.; Ahmad, S.; Fox, P.; Zednik, S.; West, P.
2011-01-01
With the recent attention to climate change and proliferation of remote-sensing data utilization, climate model and various environmental monitoring and protection applications have begun to increasingly rely on satellite measurements. Research application users seek good quality satellite data, with uncertainties and biases provided for each data point. However, different communities address remote sensing quality issues rather inconsistently and differently. We describe our attempt to systematically characterize, capture, and provision quality and uncertainty information as it applies to the NASA MODIS Aerosol Optical Depth data product. In particular, we note the semantic differences in quality/bias/uncertainty at the pixel, granule, product, and record levels. We outline various factors contributing to uncertainty or error budget; errors. Web-based science analysis and processing tools allow users to access, analyze, and generate visualizations of data while alleviating users from having directly managing complex data processing operations. These tools provide value by streamlining the data analysis process, but usually shield users from details of the data processing steps, algorithm assumptions, caveats, etc. Correct interpretation of the final analysis requires user understanding of how data has been generated and processed and what potential biases, anomalies, or errors may have been introduced. By providing services that leverage data lineage provenance and domain-expertise, expert systems can be built to aid the user in understanding data sources, processing, and the suitability for use of products generated by the tools. We describe our experiences developing a semantic, provenance-aware, expert-knowledge advisory system applied to NASA Giovanni web-based Earth science data analysis tool as part of the ESTO AIST-funded Multi-sensor Data Synergy Advisor project.
3-dimensional orthodontics visualization system with dental study models and orthopantomograms
NASA Astrophysics Data System (ADS)
Zhang, Hua; Ong, S. H.; Foong, K. W. C.; Dhar, T.
2005-04-01
The aim of this study is to develop a system that provides 3-dimensional visualization of orthodontic treatments. Dental plaster models and corresponding orthopantomogram (dental panoramic tomogram) are first digitized and fed into the system. A semi-auto segmentation technique is applied to the plaster models to detect the dental arches, tooth interstices and gum margins, which are used to extract individual crown models. 3-dimensional representation of roots, generated by deforming generic tooth models with orthopantomogram using radial basis functions, is attached to corresponding crowns to enable visualization of complete teeth. An optional algorithm to close the gaps between deformed roots and actual crowns by using multi-quadratic radial basis functions is also presented, which is capable of generating smooth mesh representation of complete 3-dimensional teeth. User interface is carefully designed to achieve a flexible system with as much user friendliness as possible. Manual calibration and correction is possible throughout the data processing steps to compensate occasional misbehaviors of automatic procedures. By allowing the users to move and re-arrange individual teeth (with their roots) on a full dentition, this orthodontic visualization system provides an easy and accurate way of simulation and planning of orthodontic treatment. Its capability of presenting 3-dimensional root information with only study models and orthopantomogram is especially useful for patients who do not undergo CT scanning, which is not a routine procedure in most orthodontic cases.
NASA Astrophysics Data System (ADS)
Dewi, D. S.; Sudiarno, A.; Saputra, H.; Dewi, R. S.
2018-04-01
The internet users in Indonesia has increased rapidly over the last decade. A survey conducted by Association of Internet Service Providers Indonesia shows that the internet users has reached 34.9% of total population in Indonesia. The increase of internet users has led to a shift in trading practice from conventional trade to online trade. It is predicted in the next years the number of online consumers in Indonesia will continue to increase, provide many opportunity for online business. The huge number of internet users is not necesarily followed by the high number of e-purchase. It is therefore become the interest of many researchers to investigate factors that influence the decision on online purchasing.This research proposes a model that assess the effect of emotional design and customer review to customer intention on e-repeat purchase. Online questionnaire is designed and is distributed randomly through google forms. There are 187 respondent filled the questionnaire from which only 162 respondents actually have experience in online purchase. These data are then processed by using statistical analysis. A model is developed by applying structural equation modeling (SEM) approach. This study revealed that customer reviews especially objective reviews has a significant effect toward repeat purchase. Whereas emotional design particularly visual attractiveness also shows a significant effect toward e-repeat purchase.
MobRISK: a model for assessing the exposure of road users to flash flood events
NASA Astrophysics Data System (ADS)
Shabou, Saif; Ruin, Isabelle; Lutoff, Céline; Debionne, Samuel; Anquetin, Sandrine; Creutin, Jean-Dominique; Beaufils, Xavier
2017-09-01
Recent flash flood impact studies highlight that road networks are often disrupted due to adverse weather and flash flood events. Road users are thus particularly exposed to road flooding during their daily mobility. Previous exposure studies, however, do not take into consideration population mobility. Recent advances in transportation research provide an appropriate framework for simulating individual travel-activity patterns using an activity-based approach. These activity-based mobility models enable the prediction of the sequence of activities performed by individuals and locating them with a high spatial-temporal resolution. This paper describes the development of the MobRISK microsimulation system: a model for assessing the exposure of road users to extreme hydrometeorological events. MobRISK aims at providing an accurate spatiotemporal exposure assessment by integrating travel-activity behaviors and mobility adaptation with respect to weather disruptions. The model is applied in a flash-flood-prone area in southern France to assess motorists' exposure to the September 2002 flash flood event. The results show that risk of flooding mainly occurs in principal road links with considerable traffic load. However, a lag time between the timing of the road submersion and persons crossing these roads contributes to reducing the potential vehicle-related fatal accidents. It is also found that sociodemographic variables have a significant effect on individual exposure. Thus, the proposed model demonstrates the benefits of considering spatiotemporal dynamics of population exposure to flash floods and presents an important improvement in exposure assessment methods. Such improved characterization of road user exposures can present valuable information for flood risk management services.
Applying Dynamic Fuzzy Petri Net to Web Learning System
ERIC Educational Resources Information Center
Chen, Juei-Nan; Huang, Yueh-Min; Chu, William
2005-01-01
This investigation presents a DFPN (Dynamic Fuzzy Petri Net) model to increase the flexibility of the tutoring agent's behaviour and thus provide a learning content structure for a lecture course. The tutoring agent is a software assistant for a single user, who may be an expert in an e-Learning course. Based on each learner's behaviour, the…
The Impact of Different Scoring Rubrics for Grading Virtual Patient-Based Exams
ERIC Educational Resources Information Center
Fors, Uno G. H.; Gunning, William T.
2014-01-01
Virtual patient cases (VPs) are used for healthcare education and assessment. Most VP systems track user interactions to be used for assessment. Few studies have investigated how virtual exam cases should be scored and graded. We have applied eight different scoring models on a data set from 154 students. Issues studied included the impact of…
Bodenmann, Patrick; Baggio, Stéphanie; Iglesias, Katia; Althaus, Fabrice; Velonaki, Venetia-Sofia; Stucki, Stephanie; Ansermet, Corine; Paroz, Sophie; Trueb, Lionel; Hugli, Olivier; Griffin, Judith L; Daeppen, Jean-Bernard
2015-12-09
Frequent emergency department (ED) users meet several of the criteria of vulnerability, but this needs to be further examined taking into consideration all vulnerability's different dimensions. This study aimed to characterize frequent ED users and to define risk factors of frequent ED use within a universal health care coverage system, applying a conceptual framework of vulnerability. A controlled, cross-sectional study comparing frequent ED users to a control group of non-frequent users was conducted at the Lausanne University Hospital, Switzerland. Frequent users were defined as patients with five or more visits to the ED in the previous 12 months. The two groups were compared using validated scales for each one of the five dimensions of an innovative conceptual framework: socio-demographic characteristics; somatic, mental, and risk-behavior indicators; and use of health care services. Independent t-tests, Wilcoxon rank-sum tests, Pearson's Chi-squared test and Fisher's exact test were used for the comparison. To examine the -related to vulnerability- risk factors for being a frequent ED user, univariate and multivariate logistic regression models were used. We compared 226 frequent users and 173 controls. Frequent users had more vulnerabilities in all five dimensions of the conceptual framework. They were younger, and more often immigrants from low/middle-income countries or unemployed, had more somatic and psychiatric comorbidities, were more often tobacco users, and had more primary care physician (PCP) visits. The most significant frequent ED use risk factors were a history of more than three hospital admissions in the previous 12 months (adj OR:23.2, 95%CI = 9.1-59.2), the absence of a PCP (adj OR:8.4, 95%CI = 2.1-32.7), living less than 5 km from an ED (adj OR:4.4, 95%CI = 2.1-9.0), and household income lower than USD 2,800/month (adj OR:4.3, 95%CI = 2.0-9.2). Frequent ED users within a universal health coverage system form a highly vulnerable population, when taking into account all five dimensions of a conceptual framework of vulnerability. The predictive factors identified could be useful in the early detection of future frequent users, in order to address their specific needs and decrease vulnerability, a key priority for health care policy makers. Application of the conceptual framework in future research is warranted.
Cannabis, alcohol use, psychological distress, and decision-making style.
Phillips, James G; Ogeil, Rowan P
2017-09-01
There have been suggestions of hypofrontality in cannabis users. To understand cannabis-related differences in decisional processes, Janis and Mann's conflict model of decision making was applied to recreational cannabis smokers who varied in their alcohol use and level of psychological distress. An online sample of recreational substance users (114 male, 119 female) completed the Melbourne Decision Making Questionnaire, the Alcohol Use Disorders Identification Test (AUDIT), Kessler's Psychological Distress Scale (K10), and the Severity of Dependence Scale (SDS) for cannabis. Multivariate analysis of variance examined self-reported decision-making styles as a function of gender, recent cannabis use, risky alcohol consumption, and levels of psychological distress. Psychological distress was associated with lower decisional self-esteem and higher levels of procrastination and buck-passing. There were gender differences associated with cannabis use. Female cannabis users reported higher levels of hypervigilance, while male cannabis users reported lower levels of buck-passing. Although there was little indication of an avoidant decisional style in cannabis users, the results suggest that cannabis affects decisional processes, contributing to panic in females and impulsivity in males.
Salamone, Francesco; Belussi, Lorenzo; Currò, Cristian; Danza, Ludovico; Ghellere, Matteo; Guazzi, Giulia; Lenzi, Bruno; Megale, Valentino; Meroni, Italo
2018-05-17
Thermal comfort has become a topic issue in building performance assessment as well as energy efficiency. Three methods are mainly recognized for its assessment. Two of them based on standardized methodologies, face the problem by considering the indoor environment in steady-state conditions (PMV and PPD) and users as active subjects whose thermal perception is influenced by outdoor climatic conditions (adaptive approach). The latter method is the starting point to investigate thermal comfort from an overall perspective by considering endogenous variables besides the traditional physical and environmental ones. Following this perspective, the paper describes the results of an in-field investigation of thermal conditions through the use of nearable and wearable solutions, parametric models and machine learning techniques. The aim of the research is the exploration of the reliability of IoT-based solutions combined with advanced algorithms, in order to create a replicable framework for the assessment and improvement of user thermal satisfaction. For this purpose, an experimental test in real offices was carried out involving eight workers. Parametric models are applied for the assessment of thermal comfort; IoT solutions are used to monitor the environmental variables and the users' parameters; the machine learning CART method allows to predict the users' profile and the thermal comfort perception respect to the indoor environment.
Quantitative analysis of bloggers' collective behavior powered by emotions
NASA Astrophysics Data System (ADS)
Mitrović, Marija; Paltoglou, Georgios; Tadić, Bosiljka
2011-02-01
Large-scale data resulting from users' online interactions provide the ultimate source of information to study emergent social phenomena on the Web. From individual actions of users to observable collective behaviors, different mechanisms involving emotions expressed in the posted text play a role. Here we combine approaches of statistical physics with machine-learning methods of text analysis to study the emergence of emotional behavior among Web users. Mapping the high-resolution data from digg.com onto bipartite networks of users and their comments onto posted stories, we identify user communities centered around certain popular posts and determine emotional contents of the related comments by the emotion classifier developed for this type of text. Applied over different time periods, this framework reveals strong correlations between the excess of negative emotions and the evolution of communities. We observe avalanches of emotional comments exhibiting significant self-organized critical behavior and temporal correlations. To explore the robustness of these critical states, we design a network-automaton model on realistic network connections and several control parameters, which can be inferred from the dataset. Dissemination of emotions by a small fraction of very active users appears to critically tune the collective states.
Integrating Health Behavior Theory and Design Elements in Serious Games
Fleming, Theresa; Lucassen, Mathijs FG; Bridgman, Heather; Stasiak, Karolina; Shepherd, Matthew; Orpin, Peter
2015-01-01
Background Internet interventions for improving health and well-being have the potential to reach many people and fill gaps in service provision. Serious gaming interfaces provide opportunities to optimize user adherence and impact. Health interventions based in theory and evidence and tailored to psychological constructs have been found to be more effective to promote behavior change. Defining the design elements which engage users and help them to meet their goals can contribute to better informed serious games. Objective To elucidate design elements important in SPARX, a serious game for adolescents with depression, from a user-centered perspective. Methods We proposed a model based on an established theory of health behavior change and practical features of serious game design to organize ideas and rationale. We analyzed data from 5 studies comprising a total of 22 focus groups and 66 semistructured interviews conducted with youth and families in New Zealand and Australia who had viewed or used SPARX. User perceptions of the game were applied to this framework. Results A coherent framework was established using the three constructs of self-determination theory (SDT), autonomy, competence, and relatedness, to organize user perceptions and design elements within four areas important in design: computer game, accessibility, working alliance, and learning in immersion. User perceptions mapped well to the framework, which may assist developers in understanding the context of user needs. By mapping these elements against the constructs of SDT, we were able to propose a sound theoretical base for the model. Conclusions This study’s method allowed for the articulation of design elements in a serious game from a user-centered perspective within a coherent overarching framework. The framework can be used to deliberately incorporate serious game design elements that support a user’s sense of autonomy, competence, and relatedness, key constructs which have been found to mediate motivation at all stages of the change process. The resulting model introduces promising avenues for future exploration. Involving users in program design remains an imperative if serious games are to be fit for purpose. PMID:26543916
Factors affecting success of an integrated community-based telehealth system.
Hsieh, Hui-Lung; Tsai, Chung-Hung; Chih, Wen-Hai; Lin, Huei-Hsieh
2015-01-01
The rise of chronic and degenerative diseases in developed countries has become one critical epidemiologic issue. Telehealth can provide one viable way to enhance health care, public health, and health education delivery and support. The study aims to empirically examine and evaluate the success factors of community-based telehealth system adoption. The valid 336 respondents are the residents of a rural community in Taiwan. The structural equation modeling (SEM) was used to assess the proposed model applied to telehealth. The findings showed the research model had good explanatory power and fitness. Also, the findings indicated that system quality exerted the strongest overall effect on intention to use. Furthermore, service quality exerted the strongest overall effect on user satisfaction. The findings also illustrated that the joint effects of three intrinsic qualities (system quality, information quality, and service quality) on use were mediated by user satisfaction and intention to use. The study implies that community-based telehealth service providers should improve three intrinsic qualities to enhance user satisfaction and intention to use, which in turn can lead to increase the usage of the telehealth equipment. The integrated community-based telehealth system may become an innovative and suitable way to deliver better care to the residents of communities.
Ravi, Logesh; Vairavasundaram, Subramaniyaswamy
2016-01-01
Rapid growth of web and its applications has created a colossal importance for recommender systems. Being applied in various domains, recommender systems were designed to generate suggestions such as items or services based on user interests. Basically, recommender systems experience many issues which reflects dwindled effectiveness. Integrating powerful data management techniques to recommender systems can address such issues and the recommendations quality can be increased significantly. Recent research on recommender systems reveals an idea of utilizing social network data to enhance traditional recommender system with better prediction and improved accuracy. This paper expresses views on social network data based recommender systems by considering usage of various recommendation algorithms, functionalities of systems, different types of interfaces, filtering techniques, and artificial intelligence techniques. After examining the depths of objectives, methodologies, and data sources of the existing models, the paper helps anyone interested in the development of travel recommendation systems and facilitates future research direction. We have also proposed a location recommendation system based on social pertinent trust walker (SPTW) and compared the results with the existing baseline random walk models. Later, we have enhanced the SPTW model for group of users recommendations. The results obtained from the experiments have been presented.
Holtyn, August F; Koffarnus, Mikhail N; DeFulio, Anthony; Sigurdsson, Sigurdur O; Strain, Eric C; Schwartz, Robert P; Silverman, Kenneth
2014-01-01
We examined the use of employment-based abstinence reinforcement in out-of-treatment injection drug users, in this secondary analysis of a previously reported trial. Participants (N = 33) could work in the therapeutic workplace, a model employment-based program for drug addiction, for 30 weeks and could earn approximately $10 per hr. During a 4-week induction, participants only had to work to earn pay. After induction, access to the workplace was contingent on enrollment in methadone treatment. After participants met the methadone contingency for 3 weeks, they had to provide opiate-negative urine samples to maintain maximum pay. After participants met those contingencies for 3 weeks, they had to provide opiate- and cocaine-negative urine samples to maintain maximum pay. The percentage of drug-negative urine samples remained stable until the abstinence reinforcement contingency for each drug was applied. The percentage of opiate- and cocaine-negative urine samples increased abruptly and significantly after the opiate- and cocaine-abstinence contingencies, respectively, were applied. These results demonstrate that the sequential administration of employment-based abstinence reinforcement can increase opiate and cocaine abstinence among out-of-treatment injection drug users. © Society for the Experimental Analysis of Behavior.
Holtyn, August F.; Koffarnus, Mikhail N.; DeFulio, Anthony; Sigurdsson, Sigurdur O.; Strain, Eric C.; Schwartz, Robert P.; Silverman, Kenneth
2016-01-01
We examined the use of employment-based abstinence reinforcement in out-of-treatment injection drug users, in this secondary analysis of a previously reported trial. Participants (N = 33) could work in the therapeutic workplace, a model employment-based program for drug addiction, for 30 weeks and could earn approximately $10 per hr. During a 4-week induction, participants only had to work to earn pay. After induction, access to the workplace was contingent on enrollment in methadone treatment. After participants met the methadone contingency for 3 weeks, they had to provide opiate-negative urine samples to maintain maximum pay. After participants met those contingencies for 3 weeks, they had to provide opiate- and cocaine-negative urine samples to maintain maximum pay. The percentage of drug-negative urine samples remained stable until the abstinence reinforcement contingency for each drug was applied. The percentage of opiate- and cocaine-negative urine samples increased abruptly and significantly after the opiate- and cocaine-abstinence contingencies, respectively, were applied. These results demonstrate that the sequential administration of employment-based abstinence reinforcement can increase opiate and cocaine abstinence among out-of-treatment injection drug users. PMID:25292399
PFA toolbox: a MATLAB tool for Metabolic Flux Analysis.
Morales, Yeimy; Bosque, Gabriel; Vehí, Josep; Picó, Jesús; Llaneras, Francisco
2016-07-11
Metabolic Flux Analysis (MFA) is a methodology that has been successfully applied to estimate metabolic fluxes in living cells. However, traditional frameworks based on this approach have some limitations, particularly when measurements are scarce and imprecise. This is very common in industrial environments. The PFA Toolbox can be used to face those scenarios. Here we present the PFA (Possibilistic Flux Analysis) Toolbox for MATLAB, which simplifies the use of Interval and Possibilistic Metabolic Flux Analysis. The main features of the PFA Toolbox are the following: (a) It provides reliable MFA estimations in scenarios where only a few fluxes can be measured or those available are imprecise. (b) It provides tools to easily plot the results as interval estimates or flux distributions. (c) It is composed of simple functions that MATLAB users can apply in flexible ways. (d) It includes a Graphical User Interface (GUI), which provides a visual representation of the measurements and their uncertainty. (e) It can use stoichiometric models in COBRA format. In addition, the PFA Toolbox includes a User's Guide with a thorough description of its functions and several examples. The PFA Toolbox for MATLAB is a freely available Toolbox that is able to perform Interval and Possibilistic MFA estimations.
Dynamic robustness of knowledge collaboration network of open source product development community
NASA Astrophysics Data System (ADS)
Zhou, Hong-Li; Zhang, Xiao-Dong
2018-01-01
As an emergent innovative design style, open source product development communities are characterized by a self-organizing, mass collaborative, networked structure. The robustness of the community is critical to its performance. Using the complex network modeling method, the knowledge collaboration network of the community is formulated, and the robustness of the network is systematically and dynamically studied. The characteristics of the network along the development period determine that its robustness should be studied from three time stages: the start-up, development and mature stages of the network. Five kinds of user-loss pattern are designed, to assess the network's robustness under different situations in each of these three time stages. Two indexes - the largest connected component and the network efficiency - are used to evaluate the robustness of the community. The proposed approach is applied in an existing open source car design community. The results indicate that the knowledge collaboration networks show different levels of robustness in different stages and different user loss patterns. Such analysis can be applied to provide protection strategies for the key users involved in knowledge dissemination and knowledge contribution at different stages of the network, thereby promoting the sustainable and stable development of the open source community.
Prediction of applied forces in handrim wheelchair propulsion.
Lin, Chien-Ju; Lin, Po-Chou; Guo, Lan-Yuen; Su, Fong-Chin
2011-02-03
Researchers of wheelchair propulsion have usually suggested that a wheelchair can be properly designed using anthropometrics to reduce high mechanical load and thus reduce pain and damage to joints. A model based on physiological features and biomechanical principles can be used to determine anthropometric relationships for wheelchair fitting. To improve the understanding of man-machine interaction and the mechanism through which propulsion performance been enhanced, this study develops and validates an energy model for wheelchair propulsion. Kinematic data obtained from ten able-bodied and ten wheelchair-dependent users during level propulsion at an average velocity of 1m/s were used as the input of a planar model with the criteria of increasing efficiency and reducing joint load. Results demonstrate that for both experienced and inexperienced users, predicted handrim contact forces agree with experimental data through an extensive range of the push. Significant deviations that were mostly observed in the early stage of the push phase might result from the lack of consideration of muscle dynamics and wrist joint biomechanics. The proposed model effectively verified the handrim contact force patterns during dynamic propulsion. Users do not aim to generate mechanically most effective forces to avoid high loadings on the joints. Copyright © 2010 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Thomas, W. A.; McAnally, W. H., Jr.
1985-07-01
TABS-2 is a generalized numerical modeling system for open-channel flows, sedimentation, and constituent transport. It consists of more than 40 computer programs to perform modeling and related tasks. The major modeling components--RMA-2V, STUDH, and RMA-4--calculate two-dimensional, depth-averaged flows, sedimentation, and dispersive transport, respectively. The other programs in the system perform digitizing, mesh generation, data management, graphical display, output analysis, and model interfacing tasks. Utilities include file management and automatic generation of computer job control instructions. TABS-2 has been applied to a variety of waterways, including rivers, estuaries, bays, and marshes. It is designed for use by engineers and scientists who may not have a rigorous computer background. Use of the various components is described in Appendices A-O. The bound version of the report does not include the appendices. A looseleaf form with Appendices A-O is distributed to system users.
Modeling Vortex Generators in the Wind-US Code
NASA Technical Reports Server (NTRS)
Dudek, Julianne C.
2010-01-01
A source term model which simulates the effects of vortex generators was implemented into the Wind-US Navier Stokes code. The source term added to the Navier-Stokes equations simulates the lift force which would result from a vane-type vortex generator in the flowfield. The implementation is user-friendly, requiring the user to specify only three quantities for each desired vortex generator: the range of grid points over which the force is to be applied and the planform area and angle of incidence of the physical vane. The model behavior was evaluated for subsonic flow in a rectangular duct with a single vane vortex generator, supersonic flow in a rectangular duct with a counterrotating vortex generator pair, and subsonic flow in an S-duct with 22 co-rotating vortex generators. The validation results indicate that the source term vortex generator model provides a useful tool for screening vortex generator configurations and gives comparable results to solutions computed using a gridded vane.
Crowdsourcing Based 3d Modeling
NASA Astrophysics Data System (ADS)
Somogyi, A.; Barsi, A.; Molnar, B.; Lovas, T.
2016-06-01
Web-based photo albums that support organizing and viewing the users' images are widely used. These services provide a convenient solution for storing, editing and sharing images. In many cases, the users attach geotags to the images in order to enable using them e.g. in location based applications on social networks. Our paper discusses a procedure that collects open access images from a site frequently visited by tourists. Geotagged pictures showing the image of a sight or tourist attraction are selected and processed in photogrammetric processing software that produces the 3D model of the captured object. For the particular investigation we selected three attractions in Budapest. To assess the geometrical accuracy, we used laser scanner and DSLR as well as smart phone photography to derive reference values to enable verifying the spatial model obtained from the web-album images. The investigation shows how detailed and accurate models could be derived applying photogrammetric processing software, simply by using images of the community, without visiting the site.
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-09-14
This package contains statistical routines for extracting features from multivariate time-series data which can then be used for subsequent multivariate statistical analysis to identify patterns and anomalous behavior. It calculates local linear or quadratic regression model fits to moving windows for each series and then summarizes the model coefficients across user-defined time intervals for each series. These methods are domain agnostic-but they have been successfully applied to a variety of domains, including commercial aviation and electric power grid data.
2012-01-01
networks has become fast , cheap, and easy (Shapiro, 1971; Trigg & Weiser, 1986). Modern information and communication technologies, such as the internet...However, once the model is learned, inference time is not subject to this constraint. Therefore, applying the model in end-user applications is fast ...products that facilitate the fast collection and assessment of these networks. For the purpose of analyzing socio-technical networks of geopolitical
Rajasekaran, Vijaykumar; López-Larraz, Eduardo; Trincado-Alonso, Fernando; Aranda, Joan; Montesano, Luis; Del-Ama, Antonio J; Pons, Jose L
2018-01-03
Gait training for individuals with neurological disorders is challenging in providing the suitable assistance and more adaptive behaviour towards user needs. The user specific adaptation can be defined based on the user interaction with the orthosis and by monitoring the user intentions. In this paper, an adaptive control model, commanded by the user intention, is evaluated using a lower limb exoskeleton with incomplete spinal cord injury individuals (SCI). A user intention based adaptive control model has been developed and evaluated with 4 incomplete SCI individuals across 3 sessions of training per individual. The adaptive control model modifies the joint impedance properties of the exoskeleton as a function of the human-orthosis interaction torques and the joint trajectory evolution along the gait sequence, in real time. The volitional input of the user is identified by monitoring the neural signals, pertaining to the user's motor activity. These volitional inputs are used as a trigger to initiate the gait movement, allowing the user to control the initialization of the exoskeleton movement, independently. A Finite-state machine based control model is used in this set-up which helps in combining the volitional orders with the gait adaptation. The exoskeleton demonstrated an adaptive assistance depending on the patients' performance without guiding them to follow an imposed trajectory. The exoskeleton initiated the trajectory based on the user intention command received from the brain machine interface, demonstrating it as a reliable trigger. The exoskeleton maintained the equilibrium by providing suitable assistance throughout the experiments. A progressive change in the maximum flexion of the knee joint was observed at the end of each session which shows improvement in the patient performance. Results of the adaptive impedance were evaluated by comparing with the application of a constant impedance value. Participants reported that the movement of the exoskeleton was flexible and the walking patterns were similar to their own distinct patterns. This study demonstrates that user specific adaptive control can be applied on a wearable robot based on the human-orthosis interaction torques and modifying the joints' impedance properties. The patients perceived no external or impulsive force and felt comfortable with the assistance provided by the exoskeleton. The main goal of such a user dependent control is to assist the patients' needs and adapt to their characteristics, thus maximizing their engagement in the therapy and avoiding slacking. In addition, the initiation directly controlled by the brain allows synchronizing the user's intention with the afferent stimulus provided by the movement of the exoskeleton, which maximizes the potentiality of the system in neuro-rehabilitative therapies.
NASA Astrophysics Data System (ADS)
Zou, Jie; Gattani, Abhishek
2005-01-01
When completely automated systems don't yield acceptable accuracy, many practical pattern recognition systems involve the human either at the beginning (pre-processing) or towards the end (handling rejects). We believe that it may be more useful to involve the human throughout the recognition process rather than just at the beginning or end. We describe a methodology of interactive visual recognition for human-centered low-throughput applications, Computer Assisted Visual InterActive Recognition (CAVIAR), and discuss the prospects of implementing CAVIAR over the Internet. The novelty of CAVIAR is image-based interaction through a domain-specific parameterized geometrical model, which reduces the semantic gap between humans and computers. The user may interact with the computer anytime that she considers its response unsatisfactory. The interaction improves the accuracy of the classification features by improving the fit of the computer-proposed model. The computer makes subsequent use of the parameters of the improved model to refine not only its own statistical model-fitting process, but also its internal classifier. The CAVIAR methodology was applied to implement a flower recognition system. The principal conclusions from the evaluation of the system include: 1) the average recognition time of the CAVIAR system is significantly shorter than that of the unaided human; 2) its accuracy is significantly higher than that of the unaided machine; 3) it can be initialized with as few as one training sample per class and still achieve high accuracy; and 4) it demonstrates a self-learning ability. We have also implemented a Mobile CAVIAR system, where a pocket PC, as a client, connects to a server through wireless communication. The motivation behind a mobile platform for CAVIAR is to apply the methodology in a human-centered pervasive environment, where the user can seamlessly interact with the system for classifying field-data. Deploying CAVIAR to a networked mobile platform poses the challenge of classifying field images and programming under constraints of display size, network bandwidth, processor speed, and memory size. Editing of the computer-proposed model is performed on the handheld while statistical model fitting and classification take place on the server. The possibility that the user can easily take several photos of the object poses an interesting information fusion problem. The advantage of the Internet is that the patterns identified by different users can be pooled together to benefit all peer users. When users identify patterns with CAVIAR in a networked setting, they also collect training samples and provide opportunities for machine learning from their intervention. CAVIAR implemented over the Internet provides a perfect test bed for, and extends, the concept of Open Mind Initiative proposed by David Stork. Our experimental evaluation focuses on human time, machine and human accuracy, and machine learning. We devoted much effort to evaluating the use of our image-based user interface and on developing principles for the evaluation of interactive pattern recognition system. The Internet architecture and Mobile CAVIAR methodology have many applications. We are exploring in the directions of teledermatology, face recognition, and education.
Service quality of Early Childhood Education web portals in Finnish municipalities
NASA Astrophysics Data System (ADS)
Koskivaara, Eija; Pihlaja, Päivi
Increasing number of governmental organizations have transformed material on their web sites as a way of providing users with information about their products and services. In this paper, we apply Yang et al (2005) instrument for analyzing municipal early childhood education (ECE) web sites in Finland. The objective of the study was to find out the quality of ECE web portals as well as to give hints to improve their value from users' point of view. In general the five dimensions, usability, usefulness of content, adequacy of information, accessibility, and interaction, of the Yang et al model seems to be applicable also in the early childhood education environment.
Ochs, Christopher; Case, James T; Perl, Yehoshua
2017-03-01
Thousands of changes are applied to SNOMED CT's concepts during each release cycle. These changes are the result of efforts to improve or expand the coverage of health domains in the terminology. Understanding which concepts changed, how they changed, and the overall impact of a set of changes is important for editors and end users. Each SNOMED CT release comes with delta files, which identify all of the individual additions and removals of concepts and relationships. These files typically contain tens of thousands of individual entries, overwhelming users. They also do not identify the editorial processes that were applied to individual concepts and they do not capture the overall impact of a set of changes on a subhierarchy of concepts. In this paper we introduce a methodology and accompanying software tool called a SNOMED CT Visual Semantic Delta ("semantic delta" for short) to enable a comprehensive review of changes in SNOMED CT. The semantic delta displays a graphical list of editing operations that provides semantics and context to the additions and removals in the delta files. However, there may still be thousands of editing operations applied to a set of concepts. To address this issue, a semantic delta includes a visual summary of changes that affected sets of structurally and semantically similar concepts. The software tool for creating semantic deltas offers views of various granularities, allowing a user to control how much change information they view. In this tool a user can select a set of structurally and semantically similar concepts and review the editing operations that affected their modeling. The semantic delta methodology is demonstrated on SNOMED CT's Bacterial infectious disease subhierarchy, which has undergone a significant remodeling effort over the last two years. Copyright © 2017 Elsevier Inc. All rights reserved.
Ochs, Christopher; Case, James T.; Perl, Yehoshua
2017-01-01
Thousands of changes are applied to SNOMED CT’s concepts during each release cycle. These changes are the result of efforts to improve or expand the coverage of health domains in the terminology. Understanding which concepts changed, how they changed, and the overall impact of a set of changes is important for editors and end users. Each SNOMED CT release comes with delta files, which identify all of the individual additions and removals of concepts and relationships. These files typically contain tens of thousands of individual entries, overwhelming users. They also do not identify the editorial processes that were applied to individual concepts and they do not capture the overall impact of a set of changes on a subhierarchy of concepts. In this paper we introduce a methodology and accompanying software tool called a SNOMED CT Visual Semantic Delta (“semantic delta” for short) to enable a comprehensive review of changes in SNOMED CT. The semantic delta displays a graphical list of editing operations that provides semantics and context to the additions and removals in the delta files. However, there may still be thousands of editing operations applied to a set of concepts. To address this issue, a semantic delta includes a visual summary of changes that affected sets of structurally and semantically similar concepts. The software tool for creating semantic deltas offers views of various granularities, allowing a user to control how much change information they view. In this tool a user can select a set of structurally and semantically similar concepts and review the editing operations that affected their modeling. The semantic delta methodology is demonstrated on SNOMED CT’s Bacterial infectious disease subhierarchy, which has undergone a significant remodeling effort over the last two years. PMID:28215561
Choi, Kelvin; Sabado, Melanie; El-Toukhy, Sherine; Vogtmann, Emily; Freedman, Neal D; Hatsukami, Dorothy
2017-10-01
Background: Few studies have examined differences in product consumption patterns and nicotine and tobacco-specific nitrosamines (TSNA) exposure between single versus dual- and poly-tobacco users. We applied the Tobacco Product Use Patterns (T-PUPs) model to fill this gap in the literature. Methods: Data from adults (age ≥18 years) who used any tobacco products during the 5 days prior to participating in the 1999-2012 National Health and Nutrition Examination Survey (NHANES) were analyzed. Participants were classified into seven T-PUPs: (1) cigarettes only, (2) noncigarette combustibles only, (3) noncombustibles only, (4) dual noncigarette combustibles and noncombustibles, (5) dual cigarettes and noncombustibles, (6) dual cigarettes and noncigarette combustibles, and (7) poly-tobacco use. Weighted regression models were used to compare product consumption, serum cotinine, and urinary total 4-(methylnitrosamino)-1-(3-pyridyl)-1-butanol (i.e., NNAL) levels between single-, dual-, and poly-tobacco T-PUPs. Results: Dual- and poly-tobacco T-PUPs were associated with lower product consumption compared with single-product T-PUPs only in some cases (e.g., dual cigarette and noncombustible users smoked cigarettes on 0.6 fewer days in the past 5 days compared with cigarette-only users; P < 0.05). Dual- and poly-tobacco T-PUPs had either nondistinguishable or higher levels of serum cotinine and urinary total NNAL than corresponding single-product T-PUPs. Conclusions: Product consumption, and nicotine and TSNAs exposure of dual- and poly-tobacco product category users somewhat differ from those of single-product category users as defined by the T-TUPs model. Impact: Higher levels of cotinine and NNAL among dual- and poly-tobacco T-TUPs users compared with the single-product T-TUPs users may indicate health concerns. Cancer Epidemiol Biomarkers Prev; 26(10); 1525-30. ©2017 AACR . ©2017 American Association for Cancer Research.
Extracting Association Patterns in Network Communications
Portela, Javier; Villalba, Luis Javier García; Trujillo, Alejandra Guadalupe Silva; Orozco, Ana Lucila Sandoval; Kim, Tai-hoon
2015-01-01
In network communications, mixes provide protection against observers hiding the appearance of messages, patterns, length and links between senders and receivers. Statistical disclosure attacks aim to reveal the identity of senders and receivers in a communication network setting when it is protected by standard techniques based on mixes. This work aims to develop a global statistical disclosure attack to detect relationships between users. The only information used by the attacker is the number of messages sent and received by each user for each round, the batch of messages grouped by the anonymity system. A new modeling framework based on contingency tables is used. The assumptions are more flexible than those used in the literature, allowing to apply the method to multiple situations automatically, such as email data or social networks data. A classification scheme based on combinatoric solutions of the space of rounds retrieved is developed. Solutions about relationships between users are provided for all pairs of users simultaneously, since the dependence of the data retrieved needs to be addressed in a global sense. PMID:25679311
Extracting association patterns in network communications.
Portela, Javier; Villalba, Luis Javier García; Trujillo, Alejandra Guadalupe Silva; Orozco, Ana Lucila Sandoval; Kim, Tai-hoon
2015-02-11
In network communications, mixes provide protection against observers hiding the appearance of messages, patterns, length and links between senders and receivers. Statistical disclosure attacks aim to reveal the identity of senders and receivers in a communication network setting when it is protected by standard techniques based on mixes. This work aims to develop a global statistical disclosure attack to detect relationships between users. The only information used by the attacker is the number of messages sent and received by each user for each round, the batch of messages grouped by the anonymity system. A new modeling framework based on contingency tables is used. The assumptions are more flexible than those used in the literature, allowing to apply the method to multiple situations automatically, such as email data or social networks data. A classification scheme based on combinatoric solutions of the space of rounds retrieved is developed. Solutions about relationships between users are provided for all pairs of users simultaneously, since the dependence of the data retrieved needs to be addressed in a global sense.
Wang, Jichuan; Kelly, Brian C; Liu, Tieqiao; Hao, Wei
2016-03-01
Given the growth in methamphetamine use in China during the 21st century, we assessed perceived psychosocial barriers to drug treatment among this population. Using a sample of 303 methamphetamine users recruited via Respondent Driven Sampling, we use Latent Class Analysis (LCA) to identify possible distinct latent groups among Chinese methamphetamine users on the basis of their perceptions of psychosocial barriers to drug treatment. After covariates were included to predict latent class membership, the 3-step modeling approach was applied. Our findings indicate that the Chinese methamphetamine using population was heterogeneous on perceptions of drug treatment barriers; four distinct latent classes (subpopulations) were identified--Unsupported Deniers, Deniers, Privacy Anxious, and Low Barriers--and individual characteristics shaped the probability of class membership. Efforts to link Chinese methamphetamine users to treatment may require a multi-faceted approach that attends to differing perceptions about impediments to drug treatment. Copyright © 2015. Published by Elsevier Inc.
NASA Technical Reports Server (NTRS)
Dorais, Gregory A.; Nicewarner, Keith
2006-01-01
We present an multi-agent model-based autonomy architecture with monitoring, planning, diagnosis, and execution elements. We discuss an internal spacecraft free-flying robot prototype controlled by an implementation of this architecture and a ground test facility used for development. In addition, we discuss a simplified environment control life support system for the spacecraft domain also controlled by an implementation of this architecture. We discuss adjustable autonomy and how it applies to this architecture. We describe an interface that provides the user situation awareness of both autonomous systems and enables the user to dynamically edit the plans prior to and during execution as well as control these agents at various levels of autonomy. This interface also permits the agents to query the user or request the user to perform tasks to help achieve the commanded goals. We conclude by describing a scenario where these two agents and a human interact to cooperatively detect, diagnose and recover from a simulated spacecraft fault.
Information diffusion in structured online social networks
NASA Astrophysics Data System (ADS)
Li, Pei; Zhang, Yini; Qiao, Fengcai; Wang, Hui
2015-05-01
Nowadays, due to the word-of-mouth effect, online social networks have been considered to be efficient approaches to conduct viral marketing, which makes it of great importance to understand the diffusion dynamics in online social networks. However, most research on diffusion dynamics in epidemiology and existing social networks cannot be applied directly to characterize online social networks. In this paper, we propose models to characterize the information diffusion in structured online social networks with push-based forwarding mechanism. We introduce the term user influence to characterize the average number of times that messages are browsed which is incurred by a given type user generating a message, and study the diffusion threshold, above which the user influence of generating a message will approach infinity. We conduct simulations and provide the simulation results, which are consistent with the theoretical analysis results perfectly. These results are of use in understanding the diffusion dynamics in online social networks and also critical for advertisers in viral marketing who want to estimate the user influence before posting an advertisement.
Carrino, Stefano; Caon, Maurizio; Angelini, Leonardo; Mugellini, Elena; Abou Khaled, Omar; Orte, Silvia; Vargiu, Eloisa; Coulson, Neil; Serrano, José C E; Tabozzi, Sarah; Lafortuna, Claudio; Rizzo, Giovanna
2014-01-01
Unhealthy alimentary behaviours and physical inactivity habits are key risk factors for major non communicable diseases. Several researches demonstrate that juvenile obesity can lead to serious medical conditions, pathologies and have important psycho-social consequences. PEGASO is a multidisciplinary project aimed at promoting healthy lifestyles among teenagers through assistive technology. The core of this project is represented by the ICT system, which allows providing tailored interventions to the users through their smartphones in order to motivate them. The novelty of this approach consists of developing a Virtual Individual Model (VIM) for user characterization, which is based on physical, functional and behavioural parameters opportunely selected by experts. These parameters are digitised and updated thanks to the user monitoring through smartphone; data mining algorithms are applied for the detection of activity and nutrition habits and this information is used to provide personalised feedback. The user interface will be developed using gamified approaches and integrating serious games to effectively promote health literacy and facilitate behaviour change.
Modeling Regular Replacement for String Constraint Solving
NASA Technical Reports Server (NTRS)
Fu, Xiang; Li, Chung-Chih
2010-01-01
Bugs in user input sanitation of software systems often lead to vulnerabilities. Among them many are caused by improper use of regular replacement. This paper presents a precise modeling of various semantics of regular substitution, such as the declarative, finite, greedy, and reluctant, using finite state transducers (FST). By projecting an FST to its input/output tapes, we are able to solve atomic string constraints, which can be applied to both the forward and backward image computation in model checking and symbolic execution of text processing programs. We report several interesting discoveries, e.g., certain fragments of the general problem can be handled using less expressive deterministic FST. A compact representation of FST is implemented in SUSHI, a string constraint solver. It is applied to detecting vulnerabilities in web applications
Determining the sources of fine-grained sediment using the Sediment Source Assessment Tool (Sed_SAT)
Gorman Sanisaca, Lillian E.; Gellis, Allen C.; Lorenz, David L.
2017-07-27
A sound understanding of sources contributing to instream sediment flux in a watershed is important when developing total maximum daily load (TMDL) management strategies designed to reduce suspended sediment in streams. Sediment fingerprinting and sediment budget approaches are two techniques that, when used jointly, can qualify and quantify the major sources of sediment in a given watershed. The sediment fingerprinting approach uses trace element concentrations from samples in known potential source areas to determine a clear signature of each potential source. A mixing model is then used to determine the relative source contribution to the target suspended sediment samples.The computational steps required to apportion sediment for each target sample are quite involved and time intensive, a problem the Sediment Source Assessment Tool (Sed_SAT) addresses. Sed_SAT is a user-friendly statistical model that guides the user through the necessary steps in order to quantify the relative contributions of sediment sources in a given watershed. The model is written using the statistical software R (R Core Team, 2016b) and utilizes Microsoft Access® as a user interface but requires no prior knowledge of R or Microsoft Access® to successfully run the model successfully. Sed_SAT identifies outliers, corrects for differences in size and organic content in the source samples relative to the target samples, evaluates the conservative behavior of tracers used in fingerprinting by applying a “Bracket Test,” identifies tracers with the highest discriminatory power, and provides robust error analysis through a Monte Carlo simulation following the mixing model. Quantifying sediment source contributions using the sediment fingerprinting approach provides local, State, and Federal land management agencies with important information needed to implement effective strategies to reduce sediment. Sed_SAT is designed to assist these agencies in applying the sediment fingerprinting approach to quantify sediment sources in the sediment TMDL framework.
Context-Enabled Business Intelligence
DOE Office of Scientific and Technical Information (OSTI.GOV)
Troy Hiltbrand
To truly understand context and apply it in business intelligence, it is vital to understand what context is and how it can be applied in addressing organizational needs. Context describes the facets of the environment that impact the way that end users interact with the system. Context includes aspects of location, chronology, access method, demographics, social influence/ relationships, end-user attitude/ emotional state, behavior/ past behavior, and presence. To be successful in making Business Intelligence content enabled, it is important to be able to capture the context of use user. With advances in technology, there are a number of ways inmore » which this user based information can be gathered and exposed to enhance the overall end user experience.« less
Model construction of “earning money by taking photos”
NASA Astrophysics Data System (ADS)
Yang, Jingmei
2018-03-01
In the era of information, with the increasingly developed network, “to earn money by taking photos” is a self-service model under the mobile Internet. The user downloads the APP, registers as a member of the APP, and then takes a task that needs to take photographs from the APP and earns the reward of the task on the APP. The article uses the task data and membership information data of an already completed project, including the member’s location and reputation value. On the basis of reasonable assumption, the data was processed with the MATLAB, SPSS and Excel software. This article mainly studied problems of the function relationship between the task performance, task position (GPS latitude and GPS longitude) and task price of users, analyzed the project’s task pricing rules and the reasons why the task is not completed, and applied multivariate regression function and GeoQ software to analyze the data, studied the task pricing rules, applied the chart method to solve the complex data, clear and easy to understand, and also reality simulation is applied to analyze why the tasks are not completed. Also, compared with the previous program, a new task pricing program is designed for the project to obtain the confidence level by means of the SPSS software, to estimate the reasonable range of the task pricing, predict and design a new pricing program on the reasonable price range.
HESS Opinions "Should we apply bias correction to global and regional climate model data?"
NASA Astrophysics Data System (ADS)
Ehret, U.; Zehe, E.; Wulfmeyer, V.; Warrach-Sagi, K.; Liebert, J.
2012-04-01
Despite considerable progress in recent years, output of both Global and Regional Circulation Models is still afflicted with biases to a degree that precludes its direct use, especially in climate change impact studies. This is well known, and to overcome this problem bias correction (BC), i.e. the correction of model output towards observations in a post processing step for its subsequent application in climate change impact studies has now become a standard procedure. In this paper we argue that bias correction, which has a considerable influence on the results of impact studies, is not a valid procedure in the way it is currently used: it impairs the advantages of Circulation Models which are based on established physical laws by altering spatiotemporal field consistency, relations among variables and by violating conservation principles. Bias correction largely neglects feedback mechanisms and it is unclear whether bias correction methods are time-invariant under climate change conditions. Applying bias correction increases agreement of Climate Model output with observations in hind casts and hence narrows the uncertainty range of simulations and predictions without, however, providing a satisfactory physical justification. This is in most cases not transparent to the end user. We argue that this masks rather than reduces uncertainty, which may lead to avoidable forejudging of end users and decision makers. We present here a brief overview of state-of-the-art bias correction methods, discuss the related assumptions and implications, draw conclusions on the validity of bias correction and propose ways to cope with biased output of Circulation Models in the short term and how to reduce the bias in the long term. The most promising strategy for improved future Global and Regional Circulation Model simulations is the increase in model resolution to the convection-permitting scale in combination with ensemble predictions based on sophisticated approaches for ensemble perturbation. With this article, we advocate communicating the entire uncertainty range associated with climate change predictions openly and hope to stimulate a lively discussion on bias correction among the atmospheric and hydrological community and end users of climate change impact studies.
The NASTRAN user's manual (level 17.0)
NASA Technical Reports Server (NTRS)
1979-01-01
NASTRAN embodies a lumped element approach, wherein the distributed physical properties of a structure are represented by a model consisting of a finite number of idealized substructures or elements that are interconnected at a finite of grid points, to which loads are applied. All input and output data pertain to the idealized structural model. The general procedures for defining structural models are described and instructions are given for each of the bulk data cards and case control cards. Additional information on the case control cards and use of parameters is included for each rigid format.
NASA Astrophysics Data System (ADS)
Pournazeri, S.
2011-12-01
A comprehensive optimization model named Cooperative Water Allocation Model (CWAM) is developed for equitable and efficient water allocation and valuation of Zab river basin in order to solve the draught problems of Orumieh Lake in North West of Iran. The model's methodology consists of three phases. The first represents an initial water rights allocation among competing users. The second comprises the water reallocation process for complete usage by consumers. The third phase performs an allocation of the net benefit of the stakeholders participating in a coalition by applying cooperative game theory. The environmental constraints are accounted for in the water allocation model by entering probable environmental damage in a target function, and inputting the minimum water requirement of users. The potential of underground water usage is evaluated in order to compensate for the variation in the amount of surface water. This is conducted by applying an integrated economic- hydrologic river basin model. A node-link river basin network is utilized in CWAM which consists of two major blocks. The first indicates the internal water rights allocation and the second is associated to water and net benefit reallocation. System control, loss in links by evaporation or seepage, modification of inflow into the node, loss in nodes and loss in outflow are considered in this model. Water valuation is calculated for environmental, industrial, municipal and agricultural usage by net benefit function. It can be seen that the water rights are allocated efficiently and incomes are distributed appropriately based on quality and quantity limitations.
Computational Software for Fitting Seismic Data to Epidemic-Type Aftershock Sequence Models
NASA Astrophysics Data System (ADS)
Chu, A.
2014-12-01
Modern earthquake catalogs are often analyzed using spatial-temporal point process models such as the epidemic-type aftershock sequence (ETAS) models of Ogata (1998). My work introduces software to implement two of ETAS models described in Ogata (1998). To find the Maximum-Likelihood Estimates (MLEs), my software provides estimates of the homogeneous background rate parameter and the temporal and spatial parameters that govern triggering effects by applying the Expectation-Maximization (EM) algorithm introduced in Veen and Schoenberg (2008). Despite other computer programs exist for similar data modeling purpose, using EM-algorithm has the benefits of stability and robustness (Veen and Schoenberg, 2008). Spatial shapes that are very long and narrow cause difficulties in optimization convergence and problems with flat or multi-modal log-likelihood functions encounter similar issues. My program uses a robust method to preset a parameter to overcome the non-convergence computational issue. In addition to model fitting, the software is equipped with useful tools for examining modeling fitting results, for example, visualization of estimated conditional intensity, and estimation of expected number of triggered aftershocks. A simulation generator is also given with flexible spatial shapes that may be defined by the user. This open-source software has a very simple user interface. The user may execute it on a local computer, and the program also has potential to be hosted online. Java language is used for the software's core computing part and an optional interface to the statistical package R is provided.
Data Programming: Creating Large Training Sets, Quickly.
Ratner, Alexander; De Sa, Christopher; Wu, Sen; Selsam, Daniel; Ré, Christopher
2016-12-01
Large labeled training sets are the critical building blocks of supervised learning methods and are key enablers of deep learning techniques. For some applications, creating labeled training sets is the most time-consuming and expensive part of applying machine learning. We therefore propose a paradigm for the programmatic creation of training sets called data programming in which users express weak supervision strategies or domain heuristics as labeling functions , which are programs that label subsets of the data, but that are noisy and may conflict. We show that by explicitly representing this training set labeling process as a generative model, we can "denoise" the generated training set, and establish theoretically that we can recover the parameters of these generative models in a handful of settings. We then show how to modify a discriminative loss function to make it noise-aware, and demonstrate our method over a range of discriminative models including logistic regression and LSTMs. Experimentally, on the 2014 TAC-KBP Slot Filling challenge, we show that data programming would have led to a new winning score, and also show that applying data programming to an LSTM model leads to a TAC-KBP score almost 6 F1 points over a state-of-the-art LSTM baseline (and into second place in the competition). Additionally, in initial user studies we observed that data programming may be an easier way for non-experts to create machine learning models when training data is limited or unavailable.
Data Programming: Creating Large Training Sets, Quickly
Ratner, Alexander; De Sa, Christopher; Wu, Sen; Selsam, Daniel; Ré, Christopher
2018-01-01
Large labeled training sets are the critical building blocks of supervised learning methods and are key enablers of deep learning techniques. For some applications, creating labeled training sets is the most time-consuming and expensive part of applying machine learning. We therefore propose a paradigm for the programmatic creation of training sets called data programming in which users express weak supervision strategies or domain heuristics as labeling functions, which are programs that label subsets of the data, but that are noisy and may conflict. We show that by explicitly representing this training set labeling process as a generative model, we can “denoise” the generated training set, and establish theoretically that we can recover the parameters of these generative models in a handful of settings. We then show how to modify a discriminative loss function to make it noise-aware, and demonstrate our method over a range of discriminative models including logistic regression and LSTMs. Experimentally, on the 2014 TAC-KBP Slot Filling challenge, we show that data programming would have led to a new winning score, and also show that applying data programming to an LSTM model leads to a TAC-KBP score almost 6 F1 points over a state-of-the-art LSTM baseline (and into second place in the competition). Additionally, in initial user studies we observed that data programming may be an easier way for non-experts to create machine learning models when training data is limited or unavailable. PMID:29872252
Modeling integrated water user decisions in intermittent supply systems
NASA Astrophysics Data System (ADS)
Rosenberg, David E.; Tarawneh, Tarek; Abdel-Khaleq, Rania; Lund, Jay R.
2007-07-01
We apply systems analysis to estimate household water use in an intermittent supply system considering numerous interdependent water user behaviors. Some 39 household actions include conservation; improving local storage or water quality; and accessing sources having variable costs, availabilities, reliabilities, and qualities. A stochastic optimization program with recourse decisions identifies the infrastructure investments and short-term coping actions a customer can adopt to cost-effectively respond to a probability distribution of piped water availability. Monte Carlo simulations show effects for a population of customers. Model calibration reproduces the distribution of billed residential water use in Amman, Jordan. Parametric analyses suggest economic and demand responses to increased availability and alternative pricing. It also suggests potential market penetration for conservation actions, associated water savings, and subsidies to entice further adoption. We discuss new insights to size, target, and finance conservation.
Automatic control of finite element models for temperature-controlled radiofrequency ablation
Haemmerich, Dieter; Webster, John G
2005-01-01
Background The finite element method (FEM) has been used to simulate cardiac and hepatic radiofrequency (RF) ablation. The FEM allows modeling of complex geometries that cannot be solved by analytical methods or finite difference models. In both hepatic and cardiac RF ablation a common control mode is temperature-controlled mode. Commercial FEM packages don't support automating temperature control. Most researchers manually control the applied power by trial and error to keep the tip temperature of the electrodes constant. Methods We implemented a PI controller in a control program written in C++. The program checks the tip temperature after each step and controls the applied voltage to keep temperature constant. We created a closed loop system consisting of a FEM model and the software controlling the applied voltage. The control parameters for the controller were optimized using a closed loop system simulation. Results We present results of a temperature controlled 3-D FEM model of a RITA model 30 electrode. The control software effectively controlled applied voltage in the FEM model to obtain, and keep electrodes at target temperature of 100°C. The closed loop system simulation output closely correlated with the FEM model, and allowed us to optimize control parameters. Discussion The closed loop control of the FEM model allowed us to implement temperature controlled RF ablation with minimal user input. PMID:16018811
Stropahl, Maren; Schellhardt, Sebastian; Debener, Stefan
2017-06-01
The concurrent presentation of different auditory and visual syllables may result in the perception of a third syllable, reflecting an illusory fusion of visual and auditory information. This well-known McGurk effect is frequently used for the study of audio-visual integration. Recently, it was shown that the McGurk effect is strongly stimulus-dependent, which complicates comparisons across perceivers and inferences across studies. To overcome this limitation, we developed the freely available Oldenburg audio-visual speech stimuli (OLAVS), consisting of 8 different talkers and 12 different syllable combinations. The quality of the OLAVS set was evaluated with 24 normal-hearing subjects. All 96 stimuli were characterized based on their stimulus disparity, which was obtained from a probabilistic model (cf. Magnotti & Beauchamp, 2015). Moreover, the McGurk effect was studied in eight adult cochlear implant (CI) users. By applying the individual, stimulus-independent parameters of the probabilistic model, the predicted effect of stronger audio-visual integration in CI users could be confirmed, demonstrating the validity of the new stimulus material.
A Test Methodology for Determining Space-Readiness of Xilinx SRAM-Based FPGA Designs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quinn, Heather M; Graham, Paul S; Morgan, Keith S
2008-01-01
Using reconfigurable, static random-access memory (SRAM) based field-programmable gate arrays (FPGAs) for space-based computation has been an exciting area of research for the past decade. Since both the circuit and the circuit's state is stored in radiation-tolerant memory, both could be alterd by the harsh space radiation environment. Both the circuit and the circuit's state can be prote cted by triple-moduler redundancy (TMR), but applying TMR to FPGA user designs is often an error-prone process. Faulty application of TMR could cause the FPGA user circuit to output incorrect data. This paper will describe a three-tiered methodology for testing FPGA usermore » designs for space-readiness. We will describe the standard approach to testing FPGA user designs using a particle accelerator, as well as two methods using fault injection and a modeling tool. While accelerator testing is the current 'gold standard' for pre-launch testing, we believe the use of fault injection and modeling tools allows for easy, cheap and uniform access for discovering errors early in the design process.« less
Exploring the safety in numbers effect for vulnerable road users on a macroscopic scale.
Tasic, Ivana; Elvik, Rune; Brewer, Simon
2017-12-01
A "Safety in Numbers" effect for a certain group of road users is present if the number of crashes increases at a lower rate than the number of road users. The existence of this effect has been invoked to justify investments in multimodal transportation improvements in order to create more sustainable urban transportation systems by encouraging walking, biking, and transit ridership. The goal of this paper is to explore safety in numbers effect for cyclists and pedestrians in areas with different levels of access to multimodal infrastructure. Data from Chicago served to estimate the expected number of crashes on the census tract level by applying Generalized Additive Models (GAM) to capture spatial dependence in crash data. Measures of trip generation, multimodal infrastructure, network connectivity and completeness, and accessibility were used to model travel exposure in terms of activity, number of trips, trip length, travel opportunities, and conflicts. The results show that a safety in numbers effect exists on a macroscopic level for motor vehicles, pedestrians, and bicyclists. Copyright © 2017 Elsevier Ltd. All rights reserved.
Exploring beliefs about dietary supplement use: focus group discussions with Dutch adults.
Pajor, Emília Margit; Oenema, Anke; Eggers, Sander Matthijs; de Vries, Hein
2017-10-01
Although dietary supplement use is increasing in Europe and the USA, little research involving adults' beliefs regarding dietary supplements has been conducted. Therefore, the present study aimed to explore and compare users' and non-users' beliefs towards dietary supplements. Thirteen focus group discussions were conducted of which seven groups were dietary supplement users and six groups were non-users. Based on the socio-cognitive factors of the Integrated Change Model, a semi-structured topic guide was set up. The discussions were audio-recorded and subjected to qualitative content analysis, applying the framework approach. Data were collected in Maastricht, the Netherlands, in 2014 and 2015. In total fifty-six individuals participated in the study, of whom twenty-eight were dietary supplement users and twenty-eight non-users. The average age of participants was 42·9 years. Dietary supplement users' attitude beliefs were mainly related to mental and physical health enhancement, illness prevention and curative health benefits. Users were critical of the nutritional knowledge of health professionals and of the quality of food products. Non-users were convinced that the human body does not need any support and that regular food is enough to cover one's nutritional needs. Users and non-users held comparable beliefs regarding the definition and risks of dietary supplements, and perceived social influences. In their decision about dietary supplement use, both groups were guided by their own convictions to a great extent. Both groups would benefit from improved understanding of the health effects of dietary supplements to improve informed decision making.
Kerr, Cicely; Murray, Elizabeth; Burns, Jo; Turner, Indra; Westwood, Mark A; Macadam, Catherine; Nazareth, Irwin; Patterson, David
2008-01-01
Internet interventions can help people to self-manage chronic disease. However, they are only likely to be used if they meet patients' perceived needs. We have developed an Internet intervention in two stages to meet the needs of patients with coronary heart disease (CHD). First, user-generated criteria were applied to an existing US-based intervention called 'CHESS Living with Heart Disease' which provides information, emotional and social support, self-assessment and monitoring tools, and behavioural change support. This identified the development work required. Then we conducted a user evaluation with a panel of five patients with CHD. Overall, users generally made positive comments about the information content. However they were critical of presentation, ease of navigation through the content, understanding what was offered in the different services and finding the information they were after. Applying user-generated quality criteria proved useful in developing an intervention to meet the needs of UK patients with CHD.
NASA Technical Reports Server (NTRS)
Brown, Molly E.; Escobar, Vanessa M.
2013-01-01
NASA's Soil Moisture Active and Passive (SMAP) mission is planned for launch in October 2014 and will provide global measurements of soil moisture and freeze thaw state. The project is driven by both basic research and applied science goals. Understanding how application driven end-users will apply SMAP data, prior to the satellite's launch, is an important goal of NASA's applied science program and SMAP mission success. Because SMAP data are unique, there are no direct proxy data sets that can be used in research and operational studies to determine how the data will interact with existing processes. The objective of this study is to solicit data requirements, accuracy needs, and current understanding of the SMAP mission from the potential user community. This study showed that the data to be provided by the SMAP mission did substantially meet the user community needs. Although there was a broad distribution of requirements stated, the SMAP mission fit within these requirements.
Recreation conflict potential and management in the northern/central Black Forest Nature Park
C. Mann; J. D. Absher
2008-01-01
This study explores conflict in recreational use of the Black Forest Nature Park (BFNP) by six different nature sports groups as a function of infrastructure, forest management and other users. A multi-step, methodological triangulation conflict model from US recreation management was applied and tested in the Park. Results from two groups, hikers and mountain bikers,...
ERIC Educational Resources Information Center
Kotesky, Arturo A.
Feedback procedures and information provided to instructors within computer managed learning environments were assessed to determine current usefulness and meaningfulness to users, and to present the design of a different instructor feedback instrument. Kaufman's system model was applied to accomplish the needs assessment phase of the study; and…
Statistical Package User’s Guide.
1980-08-01
261 C. STACH Nonparametric Descriptive Statistics ... ......... ... 265 D. CHIRA Coefficient of Concordance...135 I.- -a - - W 7- Test Data: This program was tested using data from John Neter and William Wasserman, Applied Linear Statistical Models: Regression...length of data file e. new fileý name (not same as raw data file) 5. Printout as optioned for only. Comments: Ranked data are used for program CHIRA
Simulation of the wastewater temperature in sewers with TEMPEST.
Dürrenmatt, David J; Wanner, Oskar
2008-01-01
TEMPEST is a new interactive simulation program for the estimation of the wastewater temperature in sewers. Intuitive graphical user interfaces assist the user in managing data, performing calculations and plotting results. The program calculates the dynamics and longitudinal spatial profiles of the wastewater temperature in sewer lines. Interactions between wastewater, sewer air and surrounding soil are modeled in TEMPEST by mass balance equations, rate expressions found in the literature and a new empirical model of the airflow in the sewer. TEMPEST was developed as a tool which can be applied in practice, i.e., it requires as few input data as possible. These data include the upstream wastewater discharge and temperature, geometric and hydraulic parameters of the sewer, material properties of the sewer pipe and surrounding soil, ambient conditions, and estimates of the capacity of openings for air exchange between sewer and environment. Based on a case study it is shown how TEMPEST can be applied to estimate the decrease of the downstream wastewater temperature caused by heat recovery from the sewer. Because the efficiency of nitrification strongly depends on the wastewater temperature, this application is of practical relevance for situations in which the sewer ends at a nitrifying wastewater treatment plant.
Toward visual user interfaces supporting collaborative multimedia content management
NASA Astrophysics Data System (ADS)
Husein, Fathi; Leissler, Martin; Hemmje, Matthias
2000-12-01
Supporting collaborative multimedia content management activities, as e.g., image and video acquisition, exploration, and access dialogues between naive users and multi media information systems is a non-trivial task. Although a wide variety of experimental and prototypical multimedia storage technologies as well as corresponding indexing and retrieval engines are available, most of them lack appropriate support for collaborative end-user oriented user interface front ends. The development of advanced user adaptable interfaces is necessary for building collaborative multimedia information- space presentations based upon advanced tools for information browsing, searching, filtering, and brokering to be applied on potentially very large and highly dynamic multimedia collections with a large number of users and user groups. Therefore, the development of advanced and at the same time adaptable and collaborative computer graphical information presentation schemes that allow to easily apply adequate visual metaphors for defined target user stereotypes has to become a key focus within ongoing research activities trying to support collaborative information work with multimedia collections.
Speech Perception With Combined Electric-Acoustic Stimulation: A Simulation and Model Comparison.
Rader, Tobias; Adel, Youssef; Fastl, Hugo; Baumann, Uwe
2015-01-01
The aim of this study is to simulate speech perception with combined electric-acoustic stimulation (EAS), verify the advantage of combined stimulation in normal-hearing (NH) subjects, and then compare it with cochlear implant (CI) and EAS user results from the authors' previous study. Furthermore, an automatic speech recognition (ASR) system was built to examine the impact of low-frequency information and is proposed as an applied model to study different hypotheses of the combined-stimulation advantage. Signal-detection-theory (SDT) models were applied to assess predictions of subject performance without the need to assume any synergistic effects. Speech perception was tested using a closed-set matrix test (Oldenburg sentence test), and its speech material was processed to simulate CI and EAS hearing. A total of 43 NH subjects and a customized ASR system were tested. CI hearing was simulated by an aurally adequate signal spectrum analysis and representation, the part-tone-time-pattern, which was vocoded at 12 center frequencies according to the MED-EL DUET speech processor. Residual acoustic hearing was simulated by low-pass (LP)-filtered speech with cutoff frequencies 200 and 500 Hz for NH subjects and in the range from 100 to 500 Hz for the ASR system. Speech reception thresholds were determined in amplitude-modulated noise and in pseudocontinuous noise. Previously proposed SDT models were lastly applied to predict NH subject performance with EAS simulations. NH subjects tested with EAS simulations demonstrated the combined-stimulation advantage. Increasing the LP cutoff frequency from 200 to 500 Hz significantly improved speech reception thresholds in both noise conditions. In continuous noise, CI and EAS users showed generally better performance than NH subjects tested with simulations. In modulated noise, performance was comparable except for the EAS at cutoff frequency 500 Hz where NH subject performance was superior. The ASR system showed similar behavior to NH subjects despite a positive signal-to-noise ratio shift for both noise conditions, while demonstrating the synergistic effect for cutoff frequencies ≥300 Hz. One SDT model largely predicted the combined-stimulation results in continuous noise, while falling short of predicting performance observed in modulated noise. The presented simulation was able to demonstrate the combined-stimulation advantage for NH subjects as observed in EAS users. Only NH subjects tested with EAS simulations were able to take advantage of the gap listening effect, while CI and EAS user performance was consistently degraded in modulated noise compared with performance in continuous noise. The application of ASR systems seems feasible to assess the impact of different signal processing strategies on speech perception with CI and EAS simulations. In continuous noise, SDT models were largely able to predict the performance gain without assuming any synergistic effects, but model amendments are required to explain the gap listening effect in modulated noise.
Automatic User Interface Generation for Visualizing Big Geoscience Data
NASA Astrophysics Data System (ADS)
Yu, H.; Wu, J.; Zhou, Y.; Tang, Z.; Kuo, K. S.
2016-12-01
Along with advanced computing and observation technologies, geoscience and its related fields have been generating a large amount of data at an unprecedented growth rate. Visualization becomes an increasingly attractive and feasible means for researchers to effectively and efficiently access and explore data to gain new understandings and discoveries. However, visualization has been challenging due to a lack of effective data models and visual representations to tackle the heterogeneity of geoscience data. We propose a new geoscience data visualization framework by leveraging the interface automata theory to automatically generate user interface (UI). Our study has the following three main contributions. First, geoscience data has its unique hierarchy data structure and complex formats, and therefore it is relatively easy for users to get lost or confused during their exploration of the data. By applying interface automata model to the UI design, users can be clearly guided to find the exact visualization and analysis that they want. In addition, from a development perspective, interface automaton is also easier to understand than conditional statements, which can simplify the development process. Second, it is common that geoscience data has discontinuity in its hierarchy structure. The application of interface automata can prevent users from suffering automation surprises, and enhance user experience. Third, for supporting a variety of different data visualization and analysis, our design with interface automata could also make applications become extendable in that a new visualization function or a new data group could be easily added to an existing application, which reduces the overhead of maintenance significantly. We demonstrate the effectiveness of our framework using real-world applications.
NASA Technical Reports Server (NTRS)
1979-01-01
The computer program Linear SCIDNT which evaluates rotorcraft stability and control coefficients from flight or wind tunnel test data is described. It implements the maximum likelihood method to maximize the likelihood function of the parameters based on measured input/output time histories. Linear SCIDNT may be applied to systems modeled by linear constant-coefficient differential equations. This restriction in scope allows the application of several analytical results which simplify the computation and improve its efficiency over the general nonlinear case.
Improving global flood risk awareness through collaborative research: Id-Lab
NASA Astrophysics Data System (ADS)
Weerts, A.; Zijderveld, A.; Cumiskey, L.; Buckman, L.; Verlaan, M.; Baart, F.
2015-12-01
Scientific and end-user collaboration on operational flood risk modelling and forecasting requires an environment where scientists and end-users can physically work together and demonstrate, enhance and learn about new tools, methods and models for forecasting and warning purposes. Therefore, Deltares has built a real-time demonstration, training and research infrastructure ('operational' room and ICT backend). This research infrastructure supports various functions like (1) Real time response and disaster management, (2) Training, (3) Collaborative Research, (4) Demonstration. The research infrastructure will be used for a mixture of these functions on a regular basis by Deltares and a multitude of both scientists as well as end users such as universities, research institutes, consultants, governments and aid agencies. This infrastructure facilitates emergency advice and support during international and national disasters caused by rainfall, tropical cyclones or tsunamis. It hosts research flood and storm surge forecasting systems for global/continental/regional scale. It facilitates training for emergency & disaster management (along with hosting forecasting system user trainings in for instance the forecasting platform Delft-FEWS) both internally and externally. The facility is expected to inspire and initiate creative innovations by bringing together different experts from various organizations. The room hosts interactive modelling developments, participatory workshops and stakeholder meetings. State of the art tools, models and software, being applied across the globe are available and on display within the facility. We will present the Id-Lab in detail and we will put particular focus on the global operational forecasting systems GLOFFIS (Global Flood Forecasting Information System) and GLOSSIS (Global Storm Surge Information System).
Investigating User Search Tactic Patterns and System Support in Using Digital Libraries
ERIC Educational Resources Information Center
Joo, Soohyung
2013-01-01
This study aims to investigate users' search tactic application and system support in using digital libraries. A user study was conducted with sixty digital library users. The study was designed to answer three research questions: 1) How do users engage in a search process by applying different types of search tactics while conducting different…
Modeling Vortex Generators in a Navier-Stokes Code
NASA Technical Reports Server (NTRS)
Dudek, Julianne C.
2011-01-01
A source-term model that simulates the effects of vortex generators was implemented into the Wind-US Navier-Stokes code. The source term added to the Navier-Stokes equations simulates the lift force that would result from a vane-type vortex generator in the flowfield. The implementation is user-friendly, requiring the user to specify only three quantities for each desired vortex generator: the range of grid points over which the force is to be applied and the planform area and angle of incidence of the physical vane. The model behavior was evaluated for subsonic flow in a rectangular duct with a single vane vortex generator, subsonic flow in an S-duct with 22 corotating vortex generators, and supersonic flow in a rectangular duct with a counter-rotating vortex-generator pair. The model was also used to successfully simulate microramps in supersonic flow by treating each microramp as a pair of vanes with opposite angles of incidence. The validation results indicate that the source-term vortex-generator model provides a useful tool for screening vortex-generator configurations and gives comparable results to solutions computed using gridded vanes.
HydroCube: an entity-relationship hydrogeological data model
NASA Astrophysics Data System (ADS)
Wojda, Piotr; Brouyère, Serge; Derouane, Johan; Dassargues, Alain
2010-12-01
Managing, handling and accessing hydrogeological information depends heavily on the applied hydrogeological data models, which differ between institutions and countries. The effective dissemination of hydrogeological information requires the convergence of such models to make hydrogeological information accessible to multiple users such as universities, water suppliers, and administration and research organisations. Furthermore, because hydrogeological studies are complex, they require a wide variety of high-quality hydrogeological data with appropriate metadata in clearly designed and coherent structures. A need exists, therefore, to develop and implement hydrogeological data models that cover, as much as possible, the full hydrogeological domain. A new data model, called HydroCube, was developed for the Walloon Region in Belgium in 2005. The HydroCube model presents an innovative holistic project-based approach which covers a full set of hydrogeological concepts and features, allowing for effective hydrogeological project management. The model stores data relating to the project locality, hydrogeological equipment, and related observations and measurements. In particular, it focuses on specialized hydrogeological field experiments such as pumping and tracer tests. This logical data model uses entity-relationship diagrams and it has been implemented in the Microsoft Access environment. It has been enriched with a fully functional user interface.
Requirements Engineering in Building Climate Science Software
NASA Astrophysics Data System (ADS)
Batcheller, Archer L.
Software has an important role in supporting scientific work. This dissertation studies teams that build scientific software, focusing on the way that they determine what the software should do. These requirements engineering processes are investigated through three case studies of climate science software projects. The Earth System Modeling Framework assists modeling applications, the Earth System Grid distributes data via a web portal, and the NCAR (National Center for Atmospheric Research) Command Language is used to convert, analyze and visualize data. Document analysis, observation, and interviews were used to investigate the requirements-related work. The first research question is about how and why stakeholders engage in a project, and what they do for the project. Two key findings arise. First, user counts are a vital measure of project success, which makes adoption important and makes counting tricky and political. Second, despite the importance of quantities of users, a few particular "power users" develop a relationship with the software developers and play a special role in providing feedback to the software team and integrating the system into user practice. The second research question focuses on how project objectives are articulated and how they are put into practice. The team seeks to both build a software system according to product requirements but also to conduct their work according to process requirements such as user support. Support provides essential communication between users and developers that assists with refining and identifying requirements for the software. It also helps users to learn and apply the software to their real needs. User support is a vital activity for scientific software teams aspiring to create infrastructure. The third research question is about how change in scientific practice and knowledge leads to changes in the software, and vice versa. The "thickness" of a layer of software infrastructure impacts whether the software team or users have control and responsibility for making changes in response to new scientific ideas. Thick infrastructure provides more functionality for users, but gives them less control of it. The stability of infrastructure trades off against the responsiveness that the infrastructure can have to user needs.
User's Guide for Mixed-Size Sediment Transport Model for Networks of One-Dimensional Open Channels
Bennett, James P.
2001-01-01
This user's guide describes a mathematical model for predicting the transport of mixed sizes of sediment by flow in networks of one-dimensional open channels. The simulation package is useful for general sediment routing problems, prediction of erosion and deposition following dam removal, and scour in channels at road embankment crossings or other artificial structures. The model treats input hydrographs as stepwise steady-state, and the flow computation algorithm automatically switches between sub- and supercritical flow as dictated by channel geometry and discharge. A variety of boundary conditions including weirs and rating curves may be applied both external and internal to the flow network. The model may be used to compute flow around islands and through multiple openings in embankments, but the network must be 'simple' in the sense that the flow directions in all channels can be specified before simulation commences. The location and shape of channel banks are user specified, and all bedelevation changes take place between these banks and above a user-specified bedrock elevation. Computation of sediment-transport emphasizes the sand-size range (0.0625-2.0 millimeter) but the user may select any desired range of particle diameters including silt and finer (<0.0625 millimeter). As part of data input, the user may set the original bed-sediment composition of any number of layers of known thickness. The model computes the time evolution of total transport and the size composition of bed- and suspended-load sand through any cross section of interest. It also tracks bed -surface elevation and size composition. The model is written in the FORTRAN programming language for implementation on personal computers using the WINDOWS operating system and, along with certain graphical output display capability, is accessed from a graphical user interface (GUI). The GUI provides a framework for selecting input files and parameters of a number of components of the sediment-transport process. There are no restrictions in the use of the model as to numbers of channels, channel junctions, cross sections per channel, or points defining the cross sections. Following completion of the simulation computations, the GUI accommodates display of longitudinal plots of either bed elevation and size composition, or of transport rate and size composition of the various components, for individual channels and selected times during the simulation period. For individual cross sections, the GUI also allows display of time series of transport rate and size composition of the various components and of bed elevation and size composition.
Health effects of the London bicycle sharing system: health impact modelling study
Tainio, Marko; Cheshire, James; O’Brien, Oliver; Goodman, Anna
2014-01-01
Objective To model the impacts of the bicycle sharing system in London on the health of its users. Design Health impact modelling and evaluation, using a stochastic simulation model. Setting Central and inner London, England. Data sources Total population operational registration and usage data for the London cycle hire scheme (collected April 2011-March 2012), surveys of cycle hire users (collected 2011), and London data on travel, physical activity, road traffic collisions, and particulate air pollution (PM2.5, (collected 2005-12). Participants 578 607 users of the London cycle hire scheme, aged 14 years and over, with an estimated 78% of travel time accounted for by users younger than 45 years. Main outcome measures Change in lifelong disability adjusted life years (DALYs) based on one year impacts on incidence of disease and injury, modelled through medium term changes in physical activity, road traffic injuries, and exposure to air pollution. Results Over the year examined the users made 7.4 million cycle hire trips (estimated 71% of cycling time by men). These trips would mostly otherwise have been made on foot (31%) or by public transport (47%). To date there has been a trend towards fewer fatalities and injuries than expected on cycle hire bicycles. Using these observed injury rates, the population benefits from the cycle hire scheme substantially outweighed harms (net change −72 DALYs (95% credible interval −110 to −43) among men using cycle hire per accounting year; −15 (−42 to −6) among women; note that negative DALYs represent a health benefit). When we modelled cycle hire injury rates as being equal to background rates for all cycling in central London, these benefits were smaller and there was no evidence of a benefit among women (change −49 DALYs (−88 to −17) among men; −1 DALY (−27 to 12) among women). This sex difference largely reflected higher road collision fatality rates for female cyclists. At older ages the modelled benefits of cycling were much larger than the harms. Using background injury rates in the youngest age group (15 to 29 years), the medium term benefits and harms were both comparatively small and potentially negative. Conclusion London’s bicycle sharing system has positive health impacts overall, but these benefits are clearer for men than for women and for older users than for younger users. The potential benefits of cycling may not currently apply to all groups in all settings. PMID:24524928
Mars Global Reference Atmospheric Model (Mars-GRAM) Version 3.8: Users Guide
NASA Astrophysics Data System (ADS)
Justus, C. G.; James, B. F.
1999-05-01
Mars Global Reference Atmospheric Model (Mars-GRAM) Version 3.8 is presented and its new features are discussed. Mars-GRAM uses new values of planetary reference ellipsoid radii, gravity term, and rotation rate (consistent with current JPL values) and includes centrifugal effects on gravity. The model now uses NASA Ames Global Circulation Model low resolution topography. Curvature corrections are applied to winds and limits based on speed of sound are applied. Altitude of the F1 ionization peak and density scale height, including effects of change of molecular weight with altitude are computed. A check is performed to disallow temperatures below CO2 sublimination. This memorandum includes instructions on obtaining Mars-GRAM source code and data files and running the program. Sample input and output are provided. An example of incorporating Mars-GRAM as an atmospheric subroutine in a trajectory code is also given.
Performability evaluation of the SIFT computer
NASA Technical Reports Server (NTRS)
Meyer, J. F.; Furchtgott, D. G.; Wu, L. T.
1979-01-01
Performability modeling and evaluation techniques are applied to the SIFT computer as it might operate in the computational evironment of an air transport mission. User-visible performance of the total system (SIFT plus its environment) is modeled as a random variable taking values in a set of levels of accomplishment. These levels are defined in terms of four attributes of total system behavior: safety, no change in mission profile, no operational penalties, and no economic process whose states describe the internal structure of SIFT as well as relavant conditions of the environment. Base model state trajectories are related to accomplishment levels via a capability function which is formulated in terms of a 3-level model hierarchy. Performability evaluation algorithms are then applied to determine the performability of the total system for various choices of computer and environment parameter values. Numerical results of those evaluations are presented and, in conclusion, some implications of this effort are discussed.
Mars Global Reference Atmospheric Model (Mars-GRAM) Version 3.8: Users Guide
NASA Technical Reports Server (NTRS)
Justus, C. G.; James, B. F.
1999-01-01
Mars Global Reference Atmospheric Model (Mars-GRAM) Version 3.8 is presented and its new features are discussed. Mars-GRAM uses new values of planetary reference ellipsoid radii, gravity term, and rotation rate (consistent with current JPL values) and includes centrifugal effects on gravity. The model now uses NASA Ames Global Circulation Model low resolution topography. Curvature corrections are applied to winds and limits based on speed of sound are applied. Altitude of the F1 ionization peak and density scale height, including effects of change of molecular weight with altitude are computed. A check is performed to disallow temperatures below CO2 sublimination. This memorandum includes instructions on obtaining Mars-GRAM source code and data files and running the program. Sample input and output are provided. An example of incorporating Mars-GRAM as an atmospheric subroutine in a trajectory code is also given.
a Target Aware Texture Mapping for Sculpture Heritage Modeling
NASA Astrophysics Data System (ADS)
Yang, C.; Zhang, F.; Huang, X.; Li, D.; Zhu, Y.
2017-08-01
In this paper, we proposed a target aware image to model registration method using silhouette as the matching clues. The target sculpture object in natural environment can be automatically detected from image with complex background with assistant of 3D geometric data. Then the silhouette can be automatically extracted and applied in image to model matching. Due to the user don't need to deliberately draw target area, the time consumption for precisely image to model matching operation can be greatly reduced. To enhance the function of this method, we also improved the silhouette matching algorithm to support conditional silhouette matching. Two experiments using a stone lion sculpture of Ming Dynasty and a potable relic in museum are given to evaluate the method we proposed. The method we proposed in this paper is extended and developed into a mature software applied in many culture heritage documentation projects.
User's Manual and Final Report for Hot-SMAC GUI Development
NASA Technical Reports Server (NTRS)
Yarrington, Phil
2001-01-01
A new software package called Higher Order Theory-Structural/Micro Analysis Code (HOT-SMAC) has been developed as an effective alternative to the finite element approach for Functionally Graded Material (FGM) modeling. HOT-SMAC is a self-contained package including pre- and post-processing through an intuitive graphical user interface, along with the well-established Higher Order Theory for Functionally Graded Materials (HOTFGM) thermomechanical analysis engine. This document represents a Getting Started/User's Manual for HOT-SMAC and a final report for its development. First, the features of the software are presented in a simple step-by-step example where a HOT-SMAC model representing a functionally graded material is created, mechanical and thermal boundary conditions are applied, the model is analyzed and results are reviewed. In a second step-by-step example, a HOT-SMAC model of an actively cooled metallic channel with ceramic thermal barrier coating is built and analyzed. HOT-SMAC results from this model are compared to recently published results (NASA/TM-2001-210702) for two grid densities. Finally, a prototype integration of HOTSMAC with the commercially available HyperSizer(R) structural analysis and sizing software is presented. In this integration, local strain results from HyperSizer's structural analysis are fed to a detailed HOT-SMAC model of the flange-to-facesheet bond region of a stiffened panel. HOT-SMAC is then used to determine the peak shear and peel (normal) stresses between the facesheet and bonded flange of the panel and determine the "free edge" effects.
Kherfi, Mohammed Lamine; Ziou, Djemel
2006-04-01
In content-based image retrieval, understanding the user's needs is a challenging task that requires integrating him in the process of retrieval. Relevance feedback (RF) has proven to be an effective tool for taking the user's judgement into account. In this paper, we present a new RF framework based on a feature selection algorithm that nicely combines the advantages of a probabilistic formulation with those of using both the positive example (PE) and the negative example (NE). Through interaction with the user, our algorithm learns the importance he assigns to image features, and then applies the results obtained to define similarity measures that correspond better to his judgement. The use of the NE allows images undesired by the user to be discarded, thereby improving retrieval accuracy. As for the probabilistic formulation of the problem, it presents a multitude of advantages and opens the door to more modeling possibilities that achieve a good feature selection. It makes it possible to cluster the query data into classes, choose the probability law that best models each class, model missing data, and support queries with multiple PE and/or NE classes. The basic principle of our algorithm is to assign more importance to features with a high likelihood and those which distinguish well between PE classes and NE classes. The proposed algorithm was validated separately and in image retrieval context, and the experiments show that it performs a good feature selection and contributes to improving retrieval effectiveness.
3DVEM Software Modules for Efficient Management of Point Clouds and Photorealistic 3d Models
NASA Astrophysics Data System (ADS)
Fabado, S.; Seguí, A. E.; Cabrelles, M.; Navarro, S.; García-De-San-Miguel, D.; Lerma, J. L.
2013-07-01
Cultural heritage managers in general and information users in particular are not usually used to deal with high-technological hardware and software. On the contrary, information providers of metric surveys are most of the times applying latest developments for real-life conservation and restoration projects. This paper addresses the software issue of handling and managing either 3D point clouds or (photorealistic) 3D models to bridge the gap between information users and information providers as regards the management of information which users and providers share as a tool for decision-making, analysis, visualization and management. There are not many viewers specifically designed to handle, manage and create easily animations of architectural and/or archaeological 3D objects, monuments and sites, among others. 3DVEM - 3D Viewer, Editor & Meter software will be introduced to the scientific community, as well as 3DVEM - Live and 3DVEM - Register. The advantages of managing projects with both sets of data, 3D point cloud and photorealistic 3D models, will be introduced. Different visualizations of true documentation projects in the fields of architecture, archaeology and industry will be presented. Emphasis will be driven to highlight the features of new userfriendly software to manage virtual projects. Furthermore, the easiness of creating controlled interactive animations (both walkthrough and fly-through) by the user either on-the-fly or as a traditional movie file will be demonstrated through 3DVEM - Live.
ADRPM-VII applied to the long-range acoustic detection problem
NASA Technical Reports Server (NTRS)
Shalis, Edward; Koenig, Gerald
1990-01-01
An acoustic detection range prediction model (ADRPM-VII) has been written for IBM PC/AT machines running on the MS-DOS operating system. The software allows the user to predict detection distances of ground combat vehicles and their associated targets when they are involved in quasi-military settings. The program can also calculate individual attenuation losses due to spherical spreading, atmospheric absorption, ground reflection and atmospheric refraction due to temperature and wind gradients while varying parameters effecting the source-receiver problem. The purpose here is to examine the strengths and limitations of ADRPM-VII by modeling the losses due to atmospheric refraction and ground absorption, commonly known as excess attenuation, when applied to the long range detection problem for distances greater than 3 kilometers.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Radiotelephone Act; or (b) Required to participate in a VMRS within a VTS area (VMRS User). VTS User's Manual...) User means a vessel, or an owner, operator, charterer, Master, or person directing the movement of a... which special operating requirements apply. VTS User means a vessel, or an owner, operator, charterer...
Learning to Detect Vandalism in Social Content Systems: A Study on Wikipedia
NASA Astrophysics Data System (ADS)
Javanmardi, Sara; McDonald, David W.; Caruana, Rich; Forouzan, Sholeh; Lopes, Cristina V.
A challenge facing user generated content systems is vandalism, i.e. edits that damage content quality. The high visibility and easy access to social networks makes them popular targets for vandals. Detecting and removing vandalism is critical for these user generated content systems. Because vandalism can take many forms, there are many different kinds of features that are potentially useful for detecting it. The complex nature of vandalism, and the large number of potential features, make vandalism detection difficult and time consuming for human editors. Machine learning techniques hold promise for developing accurate, tunable, and maintainable models that can be incorporated into vandalism detection tools. We describe a method for training classifiers for vandalism detection that yields classifiers that are more accurate on the PAN 2010 corpus than others previously developed. Because of the high turnaround in social network systems, it is important for vandalism detection tools to run in real-time. To this aim, we use feature selection to find the minimal set of features consistent with high accuracy. In addition, because some features are more costly to compute than others, we use cost-sensitive feature selection to reduce the total computational cost of executing our models. In addition to the features previously used for spam detection, we introduce new features based on user action histories. The user history features contribute significantly to classifier performance. The approach we use is general and can easily be applied to other user generated content systems.
A design space of visualization tasks.
Schulz, Hans-Jörg; Nocke, Thomas; Heitzler, Magnus; Schumann, Heidrun
2013-12-01
Knowledge about visualization tasks plays an important role in choosing or building suitable visual representations to pursue them. Yet, tasks are a multi-faceted concept and it is thus not surprising that the many existing task taxonomies and models all describe different aspects of tasks, depending on what these task descriptions aim to capture. This results in a clear need to bring these different aspects together under the common hood of a general design space of visualization tasks, which we propose in this paper. Our design space consists of five design dimensions that characterize the main aspects of tasks and that have so far been distributed across different task descriptions. We exemplify its concrete use by applying our design space in the domain of climate impact research. To this end, we propose interfaces to our design space for different user roles (developers, authors, and end users) that allow users of different levels of expertise to work with it.
EV Charging Algorithm Implementation with User Price Preference
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Bin; Hu, Boyang; Qiu, Charlie
2015-02-17
in this paper, we propose and implement a smart Electric Vehicle (EV) charging algorithm to control the EV charging infrastructures according to users’ price preferences. EVSE (Electric Vehicle Supply Equipment), equipped with bidirectional communication devices and smart meters, can be remotely monitored by the proposed charging algorithm applied to EV control center and mobile app. On the server side, ARIMA model is utilized to fit historical charging load data and perform day-ahead prediction. A pricing strategy with energy bidding policy is proposed and implemented to generate a charging price list to be broadcasted to EV users through mobile app. Onmore » the user side, EV drivers can submit their price preferences and daily travel schedules to negotiate with Control Center to consume the expected energy and minimize charging cost simultaneously. The proposed algorithm is tested and validated through the experimental implementations in UCLA parking lots.« less
NASA Technical Reports Server (NTRS)
Giles, G. L.; Wallas, M.
1981-01-01
User documentation is presented for a computer program which considers the nonlinear properties of the strain isolator pad (SIP) in the static stress analysis of the shuttle thermal protection system. This program is generalized to handle an arbitrary SIP footprint including cutouts for instrumentation and filler bar. Multiple SIP surfaces are defined to model tiles in unique locations such as leading edges, intersections, and penetrations. The nonlinearity of the SIP is characterized by experimental stress displacement data for both normal and shear behavior. Stresses in the SIP are calculated using a Newton iteration procedure to determine the six rigid body displacements of the tile which develop reaction forces in the SIP to equilibrate the externally applied loads. This user documentation gives an overview of the analysis capabilities, a detailed description of required input data and an example to illustrate use of the program.
NETL CO 2 Storage prospeCtive Resource Estimation Excel aNalysis (CO 2-SCREEN) User's Manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanguinito, Sean M.; Goodman, Angela; Levine, Jonathan
This user’s manual guides the use of the National Energy Technology Laboratory’s (NETL) CO 2 Storage prospeCtive Resource Estimation Excel aNalysis (CO 2-SCREEN) tool, which was developed to aid users screening saline formations for prospective CO 2 storage resources. CO 2- SCREEN applies U.S. Department of Energy (DOE) methods and equations for estimating prospective CO 2 storage resources for saline formations. CO2-SCREEN was developed to be substantive and user-friendly. It also provides a consistent method for calculating prospective CO 2 storage resources that allows for consistent comparison of results between different research efforts, such as the Regional Carbon Sequestration Partnershipsmore » (RCSP). CO 2-SCREEN consists of an Excel spreadsheet containing geologic inputs and outputs, linked to a GoldSim Player model that calculates prospective CO 2 storage resources via Monte Carlo simulation.« less
Determining prescription durations based on the parametric waiting time distribution.
Støvring, Henrik; Pottegård, Anton; Hallas, Jesper
2016-12-01
The purpose of the study is to develop a method to estimate the duration of single prescriptions in pharmacoepidemiological studies when the single prescription duration is not available. We developed an estimation algorithm based on maximum likelihood estimation of a parametric two-component mixture model for the waiting time distribution (WTD). The distribution component for prevalent users estimates the forward recurrence density (FRD), which is related to the distribution of time between subsequent prescription redemptions, the inter-arrival density (IAD), for users in continued treatment. We exploited this to estimate percentiles of the IAD by inversion of the estimated FRD and defined the duration of a prescription as the time within which 80% of current users will have presented themselves again. Statistical properties were examined in simulation studies, and the method was applied to empirical data for four model drugs: non-steroidal anti-inflammatory drugs (NSAIDs), warfarin, bendroflumethiazide, and levothyroxine. Simulation studies found negligible bias when the data-generating model for the IAD coincided with the FRD used in the WTD estimation (Log-Normal). When the IAD consisted of a mixture of two Log-Normal distributions, but was analyzed with a single Log-Normal distribution, relative bias did not exceed 9%. Using a Log-Normal FRD, we estimated prescription durations of 117, 91, 137, and 118 days for NSAIDs, warfarin, bendroflumethiazide, and levothyroxine, respectively. Similar results were found with a Weibull FRD. The algorithm allows valid estimation of single prescription durations, especially when the WTD reliably separates current users from incident users, and may replace ad-hoc decision rules in automated implementations. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Ionospheric error contribution to GNSS single-frequency navigation at the 2014 solar maximum
NASA Astrophysics Data System (ADS)
Orus Perez, Raul
2017-04-01
For single-frequency users of the global satellite navigation system (GNSS), one of the main error contributors is the ionospheric delay, which impacts the received signals. As is well-known, GPS and Galileo transmit global models to correct the ionospheric delay, while the international GNSS service (IGS) computes precise post-process global ionospheric maps (GIM) that are considered reference ionospheres. Moreover, accurate ionospheric maps have been recently introduced, which allow for the fast convergence of the real-time precise point position (PPP) globally. Therefore, testing of the ionospheric models is a key issue for code-based single-frequency users, which constitute the main user segment. Therefore, the testing proposed in this paper is straightforward and uses the PPP modeling applied to single- and dual-frequency code observations worldwide for 2014. The usage of PPP modeling allows us to quantify—for dual-frequency users—the degradation of the navigation solutions caused by noise and multipath with respect to the different ionospheric modeling solutions, and allows us, in turn, to obtain an independent assessment of the ionospheric models. Compared to the dual-frequency solutions, the GPS and Galileo ionospheric models present worse global performance, with horizontal root mean square (RMS) differences of 1.04 and 0.49 m and vertical RMS differences of 0.83 and 0.40 m, respectively. While very precise global ionospheric models can improve the dual-frequency solution globally, resulting in a horizontal RMS difference of 0.60 m and a vertical RMS difference of 0.74 m, they exhibit a strong dependence on the geographical location and ionospheric activity.
Measuring user experience in digital gaming: theoretical and methodological issues
NASA Astrophysics Data System (ADS)
Takatalo, Jari; Häkkinen, Jukka; Kaistinen, Jyrki; Nyman, Göte
2007-01-01
There are innumerable concepts, terms and definitions for user experience. Few of them have a solid empirical foundation. In trying to understand user experience in interactive technologies such as computer games and virtual environments, reliable and valid concepts are needed for measuring relevant user reactions and experiences. Here we present our approach to create both theoretically and methodologically sound methods for quantification of the rich user experience in different digital environments. Our approach is based on the idea that the experience received from a content presented with a specific technology is always a result of a complex psychological interpretation process, which components should be understood. The main aim of our approach is to grasp the complex and multivariate nature of the experience and make it measurable. We will present our two basic measurement frameworks, which have been developed and tested in large data set (n=2182). The 15 measurement scales extracted from these models are applied to digital gaming with a head-mounted display and a table-top display. The results show how it is possible to map between experience, technology variables and the background of the user (e.g., gender). This approach can help to optimize, for example, the contents for specific viewing devices or viewing situations.
A resource facility for kinetic analysis: modeling using the SAAM computer programs.
Foster, D M; Boston, R C; Jacquez, J A; Zech, L
1989-01-01
Kinetic analysis and integrated system modeling have contributed significantly to understanding the physiology and pathophysiology of metabolic systems in humans and animals. Many experimental biologists are aware of the usefulness of these techniques and recognize that kinetic modeling requires special expertise. The Resource Facility for Kinetic Analysis (RFKA) provides this expertise through: (1) development and application of modeling technology for biomedical problems, and (2) development of computer-based kinetic modeling methodologies concentrating on the computer program Simulation, Analysis, and Modeling (SAAM) and its conversational version, CONversational SAAM (CONSAM). The RFKA offers consultation to the biomedical community in the use of modeling to analyze kinetic data and trains individuals in using this technology for biomedical research. Early versions of SAAM were widely applied in solving dosimetry problems; many users, however, are not familiar with recent improvements to the software. The purpose of this paper is to acquaint biomedical researchers in the dosimetry field with RFKA, which, together with the joint National Cancer Institute-National Heart, Lung and Blood Institute project, is overseeing SAAM development and applications. In addition, RFKA provides many service activities to the SAAM user community that are relevant to solving dosimetry problems.
User document for computer programs for ring-stiffened shells of revolution
NASA Technical Reports Server (NTRS)
Cohen, G. A.
1973-01-01
A user manual and related program documentation is presented for six compatible computer programs for structural analysis of axisymmetric shell structures. The programs apply to a common structural model but analyze different modes of structural response. In particular, they are: (1) Linear static response under asymmetric loads; (2) Buckling of linear states under asymmetric loads; (3) Nonlinear static response under axisymmetric loads; (4) Buckling nonlinear states under axisymmetric (5) Imperfection sensitivity of buckling modes under axisymmetric loads; and (6) Vibrations about nonlinear states under axisymmetric loads. These programs treat branched shells of revolution with an arbitrary arrangement of a large number of open branches but with at most one closed branch.
Server-Controlled Identity-Based Authenticated Key Exchange
NASA Astrophysics Data System (ADS)
Guo, Hua; Mu, Yi; Zhang, Xiyong; Li, Zhoujun
We present a threshold identity-based authenticated key exchange protocol that can be applied to an authenticated server-controlled gateway-user key exchange. The objective is to allow a user and a gateway to establish a shared session key with the permission of the back-end servers, while the back-end servers cannot obtain any information about the established session key. Our protocol has potential applications in strong access control of confidential resources. In particular, our protocol possesses the semantic security and demonstrates several highly-desirable security properties such as key privacy and transparency. We prove the security of the protocol based on the Bilinear Diffie-Hellman assumption in the random oracle model.
Heuristics for Relevancy Ranking of Earth Dataset Search Results
NASA Astrophysics Data System (ADS)
Lynnes, C.; Quinn, P.; Norton, J.
2016-12-01
As the Variety of Earth science datasets increases, science researchers find it more challenging to discover and select the datasets that best fit their needs. The most common way of search providers to address this problem is to rank the datasets returned for a query by their likely relevance to the user. Large web page search engines typically use text matching supplemented with reverse link counts, semantic annotations and user intent modeling. However, this produces uneven results when applied to dataset metadata records simply externalized as a web page. Fortunately, data and search provides have decades of experience in serving data user communities, allowing them to form heuristics that leverage the structure in the metadata together with knowledge about the user community. Some of these heuristics include specific ways of matching the user input to the essential measurements in the dataset and determining overlaps of time range and spatial areas. Heuristics based on the novelty of the datasets can prioritize later, better versions of data over similar predecessors. And knowledge of how different user types and communities use data can be brought to bear in cases where characteristics of the user (discipline, expertise) or their intent (applications, research) can be divined. The Earth Observing System Data and Information System has begun implementing some of these heuristics in the relevancy algorithm of its Common Metadata Repository search engine.
Heuristics for Relevancy Ranking of Earth Dataset Search Results
NASA Technical Reports Server (NTRS)
Lynnes, Christopher; Quinn, Patrick; Norton, James
2016-01-01
As the Variety of Earth science datasets increases, science researchers find it more challenging to discover and select the datasets that best fit their needs. The most common way of search providers to address this problem is to rank the datasets returned for a query by their likely relevance to the user. Large web page search engines typically use text matching supplemented with reverse link counts, semantic annotations and user intent modeling. However, this produces uneven results when applied to dataset metadata records simply externalized as a web page. Fortunately, data and search provides have decades of experience in serving data user communities, allowing them to form heuristics that leverage the structure in the metadata together with knowledge about the user community. Some of these heuristics include specific ways of matching the user input to the essential measurements in the dataset and determining overlaps of time range and spatial areas. Heuristics based on the novelty of the datasets can prioritize later, better versions of data over similar predecessors. And knowledge of how different user types and communities use data can be brought to bear in cases where characteristics of the user (discipline, expertise) or their intent (applications, research) can be divined. The Earth Observing System Data and Information System has begun implementing some of these heuristics in the relevancy algorithm of its Common Metadata Repository search engine.
Relevancy Ranking of Satellite Dataset Search Results
NASA Technical Reports Server (NTRS)
Lynnes, Christopher; Quinn, Patrick; Norton, James
2017-01-01
As the Variety of Earth science datasets increases, science researchers find it more challenging to discover and select the datasets that best fit their needs. The most common way of search providers to address this problem is to rank the datasets returned for a query by their likely relevance to the user. Large web page search engines typically use text matching supplemented with reverse link counts, semantic annotations and user intent modeling. However, this produces uneven results when applied to dataset metadata records simply externalized as a web page. Fortunately, data and search provides have decades of experience in serving data user communities, allowing them to form heuristics that leverage the structure in the metadata together with knowledge about the user community. Some of these heuristics include specific ways of matching the user input to the essential measurements in the dataset and determining overlaps of time range and spatial areas. Heuristics based on the novelty of the datasets can prioritize later, better versions of data over similar predecessors. And knowledge of how different user types and communities use data can be brought to bear in cases where characteristics of the user (discipline, expertise) or their intent (applications, research) can be divined. The Earth Observing System Data and Information System has begun implementing some of these heuristics in the relevancy algorithm of its Common Metadata Repository search engine.
New generation of exploration tools: interactive modeling software and microcomputers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krajewski, S.A.
1986-08-01
Software packages offering interactive modeling techniques are now available for use on microcomputer hardware systems. These packages are reasonably priced for both company and independent explorationists; they do not require users to have high levels of computer literacy; they are capable of rapidly completing complex ranges of sophisticated geologic and geophysical modeling tasks; and they can produce presentation-quality output for comparison with real-world data. For example, interactive packages are available for mapping, log analysis, seismic modeling, reservoir studies, and financial projects as well as for applying a variety of statistical and geostatistical techniques to analysis of exploration data. More importantly,more » these packages enable explorationists to directly apply their geologic expertise when developing and fine-tuning models for identifying new prospects and for extending producing fields. As a result of these features, microcomputers and interactive modeling software are becoming common tools in many exploration offices. Gravity and magnetics software programs illustrate some of the capabilities of such exploration tools.« less
Strudwick, Gillian
2015-05-01
The benefits of healthcare technologies can only be attained if nurses accept and intend to fully use them. One of the most common models utilized to understand user acceptance of technology is the Technology Acceptance Model. This model and modified versions of it have only recently been applied in the healthcare literature among nurse participants. An integrative literature review was conducted on this topic. Ovid/MEDLINE, PubMed, Google Scholar, and CINAHL were searched yielding a total of 982 references. Upon eliminating duplicates and applying the inclusion and exclusion criteria, the review included a total of four dissertations, three symposium proceedings, and 13 peer-reviewed journal articles. These documents were appraised and reviewed. The results show that a modified Technology Acceptance Model with added variables could provide a better explanation of nurses' acceptance of healthcare technology. These added variables to modified versions of the Technology Acceptance Model are discussed, and the studies' methodologies are critiqued. Limitations of the studies included in the integrative review are also examined.
Revell, Kirsten M A; Stanton, Neville A
2016-11-01
Householders' behaviour with their home heating systems is a considerable contributor to domestic energy consumption. To create a design specification for the 'scaffolding' needed for sustainable behaviour with home heating controls, Norman's (1986) Gulf of Execution and Evaluation was applied to the home heating system. A Home Heating Design Model (DM) was produced with a home heating expert. Norman's (1986) 7 Stages of Activity were considered to derive a Compatible User Mental Model (CUMM) of a typical Heating System. Considerable variation in the concepts needed at each stage was found. Elements that could be derived from the DM supported stages relating to action specification, execution, perception and interpretation, but many are not communicated in the design of typical heating controls. Stages relating to goals, intentions and evaluation required concepts beyond the DM. A systems view that tackles design for sustainable behaviour from a variety of levels is needed. Copyright © 2016 Elsevier Ltd. All rights reserved.
Sensor-Based Optimization Model for Air Quality Improvement in Home IoT
Kim, Jonghyuk
2018-01-01
We introduce current home Internet of Things (IoT) technology and present research on its various forms and applications in real life. In addition, we describe IoT marketing strategies as well as specific modeling techniques for improving air quality, a key home IoT service. To this end, we summarize the latest research on sensor-based home IoT, studies on indoor air quality, and technical studies on random data generation. In addition, we develop an air quality improvement model that can be readily applied to the market by acquiring initial analytical data and building infrastructures using spectrum/density analysis and the natural cubic spline method. Accordingly, we generate related data based on user behavioral values. We integrate the logic into the existing home IoT system to enable users to easily access the system through the Web or mobile applications. We expect that the present introduction of a practical marketing application method will contribute to enhancing the expansion of the home IoT market. PMID:29570684
Sensor-Based Optimization Model for Air Quality Improvement in Home IoT.
Kim, Jonghyuk; Hwangbo, Hyunwoo
2018-03-23
We introduce current home Internet of Things (IoT) technology and present research on its various forms and applications in real life. In addition, we describe IoT marketing strategies as well as specific modeling techniques for improving air quality, a key home IoT service. To this end, we summarize the latest research on sensor-based home IoT, studies on indoor air quality, and technical studies on random data generation. In addition, we develop an air quality improvement model that can be readily applied to the market by acquiring initial analytical data and building infrastructures using spectrum/density analysis and the natural cubic spline method. Accordingly, we generate related data based on user behavioral values. We integrate the logic into the existing home IoT system to enable users to easily access the system through the Web or mobile applications. We expect that the present introduction of a practical marketing application method will contribute to enhancing the expansion of the home IoT market.
Comparison of LiST measles mortality model and WHO/IVB measles model.
Chen, Wei-Ju
2011-04-13
The Lives Saved Tool (LiST) has been developed to estimate the impact of health interventions and can consider multiple interventions simultaneously. Given its increasing usage by donor organizations and national program planner, we compare the LiST measles model to the widely used World Health Organization's Department of Immunization, Vaccines and Biologicals (WHO/IVB) measles model which is used to produce estimates serving as a major indicator of monitoring country measles epidemics and the progress of measles control. We analyzed the WHO/IVB models and the LiST measles model and identified components and assumptions held in each model. We contrasted the important components, and compared results from the two models by applying historical measles containing vaccine (MCV) coverages and the default values of all parameters set in the models. We also conducted analyses following a hypothetical scenario to understand how both models performed when the proportion of population protected by MCV declined to zero percent in short time period. The WHO/IVB measles model and the LiST measles model structures differ: the former is a mixed model which applies surveillance data adjusted for reporting completeness for countries with good disease surveillance system and applies a natural history model for countries with poorer disease control program and surveillance system, and the latter is a cohort model incorporating country-specific cause-of-death (CoD) profiles among children under-five. The trends of estimates of the two models are similar, but the estimates of the first year are different in most of the countries included in the analysis. The two models are comparable if we adjust the measles CoD in the LiST to produce the same baseline estimates. In addition, we used the models to estimate the potential impact of stopping using measles vaccine over a 7-year period. The WHO/IVB model produced similar estimates to the LiST model with adjusted CoD. But the LiST model produced low estimates for countries with very low or eliminated measles infection that may be inappropriate. The study presents methodological and quantitative comparisons between the WHO/IVB and the LiST measles models that highlights differences in model structures and may help users to better interpret and contrast estimates of the measles death from the two models. The major differences are resulted from the usage of case-fatality rate (CFR) in the WHO/IVB model and the CoD profile in the LiST. Both models have their own advantages and limitations. Users should be aware of the issue and apply as update country parameters as possible. Advanced models are expected to validate the policy-planning tools in the future.
Comparison of LiST measles mortality model and WHO/IVB measles model
2011-01-01
Background The Lives Saved Tool (LiST) has been developed to estimate the impact of health interventions and can consider multiple interventions simultaneously. Given its increasing usage by donor organizations and national program planner, we compare the LiST measles model to the widely used World Health Organization's Department of Immunization, Vaccines and Biologicals (WHO/IVB) measles model which is used to produce estimates serving as a major indicator of monitoring country measles epidemics and the progress of measles control. Methods We analyzed the WHO/IVB models and the LiST measles model and identified components and assumptions held in each model. We contrasted the important components, and compared results from the two models by applying historical measles containing vaccine (MCV) coverages and the default values of all parameters set in the models. We also conducted analyses following a hypothetical scenario to understand how both models performed when the proportion of population protected by MCV declined to zero percent in short time period. Results The WHO/IVB measles model and the LiST measles model structures differ: the former is a mixed model which applies surveillance data adjusted for reporting completeness for countries with good disease surveillance system and applies a natural history model for countries with poorer disease control program and surveillance system, and the latter is a cohort model incorporating country-specific cause-of-death (CoD) profiles among children under-five. The trends of estimates of the two models are similar, but the estimates of the first year are different in most of the countries included in the analysis. The two models are comparable if we adjust the measles CoD in the LiST to produce the same baseline estimates. In addition, we used the models to estimate the potential impact of stopping using measles vaccine over a 7-year period. The WHO/IVB model produced similar estimates to the LiST model with adjusted CoD. But the LiST model produced low estimates for countries with very low or eliminated measles infection that may be inappropriate. Conclusions The study presents methodological and quantitative comparisons between the WHO/IVB and the LiST measles models that highlights differences in model structures and may help users to better interpret and contrast estimates of the measles death from the two models. The major differences are resulted from the usage of case-fatality rate (CFR) in the WHO/IVB model and the CoD profile in the LiST. Both models have their own advantages and limitations. Users should be aware of the issue and apply as update country parameters as possible. Advanced models are expected to validate the policy-planning tools in the future. PMID:21501452
Play-Personas: Behaviours and Belief Systems in User-Centred Game Design
NASA Astrophysics Data System (ADS)
Canossa, Alessandro; Drachen, Anders
Game designers attempt to ignite affective, emotional responses from players via engineering game designs to incite definite user experiences. Theories of emotion state that definite emotional responses are individual, and caused by the individual interaction sequence or history. Engendering desired emotions in the audience of traditional audiovisual media is a considerable challenge; however it is potentially even more difficult to achieve the same goal for the audience of interactive entertainment, because a substantial degree of control rests in the hand of the end user rather than the designer. This paper presents a possible solution to the challenge of integrating the user in the design of interactive entertainment such as computer games by employing the "persona" framework introduced by Alan Cooper. This approach is already in use in interaction design. The method can be improved by complementing the traditional narrative description of personas with quantitative, data-oriented models of predicted patterns of user behaviour for a specific computer game Additionally, persona constructs can be applied both as design-oriented metaphors during the development of games, and as analytical lenses to existing games, e.g. for evaluation of patterns of player behaviour.
Neighborhood Influences on Vehicle-Pedestrian Crash Severity.
Toran Pour, Alireza; Moridpour, Sara; Tay, Richard; Rajabifard, Abbas
2017-12-01
Socioeconomic factors are known to be contributing factors for vehicle-pedestrian crashes. Although several studies have examined the socioeconomic factors related to the location of the crashes, limited studies have considered the socioeconomic factors of the neighborhood where the road users live in vehicle-pedestrian crash modelling. This research aims to identify the socioeconomic factors related to both the neighborhoods where the road users live and where crashes occur that have an influence on vehicle-pedestrian crash severity. Data on vehicle-pedestrian crashes that occurred at mid-blocks in Melbourne, Australia, was analyzed. Neighborhood factors associated with road users' residents and location of crash were investigated using boosted regression tree (BRT). Furthermore, partial dependence plots were applied to illustrate the interactions between these factors. We found that socioeconomic factors accounted for 60% of the 20 top contributing factors to vehicle-pedestrian crashes. This research reveals that socioeconomic factors of the neighborhoods where the road users live and where the crashes occur are important in determining the severity of the crashes, with the former having a greater influence. Hence, road safety countermeasures, especially those focussing on the road users, should be targeted at these high-risk neighborhoods.
Soysal, Ergin; Wang, Jingqi; Jiang, Min; Wu, Yonghui; Pakhomov, Serguei; Liu, Hongfang; Xu, Hua
2017-11-24
Existing general clinical natural language processing (NLP) systems such as MetaMap and Clinical Text Analysis and Knowledge Extraction System have been successfully applied to information extraction from clinical text. However, end users often have to customize existing systems for their individual tasks, which can require substantial NLP skills. Here we present CLAMP (Clinical Language Annotation, Modeling, and Processing), a newly developed clinical NLP toolkit that provides not only state-of-the-art NLP components, but also a user-friendly graphic user interface that can help users quickly build customized NLP pipelines for their individual applications. Our evaluation shows that the CLAMP default pipeline achieved good performance on named entity recognition and concept encoding. We also demonstrate the efficiency of the CLAMP graphic user interface in building customized, high-performance NLP pipelines with 2 use cases, extracting smoking status and lab test values. CLAMP is publicly available for research use, and we believe it is a unique asset for the clinical NLP community. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Competitive diffusion in online social networks with heterogeneous users
NASA Astrophysics Data System (ADS)
Li, Pei; He, Su; Wang, Hui; Zhang, Xin
2014-06-01
Online social networks have attracted increasing attention since they provide various approaches for hundreds of millions of people to stay connected with their friends. However, most research on diffusion dynamics in epidemiology cannot be applied directly to characterize online social networks, where users are heterogeneous and may act differently according to their standpoints. In this paper, we propose models to characterize the competitive diffusion in online social networks with heterogeneous users. We classify messages into two types (i.e., positive and negative) and users into three types (i.e., positive, negative and neutral). We estimate the positive (negative) influence for a user generating a given type message, which is the number of times that positive (negative) messages are processed (i.e., read) incurred by this action. We then consider the diffusion threshold, above which the corresponding influence will approach infinity, and the effect threshold, above which the unexpected influence of generating a message will exceed the expected one. We verify all these results by simulations, which show the analysis results are perfectly consistent with the simulation results. These results are of importance in understanding the diffusion dynamics in online social networks, and also critical for advertisers in viral marketing where there are fans, haters and neutrals.
Estimation of the Driving Style Based on the Users' Activity and Environment Influence.
Sysoev, Mikhail; Kos, Andrej; Guna, Jože; Pogačnik, Matevž
2017-10-21
New models and methods have been designed to predict the influence of the user's environment and activity information to the driving style in standard automotive environments. For these purposes, an experiment was conducted providing two types of analysis: (i) the evaluation of a self-assessment of the driving style; (ii) the prediction of aggressive driving style based on drivers' activity and environment parameters. Sixty seven h of driving data from 10 drivers were collected for analysis in this study. The new parameters used in the experiment are the car door opening and closing manner, which were applied to improve the prediction accuracy. An Android application called Sensoric was developed to collect low-level smartphone data about the users' activity. The driving style was predicted from the user's environment and activity data collected before driving. The prediction was tested against the actual driving style, calculated from objective driving data. The prediction has shown encouraging results, with precision values ranging from 0.727 up to 0.909 for aggressive driving recognition rate. The obtained results lend support to the hypothesis that user's environment and activity data could be used for the prediction of the aggressive driving style in advance, before the driving starts.
Human-computer interface including haptically controlled interactions
Anderson, Thomas G.
2005-10-11
The present invention provides a method of human-computer interfacing that provides haptic feedback to control interface interactions such as scrolling or zooming within an application. Haptic feedback in the present method allows the user more intuitive control of the interface interactions, and allows the user's visual focus to remain on the application. The method comprises providing a control domain within which the user can control interactions. For example, a haptic boundary can be provided corresponding to scrollable or scalable portions of the application domain. The user can position a cursor near such a boundary, feeling its presence haptically (reducing the requirement for visual attention for control of scrolling of the display). The user can then apply force relative to the boundary, causing the interface to scroll the domain. The rate of scrolling can be related to the magnitude of applied force, providing the user with additional intuitive, non-visual control of scrolling.
NASA Astrophysics Data System (ADS)
Retscher, G.
2017-09-01
Positioning of mobile users in indoor environments with Wireless Fidelity (Wi-Fi) has become very popular whereby location fingerprinting and trilateration are the most commonly employed methods. In both the received signal strength (RSS) of the surrounding access points (APs) are scanned and used to estimate the user's position. Within the scope of this study the advantageous qualities of both methods are identified and selected to benefit their combination. By a fusion of these technologies a higher performance for Wi-Fi positioning is achievable. For that purpose, a novel approach based on the well-known Differential GPS (DGPS) principle of operation is developed and applied. This approach for user localization and tracking is termed Differential Wi-Fi (DWi-Fi) by analogy with DGPS. From reference stations deployed in the area of interest differential measurement corrections are derived and applied at the mobile user side. Hence, range or coordinate corrections can be estimated from a network of reference station observations as it is done in common CORS GNSS networks. A low-cost realization with Raspberry Pi units is employed for these reference stations. These units serve at the same time as APs broadcasting Wi-Fi signals as well as reference stations scanning the receivable Wi-Fi signals of the surrounding APs. As the RSS measurements are carried out continuously at the reference stations dynamically changing maps of RSS distributions, so-called radio maps, are derived. Similar as in location fingerprinting this radio maps represent the RSS fingerprints at certain locations. From the areal modelling of the correction parameters in combination with the dynamically updated radio maps the location of the user can be estimated in real-time. The novel approach is presented and its performance demonstrated in this paper.
Ravi, Logesh; Vairavasundaram, Subramaniyaswamy
2016-01-01
Rapid growth of web and its applications has created a colossal importance for recommender systems. Being applied in various domains, recommender systems were designed to generate suggestions such as items or services based on user interests. Basically, recommender systems experience many issues which reflects dwindled effectiveness. Integrating powerful data management techniques to recommender systems can address such issues and the recommendations quality can be increased significantly. Recent research on recommender systems reveals an idea of utilizing social network data to enhance traditional recommender system with better prediction and improved accuracy. This paper expresses views on social network data based recommender systems by considering usage of various recommendation algorithms, functionalities of systems, different types of interfaces, filtering techniques, and artificial intelligence techniques. After examining the depths of objectives, methodologies, and data sources of the existing models, the paper helps anyone interested in the development of travel recommendation systems and facilitates future research direction. We have also proposed a location recommendation system based on social pertinent trust walker (SPTW) and compared the results with the existing baseline random walk models. Later, we have enhanced the SPTW model for group of users recommendations. The results obtained from the experiments have been presented. PMID:27069468
Staccini, P; Joubert, M; Quaranta, J F; Fieschi, D; Fieschi, M
2000-01-01
Hospital information systems have to support quality improvement objectives. The design issues of health care information system can be classified into three categories: 1) time-oriented and event-labelled storage of patient data; 2) contextual support of decision-making; 3) capabilities for modular upgrading. The elicitation of the requirements has to meet users' needs in relation to both the quality (efficacy, safety) and the monitoring of all health care activities (traceability). Information analysts need methods to conceptualize clinical information systems that provide actors with individual benefits and guide behavioural changes. A methodology is proposed to elicit and structure users' requirements using a process-oriented analysis, and it is applied to the field of blood transfusion. An object-oriented data model of a process has been defined in order to identify its main components: activity, sub-process, resources, constrains, guidelines, parameters and indicators. Although some aspects of activity, such as "where", "what else", and "why" are poorly represented by the data model alone, this method of requirement elicitation fits the dynamic of data input for the process to be traced. A hierarchical representation of hospital activities has to be found for this approach to be generalised within the organisation, for the processes to be interrelated, and for their characteristics to be shared.
Staccini, P; Joubert, M; Quaranta, J F; Fieschi, D; Fieschi, M
2001-12-01
Healthcare institutions are looking at ways to increase their efficiency by reducing costs while providing care services with a high level of safety. Thus, hospital information systems have to support quality improvement objectives. The elicitation of the requirements has to meet users' needs in relation to both the quality (efficacy, safety) and the monitoring of all health care activities (traceability). Information analysts need methods to conceptualise clinical information systems that provide actors with individual benefits and guide behavioural changes. A methodology is proposed to elicit and structure users' requirements using a process-oriented analysis, and it is applied to the blood transfusion process. An object-oriented data model of a process has been defined in order to organise the data dictionary. Although some aspects of activity, such as 'where', 'what else', and 'why' are poorly represented by the data model alone, this method of requirement elicitation fits the dynamic of data input for the process to be traced. A hierarchical representation of hospital activities has to be found for the processes to be interrelated, and for their characteristics to be shared, in order to avoid data redundancy and to fit the gathering of data with the provision of care.
ERIC Educational Resources Information Center
Marques, Bertil P.; Carvalho, Piedade; Escudeiro, Paula; Barata, Ana; Silva, Ana; Queiros, Sandra
2017-01-01
Promoted by the significant increase of large scale internet access, many audiences have turned to the web and to its resources for learning and inspiration, with diverse sets of skills and intents. In this context, Multimedia Online Open Courses (MOOC) consist in learning models supported on user-friendly web tools that allow anyone with minimum…
Based on user interest level of modeling scenarios and browse content
NASA Astrophysics Data System (ADS)
Zhao, Yang
2017-08-01
User interest modeling is the core of personalized service, taking into account the impact of situational information on user preferences, the user behavior days of financial information. This paper proposes a method of user interest modeling based on scenario information, which is obtained by calculating the similarity of the situation. The user's current scene of the approximate scenario set; on the "user - interest items - scenarios" three-dimensional model using the situation pre-filtering method of dimension reduction processing. View the content of the user interested in the theme, the analysis of the page content to get each topic of interest keywords, based on the level of vector space model user interest. The experimental results show that the user interest model based on the scenario information is within 9% of the user's interest prediction, which is effective.
NASA Technical Reports Server (NTRS)
Egolf, T. A.; Landgrebe, A. J.
1982-01-01
A user's manual is provided which includes the technical approach for the Prescribed Wake Rotor Inflow and Flow Field Prediction Analysis. The analysis is used to provide the rotor wake induced velocities at the rotor blades for use in blade airloads and response analyses and to provide induced velocities at arbitrary field points such as at a tail surface. This analysis calculates the distribution of rotor wake induced velocities based on a prescribed wake model. Section operating conditions are prescribed from blade motion and controls determined by a separate blade response analysis. The analysis represents each blade by a segmented lifting line, and the rotor wake by discrete segmented trailing vortex filaments. Blade loading and circulation distributions are calculated based on blade element strip theory including the local induced velocity predicted by the numerical integration of the Biot-Savart Law applied to the vortex wake model.
Apply 3D model on the customized product color combination for the interior decoration
NASA Astrophysics Data System (ADS)
Chen, Cheih-Ying
2013-03-01
The customized product color interface for the interior decoration is designed to simulate the display of various color combination sofas in the interior of the room. There are 144 color combinations of the spatial image resulted from four the interior rooms and 36 popular color sofas. The image compositing technique is adopted to appear the 144 color combinations of the spatial image on computer screen. This study tests the experience of using the interface by the questionnaire for User Interface Satisfaction (QUIS). The results show that the high grade of evaluation items including wonderful, easy, satisfying, stimulating and flexible for the experience of users. Therefore, the entrepreneur who wants to display the color primarily commodity could using the customized color combination interface with 3D models for consumers to take opportunity to find the appropriate products to meet with the interior room, so as to shorten communication time between entrepreneurs and consumers.
Applying User Centered Design to Research Work
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scholtz, Jean; Love, Oriana J.; Pike, William A.
The SuperIdentity (SID) research project is a collaboration between six universities in the UK (Bath, Dundee, Kent, Leicester, Oxford, and Southampton) and the Pacific Northwest National Laboratory (PNNL). SID offers an innovative and exciting new approach to the concept of identity. The assumption underlying our hypothesis is that while there may be many dimensions to an identity - some more stable than others - all should ultimately reference back to a single core identity or a 'SuperIdentity.' The obvious consequence is that identification is improved by the combination of measures. Our work at PNNL has focused on the developing usemore » cases to use in developing a model of identity and in developing visualizations for both researchers to explore the model and in the future for end users to use in determining various paths that may be possible to obtain various identity attributes from a set that is already known.« less
NASA Astrophysics Data System (ADS)
Quinn, J. D.; Larour, E. Y.; Cheng, D. L. C.; Halkides, D. J.
2016-12-01
The Virtual Earth System Laboratory (VESL) is a Web-based tool, under development at the Jet Propulsion Laboratory and UC Irvine, for the visualization of Earth System data and process simulations. It contains features geared toward a range of applications, spanning research and outreach. It offers an intuitive user interface, in which model inputs are changed using sliders and other interactive components. Current capabilities include simulation of polar ice sheet responses to climate forcing, based on NASA's Ice Sheet System Model (ISSM). We believe that the visualization of data is most effective when tailored to the target audience, and that many of the best practices for modern Web design/development can be applied directly to the visualization of data: use of negative space, color schemes, typography, accessibility standards, tooltips, etc cetera. We present our prototype website, and invite input from potential users, including researchers, educators, and students.
NASA Astrophysics Data System (ADS)
Camargo, F. R.; Henson, B.
2015-02-01
The notion of that more or less of a physical feature affects in different degrees the users' impression with regard to an underlying attribute of a product has frequently been applied in affective engineering. However, those attributes exist only as a premise that cannot directly be measured and, therefore, inferences based on their assessment are error-prone. To establish and improve measurement of latent attributes it is presented in this paper the concept of a stochastic framework using the Rasch model for a wide range of independent variables referred to as an item bank. Based on an item bank, computerized adaptive testing (CAT) can be developed. A CAT system can converge into a sequence of items bracketing to convey information at a user's particular endorsement level. It is through item banking and CAT that the financial benefits of using the Rasch model in affective engineering can be realised.
Between-User Reliability of Tier 1 Exposure Assessment Tools Used Under REACH.
Lamb, Judith; Galea, Karen S; Miller, Brian G; Hesse, Susanne; Van Tongeren, Martie
2017-10-01
When applying simple screening (Tier 1) tools to estimate exposure to chemicals in a given exposure situation under the Registration, Evaluation, Authorisation and restriction of CHemicals Regulation 2006 (REACH), users must select from several possible input parameters. Previous studies have suggested that results from exposure assessments using expert judgement and from the use of modelling tools can vary considerably between assessors. This study aimed to investigate the between-user reliability of Tier 1 tools. A remote-completion exercise and in person workshop were used to identify and evaluate tool parameters and factors such as user demographics that may be potentially associated with between-user variability. Participants (N = 146) generated dermal and inhalation exposure estimates (N = 4066) from specified workplace descriptions ('exposure situations') and Tier 1 tool combinations (N = 20). Interactions between users, tools, and situations were investigated and described. Systematic variation associated with individual users was minor compared with random between-user variation. Although variation was observed between choices made for the majority of input parameters, differing choices of Process Category ('PROC') code/activity descriptor and dustiness level impacted most on the resultant exposure estimates. Exposure estimates ranging over several orders of magnitude were generated for the same exposure situation by different tool users. Such unpredictable between-user variation will reduce consistency within REACH processes and could result in under-estimation or overestimation of exposure, risking worker ill-health or the implementation of unnecessary risk controls, respectively. Implementation of additional support and quality control systems for all tool users is needed to reduce between-assessor variation and so ensure both the protection of worker health and avoidance of unnecessary business risk management expenditure. © The Author 2017. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.
The likelihood of sunburn in sunscreen users is disproportionate to the SPF.
Pissavini, Marc; Diffey, Brian
2013-06-01
Sunburn is a common feature in sunscreen users. The purpose of this paper is to estimate the expected frequency and magnitude of sunburn resulting from typical use of sunscreens labelled SPF15 and SPF30 by people spending long periods outdoors in strong summer sunshine. By combining the probability distribution of the measured sun protection factor (SPF) in vivo with those for the average application thickness and the uniformity of application over the skin surface, a simulation model was developed to estimate the variation in delivered protection over the exposed skin surface from consumer use of sunscreens. While either sunscreen, if delivering the nominal SPF over the entire exposed skin, would be sufficient to prevent any erythema, the simulation indicates that the combination of the average quantity applied with the variability in thickness over the skin surface will lead to erythema, especially in SPF15 sunscreen users. People who intend spending long periods outside in strong sunshine would be better advised to use SPF30 labelled sunscreens than SPF15 sunscreens, and to apply the product carefully over exposed skin if they wish to minimize their risk of sunburn and, by implication, skin cancer. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Land subsidence in Yunlin, Taiwan, due to Agricultural and Domestic Water Use
NASA Astrophysics Data System (ADS)
Hsu, K.; Lin, P.; Lin, Z.
2013-12-01
Subsidence in a layered aquifer is caused by groundwater excess extraction and results in complicated problems in Taiwan. Commonly, responsibility to subsidence for agricultural and domestic water users is difficulty to identify due to the lack of quantitative evidences. An integrated model was proposed to analyze subsidence problem. The flow field utilizes analytical solution for pumping in a layered system from Neuman and Witherspoon (1969) to calculate the head drawdown variation. The subsidence estimation applies Terzaghi (1943) one-dimensional consolidation theory to calculate the deformation in each layer. The proposed model was applied to estimate land subsidence and drawdown variation at the Yuanchang Township of Yunlin County in Taiwan. Groundwater data for dry-season periods were used for calibration and validation. Seasonal effect in groundwater variation was first filtered out. Dry-season pumping effect on land subsidence was analyzed. The results show that multi-layer pumping contributes more in subsidence than single-layer pumping on the response of drawdown and land subsidence in aquifer 2 with a contribution of 97% total change at Yuanchang station. Pumping in aquifer 2 contributes more significant than pumping in aquifer 3 to cause change in drawdown and land subsidence in aquifer 2 with a contribution of 70% total change at Yuanchang station. Larger area of subsidence in Yuanchang Township was attributed pumping at aquifer 2 while pumping at aquifer 3 results in significant subsidence near the well field. The single-layer user contributes most area of subsidence but the multi-layer user generates more serious subsidence.
Staying True to the Core: Designing the Future Academic Library Experience
ERIC Educational Resources Information Center
Bell, Steven J.
2014-01-01
In 2014, the practice of user experience design in academic libraries continues to evolve. It is typically applied in the context of interactions with digital interfaces. Some academic librarians are applying user experience approaches more broadly to design both environments and services with human-centered strategies. As the competition for the…
Magliano, Lorenza; Puviani, Marta; Rega, Sonia; Marchesini, Nadia; Rossetti, Marisa; Starace, Fabrizio
2016-01-30
This controlled, non-randomized study explored the feasibility of introducing a Combined Individual and Group Intervention (CIGI) for users with mental disorders in residential facilities, and tested whether users who received the CIGI had better functioning than users who received the Treatment-As-Usual (TAU), at two-year follow up. In the CIGI, a structured cognitivebehavioral approach called VADO (in English, Skills Assessment and Definition of Goals) was used to set specific goals with each user, while Falloon's psychoeducational treatment was applied with the users as a group. Thirty-one professionals attended a training course in CIGI, open to users' voluntary participation, and applied it for two years with all users living in 8 residential facilities of the Mental Health Department of Modena, Italy. In the same department, 5 other residential facilities providing TAU were used as controls. ANOVA for repeated measures showed a significant interaction effect between users' functioning at baseline and follow up assessments, and the intervention. In particular, change in global functioning was higher in the 55 CIGI users than in the 44 TAU users. These results suggest that CIGI can be successfully introduced in residential facilities and may be useful to improve functioning in users with severe mental disorders. Copyright © 2016. Published by Elsevier Ireland Ltd.
Landlab: A numerical modeling framework for evolving Earth surfaces from mountains to the coast
NASA Astrophysics Data System (ADS)
Gasparini, N. M.; Adams, J. M.; Tucker, G. E.; Hobley, D. E. J.; Hutton, E.; Istanbulluoglu, E.; Nudurupati, S. S.
2016-02-01
Landlab is an open-source, user-friendly, component-based modeling framework for exploring the evolution of Earth's surface. Landlab itself is not a model. Instead, it is a computational framework that facilitates the development of numerical models of coupled earth surface processes. The Landlab Python library includes a gridding engine and process components, along with support functions for tasks such as reading in DEM data and input variables, setting boundary conditions, and plotting and outputting data. Each user of Landlab builds his or her own unique model. The first step in building a Landlab model is generally initializing a grid, either regular (raster) or irregular (e.g. delaunay or radial), and process components. This initialization process involves reading in relevant parameter values and data. The process components act on the grid to alter grid properties over time. For example, a component exists that can track the growth, death, and succession of vegetation over time. There are also several components that evolve surface elevation, through processes such as fluvial sediment transport and linear diffusion, among others. Users can also build their own process components, taking advantage of existing functions in Landlab such as those that identify grid connectivity and calculate gradients and flux divergence. The general nature of the framework makes it applicable to diverse environments - from bedrock rivers to a pile of sand - and processes acting over a range of spatial and temporal scales. In this poster we illustrate how a user builds a model using Landlab and propose a number of ways in which Landlab can be applied in coastal environments - from dune migration to channelization of barrier islands. We seek input from the coastal community as to how the process component library can be expanded to explore the diverse phenomena that act to shape coastal environments.
M-Split: A Graphical User Interface to Analyze Multilayered Anisotropy from Shear Wave Splitting
NASA Astrophysics Data System (ADS)
Abgarmi, Bizhan; Ozacar, A. Arda
2017-04-01
Shear wave splitting analysis are commonly used to infer deep anisotropic structure. For simple cases, obtained delay times and fast-axis orientations are averaged from reliable results to define anisotropy beneath recording seismic stations. However, splitting parameters show systematic variations with back azimuth in the presence of complex anisotropy and cannot be represented by average time delay and fast axis orientation. Previous researchers had identified anisotropic complexities at different tectonic settings and applied various approaches to model them. Most commonly, such complexities are modeled by using multiple anisotropic layers with priori constraints from geologic data. In this study, a graphical user interface called M-Split is developed to easily process and model multilayered anisotropy with capabilities to properly address the inherited non-uniqueness. M-Split program runs user defined grid searches through the model parameter space for two-layer anisotropy using formulation of Silver and Savage (1994) and creates sensitivity contour plots to locate local maximas and analyze all possible models with parameter tradeoffs. In order to minimize model ambiguity and identify the robust model parameters, various misfit calculation procedures are also developed and embedded to M-Split which can be used depending on the quality of the observations and their back-azimuthal coverage. Case studies carried out to evaluate the reliability of the program using real noisy data and for this purpose stations from two different networks are utilized. First seismic network is the Kandilli Observatory and Earthquake research institute (KOERI) which includes long term running permanent stations and second network comprises seismic stations deployed temporary as part of the "Continental Dynamics-Central Anatolian Tectonics (CD-CAT)" project funded by NSF. It is also worth to note that M-Split is designed as open source program which can be modified by users for additional capabilities or for other applications.
Use Of REX Control System For The Ball On Spool Model
NASA Astrophysics Data System (ADS)
Ožana, Štěpán; Pieš, Martin; Hájovský, Radovan; Dočekal, Tomáš
2015-07-01
This paper deals with the design and implementation of linear quadratic controller (LQR) for modeling of Ball on Spool. The paper presents the entire process, starting from mathematical model through control design towards application of controller with the use of given hardware platform. Proposed solution by means of REX Control System provides a high level of user comfort regarding implementation of control loop, diagnostics and automatically generated visualization based on HTML5. It represents an ideal example of a complex nonlinear mechatronic system with a lot of possibilities to apply other types of controllers.
NASA Astrophysics Data System (ADS)
Nadi, S.; Delavar, M. R.
2011-06-01
This paper presents a generic model for using different decision strategies in multi-criteria, personalized route planning. Some researchers have considered user preferences in navigation systems. However, these prior studies typically employed a high tradeoff decision strategy, which used a weighted linear aggregation rule, and neglected other decision strategies. The proposed model integrates a pairwise comparison method and quantifier-guided ordered weighted averaging (OWA) aggregation operators to form a personalized route planning method that incorporates different decision strategies. The model can be used to calculate the impedance of each link regarding user preferences in terms of the route criteria, criteria importance and the selected decision strategy. Regarding the decision strategy, the calculated impedance lies between aggregations that use a logical "and" (which requires all the criteria to be satisfied) and a logical "or" (which requires at least one criterion to be satisfied). The calculated impedance also includes taking the average of the criteria scores. The model results in multiple alternative routes, which apply different decision strategies and provide users with the flexibility to select one of them en-route based on the real world situation. The model also defines the robust personalized route under different decision strategies. The influence of different decision strategies on the results are investigated in an illustrative example. This model is implemented in a web-based geographical information system (GIS) for Isfahan in Iran and verified in a tourist routing scenario. The results demonstrated, in real world situations, the validity of the route planning carried out in the model.
Winston, Richard B.; Voss, Clifford I.
2004-01-01
This report describes SutraGUI, a flexible graphical user-interface (GUI) that supports two-dimensional (2D) and three-dimensional (3D) simulation with the U.S. Geological Survey (USGS) SUTRA ground-water-flow and transport model (Voss and Provost, 2002). SutraGUI allows the user to create SUTRA ground-water models graphically. SutraGUI provides all of the graphical functionality required for setting up and running SUTRA simulations that range from basic to sophisticated, but it is also possible for advanced users to apply programmable features within Argus ONE to meet the unique demands of particular ground-water modeling projects. SutraGUI is a public-domain computer program designed to run with the proprietary Argus ONE? package, which provides 2D Geographic Information System (GIS) and meshing support. For 3D simulation, GIS and meshing support is provided by programming contained within SutraGUI. When preparing a 3D SUTRA model, the model and all of its features are viewed within Argus 1 in 2D projection. For 2D models, SutraGUI is only slightly changed in functionality from the previous 2D-only version (Voss and others, 1997) and it provides visualization of simulation results. In 3D, only model preparation is supported by SutraGUI, and 3D simulation results may be viewed in SutraPlot (Souza, 1999) or Model Viewer (Hsieh and Winston, 2002). A comprehensive online Help system is included in SutraGUI. For 3D SUTRA models, the 3D model domain is conceptualized as bounded on the top and bottom by 2D surfaces. The 3D domain may also contain internal surfaces extending across the model that divide the domain into tabular units, which can represent hydrogeologic strata or other features intended by the user. These surfaces can be non-planar and non-horizontal. The 3D mesh is defined by one or more 2D meshes at different elevations that coincide with these surfaces. If the nodes in the 3D mesh are vertically aligned, only a single 2D mesh is needed. For nonaligned meshes, two or more 2D meshes of similar connectivity are used. Between each set of 2D meshes (and model surfaces), the vertical space in the 3D mesh is evenly divided into a user-specified number of layers of finite elements. Boundary conditions may be specified for 3D models in SutraGUI using a variety of geometric shapes that may be located freely within the 3D model domain. These shapes include points, lines, sheets, and solids. These are represented by 2D contours (within the vertically-projected Argus ONE view) with user-defined elevations. In addition, boundary conditions may be specified for 3D models as points, lines, and areas that are located exactly within the surfaces that define the model top and the bottoms of the tabular units. Aquifer properties may be specified separately for each tabular unit. If the aquifer properties vary vertically within a unit, SutraGUI provides the Sutra_Z function that can be used to specify such variation.
Distributive On-line Processing, Visualization and Analysis System for Gridded Remote Sensing Data
NASA Technical Reports Server (NTRS)
Leptoukh, G.; Berrick, S.; Liu, Z.; Pham, L.; Rui, H.; Shen, S.; Teng, W.; Zhu, T.
2004-01-01
The ability to use data stored in the current Earth Observing System (EOS) archives for studying regional or global phenomena is highly dependent on having a detailed understanding of the data's internal structure and physical implementation. Gaining this understanding and applying it to data reduction is a time- consuming task that must be undertaken before the core investigation can begin. This is an especially difficult challenge when science objectives require users to deal with large multi-sensor data sets that are usually of different formats, structures, and resolutions, for example, when preparing data for input into modeling systems. The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) has taken a major step towards meeting this challenge by developing an infrastructure with a Web interface that allows users to perform interactive analysis online without downloading any data, the GES-DISC Interactive Online Visualization and Analysis Infrastructure or "Giovanni." Giovanni provides interactive, online, analysis tools for data users to facilitate their research. There have been several instances of this interface created to serve TRMM users, Aerosol scientists, Ocean Color and Agriculture applications users. The first generation of these tools support gridded data only. The user selects geophysical parameters, area of interest, time period; and the system generates an output on screen in a matter of seconds. The currently available output options are: Area plot averaged or accumulated over any available data period for any rectangular area; Time plot time series averaged over any rectangular area; Time plots image view of any longitude-time and latitude-time cross sections; ASCII output for all plot types; Image animation for area plot. In the future, we will add correlation plots, GIS-compatible outputs, etc. This allow user to focus on data content (i.e. science parameters) and eliminate the need for expensive learning, development and processing tasks that are redundantly incurred by an archive's user community. The current implementation utilizes the GrADS-DODS Server (GDS), a stable, secure data server that provides subsetting and analysis services across the Internet for any GrADS-readable dataset. The subsetting capability allows users to retrieve a specified temporal and/or spatial subdomain from a large dataset, eliminating the need to download everything simply to access a small relevant portion of a dataset. The analysis capability allows users to retrieve the results of an operation applied to one or more datasets on the server. In our case, we use this approach to read pre-processed binary files and/or to read and extract the needed parts from HDF or HDF-EOS files. These subsets then serve as inputs into GrADS processing and analysis scripts. It can be used in a wide variety of Earth science applications: climate and weather events study and monitoring; modeling. It can be easily configured for new applications.
A user-friendly model for spray drying to aid pharmaceutical product development.
Grasmeijer, Niels; de Waard, Hans; Hinrichs, Wouter L J; Frijlink, Henderik W
2013-01-01
The aim of this study was to develop a user-friendly model for spray drying that can aid in the development of a pharmaceutical product, by shifting from a trial-and-error towards a quality-by-design approach. To achieve this, a spray dryer model was developed in commercial and open source spreadsheet software. The output of the model was first fitted to the experimental output of a Büchi B-290 spray dryer and subsequently validated. The predicted outlet temperatures of the spray dryer model matched the experimental values very well over the entire range of spray dryer settings that were tested. Finally, the model was applied to produce glassy sugars by spray drying, an often used excipient in formulations of biopharmaceuticals. For the production of glassy sugars, the model was extended to predict the relative humidity at the outlet, which is not measured in the spray dryer by default. This extended model was then successfully used to predict whether specific settings were suitable for producing glassy trehalose and inulin by spray drying. In conclusion, a spray dryer model was developed that is able to predict the output parameters of the spray drying process. The model can aid the development of spray dried pharmaceutical products by shifting from a trial-and-error towards a quality-by-design approach.
NASA Astrophysics Data System (ADS)
Odijk, Dennis; Zhang, Baocheng; Khodabandeh, Amir; Odolinski, Robert; Teunissen, Peter J. G.
2016-01-01
The concept of integer ambiguity resolution-enabled Precise Point Positioning (PPP-RTK) relies on appropriate network information for the parameters that are common between the single-receiver user that applies and the network that provides this information. Most of the current methods for PPP-RTK are based on forming the ionosphere-free combination using dual-frequency Global Navigation Satellite System (GNSS) observations. These methods are therefore restrictive in the light of the development of new multi-frequency GNSS constellations, as well as from the point of view that the PPP-RTK user requires ionospheric corrections to obtain integer ambiguity resolution results based on short observation time spans. The method for PPP-RTK that is presented in this article does not have above limitations as it is based on the undifferenced, uncombined GNSS observation equations, thereby keeping all parameters in the model. Working with the undifferenced observation equations implies that the models are rank-deficient; not all parameters are unbiasedly estimable, but only combinations of them. By application of S-system theory the model is made of full rank by constraining a minimum set of parameters, or S-basis. The choice of this S-basis determines the estimability and the interpretation of the parameters that are transmitted to the PPP-RTK users. As this choice is not unique, one has to be very careful when comparing network solutions in different S-systems; in that case the S-transformation, which is provided by the S-system method, should be used to make the comparison. Knowing the estimability and interpretation of the parameters estimated by the network is shown to be crucial for a correct interpretation of the estimable PPP-RTK user parameters, among others the essential ambiguity parameters, which have the integer property which is clearly following from the interpretation of satellite phase biases from the network. The flexibility of the S-system method is furthermore demonstrated by the fact that all models in this article are derived in multi-epoch mode, allowing to incorporate dynamic model constraints on all or subsets of parameters.
Integration of GCAM-USA into GLIMPSE: Update and ...
The purpose of this presentation is to (i) discuss changes made to the GCAM-USA model to more fully support long-term, coordinated environmental-climate-energy planning within the U.S., and (ii) demonstrate the graphical user interface that has been constructed to construct modeling scenarios, execute GCAM-USA, and visualize and compare model outputs. GLIMPSE is intended to provide insights into linkages and synergies among the goals of air quality management, climate change mitigation, and long-range energy planning. We have expanded GLIMPSE to also incorporate the open-source Global Change Assessment Model-USA (GCAM-USA), which has state-level representation of the U.S. energy system. With GCAM-USA, GLIMPSE can consider more aspects of the economy, linkages to the water and climate systems, and interactions with other regions of the world. A user-friendly graphical interface allows the system to be applied by analysts to explore a range of policies, such emission taxes or caps, efficiency standards, and renewable portfolio standards. We expect GLIMPSE to be used within research and planning activities, both within the EPA and beyond.
Segmentation in low-penetration and low-involvement categories: an application to lottery games.
Guesalaga, Rodrigo; Marshall, Pablo
2013-09-01
Market segmentation is accepted as a fundamental concept in marketing and several authors have recently proposed a segmentation model where personal and environmental variables intersect with each other to form motivating conditions that drive behavior and preferences. This model of segmentation has been applied to packaged goods. This paper extends this literature by proposing a segmentation model for low-penetration and low involvement (LP-LI) products. An application to the lottery games in Chile supports the proposed model. The results of the study show that in this type of products (LP-LI), the attitude towards the product category is the most important factor that distinguishes consumers from non consumers, and heavy users from light users, and consequently, a critical segmentation variable. In addition, a cluster analysis shows the existence of three segments: (1) the impulsive dreamers, who believe in chance, and in that lottery games can change their life, (2) the skeptical, that do not believe in chance, nor in that lottery games can change their life and (3) the willing, who value the benefits of playing.
NASA Astrophysics Data System (ADS)
Liu, Jinjie
2017-08-01
In order to fully consider the impact of future policies and technologies on the electricity sales market, improve the efficiency of electricity market operation, realize the dual goal of power reform and energy saving and emission reduction, this paper uses multi-level decision theory to put forward the double-layer game model under the consideration of ETS and block chain. We set the maximization of electricity sales profit as upper level objective and establish a game strategy model of electricity purchase; while we set maximization of user satisfaction as lower level objective and build a choice behavior model based on customer satisfaction. This paper applies the strategy to the simulation of a sales company's transaction, and makes a horizontal comparison of the same industry competitors as well as a longitudinal comparison of game strategies considering different factors. The results show that Double-layer game model is reasonable and effective, it can significantly improve the efficiency of the electricity sales companies and user satisfaction, while promoting new energy consumption and achieving energy-saving emission reduction.
Quantifying the web browser ecosystem
Ferdman, Sela; Minkov, Einat; Gefen, David
2017-01-01
Contrary to the assumption that web browsers are designed to support the user, an examination of a 900,000 distinct PCs shows that web browsers comprise a complex ecosystem with millions of addons collaborating and competing with each other. It is possible for addons to “sneak in” through third party installations or to get “kicked out” by their competitors without user involvement. This study examines that ecosystem quantitatively by constructing a large-scale graph with nodes corresponding to users, addons, and words (terms) that describe addon functionality. Analyzing addon interactions at user level using the Personalized PageRank (PPR) random walk measure shows that the graph demonstrates ecological resilience. Adapting the PPR model to analyzing the browser ecosystem at the level of addon manufacturer, the study shows that some addon companies are in symbiosis and others clash with each other as shown by analyzing the behavior of 18 prominent addon manufacturers. Results may herald insight on how other evolving internet ecosystems may behave, and suggest a methodology for measuring this behavior. Specifically, applying such a methodology could transform the addon market. PMID:28644833
The amount of ergonomics and user involvement in 151 design processes.
Kok, Barbara N E; Slegers, Karin; Vink, Peter
2012-01-01
Ergonomics, usability and user-centered design are terms that are well known among designers. Yet, products often seem to fail to meet the users' needs, resulting in a gap between expected and experienced usability. To understand the possible causes of this gap the actions taken by the designer during the design process are studied in this paper. This can show whether and how certain actions influence the user-friendliness of the design products. The aim of this research was to understand whether ergonomic principles and methods are included in the design process, whether users are involved in this process and whether the experience of the designer (in ergonomics/user involvement) has an effect on the end product usability. In this study the design processes of 151 tangible products of students in design were analyzed. It showed that in 75% of the cases some ergonomic principles were applied. User involvement was performed in only 1/3 of the design cases. Hardly any correlation was found between the designers' experience in ergonomic principles and the way they applied it and no correlations were found between the designers' experience in user involvement and the users' involvement in the design process.
Towards a Ubiquitous User Model for Profile Sharing and Reuse
de Lourdes Martinez-Villaseñor, Maria; Gonzalez-Mendoza, Miguel; Hernandez-Gress, Neil
2012-01-01
People interact with systems and applications through several devices and are willing to share information about preferences, interests and characteristics. Social networking profiles, data from advanced sensors attached to personal gadgets, and semantic web technologies such as FOAF and microformats are valuable sources of personal information that could provide a fair understanding of the user, but profile information is scattered over different user models. Some researchers in the ubiquitous user modeling community envision the need to share user model's information from heterogeneous sources. In this paper, we address the syntactic and semantic heterogeneity of user models in order to enable user modeling interoperability. We present a dynamic user profile structure based in Simple Knowledge Organization for the Web (SKOS) to provide knowledge representation for ubiquitous user model. We propose a two-tier matching strategy for concept schemas alignment to enable user modeling interoperability. Our proposal is proved in the application scenario of sharing and reusing data in order to deal with overweight and obesity. PMID:23201995
User modeling for distributed virtual environment intelligent agents
NASA Astrophysics Data System (ADS)
Banks, Sheila B.; Stytz, Martin R.
1999-07-01
This paper emphasizes the requirement for user modeling by presenting the necessary information to motivate the need for and use of user modeling for intelligent agent development. The paper will present information on our current intelligent agent development program, the Symbiotic Information Reasoning and Decision Support (SIRDS) project. We then discuss the areas of intelligent agents and user modeling, which form the foundation of the SIRDS project. Included in the discussion of user modeling are its major components, which are cognitive modeling and behavioral modeling. We next motivate the need for and user of a methodology to develop user models to encompass work within cognitive task analysis. We close the paper by drawing conclusions from our current intelligent agent research project and discuss avenues of future research in the utilization of user modeling for the development of intelligent agents for virtual environments.
Advanced solar irradiances applied to satellite and ionospheric operational systems
NASA Astrophysics Data System (ADS)
Tobiska, W. Kent; Schunk, Robert; Eccles, Vince; Bouwer, Dave
Satellite and ionospheric operational systems require solar irradiances in a variety of time scales and spectral formats. We describe the development of a system using operational grade solar irradiances that are applied to empirical thermospheric density models and physics-based ionospheric models used by operational systems that require a space weather characterization. The SOLAR2000 (S2K) and SOLARFLARE (SFLR) models developed by Space Environment Technologies (SET) provide solar irradiances from the soft X-rays (XUV) through the Far Ultraviolet (FUV) spectrum. The irradiances are provided as integrated indices for the JB2006 empirical atmosphere density models and as line/band spectral irradiances for the physics-based Ionosphere Forecast Model (IFM) developed by the Space Environment Corporation (SEC). We describe the integration of these irradiances in historical, current epoch, and forecast modes through the Communication Alert and Prediction System (CAPS). CAPS provides real-time and forecast HF radio availability for global and regional users and global total electron content (TEC) conditions.
NASA Astrophysics Data System (ADS)
Plag, H.-P.; Foley, G.; Jules-Plag, S.; Ondich, G.; Kaufman, J.
2012-04-01
The Group on Earth Observations (GEO) is implementing the Global Earth Observation System of Systems (GEOSS) as a user-driven service infrastructure responding to the needs of users in nine interdependent Societal Benefit Areas (SBAs) of Earth observations (EOs). GEOSS applies an interdisciplinary scientific approach integrating observations, research, and knowledge in these SBAs in order to enable scientific interpretation of the collected observations and the extraction of actionable information. Using EOs to actually produce these societal benefits means getting the data and information to users, i.e., decision-makers. Thus, GEO needs to know what the users need and how they would use the information. The GEOSS User Requirements Registry (URR) is developed as a service-oriented infrastructure enabling a wide range of users, including science and technology (S&T) users, to express their needs in terms of EOs and to understand the benefits of GEOSS for their fields. S&T communities need to be involved in both the development and the use of GEOSS, and the development of the URR accounts for the special needs of these communities. The GEOSS Common Infrastructure (GCI) at the core of GEOSS includes system-oriented registries enabling users to discover, access, and use EOs and derived products and services available through GEOSS. In addition, the user-oriented URR is a place for the collection, sharing, and analysis of user needs and EO requirements, and it provides means for an efficient dialog between users and providers. The URR is a community-based infrastructure for the publishing, viewing, and analyzing of user-need related information. The data model of the URR has a core of seven relations for User Types, Applications, Requirements, Research Needs, Infrastructure Needs, Technology Needs, and Capacity Building Needs. The URR also includes a Lexicon, a number of controlled vocabularies, and
NASA Astrophysics Data System (ADS)
Muenich, R. L.; Kalcic, M. M.; Teshager, A. D.; Long, C. M.; Wang, Y. C.; Scavia, D.
2017-12-01
Thanks to the availability of open-source software, online tutorials, and advanced software capabilities, watershed modeling has expanded its user-base and applications significantly in the past thirty years. Even complicated models like the Soil and Water Assessment Tool (SWAT) are being used and documented in hundreds of peer-reviewed publications each year, and likely more applied in practice. These models can help improve our understanding of present, past, and future conditions, or analyze important "what-if" management scenarios. However, baseline data and methods are often adopted and applied without rigorous testing. In multiple collaborative projects, we have evaluated the influence of some of these common approaches on model results. Specifically, we examined impacts of baseline data and assumptions involved in manure application, combined sewer overflows, and climate data incorporation across multiple watersheds in the Western Lake Erie Basin. In these efforts, we seek to understand the impact of using typical modeling data and assumptions, versus using improved data and enhanced assumptions on model outcomes and thus ultimately, study conclusions. We provide guidance for modelers as they adopt and apply data and models for their specific study region. While it is difficult to quantitatively assess the full uncertainty surrounding model input data and assumptions, recognizing the impacts of model input choices is important when considering actions at the both the field and watershed scales.
Addiction Research Ethics and the Belmont Principles: Do Drug Users Have a Different Moral Voice?
Fisher, Celia B.
2013-01-01
This study used semi-structured interviews and content analysis to examine moral principles that street drug users apply to three hypothetical addiction research ethical dilemmas. Participants (n = 90) were ethnically diverse, economically disadvantaged drug users recruited in New York City in 2009. Participants applied a wide range of contextually sensitive moral precepts, including respect, beneficence, justice, relationality, professional obligations, rules, and pragmatic self-interest. Limitations and implications for future research and the responsible conduct of addiction research are discussed. PMID:21073412
NASA Technical Reports Server (NTRS)
Nutter, Paul; Manobianco, John
1998-01-01
This report describes the Applied Meteorology Unit's objective verification of the National Centers for Environmental Prediction 29-km eta model during separate warm and cool season periods from May 1996 through January 1998. The verification of surface and upper-air point forecasts was performed at three selected stations important for 45th Weather Squadron, Spaceflight Meteorology Group, and National Weather Service, Melbourne operational weather concerns. The statistical evaluation identified model biases that may result from inadequate parameterization of physical processes. Since model biases are relatively small compared to the random error component, most of the total model error results from day-to-day variability in the forecasts and/or observations. To some extent, these nonsystematic errors reflect the variability in point observations that sample spatial and temporal scales of atmospheric phenomena that cannot be resolved by the model. On average, Meso-Eta point forecasts provide useful guidance for predicting the evolution of the larger scale environment. A more substantial challenge facing model users in real time is the discrimination of nonsystematic errors that tend to inflate the total forecast error. It is important that model users maintain awareness of ongoing model changes. Such changes are likely to modify the basic error characteristics, particularly near the surface.
Reconceptualizing Early and Late Onset: A Life Course Analysis of Older Heroin Users
ERIC Educational Resources Information Center
Boeri, Miriam Williams; Sterk, Claire E.; Elifson, Kirk W.
2008-01-01
Purpose: Researchers' knowledge regarding older users of illicit drugs is limited despite the increasing numbers of users. In this article, we apply a life course perspective to gain a further understanding of older adult drug use, specifically contrasting early- and late-onset heroin users. Design and Methods: We collected qualitative data from…
[An Introduction to Methods for Evaluating Health Care Technology].
Lee, Ting-Ting
2015-06-01
The rapid and continual advance of healthcare technology makes ensuring that this technology is used effectively to achieve its original goals a critical issue. This paper presents three methods that may be applied by healthcare professionals in the evaluation of healthcare technology. These methods include: the perception/experiences of users, user work-pattern changes, and chart review or data mining. The first method includes two categories: using interviews to explore the user experience and using theory-based questionnaire surveys. The second method applies work sampling to observe the work pattern changes of users. The last method conducts chart reviews or data mining to analyze the designated variables. In conclusion, while evaluative feedback may be used to improve the design and development of healthcare technology applications, the informatics competency and informatics literacy of users may be further explored in future research.
Liverpool's Discovery: A University Library Applies a New Search Tool to Improve the User Experience
ERIC Educational Resources Information Center
Kenney, Brian
2011-01-01
This article features the University of Liverpool's arts and humanities library, which applies a new search tool to improve the user experience. In nearly every way imaginable, the Sydney Jones Library and the Harold Cohen Library--the university's two libraries that serve science, engineering, and medical students--support the lives of their…
MOPET: a context-aware and user-adaptive wearable system for fitness training.
Buttussi, Fabio; Chittaro, Luca
2008-02-01
Cardiovascular disease, obesity, and lack of physical fitness are increasingly common and negatively affect people's health, requiring medical assistance and decreasing people's wellness and productivity. In the last years, researchers as well as companies have been increasingly investigating wearable devices for fitness applications with the aim of improving user's health, in terms of cardiovascular benefits, loss of weight or muscle strength. Dedicated GPS devices, accelerometers, step counters and heart rate monitors are already commercially available, but they are usually very limited in terms of user interaction and artificial intelligence capabilities. This significantly limits the training and motivation support provided by current systems, making them poorly suited for untrained people who are more interested in fitness for health rather than competitive purposes. To better train and motivate users, we propose the mobile personal trainer (MOPET) system. MOPET is a wearable system that supervises a physical fitness activity based on alternating jogging and fitness exercises in outdoor environments. By exploiting real-time data coming from sensors, knowledge elicited from a sport physiologist and a professional trainer, and a user model that is built and periodically updated through a guided autotest, MOPET can provide motivation as well as safety and health advice, adapted to the user and the context. To better interact with the user, MOPET also displays a 3D embodied agent that speaks, suggests stretching or strengthening exercises according to user's current condition, and demonstrates how to correctly perform exercises with interactive 3D animations. By describing MOPET, we show how context-aware and user-adaptive techniques can be applied to the fitness domain. In particular, we describe how such techniques can be exploited to train, motivate, and supervise users in a wearable personal training system for outdoor fitness activity.
Boland, Mary Regina; Rusanov, Alexander; So, Yat; Lopez-Jimenez, Carlos; Busacca, Linda; Steinman, Richard C; Bakken, Suzanne; Bigger, J Thomas; Weng, Chunhua
2014-12-01
Underspecified user needs and frequent lack of a gold standard reference are typical barriers to technology evaluation. To address this problem, this paper presents a two-phase evaluation framework involving usability experts (phase 1) and end-users (phase 2). In phase 1, a cross-system functionality alignment between expert-derived user needs and system functions was performed to inform the choice of "the best available" comparison system to enable a cognitive walkthrough in phase 1 and a comparative effectiveness evaluation in phase 2. During phase 2, five quantitative and qualitative evaluation methods are mixed to assess usability: time-motion analysis, software log, questionnaires - System Usability Scale and the Unified Theory of Acceptance of Use of Technology, think-aloud protocols, and unstructured interviews. Each method contributes data for a unique measure (e.g., time motion analysis contributes task-completion-time; software log contributes action transition frequency). The measures are triangulated to yield complementary insights regarding user-perceived ease-of-use, functionality integration, anxiety during use, and workflow impact. To illustrate its use, we applied this framework in a formative evaluation of a software called Integrated Model for Patient Care and Clinical Trials (IMPACT). We conclude that this mixed-methods evaluation framework enables an integrated assessment of user needs satisfaction and user-perceived usefulness and usability of a novel design. This evaluation framework effectively bridges the gap between co-evolving user needs and technology designs during iterative prototyping and is particularly useful when it is difficult for users to articulate their needs for technology support due to the lack of a baseline. Copyright © 2013 Elsevier Inc. All rights reserved.
Software Models Impact Stresses
NASA Technical Reports Server (NTRS)
Hanshaw, Timothy C.; Roy, Dipankar; Toyooka, Mark
1991-01-01
Generalized Impact Stress Software designed to assist engineers in predicting stresses caused by variety of impacts. Program straightforward, simple to implement on personal computers, "user friendly", and handles variety of boundary conditions applied to struck body being analyzed. Applications include mathematical modeling of motions and transient stresses of spacecraft, analysis of slamming of piston, of fast valve shutoffs, and play of rotating bearing assembly. Provides fast and inexpensive analytical tool for analysis of stresses and reduces dependency on expensive impact tests. Written in FORTRAN 77. Requires use of commercial software package PLOT88.
Pragmatic User Model Implementation in an Intelligent Help System.
ERIC Educational Resources Information Center
Fernandez-Manjon, Baltasar; Fernandez-Valmayor, Alfredo; Fernandez-Chamizo, Carmen
1998-01-01
Describes Aran, a knowledge-based system designed to help users deal with problems related to Unix operation. Highlights include adaptation to the individual user; user modeling knowledge; stereotypes; content of the individual user model; instantiation, acquisition, and maintenance of the individual model; dynamic acquisition of objective and…
Flexible augmented reality architecture applied to environmental management
NASA Astrophysics Data System (ADS)
Correia, Nuno M. R.; Romao, Teresa; Santos, Carlos; Trabuco, Adelaide; Santos, Rossana; Romero, Luis; Danado, Jose; Dias, Eduardo; Camara, Antonio; Nobre, Edmundo
2003-05-01
Environmental management often requires in loco observation of the area under analysis. Augmented Reality (AR) technologies allow real time superimposition of synthetic objects on real images, providing augmented knowledge about the surrounding world. Users of an AR system can visualize the real surrounding world together with additional data generated in real time in a contextual way. The work reported in this paper was done in the scope of ANTS (Augmented Environments) project. ANTS is an AR project that explores the development of an augmented reality technological infrastructure for environmental management. This paper presents the architecture and the most relevant modules of ANTS. The system"s architecture follows the client-server model and is based on several independent, but functionally interdependent modules. It has a flexible design, which allows the transfer of some modules to and from the client side, according to the available processing capacities of the client device and the application"s requirements. It combines several techniques to identify the user"s position and orientation allowing the system to adapt to the particular characteristics of each environment. The determination of the data associated to a certain location involves the use of both a 3D Model of the location and the multimedia geo-referenced database.
Gravity compensation of an upper extremity exoskeleton.
Moubarak, S; Pham, M T; Moreau, R; Redarce, T
2010-01-01
This paper presents a new gravity compensation method for an upper extremity exoskeleton mounted on a wheel chair. This new device is dedicated to regular and efficient rehabilitation training for post-stroke and injured people without the continuous presence of a therapist. The exoskeleton is a wearable robotic device attached to the human arm. The user provides information signals to the controller by means of the force sensors around the wrist and the arm, and the robot controller generates the appropriate control signals for different training strategies and paradigms. This upper extremity exoskeleton covers four basic degrees of freedom of the shoulder and the elbow joints with three additional adaptability degrees of freedom in order to match the arm anatomy of different users. For comfortable and efficient rehabilitation, a new heuristic method have been studied and applied on our prototype in order to calculate the gravity compensation model without the need to identify the mass parameters. It is based on the geometric model of the robot and accurate torque measurements of the prototype's actuators in a set of specifically chosen joint positions. The weight effect has been successfully compensated so that the user can move his arm freely while wearing the exoskeleton without feeling its mass.
Roberts, Michaela; Hanley, Nick; Cresswell, Will
2017-09-15
While ecological links between ecosystems have been long recognised, management rarely crosses ecosystem boundaries. Coral reefs are susceptible to damage through terrestrial run-off, and failing to account for this within management threatens reef protection. In order to quantify the extent to that coral reef users are willing to support management actions to improve ecosystem quality, we conducted a choice experiment with SCUBA divers on the island of Bonaire, Caribbean Netherlands. Specifically, we estimated their willingness to pay to reduce terrestrial overgrazing as a means to improve reef health. Willingness to pay was estimated using the multinomial, random parameter and latent class logit models. Willingness to pay for improvements to reef quality was positive for the majority of respondents. Estimates from the latent class model determined willingness to pay for reef improvements of between $31.17 - $413.18/year, dependent on class membership. This represents a significant source of funding for terrestrial conservation, and illustrates the potential for user fees to be applied across ecosystem boundaries. We argue that such across-ecosystem-boundary funding mechanisms are an important avenue for future investigation in many connected systems. Copyright © 2017 Elsevier Ltd. All rights reserved.
Psychophysical experiments on the PicHunter image retrieval system
NASA Astrophysics Data System (ADS)
Papathomas, Thomas V.; Cox, Ingemar J.; Yianilos, Peter N.; Miller, Matt L.; Minka, Thomas P.; Conway, Tiffany E.; Ghosn, Joumana
2001-01-01
Psychophysical experiments were conducted on PicHunter, a content-based image retrieval (CBIR) experimental prototype with the following properties: (1) Based on a model of how users respond, it uses Bayes's rule to predict what target users want, given their actions. (2) It possesses an extremely simple user interface. (3) It employs an entropy- based scheme to improve convergence. (4) It introduces a paradigm for assessing the performance of CBIR systems. Experiments 1-3 studied human judgment of image similarity to obtain data for the model. Experiment 4 studied the importance of using: (a) semantic information, (b) memory of earlier input, and (c) relative and absolute judgments of similarity. Experiment 5 tested an approach that we propose for comparing performances of CBIR systems objectively. Finally, experiment 6 evaluated the most informative display-updating scheme that is based on entropy minimization, and confirmed earlier simulation results. These experiments represent one of the first attempts to quantify CBIR performance based on psychophysical studies, and they provide valuable data for improving CBIR algorithms. Even though they were designed with PicHunter in mind, their results can be applied to any CBIR system and, more generally, to any system that involves judgment of image similarity by humans.
NASA Technical Reports Server (NTRS)
Sidwell, Kenneth W.; Baruah, Pranab K.; Bussoletti, John E.; Medan, Richard T.; Conner, R. S.; Purdon, David J.
1990-01-01
A comprehensive description of user problem definition for the PAN AIR (Panel Aerodynamics) system is given. PAN AIR solves the 3-D linear integral equations of subsonic and supersonic flow. Influence coefficient methods are used which employ source and doublet panels as boundary surfaces. Both analysis and design boundary conditions can be used. This User's Manual describes the information needed to use the PAN AIR system. The structure and organization of PAN AIR are described, including the job control and module execution control languages for execution of the program system. The engineering input data are described, including the mathematical and physical modeling requirements. Version 3.0 strictly applies only to PAN AIR version 3.0. The major revisions include: (1) inputs and guidelines for the new FDP module (which calculates streamlines and offbody points); (2) nine new class 1 and class 2 boundary conditions to cover commonly used modeling practices, in particular the vorticity matching Kutta condition; (3) use of the CRAY solid state Storage Device (SSD); and (4) incorporation of errata and typo's together with additional explanation and guidelines.
NASA Astrophysics Data System (ADS)
Lin, Y.; Bajcsy, P.; Valocchi, A. J.; Kim, C.; Wang, J.
2007-12-01
Natural systems are complex, thus extensive data are needed for their characterization. However, data acquisition is expensive; consequently we develop models using sparse, uncertain information. When all uncertainties in the system are considered, the number of alternative conceptual models is large. Traditionally, the development of a conceptual model has relied on subjective professional judgment. Good judgment is based on experience in coordinating and understanding auxiliary information which is correlated to the model but difficult to be quantified into the mathematical model. For example, groundwater recharge and discharge (R&D) processes are known to relate to multiple information sources such as soil type, river and lake location, irrigation patterns and land use. Although hydrologists have been trying to understand and model the interaction between each of these information sources and R&D processes, it is extremely difficult to quantify their correlations using a universal approach due to the complexity of the processes, the spatiotemporal distribution and uncertainty. There is currently no single method capable of estimating R&D rates and patterns for all practical applications. Chamberlin (1890) recommended use of "multiple working hypotheses" (alternative conceptual models) for rapid advancement in understanding of applied and theoretical problems. Therefore, cross analyzing R&D rates and patterns from various estimation methods and related field information will likely be superior to using only a single estimation method. We have developed the Pattern Recognition Utility (PRU), to help GIS users recognize spatial patterns from noisy 2D image. This GIS plug-in utility has been applied to help hydrogeologists establish alternative R&D conceptual models in a more efficient way than conventional methods. The PRU uses numerical methods and image processing algorithms to estimate and visualize shallow R&D patterns and rates. It can provide a fast initial estimate prior to planning labor intensive and time consuming field R&D measurements. Furthermore, the Spatial Pattern 2 Learn (SP2L) was developed to cross analyze results from the PRU with ancillary field information, such as land coverage, soil type, topographic maps and previous estimates. The learning process of SP2L cross examines each initially recognized R&D pattern with the ancillary spatial dataset, and then calculates a quantifiable reliability index for each R&D map using a supervised machine learning technique called decision tree. This JAVA based software package is capable of generating alternative R&D maps if the user decides to apply certain conditions recognized by the learning process. The reliability indices from SP2L will improve the traditionally subjective approach to initiating conceptual models by providing objectively quantifiable conceptual bases for further probabilistic and uncertainty analyses. Both the PRU and SP2L have been designed to be user-friendly and universal utilities for pattern recognition and learning to improve model predictions from sparse measurements by computer-assisted integration of spatially dense geospatial image data and machine learning of model dependencies.
The Airspace Concepts Evaluation System Architecture and System Plant
NASA Technical Reports Server (NTRS)
Windhorst, Robert; Meyn, Larry; Manikonda, Vikram; Carlos, Patrick; Capozzi, Brian
2006-01-01
The Airspace Concepts Evaluation System is a simulation of the National Airspace System. It includes models of flights, airports, airspaces, air traffic controls, traffic flow managements, and airline operation centers operating throughout the United States. It is used to predict system delays in response to future capacity and demand scenarios and perform benefits assessments of current and future airspace technologies and operational concepts. Facilitation of these studies requires that the simulation architecture supports plug and play of different air traffic control, traffic flow management, and airline operation center models and multi-fidelity modeling of flights, airports, and airspaces. The simulation is divided into two parts that are named, borrowing from classical control theory terminology, control and plant. The control consists of air traffic control, traffic flow management, and airline operation center models, and the plant consists of flight, airport, and airspace models. The plant can run open loop, in the absence of the control. However, undesired affects, such as conflicts and over congestions in the airspaces and airports, can occur. Different controls are applied, "plug and played", to the plant. A particular control is evaluated by analyzing how well it managed conflicts and congestions. Furthermore, the terminal area plants consist of models of airports and terminal airspaces. Each model consists of a set of nodes and links which are connected by the user to form a network. Nodes model runways, fixes, taxi intersections, gates, and/or other points of interest, and links model taxiways, departure paths, and arrival paths. Metering, flow distribution, and sequencing functions can be applied at nodes. Different fidelity model of how a flight transits are can be used by links. The fidelity of the model can be adjusted by the user by either changing the complexity of the node/link network-or the way that the link models how the flights transit from one node to the other.
How should Fitts' Law be applied to human-computer interaction?
NASA Technical Reports Server (NTRS)
Gillan, D. J.; Holden, K.; Adam, S.; Rudisill, M.; Magee, L.
1992-01-01
The paper challenges the notion that any Fitts' Law model can be applied generally to human-computer interaction, and proposes instead that applying Fitts' Law requires knowledge of the users' sequence of movements, direction of movement, and typical movement amplitudes as well as target sizes. Two experiments examined a text selection task with sequences of controlled movements (point-click and point-drag). For the point-click sequence, a Fitts' Law model that used the diagonal across the text object in the direction of pointing (rather than the horizontal extent of the text object) as the target size provided the best fit for the pointing time data, whereas for the point-drag sequence, a Fitts' Law model that used the vertical size of the text object as the target size gave the best fit. Dragging times were fitted well by Fitts' Law models that used either the vertical or horizontal size of the terminal character in the text object. Additional results of note were that pointing in the point-click sequence was consistently faster than in the point-drag sequence, and that pointing in either sequence was consistently faster than dragging. The discussion centres around the need to define task characteristics before applying Fitts' Law to an interface design or analysis, analyses of pointing and of dragging, and implications for interface design.
ERIC Educational Resources Information Center
Godwin, Stephen; McAndrew, Patrick; Santos, Andreia
2008-01-01
Web-enabled technology is now being applied on a large scale. In this paper we look at open access provision of teaching and learning leading to many users with varying patterns and motivations for use. This has provided us with a research challenge to find methods that help us understand and explain such initiatives. We describe ways to model the…
NASA Gulf of Mexico Initiative Hypoxia Research
NASA Technical Reports Server (NTRS)
Armstrong, Curtis D.
2012-01-01
The Applied Science & Technology Project Office at Stennis Space Center (SSC) manages NASA's Gulf of Mexico Initiative (GOMI). Addressing short-term crises and long-term issues, GOMI participants seek to understand the environment using remote sensing, in-situ observations, laboratory analyses, field observations and computational models. New capabilities are transferred to end-users to help them make informed decisions. Some GOMI activities of interest to the hypoxia research community are highlighted.
Experimental Investigation and Numerical Predication of a Cross-Flow Fan
2006-12-01
Figure 3. Combination probes and pressure tap layout .....................................................6 Figure 4. CFF_DAQ graphical user interface...properties were United Sensor Devices model USD-C-161 3 mm (1/8-inch) combination thermocouple/pressure probes, and static pressure taps . The...was applied to the three static pressure tapes at the throat of the bell-mouth and to the two exhaust duct static pressure taps . Once the data
SSM - SOLID SURFACE MODELER, VERSION 6.0
NASA Technical Reports Server (NTRS)
Goza, S. P.
1994-01-01
The Solid Surface Modeler (SSM) is an interactive graphics software application for solid-shaded and wireframe three- dimensional geometric modeling. It enables the user to construct models of real-world objects as simple as boxes or as complex as Space Station Freedom. The program has a versatile user interface that, in many cases, allows mouse input for intuitive operation or keyboard input when accuracy is critical. SSM can be used as a stand-alone model generation and display program and offers high-fidelity still image rendering. Models created in SSM can also be loaded into other software for animation or engineering simulation. (See the information below for the availability of SSM with the Object Orientation Manipulator program, OOM, a graphics software application for three-dimensional rendering and animation.) Models are constructed within SSM using functions of the Create Menu to create, combine, and manipulate basic geometric building blocks called primitives. Among the simpler primitives are boxes, spheres, ellipsoids, cylinders, and plates; among the more complex primitives are tubes, skinned-surface models and surfaces of revolution. SSM also provides several methods for duplicating models. Constructive Solid Geometry (CSG) is one of the most powerful model manipulation tools provided by SSM. The CSG operations implemented in SSM are union, subtraction and intersection. SSM allows the user to transform primitives with respect to each axis, transform the camera (the user's viewpoint) about its origin, apply texture maps and bump maps to model surfaces, and define color properties; to select and combine surface-fill attributes, including wireframe, constant, and smooth; and to specify models' points of origin (the positions about which they rotate). SSM uses Euler angle transformations for calculating the results of translation and rotation operations. The user has complete control over the modeling environment from within the system. A variety of file formats are supported to facilitate modification of models and to provide for translation to other formats. This combination of features makes SSM valuable for research and development beyond its intended role in the creation of simulation and animation models. SSM makes an important distinction between models, objects, and surfaces. Models consist of one or more objects and are the highest level geometric entity upon which SSM operates. File operations are performed solely at the model level. (All primitives are models consisting of a single object.) The majority of SSM's manipulation functions operate at the object level. Objects consist of one or more surfaces and surfaces may consist of one or more polygons, which are the structural basis for the modeling method used by SSM. Surfaces are the lowest-level geometric entity upon which SSM operates. Surface-fill attributes, for example, may be assigned at the surface level. Surfaces cannot exist except as part of an object and objects cannot exist except as part of a model. SSM can simultaneously accommodate as many models as the host computer's memory permits. In its default display mode, SSM renders model surfaces using two shading methods: constant shading and smooth shading. Constant shading reveals each polygon of an object's surfaces, giving the object an angular appearance. Smooth shading causes an object's polygons to blend into one another, giving its surfaces a smooth, continuous appearance. When used in proper combination, each of these methods contribute to object realism. SSM applies each method automatically during the creation of primitives, but the user can manually override the default settings. Both fill attributes and shading characteristics can be defined for individual surfaces, objects, and models. SSM provides two optional display modes for reducing rendering time for complex models. In wireframe mode, SSM represents all model geometry data in unshaded line drawings, and no hidden-surface removal is performed. In simple mode, only the outermost boundaries (or bounding volume) that define each model are depicted. In either case the user is allowed to trade off visual authenticity for update speed. SSM is written in C-language for implementation on SGI IRIS 4D series workstations running the IRIX operating system. A minimum of 8Mb of RAM is recommended for this program. The standard distribution medium for SSM is a .25 inch streaming magnetic IRIX tape cartridge in UNIX tar format. SSM is also offered as a bundle with a related program, OOM (Object Orientation Manipulator). Please see the abstract for SSM/OOM (COS-10047) for information about the bundled package. Version 6.0 of SSM was released in 1993.
Development of a software tool to support chemical and biological terrorism intelligence analysis
NASA Astrophysics Data System (ADS)
Hunt, Allen R.; Foreman, William
1997-01-01
AKELA has developed a software tool which uses a systems analytic approach to model the critical processes which support the acquisition of biological and chemical weapons by terrorist organizations. This tool has four major components. The first is a procedural expert system which describes the weapon acquisition process. It shows the relationship between the stages a group goes through to acquire and use a weapon, and the activities in each stage required to be successful. It applies to both state sponsored and small group acquisition. An important part of this expert system is an analysis of the acquisition process which is embodied in a list of observables of weapon acquisition activity. These observables are cues for intelligence collection The second component is a detailed glossary of technical terms which helps analysts with a non- technical background understand the potential relevance of collected information. The third component is a linking capability which shows where technical terms apply to the parts of the acquisition process. The final component is a simple, intuitive user interface which shows a picture of the entire process at a glance and lets the user move quickly to get more detailed information. This paper explains e each of these five model components.
Sensitivity analysis of dynamic biological systems with time-delays.
Wu, Wu Hsiung; Wang, Feng Sheng; Chang, Maw Shang
2010-10-15
Mathematical modeling has been applied to the study and analysis of complex biological systems for a long time. Some processes in biological systems, such as the gene expression and feedback control in signal transduction networks, involve a time delay. These systems are represented as delay differential equation (DDE) models. Numerical sensitivity analysis of a DDE model by the direct method requires the solutions of model and sensitivity equations with time-delays. The major effort is the computation of Jacobian matrix when computing the solution of sensitivity equations. The computation of partial derivatives of complex equations either by the analytic method or by symbolic manipulation is time consuming, inconvenient, and prone to introduce human errors. To address this problem, an automatic approach to obtain the derivatives of complex functions efficiently and accurately is necessary. We have proposed an efficient algorithm with an adaptive step size control to compute the solution and dynamic sensitivities of biological systems described by ordinal differential equations (ODEs). The adaptive direct-decoupled algorithm is extended to solve the solution and dynamic sensitivities of time-delay systems describing by DDEs. To save the human effort and avoid the human errors in the computation of partial derivatives, an automatic differentiation technique is embedded in the extended algorithm to evaluate the Jacobian matrix. The extended algorithm is implemented and applied to two realistic models with time-delays: the cardiovascular control system and the TNF-α signal transduction network. The results show that the extended algorithm is a good tool for dynamic sensitivity analysis on DDE models with less user intervention. By comparing with direct-coupled methods in theory, the extended algorithm is efficient, accurate, and easy to use for end users without programming background to do dynamic sensitivity analysis on complex biological systems with time-delays.
NASA Astrophysics Data System (ADS)
Leavesley, G.; Markstrom, S.; Frevert, D.; Fulp, T.; Zagona, E.; Viger, R.
2004-12-01
Increasing demands for limited fresh-water supplies, and increasing complexity of water-management issues, present the water-resource manager with the difficult task of achieving an equitable balance of water allocation among a diverse group of water users. The Watershed and River System Management Program (WARSMP) is a cooperative effort between the U.S. Geological Survey (USGS) and the Bureau of Reclamation (BOR) to develop and deploy a database-centered, decision-support system (DSS) to address these multi-objective, resource-management problems. The decision-support system couples the USGS Modular Modeling System (MMS) with the BOR RiverWare tools using a shared relational database. MMS is an integrated system of computer software that provides a research and operational framework to support the development and integration of a wide variety of hydrologic and ecosystem models, and their application to water- and ecosystem-resource management. RiverWare is an object-oriented reservoir and river-system modeling framework developed to provide tools for evaluating and applying water-allocation and management strategies. The modeling capabilities of MMS and Riverware include simulating watershed runoff, reservoir inflows, and the impacts of resource-management decisions on municipal, agricultural, and industrial water users, environmental concerns, power generation, and recreational interests. Forecasts of future climatic conditions are a key component in the application of MMS models to resource-management decisions. Forecast methods applied in MMS include a modified version of the National Weather Service's Extended Streamflow Prediction Program (ESP) and statistical downscaling from atmospheric models. The WARSMP DSS is currently operational in the Gunnison River Basin, Colorado; Yakima River Basin, Washington; Rio Grande Basin in Colorado and New Mexico; and Truckee River Basin in California and Nevada.
A Pre-launch Analysis of NASA's SMAP Mission Data
NASA Astrophysics Data System (ADS)
Escobar, V. M.; Brown, M. E.
2012-12-01
Product applications have become an integral part of converting the data collected into actionable knowledge that can be used to inform policy. Successfully bridging scientific research with operational decision making in different application areas requires looking into thematic user requirements and data requirements. NASA's Soil Moisture Active/Passive mission (SMAP) has an applications program that actively seeks to integrate the data prior to launch into a broad range of environmental monitoring and decision making systems from drought and flood guidance to disease risk assessment and national security SMAP is a a combined active/passive microwave instrument, which will be launched into a near-polar orbit in late 2014. It aims to produce a series of soil moisture products and soil freeze/thaw products with an accuracy of +/- 10%, a nominal resolution of between 3 and 40km, and latency between 12 hours and 7 days. These measurements will be used to enhance the understanding of processes that link the water, energy and carbon cycles, and to extend the capabilities of weather and climate prediction models. The driving success of the SMAP applications program is joining mission scientists to thematic end users and leveraging the knowledge base of soil moisture data applications, increase the speed SMAP data product ingestion into critical processes and research, improving societal benefits to science. Because SMAP has not yet launched, the mission is using test algorithms to determine how the data will interact with existing processes. The objective of this profession review is to solicit data requirements, accuracy needs and current understanding of the SMAP mission from the user community and then feed that back into mission product development. Thus, understanding how users will apply SMAP data, prior to the satellite's launch, is an important component of SMAP Applied Sciences and one of NASA's measures for mission success. This paper presents an analysis of an email-based review of expert end-users and earth science researchers to eliciting how pre-launch activities and research is being conducted in thematic group's organizations. Our focus through the SMAP Applications Program will be to (1) improve the missions understanding of the SMAP user community requirements, (2) document and communicate the perceived challenges and advantages to the mission scientists, and (3) facilitate the movement of science into policy and decision making arenas. We will analyze the data of this review to understand the perceived benefits to pre-launch efforts, user engagement and define areas were the connection between science development and user engagement can continue to improve and further benefit future mission pre launch efforts. The research will facilitate collaborative opportunities between agencies, broadening the fields of science where soil moisture observation data can be applied.
Tree Colors: Color Schemes for Tree-Structured Data.
Tennekes, Martijn; de Jonge, Edwin
2014-12-01
We present a method to map tree structures to colors from the Hue-Chroma-Luminance color model, which is known for its well balanced perceptual properties. The Tree Colors method can be tuned with several parameters, whose effect on the resulting color schemes is discussed in detail. We provide a free and open source implementation with sensible parameter defaults. Categorical data are very common in statistical graphics, and often these categories form a classification tree. We evaluate applying Tree Colors to tree structured data with a survey on a large group of users from a national statistical institute. Our user study suggests that Tree Colors are useful, not only for improving node-link diagrams, but also for unveiling tree structure in non-hierarchical visualizations.
Updated Results for the Wake Vortex Inverse Model
NASA Technical Reports Server (NTRS)
Robins, Robert E.; Lai, David Y.; Delisi, Donald P.; Mellman, George R.
2008-01-01
NorthWest Research Associates (NWRA) has developed an Inverse Model for inverting aircraft wake vortex data. The objective of the inverse modeling is to obtain estimates of the vortex circulation decay and crosswind vertical profiles, using time history measurements of the lateral and vertical position of aircraft vortices. The Inverse Model performs iterative forward model runs using estimates of vortex parameters, vertical crosswind profiles, and vortex circulation as a function of wake age. Iterations are performed until a user-defined criterion is satisfied. Outputs from an Inverse Model run are the best estimates of the time history of the vortex circulation derived from the observed data, the vertical crosswind profile, and several vortex parameters. The forward model, named SHRAPA, used in this inverse modeling is a modified version of the Shear-APA model, and it is described in Section 2 of this document. Details of the Inverse Model are presented in Section 3. The Inverse Model was applied to lidar-observed vortex data at three airports: FAA acquired data from San Francisco International Airport (SFO) and Denver International Airport (DEN), and NASA acquired data from Memphis International Airport (MEM). The results are compared with observed data. This Inverse Model validation is documented in Section 4. A summary is given in Section 5. A user's guide for the inverse wake vortex model is presented in a separate NorthWest Research Associates technical report (Lai and Delisi, 2007a).
Morris-Kukoski, Cynthia L; Montgomery, Madeline A; Hammer, Rena L
2014-01-01
Samples from a self-proclaimed cocaine (COC) user, from 19 drug users (postmortem) and from 27 drug chemists were extensively washed and analyzed for COC, benzoylecgonine, norcocaine (NC), cocaethylene (CE) and aryl hydroxycocaines by liquid chromatography-tandem mass spectrometry. Published wash criteria and cutoffs were applied to the results. Additionally, the data were used to formulate new reporting criteria and interpretation guidelines for forensic casework. Applying the wash and reporting criteria, hair that was externally contaminated with COC was distinguished from hair collected from individuals known to have consumed COC. In addition, CE, NC and hydroxycocaine metabolites were only present in COC users' hair and not in drug chemists' hair. When properly applied, the use of an extended wash, along with the reporting criteria defined here, will exclude false-positive results from environmental contact with COC. Published by Oxford University Press 2014. This work is written by (a) US Government employee(s) and is in the public domain in the US.
NASA Astrophysics Data System (ADS)
Roncoli, Carla; Kirshen, Paul; Etkin, Derek; Sanon, Moussa; Somé, Léopold; Dembélé, Youssouf; Sanfo, Bienvenue J.; Zoungrana, Jacqueline; Hoogenboom, Gerrit
2009-10-01
This study focuses on the potential role of technical and institutional innovations for improving water management in a multi-user context in Burkina Faso. We focus on a system centered on three reservoirs that capture the waters of the Upper Comoé River Basin and servicing a diversity of users, including a sugar manufacturing company, a urban water supply utility, a farmer cooperative, and other downstream users. Due to variable and declining rainfall and expanding users’ needs, drastic fluctuations in water supply and demand occur during each dry season. A decision support tool was developed through participatory research to enable users to assess the impact of alternative release and diversion schedules on deficits faced by each user. The tool is meant to be applied in the context of consultative planning by a local user committee that has been created by a new national integrated water management policy. We contend that both solid science and good governance are instrumental in realizing efficient and equitable water management and adaptation to climate variability and change. But, while modeling tools and negotiation platforms may assist users in managing climate risk, they also introduce additional uncertainties into the deliberative process. It is therefore imperative to understand how these technological and institutional innovations frame water use issues and decisions to ensure that such framing is consistent with the goals of integrated water resource management.
Perceptual grouping effects on cursor movement expectations.
Dorneich, Michael C; Hamblin, Christopher J; Lancaster, Jeff A; Olofinboba, Olu
2014-05-01
Two studies were conducted to develop an understanding of factors that drive user expectations when navigating between discrete elements on a display via a limited degree-of-freedom cursor control device. For the Orion Crew Exploration Vehicle spacecraft, a free-floating cursor with a graphical user interface (GUI) would require an unachievable level of accuracy due to expected acceleration and vibration conditions during dynamic phases of flight. Therefore, Orion program proposed using a "caged" cursor to "jump" from one controllable element (node) on the GUI to another. However, nodes are not likely to be arranged on a rectilinear grid, and so movements between nodes are not obvious. Proximity between nodes, direction of nodes relative to each other, and context features may all contribute to user cursor movement expectations. In an initial study, we examined user expectations based on the nodes themselves. In a second study, we examined the effect of context features on user expectations. The studies established that perceptual grouping effects influence expectations to varying degrees. Based on these results, a simple rule set was developed to support users in building a straightforward mental model that closely matches their natural expectations for cursor movement. The results will help designers of display formats take advantage of the natural context-driven cursor movement expectations of users to reduce navigation errors, increase usability, and decrease access time. The rules set and guidelines tie theory to practice and can be applied in environments where vibration or acceleration are significant, including spacecraft, aircraft, and automobiles.
NASA Astrophysics Data System (ADS)
Yang, Z. L.; Cao, J.; Hu, K.; Gui, Z. P.; Wu, H. Y.; You, L.
2016-06-01
Efficient online discovering and applying geospatial information resources (GIRs) is critical in Earth Science domain as while for cross-disciplinary applications. However, to achieve it is challenging due to the heterogeneity, complexity and privacy of online GIRs. In this article, GeoSquare, a collaborative online geospatial information sharing and geoprocessing platform, was developed to tackle this problem. Specifically, (1) GIRs registration and multi-view query functions allow users to publish and discover GIRs more effectively. (2) Online geoprocessing and real-time execution status checking help users process data and conduct analysis without pre-installation of cumbersome professional tools on their own machines. (3) A service chain orchestration function enables domain experts to contribute and share their domain knowledge with community members through workflow modeling. (4) User inventory management allows registered users to collect and manage their own GIRs, monitor their execution status, and track their own geoprocessing histories. Besides, to enhance the flexibility and capacity of GeoSquare, distributed storage and cloud computing technologies are employed. To support interactive teaching and training, GeoSquare adopts the rich internet application (RIA) technology to create user-friendly graphical user interface (GUI). Results show that GeoSquare can integrate and foster collaboration between dispersed GIRs, computing resources and people. Subsequently, educators and researchers can share and exchange resources in an efficient and harmonious way.
Nieke, Jens; Reusen, Ils
2007-01-01
User-driven requirements for remote sensing data are difficult to define, especially details on geometric, spectral and radiometric parameters. Even more difficult is a decent assessment of the required degrees of processing and corresponding data quality. It is therefore a real challenge to appropriately assess data costs and services to be provided. In 2006, the HYRESSA project was initiated within the framework 6 programme of the European Commission to analyze the user needs of the hyperspectral remote sensing community. Special focus was given to finding an answer to the key question, “What are the individual user requirements for hyperspectral imagery and its related data products?”. A Value-Benefit Analysis (VBA) was performed to retrieve user needs and address open items accordingly. The VBA is an established tool for systematic problem solving by supporting the possibility of comparing competing projects or solutions. It enables evaluation on the basis of a multidimensional objective model and can be augmented with expert's preferences. After undergoing a VBA, the scaling method (e.g., Law of Comparative Judgment) was applied for achieving the desired ranking judgments. The result, which is the relative value of projects with respect to a well-defined main objective, can therefore be produced analytically using a VBA. A multidimensional objective model adhering to VBA methodology was established. Thereafter, end users and experts were requested to fill out a Questionnaire of User Needs (QUN) at the highest level of detail - the value indicator level. The end user was additionally requested to report personal preferences for his particular research field. In the end, results from the experts' evaluation and results from a sensor data survey can be compared in order to understand user needs and the drawbacks of existing data products. The investigation – focusing on the needs of the hyperspectral user community in Europe – showed that a VBA is a suitable method for analyzing the needs of hyperspectral data users and supporting the sensor/data specification-building process. The VBA has the advantage of being easy to handle, resulting in a comprehensive evaluation. The primary disadvantage is the large effort in realizing such an analysis because the level of detail is extremely high.
A Study into the Method of Precise Orbit Determination of a HEO Orbiter by GPS and Accelerometer
NASA Technical Reports Server (NTRS)
Ikenaga, Toshinori; Hashida, Yoshi; Unwin, Martin
2007-01-01
In the present day, orbit determination by Global Positioning System (GPS) is not unusual. Especially for low-cost small satellites, position determination by an on-board GPS receiver provides a cheap, reliable and precise method. However, the original purpose of GPS is for ground users, so the transmissions from all of the GPS satellites are directed toward the Earth s surface. Hence there are some restrictions for users above the GPS constellation to detect those signals. On the other hand, a desire for precise orbit determination for users in orbits higher than GPS constellation exists. For example, the next Japanese Very Long Baseline Interferometry (VLBI) mission "ASTRO-G" is trying to determine its orbit in an accuracy of a few centimeters at apogee. The use of GPS is essential for such ultra accurate orbit determination. This study aims to construct a method for precise orbit determination for such high orbit users, especially in High Elliptical Orbits (HEOs). There are several approaches for this objective. In this study, a hybrid method with GPS and an accelerometer is chosen. Basically, while the position cannot be determined by an on-board GPS receiver or other Range and Range Rate (RARR) method, all we can do to estimate the user satellite s position is to propagate the orbit along with the force model, which is not perfectly correct. However if it has an accelerometer (ACC), the coefficients of the air drag and the solar radiation pressure applied to the user satellite can be updated and then the propagation along with the "updated" force model can improve the fitting accuracy of the user satellite s orbit. In this study, it is assumed to use an accelerometer available in the present market. The effects by a bias error of an accelerometer will also be discussed in this paper.
NASA Astrophysics Data System (ADS)
Burkett, E. R.; Jayanty, N. K.; Sellnow, D. D.; Given, D. D.; DeGroot, R. M.
2016-12-01
Methods that use storytelling to gather and synthesize data from people can be advantageous in understanding user needs and designing successful communication products. Using a multidisciplinary approach, we research and prioritize user needs for the ShakeAlert Earthquake Early Warning system (http://pubs.usgs.gov/fs/2014/3083/), drawing on best practices from social and behavioral science, risk communication, and human-centered design. We apply quantitative and qualitative human data collection methods including user surveys, interviews, journey maps, personas, and scenarios. Human-centered design methods leverage storytelling (a) in the acquisition of qualitative behavioral data (e.g. with journey mapping), (b) through goal-driven behaviors and needs that are synthesized into a persona as a composite model of the data, and (c) within context scenarios (the story plot or projected circumstances) in which the persona is placed in context to inform the design of relevant and usable products or services. ShakeAlert, operated by the USGS and partners, has transitioned into a production prototype phase in which users are permitted to begin testing pilot implementations to take protective actions in response to an earthquake alert. While a subset of responses will be automated (e.g., opening fire house doors), other applications of the technology will alert individuals by broadcast, public address, or mobile device notifications and require self-protective behavioral decisions (e.g., "Drop, Cover, and Hold On"). To better understand ShakeAlert user decisions and needs, we use human-centered design methods to synthesize aggregated behavioral data into "personas," which model the common behavioral patterns that can be used to guide plans for the ShakeAlert interface, messaging, and training. We present user data, methods, and resulting personas that will inform decisions moving forward to shape ShakeAlert messaging and training that will be most usable by alert recipients.
Bailey, B A; Hare, D J; Hatton, C; Limb, K
2006-03-01
Previous studies have attempted to apply Weiner's attributional model of helping behaviour to care staff who work with service users with intellectual disabilities and challenging behaviours by using studies based on vignettes. The aims of the current study were to investigate the application of Weiner's model to 'real' service users with intellectual disabilities and challenging behaviours and to observe the care staff's actual responses to challenging behaviours displayed by service users. Also, to compare care staff attributions, emotions, optimism, willingness to help and observed helping behaviours for self-injurious behaviours in comparison to other forms of challenging behaviours. A total of 27 care staff completed two sets of measures, one set regarding a self-injurious behaviour and the other regarding other forms of challenging behaviour. An additional 16 staff completed one set of measures. The measures focused on care staff attributions, emotions, optimism and willingness to help. Also, 16 of the care staff were observed interacting with the service users to collect data regarding their responses to challenging behaviours. For both self-injurious behaviours and other forms of challenging behaviour, associations were found between the care staff internal, stable and uncontrollable attribution scores and care staff negative emotion scores. However, no associations were found between the care staff levels of emotion, optimism and willingness to help. Some associations were found between the care staff levels of willingness to help and observed helping behaviours. There were significant differences between the care staff attribution scores with higher scores being obtained for uncontrollable and stable attributions for other forms of challenging behaviours. No significant differences were found between the care staff emotions, optimism, willingness to help and observed helping behaviours. The results did not provide support for Weiner's attributional model of helping behaviour. However, a preliminary model of negative care staff behaviour was derived from the exploratory analyses completed. This model proposes that there are associations between internal, stable and uncontrollable attributions and negative emotions in care staff and also between negative emotions and negative behaviours displayed by care staff in response to the actions of service users.
Binaural enhancement for bilateral cochlear implant users.
Brown, Christopher A
2014-01-01
Bilateral cochlear implant (BCI) users receive limited binaural cues and, thus, show little improvement to speech intelligibility from spatial cues. The feasibility of a method for enhancing the binaural cues available to BCI users is investigated. This involved extending interaural differences of levels, which typically are restricted to high frequencies, into the low-frequency region. Speech intelligibility was measured in BCI users listening over headphones and with direct stimulation, with a target talker presented to one side of the head in the presence of a masker talker on the other side. Spatial separation was achieved by applying either naturally occurring binaural cues or enhanced cues. In this listening configuration, BCI patients showed greater speech intelligibility with the enhanced binaural cues than with naturally occurring binaural cues. In some situations, it is possible for BCI users to achieve greater speech intelligibility when binaural cues are enhanced by applying interaural differences of levels in the low-frequency region.
Development and application of computational aerothermodynamics flowfield computer codes
NASA Technical Reports Server (NTRS)
Venkatapathy, Ethiraj
1994-01-01
Research was performed in the area of computational modeling and application of hypersonic, high-enthalpy, thermo-chemical nonequilibrium flow (Aerothermodynamics) problems. A number of computational fluid dynamic (CFD) codes were developed and applied to simulate high altitude rocket-plume, the Aeroassist Flight Experiment (AFE), hypersonic base flow for planetary probes, the single expansion ramp model (SERN) connected with the National Aerospace Plane, hypersonic drag devices, hypersonic ramp flows, ballistic range models, shock tunnel facility nozzles, transient and steady flows in the shock tunnel facility, arc-jet flows, thermochemical nonequilibrium flows around simple and complex bodies, axisymmetric ionized flows of interest to re-entry, unsteady shock induced combustion phenomena, high enthalpy pulsed facility simulations, and unsteady shock boundary layer interactions in shock tunnels. Computational modeling involved developing appropriate numerical schemes for the flows on interest and developing, applying, and validating appropriate thermochemical processes. As part of improving the accuracy of the numerical predictions, adaptive grid algorithms were explored, and a user-friendly, self-adaptive code (SAGE) was developed. Aerothermodynamic flows of interest included energy transfer due to strong radiation, and a significant level of effort was spent in developing computational codes for calculating radiation and radiation modeling. In addition, computational tools were developed and applied to predict the radiative heat flux and spectra that reach the model surface.
Code of Federal Regulations, 2010 CFR
2010-07-01
... term Industrial User or User means a source of Indirect Discharge. (k) The term Interference means a... the EPA in accordance with section 307 (b) and (c) of the Act, which applies to Industrial Users. This... feasibility, engineering, and design studies do not constitute a contractual obligation under this paragraph...
Code of Federal Regulations, 2012 CFR
2012-07-01
... term Industrial User or User means a source of Indirect Discharge. (k) The term Interference means a... the EPA in accordance with section 307 (b) and (c) of the Act, which applies to Industrial Users. This... feasibility, engineering, and design studies do not constitute a contractual obligation under this paragraph...
Code of Federal Regulations, 2013 CFR
2013-07-01
... term Industrial User or User means a source of Indirect Discharge. (k) The term Interference means a... the EPA in accordance with section 307 (b) and (c) of the Act, which applies to Industrial Users. This... feasibility, engineering, and design studies do not constitute a contractual obligation under this paragraph...
Code of Federal Regulations, 2014 CFR
2014-07-01
... term Industrial User or User means a source of Indirect Discharge. (k) The term Interference means a... the EPA in accordance with section 307 (b) and (c) of the Act, which applies to Industrial Users. This... feasibility, engineering, and design studies do not constitute a contractual obligation under this paragraph...
Code of Federal Regulations, 2011 CFR
2011-07-01
... term Industrial User or User means a source of Indirect Discharge. (k) The term Interference means a... the EPA in accordance with section 307 (b) and (c) of the Act, which applies to Industrial Users. This... feasibility, engineering, and design studies do not constitute a contractual obligation under this paragraph...
NASA Technical Reports Server (NTRS)
Carnahan, Richard S., Jr.; Corey, Stephen M.; Snow, John B.
1989-01-01
Applications of rapid prototyping and Artificial Intelligence techniques to problems associated with Space Station-era information management systems are described. In particular, the work is centered on issues related to: (1) intelligent man-machine interfaces applied to scientific data user support, and (2) the requirement that intelligent information management systems (IIMS) be able to efficiently process metadata updates concerning types of data handled. The advanced IIMS represents functional capabilities driven almost entirely by the needs of potential users. Space Station-era scientific data projected to be generated is likely to be significantly greater than data currently processed and analyzed. Information about scientific data must be presented clearly, concisely, and with support features to allow users at all levels of expertise efficient and cost-effective data access. Additionally, mechanisms for allowing more efficient IIMS metadata update processes must be addressed. The work reported covers the following IIMS design aspects: IIMS data and metadata modeling, including the automatic updating of IIMS-contained metadata, IIMS user-system interface considerations, including significant problems associated with remote access, user profiles, and on-line tutorial capabilities, and development of an IIMS query and browse facility, including the capability to deal with spatial information. A working prototype has been developed and is being enhanced.
A 3-D mixed-reality system for stereoscopic visualization of medical dataset.
Ferrari, Vincenzo; Megali, Giuseppe; Troia, Elena; Pietrabissa, Andrea; Mosca, Franco
2009-11-01
We developed a simple, light, and cheap 3-D visualization device based on mixed reality that can be used by physicians to see preoperative radiological exams in a natural way. The system allows the user to see stereoscopic "augmented images," which are created by mixing 3-D virtual models of anatomies obtained by processing preoperative volumetric radiological images (computed tomography or MRI) with real patient live images, grabbed by means of cameras. The interface of the system consists of a head-mounted display equipped with two high-definition cameras. Cameras are mounted in correspondence of the user's eyes and allow one to grab live images of the patient with the same point of view of the user. The system does not use any external tracker to detect movements of the user or the patient. The movements of the user's head and the alignment of virtual patient with the real one are done using machine vision methods applied on pairs of live images. Experimental results, concerning frame rate and alignment precision between virtual and real patient, demonstrate that machine vision methods used for localization are appropriate for the specific application and that systems based on stereoscopic mixed reality are feasible and can be proficiently adopted in clinical practice.
Parametric Structural Model for a Mars Entry Concept
NASA Technical Reports Server (NTRS)
Lane, Brittney M.; Ahmed, Samee W.
2017-01-01
This paper outlines the process of developing a parametric model for a vehicle that can withstand Earth launch and Mars entry conditions. This model allows the user to change a variety of parameters ranging from dimensions and meshing to materials and atmospheric entry angles to perform finite element analysis on the model for the specified load cases. While this work focuses on an aeroshell for Earth launch aboard the Space Launch System (SLS) and Mars entry, the model can be applied to different vehicles and destinations. This specific project derived from the need to deliver large payloads to Mars efficiently, safely, and cheaply. Doing so requires minimizing the structural mass of the body as much as possible. The code developed for this project allows for dozens of cases to be run with the single click of a button. The end result of the parametric model gives the user a sense of how the body reacts under different loading cases so that it can be optimized for its purpose. The data are reported in this paper and can provide engineers with a good understanding of the model and valuable information for improving the design of the vehicle. In addition, conclusions show that the frequency analysis drives the design and suggestions are made to reduce the significance of normal modes in the design.
Designing an intuitive web application for drug discovery scientists.
Karamanis, Nikiforos; Pignatelli, Miguel; Carvalho-Silva, Denise; Rowland, Francis; Cham, Jennifer A; Dunham, Ian
2018-06-01
We discuss how we designed the Open Targets Platform (www.targetvalidation.org), an intuitive application for bench scientists working in early drug discovery. To meet the needs of our users, we applied lean user experience (UX) design methods: we started engaging with users very early and carried out research, design and evaluation activities within an iterative development process. We also emphasize the collaborative nature of applying lean UX design, which we believe is a foundation for success in this and many other scientific projects. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
Modelling exclusive meson pair production at hadron colliders
NASA Astrophysics Data System (ADS)
Harland-Lang, L. A.; Khoze, V. A.; Ryskin, M. G.
2014-04-01
We present a study of the central exclusive production of light meson pairs, concentrating on the region of lower invariant masses of the central system and/or meson transverse momentum, where perturbative QCD cannot be reliably applied. We describe in detail a phenomenological model, using the tools of Regge theory, that may be applied with some success in this regime, and we present the new, publicly available, Dime Monte Carlo (MC) implementation of this for , and production. The MC implementation includes a fully differential treatment of the survival factor, which in general depends on all kinematic variables, as well as allows for the so far reasonably unconstrained model parameters to be set by the user. We present predictions for the Tevatron and LHC, discuss and estimate the size of the proton-dissociative background, and show how future measurements may further test this Regge-based approach, as well as the soft hadronic model required to calculate the survival factor, in particular in the presence of tagged protons.
Deterministic propagation model for RFID using site-specific and FDTD
NASA Astrophysics Data System (ADS)
Cunha de Azambuja, Marcelo; Passuelo Hessel, Fabiano; Luís Berz, Everton; Bauermann Porfírio, Leandro; Ruhnke Valério, Paula; De Pieri Baladei, Suely; Jung, Carlos Fernando
2015-06-01
The conduction of experiments to evaluate a tag orientation and its readability in a laboratory offers great potential for reducing time and costs for users. This article presents a novel methodology for developing simulation models for RFID (radio-frequency identification) environments. The main challenges in adopting this model are: (1) to find out how the properties of each one of the materials, on which the tag is applied, influence the read range and to determine the necessary power for tag reading and (2) to find out the power of the backscattered signal received by the tag when energised by the RF wave transmitted by the reader. The validation tests, performed in four different kinds of environments, with tags applied to six different kinds of materials, six different distances and with a reader configured with three different powers, showed achievements on the average of 95.3% accuracy in the best scenario and 87.0% in the worst scenario. The methodology can be easily duplicated to generate simulation models to other different RFID environments.
NASA Technical Reports Server (NTRS)
Morgenthaler, George W.; Glover, Fred W.; Woodcock, Gordon R.; Laguna, Manuel
2005-01-01
The 1/14/04 USA Space Exploratiofltilization Initiative invites all Space-faring Nations, all Space User Groups in Science, Space Entrepreneuring, Advocates of Robotic and Human Space Exploration, Space Tourism and Colonization Promoters, etc., to join an International Space Partnership. With more Space-faring Nations and Space User Groups each year, such a Partnership would require Multi-year (35 yr.-45 yr.) Space Mission Planning. With each Nation and Space User Group demanding priority for its missions, one needs a methodology for obiectively selecting the best mission sequences to be added annually to this 45 yr. Moving Space Mission Plan. How can this be done? Planners have suggested building a Reusable, Sustainable, Space Transportation Infrastructure (RSSn) to increase Mission synergism, reduce cost, and increase scientific and societal returns from this Space Initiative. Morgenthaler and Woodcock presented a Paper at the 55th IAC, Vancouver B.C., Canada, entitled Constrained Optimization Models For Optimizing Multi - Year Space Programs. This Paper showed that a Binary Integer Programming (BIP) Constrained Optimization Model combined with the NASA ATLAS Cost and Space System Operational Parameter Estimating Model has the theoretical capability to solve such problems. IAA Commission III, Space Technology and Space System Development, in its ACADEMY DAY meeting at Vancouver, requested that the Authors and NASA experts find several Space Exploration Architectures (SEAS), apply the combined BIP/ATLAS Models, and report the results at the 56th Fukuoka IAC. While the mathematical Model is in Ref.[2] this Paper presents the Application saga of that effort.
Dynamic taxonomies applied to a web-based relational database for geo-hydrological risk mitigation
NASA Astrophysics Data System (ADS)
Sacco, G. M.; Nigrelli, G.; Bosio, A.; Chiarle, M.; Luino, F.
2012-02-01
In its 40 years of activity, the Research Institute for Geo-hydrological Protection of the Italian National Research Council has amassed a vast and varied collection of historical documentation on landslides, muddy-debris flows, and floods in northern Italy from 1600 to the present. Since 2008, the archive resources have been maintained through a relational database management system. The database is used for routine study and research purposes as well as for providing support during geo-hydrological emergencies, when data need to be quickly and accurately retrieved. Retrieval speed and accuracy are the main objectives of an implementation based on a dynamic taxonomies model. Dynamic taxonomies are a general knowledge management model for configuring complex, heterogeneous information bases that support exploratory searching. At each stage of the process, the user can explore or browse the database in a guided yet unconstrained way by selecting the alternatives suggested for further refining the search. Dynamic taxonomies have been successfully applied to such diverse and apparently unrelated domains as e-commerce and medical diagnosis. Here, we describe the application of dynamic taxonomies to our database and compare it to traditional relational database query methods. The dynamic taxonomy interface, essentially a point-and-click interface, is considerably faster and less error-prone than traditional form-based query interfaces that require the user to remember and type in the "right" search keywords. Finally, dynamic taxonomy users have confirmed that one of the principal benefits of this approach is the confidence of having considered all the relevant information. Dynamic taxonomies and relational databases work in synergy to provide fast and precise searching: one of the most important factors in timely response to emergencies.
Comprehensive Areal Model of Earthquake-Induced Landslides: Technical Specification and User Guide
Miles, Scott B.; Keefer, David K.
2007-01-01
This report describes the complete design of a comprehensive areal model of earthquakeinduced landslides (CAMEL). This report presents the design process, technical specification of CAMEL. It also provides a guide to using the CAMEL source code and template ESRI ArcGIS map document file for applying CAMEL, both of which can be obtained by contacting the authors. CAMEL is a regional-scale model of earthquake-induced landslide hazard developed using fuzzy logic systems. CAMEL currently estimates areal landslide concentration (number of landslides per square kilometer) of six aggregated types of earthquake-induced landslides - three types each for rock and soil.
1990-11-01
Applying the chain rule: dB dB dH dB 1 (180) 95 It remains to calculate dBi/ dHi and dB i+1/dHi+ 1 . Now as calculation pro - ceeds from node to node...simulation models can be powerful tools for studying these issues. However, to be useful, the water quality model must be properly vuited for the problem...0 GN_1(QNANQN _1AN_ ) = 0 GN(QN,AN) = 0 (47) 44. The e. l solution of these nobi -e-.a. equations c .-n proceed in two ways. First the nonlinear terms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cottam, Joseph A.; Blaha, Leslie M.
Systems have biases. Their interfaces naturally guide a user toward specific patterns of action. For example, modern word-processors and spreadsheets are both capable of taking word wrapping, checking spelling, storing tables, and calculating formulas. You could write a paper in a spreadsheet or could do simple business modeling in a word-processor. However, their interfaces naturally communicate which function they are designed for. Visual analytic interfaces also have biases. In this paper, we outline why simple Markov models are a plausible tool for investigating that bias and how they might be applied. We also discuss some anticipated difficulties in such modelingmore » and touch briefly on what some Markov model extensions might provide.« less
User Modeling in Adaptive Hypermedia Educational Systems
ERIC Educational Resources Information Center
Martins, Antonio Constantino; Faria, Luiz; Vaz de Carvalho, Carlos; Carrapatoso, Eurico
2008-01-01
This document is a survey in the research area of User Modeling (UM) for the specific field of Adaptive Learning. The aims of this document are: To define what it is a User Model; To present existing and well known User Models; To analyze the existent standards related with UM; To compare existing systems. In the scientific area of User Modeling…
Practical Application of Model Checking in Software Verification
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Skakkebaek, Jens Ulrik
1999-01-01
This paper presents our experiences in applying the JAVA PATHFINDER (J(sub PF)), a recently developed JAVA to SPIN translator, in the finding of synchronization bugs in a Chinese Chess game server application written in JAVA. We give an overview of J(sub PF) and the subset of JAVA that it supports and describe the abstraction and verification of the game server. Finally, we analyze the results of the effort. We argue that abstraction by under-approximation is necessary for abstracting sufficiently smaller models for verification purposes; that user guidance is crucial for effective abstraction; and that current model checkers do not conveniently support the computational models of software in general and JAVA in particular.
Schmutz, Sven; Sonderegger, Andreas; Sauer, Juergen
2017-09-01
The present study examined whether implementing recommendations of Web accessibility guidelines would have different effects on nondisabled users than on users with visual impairments. The predominant approach for making Web sites accessible for users with disabilities is to apply accessibility guidelines. However, it has been hardly examined whether this approach has side effects for nondisabled users. A comparison of the effects on both user groups would contribute to a better understanding of possible advantages and drawbacks of applying accessibility guidelines. Participants from two matched samples, comprising 55 participants with visual impairments and 55 without impairments, took part in a synchronous remote testing of a Web site. Each participant was randomly assigned to one of three Web sites, which differed in the level of accessibility (very low, low, and high) according to recommendations of the well-established Web Content Accessibility Guidelines 2.0 (WCAG 2.0). Performance (i.e., task completion rate and task completion time) and a range of subjective variables (i.e., perceived usability, positive affect, negative affect, perceived aesthetics, perceived workload, and user experience) were measured. Higher conformance to Web accessibility guidelines resulted in increased performance and more positive user ratings (e.g., perceived usability or aesthetics) for both user groups. There was no interaction between user group and accessibility level. Higher conformance to WCAG 2.0 may result in benefits for nondisabled users and users with visual impairments alike. Practitioners may use the present findings as a basis for deciding on whether and how to implement accessibility best.
User-Centered Design through Learner-Centered Instruction
ERIC Educational Resources Information Center
Altay, Burçak
2014-01-01
This article initially demonstrates the parallels between the learner-centered approach in education and the user-centered approach in design disciplines. Afterward, a course on human factors that applies learner-centered methods to teach user-centered design is introduced. The focus is on three tasks to identify the application of theoretical and…
NASA Technical Reports Server (NTRS)
McElroy, Mark W.
2017-01-01
This document serves as a user guide for the AF-Shell 1.0 software, an efficient tool for progressive damage simulation in composite laminates. This guide contains minimal technical material and is meant solely as a guide for a new user to apply AF-Shell 1.0 to laminate damage simulation problems.
Freckmann, Guido; Jendrike, Nina; Baumstark, Annette; Pleus, Stefan; Liebing, Christina; Haug, Cornelia
2018-04-01
The international standard ISO 15197:2013 requires a user performance evaluation to assess if intended users are able to obtain accurate blood glucose measurement results with a self-monitoring of blood glucose (SMBG) system. In this study, user performance was evaluated for four SMBG systems on the basis of ISO 15197:2013, and possibly related insulin dosing errors were calculated. Additionally, accuracy was assessed in the hands of study personnel. Accu-Chek ® Performa Connect (A), Contour ® plus ONE (B), FreeStyle Optium Neo (C), and OneTouch Select ® Plus (D) were evaluated with one test strip lot. After familiarization with the systems, subjects collected a capillary blood sample and performed an SMBG measurement. Study personnel observed the subjects' measurement technique. Then, study personnel performed SMBG measurements and comparison measurements. Number and percentage of SMBG measurements within ± 15 mg/dl and ± 15% of the comparison measurements at glucose concentrations < 100 and ≥ 100 mg/dl, respectively, were calculated. In addition, insulin dosing errors were modelled. In the hands of lay-users three systems fulfilled ISO 15197:2013 accuracy criteria with the investigated test strip lot showing 96% (A), 100% (B), and 98% (C) of results within the defined limits. All systems fulfilled minimum accuracy criteria in the hands of study personnel [99% (A), 100% (B), 99.5% (C), 96% (D)]. Measurements with all four systems were within zones of the consensus error grid and surveillance error grid associated with no or minimal risk. Regarding calculated insulin dosing errors, all 99% ranges were between dosing errors of - 2.7 and + 1.4 units for measurements in the hands of lay-users and between - 2.5 and + 1.4 units for study personnel. Frequent lay-user errors were not checking the test strips' expiry date and applying blood incorrectly. Data obtained in this study show that not all available SMBG systems complied with ISO 15197:2013 accuracy criteria when measurements were performed by lay-users. The study was registered at ClinicalTrials.gov (NCT02916576). Ascensia Diabetes Care Deutschland GmbH.
In the mood: the dynamics of collective sentiments on Twitter
Charlton, Nathaniel; Singleton, Colin; Greetham, Danica Vukadinović
2016-01-01
We study the relationship between the sentiment levels of Twitter users and the evolving network structure that the users created by @-mentioning each other. We use a large dataset of tweets to which we apply three sentiment scoring algorithms, including the open source SentiStrength program. Specifically we make three contributions. Firstly, we find that people who have potentially the largest communication reach (according to a dynamic centrality measure) use sentiment differently than the average user: for example, they use positive sentiment more often and negative sentiment less often. Secondly, we find that when we follow structurally stable Twitter communities over a period of months, their sentiment levels are also stable, and sudden changes in community sentiment from one day to the next can in most cases be traced to external events affecting the community. Thirdly, based on our findings, we create and calibrate a simple agent-based model that is capable of reproducing measures of emotive response comparable with those obtained from our empirical dataset. PMID:27429774
Interactive Machine Learning at Scale with CHISSL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arendt, Dustin L.; Grace, Emily A.; Volkova, Svitlana
We demonstrate CHISSL, a scalable client-server system for real-time interactive machine learning. Our system is capa- ble of incorporating user feedback incrementally and imme- diately without a structured or pre-defined prediction task. Computation is partitioned between a lightweight web-client and a heavyweight server. The server relies on representation learning and agglomerative clustering to learn a dendrogram, a hierarchical approximation of a representation space. The client uses only this dendrogram to incorporate user feedback into the model via transduction. Distances and predictions for each unlabeled instance are updated incrementally and deter- ministically, with O(n) space and time complexity. Our al- gorithmmore » is implemented in a functional prototype, designed to be easy to use by non-experts. The prototype organizes the large amounts of data into recommendations. This allows the user to interact with actual instances by dragging and drop- ping to provide feedback in an intuitive manner. We applied CHISSL to several domains including cyber, social media, and geo-temporal analysis.« less
Employment Trajectories: Exploring Gender Differences and Impacts of Drug Use
Huang, David Y.C.; Evans, Elizabeth; Hara, Motoaki; Weiss, Robert E.; Hser, Yih-Ing
2010-01-01
This study investigated the impact of drug use on employment over 20 years among men and women, utilizing data on 7,661 participants in the National Longitudinal Survey of Youth. Growth mixture modeling was applied, and five distinct employment trajectory groups were identified for both men and women. The identified patterns were largely similar for men and women except that a U-shape employment trajectory was uniquely identified for women. Early-initiation drug users, users of “hard” drugs, and frequent drug users were more likely to demonstrate consistently low levels of employment, and the negative relationship between drug use and employment was more apparent among men than women. Also, positive associations between employment and marriage became more salient for men over time, as did negative associations between employment and childrearing among women. Processes are dynamic and complex, suggesting that throughout the life course, protective factors that reduce the risk of employment problems emerge and change, as do critical periods for maximizing the impact of drug prevention and intervention efforts. PMID:21765533
Designing for Motivation, Engagement and Wellbeing in Digital Experience
Peters, Dorian; Calvo, Rafael A.; Ryan, Richard M.
2018-01-01
Research in psychology has shown that both motivation and wellbeing are contingent on the satisfaction of certain psychological needs. Yet, despite a long-standing pursuit in human-computer interaction (HCI) for design strategies that foster sustained engagement, behavior change and wellbeing, the basic psychological needs shown to mediate these outcomes are rarely taken into account. This is possibly due to the lack of a clear model to explain these needs in the context of HCI. Herein we introduce such a model: Motivation, Engagement and Thriving in User Experience (METUX). The model provides a framework grounded in psychological research that can allow HCI researchers and practitioners to form actionable insights with respect to how technology designs support or undermine basic psychological needs, thereby increasing motivation and engagement, and ultimately, improving user wellbeing. We propose that in order to address wellbeing, psychological needs must be considered within five different spheres of analysis including: at the point of technology adoption, during interaction with the interface, as a result of engagement with technology-specific tasks, as part of the technology-supported behavior, and as part of an individual's life overall. These five spheres of experience sit within a sixth, society, which encompasses both direct and collateral effects of technology use as well as non-user experiences. We build this model based on existing evidence for basic psychological need satisfaction, including evidence within the context of the workplace, computer games, and health. We extend and hone these ideas to provide practical advice for designers along with real world examples of how to apply the model to design practice. PMID:29892246
Sittig, Dean F.; Singh, Hardeep
2011-01-01
Conceptual models have been developed to address challenges inherent in studying health information technology (HIT). This manuscript introduces an 8-dimensional model specifically designed to address the socio-technical challenges involved in design, development, implementation, use, and evaluation of HIT within complex adaptive healthcare systems. The 8 dimensions are not independent, sequential, or hierarchical, but rather are interdependent and interrelated concepts similar to compositions of other complex adaptive systems. Hardware and software computing infrastructure refers to equipment and software used to power, support, and operate clinical applications and devices. Clinical content refers to textual or numeric data and images that constitute the “language” of clinical applications. The human computer interface includes all aspects of the computer that users can see, touch, or hear as they interact with it. People refers to everyone who interacts in some way with the system, from developer to end-user, including potential patient-users. Workflow and communication are the processes or steps involved in assuring that patient care tasks are carried out effectively. Two additional dimensions of the model are internal organizational features (e.g., policies, procedures, and culture) and external rules and regulations, both of which may facilitate or constrain many aspects of the preceding dimensions. The final dimension is measurement and monitoring, which refers to the process of measuring and evaluating both intended and unintended consequences of HIT implementation and use. We illustrate how our model has been successfully applied in real-world complex adaptive settings to understand and improve HIT applications at various stages of development and implementation. PMID:20959322
Sittig, Dean F; Singh, Hardeep
2010-10-01
Conceptual models have been developed to address challenges inherent in studying health information technology (HIT). This manuscript introduces an eight-dimensional model specifically designed to address the sociotechnical challenges involved in design, development, implementation, use and evaluation of HIT within complex adaptive healthcare systems. The eight dimensions are not independent, sequential or hierarchical, but rather are interdependent and inter-related concepts similar to compositions of other complex adaptive systems. Hardware and software computing infrastructure refers to equipment and software used to power, support and operate clinical applications and devices. Clinical content refers to textual or numeric data and images that constitute the 'language' of clinical applications. The human--computer interface includes all aspects of the computer that users can see, touch or hear as they interact with it. People refers to everyone who interacts in some way with the system, from developer to end user, including potential patient-users. Workflow and communication are the processes or steps involved in ensuring that patient care tasks are carried out effectively. Two additional dimensions of the model are internal organisational features (eg, policies, procedures and culture) and external rules and regulations, both of which may facilitate or constrain many aspects of the preceding dimensions. The final dimension is measurement and monitoring, which refers to the process of measuring and evaluating both intended and unintended consequences of HIT implementation and use. We illustrate how our model has been successfully applied in real-world complex adaptive settings to understand and improve HIT applications at various stages of development and implementation.
Collaborative Recurrent Neural Networks forDynamic Recommender Systems
2016-11-22
formulation leads to an efficient and practical method. Furthermore, we demonstrate the versatility of our model by applying it to two different tasks: music ...form (user id, location id, check-in time). The LastFM9 dataset consists of sequences of songs played by a user’s music player collected by using a...Jeffrey L Elman. Finding structure in time. Cognitive science, 14(2), 1990. Alex Graves, Abdel-rahman Mohamed, and Geoffrey Hinton. Speech recognition
Surles, M C; Richardson, J S; Richardson, D C; Brooks, F P
1994-02-01
We describe a new paradigm for modeling proteins in interactive computer graphics systems--continual maintenance of a physically valid representation, combined with direct user control and visualization. This is achieved by a fast algorithm for energy minimization, capable of real-time performance on all atoms of a small protein, plus graphically specified user tugs. The modeling system, called Sculpt, rigidly constrains bond lengths, bond angles, and planar groups (similar to existing interactive modeling programs), while it applies elastic restraints to minimize the potential energy due to torsions, hydrogen bonds, and van der Waals and electrostatic interactions (similar to existing batch minimization programs), and user-specified springs. The graphical interface can show bad and/or favorable contacts, and individual energy terms can be turned on or off to determine their effects and interactions. Sculpt finds a local minimum of the total energy that satisfies all the constraints using an augmented Lagrange-multiplier method; calculation time increases only linearly with the number of atoms because the matrix of constraint gradients is sparse and banded. On a 100-MHz MIPS R4000 processor (Silicon Graphics Indigo), Sculpt achieves 11 updates per second on a 20-residue fragment and 2 updates per second on an 80-residue protein, using all atoms except non-H-bonding hydrogens, and without electrostatic interactions. Applications of Sculpt are described: to reverse the direction of bundle packing in a designed 4-helix bundle protein, to fold up a 2-stranded beta-ribbon into an approximate beta-barrel, and to design the sequence and conformation of a 30-residue peptide that mimics one partner of a protein subunit interaction. Computer models that are both interactive and physically realistic (within the limitations of a given force field) have 2 significant advantages: (1) they make feasible the modeling of very large changes (such as needed for de novo design), and (2) they help the user understand how different energy terms interact to stabilize a given conformation. The Sculpt paradigm combines many of the best features of interactive graphical modeling, energy minimization, and actual physical models, and we propose it as an especially productive way to use current and future increases in computer speed.
A User-Friendly Model for Spray Drying to Aid Pharmaceutical Product Development
Grasmeijer, Niels; de Waard, Hans; Hinrichs, Wouter L. J.; Frijlink, Henderik W.
2013-01-01
The aim of this study was to develop a user-friendly model for spray drying that can aid in the development of a pharmaceutical product, by shifting from a trial-and-error towards a quality-by-design approach. To achieve this, a spray dryer model was developed in commercial and open source spreadsheet software. The output of the model was first fitted to the experimental output of a Büchi B-290 spray dryer and subsequently validated. The predicted outlet temperatures of the spray dryer model matched the experimental values very well over the entire range of spray dryer settings that were tested. Finally, the model was applied to produce glassy sugars by spray drying, an often used excipient in formulations of biopharmaceuticals. For the production of glassy sugars, the model was extended to predict the relative humidity at the outlet, which is not measured in the spray dryer by default. This extended model was then successfully used to predict whether specific settings were suitable for producing glassy trehalose and inulin by spray drying. In conclusion, a spray dryer model was developed that is able to predict the output parameters of the spray drying process. The model can aid the development of spray dried pharmaceutical products by shifting from a trial-and-error towards a quality-by-design approach. PMID:24040240
User's Manual for the Object User Interface (OUI): An Environmental Resource Modeling Framework
Markstrom, Steven L.; Koczot, Kathryn M.
2008-01-01
The Object User Interface is a computer application that provides a framework for coupling environmental-resource models and for managing associated temporal and spatial data. The Object User Interface is designed to be easily extensible to incorporate models and data interfaces defined by the user. Additionally, the Object User Interface is highly configurable through the use of a user-modifiable, text-based control file that is written in the eXtensible Markup Language. The Object User Interface user's manual provides (1) installation instructions, (2) an overview of the graphical user interface, (3) a description of the software tools, (4) a project example, and (5) specifications for user configuration and extension.
Integrating Cache Performance Modeling and Tuning Support in Parallelization Tools
NASA Technical Reports Server (NTRS)
Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)
1998-01-01
With the resurgence of distributed shared memory (DSM) systems based on cache-coherent Non Uniform Memory Access (ccNUMA) architectures and increasing disparity between memory and processors speeds, data locality overheads are becoming the greatest bottlenecks in the way of realizing potential high performance of these systems. While parallelization tools and compilers facilitate the users in porting their sequential applications to a DSM system, a lot of time and effort is needed to tune the memory performance of these applications to achieve reasonable speedup. In this paper, we show that integrating cache performance modeling and tuning support within a parallelization environment can alleviate this problem. The Cache Performance Modeling and Prediction Tool (CPMP), employs trace-driven simulation techniques without the overhead of generating and managing detailed address traces. CPMP predicts the cache performance impact of source code level "what-if" modifications in a program to assist a user in the tuning process. CPMP is built on top of a customized version of the Computer Aided Parallelization Tools (CAPTools) environment. Finally, we demonstrate how CPMP can be applied to tune a real Computational Fluid Dynamics (CFD) application.
A case for user-generated sensor metadata
NASA Astrophysics Data System (ADS)
Nüst, Daniel
2015-04-01
Cheap and easy to use sensing technology and new developments in ICT towards a global network of sensors and actuators promise previously unthought of changes for our understanding of the environment. Large professional as well as amateur sensor networks exist, and they are used for specific yet diverse applications across domains such as hydrology, meteorology or early warning systems. However the impact this "abundance of sensors" had so far is somewhat disappointing. There is a gap between (community-driven) sensor networks that could provide very useful data and the users of the data. In our presentation, we argue this is due to a lack of metadata which allows determining the fitness of use of a dataset. Syntactic or semantic interoperability for sensor webs have made great progress and continue to be an active field of research, yet they often are quite complex, which is of course due to the complexity of the problem at hand. But still, we see the most generic information to determine fitness for use is a dataset's provenance, because it allows users to make up their own minds independently from existing classification schemes for data quality. In this work we will make the case how curated user-contributed metadata has the potential to improve this situation. This especially applies for scenarios in which an observed property is applicable in different domains, and for set-ups where the understanding about metadata concepts and (meta-)data quality differs between data provider and user. On the one hand a citizen does not understand the ISO provenance metadata. On the other hand a researcher might find issues in publicly accessible time series published by citizens, which the latter might not be aware of or care about. Because users will have to determine fitness for use for each application on their own anyway, we suggest an online collaboration platform for user-generated metadata based on an extremely simplified data model. In the most basic fashion, metadata generated by users can be boiled down to a basic property of the world wide web: many information items, such as news or blog posts, allow users to create comments and rate the content. Therefore we argue to focus a core data model on one text field for a textual comment, one optional numerical field for a rating, and a resolvable identifier for the dataset that is commented on. We present a conceptual framework that integrates user comments in existing standards and relevant applications of online sensor networks and discuss possible approaches, such as linked data, brokering, or standalone metadata portals. We relate this framework to existing work in user generated content, such as proprietary rating systems on commercial websites, microformats, the GeoViQua User Quality Model, the CHARMe annotations, or W3C Open Annotation. These systems are also explored for commonalities and based on their very useful concepts and ideas; we present an outline for future extensions of the minimal model. Building on this framework we present a concept how a simplistic comment-rating-system can be extended to capture provenance information for spatio-temporal observations in the sensor web, and how this framework can be evaluated.
Participatory Design in Gerontechnology: A Systematic Literature Review.
Merkel, Sebastian; Kucharski, Alexander
2018-05-19
Participatory design (PD) is widely used within gerontechnology but there is no common understanding about which methods are used for what purposes. This review aims to examine what different forms of PD exist in the field of gerontechnology and how these can be categorized. We conducted a systematic literature review covering several databases. The search strategy was based on 3 elements: (1) participatory methods and approaches with (2) older persons aiming at developing (3) technology for older people. Our final review included 26 studies representing a variety of technologies designed/developed and methods/instruments applied. According to the technologies, the publications reviewed can be categorized in 3 groups: Studies that (1) use already existing technology with the aim to find new ways of use; (2) aim at creating new devices; (3) test and/or modify prototypes. The implementation of PD depends on the questions: Why a participatory approach is applied, who is involved as future user(s), when those future users are involved, and how they are incorporated into the innovation process. There are multiple ways, methods, and instruments to integrate users into the innovation process. Which methods should be applied, depends on the context. However, most studies do not evaluate if participatory approaches will lead to a better acceptance and/or use of the co-developed products. Therefore, participatory design should follow a comprehensive strategy, starting with the users' needs and ending with an evaluation if the applied methods have led to better results.
SAFOD Brittle Microstructure and Mechanics Knowledge Base (SAFOD BM2KB)
NASA Astrophysics Data System (ADS)
Babaie, H. A.; Hadizadeh, J.; di Toro, G.; Mair, K.; Kumar, A.
2008-12-01
We have developed a knowledge base to store and present the data collected by a group of investigators studying the microstructures and mechanics of brittle faulting using core samples from the SAFOD (San Andreas Fault Observatory at Depth) project. The investigations are carried out with a variety of analytical and experimental methods primarily to better understand the physics of strain localization in fault gouge. The knowledge base instantiates an specially-designed brittle rock deformation ontology developed at Georgia State University. The inference rules embedded in the semantic web languages, such as OWL, RDF, and RDFS, which are used in our ontology, allow the Pellet reasoner used in this application to derive additional truths about the ontology and knowledge of this domain. Access to the knowledge base is via a public website, which is designed to provide the knowledge acquired by all the investigators involved in the project. The stored data will be products of studies such as: experiments (e.g., high-velocity friction experiment), analyses (e.g., microstructural, chemical, mass transfer, mineralogical, surface, image, texture), microscopy (optical, HRSEM, FESEM, HRTEM]), tomography, porosity measurement, microprobe, and cathodoluminesence. Data about laboratories, experimental conditions, methods, assumptions, equipments, and mechanical properties and lithology of the studied samples will also be presented on the website per investigation. The ontology was modeled applying the UML (Unified Modeling Language) in Rational Rose, and implemented in OWL-DL (Ontology Web Language) using the Protégé ontology editor. The UML model was converted to OWL-DL by first mapping it to Ecore (.ecore) and Generator model (.genmodel) with the help of the EMF (Eclipse Modeling Framework) plugin in Eclipse. The Ecore model was then mapped to a .uml file, which later was converted into an .owl file and subsequently imported into the Protégé ontology editing environment. The web-interface was developed in java using eclipse as the IDE. The web interfaces to query and submit data were implemented applying JSP, servlets, javascript, and AJAX. The Jena API, a Java framework for building Semantic Web applications, was used to develop the web-interface. Jena provided a programmatic environment for RDF, RDFS, OWL, and SPARQL query engine. Building web applications with AJAX helps retrieving data from the server asynchronously in the background without interfering with the display and behavior of the existing page. The application was deployed on an apache tomcat server at GSU. The SAFOD BM2KB website provides user-friendly search, submit, feedback, and other services. The General Search option allows users to search the knowledge base by selecting the classes (e.g., Experiment, Surface Analysis), their respective attributes (e.g., apparatus, date performed), and the relationships to other classes (e.g., Sample, Laboratory). The Search by Sample option allows users to search the knowledge base based on sample number. The Search by Investigator lets users to search the knowledge base by choosing an investigator who is involved in this project. The website also allows users to submit new data. The Submit Data option opens a page where users can submit the SAFOD data to our knowledge base by selecting specific classes and attributes. The submitted data then become available for query as part of the knowledge base. The SAFOD BM2KB can be accessed from the main SAFOD website.
An R package for analyzing and modeling ranking data
2013-01-01
Background In medical informatics, psychology, market research and many other fields, researchers often need to analyze and model ranking data. However, there is no statistical software that provides tools for the comprehensive analysis of ranking data. Here, we present pmr, an R package for analyzing and modeling ranking data with a bundle of tools. The pmr package enables descriptive statistics (mean rank, pairwise frequencies, and marginal matrix), Analytic Hierarchy Process models (with Saaty’s and Koczkodaj’s inconsistencies), probability models (Luce model, distance-based model, and rank-ordered logit model), and the visualization of ranking data with multidimensional preference analysis. Results Examples of the use of package pmr are given using a real ranking dataset from medical informatics, in which 566 Hong Kong physicians ranked the top five incentives (1: competitive pressures; 2: increased savings; 3: government regulation; 4: improved efficiency; 5: improved quality care; 6: patient demand; 7: financial incentives) to the computerization of clinical practice. The mean rank showed that item 4 is the most preferred item and item 3 is the least preferred item, and significance difference was found between physicians’ preferences with respect to their monthly income. A multidimensional preference analysis identified two dimensions that explain 42% of the total variance. The first can be interpreted as the overall preference of the seven items (labeled as “internal/external”), and the second dimension can be interpreted as their overall variance of (labeled as “push/pull factors”). Various statistical models were fitted, and the best were found to be weighted distance-based models with Spearman’s footrule distance. Conclusions In this paper, we presented the R package pmr, the first package for analyzing and modeling ranking data. The package provides insight to users through descriptive statistics of ranking data. Users can also visualize ranking data by applying a thought multidimensional preference analysis. Various probability models for ranking data are also included, allowing users to choose that which is most suitable to their specific situations. PMID:23672645
An R package for analyzing and modeling ranking data.
Lee, Paul H; Yu, Philip L H
2013-05-14
In medical informatics, psychology, market research and many other fields, researchers often need to analyze and model ranking data. However, there is no statistical software that provides tools for the comprehensive analysis of ranking data. Here, we present pmr, an R package for analyzing and modeling ranking data with a bundle of tools. The pmr package enables descriptive statistics (mean rank, pairwise frequencies, and marginal matrix), Analytic Hierarchy Process models (with Saaty's and Koczkodaj's inconsistencies), probability models (Luce model, distance-based model, and rank-ordered logit model), and the visualization of ranking data with multidimensional preference analysis. Examples of the use of package pmr are given using a real ranking dataset from medical informatics, in which 566 Hong Kong physicians ranked the top five incentives (1: competitive pressures; 2: increased savings; 3: government regulation; 4: improved efficiency; 5: improved quality care; 6: patient demand; 7: financial incentives) to the computerization of clinical practice. The mean rank showed that item 4 is the most preferred item and item 3 is the least preferred item, and significance difference was found between physicians' preferences with respect to their monthly income. A multidimensional preference analysis identified two dimensions that explain 42% of the total variance. The first can be interpreted as the overall preference of the seven items (labeled as "internal/external"), and the second dimension can be interpreted as their overall variance of (labeled as "push/pull factors"). Various statistical models were fitted, and the best were found to be weighted distance-based models with Spearman's footrule distance. In this paper, we presented the R package pmr, the first package for analyzing and modeling ranking data. The package provides insight to users through descriptive statistics of ranking data. Users can also visualize ranking data by applying a thought multidimensional preference analysis. Various probability models for ranking data are also included, allowing users to choose that which is most suitable to their specific situations.
SOA-based model for value-added ITS services delivery.
Herrera-Quintero, Luis Felipe; Maciá-Pérez, Francisco; Marcos-Jorquera, Diego; Gilart-Iglesias, Virgilio
2014-01-01
Integration is currently a key factor in intelligent transportation systems (ITS), especially because of the ever increasing service demands originating from the ITS industry and ITS users. The current ITS landscape is made up of multiple technologies that are tightly coupled, and its interoperability is extremely low, which limits ITS services generation. Given this fact, novel information technologies (IT) based on the service-oriented architecture (SOA) paradigm have begun to introduce new ways to address this problem. The SOA paradigm allows the construction of loosely coupled distributed systems that can help to integrate the heterogeneous systems that are part of ITS. In this paper, we focus on developing an SOA-based model for integrating information technologies (IT) into ITS to achieve ITS service delivery. To develop our model, the ITS technologies and services involved were identified, catalogued, and decoupled. In doing so, we applied our SOA-based model to integrate all of the ITS technologies and services, ranging from the lowest-level technical components, such as roadside unit as a service (RSUAAS), to the most abstract ITS services that will be offered to ITS users (value-added services). To validate our model, a functionality case study that included all of the components of our model was designed.
NASA Astrophysics Data System (ADS)
Liu, Fang; Cao, San-xing; Lu, Rui
2012-04-01
This paper proposes a user credit assessment model based on clustering ensemble aiming to solve the problem that users illegally spread pirated and pornographic media contents within the user self-service oriented broadband network new media platforms. Its idea is to do the new media user credit assessment by establishing indices system based on user credit behaviors, and the illegal users could be found according to the credit assessment results, thus to curb the bad videos and audios transmitted on the network. The user credit assessment model based on clustering ensemble proposed by this paper which integrates the advantages that swarm intelligence clustering is suitable for user credit behavior analysis and K-means clustering could eliminate the scattered users existed in the result of swarm intelligence clustering, thus to realize all the users' credit classification automatically. The model's effective verification experiments are accomplished which are based on standard credit application dataset in UCI machine learning repository, and the statistical results of a comparative experiment with a single model of swarm intelligence clustering indicates this clustering ensemble model has a stronger creditworthiness distinguishing ability, especially in the aspect of predicting to find user clusters with the best credit and worst credit, which will facilitate the operators to take incentive measures or punitive measures accurately. Besides, compared with the experimental results of Logistic regression based model under the same conditions, this clustering ensemble model is robustness and has better prediction accuracy.
Serious Games for Health: The Potential of Metadata.
Göbel, Stefan; Maddison, Ralph
2017-02-01
Numerous serious games and health games exist, either as commercial products (typically with a focus on entertaining a broad user group) or smaller games and game prototypes, often resulting from research projects (typically tailored to a smaller user group with a specific health characteristic). A major drawback of existing health games is that they are not very well described and attributed with (machine-readable, quantitative, and qualitative) metadata such as the characterizing goal of the game, the target user group, or expected health effects well proven in scientific studies. This makes it difficult or even impossible for end users to find and select the most appropriate game for a specific situation (e.g., health needs). Therefore, the aim of this article was to motivate the need and potential/benefit of metadata for the description and retrieval of health games and to describe a descriptive model for the qualitative description of games for health. It was not the aim of the article to describe a stable, running system (portal) for health games. This will be addressed in future work. Building on previous work toward a metadata format for serious games, a descriptive model for the formal description of games for health is introduced. For the conceptualization of this model, classification schemata of different existing health game repositories are considered. The classification schema consists of three levels: a core set of mandatory descriptive fields relevant for all games for health application areas, a detailed level with more comprehensive, optional information about the games, and so-called extension as level three with specific descriptive elements relevant for dedicated health games application areas, for example, cardio training. A metadata format provides a technical framework to describe, find, and select appropriate health games matching the needs of the end user. Future steps to improve, apply, and promote the metadata format in the health games market are discussed.
Hoos, Anne B.; Patel, Anant R.
1996-01-01
Model-adjustment procedures were applied to the combined data bases of storm-runoff quality for Chattanooga, Knoxville, and Nashville, Tennessee, to improve predictive accuracy for storm-runoff quality for urban watersheds in these three cities and throughout Middle and East Tennessee. Data for 45 storms at 15 different sites (five sites in each city) constitute the data base. Comparison of observed values of storm-runoff load and event-mean concentration to the predicted values from the regional regression models for 10 constituents shows prediction errors, as large as 806,000 percent. Model-adjustment procedures, which combine the regional model predictions with local data, are applied to improve predictive accuracy. Standard error of estimate after model adjustment ranges from 67 to 322 percent. Calibration results may be biased due to sampling error in the Tennessee data base. The relatively large values of standard error of estimate for some of the constituent models, although representing significant reduction (at least 50 percent) in prediction error compared to estimation with unadjusted regional models, may be unacceptable for some applications. The user may wish to collect additional local data for these constituents and repeat the analysis, or calibrate an independent local regression model.
Save medical personnel's time by improved user interfaces.
Kindler, H
1997-01-01
Common objectives in the industrial countries are the improvement of quality of care, clinical effectiveness, and cost control. Cost control, in particular, has been addressed through the introduction of case mix systems for reimbursement by social-security institutions. More data is required to enable quality improvement, increases in clinical effectiveness and for juridical reasons. At first glance, this documentation effort is contradictory to cost reduction. However, integrated services for resource management based on better documentation should help to reduce costs. The clerical effort for documentation should be decreased by providing a co-operative working environment for healthcare professionals applying sophisticated human-computer interface technology. Additional services, e.g., automatic report generation, increase the efficiency of healthcare personnel. Modelling the medical work flow forms an essential prerequisite for integrated resource management services and for co-operative user interfaces. A user interface aware of the work flow provides intelligent assistance by offering the appropriate tools at the right moment. Nowadays there is a trend to client/server systems with relational databases or object-oriented databases as repository. The work flows used for controlling purposes and to steer the user interfaces must be represented in the repository.
NASA Technical Reports Server (NTRS)
Yliniemi, Logan; Agogino, Adrian K.; Tumer, Kagan
2014-01-01
Accurate simulation of the effects of integrating new technologies into a complex system is critical to the modernization of our antiquated air traffic system, where there exist many layers of interacting procedures, controls, and automation all designed to cooperate with human operators. Additions of even simple new technologies may result in unexpected emergent behavior due to complex human/ machine interactions. One approach is to create high-fidelity human models coming from the field of human factors that can simulate a rich set of behaviors. However, such models are difficult to produce, especially to show unexpected emergent behavior coming from many human operators interacting simultaneously within a complex system. Instead of engineering complex human models, we directly model the emergent behavior by evolving goal directed agents, representing human users. Using evolution we can predict how the agent representing the human user reacts given his/her goals. In this paradigm, each autonomous agent in a system pursues individual goals, and the behavior of the system emerges from the interactions, foreseen or unforeseen, between the agents/actors. We show that this method reflects the integration of new technologies in a historical case, and apply the same methodology for a possible future technology.
Barnes, Rebecca O; Schacter, Brent; Kodeeswaran, Sugy; Watson, Peter H
2014-10-01
Biorepositories, the coordinating hubs for the collection and annotation of biospecimens, are under increasing financial pressure and are challenged to remain sustainable. To gain a better understanding of the current funding situation for Canadian biorepositories and the relative contributions they receive from different funding sources, the Canadian Tumour Repository Network (CTRNet) conducted two surveys. The first survey targeted CTRNet's six main nodes to ascertain the relative funding sources and levels of user fees. The second survey was targeted to a broader range of biorepositories (n=45) to ascertain business practices in application of user fees. The results show that >70% of Canadian biorepositories apply user fees and that the majority apply differential fees to different user groups (academic vs. industry, local vs. international). However, user fees typically comprise only 6% of overall operational budgets. We conclude that while strategies to drive up user fee levels need to be implemented, it is essential for the many stakeholders in the biomedical health research sector to consider this issue in order to ensure the ongoing availability of research biospecimens and data that are standardized, high-quality, and that are therefore capable of meeting research needs.
Advanced display object selection methods for enhancing user-computer productivity
NASA Technical Reports Server (NTRS)
Osga, Glenn A.
1993-01-01
The User-Interface Technology Branch at NCCOSC RDT&E Division has been conducting a series of studies to address the suitability of commercial off-the-shelf (COTS) graphic user-interface (GUI) methods for efficiency and performance in critical naval combat systems. This paper presents an advanced selection algorithm and method developed to increase user performance when making selections on tactical displays. The method has also been applied with considerable success to a variety of cursor and pointing tasks. Typical GUI's allow user selection by: (1) moving a cursor with a pointing device such as a mouse, trackball, joystick, touchscreen; and (2) placing the cursor on the object. Examples of GUI objects are the buttons, icons, folders, scroll bars, etc. used in many personal computer and workstation applications. This paper presents an improved method of selection and the theoretical basis for the significant performance gains achieved with various input devices tested. The method is applicable to all GUI styles and display sizes, and is particularly useful for selections on small screens such as notebook computers. Considering the amount of work-hours spent pointing and clicking across all styles of available graphic user-interfaces, the cost/benefit in applying this method to graphic user-interfaces is substantial, with the potential for increasing productivity across thousands of users and applications.
Design Application Translates 2-D Graphics to 3-D Surfaces
NASA Technical Reports Server (NTRS)
2007-01-01
Fabric Images Inc., specializing in the printing and manufacturing of fabric tension architecture for the retail, museum, and exhibit/tradeshow communities, designed software to translate 2-D graphics for 3-D surfaces prior to print production. Fabric Images' fabric-flattening design process models a 3-D surface based on computer-aided design (CAD) specifications. The surface geometry of the model is used to form a 2-D template, similar to a flattening process developed by NASA's Glenn Research Center. This template or pattern is then applied in the development of a 2-D graphic layout. Benefits of this process include 11.5 percent time savings per project, less material wasted, and the ability to improve upon graphic techniques and offer new design services. Partners include Exhibitgroup/Giltspur (end-user client: TAC Air, a division of Truman Arnold Companies Inc.), Jack Morton Worldwide (end-user client: Nickelodeon), as well as 3D Exhibits Inc., and MG Design Associates Corp.
Galván-Tejada, Carlos E; García-Vázquez, Juan Pablo; Galván-Tejada, Jorge I; Delgado-Contreras, J Rubén; Brena, Ramon F
2015-08-18
In this paper, we present the development of an infrastructure-less indoor location system (ILS), which relies on the use of a microphone, a magnetometer and a light sensor of a smartphone, all three of which are essentially passive sensors, relying on signals available practically in any building in the world, no matter how developed the region is. In our work, we merge the information from those sensors to estimate the user's location in an indoor environment. A multivariate model is applied to find the user's location, and we evaluate the quality of the resulting model in terms of sensitivity and specificity. Our experiments were carried out in an office environment during summer and winter, to take into account changes in light patterns, as well as changes in the Earth's magnetic field irregularities. The experimental results clearly show the benefits of using the information fusion of multiple sensors when contrasted with the use of a single source of information.
Inferring the interplay between network structure and market effects in Bitcoin
NASA Astrophysics Data System (ADS)
Kondor, Dániel; Csabai, István; Szüle, János; Pósfai, Márton; Vattay, Gábor
2014-12-01
A main focus in economics research is understanding the time series of prices of goods and assets. While statistical models using only the properties of the time series itself have been successful in many aspects, we expect to gain a better understanding of the phenomena involved if we can model the underlying system of interacting agents. In this article, we consider the history of Bitcoin, a novel digital currency system, for which the complete list of transactions is available for analysis. Using this dataset, we reconstruct the transaction network between users and analyze changes in the structure of the subgraph induced by the most active users. Our approach is based on the unsupervised identification of important features of the time variation of the network. Applying the widely used method of Principal Component Analysis to the matrix constructed from snapshots of the network at different times, we are able to show how structural changes in the network accompany significant changes in the exchange price of bitcoins.
LDA-Based Unified Topic Modeling for Similar TV User Grouping and TV Program Recommendation.
Pyo, Shinjee; Kim, Eunhui; Kim, Munchurl
2015-08-01
Social TV is a social media service via TV and social networks through which TV users exchange their experiences about TV programs that they are viewing. For social TV service, two technical aspects are envisioned: grouping of similar TV users to create social TV communities and recommending TV programs based on group and personal interests for personalizing TV. In this paper, we propose a unified topic model based on grouping of similar TV users and recommending TV programs as a social TV service. The proposed unified topic model employs two latent Dirichlet allocation (LDA) models. One is a topic model of TV users, and the other is a topic model of the description words for viewed TV programs. The two LDA models are then integrated via a topic proportion parameter for TV programs, which enforces the grouping of similar TV users and associated description words for watched TV programs at the same time in a unified topic modeling framework. The unified model identifies the semantic relation between TV user groups and TV program description word groups so that more meaningful TV program recommendations can be made. The unified topic model also overcomes an item ramp-up problem such that new TV programs can be reliably recommended to TV users. Furthermore, from the topic model of TV users, TV users with similar tastes can be grouped as topics, which can then be recommended as social TV communities. To verify our proposed method of unified topic-modeling-based TV user grouping and TV program recommendation for social TV services, in our experiments, we used real TV viewing history data and electronic program guide data from a seven-month period collected by a TV poll agency. The experimental results show that the proposed unified topic model yields an average 81.4% precision for 50 topics in TV program recommendation and its performance is an average of 6.5% higher than that of the topic model of TV users only. For TV user prediction with new TV programs, the average prediction precision was 79.6%. Also, we showed the superiority of our proposed model in terms of both topic modeling performance and recommendation performance compared to two related topic models such as polylingual topic model and bilingual topic model.
EM Modelling of RF Propagation Through Plasma Plumes
NASA Astrophysics Data System (ADS)
Pandolfo, L.; Bandinelli, M.; Araque Quijano, J. L.; Vecchi, G.; Pawlak, H.; Marliani, F.
2012-05-01
Electric propulsion is a commercially attractive solution for attitude and position control of geostationary satellites. Hall-effect ion thrusters generate a localized plasma flow in the surrounding of the satellite, whose impact on the communication system needs to be qualitatively and quantitatively assessed. An electromagnetic modelling tool has been developed and integrated into the Antenna Design Framework- ElectroMagnetic Satellite (ADF-EMS). The system is able to guide the user from the plume definition phases through plume installation and simulation. A validation activity has been carried out and the system has been applied to the plume modulation analysis of SGEO/Hispasat mission.
QoS prediction for web services based on user-trust propagation model
NASA Astrophysics Data System (ADS)
Thinh, Le-Van; Tu, Truong-Dinh
2017-10-01
There is an important online role for Web service providers and users; however, the rapidly growing number of service providers and users, it can create some similar functions among web services. This is an exciting area for research, and researchers seek to to propose solutions for the best service to users. Collaborative filtering (CF) algorithms are widely used in recommendation systems, although these are less effective for cold-start users. Recently, some recommender systems have been developed based on social network models, and the results show that social network models have better performance in terms of CF, especially for cold-start users. However, most social network-based recommendations do not consider the user's mood. This is a hidden source of information, and is very useful in improving prediction efficiency. In this paper, we introduce a new model called User-Trust Propagation (UTP). The model uses a combination of trust and the mood of users to predict the QoS value and matrix factorisation (MF), which is used to train the model. The experimental results show that the proposed model gives better accuracy than other models, especially for the cold-start problem.
Voss, Clifford I.; Boldt, David; Shapiro, Allen M.
1997-01-01
This report describes a Graphical-User Interface (GUI) for SUTRA, the U.S. Geological Survey (USGS) model for saturated-unsaturated variable-fluid-density ground-water flow with solute or energy transport,which combines a USGS-developed code that interfaces SUTRA with Argus ONE, a commercial software product developed by Argus Interware. This product, known as Argus Open Numerical Environments (Argus ONETM), is a programmable system with geographic-information-system-like (GIS-like) functionality that includes automated gridding and meshing capabilities for linking geospatial information with finite-difference and finite-element numerical model discretizations. The GUI for SUTRA is based on a public-domain Plug-In Extension (PIE) to Argus ONE that automates the use of ArgusONE to: automatically create the appropriate geospatial information coverages (information layers) for SUTRA, provide menus and dialogs for inputting geospatial information and simulation control parameters for SUTRA, and allow visualization of SUTRA simulation results. Following simulation control data and geospatial data input bythe user through the GUI, ArgusONE creates text files in a format required for normal input to SUTRA,and SUTRA can be executed within the Argus ONE environment. Then, hydraulic head, pressure, solute concentration, temperature, saturation and velocity results from the SUTRA simulation may be visualized. Although the GUI for SUTRA discussed in this report provides all of the graphical pre- and post-processor functions required for running SUTRA, it is also possible for advanced users to apply programmable features within Argus ONE to modify the GUI to meet the unique demands of particular ground-water modeling projects.
Developing Vocabularies to Improve Understanding and Use of NOAA Observing Systems
NASA Astrophysics Data System (ADS)
Austin, M.
2014-12-01
The NOAA Observing System Integrated Analysis project (NOSIA II), is an attempt to capture and tell the story of how valuable observing systems are in producing products and services that are required to fulfill the NOAA's diverse mission. NOAA's goals and mission areas cover a broad range of environmental data; a complexity exists in terms and vocabulary as applied to the creation of observing system derived products. The NOSIA data collection focused first on decomposing NOAA's goals in the creation and acceptance of Mission Service Areas (MSAs) by NOAA senior leadership. Products and services that supported the MSAs were then identified through the process of interviewing product producers across NOAA organization. Product Data inputs including models, databases and observing system were also identified. The NOSIA model contains over 20,000 nodes each representing levels in a network connecting products, datasources, users and desired outcomes. An immediate need became apparent that the complexity and variety of the data collected required data management to mature the quality and the content of the NOSIA model. The NOSIA Analysis Database (ADB) was developed initially to improve consistency of terms and data types to allow for the linkage of observing systems, products and NOAA's Goals and mission. The ADB also allowed for the prototyping of reports and product generation in an easily accessible and comprehensive format for the first time. Web based visualization of relationships between products, datasources, users, producers were generated to make the information easily understood This includes developing ontologies/vocabularies that are used for the development of users type specific products for NOAA leadership, Observing System Portfolio mangers and the users of NOAA data.
Implications of Artificial Intelligence for End User Use of Online Systems.
ERIC Educational Resources Information Center
Smith, Linda C.
1980-01-01
Reviewed are several studies which demonstrate how artificial intelligence techniques can be applied in the design of end user-oriented interfaces (which would eliminate the need for an intermediary) to existing online systems, as well as in the development of future generations of online systems intended for the end user. (Author/SW)
Predictors of outcomes of assertive outreach teams: a 3-year follow-up study in North East England.
Carpenter, John; Luce, Anna; Wooff, David
2011-06-01
Assertive outreach (AO) is a required component of services for people with severe mental illness in England. However, the claims to its effectiveness have been contested and the relationships between team organisation, including model fidelity, the use of mental health interventions and outcomes for service users remain unclear. Three-year follow up of 33 AO teams was conducted using standardised measures of model fidelity and mental health interventions, and of current location and a range of outcomes for service users (n = 628). Predictors of the number of hospital admissions, mental health and social functioning at T2, and discharge from the team as 'improved' were modelled using multivariate regression analyses. Teams had moderate mean ratings of fidelity to the AO model. All rated highly on the core intervention modalities of engagement, assessment and care co-ordination, but ratings for psychosocial interventions were comparatively low. Two-thirds (462) of service users were still in AO and data were returned on 400 (87%). There was evidence of small improvements in mental health and social functioning and a reduction in the mean number of hospital admissions in the previous 2 years (from 2.09 to 1.39). Poor outcomes were predicted variously by service users' characteristics, previous psychiatric history, poor collaboration with services, homelessness and dual diagnosis. Fidelity to the AO model did not emerge as a predictor of outcome, but the team working for extended hours was associated with more frequent in-patient admissions and less likelihood of discharge from AO. Supportive interventions in daily living, together with the team's use of family and psychological interventions were also associated with poorer outcomes. Possible explanations for these unexpected findings are considered. AO appears to have been quite successful in keeping users engaged over a substantial period and to have an impact in supporting many people to live in the community and to avoid the necessity of psychiatric hospital admission. However, teams should focus on those with a history of hospital admissions, who do not engage well with services and for whom outcomes are less good. Psychosocial interventions should be applied. The relationship between model fidelity, team organisation, mental health interventions and outcomes is not straightforward and deserves further study.
Making Cloud Computing Available For Researchers and Innovators (Invited)
NASA Astrophysics Data System (ADS)
Winsor, R.
2010-12-01
High Performance Computing (HPC) facilities exist in most academic institutions but are almost invariably over-subscribed. Access is allocated based on academic merit, the only practical method of assigning valuable finite compute resources. Cloud computing on the other hand, and particularly commercial clouds, draw flexibly on an almost limitless resource as long as the user has sufficient funds to pay the bill. How can the commercial cloud model be applied to scientific computing? Is there a case to be made for a publicly available research cloud and how would it be structured? This talk will explore these themes and describe how Cybera, a not-for-profit non-governmental organization in Alberta Canada, aims to leverage its high speed research and education network to provide cloud computing facilities for a much wider user base.
NASA Technical Reports Server (NTRS)
Happell, Nadine; Miksell, Steve; Carlisle, Candace
1989-01-01
A major barrier in taking expert systems from prototype to operational status involves instilling end user confidence in the operational system. The software of different life cycle models is examined and the advantages and disadvantages of each when applied to expert system development are explored. The Fault Isolation Expert System for Tracking and data relay satellite system Applications (FIESTA) is presented as a case study of development of an expert system. The end user confidence necessary for operational use of this system is accentuated by the fact that it will handle real-time data in a secure environment, allowing little tolerance for errors. How FIESTA is dealing with transition problems as it moves from an off-line standalone prototype to an on-line real-time system is discussed.
NASA Technical Reports Server (NTRS)
Happell, Nadine; Miksell, Steve; Carlisle, Candace
1989-01-01
A major barrier in taking expert systems from prototype to operational status involves instilling end user confidence in the operational system. The software of different life cycle models is examined and the advantages and disadvantages of each when applied to expert system development are explored. The Fault Isolation Expert System for Tracking and data relay satellite system Applications (FIESTA) is presented as a case study of development of an expert system. The end user confidence necessary for operational use of this system is accentuated by the fact that it will handle real-time data in a secure environment, allowing little tolerance for errors. How FIESTA is dealing with transition problems as it moves from an off-line standalone prototype to an on-line real-time system is discussed.
NASA Technical Reports Server (NTRS)
Hall, Callie; Arnone, Robert
2006-01-01
The NASA Applied Sciences Program seeks to transfer NASA data, models, and knowledge into the hands of end-users by forming links with partner agencies and associated decision support tools (DSTs). Through the NASA REASoN (Research, Education and Applications Solutions Network) Cooperative Agreement, the Oceanography Division of the Naval Research Laboratory (NRLSSC) is developing new products through the integration of data from NASA Earth-Sun System assets with coastal ocean forecast models and other available data to enhance coastal management in the Gulf of Mexico. The recipient federal agency for this research effort is the National Oceanic and Atmospheric Administration (NOAA). The contents of this report detail the effort to further the goals of the NASA Applied Sciences Program by demonstrating the use of NASA satellite products combined with data-assimilating ocean models to provide near real-time information to maritime users and coastal managers of the Gulf of Mexico. This effort provides new and improved capabilities for monitoring, assessing, and predicting the coastal environment. Coastal managers can exploit these capabilities through enhanced DSTs at federal, state and local agencies. The project addresses three major issues facing coastal managers: 1) Harmful Algal Blooms (HABs); 2) hypoxia; and 3) freshwater fluxes to the coastal ocean. A suite of ocean products capable of describing Ocean Weather is assembled on a daily basis as the foundation for this semi-operational multiyear effort. This continuous realtime capability brings decision makers a new ability to monitor both normal and anomalous coastal ocean conditions with a steady flow of satellite and ocean model conditions. Furthermore, as the baseline data sets are used more extensively and the customer list increased, customer feedback is obtained and additional customized products are developed and provided to decision makers. Continual customer feedback and response with new improved products are required between the researcher and customer. This document details the methods by which these coastal ocean products are produced including the data flow, distribution, and verification. Product applications and the degree to which these products are used successfully within NOAA and coordinated with the Mississippi Department of Marine Resources (MDMR) is benchmarked.
NASA Astrophysics Data System (ADS)
Carniel, Roberto; Di Cecca, Mauro; Jaquet, Olivier
2006-05-01
In the framework of the EU-funded project "Multi-disciplinary monitoring, modelling and forecasting of volcanic hazard" (MULTIMO), multiparametric data have been recorded at the MULTIMO station in Montserrat. Moreover, several other long time series, recorded at Montserrat and at other volcanoes, have been acquired in order to test stochastic and deterministic methodologies under development. Creating a general framework to handle data efficiently is a considerable task even for homogeneous data. In the case of heterogeneous data, this becomes a major issue. A need for a consistent way of browsing such a heterogeneous dataset in a user-friendly way therefore arose. Additionally, a framework for applying the calculation of the developed dynamical parameters on the data series was also needed in order to easily keep these parameters under control, e.g. for monitoring, research or forecasting purposes. The solution which we present is completely based on Open Source software, including Linux operating system, MySql database management system, Apache web server, Zope application server, Scilab math engine, Plone content management framework, Unified Modelling Language. From the user point of view the main advantage is the possibility of browsing through datasets recorded on different volcanoes, with different instruments, with different sampling frequencies, stored in different formats, all via a consistent, user- friendly interface that transparently runs queries to the database, gets the data from the main storage units, generates the graphs and produces dynamically generated web pages to interact with the user. The involvement of third parties for continuing the development in the Open Source philosophy and/or extending the application fields is now sought.
Smith, Vincent S; Rycroft, Simon D; Harman, Kehan T; Scott, Ben; Roberts, David
2009-01-01
Background Natural History science is characterised by a single immense goal (to document, describe and synthesise all facets pertaining to the diversity of life) that can only be addressed through a seemingly infinite series of smaller studies. The discipline's failure to meaningfully connect these small studies with natural history's goal has made it hard to demonstrate the value of natural history to a wider scientific community. Digital technologies provide the means to bridge this gap. Results We describe the system architecture and template design of "Scratchpads", a data-publishing framework for groups of people to create their own social networks supporting natural history science. Scratchpads cater to the particular needs of individual research communities through a common database and system architecture. This is flexible and scalable enough to support multiple networks, each with its own choice of features, visual design, and constituent data. Our data model supports web services on standardised data elements that might be used by related initiatives such as GBIF and the Encyclopedia of Life. A Scratchpad allows users to organise data around user-defined or imported ontologies, including biological classifications. Automated semantic annotation and indexing is applied to all content, allowing users to navigate intuitively and curate diverse biological data, including content drawn from third party resources. A system of archiving citable pages allows stable referencing with unique identifiers and provides credit to contributors through normal citation processes. Conclusion Our framework currently serves more than 1,100 registered users across 100 sites, spanning academic, amateur and citizen-science audiences. These users have generated more than 130,000 nodes of content in the first two years of use. The template of our architecture may serve as a model to other research communities developing data publishing frameworks outside biodiversity research. PMID:19900302
2007-06-01
Innovations. New York: The Free Press. Rohlfs, J. (2001). Bandwagon Effects in High-Technology Industries. Massachusetts Institute of Technology...adoption. It focuses on cost and benefit uncertainty as well as network effects applied to end- users and their organizations. Specifically, it...as network effects applied to end- users and their organizations. Specifically, it explores Department of Defense (DoD) acquisition programs