Sample records for client database applications

  1. SQLGEN: a framework for rapid client-server database application development.

    PubMed

    Nadkarni, P M; Cheung, K H

    1995-12-01

    SQLGEN is a framework for rapid client-server relational database application development. It relies on an active data dictionary on the client machine that stores metadata on one or more database servers to which the client may be connected. The dictionary generates dynamic Structured Query Language (SQL) to perform common database operations; it also stores information about the access rights of the user at log-in time, which is used to partially self-configure the behavior of the client to disable inappropriate user actions. SQLGEN uses a microcomputer database as the client to store metadata in relational form, to transiently capture server data in tables, and to allow rapid application prototyping followed by porting to client-server mode with modest effort. SQLGEN is currently used in several production biomedical databases.

  2. Database architectures for Space Telescope Science Institute

    NASA Astrophysics Data System (ADS)

    Lubow, Stephen

    1993-08-01

    At STScI nearly all large applications require database support. A general purpose architecture has been developed and is in use that relies upon an extended client-server paradigm. Processing is in general distributed across three processes, each of which generally resides on its own processor. Database queries are evaluated on one such process, called the DBMS server. The DBMS server software is provided by a database vendor. The application issues database queries and is called the application client. This client uses a set of generic DBMS application programming calls through our STDB/NET programming interface. Intermediate between the application client and the DBMS server is the STDB/NET server. This server accepts generic query requests from the application and converts them into the specific requirements of the DBMS server. In addition, it accepts query results from the DBMS server and passes them back to the application. Typically the STDB/NET server is local to the DBMS server, while the application client may be remote. The STDB/NET server provides additional capabilities such as database deadlock restart and performance monitoring. This architecture is currently in use for some major STScI applications, including the ground support system. We are currently investigating means of providing ad hoc query support to users through the above architecture. Such support is critical for providing flexible user interface capabilities. The Universal Relation advocated by Ullman, Kernighan, and others appears to be promising. In this approach, the user sees the entire database as a single table, thereby freeing the user from needing to understand the detailed schema. A software layer provides the translation between the user and detailed schema views of the database. However, many subtle issues arise in making this transformation. We are currently exploring this scheme for use in the Hubble Space Telescope user interface to the data archive system (DADS).

  3. Open Clients for Distributed Databases

    NASA Astrophysics Data System (ADS)

    Chayes, D. N.; Arko, R. A.

    2001-12-01

    We are actively developing a collection of open source example clients that demonstrate use of our "back end" data management infrastructure. The data management system is reported elsewhere at this meeting (Arko and Chayes: A Scaleable Database Infrastructure). In addition to their primary goal of being examples for others to build upon, some of these clients may have limited utility in them selves. More information about the clients and the data infrastructure is available on line at http://data.ldeo.columbia.edu. The available examples to be demonstrated include several web-based clients including those developed for the Community Review System of the Digital Library for Earth System Education, a real-time watch standers log book, an offline interface to use log book entries, a simple client to search on multibeam metadata and others are Internet enabled and generally web-based front ends that support searches against one or more relational databases using industry standard SQL queries. In addition to the web based clients, simple SQL searches from within Excel and similar applications will be demonstrated. By defining, documenting and publishing a clear interface to the fully searchable databases, it becomes relatively easy to construct client interfaces that are optimized for specific applications in comparison to building a monolithic data and user interface system.

  4. Kentucky geotechnical database.

    DOT National Transportation Integrated Search

    2005-03-01

    Development of a comprehensive dynamic, geotechnical database is described. Computer software selected to program the client/server application in windows environment, components and structure of the geotechnical database, and primary factors cons...

  5. Develop a Prototype Personal Health Record Application (PHR-A) that Captures Information About Daily Living Important for Diabetes and Provides Decision Support with Actionable Advice for Diabetes Self Care

    DTIC Science & Technology

    2012-10-01

    higher  Java v5Apache Struts v2  Hibernate v2  C3PO  SQL*Net client / JDBC Database Server  Oracle 10.0.2 Desktop Client  Internet Explorer...for mobile Smartphones - A Java -based framework utilizing Apache Struts on the server - Relational database to handle data storage requirements B...technologies are as follows: Technology Use Requirements Java Application Provides the backend application software to drive the PHR-A 7 BEA Web

  6. Can "patient keeper" help in-patients?

    PubMed

    Al-Hinnawi, M F

    2009-06-01

    The aim of this paper is to present our "Patient Keeper" application, which is a client-server medical application. "Patient Keeper" is designed to run on a mobile phone for the client application and on a PC for the server application using J2ME and JAVA2, respectively. This application can help doctors during visits to their patients in hospitals. The client application allows doctors to store on their mobile phones the results of their diagnoses and findings such as temperature, blood pressure, medications, analysis, etc., and send this information to the server via short message service (SMS) for storage in a database. The server can also respond to any request from the client and send the result via Bluetooth, infrared, or over the air. Experimental results showed a significant improvement of the healthcare delivery and reduction for in-patient stay.

  7. Chemical Inventory Management at NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Kraft, Shirley S.; Homan, Joseph R.; Bajorek, Michael J.; Dominguez, Manuel B.; Smith, Vanessa L.

    1997-01-01

    The Chemical Management System (CMS) is a client/server application developed with Power Builder and Sybase for the Lewis Research Center (LeRC). Power Builder is a client-server application development tool, Sybase is a Relational Database Management System. The entire LeRC community can access the CMS from any desktop environment. The multiple functions and benefits of the CMS are addressed.

  8. Service Management Database for DSN Equipment

    NASA Technical Reports Server (NTRS)

    Zendejas, Silvino; Bui, Tung; Bui, Bach; Malhotra, Shantanu; Chen, Fannie; Wolgast, Paul; Allen, Christopher; Luong, Ivy; Chang, George; Sadaqathulla, Syed

    2009-01-01

    This data- and event-driven persistent storage system leverages the use of commercial software provided by Oracle for portability, ease of maintenance, scalability, and ease of integration with embedded, client-server, and multi-tiered applications. In this role, the Service Management Database (SMDB) is a key component of the overall end-to-end process involved in the scheduling, preparation, and configuration of the Deep Space Network (DSN) equipment needed to perform the various telecommunication services the DSN provides to its customers worldwide. SMDB makes efficient use of triggers, stored procedures, queuing functions, e-mail capabilities, data management, and Java integration features provided by the Oracle relational database management system. SMDB uses a third normal form schema design that allows for simple data maintenance procedures and thin layers of integration with client applications. The software provides an integrated event logging system with ability to publish events to a JMS messaging system for synchronous and asynchronous delivery to subscribed applications. It provides a structured classification of events and application-level messages stored in database tables that are accessible by monitoring applications for real-time monitoring or for troubleshooting and analysis over historical archives.

  9. Time and Space Efficient Algorithms for Two-Party Authenticated Data Structures

    NASA Astrophysics Data System (ADS)

    Papamanthou, Charalampos; Tamassia, Roberto

    Authentication is increasingly relevant to data management. Data is being outsourced to untrusted servers and clients want to securely update and query their data. For example, in database outsourcing, a client's database is stored and maintained by an untrusted server. Also, in simple storage systems, clients can store very large amounts of data but at the same time, they want to assure their integrity when they retrieve them. In this paper, we present a model and protocol for two-party authentication of data structures. Namely, a client outsources its data structure and verifies that the answers to the queries have not been tampered with. We provide efficient algorithms to securely outsource a skip list with logarithmic time overhead at the server and client and logarithmic communication cost, thus providing an efficient authentication primitive for outsourced data, both structured (e.g., relational databases) and semi-structured (e.g., XML documents). In our technique, the client stores only a constant amount of space, which is optimal. Our two-party authentication framework can be deployed on top of existing storage applications, thus providing an efficient authentication service. Finally, we present experimental results that demonstrate the practical efficiency and scalability of our scheme.

  10. Retrieving high-resolution images over the Internet from an anatomical image database

    NASA Astrophysics Data System (ADS)

    Strupp-Adams, Annette; Henderson, Earl

    1999-12-01

    The Visible Human Data set is an important contribution to the national collection of anatomical images. To enhance the availability of these images, the National Library of Medicine has supported the design and development of a prototype object-oriented image database which imports, stores, and distributes high resolution anatomical images in both pixel and voxel formats. One of the key database modules is its client-server Internet interface. This Web interface provides a query engine with retrieval access to high-resolution anatomical images that range in size from 100KB for browser viewable rendered images, to 1GB for anatomical structures in voxel file formats. The Web query and retrieval client-server system is composed of applet GUIs, servlets, and RMI application modules which communicate with each other to allow users to query for specific anatomical structures, and retrieve image data as well as associated anatomical images from the database. Selected images can be downloaded individually as single files via HTTP or downloaded in batch-mode over the Internet to the user's machine through an applet that uses Netscape's Object Signing mechanism. The image database uses ObjectDesign's object-oriented DBMS, ObjectStore that has a Java interface. The query and retrieval systems has been tested with a Java-CDE window system, and on the x86 architecture using Windows NT 4.0. This paper describes the Java applet client search engine that queries the database; the Java client module that enables users to view anatomical images online; the Java application server interface to the database which organizes data returned to the user, and its distribution engine that allow users to download image files individually and/or in batch-mode.

  11. Supply Chain Collaboration: Information Sharing in a Tactical Operating Environment

    DTIC Science & Technology

    2013-06-01

    architecture, there are four tiers: Client (Web Application Clients ), Presentation (Web-Server), Processing (Application-Server), Data (Database...organization in each period. This data will be collected to analyze. i) Analyses and Validation: We will do a statistics test in this data, Pareto ...notes, outstanding deliveries, and inventory. i) Analyses and Validation: We will do a statistics test in this data, Pareto analyses and confirmation

  12. The Network Configuration of an Object Relational Database Management System

    NASA Technical Reports Server (NTRS)

    Diaz, Philip; Harris, W. C.

    2000-01-01

    The networking and implementation of the Oracle Database Management System (ODBMS) requires developers to have knowledge of the UNIX operating system as well as all the features of the Oracle Server. The server is an object relational database management system (DBMS). By using distributed processing, processes are split up between the database server and client application programs. The DBMS handles all the responsibilities of the server. The workstations running the database application concentrate on the interpretation and display of data.

  13. Design and Development of a Web-Based Self-Monitoring System to Support Wellness Coaching.

    PubMed

    Zarei, Reza; Kuo, Alex

    2017-01-01

    We analyzed, designed and deployed a web-based, self-monitoring system to support wellness coaching. A wellness coach can plan for clients' exercise and diet through the system and is able to monitor the changes in body dimensions and body composition that the client reports. The system can also visualize the client's data in form of graphs for both the client and the coach. Both parties can also communicate through the messaging feature embedded in the application. A reminder system is also incorporated into the system and sends reminder messages to the clients when their reporting is due. The web-based self-monitoring application uses Oracle 11g XE as the backend database and Application Express 4.2 as user interface development tool. The system allowed users to access, update and modify data through web browser anytime, anywhere, and on any device.

  14. Design and implementation of a distributed large-scale spatial database system based on J2EE

    NASA Astrophysics Data System (ADS)

    Gong, Jianya; Chen, Nengcheng; Zhu, Xinyan; Zhang, Xia

    2003-03-01

    With the increasing maturity of distributed object technology, CORBA, .NET and EJB are universally used in traditional IT field. However, theories and practices of distributed spatial database need farther improvement in virtue of contradictions between large scale spatial data and limited network bandwidth or between transitory session and long transaction processing. Differences and trends among of CORBA, .NET and EJB are discussed in details, afterwards the concept, architecture and characteristic of distributed large-scale seamless spatial database system based on J2EE is provided, which contains GIS client application, web server, GIS application server and spatial data server. Moreover the design and implementation of components of GIS client application based on JavaBeans, the GIS engine based on servlet, the GIS Application server based on GIS enterprise JavaBeans(contains session bean and entity bean) are explained.Besides, the experiments of relation of spatial data and response time under different conditions are conducted, which proves that distributed spatial database system based on J2EE can be used to manage, distribute and share large scale spatial data on Internet. Lastly, a distributed large-scale seamless image database based on Internet is presented.

  15. Fine-grained policy control in U.S. Army Research Laboratory (ARL) multimodal signatures database

    NASA Astrophysics Data System (ADS)

    Bennett, Kelly; Grueneberg, Keith; Wood, David; Calo, Seraphin

    2014-06-01

    The U.S. Army Research Laboratory (ARL) Multimodal Signatures Database (MMSDB) consists of a number of colocated relational databases representing a collection of data from various sensors. Role-based access to this data is granted to external organizations such as DoD contractors and other government agencies through a client Web portal. In the current MMSDB system, access control is only at the database and firewall level. In order to offer finer grained security, changes to existing user profile schemas and authentication mechanisms are usually needed. In this paper, we describe a software middleware architecture and implementation that allows fine-grained access control to the MMSDB at a dataset, table, and row level. Result sets from MMSDB queries issued in the client portal are filtered with the use of a policy enforcement proxy, with minimal changes to the existing client software and database. Before resulting data is returned to the client, policies are evaluated to determine if the user or role is authorized to access the data. Policies can be authored to filter data at the row, table or column level of a result set. The system uses various technologies developed in the International Technology Alliance in Network and Information Science (ITA) for policy-controlled information sharing and dissemination1. Use of the Policy Management Library provides a mechanism for the management and evaluation of policies to support finer grained access to the data in the MMSDB system. The GaianDB is a policy-enabled, federated database that acts as a proxy between the client application and the MMSDB system.

  16. Group-oriented coordination models for distributed client-server computing

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.; Hughes, Craig S.

    1994-01-01

    This paper describes group-oriented control models for distributed client-server interactions. These models transparently coordinate requests for services that involve multiple servers, such as queries across distributed databases. Specific capabilities include: decomposing and replicating client requests; dispatching request subtasks or copies to independent, networked servers; and combining server results into a single response for the client. The control models were implemented by combining request broker and process group technologies with an object-oriented communication middleware tool. The models are illustrated in the context of a distributed operations support application for space-based systems.

  17. RefPrimeCouch—a reference gene primer CouchApp

    PubMed Central

    Silbermann, Jascha; Wernicke, Catrin; Pospisil, Heike; Frohme, Marcus

    2013-01-01

    To support a quantitative real-time polymerase chain reaction standardization project, a new reference gene database application was required. The new database application was built with the explicit goal of simplifying not only the development process but also making the user interface more responsive and intuitive. To this end, CouchDB was used as the backend with a lightweight dynamic user interface implemented client-side as a one-page web application. Data entry and curation processes were streamlined using an OpenRefine-based workflow. The new RefPrimeCouch database application provides its data online under an Open Database License. Database URL: http://hpclife.th-wildau.de:5984/rpc/_design/rpc/view.html PMID:24368831

  18. RefPrimeCouch--a reference gene primer CouchApp.

    PubMed

    Silbermann, Jascha; Wernicke, Catrin; Pospisil, Heike; Frohme, Marcus

    2013-01-01

    To support a quantitative real-time polymerase chain reaction standardization project, a new reference gene database application was required. The new database application was built with the explicit goal of simplifying not only the development process but also making the user interface more responsive and intuitive. To this end, CouchDB was used as the backend with a lightweight dynamic user interface implemented client-side as a one-page web application. Data entry and curation processes were streamlined using an OpenRefine-based workflow. The new RefPrimeCouch database application provides its data online under an Open Database License. Database URL: http://hpclife.th-wildau.de:5984/rpc/_design/rpc/view.html.

  19. Preliminary Results on Design and Implementation of a Solar Radiation Monitoring System

    PubMed Central

    Balan, Mugur C.; Damian, Mihai; Jäntschi, Lorentz

    2008-01-01

    The paper presents a solar radiation monitoring system, using two scientific pyranometers and an on-line computer home-made data acquisition system. The first pyranometer measures the global solar radiation and the other one, which is shaded, measure the diffuse radiation. The values of total and diffuse solar radiation are continuously stored into a database on a server. Original software was created for data acquisition and interrogation of the created system. The server application acquires the data from pyranometers and stores it into a database with a baud rate of one record at 50 seconds. The client-server application queries the database and provides descriptive statistics. A web interface allow to any user to define the including criteria and to obtain the results. In terms of results, the system is able to provide direct, diffuse and total radiation intensities as time series. Our client-server application computes also derivate heats. The ability of the system to evaluate the local solar energy potential is highlighted. PMID:27879746

  20. Concept locator: a client-server application for retrieval of UMLS metathesaurus concepts through complex boolean query.

    PubMed

    Nadkarni, P M

    1997-08-01

    Concept Locator (CL) is a client-server application that accesses a Sybase relational database server containing a subset of the UMLS Metathesaurus for the purpose of retrieval of concepts corresponding to one or more query expressions supplied to it. CL's query grammar permits complex Boolean expressions, wildcard patterns, and parenthesized (nested) subexpressions. CL translates the query expressions supplied to it into one or more SQL statements that actually perform the retrieval. The generated SQL is optimized by the client to take advantage of the strengths of the server's query optimizer, and sidesteps its weaknesses, so that execution is reasonably efficient.

  1. JPEG2000 and dissemination of cultural heritage over the Internet.

    PubMed

    Politou, Eugenia A; Pavlidis, George P; Chamzas, Christodoulos

    2004-03-01

    By applying the latest technologies in image compression for managing the storage of massive image data within cultural heritage databases and by exploiting the universality of the Internet we are now able not only to effectively digitize, record and preserve, but also to promote the dissemination of cultural heritage. In this work we present an application of the latest image compression standard JPEG2000 in managing and browsing image databases, focusing on the image transmission aspect rather than database management and indexing. We combine the technologies of JPEG2000 image compression with client-server socket connections and client browser plug-in, as to provide with an all-in-one package for remote browsing of JPEG2000 compressed image databases, suitable for the effective dissemination of cultural heritage.

  2. Distributed data collection for a database of radiological image interpretations

    NASA Astrophysics Data System (ADS)

    Long, L. Rodney; Ostchega, Yechiam; Goh, Gin-Hua; Thoma, George R.

    1997-01-01

    The National Library of Medicine, in collaboration with the National Center for Health Statistics and the National Institute for Arthritis and Musculoskeletal and Skin Diseases, has built a system for collecting radiological interpretations for a large set of x-ray images acquired as part of the data gathered in the second National Health and Nutrition Examination Survey. This system is capable of delivering across the Internet 5- and 10-megabyte x-ray images to Sun workstations equipped with X Window based 2048 X 2560 image displays, for the purpose of having these images interpreted for the degree of presence of particular osteoarthritic conditions in the cervical and lumbar spines. The collected interpretations can then be stored in a database at the National Library of Medicine, under control of the Illustra DBMS. This system is a client/server database application which integrates (1) distributed server processing of client requests, (2) a customized image transmission method for faster Internet data delivery, (3) distributed client workstations with high resolution displays, image processing functions and an on-line digital atlas, and (4) relational database management of the collected data.

  3. Shuttle-Data-Tape XML Translator

    NASA Technical Reports Server (NTRS)

    Barry, Matthew R.; Osborne, Richard N.

    2005-01-01

    JSDTImport is a computer program for translating native Shuttle Data Tape (SDT) files from American Standard Code for Information Interchange (ASCII) format into databases in other formats. JSDTImport solves the problem of organizing the SDT content, affording flexibility to enable users to choose how to store the information in a database to better support client and server applications. JSDTImport can be dynamically configured by use of a simple Extensible Markup Language (XML) file. JSDTImport uses this XML file to define how each record and field will be parsed, its layout and definition, and how the resulting database will be structured. JSDTImport also includes a client application programming interface (API) layer that provides abstraction for the data-querying process. The API enables a user to specify the search criteria to apply in gathering all the data relevant to a query. The API can be used to organize the SDT content and translate into a native XML database. The XML format is structured into efficient sections, enabling excellent query performance by use of the XPath query language. Optionally, the content can be translated into a Structured Query Language (SQL) database for fast, reliable SQL queries on standard database server computers.

  4. HOED: Hypermedia Online Educational Database.

    ERIC Educational Resources Information Center

    Duval, E.; Olivie, H.

    This paper presents HOED, a distributed hypermedia client-server system for educational resources. The aim of HOED is to provide a library facility for hyperdocuments that is accessible via the world wide web. Its main application domain is education. The HOED database not only holds the educational resources themselves, but also data describing…

  5. FirebrowseR: an R client to the Broad Institute’s Firehose Pipeline

    PubMed Central

    Deng, Mario; Brägelmann, Johannes; Kryukov, Ivan; Saraiva-Agostinho, Nuno; Perner, Sven

    2017-01-01

    With its Firebrowse service (http://firebrowse.org/) the Broad Institute is making large-scale multi-platform omics data analysis results publicly available through a Representational State Transfer (REST) Application Programmable Interface (API). Querying this database through an API client from an arbitrary programming environment is an essential task, allowing other developers and researchers to focus on their analysis and avoid data wrangling. Hence, as a first result, we developed a workflow to automatically generate, test and deploy such clients for rapid response to API changes. Its underlying infrastructure, a combination of free and publicly available web services, facilitates the development of API clients. It decouples changes in server software from the client software by reacting to changes in the RESTful service and removing direct dependencies on a specific implementation of an API. As a second result, FirebrowseR, an R client to the Broad Institute’s RESTful Firehose Pipeline, is provided as a working example, which is built by the means of the presented workflow. The package’s features are demonstrated by an example analysis of cancer gene expression data. Database URL: https://github.com/mariodeng/ PMID:28062517

  6. FirebrowseR: an R client to the Broad Institute's Firehose Pipeline.

    PubMed

    Deng, Mario; Brägelmann, Johannes; Kryukov, Ivan; Saraiva-Agostinho, Nuno; Perner, Sven

    2017-01-01

    With its Firebrowse service (http://firebrowse.org/) the Broad Institute is making large-scale multi-platform omics data analysis results publicly available through a Representational State Transfer (REST) Application Programmable Interface (API). Querying this database through an API client from an arbitrary programming environment is an essential task, allowing other developers and researchers to focus on their analysis and avoid data wrangling. Hence, as a first result, we developed a workflow to automatically generate, test and deploy such clients for rapid response to API changes. Its underlying infrastructure, a combination of free and publicly available web services, facilitates the development of API clients. It decouples changes in server software from the client software by reacting to changes in the RESTful service and removing direct dependencies on a specific implementation of an API. As a second result, FirebrowseR, an R client to the Broad Institute's RESTful Firehose Pipeline, is provided as a working example, which is built by the means of the presented workflow. The package's features are demonstrated by an example analysis of cancer gene expression data.Database URL: https://github.com/mariodeng/. © The Author(s) 2017. Published by Oxford University Press.

  7. The BiolAD-DB system : an informatics system for clinical and genetic data.

    PubMed

    Nielsen, David A; Leidner, Marty; Haynes, Chad; Krauthammer, Michael; Kreek, Mary Jeanne

    2007-01-01

    The Biology of Addictive Diseases-Database (BiolAD-DB) system is a research bioinformatics system for archiving, analyzing, and processing of complex clinical and genetic data. The database schema employs design principles for handling complex clinical information, such as response items in genetic questionnaires. Data access and validation is provided by the BiolAD-DB client application, which features a data validation engine tightly coupled to a graphical user interface. Data integrity is provided by the password-protected BiolAD-DB SQL compliant server and database. BiolAD-DB tools further provide functionalities for generating customized reports and views. The BiolAD-DB system schema, client, and installation instructions are freely available at http://www.rockefeller.edu/biolad-db/.

  8. Visualization of historical data for the ATLAS detector controls - DDV

    NASA Astrophysics Data System (ADS)

    Maciejewski, J.; Schlenker, S.

    2017-10-01

    The ATLAS experiment is one of four detectors located on the Large Hardon Collider (LHC) based at CERN. Its detector control system (DCS) stores the slow control data acquired within the back-end of distributed WinCC OA applications, which enables the data to be retrieved for future analysis, debugging and detector development in an Oracle relational database. The ATLAS DCS Data Viewer (DDV) is a client-server application providing access to the historical data outside of the experiment network. The server builds optimized SQL queries, retrieves the data from the database and serves it to the clients via HTTP connections. The server also implements protection methods to prevent malicious use of the database. The client is an AJAX-type web application based on the Vaadin (framework build around the Google Web Toolkit (GWT)) which gives users the possibility to access the data with ease. The DCS metadata can be selected using a column-tree navigation or a search engine supporting regular expressions. The data is visualized by a selection of output modules such as a java script value-over time plots or a lazy loading table widget. Additional plugins give the users the possibility to retrieve the data in ROOT format or as an ASCII file. Control system alarms can also be visualized in a dedicated table if necessary. Python mock-up scripts can be generated by the client, allowing the user to query the pythonic DDV server directly, such that the users can embed the scripts into more complex analysis programs. Users are also able to store searches and output configurations as XML on the server to share with others via URL or to embed in HTML.

  9. Contingency Contractor Optimization Phase 3 Sustainment Third-Party Software List - Contingency Contractor Optimization Tool - Prototype

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Durfee, Justin David; Frazier, Christopher Rawls; Bandlow, Alisa

    2016-05-01

    The Contingency Contractor Optimization Tool - Prototype (CCOT-P) requires several third-party software packages. These are documented below for each of the CCOT-P elements: client, web server, database server, solver, web application and polling application.

  10. Electronic Reference Library: Silverplatter's Database Networking Solution.

    ERIC Educational Resources Information Center

    Millea, Megan

    Silverplatter's Electronic Reference Library (ERL) provides wide area network access to its databases using TCP/IP communications and client-server architecture. ERL has two main components: The ERL clients (retrieval interface) and the ERL server (search engines). ERL clients provide patrons with seamless access to multiple databases on multiple…

  11. Promotion Assistance Tool for Mobile Phone Users

    NASA Astrophysics Data System (ADS)

    Intraprasert, P.; Jatikul, N.; Chantrapornchai, C.

    In this paper, we propose an application tool to help analyze the usage of a mobile phone for a typical user. From the past usage, the tool can analyze the promotion that is suitable for the user which may save the total expense. The application consists of both client and server side. On the server side, the information for each promotion package for a phone operator is stored as well as the usage database for each client. The client side is a user interface for both phone operators and users to enter their information. The analysis engine are based on KNN, ANN, decision tree and Naïve Bayes models. For comparison, it is shown that KNN and decision outperforms the others.

  12. Highway rock slope management program.

    DOT National Transportation Integrated Search

    2001-06-30

    Development of a comprehensive geotechnical database for risk management of highway rock slope problems is described. Computer software selected to program the client/server application in windows environment, components and structure of the geote...

  13. WebEAV: automatic metadata-driven generation of web interfaces to entity-attribute-value databases.

    PubMed

    Nadkarni, P M; Brandt, C M; Marenco, L

    2000-01-01

    The task of creating and maintaining a front end to a large institutional entity-attribute-value (EAV) database can be cumbersome when using traditional client-server technology. Switching to Web technology as a delivery vehicle solves some of these problems but introduces others. In particular, Web development environments tend to be primitive, and many features that client-server developers take for granted are missing. WebEAV is a generic framework for Web development that is intended to streamline the process of Web application development for databases having a significant EAV component. It also addresses some challenging user interface issues that arise when any complex system is created. The authors describe the architecture of WebEAV and provide an overview of its features with suitable examples.

  14. A radiology department intranet: development and applications.

    PubMed

    Willing, S J; Berland, L L

    1999-01-01

    An intranet is a "private Internet" that uses the protocols of the World Wide Web to share information resources within a company or with the company's business partners and clients. The hardware requirements for an intranet begin with a dedicated Web server permanently connected to the departmental network. The heart of a Web server is the hypertext transfer protocol (HTTP) service, which receives a page request from a client's browser and transmits the page back to the client. Although knowledge of hypertext markup language (HTML) is not essential for authoring a Web page, a working familiarity with HTML is useful, as is knowledge of programming and database management. Security can be ensured by using scripts to write information in hidden fields or by means of "cookies." Interfacing databases and database management systems with the Web server and conforming the user interface to HTML syntax can be achieved by means of the common gateway interface (CGI), Active Server Pages (ASP), or other methods. An intranet in a radiology department could include the following types of content: on-call schedules, work schedules and a calendar, a personnel directory, resident resources, memorandums and discussion groups, software for a radiology information system, and databases.

  15. The Common Gateway Interface (CGI) for Enhancing Access to Database Servers via the World Wide Web (WWW).

    ERIC Educational Resources Information Center

    Machovec, George S., Ed.

    1995-01-01

    Explains the Common Gateway Interface (CGI) protocol as a set of rules for passing information from a Web server to an external program such as a database search engine. Topics include advantages over traditional client/server solutions, limitations, sample library applications, and sources of information from the Internet. (LRW)

  16. 47 CFR 15.703 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... database or spectrum sensing. (b) Client device. A TVBD operating in client mode. (c) Client mode. An... TVBD is able to select a channel itself based on a list provided by the database and initiate a network... does not require use of a geo-location capability or access to the TV bands database and requires...

  17. Development of a web-based video management and application processing system

    NASA Astrophysics Data System (ADS)

    Chan, Shermann S.; Wu, Yi; Li, Qing; Zhuang, Yueting

    2001-07-01

    How to facilitate efficient video manipulation and access in a web-based environment is becoming a popular trend for video applications. In this paper, we present a web-oriented video management and application processing system, based on our previous work on multimedia database and content-based retrieval. In particular, we extend the VideoMAP architecture with specific web-oriented mechanisms, which include: (1) Concurrency control facilities for the editing of video data among different types of users, such as Video Administrator, Video Producer, Video Editor, and Video Query Client; different users are assigned various priority levels for different operations on the database. (2) Versatile video retrieval mechanism which employs a hybrid approach by integrating a query-based (database) mechanism with content- based retrieval (CBR) functions; its specific language (CAROL/ST with CBR) supports spatio-temporal semantics of video objects, and also offers an improved mechanism to describe visual content of videos by content-based analysis method. (3) Query profiling database which records the `histories' of various clients' query activities; such profiles can be used to provide the default query template when a similar query is encountered by the same kind of users. An experimental prototype system is being developed based on the existing VideoMAP prototype system, using Java and VC++ on the PC platform.

  18. Asynchronous data change notification between database server and accelerator controls system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fu, W.; Morris, J.; Nemesure, S.

    2011-10-10

    Database data change notification (DCN) is a commonly used feature. Not all database management systems (DBMS) provide an explicit DCN mechanism. Even for those DBMS's which support DCN (such as Oracle and MS SQL server), some server side and/or client side programming may be required to make the DCN system work. This makes the setup of DCN between database server and interested clients tedious and time consuming. In accelerator control systems, there are many well established software client/server architectures (such as CDEV, EPICS, and ADO) that can be used to implement data reflection servers that transfer data asynchronously to anymore » client using the standard SET/GET API. This paper describes a method for using such a data reflection server to set up asynchronous DCN (ADCN) between a DBMS and clients. This method works well for all DBMS systems which provide database trigger functionality. Asynchronous data change notification (ADCN) between database server and clients can be realized by combining the use of a database trigger mechanism, which is supported by major DBMS systems, with server processes that use client/server software architectures that are familiar in the accelerator controls community (such as EPICS, CDEV or ADO). This approach makes the ADCN system easy to set up and integrate into an accelerator controls system. Several ADCN systems have been set up and used in the RHIC-AGS controls system.« less

  19. Development of a statewide landslide inventory program.

    DOT National Transportation Integrated Search

    2003-02-01

    Development of a comprehensive geotechnical database for risk management of highway landslide problems is described. Computer software selected to program the client/server application in a data window, components and structure of the geotechnical da...

  20. CyBy(2): a structure-based data management tool for chemical and biological data.

    PubMed

    Höck, Stefan; Riedl, Rainer

    2012-01-01

    We report the development of a powerful data management tool for chemical and biological data: CyBy(2). CyBy(2) is a structure-based information management tool used to store and visualize structural data alongside additional information such as project assignment, physical information, spectroscopic data, biological activity, functional data and synthetic procedures. The application consists of a database, an application server, used to query and update the database, and a client application with a rich graphical user interface (GUI) used to interact with the server.

  1. KeyWare: an open wireless distributed computing environment

    NASA Astrophysics Data System (ADS)

    Shpantzer, Isaac; Schoenfeld, Larry; Grindahl, Merv; Kelman, Vladimir

    1995-12-01

    Deployment of distributed applications in the wireless domain lack equivalent tools, methodologies, architectures, and network management that exist in LAN based applications. A wireless distributed computing environment (KeyWareTM) based on intelligent agents within a multiple client multiple server scheme was developed to resolve this problem. KeyWare renders concurrent application services to wireline and wireless client nodes encapsulated in multiple paradigms such as message delivery, database access, e-mail, and file transfer. These services and paradigms are optimized to cope with temporal and spatial radio coverage, high latency, limited throughput and transmission costs. A unified network management paradigm for both wireless and wireline facilitates seamless extensions of LAN- based management tools to include wireless nodes. A set of object oriented tools and methodologies enables direct asynchronous invocation of agent-based services supplemented by tool-sets matched to supported KeyWare paradigms. The open architecture embodiment of KeyWare enables a wide selection of client node computing platforms, operating systems, transport protocols, radio modems and infrastructures while maintaining application portability.

  2. U.S. Security-Related Agreements in Force Since 1955: Introducing a New Database

    DTIC Science & Technology

    2014-01-01

    necessarily reflect the opinions of its research clients and sponsors. Support RAND Make a tax-deductible charitable contribution at www.rand.org/giving...PAF), a division of the RAND Corporation, is the U.S. Air Force’s federally funded research and development center for studies and analyses. PAF...33   Additional Applications of the Treaty and Agreement Database ........................................................... 35   Summary

  3. Mars Science Laboratory Frame Manager for Centralized Frame Tree Database and Target Pointing

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Leger, Chris; Peters, Stephen; Carsten, Joseph; Diaz-Calderon, Antonio

    2013-01-01

    The FM (Frame Manager) flight software module is responsible for maintaining the frame tree database containing coordinate transforms between frames. The frame tree is a proper tree structure of directed links, consisting of surface and rover subtrees. Actual frame transforms are updated by their owner. FM updates site and saved frames for the surface tree. As the rover drives to a new area, a new site frame with an incremented site index can be created. Several clients including ARM and RSM (Remote Sensing Mast) update their related rover frames that they own. Through the onboard centralized FM frame tree database, client modules can query transforms between any two frames. Important applications include target image pointing for RSM-mounted cameras and frame-referenced arm moves. The use of frame tree eliminates cumbersome, error-prone calculations of coordinate entries for commands and thus simplifies flight operations significantly.

  4. Lean Middleware

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Bell, David g.; Ashish, Naveen

    2005-01-01

    This paper describes an approach to achieving data integration across multiple sources in an enterprise, in a manner that is cost efficient and economically scalable. We present an approach that does not rely on major investment in structured, heavy-weight database systems for data storage or heavy-weight middleware responsible for integrated access. The approach is centered around pushing any required data structure and semantics functionality (schema) to application clients, as well as pushing integration specification and functionality to clients where integration can be performed on-the-fly .

  5. WebEAV

    PubMed Central

    Nadkarni, Prakash M.; Brandt, Cynthia M.; Marenco, Luis

    2000-01-01

    The task of creating and maintaining a front end to a large institutional entity-attribute-value (EAV) database can be cumbersome when using traditional client-server technology. Switching to Web technology as a delivery vehicle solves some of these problems but introduces others. In particular, Web development environments tend to be primitive, and many features that client-server developers take for granted are missing. WebEAV is a generic framework for Web development that is intended to streamline the process of Web application development for databases having a significant EAV component. It also addresses some challenging user interface issues that arise when any complex system is created. The authors describe the architecture of WebEAV and provide an overview of its features with suitable examples. PMID:10887163

  6. Validation of the European Prototype for Integrated Care at Municipal Level in Savona: Updating and Maintenance

    DTIC Science & Technology

    2001-10-25

    within one of the programmes sponsored by the European Commission.The system mainly consists of a shared care database in which each groups of...care database in which each community facility, or group of facilities, is supported by a local area network (LAN). Each of these LANs is connected over...functions. The software is layered, so that the client application is not affected by how the servers are implemented or which database system they use

  7. Web data mining

    NASA Astrophysics Data System (ADS)

    Wibonele, Kasanda J.; Zhang, Yanqing

    2002-03-01

    A web data mining system using granular computing and ASP programming is proposed. This is a web based application, which allows web users to submit survey data for many different companies. This survey is a collection of questions that will help these companies develop and improve their business and customer service with their clients by analyzing survey data. This web application allows users to submit data anywhere. All the survey data is collected into a database for further analysis. An administrator of this web application can login to the system and view all the data submitted. This web application resides on a web server, and the database resides on the MS SQL server.

  8. Management system for the SND experiments

    NASA Astrophysics Data System (ADS)

    Pugachev, K.; Korol, A.

    2017-09-01

    A new management system for the SND detector experiments (at VEPP-2000 collider in Novosibirsk) is developed. We describe here the interaction between a user and the SND databases. These databases contain experiment configuration, conditions and metadata. The new system is designed in client-server architecture. It has several logical layers corresponding to the users roles. A new template engine is created. A web application is implemented using Node.js framework. At the time the application provides: showing and editing configuration; showing experiment metadata and experiment conditions data index; showing SND log (prototype).

  9. Enhancing Clients' Communication Regarding Goals for Using Psychiatric Medications.

    PubMed

    Deegan, Patricia E; Carpenter-Song, Elizabeth; Drake, Robert E; Naslund, John A; Luciano, Alison; Hutchison, Shari L

    2017-08-01

    Discordance between psychiatric care providers' and clients' goals for medication treatment is prevalent and is a barrier to person-centered care. Power statements-short self-advocacy statements prepared by clients in response to a two-part template-offer a novel approach to help clients clarify and communicate their personal goals for using psychiatric medications. This study described the power statement method and examined a sample of power statements to understand clients' goals for medication treatment. More than 17,000 adults with serious mental illness at 69 public mental health clinics had the option to develop power statements by using a Web application located in the clinic waiting areas. A database query determined the percentage of clients who entered power statements into the Web application. The authors examined textual data from a random sample of 300 power statements by using content analysis. Nearly 14,000 (79%) clients developed power statements. Of the 277 statements in the sample deemed appropriate for content analysis, 272 statements had responses to the first part of the template and 230 had responses to the second part. Clients wanted psychiatric medications to help control symptoms in the service of improving functioning. Common goals for taking psychiatric medications (N=230 statements) were to enhance relationships (51%), well-being (32%), self-sufficiency (23%), employment (19%), hobbies (15%), and self-improvement (10%). People with serious mental illness typically viewed medications as a means to pursue meaningful life goals. Power statements appear to be a simple and scalable technique to enhance clients' communication of their goals for psychiatric medication treatment.

  10. Towards the Interoperability of Web, Database, and Mass Storage Technologies for Petabyte Archives

    NASA Technical Reports Server (NTRS)

    Moore, Reagan; Marciano, Richard; Wan, Michael; Sherwin, Tom; Frost, Richard

    1996-01-01

    At the San Diego Supercomputer Center, a massive data analysis system (MDAS) is being developed to support data-intensive applications that manipulate terabyte sized data sets. The objective is to support scientific application access to data whether it is located at a Web site, stored as an object in a database, and/or storage in an archival storage system. We are developing a suite of demonstration programs which illustrate how Web, database (DBMS), and archival storage (mass storage) technologies can be integrated. An application presentation interface is being designed that integrates data access to all of these sources. We have developed a data movement interface between the Illustra object-relational database and the NSL UniTree archival storage system running in a production mode at the San Diego Supercomputer Center. With this interface, an Illustra client can transparently access data on UniTree under the control of the Illustr DBMS server. The current implementation is based on the creation of a new DBMS storage manager class, and a set of library functions that allow the manipulation and migration of data stored as Illustra 'large objects'. We have extended this interface to allow a Web client application to control data movement between its local disk, the Web server, the DBMS Illustra server, and the UniTree mass storage environment. This paper describes some of the current approaches successfully integrating these technologies. This framework is measured against a representative sample of environmental data extracted from the San Diego Ba Environmental Data Repository. Practical lessons are drawn and critical research areas are highlighted.

  11. Family Expense Manager Application in Android

    NASA Astrophysics Data System (ADS)

    Rajaprabha, M. N.

    2017-11-01

    FAMILY EXPENSES MANAGER is an android application. This monitors your own costs, family costs and incidental costs. This resembles a present day costs day book in your versatile. This application helps you to monitor your every day costs, settlement points of interest, general rundown, report in detail and periodic costs subtle elements. Every one of the information is put away in database and can be recovered by the client and their relatives.

  12. Computing and Communications Infrastructure for Network-Centric Warfare: Exploiting COTS, Assuring Performance

    DTIC Science & Technology

    2004-06-01

    remote databases, has seen little vendor acceptance. Each database ( Oracle , DB2, MySQL , etc.) has its own client- server protocol. Therefore each...existing standards – SQL , X.500/LDAP, FTP, etc. • View information dissemination as selective replication – State-oriented vs . message-oriented...allowing the 8 application to start. The resource management system would serve as a broker to the resources, making sure that resources are not

  13. Incorporating client-server database architecture and graphical user interface into outpatient medical records.

    PubMed Central

    Fiacco, P. A.; Rice, W. H.

    1991-01-01

    Computerized medical record systems require structured database architectures for information processing. However, the data must be able to be transferred across heterogeneous platform and software systems. Client-Server architecture allows for distributive processing of information among networked computers and provides the flexibility needed to link diverse systems together effectively. We have incorporated this client-server model with a graphical user interface into an outpatient medical record system, known as SuperChart, for the Department of Family Medicine at SUNY Health Science Center at Syracuse. SuperChart was developed using SuperCard and Oracle SuperCard uses modern object-oriented programming to support a hypermedia environment. Oracle is a powerful relational database management system that incorporates a client-server architecture. This provides both a distributed database and distributed processing which improves performance. PMID:1807732

  14. Recent advancements on the development of web-based applications for the implementation of seismic analysis and surveillance systems

    NASA Astrophysics Data System (ADS)

    Friberg, P. A.; Luis, R. S.; Quintiliani, M.; Lisowski, S.; Hunter, S.

    2014-12-01

    Recently, a novel set of modules has been included in the Open Source Earthworm seismic data processing system, supporting the use of web applications. These include the Mole sub-system, for storing relevant event data in a MySQL database (see M. Quintiliani and S. Pintore, SRL, 2013), and an embedded webserver, Moleserv, for serving such data to web clients in QuakeML format. These modules have enabled, for the first time using Earthworm, the use of web applications for seismic data processing. These can greatly simplify the operation and maintenance of seismic data processing centers by having one or more servers providing the relevant data as well as the data processing applications themselves to client machines running arbitrary operating systems.Web applications with secure online web access allow operators to work anywhere, without the often cumbersome and bandwidth hungry use of secure shell or virtual private networks. Furthermore, web applications can seamlessly access third party data repositories to acquire additional information, such as maps. Finally, the usage of HTML email brought the possibility of specialized web applications, to be used in email clients. This is the case of EWHTMLEmail, which produces event notification emails that are in fact simple web applications for plotting relevant seismic data.Providing web services as part of Earthworm has enabled a number of other tools as well. One is ISTI's EZ Earthworm, a web based command and control system for an otherwise command line driven system; another is a waveform web service. The waveform web service serves Earthworm data to additional web clients for plotting, picking, and other web-based processing tools. The current Earthworm waveform web service hosts an advanced plotting capability for providing views of event-based waveforms from a Mole database served by Moleserve.The current trend towards the usage of cloud services supported by web applications is driving improvements in JavaScript, css and HTML, as well as faster and more efficient web browsers, including mobile. It is foreseeable that in the near future, web applications are as powerful and efficient as native applications. Hence the work described here has been the first step towards bringing the Open Source Earthworm seismic data processing system to this new paradigm.

  15. Federal Emergency Management Information System (FEMIS) system administration guide, version 1.4.5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arp, J.A.; Burnett, R.A.; Carter, R.J.

    The Federal Emergency Management Information Systems (FEMIS) is an emergency management planning and response tool that was developed by the Pacific Northwest National Laboratory (PNNL) under the direction of the US Army Chemical Biological Defense Command. The FEMIS System Administration Guide provides information necessary for the system administrator to maintain the FEMIS system. The FEMIS system is designed for a single Chemical Stockpile Emergency Preparedness Program (CSEPP) site that has multiple Emergency Operations Centers (EOCs). Each EOC has personal computers (PCs) that emergency planners and operations personnel use to do their jobs. These PCs are connected via a local areamore » network (LAN) to servers that provide EOC-wide services. Each EOC is interconnected to other EOCs via a Wide Area Network (WAN). Thus, FEMIS is an integrated software product that resides on client/server computer architecture. The main body of FEMIS software, referred to as the FEMIS Application Software, resides on the PC client(s) and is directly accessible to emergency management personnel. The remainder of the FEMIS software, referred to as the FEMIS Support Software, resides on the UNIX server. The Support Software provides the communication, data distribution, and notification functionality necessary to operate FEMIS in a networked, client/server environment. The UNIX server provides an Oracle relational database management system (RDBMS) services, ARC/INFO GIS (optional) capabilities, and basic file management services. PNNL developed utilities that reside on the server include the Notification Service, the Command Service that executes the evacuation model, and AutoRecovery. To operate FEMIS, the Application Software must have access to a site specific FEMIS emergency management database. Data that pertains to an individual EOC`s jurisdiction is stored on the EOC`s local server. Information that needs to be accessible to all EOCs is automatically distributed by the FEMIS database to the other EOCs at the site.« less

  16. MOOsburg: Multi-User Domain Support for a Community Network.

    ERIC Educational Resources Information Center

    Carroll, John M.; Rosson, Mary Beth; Isenhour, Philip L.; Van Metre, Christina; Schafer, Wendy A.; Ganoe, Craig H.

    2001-01-01

    Explains MOOsburg, a community-oriented MOO that models the geography of the town of Blacksburg, Virginia and is designed to be used by local residents. Highlights include the software architecture; client-server communication; spatial database; user interface; interaction; map-based navigation; application development; and future plans. (LRW)

  17. Design of Integrated Database on Mobile Information System: A Study of Yogyakarta Smart City App

    NASA Astrophysics Data System (ADS)

    Nurnawati, E. K.; Ermawati, E.

    2018-02-01

    An integration database is a database which acts as the data store for multiple applications and thus integrates data across these applications (in contrast to an Application Database). An integration database needs a schema that takes all its client applications into account. The benefit of the schema that sharing data among applications does not require an extra layer of integration services on the applications. Any changes to data made in a single application are made available to all applications at the time of database commit - thus keeping the applications’ data use better synchronized. This study aims to design and build an integrated database that can be used by various applications in a mobile device based system platforms with the based on smart city system. The built-in database can be used by various applications, whether used together or separately. The design and development of the database are emphasized on the flexibility, security, and completeness of attributes that can be used together by various applications to be built. The method used in this study is to choice of the appropriate database logical structure (patterns of data) and to build the relational-database models (Design Databases). Test the resulting design with some prototype apps and analyze system performance with test data. The integrated database can be utilized both of the admin and the user in an integral and comprehensive platform. This system can help admin, manager, and operator in managing the application easily and efficiently. This Android-based app is built based on a dynamic clientserver where data is extracted from an external database MySQL. So if there is a change of data in the database, then the data on Android applications will also change. This Android app assists users in searching of Yogyakarta (as smart city) related information, especially in culture, government, hotels, and transportation.

  18. NMRPro: an integrated web component for interactive processing and visualization of NMR spectra.

    PubMed

    Mohamed, Ahmed; Nguyen, Canh Hao; Mamitsuka, Hiroshi

    2016-07-01

    The popularity of using NMR spectroscopy in metabolomics and natural products has driven the development of an array of NMR spectral analysis tools and databases. Particularly, web applications are well used recently because they are platform-independent and easy to extend through reusable web components. Currently available web applications provide the analysis of NMR spectra. However, they still lack the necessary processing and interactive visualization functionalities. To overcome these limitations, we present NMRPro, a web component that can be easily incorporated into current web applications, enabling easy-to-use online interactive processing and visualization. NMRPro integrates server-side processing with client-side interactive visualization through three parts: a python package to efficiently process large NMR datasets on the server-side, a Django App managing server-client interaction, and SpecdrawJS for client-side interactive visualization. Demo and installation instructions are available at http://mamitsukalab.org/tools/nmrpro/ mohamed@kuicr.kyoto-u.ac.jp Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  19. CheD: chemical database compilation tool, Internet server, and client for SQL servers.

    PubMed

    Trepalin, S V; Yarkov, A V

    2001-01-01

    An efficient program, which runs on a personal computer, for the storage, retrieval, and processing of chemical information, is presented, The program can work both as a stand-alone application or in conjunction with a specifically written Web server application or with some standard SQL servers, e.g., Oracle, Interbase, and MS SQL. New types of data fields are introduced, e.g., arrays for spectral information storage, HTML and database links, and user-defined functions. CheD has an open architecture; thus, custom data types, controls, and services may be added. A WWW server application for chemical data retrieval features an easy and user-friendly installation on Windows NT or 95 platforms.

  20. Web-based access to near real-time and archived high-density time-series data: cyber infrastructure challenges & developments in the open-source Waveform Server

    NASA Astrophysics Data System (ADS)

    Reyes, J. C.; Vernon, F. L.; Newman, R. L.; Steidl, J. H.

    2010-12-01

    The Waveform Server is an interactive web-based interface to multi-station, multi-sensor and multi-channel high-density time-series data stored in Center for Seismic Studies (CSS) 3.0 schema relational databases (Newman et al., 2009). In the last twelve months, based on expanded specifications and current user feedback, both the server-side infrastructure and client-side interface have been extensively rewritten. The Python Twisted server-side code-base has been fundamentally modified to now present waveform data stored in cluster-based databases using a multi-threaded architecture, in addition to supporting the pre-existing single database model. This allows interactive web-based access to high-density (broadband @ 40Hz to strong motion @ 200Hz) waveform data that can span multiple years; the common lifetime of broadband seismic networks. The client-side interface expands on it's use of simple JSON-based AJAX queries to now incorporate a variety of User Interface (UI) improvements including standardized calendars for defining time ranges, applying on-the-fly data calibration to display SI-unit data, and increased rendering speed. This presentation will outline the various cyber infrastructure challenges we have faced while developing this application, the use-cases currently in existence, and the limitations of web-based application development.

  1. Building I.S. Professionals through a Real-World Client Project in a Database Application Development Course

    ERIC Educational Resources Information Center

    Podeschi, R. J.

    2016-01-01

    Information systems curricula are increasingly using active learning methodologies to help students learn "through" technology rather than just "about" technology. While one way to achieve this is through the assignment of semester-long projects, previous research suggests that real-world projects provide more meaningful…

  2. Dcs Data Viewer, an Application that Accesses ATLAS DCS Historical Data

    NASA Astrophysics Data System (ADS)

    Tsarouchas, C.; Schlenker, S.; Dimitrov, G.; Jahn, G.

    2014-06-01

    The ATLAS experiment at CERN is one of the four Large Hadron Collider experiments. The Detector Control System (DCS) of ATLAS is responsible for the supervision of the detector equipment, the reading of operational parameters, the propagation of the alarms and the archiving of important operational data in a relational database (DB). DCS Data Viewer (DDV) is an application that provides access to the ATLAS DCS historical data through a web interface. Its design is structured using a client-server architecture. The pythonic server connects to the DB and fetches the data by using optimized SQL requests. It communicates with the outside world, by accepting HTTP requests and it can be used stand alone. The client is an AJAX (Asynchronous JavaScript and XML) interactive web application developed under the Google Web Toolkit (GWT) framework. Its web interface is user friendly, platform and browser independent. The selection of metadata is done via a column-tree view or with a powerful search engine. The final visualization of the data is done using java applets or java script applications as plugins. The default output is a value-over-time chart, but other types of outputs like tables, ascii or ROOT files are supported too. Excessive access or malicious use of the database is prevented by a dedicated protection mechanism, allowing the exposure of the tool to hundreds of inexperienced users. The current configuration of the client and of the outputs can be saved in an XML file. Protection against web security attacks is foreseen and authentication constrains have been taken into account, allowing the exposure of the tool to hundreds of users world wide. Due to its flexible interface and its generic and modular approach, DDV could be easily used for other experiment control systems.

  3. Registered File Support for Critical Operations Files at (Space Infrared Telescope Facility) SIRTF

    NASA Technical Reports Server (NTRS)

    Turek, G.; Handley, Tom; Jacobson, J.; Rector, J.

    2001-01-01

    The SIRTF Science Center's (SSC) Science Operations System (SOS) has to contend with nearly one hundred critical operations files via comprehensive file management services. The management is accomplished via the registered file system (otherwise known as TFS) which manages these files in a registered file repository composed of a virtual file system accessible via a TFS server and a file registration database. The TFS server provides controlled, reliable, and secure file transfer and storage by registering all file transactions and meta-data in the file registration database. An API is provided for application programs to communicate with TFS servers and the repository. A command line client implementing this API has been developed as a client tool. This paper describes the architecture, current implementation, but more importantly, the evolution of these services based on evolving community use cases and emerging information system technology.

  4. Open source database of images DEIMOS: extension for large-scale subjective image quality assessment

    NASA Astrophysics Data System (ADS)

    Vítek, Stanislav

    2014-09-01

    DEIMOS (Database of Images: Open Source) is an open-source database of images and video sequences for testing, verification and comparison of various image and/or video processing techniques such as compression, reconstruction and enhancement. This paper deals with extension of the database allowing performing large-scale web-based subjective image quality assessment. Extension implements both administrative and client interface. The proposed system is aimed mainly at mobile communication devices, taking into account advantages of HTML5 technology; it means that participants don't need to install any application and assessment could be performed using web browser. The assessment campaign administrator can select images from the large database and then apply rules defined by various test procedure recommendations. The standard test procedures may be fully customized and saved as a template. Alternatively the administrator can define a custom test, using images from the pool and other components, such as evaluating forms and ongoing questionnaires. Image sequence is delivered to the online client, e.g. smartphone or tablet, as a fully automated assessment sequence or viewer can decide on timing of the assessment if required. Environmental data and viewing conditions (e.g. illumination, vibrations, GPS coordinates, etc.), may be collected and subsequently analyzed.

  5. Web application for detailed real-time database transaction monitoring for CMS condition data

    NASA Astrophysics Data System (ADS)

    de Gruttola, Michele; Di Guida, Salvatore; Innocente, Vincenzo; Pierro, Antonio

    2012-12-01

    In the upcoming LHC era, database have become an essential part for the experiments collecting data from LHC, in order to safely store, and consistently retrieve, a wide amount of data, which are produced by different sources. In the CMS experiment at CERN, all this information is stored in ORACLE databases, allocated in several servers, both inside and outside the CERN network. In this scenario, the task of monitoring different databases is a crucial database administration issue, since different information may be required depending on different users' tasks such as data transfer, inspection, planning and security issues. We present here a web application based on Python web framework and Python modules for data mining purposes. To customize the GUI we record traces of user interactions that are used to build use case models. In addition the application detects errors in database transactions (for example identify any mistake made by user, application failure, unexpected network shutdown or Structured Query Language (SQL) statement error) and provides warning messages from the different users' perspectives. Finally, in order to fullfill the requirements of the CMS experiment community, and to meet the new development in many Web client tools, our application was further developed, and new features were deployed.

  6. Installation of the National Transport Code Collaboration Data Server at the ITPA International Multi-tokamak Confinement Profile Database

    NASA Astrophysics Data System (ADS)

    Roach, Colin; Carlsson, Johan; Cary, John R.; Alexander, David A.

    2002-11-01

    The National Transport Code Collaboration (NTCC) has developed an array of software, including a data client/server. The data server, which is written in C++, serves local data (in the ITER Profile Database format) as well as remote data (by accessing one or several MDS+ servers). The client, a web-invocable Java applet, provides a uniform, intuitive, user-friendly, graphical interface to the data server. The uniformity of the interface relieves the user from the trouble of mastering the differences between different data formats and lets him/her focus on the essentials: plotting and viewing the data. The user runs the client by visiting a web page using any Java capable Web browser. The client is automatically downloaded and run by the browser. A reference to the data server is then retrieved via the standard Web protocol (HTTP). The communication between the client and the server is then handled by the mature, industry-standard CORBA middleware. CORBA has bindings for all common languages and many high-quality implementations are available (both Open Source and commercial). The NTCC data server has been installed at the ITPA International Multi-tokamak Confinement Profile Database, which is hosted by the UKAEA at Culham Science Centre. The installation of the data server is protected by an Internet firewall. To make it accessible to clients outside the firewall some modifications of the server were required. The working version of the ITPA confinement profile database is not open to the public. Authentification of legitimate users is done utilizing built-in Java security features to demand a password to download the client. We present an overview of the NTCC data client/server and some details of how the CORBA firewall-traversal issues were resolved and how the user authentification is implemented.

  7. [Research and development of medical case database: a novel medical case information system integrating with biospecimen management].

    PubMed

    Pan, Shiyang; Mu, Yuan; Wang, Hong; Wang, Tong; Huang, Peijun; Ma, Jianfeng; Jiang, Li; Zhang, Jie; Gu, Bing; Yi, Lujiang

    2010-04-01

    To meet the needs of management of medical case information and biospecimen simultaneously, we developed a novel medical case information system integrating with biospecimen management. The database established by MS SQL Server 2000 covered, basic information, clinical diagnosis, imaging diagnosis, pathological diagnosis and clinical treatment of patient; physicochemical property, inventory management and laboratory analysis of biospecimen; users log and data maintenance. The client application developed by Visual C++ 6.0 was used to implement medical case and biospecimen management, which was based on Client/Server model. This system can perform input, browse, inquest, summary of case and related biospecimen information, and can automatically synthesize case-records based on the database. Management of not only a long-term follow-up on individual, but also of grouped cases organized according to the aim of research can be achieved by the system. This system can improve the efficiency and quality of clinical researches while biospecimens are used coordinately. It realizes synthesized and dynamic management of medical case and biospecimen, which may be considered as a new management platform.

  8. An Introduction to MAMA (Meta-Analysis of MicroArray data) System.

    PubMed

    Zhang, Zhe; Fenstermacher, David

    2005-01-01

    Analyzing microarray data across multiple experiments has been proven advantageous. To support this kind of analysis, we are developing a software system called MAMA (Meta-Analysis of MicroArray data). MAMA utilizes a client-server architecture with a relational database on the server-side for the storage of microarray datasets collected from various resources. The client-side is an application running on the end user's computer that allows the user to manipulate microarray data and analytical results locally. MAMA implementation will integrate several analytical methods, including meta-analysis within an open-source framework offering other developers the flexibility to plug in additional statistical algorithms.

  9. Applying Analogical Reasoning Techniques for Teaching XML Document Querying Skills in Database Classes

    ERIC Educational Resources Information Center

    Mitri, Michel

    2012-01-01

    XML has become the most ubiquitous format for exchange of data between applications running on the Internet. Most Web Services provide their information to clients in the form of XML. The ability to process complex XML documents in order to extract relevant information is becoming as important a skill for IS students to master as querying…

  10. Development, Deployment, and Cost Effectiveness of a Self-Administered Stereo Non Mydriatic Automated Retinal Camera (SNARC) Containing Automated Retinal Lesion (ARL) Detection Using Adaptive Optics

    DTIC Science & Technology

    2010-10-01

    Requirements Application Server  BEA Weblogic Express 9.2 or higher  Java v5Apache Struts v2  Hibernate v2  C3PO  SQL*Net client / JDBC Database Server...designed for the desktop o An HTML and JavaScript browser-based front end designed for mobile Smartphones - A Java -based framework utilizing Apache...Technology Requirements The recommended technologies are as follows: Technology Use Requirements Java Application Provides the backend application

  11. jSPyDB, an open source database-independent tool for data management

    NASA Astrophysics Data System (ADS)

    Pierro, Giuseppe Antonio; Cavallari, Francesca; Di Guida, Salvatore; Innocente, Vincenzo

    2011-12-01

    Nowadays, the number of commercial tools available for accessing Databases, built on Java or .Net, is increasing. However, many of these applications have several drawbacks: usually they are not open-source, they provide interfaces only with a specific kind of database, they are platform-dependent and very CPU and memory consuming. jSPyDB is a free web-based tool written using Python and Javascript. It relies on jQuery and python libraries, and is intended to provide a simple handler to different database technologies inside a local web browser. Such a tool, exploiting fast access libraries such as SQLAlchemy, is easy to install, and to configure. The design of this tool envisages three layers. The front-end client side in the local web browser communicates with a backend server. Only the server is able to connect to the different databases for the purposes of performing data definition and manipulation. The server makes the data available to the client, so that the user can display and handle them safely. Moreover, thanks to jQuery libraries, this tool supports export of data in different formats, such as XML and JSON. Finally, by using a set of pre-defined functions, users are allowed to create their customized views for a better data visualization. In this way, we optimize the performance of database servers by avoiding short connections and concurrent sessions. In addition, security is enforced since we do not provide users the possibility to directly execute any SQL statement.

  12. Development of a Personal Digital Assistant (PDA) based client/server NICU patient data and charting system.

    PubMed

    Carroll, A E; Saluja, S; Tarczy-Hornoch, P

    2001-01-01

    Personal Digital Assistants (PDAs) offer clinicians the ability to enter and manage critical information at the point of care. Although PDAs have always been designed to be intuitive and easy to use, recent advances in technology have made them even more accessible. The ability to link data on a PDA (client) to a central database (server) allows for near-unlimited potential in developing point of care applications and systems for patient data management. Although many stand-alone systems exist for PDAs, none are designed to work in an integrated client/server environment. This paper describes the design, software and hardware selection, and preliminary testing of a PDA based patient data and charting system for use in the University of Washington Neonatal Intensive Care Unit (NICU). This system will be the subject of a subsequent study to determine its impact on patient outcomes and clinician efficiency.

  13. Web servicing the biological office.

    PubMed

    Szugat, Martin; Güttler, Daniel; Fundel, Katrin; Sohler, Florian; Zimmer, Ralf

    2005-09-01

    Biologists routinely use Microsoft Office applications for standard analysis tasks. Despite ubiquitous internet resources, information needed for everyday work is often not directly and seamlessly available. Here we describe a very simple and easily extendable mechanism using Web Services to enrich standard MS Office applications with internet resources. We demonstrate its capabilities by providing a Web-based thesaurus for biological objects, which maps names to database identifiers and vice versa via an appropriate synonym list. The client application ProTag makes these features available in MS Office applications using Smart Tags and Add-Ins. http://services.bio.ifi.lmu.de/prothesaurus/

  14. Phynx: an open source software solution supporting data management and web-based patient-level data review for drug safety studies in the general practice research database and other health care databases.

    PubMed

    Egbring, Marco; Kullak-Ublick, Gerd A; Russmann, Stefan

    2010-01-01

    To develop a software solution that supports management and clinical review of patient data from electronic medical records databases or claims databases for pharmacoepidemiological drug safety studies. We used open source software to build a data management system and an internet application with a Flex client on a Java application server with a MySQL database backend. The application is hosted on Amazon Elastic Compute Cloud. This solution named Phynx supports data management, Web-based display of electronic patient information, and interactive review of patient-level information in the individual clinical context. This system was applied to a dataset from the UK General Practice Research Database (GPRD). Our solution can be setup and customized with limited programming resources, and there is almost no extra cost for software. Access times are short, the displayed information is structured in chronological order and visually attractive, and selected information such as drug exposure can be blinded. External experts can review patient profiles and save evaluations and comments via a common Web browser. Phynx provides a flexible and economical solution for patient-level review of electronic medical information from databases considering the individual clinical context. It can therefore make an important contribution to an efficient validation of outcome assessment in drug safety database studies.

  15. WMT: The CSDMS Web Modeling Tool

    NASA Astrophysics Data System (ADS)

    Piper, M.; Hutton, E. W. H.; Overeem, I.; Syvitski, J. P.

    2015-12-01

    The Community Surface Dynamics Modeling System (CSDMS) has a mission to enable model use and development for research in earth surface processes. CSDMS strives to expand the use of quantitative modeling techniques, promotes best practices in coding, and advocates for the use of open-source software. To streamline and standardize access to models, CSDMS has developed the Web Modeling Tool (WMT), a RESTful web application with a client-side graphical interface and a server-side database and API that allows users to build coupled surface dynamics models in a web browser on a personal computer or a mobile device, and run them in a high-performance computing (HPC) environment. With WMT, users can: Design a model from a set of components Edit component parameters Save models to a web-accessible server Share saved models with the community Submit runs to an HPC system Download simulation results The WMT client is an Ajax application written in Java with GWT, which allows developers to employ object-oriented design principles and development tools such as Ant, Eclipse and JUnit. For deployment on the web, the GWT compiler translates Java code to optimized and obfuscated JavaScript. The WMT client is supported on Firefox, Chrome, Safari, and Internet Explorer. The WMT server, written in Python and SQLite, is a layered system, with each layer exposing a web service API: wmt-db: database of component, model, and simulation metadata and output wmt-api: configure and connect components wmt-exe: launch simulations on remote execution servers The database server provides, as JSON-encoded messages, the metadata for users to couple model components, including descriptions of component exchange items, uses and provides ports, and input parameters. Execution servers are network-accessible computational resources, ranging from HPC systems to desktop computers, containing the CSDMS software stack for running a simulation. Once a simulation completes, its output, in NetCDF, is packaged and uploaded to a data server where it is stored and from which a user can download it as a single compressed archive file.

  16. A Call for Feminist Research: A Limited Client Perspective

    ERIC Educational Resources Information Center

    Murray, Kirsten

    2006-01-01

    Feminist approaches embrace a counselor stance that is both collaborative and supportive, seeking client empowerment. On review of feminist family and couple counseling literature of the past 20 years using several academic databases, no research was found that explored a clients experience of feminist-informed family and couple counseling. The…

  17. Content-based image retrieval on mobile devices

    NASA Astrophysics Data System (ADS)

    Ahmad, Iftikhar; Abdullah, Shafaq; Kiranyaz, Serkan; Gabbouj, Moncef

    2005-03-01

    Content-based image retrieval area possesses a tremendous potential for exploration and utilization equally for researchers and people in industry due to its promising results. Expeditious retrieval of desired images requires indexing of the content in large-scale databases along with extraction of low-level features based on the content of these images. With the recent advances in wireless communication technology and availability of multimedia capable phones it has become vital to enable query operation in image databases and retrieve results based on the image content. In this paper we present a content-based image retrieval system for mobile platforms, providing the capability of content-based query to any mobile device that supports Java platform. The system consists of light-weight client application running on a Java enabled device and a server containing a servlet running inside a Java enabled web server. The server responds to image query using efficient native code from selected image database. The client application, running on a mobile phone, is able to initiate a query request, which is handled by a servlet in the server for finding closest match to the queried image. The retrieved results are transmitted over mobile network and images are displayed on the mobile phone. We conclude that such system serves as a basis of content-based information retrieval on wireless devices and needs to cope up with factors such as constraints on hand-held devices and reduced network bandwidth available in mobile environments.

  18. blastjs: a BLAST+ wrapper for Node.js.

    PubMed

    Page, Martin; MacLean, Dan; Schudoma, Christian

    2016-02-27

    To cope with the ever-increasing amount of sequence data generated in the field of genomics, the demand for efficient and fast database searches that drive functional and structural annotation in both large- and small-scale genome projects is on the rise. The tools of the BLAST+ suite are the most widely employed bioinformatic method for these database searches. Recent trends in bioinformatics application development show an increasing number of JavaScript apps that are based on modern frameworks such as Node.js. Until now, there is no way of using database searches with the BLAST+ suite from a Node.js codebase. We developed blastjs, a Node.js library that wraps the search tools of the BLAST+ suite and thus allows to easily add significant functionality to any Node.js-based application. blastjs is a library that allows the incorporation of BLAST+ functionality into bioinformatics applications based on JavaScript and Node.js. The library was designed to be as user-friendly as possible and therefore requires only a minimal amount of code in the client application. The library is freely available under the MIT license at https://github.com/teammaclean/blastjs.

  19. Availability of software services for a hospital information system.

    PubMed

    Sakamoto, N

    1998-03-01

    Hospital information systems (HISs) are becoming more important and covering more parts in daily hospital operations as order-entry systems become popular and electronic charts are introduced. Thus, HISs today need to be able to provide necessary services for hospital operations for a 24-h day, 365 days a year. The provision of services discussed here does not simply mean the availability of computers, in which all that matters is that the computer is functioning. It means the provision of necessary information for hospital operations by the computer software, and we will call it the availability of software services. HISs these days are mostly client-server systems. To increase availability of software services in these systems, it is not enough to just use system structures that are highly reliable in existing host-centred systems. Four main components which support availability of software services are network systems, client computers, server computers, and application software. In this paper, we suggest how to structure these four components to provide the minimum requested software services even if a part of the system stops to function. The network system should be double-protected in stratus using Asynchronous Transfer Mode (ATM) as its base network. Client computers should be fat clients with as much application logic as possible, and reference information which do not require frequent updates (master files, for example) should be replicated in clients. It would be best if all server computers could be double-protected. However, if that is physically impossible, one database file should be made accessible by several server computers. Still, at least the basic patients' information and the latest clinical records should be double-protected physically. Application software should be tested carefully before introduction. Different versions of the application software should always be kept and managed in case the new version has problems. If a hospital information system is designed and developed with these points in mind, it's availability of software services should increase greatly.

  20. KAGLVis - On-line 3D Visualisation of Earth-observing-satellite Data

    NASA Astrophysics Data System (ADS)

    Szuba, Marek; Ameri, Parinaz; Grabowski, Udo; Maatouki, Ahmad; Meyer, Jörg

    2015-04-01

    One of the goals of the Large-Scale Data Management and Analysis project is to provide a high-performance framework facilitating management of data acquired by Earth-observing satellites such as Envisat. On the client-facing facet of this framework, we strive to provide visualisation and basic analysis tool which could be used by scientists with minimal to no knowledge of the underlying infrastructure. Our tool, KAGLVis, is a JavaScript client-server Web application which leverages modern Web technologies to provide three-dimensional visualisation of satellite observables on a wide range of client systems. It takes advantage of the WebGL API to employ locally available GPU power for 3D rendering; this approach has been demonstrated to perform well even on relatively weak hardware such as integrated graphics chipsets found in modern laptop computers and with some user-interface tuning could even be usable on embedded devices such as smartphones or tablets. Data is fetched from the database back-end using a ReST API and cached locally, both in memory and using HTML5 Web Storage, to minimise network use. Computations, calculation of cloud altitude from cloud-index measurements for instance, can depending on configuration be performed on either the client or the server side. Keywords: satellite data, Envisat, visualisation, 3D graphics, Web application, WebGL, MEAN stack.

  1. Flexible augmented reality architecture applied to environmental management

    NASA Astrophysics Data System (ADS)

    Correia, Nuno M. R.; Romao, Teresa; Santos, Carlos; Trabuco, Adelaide; Santos, Rossana; Romero, Luis; Danado, Jose; Dias, Eduardo; Camara, Antonio; Nobre, Edmundo

    2003-05-01

    Environmental management often requires in loco observation of the area under analysis. Augmented Reality (AR) technologies allow real time superimposition of synthetic objects on real images, providing augmented knowledge about the surrounding world. Users of an AR system can visualize the real surrounding world together with additional data generated in real time in a contextual way. The work reported in this paper was done in the scope of ANTS (Augmented Environments) project. ANTS is an AR project that explores the development of an augmented reality technological infrastructure for environmental management. This paper presents the architecture and the most relevant modules of ANTS. The system"s architecture follows the client-server model and is based on several independent, but functionally interdependent modules. It has a flexible design, which allows the transfer of some modules to and from the client side, according to the available processing capacities of the client device and the application"s requirements. It combines several techniques to identify the user"s position and orientation allowing the system to adapt to the particular characteristics of each environment. The determination of the data associated to a certain location involves the use of both a 3D Model of the location and the multimedia geo-referenced database.

  2. Netlib services and resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Browne, S.V.; Green, S.C.; Moore, K.

    1994-04-01

    The Netlib repository, maintained by the University of Tennessee and Oak Ridge National Laboratory, contains freely available software, documents, and databases of interest to the numerical, scientific computing, and other communities. This report includes both the Netlib User`s Guide and the Netlib System Manager`s Guide, and contains information about Netlib`s databases, interfaces, and system implementation. The Netlib repository`s databases include the Performance Database, the Conferences Database, and the NA-NET mail forwarding and Whitepages Databases. A variety of user interfaces enable users to access the Netlib repository in the manner most convenient and compatible with their networking capabilities. These interfaces includemore » the Netlib email interface, the Xnetlib X Windows client, the netlibget command-line TCP/IP client, anonymous FTP, anonymous RCP, and gopher.« less

  3. NUREBASE: database of nuclear hormone receptors.

    PubMed

    Duarte, Jorge; Perrière, Guy; Laudet, Vincent; Robinson-Rechavi, Marc

    2002-01-01

    Nuclear hormone receptors are an abundant class of ligand activated transcriptional regulators, found in varying numbers in all animals. Based on our experience of managing the official nomenclature of nuclear receptors, we have developed NUREBASE, a database containing protein and DNA sequences, reviewed protein alignments and phylogenies, taxonomy and annotations for all nuclear receptors. The reviewed NUREBASE is completed by NUREBASE_DAILY, automatically updated every 24 h. Both databases are organized under a client/server architecture, with a client written in Java which runs on any platform. This client, named FamFetch, integrates a graphical interface allowing selection of families, and manipulation of phylogenies and alignments. NUREBASE sequence data is also accessible through a World Wide Web server, allowing complex queries. All information on accessing and installing NUREBASE may be found at http://www.ens-lyon.fr/LBMC/laudet/nurebase.html.

  4. Searching and exploitation of distributed geospatial data sources via the Naval Research Lab's Geospatial Information Database (GIDB) Portal System

    NASA Astrophysics Data System (ADS)

    McCreedy, Frank P.; Sample, John T.; Ladd, William P.; Thomas, Michael L.; Shaw, Kevin B.

    2005-05-01

    The Naval Research Laboratory"s Geospatial Information Database (GIDBTM) Portal System has been extended to now include an extensive geospatial search functionality. The GIDB Portal System interconnects over 600 distributed geospatial data sources via the Internet with a thick client, thin client and a PDA client. As the GIDB Portal System has rapidly grown over the last two years (adding hundreds of geospatial sources), the obvious requirement has arisen to more effectively mine the interconnected sources in near real-time. How the GIDB Search addresses this issue is the prime focus of this paper.

  5. creation of unificed geoinformation system for monitoring social and economic developments of departamento quindio (colombia) on the basis of the isolated data sources

    NASA Astrophysics Data System (ADS)

    Buravlev, V.; Sereshnikov, S. V.; Mayorov, A. A.; Vila, J. J.

    At each level of the state and municipal management the information resources which provide the support of acceptance of administrative decisions, usually are performed as a number of polytypic, untied among themselves electronic data sources, such as databases, geoinformation projects, electronic archives of documents, etc. These sources are located in the various organizations, they function in various programs, and are actualized according to various rules. Creation on the basis of such isolated sources of the uniform information systems which provide an opportunity to look through and analyze any information stored in these sources in real time mode, will help to promote an increase in a degree of adequacy of accepted administrative decisions. The Distributed Data Service technology - TrisoftDDS, developed by company Trisoft, Ltd, provides the construction of horizontal territorially distributed heterogeneous information systems (TeRGIS). Technology TrisoftDDS allows the quickly creation and support, easy modification of systems, the data sources for which are already existing information complexes, without any working capacity infringements of the last ones, and provides the remote regulated multi-user access to the different types of data sources by the Internet/Intranet. Relational databases, GIS projects, files of various types (documents MS Office, images, html documents, etc.) can be used as data sources in TeRGIS. TeRGIS is created as Internet/Intranet application representing three-level client-server system. Access to the information in existing data sources is carried out by means of the distributed DDS data service, the nucleus of which is the distributed data service server - the DSServer, settling down on an intermediate level. TrisoftDDS Technology includes the following components: Client DSBrowser (Data Service Browser) - the client application connected through the Internet/intranet to the DSServer and provides both - a choice and viewing of documents. Tables of databases, inquiries to databases, inquiries to geoinformation projects, files of various types (documents MS Office, images, html files, etc.) can act as documents. For work with complex data sources the DSBrowser gives an opportunity to create inquiries, to execute data view and filter. Server of the distributed data service - DSServer (Data Service Server) - the web-application that provides the access to the data sources and performance of the client's inquiries on granting of chosen documents. Tool means - Toolkit DDS: the Manager of catalogue - the DCMan (Data Catalog Manager) - - the client-server application intended for the organization and administration of the data catalogue. Documentator - the DSDoc (Data Source Documentor) - the client-server application intended for documenting the procedure of formation of the required document from the data source. The documentation, created by the DBDoc, represents the metadata tables, which are included in the data catalogue with the help of the catalogue manager - the DSCMan. The functioning logic of territorially distributed heterogeneous information system, based on DDS technology, is following: Client application - DSBrowser addresses to the DSServer on specified Internet address. In reply to the reference the DSServer sends the client the catalogue of the system's information resources. The catalogue represents the xml-document which is processed by the client's browser and is deduced as tree - structure in a special window. The user of the application looks through the list and chooses necessary documents, the DSBrowser sends corresponding inquiry to the DSServer. The DSServer, in its turn, addresses to the metadata tables, which describe the document, chosen by user, and broadcasts inquiry to the corresponding data source and after this returns to the client application the result of the inquiry. The catalogue of the data services contains the full Internet address of the document. This allows to create catalogues of the distributed information resources, separate parts of which (documents) can be located on different servers in various places of Internet. Catalogues, thus, can separately settle down at anyone Internet provider, which supports the necessary software. Lists of documents in the catalogue gather in the thematic blocks, allowing to organize user-friendly navigation down the information sources of the system. The TrisoftDDS technology perspectives, first of all, consist of the organization and the functionality of the distributed data service which process inquiries about granting of documents. The distributed data service allows to hide the complex and, in most cases, not necessary features of structure of complex data sources and ways of connection to them from the external user. Instead of this, user receives pseudonyms of connections and file directories, the real parameters of which are stored in the register of the web-server, which hosts the DSServer. Such scheme gives also wide opportunities of the data protection and differentiations of access rights to the information. The technology of creation of horizontal territory distributed geoinformation systems with the purpose of the territorial social and economic development level classification of Quindio Departamento (Columbia) is also given in this work. This technology includes the creation of thematic maps on the base of ESRI software products - Arcview and Erdas. It also shows and offer some ways of regional social and economic development conditions analysis for comparison of optimality of the decision. This technology includes the following parameters: dynamics of demographic processes; education; health and a feed; infrastructure; political and social stability; culture, social and family values; condition of an environment; political and civil institutes; profitableness of the population; unemployment, use of a labour; poverty and not equality. The methodology allows to include other parameters with the help of an expert estimations method and optimization theories and there is also a module for the forecast check by field checks on district.

  6. An Application Server for Scientific Collaboration

    NASA Astrophysics Data System (ADS)

    Cary, John R.; Luetkemeyer, Kelly G.

    1998-11-01

    Tech-X Corporation has developed SciChat, an application server for scientific collaboration. Connections are made to the server through a Java client, that can either be an application or an applet served in a web page. Once connected, the client may choose to start or join a session. A session includes not only other clients, but also an application. Any client can send a command to the application. This command is executed on the server and echoed to all clients. The results of the command, whether numerical or graphical, are then distributed to all of the clients; thus, multiple clients can interact collaboratively with a single application. The client is developed in Java, the server in C++, and the middleware is the Common Object Request Broker Architecture. In this system, the Graphical User Interface processing is on the client machine, so one does not have the disadvantages of insufficient bandwidth as occurs when running X over the internet. Because the server, client, and middleware are object oriented, new types of servers and clients specialized to particular scientific applications are more easily developed.

  7. Event Driven Messaging with Role-Based Subscriptions

    NASA Technical Reports Server (NTRS)

    Bui, Tung; Bui, Bach; Malhotra, Shantanu; Chen, Fannie; Kim, rachel; Allen, Christopher; Luong, Ivy; Chang, George; Zendejas, Silvino; Sadaqathulla, Syed

    2009-01-01

    Event Driven Messaging with Role-Based Subscriptions (EDM-RBS) is a framework integrated into the Service Management Database (SMDB) to allow for role-based and subscription-based delivery of synchronous and asynchronous messages over JMS (Java Messaging Service), SMTP (Simple Mail Transfer Protocol), or SMS (Short Messaging Service). This allows for 24/7 operation with users in all parts of the world. The software classifies messages by triggering data type, application source, owner of data triggering event (mission), classification, sub-classification and various other secondary classifying tags. Messages are routed to applications or users based on subscription rules using a combination of the above message attributes. This program provides a framework for identifying connected users and their applications for targeted delivery of messages over JMS to the client applications the user is logged into. EDMRBS provides the ability to send notifications over e-mail or pager rather than having to rely on a live human to do it. It is implemented as an Oracle application that uses Oracle relational database management system intrinsic functions. It is configurable to use Oracle AQ JMS API or an external JMS provider for messaging. It fully integrates into the event-logging framework of SMDB (Subnet Management Database).

  8. MetNetAPI: A flexible method to access and manipulate biological network data from MetNet

    PubMed Central

    2010-01-01

    Background Convenient programmatic access to different biological databases allows automated integration of scientific knowledge. Many databases support a function to download files or data snapshots, or a webservice that offers "live" data. However, the functionality that a database offers cannot be represented in a static data download file, and webservices may consume considerable computational resources from the host server. Results MetNetAPI is a versatile Application Programming Interface (API) to the MetNetDB database. It abstracts, captures and retains operations away from a biological network repository and website. A range of database functions, previously only available online, can be immediately (and independently from the website) applied to a dataset of interest. Data is available in four layers: molecular entities, localized entities (linked to a specific organelle), interactions, and pathways. Navigation between these layers is intuitive (e.g. one can request the molecular entities in a pathway, as well as request in what pathways a specific entity participates). Data retrieval can be customized: Network objects allow the construction of new and integration of existing pathways and interactions, which can be uploaded back to our server. In contrast to webservices, the computational demand on the host server is limited to processing data-related queries only. Conclusions An API provides several advantages to a systems biology software platform. MetNetAPI illustrates an interface with a central repository of data that represents the complex interrelationships of a metabolic and regulatory network. As an alternative to data-dumps and webservices, it allows access to a current and "live" database and exposes analytical functions to application developers. Yet it only requires limited resources on the server-side (thin server/fat client setup). The API is available for Java, Microsoft.NET and R programming environments and offers flexible query and broad data- retrieval methods. Data retrieval can be customized to client needs and the API offers a framework to construct and manipulate user-defined networks. The design principles can be used as a template to build programmable interfaces for other biological databases. The API software and tutorials are available at http://www.metnetonline.org/api. PMID:21083943

  9. A client/server system for Internet access to biomedical text/image databanks.

    PubMed

    Thoma, G R; Long, L R; Berman, L E

    1996-01-01

    Internet access to mixed text/image databanks is finding application in the medical world. An example is a database of medical X-rays and associated data consisting of demographic, socioeconomic, physician's exam, medical laboratory and other information collected as part of a nationwide health survey conducted by the government. Another example is a collection of digitized cryosection images, CT and MR taken of cadavers as part of the National Library of Medicine's Visible Human Project. In both cases, the challenge is to provide access to both the image and the associated text for a wide end user community to create atlases, conduct epidemiological studies, to develop image-specific algorithms for compression, enhancement and other types of image processing, among many other applications. The databanks mentioned above are being created in prototype form. This paper describes the prototype system developed for the archiving of the data and the client software to enable a broad range of end users to access the archive, retrieve text and image data, display the data and manipulate the images. System design considerations include; data organization in a relational database management system with object-oriented extensions; a hierarchical organization of the image data by different resolution levels for different user classes; client design based on common hardware and software platforms incorporating SQL search capability, X Window, Motif and TAE (a development environment supporting rapid prototyping and management of graphic-oriented user interfaces); potential to include ultra high resolution display monitors as a user option; intuitive user interface paradigm for building complex queries; and contrast enhancement, magnification and mensuration tools for better viewing by the user.

  10. [The database server for the medical bibliography database at Charles University].

    PubMed

    Vejvalka, J; Rojíková, V; Ulrych, O; Vorísek, M

    1998-01-01

    In the medical community, bibliographic databases are widely accepted as a most important source of information both for theoretical and clinical disciplines. To improve access to medical bibliographic databases at Charles University, a database server (ERL by Silver Platter) was set up at the 2nd Faculty of Medicine in Prague. The server, accessible by Internet 24 hours/7 days, hosts now 14 years' MEDLINE and 10 years' EMBASE Paediatrics. Two different strategies are available for connecting to the server: a specialized client program that communicates over the Internet (suitable for professional searching) and a web-based access that requires no specialized software (except the WWW browser) on the client side. The server is now offered to academic community to host further databases, possibly subscribed by consortia whose individual members would not subscribe them by themselves.

  11. Analysis Tool Web Services from the EMBL-EBI.

    PubMed

    McWilliam, Hamish; Li, Weizhong; Uludag, Mahmut; Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Cowley, Andrew Peter; Lopez, Rodrigo

    2013-07-01

    Since 2004 the European Bioinformatics Institute (EMBL-EBI) has provided access to a wide range of databases and analysis tools via Web Services interfaces. This comprises services to search across the databases available from the EMBL-EBI and to explore the network of cross-references present in the data (e.g. EB-eye), services to retrieve entry data in various data formats and to access the data in specific fields (e.g. dbfetch), and analysis tool services, for example, sequence similarity search (e.g. FASTA and NCBI BLAST), multiple sequence alignment (e.g. Clustal Omega and MUSCLE), pairwise sequence alignment and protein functional analysis (e.g. InterProScan and Phobius). The REST/SOAP Web Services (http://www.ebi.ac.uk/Tools/webservices/) interfaces to these databases and tools allow their integration into other tools, applications, web sites, pipeline processes and analytical workflows. To get users started using the Web Services, sample clients are provided covering a range of programming languages and popular Web Service tool kits, and a brief guide to Web Services technologies, including a set of tutorials, is available for those wishing to learn more and develop their own clients. Users of the Web Services are informed of improvements and updates via a range of methods.

  12. Analysis Tool Web Services from the EMBL-EBI

    PubMed Central

    McWilliam, Hamish; Li, Weizhong; Uludag, Mahmut; Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Cowley, Andrew Peter; Lopez, Rodrigo

    2013-01-01

    Since 2004 the European Bioinformatics Institute (EMBL-EBI) has provided access to a wide range of databases and analysis tools via Web Services interfaces. This comprises services to search across the databases available from the EMBL-EBI and to explore the network of cross-references present in the data (e.g. EB-eye), services to retrieve entry data in various data formats and to access the data in specific fields (e.g. dbfetch), and analysis tool services, for example, sequence similarity search (e.g. FASTA and NCBI BLAST), multiple sequence alignment (e.g. Clustal Omega and MUSCLE), pairwise sequence alignment and protein functional analysis (e.g. InterProScan and Phobius). The REST/SOAP Web Services (http://www.ebi.ac.uk/Tools/webservices/) interfaces to these databases and tools allow their integration into other tools, applications, web sites, pipeline processes and analytical workflows. To get users started using the Web Services, sample clients are provided covering a range of programming languages and popular Web Service tool kits, and a brief guide to Web Services technologies, including a set of tutorials, is available for those wishing to learn more and develop their own clients. Users of the Web Services are informed of improvements and updates via a range of methods. PMID:23671338

  13. Delayed Instantiation Bulk Operations for Management of Distributed, Object-Based Storage Systems

    DTIC Science & Technology

    2009-08-01

    source and destination object sets, while they have attribute pages to indicate that history . Fourth, we allow for operations to occur on any objects...client dialogue to the PostgreSQL database where server-side functions implement the service logic for the requests. The translation is done...to satisfy client requests, and performs delayed instantiation bulk operations. It is built around a PostgreSQL database with tables for storing

  14. Web client and ODBC access to legacy database information: a low cost approach.

    PubMed Central

    Sanders, N. W.; Mann, N. H.; Spengler, D. M.

    1997-01-01

    A new method has been developed for the Department of Orthopaedics of Vanderbilt University Medical Center to access departmental clinical data. Previously this data was stored only in the medical center's mainframe DB2 database, it is now additionally stored in a departmental SQL database. Access to this data is available via any ODBC compliant front-end or a web client. With a small budget and no full time staff, we were able to give our department on-line access to many years worth of patient data that was previously inaccessible. PMID:9357735

  15. Real Time Monitor of Grid job executions

    NASA Astrophysics Data System (ADS)

    Colling, D. J.; Martyniak, J.; McGough, A. S.; Křenek, A.; Sitera, J.; Mulač, M.; Dvořák, F.

    2010-04-01

    In this paper we describe the architecture and operation of the Real Time Monitor (RTM), developed by the Grid team in the HEP group at Imperial College London. This is arguably the most popular dissemination tool within the EGEE [1] Grid. Having been used, on many occasions including GridFest and LHC inauguration events held at CERN in October 2008. The RTM gathers information from EGEE sites hosting Logging and Bookkeeping (LB) services. Information is cached locally at a dedicated server at Imperial College London and made available for clients to use in near real time. The system consists of three main components: the RTM server, enquirer and an apache Web Server which is queried by clients. The RTM server queries the LB servers at fixed time intervals, collecting job related information and storing this in a local database. Job related data includes not only job state (i.e. Scheduled, Waiting, Running or Done) along with timing information but also other attributes such as Virtual Organization and Computing Element (CE) queue - if known. The job data stored in the RTM database is read by the enquirer every minute and converted to an XML format which is stored on a Web Server. This decouples the RTM server database from the client removing the bottleneck problem caused by many clients simultaneously accessing the database. This information can be visualized through either a 2D or 3D Java based client with live job data either being overlaid on to a 2 dimensional map of the world or rendered in 3 dimensions over a globe map using OpenGL.

  16. Tank Information System (tis): a Case Study in Migrating Web Mapping Application from Flex to Dojo for Arcgis Server and then to Open Source

    NASA Astrophysics Data System (ADS)

    Pulsani, B. R.

    2017-11-01

    Tank Information System is a web application which provides comprehensive information about minor irrigation tanks of Telangana State. As part of the program, a web mapping application using Flex and ArcGIS server was developed to make the data available to the public. In course of time as Flex be-came outdated, a migration of the client interface to the latest JavaScript based technologies was carried out. Initially, the Flex based application was migrated to ArcGIS JavaScript API using Dojo Toolkit. Both the client applications used published services from ArcGIS server. To check the migration pattern from proprietary to open source, the JavaScript based ArcGIS application was later migrated to OpenLayers and Dojo Toolkit which used published service from GeoServer. The migration pattern noticed in the study especially emphasizes upon the use of Dojo Toolkit and PostgreSQL database for ArcGIS server so that migration to open source could be performed effortlessly. The current ap-plication provides a case in study which could assist organizations in migrating their proprietary based ArcGIS web applications to open source. Furthermore, the study reveals cost benefits of adopting open source against commercial software's.

  17. WebCN: A web-based computation tool for in situ-produced cosmogenic nuclides

    NASA Astrophysics Data System (ADS)

    Ma, Xiuzeng; Li, Yingkui; Bourgeois, Mike; Caffee, Marc; Elmore, David; Granger, Darryl; Muzikar, Paul; Smith, Preston

    2007-06-01

    Cosmogenic nuclide techniques are increasingly being utilized in geoscience research. For this it is critical to establish an effective, easily accessible and well defined tool for cosmogenic nuclide computations. We have been developing a web-based tool (WebCN) to calculate surface exposure ages and erosion rates based on the nuclide concentrations measured by the accelerator mass spectrometry. WebCN for 10Be and 26Al has been finished and published at http://www.physics.purdue.edu/primelab/for_users/rockage.html. WebCN for 36Cl is under construction. WebCN is designed as a three-tier client/server model and uses the open source PostgreSQL for the database management and PHP for the interface design and calculations. On the client side, an internet browser and Microsoft Access are used as application interfaces to access the system. Open Database Connectivity is used to link PostgreSQL and Microsoft Access. WebCN accounts for both spatial and temporal distributions of the cosmic ray flux to calculate the production rates of in situ-produced cosmogenic nuclides at the Earth's surface.

  18. Clients' collaboration in therapy: Self-perceptions and relationships with client psychological functioning, interpersonal relations, and motivation.

    PubMed

    Bachelor, Alexandra; Laverdière, Olivier; Gamache, Dominick; Bordeleau, Vincent

    2007-06-01

    To gain a closer understanding of client collaboration and its determinants, the first goal of this study involved the investigation of clients' perceptions of collaboration using a discovery-oriented methodology. Content analysis of 30 clients' written descriptions revealed three different modes of client collaboration, labeled active, mutual, and therapist-dependent, which emphasized client initiative and active participation, joint participation, and reliance on therapists' contributions to the work and change process, respectively. The majority of clients valued the therapist's active involvement and also emphasized the helpfulness of their collaborative experiences. In general, the therapist actions and attitudes involved in clients' views of good collaboration varied among clients. A second goal was to examine the relationships between client psychological functioning, quality of interpersonal relationships, and motivation, and clients' collaborative contributions, as rated by clients and therapists. Of these, only motivation was significantly associated with client collaboration, particularly in the perceptions of therapists. (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  19. JBioWH: an open-source Java framework for bioinformatics data integration

    PubMed Central

    Vera, Roberto; Perez-Riverol, Yasset; Perez, Sonia; Ligeti, Balázs; Kertész-Farkas, Attila; Pongor, Sándor

    2013-01-01

    The Java BioWareHouse (JBioWH) project is an open-source platform-independent programming framework that allows a user to build his/her own integrated database from the most popular data sources. JBioWH can be used for intensive querying of multiple data sources and the creation of streamlined task-specific data sets on local PCs. JBioWH is based on a MySQL relational database scheme and includes JAVA API parser functions for retrieving data from 20 public databases (e.g. NCBI, KEGG, etc.). It also includes a client desktop application for (non-programmer) users to query data. In addition, JBioWH can be tailored for use in specific circumstances, including the handling of massive queries for high-throughput analyses or CPU intensive calculations. The framework is provided with complete documentation and application examples and it can be downloaded from the Project Web site at http://code.google.com/p/jbiowh. A MySQL server is available for demonstration purposes at hydrax.icgeb.trieste.it:3307. Database URL: http://code.google.com/p/jbiowh PMID:23846595

  20. JBioWH: an open-source Java framework for bioinformatics data integration.

    PubMed

    Vera, Roberto; Perez-Riverol, Yasset; Perez, Sonia; Ligeti, Balázs; Kertész-Farkas, Attila; Pongor, Sándor

    2013-01-01

    The Java BioWareHouse (JBioWH) project is an open-source platform-independent programming framework that allows a user to build his/her own integrated database from the most popular data sources. JBioWH can be used for intensive querying of multiple data sources and the creation of streamlined task-specific data sets on local PCs. JBioWH is based on a MySQL relational database scheme and includes JAVA API parser functions for retrieving data from 20 public databases (e.g. NCBI, KEGG, etc.). It also includes a client desktop application for (non-programmer) users to query data. In addition, JBioWH can be tailored for use in specific circumstances, including the handling of massive queries for high-throughput analyses or CPU intensive calculations. The framework is provided with complete documentation and application examples and it can be downloaded from the Project Web site at http://code.google.com/p/jbiowh. A MySQL server is available for demonstration purposes at hydrax.icgeb.trieste.it:3307. Database URL: http://code.google.com/p/jbiowh.

  1. [A telemedicine electrocardiography system based on the component-architecture soft].

    PubMed

    Potapov, I V; Selishchev, S V

    2004-01-01

    The paper deals with a universal component-oriented architecture for creating the telemedicine applications. The worked-out system ensures the ECG reading, pressure measurements and pulsometry. The system design comprises a central database server and a client telemedicine module. Data can be transmitted via different interfaces--from an ordinary local network to digital satellite phones. The data protection is guaranteed by microchip charts that were used to realize the authentication 3DES algorithm.

  2. An ECG storage and retrieval system embedded in client server HIS utilizing object-oriented DB.

    PubMed

    Wang, C; Ohe, K; Sakurai, T; Nagase, T; Kaihara, S

    1996-02-01

    In the University of Tokyo Hospital, the improved client server HIS has been applied to clinical practice and physicians can order prescription, laboratory examination, ECG examination and radiographic examination, etc. directly by themselves and read results of these examinations, except medical signal waves, schema and image, on UNIX workstations. Recently, we designed and developed an ECG storage and retrieval system embedded in the client server HIS utilizing object-oriented database to take the first step in dealing with digitized signal, schema and image data and show waves, graphics, and images directly to physicians by the client server HIS. The system was developed based on object-oriented analysis and design, and implemented with object-oriented database management system (OODMS) and C++ programming language. In this paper, we describe the ECG data model, functions of the storage and retrieval system, features of user interface and the result of its implementation in the HIS.

  3. Internet calculations of thermodynamic properties of substances: Some problems and results

    NASA Astrophysics Data System (ADS)

    Ustyuzhanin, E. E.; Ochkov, V. F.; Shishakov, V. V.; Rykov, S. V.

    2016-11-01

    Internet resources (databases, web sites and others) on thermodynamic properties R = (p,T,s,...) of technologically important substances are analyzed. These databases put online by a number of organizations (the Joint Institute for High Temperatures of the Russian Academy of Sciences, Standartinform, the National Institute of Standards and Technology USA, the Institute for Thermal Physics of the Siberian Branch of the Russian Academy of Sciences, etc) are investigated. Software codes are elaborated in the work in forms of “client functions” those have such characteristics: (i) they are placed on a remote server, (ii) they serve as open interactive Internet resources. A client can use them for a calculation of R properties of substances. “Complex client functions” are considered. They are focused on sharing (i) software codes elaborated to design of power plants (PP) and (ii) client functions those can calculate R properties of working fluids for PP.

  4. CORAL Server and CORAL Server Proxy: Scalable Access to Relational Databases from CORAL Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valassi, A.; /CERN; Bartoldus, R.

    The CORAL software is widely used at CERN by the LHC experiments to access the data they store on relational databases, such as Oracle. Two new components have recently been added to implement a model involving a middle tier 'CORAL server' deployed close to the database and a tree of 'CORAL server proxies', providing data caching and multiplexing, deployed close to the client. A first implementation of the two new components, released in the summer 2009, is now deployed in the ATLAS online system to read the data needed by the High Level Trigger, allowing the configuration of a farmmore » of several thousand processes. This paper reviews the architecture of the software, its development status and its usage in ATLAS.« less

  5. Design and development of a web-based application for diabetes patient data management.

    PubMed

    Deo, S S; Deobagkar, D N; Deobagkar, Deepti D

    2005-01-01

    A web-based database management system developed for collecting, managing and analysing information of diabetes patients is described here. It is a searchable, client-server, relational database application, developed on the Windows platform using Oracle, Active Server Pages (ASP), Visual Basic Script (VB Script) and Java Script. The software is menu-driven and allows authorized healthcare providers to access, enter, update and analyse patient information. Graphical representation of data can be generated by the system using bar charts and pie charts. An interactive web interface allows users to query the database and generate reports. Alpha- and beta-testing of the system was carried out and the system at present holds records of 500 diabetes patients and is found useful in diagnosis and treatment. In addition to providing patient data on a continuous basis in a simple format, the system is used in population and comparative analysis. It has proved to be of significant advantage to the healthcare provider as compared to the paper-based system.

  6. SPANG: a SPARQL client supporting generation and reuse of queries for distributed RDF databases.

    PubMed

    Chiba, Hirokazu; Uchiyama, Ikuo

    2017-02-08

    Toward improved interoperability of distributed biological databases, an increasing number of datasets have been published in the standardized Resource Description Framework (RDF). Although the powerful SPARQL Protocol and RDF Query Language (SPARQL) provides a basis for exploiting RDF databases, writing SPARQL code is burdensome for users including bioinformaticians. Thus, an easy-to-use interface is necessary. We developed SPANG, a SPARQL client that has unique features for querying RDF datasets. SPANG dynamically generates typical SPARQL queries according to specified arguments. It can also call SPARQL template libraries constructed in a local system or published on the Web. Further, it enables combinatorial execution of multiple queries, each with a distinct target database. These features facilitate easy and effective access to RDF datasets and integrative analysis of distributed data. SPANG helps users to exploit RDF datasets by generation and reuse of SPARQL queries through a simple interface. This client will enhance integrative exploitation of biological RDF datasets distributed across the Web. This software package is freely available at http://purl.org/net/spang .

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soldevilla, M.; Salmons, S.; Espinosa, B.

    The new application BDDR (Reactor database) has been developed at CEA in order to manage nuclear reactors technological and operating data. This application is a knowledge management tool which meets several internal needs: -) to facilitate scenario studies for any set of reactors, e.g. non-proliferation assessments; -) to make core physics studies easier, whatever the reactor design (PWR-Pressurized Water Reactor-, BWR-Boiling Water Reactor-, MAGNOX- Magnesium Oxide reactor-, CANDU - CANada Deuterium Uranium-, FBR - Fast Breeder Reactor -, etc.); -) to preserve the technological data of all reactors (past and present, power generating or experimental, naval propulsion,...) in a uniquemore » repository. Within the application database are enclosed location data and operating history data as well as a tree-like structure containing numerous technological data. These data address all kinds of reactors features and components. A few neutronics data are also included (neutrons fluxes). The BDDR application is based on open-source technologies and thin client/server architecture. The software architecture has been made flexible enough to allow for any change. (authors)« less

  8. A JEE RESTful service to access Conditions Data in ATLAS

    NASA Astrophysics Data System (ADS)

    Formica, Andrea; Gallas, E. J.

    2015-12-01

    Usage of condition data in ATLAS is extensive for offline reconstruction and analysis (e.g. alignment, calibration, data quality). The system is based on the LCG Conditions Database infrastructure, with read and write access via an ad hoc C++ API (COOL), a system which was developed before Run 1 data taking began. The infrastructure dictates that the data is organized into separate schemas (assigned to subsystems/groups storing distinct and independent sets of conditions), making it difficult to access information from several schemas at the same time. We have thus created PL/SQL functions containing queries to provide content extraction at multi-schema level. The PL/SQL API has been exposed to external clients by means of a Java application providing DB access via REST services, deployed inside an application server (JBoss WildFly). The services allow navigation over multiple schemas via simple URLs. The data can be retrieved either in XML or JSON formats, via simple clients (like curl or Web browsers).

  9. Mining large heterogeneous data sets in drug discovery.

    PubMed

    Wild, David J

    2009-10-01

    Increasingly, effective drug discovery involves the searching and data mining of large volumes of information from many sources covering the domains of chemistry, biology and pharmacology amongst others. This has led to a proliferation of databases and data sources relevant to drug discovery. This paper provides a review of the publicly-available large-scale databases relevant to drug discovery, describes the kinds of data mining approaches that can be applied to them and discusses recent work in integrative data mining that looks for associations that pan multiple sources, including the use of Semantic Web techniques. The future of mining large data sets for drug discovery requires intelligent, semantic aggregation of information from all of the data sources described in this review, along with the application of advanced methods such as intelligent agents and inference engines in client applications.

  10. Family tree and ancestry inference: is there a need for a 'generational' consent?

    PubMed

    Wallace, Susan E; Gourna, Elli G; Nikolova, Viktoriya; Sheehan, Nuala A

    2015-12-09

    Genealogical research and ancestry testing are popular recreational activities but little is known about the impact of the use of these services on clients' biological and social families. Ancestry databases are being enriched with self-reported data and data from deoxyribonucleic acid (DNA) analyses, but also are being linked to other direct-to-consumer genetic testing and research databases. As both family history data and DNA can provide information on more than just the individual, we asked whether companies, as a part of the consent process, were informing clients, and through them clients' relatives, of the potential implications of the use and linkage of their personal data. We used content analysis to analyse publically-available consent and informational materials provided to potential clients of ancestry and direct-to-consumer genetic testing companies to determine what consent is required, what risks associated with participation were highlighted, and whether the consent or notification of third parties was suggested or required. We identified four categories of companies providing: 1) services based only on self-reported data, such as personal or family history; 2) services based only on DNA provided by the client; 3) services using both; and 4) services using both that also have a research component. The amount of information provided on the potential issues varied significantly across the categories of companies. 'Traditional' ancestry companies showed the greatest awareness of the implications for family members, while companies only asking for DNA focused solely on the client. While in some cases companies included text recommending clients inform their relatives, showing they recognised the issues, often it was located within lengthy terms and conditions or privacy statements that may not be read by potential clients. We recommend that companies should make it clearer that clients should inform third parties about their plans to participate, that third parties' data will be provided to companies, and that that data will be linked to other databases, thus raising privacy and issues on use of data. We also suggest investigating whether a 'generational consent' should be created that would include more than just the individual in decisions about participating in genetic investigations.

  11. The EarthServer project: Exploiting Identity Federations, Science Gateways and Social and Mobile Clients for Big Earth Data Analysis

    NASA Astrophysics Data System (ADS)

    Barbera, Roberto; Bruno, Riccardo; Calanducci, Antonio; Messina, Antonio; Pappalardo, Marco; Passaro, Gianluca

    2013-04-01

    The EarthServer project (www.earthserver.eu), funded by the European Commission under its Seventh Framework Program, aims at establishing open access and ad-hoc analytics on extreme-size Earth Science data, based on and extending leading-edge Array Database technology. The core idea is to use database query languages as client/server interface to achieve barrier-free "mix & match" access to multi-source, any-size, multi-dimensional space-time data -- in short: "Big Earth Data Analytics" - based on the open standards of the Open Geospatial Consortium Web Coverage Processing Service (OGC WCPS) and the W3C XQuery. EarthServer combines both, thereby achieving a tight data/metadata integration. Further, the rasdaman Array Database System (www.rasdaman.com) is extended with further space-time coverage data types. On server side, highly effective optimizations - such as parallel and distributed query processing - ensure scalability to Exabyte volumes. Six Lighthouse Applications are being established in EarthServer, each of which poses distinct challenges on Earth Data Analytics: Cryospheric Science, Airborne Science, Atmospheric Science, Geology, Oceanography, and Planetary Science. Altogether, they cover all Earth Science domains; the Planetary Science use case has been added to challenge concepts and standards in non-standard environments. In addition, EarthLook (maintained by Jacobs University) showcases use of OGC standards in 1D through 5D use cases. In this contribution we will report on the first applications integrated in the EarthServer Science Gateway and on the clients for mobile appliances developed to access them. We will also show how federated and social identity services can allow Big Earth Data Providers to expose their data in a distributed environment keeping a strict and fine-grained control on user authentication and authorisation. The degree of fulfilment of the EarthServer implementation with the recommendations made in the recent TERENA Study on AAA Platforms For Scientific Resources in Europe (https://confluence.terena.org/display/aaastudy/AAA+Study+Home+Page) will also be assessed.

  12. Use of XML and Java for collaborative petroleum reservoir modeling on the Internet

    NASA Astrophysics Data System (ADS)

    Victorine, John; Watney, W. Lynn; Bhattacharya, Saibal

    2005-11-01

    The GEMINI (Geo-Engineering Modeling through INternet Informatics) is a public-domain, web-based freeware that is made up of an integrated suite of 14 Java-based software tools to accomplish on-line, real-time geologic and engineering reservoir modeling. GEMINI facilitates distant collaborations for small company and academic clients, negotiating analyses of both single and multiple wells. The system operates on a single server and an enterprise database. External data sets must be uploaded into this database. Feedback from GEMINI users provided the impetus to develop Stand Alone Web Start Applications of GEMINI modules that reside in and operate from the user's PC. In this version, the GEMINI modules run as applets, which may reside in local user PCs, on the server, or Java Web Start. In this enhanced version, XML-based data handling procedures are used to access data from remote and local databases and save results for later access and analyses. The XML data handling process also integrates different stand-alone GEMINI modules enabling the user(s) to access multiple databases. It provides flexibility to the user to customize analytical approach, database location, and level of collaboration. An example integrated field-study using GEMINI modules and Stand Alone Web Start Applications is provided to demonstrate the versatile applicability of this freeware for cost-effective reservoir modeling.

  13. Use of XML and Java for collaborative petroleum reservoir modeling on the Internet

    USGS Publications Warehouse

    Victorine, J.; Watney, W.L.; Bhattacharya, S.

    2005-01-01

    The GEMINI (Geo-Engineering Modeling through INternet Informatics) is a public-domain, web-based freeware that is made up of an integrated suite of 14 Java-based software tools to accomplish on-line, real-time geologic and engineering reservoir modeling. GEMINI facilitates distant collaborations for small company and academic clients, negotiating analyses of both single and multiple wells. The system operates on a single server and an enterprise database. External data sets must be uploaded into this database. Feedback from GEMINI users provided the impetus to develop Stand Alone Web Start Applications of GEMINI modules that reside in and operate from the user's PC. In this version, the GEMINI modules run as applets, which may reside in local user PCs, on the server, or Java Web Start. In this enhanced version, XML-based data handling procedures are used to access data from remote and local databases and save results for later access and analyses. The XML data handling process also integrates different stand-alone GEMINI modules enabling the user(s) to access multiple databases. It provides flexibility to the user to customize analytical approach, database location, and level of collaboration. An example integrated field-study using GEMINI modules and Stand Alone Web Start Applications is provided to demonstrate the versatile applicability of this freeware for cost-effective reservoir modeling. ?? 2005 Elsevier Ltd. All rights reserved.

  14. 78 FR 46395 - Self-Regulatory Organizations; Chicago Board Options Exchange, Incorporated; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-31

    ... after logon 10:00:020--CAS receives a message from Client Application --Counter re-starts 10:00:070--No... receives a message from Client Application --Counter restarts (2) 10:00:000--Heartbeat Request sent to Client Application within login 10:00:020--CAS receives a message from Client Application --Counter re...

  15. A simple versatile solution for collecting multidimensional clinical data based on the CakePHP web application framework.

    PubMed

    Biermann, Martin

    2014-04-01

    Clinical trials aiming for regulatory approval of a therapeutic agent must be conducted according to Good Clinical Practice (GCP). Clinical Data Management Systems (CDMS) are specialized software solutions geared toward GCP-trials. They are however less suited for data management in small non-GCP research projects. For use in researcher-initiated non-GCP studies, we developed a client-server database application based on the public domain CakePHP framework. The underlying MySQL database uses a simple data model based on only five data tables. The graphical user interface can be run in any web browser inside the hospital network. Data are validated upon entry. Data contained in external database systems can be imported interactively. Data are automatically anonymized on import, and the key lists identifying the subjects being logged to a restricted part of the database. Data analysis is performed by separate statistics and analysis software connecting to the database via a generic Open Database Connectivity (ODBC) interface. Since its first pilot implementation in 2011, the solution has been applied to seven different clinical research projects covering different clinical problems in different organ systems such as cancer of the thyroid and the prostate glands. This paper shows how the adoption of a generic web application framework is a feasible, flexible, low-cost, and user-friendly way of managing multidimensional research data in researcher-initiated non-GCP clinical projects. Copyright © 2014 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  16. Insect barcode information system.

    PubMed

    Pratheepa, Maria; Jalali, Sushil Kumar; Arokiaraj, Robinson Silvester; Venkatesan, Thiruvengadam; Nagesh, Mandadi; Panda, Madhusmita; Pattar, Sharath

    2014-01-01

    Insect Barcode Information System called as Insect Barcode Informática (IBIn) is an online database resource developed by the National Bureau of Agriculturally Important Insects, Bangalore. This database provides acquisition, storage, analysis and publication of DNA barcode records of agriculturally important insects, for researchers specifically in India and other countries. It bridges a gap in bioinformatics by integrating molecular, morphological and distribution details of agriculturally important insects. IBIn was developed using PHP/My SQL by using relational database management concept. This database is based on the client- server architecture, where many clients can access data simultaneously. IBIn is freely available on-line and is user-friendly. IBIn allows the registered users to input new information, search and view information related to DNA barcode of agriculturally important insects.This paper provides a current status of insect barcode in India and brief introduction about the database IBIn. http://www.nabg-nbaii.res.in/barcode.

  17. A Symphony of Software.

    ERIC Educational Resources Information Center

    Currents, 2002

    2002-01-01

    Offers a descriptive table of databases that help higher education institutions orchestrate advancement operations. Information includes vendor, contact, software, price, database engine/server platform, recommended reporting tools, record capacity, and client type. (EV)

  18. CDC WONDER: a cooperative processing architecture for public health.

    PubMed Central

    Friede, A; Rosen, D H; Reid, J A

    1994-01-01

    CDC WONDER is an information management architecture designed for public health. It provides access to information and communications without the user's needing to know the location of data or communication pathways and mechanisms. CDC WONDER users have access to extractions from some 40 databases; electronic mail (e-mail); and surveillance data processing. System components include the Remote Client, the Communications Server, the Queue Managers, and Data Servers and Process Servers. The Remote Client software resides in the user's machine; other components are at the Centers for Disease Control and Prevention (CDC). The Remote Client, the Communications Server, and the Applications Server provide access to the information and functions in the Data Servers and Process Servers. The system architecture is based on cooperative processing, and components are coupled via pure message passing, using several protocols. This architecture allows flexibility in the choice of hardware and software. One system limitation is that final results from some subsystems are obtained slowly. Although designed for public health, CDC WONDER could be useful for other disciplines that need flexible, integrated information exchange. PMID:7719813

  19. Best kept secrets ... First Coast Systems, Inc. (FCS).

    PubMed

    Andrew, W F

    1991-04-01

    The FCS/APaCS system is a viable option for small-to medium-size hospitals (up to 400 beds). The table-driven system takes full advantage of IBM AS/400 computer architecture. A comprehensive application set, provided in an integrated database environment, is adaptable to multi-facility environments. Price/performance appears to be competitive. Commitment to IBM AS/400 environment assures cost-effective hardware platforms backed by IBM support and resources. As an IBM Health Industry Business Partner, FCS (and its clients) benefits from IBM's well-known commitment to quality and service. Corporate emphasis on user involvement and satisfaction, along with a commitment to quality and service for the APaCS systems, assures clients of "leading edge" capabilities in this evolutionary healthcare delivery environment. FCS/APaCS will be a strong contender in selected marketing environments.

  20. GLobal Integrated Design Environment

    NASA Technical Reports Server (NTRS)

    Kunkel, Matthew; McGuire, Melissa; Smith, David A.; Gefert, Leon P.

    2011-01-01

    The GLobal Integrated Design Environment (GLIDE) is a collaborative engineering application built to resolve the design session issues of real-time passing of data between multiple discipline experts in a collaborative environment. Utilizing Web protocols and multiple programming languages, GLIDE allows engineers to use the applications to which they are accustomed in this case, Excel to send and receive datasets via the Internet to a database-driven Web server. Traditionally, a collaborative design session consists of one or more engineers representing each discipline meeting together in a single location. The discipline leads exchange parameters and iterate through their respective processes to converge on an acceptable dataset. In cases in which the engineers are unable to meet, their parameters are passed via e-mail, telephone, facsimile, or even postal mail. The result of this slow process of data exchange would elongate a design session to weeks or even months. While the iterative process remains in place, software can now exchange parameters securely and efficiently, while at the same time allowing for much more information about a design session to be made available. GLIDE is written in a compilation of several programming languages, including REALbasic, PHP, and Microsoft Visual Basic. GLIDE client installers are available to download for both Microsoft Windows and Macintosh systems. The GLIDE client software is compatible with Microsoft Excel 2000 or later on Windows systems, and with Microsoft Excel X or later on Macintosh systems. GLIDE follows the Client-Server paradigm, transferring encrypted and compressed data via standard Web protocols. Currently, the engineers use Excel as a front end to the GLIDE Client, as many of their custom tools run in Excel.

  1. 76 FR 5840 - Supermedia LLC, Formerly Known as Idearc Media LLC, Supermedia Information Services LLC, Client...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-02

    ..., and Database; Middleton, Massachusetts; Notice of Revised Determination on Reconsideration By... from Advantage (TAC), Resprcconn, Tataconssv, Modis, Amdocs, and Database, Middleton, Massachusetts...

  2. 34 CFR 370.40 - What are allowable costs?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... AND REHABILITATIVE SERVICES, DEPARTMENT OF EDUCATION CLIENT ASSISTANCE PROGRAM What Post-Award... in § 370.4, the cost of travel in connection with the provision to a client or client applicant of... if the attendant must accompany the client or client applicant. (d) The State and the designated...

  3. 34 CFR 370.44 - What reporting requirement applies to each designated agency?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... (Continued) OFFICE OF SPECIAL EDUCATION AND REHABILITATIVE SERVICES, DEPARTMENT OF EDUCATION CLIENT... designated agency from clients or client applicants; (d) The number of the requests for advocacy services from clients or client applicants that the designated agency was unable to serve; (e) The reasons that...

  4. Cryptographically secure biometrics

    NASA Astrophysics Data System (ADS)

    Stoianov, A.

    2010-04-01

    Biometric systems usually do not possess a cryptographic level of security: it has been deemed impossible to perform a biometric authentication in the encrypted domain because of the natural variability of biometric samples and of the cryptographic intolerance even to a single bite error. Encrypted biometric data need to be decrypted on authentication, which creates privacy and security risks. On the other hand, the known solutions called "Biometric Encryption (BE)" or "Fuzzy Extractors" can be cracked by various attacks, for example, by running offline a database of images against the stored helper data in order to obtain a false match. In this paper, we present a novel approach which combines Biometric Encryption with classical Blum-Goldwasser cryptosystem. In the "Client - Service Provider (SP)" or in the "Client - Database - SP" architecture it is possible to keep the biometric data encrypted on all the stages of the storage and authentication, so that SP never has an access to unencrypted biometric data. It is shown that this approach is suitable for two of the most popular BE schemes, Fuzzy Commitment and Quantized Index Modulation (QIM). The approach has clear practical advantages over biometric systems using "homomorphic encryption". Future work will deal with the application of the proposed solution to one-to-many biometric systems.

  5. A web-based approach for electrocardiogram monitoring in the home.

    PubMed

    Magrabi, F; Lovell, N H; Celler, B G

    1999-05-01

    A Web-based electrocardiogram (ECG) monitoring service in which a longitudinal clinical record is used for management of patients, is described. The Web application is used to collect clinical data from the patient's home. A database on the server acts as a central repository where this clinical information is stored. A Web browser provides access to the patient's records and ECG data. We discuss the technologies used to automate the retrieval and storage of clinical data from a patient database, and the recording and reviewing of clinical measurement data. On the client's Web browser, ActiveX controls embedded in the Web pages provide a link between the various components including the Web server, Web page, the specialised client side ECG review and acquisition software, and the local file system. The ActiveX controls also implement FTP functions to retrieve and submit clinical data to and from the server. An intelligent software agent on the server is activated whenever new ECG data is sent from the home. The agent compares historical data with newly acquired data. Using this method, an optimum patient care strategy can be evaluated, a summarised report along with reminders and suggestions for action is sent to the doctor and patient by email.

  6. Global Play Evaluation TOol (GPETO) assists Mobil explorationists with play evaluation and ranking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Withers, K.D.; Brown, P.J.; Clary, R.C.

    1996-01-01

    GPETO is a relational database and application containing information about over 2500 plays around the world. It also has information about approximately 30,000 fields and the related provinces. The GPETO application has been developed to assist Mobil geoscientists, planners and managers with global play evaluations and portfolio management. The, main features of GPETO allow users to: (1) view or modify play and province information, (2) composite user specified plays in a statistically valid way, (3) view threshold information for plays and provinces, including curves, (4) examine field size data, including discovered, future and ultimate field sizes for provinces and plays,more » (5) use a database browser to lookup and validate data by geographic, volumetric, technical and business criteria, (6) display ranged values and graphical displays of future and ultimate potential for plays, provinces, countries, and continents, (7) run, view and print a number of informative reports containing input and output data from the system. The GPETO application is written in c and fortran, runs on a unix based system, utilizes an Ingres database, and was implemented using a 3-tiered client/server architecture.« less

  7. Global Play Evaluation TOol (GPETO) assists Mobil explorationists with play evaluation and ranking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Withers, K.D.; Brown, P.J.; Clary, R.C.

    1996-12-31

    GPETO is a relational database and application containing information about over 2500 plays around the world. It also has information about approximately 30,000 fields and the related provinces. The GPETO application has been developed to assist Mobil geoscientists, planners and managers with global play evaluations and portfolio management. The, main features of GPETO allow users to: (1) view or modify play and province information, (2) composite user specified plays in a statistically valid way, (3) view threshold information for plays and provinces, including curves, (4) examine field size data, including discovered, future and ultimate field sizes for provinces and plays,more » (5) use a database browser to lookup and validate data by geographic, volumetric, technical and business criteria, (6) display ranged values and graphical displays of future and ultimate potential for plays, provinces, countries, and continents, (7) run, view and print a number of informative reports containing input and output data from the system. The GPETO application is written in c and fortran, runs on a unix based system, utilizes an Ingres database, and was implemented using a 3-tiered client/server architecture.« less

  8. 34 CFR 370.20 - What must be included in a request for a grant?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... SPECIAL EDUCATION AND REHABILITATIVE SERVICES, DEPARTMENT OF EDUCATION CLIENT ASSISTANCE PROGRAM How Does... rights of clients or client applicants within the State. (2) The authority to pursue remedies described... in the State will advise all clients and client applicants of the existence of the CAP, the services...

  9. Embedded neural recording with TinyOS-based wireless-enabled processor modules.

    PubMed

    Farshchi, Shahin; Pesterev, Aleksey; Nuyujukian, Paul; Guenterberg, Eric; Mody, Istvan; Judy, Jack W

    2010-04-01

    To create a wireless neural recording system that can benefit from the continuous advancements being made in embedded microcontroller and communications technologies, an embedded-system-based architecture for wireless neural recording has been designed, fabricated, and tested. The system consists of commercial-off-the-shelf wireless-enabled processor modules (motes) for communicating the neural signals, and a back-end database server and client application for archiving and browsing the neural signals. A neural-signal-acquisition application has been developed to enable the mote to either acquire neural signals at a rate of 4000 12-bit samples per second, or detect and transmit spike heights and widths sampled at a rate of 16670 12-bit samples per second on a single channel. The motes acquire neural signals via a custom low-noise neural-signal amplifier with adjustable gain and high-pass corner frequency that has been designed, and fabricated in a 1.5-microm CMOS process. In addition to browsing acquired neural data, the client application enables the user to remotely toggle modes of operation (real-time or spike-only), as well as amplifier gain and high-pass corner frequency.

  10. Preparing College Students To Search Full-Text Databases: Is Instruction Necessary?

    ERIC Educational Resources Information Center

    Riley, Cheryl; Wales, Barbara

    Full-text databases allow Central Missouri State University's clients to access some of the serials that libraries have had to cancel due to escalating subscription costs; EbscoHost, the subject of this study, is one such database. The database is available free to all Missouri residents. A survey was designed consisting of 21 questions intended…

  11. An incremental database access method for autonomous interoperable databases

    NASA Technical Reports Server (NTRS)

    Roussopoulos, Nicholas; Sellis, Timos

    1994-01-01

    We investigated a number of design and performance issues of interoperable database management systems (DBMS's). The major results of our investigation were obtained in the areas of client-server database architectures for heterogeneous DBMS's, incremental computation models, buffer management techniques, and query optimization. We finished a prototype of an advanced client-server workstation-based DBMS which allows access to multiple heterogeneous commercial DBMS's. Experiments and simulations were then run to compare its performance with the standard client-server architectures. The focus of this research was on adaptive optimization methods of heterogeneous database systems. Adaptive buffer management accounts for the random and object-oriented access methods for which no known characterization of the access patterns exists. Adaptive query optimization means that value distributions and selectives, which play the most significant role in query plan evaluation, are continuously refined to reflect the actual values as opposed to static ones that are computed off-line. Query feedback is a concept that was first introduced to the literature by our group. We employed query feedback for both adaptive buffer management and for computing value distributions and selectivities. For adaptive buffer management, we use the page faults of prior executions to achieve more 'informed' management decisions. For the estimation of the distributions of the selectivities, we use curve-fitting techniques, such as least squares and splines, for regressing on these values.

  12. Markerless motion capture systems as training device in neurological rehabilitation: a systematic review of their use, application, target population and efficacy.

    PubMed

    Knippenberg, Els; Verbrugghe, Jonas; Lamers, Ilse; Palmaers, Steven; Timmermans, Annick; Spooren, Annemie

    2017-06-24

    Client-centred task-oriented training is important in neurological rehabilitation but is time consuming and costly in clinical practice. The use of technology, especially motion capture systems (MCS) which are low cost and easy to apply in clinical practice, may be used to support this kind of training, but knowledge and evidence of their use for training is scarce. The present review aims to investigate 1) which motion capture systems are used as training devices in neurological rehabilitation, 2) how they are applied, 3) in which target population, 4) what the content of the training and 5) efficacy of training with MCS is. A computerised systematic literature review was conducted in four databases (PubMed, Cinahl, Cochrane Database and IEEE). The following MeSH terms and key words were used: Motion, Movement, Detection, Capture, Kinect, Rehabilitation, Nervous System Diseases, Multiple Sclerosis, Stroke, Spinal Cord, Parkinson Disease, Cerebral Palsy and Traumatic Brain Injury. The Van Tulder's Quality assessment was used to score the methodological quality of the selected studies. The descriptive analysis is reported by MCS, target population, training parameters and training efficacy. Eighteen studies were selected (mean Van Tulder score = 8.06 ± 3.67). Based on methodological quality, six studies were selected for analysis of training efficacy. Most commonly used MCS was Microsoft Kinect, training was mostly conducted in upper limb stroke rehabilitation. Training programs varied in intensity, frequency and content. None of the studies reported an individualised training program based on client-centred approach. Motion capture systems are training devices with potential in neurological rehabilitation to increase the motivation during training and may assist improvement on one or more International Classification of Functioning, Disability and Health (ICF) levels. Although client-centred task-oriented training is important in neurological rehabilitation, the client-centred approach was not included. Future technological developments should take up the challenge to combine MCS with the principles of a client-centred task-oriented approach and prove efficacy using randomised controlled trials with long-term follow-up. Prospero registration number 42016035582 .

  13. Elements affecting wound healing time: An evidence based analysis.

    PubMed

    Khalil, Hanan; Cullen, Marianne; Chambers, Helen; Carroll, Matthew; Walker, Judi

    2015-01-01

    The purpose of this study was to identify the predominant client factors and comorbidities that affected the time taken for wounds to heal. A prospective study design used the Mobile Wound Care (MWC) database to capture and collate detailed medical histories, comorbidities, healing times and consumable costs for clients with wounds in Gippsland, Victoria. There were 3,726 wounds documented from 2,350 clients, so an average of 1.6 wounds per client. Half (49.6%) of all clients were females, indicating that there were no gender differences in terms of wound prevalence. The clients were primarily older people, with an average age of 64.3 years (ranging between 0.7 and 102.9 years). The majority of the wounds (56%) were acute and described as surgical, crush and trauma. The MWC database categorized the elements that influenced wound healing into 3 groups--factors affecting healing (FAH), comorbidities, and medications known to affect wound healing. While there were a multitude of significant associations, multiple linear regression identified the following key elements: age over 65 years, obesity, nonadherence to treatment plan, peripheral vascular disease, specific wounds associated with pressure/friction/shear, confirmed infection, and cerebrovascular accident (stroke). Wound healing is a complex process that requires a thorough understanding of influencing elements to improve healing times.© 2015 by the Wound Healing Society. © 2015 by the Wound Healing Society.

  14. Medicaid care management: description of high-cost addictions treatment clients.

    PubMed

    Neighbors, Charles J; Sun, Yi; Yerneni, Rajeev; Tesiny, Ed; Burke, Constance; Bardsley, Leland; McDonald, Rebecca; Morgenstern, Jon

    2013-09-01

    High utilizers of alcohol and other drug treatment (AODTx) services are a priority for healthcare cost control. We examine characteristics of Medicaid-funded AODTx clients, comparing three groups: individuals <90th percentile of AODTx expenditures (n=41,054); high-cost clients in the top decile of AODTx expenditures (HC; n=5,718); and 1760 enrollees in a chronic care management (CM) program for HC clients implemented in 22 counties in New York State. Medicaid and state AODTx registry databases were combined to draw demographic, clinical, social needs and treatment history data. HC clients accounted for 49% of AODTx costs funded by Medicaid. As expected, HC clients had significant social welfare needs, comorbid medical and psychiatric conditions, and use of inpatient services. The CM program was successful in enrolling some high-needs, high-cost clients but faced barriers to reaching the most costly and disengaged individuals. Copyright © 2013 Elsevier Inc. All rights reserved.

  15. Supervision of tunnelling constructions and software used for their evaluation

    NASA Astrophysics Data System (ADS)

    Caravanas, Aristotelis; Hilar, Matous

    2017-09-01

    Supervision is a common instrument for controlling constructions of tunnels. In order to suit relevant project’s purposes a supervision procedure is modified by local conditions, habits, codes and ways of allocating of a particular tunnelling project. The duties of tunnel supervision are specified in an agreement with the client and they can include a wide range of activities. On large scale tunnelling projects the supervision tasks are performed by a high number of people of different professions. Teamwork, smooth communication and coordination are required in order to successfully fulfil supervision tasks. The efficiency and quality of tunnel supervision work are enhanced when specialized software applications are used. Such applications should allow on-line data management and the prompt evaluation, reporting and sharing of relevant construction information and other aspects. The client is provided with an as-built database that contains all the relevant information related to a construction process, which is a valuable tool for the claim management as well as for the evaluation of structure defects that can occur in the future. As a result, the level of risks related to tunnel constructions is decreased.

  16. The research and implementation of PDM systems based on the .NET platform

    NASA Astrophysics Data System (ADS)

    Gao, Hong-li; Jia, Ying-lian; Yang, Ji-long; Jiang, Wei

    2005-12-01

    A new kind of PDM system scheme based on the .NET platform for solving application problems of the current PDM system applied in an enterprise is described. The key technologies of this system, such as .NET, Accessing Data, information processing, Web, ect., were discussed. The 3-tier architecture of a PDM system based on the C/S and B/S mixed mode was presented. In this system, all users share the same Database Server in order to ensure the coherence and safety of client data. ADO.NET leverages the power of XML to provide disconnected access to data, which frees the connection to be used by other clients. Using this approach, the system performance was improved. Moreover, the important function modules in a PDM system such as project management, product structure management and Document Management module were developed and realized.

  17. Using the STOQS Web Application for Access to in situ Oceanographic Data

    NASA Astrophysics Data System (ADS)

    McCann, M. P.

    2012-12-01

    Using the STOQS Web Application for Access to in situ Oceanographic Data Mike McCann 7 August 2012 With increasing measurement and sampling capabilities of autonomous oceanographic platforms (e.g. Gliders, Autonomous Underwater Vehicles, Wavegliders), the need to efficiently access and visualize the data they collect is growing. The Monterey Bay Aquarium Research Institute has designed and built the Spatial Temporal Oceanographic Query System (STOQS) specifically to address this issue. The need for STOQS arises from inefficiencies discovered from using CF-NetCDF point observation conventions for these data. The problem is that access efficiency decreases with decreasing dimension of CF-NetCDF data. For example, the Trajectory Common Data Model feature type has only one coordinate dimension, usually Time - positions of the trajectory (Depth, Latitude, Longitude) are stored as non-indexed record variables within the NetCDF file. If client software needs to access data between two depth values or from a bounded geographic area, then the whole data set must be read and the selection made within the client software. This is very inefficient. What is needed is a way to easily select data of interest from an archive given any number of spatial, temporal, or other constraints. Geospatial relational database technology provides this capability. The full STOQS application consists of a Postgres/PostGIS database, Mapserver, and Python-Django running on a server and Web 2.0 technology (jQuery, OpenLayers, Twitter Bootstrap) running in a modern web browser. The web application provides faceted search capabilities allowing a user to quickly drill into the data of interest. Data selection can be constrained by spatial, temporal, and depth selections as well as by parameter value and platform name. The web application layer also provides a REST (Representational State Transfer) Application Programming Interface allowing tools such as the Matlab stoqstoolbox to retrieve data directly from the database. STOQS is an open source software project built upon a framework of free and open source software and is available for anyone to use for making their data more accessible and usable. For more information please see: http://code.google.com/p/stoqs/.; In the above screen grab a user has selected the "mass_concentrtion_of_chlorophyll_in_sea_water" parameter and a time depth range that includes three weeks of AUV missions of just the upper 5 meters.

  18. RPPAML/RIMS: A metadata format and an information management system for reverse phase protein arrays

    PubMed Central

    Stanislaus, Romesh; Carey, Mark; Deus, Helena F; Coombes, Kevin; Hennessy, Bryan T; Mills, Gordon B; Almeida, Jonas S

    2008-01-01

    Background Reverse Phase Protein Arrays (RPPA) are convenient assay platforms to investigate the presence of biomarkers in tissue lysates. As with other high-throughput technologies, substantial amounts of analytical data are generated. Over 1000 samples may be printed on a single nitrocellulose slide. Up to 100 different proteins may be assessed using immunoperoxidase or immunoflorescence techniques in order to determine relative amounts of protein expression in the samples of interest. Results In this report an RPPA Information Management System (RIMS) is described and made available with open source software. In order to implement the proposed system, we propose a metadata format known as reverse phase protein array markup language (RPPAML). RPPAML would enable researchers to describe, document and disseminate RPPA data. The complexity of the data structure needed to describe the results and the graphic tools necessary to visualize them require a software deployment distributed between a client and a server application. This was achieved without sacrificing interoperability between individual deployments through the use of an open source semantic database, S3DB. This data service backbone is available to multiple client side applications that can also access other server side deployments. The RIMS platform was designed to interoperate with other data analysis and data visualization tools such as Cytoscape. Conclusion The proposed RPPAML data format hopes to standardize RPPA data. Standardization of data would result in diverse client applications being able to operate on the same set of data. Additionally, having data in a standard format would enable data dissemination and data analysis. PMID:19102773

  19. Multi-resolution extension for transmission of geodata in a mobile context

    NASA Astrophysics Data System (ADS)

    Follin, Jean-Michel; Bouju, Alain; Bertrand, Frédéric; Boursier, Patrice

    2005-03-01

    A solution is proposed for the management of multi-resolution vector data in a mobile spatial information visualization system. The client-server architecture and the models of data and transfer of the system are presented first. The aim of this system is to reduce data exchanged between client and server by reusing data already present on the client side. Then, an extension of this system to multi-resolution data is proposed. Our solution is based on the use of increments in a multi-scale database. A database architecture where data sets for different predefined scales are precomputed and stored on the server side is adopted. In this model, each object representing the same real world entities at different levels of detail has to be linked beforehand. Increments correspond to the difference between two datasets with different levels of detail. They are transmitted in order to increase (or decrease) the detail to the client upon request. They include generalization and refinement operators allowing transitions between the different levels. Finally, a framework suited to the transfer of multi-resolution data in a mobile context is presented. This allows reuse of data locally available at different levels of detail and, in this way, reduces the amount of data transferred between client and server.

  20. Implementing a Dynamic Database-Driven Course Using LAMP

    ERIC Educational Resources Information Center

    Laverty, Joseph Packy; Wood, David; Turchek, John

    2011-01-01

    This paper documents the formulation of a database driven open source architecture web development course. The design of a web-based curriculum faces many challenges: a) relative emphasis of client and server-side technologies, b) choice of a server-side language, and c) the cost and efficient delivery of a dynamic web development, database-driven…

  1. maxdLoad2 and maxdBrowse: standards-compliant tools for microarray experimental annotation, data management and dissemination.

    PubMed

    Hancock, David; Wilson, Michael; Velarde, Giles; Morrison, Norman; Hayes, Andrew; Hulme, Helen; Wood, A Joseph; Nashar, Karim; Kell, Douglas B; Brass, Andy

    2005-11-03

    maxdLoad2 is a relational database schema and Java application for microarray experimental annotation and storage. It is compliant with all standards for microarray meta-data capture; including the specification of what data should be recorded, extensive use of standard ontologies and support for data exchange formats. The output from maxdLoad2 is of a form acceptable for submission to the ArrayExpress microarray repository at the European Bioinformatics Institute. maxdBrowse is a PHP web-application that makes contents of maxdLoad2 databases accessible via web-browser, the command-line and web-service environments. It thus acts as both a dissemination and data-mining tool. maxdLoad2 presents an easy-to-use interface to an underlying relational database and provides a full complement of facilities for browsing, searching and editing. There is a tree-based visualization of data connectivity and the ability to explore the links between any pair of data elements, irrespective of how many intermediate links lie between them. Its principle novel features are: the flexibility of the meta-data that can be captured, the tools provided for importing data from spreadsheets and other tabular representations, the tools provided for the automatic creation of structured documents, the ability to browse and access the data via web and web-services interfaces. Within maxdLoad2 it is very straightforward to customise the meta-data that is being captured or change the definitions of the meta-data. These meta-data definitions are stored within the database itself allowing client software to connect properly to a modified database without having to be specially configured. The meta-data definitions (configuration file) can also be centralized allowing changes made in response to revisions of standards or terminologies to be propagated to clients without user intervention.maxdBrowse is hosted on a web-server and presents multiple interfaces to the contents of maxd databases. maxdBrowse emulates many of the browse and search features available in the maxdLoad2 application via a web-browser. This allows users who are not familiar with maxdLoad2 to browse and export microarray data from the database for their own analysis. The same browse and search features are also available via command-line and SOAP server interfaces. This both enables scripting of data export for use embedded in data repositories and analysis environments, and allows access to the maxd databases via web-service architectures. maxdLoad2 http://www.bioinf.man.ac.uk/microarray/maxd/ and maxdBrowse http://dbk.ch.umist.ac.uk/maxdBrowse are portable and compatible with all common operating systems and major database servers. They provide a powerful, flexible package for annotation of microarray experiments and a convenient dissemination environment. They are available for download and open sourced under the Artistic License.

  2. The real relationship inventory: development and psychometric investigation of the client form.

    PubMed

    Kelley, Frances A; Gelso, Charles J; Fuertes, Jairo N; Marmarosh, Cheri; Lanier, Stacey Holmes

    2010-12-01

    The development and validation of a client version of the Real Relationship Inventory (RRI-C) is reported. Using a sample of clients (n = 94) who were currently in psychotherapy, a 24-item measure was developed consisting of two subscales (Realism and Genuineness) and a total score. This 24-item version and other measures used for validation were completed by 93 additional clients. Results of the present study offer initial support for the validity and reliability of the RRI-C. The RRI-C correlated significantly in theoretically expected ways with measures of the client-rated working alliance and therapists' congruence, clients' observing ego, and client ratings of client and therapist real relationship on an earlier measure of the real relationship (Eugster & Wampold, 1996). A nonsignificant relation was found between the RRI-C and a measure of social desirability, providing support for discriminant validity. A confirmatory factor analysis supported the two theorized factors of the RRI-C. The authors discuss the importance of measuring clients' perceptions of the real relationship. (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  3. Adverse events among Ontario home care clients associated with emergency room visit or hospitalization: a retrospective cohort study

    PubMed Central

    2013-01-01

    Background Home care (HC) is a critical component of the ongoing restructuring of healthcare in Canada. It impacts three dimensions of healthcare delivery: primary healthcare, chronic disease management, and aging at home strategies. The purpose of our study is to investigate a significant safety dimension of HC, the occurrence of adverse events and their related outcomes. The study reports on the incidence of HC adverse events, the magnitude of the events, the types of events that occur, and the consequences experienced by HC clients in the province of Ontario. Methods A retrospective cohort design was used, utilizing comprehensive secondary databases available for Ontario HC clients from the years 2008 and 2009. The data were derived from the Canadian Home Care Reporting System, the Hospital Discharge Abstract Database, the National Ambulatory Care Reporting System, the Ontario Mental Health Reporting System, and the Continuing Care Reporting System. Descriptive analysis was used to identify the type and frequency of the adverse events recorded and the consequences of the events. Logistic regression analysis was used to examine the association between the events and their consequences. Results The study found that the incident rate for adverse events for the HC clients included in the cohort was 13%. The most frequent adverse events identified in the databases were injurious falls, injuries from other than a fall, and medication-related incidents. With respect to outcomes, we determined that an injurious fall was associated with a significant increase in the odds of a client requiring long-term-care facility admission and of client death. We further determined that three types of events, delirium, sepsis, and medication-related incidents were associated directly with an increase in the odds of client death. Conclusions Our study concludes that 13% of clients in homecare experience an adverse event annually. We also determined that an injurious fall was the most frequent of the adverse events and was associated with increased admission to long-term care or death. We recommend the use of tools that are presently available in Canada, such as the Resident Assessment Instrument and its Clinical Assessment Protocols, for assessing and mitigating the risk of an adverse event occurring. PMID:23800280

  4. Community ambulation: influences on therapists and clients reasoning and decision making.

    PubMed

    Corrigan, Rosemary; McBurney, Helen

    2008-01-01

    Community ambulation is an important element of a rehabilitation training programme and its achievement is a goal shared by rehabilitation professionals and clients. The factors that influence a physiotherapist's or health professionals decision making around the preparation of a client for community ambulation and the factors that influence a client's decision to return to walking in their community are unclear. To review the available literature about the factors that have influenced the reasoning and decision making of rehabilitation therapists and clients around the topic of ambulation in the community. Three separate searches of the available literature were undertaken using Ovid, Cinahl, ProQuest, Medline and Ebscohost databases. Databases were searched from 1966 to October 2006.The first search explored the literature for factors that influence the clinical reasoning of rehabilitation therapists. The second search explored the literature for factors that influence client's decision to ambulate in the community. A third search was undertaken to explore the literature for the demands of community ambulation in rural communities. Very few studies were found that explored community ambulation in the context of clinical reasoning and decision making, the facilitators and barriers to a clients return to ambulation in their community or the demands of ambulation in a rural community. Consideration of the environment is key to the successful return to walking in the community of clients with mobility problems yet little literature has been found to guide physiotherapist's decision making about preparing a clients to return to walking in the community. An individual's participation in their society is also a result of the interaction between their personal characteristics and his or her environment. The influence of these characteristics may vary from one individual to another yet the factors that influence a person's decision to return to walking in their community after stroke remain unclear.

  5. Dynamic Server-Based KML Code Generator Method for Level-of-Detail Traversal of Geospatial Data

    NASA Technical Reports Server (NTRS)

    Baxes, Gregory; Mixon, Brian; Linger, TIm

    2013-01-01

    Web-based geospatial client applications such as Google Earth and NASA World Wind must listen to data requests, access appropriate stored data, and compile a data response to the requesting client application. This process occurs repeatedly to support multiple client requests and application instances. Newer Web-based geospatial clients also provide user-interactive functionality that is dependent on fast and efficient server responses. With massively large datasets, server-client interaction can become severely impeded because the server must determine the best way to assemble data to meet the client applications request. In client applications such as Google Earth, the user interactively wanders through the data using visually guided panning and zooming actions. With these actions, the client application is continually issuing data requests to the server without knowledge of the server s data structure or extraction/assembly paradigm. A method for efficiently controlling the networked access of a Web-based geospatial browser to server-based datasets in particular, massively sized datasets has been developed. The method specifically uses the Keyhole Markup Language (KML), an Open Geospatial Consortium (OGS) standard used by Google Earth and other KML-compliant geospatial client applications. The innovation is based on establishing a dynamic cascading KML strategy that is initiated by a KML launch file provided by a data server host to a Google Earth or similar KMLcompliant geospatial client application user. Upon execution, the launch KML code issues a request for image data covering an initial geographic region. The server responds with the requested data along with subsequent dynamically generated KML code that directs the client application to make follow-on requests for higher level of detail (LOD) imagery to replace the initial imagery as the user navigates into the dataset. The approach provides an efficient data traversal path and mechanism that can be flexibly established for any dataset regardless of size or other characteristics. The method yields significant improvements in userinteractive geospatial client and data server interaction and associated network bandwidth requirements. The innovation uses a C- or PHP-code-like grammar that provides a high degree of processing flexibility. A set of language lexer and parser elements is provided that offers a complete language grammar for writing and executing language directives. A script is wrapped and passed to the geospatial data server by a client application as a component of a standard KML-compliant statement. The approach provides an efficient means for a geospatial client application to request server preprocessing of data prior to client delivery. Data is structured in a quadtree format. As the user zooms into the dataset, geographic regions are subdivided into four child regions. Conversely, as the user zooms out, four child regions collapse into a single, lower-LOD region. The approach provides an efficient data traversal path and mechanism that can be flexibly established for any dataset regardless of size or other characteristics.

  6. The database on transgenic luminescent microorganisms as an instrument of studying a microbial component of closed ecosystems

    NASA Astrophysics Data System (ADS)

    Boyandin, A. N.; Lankin, Y. P.; Kargatova, T. V.; Popova, L. Y.; Pechurkin, N. S.

    Luminescent transgenic microorganisms are widely used for study of microbial communities' functioning including closed ones. Bioluminescence is of high sensitive to effects of different environmental factors. Integration of lux-genes into different metabolic ways allows studying many aspects of microorganisms' life permitting to carry out measurements in situ. There is much information about applications of bioluminescent bacteria in different researches. But for effective using these data their summarizing and accumulation in common source is required. Therefore an information system on characteristics of transgenic microorganisms with cloned lux-genes was created. The database and client software related were developed. A database structure includes information on common characteristics of cloned lux-genes, their sources and properties, on regulation of gene expression in bacterial cells, on dependence of bioluminescence manifestation on biotic, abiotic and anthropogenic environmental factors. The database also can store description of changes in bacterial populations depending on environmental changes. The database created allows storing and using bibliographic information and also links to web sites of world collections of microorganisms. Internet publishing software permitting to open access to the database through the Internet is developed.

  7. Ontology-based geospatial data query and integration

    USGS Publications Warehouse

    Zhao, T.; Zhang, C.; Wei, M.; Peng, Z.-R.

    2008-01-01

    Geospatial data sharing is an increasingly important subject as large amount of data is produced by a variety of sources, stored in incompatible formats, and accessible through different GIS applications. Past efforts to enable sharing have produced standardized data format such as GML and data access protocols such as Web Feature Service (WFS). While these standards help enabling client applications to gain access to heterogeneous data stored in different formats from diverse sources, the usability of the access is limited due to the lack of data semantics encoded in the WFS feature types. Past research has used ontology languages to describe the semantics of geospatial data but ontology-based queries cannot be applied directly to legacy data stored in databases or shapefiles, or to feature data in WFS services. This paper presents a method to enable ontology query on spatial data available from WFS services and on data stored in databases. We do not create ontology instances explicitly and thus avoid the problems of data replication. Instead, user queries are rewritten to WFS getFeature requests and SQL queries to database. The method also has the benefits of being able to utilize existing tools of databases, WFS, and GML while enabling query based on ontology semantics. ?? 2008 Springer-Verlag Berlin Heidelberg.

  8. Infectious Disease Information Collection System at the Scene of Disaster Relief Based on a Personal Digital Assistant.

    PubMed

    Li, Ya-Pin; Gao, Hong-Wei; Fan, Hao-Jun; Wei, Wei; Xu, Bo; Dong, Wen-Long; Li, Qing-Feng; Song, Wen-Jing; Hou, Shi-Ke

    2017-12-01

    The objective of this study was to build a database to collect infectious disease information at the scene of a disaster through the use of 128 epidemiological questionnaires and 47 types of options, with rapid acquisition of information regarding infectious disease and rapid questionnaire customization at the scene of disaster relief by use of a personal digital assistant (PDA). SQL Server 2005 (Microsoft Corp, Redmond, WA) was used to create the option database for the infectious disease investigation, to develop a client application for the PDA, and to deploy the application on the server side. The users accessed the server for data collection and questionnaire customization with the PDA. A database with a set of comprehensive options was created and an application system was developed for the Android operating system (Google Inc, Mountain View, CA). On this basis, an infectious disease information collection system was built for use at the scene of disaster relief. The creation of an infectious disease information collection system and rapid questionnaire customization through the use of a PDA was achieved. This system integrated computer technology and mobile communication technology to develop an infectious disease information collection system and to allow for rapid questionnaire customization at the scene of disaster relief. (Disaster Med Public Health Preparedness. 2017;11:668-673).

  9. Modis, SeaWIFS, and Pathfinder funded activities

    NASA Technical Reports Server (NTRS)

    Evans, Robert H.

    1995-01-01

    MODIS (Moderate Resolution Imaging Spectrometer), SeaWIFS (Sea-viewing Wide Field Sensor), Pathfinder, and DSP (Digital Signal Processor) objectives are summarized. An overview of current progress is given for the automatic processing database, client/server status, matchup database, and DSP support.

  10. Fostering engagement during termination: Applying attachment theory and research.

    PubMed

    Marmarosh, Cheri L

    2017-03-01

    Therapists often struggle to determine the most important things to focus on during termination. Reviewing the treatment, identifying plans for the future, summarizing positive gains, and saying goodbye receive the most attention. Despite our best intentions, termination can end up becoming intellectualized. Attachment theory and recent developments in neuroscience offer us a road map for facilitating endings that address client's underlying relational needs, direct us to foster engagement, and help us facilitate new relational experience that can be transformative for clients. We argue that endings in therapy activate client's and therapist's attachments and these endings trigger emotion regulating strategies that can elicit client's engagement or more defensiveness. The current paper will highlight through de-identified case examples how clients automatically respond termination and how therapists can foster rich relational experiences in the here-and-now that clients can take with them. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  11. Clients' and therapists' real relationship and session quality in brief therapy: an actor partner interdependence analysis.

    PubMed

    Markin, Rayna D; Kivlighan, Dennis M; Gelso, Charles J; Hummel, Ann M; Spiegel, Eric B

    2014-09-01

    This study used the Actor Partner Interdependence Model (APIM; Kenny & Cook, 1999) to examine the associations of client- and therapist-rated real relationship (RR) and session quality over time. Eighty-seven clients and their therapists (n = 25) completed RR and session quality measures after every session of brief therapy. Therapists' current session quality ratings were significantly related to all of the following: session number (b = .04), their session quality rating of the previous session (b = .24), their RR in the previous session (b = 1.091), their client's RR in the previous session (b = .17), and interactions between their own and their clients' RR and session number (b = -.16 and β = -.04, respectively). Clients' ratings of current session quality were significantly related to only their own RR in the previous session (b = .47). Implications for future research and practice are discussed. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  12. Systematic plan of building Web geographic information system based on ActiveX control

    NASA Astrophysics Data System (ADS)

    Zhang, Xia; Li, Deren; Zhu, Xinyan; Chen, Nengcheng

    2003-03-01

    A systematic plan of building Web Geographic Information System (WebGIS) using ActiveX technology is proposed in this paper. In the proposed plan, ActiveX control technology is adopted in building client-side application, and two different schemas are introduced to implement communication between controls in users¡ browser and middle application server. One is based on Distribute Component Object Model (DCOM), the other is based on socket. In the former schema, middle service application is developed as a DCOM object that communicates with ActiveX control through Object Remote Procedure Call (ORPC) and accesses data in GIS Data Server through Open Database Connectivity (ODBC). In the latter, middle service application is developed using Java language. It communicates with ActiveX control through socket based on TCP/IP and accesses data in GIS Data Server through Java Database Connectivity (JDBC). The first one is usually developed using C/C++, and it is difficult to develop and deploy. The second one is relatively easy to develop, but its performance of data transfer relies on Web bandwidth. A sample application is developed using the latter schema. It is proved that the performance of the sample application is better than that of some other WebGIS applications in some degree.

  13. A Client/Server Architecture for Supporting Science Data Using EPICS Version 4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dalesio, Leo

    2015-04-21

    The Phase 1 grant that serves as a precursor to this proposal, prototyped complex storage techniques for high speed structured data that is being produced in accelerator diagnostics and beam line experiments. It demonstrates the technologies that can be used to archive and retrieve complex data structures and provide the performance required by our new accelerators, instrumentations, and detectors. Phase 2 is proposed to develop a high-performance platform for data acquisition and analysis to provide physicists and operators a better understanding of the beam dynamics. This proposal includes developing a platform for reading 109 MHz data at 10 KHz ratesmore » through a multicore front end processor, archiving the data to an archive repository that is then indexed for fast retrieval. The data is then retrieved from this data archive, integrated with the scalar data, to provide data sets to client applications for analysis, use in feedback, and to aid in identifying problem with the instrumentation, plant, beam steering, or model. This development is built on EPICS version 4 , which is being successfully deployed to implement physics applications. Through prior SBIR grants, EPICS version 4 has a solid communication protocol for middle layer services (PVAccess), structured data representation and methods for efficient transportation and access (PVData), an operational hierarchical record environment (JAVA IOC), and prototypes for standard structured data (Normative Types). This work was further developed through project funding to successfully deploy the first service based physics application environment with demonstrated services that provide arbitrary object views, save sets, model, lattice, and unit conversion. Thin client physics applications have been developed in Python that implement quad centering, orbit display, bump control, and slow orbit feedback. This service based architecture has provided a very modular and robust environment that enables commissioning teams to rapidly develop and deploy small scripts that build on powerful services. These services are all built on relational database data stores and scalar data. The work proposed herein, builds on these previous successes to provide data acquisition of high speed data for online analysis clients.« less

  14. Addressing clients' racism and racial prejudice in individual psychotherapy: Therapeutic considerations.

    PubMed

    Bartoli, Eleonora; Pyati, Aarti

    2009-06-01

    Psychotherapists lack clear guidelines regarding how to address clients' racist and prejudicial comments in clinical work. The authors explore the contributions of multicultural, social justice, feminist, and ethical theories to the field of psychotherapy and apply these theories to 2 clinical vignettes in which clients made racially charged statements. These clinical examples highlight the importance of using racial, in addition to traditional, theories to decipher the clinical meanings of racial comments and dynamics in clinical work. The article provides therapeutic conceptualizations regarding how to address clients' racist and prejudicial comments in psychotherapy and elaborates on the complex meanings that might arise from engaging in racially charged discussions with clients depending on the racial composition of the therapeutic dyad. In addition to highlighting how social justice, multicultural, and feminist lenses are necessary to fully understand the meaning of clients' comments, the argument is made that addressing clients' racist and prejudicial comments is at once a clinical and a social justice issue. (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  15. Integrating a local database into the StarView distributed user interface

    NASA Technical Reports Server (NTRS)

    Silberberg, D. P.

    1992-01-01

    A distributed user interface to the Space Telescope Data Archive and Distribution Service (DADS) known as StarView is being developed. The DADS architecture consists of the data archive as well as a relational database catalog describing the archive. StarView is a client/server system in which the user interface is the front-end client to the DADS catalog and archive servers. Users query the DADS catalog from the StarView interface. Query commands are transmitted via a network and evaluated by the database. The results are returned via the network and are displayed on StarView forms. Based on the results, users decide which data sets to retrieve from the DADS archive. Archive requests are packaged by StarView and sent to DADS, which returns the requested data sets to the users. The advantages of distributed client/server user interfaces over traditional one-machine systems are well known. Since users run software on machines separate from the database, the overall client response time is much faster. Also, since the server is free to process only database requests, the database response time is much faster. Disadvantages inherent in this architecture are slow overall database access time due to the network delays, lack of a 'get previous row' command, and that refinements of a previously issued query must be submitted to the database server, even though the domain of values have already been returned by the previous query. This architecture also does not allow users to cross correlate DADS catalog data with other catalogs. Clearly, a distributed user interface would be more powerful if it overcame these disadvantages. A local database is being integrated into StarView to overcome these disadvantages. When a query is made through a StarView form, which is often composed of fields from multiple tables, it is translated to an SQL query and issued to the DADS catalog. At the same time, a local database table is created to contain the resulting rows of the query. The returned rows are displayed on the form as well as inserted into the local database table. Identical results are produced by reissuing the query to either the DADS catalog or to the local table. Relational databases do not provide a 'get previous row' function because of the inherent complexity of retrieving previous rows of multiple-table joins. However, since this function is easily implemented on a single table, StarView uses the local table to retrieve the previous row. Also, StarView issues subsequent query refinements to the local table instead of the DADS catalog, eliminating the network transmission overhead. Finally, other catalogs can be imported into the local database for cross correlation with local tables. Overall, it is believe that this is a more powerful architecture for distributed, database user interfaces.

  16. Laboratory Information Systems.

    PubMed

    Henricks, Walter H

    2015-06-01

    Laboratory information systems (LISs) supply mission-critical capabilities for the vast array of information-processing needs of modern laboratories. LIS architectures include mainframe, client-server, and thin client configurations. The LIS database software manages a laboratory's data. LIS dictionaries are database tables that a laboratory uses to tailor an LIS to the unique needs of that laboratory. Anatomic pathology LIS (APLIS) functions play key roles throughout the pathology workflow, and laboratories rely on LIS management reports to monitor operations. This article describes the structure and functions of APLISs, with emphasis on their roles in laboratory operations and their relevance to pathologists. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Database Organisation in a Web-Enabled Free and Open-Source Software (foss) Environment for Spatio-Temporal Landslide Modelling

    NASA Astrophysics Data System (ADS)

    Das, I.; Oberai, K.; Sarathi Roy, P.

    2012-07-01

    Landslides exhibit themselves in different mass movement processes and are considered among the most complex natural hazards occurring on the earth surface. Making landslide database available online via WWW (World Wide Web) promotes the spreading and reaching out of the landslide information to all the stakeholders. The aim of this research is to present a comprehensive database for generating landslide hazard scenario with the help of available historic records of landslides and geo-environmental factors and make them available over the Web using geospatial Free & Open Source Software (FOSS). FOSS reduces the cost of the project drastically as proprietary software's are very costly. Landslide data generated for the period 1982 to 2009 were compiled along the national highway road corridor in Indian Himalayas. All the geo-environmental datasets along with the landslide susceptibility map were served through WEBGIS client interface. Open source University of Minnesota (UMN) mapserver was used as GIS server software for developing web enabled landslide geospatial database. PHP/Mapscript server-side application serve as a front-end application and PostgreSQL with PostGIS extension serve as a backend application for the web enabled landslide spatio-temporal databases. This dynamic virtual visualization process through a web platform brings an insight into the understanding of the landslides and the resulting damage closer to the affected people and user community. The landslide susceptibility dataset is also made available as an Open Geospatial Consortium (OGC) Web Feature Service (WFS) which can be accessed through any OGC compliant open source or proprietary GIS Software.

  18. 77 FR 38581 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-28

    ...: Minority Business Development Agency. Title: Online Customer Relationship Management (CRM)/Performance... client information, service activities and progress on attainment of program goals via the Online CRM/Performance Databases. The data collected through the Online CRM/Performance Databases is used to regularly...

  19. Saying good goodbyes to your clients: A functional analytic psychotherapy (FAP) perspective.

    PubMed

    Tsai, Mavis; Gustafsson, Tore; Kanter, Jonathan; Plummer Loudon, Mary; Kohlenberg, Robert J

    2017-03-01

    Functional analytic psychotherapy (FAP) promotes client growth by shaping clients' daily life problems that also show up in session with their therapists. FAP therapists create evocative contexts within therapy that afford clients the opportunity to practice, refine, and be reinforced for new, more adaptive behaviors which then can be generalized into their outside lives. In FAP, the termination process will vary from client to client depending on the nature of the client's problems and targets. For many clients, the process can be a rich, multifaceted, final opportunity to evoke, reinforce, and promote generalization of clients' in-session improvements, particularly improvements related to vulnerable self-expression in the service of intimate and close relationships. By making explicit agreements at the outset of therapy to participate in an intentional termination process, and by later providing an evocative structure for ending therapy with vulnerable emotional expression, clients have the opportunity to develop more adaptive behaviors in the context of relationship endings which can be a painful part of the human experience. Equipped with the skills of open-hearted communication developed from an authentic relationship with their therapist, clients can leave therapy on a trajectory of further growth in interpersonal connection and living more boldly. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  20. MADGE: scalable distributed data management software for cDNA microarrays.

    PubMed

    McIndoe, Richard A; Lanzen, Aaron; Hurtz, Kimberly

    2003-01-01

    The human genome project and the development of new high-throughput technologies have created unparalleled opportunities to study the mechanism of diseases, monitor the disease progression and evaluate effective therapies. Gene expression profiling is a critical tool to accomplish these goals. The use of nucleic acid microarrays to assess the gene expression of thousands of genes simultaneously has seen phenomenal growth over the past five years. Although commercial sources of microarrays exist, investigators wanting more flexibility in the genes represented on the array will turn to in-house production. The creation and use of cDNA microarrays is a complicated process that generates an enormous amount of information. Effective data management of this information is essential to efficiently access, analyze, troubleshoot and evaluate the microarray experiments. We have developed a distributable software package designed to track and store the various pieces of data generated by a cDNA microarray facility. This includes the clone collection storage data, annotation data, workflow queues, microarray data, data repositories, sample submission information, and project/investigator information. This application was designed using a 3-tier client server model. The data access layer (1st tier) contains the relational database system tuned to support a large number of transactions. The data services layer (2nd tier) is a distributed COM server with full database transaction support. The application layer (3rd tier) is an internet based user interface that contains both client and server side code for dynamic interactions with the user. This software is freely available to academic institutions and non-profit organizations at http://www.genomics.mcg.edu/niddkbtc.

  1. Mobile Application "Neurogame" for Assessment the Attention, Focus and Concentration.

    PubMed

    Loleski, Mario; Loleska, Sofija; Pop-Jordanova, Nada

    2017-12-01

    Smartphones are ubiquitous, but it is still unknown what physiological functions can be monitored at clinical quality. In medicine their use is cited in many fields (cardiology, pulmology, endocrinology, rheumatology, pediatrics as well as in the field of mental health). The aim of this paper is to explain how the use of mobile application can help clients to improve the index of their focus, concentration and motor skills. Our original developed application on Android operating system, named "neurogame" is based on an open source platform to enable assessment and therapeutic stimulation, focus and concentration with the ability to monitor the progress of the results obtained in a larger number of participants (normal subjects as well as patients with different disorders) over a period of time. Whilst nowadays the predominant focus is on the pharmacological treatments, there is a rapidly growing interest in research on alternative options that will offer help in many cases of disorder management in terms of mobile application games. In order to have some kind of "norms", we evaluated a group of healthy population. Obtained results will serve as a database for comparison the future results. This article displays the results obtained as database.

  2. New NED XML/VOtable Services and Client Interface Applications

    NASA Astrophysics Data System (ADS)

    Pevunova, O.; Good, J.; Mazzarella, J.; Berriman, G. B.; Madore, B.

    2005-12-01

    The NASA/IPAC Extragalactic Database (NED) provides data and cross-identifications for over 7 million extragalactic objects fused from thousands of survey catalogs and journal articles. The data cover all frequencies from radio through gamma rays and include positions, redshifts, photometry and spectral energy distributions (SEDs), sizes, and images. NED services have traditionally supplied data in HTML format for connections from Web browsers, and a custom ASCII data structure for connections by remote computer programs written in the C programming language. We describe new services that provide responses from NED queries in XML documents compliant with the international virtual observatory VOtable protocol. The XML/VOtable services support cone searches, all-sky searches based on object attributes (survey names, cross-IDs, redshifts, flux densities), and requests for detailed object data. Initial services have been inserted into the NVO registry, and others will follow soon. The first client application is a Style Sheet specification for rendering NED VOtable query results in Web browsers that support XML. The second prototype application is a Java applet that allows users to compare multiple SEDs. The new XML/VOtable output mode will also simplify the integration of data from NED into visualization and analysis packages, software agents, and other virtual observatory applications. We show an example SED from NED plotted using VOPlot. The NED website is: http://nedwww.ipac.caltech.edu.

  3. When Problem Gambling Is the Primary Reason for Seeking Addiction Treatment

    ERIC Educational Resources Information Center

    Jamieson, John; Mazmanian, Dwight; Penney, Alexander; Black, Nancy; Nguyen, An

    2011-01-01

    An existing database was used to compare problem gamblers (N = 138) who presented for treatment of their gambling problem to two other groups: alcohol and/or drug addiction clients who also had a gambling problem (N = 280) or who did not have a gambling problem (N = 2178). Clients with gambling as their primary problem were more likely to be…

  4. Software for Building Models of 3D Objects via the Internet

    NASA Technical Reports Server (NTRS)

    Schramer, Tim; Jensen, Jeff

    2003-01-01

    The Virtual EDF Builder (where EDF signifies Electronic Development Fixture) is a computer program that facilitates the use of the Internet for building and displaying digital models of three-dimensional (3D) objects that ordinarily comprise assemblies of solid models created previously by use of computer-aided-design (CAD) programs. The Virtual EDF Builder resides on a Unix-based server computer. It is used in conjunction with a commercially available Web-based plug-in viewer program that runs on a client computer. The Virtual EDF Builder acts as a translator between the viewer program and a database stored on the server. The translation function includes the provision of uniform resource locator (URL) links to other Web-based computer systems and databases. The Virtual EDF builder can be used in two ways: (1) If the client computer is Unix-based, then it can assemble a model locally; the computational load is transferred from the server to the client computer. (2) Alternatively, the server can be made to build the model, in which case the server bears the computational load and the results are downloaded to the client computer or workstation upon completion.

  5. The Human Transcript Database: A Catalogue of Full Length cDNA Inserts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bouckk John; Michael McLeod; Kim Worley

    1999-09-10

    The BCM Search Launcher provided improved access to web-based sequence analysis services during the granting period and beyond. The Search Launcher web site grouped analysis procedures by function and provided default parameters that provided reasonable search results for most applications. For instance, most queries were automatically masked for repeat sequences prior to sequence database searches to avoid spurious matches. In addition to the web-based access and arrangements that were made using the functions easier, the BCM Search Launcher provided unique value-added applications like the BEAUTY sequence database search tool that combined information about protein domains and sequence database search resultsmore » to give an enhanced, more complete picture of the reliability and relative value of the information reported. This enhanced search tool made evaluating search results more straight-forward and consistent. Some of the favorite features of the web site are the sequence utilities and the batch client functionality that allows processing of multiple samples from the command line interface. One measure of the success of the BCM Search Launcher is the number of sites that have adopted the models first developed on the site. The graphic display on the BLAST search from the NCBI web site is one such outgrowth, as is the display of protein domain search results within BLAST search results, and the design of the Biology Workbench application. The logs of usage and comments from users confirm the great utility of this resource.« less

  6. An XML-based Generic Tool for Information Retrieval in Solar Databases

    NASA Astrophysics Data System (ADS)

    Scholl, Isabelle F.; Legay, Eric; Linsolas, Romain

    This paper presents the current architecture of the `Solar Web Project' now in its development phase. This tool will provide scientists interested in solar data with a single web-based interface for browsing distributed and heterogeneous catalogs of solar observations. The main goal is to have a generic application that can be easily extended to new sets of data or to new missions with a low level of maintenance. It is developed with Java and XML is used as a powerful configuration language. The server, independent of any database scheme, can communicate with a client (the user interface) and several local or remote archive access systems (such as existing web pages, ftp sites or SQL databases). Archive access systems are externally described in XML files. The user interface is also dynamically generated from an XML file containing the window building rules and a simplified database description. This project is developed at MEDOC (Multi-Experiment Data and Operations Centre), located at the Institut d'Astrophysique Spatiale (Orsay, France). Successful tests have been conducted with other solar archive access systems.

  7. Concierge: Personal Database Software for Managing Digital Research Resources

    PubMed Central

    Sakai, Hiroyuki; Aoyama, Toshihiro; Yamaji, Kazutsuna; Usui, Shiro

    2007-01-01

    This article introduces a desktop application, named Concierge, for managing personal digital research resources. Using simple operations, it enables storage of various types of files and indexes them based on content descriptions. A key feature of the software is a high level of extensibility. By installing optional plug-ins, users can customize and extend the usability of the software based on their needs. In this paper, we also introduce a few optional plug-ins: literature management, electronic laboratory notebook, and XooNlps client plug-ins. XooNIps is a content management system developed to share digital research resources among neuroscience communities. It has been adopted as the standard database system in Japanese neuroinformatics projects. Concierge, therefore, offers comprehensive support from management of personal digital research resources to their sharing in open-access neuroinformatics databases such as XooNIps. This interaction between personal and open-access neuroinformatics databases is expected to enhance the dissemination of digital research resources. Concierge is developed as an open source project; Mac OS X and Windows XP versions have been released at the official site (http://concierge.sourceforge.jp). PMID:18974800

  8. The GMOS cyber(e)-infrastructure: advanced services for supporting science and policy.

    PubMed

    Cinnirella, S; D'Amore, F; Bencardino, M; Sprovieri, F; Pirrone, N

    2014-03-01

    The need for coordinated, systematized and catalogued databases on mercury in the environment is of paramount importance as improved information can help the assessment of the effectiveness of measures established to phase out and ban mercury. Long-term monitoring sites have been established in a number of regions and countries for the measurement of mercury in ambient air and wet deposition. Long term measurements of mercury concentration in biota also produced a huge amount of information, but such initiatives are far from being within a global, systematic and interoperable approach. To address these weaknesses the on-going Global Mercury Observation System (GMOS) project ( www.gmos.eu ) established a coordinated global observation system for mercury as well it retrieved historical data ( www.gmos.eu/sdi ). To manage such large amount of information a technological infrastructure was planned. This high-performance back-end resource associated with sophisticated client applications enables data storage, computing services, telecommunications networks and all services necessary to support the activity. This paper reports the architecture definition of the GMOS Cyber(e)-Infrastructure and the services developed to support science and policy, including the United Nation Environmental Program. It finally describes new possibilities in data analysis and data management through client applications.

  9. 45 CFR 1634.9 - Selection criteria.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... key staff; (6) The applicant's knowledge of the various components of the legal services delivery... services to eligible clients; and (ii) its knowledge of and willingness to cooperate with other legal... applicant's capacity to ensure continuity in client services and representation of eligible clients with...

  10. 45 CFR 1634.9 - Selection criteria.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... key staff; (6) The applicant's knowledge of the various components of the legal services delivery... services to eligible clients; and (ii) its knowledge of and willingness to cooperate with other legal... applicant's capacity to ensure continuity in client services and representation of eligible clients with...

  11. 45 CFR 1634.9 - Selection criteria.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... key staff; (6) The applicant's knowledge of the various components of the legal services delivery... services to eligible clients; and (ii) its knowledge of and willingness to cooperate with other legal... applicant's capacity to ensure continuity in client services and representation of eligible clients with...

  12. 45 CFR 1634.9 - Selection criteria.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... key staff; (6) The applicant's knowledge of the various components of the legal services delivery... services to eligible clients; and (ii) its knowledge of and willingness to cooperate with other legal... applicant's capacity to ensure continuity in client services and representation of eligible clients with...

  13. Emotional congruence between clients and therapists and its effect on treatment outcome.

    PubMed

    Atzil-Slonim, Dana; Bar-Kalifa, Eran; Fisher, Hadar; Peri, Tuvia; Lutz, Wolfgang; Rubel, Julian; Rafaeli, Eshkol

    2018-01-01

    The present study aimed to (a) explore 2 indices of emotional congruence-temporal similarity and directional discrepancy-between clients' and therapists' ratings of their emotions as they cofluctuate session-by-session; and (b) examine whether client/therapist emotional congruence predicts clients' symptom relief and improved functioning. The sample comprised 109 clients treated by 62 therapists in a university setting. Clients and therapists self-reported their negative (NE) and positive emotions (PE) after each session. Symptom severity and functioning level were assessed at the beginning of each session using the clients' self-reports. To assess emotional congruence, an adaptation of West and Kenny's (2011) Truth and Bias model was applied. To examine the consequences of emotional congruence, polynomial regression, and response surface analyses were conducted (Edwards & Parry, 1993). Clients and therapists were temporally similar in both PE and NE. Therapists experienced less intense PE on average, but did not experience more or less intense NE than their clients. Those therapists who experienced more intense NE than their clients were more temporally similar in their emotions to their clients. Therapist/client incongruence in both PE and NE predicted poorer next-session symptomatology; incongruence in PE was also associated with lower client next-session functioning. Session-level symptoms were better when therapists experienced more intense emotions (both PE and NE) than their clients. The findings highlight the importance of recognizing the dynamic nature of emotions in client-therapist interactions and the contribution of session-by-session emotional dynamics to outcomes. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  14. An integrated biomedical telemetry system for sleep monitoring employing a portable body area network of sensors (SENSATION).

    PubMed

    Astaras, Alexander; Arvanitidou, Marina; Chouvarda, Ioanna; Kilintzis, Vassilis; Koutkias, Vassilis; Sanchez, Eduardo Monton; Stalidis, George; Triantafyllidis, Andreas; Maglaveras, Nicos

    2008-01-01

    A flexible, scaleable and cost-effective medical telemetry system is described for monitoring sleep-related disorders in the home environment. The system was designed and built for real-time data acquisition and processing, allowing for additional use in intensive care unit scenarios where rapid medical response is required in case of emergency. It comprises a wearable body area network of Zigbee-compatible wireless sensors worn by the subject, a central database repository residing in the medical centre and thin client workstations located at the subject's home and in the clinician's office. The system supports heterogeneous setup configurations, involving a variety of data acquisition sensors to suit several medical applications. All telemetry data is securely transferred and stored in the central database under the clinicians' ownership and control.

  15. Examining the relationship between therapeutic self-care and adverse events for home care clients in Ontario, Canada: a retrospective cohort study.

    PubMed

    Sun, Winnie; Doran, Diane M; Wodchis, Walter P; Peter, Elizabeth

    2017-03-14

    In an era of a rapidly aging population who requires home care services, clients must possess or develop therapeutic self-care ability in order to manage their health conditions safely in their homes. Therapeutic self-care is the ability to take medications as prescribed and to recognize and manage symptoms that may be experienced, such as pain. The purpose of this research study was to investigate whether therapeutic self-care ability explained variation in the frequency and types of adverse events experienced by home care clients. A retrospective cohort design was used, utilizing secondary databases available for Ontario home care clients from the years 2010 to 2012. The data were derived from (1) Health Outcomes for Better Information and Care; (2) Resident Assessment Instrument-Home Care; (3) National Ambulatory Care Reporting System; and (4) Discharge Abstract Database. Descriptive analysis was used to identify the types and prevalence of adverse events experienced by home care clients. Logistic regression analysis was used to examine the association between therapeutic self-care ability and the occurrence of adverse events in home care. The results indicated that low therapeutic self-care ability was associated with an increase in adverse events. In particular, logistic regression results indicated that low therapeutic self-care ability was associated with an increase in clients experiencing: (1) unplanned hospital visits; (2) a decline in activities of daily living; (3) falls; (4) unintended weight loss, and (5) non-compliance with medication. This study advances the understanding about the role of therapeutic self-care ability in supporting the safety of home care clients. High levels of therapeutic self-care ability can be a protective factor against the occurrence of adverse events among home care clients. A clear understanding of the nature of the relationship between therapeutic self-care ability and adverse events helps to pinpoint the areas of home care service delivery required to improve clients' health and functioning. Such knowledge is vital for informing health care leaders about effective strategies that promote therapeutic self-care, as well as providing evidence for policy formulation in relation to risk mitigation in home care.

  16. Introduction

    NASA Astrophysics Data System (ADS)

    Zhao, Ben; Garbacki, Paweł; Gkantsidis, Christos; Iamnitchi, Adriana; Voulgaris, Spyros

    After a decade of intensive investigation, peer-to-peer computing has established itself as an accepted research eld in the general area of distributed systems. Peer-to- peer computing can be seen as the democratization of computing over throwing traditional hierarchical designs favored in client-server systems largely brought about by last-mile network improvements which have made individual PCs rst-class citizens in the network community. Much of the early focus in peer-to-peer systems was on best-effort le sharing applications. In recent years, however, research has focused on peer-to-peer systems that provide operational properties and functionality similar to those shown by more traditional distributed systems. These properties include stronger consistency, reliability, and security guarantees suitable to supporting traditional applications such as databases.

  17. Intersectionality in psychotherapy: The experiences of an AfroLatinx queer immigrant.

    PubMed

    Adames, Hector Y; Chavez-Dueñas, Nayeli Y; Sharma, Shweta; La Roche, Martin J

    2018-03-01

    Culturally responsive and racially conscious psychotherapeutic work requires that therapists recognize the ways clients are impacted by their multiple marginalized identities and by systems of oppression (e.g., racism, ethnocentrism, sexism, heterosexism, and nativism). Attending exclusively to clients' marginalized identities (i.e., weak intersectionality) may drive therapists to only focus on internal, subjective, and emotional experiences, hence, missing the opportunity to consider and address how multiple sociostructural dimensions (i.e., strong intersectionality) may be impacting the client's presenting problems. Alternatively, focusing solely on the impact of sociostructural dimensions on the lives of clients may miss the more nuanced and variable individual personal experiences. In this article, we highlight the challenge of maintaining a culturally responsive and racially conscious stance when considering multiple marginalized identities, overlapping systemic inequities, and how both affect clients' lives and experiences. The case of an AfroLatinx queer immigrant is presented to illustrate some of the challenges and opportunities while simultaneously considering (a) the client's multiple marginalized identities, (b) the way clients are impacted by systemic oppression, and (c) integrating the client's personal experiences and narratives in psychotherapy. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  18. Transference and insight in psychotherapy with gay and bisexual male clients: the role of sexual orientation identity integration.

    PubMed

    Mohr, Jonathan J; Fuertes, Jairo N; Stracuzzi, Thomas I

    2015-03-01

    Clinical writing has suggested that the therapeutic process and relationship in work with lesbian, gay, and bisexual clients may be influenced by the extent to which clients have accepted their sexual orientation and developed a social network supportive of their sexual orientation, a construct we refer to as sexual orientation identity integration. The present cross-sectional study investigated this proposition by examining the identity integration ratings of 90 gay and bisexual male clients in relation to elements of treatment as rated by both the therapist (insight, negative transference, working alliance, session depth, and client improvement) and client (working alliance, session depth, and client improvement). Participants were male-male therapy dyads recruited from lesbian, gay, and bisexual-affirming practices. Client identity integration was negatively associated with transference, and positively associated with ratings of insight, alliance, depth, and improvement. Insight, but not transference, uniquely mediated the positive association between identity integration and most indicators of therapeutic quality. Results from an exploratory model suggested that transference may indirectly influence therapeutic quality by serving as a barrier to insight. (PsycINFO Database Record (c) 2015 APA, all rights reserved).

  19. Intellectual Production Supervision Perform based on RFID Smart Electricity Meter

    NASA Astrophysics Data System (ADS)

    Chen, Xiangqun; Huang, Rui; Shen, Liman; chen, Hao; Xiong, Dezhi; Xiao, Xiangqi; Liu, Mouhai; Xu, Renheng

    2018-03-01

    This topic develops the RFID intelligent electricity meter production supervision project management system. The system is designed for energy meter production supervision in the management of the project schedule, quality and cost information management requirements in RFID intelligent power, and provide quantitative information more comprehensive, timely and accurate for supervision engineer and project manager management decisions, and to provide technical information for the product manufacturing stage file. From the angle of scheme analysis, design, implementation and test, the system development of production supervision project management system for RFID smart meter project is discussed. Focus on the development of the system, combined with the main business application and management mode at this stage, focuses on the energy meter to monitor progress information, quality information and cost based information on RFID intelligent power management function. The paper introduces the design scheme of the system, the overall client / server architecture, client oriented graphical user interface universal, complete the supervision of project management and interactive transaction information display, the server system of realizing the main program. The system is programmed with C# language and.NET operating environment, and the client and server platforms use Windows operating system, and the database server software uses Oracle. The overall platform supports mainstream information and standards and has good scalability.

  20. Beauty from the beast: Avoiding errors in responding to client questions.

    PubMed

    Waehler, Charles A; Grandy, Natalie M

    2016-09-01

    Those rare moments when clients ask direct questions of their therapists likely represent a point when they are particularly open to new considerations, thereby representing an opportunity for substantial therapeutic gains. However, clinical errors abound in this area because clients' questions often engender apprehension in therapists, causing therapists to respond with too little or too much information or shutting down the discussion prematurely. These response types can damage the therapeutic relationship, the psychotherapy process, or both. We explore the nature of these clinical errors in response to client questions by providing examples from our own clinical work, suggesting potential reasons why clinicians may not make optimal use of client questions, and discussing how the mixed psychological literature further complicates the issue. We also present four guidelines designed to help therapists, trainers, and supervisors respond constructively to clinical questions in order to create constructive interactions. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  1. The necessary and sufficient conditions of therapeutic personality change.

    PubMed

    Rogers, Carl R

    2007-09-01

    This reprinted article originally appeared in Journal of Consulting Psychology, 1957(Apr), Vol 21(2), 95-103. (The following abstract of the original article appeared in record 1959-00842-001.) "For constructive personality change to occur, it is necessary that these conditions exist and continue over a period of time: (1) Two persons are in psychological contact. (2) The first, whom we shall term the client, is in a state of incongruence, being vulnerable or anxious. (3) The second person, whom we shall term the therapist, is congruent or integrated in the relationship. (4) The therapist experiences unconditional positive regard for the client. (5) The therapist experiences an empathic understanding of the client's internal frame of reference and endeavors to communicate this experience to the client. (6) The communication to the client of the therapist's empathic understanding and unconditional positive regard is to a minimal degree achieved." (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  2. Transitioning Client Based NALCOMIS to a Multi Function Web Based Application

    DTIC Science & Technology

    2016-09-23

    NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS TRANSITIONING CLIENT- BASED NALCOMIS TO A MULTI-FUNCTION WEB- BASED APPLICATION by Aaron P...TITLE AND SUBTITLE TRANSITIONING CLIENT- BASED NALCOMIS TO A MULTI-FUNCTION WEB- BASED APPLICATION 5. FUNDING NUMBERS 6. AUTHOR(S) Aaron P. Schnetzler 7...NALCOMIS. NALCOMIS has two configurations that are used by organizational and intermediate level maintenance activi- ties, Optimized Organizational

  3. A collaborative computer auditing system under SOA-based conceptual model

    NASA Astrophysics Data System (ADS)

    Cong, Qiushi; Huang, Zuoming; Hu, Jibing

    2013-03-01

    Some of the current challenges of computer auditing are the obstacles to retrieving, converting and translating data from different database schema. During the last few years, there are many data exchange standards under continuous development such as Extensible Business Reporting Language (XBRL). These XML document standards can be used for data exchange among companies, financial institutions, and audit firms. However, for many companies, it is still expensive and time-consuming to translate and provide XML messages with commercial application packages, because it is complicated and laborious to search and transform data from thousands of tables in the ERP databases. How to transfer transaction documents for supporting continuous auditing or real time auditing between audit firms and their client companies is a important topic. In this paper, a collaborative computer auditing system under SOA-based conceptual model is proposed. By utilizing the widely used XML document standards and existing data transformation applications developed by different companies and software venders, we can wrap these application as commercial web services that will be easy implemented under the forthcoming application environments: service-oriented architecture (SOA). Under the SOA environments, the multiagency mechanism will help the maturity and popularity of data assurance service over the Internet. By the wrapping of data transformation components with heterogeneous databases or platforms, it will create new component markets composed by many software vendors and assurance service companies to provide data assurance services for audit firms, regulators or third parties.

  4. Providers' perspectives of factors influencing implementation of evidence-based treatments in a community mental health setting: A qualitative investigation of the training-practice gap.

    PubMed

    Marques, Luana; Dixon, Louise; Valentine, Sarah E; Borba, Christina P C; Simon, Naomi M; Wiltsey Stirman, Shannon

    2016-08-01

    This study aims to elucidate relations between provider perceptions of aspects of the consolidated framework for implementation research (Damschroder et al., 2009) and provider attitudes toward the implementation of evidence-based treatments (EBTs) in an ethnically diverse community health setting. Guided by directed content analysis, we analyzed 28 semistructured interviews that were conducted with providers during the pre-implementation phase of a larger implementation study for cognitive processing therapy for posttraumatic stress disorder (Resick et al., 2008). Our findings extend the existing literature by also presenting provider-identified client-level factors that contribute to providers' positive and negative attitudes toward EBTs. Provider-identified client-level factors include the following: client motivation to engage in treatment, client openness to EBTs, support networks of family and friends, client use of community and government resources, the connection and relationship with their therapist, client treatment adherence, client immediate needs or crises, low literacy or illiteracy, low levels of education, client cognitive limitations, and misconceptions about therapy. These results highlight the relations between provider perceptions of their clients, provider engagement in EBT training, and subsequent adoption of EBTs. We present suggestions for future implementation research in this area. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  5. PsychVACS: a system for asynchronous telepsychiatry.

    PubMed

    Odor, Alberto; Yellowlees, Peter; Hilty, Donald; Parish, Michelle Burke; Nafiz, Najia; Iosif, Ana-Maria

    2011-05-01

    To describe the technical development of an asynchronous telepsychiatry application, the Psychiatric Video Archiving and Communication System. A client-server application was developed in Visual Basic.Net with Microsoft(®) SQL database as the backend. It includes the capability of storing video-recorded psychiatric interviews and manages the workflow of the system with automated messaging. Psychiatric Video Archiving and Communication System has been used to conduct the first ever series of asynchronous telepsychiatry consultations worldwide. A review of the software application and the process as part of this project has led to a number of improvements that are being implemented in the next version, which is being written in Java. This is the first description of the use of video recorded data in an asynchronous telemedicine application. Primary care providers and consulting psychiatrists have found it easy to work with and a valuable resource to increase the availability of psychiatric consultation in remote rural locations.

  6. 14 CFR 1261.317 - Attorney-client privilege.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 5 2010-01-01 2010-01-01 false Attorney-client privilege. 1261.317 Section... Injury or Death-Accruing On or After January 18, 1967 § 1261.317 Attorney-client privilege. (a) Attorneys... traditional attorney-client relationship with the employee with respect to application of the attorney-client...

  7. The impact of counselor self-disclosure on clients: a meta-analytic review of experimental and quasi-experimental research.

    PubMed

    Henretty, Jennifer R; Currier, Joseph M; Berman, Jeffrey S; Levitt, Heidi M

    2014-04-01

    In an attempt to make sense of contradictory findings, meta-analysis was used to review 53 studies that examined counselor self-disclosure (CSD) vs. nondisclosure. CSD, overall, was found to have a favorable impact on clients/participants, with clients/participants having favorable perceptions of disclosing counselors and rating themselves more likely to disclose to counselors who had self-disclosed. Specifically, CSD that (a) revealed similarity between client and counselor; (b) was of negative content valence; or (c) was related to intra- or, especially, extratherapy experiences, had favorable impacts on clients/participants compared with nondisclosure. These types of disclosure resulted in more favorable perceptions of the counselor, especially in the area of professional attractiveness. CSD that revealed similarity between client and counselor also had a favorable impact on clients'/participants' allegiance-specifically, on their willingness to return-to disclosing counselors. Significant moderators of the impact of CSD on clients included researcher bias for or against CSD, type of "session" (e.g., written transcript, interview, real session), timing of CSD (whether before or after client self-disclosure), verb tense of extratherapy CSD, experimental setting, type of control group, and the number of CSDs in the experiment. Clinical implications include that CSD may be beneficial for building rapport, strengthening alliance, and eliciting client disclosure, with similar CSD being especially beneficial. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  8. Dual diagnosis clients' treatment satisfaction - a systematic review

    PubMed Central

    2011-01-01

    Background The aim of this systematic review is to synthesize existing evidence about treatment satisfaction among clients with substance misuse and mental health co-morbidity (dual diagnoses, DD). Methods We examined satisfaction with treatment received, variations in satisfaction levels by type of treatment intervention and by diagnosis (i.e. DD clients vs. single diagnosis clients), and the influence of factors other than treatment type on satisfaction. Peer-reviewed studies published in English since 1970 were identified by searching electronic databases using pre-defined search strings. Results Across the 27 studies that met inclusion criteria, high average satisfaction scores were found. In most studies, integrated DD treatment yielded greater client satisfaction than standard treatment without explicit DD focus. In standard treatment without DD focus, DD clients tended to be less satisfied than single diagnosis clients. Whilst the evidence base on client and treatment variables related to satisfaction is small, it suggested client demographics and symptom severity to be unrelated to treatment satisfaction. However, satisfaction tended to be linked to other treatment process and outcome variables. Findings are limited in that many studies had very small sample sizes, did not use validated satisfaction instruments and may not have controlled for potential confounders. A framework for further research in this important area is discussed. Conclusions High satisfaction levels with current treatment provision, especially among those in integrated treatment, should enhance therapeutic optimism among practitioners dealing with DD clients. PMID:21501510

  9. The insecure psychotherapy base: Using client and therapist attachment styles to understand the early alliance.

    PubMed

    Marmarosh, Cheri L; Kivlighan, Dennis M; Bieri, Kathryn; LaFauci Schutt, Jean M; Barone, Carrie; Choi, Jaehwa

    2014-09-01

    The purpose of this study was to test the notion that complementary attachments are best for achieving a secure base in psychotherapy. Specifically, we predicted third to fifth session alliance from client- and therapist-rated attachment style interactions. Using a combined sample of 46 therapy dyads from a community mental health clinic and university counseling center, the client- and therapist-perceived therapy alliance, attachment anxiety, and attachment avoidance were examined at the beginning of therapy. The results of an Actor-Partner Interdependence Model (APIM; Kenny & Cook, 1999, Partner effects in relationship research: Conceptual issues, analytic difficulties, and illustrations. Personal Relationships, 6, 433-448.) indicated that there was no direct effect of either client or therapist attachment style on therapist or client early ratings of the alliance. One significant interaction emerged and indicated that client-perceived alliance was influenced by therapist and client attachment anxiety. The client-perceived early alliance was higher when more anxious therapists worked with clients with decreasing anxiety. The client early alliance was higher when less anxious therapists worked with clients with increasing anxiety. The findings partially support the notion that different attachment configurations between the therapist and client facilitate greater alliance, but this was the case only when assessing client-perceived early alliance and only with regards to the dimension of attachment anxiety. There were no significant main effects or interactions when exploring therapist-perceived alliance. Implications of the findings are discussed along with recommendations for future study and clinical training. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  10. Nencki Genomics Database--Ensembl funcgen enhanced with intersections, user data and genome-wide TFBS motifs.

    PubMed

    Krystkowiak, Izabella; Lenart, Jakub; Debski, Konrad; Kuterba, Piotr; Petas, Michal; Kaminska, Bozena; Dabrowski, Michal

    2013-01-01

    We present the Nencki Genomics Database, which extends the functionality of Ensembl Regulatory Build (funcgen) for the three species: human, mouse and rat. The key enhancements over Ensembl funcgen include the following: (i) a user can add private data, analyze them alongside the public data and manage access rights; (ii) inside the database, we provide efficient procedures for computing intersections between regulatory features and for mapping them to the genes. To Ensembl funcgen-derived data, which include data from ENCODE, we add information on conserved non-coding (putative regulatory) sequences, and on genome-wide occurrence of transcription factor binding site motifs from the current versions of two major motif libraries, namely, Jaspar and Transfac. The intersections and mapping to the genes are pre-computed for the public data, and the result of any procedure run on the data added by the users is stored back into the database, thus incrementally increasing the body of pre-computed data. As the Ensembl funcgen schema for the rat is currently not populated, our database is the first database of regulatory features for this frequently used laboratory animal. The database is accessible without registration using the mysql client: mysql -h database.nencki-genomics.org -u public. Registration is required only to add or access private data. A WSDL webservice provides access to the database from any SOAP client, including the Taverna Workbench with a graphical user interface.

  11. Nencki Genomics Database—Ensembl funcgen enhanced with intersections, user data and genome-wide TFBS motifs

    PubMed Central

    Krystkowiak, Izabella; Lenart, Jakub; Debski, Konrad; Kuterba, Piotr; Petas, Michal; Kaminska, Bozena; Dabrowski, Michal

    2013-01-01

    We present the Nencki Genomics Database, which extends the functionality of Ensembl Regulatory Build (funcgen) for the three species: human, mouse and rat. The key enhancements over Ensembl funcgen include the following: (i) a user can add private data, analyze them alongside the public data and manage access rights; (ii) inside the database, we provide efficient procedures for computing intersections between regulatory features and for mapping them to the genes. To Ensembl funcgen-derived data, which include data from ENCODE, we add information on conserved non-coding (putative regulatory) sequences, and on genome-wide occurrence of transcription factor binding site motifs from the current versions of two major motif libraries, namely, Jaspar and Transfac. The intersections and mapping to the genes are pre-computed for the public data, and the result of any procedure run on the data added by the users is stored back into the database, thus incrementally increasing the body of pre-computed data. As the Ensembl funcgen schema for the rat is currently not populated, our database is the first database of regulatory features for this frequently used laboratory animal. The database is accessible without registration using the mysql client: mysql –h database.nencki-genomics.org –u public. Registration is required only to add or access private data. A WSDL webservice provides access to the database from any SOAP client, including the Taverna Workbench with a graphical user interface. Database URL: http://www.nencki-genomics.org. PMID:24089456

  12. Programming Wireless Handheld Devices for Applications in Teaching Astronomy

    NASA Astrophysics Data System (ADS)

    Budiardja, R.; Saranathan, V.; Guidry, M.

    2002-12-01

    Wireless technology implemented with handheld devices has attractive features because of the potential to access large amounts of data and the prospect of on-the-fly computational analysis from a device that can be carried in a shirt pocket. We shall describe applications of such technology to the general paradigm of making digital wireless connections from the field to upload information and queries to network servers, executing (potentially complex) data analysis and/or database operations on fast network computers, and returning real-time information from this analysis to the handheld device in the field. As illustration, we shall describe several client/server programs that we have written for applications in teaching introductory astronomy. For example, one program allows static and dynamic properties of astronomical objects to be accessed in a remote observation laboratory setting using a digital cell phone or PDA. Another implements interactive quizzing over a cell phone or PDA using a 700-question introductory astronomy quiz database, thus permitting students to study for astronomy quizzes in any environment in which they have a few free minutes and a digital cell phone or wireless PDA. The presentation will include hands-on demonstrations with real devices.

  13. A Dynamic Enhancement With Background Reduction Algorithm: Overview and Application to Satellite-Based Dust Storm Detection

    NASA Astrophysics Data System (ADS)

    Miller, Steven D.; Bankert, Richard L.; Solbrig, Jeremy E.; Forsythe, John M.; Noh, Yoo-Jeong; Grasso, Lewis D.

    2017-12-01

    This paper describes a Dynamic Enhancement Background Reduction Algorithm (DEBRA) applicable to multispectral satellite imaging radiometers. DEBRA uses ancillary information about the clear-sky background to reduce false detections of atmospheric parameters in complex scenes. Applied here to the detection of lofted dust, DEBRA enlists a surface emissivity database coupled with a climatological database of surface temperature to approximate the clear-sky equivalent signal for selected infrared-based multispectral dust detection tests. This background allows for suppression of false alarms caused by land surface features while retaining some ability to detect dust above those problematic surfaces. The algorithm is applicable to both day and nighttime observations and enables weighted combinations of dust detection tests. The results are provided quantitatively, as a detection confidence factor [0, 1], but are also readily visualized as enhanced imagery. Utilizing the DEBRA confidence factor as a scaling factor in false color red/green/blue imagery enables depiction of the targeted parameter in the context of the local meteorology and topography. In this way, the method holds utility to both automated clients and human analysts alike. Examples of DEBRA performance from notable dust storms and comparisons against other detection methods and independent observations are presented.

  14. Commentary on Dinger et al.: Therapist's attachment, interpersonal problems and alliance development over time in inpatient psychotherapy.

    PubMed

    Holmes, Jeremy

    2009-09-01

    This short article is a commentary on a research study investigating therapist and client attachment styles and their relationship to alliance development in a 12-week psychodynamic psychotherapy program for nonpsychotic inpatients. The relationship is complex; unsurprisingly, securely attached therapists with less distressed clients formed the strongest alliances. A significant proportion of therapists were insecure, almost entirely in the preoccupied or hyperactivating mode. It is argued that collusive relationships between such therapists and similarly overaroused clients may be common. Therapists need both to accommodate to their client's attachment style and to confound it if positive change is to result. Therapist self-scrutiny is likely to be a precondition for such positive outcomes. (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  15. Clinical errors and therapist discomfort with client disclosure of troublesome pornography use: Implications for clinical practice and error reduction.

    PubMed

    Walters, Nathan T; Spengler, Paul M

    2016-09-01

    Mental health professionals are increasingly aware of the need for competence in the treatment of clients with pornography-related concerns. However, while researchers have recently sought to explore efficacious treatments for pornography-related concerns, few explorations of potential clinical judgment issues have occurred. Due to the sensitive, and at times uncomfortable, nature of client disclosures of sexual concerns within therapy, therapists are required to manage their own discomfort while retaining fidelity to treatment. The present paper explores clinician examples of judgment errors that may result from feelings of discomfort, and specifically from client use of pornography. Issues of potential bias, bias management techniques, and therapeutic implications are discussed. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  16. 37 CFR 10.57 - Preservation of confidences and secrets of a client.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... and secrets of a client. 10.57 Section 10.57 Patents, Trademarks, and Copyrights UNITED STATES PATENT... confidences and secrets of a client. (a) “Confidence” refers to information protected by the attorney-client or agent-client privilege under applicable law. “Secret” refers to other information gained in the...

  17. 37 CFR 10.57 - Preservation of confidences and secrets of a client.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... and secrets of a client. 10.57 Section 10.57 Patents, Trademarks, and Copyrights UNITED STATES PATENT... confidences and secrets of a client. (a) “Confidence” refers to information protected by the attorney-client or agent-client privilege under applicable law. “Secret” refers to other information gained in the...

  18. 37 CFR 10.57 - Preservation of confidences and secrets of a client.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... and secrets of a client. 10.57 Section 10.57 Patents, Trademarks, and Copyrights UNITED STATES PATENT... confidences and secrets of a client. (a) “Confidence” refers to information protected by the attorney-client or agent-client privilege under applicable law. “Secret” refers to other information gained in the...

  19. Next generation tools for genomic data generation, distribution, and visualization

    PubMed Central

    2010-01-01

    Background With the rapidly falling cost and availability of high throughput sequencing and microarray technologies, the bottleneck for effectively using genomic analysis in the laboratory and clinic is shifting to one of effectively managing, analyzing, and sharing genomic data. Results Here we present three open-source, platform independent, software tools for generating, analyzing, distributing, and visualizing genomic data. These include a next generation sequencing/microarray LIMS and analysis project center (GNomEx); an application for annotating and programmatically distributing genomic data using the community vetted DAS/2 data exchange protocol (GenoPub); and a standalone Java Swing application (GWrap) that makes cutting edge command line analysis tools available to those who prefer graphical user interfaces. Both GNomEx and GenoPub use the rich client Flex/Flash web browser interface to interact with Java classes and a relational database on a remote server. Both employ a public-private user-group security model enabling controlled distribution of patient and unpublished data alongside public resources. As such, they function as genomic data repositories that can be accessed manually or programmatically through DAS/2-enabled client applications such as the Integrated Genome Browser. Conclusions These tools have gained wide use in our core facilities, research laboratories and clinics and are freely available for non-profit use. See http://sourceforge.net/projects/gnomex/, http://sourceforge.net/projects/genoviz/, and http://sourceforge.net/projects/useq. PMID:20828407

  20. The new ALICE DQM client: a web access to ROOT-based objects

    NASA Astrophysics Data System (ADS)

    von Haller, B.; Carena, F.; Carena, W.; Chapeland, S.; Chibante Barroso, V.; Costa, F.; Delort, C.; Dénes, E.; Diviá, R.; Fuchs, U.; Niedziela, J.; Simonetti, G.; Soós, C.; Telesca, A.; Vande Vyvre, P.; Wegrzynek, A.

    2015-12-01

    A Large Ion Collider Experiment (ALICE) is the heavy-ion detector designed to study the physics of strongly interacting matter and the quark-gluon plasma at the CERN Large Hadron Collider (LHC). The online Data Quality Monitoring (DQM) plays an essential role in the experiment operation by providing shifters with immediate feedback on the data being recorded in order to quickly identify and overcome problems. An immediate access to the DQM results is needed not only by shifters in the control room but also by detector experts worldwide. As a consequence, a new web application has been developed to dynamically display and manipulate the ROOT-based objects produced by the DQM system in a flexible and user friendly interface. The architecture and design of the tool, its main features and the technologies that were used, both on the server and the client side, are described. In particular, we detail how we took advantage of the most recent ROOT JavaScript I/O and web server library to give interactive access to ROOT objects stored in a database. We describe as well the use of modern web techniques and packages such as AJAX, DHTMLX and jQuery, which has been instrumental in the successful implementation of a reactive and efficient application. We finally present the resulting application and how code quality was ensured. We conclude with a roadmap for future technical and functional developments.

  1. Network oriented radiological and medical archive

    NASA Astrophysics Data System (ADS)

    Ferraris, M.; Frixione, P.; Squarcia, S.

    2001-10-01

    In this paper the basic ideas of NORMA (Network Oriented Radiological and Medical Archive) are discussed. NORMA is an original project built by a team of physicists in collaboration with radiologists in order to select the best Treatment Planning in radiotherapy. It allows physicians and health physicists, working in different places, to discuss on interesting clinical cases visualizing the same diagnostic images, at the same time, and highlighting zones of interest (tumors and organs at risk). NORMA has a client/server architecture in order to be platform independent. Applying World Wide Web technologies, it can be easily used by people with no specific computer knowledge providing a verbose help to guide the user through the right steps of execution. The client side is an applet while the server side is a Java application. In order to optimize execution the project also includes a proprietary protocol, lying over TCP/IP suite, that organizes data exchanges and control messages. Diagnostic images are retrieved from a relational database or from a standard DICOM (Digital Images and COmmunications in Medicine) PACS through the DICOM-WWW gateway allowing connection of the usual Web browsers, used by the NORMA system, to DICOM applications via the HTTP protocol. Browser requests are sent to the gateway from the Web server through CGI (Common Gateway Interface). DICOM software translates the requests in DICOM messages and organizes the communication with the remote DICOM Application.

  2. Automatic meta-data collection of STP observation data

    NASA Astrophysics Data System (ADS)

    Ishikura, S.; Kimura, E.; Murata, K.; Kubo, T.; Shinohara, I.

    2006-12-01

    For the geo-science and the STP (Solar-Terrestrial Physics) studies, various observations have been done by satellites and ground-based observatories up to now. These data are saved and managed at many organizations, but no common procedure and rule to provide and/or share these data files. Researchers have felt difficulty in searching and analyzing such different types of data distributed over the Internet. To support such cross-over analyses of observation data, we have developed the STARS (Solar-Terrestrial data Analysis and Reference System). The STARS consists of client application (STARS-app), the meta-database (STARS- DB), the portal Web service (STARS-WS) and the download agent Web service (STARS DLAgent-WS). The STARS-DB includes directory information, access permission, protocol information to retrieve data files, hierarchy information of mission/team/data and user information. Users of the STARS are able to download observation data files without knowing locations of the files by using the STARS-DB. We have implemented the Portal-WS to retrieve meta-data from the meta-database. One reason we use the Web service is to overcome a variety of firewall restrictions which is getting stricter in recent years. Now it is difficult for the STARS client application to access to the STARS-DB by sending SQL query to obtain meta- data from the STARS-DB. Using the Web service, we succeeded in placing the STARS-DB behind the Portal- WS and prevent from exposing it on the Internet. The STARS accesses to the Portal-WS by sending the SOAP (Simple Object Access Protocol) request over HTTP. Meta-data is received as a SOAP Response. The STARS DLAgent-WS provides clients with data files downloaded from data sites. The data files are provided with a variety of protocols (e.g., FTP, HTTP, FTPS and SFTP). These protocols are individually selected at each site. The clients send a SOAP request with download request messages and receive observation data files as a SOAP Response with DIME-Attachment. By introducing the DLAgent-WS, we overcame the problem that the data management policies of each data site are independent. Another important issue to be overcome is how to collect the meta-data of observation data files. So far, STARS-DB managers have added new records to the meta-database and updated them manually. We have had a lot of troubles to maintain the meta-database because observation data are generated every day and the quantity of data files increases explosively. For that purpose, we have attempted to automate collection of the meta-data. In this research, we adopted the RSS 1.0 (RDF Site Summary) as a format to exchange meta-data in the STP fields. The RSS is an RDF vocabulary that provides a multipurpose extensible meta-data description and is suitable for syndication of meta-data. Most of the data in the present study are described in the CDF (Common Data Format), which is a self- describing data format. We have converted meta-information extracted from the CDF data files into RSS files. The program to generate the RSS files is executed on data site server once a day and the RSS files provide information of new data files. The RSS files are collected by RSS collection server once a day and the meta- data are stored in the STARS-DB.

  3. Development of Web-based Distributed Cooperative Development Environmentof Sign-Language Animation System and its Evaluation

    NASA Astrophysics Data System (ADS)

    Yuizono, Takaya; Hara, Kousuke; Nakayama, Shigeru

    A web-based distributed cooperative development environment of sign-language animation system has been developed. We have extended the system from the previous animation system that was constructed as three tiered system which consists of sign-language animation interface layer, sign-language data processing layer, and sign-language animation database. Two components of a web client using VRML plug-in and web servlet are added to the previous system. The systems can support humanoid-model avatar for interoperability, and can use the stored sign language animation data shared on the database. It is noted in the evaluation of this system that the inverse kinematics function of web client improves the sign-language animation making.

  4. The Data Acquisition System of the Stockholm Educational Air Shower Array

    NASA Astrophysics Data System (ADS)

    Hofverberg, P.; Johansson, H.; Pearce, M.; Rydstrom, S.; Wikstrom, C.

    2005-12-01

    The Stockholm Educational Air Shower Array (SEASA) project is deploying an array of plastic scintillator detector stations on school roofs in the Stockholm area. Signals from GPS satellites are used to time synchronise signals from the widely separated detector stations, allowing cosmic ray air showers to be identified and studied. A low-cost and highly scalable data acquisition system has been produced using embedded Linux processors which communicate station data to a central server running a MySQL database. Air shower data can be visualised in real-time using a Java-applet client. It is also possible to query the database and manage detector stations from the client. In this paper, the design and performance of the system are described

  5. Mobile object retrieval in server-based image databases

    NASA Astrophysics Data System (ADS)

    Manger, D.; Pagel, F.; Widak, H.

    2013-05-01

    The increasing number of mobile phones equipped with powerful cameras leads to huge collections of user-generated images. To utilize the information of the images on site, image retrieval systems are becoming more and more popular to search for similar objects in an own image database. As the computational performance and the memory capacity of mobile devices are constantly increasing, this search can often be performed on the device itself. This is feasible, for example, if the images are represented with global image features or if the search is done using EXIF or textual metadata. However, for larger image databases, if multiple users are meant to contribute to a growing image database or if powerful content-based image retrieval methods with local features are required, a server-based image retrieval backend is needed. In this work, we present a content-based image retrieval system with a client server architecture working with local features. On the server side, the scalability to large image databases is addressed with the popular bag-of-word model with state-of-the-art extensions. The client end of the system focuses on a lightweight user interface presenting the most similar images of the database highlighting the visual information which is common with the query image. Additionally, new images can be added to the database making it a powerful and interactive tool for mobile contentbased image retrieval.

  6. Surfing for Data: A Gathering Trend in Data Storage Is the Use of Web-Based Applications that Make It Easy for Authorized Users to Access Hosted Server Content with Just a Computing Device and Browser

    ERIC Educational Resources Information Center

    Technology & Learning, 2005

    2005-01-01

    In recent years, the widespread availability of networks and the flexibility of Web browsers have shifted the industry from a client-server model to a Web-based one. In the client-server model of computing, clients run applications locally, with the servers managing storage, printing functions, and network traffic. Because every client is…

  7. Applications For Real Time NOMADS At NCEP To Disseminate NOAA's Operational Model Data Base

    NASA Astrophysics Data System (ADS)

    Alpert, J. C.; Wang, J.; Rutledge, G.

    2007-05-01

    A wide range of environmental information, in digital form, with metadata descriptions and supporting infrastructure is contained in the NOAA Operational Modeling Archive Distribution System (NOMADS) and its Real Time (RT) project prototype at the National Centers for Environmental Prediction (NCEP). NOMADS is now delivering on its goal of a seamless framework, from archival to real time data dissemination for NOAA's operational model data holdings. A process is under way to make NOMADS part of NCEP's operational production of products. A goal is to foster collaborations among the research and education communities, value added retailers, and public access for science and development. In the National Research Council's "Completing the Forecast", Recommendation 3.4 states: "NOMADS should be maintained and extended to include (a) long-term archives of the global and regional ensemble forecasting systems at their native resolution, and (b) re-forecast datasets to facilitate post-processing." As one of many participants of NOMADS, NCEP serves the operational model data base using data application protocol (Open-DAP) and other services for participants to serve their data sets and users to obtain them. Using the NCEP global ensemble data as an example, we show an Open-DAP (also known as DODS) client application that provides a request-and-fulfill mechanism for access to the complex ensemble matrix of holdings. As an example of the DAP service, we show a client application which accesses the Global or Regional Ensemble data set to produce user selected weather element event probabilities. The event probabilities are easily extended over model forecast time to show probability histograms defining the future trend of user selected events. This approach insures an efficient use of computer resources because users transmit only the data necessary for their tasks. Data sets are served by OPeN-DAP allowing commercial clients such as MATLAB or IDL as well as freeware clients such as GrADS to access the NCEP real time database. We will demonstrate how users can use NOMADS services to repackage area subsets and select levels and variables that are sent to a users selected ftp site. NOMADS can also display plots on demand for area subsets, selected levels, time series and selected variables.

  8. A Web application for the management of clinical workflow in image-guided and adaptive proton therapy for prostate cancer treatments.

    PubMed

    Yeung, Daniel; Boes, Peter; Ho, Meng Wei; Li, Zuofeng

    2015-05-08

    Image-guided radiotherapy (IGRT), based on radiopaque markers placed in the prostate gland, was used for proton therapy of prostate patients. Orthogonal X-rays and the IBA Digital Image Positioning System (DIPS) were used for setup correction prior to treatment and were repeated after treatment delivery. Following a rationale for margin estimates similar to that of van Herk,(1) the daily post-treatment DIPS data were analyzed to determine if an adaptive radiotherapy plan was necessary. A Web application using ASP.NET MVC5, Entity Framework, and an SQL database was designed to automate this process. The designed features included state-of-the-art Web technologies, a domain model closely matching the workflow, a database-supporting concurrency and data mining, access to the DIPS database, secured user access and roles management, and graphing and analysis tools. The Model-View-Controller (MVC) paradigm allowed clean domain logic, unit testing, and extensibility. Client-side technologies, such as jQuery, jQuery Plug-ins, and Ajax, were adopted to achieve a rich user environment and fast response. Data models included patients, staff, treatment fields and records, correction vectors, DIPS images, and association logics. Data entry, analysis, workflow logics, and notifications were implemented. The system effectively modeled the clinical workflow and IGRT process.

  9. Image Reference Database in Teleradiology: Migrating to WWW

    NASA Astrophysics Data System (ADS)

    Pasqui, Valdo

    The paper presents a multimedia Image Reference Data Base (IRDB) used in Teleradiology. The application was developed at the University of Florence in the framework of the European Community TELEMED Project. TELEMED overall goals and IRDB requirements are outlined and the resulting architecture is described. IRDB is a multisite database containing radiological images, selected because their scientific interest, and their related information. The architecture consists of a set of IRDB Installations which are accessed from Viewing Stations (VS) located at different medical sites. The interaction between VS and IRDB Installations follows the client-server paradigm and uses an OSI level-7 protocol, named Telemed Communication Language. After reviewing Florence prototype implementation and experimentation, IRDB migration to World Wide Web (WWW) is discussed. A possible scenery to implement IRDB on the basis of WWW model is depicted in order to exploit WWW servers and browsers capabilities. Finally, the advantages of this conversion are outlined.

  10. 45 CFR 1621.1 - Purpose.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Regulations Relating to Public Welfare (Continued) LEGAL SERVICES CORPORATION CLIENT GRIEVANCE PROCEDURES... assistance to clients as required by the LSC Act and are accountable to clients and applicants for legal... about the denial of legal assistance and clients about the manner or quality of legal assistance...

  11. Momentary assessment of interpersonal process in psychotherapy.

    PubMed

    Thomas, Katherine M; Hopwood, Christopher J; Woody, Erik; Ethier, Nicole; Sadler, Pamela

    2014-01-01

    To demonstrate how a novel computer joystick coding method can illuminate the study of interpersonal processes in psychotherapy sessions, we applied it to Shostrom's (1966) well-known films in which a client, Gloria, had sessions with 3 prominent psychotherapists. The joystick method, which records interpersonal behavior as nearly continuous flows on the plane defined by the interpersonal dimensions of control and affiliation, provides an excellent sampling of variability in each person's interpersonal behavior across the session. More important, it yields extensive information about the temporal dynamics that interrelate clients' and therapists' behaviors. Gloria's 3 psychotherapy sessions were characterized using time-series statistical indices and graphical representations. Results demonstrated that patterns of within-person variability tended to be markedly asymmetric, with a predominant, set-point-like interpersonal style from which deviations mostly occurred in just 1 direction (e.g., occasional submissive departures from a modal dominant style). In addition, across each session, the therapist and client showed strongly cyclical variations in both control and affiliation, and these oscillations were entrained to different extents depending on the therapist. We interpreted different patterns of moment-to-moment complementarity of interpersonal behavior in terms of different therapeutic goals, such as fostering a positive alliance versus disconfirming the client's interpersonal expectations. We also showed how this method can be used to provide a more detailed analysis of specific shorter segments from each of the sessions. Finally, we compared our approach to alternative techniques, such as act-to-act lagged relations and dynamic systems and pointed to a variety of possible research and training applications. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  12. Trainees' use of supervision for therapy with sexual minority clients: A qualitative study.

    PubMed

    Chui, Harold; McGann, Kevin J; Ziemer, Kathryn S; Hoffman, Mary Ann; Stahl, Jessica

    2018-01-01

    In the supervision literature, research on sexual orientation considerations often focuses on sexual minority supervisees and less often on their work with sexual minority clients. Yet both heterosexual and sexual minority supervisees serve sexual minority clients and may have different supervision needs. Twelve predoctoral interns from 12 APA-accredited counseling center internships were interviewed about how they made use of supervision for their work with a sexual minority client. The sample consisted of 6 heterosexual-identified supervisees and 6 supervisees who identified as lesbian, gay, or queer (LGQ). Data were analyzed using consensual qualitative research. All participants reported positive gains from supervision that carried over to their work with heterosexual and sexual minority clients, even when not all supervisors disclosed or discussed their own sexual orientation. Heterosexual supervisees used supervision to ensure that their heterosexuality does not interfere with an affirmative experience for their sexual minority client, whereas LGQ supervisees used supervision to explore differences in sexual identity development between themselves and their client to minimize the negative impact of overidentification. Thus, affirmative supervision may unfold with different foci depending on supervisees' sexual identity. Implications for training and supervision are discussed. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  13. Counseling services for Asian, Latino/a, and White American students: Initial severity, session attendance, and outcome.

    PubMed

    Kim, Jin E; Park, Samuel S; La, Amy; Chang, Jenss; Zane, Nolan

    2016-07-01

    The current study examined racial/ethnic differences in initial severity, session attendance, and counseling outcomes in a large and diverse sample of Asian American, Latino/a, and White student clients who utilized university counseling services between 2008 and 2012. We used archival data of 5,472 clients (62% female; M age = 23.1, SD = 4.3) who self-identified their race/ethnicity as being Asian American (38.9%), Latino/a (14.9%), or White (46.2%). Treatment engagement was measured by the number of counseling sessions attended; initial severity and treatment outcome were measured using the Outcome Questionnaire-45. Asian American clients, particularly Chinese, Filipino/a, Korean, and Vietnamese Americans, had greater initial severity compared with White clients. Asian Indian, Korean, and Vietnamese American clients used significantly fewer sessions of counseling than White clients after controlling for initial severity. All racial/ethnic minority groups continued to have clinically significant distress in certain areas (e.g., social role functioning) at counseling termination. These findings highlight the need to devote greater attention to the counseling experiences of racial/ethnic minority clients, especially certain Asian American groups. Further research directions are provided. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  14. Integrated Controlling System and Unified Database for High Throughput Protein Crystallography Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaponov, Yu.A.; Igarashi, N.; Hiraki, M.

    2004-05-12

    An integrated controlling system and a unified database for high throughput protein crystallography experiments have been developed. Main features of protein crystallography experiments (purification, crystallization, crystal harvesting, data collection, data processing) were integrated into the software under development. All information necessary to perform protein crystallography experiments is stored (except raw X-ray data that are stored in a central data server) in a MySQL relational database. The database contains four mutually linked hierarchical trees describing protein crystals, data collection of protein crystal and experimental data processing. A database editor was designed and developed. The editor supports basic database functions to view,more » create, modify and delete user records in the database. Two search engines were realized: direct search of necessary information in the database and object oriented search. The system is based on TCP/IP secure UNIX sockets with four predefined sending and receiving behaviors, which support communications between all connected servers and clients with remote control functions (creating and modifying data for experimental conditions, data acquisition, viewing experimental data, and performing data processing). Two secure login schemes were designed and developed: a direct method (using the developed Linux clients with secure connection) and an indirect method (using the secure SSL connection using secure X11 support from any operating system with X-terminal and SSH support). A part of the system has been implemented on a new MAD beam line, NW12, at the Photon Factory Advanced Ring for general user experiments.« less

  15. Distributed cyberinfrastructure tools for automated data processing of structural monitoring data

    NASA Astrophysics Data System (ADS)

    Zhang, Yilan; Kurata, Masahiro; Lynch, Jerome P.; van der Linden, Gwendolyn; Sederat, Hassan; Prakash, Atul

    2012-04-01

    The emergence of cost-effective sensing technologies has now enabled the use of dense arrays of sensors to monitor the behavior and condition of large-scale bridges. The continuous operation of dense networks of sensors presents a number of new challenges including how to manage such massive amounts of data that can be created by the system. This paper reports on the progress of the creation of cyberinfrastructure tools which hierarchically control networks of wireless sensors deployed in a long-span bridge. The internet-enabled cyberinfrastructure is centrally managed by a powerful database which controls the flow of data in the entire monitoring system architecture. A client-server model built upon the database provides both data-provider and system end-users with secured access to various levels of information of a bridge. In the system, information on bridge behavior (e.g., acceleration, strain, displacement) and environmental condition (e.g., wind speed, wind direction, temperature, humidity) are uploaded to the database from sensor networks installed in the bridge. Then, data interrogation services interface with the database via client APIs to autonomously process data. The current research effort focuses on an assessment of the scalability and long-term robustness of the proposed cyberinfrastructure framework that has been implemented along with a permanent wireless monitoring system on the New Carquinez (Alfred Zampa Memorial) Suspension Bridge in Vallejo, CA. Many data interrogation tools are under development using sensor data and bridge metadata (e.g., geometric details, material properties, etc.) Sample data interrogation clients including those for the detection of faulty sensors, automated modal parameter extraction.

  16. Access to DNA and protein databases on the Internet.

    PubMed

    Harper, R

    1994-02-01

    During the past year, the number of biological databases that can be queried via Internet has dramatically increased. This increase has resulted from the introduction of networking tools, such as Gopher and WAIS, that make it easy for research workers to index databases and make them available for on-line browsing. Biocomputing in the nineties will see the advent of more client/server options for the solution of problems in bioinformatics.

  17. A proposed group management scheme for XTP multicast

    NASA Technical Reports Server (NTRS)

    Dempsey, Bert J.; Weaver, Alfred C.

    1990-01-01

    The purpose of a group management scheme is to enable its associated transfer layer protocol to be responsive to user determined reliability requirements for multicasting. Group management (GM) must assist the client process in coordinating multicast group membership, allow the user to express the subset of the multicast group that a particular multicast distribution must reach in order to be successful (reliable), and provide the transfer layer protocol with the group membership information necessary to guarantee delivery to this subset. GM provides services and mechanisms that respond to the need of the client process or process level management protocols to coordinate, modify, and determine attributes of the multicast group, especially membership. XTP GM provides a link between process groups and their multicast groups by maintaining a group membership database that identifies members in a name space understood by the underlying transfer layer protocol. Other attributes of the multicast group useful to both the client process and the data transfer protocol may be stored in the database. Examples include the relative dispersion, most recent update, and default delivery parameters of a group.

  18. IAIMS Architecture

    PubMed Central

    Hripcsak, George

    1997-01-01

    Abstract An information system architecture defines the components of a system and the interfaces among the components. A good architecture is essential for creating an Integrated Advanced Information Management System (IAIMS) that works as an integrated whole yet is flexible enough to accommodate many users and roles, multiple applications, changing vendors, evolving user needs, and advancing technology. Modularity and layering promote flexibility by reducing the complexity of a system and by restricting the ways in which components may interact. Enterprise-wide mediation promotes integration by providing message routing, support for standards, dictionary-based code translation, a centralized conceptual data schema, business rule implementation, and consistent access to databases. Several IAIMS sites have adopted a client-server architecture, and some have adopted a three-tiered approach, separating user interface functions, application logic, and repositories. PMID:9067884

  19. Integrated opioid substitution therapy and HIV care: a qualitative systematic review and synthesis of client and provider experiences.

    PubMed

    Guise, Andy; Seguin, Maureen; Mburu, Gitau; McLean, Susie; Grenfell, Pippa; Islam, Zahed; Filippovych, Sergii; Assan, Happy; Low, Andrea; Vickerman, Peter; Rhodes, Tim

    2017-09-01

    People who use drugs in many contexts have limited access to opioid substitution therapy and HIV care. Service integration is one strategy identified to support increased access. We reviewed and synthesized literature exploring client and provider experiences of integrated opioid substitution therapy and HIV care to identify acceptable approaches to care delivery. We systematically reviewed qualitative literature. We searched nine bibliographic databases, supplemented by manual searches of reference lists of articles from the database search, relevant journals, conferences, key organizations and consultation with experts. Thematic synthesis was used to develop descriptive themes in client and provider experiences. The search yielded 11 articles for inclusion, along with 8 expert and policy reports. We identify five descriptive themes: the convenience and comprehensive nature of co-located care, contrasting care philosophies and their role in shaping integration, the limits to disclosure and communication between clients and providers, opioid substitution therapy enabling HIV care access and engagement, and health system challenges to delivering integrated services. The discussion explores how integrated opioid substitution therapy and HIV care needs to adapt to specific social conditions, rather than following universal approaches. We identify priorities for future research. Acceptable integrated opioid substitution therapy and HIV care for people who use drugs and providers is most likely through co-located care and relies upon attention to stigma, supportive relationships and client centred cultures of delivery. Further research is needed to understand experiences of integrated care, particularly delivery in low and middle income settings and models of care focused on community and non-clinic based delivery.

  20. 76 FR 63974 - Self-Regulatory Organizations; Chicago Board Options Exchange, Incorporated; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-14

    ... charges to assess a fee for each CMI Login ID. Firms may access CBOEdirect via either a CMI Client... Login IDs, accessing the same CMI Client Application Server, allowing the firm to only pay the monthly fee once. Alternatively, a firm may use the same Login ID to access different CMI Client Application...

  1. Building a Massive Volcano Archive and the Development of a Tool for the Science Community

    NASA Technical Reports Server (NTRS)

    Linick, Justin

    2012-01-01

    The Jet Propulsion Laboratory has traditionally housed one of the world's largest databases of volcanic satellite imagery, the ASTER Volcano Archive (10Tb), making these data accessible online for public and scientific use. However, a series of changes in how satellite imagery is housed by the Earth Observing System (EOS) Data Information System has meant that JPL has been unable to systematically maintain its database for the last several years. We have provided a fast, transparent, machine-to-machine client that has updated JPL's database and will keep it current in near real-time. The development of this client has also given us the capability to retrieve any data provided by NASA's Earth Observing System Clearinghouse (ECHO) that covers a volcanic event reported by U.S. Air Force Weather Agency (AFWA). We will also provide a publicly available tool that interfaces with ECHO that can provide functionality not available in any of ECHO's Earth science discovery tools.

  2. Towards Direct Manipulation and Remixing of Massive Data: The EarthServer Approach

    NASA Astrophysics Data System (ADS)

    Baumann, P.

    2012-04-01

    Complex analytics on "big data" is one of the core challenges of current Earth science, generating strong requirements for on-demand processing and fil tering of massive data sets. Issues under discussion include flexibility, performance, scalability, and the heterogeneity of the information types invo lved. In other domains, high-level query languages (such as those offered by database systems) have proven successful in the quest for flexible, scalable data access interfaces to massive amounts of data. However, due to the lack of support for many of the Earth science data structures, database systems are only used for registries and catalogs, but not for the bulk of spatio-temporal data. One core information category in this field is given by coverage data. ISO 19123 defines coverages, simplifying, as a representation of a "space-time varying phenomenon". This model can express a large class of Earth science data structures, including rectified and non-rectified rasters, curvilinear grids, point clouds, TINs, general meshes, trajectories, surfaces, and solids. This abstract definition, which is too high-level to establish interoperability, is concretized by the OGC GML 3.2.1 Application Schema for Coverages Standard into an interoperable representation. The OGC Web Coverage Processing Service (WCPS) Standard defines a declarative query language on multi-dimensional raster-type coverages, such as 1D in-situ sensor timeseries, 2D EO imagery, 3D x/y/t image time series and x/y/z geophysical data, 4D x/y/z/t climate and ocean data. Hence, important ingredients for versatile coverage retrieval are given - however, this potential has not been fully unleashed by service architectures up to now. The EU FP7-INFRA project EarthServer, launched in September 2011, aims at enabling standards-based on-demand analytics over the Web for Earth science data based on an integration of W3C XQuery for alphanumeric data and OGC-WCPS for raster data. Ultimately, EarthServer will support all OGC coverage types. The platform used by EarthServer is the rasdaman raster database system. To exploit heterogeneous multi-parallel platforms, automatic request distribution and orchestration is being established. Client toolkits are under development which will allow to quickly compose bespoke interactive clients, ranging from mobile devices over Web clients to high-end immersive virtual reality. The EarthServer platform has been deployed in six large-scale data centres with the aim of setting up Lighthouse Applications addressing all Earth Sciences, including satellite and airborne earth observation as well as use cases from atmosphere, ocean, snow, and ice monitoring, and geology on Earth and Mars. These services, each of which will ultimately host at least 100 TB, will form a peer cloud with distributed query processing for arbitrarily mixing database and in-situ access. With its ability to directly manipulate, analyze and remix massive data, the goal of EarthServer is to lift the data providers' semantic level from data stewardship to service stewardship.

  3. A complete history of everything

    NASA Astrophysics Data System (ADS)

    Lanclos, Kyle; Deich, William T. S.

    2012-09-01

    This paper discusses Lick Observatory's local solution for retaining a complete history of everything. Leveraging our existing deployment of a publish/subscribe communications model that is used to broadcast the state of all systems at Lick Observatory, a monitoring daemon runs on a dedicated server that subscribes to and records all published messages. Our success with this system is a testament to the power of simple, straightforward approaches to complex problems. The solution itself is written in Python, and the initial version required about a week of development time; the data are stored in PostgreSQL database tables using a distinctly simple schema. Over time, we addressed scaling issues as the data set grew, which involved reworking the PostgreSQL database schema on the back-end. We also duplicate the data in flat files to enable recovery or migration of the data from one server to another. This paper will cover both the initial design as well as the solutions to the subsequent deployment issues, the trade-offs that motivated those choices, and the integration of this history database with existing client applications.

  4. 75 FR 66797 - PricewaterhouseCoopers LLP (“PwC”) Internal Firm Services Client Account Administrators Group...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-29

    ... LLP (``PwC'') Internal Firm Services Client Account Administrators Group, Charlotte, NC; Amended... Firm Services Client Account Administrators Group. Accordingly, the Department is amending this... Firm Services Client Account Administrators Group. The amended notice applicable to TA-W-73,608 is...

  5. 34 CFR 370.43 - What requirement applies to the use of mediation procedures?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... (Continued) OFFICE OF SPECIAL EDUCATION AND REHABILITATIVE SERVICES, DEPARTMENT OF EDUCATION CLIENT... client or client applicant who is a party to the mediation; and (2) Has not previously advocated for or otherwise represented or been involved with advocating for or otherwise representing that same client or...

  6. 31 CFR 8.34 - Knowledge of client's omission.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 31 Money and Finance: Treasury 1 2010-07-01 2010-07-01 false Knowledge of client's omission. 8.34... client's omission. Each attorney, certified public accountant, or enrolled practitioner who knows that a client has not complied with applicable law, or has made an error in or omission from any document...

  7. 75 FR 66796 - Pricewaterhousecoopers LLP (“PwC”), Internal Firm Services Client Account Administrators Group...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-29

    ... LLP (``PwC''), Internal Firm Services Client Account Administrators Group Atlanta, GA; Amended...''), Internal Firm Services Client Account Administrators Group. Accordingly, the Department is amending this... Firm Services Client Account Administrators Group. The amended notice applicable to TA-W-73,630 is...

  8. RIMS: An Integrated Mapping and Analysis System with Applications to Earth Sciences and Hydrology

    NASA Astrophysics Data System (ADS)

    Proussevitch, A. A.; Glidden, S.; Shiklomanov, A. I.; Lammers, R. B.

    2011-12-01

    A web-based information and computational system for analysis of spatially distributed Earth system, climate, and hydrologic data have been developed. The System allows visualization, data exploration, querying, manipulation and arbitrary calculations with any loaded gridded or vector polygon dataset. The system's acronym, RIMS, stands for its core functionality as a Rapid Integrated Mapping System. The system can be deployed for a Global scale projects as well as for regional hydrology and climatology studies. In particular, the Water Systems Analysis Group of the University of New Hampshire developed the global and regional (Northern Eurasia, pan-Arctic) versions of the system with different map projections and specific data. The system has demonstrated its potential for applications in other fields of Earth sciences and education. The key Web server/client components of the framework include (a) a visualization engine built on Open Source libraries (GDAL, PROJ.4, etc.) that are utilized in a MapServer; (b) multi-level data querying tools built on XML server-client communication protocols that allow downloading map data on-the-fly to a client web browser; and (c) data manipulation and grid cell level calculation tools that mimic desktop GIS software functionality via a web interface. Server side data management of the system is designed around a simple database of dataset metadata facilitating mounting of new data to the system and maintaining existing data in an easy manner. RIMS contains "built-in" river network data that allows for query of upstream areas on-demand which can be used for spatial data aggregation and analysis of sub-basin areas. RIMS is an ongoing effort and currently being used to serve a number of websites hosting a suite of hydrologic, environmental and other GIS data.

  9. Integration of treatment innovation planning and implementation: strategic process models and organizational challenges.

    PubMed

    Lehman, Wayne E K; Simpson, D Dwayne; Knight, Danica K; Flynn, Patrick M

    2011-06-01

    Sustained and effective use of evidence-based practices in substance abuse treatment services faces both clinical and contextual challenges. Implementation approaches are reviewed that rely on variations of plan-do-study-act (PDSA) cycles, but most emphasize conceptual identification of core components for system change strategies. A two-phase procedural approach is therefore presented based on the integration of Texas Christian University (TCU) models and related resources for improving treatment process and program change. Phase 1 focuses on the dynamics of clinical services, including stages of client recovery (cross-linked with targeted assessments and interventions), as the foundations for identifying and planning appropriate innovations to improve efficiency and effectiveness. Phase 2 shifts to the operational and organizational dynamics involved in implementing and sustaining innovations (including the stages of training, adoption, implementation, and practice). A comprehensive system of TCU assessments and interventions for client and program-level needs and functioning are summarized as well, with descriptions and guidelines for applications in practical settings. (PsycINFO Database Record (c) 2011 APA, all rights reserved).

  10. Scalable and expressive medical terminologies.

    PubMed

    Mays, E; Weida, R; Dionne, R; Laker, M; White, B; Liang, C; Oles, F J

    1996-01-01

    The K-Rep system, based on description logic, is used to represent and reason with large and expressive controlled medical terminologies. Expressive concept descriptions incorporate semantically precise definitions composed using logical operators, together with important non-semantic information such as synonyms and codes. Examples are drawn from our experience with K-Rep in modeling the InterMed laboratory terminology and also developing a large clinical terminology now in production use at Kaiser-Permanente. System-level scalability of performance is achieved through an object-oriented database system which efficiently maps persistent memory to virtual memory. Equally important is conceptual scalability-the ability to support collaborative development, organization, and visualization of a substantial terminology as it evolves over time. K-Rep addresses this need by logically completing concept definitions and automatically classifying concepts in a taxonomy via subsumption inferences. The K-Rep system includes a general-purpose GUI environment for terminology development and browsing, a custom interface for formulary term maintenance, a C+2 application program interface, and a distributed client-server mode which provides lightweight clients with efficient run-time access to K-Rep by means of a scripting language.

  11. The Resource Manager the ATLAS Trigger and Data Acquisition System

    NASA Astrophysics Data System (ADS)

    Aleksandrov, I.; Avolio, G.; Lehmann Miotto, G.; Soloviev, I.

    2017-10-01

    The Resource Manager is one of the core components of the Data Acquisition system of the ATLAS experiment at the LHC. The Resource Manager marshals the right for applications to access resources which may exist in multiple but limited copies, in order to avoid conflicts due to program faults or operator errors. The access to resources is managed in a manner similar to what a lock manager would do in other software systems. All the available resources and their association to software processes are described in the Data Acquisition configuration database. The Resource Manager is queried about the availability of resources every time an application needs to be started. The Resource Manager’s design is based on a client-server model, hence it consists of two components: the Resource Manager “server” application and the “client” shared library. The Resource Manager server implements all the needed functionalities, while the Resource Manager client library provides remote access to the “server” (i.e., to allocate and free resources, to query about the status of resources). During the LHC’s Long Shutdown period, the Resource Manager’s requirements have been reviewed at the light of the experience gained during the LHC’s Run 1. As a consequence, the Resource Manager has undergone a full re-design and re-implementation cycle with the result of a reduction of the code base by 40% with respect to the previous implementation. This contribution will focus on the way the design and the implementation of the Resource Manager could leverage the new features available in the C++11 standard, and how the introduction of external libraries (like Boost multi-container) led to a more maintainable system. Additionally, particular attention will be given to the technical solutions adopted to ensure the Resource Manager could effort the typical requests rates of the Data Acquisition system, which is about 30000 requests in a time window of few seconds coming from more than 1000 clients.

  12. Accessing and distributing EMBL data using CORBA (common object request broker architecture).

    PubMed

    Wang, L; Rodriguez-Tomé, P; Redaschi, N; McNeil, P; Robinson, A; Lijnzaad, P

    2000-01-01

    The EMBL Nucleotide Sequence Database is a comprehensive database of DNA and RNA sequences and related information traditionally made available in flat-file format. Queries through tools such as SRS (Sequence Retrieval System) also return data in flat-file format. Flat files have a number of shortcomings, however, and the resources therefore currently lack a flexible environment to meet individual researchers' needs. The Object Management Group's common object request broker architecture (CORBA) is an industry standard that provides platform-independent programming interfaces and models for portable distributed object-oriented computing applications. Its independence from programming languages, computing platforms and network protocols makes it attractive for developing new applications for querying and distributing biological data. A CORBA infrastructure developed by EMBL-EBI provides an efficient means of accessing and distributing EMBL data. The EMBL object model is defined such that it provides a basis for specifying interfaces in interface definition language (IDL) and thus for developing the CORBA servers. The mapping from the object model to the relational schema in the underlying Oracle database uses the facilities provided by PersistenceTM, an object/relational tool. The techniques of developing loaders and 'live object caching' with persistent objects achieve a smart live object cache where objects are created on demand. The objects are managed by an evictor pattern mechanism. The CORBA interfaces to the EMBL database address some of the problems of traditional flat-file formats and provide an efficient means for accessing and distributing EMBL data. CORBA also provides a flexible environment for users to develop their applications by building clients to our CORBA servers, which can be integrated into existing systems.

  13. Accessing and distributing EMBL data using CORBA (common object request broker architecture)

    PubMed Central

    Wang, Lichun; Rodriguez-Tomé, Patricia; Redaschi, Nicole; McNeil, Phil; Robinson, Alan; Lijnzaad, Philip

    2000-01-01

    Background: The EMBL Nucleotide Sequence Database is a comprehensive database of DNA and RNA sequences and related information traditionally made available in flat-file format. Queries through tools such as SRS (Sequence Retrieval System) also return data in flat-file format. Flat files have a number of shortcomings, however, and the resources therefore currently lack a flexible environment to meet individual researchers' needs. The Object Management Group's common object request broker architecture (CORBA) is an industry standard that provides platform-independent programming interfaces and models for portable distributed object-oriented computing applications. Its independence from programming languages, computing platforms and network protocols makes it attractive for developing new applications for querying and distributing biological data. Results: A CORBA infrastructure developed by EMBL-EBI provides an efficient means of accessing and distributing EMBL data. The EMBL object model is defined such that it provides a basis for specifying interfaces in interface definition language (IDL) and thus for developing the CORBA servers. The mapping from the object model to the relational schema in the underlying Oracle database uses the facilities provided by PersistenceTM, an object/relational tool. The techniques of developing loaders and 'live object caching' with persistent objects achieve a smart live object cache where objects are created on demand. The objects are managed by an evictor pattern mechanism. Conclusions: The CORBA interfaces to the EMBL database address some of the problems of traditional flat-file formats and provide an efficient means for accessing and distributing EMBL data. CORBA also provides a flexible environment for users to develop their applications by building clients to our CORBA servers, which can be integrated into existing systems. PMID:11178259

  14. Rapid HIS, RIS, PACS Integration Using Graphical CASE Tools

    NASA Astrophysics Data System (ADS)

    Taira, Ricky K.; Breant, Claudine M.; Stepczyk, Frank M.; Kho, Hwa T.; Valentino, Daniel J.; Tashima, Gregory H.; Materna, Anthony T.

    1994-05-01

    We describe the clinical requirements of the integrated federation of databases and present our client-mediator-server design. The main body of the paper describes five important aspects of integrating information systems: (1) global schema design, (2) establishing sessions with remote database servers, (3) development of schema translators, (4) integration of global system triggers, and (5) development of job workflow scripts.

  15. Application of the human needs conceptual model to dental hygiene practice.

    PubMed

    Darby, M L; Walsh, M M

    2000-01-01

    The Human Needs Conceptual Model is relevant to dental hygiene because of the need for dental hygienists to be client focused, humanistic, and accountable in practice. Application of the Human Needs Conceptual Model provides a formal framework for identifying and understanding the unique needs of the client that can be met through dental hygiene care. Practitioners find that the Human Needs Conceptual Model can not only help them in assessment and diagnosis, but also in client education, decision-making, care implementation, and the evaluation of treatment outcomes. By using the model, the dental hygienist is able to manage client care humanistically and holistically, and ensure that care is client-centered rather than task-oriented. With the model, a professional practice can be made operational.

  16. A qualitative meta-analysis examining clients' experiences of psychotherapy: A new agenda.

    PubMed

    Levitt, Heidi M; Pomerville, Andrew; Surace, Francisco I

    2016-08-01

    This article argues that psychotherapy practitioners and researchers should be informed by the substantive body of qualitative evidence that has been gathered to represent clients' own experiences of therapy. The current meta-analysis examined qualitative research studies analyzing clients' experiences within adult individual psychotherapy that appeared in English-language journals. This omnibus review integrates research from across psychotherapy approaches and qualitative methods, focusing on the cross-cutting question of how clients experience therapy. It utilized an innovative method in which 67 studies were subjected to a grounded theory meta-analysis in order to develop a hierarchy of data and then 42 additional studies were added into this hierarchy using a content meta-analytic method-summing to 109 studies in total. Findings highlight the critical psychotherapy experiences for clients, based upon robust findings across these research studies. Process-focused principles for practice are generated that can enrich therapists' understanding of their clients in key clinical decision-making moments. Based upon these findings, an agenda is suggested in which research is directed toward heightening therapists' understanding of clients and recognizing them as agents of change within sessions, supporting the client as self-healer paradigm. This research aims to improve therapists' sensitivity to clients' experiences and thus can expand therapists' attunement and intentionality in shaping interventions in accordance with whichever theoretical orientation is in use. The article advocates for the full integration of the qualitative literature in psychotherapy research in which variables are conceptualized in reference to an understanding of clients' experiences in sessions. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  17. Design and deployment of a large brain-image database for clinical and nonclinical research

    NASA Astrophysics Data System (ADS)

    Yang, Guo Liang; Lim, Choie Cheio Tchoyoson; Banukumar, Narayanaswami; Aziz, Aamer; Hui, Francis; Nowinski, Wieslaw L.

    2004-04-01

    An efficient database is an essential component of organizing diverse information on image metadata and patient information for research in medical imaging. This paper describes the design, development and deployment of a large database system serving as a brain image repository that can be used across different platforms in various medical researches. It forms the infrastructure that links hospitals and institutions together and shares data among them. The database contains patient-, pathology-, image-, research- and management-specific data. The functionalities of the database system include image uploading, storage, indexing, downloading and sharing as well as database querying and management with security and data anonymization concerns well taken care of. The structure of database is multi-tier client-server architecture with Relational Database Management System, Security Layer, Application Layer and User Interface. Image source adapter has been developed to handle most of the popular image formats. The database has a user interface based on web browsers and is easy to handle. We have used Java programming language for its platform independency and vast function libraries. The brain image database can sort data according to clinically relevant information. This can be effectively used in research from the clinicians" points of view. The database is suitable for validation of algorithms on large population of cases. Medical images for processing could be identified and organized based on information in image metadata. Clinical research in various pathologies can thus be performed with greater efficiency and large image repositories can be managed more effectively. The prototype of the system has been installed in a few hospitals and is working to the satisfaction of the clinicians.

  18. Security Behavior Observatory: Infrastructure for Long-term Monitoring of Client Machines

    DTIC Science & Technology

    2014-07-14

    desired data. In Wmdows, this is most often a .NET language (e.g., C#, PowerShell), a command-line batch script, or Java . 3) Least privilege: To ensure...modules are written in Java , and thus should be easily-portable to any OS. B. Deployment There are several high-level requirements the SBO must meet...practically feasible with such solutions. Instead, one researcher with access to all the clients’ keys (stored in an isolated and secured MySQL database

  19. A Brief Assessment of LC2IEDM, MIST and Web Services for use in Naval Tactical Data Management

    DTIC Science & Technology

    2004-07-01

    server software, messaging between the client and server, and a database. The MIST database is implemented in an open source DBMS named PostGreSQL ... PostGreSQL had its beginnings at the University of California, Berkley, in 1986 [11]. The development of PostGreSQL has since evolved into a...contact history from the database. DRDC Atlantic TM 2004-148 9 Request Software Request Software Server Side Response from service

  20. Access Control of Web- and Java-Based Applications

    NASA Technical Reports Server (NTRS)

    Tso, Kam S.; Pajevski, Michael J.

    2013-01-01

    Cybersecurity has become a great concern as threats of service interruption, unauthorized access, stealing and altering of information, and spreading of viruses have become more prevalent and serious. Application layer access control of applications is a critical component in the overall security solution that also includes encryption, firewalls, virtual private networks, antivirus, and intrusion detection. An access control solution, based on an open-source access manager augmented with custom software components, was developed to provide protection to both Web-based and Javabased client and server applications. The DISA Security Service (DISA-SS) provides common access control capabilities for AMMOS software applications through a set of application programming interfaces (APIs) and network- accessible security services for authentication, single sign-on, authorization checking, and authorization policy management. The OpenAM access management technology designed for Web applications can be extended to meet the needs of Java thick clients and stand alone servers that are commonly used in the JPL AMMOS environment. The DISA-SS reusable components have greatly reduced the effort for each AMMOS subsystem to develop its own access control strategy. The novelty of this work is that it leverages an open-source access management product that was designed for Webbased applications to provide access control for Java thick clients and Java standalone servers. Thick clients and standalone servers are still commonly used in businesses and government, especially for applications that require rich graphical user interfaces and high-performance visualization that cannot be met by thin clients running on Web browsers

  1. Secure UNIX socket-based controlling system for high-throughput protein crystallography experiments.

    PubMed

    Gaponov, Yurii; Igarashi, Noriyuki; Hiraki, Masahiko; Sasajima, Kumiko; Matsugaki, Naohiro; Suzuki, Mamoru; Kosuge, Takashi; Wakatsuki, Soichi

    2004-01-01

    A control system for high-throughput protein crystallography experiments has been developed based on a multilevel secure (SSL v2/v3) UNIX socket under the Linux operating system. Main features of protein crystallography experiments (purification, crystallization, loop preparation, data collecting, data processing) are dealt with by the software. All information necessary to perform protein crystallography experiments is stored (except raw X-ray data, that are stored in Network File Server) in a relational database (MySQL). The system consists of several servers and clients. TCP/IP secure UNIX sockets with four predefined behaviors [(a) listening to a request followed by a reply, (b) sending a request and waiting for a reply, (c) listening to a broadcast message, and (d) sending a broadcast message] support communications between all servers and clients allowing one to control experiments, view data, edit experimental conditions and perform data processing remotely. The usage of the interface software is well suited for developing well organized control software with a hierarchical structure of different software units (Gaponov et al., 1998), which will pass and receive different types of information. All communication is divided into two parts: low and top levels. Large and complicated control tasks are split into several smaller ones, which can be processed by control clients independently. For communicating with experimental equipment (beamline optical elements, robots, and specialized experimental equipment etc.), the STARS server, developed at the Photon Factory, is used (Kosuge et al., 2002). The STARS server allows any application with an open socket to be connected with any other clients that control experimental equipment. Majority of the source code is written in C/C++. GUI modules of the system were built mainly using Glade user interface builder for GTK+ and Gnome under Red Hat Linux 7.1 operating system.

  2. A novel application in the study of client language: Alcohol and marijuana-related statements in substance-using adolescents during a simulation task.

    PubMed

    Ladd, Benjamin O; Garcia, Tracey A; Anderson, Kristen G

    2016-09-01

    The current study explored whether laboratory-based techniques can provide a strategy for studying client language as a mechanism of behavior change. Specifically, this study examined the potential of a simulation task to elicit healthy talk, or self-motivational statements in favor of healthy behavior, related to marijuana and alcohol use. Participants (N = 84) were adolescents reporting at least 10 lifetime substance use episodes recruited from various community settings in an urban Pacific Northwest setting. Participants completed the Adolescent Simulated Intoxication Digital Elicitation (A-SIDE), a validated paradigm for assessing substance use decision making in peer contexts. Participants responded to 4 types of offers in the A-SIDE: (a) marijuana, (b) food (marijuana control), (c) alcohol, and (d) soda (alcohol control). Using a validated coding scheme adapted for the current study, client language during a structured interview assessing participants' response to the simulated offers was evaluated. Associations between percent healthy talk (PHT, calculated by dividing the number of healthy statements by the sum of all substance-related statements) and cross-sectional outcomes of interest (previous substance use, substance use expectancies, and behavioral willingness) were explored. The frequency of substance-related statements differed in response to offer type; rate of PHT did not. PHT was associated with behavioral willingness to accept the offer. However, PHT was not associated with decontextualized measures of substance use. Associations between PHT and global expectancies were limited. Simulation methods may be useful in investigating the impact of context on self-talk and to systematically explore client language as a mechanism of change. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  3. Innovative Technology for Teaching Introductory Astronomy

    NASA Astrophysics Data System (ADS)

    Guidry, Mike

    The application of state-of-the-art technology (primarily Java and Flash MX Actionscript on the client side and Java PHP PERL XML and SQL databasing on the server side) to the teaching of introductory astronomy will be discussed. A completely online syllabus in introductory astronomy built around more than 350 interactive animations called ""Online Journey through Astronomy"" and a new set of 20 online virtual laboratories in astronomy that we are currently developing will be used as illustration. In addition to demonstration of the technology our experience using these technologies to teach introductory astronomy to thousands of students in settings ranging from traditional classrooms to full distance learning will be summarized. Recent experiments using Java and vector graphics programming of handheld devices (Personal Digital Assistants and cell phones) with wireless wide-area connectivity for applications in astronomy education will also be described.

  4. Biblio-Link and Pro-Cite: The Searcher's Workstation.

    ERIC Educational Resources Information Center

    Hoyle, Norman; McNamara, Kathleen

    1987-01-01

    Describes the Biblio-Link and Pro-Cite software packages, which can be used together to create local databases with downloaded records, or to reorganize and repackage downloaded records for client reports. (CLB)

  5. Exchanging the Context between OGC Geospatial Web clients and GIS applications using Atom

    NASA Astrophysics Data System (ADS)

    Maso, Joan; Díaz, Paula; Riverola, Anna; Pons, Xavier

    2013-04-01

    Currently, the discovery and sharing of geospatial information over the web still presents difficulties. News distribution through website content was simplified by the use of Really Simple Syndication (RSS) and Atom syndication formats. This communication exposes an extension of Atom to redistribute references to geospatial information in a Spatial Data Infrastructure distributed environment. A geospatial client can save the status of an application that involves several OGC services of different kind and direct data and share this status with other users that need the same information and use different client vendor products in an interoperable way. The extensibility of the Atom format was essential to define a format that could be used in RSS enabled web browser, Mass Market map viewers and emerging geospatial enable integrated clients that support Open Geospatial Consortium (OGC) services. Since OWS Context has been designed as an Atom extension, it is possible to see the document in common places where Atom documents are valid. Internet web browsers are able to present the document as a list of items with title, abstract, time, description and downloading features. OWS Context uses GeoRSS so that, the document can be to be interpreted by both Google maps and Bing Maps as items that have the extent represented in a dynamic map. Another way to explode a OWS Context is to develop an XSLT to transform the Atom feed into an HTML5 document that shows the exact status of the client view window that saved the context document. To accomplish so, we use the width and height of the client window, and the extent of the view in world (geographic) coordinates in order to calculate the scale of the map. Then, we can mix elements in world coordinates (such as CF-NetCDF files or GML) with elements in pixel coordinates (such as WMS maps, WMTS tiles and direct SVG content). A smarter map browser application called MiraMon Map Browser is able to write a context document and read it again to recover the context of the previous view or load a context generated by another application. The possibility to store direct links to direct files in OWS Context is particularly interesting for GIS desktop solutions. This communication also presents the development made in the MiraMon desktop GIS solution to include OWS Context. MiraMon software is able to deal either with local files, web services and database connections. As in any other GIS solution, MiraMon team designed its own file (MiraMon Map MMM) for storing and sharing the status of a GIS session. The new OWS Context format is now adopted as an interoperable substitution of the MMM. The extensibility of the format makes it possible to map concepts in the MMM to current OWS Context elements (such as titles, data links, extent, etc) and to generate new elements that are able to include all extra metadata not currently covered by OWS Context. These developments were done in the nine edition of the OpenGIS Web Services Interoperability Experiment (OWS-9) and are demonstrated in this communication.

  6. Designing and Implementing a Distributed System Architecture for the Mars Rover Mission Planning Software (Maestro)

    NASA Technical Reports Server (NTRS)

    Goldgof, Gregory M.

    2005-01-01

    Distributed systems allow scientists from around the world to plan missions concurrently, while being updated on the revisions of their colleagues in real time. However, permitting multiple clients to simultaneously modify a single data repository can quickly lead to data corruption or inconsistent states between users. Since our message broker, the Java Message Service, does not ensure that messages will be received in the order they were published, we must implement our own numbering scheme to guarantee that changes to mission plans are performed in the correct sequence. Furthermore, distributed architectures must ensure that as new users connect to the system, they synchronize with the database without missing any messages or falling into an inconsistent state. Robust systems must also guarantee that all clients will remain synchronized with the database even in the case of multiple client failure, which can occur at any time due to lost network connections or a user's own system instability. The final design for the distributed system behind the Mars rover mission planning software fulfills all of these requirements and upon completion will be deployed to MER at the end of 2005 as well as Phoenix (2007) and MSL (2009).

  7. The IVTANTHERMO-Online database for thermodynamic properties of individual substances with web interface

    NASA Astrophysics Data System (ADS)

    Belov, G. V.; Dyachkov, S. A.; Levashov, P. R.; Lomonosov, I. V.; Minakov, D. V.; Morozov, I. V.; Sineva, M. A.; Smirnov, V. N.

    2018-01-01

    The database structure, main features and user interface of an IVTANTHERMO-Online system are reviewed. This system continues the series of the IVTANTHERMO packages developed in JIHT RAS. It includes the database for thermodynamic properties of individual substances and related software for analysis of experimental results, data fitting, calculation and estimation of thermodynamical functions and thermochemistry quantities. In contrast to the previous IVTANTHERMO versions it has a new extensible database design, the client-server architecture, a user-friendly web interface with a number of new features for online and offline data processing.

  8. CIVET: Continuous Integration, Verification, Enhancement, and Testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alger, Brian; Gaston, Derek R.; Permann, Cody J

    A Git server (GitHub, GitLab, BitBucket) sends event notifications to the Civet server. These are either a " Pull Request" or a "Push" notification. Civet then checks the database to determine what tests need to be run and marks them as ready to run. Civet clients, running on dedicated machines, query the server for available jobs that are ready to run. When a client gets a job it executes the scripts attached to the job and report back to the server the output and exit status. When the client updates the server, the server will also update the Git servermore » with the result of the job, as well as updating the main web page.« less

  9. Client value models provide a framework for rational library planning (or, phrasing the answer in the form of a question).

    PubMed

    Van Moorsel, Guillaume

    2005-01-01

    Libraries often do not know how clients value their product/ service offerings. Yet at a time when the mounting costs for library support are increasingly difficult to justify to the parent institution, the library's ability to gauge the value of its offerings to clients has never been more critical. Client Value Models (CVMs) establish a common definition of value elements-or a "value vocabulary"-for libraries and their clients, thereby providing a basis upon which to make rational planning decisions regarding product/service acquisition and development. The CVM concept is borrowed from business and industry, but its application has a natural fit in libraries. This article offers a theoretical consideration and practical illustration of CVM application in libraries.

  10. A mobile trauma database with charge capture.

    PubMed

    Moulton, Steve; Myung, Dan; Chary, Aron; Chen, Joshua; Agarwal, Suresh; Emhoff, Tim; Burke, Peter; Hirsch, Erwin

    2005-11-01

    Charge capture plays an important role in every surgical practice. We have developed and merged a custom mobile database (DB) system with our trauma registry (TRACS), to better understand our billing methods, revenue generators, and areas for improved revenue capture. The mobile database runs on handheld devices using the Windows Compact Edition platform. The front end was written in C# and the back end is SQL. The mobile database operates as a thick client; it includes active and inactive patient lists, billing screens, hot pick lists, and Current Procedural Terminology and International Classification of Diseases, Ninth Revision code sets. Microsoft Information Internet Server provides secure data transaction services between the back ends stored on each device. Traditional, hand written billing information for three of five adult trauma surgeons was averaged over a 5-month period. Electronic billing information was then collected over a 3-month period using handheld devices and the subject software application. One surgeon used the software for all 3 months, and two surgeons used it for the latter 2 months of the electronic data collection period. This electronic billing information was combined with TRACS data to determine the clinical characteristics of the trauma patients who were and were not captured using the mobile database. Total charges increased by 135%, 148%, and 228% for each of the three trauma surgeons who used the mobile DB application. The majority of additional charges were for evaluation and management services. Patients who were captured and billed at the point of care using the mobile DB had higher Injury Severity Scores, were more likely to undergo an operative procedure, and had longer lengths of stay compared with those who were not captured. Total charges more than doubled using a mobile database to bill at the point of care. A subsequent comparison of TRACS data with billing information revealed a large amount of uncaptured patient revenue. Greater familiarity and broader use of mobile database technology holds the potential for even greater revenue capture.

  11. A Visual Galaxy Classification Interface and its Classroom Application

    NASA Astrophysics Data System (ADS)

    Kautsch, Stefan J.; Phung, Chau; VanHilst, Michael; Castro, Victor H

    2014-06-01

    Galaxy morphology is an important topic in modern astronomy to understand questions concerning the evolution and formation of galaxies and their dark matter content. In order to engage students in exploring galaxy morphology, we developed a web-based, graphical interface that allows students to visually classify galaxy images according to various morphological types. The website is designed with HTML5, JavaScript, PHP, and a MySQL database. The classification interface provides hands-on research experience and training for students and interested clients, and allows them to contribute to studies of galaxy morphology. We present the first results of a pilot study and compare the visually classified types using our interface with that from automated classification routines.

  12. Geothopica and the interactive analysis and visualization of the updated Italian National Geothermal Database

    NASA Astrophysics Data System (ADS)

    Trumpy, Eugenio; Manzella, Adele

    2017-02-01

    The Italian National Geothermal Database (BDNG), is the largest collection of Italian Geothermal data and was set up in the 1980s. It has since been updated both in terms of content and management tools: information on deep wells and thermal springs (with temperature > 30 °C) are currently organized and stored in a PostgreSQL relational database management system, which guarantees high performance, data security and easy access through different client applications. The BDNG is the core of the Geothopica web site, whose webGIS tool allows different types of user to access geothermal data, to visualize multiple types of datasets, and to perform integrated analyses. The webGIS tool has been recently improved by two specially designed, programmed and implemented visualization tools to display data on well lithology and underground temperatures. This paper describes the contents of the database and its software and data update, as well as the webGIS tool including the new tools for data lithology and temperature visualization. The geoinformation organized in the database and accessible through Geothopica is of use not only for geothermal purposes, but also for any kind of georesource and CO2 storage project requiring the organization of, and access to, deep underground data. Geothopica also supports project developers, researchers, and decision makers in the assessment, management and sustainable deployment of georesources.

  13. MODBUS APPLICATION AT JEFFERSON LAB

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yan, Jianxun; Seaton, Chad; Philip, Sarin

    Modbus is a client/server communication model. In our applications, the embedded Ethernet device XPort is designed as the server and a SoftIOC running EPICS Modbus is the client. The SoftIOC builds a Modbus request from parameter contained in a demand that is sent by the EPICS application to the Modbus Client interface. On reception of the Modbus request, the Modbus server activates a local action to read, write, or achieve some other action. So, the main Modbus server functions are to wait for a Modbus request on 502 TCP port, treat this request, and then build a Modbus response.

  14. Treatment refusal and premature termination in psychotherapy, pharmacotherapy, and their combination: A meta-analysis of head-to-head comparisons.

    PubMed

    Swift, Joshua K; Greenberg, Roger P; Tompkins, Kelley A; Parkin, Susannah R

    2017-03-01

    The purpose of this meta-analysis was to examine rates of treatment refusal and premature termination for pharmacotherapy alone, psychotherapy alone, pharmacotherapy plus psychotherapy, and psychotherapy plus pill placebo treatments. A systematic review of the literature resulted in 186 comparative trials that included a report of treatment refusal and/or premature termination for at least 2 of the 4 treatment conditions. The data from these studies were pooled using a random-effects analysis. Odds Ratio effect sizes were then calculated to compare the rates between treatment conditions, once across all studies and then again for specific client disorder categories. An average treatment refusal rate of 8.2% was found across studies. Clients who were assigned to pharmacotherapy were 1.76 times more likely to refuse treatment compared with clients who were assigned psychotherapy. Differences in refusal rates for pharmacotherapy and psychotherapy were particularly evident for depressive disorders, panic disorder, and social anxiety disorder. On average, 21.9% of clients prematurely terminated their treatment. Across studies, clients who were assigned to pharmacotherapy were 1.20 times more likely to drop out compared with clients who were assigned to psychotherapy. Pharmacotherapy clients with anorexia/bulimia and depressive disorders dropped out at higher rates compared with psychotherapy clients with these disorders. Treatment refusal and dropout are significant problems in both psychotherapy and pharmacotherapy and providers of these treatments should seek to employ strategies to reduce their occurrence. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  15. ThermoFit: A Set of Software Tools, Protocols and Schema for the Organization of Thermodynamic Data and for the Development, Maintenance, and Distribution of Internally Consistent Thermodynamic Data/Model Collections

    NASA Astrophysics Data System (ADS)

    Ghiorso, M. S.

    2013-12-01

    Internally consistent thermodynamic databases are critical resources that facilitate the calculation of heterogeneous phase equilibria and thereby support geochemical, petrological, and geodynamical modeling. These 'databases' are actually derived data/model systems that depend on a diverse suite of physical property measurements, calorimetric data, and experimental phase equilibrium brackets. In addition, such databases are calibrated with the adoption of various models for extrapolation of heat capacities and volumetric equations of state to elevated temperature and pressure conditions. Finally, these databases require specification of thermochemical models for the mixing properties of solid, liquid, and fluid solutions, which are often rooted in physical theory and, in turn, depend on additional experimental observations. The process of 'calibrating' a thermochemical database involves considerable effort and an extensive computational infrastructure. Because of these complexities, the community tends to rely on a small number of thermochemical databases, generated by a few researchers; these databases often have limited longevity and are universally difficult to maintain. ThermoFit is a software framework and user interface whose aim is to provide a modeling environment that facilitates creation, maintenance and distribution of thermodynamic data/model collections. Underlying ThermoFit are data archives of fundamental physical property, calorimetric, crystallographic, and phase equilibrium constraints that provide the essential experimental information from which thermodynamic databases are traditionally calibrated. ThermoFit standardizes schema for accessing these data archives and provides web services for data mining these collections. Beyond simple data management and interoperability, ThermoFit provides a collection of visualization and software modeling tools that streamline the model/database generation process. Most notably, ThermoFit facilitates the rapid visualization of predicted model outcomes and permits the user to modify these outcomes using tactile- or mouse-based GUI interaction, permitting real-time updates that reflect users choices, preferences, and priorities involving derived model results. This ability permits some resolution of the problem of correlated model parameters in the common situation where thermodynamic models must be calibrated from inadequate data resources. The ability also allows modeling constraints to be imposed using natural data and observations (i.e. petrologic or geochemical intuition). Once formulated, ThermoFit facilitates deployment of data/model collections by automated creation of web services. Users consume these services via web-, excel-, or desktop-clients. ThermoFit is currently under active development and not yet generally available; a limited capability prototype system has been coded for Macintosh computers and utilized to construct thermochemical models for H2O-CO2 mixed fluid saturation in silicate liquids. The longer term goal is to release ThermoFit as a web portal application client with server-based cloud computations supporting the modeling environment.

  16. Thin Client Architecture: The Promise and the Problems.

    ERIC Educational Resources Information Center

    Machovec, George S.

    1997-01-01

    Describes thin clients, a networking technology that allows organizations to provide software applications over networked workstations connected to a central server. Topics include corporate settings; major advantages, including cost effectiveness and increased computer security; problems; and possible applications for large public and academic…

  17. The CMS dataset bookkeeping service

    NASA Astrophysics Data System (ADS)

    Afaq, A.; Dolgert, A.; Guo, Y.; Jones, C.; Kosyakov, S.; Kuznetsov, V.; Lueking, L.; Riley, D.; Sekhri, V.

    2008-07-01

    The CMS Dataset Bookkeeping Service (DBS) has been developed to catalog all CMS event data from Monte Carlo and Detector sources. It provides the ability to identify MC or trigger source, track data provenance, construct datasets for analysis, and discover interesting data. CMS requires processing and analysis activities at various service levels and the DBS system provides support for localized processing or private analysis, as well as global access for CMS users at large. Catalog entries can be moved among the various service levels with a simple set of migration tools, thus forming a loose federation of databases. DBS is available to CMS users via a Python API, Command Line, and a Discovery web page interfaces. The system is built as a multi-tier web application with Java servlets running under Tomcat, with connections via JDBC to Oracle or MySQL database backends. Clients connect to the service through HTTP or HTTPS with authentication provided by GRID certificates and authorization through VOMS. DBS is an integral part of the overall CMS Data Management and Workflow Management systems.

  18. Design of Instant Messaging System of Multi-language E-commerce Platform

    NASA Astrophysics Data System (ADS)

    Yang, Heng; Chen, Xinyi; Li, Jiajia; Cao, Yaru

    2017-09-01

    This paper aims at researching the message system in the instant messaging system based on the multi-language e-commerce platform in order to design the instant messaging system in multi-language environment and exhibit the national characteristics based information as well as applying national languages to e-commerce. In order to develop beautiful and friendly system interface for the front end of the message system and reduce the development cost, the mature jQuery framework is adopted in this paper. The high-performance server Tomcat is adopted at the back end to process user requests, and MySQL database is adopted for data storage to persistently store user data, and meanwhile Oracle database is adopted as the message buffer for system optimization. Moreover, AJAX technology is adopted for the client to actively pull the newest data from the server at the specified time. In practical application, the system has strong reliability, good expansibility, short response time, high system throughput capacity and high user concurrency.

  19. Remote Adaptive Communication System

    DTIC Science & Technology

    2001-10-25

    manage several different devices using the software tool A. Client /Server Architecture The architecture we are proposing is based on the Client ...Server model (see figure 3). We want both client and server to be accessible from anywhere via internet. The computer, acting as a server, is in...the other hand, each of the client applications will act as sender or receiver, depending on the associated interface: user interface or device

  20. OLS Client and OLS Dialog: Open Source Tools to Annotate Public Omics Datasets.

    PubMed

    Perez-Riverol, Yasset; Ternent, Tobias; Koch, Maximilian; Barsnes, Harald; Vrousgou, Olga; Jupp, Simon; Vizcaíno, Juan Antonio

    2017-10-01

    The availability of user-friendly software to annotate biological datasets and experimental details is becoming essential in data management practices, both in local storage systems and in public databases. The Ontology Lookup Service (OLS, http://www.ebi.ac.uk/ols) is a popular centralized service to query, browse and navigate biomedical ontologies and controlled vocabularies. Recently, the OLS framework has been completely redeveloped (version 3.0), including enhancements in the data model, like the added support for Web Ontology Language based ontologies, among many other improvements. However, the new OLS is not backwards compatible and new software tools are needed to enable access to this widely used framework now that the previous version is no longer available. We here present the OLS Client as a free, open-source Java library to retrieve information from the new version of the OLS. It enables rapid tool creation by providing a robust, pluggable programming interface and common data model to programmatically access the OLS. The library has already been integrated and is routinely used by several bioinformatics resources and related data annotation tools. Secondly, we also introduce an updated version of the OLS Dialog (version 2.0), a Java graphical user interface that can be easily plugged into Java desktop applications to access the OLS. The software and related documentation are freely available at https://github.com/PRIDE-Utilities/ols-client and https://github.com/PRIDE-Toolsuite/ols-dialog. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Verifying the secure setup of UNIX client/servers and detection of network intrusion

    NASA Astrophysics Data System (ADS)

    Feingold, Richard; Bruestle, Harry R.; Bartoletti, Tony; Saroyan, R. A.; Fisher, John M.

    1996-03-01

    This paper describes our technical approach to developing and delivering Unix host- and network-based security products to meet the increasing challenges in information security. Today's global `Infosphere' presents us with a networked environment that knows no geographical, national, or temporal boundaries, and no ownership, laws, or identity cards. This seamless aggregation of computers, networks, databases, applications, and the like store, transmit, and process information. This information is now recognized as an asset to governments, corporations, and individuals alike. This information must be protected from misuse. The Security Profile Inspector (SPI) performs static analyses of Unix-based clients and servers to check on their security configuration. SPI's broad range of security tests and flexible usage options support the needs of novice and expert system administrators alike. SPI's use within the Department of Energy and Department of Defense has resulted in more secure systems, less vulnerable to hostile intentions. Host-based information protection techniques and tools must also be supported by network-based capabilities. Our experience shows that a weak link in a network of clients and servers presents itself sooner or later, and can be more readily identified by dynamic intrusion detection techniques and tools. The Network Intrusion Detector (NID) is one such tool. NID is designed to monitor and analyze activity on the Ethernet broadcast Local Area Network segment and product transcripts of suspicious user connections. NID's retrospective and real-time modes have proven invaluable to security officers faced with ongoing attacks to their systems and networks.

  2. Recent improvements in the NASA technical report server

    NASA Technical Reports Server (NTRS)

    Maa, Ming-Hokng; Nelson, Michael L.

    1995-01-01

    The NASA Technical Report Server (NTRS), a World Wide Web (WWW) report distribution service, has been modified to allow parallel database queries, significantly decreasing user access time by an average factor of 2.3, access from clients behind firewalls and/or proxies which truncate excessively long Uniform Resource Locators (URL's), access to non-Wide Area Information Server (WAIS) databases, and compatibility with the Z39-50.3 protocol.

  3. Java RMI Software Technology for the Payload Planning System of the International Space Station

    NASA Technical Reports Server (NTRS)

    Bryant, Barrett R.

    1999-01-01

    The Payload Planning System is for experiment planning on the International Space Station. The planning process has a number of different aspects which need to be stored in a database which is then used to generate reports on the planning process in a variety of formats. This process is currently structured as a 3-tier client/server software architecture comprised of a Java applet at the front end, a Java server in the middle, and an Oracle database in the third tier. This system presently uses CGI, the Common Gateway Interface, to communicate between the user-interface and server tiers and Active Data Objects (ADO) to communicate between the server and database tiers. This project investigated other methods and tools for performing the communications between the three tiers of the current system so that both the system performance and software development time could be improved. We specifically found that for the hardware and software platforms that PPS is required to run on, the best solution is to use Java Remote Method Invocation (RMI) for communication between the client and server and SQLJ (Structured Query Language for Java) for server interaction with the database. Prototype implementations showed that RMI combined with SQLJ significantly improved performance and also greatly facilitated construction of the communication software.

  4. 42 CFR 51.45 - Confidentiality of protection and advocacy system records.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... information contained in any automated electronic database pertaining to: (i) Clients to the same extent as is... criminal prosecution. (c) For purposes of any periodic audit, report, or evaluation of the performance of...

  5. Software Update.

    ERIC Educational Resources Information Center

    Currents, 2000

    2000-01-01

    A chart of 40 alumni-development database systems provides information on vendor/Web site, address, contact/phone, software name, price range, minimum suggested workstation/suggested server, standard reports/reporting tools, minimum/maximum record capacity, and number of installed sites/client type. (DB)

  6. TERRESTRIAL ECOSYSTEM SIMULATOR

    EPA Science Inventory

    The Terrestrial Habitats Project at the Western Ecology Division (Corvallis, OR) is developing tools and databases to meet the needs of Program Office clients for assessing risks to wildlife and terrestrial ecosystems. Because habitat is a dynamic condition in real-world environm...

  7. Federated Web-accessible Clinical Data Management within an Extensible NeuroImaging Database

    PubMed Central

    Keator, David B.; Wei, Dingying; Fennema-Notestine, Christine; Pease, Karen R.; Bockholt, Jeremy; Grethe, Jeffrey S.

    2010-01-01

    Managing vast datasets collected throughout multiple clinical imaging communities has become critical with the ever increasing and diverse nature of datasets. Development of data management infrastructure is further complicated by technical and experimental advances that drive modifications to existing protocols and acquisition of new types of research data to be incorporated into existing data management systems. In this paper, an extensible data management system for clinical neuroimaging studies is introduced: The Human Clinical Imaging Database (HID) and Toolkit. The database schema is constructed to support the storage of new data types without changes to the underlying schema. The complex infrastructure allows management of experiment data, such as image protocol and behavioral task parameters, as well as subject-specific data, including demographics, clinical assessments, and behavioral task performance metrics. Of significant interest, embedded clinical data entry and management tools enhance both consistency of data reporting and automatic entry of data into the database. The Clinical Assessment Layout Manager (CALM) allows users to create on-line data entry forms for use within and across sites, through which data is pulled into the underlying database via the generic clinical assessment management engine (GAME). Importantly, the system is designed to operate in a distributed environment, serving both human users and client applications in a service-oriented manner. Querying capabilities use a built-in multi-database parallel query builder/result combiner, allowing web-accessible queries within and across multiple federated databases. The system along with its documentation is open-source and available from the Neuroimaging Informatics Tools and Resource Clearinghouse (NITRC) site. PMID:20567938

  8. Federated web-accessible clinical data management within an extensible neuroimaging database.

    PubMed

    Ozyurt, I Burak; Keator, David B; Wei, Dingying; Fennema-Notestine, Christine; Pease, Karen R; Bockholt, Jeremy; Grethe, Jeffrey S

    2010-12-01

    Managing vast datasets collected throughout multiple clinical imaging communities has become critical with the ever increasing and diverse nature of datasets. Development of data management infrastructure is further complicated by technical and experimental advances that drive modifications to existing protocols and acquisition of new types of research data to be incorporated into existing data management systems. In this paper, an extensible data management system for clinical neuroimaging studies is introduced: The Human Clinical Imaging Database (HID) and Toolkit. The database schema is constructed to support the storage of new data types without changes to the underlying schema. The complex infrastructure allows management of experiment data, such as image protocol and behavioral task parameters, as well as subject-specific data, including demographics, clinical assessments, and behavioral task performance metrics. Of significant interest, embedded clinical data entry and management tools enhance both consistency of data reporting and automatic entry of data into the database. The Clinical Assessment Layout Manager (CALM) allows users to create on-line data entry forms for use within and across sites, through which data is pulled into the underlying database via the generic clinical assessment management engine (GAME). Importantly, the system is designed to operate in a distributed environment, serving both human users and client applications in a service-oriented manner. Querying capabilities use a built-in multi-database parallel query builder/result combiner, allowing web-accessible queries within and across multiple federated databases. The system along with its documentation is open-source and available from the Neuroimaging Informatics Tools and Resource Clearinghouse (NITRC) site.

  9. BioAssay Research Database (BARD): chemical biology and probe-development enabled by structured metadata and result types

    PubMed Central

    Howe, E.A.; de Souza, A.; Lahr, D.L.; Chatwin, S.; Montgomery, P.; Alexander, B.R.; Nguyen, D.-T.; Cruz, Y.; Stonich, D.A.; Walzer, G.; Rose, J.T.; Picard, S.C.; Liu, Z.; Rose, J.N.; Xiang, X.; Asiedu, J.; Durkin, D.; Levine, J.; Yang, J.J.; Schürer, S.C.; Braisted, J.C.; Southall, N.; Southern, M.R.; Chung, T.D.Y.; Brudz, S.; Tanega, C.; Schreiber, S.L.; Bittker, J.A.; Guha, R.; Clemons, P.A.

    2015-01-01

    BARD, the BioAssay Research Database (https://bard.nih.gov/) is a public database and suite of tools developed to provide access to bioassay data produced by the NIH Molecular Libraries Program (MLP). Data from 631 MLP projects were migrated to a new structured vocabulary designed to capture bioassay data in a formalized manner, with particular emphasis placed on the description of assay protocols. New data can be submitted to BARD with a user-friendly set of tools that assist in the creation of appropriately formatted datasets and assay definitions. Data published through the BARD application program interface (API) can be accessed by researchers using web-based query tools or a desktop client. Third-party developers wishing to create new tools can use the API to produce stand-alone tools or new plug-ins that can be integrated into BARD. The entire BARD suite of tools therefore supports three classes of researcher: those who wish to publish data, those who wish to mine data for testable hypotheses, and those in the developer community who wish to build tools that leverage this carefully curated chemical biology resource. PMID:25477388

  10. Earth science big data at users' fingertips: the EarthServer Science Gateway Mobile

    NASA Astrophysics Data System (ADS)

    Barbera, Roberto; Bruno, Riccardo; Calanducci, Antonio; Fargetta, Marco; Pappalardo, Marco; Rundo, Francesco

    2014-05-01

    The EarthServer project (www.earthserver.eu), funded by the European Commission under its Seventh Framework Program, aims at establishing open access and ad-hoc analytics on extreme-size Earth Science data, based on and extending leading-edge Array Database technology. The core idea is to use database query languages as client/server interface to achieve barrier-free "mix & match" access to multi-source, any-size, multi-dimensional space-time data -- in short: "Big Earth Data Analytics" - based on the open standards of the Open Geospatial Consortium Web Coverage Processing Service (OGC WCPS) and the W3C XQuery. EarthServer combines both, thereby achieving a tight data/metadata integration. Further, the rasdaman Array Database System (www.rasdaman.com) is extended with further space-time coverage data types. On server side, highly effective optimizations - such as parallel and distributed query processing - ensure scalability to Exabyte volumes. In this contribution we will report on the EarthServer Science Gateway Mobile, an app for both iOS and Android-based devices that allows users to seamlessly access some of the EarthServer applications using SAML-based federated authentication and fine-grained authorisation mechanisms.

  11. An Adaptive Priority Tuning System for Optimized Local CPU Scheduling using BOINC Clients

    NASA Astrophysics Data System (ADS)

    Mnaouer, Adel B.; Ragoonath, Colin

    2010-11-01

    Volunteer Computing (VC) is a Distributed Computing model which utilizes idle CPU cycles from computing resources donated by volunteers who are connected through the Internet to form a very large-scale, loosely coupled High Performance Computing environment. Distributed Volunteer Computing environments such as the BOINC framework is concerned mainly with the efficient scheduling of the available resources to the applications which require them. The BOINC framework thus contains a number of scheduling policies/algorithms both on the server-side and on the client which work together to maximize the available resources and to provide a degree of QoS in an environment which is highly volatile. This paper focuses on the BOINC client and introduces an adaptive priority tuning client side middleware application which improves the execution times of Work Units (WUs) while maintaining an acceptable Maximum Response Time (MRT) for the end user. We have conducted extensive experimentation of the proposed system and the results show clear speedup of BOINC applications using our optimized middleware as opposed to running using the original BOINC client.

  12. Moving from affirmation to liberation in psychological practice with transgender and gender nonconforming clients.

    PubMed

    Singh, Anneliese A

    2016-11-01

    While affirmative approaches with transgender and gender nonconforming (TGNC) clients are gaining momentum within psychological practice (American Counseling Association, 2010; American Psychological Association, 2015), there is a simultaneous and pressing need to move beyond TGNC-affirmative to TGNC-liberatory approaches to more fully address how societal gender binaries influence both psychologist and client. Psychologists may use the lens of liberation psychology (Martín-Baró, 1996) to transform the ways they work with TGNC clients. Using this perspective, psychologists can reflect on their own gender journey and experiences, identify how cisgender privilege influences counseling and psychological practice, and advocate for TGNC people to be better served in the settings in which they work. Psychologists are then able to engage in social change on behalf of, and in collaboration with, TGNC people in ways that simultaneously liberate psychologists from their own gender oppression experiences. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  13. Client/Server Architecture Promises Radical Changes.

    ERIC Educational Resources Information Center

    Freeman, Grey; York, Jerry

    1991-01-01

    This article discusses the emergence of the client/server paradigm for the delivery of computer applications, its emergence in response to the proliferation of microcomputers and local area networks, the applicability of the model in academic institutions, and its implications for college campus information technology organizations. (Author/DB)

  14. BossPro: a biometrics-based obfuscation scheme for software protection

    NASA Astrophysics Data System (ADS)

    Kuseler, Torben; Lami, Ihsan A.; Al-Assam, Hisham

    2013-05-01

    This paper proposes to integrate biometric-based key generation into an obfuscated interpretation algorithm to protect authentication application software from illegitimate use or reverse-engineering. This is especially necessary for mCommerce because application programmes on mobile devices, such as Smartphones and Tablet-PCs are typically open for misuse by hackers. Therefore, the scheme proposed in this paper ensures that a correct interpretation / execution of the obfuscated program code of the authentication application requires a valid biometric generated key of the actual person to be authenticated, in real-time. Without this key, the real semantics of the program cannot be understood by an attacker even if he/she gains access to this application code. Furthermore, the security provided by this scheme can be a vital aspect in protecting any application running on mobile devices that are increasingly used to perform business/financial or other security related applications, but are easily lost or stolen. The scheme starts by creating a personalised copy of any application based on the biometric key generated during an enrolment process with the authenticator as well as a nuance created at the time of communication between the client and the authenticator. The obfuscated code is then shipped to the client's mobile devise and integrated with real-time biometric extracted data of the client to form the unlocking key during execution. The novelty of this scheme is achieved by the close binding of this application program to the biometric key of the client, thus making this application unusable for others. Trials and experimental results on biometric key generation, based on client's faces, and an implemented scheme prototype, based on the Android emulator, prove the concept and novelty of this proposed scheme.

  15. DICOM image integration into an electronic medical record using thin viewing clients

    NASA Astrophysics Data System (ADS)

    Stewart, Brent K.; Langer, Steven G.; Taira, Ricky K.

    1998-07-01

    Purpose -- To integrate radiological DICOM images into our currently existing web-browsable Electronic Medical Record (MINDscape). Over the last five years the University of Washington has created a clinical data repository combining in a distributed relational database information from multiple departmental databases (MIND). A text-based view of this data called the Mini Medical Record (MMR) has been available for three years. MINDscape, unlike the text based MMR, provides a platform independent, web browser view of the MIND dataset that can easily be linked to other information resources on the network. We have now added the integration of radiological images into MINDscape through a DICOM webserver. Methods/New Work -- we have integrated a commercial webserver that acts as a DICOM Storage Class Provider to our, computed radiography (CR), computed tomography (CT), digital fluoroscopy (DF), magnetic resonance (MR) and ultrasound (US) scanning devices. These images can be accessed through CGI queries or by linking the image server database using ODBC or SQL gateways. This allows the use of dynamic HTML links to the images on the DICOM webserver from MINDscape, so that the radiology reports already resident in the MIND repository can be married with the associated images through the unique examination accession number generated by our Radiology Information System (RIS). The web browser plug-in used provides a wavelet decompression engine (up to 16-bits per pixel) and performs the following image manipulation functions: window/level, flip, invert, sort, rotate, zoom, cine-loop and save as JPEG. Results -- Radiological DICOM image sets (CR, CT, MR and US) are displayed with associated exam reports for referring physician and clinicians anywhere within the widespread academic medical center on PCs, Macs, X-terminals and Unix computers. This system is also being used for home teleradiology application. Conclusion -- Radiological DICOM images can be made available medical center wide to physicians quickly using low-cost and ubiquitous, thin client browsing technology and wavelet compression.

  16. Web Program for Development of GUIs for Cluster Computers

    NASA Technical Reports Server (NTRS)

    Czikmantory, Akos; Cwik, Thomas; Klimeck, Gerhard; Hua, Hook; Oyafuso, Fabiano; Vinyard, Edward

    2003-01-01

    WIGLAF (a Web Interface Generator and Legacy Application Facade) is a computer program that provides a Web-based, distributed, graphical-user-interface (GUI) framework that can be adapted to any of a broad range of application programs, written in any programming language, that are executed remotely on any cluster computer system. WIGLAF enables the rapid development of a GUI for controlling and monitoring a specific application program running on the cluster and for transferring data to and from the application program. The only prerequisite for the execution of WIGLAF is a Web-browser program on a user's personal computer connected with the cluster via the Internet. WIGLAF has a client/server architecture: The server component is executed on the cluster system, where it controls the application program and serves data to the client component. The client component is an applet that runs in the Web browser. WIGLAF utilizes the Extensible Markup Language to hold all data associated with the application software, Java to enable platform-independent execution on the cluster system and the display of a GUI generator through the browser, and the Java Remote Method Invocation software package to provide simple, effective client/server networking.

  17. Online Maps and Cloud-Supported Location-Based Services across a Manifold of Devices

    NASA Astrophysics Data System (ADS)

    Kröpfl, M.; Buchmüller, D.; Leberl, F.

    2012-07-01

    Online mapping, miniaturization of computing devices, the "cloud", Global Navigation Satellite System (GNSS) and cell tower triangulation all coalesce into an entirely novel infrastructure for numerous innovative map applications. This impacts the planning of human activities, navigating and tracking these activities as they occur, and finally documenting their outcome for either a single user or a network of connected users in a larger context. In this paper, we provide an example of a simple geospatial application making use of this model, which we will use to explain the basic steps necessary to deploy an application involving a web service hosting geospatial information and a client software consuming the web service through an API. The application allows an insurance claim specialist to add claims to a cloud-based database including a claim location. A field agent then uses a smartphone application to query the database by proximity, and heads out to capture photographs as supporting documentation for the claim. Once the photos have been uploaded to the web service, a second web service for image matching is called in order to try and match the current photograph to previously submitted assets. Image matching is used as a pre-verification step to determine whether the coverage of the respective object is sufficient for the claim specialist to process the claim. The development of the application was based on Microsoft's® Bing Maps™, Windows Phone™, Silverlight™, Windows Azure™ and Visual Studio™, and was completed in approximately 30 labour hours split among two developers.

  18. Applications of Professional Ethics in Educational Assessment.

    ERIC Educational Resources Information Center

    Wickwire, Pat Nellor

    In the schools, the primary clients are the students. Other clients include parents, citizens, the community, and educators--all stakeholders in the processes and the products of schools. Professionals in administrative, instructional, and student services are committed to serving these internal and external clients by providing for offerings and…

  19. Development of mobile preventive notification system (PreNotiS)

    NASA Astrophysics Data System (ADS)

    Kumar, Abhinav; Akopian, David; Chen, Philip

    2009-02-01

    The tasks achievable by mobile handsets continuously exceed our imagination. Statistics show that the mobile phone sales are soaring, rising exponentially year after year with predictions being that they will rise to a billion units in 2009, with a large section of these being smartphones. Mobile service providers, mobile application developers and researchers have been working closely over the past decade to bring about revolutionary and hardware and software advancements in hand-sets such as embedded digital camera, large memory capacity, accelerometer, touch sensitive screens, GPS, Wi- Fi capabilities etc. as well as in the network infrastructure to support these features. Recently we presented a multi-platform, massive data collection system from distributive sources such as cell phone users1 called PreNotiS. This technology was intended to significantly simplify the response to the events and help e.g. special agencies to gather crucial information in time and respond as quickly as possible to prevent or contain potential emergency situations and act as a massive, centralized evidence collection mechanism that effectively exploits the advancements in mobile application development platforms and the existing network infrastructure to present an easy-touse, fast and effective tool to mobile phone users. We successfully demonstrated the functionality of the client-server application suite to post user information onto the server. This paper presents a new version of the system PreNotiS, with a revised client application and with all new server capabilities. PreNotiS still puts forth the idea of having a fast, efficient client-server based application suite for mobile phones which through a highly simplified user interface will collect security/calamity based information in a structured format from first responders and relay that structured information to a central server where this data is sorted into a database in a predefined manner. This information which includes selections, images and text will be instantly available to authorities and action forces through a secure web portal thus helping them to make decisions in a timely and prompt manner. All the cell phones have self-localizing capability according to FCC E9112 mandate, thus the communicated information can be further tagged automatically by location and time information at the server making all this information available through the secure web-portal.

  20. Three-Dimensional Audio Client Library

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.

    2005-01-01

    The Three-Dimensional Audio Client Library (3DAudio library) is a group of software routines written to facilitate development of both stand-alone (audio only) and immersive virtual-reality application programs that utilize three-dimensional audio displays. The library is intended to enable the development of three-dimensional audio client application programs by use of a code base common to multiple audio server computers. The 3DAudio library calls vendor-specific audio client libraries and currently supports the AuSIM Gold-Server and Lake Huron audio servers. 3DAudio library routines contain common functions for (1) initiation and termination of a client/audio server session, (2) configuration-file input, (3) positioning functions, (4) coordinate transformations, (5) audio transport functions, (6) rendering functions, (7) debugging functions, and (8) event-list-sequencing functions. The 3DAudio software is written in the C++ programming language and currently operates under the Linux, IRIX, and Windows operating systems.

  1. An Array Library for Microsoft SQL Server with Astrophysical Applications

    NASA Astrophysics Data System (ADS)

    Dobos, L.; Szalay, A. S.; Blakeley, J.; Falck, B.; Budavári, T.; Csabai, I.

    2012-09-01

    Today's scientific simulations produce output on the 10-100 TB scale. This unprecedented amount of data requires data handling techniques that are beyond what is used for ordinary files. Relational database systems have been successfully used to store and process scientific data, but the new requirements constantly generate new challenges. Moving terabytes of data among servers on a timely basis is a tough problem, even with the newest high-throughput networks. Thus, moving the computations as close to the data as possible and minimizing the client-server overhead are absolutely necessary. At least data subsetting and preprocessing have to be done inside the server process. Out of the box commercial database systems perform very well in scientific applications from the prospective of data storage optimization, data retrieval, and memory management but lack basic functionality like handling scientific data structures or enabling advanced math inside the database server. The most important gap in Microsoft SQL Server is the lack of a native array data type. Fortunately, the technology exists to extend the database server with custom-written code that enables us to address these problems. We present the prototype of a custom-built extension to Microsoft SQL Server that adds array handling functionality to the database system. With our Array Library, fix-sized arrays of all basic numeric data types can be created and manipulated efficiently. Also, the library is designed to be able to be seamlessly integrated with the most common math libraries, such as BLAS, LAPACK, FFTW, etc. With the help of these libraries, complex operations, such as matrix inversions or Fourier transformations, can be done on-the-fly, from SQL code, inside the database server process. We are currently testing the prototype with two different scientific data sets: The Indra cosmological simulation will use it to store particle and density data from N-body simulations, and the Milky Way Laboratory project will use it to store galaxy simulation data.

  2. The Database Query Support Processor (QSP)

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The number and diversity of databases available to users continues to increase dramatically. Currently, the trend is towards decentralized, client server architectures that (on the surface) are less expensive to acquire, operate, and maintain than information architectures based on centralized, monolithic mainframes. The database query support processor (QSP) effort evaluates the performance of a network level, heterogeneous database access capability. Air Force Material Command's Rome Laboratory has developed an approach, based on ANSI standard X3.138 - 1988, 'The Information Resource Dictionary System (IRDS)' to seamless access to heterogeneous databases based on extensions to data dictionary technology. To successfully query a decentralized information system, users must know what data are available from which source, or have the knowledge and system privileges necessary to find out this information. Privacy and security considerations prohibit free and open access to every information system in every network. Even in completely open systems, time required to locate relevant data (in systems of any appreciable size) would be better spent analyzing the data, assuming the original question was not forgotten. Extensions to data dictionary technology have the potential to more fully automate the search and retrieval for relevant data in a decentralized environment. Substantial amounts of time and money could be saved by not having to teach users what data resides in which systems and how to access each of those systems. Information describing data and how to get it could be removed from the application and placed in a dedicated repository where it belongs. The result simplified applications that are less brittle and less expensive to build and maintain. Software technology providing the required functionality is off the shelf. The key difficulty is in defining the metadata required to support the process. The database query support processor effort will provide quantitative data on the amount of effort required to implement an extended data dictionary at the network level, add new systems, adapt to changing user needs, and provide sound estimates on operations and maintenance costs and savings.

  3. Using feminist, emotion-focused, and developmental approaches to enhance cognitive-behavioral therapies for posttraumatic stress disorder related to childhood sexual abuse.

    PubMed

    Cohen, Jacqueline N

    2008-06-01

    A body of research indicates the efficacy of cognitive-behavioral interventions for the treatment of posttraumatic stress disorder (PTSD) subsequent to sexual assault in adulthood. The generalizability of these treatments to women who present with trauma symptoms associated with childhood sexual abuse (CSA) has yet to be shown, however. A number of characteristics and dynamics of CSA that make it unique from sexual assault in adulthood are described, specifically its disruption of normal childhood development, its impact on attachment style and interpersonal relationships, its inescapability, and the stigma attached to it. Then, drawing on the developmental, emotion-focused, and feminist literatures, a number of considerations that would enhance the application of cognitive- behavioral trauma therapies to the treatment of women with PTSD related to CSA are delineated. These considerations relate to providing clients with corrective interpersonal experiences, creating new relationship events, enhancing affect regulation skills before initiating exposure therapy, considering the time elapsed since the abuse, addressing themes of power, betrayal, self-blame, stigma, and sex-related cognitions and emotions, and helping clients develop a feminist consciousness. (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  4. NeisseriaBase: a specialised Neisseria genomic resource and analysis platform.

    PubMed

    Zheng, Wenning; Mutha, Naresh V R; Heydari, Hamed; Dutta, Avirup; Siow, Cheuk Chuen; Jakubovics, Nicholas S; Wee, Wei Yee; Tan, Shi Yang; Ang, Mia Yang; Wong, Guat Jah; Choo, Siew Woh

    2016-01-01

    Background. The gram-negative Neisseria is associated with two of the most potent human epidemic diseases: meningococcal meningitis and gonorrhoea. In both cases, disease is caused by bacteria colonizing human mucosal membrane surfaces. Overall, the genus shows great diversity and genetic variation mainly due to its ability to acquire and incorporate genetic material from a diverse range of sources through horizontal gene transfer. Although a number of databases exist for the Neisseria genomes, they are mostly focused on the pathogenic species. In this present study we present the freely available NeisseriaBase, a database dedicated to the genus Neisseria encompassing the complete and draft genomes of 15 pathogenic and commensal Neisseria species. Methods. The genomic data were retrieved from National Center for Biotechnology Information (NCBI) and annotated using the RAST server which were then stored into the MySQL database. The protein-coding genes were further analyzed to obtain information such as calculation of GC content (%), predicted hydrophobicity and molecular weight (Da) using in-house Perl scripts. The web application was developed following the secure four-tier web application architecture: (1) client workstation, (2) web server, (3) application server, and (4) database server. The web interface was constructed using PHP, JavaScript, jQuery, AJAX and CSS, utilizing the model-view-controller (MVC) framework. The in-house developed bioinformatics tools implemented in NeisseraBase were developed using Python, Perl, BioPerl and R languages. Results. Currently, NeisseriaBase houses 603,500 Coding Sequences (CDSs), 16,071 RNAs and 13,119 tRNA genes from 227 Neisseria genomes. The database is equipped with interactive web interfaces. Incorporation of the JBrowse genome browser in the database enables fast and smooth browsing of Neisseria genomes. NeisseriaBase includes the standard BLAST program to facilitate homology searching, and for Virulence Factor Database (VFDB) specific homology searches, the VFDB BLAST is also incorporated into the database. In addition, NeisseriaBase is equipped with in-house designed tools such as the Pairwise Genome Comparison tool (PGC) for comparative genomic analysis and the Pathogenomics Profiling Tool (PathoProT) for the comparative pathogenomics analysis of Neisseria strains. Discussion. This user-friendly database not only provides access to a host of genomic resources on Neisseria but also enables high-quality comparative genome analysis, which is crucial for the expanding scientific community interested in Neisseria research. This database is freely available at http://neisseria.um.edu.my.

  5. NeisseriaBase: a specialised Neisseria genomic resource and analysis platform

    PubMed Central

    Zheng, Wenning; Mutha, Naresh V.R.; Heydari, Hamed; Dutta, Avirup; Siow, Cheuk Chuen; Jakubovics, Nicholas S.; Wee, Wei Yee; Tan, Shi Yang; Ang, Mia Yang; Wong, Guat Jah

    2016-01-01

    Background. The gram-negative Neisseria is associated with two of the most potent human epidemic diseases: meningococcal meningitis and gonorrhoea. In both cases, disease is caused by bacteria colonizing human mucosal membrane surfaces. Overall, the genus shows great diversity and genetic variation mainly due to its ability to acquire and incorporate genetic material from a diverse range of sources through horizontal gene transfer. Although a number of databases exist for the Neisseria genomes, they are mostly focused on the pathogenic species. In this present study we present the freely available NeisseriaBase, a database dedicated to the genus Neisseria encompassing the complete and draft genomes of 15 pathogenic and commensal Neisseria species. Methods. The genomic data were retrieved from National Center for Biotechnology Information (NCBI) and annotated using the RAST server which were then stored into the MySQL database. The protein-coding genes were further analyzed to obtain information such as calculation of GC content (%), predicted hydrophobicity and molecular weight (Da) using in-house Perl scripts. The web application was developed following the secure four-tier web application architecture: (1) client workstation, (2) web server, (3) application server, and (4) database server. The web interface was constructed using PHP, JavaScript, jQuery, AJAX and CSS, utilizing the model-view-controller (MVC) framework. The in-house developed bioinformatics tools implemented in NeisseraBase were developed using Python, Perl, BioPerl and R languages. Results. Currently, NeisseriaBase houses 603,500 Coding Sequences (CDSs), 16,071 RNAs and 13,119 tRNA genes from 227 Neisseria genomes. The database is equipped with interactive web interfaces. Incorporation of the JBrowse genome browser in the database enables fast and smooth browsing of Neisseria genomes. NeisseriaBase includes the standard BLAST program to facilitate homology searching, and for Virulence Factor Database (VFDB) specific homology searches, the VFDB BLAST is also incorporated into the database. In addition, NeisseriaBase is equipped with in-house designed tools such as the Pairwise Genome Comparison tool (PGC) for comparative genomic analysis and the Pathogenomics Profiling Tool (PathoProT) for the comparative pathogenomics analysis of Neisseria strains. Discussion. This user-friendly database not only provides access to a host of genomic resources on Neisseria but also enables high-quality comparative genome analysis, which is crucial for the expanding scientific community interested in Neisseria research. This database is freely available at http://neisseria.um.edu.my. PMID:27017950

  6. Letter Writing as a Tool To Increase Client Motivation To Change: Application to an Inpatient Crisis Unit.

    ERIC Educational Resources Information Center

    Tubman, Jonathan G.; Montgomery, Marilyn J.; Wagner, Eric F.

    2001-01-01

    Describes the application of a letter writing exercise as a motivational technique for group counseling in contemporary crisis unit settings. Discusses guidelines and implications for clinical practice with clients with multiple, chronic problems. (Contains 37 references and 1 table.) (GCP)

  7. Effectiveness of motivational interviewing on lifestyle modification and health outcomes of clients at risk or diagnosed with cardiovascular diseases: A systematic review.

    PubMed

    Lee, Windy W M; Choi, K C; Yum, Royce W Y; Yu, Doris S F; Chair, S Y

    2016-01-01

    Clinically, there is an increasing trend in using motivational interviewing as a counseling method to help clients with cardiovascular diseases to modify their unhealthy lifestyle in order to decrease the risk of disease occurrence. As motivational interviewing has gained increased attention, research has been conducted to examine its effectiveness. This review attempts to identify the best available evidence related to the effectiveness of motivational interviewing on lifestyle modification, physiological and psychological outcomes for clients at risk of developing or with established cardiovascular diseases. Systematic review of studies incorporating motivational interviewing in modifying lifestyles, improving physiological and psychological outcomes for clients at risk of or diagnosed with cardiovascular diseases. Major English and Chinese electronic databases were searched to identify citations that reported the effectiveness of motivational interviewing. The searched databases included MEDLINE, British Nursing Index, CINAHL Plus, PsycINFO, SCOPUS, CJN, CBM, HyRead, WanFang Data, Digital Dissertation Consortium, and so on. Two reviewers independently assessed the relevance of citations based on the inclusion criteria. Full texts of potential citations were retrieved for more detailed review. Critical appraisal was conducted by using the standardized critical appraisal checklist for randomized and quasi-randomized controlled studies from the Joanna Briggs Institute - Meta Analysis of Statistics Assessment and Review Instrument (JBI-MAStaRI). After eligibility screening, 14 articles describing 9 studies satisfied the inclusion criteria and were included in the analysis. Only certain outcomes in certain studies were pooled for meta-analysis because of the large variability of the studies included, other findings were presented in narrative form. For lifestyle modification, the review showed that motivational interviewing could be more effective than usual care on altering smoking habits. For physiological outcomes, the review showed that motivational interviewing positively improved client's systolic and diastolic blood pressures but the result was not significant. For psychological outcomes, the review showed that motivational interviewing might have favorable effect on improving clients' depression. For other outcomes, the review showed that motivational interviewing did not differ from usual care or usual care was even more effective. The review showed that motivational interviewing might have favorable effects on changing clients' smoking habits, depression, and three SF-36 domains. For the other outcomes, most of the results were inconclusive. Further studies should be performed to identify the optimal format and frequency of motivational interviewing. Primary research on the effectiveness of motivational interviewing on increasing clients' motivation and their actual changes in healthy behavior is also recommended. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Implementing home care in Canada: four critical elements.

    PubMed

    Richardson, B

    2000-01-01

    While MacAdam proposes a "national approach to home care#8221; the obstacles to this are well known and substantial. They are the likely cost and the limitations of the federal government s role in healthcare. Building on MacAdam's assessment, this paper outlines four problems embedded in the various home-care service delivery models in Canada: the lack of factual client outcome information to support decision-making, the limited client choice of provider, the perverse incentive of fee for service and the bias against the for-profit provider. The paper proposes that the assessment, classification and measurement of outcomes for every recipient of home-care services be standardized using a proven assessment instrument, such as OASIS-B or MDS-HC, by healthcare professionals certified in its use. The resulting information would be captured in a regional database and available for analysis and research. CIHI would be contracted to manage a national database and to fund the training and certification of assessors. The paper proposes a new service delivery and funding model, utilizing standard client outcome information, different roles for regional health authorities and service providers, and a prospective payment mechanism replacing fee for service. A national home care program may be an elusive dream, but that shouldn't stop experimentation, evaluation and improvement.

  9. Use of a secure Internet Web site for collaborative medical research.

    PubMed

    Marshall, W W; Haley, R W

    2000-10-11

    Researchers who collaborate on clinical research studies from diffuse locations need a convenient, inexpensive, secure way to record and manage data. The Internet, with its World Wide Web, provides a vast network that enables researchers with diverse types of computers and operating systems anywhere in the world to log data through a common interface. Development of a Web site for scientific data collection can be organized into 10 steps, including planning the scientific database, choosing a database management software system, setting up database tables for each collaborator's variables, developing the Web site's screen layout, choosing a middleware software system to tie the database software to the Web site interface, embedding data editing and calculation routines, setting up the database on the central server computer, obtaining a unique Internet address and name for the Web site, applying security measures to the site, and training staff who enter data. Ensuring the security of an Internet database requires limiting the number of people who have access to the server, setting up the server on a stand-alone computer, requiring user-name and password authentication for server and Web site access, installing a firewall computer to prevent break-ins and block bogus information from reaching the server, verifying the identity of the server and client computers with certification from a certificate authority, encrypting information sent between server and client computers to avoid eavesdropping, establishing audit trails to record all accesses into the Web site, and educating Web site users about security techniques. When these measures are carefully undertaken, in our experience, information for scientific studies can be collected and maintained on Internet databases more efficiently and securely than through conventional systems of paper records protected by filing cabinets and locked doors. JAMA. 2000;284:1843-1849.

  10. GenExp: an interactive web-based genomic DAS client with client-side data rendering.

    PubMed

    Gel Moreno, Bernat; Messeguer Peypoch, Xavier

    2011-01-01

    The Distributed Annotation System (DAS) offers a standard protocol for sharing and integrating annotations on biological sequences. There are more than 1000 DAS sources available and the number is steadily increasing. Clients are an essential part of the DAS system and integrate data from several independent sources in order to create a useful representation to the user. While web-based DAS clients exist, most of them do not have direct interaction capabilities such as dragging and zooming with the mouse. Here we present GenExp, a web based and fully interactive visual DAS client. GenExp is a genome oriented DAS client capable of creating informative representations of genomic data zooming out from base level to complete chromosomes. It proposes a novel approach to genomic data rendering and uses the latest HTML5 web technologies to create the data representation inside the client browser. Thanks to client-side rendering most position changes do not need a network request to the server and so responses to zooming and panning are almost immediate. In GenExp it is possible to explore the genome intuitively moving it with the mouse just like geographical map applications. Additionally, in GenExp it is possible to have more than one data viewer at the same time and to save the current state of the application to revisit it later on. GenExp is a new interactive web-based client for DAS and addresses some of the short-comings of the existing clients. It uses client-side data rendering techniques resulting in easier genome browsing and exploration. GenExp is open source under the GPL license and it is freely available at http://gralggen.lsi.upc.edu/recerca/genexp.

  11. GenExp: An Interactive Web-Based Genomic DAS Client with Client-Side Data Rendering

    PubMed Central

    Gel Moreno, Bernat; Messeguer Peypoch, Xavier

    2011-01-01

    Background The Distributed Annotation System (DAS) offers a standard protocol for sharing and integrating annotations on biological sequences. There are more than 1000 DAS sources available and the number is steadily increasing. Clients are an essential part of the DAS system and integrate data from several independent sources in order to create a useful representation to the user. While web-based DAS clients exist, most of them do not have direct interaction capabilities such as dragging and zooming with the mouse. Results Here we present GenExp, a web based and fully interactive visual DAS client. GenExp is a genome oriented DAS client capable of creating informative representations of genomic data zooming out from base level to complete chromosomes. It proposes a novel approach to genomic data rendering and uses the latest HTML5 web technologies to create the data representation inside the client browser. Thanks to client-side rendering most position changes do not need a network request to the server and so responses to zooming and panning are almost immediate. In GenExp it is possible to explore the genome intuitively moving it with the mouse just like geographical map applications. Additionally, in GenExp it is possible to have more than one data viewer at the same time and to save the current state of the application to revisit it later on. Conclusions GenExp is a new interactive web-based client for DAS and addresses some of the short-comings of the existing clients. It uses client-side data rendering techniques resulting in easier genome browsing and exploration. GenExp is open source under the GPL license and it is freely available at http://gralggen.lsi.upc.edu/recerca/genexp. PMID:21750706

  12. The GRIDView Visualization Package

    NASA Astrophysics Data System (ADS)

    Kent, B. R.

    2011-07-01

    Large three-dimensional data cubes, catalogs, and spectral line archives are increasingly important elements of the data discovery process in astronomy. Visualization of large data volumes is of vital importance for the success of large spectral line surveys. Examples of data reduction utilizing the GRIDView software package are shown. The package allows users to manipulate data cubes, extract spectral profiles, and measure line properties. The package and included graphical user interfaces (GUIs) are designed with pipeline infrastructure in mind. The software has been used with great success analyzing spectral line and continuum data sets obtained from large radio survey collaborations. The tools are also important for multi-wavelength cross-correlation studies and incorporate Virtual Observatory client applications for overlaying database information in real time as cubes are examined by users.

  13. Effect of Temporal Relationships in Associative Rule Mining for Web Log Data

    PubMed Central

    Mohd Khairudin, Nazli; Mustapha, Aida

    2014-01-01

    The advent of web-based applications and services has created such diverse and voluminous web log data stored in web servers, proxy servers, client machines, or organizational databases. This paper attempts to investigate the effect of temporal attribute in relational rule mining for web log data. We incorporated the characteristics of time in the rule mining process and analysed the effect of various temporal parameters. The rules generated from temporal relational rule mining are then compared against the rules generated from the classical rule mining approach such as the Apriori and FP-Growth algorithms. The results showed that by incorporating the temporal attribute via time, the number of rules generated is subsequently smaller but is comparable in terms of quality. PMID:24587757

  14. Data exchange technology based on handshake protocol for industrial automation system

    NASA Astrophysics Data System (ADS)

    Astafiev, A. V.; Shardin, T. O.

    2018-05-01

    In the article, questions of data exchange technology based on the handshake protocol for industrial automation system are considered. The methods of organizing the technology in client-server applications are analyzed. In the process of work, the main threats of client-server applications that arise during the information interaction of users are indicated. Also, a comparative analysis of analogue systems was carried out, as a result of which the most suitable option was chosen for further use. The basic schemes for the operation of the handshake protocol are shown, as well as the general scheme of the implemented application, which describes the entire process of interaction between the client and the server.

  15. Doing Your Science While You're in Orbit

    NASA Astrophysics Data System (ADS)

    Green, Mark L.; Miller, Stephen D.; Vazhkudai, Sudharshan S.; Trater, James R.

    2010-11-01

    Large-scale neutron facilities such as the Spallation Neutron Source (SNS) located at Oak Ridge National Laboratory need easy-to-use access to Department of Energy Leadership Computing Facilities and experiment repository data. The Orbiter thick- and thin-client and its supporting Service Oriented Architecture (SOA) based services (available at https://orbiter.sns.gov) consist of standards-based components that are reusable and extensible for accessing high performance computing, data and computational grid infrastructure, and cluster-based resources easily from a user configurable interface. The primary Orbiter system goals consist of (1) developing infrastructure for the creation and automation of virtual instrumentation experiment optimization, (2) developing user interfaces for thin- and thick-client access, (3) provide a prototype incorporating major instrument simulation packages, and (4) facilitate neutron science community access and collaboration. The secure Orbiter SOA authentication and authorization is achieved through the developed Virtual File System (VFS) services, which use Role-Based Access Control (RBAC) for data repository file access, thin-and thick-client functionality and application access, and computational job workflow management. The VFS Relational Database Management System (RDMS) consists of approximately 45 database tables describing 498 user accounts with 495 groups over 432,000 directories with 904,077 repository files. Over 59 million NeXus file metadata records are associated to the 12,800 unique NeXus file field/class names generated from the 52,824 repository NeXus files. Services that enable (a) summary dashboards of data repository status with Quality of Service (QoS) metrics, (b) data repository NeXus file field/class name full text search capabilities within a Google like interface, (c) fully functional RBAC browser for the read-only data repository and shared areas, (d) user/group defined and shared metadata for data repository files, (e) user, group, repository, and web 2.0 based global positioning with additional service capabilities are currently available. The SNS based Orbiter SOA integration progress with the Distributed Data Analysis for Neutron Scattering Experiments (DANSE) software development project is summarized with an emphasis on DANSE Central Services and the Virtual Neutron Facility (VNF). Additionally, the DANSE utilization of the Orbiter SOA authentication, authorization, and data transfer services best practice implementations are presented.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fisher,D

    Concerns about the long-term viability of SFS as the metadata store for HPSS have been increasing. A concern that Transarc may discontinue support for SFS motivates us to consider alternative means to store HPSS metadata. The obvious alternative is a commercial database. Commercial databases have the necessary characteristics for storage of HPSS metadata records. They are robust and scalable and can easily accommodate the volume of data that must be stored. They provide programming interfaces, transactional semantics and a full set of maintenance and performance enhancement tools. A team was organized within the HPSS project to study and recommend anmore » approach for the replacement of SFS. Members of the team are David Fisher, Jim Minton, Donna Mecozzi, Danny Cook, Bart Parliman and Lynn Jones. We examined several possible solutions to the problem of replacing SFS, and recommended on May 22, 2000, in a report to the HPSS Technical and Executive Committees, to change HPSS into a database application over either Oracle or DB2. We recommended either Oracle or DB2 on the basis of market share and technical suitability. Oracle and DB2 are dominant offerings in the market, and it is in the best interest of HPSS to use a major player's product. Both databases provide a suitable programming interface. Transaction management functions, support for multi-threaded clients and data manipulation languages (DML) are available. These findings were supported in meetings held with technical experts from both companies. In both cases, the evidence indicated that either database would provide the features needed to host HPSS.« less

  17. Ultrabroadband photonic internet: safety aspects

    NASA Astrophysics Data System (ADS)

    Kalicki, Arkadiusz; Romaniuk, Ryszard

    2008-11-01

    Web applications became most popular medium in the Internet. Popularity, easiness of web application frameworks together with careless development results in high number of vulnerabilities and attacks. There are several types of attacks possible because of improper input validation. SQL injection is ability to execute arbitrary SQL queries in a database through an existing application. Cross-site scripting is the vulnerability which allows malicious web users to inject code into the web pages viewed by other users. Cross-Site Request Forgery (CSRF) is an attack that tricks the victim into loading a page that contains malicious request. Web spam in blogs. There are several techniques to mitigate attacks. Most important are web application strong design, correct input validation, defined data types for each field and parameterized statements in SQL queries. Server hardening with firewall, modern security policies systems and safe web framework interpreter configuration are essential. It is advised to keep proper security level on client side, keep updated software and install personal web firewalls or IDS/IPS systems. Good habits are logging out from services just after finishing work and using even separate web browser for most important sites, like e-banking.

  18. Becoming psychotherapists: Experiences of novice trainees in a beginning graduate class.

    PubMed

    Hill, Clara E; Sullivan, Catherine; Knox, Sarah; Schlosser, Lewis Z

    2007-12-01

    The authors investigated the experiences related to becoming psychotherapists for 5 counseling psychology doctoral trainees in their first prepracticum course. Qualitative analyses of weekly journals indicated that trainees discussed challenges related to becoming psychotherapists (e.g., being self-critical, having troubling reactions to clients, learning to use helping skills), gains made during the semester related to becoming psychotherapists (e.g., using helping skills more effectively, becoming less self-critical, being able to connect with clients), as well as experiences in supervision and activities that helped them cope with their anxieties. Results are discussed in 5 broad areas: feelings about self in role of psychotherapist, awareness of reactions to clients, learning and using helping skills, reactions to supervision, and experiences that fostered growth. Implications for training and research are provided. (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  19. The implementation of psychiatric advance directives: experiences from a Dutch crisis card initiative.

    PubMed

    van der Ham, Alida J; Voskes, Yolande; van Kempen, Nel; Broerse, Jacqueline E W; Widdershoven, Guy A M

    2013-06-01

    The crisis card is a specific form of psychiatric advance directive, documenting mental clients' treatment preferences in advance of a potential psychiatric crisis. In this paper, we aim to provide insight into implementation issues surrounding the crisis card. A Dutch crisis-card project formed the scope of this study. Data were collected through interviews with 15 participants from six stakeholder groups. Identified implementation issues are: (a) The role of the crisis-card counselor, (b) lack of distribution and familiarity, (c) care professionals' routines, and (d) client readiness. The crisis-card counselor appears to play a key role in fostering benefits of the crisis card by supporting clients' perspectives. More structural integration of the crisis card in care processes may enhance its impact, but should be carefully explored. (PsycINFO Database Record (c) 2013 APA, all rights reserved).

  20. Geosciences Information Network (GIN): A modular, distributed, interoperable data network for the geosciences

    NASA Astrophysics Data System (ADS)

    Allison, M.; Gundersen, L. C.; Richard, S. M.; Dickinson, T. L.

    2008-12-01

    A coalition of the state geological surveys (AASG), the U.S. Geological Survey (USGS), and partners will receive NSF funding over 3 years under the INTEROP solicitation to start building the Geoscience Information Network (www.geoinformatics.info/gin) a distributed, interoperable data network. The GIN project will develop standardized services to link existing and in-progress components using a few standards and protocols, and work with data providers to implement these services. The key components of this network are 1) catalog system(s) for data discovery; 2) service definitions for interfaces for searching catalogs and accessing resources; 3) shared interchange formats to encode information for transmission (e.g. various XML markup languages); 4) data providers that publish information using standardized services defined by the network; and 5) client applications adapted to use information resources provided by the network. The GIN will integrate and use catalog resources that currently exist or are in development. We are working with the USGS National Geologic Map Database's existing map catalog, with the USGS National Geological and Geophysical Data Preservation Program, which is developing a metadata catalog (National Digital Catalog) for geoscience information resource discovery, and with the GEON catalog. Existing interchange formats will be used, such as GeoSciML, ChemML, and Open Geospatial Consortium sensor, observation and measurement MLs. Client application development will be fostered by collaboration with industry and academic partners. The GIN project will focus on the remaining aspects of the system -- service definitions and assistance to data providers to implement the services and bring content online - and on system integration of the modules. Initial formal collaborators include the OneGeology-Europe consortium of 27 nations that is building a comparable network under the EU INSPIRE initiative, GEON, Earthchem, and GIS software company ESRI. OneGeology-Europe and GIN have agreed to integrate their networks, effectively adopting global standards among geological surveys that are available across the entire field. ESRI is creating a Geology Data Model for ArcGIS software to be compatible with GIN, and other companies are expressing interest in adapting their services, applications, and clients to take advantage of the large data resources planned to become available through GIN.

  1. Development of spatial density maps based on geoprocessing web services: application to tuberculosis incidence in Barcelona, Spain.

    PubMed

    Dominkovics, Pau; Granell, Carlos; Pérez-Navarro, Antoni; Casals, Martí; Orcau, Angels; Caylà, Joan A

    2011-11-29

    Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This study is an attempt to demonstrate how web processing services together with web-based mapping capabilities suit the needs of health practitioners in epidemiological analysis scenarios.

  2. Development of spatial density maps based on geoprocessing web services: application to tuberculosis incidence in Barcelona, Spain

    PubMed Central

    2011-01-01

    Background Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Methods Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. Results The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. Conclusions In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This study is an attempt to demonstrate how web processing services together with web-based mapping capabilities suit the needs of health practitioners in epidemiological analysis scenarios. PMID:22126392

  3. Database for the collection and analysis of clinical data and images of neoplasms of the sinonasal tract.

    PubMed

    Trimarchi, Matteo; Lund, Valerie J; Nicolai, Piero; Pini, Massimiliano; Senna, Massimo; Howard, David J

    2004-04-01

    The Neoplasms of the Sinonasal Tract software package (NSNT v 1.0) implements a complete visual database for patients with sinonasal neoplasia, facilitating standardization of data and statistical analysis. The software, which is compatible with the Macintosh and Windows platforms, provides multiuser application with a dedicated server (on Windows NT or 2000 or Macintosh OS 9 or X and a network of clients) together with web access, if required. The system hardware consists of an Apple Power Macintosh G4500 MHz computer with PCI bus, 256 Mb of RAM plus 60 Gb hard disk, or any IBM-compatible computer with a Pentium 2 processor. Image acquisition may be performed with different frame-grabber cards for analog or digital video input of different standards (PAL, SECAM, or NTSC) and levels of quality (VHS, S-VHS, Betacam, Mini DV, DV). The visual database is based on 4th Dimension by 4D Inc, and video compression is made in real-time MPEG format. Six sections have been developed: demographics, symptoms, extent of disease, radiology, treatment, and follow-up. Acquisition of data includes computed tomography and magnetic resonance imaging, histology, and endoscopy images, allowing sequential comparison. Statistical analysis integral to the program provides Kaplan-Meier survival curves. The development of a dedicated, user-friendly database for sinonasal neoplasia facilitates a multicenter network and has obvious clinical and research benefits.

  4. Experiences with the Application of Services Oriented Approaches to the Federation of Heterogeneous Geologic Data Resources

    NASA Astrophysics Data System (ADS)

    Cervato, C.; Fils, D.; Bohling, G.; Diver, P.; Greer, D.; Reed, J.; Tang, X.

    2006-12-01

    The federation of databases is not a new endeavor. Great strides have been made e.g. in the health and astrophysics communities. Reviews of those successes indicate that they have been able to leverage off key cross-community core concepts. In its simplest implementation, a federation of databases with identical base schemas that can be extended to address individual efforts, is relatively easy to accomplish. Efforts of groups like the Open Geospatial Consortium have shown methods to geospatially relate data between different sources. We present here a summary of CHRONOS's (http://www.chronos.org) experience with highly heterogeneous data. Our experience with the federation of very diverse databases shows that the wide variety of encoding options for items like locality, time scale, taxon ID, and other key parameters makes it difficult to effectively join data across them. However, the response to this is not to develop one large, monolithic database, which will suffer growth pains due to social, national, and operational issues, but rather to systematically develop the architecture that will enable cross-resource (database, repository, tool, interface) interaction. CHRONOS has accomplished the major hurdle of federating small IT database efforts with service-oriented and XML-based approaches. The application of easy-to-use procedures that allow groups of all sizes to implement and experiment with searches across various databases and to use externally created tools is vital. We are sharing with the geoinformatics community the difficulties with application frameworks, user authentication, standards compliance, and data storage encountered in setting up web sites and portals for various science initiatives (e.g., ANDRILL, EARTHTIME). The ability to incorporate CHRONOS data, services, and tools into the existing framework of a group is crucial to the development of a model that supports and extends the vitality of the small- to medium-sized research effort that is essential for a vibrant scientific community. This presentation will directly address issues of portal development related to JSR-168 and other portal API's as well as issues related to both federated and local directory-based authentication. The application of service-oriented architecture in connection with ReST-based approaches is vital to facilitate service use by experienced and less experienced information technology groups. Application of these services with XML- based schemas allows for the connection to third party tools such a GIS-based tools and software designed to perform a specific scientific analysis. The connection of all these capabilities into a combined framework based on the standard XHTML Document object model and CSS 2.0 standards used in traditional web development will be demonstrated. CHRONOS also utilizes newer client techniques such as AJAX and cross- domain scripting along with traditional server-side database, application, and web servers. The combination of the various components of this architecture creates an environment based on open and free standards that allows for the discovery, retrieval, and integration of tools and data.

  5. Information Collection using Handheld Devices in Unreliable Networking Environments

    DTIC Science & Technology

    2014-06-01

    different types of mobile devices that connect wirelessly to a database 8 server. The actual backend database is not important to the mobile clients...Google’s infrastructure and local servers with MySQL and PostgreSQL on the backend (ODK 2014b). (2) Google Fusion Tables are used to do basic link...how we conduct business. Our requirements to share information do not change simply because there is little or no existing infrastructure in our

  6. A Rich Client-Server Based Framework for Convenient Security and Management of Mobile Applications

    NASA Astrophysics Data System (ADS)

    Badan, Stephen; Probst, Julien; Jaton, Markus; Vionnet, Damien; Wagen, Jean-Frédéric; Litzistorf, Gérald

    Contact lists, Emails, SMS or custom applications on a professional smartphone could hold very confidential or sensitive information. What could happen in case of theft or accidental loss of such devices? Such events could be detected by the separation between the smartphone and a Bluetooth companion device. This event should typically block the applications and delete personal and sensitive data. Here, a solution is proposed based on a secured framework application running on the mobile phone as a rich client connected to a security server. The framework offers strong and customizable authentication and secured connectivity. A security server manages all security issues. User applications are then loaded via the framework. User data can be secured, synchronized, pushed or pulled via the framework. This contribution proposes a convenient although secured environment based on a client-server architecture using external authentications. Several features of the proposed system are exposed and a practical demonstrator is described.

  7. Assessing Client-Caregiver Relationships and the Applicability of the "Student-Teacher Relationship Scale" for People with Intellectual Disabilities

    ERIC Educational Resources Information Center

    Roeden, John M.; Maaskant, Marian A.; Koomen, Helma M. Y.; Candel, Math J. J. M.; Curfs, Leopold M. G.

    2012-01-01

    Improvements in client-caregiver relationships may lead to improvements in the quality of life of clients with intellectual disabilities (ID). For this reason, interventions aimed at influencing these relationships are important. To gain insight into the nature and intention of these relationships in the ID population, suitable measurement…

  8. A Public-Key Based Authentication and Key Establishment Protocol Coupled with a Client Puzzle.

    ERIC Educational Resources Information Center

    Lee, M. C.; Fung, Chun-Kan

    2003-01-01

    Discusses network denial-of-service attacks which have become a security threat to the Internet community and suggests the need for reliable authentication protocols in client-server applications. Presents a public-key based authentication and key establishment protocol coupled with a client puzzle protocol and validates it through formal logic…

  9. Treatment Outcome and Follow-Up Evaluation Based on Client Case Records in a Mental Health Center.

    ERIC Educational Resources Information Center

    Simons, Lynn S.; And Others

    1978-01-01

    Evaluated the application of Goal Attainment Scaling (GAS) to client case records as a measure of treatment effectiveness and examined its correspondence to other measures of outcome. Findings were that GAS scores converged significantly with therapist ratings of global improvement and GAS scores obtained from client reports at follow-up.…

  10. Delivering Cognitive Processing Therapy in a Community Health Setting: The Influence of Latino Culture and Community Violence on Posttraumatic Cognitions

    PubMed Central

    Marques, Luana; Eustis, Elizabeth H.; Dixon, Louise; Valentine, Sarah E.; Borba, Christina; Simon, Naomi; Kaysen, Debra; Wiltsey-Stirman, Shannon

    2015-01-01

    Despite the applicability of Cognitive Processing Therapy (CPT) for Posttraumatic Stress Disorder (PTSD) to addressing sequelae of a range of traumatic events, few studies have evaluated whether the treatment itself is applicable across diverse populations. The present study examined differences and similarities amongst non-Latino, Latino Spanish-speaking, and Latino English-speaking clients in rigid beliefs – or “stuck points” – associated with PTSD symptoms in a sample of community mental health clients. We utilized the procedures of content analysis to analyze stuck point logs and impact statements of 29 participants enrolled in a larger implementation trial for CPT. Findings indicated that the content of stuck points was similar across Latino and non-Latino clients, although fewer total stuck points were identified for Latino clients compared to non-Latino clients. Given that identification of stuck points is central to implementing CPT, difficulty identifying stuck points could pose significant challenges for implementing CPT among Latino clients and warrants further examination. Thematic analysis of impact statements revealed the importance of family, religion, and the urban context (e.g., poverty, violence exposure) in understanding how clients organize beliefs and emotions associated with trauma. Clinical recommendations for implementing CPT in community settings and the identification of stuck points are provided. PMID:25961865

  11. Delivering cognitive processing therapy in a community health setting: The influence of Latino culture and community violence on posttraumatic cognitions.

    PubMed

    Marques, Luana; Eustis, Elizabeth H; Dixon, Louise; Valentine, Sarah E; Borba, Christina P C; Simon, Naomi; Kaysen, Debra; Wiltsey-Stirman, Shannon

    2016-01-01

    Despite the applicability of cognitive processing therapy (CPT) for posttraumatic stress disorder (PTSD) to addressing sequelae of a range of traumatic events, few studies have evaluated whether the treatment itself is applicable across diverse populations. The present study examined differences and similarities among non-Latino, Latino Spanish-speaking, and Latino English-speaking clients in rigid beliefs-or "stuck points"-associated with PTSD symptoms in a sample of community mental health clients. We utilized the procedures of content analysis to analyze stuck point logs and impact statements of 29 participants enrolled in a larger implementation trial for CPT. Findings indicated that the content of stuck points was similar across Latino and non-Latino clients, although fewer total stuck points were identified for Latino clients compared to non-Latino clients. Given that identification of stuck points is central to implementing CPT, difficulty identifying stuck points could pose significant challenges for implementing CPT among Latino clients and warrants further examination. Thematic analysis of impact statements revealed the importance of family, religion, and the urban context (e.g., poverty, violence exposure) in understanding how clients organize beliefs and emotions associated with trauma. Clinical recommendations for implementing CPT in community settings and the identification of stuck points are provided. (c) 2016 APA, all rights reserved).

  12. Verifying the secure setup of Unix client/servers and detection of network intrusion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feingold, R.; Bruestle, H.R.; Bartoletti, T.

    1995-07-01

    This paper describes our technical approach to developing and delivering Unix host- and network-based security products to meet the increasing challenges in information security. Today`s global ``Infosphere`` presents us with a networked environment that knows no geographical, national, or temporal boundaries, and no ownership, laws, or identity cards. This seamless aggregation of computers, networks, databases, applications, and the like store, transmit, and process information. This information is now recognized as an asset to governments, corporations, and individuals alike. This information must be protected from misuse. The Security Profile Inspector (SPI) performs static analyses of Unix-based clients and servers to checkmore » on their security configuration. SPI`s broad range of security tests and flexible usage options support the needs of novice and expert system administrators alike. SPI`s use within the Department of Energy and Department of Defense has resulted in more secure systems, less vulnerable to hostile intentions. Host-based information protection techniques and tools must also be supported by network-based capabilities. Our experience shows that a weak link in a network of clients and servers presents itself sooner or later, and can be more readily identified by dynamic intrusion detection techniques and tools. The Network Intrusion Detector (NID) is one such tool. NID is designed to monitor and analyze activity on an Ethernet broadcast Local Area Network segment and produce transcripts of suspicious user connections. NID`s retrospective and real-time modes have proven invaluable to security officers faced with ongoing attacks to their systems and networks.« less

  13. Forecasting the value of credit scoring

    NASA Astrophysics Data System (ADS)

    Saad, Shakila; Ahmad, Noryati; Jaffar, Maheran Mohd

    2017-08-01

    Nowadays, credit scoring system plays an important role in banking sector. This process is important in assessing the creditworthiness of customers requesting credit from banks or other financial institutions. Usually, the credit scoring is used when customers send the application for credit facilities. Based on the score from credit scoring, bank will be able to segregate the "good" clients from "bad" clients. However, in most cases the score is useful at that specific time only and cannot be used to forecast the credit worthiness of the same applicant after that. Hence, bank will not know if "good" clients will always be good all the time or "bad" clients may become "good" clients after certain time. To fill up the gap, this study proposes an equation to forecast the credit scoring of the potential borrowers at a certain time by using the historical score related to the assumption. The Mean Absolute Percentage Error (MAPE) is used to measure the accuracy of the forecast scoring. Result shows the forecast scoring is highly accurate as compared to actual credit scoring.

  14. Phased development of a web-based PACS viewer

    NASA Astrophysics Data System (ADS)

    Gidron, Yoad; Shani, Uri; Shifrin, Mark

    2000-05-01

    The Web browser is an excellent environment for the rapid development of an effective and inexpensive PACS viewer. In this paper we will share our experience in developing a browser-based viewer, from the inception and prototype stages to its current state of maturity. There are many operational advantages to a browser-based viewer, even when native viewers already exist in the system (with multiple and/or high resolution screens): (1) It can be used on existing personal workstations throughout the hospital. (2) It is easy to make the service available from physician's homes. (3) The viewer is extremely portable and platform independent. There is a wide variety of means available for implementing the browser- based viewer. Each file sent to the client by the server can perform some end-user or client/server interaction. These means range from HTML (for HyperText Markup Language) files, through Java Script, to Java applets. Some data types may also invoke plug-in code in the client, although this would reduce the portability of the viewer, it would provide the needed efficiency in critical places. On the server side the range of means is also very rich: (1) A set of files: html, Java Script, Java applets, etc. (2) Extensions of the server via cgi-bin programs, (3) Extensions of the server via servlets, (4) Any other helper application residing and working with the server to access the DICOM archive. The viewer architecture consists of two basic parts: The first part performs query and navigation through the DICOM archive image folders. The second part does the image access and display. While the first part deals with low data traffic, it involves many database transactions. The second part is simple as far as access transactions are concerned, but requires much more data traffic and display functions. Our web-based viewer has gone through three development stages characterized by the complexity of the means and tools employed on both client and server sides.

  15. Application of a Judgment Model toward Measurement of Clinical Judgment in Senior Nursing Students

    ERIC Educational Resources Information Center

    Pongmarutai, Tiwaporn

    2010-01-01

    Clinical judgment, defined as "the application of the nurse's knowledge and experience in making decisions about client care" (The National Council of State Boards of Nursing, 2005, p. 2), has been recognized as a vital and essential skill for healthcare providers when caring for clients. Undisputedly, nurses represent the largest…

  16. LISA, the next generation: from a web-based application to a fat client.

    PubMed

    Pierlet, Noëlla; Aerts, Werner; Vanautgaerden, Mark; Van den Bosch, Bart; De Deurwaerder, André; Schils, Erik; Noppe, Thomas

    2008-01-01

    The LISA application, developed by the University Hospitals Leuven, permits referring physicians to consult the electronic medical records of their patients over the internet in a highly secure way. We decided to completely change the way we secured the application, discard the existing web application and build a completely new application, based on the in-house developed hospital information system, used in the University Hospitals Leuven. The result is a fat Java client, running on a Windows Terminal Server, secured by a commercial SSL-VPN solution.

  17. Gene and protein nomenclature in public databases

    PubMed Central

    Fundel, Katrin; Zimmer, Ralf

    2006-01-01

    Background Frequently, several alternative names are in use for biological objects such as genes and proteins. Applications like manual literature search, automated text-mining, named entity identification, gene/protein annotation, and linking of knowledge from different information sources require the knowledge of all used names referring to a given gene or protein. Various organism-specific or general public databases aim at organizing knowledge about genes and proteins. These databases can be used for deriving gene and protein name dictionaries. So far, little is known about the differences between databases in terms of size, ambiguities and overlap. Results We compiled five gene and protein name dictionaries for each of the five model organisms (yeast, fly, mouse, rat, and human) from different organism-specific and general public databases. We analyzed the degree of ambiguity of gene and protein names within and between dictionaries, to a lexicon of common English words and domain-related non-gene terms, and we compared different data sources in terms of size of extracted dictionaries and overlap of synonyms between those. The study shows that the number of genes/proteins and synonyms covered in individual databases varies significantly for a given organism, and that the degree of ambiguity of synonyms varies significantly between different organisms. Furthermore, it shows that, despite considerable efforts of co-curation, the overlap of synonyms in different data sources is rather moderate and that the degree of ambiguity of gene names with common English words and domain-related non-gene terms varies depending on the considered organism. Conclusion In conclusion, these results indicate that the combination of data contained in different databases allows the generation of gene and protein name dictionaries that contain significantly more used names than dictionaries obtained from individual data sources. Furthermore, curation of combined dictionaries considerably increases size and decreases ambiguity. The entries of the curated synonym dictionary are available for manual querying, editing, and PubMed- or Google-search via the ProThesaurus-wiki. For automated querying via custom software, we offer a web service and an exemplary client application. PMID:16899134

  18. The Architecture of an Automatic eHealth Platform With Mobile Client for Cerebrovascular Disease Detection

    PubMed Central

    Wang, Xingce; Bie, Rongfang; Wu, Zhongke; Zhou, Mingquan; Cao, Rongfei; Xie, Lizhi; Zhang, Dong

    2013-01-01

    Background In recent years, cerebrovascular disease has been the leading cause of death and adult disability in the world. This study describes an efficient approach to detect cerebrovascular disease. Objective In order to improve cerebrovascular treatment, prevention, and care, an automatic cerebrovascular disease detection eHealth platform is designed and studied. Methods We designed an automatic eHealth platform for cerebrovascular disease detection with a four-level architecture: object control layer, data transmission layer, service supporting layer, and application service layer. The platform has eight main functions: cerebrovascular database management, preprocessing of cerebral image data, image viewing and adjustment model, image cropping compression and measurement, cerebrovascular segmentation, 3-dimensional cerebrovascular reconstruction, cerebrovascular rendering, cerebrovascular virtual endoscope, and automatic detection. Several key technologies were employed for the implementation of the platform. The anisotropic diffusion model was used to reduce the noise. Statistics segmentation with Gaussian-Markov random field model (G-MRF) and Stochastic Estimation Maximization (SEM) parameter estimation method were used to realize the cerebrovascular segmentation. Ball B-Spline curve was proposed to model the cerebral blood vessels. Compute unified device architecture (CUDA) based on ray-casting volume rendering presented by curvature enhancement and boundary enhancement were used to realize the volume rendering model. We implemented the platform with a network client and mobile phone client to fit different users. Results The implemented platform is running on a common personal computer. Experiments on 32 patients’ brain computed tomography data or brain magnetic resonance imaging data stored in the system verified the feasibility and validity of each model we proposed. The platform is partly used in the cranial nerve surgery of the First Hospital Affiliated to the General Hospital of People's Liberation Army and radiology of Beijing Navy General Hospital. At the same time it also gets some applications in medical imaging specialty teaching of Tianjin Medical University. The application results have also been validated by our neurosurgeon and radiologist. Conclusions The platform appears beneficial in diagnosis of the cerebrovascular disease. The long-term benefits and additional applications of this technology warrant further study. The research built a diagnosis and treatment platform of the human tissue with complex geometry and topology such as brain vessel based on the Internet of things. PMID:25098861

  19. An object-oriented, technology-adaptive information model

    NASA Technical Reports Server (NTRS)

    Anyiwo, Joshua C.

    1995-01-01

    The primary objective was to develop a computer information system for effectively presenting NASA's technologies to American industries, for appropriate commercialization. To this end a comprehensive information management model, applicable to a wide variety of situations, and immune to computer software/hardware technological gyrations, was developed. The model consists of four main elements: a DATA_STORE, a data PRODUCER/UPDATER_CLIENT and a data PRESENTATION_CLIENT, anchored to a central object-oriented SERVER engine. This server engine facilitates exchanges among the other model elements and safeguards the integrity of the DATA_STORE element. It is designed to support new technologies, as they become available, such as Object Linking and Embedding (OLE), on-demand audio-video data streaming with compression (such as is required for video conferencing), Worldwide Web (WWW) and other information services and browsing, fax-back data requests, presentation of information on CD-ROM, and regular in-house database management, regardless of the data model in place. The four components of this information model interact through a system of intelligent message agents which are customized to specific information exchange needs. This model is at the leading edge of modern information management models. It is independent of technological changes and can be implemented in a variety of ways to meet the specific needs of any communications situation. This summer a partial implementation of the model has been achieved. The structure of the DATA_STORE has been fully specified and successfully tested using Microsoft's FoxPro 2.6 database management system. Data PRODUCER/UPDATER and PRESENTATION architectures have been developed and also successfully implemented in FoxPro; and work has started on a full implementation of the SERVER engine. The model has also been successfully applied to a CD-ROM presentation of NASA's technologies in support of Langley Research Center's TAG efforts.

  20. A Virtual Reality Exposure Therapy Application for Iraq War Post Traumatic Stress Disorder

    DTIC Science & Technology

    2006-01-01

    denial of social problems. Prior to the availability of VR therapy applications, the existing standard of care for PTSD was imaginal exposure...The application is built on ICT’s FlatWorld Simulation Control Architecture (FSCA) [13]. The FSCA enables a network -centric system of client displays...client-side interaction despite potential network delays. FCSA scripting is based on the Lua programming language [14] and provides facilities for real

  1. Privacy preserving, real-time and location secured biometrics for mCommerce authentication

    NASA Astrophysics Data System (ADS)

    Kuseler, Torben; Al-Assam, Hisham; Jassim, Sabah; Lami, Ihsan A.

    2011-06-01

    Secure wireless connectivity between mobile devices and financial/commercial establishments is mature, and so is the security of remote authentication for mCommerce. However, the current techniques are open for hacking, false misrepresentation, replay and other attacks. This is because of the lack of real-time and current-precise-location in the authentication process. This paper proposes a new technique that includes freshly-generated real-time personal biometric data of the client and present-position of the mobile device used by the client to perform the mCommerce so to form a real-time biometric representation to authenticate any remote transaction. A fresh GPS fix generates the "time and location" to stamp the biometric data freshly captured to produce a single, real-time biometric representation on the mobile device. A trusted Certification Authority (CA) acts as an independent authenticator of such client's claimed realtime location and his/her provided fresh biometric data. Thus eliminates the necessity of user enrolment with many mCommerce services and application providers. This CA can also "independently from the client" and "at that instant of time" collect the client's mobile device "time and location" from the cellular network operator so to compare with the received information, together with the client's stored biometric information. Finally, to preserve the client's location privacy and to eliminate the possibility of cross-application client tracking, this paper proposes shielding the real location of the mobile device used prior to submission to the CA or authenticators.

  2. The relationship between clients' depression etiological beliefs and psychotherapy orientation preferences, expectations, and credibility beliefs.

    PubMed

    Tompkins, Kelley A; Swift, Joshua K; Rousmaniere, Tony G; Whipple, Jason L

    2017-06-01

    The purpose of this study was to examine the relationship between clients' etiological beliefs for depression and treatment preferences, credibility beliefs, and outcome expectations for five different depression treatments-behavioral activation, cognitive therapy, interpersonal psychotherapy, pharmacotherapy, and psychodynamic psychotherapy. Adult psychotherapy clients (N = 98) were asked to complete an online survey that included the Reasons for Depression Questionnaire, a brief description of each of the five treatment options, and credibility, expectancy, and preference questions for each option. On average, the participating clients rated pharmacotherapy as significantly less credible, having a lower likelihood of success, and being less preferred than the four types of psychotherapy. In general, interpersonal psychotherapy was also rated more negatively than the other types of psychotherapy. However, these findings depended somewhat on whether the participating client was personally experiencing depression. Credibility beliefs, outcome expectations, and preferences for pharmacotherapy were positively associated with biological beliefs for depression; however, the other hypothesized relationships between etiological beliefs and treatment attitudes were not supported. Although the study is limited based on the specific sample and treatment descriptions that were used, the results may still have implications for psychotherapy research, training, and practice. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  3. Diversity within African American, female therapists: variability in clients' expectations and assumptions about the therapist.

    PubMed

    Kelly, Jennifer F; Greene, Beverly

    2010-06-01

    Despite the presence of some literature that has addressed the characteristics of the African American female therapist, most psychotherapy training proceeds with the assumption that therapists are members of dominant groups, and most of the psychological and psychotherapy literature has been written by therapists and psychologists who come from dominant cultural perspectives. Not as much has been written about psychological paradigms or the process of psychotherapy from the perspective of the therapist who is not a dominant group member. This article explores both the common and divergent experiences that we, the authors, share as African American female therapists and the different reactions we frequently elicit in clients. We also explore how individual differences in our physical appearances, personal backgrounds, and different characteristics of our respective practices elicit distinct responses from clients that we believe are based on differences between us, despite the fact that we are both African American women. We believe that many of the stereotypes that affect perceptions of African American female clients also exist for African American female therapists. We will address how the intersection of gender, race, and sexual orientation of the client highlights the complexity of culturally competent practice. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  4. Exploring mortality among drug treatment clients: The relationship between treatment type and mortality.

    PubMed

    Lloyd, Belinda; Zahnow, Renee; Barratt, Monica J; Best, David; Lubman, Dan I; Ferris, Jason

    2017-11-01

    Studies consistently identify substance treatment populations as more likely to die prematurely compared with age-matched general population, with mortality risk higher out-of-treatment than in-treatment. While opioid-using pharmacotherapy cohorts have been studied extensively, less evidence exists regarding effects of other treatment types, and clients in treatment for other drugs. This paper examines mortality during and following treatment across treatment modalities. A retrospective seven-year cohort was utilised to examine mortality during and in the two years following treatment among clients from Victoria, Australia, recorded on the Alcohol and Drug Information Service database by linking with National Death Index. 18,686 clients over a 12-month period were included. Crude (CMRs) and standardised mortality rates (SMRs) were analysed in terms of treatment modality, and time in or out of treatment. Higher risk of premature death was associated with residential withdrawal as the last type of treatment engagement, while mortality following counselling was significantly lower than all other treatment types in the year post-treatment. Both CMRs and SMRs were significantly higher in-treatment than post-treatment. Better understanding of factors contributing to elevated mortality risk for clients engaged in, and following treatment, is needed to ensure that treatment systems provide optimal outcomes during and after treatment. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Designing and Developing a NASA Research Projects Knowledge Base and Implementing Knowledge Management and Discovery Techniques

    NASA Astrophysics Data System (ADS)

    Dabiru, L.; O'Hara, C. G.; Shaw, D.; Katragadda, S.; Anderson, D.; Kim, S.; Shrestha, B.; Aanstoos, J.; Frisbie, T.; Policelli, F.; Keblawi, N.

    2006-12-01

    The Research Project Knowledge Base (RPKB) is currently being designed and will be implemented in a manner that is fully compatible and interoperable with enterprise architecture tools developed to support NASA's Applied Sciences Program. Through user needs assessment, collaboration with Stennis Space Center, Goddard Space Flight Center, and NASA's DEVELOP Staff personnel insight to information needs for the RPKB were gathered from across NASA scientific communities of practice. To enable efficient, consistent, standard, structured, and managed data entry and research results compilation a prototype RPKB has been designed and fully integrated with the existing NASA Earth Science Systems Components database. The RPKB will compile research project and keyword information of relevance to the six major science focus areas, 12 national applications, and the Global Change Master Directory (GCMD). The RPKB will include information about projects awarded from NASA research solicitations, project investigator information, research publications, NASA data products employed, and model or decision support tools used or developed as well as new data product information. The RPKB will be developed in a multi-tier architecture that will include a SQL Server relational database backend, middleware, and front end client interfaces for data entry. The purpose of this project is to intelligently harvest the results of research sponsored by the NASA Applied Sciences Program and related research program results. We present various approaches for a wide spectrum of knowledge discovery of research results, publications, projects, etc. from the NASA Systems Components database and global information systems and show how this is implemented in SQL Server database. The application of knowledge discovery is useful for intelligent query answering and multiple-layered database construction. Using advanced EA tools such as the Earth Science Architecture Tool (ESAT), RPKB will enable NASA and partner agencies to efficiently identify the significant results for new experiment directions and principle investigators to formulate experiment directions for new proposals.

  6. Optimizing the NASA Technical Report Server

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Maa, Ming-Hokng

    1996-01-01

    The NASA Technical Report Server (NTRS), a World Wide Web report distribution NASA technical publications service, is modified for performance enhancement, greater protocol support, and human interface optimization. Results include: Parallel database queries, significantly decreasing user access times by an average factor of 2.3; access from clients behind firewalls and/ or proxies which truncate excessively long Uniform Resource Locators (URLs); access to non-Wide Area Information Server (WAIS) databases and compatibility with the 239-50.3 protocol; and a streamlined user interface.

  7. The CMS dataset bookkeeping service

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Afaq, Anzar,; /Fermilab; Dolgert, Andrew

    2007-10-01

    The CMS Dataset Bookkeeping Service (DBS) has been developed to catalog all CMS event data from Monte Carlo and Detector sources. It provides the ability to identify MC or trigger source, track data provenance, construct datasets for analysis, and discover interesting data. CMS requires processing and analysis activities at various service levels and the DBS system provides support for localized processing or private analysis, as well as global access for CMS users at large. Catalog entries can be moved among the various service levels with a simple set of migration tools, thus forming a loose federation of databases. DBS ismore » available to CMS users via a Python API, Command Line, and a Discovery web page interfaces. The system is built as a multi-tier web application with Java servlets running under Tomcat, with connections via JDBC to Oracle or MySQL database backends. Clients connect to the service through HTTP or HTTPS with authentication provided by GRID certificates and authorization through VOMS. DBS is an integral part of the overall CMS Data Management and Workflow Management systems.« less

  8. Managing Attribute—Value Clinical Trials Data Using the ACT/DB Client—Server Database System

    PubMed Central

    Nadkarni, Prakash M.; Brandt, Cynthia; Frawley, Sandra; Sayward, Frederick G.; Einbinder, Robin; Zelterman, Daniel; Schacter, Lee; Miller, Perry L.

    1998-01-01

    ACT/DB is a client—server database application for storing clinical trials and outcomes data, which is currently undergoing initial pilot use. It stores most of its data in entity—attribute—value form. Such data are segregated according to data type to allow indexing by value when possible, and binary large object data are managed in the same way as other data. ACT/DB lets an investigator design a study rapidly by defining the parameters (or attributes) that are to be gathered, as well as their logical grouping for purposes of display and data entry. ACT/DB generates customizable data entry. The data can be viewed through several standard reports as well as exported as text to external analysis programs. ACT/DB is designed to encourage reuse of parameters across multiple studies and has facilities for dictionary search and maintenance. It uses a Microsoft Access client running on Windows 95 machines, which communicates with an Oracle server running on a UNIX platform. ACT/DB is being used to manage the data for seven studies in its initial deployment. PMID:9524347

  9. Development and feasibility of smartphone application for cognitive-behavioural case management of individuals with early psychosis.

    PubMed

    Kim, Sung-Wan; Lee, Ga-Young; Yu, Hye-Young; Jung, Eun-I; Lee, Ju-Yeon; Kim, Seon-Young; Kim, Jae-Min; Shin, Il-Seon; Yoon, Jin-Sang

    2017-05-18

    This article describes the development of the smartphone application for cognitive-behavioural case management of young individuals with early psychosis and examines the acceptance and potential clinical benefits of this application through a pilot survey. Gwangju Bukgu-Community Mental Health Center developed and launched a smartphone application (Heal Your Mind [HYM]) for cognitive-behavioural case management and symptom monitoring. The HYM application for clients includes 6 main modules including thought record, symptom record, daily life record, official notices, communication and scales. The key module is the "thought record" for self-directed cognitive-behavioural treatment. When the client writes and sends the self-cognitive-behavioural therapy sheet to the case manager, the latter receives a notification and can provide feedback in real time. We conducted a survey to investigate the acceptance and feasibility of this approach among young clients with early psychosis. A total of 24 clients with early psychosis participated in this survey. More than 80% of participants reported that it was easy to learn to use this application, and no one described this application as very complicated or reported that they needed a long time to learn how to use it. About 80% of participants were satisfied with this application, and 70% reported that they received help as a result of using this application. This study suggests that this smartphone application is useful for young individuals with early psychosis and that it may contribute to the development of both young customer- and case manager-friendly systems for this clinical population. © 2017 John Wiley & Sons Australia, Ltd.

  10. Creating Web-Based Scientific Applications Using Java Servlets

    NASA Technical Reports Server (NTRS)

    Palmer, Grant; Arnold, James O. (Technical Monitor)

    2001-01-01

    There are many advantages to developing web-based scientific applications. Any number of people can access the application concurrently. The application can be accessed from a remote location. The application becomes essentially platform-independent because it can be run from any machine that has internet access and can run a web browser. Maintenance and upgrades to the application are simplified since only one copy of the application exists in a centralized location. This paper details the creation of web-based applications using Java servlets. Java is a powerful, versatile programming language that is well suited to developing web-based programs. A Java servlet provides the interface between the central server and the remote client machines. The servlet accepts input data from the client, runs the application on the server, and sends the output back to the client machine. The type of servlet that supports the HTTP protocol will be discussed in depth. Among the topics the paper will discuss are how to write an http servlet, how the servlet can run applications written in Java and other languages, and how to set up a Java web server. The entire process will be demonstrated by building a web-based application to compute stagnation point heat transfer.

  11. GarlicESTdb: an online database and mining tool for garlic EST sequences.

    PubMed

    Kim, Dae-Won; Jung, Tae-Sung; Nam, Seong-Hyeuk; Kwon, Hyuk-Ryul; Kim, Aeri; Chae, Sung-Hwa; Choi, Sang-Haeng; Kim, Dong-Wook; Kim, Ryong Nam; Park, Hong-Seog

    2009-05-18

    Allium sativum., commonly known as garlic, is a species in the onion genus (Allium), which is a large and diverse one containing over 1,250 species. Its close relatives include chives, onion, leek and shallot. Garlic has been used throughout recorded history for culinary, medicinal use and health benefits. Currently, the interest in garlic is highly increasing due to nutritional and pharmaceutical value including high blood pressure and cholesterol, atherosclerosis and cancer. For all that, there are no comprehensive databases available for Expressed Sequence Tags(EST) of garlic for gene discovery and future efforts of genome annotation. That is why we developed a new garlic database and applications to enable comprehensive analysis of garlic gene expression. GarlicESTdb is an integrated database and mining tool for large-scale garlic (Allium sativum) EST sequencing. A total of 21,595 ESTs collected from an in-house cDNA library were used to construct the database. The analysis pipeline is an automated system written in JAVA and consists of the following components: automatic preprocessing of EST reads, assembly of raw sequences, annotation of the assembled sequences, storage of the analyzed information into MySQL databases, and graphic display of all processed data. A web application was implemented with the latest J2EE (Java 2 Platform Enterprise Edition) software technology (JSP/EJB/JavaServlet) for browsing and querying the database, for creation of dynamic web pages on the client side, and for mapping annotated enzymes to KEGG pathways, the AJAX framework was also used partially. The online resources, such as putative annotation, single nucleotide polymorphisms (SNP) and tandem repeat data sets, can be searched by text, explored on the website, searched using BLAST, and downloaded. To archive more significant BLAST results, a curation system was introduced with which biologists can easily edit best-hit annotation information for others to view. The GarlicESTdb web application is freely available at http://garlicdb.kribb.re.kr. GarlicESTdb is the first incorporated online information database of EST sequences isolated from garlic that can be freely accessed and downloaded. It has many useful features for interactive mining of EST contigs and datasets from each library, including curation of annotated information, expression profiling, information retrieval, and summary of statistics of functional annotation. Consequently, the development of GarlicESTdb will provide a crucial contribution to biologists for data-mining and more efficient experimental studies.

  12. ClearedLeavesDB: an online database of cleared plant leaf images

    PubMed Central

    2014-01-01

    Background Leaf vein networks are critical to both the structure and function of leaves. A growing body of recent work has linked leaf vein network structure to the physiology, ecology and evolution of land plants. In the process, multiple institutions and individual researchers have assembled collections of cleared leaf specimens in which vascular bundles (veins) are rendered visible. In an effort to facilitate analysis and digitally preserve these specimens, high-resolution images are usually created, either of entire leaves or of magnified leaf subsections. In a few cases, collections of digital images of cleared leaves are available for use online. However, these collections do not share a common platform nor is there a means to digitally archive cleared leaf images held by individual researchers (in addition to those held by institutions). Hence, there is a growing need for a digital archive that enables online viewing, sharing and disseminating of cleared leaf image collections held by both institutions and individual researchers. Description The Cleared Leaf Image Database (ClearedLeavesDB), is an online web-based resource for a community of researchers to contribute, access and share cleared leaf images. ClearedLeavesDB leverages resources of large-scale, curated collections while enabling the aggregation of small-scale collections within the same online platform. ClearedLeavesDB is built on Drupal, an open source content management platform. It allows plant biologists to store leaf images online with corresponding meta-data, share image collections with a user community and discuss images and collections via a common forum. We provide tools to upload processed images and results to the database via a web services client application that can be downloaded from the database. Conclusions We developed ClearedLeavesDB, a database focusing on cleared leaf images that combines interactions between users and data via an intuitive web interface. The web interface allows storage of large collections and integrates with leaf image analysis applications via an open application programming interface (API). The open API allows uploading of processed images and other trait data to the database, further enabling distribution and documentation of analyzed data within the community. The initial database is seeded with nearly 19,000 cleared leaf images representing over 40 GB of image data. Extensible storage and growth of the database is ensured by using the data storage resources of the iPlant Discovery Environment. ClearedLeavesDB can be accessed at http://clearedleavesdb.org. PMID:24678985

  13. ClearedLeavesDB: an online database of cleared plant leaf images.

    PubMed

    Das, Abhiram; Bucksch, Alexander; Price, Charles A; Weitz, Joshua S

    2014-03-28

    Leaf vein networks are critical to both the structure and function of leaves. A growing body of recent work has linked leaf vein network structure to the physiology, ecology and evolution of land plants. In the process, multiple institutions and individual researchers have assembled collections of cleared leaf specimens in which vascular bundles (veins) are rendered visible. In an effort to facilitate analysis and digitally preserve these specimens, high-resolution images are usually created, either of entire leaves or of magnified leaf subsections. In a few cases, collections of digital images of cleared leaves are available for use online. However, these collections do not share a common platform nor is there a means to digitally archive cleared leaf images held by individual researchers (in addition to those held by institutions). Hence, there is a growing need for a digital archive that enables online viewing, sharing and disseminating of cleared leaf image collections held by both institutions and individual researchers. The Cleared Leaf Image Database (ClearedLeavesDB), is an online web-based resource for a community of researchers to contribute, access and share cleared leaf images. ClearedLeavesDB leverages resources of large-scale, curated collections while enabling the aggregation of small-scale collections within the same online platform. ClearedLeavesDB is built on Drupal, an open source content management platform. It allows plant biologists to store leaf images online with corresponding meta-data, share image collections with a user community and discuss images and collections via a common forum. We provide tools to upload processed images and results to the database via a web services client application that can be downloaded from the database. We developed ClearedLeavesDB, a database focusing on cleared leaf images that combines interactions between users and data via an intuitive web interface. The web interface allows storage of large collections and integrates with leaf image analysis applications via an open application programming interface (API). The open API allows uploading of processed images and other trait data to the database, further enabling distribution and documentation of analyzed data within the community. The initial database is seeded with nearly 19,000 cleared leaf images representing over 40 GB of image data. Extensible storage and growth of the database is ensured by using the data storage resources of the iPlant Discovery Environment. ClearedLeavesDB can be accessed at http://clearedleavesdb.org.

  14. Calculating Clinically Significant Change: Applications of the Clinical Global Impressions (CGI) Scale to Evaluate Client Outcomes in Private Practice

    ERIC Educational Resources Information Center

    Kelly, Peter James

    2010-01-01

    The Clinical Global Impressions (CGI) scale is a therapist-rated measure of client outcome that has been widely used within the research literature. The current study aimed to develop reliable and clinically significant change indices for the CGI, and to demonstrate its application in private psychological practice. Following the guidelines…

  15. 77 FR 3017 - Self-Regulatory Organizations; C2 Options Exchange, Incorporated; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-20

    ... be charged on a per-Login ID basis. Firms may access C2 via either a CMI Client Application [[Page..., using different Login IDs, accessing the same CMI Client Application Server or FIX Port, allowing the firm to only pay the monthly fee once. Alternatively, a firm may use the same Login ID to access...

  16. Fulfillment of HTTP Authentication Based on Alcatel OmniSwitch 9700

    NASA Astrophysics Data System (ADS)

    Liu, Hefu

    This paper provides a way of HTTP authentication On Alcatel OmniSwitch 9700. Authenticated VLANs control user access to network resources based on VLAN assignment and user authentication. The user can be authenticated through the switch via any standard Web browser software. Web browser client displays the username and password prompts. Then a way for HTML forms can be given to pass HTTP authentication data when it's submitted. A radius server will provide a database of user information that the switch checks whenever it tries to authenticate through the switch. Before or after authentication, the client can get an address from a Dhcp server.

  17. Design of multi-language trading system of ethnic characteristic agricultural products based on android

    NASA Astrophysics Data System (ADS)

    Huanqin, Wu; Yasheng, Jin; Yugang, Dai

    2017-06-01

    Under the current situation where Internet technology develops rapidly, mobile E-commerce technology has brought great convenience to our life. Now, the graphical user interface (GUI) of most E-commerce platforms only supports Chinese. Thus, the development of Android client of E-commerce that supports ethnic languages owns a great prospect. The principle that combines front end design and database technology is adopted in this paper to construct the Android client system of E-commerce platforms that supports ethnic languages, which realizes the displaying, browsing, querying, searching, trading and other functions of ethnic characteristic agricultural products on android platforms.

  18. Quality Primary Care and Family Planning Services for LGBT Clients: A Comprehensive Review of Clinical Guidelines.

    PubMed

    Klein, David A; Malcolm, Nikita M; Berry-Bibee, Erin N; Paradise, Scott L; Coulter, Jessica S; Keglovitz Baker, Kristin; Schvey, Natasha A; Rollison, Julia M; Frederiksen, Brittni N

    2018-04-01

    LGBT clients have unique healthcare needs but experience a wide range of quality in the care that they receive. This study provides a summary of clinical guideline recommendations related to the provision of primary care and family planning services for LGBT clients. In addition, we identify gaps in current guidelines, and inform future recommendations and guidance for clinical practice and research. PubMed, Cochrane, and Agency for Healthcare Research and Quality electronic bibliographic databases, and relevant professional organizations' websites, were searched to identify clinical guidelines related to the provision of primary care and family planning services for LGBT clients. Information obtained from a technical expert panel was used to inform the review. Clinical guidelines meeting the inclusion criteria were assessed to determine their alignment with Institute of Medicine (IOM) standards for the development of clinical practice guidelines and content relevant to the identified themes. The search parameters identified 2,006 clinical practice guidelines. Seventeen clinical guidelines met the inclusion criteria. Two of the guidelines met all eight IOM criteria. However, many recommendations were consistent regarding provision of services to LGBT clients within the following themes: clinic environment, provider cultural sensitivity and awareness, communication, confidentiality, coordination of care, general clinical principles, mental health considerations, and reproductive health. Guidelines for the primary and family planning care of LGBT clients are evolving. The themes identified in this review may guide professional organizations during guideline development, clinicians when providing care, and researchers conducting LGBT-related studies.

  19. Earthdata Search Client: Usability Review Process, Results, and Implemented Changes, Using Earthdata Search Client as a Case Study

    NASA Technical Reports Server (NTRS)

    Siarto, Jeff; Reese, Mark; Shum, Dana; Baynes, Katie

    2016-01-01

    User experience and visual design are greatly improved when usability testing is performed on a periodic basis. Design decisions should be tested by real users so that application owners can understand the effectiveness of each decision and identify areas for improvement. It is important that applications be tested not just once, but as a part of a continuing process that looks to build upon previous tests. NASA's Earthdata Search Client has undergone a usability study to ensure its users' needs are being met and that users understand how to use the tool efficiently and effectively. This poster will highlight the process followed for usability study, the results of the study, and what has been implemented in light of the results to improve the application's interface.

  20. Visualization of Vgi Data Through the New NASA Web World Wind Virtual Globe

    NASA Astrophysics Data System (ADS)

    Brovelli, M. A.; Kilsedar, C. E.; Zamboni, G.

    2016-06-01

    GeoWeb 2.0, laying the foundations of Volunteered Geographic Information (VGI) systems, has led to platforms where users can contribute to the geographic knowledge that is open to access. Moreover, as a result of the advancements in 3D visualization, virtual globes able to visualize geographic data even on browsers emerged. However the integration of VGI systems and virtual globes has not been fully realized. The study presented aims to visualize volunteered data in 3D, considering also the ease of use aspects for general public, using Free and Open Source Software (FOSS). The new Application Programming Interface (API) of NASA, Web World Wind, written in JavaScript and based on Web Graphics Library (WebGL) is cross-platform and cross-browser, so that the virtual globe created using this API can be accessible through any WebGL supported browser on different operating systems and devices, as a result not requiring any installation or configuration on the client-side, making the collected data more usable to users, which is not the case with the World Wind for Java as installation and configuration of the Java Virtual Machine (JVM) is required. Furthermore, the data collected through various VGI platforms might be in different formats, stored in a traditional relational database or in a NoSQL database. The project developed aims to visualize and query data collected through Open Data Kit (ODK) platform and a cross-platform application, where data is stored in a relational PostgreSQL and NoSQL CouchDB databases respectively.

  1. Global Ocean Currents Database

    NASA Astrophysics Data System (ADS)

    Boyer, T.; Sun, L.

    2016-02-01

    The NOAA's National Centers for Environmental Information has released an ocean currents database portal that aims 1) to integrate global ocean currents observations from a variety of instruments with different resolution, accuracy and response to spatial and temporal variability into a uniform network common data form (NetCDF) format and 2) to provide a dedicated online data discovery, access to NCEI-hosted and distributed data sources for ocean currents data. The portal provides a tailored web application that allows users to search for ocean currents data by platform types and spatial/temporal ranges of their interest. The dedicated web application is available at http://www.nodc.noaa.gov/gocd/index.html. The NetCDF format supports widely-used data access protocols and catalog services such as OPeNDAP (Open-source Project for a Network Data Access Protocol) and THREDDS (Thematic Real-time Environmental Distributed Data Services), which the GOCD users can use data files with their favorite analysis and visualization client software without downloading to their local machine. The potential users of the ocean currents database include, but are not limited to, 1) ocean modelers for their model skills assessments, 2) scientists and researchers for studying the impact of ocean circulations on the climate variability, 3) ocean shipping industry for safety navigation and finding optimal routes for ship fuel efficiency, 4) ocean resources managers while planning for the optimal sites for wastes and sewages dumping and for renewable hydro-kinematic energy, and 5) state and federal governments to provide historical (analyzed) ocean circulations as an aid for search and rescue

  2. Research on cloud-based remote measurement and analysis system

    NASA Astrophysics Data System (ADS)

    Gao, Zhiqiang; He, Lingsong; Su, Wei; Wang, Can; Zhang, Changfan

    2015-02-01

    The promising potential of cloud computing and its convergence with technologies such as cloud storage, cloud push, mobile computing allows for creation and delivery of newer type of cloud service. Combined with the thought of cloud computing, this paper presents a cloud-based remote measurement and analysis system. This system mainly consists of three parts: signal acquisition client, web server deployed on the cloud service, and remote client. This system is a special website developed using asp.net and Flex RIA technology, which solves the selective contradiction between two monitoring modes, B/S and C/S. This platform supplies customer condition monitoring and data analysis service by Internet, which was deployed on the cloud server. Signal acquisition device is responsible for data (sensor data, audio, video, etc.) collection and pushes the monitoring data to the cloud storage database regularly. Data acquisition equipment in this system is only conditioned with the function of data collection and network function such as smartphone and smart sensor. This system's scale can adjust dynamically according to the amount of applications and users, so it won't cause waste of resources. As a representative case study, we developed a prototype system based on Ali cloud service using the rotor test rig as the research object. Experimental results demonstrate that the proposed system architecture is feasible.

  3. A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations

    NASA Astrophysics Data System (ADS)

    Demir, I.; Agliamzanov, R.

    2014-12-01

    Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.

  4. A life scientist's gateway to distributed data management and computing: the PathPort/ToolBus framework.

    PubMed

    Eckart, J Dana; Sobral, Bruno W S

    2003-01-01

    The emergent needs of the bioinformatics community challenge current information systems. The pace of biological data generation far outstrips Moore's Law. Therefore, a gap continues to widen between the capabilities to produce biological (molecular and cell) data sets and the capability to manage and analyze these data sets. As a result, Federal investments in large data set generation produces diminishing returns in terms of the community's capabilities of understanding biology and leveraging that understanding to make scientific and technological advances that improve society. We are building an open framework to address various data management issues including data and tool interoperability, nomenclature and data communication standardization, and database integration. PathPort, short for Pathogen Portal, employs a generic, web-services based framework to deal with some of the problems identified by the bioinformatics community. The motivating research goal of a scalable system to provide data management and analysis for key pathosystems, especially relating to molecular data, has resulted in a generic framework using two major components. On the server-side, we employ web-services. On the client-side, a Java application called ToolBus acts as a client-side "bus" for contacting data and tools and viewing results through a single, consistent user interface.

  5. Web tools for effective retrieval, visualization, and evaluation of cardiology medical images and records

    NASA Astrophysics Data System (ADS)

    Masseroli, Marco; Pinciroli, Francesco

    2000-12-01

    To provide easy retrieval, integration and evaluation of multimodal cardiology images and data in a web browser environment, distributed application technologies and java programming were used to implement a client-server architecture based on software agents. The server side manages secure connections and queries to heterogeneous remote databases and file systems containing patient personal and clinical data. The client side is a Java applet running in a web browser and providing a friendly medical user interface to perform queries on patient and medical test dat and integrate and visualize properly the various query results. A set of tools based on Java Advanced Imaging API enables to process and analyze the retrieved cardiology images, and quantify their features in different regions of interest. The platform-independence Java technology makes the developed prototype easy to be managed in a centralized form and provided in each site where an intranet or internet connection can be located. Giving the healthcare providers effective tools for querying, visualizing and evaluating comprehensively cardiology medical images and records in all locations where they can need them- i.e. emergency, operating theaters, ward, or even outpatient clinics- the developed prototype represents an important aid in providing more efficient diagnoses and medical treatments.

  6. GIS Methodic and New Database for Magmatic Rocks. Application for Atlantic Oceanic Magmatism.

    NASA Astrophysics Data System (ADS)

    Asavin, A. M.

    2001-12-01

    There are several geochemical Databases in INTERNET available now. There one of the main peculiarities of stored geochemical information is geographical coordinates of each samples in those Databases. As rule the software of this Database use spatial information only for users interface search procedures. In the other side, GIS-software (Geographical Information System software),for example ARC/INFO software which using for creation and analyzing special geological, geochemical and geophysical e-map, have been deeply involved with geographical coordinates for of samples. We join peculiarities GIS systems and relational geochemical Database from special software. Our geochemical information system created in Vernadsky Geological State Museum and institute of Geochemistry and Analytical Chemistry from Moscow. Now we tested system with data of geochemistry oceanic rock from Atlantic and Pacific oceans, about 10000 chemical analysis. GIS information content consist from e-map covers Wold Globes. Parts of these maps are Atlantic ocean covers gravica map (with grid 2''), oceanic bottom hot stream, altimeteric maps, seismic activity, tectonic map and geological map. Combination of this information content makes possible created new geochemical maps and combination of spatial analysis and numerical geochemical modeling of volcanic process in ocean segment. Now we tested information system on thick client technology. Interface between GIS system Arc/View and Database resides in special multiply SQL-queries sequence. The result of the above gueries were simple DBF-file with geographical coordinates. This file act at the instant of creation geochemical and other special e-map from oceanic region. We used more complex method for geophysical data. From ARC\\View we created grid cover for polygon spatial geophysical information.

  7. Client - server programs analysis in the EPOCA environment

    NASA Astrophysics Data System (ADS)

    Donatelli, Susanna; Mazzocca, Nicola; Russo, Stefano

    1996-09-01

    Client - server processing is a popular paradigm for distributed computing. In the development of client - server programs, the designer has first to ensure that the implementation behaves correctly, in particular that it is deadlock free. Second, he has to guarantee that the program meets predefined performance requirements. This paper addresses the issues in the analysis of client - server programs in EPOCA. EPOCA is a computer-aided software engeneering (CASE) support system that allows the automated construction and analysis of generalized stochastic Petri net (GSPN) models of concurrent applications. The paper describes, on the basis of a realistic case study, how client - server systems are modelled in EPOCA, and the kind of qualitative and quantitative analysis supported by its tools.

  8. Swedish social insurance officers' experiences of difficulties in assessing applications for disability pensions--an interview study.

    PubMed

    Ydreborg, Berit; Ekberg, Kerstin; Nilsson, Kerstin

    2007-06-27

    In this study the focus is on social insurance officers judging applications for disability pensions. The number of applications for disability pension increased during the late 1990s, which has resulted in an increasing number of disability pensions in Sweden. A more restrictive attitude towards the clients has however evolved, as societal costs have increased and governmental guidelines now focus on reducing costs. As a consequence, the quantitative and qualitative demands on social insurance officers when handling applications for disability pensions may have increased. The aim of this study was therefore to describe the social insurance officers' experiences of assessing applications for disability pensions after the government's introduction of stricter regulations. Qualitative methodology was employed and a total of ten social insurance officers representing different experiences and ages were chosen. Open-ended interviews were performed with the ten social insurance officers. Data was analysed with inductive content analysis. Three themes could be identified as problematic in the social insurance officers' descriptions of dealing with the applications in order to reach a decision on whether the issue qualified applicants for a disability pension or not: 1. Clients are heterogeneous. 2. Ineffective and time consuming waiting for medical certificates impede the decision process. 3. Perspectives on the issue of work capacity differed among different stakeholders. The backgrounds of the clients differ considerably, leading to variation in the quality and content of applications. Social insurance officers had to make rapid decisions within a limited time frame, based on limited information, mainly on the basis of medical certificates that were often insufficient to judge work capacity. The role as coordinating actor with other stakeholders in the welfare system was perceived as frustrating, since different stakeholders have different goals and demands. The social insurance officers experience lack of control over the decision process, as regulations and other stakeholders restrict their work. A picture emerges of difficulties due to disharmonized systems, stakeholder-bound goals causing some clients to fall between two stools, or leading to unnecessary waiting times, which may limit the clients' ability to take an active part in a constructive process. Increased communication with physicians about how to elaborate the medical certificates might improve the quality of certificates and thereby reduce the clients waiting time.

  9. Swedish social insurance officers' experiences of difficulties in assessing applications for disability pensions – an interview study

    PubMed Central

    Ydreborg, Berit; Ekberg, Kerstin; Nilsson, Kerstin

    2007-01-01

    Background In this study the focus is on social insurance officers judging applications for disability pensions. The number of applications for disability pension increased during the late 1990s, which has resulted in an increasing number of disability pensions in Sweden. A more restrictive attitude towards the clients has however evolved, as societal costs have increased and governmental guidelines now focus on reducing costs. As a consequence, the quantitative and qualitative demands on social insurance officers when handling applications for disability pensions may have increased. The aim of this study was therefore to describe the social insurance officers' experiences of assessing applications for disability pensions after the government's introduction of stricter regulations. Methods Qualitative methodology was employed and a total of ten social insurance officers representing different experiences and ages were chosen. Open-ended interviews were performed with the ten social insurance officers. Data was analysed with inductive content analysis. Results Three themes could be identified as problematic in the social insurance officers' descriptions of dealing with the applications in order to reach a decision on whether the issue qualified applicants for a disability pension or not: 1. Clients are heterogeneous. 2. Ineffective and time consuming waiting for medical certificates impede the decision process. 3. Perspectives on the issue of work capacity differed among different stakeholders. The backgrounds of the clients differ considerably, leading to variation in the quality and content of applications. Social insurance officers had to make rapid decisions within a limited time frame, based on limited information, mainly on the basis of medical certificates that were often insufficient to judge work capacity. The role as coordinating actor with other stakeholders in the welfare system was perceived as frustrating, since different stakeholders have different goals and demands. The social insurance officers experience lack of control over the decision process, as regulations and other stakeholders restrict their work. Conclusion A picture emerges of difficulties due to disharmonized systems, stakeholder-bound goals causing some clients to fall between two stools, or leading to unnecessary waiting times, which may limit the clients' ability to take an active part in a constructive process. Increased communication with physicians about how to elaborate the medical certificates might improve the quality of certificates and thereby reduce the clients waiting time. PMID:17597536

  10. Information resources assessment of a healthcare integrated delivery system.

    PubMed Central

    Gadd, C. S.; Friedman, C. P.; Douglas, G.; Miller, D. J.

    1999-01-01

    While clinical healthcare systems may have lagged behind computer applications in other fields in the shift from mainframes to client-server architectures, the rapid deployment of newer applications is closing that gap. Organizations considering the transition to client-server must identify and position themselves to provide the resources necessary to implement and support the infrastructure requirements of client-server architectures and to manage the accelerated complexity at the desktop, including hardware and software deployment, training, and maintenance needs. This paper describes an information resources assessment of the recently aligned Pennsylvania regional Veterans Administration Stars and Stripes Health Network (VISN4), in anticipation of the shift from a predominantly mainframe to a client-server information systems architecture in its well-established VistA clinical information system. The multimethod assessment study is described here to demonstrate this approach and its value to regional healthcare networks undergoing organizational integration and/or significant information technology transformations. PMID:10566414

  11. Role-based access control permissions

    DOEpatents

    Staggs, Kevin P.; Markham, Thomas R.; Hull Roskos, Julie J.; Chernoguzov, Alexander

    2017-04-25

    Devices, systems, and methods for role-based access control permissions are disclosed. One method includes a policy decision point that receives up-to-date security context information from one or more outside sources to determine whether to grant access for a data client to a portion of the system and creates an access vector including the determination; receiving, via a policy agent, a request by the data client for access to the portion of the computing system by the data client, wherein the policy agent checks to ensure there is a session established with communications and user/application enforcement points; receiving, via communications policy enforcement point, the request from the policy agent, wherein the communications policy enforcement point determines whether the data client is an authorized node, based upon the access vector received from the policy decision point; and receiving, via the user/application policy enforcement point, the request from the communications policy enforcement point.

  12. The ISMARA client

    PubMed Central

    Ioannidis, Vassilios; van Nimwegen, Erik; Stockinger, Heinz

    2016-01-01

    ISMARA ( ismara.unibas.ch) automatically infers the key regulators and regulatory interactions from high-throughput gene expression or chromatin state data. However, given the large sizes of current next generation sequencing (NGS) datasets, data uploading times are a major bottleneck. Additionally, for proprietary data, users may be uncomfortable with uploading entire raw datasets to an external server. Both these problems could be alleviated by providing a means by which users could pre-process their raw data locally, transferring only a small summary file to the ISMARA server. We developed a stand-alone client application that pre-processes large input files (RNA-seq or ChIP-seq data) on the user's computer for performing ISMARA analysis in a completely automated manner, including uploading of small processed summary files to the ISMARA server. This reduces file sizes by up to a factor of 1000, and upload times from many hours to mere seconds. The client application is available from ismara.unibas.ch/ISMARA/client. PMID:28232860

  13. Blending Technology with Camp Tradition: Technology Can Simplify Camp Operations.

    ERIC Educational Resources Information Center

    Salzman, Jeff

    2000-01-01

    Discusses uses of technology appropriate for camps, which are service organizations based on building relationships. Describes relationship marketing and how it can be enhanced through use of Web sites, interactive brochures, and client databases. Outlines other technology uses at camp: automated dispensing of medications, satellite tracking of…

  14. Get It Together: Integrating Data with XML.

    ERIC Educational Resources Information Center

    Miller, Ron

    2003-01-01

    Discusses the use of XML for data integration to move data across different platforms, including across the Internet, from a variety of sources. Topics include flexibility; standards; organizing databases; unstructured data and the use of meta tags to encode it with XML information; cost effectiveness; and eliminating client software licenses.…

  15. Efficient transmission of compressed data for remote volume visualization.

    PubMed

    Krishnan, Karthik; Marcellin, Michael W; Bilgin, Ali; Nadar, Mariappan S

    2006-09-01

    One of the goals of telemedicine is to enable remote visualization and browsing of medical volumes. There is a need to employ scalable compression schemes and efficient client-server models to obtain interactivity and an enhanced viewing experience. First, we present a scheme that uses JPEG2000 and JPIP (JPEG2000 Interactive Protocol) to transmit data in a multi-resolution and progressive fashion. The server exploits the spatial locality offered by the wavelet transform and packet indexing information to transmit, in so far as possible, compressed volume data relevant to the clients query. Once the client identifies its volume of interest (VOI), the volume is refined progressively within the VOI from an initial lossy to a final lossless representation. Contextual background information can also be made available having quality fading away from the VOI. Second, we present a prioritization that enables the client to progressively visualize scene content from a compressed file. In our specific example, the client is able to make requests to progressively receive data corresponding to any tissue type. The server is now capable of reordering the same compressed data file on the fly to serve data packets prioritized as per the client's request. Lastly, we describe the effect of compression parameters on compression ratio, decoding times and interactivity. We also present suggestions for optimizing JPEG2000 for remote volume visualization and volume browsing applications. The resulting system is ideally suited for client-server applications with the server maintaining the compressed volume data, to be browsed by a client with a low bandwidth constraint.

  16. Android Video Streaming

    DTIC Science & Technology

    2014-05-01

    natural choice. In this document, we describe several aspects of video streaming and the challenges of performing video streaming between Android-based...client application was needed. Typically something like VideoLAN Client ( VLC ) is used for this purpose in a desktop environment. However, while VLC is...a very mature application on Windows and Linux, VLC for Android is still in a beta testing phase, and versions have only been developed to work

  17. Client-Side Data Processing and Training for Multispectral Imagery Applications in the GOES-R Era

    NASA Technical Reports Server (NTRS)

    Fuell, Kevin; Gravelle, Chad; Burks, Jason; Berndt, Emily; Schultz, Lori; Molthan, Andrew; Leroy, Anita

    2016-01-01

    RGB imagery can be created locally (i.e. client-side) from single band imagery already on the system with little impact given recommended change to texture cache in AWIPS II. Training/Reference material accessible to forecasters within their operational display system improves RGB interpretation and application as demonstrated at OPG. Application examples from experienced forecasters are needed to support the larger community use of RGB imagery and these can be integrated into the user's display system.

  18. Significant events in psychotherapy: An update of research findings.

    PubMed

    Timulak, Ladislav

    2010-11-01

    Significant events research represents a specific approach to studying client-identified important moments in the therapy process. The current study provides an overview of the significant events research conducted, the methodology used together with findings and implications. PsychInfo database was searched with keywords such as significant events, important events, significant moments, important moments, and counselling or psychotherapy. The references of the selected studies were also searched. This process led to the identification of 41 primary studies that used client-identified significant event(s) as a main or secondary focus of the study. These were consequently reviewed with regard to their methodology and findings. The findings are presented according to type of study conducted. The impacts of helpful events reported by clients are focused on contributions to therapeutic relationship and to in-session outcomes. Hindering events focus on some client disappointment with the therapist or therapy. The group therapy modality highlighted additional helpful impacts (like learning from others). Perspectives on what is significant in therapy differ between clients and therapists. The intensive qualitative studies reviewed confirm that the processes involved in significant events are complex and ambiguous. Studies show that the helpful events may also contain many hindering elements and that specific events are deeply contextually embedded in the preceding events of therapy. Some studies suggest that helpful significant events are therapeutically productive although this may need to be established further. Specific intensive studies show that the clients' perceptions in therapy may differ dramatically from that of the therapist. Furthermore, the relational and emotional aspects of significant moments may be more important for the clients than the cognitive aspects of therapy which are frequently stressed by therapists. 2010 The British Psychological Society.

  19. Essential issues in the design of shared document/image libraries

    NASA Astrophysics Data System (ADS)

    Gladney, Henry M.; Mantey, Patrick E.

    1990-08-01

    We consider what is needed to create electronic document libraries which mimic physical collections of books, papers, and other media. The quantitative measures of merit for personal workstations-cost, speed, size of volatile and persistent storage-will improve by at least an order ofmagnitude in the next decade. Every professional worker will be able to afford a very powerful machine, but databases and libraries are not really economical and useful unless they are shared. We therefore see a two-tier world emerging, in which custodians of information make it available to network-attached workstations. A client-server model is the natural description of this world. In collaboration with several state governments, we have considered what would be needed to replace paper-based record management for a dozen different applications. We find that a professional worker can anticipate most data needs and that (s)he is interested in each clump of data for a period of days to months. We further find that only a small fraction of any collection will be used in any period. Given expected bandwidths, data sizes, search times and costs, and other such parameters, an effective strategy to support user interaction is to bring large clumps from their sources, to transform them into convenient representations, and only then start whatever investigation is intended. A system-managed hierarchy of caches and archives is indicated. Each library is a combination of a catalog and a collection, and each stored item has a primary instance which is the standard by which the correctness of any copy is judged. Catalog records mostly refer to 1 to 3 stored items. Weighted by the number of bytes to be stored, immutable data dominate collections. These characteristics affect how consistency, currency, and access control of replicas distributed in the network should be managed. We present the large features of a design for network docun1ent/image library services. A prototype is being built for State of California pilot applications. The design allows library servers in any environment with an ANSI SQL database; clients execute in any environment; conimunications are with either TCP/IP or SNA LU 6.2.

  20. Migration to Current Open Source Technologies by MagIC Enables a More Responsive Website, Quicker Development Times, and Increased Community Engagement

    NASA Astrophysics Data System (ADS)

    Jarboe, N.; Minnett, R.; Koppers, A.; Constable, C.; Tauxe, L.; Jonestrask, L.

    2017-12-01

    The Magnetics Information Consortium (MagIC) supports an online database for the paleo, geo, and rock magnetic communities ( https://earthref.org/MagIC ). Researchers can upload data into the archive and download data as selected with a sophisticated search system. MagIC has completed the transition from an Oracle backed, Perl based, server oriented website to an ElasticSearch backed, Meteor based thick client website technology stack. Using JavaScript on both the sever and the client enables increased code reuse and allows easy offloading many computational operations to the client for faster response. On-the-fly data validation, column header suggestion, and spreadsheet online editing are some new features available with the new system. The 3.0 data model, method codes, and vocabulary lists can be browsed via the MagIC website and more easily updated. Source code for MagIC is publicly available on GitHub ( https://github.com/earthref/MagIC ). The MagIC file format is natively compatible with the PmagPy ( https://github.com/PmagPy/PmagPy) paleomagnetic analysis software. MagIC files can now be downloaded from the database and viewed and interpreted in the PmagPy GUI based tool, pmag_gui. Changes or interpretations of the data can then be saved by pmag_gui in the MagIC 3.0 data format and easily uploaded to the MagIC database. The rate of new contributions to the database has been increasing with many labs contributing measurement level data for the first time in the last year. Over a dozen file format conversion scripts are available for translating non-MagIC measurement data files into the MagIC format for easy uploading. We will continue to work with more labs until the whole community has a manageable workflow for contributing their measurement level data. MagIC will continue to provide a global repository for archiving and retrieving paleomagnetic and rock magnetic data and, with the new system in place, be able to more quickly respond to the community's requests for changes and improvements.

  1. Evolutionary conceptual analysis: faith community nursing.

    PubMed

    Ziebarth, Deborah

    2014-12-01

    The aim of the study was to report an evolutionary concept analysis of faith community nursing (FCN). FCN is a source of healthcare delivery in the USA which has grown in comprehensiveness and complexity. With increasing healthcare cost and a focus on access and prevention, FCN has extended beyond the physical walls of the faith community building. Faith communities and healthcare organizations invest in FCN and standardized training programs exist. Using Rodgers' evolutionary analysis, the literature was examined for antecedents, attributes, and consequences of the concept. This design allows for understanding the historical and social nature of the concept and how it changes over time. A search of databases using the keywords FCN, faith community nurse, parish nursing, and parish nurse was done. The concept of FCN was explored using research and theoretical literature. A theoretical definition and model were developed with relevant implications. The search results netted a sample of 124 reports of research and theoretical articles from multiple disciplines: medicine, education, religion and philosophy, international health, and nursing. Theoretical definition: FCN is a method of healthcare delivery that is centered in a relationship between the nurse and client (client as person, family, group, or community). The relationship occurs in an iterative motion over time when the client seeks or is targeted for wholistic health care with the goal of optimal wholistic health functioning. Faith integrating is a continuous occurring attribute. Health promoting, disease managing, coordinating, empowering and accessing health care are other essential attributes. All essential attributes occur with intentionality in a faith community, home, health institution and other community settings with fluidity as part of a community, national, or global health initiative. A new theoretical definition and corresponding conceptual model of FCN provides a basis for future nursing knowledge and model-based applications for evidence-based practice and research.

  2. The use of imagery in phase 1 treatment of clients with complex dissociative disorders

    PubMed Central

    van der Hart, Onno

    2012-01-01

    The “standard of care” for clients with complex dissociative disorders and other complex trauma-related disorders is phase-oriented treatment. Within this frame, therapeutic progress can be enhanced by the use of imagery-based therapeutic techniques. In this article, the emphasis is on their application in phase 1 treatment, stabilization, symptom reduction, and skills training, but attention is also paid to applications in phase 2 and phase 3 treatment. Many of the existing imagery techniques are geared toward clients becoming more able to function in a more adaptive way in daily life, which, however, requires the involvement of various dissociative parts of the personality. Such collaborative involvement is also essential in the later treatment phases. Therefore, understanding the dissociative nature of these disorders is helpful in the judicious application of these techniques. PMID:22893843

  3. Client-therapist agreement in the termination process and its association with therapeutic relationship.

    PubMed

    Olivera, Julieta; Challú, Laura; Gómez Penedo, Juan Martín; Roussos, Andrés

    2017-03-01

    There is no consensus among different therapeutic approaches on the process of termination when therapy does not have a prefixed duration. Moreover, both clinicians and researchers are still exploring decision making in the termination of treatment. The present study assessed former client's perspective of therapy termination in a nonprobabilistic sample from Buenos Aires, Argentina. Seventy-three semistructured interviews, lasting ∼60 min each, were conducted with participants that had finished a therapeutic treatment or dropped out. They were asked about several aspects of therapy, including their experience of termination, specifically who decided to terminate, if there was agreement on termination or not, and their thoughts on the termination process. All interviews were transcribed and analyzed using an adaptation of Consensual Qualitative Research (CQR). Quantitative analyses were also conducted to examine associations between variables. Two main factors emerged from the analysis: client/therapist initiative on termination; and level of agreement between client and therapist regarding termination. Whereas nearly all (95%) of therapist-initiated termination cases agreed on termination, client-initiated termination cases could be sorted in agreed (49%) and disagreed (51%) terminations. Both therapist-initiated terminations and agreed upon terminations presented more categories of positive termination motives, better therapeutic bond, and higher overall satisfaction with treatment. Implications for research and clinical practice are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  4. Shadow-Bitcoin: Scalable Simulation via Direct Execution of Multi-Threaded Applications

    DTIC Science & Technology

    2015-08-10

    Shadow- Bitcoin : Scalable Simulation via Direct Execution of Multi-threaded Applications Andrew Miller University of Maryland amiller@cs.umd.edu Rob...Shadow plug-in that directly executes the Bitcoin reference client software. To demonstrate the usefulness of this tool, we present novel denial-of...service attacks against the Bit- coin software that exploit low-level implementation ar- tifacts in the Bitcoin reference client; our determinis- tic

  5. NSLS-II HIGH LEVEL APPLICATION INFRASTRUCTURE AND CLIENT API DESIGN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shen, G.; Yang; L.

    2011-03-28

    The beam commissioning software framework of NSLS-II project adopts a client/server based architecture to replace the more traditional monolithic high level application approach. It is an open structure platform, and we try to provide a narrow API set for client application. With this narrow API, existing applications developed in different language under different architecture could be ported to our platform with small modification. This paper describes system infrastructure design, client API and system integration, and latest progress. As a new 3rd generation synchrotron light source with ultra low emittance, there are new requirements and challenges to control and manipulate themore » beam. A use case study and a theoretical analysis have been performed to clarify requirements and challenges to the high level applications (HLA) software environment. To satisfy those requirements and challenges, adequate system architecture of the software framework is critical for beam commissioning, study and operation. The existing traditional approaches are self-consistent, and monolithic. Some of them have adopted a concept of middle layer to separate low level hardware processing from numerical algorithm computing, physics modelling, data manipulating, plotting, and error handling. However, none of the existing approaches can satisfy the requirement. A new design has been proposed by introducing service oriented architecture technology. The HLA is combination of tools for accelerator physicists and operators, which is same as traditional approach. In NSLS-II, they include monitoring applications and control routines. Scripting environment is very important for the later part of HLA and both parts are designed based on a common set of APIs. Physicists and operators are users of these APIs, while control system engineers and a few accelerator physicists are the developers of these APIs. With our Client/Server mode based approach, we leave how to retrieve information to the developers of APIs and how to use them to form a physics application to the users. For example, how the channels are related to magnet and what the current real-time setting of a magnet is in physics unit are the internals of APIs. Measuring chromaticities are the users of APIs. All the users of APIs are working with magnet and instrument names in a physics unit. The low level communications in current or voltage unit are minimized. In this paper, we discussed our recent progress of our infrastructure development, and client API.« less

  6. 37 CFR 11.5 - Register of attorneys and agents in patent matters; practice before the Office.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., preparing and prosecuting any patent application, consulting with or giving advice to a client in... in trademark matters includes, but is not limited to, consulting with or giving advice to a client in...

  7. 37 CFR 11.5 - Register of attorneys and agents in patent matters; practice before the Office.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., preparing and prosecuting any patent application, consulting with or giving advice to a client in... in trademark matters includes, but is not limited to, consulting with or giving advice to a client in...

  8. Social exchange as a framework for client-nurse interaction during public health nursing maternal-child home visits.

    PubMed

    Byrd, Mary E

    2006-01-01

    The purpose of this paper was to develop a nursing-focused use of social exchange theory within the context of maternal-child home visiting. The nature of social exchange theory, its application to client-nurse interaction, and its fit with an existing data set from a field research investigation were examined. Resources exchanged between the nurse and clients were categorized and compared across the patterns of home visiting, nursing strategies based on exchange notions were identified, and variations in exchange were linked with client outcomes. The nurse provided resources within the categories of information, status, service, and goods. Clients provided time, access to the home, space within the home to conduct the visit, opportunities to observe maternal-child interaction, access to the infant, and information. The ease and breadth of resource exchange varied across the patterns of home visiting. The social exchange perspective was useful in categorizing resources, specifying and uncovering new resource categories, understanding nursing strategies to initiate and maintain the client-nurse relationship, and linking client-nurse interactive phenomena with client outcomes. Social exchange theory is potentially useful for understanding client-nurse interaction in the context of maternal-child home visits.

  9. Design and evaluation of web-based image transmission and display with different protocols

    NASA Astrophysics Data System (ADS)

    Tan, Bin; Chen, Kuangyi; Zheng, Xichuan; Zhang, Jianguo

    2011-03-01

    There are many Web-based image accessing technologies used in medical imaging area, such as component-based (ActiveX Control) thick client Web display, Zerofootprint thin client Web viewer (or called server side processing Web viewer), Flash Rich Internet Application(RIA) ,or HTML5 based Web display. Different Web display methods have different peformance in different network environment. In this presenation, we give an evaluation on two developed Web based image display systems. The first one is used for thin client Web display. It works between a PACS Web server with WADO interface and thin client. The PACS Web server provides JPEG format images to HTML pages. The second one is for thick client Web display. It works between a PACS Web server with WADO interface and thick client running in browsers containing ActiveX control, Flash RIA program or HTML5 scripts. The PACS Web server provides native DICOM format images or JPIP stream for theses clients.

  10. D GIS for Flood Modelling in River Valleys

    NASA Astrophysics Data System (ADS)

    Tymkow, P.; Karpina, M.; Borkowski, A.

    2016-06-01

    The objective of this study is implementation of system architecture for collecting and analysing data as well as visualizing results for hydrodynamic modelling of flood flows in river valleys using remote sensing methods, tree-dimensional geometry of spatial objects and GPU multithread processing. The proposed solution includes: spatial data acquisition segment, data processing and transformation, mathematical modelling of flow phenomena and results visualization. Data acquisition segment was based on aerial laser scanning supplemented by images in visible range. Vector data creation was based on automatic and semiautomatic algorithms of DTM and 3D spatial features modelling. Algorithms for buildings and vegetation geometry modelling were proposed or adopted from literature. The implementation of the framework was designed as modular software using open specifications and partially reusing open source projects. The database structure for gathering and sharing vector data, including flood modelling results, was created using PostgreSQL. For the internal structure of feature classes of spatial objects in a database, the CityGML standard was used. For the hydrodynamic modelling the solutions of Navier-Stokes equations in two-dimensional version was implemented. Visualization of geospatial data and flow model results was transferred to the client side application. This gave the independence from server hardware platform. A real-world case in Poland, which is a part of Widawa River valley near Wroclaw city, was selected to demonstrate the applicability of proposed system.

  11. Application of the transtheoretical model of behaviour change for identifying older clients' readiness for hearing rehabilitation during history-taking in audiology appointments.

    PubMed

    Ekberg, Katie; Grenness, Caitlin; Hickson, Louise

    2016-07-01

    The transtheoretical model (TTM) of behaviour change focuses on clients' readiness for adopting new health behaviours. This study explores how clients' readiness for change can be identified through their interactions with audiologists during history-taking in initial appointments; and whether clients' readiness has consequences for the rehabilitation decisions they make within the initial appointment. Conversation analysis (CA) was used to examine video-recorded initial audiology appointments with older adults with hearing impairment. The data corpus involved 62 recorded appointments with 26 audiologists and their older adult clients (aged 55+ years). Companions were present in 17 appointments. Clients' readiness for change could be observed through their interaction with the audiologist. Analysis demonstrated that the way clients described their hearing in the history-taking phase had systematic consequences for how they responded to rehabilitation recommendations (in particular, hearing aids) in the management phase of the appointment. In particular, clients identified as being in a pre-contemplation stage-of-change were more likely to display resistance to a recommendation of hearing aids (80% declined). The transtheoretical model of behaviour change can be useful for helping audiologists individualize management planning to be congruent with individual clients' needs, attitudes, desires, and psychological readiness for action in order to optimize clients' hearing outcomes.

  12. Operational Experience with the Frontier System in CMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blumenfeld, Barry; Dykstra, Dave; Kreuzer, Peter

    2012-06-20

    The Frontier framework is used in the CMS experiment at the LHC to deliver conditions data to processing clients worldwide, including calibration, alignment, and configuration information. Each central server at CERN, called a Frontier Launchpad, uses tomcat as a servlet container to establish the communication between clients and the central Oracle database. HTTP-proxy Squid servers, located close to clients, cache the responses to queries in order to provide high performance data access and to reduce the load on the central Oracle database. Each Frontier Launchpad also has its own reverse-proxy Squid for caching. The three central servers have been deliveringmore » about 5 million responses every day since the LHC startup, containing about 40 GB data in total, to more than one hundred Squid servers located worldwide, with an average response time on the order of 10 milliseconds. The Squid caches deployed worldwide process many more requests per day, over 700 million, and deliver over 40 TB of data. Several monitoring tools of the tomcat log files, the accesses of the Squids on the central Launchpad servers, and the availability of remote Squids have been developed to guarantee the performance of the service and make the system easily maintainable. Following a brief introduction of the Frontier framework, we describe the performance of this highly reliable and stable system, detail monitoring concerns and their deployment, and discuss the overall operational experience from the first two years of LHC data-taking.« less

  13. Operational Experience with the Frontier System in CMS

    NASA Astrophysics Data System (ADS)

    Blumenfeld, Barry; Dykstra, Dave; Kreuzer, Peter; Du, Ran; Wang, Weizhen

    2012-12-01

    The Frontier framework is used in the CMS experiment at the LHC to deliver conditions data to processing clients worldwide, including calibration, alignment, and configuration information. Each central server at CERN, called a Frontier Launchpad, uses tomcat as a servlet container to establish the communication between clients and the central Oracle database. HTTP-proxy Squid servers, located close to clients, cache the responses to queries in order to provide high performance data access and to reduce the load on the central Oracle database. Each Frontier Launchpad also has its own reverse-proxy Squid for caching. The three central servers have been delivering about 5 million responses every day since the LHC startup, containing about 40 GB data in total, to more than one hundred Squid servers located worldwide, with an average response time on the order of 10 milliseconds. The Squid caches deployed worldwide process many more requests per day, over 700 million, and deliver over 40 TB of data. Several monitoring tools of the tomcat log files, the accesses of the Squids on the central Launchpad servers, and the availability of remote Squids have been developed to guarantee the performance of the service and make the system easily maintainable. Following a brief introduction of the Frontier framework, we describe the performance of this highly reliable and stable system, detail monitoring concerns and their deployment, and discuss the overall operational experience from the first two years of LHC data-taking.

  14. Development of an Information System for Diploma Works Management

    ERIC Educational Resources Information Center

    Georgieva-Trifonova, Tsvetanka

    2011-01-01

    In this paper, a client/server information system for the management of data and its extraction from a database containing information for diploma works of students is proposed. The developed system provides users the possibility of accessing information about different characteristics of the diploma works, according to their specific interests.…

  15. Airborne Remote Sensing

    NASA Technical Reports Server (NTRS)

    1992-01-01

    NASA imaging technology has provided the basis for a commercial agricultural reconnaissance service. AG-RECON furnishes information from airborne sensors, aerial photographs and satellite and ground databases to farmers, foresters, geologists, etc. This service produces color "maps" of Earth conditions, which enable clients to detect crop color changes or temperature changes that may indicate fire damage or pest stress problems.

  16. Architecture and prototypical implementation of a semantic querying system for big Earth observation image bases

    PubMed Central

    Tiede, Dirk; Baraldi, Andrea; Sudmanns, Martin; Belgiu, Mariana; Lang, Stefan

    2017-01-01

    ABSTRACT Spatiotemporal analytics of multi-source Earth observation (EO) big data is a pre-condition for semantic content-based image retrieval (SCBIR). As a proof of concept, an innovative EO semantic querying (EO-SQ) subsystem was designed and prototypically implemented in series with an EO image understanding (EO-IU) subsystem. The EO-IU subsystem is automatically generating ESA Level 2 products (scene classification map, up to basic land cover units) from optical satellite data. The EO-SQ subsystem comprises a graphical user interface (GUI) and an array database embedded in a client server model. In the array database, all EO images are stored as a space-time data cube together with their Level 2 products generated by the EO-IU subsystem. The GUI allows users to (a) develop a conceptual world model based on a graphically supported query pipeline as a combination of spatial and temporal operators and/or standard algorithms and (b) create, save and share within the client-server architecture complex semantic queries/decision rules, suitable for SCBIR and/or spatiotemporal EO image analytics, consistent with the conceptual world model. PMID:29098143

  17. CARDIO-i2b2: integrating arrhythmogenic disease data in i2b2.

    PubMed

    Segagni, Daniele; Tibollo, Valentina; Dagliati, Arianna; Napolitano, Carlo; G Priori, Silvia; Bellazzi, Riccardo

    2012-01-01

    The CARDIO-i2b2 project is an initiative to customize the i2b2 bioinformatics tool with the aim to integrate clinical and research data in order to support translational research in cardiology. In this work we describe the implementation and the customization of i2b2 to manage the data of arrhytmogenic disease patients collected at the Fondazione Salvatore Maugeri of Pavia in a joint project with the NYU Langone Medical Center (New York, USA). The i2b2 clinical research chart data warehouse is populated with the data obtained by the research database called TRIAD. The research infrastructure is extended by the development of new plug-ins for the i2b2 web client application able to properly select and export phenotypic data and to perform data analysis.

  18. Framework for ReSTful Web Services in OSGi

    NASA Technical Reports Server (NTRS)

    Shams, Khawaja S.; Norris, Jeffrey S.; Powell, Mark W.; Crockett, Thomas M.; Mittman, David S.; Fox, Jason M.; Joswig, Joseph C.; Wallick, Michael N.; Torres, Recaredo J.; Rabe, Kenneth

    2009-01-01

    Ensemble ReST is a software system that eases the development, deployment, and maintenance of server-side application programs to perform functions that would otherwise be performed by client software. Ensemble ReST takes advantage of the proven disciplines of ReST (Representational State Transfer. ReST leverages the standardized HTTP protocol to enable developers to offer services to a diverse variety of clients: from shell scripts to sophisticated Java application suites

  19. Proposed color workflow solution from mobile and website to printing

    NASA Astrophysics Data System (ADS)

    Qiao, Mu; Wyse, Terry

    2015-03-01

    With the recent introduction of mobile devices and development in client side application technologies, there is an explosion of the parameter matrix for color management: hardware platform (computer vs. mobile), operating system (Windows, Mac OS, Android, iOS), client application (Flesh, IE, Firefox, Safari, Chrome), and file format (JPEG, TIFF, PDF of various versions). In a modern digital print shop, multiple print solutions are used: digital presses, wide format inkjet, dye sublimation inkjet are used to produce a wide variety of customizable products from photo book, personalized greeting card, canvas, mobile phone case and more. In this paper, we outline a strategy spans from client side application, print file construction, to color setup on printer to manage consistency and also achieve what-you-see-is-what-you-get for customers who are using a wide variety of technologies in viewing and ordering product.

  20. Meeting the mental health needs of today's college student: Reinventing services through Stepped Care 2.0.

    PubMed

    Cornish, Peter A; Berry, Gillian; Benton, Sherry; Barros-Gomes, Patricia; Johnson, Dawn; Ginsburg, Rebecca; Whelan, Beth; Fawcett, Emily; Romano, Vera

    2017-11-01

    A new stepped care model developed in North America reimagines the original United Kingdom model for the modern university campus environment. It integrates a range of established and emerging online mental health programs systematically along dimensions of treatment intensity and associated student autonomy. Program intensity can be either stepped up or down depending on level of client need. Because monitoring is configured to give both provider and client feedback on progress, the model empowers clients to participate actively in care options, decisions, and delivery. Not only is stepped care designed to be more efficient than traditional counseling services, early observations suggest it improves outcomes and access, including the elimination of service waitlists. This paper describes the new model in detail and outlines implementation experiences at 3 North American universities. While the experiences implementing the model have been positive, there is a need for development of technology that would facilitate more thorough evaluation. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  1. Correction to hill (2005).

    PubMed

    Hill, Clara E

    2006-01-01

    Reports an error in "Therapist Techniques, Client Involvement, and the Therapeutic Relationship: Inextricably Intertwined in the Therapy Process" by Clara E. Hill (Psychotherapy: Theory, Research, Practice, Training, 2005 Win, Vol 42(4), 431-442). An author's name was incorrectly spelled in a reference. The correct reference is presented. (The following abstract of the original article appeared in record 2006-03309-003.) I propose that therapist techniques, client involvement, and the therapeutic relationship are inextricably intertwined and need to be considered together in any discussion of the therapy process. Furthermore, I present a pantheoretical model of how these three variables evolve over four stages of successful therapy: initial impression formation, beginning the therapy (involves the components of facilitating client exploration and developing case conceptualization and treatment strategies), the core work of therapy (involves the components of theory-relevant tasks and overcoming obstacles), and termination. Theoretical propositions as well as implications for training and research are presented. (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  2. Reassessing Rogers' necessary and sufficient conditions of change.

    PubMed

    Watson, Jeanne C

    2007-09-01

    This article reviews the impact of Carl Rogers' postulate about the necessary and sufficient conditions of therapeutic change on the field of psychotherapy. It is proposed that his article (see record 2007-14630-002) made an impact in two ways; first, by acting as a spur to researchers to identify the active ingredients of therapeutic change; and, second, by providing guidelines for therapeutic practice. The role of the necessary and sufficient conditions in process-experiential therapy, an emotion-focused therapy for individuals, and their limitations in terms of research and practice are discussed. It is proposed that although the conditions are necessary and important in promoting clients' affect regulation, they do not take sufficient account of other moderating variables that affect clients' response to treatment and may need to be balanced with more structured interventions. Notwithstanding, Rogers highlighted a way of interacting with clients that is generally acknowledged as essential to effective psychotherapy practice. (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  3. Promoting behavior change: making healthy choices in wellness and healing choices in illness - use of self-determination theory in nursing practice.

    PubMed

    Johnson, Vicki D

    2007-06-01

    This article explores more efficacious strategies for holistic nurses to promote healthy behavior choices in their clients. It presents an overview of self-determination theory (SDT) and describes research evidence that supports the application of SDT to promoting healthy behavior change in clients. When nurses act in ways that support clients' innate needs for autonomy, competence, and relatedness, clients may be more successful at internalizing self-regulation and more inclined to adopt and maintain lifelong behavioral changes. Some examples of nursing interventions to motivate behavior change are outlined in this article.

  4. Costs and benefits to industry of online literature searches

    NASA Technical Reports Server (NTRS)

    Jensen, R. J.; Asbury, H. O.; King, R. G.

    1980-01-01

    A description is given of a client survey conducted by the NASA Industrial Application Center, U.S.C., examining user-identified dollar costs and benefits of an online computerized literature search. Telephone interviews were conducted on a random sample of clients using a Denver Research Institute questionnaire. Of the total 159 clients surveyed, over 53% identified dollar benefits. A direct relationship between client dollars invested and benefits derived from the search was shown. The ratio of dollar benefit to investment dollar averaged 2.9 to 1. Precise data on the end user's evaluation of the dollar value of an information search are presented.

  5. Using statistical process control to make data-based clinical decisions.

    PubMed

    Pfadt, A; Wheeler, D J

    1995-01-01

    Applied behavior analysis is based on an investigation of variability due to interrelationships among antecedents, behavior, and consequences. This permits testable hypotheses about the causes of behavior as well as for the course of treatment to be evaluated empirically. Such information provides corrective feedback for making data-based clinical decisions. This paper considers how a different approach to the analysis of variability based on the writings of Walter Shewart and W. Edwards Deming in the area of industrial quality control helps to achieve similar objectives. Statistical process control (SPC) was developed to implement a process of continual product improvement while achieving compliance with production standards and other requirements for promoting customer satisfaction. SPC involves the use of simple statistical tools, such as histograms and control charts, as well as problem-solving techniques, such as flow charts, cause-and-effect diagrams, and Pareto charts, to implement Deming's management philosophy. These data-analytic procedures can be incorporated into a human service organization to help to achieve its stated objectives in a manner that leads to continuous improvement in the functioning of the clients who are its customers. Examples are provided to illustrate how SPC procedures can be used to analyze behavioral data. Issues related to the application of these tools for making data-based clinical decisions and for creating an organizational climate that promotes their routine use in applied settings are also considered.

  6. A Primer on Functional Analysis

    ERIC Educational Resources Information Center

    Yoman, Jerome

    2008-01-01

    This article presents principles and basic steps for practitioners to complete a functional analysis of client behavior. The emphasis is on application of functional analysis to adult mental health clients. The article includes a detailed flow chart containing all major functional diagnoses and behavioral interventions, with functional assessment…

  7. ADAP: A Divorce Assessment Proposal.

    ERIC Educational Resources Information Center

    Ferreiro, Beverly Webster; And Others

    1986-01-01

    Proposed guidelines for mental health clinicians in assessing clients' divorce-related concerns. Current empirical information on factors affecting individual and family adjustment after divorce is organized into a practical format for clinical application. Details a comprehensive assessment that will help the clinician to understand the client's…

  8. Plugin free remote visualization in the browser

    NASA Astrophysics Data System (ADS)

    Tamm, Georg; Slusallek, Philipp

    2015-01-01

    Today, users access information and rich media from anywhere using the web browser on their desktop computers, tablets or smartphones. But the web evolves beyond media delivery. Interactive graphics applications like visualization or gaming become feasible as browsers advance in the functionality they provide. However, to deliver large-scale visualization to thin clients like mobile devices, a dedicated server component is necessary. Ideally, the client runs directly within the browser the user is accustomed to, requiring no installation of a plugin or native application. In this paper, we present the state-of-the-art of technologies which enable plugin free remote rendering in the browser. Further, we describe a remote visualization system unifying these technologies. The system transfers rendering results to the client as images or as a video stream. We utilize the upcoming World Wide Web Consortium (W3C) conform Web Real-Time Communication (WebRTC) standard, and the Native Client (NaCl) technology built into Chrome, to deliver video with low latency.

  9. A study on spatial decision support systems for HIV/AIDS prevention based on COM GIS technology

    NASA Astrophysics Data System (ADS)

    Yang, Kun; Luo, Huasong; Peng, Shungyun; Xu, Quanli

    2007-06-01

    Based on the deeply analysis of the current status and the existing problems of GIS technology applications in Epidemiology, this paper has proposed the method and process for establishing the spatial decision support systems of AIDS epidemic prevention by integrating the COM GIS, Spatial Database, GPS, Remote Sensing, and Communication technologies, as well as ASP and ActiveX software development technologies. One of the most important issues for constructing the spatial decision support systems of AIDS epidemic prevention is how to integrate the AIDS spreading models with GIS. The capabilities of GIS applications in the AIDS epidemic prevention have been described here in this paper firstly. Then some mature epidemic spreading models have also been discussed for extracting the computation parameters. Furthermore, a technical schema has been proposed for integrating the AIDS spreading models with GIS and relevant geospatial technologies, in which the GIS and model running platforms share a common spatial database and the computing results can be spatially visualized on Desktop or Web GIS clients. Finally, a complete solution for establishing the decision support systems of AIDS epidemic prevention has been offered in this paper based on the model integrating methods and ESRI COM GIS software packages. The general decision support systems are composed of data acquisition sub-systems, network communication sub-systems, model integrating sub-systems, AIDS epidemic information spatial database sub-systems, AIDS epidemic information querying and statistical analysis sub-systems, AIDS epidemic dynamic surveillance sub-systems, AIDS epidemic information spatial analysis and decision support sub-systems, as well as AIDS epidemic information publishing sub-systems based on Web GIS.

  10. TypeLoader: A fast and efficient automated workflow for the annotation and submission of novel full-length HLA alleles.

    PubMed

    Surendranath, V; Albrecht, V; Hayhurst, J D; Schöne, B; Robinson, J; Marsh, S G E; Schmidt, A H; Lange, V

    2017-07-01

    Recent years have seen a rapid increase in the discovery of novel allelic variants of the human leukocyte antigen (HLA) genes. Commonly, only the exons encoding the peptide binding domains of novel HLA alleles are submitted. As a result, the IPD-IMGT/HLA Database lacks sequence information outside those regions for the majority of known alleles. This has implications for the application of the new sequencing technologies, which deliver sequence data often covering the complete gene. As these technologies simplify the characterization of the complete gene regions, it is desirable for novel alleles to be submitted as full-length sequences to the database. However, the manual annotation of full-length alleles and the generation of specific formats required by the sequence repositories is prone to error and time consuming. We have developed TypeLoader to address both these facets. With only the full-length sequence as a starting point, Typeloader performs automatic sequence annotation and subsequently handles all steps involved in preparing the specific formats for submission with very little manual intervention. TypeLoader is routinely used at the DKMS Life Science Lab and has aided in the successful submission of more than 900 novel HLA alleles as full-length sequences to the European Nucleotide Archive repository and the IPD-IMGT/HLA Database with a 95% reduction in the time spent on annotation and submission when compared with handling these processes manually. TypeLoader is implemented as a web application and can be easily installed and used on a standalone Linux desktop system or within a Linux client/server architecture. TypeLoader is downloadable from http://www.github.com/DKMS-LSL/typeloader. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  11. Performance evaluation of a distance learning program.

    PubMed

    Dailey, D J; Eno, K R; Brinkley, J F

    1994-01-01

    This paper presents a performance metric which uses a single number to characterize the response time for a non-deterministic client-server application operating over the Internet. When applied to a Macintosh-based distance learning application called the Digital Anatomist Browser, the metric allowed us to observe that "A typical student doing a typical mix of Browser commands on a typical data set will experience the same delay if they use a slow Macintosh on a local network or a fast Macintosh on the other side of the country accessing the data over the Internet." The methodology presented is applicable to other client-server applications that are rapidly appearing on the Internet.

  12. mORCA: ubiquitous access to life science web services.

    PubMed

    Diaz-Del-Pino, Sergio; Trelles, Oswaldo; Falgueras, Juan

    2018-01-16

    Technical advances in mobile devices such as smartphones and tablets have produced an extraordinary increase in their use around the world and have become part of our daily lives. The possibility of carrying these devices in a pocket, particularly mobile phones, has enabled ubiquitous access to Internet resources. Furthermore, in the life sciences world there has been a vast proliferation of data types and services that finish as Web Services. This suggests the need for research into mobile clients to deal with life sciences applications for effective usage and exploitation. Analysing the current features in existing bioinformatics applications managing Web Services, we have devised, implemented, and deployed an easy-to-use web-based lightweight mobile client. This client is able to browse, select, compose parameters, invoke, and monitor the execution of Web Services stored in catalogues or central repositories. The client is also able to deal with huge amounts of data between external storage mounts. In addition, we also present a validation use case, which illustrates the usage of the application while executing, monitoring, and exploring the results of a registered workflow. The software its available in the Apple Store and Android Market and the source code is publicly available in Github. Mobile devices are becoming increasingly important in the scientific world due to their strong potential impact on scientific applications. Bioinformatics should not fall behind this trend. We present an original software client that deals with the intrinsic limitations of such devices and propose different guidelines to provide location-independent access to computational resources in bioinformatics and biomedicine. Its modular design makes it easily expandable with the inclusion of new repositories, tools, types of visualization, etc.

  13. Handheld Devices with Wide-Area Wireless Connectivity: Applications in Astronomy Educational Technology and Remote Computational Control

    NASA Astrophysics Data System (ADS)

    Budiardja, R. D.; Lingerfelt, E. J.; Guidry, M. W.

    2003-05-01

    Wireless technology implemented with handheld devices has attractive features because of the potential to access large amounts of data and the prospect of on-the-fly computational analysis from a device that can be carried in a shirt pocket. We shall describe applications of such technology to the general paradigm of making digital wireless connections from the field to upload information and queries to network servers, executing (potentially complex) programs and controlling data analysis and/or database operations on fast network computers, and returning real-time information from this analysis to the handheld device in the field. As illustration, we shall describe several client/server programs that we have written for applications in teaching introductory astronomy. For example, one program allows static and dynamic properties of astronomical objects to be accessed in a remote observation laboratory setting using a digital cell phone or PDA. Another implements interactive quizzing over a cell phone or PDA using a 700-question introductory astronomy quiz database, thus permitting students to study for astronomy quizzes in any environment in which they have a few free minutes and a digital cell phone or wireless PDA. Another allows one to control and monitor a computation done on a Beowulf cluster by changing the parameters of the computation remotely and retrieving the result when the computation is done. The presentation will include hands-on demonstrations with real devices. *Managed by UT-Battelle, LLC, for the U.S. Department of Energy under contract DE-AC05-00OR22725.

  14. Graphics interfaces and numerical simulations: Mexican Virtual Solar Observatory

    NASA Astrophysics Data System (ADS)

    Hernández, L.; González, A.; Salas, G.; Santillán, A.

    2007-08-01

    Preliminary results associated to the computational development and creation of the Mexican Virtual Solar Observatory (MVSO) are presented. Basically, the MVSO prototype consists of two parts: the first, related to observations that have been made during the past ten years at the Solar Observation Station (EOS) and at the Carl Sagan Observatory (OCS) of the Universidad de Sonora in Mexico. The second part is associated to the creation and manipulation of a database produced by numerical simulations related to solar phenomena, we are using the MHD ZEUS-3D code. The development of this prototype was made using mysql, apache, java and VSO 1.2. based GNU and `open source philosophy'. A graphic user interface (GUI) was created in order to make web-based, remote numerical simulations. For this purpose, Mono was used, because it is provides the necessary software to develop and run .NET client and server applications on Linux. Although this project is still under development, we hope to have access, by means of this portal, to other virtual solar observatories and to be able to count on a database created through numerical simulations or, given the case, perform simulations associated to solar phenomena.

  15. Music Therapy in the Rehabilitation of Head-Injured Clients.

    ERIC Educational Resources Information Center

    Lee, Lissa

    This paper summarizes research on clinical applications of music therapy with closed head injury clients. It offers a rationale for including music therapy in interdisciplinary rehabilitation. The Rancho Los Amigos Levels of Cognitive Functioning are outlined, and therapeutic assessment and treatment procedures are discussed. Rehabilitation…

  16. A Cognitive Perspective in the Treatment of Incarcerated Clients.

    ERIC Educational Resources Information Center

    Walsh, Thomas C.

    1990-01-01

    Proposes a cognitive therapy model as a workable approach in treating incarcerated clients. Reviews principal components and techniques of cognitive theory. Uses case vignettes to illustrate application of this approach. Delineates key features of cognitive model which relate to treatment of incarcerated population. (Author/ABL)

  17. Suggested Perspectives in Counseling the American Indian Client.

    ERIC Educational Resources Information Center

    Paisano-Suazo, Aleta

    The standard western theoretical approach to mental health counseling is not applicable to the views held by Native American clients. Consideration must be given to their unique differences, if the therapist is to provide maximum effectiveness. Several perspectives offer alternative counseling procedures. For instance, Indians place great…

  18. 45 CFR 1611.7 - Manner of determining financial eligibility.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... forms and procedures to obtain information from applicants and groups to determine financial eligibility in a manner that promotes the development of trust between attorney and client. The forms shall be... verify the information, in a manner consistent with the attorney-client relationship. (d) When one...

  19. 45 CFR 1611.7 - Manner of determining financial eligibility.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... forms and procedures to obtain information from applicants and groups to determine financial eligibility in a manner that promotes the development of trust between attorney and client. The forms shall be... verify the information, in a manner consistent with the attorney-client relationship. (d) When one...

  20. 45 CFR 1611.7 - Manner of determining financial eligibility.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... forms and procedures to obtain information from applicants and groups to determine financial eligibility in a manner that promotes the development of trust between attorney and client. The forms shall be... verify the information, in a manner consistent with the attorney-client relationship. (d) When one...

  1. Using Family Photographs to Explore Life Cycle Changes.

    ERIC Educational Resources Information Center

    Gerace, Laina M.

    1989-01-01

    The author introduced discussions about family photographs as a clinical technique with depressed clients. During therapy, clients were encouraged to discuss the photos in an open-ended manner. Methods and themes elicited by photo-interview are presented. Comments on the clinical application of phototherapy are included. (CH)

  2. A Services-Oriented Architecture for Water Observations Data

    NASA Astrophysics Data System (ADS)

    Maidment, D. R.; Zaslavsky, I.; Valentine, D.; Tarboton, D. G.; Whitenack, T.; Whiteaker, T.; Hooper, R.; Kirschtel, D.

    2009-04-01

    Water observations data are time series of measurements made at point locations of water level, flow, and quality and corresponding data for climatic observations at point locations such as gaged precipitation and weather variables. A services-oriented architecture has been built for such information for the United States that has three components: hydrologic information servers, hydrologic information clients, and a centralized metadata cataloging system. These are connected using web services for observations data and metadata defined by an XML-based language called WaterML. A Hydrologic Information Server can be built by storing observations data in a relational database schema in the CUAHSI Observations Data Model, in which case, web services access to the data and metadata is automatically provided by query functions for WaterML that are wrapped around the relational database within a web server. A Hydrologic Information Server can also be constructed by custom-programming an interface to an existing water agency web site so that responds to the same queries by producing data in WaterML as do the CUAHSI Observations Data Model based servers. A Hydrologic Information Client is one which can interpret and ingest WaterML metadata and data. We have two client applications for Excel and ArcGIS and have shown how WaterML web services can be ingested into programming environments such as Matlab and Visual Basic. HIS Central, maintained at the San Diego Supercomputer Center is a repository of observational metadata for WaterML web services which presently indexes 342 million data measured at 1.75 million locations. This is the largest catalog water observational data for the United States presently in existence. As more observation networks join what we term "CUAHSI Water Data Federation", and the system accommodates a growing number of sites, measured parameters, applications, and users, rapid and reliable access to large heterogeneous hydrologic data repositories becomes critical. The CUAHSI HIS solution to the scalability and heterogeneity challenges has several components. Structural differences across the data repositories are addressed by building a standard services foundation for the exchange of hydrologic data, as derived from a common information model for observational data measured at stationary points and its implementation as a relational schema (ODM) and an XML schema (WaterML). Semantic heterogeneity is managed by mapping water quantity, water quality, and other parameters collected by government agencies and academic projects to a common ontology. The WaterML-compliant web services are indexed in a community services registry called HIS Central (hiscentral.cuahsi.org). Once a web service is registered in HIS Central, its metadata (site and variable characteristics, period of record for each variable at each site, etc.) is harvested and appended to the central catalog. The catalog is further updated as the service publisher associates the variables in the published service with ontology concepts. After this, the newly published service becomes available for spatial and semantics-based queries from online and desktop client applications developed by the project. Hydrologic system server software is now deployed at more than a dozen locations in the United States and Australia. To provide rapid access to data summaries, in particular for several nation-wide data repositories including EPA STORET, USGS NWIS, and USDA SNOTEL, we convert the observation data catalogs and databases with harvested data values into special representations that support high-performance analysis and visualization. The construction of OLAP (Online Analytical Processing) cubes, often called data cubes, is an approach to organizing and querying large multi-dimensional data collections. We have applied the OLAP techniques, as implemented in Microsoft SQL Server 2005/2008, to the analysis of the catalogs from several agencies. OLAP analysis results reflect geography and history of observation data availability from USGS NWIS, EPA STORET, and USDA SNOTEL repositories, and spatial and temporal dynamics of the available measurements for several key nutrient-related parameters. Our experience developing the CUAHSI HIS cyberinfrastructure demonstrated that efficient integration of hydrologic observations from multiple government and academic sources requires a range of technical approaches focused on managing different components of data heterogeneity and system scalability. While this submission addresses technical aspects of developing a national-scale information system for hydrologic observations, the challenges of explicating shared semantics of hydrologic observations and building a community of HIS users and developers remain critical in constructing a nation-wide federation of water data services.

  3. Demonstration of measurement-only blind quantum computing

    NASA Astrophysics Data System (ADS)

    Greganti, Chiara; Roehsner, Marie-Christine; Barz, Stefanie; Morimae, Tomoyuki; Walther, Philip

    2016-01-01

    Blind quantum computing allows for secure cloud networks of quasi-classical clients and a fully fledged quantum server. Recently, a new protocol has been proposed, which requires a client to perform only measurements. We demonstrate a proof-of-principle implementation of this measurement-only blind quantum computing, exploiting a photonic setup to generate four-qubit cluster states for computation and verification. Feasible technological requirements for the client and the device-independent blindness make this scheme very applicable for future secure quantum networks.

  4. A regional technology transfer program. [North Carolina Industrial Applications Center for the Southeast

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The proliferation of online searching capabilities among its industrial clients, changes in marketing staff and direction, use of Dun and Bradstreet marketing service files, growth of the Annual Service Package program, and services delivered to clients at the NASA funded North Carolina Science and Technology Research Center are described. The library search service was reactivated and enlarged, and a survey was conducted on the NC/STRC Technical Bulletin's effectiveness. Several quotations from clients assess the overall value of the Center's services.

  5. Evaluation of service quality in family planning clinics in Lusaka, Zambia.

    PubMed

    Hancock, Nancy L; Vwalika, Bellington; Sitali, Elizabeth Siyama; Mbwili-Muleya, Clara; Chi, Benjamin H; Stuart, Gretchen S

    2015-10-01

    To determine the quality of contraceptive services in family planning clinics in Lusaka, Zambia, using a standardized approach. We utilized the Quick Investigation of Quality, a cross-sectional survey tool consisting of a facility assessment, client-provider observation and client exit interview, in public-sector family planning clinics. Data were collected on availability of seven contraceptive methods, information given to clients, interpersonal relations between providers and clients, providers' technical competence and mechanisms for continuity and follow-up. Data were collected from five client-provider observations and client exit interviews in each of six public-sector family planning clinics. All clinics had at least two contraceptive methods continuously available for the preceding 6 months. Most providers asked clients about concerns with their contraceptive method (80%) and told clients when to return to the clinic (87%). Most clients reported that the provider advised what to do if a problem develops (93%), described possible side effects (89%), explained how to use the method effectively (85%) and told them when to come for follow-up (83%). Clients were satisfied with services received (93%). This application of the Quick Investigation of Quality showed that the participating family planning clinics in Lusaka, Zambia, were prepared to offer high-quality services with the available commodities and that clients were satisfied with the received services. Despite the subjective client satisfaction, quality improvement efforts are needed to increase contraceptive availability. Although clients perceived the quality of care received to be high, family planning service quality could be improved to continuously offer the full spectrum of contraceptive options. The Quick Investigation of Quality was easily implemented in Lusaka, Zambia, and this simple approach could be utilized in a variety of settings as a modality for quality improvement. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Client attachment in a randomized clinical trial of psychoanalytic and cognitive-behavioral psychotherapy for bulimia nervosa: Outcome moderation and change.

    PubMed

    Daniel, Sarah Ingrid Franksdatter; Poulsen, Stig; Lunn, Susanne

    2016-06-01

    In the context of a randomized clinical trial of psychoanalytic psychotherapy (PPT) versus cognitive behavior therapy (CBT) for bulimia nervosa (BN), this study performed secondary analyses of (a) the relation between attachment and pretreatment symptom levels, (b) whether client pretreatment attachment moderated treatment outcome, (c) whether change in client attachment was associated with symptomatic change, and (d) whether client attachment changed differently in the 2 treatments. Sixty-nine women and 1 man of a mean age of 25.8 years diagnosed with BN were randomly assigned to either 2 years of weekly PPT or 5 months of CBT. Assessments at intake, after 5 months, and after 2 years included the Eating Disorder Examination to assess eating disorder symptoms, the Adult Attachment Interview to assess client attachment, and the Symptom Checklist 90-R to assess general psychiatric distress. Repeated measures were analyzed using multilevel analysis. Higher scores on attachment insecurity and attachment preoccupation were associated with more frequent binging pretreatment. Pretreatment attachment did not predict treatment outcome. In PPT, but not in CBT, reduction of binging was associated with an increase in attachment security. The 2 treatment types were not associated with significantly different patterns of attachment-related change. Degree and type of attachment insecurity is related to the frequency of binging in BN. Increase in attachment security may be a treatment-specific mechanism of change in PPT for BN. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  7. Realist Evaluation in Wraparound: A New Approach in Social Work Evidence-Based Practice

    ERIC Educational Resources Information Center

    Kazi, Mansoor A. F.; Pagkos, Brian; Milch, Heidi A.

    2011-01-01

    Objectives: The purpose of this study was to develop a realist evaluation paradigm in social work evidence-based practice. Method: Wraparound (at Gateway-Longview Inc., New York) used a reliable outcome measure and an electronic database to systematically collect and analyze data on the interventions, the client demographics and circumstances, and…

  8. Development and deployment of a Desktop and Mobile application on grid for GPS studie

    NASA Astrophysics Data System (ADS)

    Ntumba, Patient; Lotoy, Vianney; Djungu, Saint Jean; Fleury, Rolland; Petitdidier, Monique; Gemünd, André; Schwichtenberg, Horst

    2013-04-01

    GPS networks for scientific studies are developed all other the world and large databases, regularly updated, like IGS are also available. Many GPS have been installed in West and Central Africa during AMMA (African Monsoon Multiplidisciplinary Analysis), IHY (International heliophysical Year)and many other projects since 2005. African scientists have been educated to use those data especially for meteorological and ionospheric studies. The annual variations of ionospheric parameters for a given station or map of a given region are very intensive computing. Then grid or cloud computing may be a solution to obtain results in a relatively short time. Real time At the University of Kinshasa the chosen solution is a grid of several PCs. It has been deployed by using Globus Toolkit on a Condor pool in order to support the processing of GPS data for ionospheric studies. To be user-friendly, graphical user interfaces(GUI) have been developed to help the user to prepare and submit jobs. One is a java GUI for desktop client, the other is an Android GUI for mobile client. The interest of a grid is the possibility to send a bunch of jobs with an adequate agent control in order to survey the job execution and result storage. After the feasibility study the grid will be extended to a larger number of PCs. Other solutions will be in parallel explored.

  9. Working with Specify in a Paleo-Geological Context

    NASA Astrophysics Data System (ADS)

    Molineux, A.; Thompson, A. C.; Appleton, L.

    2014-12-01

    For geological collections with limited funding an open source relational database provides an opportunity to digitize specimens and related data. At the Non-vertebrate Paleontology Lab, a large mixed paleo and geological repository on a restricted budget, we opted for one such database, Specify. Initially created at Kansas University for neontological collections and based on a single computer, Specify has moved into the networked scene and will soon be web-based as Specify 7. We currently use the server version of Specify 6, networked to all computers in the lab each running a desktop client, often with six users at any one time. Along with improved access there have been great efforts to broaden the applicability of this database to other disciplines. Current developments are of great importance to us because they focus on the geological aspects of lithostratigraphy and chronostratigaphy and their relationship to other variables. Adoption of this software has required constant change as we move to take advantage of the great improvements. We enjoy the interaction with the developers and their willingness to listen and consider our issues. Here we discuss some of the ways in which we have fashioned Specify into a database that provides us with the flexibility that we need without removing the ability to share our data with other aggregators through accepted protocols. We discuss the customization of forms, the attachment of media and tracking of original media files, our efforts to incorporate geological specimens, and our plans to link the individual specimen record GUIDs to an IGSN numbers and thence to future connections to data derived from our specimens.

  10. A retrospective, descriptive study of shoulder outcomes in outpatient physical therapy.

    PubMed

    Millar, A Lynn; Lasheway, Philip A; Eaton, Wendy; Christensen, Frances

    2006-06-01

    A retrospective, descriptive study of clients with shoulder dysfunction referred to physical therapy. To (1) describe the clinical and functional outcomes of clients with shoulder dysfunction following outpatient physical therapy, and (2) to compare the outcomes by type of shoulder dysfunction. Although individuals with shoulder dysfunction are commonly referred to physical therapy few large descriptive studies regarding outcomes following physical therapy are available. Data for 878 clients (468 female, 410 male) were retrieved and analyzed. This database was developed between 1997 and 2000 and included 4 outpatient facilities from 1 healthcare system in the southwest corner of Michigan. Clients were classified by type of shoulder dysfunction, and standardized tests were performed upon admittance and discharge to physical therapy. Descriptive and inferential statistics were calculated for all data. Of all clients, 55.1% had shoulder impingement, while 18.3% had postoperative repair, 8.9% had a frozen shoulder, 7.6% had a rotator cuff tear, 3.0% had shoulder instability, 2.1% were post fracture, and the remaining 4.9% had miscellaneous diagnoses. The average (+/-SD) age of the patients was 53.6 +/- 16.4 years, with an average (+/-SD) number of treatment sessions of 13.7 +/- 11.0. All groups showed significant changes following physical therapy intervention. Clients with diverse types of shoulder dysfunction demonstrated improvement in both clinical and functional measures at the conclusion of physical therapy, although it is not possible to determine whether these changes were due to the interventions or due to time. The type of shoulder dysfunction appears to affect the prognosis, thus expected outcomes should be based upon initial diagnosis and specific measures.

  11. TogoDoc server/client system: smart recommendation and efficient management of life science literature.

    PubMed

    Iwasaki, Wataru; Yamamoto, Yasunori; Takagi, Toshihisa

    2010-12-13

    In this paper, we describe a server/client literature management system specialized for the life science domain, the TogoDoc system (Togo, pronounced Toe-Go, is a romanization of a Japanese word for integration). The server and the client program cooperate closely over the Internet to provide life scientists with an effective literature recommendation service and efficient literature management. The content-based and personalized literature recommendation helps researchers to isolate interesting papers from the "tsunami" of literature, in which, on average, more than one biomedical paper is added to MEDLINE every minute. Because researchers these days need to cover updates of much wider topics to generate hypotheses using massive datasets obtained from public databases or omics experiments, the importance of having an effective literature recommendation service is rising. The automatic recommendation is based on the content of personal literature libraries of electronic PDF papers. The client program automatically analyzes these files, which are sometimes deeply buried in storage disks of researchers' personal computers. Just saving PDF papers to the designated folders makes the client program automatically analyze and retrieve metadata, rename file names, synchronize the data to the server, and receive the recommendation lists of newly published papers, thus accomplishing effortless literature management. In addition, the tag suggestion and associative search functions are provided for easy classification of and access to past papers (researchers who read many papers sometimes only vaguely remember or completely forget what they read in the past). The TogoDoc system is available for both Windows and Mac OS X and is free. The TogoDoc Client software is available at http://tdc.cb.k.u-tokyo.ac.jp/, and the TogoDoc server is available at https://docman.dbcls.jp/pubmed_recom.

  12. TogoDoc Server/Client System: Smart Recommendation and Efficient Management of Life Science Literature

    PubMed Central

    Takagi, Toshihisa

    2010-01-01

    In this paper, we describe a server/client literature management system specialized for the life science domain, the TogoDoc system (Togo, pronounced Toe-Go, is a romanization of a Japanese word for integration). The server and the client program cooperate closely over the Internet to provide life scientists with an effective literature recommendation service and efficient literature management. The content-based and personalized literature recommendation helps researchers to isolate interesting papers from the “tsunami” of literature, in which, on average, more than one biomedical paper is added to MEDLINE every minute. Because researchers these days need to cover updates of much wider topics to generate hypotheses using massive datasets obtained from public databases or omics experiments, the importance of having an effective literature recommendation service is rising. The automatic recommendation is based on the content of personal literature libraries of electronic PDF papers. The client program automatically analyzes these files, which are sometimes deeply buried in storage disks of researchers' personal computers. Just saving PDF papers to the designated folders makes the client program automatically analyze and retrieve metadata, rename file names, synchronize the data to the server, and receive the recommendation lists of newly published papers, thus accomplishing effortless literature management. In addition, the tag suggestion and associative search functions are provided for easy classification of and access to past papers (researchers who read many papers sometimes only vaguely remember or completely forget what they read in the past). The TogoDoc system is available for both Windows and Mac OS X and is free. The TogoDoc Client software is available at http://tdc.cb.k.u-tokyo.ac.jp/, and the TogoDoc server is available at https://docman.dbcls.jp/pubmed_recom. PMID:21179453

  13. Diffusing Supply Chain Innovations at Hewlett-Packard Company: Applications of Performance Technology.

    ERIC Educational Resources Information Center

    Cargille, Brian; Branvold, Dwight

    2000-01-01

    Explains how Hewlett-Packard creates supply chain management innovations and effectively diffuses new technologies. Outlines how performance technologists help accelerate the diffusion and adoption of innovations by modifying innovations, define the client adoption path, create resources to lead clients through adoption, and improve the diffusion…

  14. Collaborating with Your Clients Using Social Media & Mobile Communications

    ERIC Educational Resources Information Center

    Typhina, Eli; Bardon, Robert E.; Gharis, Laurie W.

    2015-01-01

    Many Extension educators are still learning how to effectively integrate social media into their programs. By using the right social media platforms and mobile applications to create engaged, online communities, Extension educators can collaborate with clients to produce and to share information expanding and enhancing their social media and…

  15. Creating Experiential Learning in the Graduate Classroom through Community Engagement

    ERIC Educational Resources Information Center

    Johnson, Katryna

    2013-01-01

    Educators can provide opportunities for active learning for the students by engaging them in client-based projects with the community, which enhances application of theory and provides students with the relevance demanded from the business community. Experiential learning opportunities through client-based projects provide for such an experience.…

  16. Accessibility and preferred use of online Web applications among WIC participants with Internet access.

    PubMed

    Bensley, Robert J; Hovis, Amanda; Horton, Karissa D; Loyo, Jennifer J; Bensley, Kara M; Phillips, Diane; Desmangles, Claudia

    2014-01-01

    This study examined the current technology use of clients in the western Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) region and the preferences these current clients have for using new technologies to interact with WIC. Cross-sectional convenience sample for online survey of WIC clients over 2 months in 2011. A weighted sample of 8,144 participants showed that the majority of WIC clients have access to the Internet using a computer or mobile phone. E-mail, texting, and Facebook were technologies most often used for communication. Significant differences (P < .05) existed between age groups and Facebook use, education level and technology use for education delivery, and education level and use of video chat. Technologies should be considered for addressing WIC clients' needs, including use of text messaging and smartphone apps for appointments, education, and other WIC services; online scheduling and nutrition education; and a stronger Facebook presence for connecting with WIC clients and breastfeeding support. Published by Elsevier Inc.

  17. Design and development of a mobile exercise application for home care aides and older adult medicaid home and community-based clients.

    PubMed

    Danilovich, Margaret K; Diaz, Laura; Saberbein, Gustavo; Healey, William E; Huber, Gail; Corcos, Daniel M

    2017-01-01

    We describe a community-engaged approach with Medicaid home and community-based services (HCBS), home care aide (HCA), client, and physical therapist stakeholders to develop a mobile application (app) exercise intervention through focus groups and interviews. Participants desired a short exercise program with modification capabilities, goal setting, and mechanisms to track progress. Concerns regarding participation were training needs and feasibility within usual care services. Technological preferences were for simple, easy-to-use, and engaging content. The app was piloted with HCA-client dyads (n = 5) to refine the intervention and evaluate content. Engaging stakeholders in intervention development provides valuable user-feedback on both desired exercise program contents and mobile technology preferences for HCBS recipients.

  18. CommServer: A Communications Manager For Remote Data Sites

    NASA Astrophysics Data System (ADS)

    Irving, K.; Kane, D. L.

    2012-12-01

    CommServer is a software system that manages making connections to remote data-gathering stations, providing a simple network interface to client applications. The client requests a connection to a site by name, and the server establishes the connection, providing a bidirectional channel between the client and the target site if successful. CommServer was developed to manage networks of FreeWave serial data radios with multiple data sites, repeaters, and network-accessed base stations, and has been in continuous operational use for several years. Support for Iridium modems using RUDICS will be added soon, and no changes to the application interface are anticipated. CommServer is implemented on Linux using programs written in bash shell, Python, Perl, AWK, under a set of conventions we refer to as ThinObject.

  19. Web-Based Software for Managing Research

    NASA Technical Reports Server (NTRS)

    Hoadley, Sherwood T.; Ingraldi, Anthony M.; Gough, Kerry M.; Fox, Charles; Cronin, Catherine K.; Hagemann, Andrew G.; Kemmerly, Guy T.; Goodman, Wesley L.

    2007-01-01

    aeroCOMPASS is a software system, originally designed to aid in the management of wind tunnels at Langley Research Center, that could be adapted to provide similar aid to other enterprises in which research is performed in common laboratory facilities by users who may be geographically dispersed. Included in aeroCOMPASS is Web-interface software that provides a single, convenient portal to a set of project- and test-related software tools and other application programs. The heart of aeroCOMPASS is a user-oriented document-management software subsystem that enables geographically dispersed users to easily share and manage a variety of documents. A principle of "write once, read many" is implemented throughout aeroCOMPASS to eliminate the need for multiple entry of the same information. The Web framework of aeroCOMPASS provides links to client-side application programs that are fully integrated with databases and server-side application programs. Other subsystems of aeroCOMPASS include ones for reserving hardware, tracking of requests and feedback from users, generating interactive notes, administration of a customer-satisfaction questionnaire, managing execution of tests, managing archives of metadata about tests, planning tests, and providing online help and instruction for users.

  20. Improving client-centred care and services: the role of front/back-office configurations.

    PubMed

    Broekhuis, Manda; de Blok, Carolien; Meijboom, Bert

    2009-05-01

    This paper is a report of a study conducted to explore the application of designing front- and back-office work resulting in efficient client-centred care in healthcare organizations that supply home care, welfare and domestic services. Front/back-office configurations reflect a neglected domain of design decisions in the development of more client-centred processes and structures without incurring major cost increases. Based on a literature search, a framework of four front/back-office configurations was constructed. To illustrate the usefulness of this framework, a single, longitudinal case study was performed in a large organization, which provides home care, welfare and domestic services for a sustained period (2005-2006). The case study illustrates how front/back-office design decisions are related to the complexity of the clients' demands and the strategic objectives of an organization. The constructed framework guides the practical development of front/back-office designs, and shows how each design contributes differently to such performance objectives as quality, speed and efficiency. The front/back-office configurations presented comprise an important first step in elaborating client-centred care and service provision to the operational level. It helps healthcare organizations to become more responsive and to provide efficient client-centred care and services when approaching demand in a well-tuned manner. In addition to its applicability in home care, we believe that a deliberate front/back-office configuration also has potential in other fields of health care.

  1. Benefit–Cost in the California Treatment Outcome Project: Does Substance Abuse Treatment “Pay for Itself”?

    PubMed Central

    Ettner, Susan L; Huang, David; Evans, Elizabeth; Rose Ash, Danielle; Hardy, Mary; Jourabchi, Mickel; Hser, Yih-Ing

    2006-01-01

    Objective To examine costs and monetary benefits associated with substance abuse treatment. Data Sources Primary and administrative data on client outcomes and agency costs from 43 substance abuse treatment providers in 13 counties in California during 2000–2001. Study Design Using a social planner perspective, the estimated direct cost of treatment was compared with the associated monetary benefits, including the client's costs of medical care, mental health services, criminal activity, earnings, and (from the government's perspective) transfer program payments. The cost of the client's substance abuse treatment episode was estimated by multiplying the number of days that the client spent in each treatment modality by the estimated average per diem cost of that modality. Monetary benefits associated with treatment were estimated using a pre–posttreatment admission study design, i.e., each client served as his or her own control. Data Collection Treatment cost data were collected from providers using the Drug Abuse Treatment Cost Analysis Program instrument. For the main sample of 2,567 clients, information on medical hospitalizations, emergency room visits, earnings, and transfer payments was obtained from baseline and 9-month follow-up interviews, and linked to information on inpatient and outpatient mental health services use and criminal activity from administrative databases. Sensitivity analyses examined administrative data outcomes for a larger cohort (N=6,545) and longer time period (1 year). Principal Findings On average, substance abuse treatment costs $1,583 and is associated with a monetary benefit to society of $11,487, representing a greater than 7:1 ratio of benefits to costs. These benefits were primarily because of reduced costs of crime and increased employment earnings. Conclusions Even without considering the direct value to clients of improved health and quality of life, allocating taxpayer dollars to substance abuse treatment may be a wise investment. PMID:16430607

  2. dLocAuth: a dynamic multifactor authentication scheme for mCommerce applications using independent location-based obfuscation

    NASA Astrophysics Data System (ADS)

    Kuseler, Torben; Lami, Ihsan A.

    2012-06-01

    This paper proposes a new technique to obfuscate an authentication-challenge program (named LocProg) using randomly generated data together with a client's current location in real-time. LocProg can be used to enable any handsetapplication on mobile-devices (e.g. mCommerce on Smartphones) that requires authentication with a remote authenticator (e.g. bank). The motivation of this novel technique is to a) enhance the security against replay attacks, which is currently based on using real-time nonce(s), and b) add a new security factor, which is location verified by two independent sources, to challenge / response methods for authentication. To assure a secure-live transaction, thus reducing the possibility of replay and other remote attacks, the authors have devised a novel technique to obtain the client's location from two independent sources of GPS on the client's side and the cellular network on authenticator's side. The algorithm of LocProg is based on obfuscating "random elements plus a client's data" with a location-based key, generated on the bank side. LocProg is then sent to the client and is designed so it will automatically integrate into the target application on the client's handset. The client can then de-obfuscate LocProg if s/he is within a certain range around the location calculated by the bank and if the correct personal data is supplied. LocProg also has features to protect against trial/error attacks. Analysis of LocAuth's security (trust, threat and system models) and trials based on a prototype implementation (on Android platform) prove the viability and novelty of LocAuth.

  3. Prototype of Multifunctional Full-text Library in the Architecture Web-browser / Web-server / SQL-server

    NASA Astrophysics Data System (ADS)

    Lyapin, Sergey; Kukovyakin, Alexey

    Within the framework of the research program "Textaurus" an operational prototype of multifunctional library T-Libra v.4.1. has been created which makes it possible to carry out flexible parametrizable search within a full-text database. The information system is realized in the architecture Web-browser / Web-server / SQL-server. This allows to achieve an optimal combination of universality and efficiency of text processing, on the one hand, and convenience and minimization of expenses for an end user (due to applying of a standard Web-browser as a client application), on the other one. The following principles underlie the information system: a) multifunctionality, b) intelligence, c) multilingual primary texts and full-text searching, d) development of digital library (DL) by a user ("administrative client"), e) multi-platform working. A "library of concepts", i.e. a block of functional models of semantic (concept-oriented) searching, as well as a subsystem of parametrizable queries to a full-text database, which is closely connected with the "library", serve as a conceptual basis of multifunctionality and "intelligence" of the DL T-Libra v.4.1. An author's paragraph is a unit of full-text searching in the suggested technology. At that, the "logic" of an educational / scientific topic or a problem can be built in a multilevel flexible structure of a query and the "library of concepts", replenishable by the developers and experts. About 10 queries of various level of complexity and conceptuality are realized in the suggested version of the information system: from simple terminological searching (taking into account lexical and grammatical paradigms of Russian) to several kinds of explication of terminological fields and adjustable two-parameter thematic searching (a [set of terms] and a [distance between terms] within the limits of an author's paragraph are such parameters correspondingly).

  4. Recovery and money management.

    PubMed

    Rowe, Michael; Serowik, Kristin L; Ablondi, Karen; Wilber, Charles; Rosen, Marc I

    2013-06-01

    Social recovery and external money management are important approaches in contemporary mental health care, but little research has been done on the relationship between the two or on application of recovery principles to money management for people at risk of being assigned a representative payee or conservator. Out of 49 total qualitative interviews, 25 transcripts with persons receiving Social Security insurance or Social Security disability insurance who were at risk of being assigned a money manager were analyzed to assess the presence of recognized recovery themes. The recovery principles of self-direction and responsibility were strong themes in participant comments related to money management. Money management interventions should incorporate peoples' recovery-related motivations to acquire financial management skills as a means to direct and assume responsibility for one's finances. Staff involved in money management should receive training to support client's recovery-related goals. (PsycINFO Database Record (c) 2013 APA, all rights reserved).

  5. Criminal history systems: new technology and new directions

    NASA Astrophysics Data System (ADS)

    Threatte, James

    1997-02-01

    Many forces are driving states to improve their current Criminal History and On-Line Criminal Justice Information Systems. The predominate factors compelling this movement are (1) the deterioration and cost of supporting older legacy systems, (2) current generation high performance, low cost hardware and system software, and (3) funding programs, such as the National Criminal History Improvement Program, which are targeted specifically at improving these important systems. In early 1996, SAIC established an Internal Research and Development project devoted to Computerized Criminal History Systems (CCH). This project began with an assessment of current hardware, operating system, and relational database technology. Application software design and development approaches were then reviewed with a focus on object-oriented approaches, three tier client server architectures, and tools that enable the `right sizing' of systems. An operational prototype of a State CCH system was established based on the results of these investigations.

  6. Application of adult attachment theory to group member transference and the group therapy process.

    PubMed

    Markin, Rayna D; Marmarosh, Cheri

    2010-03-01

    Although clinical researchers have applied attachment theory to client conceptualization and treatment in individual therapy, few researchers have applied this theory to group therapy. The purpose of this article is to begin to apply theory and research on adult dyadic and group attachment styles to our understanding of group dynamics and processes in adult therapy groups. In particular, we set forth theoretical propositions on how group members' attachment styles affect relationships within the group. Specifically, this article offers some predictions on how identifying group member dyadic and group attachment styles could help leaders predict member transference within the therapy group. Implications of group member attachment for the selection and composition of a group and the different group stages are discussed. Recommendations for group clinicians and researchers are offered. PsycINFO Database Record (c) 2010 APA, all rights reserved

  7. OCIS: 15 years' experience with patient-centered computing.

    PubMed

    Enterline, J P; Lenhard, R E; Blum, B I; Majidi, F M; Stuart, G J

    1994-01-01

    In the mid-1970s, the medical and administrative staff of the Oncology Center at Johns Hopkins Hospital recognized a need for a computer-based clinical decision-support system that organized patients' information according to the care continuum, rather than as a series of event-specific data. This is especially important in cancer patients, because of the long periods in which they receive complex medical treatment and the enormous amounts of data generated by extremely ill patients with multiple interrelated diseases. During development of the Oncology Clinical Information System (OCIS), it became apparent that administrative services, research systems, ancillary functions (such as drug and blood product ordering), and financial processes should be integrated with the basic patient-oriented database. With the structured approach used in applications development, new modules were added as the need for additional functions arose. The system has since been moved to a modern network environment with the capacity for client-server processing.

  8. Initial evaluation of psychometric properties of a structured work task application for the Assessment of Work Performance in a constructed environment.

    PubMed

    Karlsson, Elin A; Liedberg, Gunilla M; Sandqvist, Jan L

    2017-06-22

    The Swedish Social Insurance Administration has developed a new assessment tool for sickness insurance. This study is a part of the initial evaluation of the application, called the Assessment of Work Performance, Structured Activities, and focuses on evaluation of the psychometric properties of social validity, content validity, and utility. This was a qualitative study using semi-structured telephone interviews with occupational therapists. A convenience sample was used and participants who fulfilled inclusion criteria (n = 15) were interviewed. Data were analyzed using content analysis with a directed approach. The results indicate that the application provides valuable information and that it is socially valid. Assessors found work tasks suitable for a diverse group of clients and reported that clients accepted the assessments. Improvements were suggested, for example, expanding the application with more work tasks. The instrument has benefits; however, further development is desired. The use of a constructed environment in assessments may be a necessary option to supplement a real environment. But depending on organizational factors such as time and other resources, the participants had different opportunities to do so. Further evaluations regarding ecological validity are essential to ensure that assessments are fair and realistic when using constructed environments. Implications for rehabilitation This study indicates that assessment in a constructed environment can provide a secure and protected context for clients being assessed. Psychometric evaluations are a never-ending process and this assessment instrument needs further development. However, this initial evaluation provides guidance in development of the instrument but also what studies to give priority to. It is important to evaluate social validity in order to ensure that clients and assessors perceive assessment methods fair and meaningful. In this study, participants found the work tasks appropriate and usable when assessing their clients but client's perspective must also be included in following studies. This assessment instrument is the only activity-based assessment instrument within the Swedish Social Security Insurance. Psychometric evaluations are important since it affects so many individuals in Sweden.

  9. A Responsive Client for Distributed Visualization

    NASA Astrophysics Data System (ADS)

    Bollig, E. F.; Jensen, P. A.; Erlebacher, G.; Yuen, D. A.; Momsen, A. R.

    2006-12-01

    As grids, web services and distributed computing continue to gain popularity in the scientific community, demand for virtual laboratories likewise increases. Today organizations such as the Virtual Laboratory for Earth and Planetary Sciences (VLab) are dedicated to developing web-based portals to perform various simulations remotely while abstracting away details of the underlying computation. Two of the biggest challenges in portal- based computing are fast visualization and smooth interrogation without over taxing clients resources. In response to this challenge, we have expanded on our previous data storage strategy and thick client visualization scheme [1] to develop a client-centric distributed application that utilizes remote visualization of large datasets and makes use of the local graphics processor for improved interactivity. Rather than waste precious client resources for visualization, a combination of 3D graphics and 2D server bitmaps are used to simulate the look and feel of local rendering. Java Web Start and Java Bindings for OpenGL enable install-on- demand functionality as well as low level access to client graphics for all platforms. Powerful visualization services based on VTK and auto-generated by the WATT compiler [2] are accessible through a standard web API. Data is permanently stored on compute nodes while separate visualization nodes fetch data requested by clients, caching it locally to prevent unnecessary transfers. We will demonstrate application capabilities in the context of simulated charge density visualization within the VLab portal. In addition, we will address generalizations of our application to interact with a wider number of WATT services and performance bottlenecks. [1] Ananthuni, R., Karki, B.B., Bollig, E.F., da Silva, C.R.S., Erlebacher, G., "A Web-Based Visualization and Reposition Scheme for Scientific Data," In Press, Proceedings of the 2006 International Conference on Modeling Simulation and Visualization Methods (MSV'06) (2006). [2] Jensen, P.A., Yuen, D.A., Erlebacher, G., Bollig, E.F., Kigelman, D.G., Shukh, E.A., Automated Generation of Web Services for Visualization Toolkits, Eos Trans. AGU, 86(52), Fall Meet. Suppl., Abstract IN42A-06, 2005.

  10. Audio-based queries for video retrieval over Java enabled mobile devices

    NASA Astrophysics Data System (ADS)

    Ahmad, Iftikhar; Cheikh, Faouzi Alaya; Kiranyaz, Serkan; Gabbouj, Moncef

    2006-02-01

    In this paper we propose a generic framework for efficient retrieval of audiovisual media based on its audio content. This framework is implemented in a client-server architecture where the client application is developed in Java to be platform independent whereas the server application is implemented for the PC platform. The client application adapts to the characteristics of the mobile device where it runs such as screen size and commands. The entire framework is designed to take advantage of the high-level segmentation and classification of audio content to improve speed and accuracy of audio-based media retrieval. Therefore, the primary objective of this framework is to provide an adaptive basis for performing efficient video retrieval operations based on the audio content and types (i.e. speech, music, fuzzy and silence). Experimental results approve that such an audio based video retrieval scheme can be used from mobile devices to search and retrieve video clips efficiently over wireless networks.

  11. Qualitative Assessment of the Feasibility, Usability, and Acceptability of a Mobile Client Data App for Community-Based Maternal, Neonatal, and Child Care in Rural Ghana.

    PubMed

    Rothstein, Jessica D; Jennings, Larissa; Moorthy, Anitha; Yang, Fan; Gee, Lisa; Romano, Karen; Hutchful, David; Labrique, Alain B; LeFevre, Amnesty E

    2016-01-01

    Mobile phone applications may enhance the delivery of critical health services and the accuracy of health service data. Yet, the opinions and experiences of frontline health workers on using mobile apps to track pregnant and recently delivered women are underreported. This evaluation qualitatively assessed the feasibility, usability, and acceptability of a mobile Client Data App for maternal, neonatal, and child client data management by community health nurses (CHNs) in rural Ghana. The mobile app enabled CHNs to enter, summarize, and query client data. It also sent visit reminders for clients and provided a mechanism to report level of care to district officers. Fourteen interviews and two focus groups with CHNs, midwives, and district health officers were conducted, coded, and thematically analyzed. Results indicated that the app was easily integrated into care, improved CHN productivity, and was acceptable due to its capacity to facilitate client follow-up, data reporting, and decision-making. However, the feasibility and usability of the app were hindered by high client volumes, staff shortages, and software and device challenges. Successful integration of mobile client data apps for frontline health workers in rural and resource-poor settings requires real-time monitoring, program investments, and targeted changes in human resources.

  12. Qualitative Assessment of the Feasibility, Usability, and Acceptability of a Mobile Client Data App for Community-Based Maternal, Neonatal, and Child Care in Rural Ghana

    PubMed Central

    Jennings, Larissa; Moorthy, Anitha; Yang, Fan; Gee, Lisa; Romano, Karen; Hutchful, David; Labrique, Alain B.; LeFevre, Amnesty E.

    2016-01-01

    Mobile phone applications may enhance the delivery of critical health services and the accuracy of health service data. Yet, the opinions and experiences of frontline health workers on using mobile apps to track pregnant and recently delivered women are underreported. This evaluation qualitatively assessed the feasibility, usability, and acceptability of a mobile Client Data App for maternal, neonatal, and child client data management by community health nurses (CHNs) in rural Ghana. The mobile app enabled CHNs to enter, summarize, and query client data. It also sent visit reminders for clients and provided a mechanism to report level of care to district officers. Fourteen interviews and two focus groups with CHNs, midwives, and district health officers were conducted, coded, and thematically analyzed. Results indicated that the app was easily integrated into care, improved CHN productivity, and was acceptable due to its capacity to facilitate client follow-up, data reporting, and decision-making. However, the feasibility and usability of the app were hindered by high client volumes, staff shortages, and software and device challenges. Successful integration of mobile client data apps for frontline health workers in rural and resource-poor settings requires real-time monitoring, program investments, and targeted changes in human resources. PMID:28070186

  13. The D3 Middleware Architecture

    NASA Technical Reports Server (NTRS)

    Walton, Joan; Filman, Robert E.; Korsmeyer, David J.; Lee, Diana D.; Mak, Ron; Patel, Tarang

    2002-01-01

    DARWIN is a NASA developed, Internet-based system for enabling aerospace researchers to securely and remotely access and collaborate on the analysis of aerospace vehicle design data, primarily the results of wind-tunnel testing and numeric (e.g., computational fluid-dynamics) model executions. DARWIN captures, stores and indexes data; manages derived knowledge (such as visualizations across multiple datasets); and provides an environment for designers to collaborate in the analysis of test results. DARWIN is an interesting application because it supports high-volumes of data. integrates multiple modalities of data display (e.g., images and data visualizations), and provides non-trivial access control mechanisms. DARWIN enables collaboration by allowing not only sharing visualizations of data, but also commentary about and views of data. Here we provide an overview of the architecture of D3, the third generation of DARWIN. Earlier versions of DARWIN were characterized by browser-based interfaces and a hodge-podge of server technologies: CGI scripts, applets, PERL, and so forth. But browsers proved difficult to control, and a proliferation of computational mechanisms proved inefficient and difficult to maintain. D3 substitutes a pure-Java approach for that medley: A Java client communicates (though RMI over HTTPS) with a Java-based application server. Code on the server accesses information from JDBC databases, distributed LDAP security services, and a collaborative information system. D3 is a three tier-architecture, but unlike 'E-commerce' applications, the data usage pattern suggests different strategies than traditional Enterprise Java Beans - we need to move volumes of related data together, considerable processing happens on the client, and the 'business logic' on the server-side is primarily data integration and collaboration. With D3, we are extending DARWIN to handle other data domains and to be a distributed system, where a single login allows a user transparent access to test results from multiple servers and authority domains.

  14. Web-based metabolic network visualization with a zooming user interface

    PubMed Central

    2011-01-01

    Background Displaying complex metabolic-map diagrams, for Web browsers, and allowing users to interact with them for querying and overlaying expression data over them is challenging. Description We present a Web-based metabolic-map diagram, which can be interactively explored by the user, called the Cellular Overview. The main characteristic of this application is the zooming user interface enabling the user to focus on appropriate granularities of the network at will. Various searching commands are available to visually highlight sets of reactions, pathways, enzymes, metabolites, and so on. Expression data from single or multiple experiments can be overlaid on the diagram, which we call the Omics Viewer capability. The application provides Web services to highlight the diagram and to invoke the Omics Viewer. This application is entirely written in JavaScript for the client browsers and connect to a Pathway Tools Web server to retrieve data and diagrams. It uses the OpenLayers library to display tiled diagrams. Conclusions This new online tool is capable of displaying large and complex metabolic-map diagrams in a very interactive manner. This application is available as part of the Pathway Tools software that powers multiple metabolic databases including Biocyc.org: The Cellular Overview is accessible under the Tools menu. PMID:21595965

  15. 78 FR 45246 - Office of Clinical and Preventive Services National HIV Program: Enhanced HIV/AIDS Screening and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-26

    ... Prevention (CDC) guidelines, provide pre- and post-test counseling (when indicated), and developing or... by applicable law. Test at least one previously untested (not tested in the prior five years) patient... ensure that clients receive their test results, particularly clients who test positive. ii. Describe how...

  16. CADB: Conformation Angles DataBase of proteins

    PubMed Central

    Sheik, S. S.; Ananthalakshmi, P.; Bhargavi, G. Ramya; Sekar, K.

    2003-01-01

    Conformation Angles DataBase (CADB) provides an online resource to access data on conformation angles (both main-chain and side-chain) of protein structures in two data sets corresponding to 25% and 90% sequence identity between any two proteins, available in the Protein Data Bank. In addition, the database contains the necessary crystallographic parameters. The package has several flexible options and display facilities to visualize the main-chain and side-chain conformation angles for a particular amino acid residue. The package can also be used to study the interrelationship between the main-chain and side-chain conformation angles. A web based JAVA graphics interface has been deployed to display the user interested information on the client machine. The database is being updated at regular intervals and can be accessed over the World Wide Web interface at the following URL: http://144.16.71.148/cadb/. PMID:12520049

  17. DNAAlignEditor: DNA alignment editor tool

    PubMed Central

    Sanchez-Villeda, Hector; Schroeder, Steven; Flint-Garcia, Sherry; Guill, Katherine E; Yamasaki, Masanori; McMullen, Michael D

    2008-01-01

    Background With advances in DNA re-sequencing methods and Next-Generation parallel sequencing approaches, there has been a large increase in genomic efforts to define and analyze the sequence variability present among individuals within a species. For very polymorphic species such as maize, this has lead to a need for intuitive, user-friendly software that aids the biologist, often with naïve programming capability, in tracking, editing, displaying, and exporting multiple individual sequence alignments. To fill this need we have developed a novel DNA alignment editor. Results We have generated a nucleotide sequence alignment editor (DNAAlignEditor) that provides an intuitive, user-friendly interface for manual editing of multiple sequence alignments with functions for input, editing, and output of sequence alignments. The color-coding of nucleotide identity and the display of associated quality score aids in the manual alignment editing process. DNAAlignEditor works as a client/server tool having two main components: a relational database that collects the processed alignments and a user interface connected to database through universal data access connectivity drivers. DNAAlignEditor can be used either as a stand-alone application or as a network application with multiple users concurrently connected. Conclusion We anticipate that this software will be of general interest to biologists and population genetics in editing DNA sequence alignments and analyzing natural sequence variation regardless of species, and will be particularly useful for manual alignment editing of sequences in species with high levels of polymorphism. PMID:18366684

  18. Parallel image registration with a thin client interface

    NASA Astrophysics Data System (ADS)

    Saiprasad, Ganesh; Lo, Yi-Jung; Plishker, William; Lei, Peng; Ahmad, Tabassum; Shekhar, Raj

    2010-03-01

    Despite its high significance, the clinical utilization of image registration remains limited because of its lengthy execution time and a lack of easy access. The focus of this work was twofold. First, we accelerated our course-to-fine, volume subdivision-based image registration algorithm by a novel parallel implementation that maintains the accuracy of our uniprocessor implementation. Second, we developed a thin-client computing model with a user-friendly interface to perform rigid and nonrigid image registration. Our novel parallel computing model uses the message passing interface model on a 32-core cluster. The results show that, compared with the uniprocessor implementation, the parallel implementation of our image registration algorithm is approximately 5 times faster for rigid image registration and approximately 9 times faster for nonrigid registration for the images used. To test the viability of such systems for clinical use, we developed a thin client in the form of a plug-in in OsiriX, a well-known open source PACS workstation and DICOM viewer, and used it for two applications. The first application registered the baseline and follow-up MR brain images, whose subtraction was used to track progression of multiple sclerosis. The second application registered pretreatment PET and intratreatment CT of radiofrequency ablation patients to demonstrate a new capability of multimodality imaging guidance. The registration acceleration coupled with the remote implementation using a thin client should ultimately increase accuracy, speed, and access of image registration-based interpretations in a number of diagnostic and interventional applications.

  19. Shared patients: multiple health and social care contact.

    PubMed

    Keene, J; Swift, L; Bailey, S; Janacek, G

    2001-07-01

    The paper describes results from the 'Tracking Project', a new method for examining agency overlap, repeat service use and shared clients/patients amongst social and health care agencies in the community. This is the first project in this country to combine total population databases from a range of social, health care and criminal justice agencies to give a multidisciplinary database for one county (n = 97,162 cases), through standardised anonymisation of agency databases, using SOUNDEX, a software programme. A range of 20 community social and health care agencies were shown to have a large overlap with each other in a two-year period, indicating high proportions of shared patients/clients. Accident and Emergency is used as an example of major overlap: 16.2% (n = 39,992) of persons who attended a community agency had attended Accident and Emergency as compared to 8.2% (n = 775,000) of the total population of the county. Of these, 96% who had attended seven or more different community agencies had also attended Accident and Emergency. Further statistical analysis of Accident and Emergency attendance as a characteristic of community agency populations (n = 39,992) revealed that increasing frequency of attendance at Accident and Emergency was very strongly associated with increasing use of other services. That is, the patients that repeatedly attend Accident and Emergency are much more likely to attend more other agencies, indicating the possibility that they share more problematic or difficult patients. Research questions arising from these data are discussed and future research methods suggested in order to derive predictors from the database and develop screening instruments to identify multiple agency attenders for targeting or multidisciplinary working. It is suggested that Accident and Emergency attendance might serve as an important predictor of multiple agency attendance.

  20. Identifying and integrating helpful and harmful religious beliefs into psychotherapy.

    PubMed

    Rosenfield, George W

    2010-12-01

    The 2 main roles of the psychotherapist involve identifying and understanding the client's problems/strengths and treating problems. Suggestions are offered to guide addressing or avoiding religious beliefs in both roles. Types of religious beliefs that contribute to distress, particularly for youth, are identified and treatment options are offered. (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  1. Standards for Data Exchange and Case Management Information Systems in Support of Comprehensive Integrated School-Linked Services. Version 2.0.

    ERIC Educational Resources Information Center

    Far West Lab. for Educational Research and Development, San Francisco, CA.

    This report is intended as a guide for local comprehensive integrated school-linked services sites and software vendors in developing and implementing case management information systems for the exchange and management of client data. The report is also intended to influence new development and future revisions of data systems, databases, and…

  2. Predicting Host Level Reachability via Static Analysis of Routing Protocol Configuration

    DTIC Science & Technology

    2007-09-01

    check_function_bodies = false; SET client_min_messages = warning; -- -- Name: SCHEMA public; Type: COMMENT; Schema: -; Owner: postgres -- COMMENT...public; Owner: mcmanst -- -- -- Name: public; Type: ACL; Schema: -; Owner: postgres -- REVOKE ALL ON SCHEMA public FROM PUBLIC; REVOKE...ALL ON SCHEMA public FROM postgres ; GRANT ALL ON SCHEMA public TO postgres ; GRANT ALL ON SCHEMA public TO PUBLIC; -- -- PostgreSQL database

  3. OPserver: opacities and radiative accelerations on demand

    NASA Astrophysics Data System (ADS)

    Mendoza, C.; González, J.; Seaton, M. J.; Buerger, P.; Bellorín, A.; Meléndez, M.; Rodríguez, L. S.; Delahaye, F.; Zeippen, C. J.; Palacios, E.; Pradhan, A. K.

    2009-05-01

    We report on developments carried out within the Opacity Project (OP) to upgrade atomic database services to comply with e-infrastructure requirements. We give a detailed description of an interactive, online server for astrophysical opacities, referred to as OPserver, to be used in sophisticated stellar modelling where Rosseland mean opacities and radiative accelerations are computed at every depth point and each evolution cycle. This is crucial, for instance, in chemically peculiar stars and in the exploitation of the new asteroseismological data. OPserver, downloadable with the new OPCD_3.0 release from the Centre de Données Astronomiques de Strasbourg, France, computes mean opacities and radiative data for arbitrary chemical mixtures from the OP monochromatic opacities. It is essentially a client-server network restructuring and optimization of the suite of codes included in the earlier OPCD_2.0 release. The server can be installed locally or, alternatively, accessed remotely from the Ohio Supercomputer Center, Columbus, Ohio, USA. The client is an interactive web page or a subroutine library that can be linked to the user code. The suitability of this scheme in grid computing environments is emphasized, and its extension to other atomic database services for astrophysical purposes is discussed.

  4. Footprint Database and web services for the Herschel space observatory

    NASA Astrophysics Data System (ADS)

    Verebélyi, Erika; Dobos, László; Kiss, Csaba

    2015-08-01

    Using all telemetry and observational meta-data, we created a searchable database of Herschel observation footprints. Data from the Herschel space observatory is freely available for everyone but no uniformly processed catalog of all observations has been published yet. As a first step, we unified the data model for all three Herschel instruments in all observation modes and compiled a database of sky coverage information. As opposed to methods using a pixellation of the sphere, in our database, sky coverage is stored in exact geometric form allowing for precise area calculations. Indexing of the footprints allows for very fast search among observations based on pointing, time, sky coverage overlap and meta-data. This enables us, for example, to find moving objects easily in Herschel fields. The database is accessible via a web site and also as a set of REST web service functions which makes it usable from program clients like Python or IDL scripts. Data is available in various formats including Virtual Observatory standards.

  5. Therapeutic Alliance: A Concept for the Childbearing Season

    PubMed Central

    Doherty, Mary Ellen

    2009-01-01

    This analysis was conducted to describe the concept of therapeutic alliance and its appropriateness for health-care provider-client interactions during the childbearing season. The concept has been defined in other disciplines. A universal definition suggested a merging of efforts directed toward health. A simple and concise definition evolved, which is applicable to the childbearing season as well as to health-care encounters across the life span. This definition states: Therapeutic alliance is a process within a health-care provider-client interaction that is initiated by an identified need for positive client health-care behaviors, whereby both parties work together toward this goal with consideration of the client's current health status and developmental stage within the life span. PMID:20514120

  6. Web-client based distributed generalization and geoprocessing

    USGS Publications Warehouse

    Wolf, E.B.; Howe, K.

    2009-01-01

    Generalization and geoprocessing operations on geospatial information were once the domain of complex software running on high-performance workstations. Currently, these computationally intensive processes are the domain of desktop applications. Recent efforts have been made to move geoprocessing operations server-side in a distributed, web accessible environment. This paper initiates research into portable client-side generalization and geoprocessing operations as part of a larger effort in user-centered design for the US Geological Survey's The National Map. An implementation of the Ramer-Douglas-Peucker (RDP) line simplification algorithm was created in the open source OpenLayers geoweb client. This algorithm implementation was benchmarked using differing data structures and browser platforms. The implementation and results of the benchmarks are discussed in the general context of client-side geoprocessing. (Abstract).

  7. Evaluation of primary care midwifery in the Netherlands: design and rationale of a dynamic cohort study (DELIVER)

    PubMed Central

    2012-01-01

    Background In the Netherlands, midwives are autonomous medical practitioners and 78% of pregnant women start their maternity care with a primary care midwife. Scientific research to support evidence-based practice in primary care midwifery in the Netherlands has been sparse. This paper describes the research design and methodology of the multicenter multidisciplinary prospective DELIVER study which is the first large-scale study evaluating the quality and provision of primary midwifery care. Methods/Design Between September 2009 and April 2011, data were collected from clients and their partners, midwives and other healthcare professionals across the Netherlands. Clients from twenty midwifery practices received up to three questionnaires to assess the expectations and experiences of clients (e.g. quality of care, prenatal screening, emotions, health, and lifestyle). These client data were linked to data from the Netherlands Perinatal Register and electronic client records kept by midwives. Midwives and practice assistants from the twenty participating practices recorded work-related activities in a diary for one week, to assess workload. Besides, the midwives were asked to complete a questionnaire, to gain insight into collaboration of midwives with other care providers, their tasks and attitude towards their job, and the quality of the care they provide. Another questionnaire was sent to all Dutch midwifery practices which reveals information regarding the organisation of midwifery practices, provision of preconception care, collaboration with other care providers, and provision of care to ethnic minorities. Data at client, midwife and practice level can be linked. Additionally, partners of pregnant women and other care providers were asked about their expectations and experiences regarding the care delivered by midwives and in six practices client consults were videotaped to objectively assess daily practice. Discussion In total, 7685 clients completed at least one questionnaire, 136 midwives and assistants completed a diary with work-related activities (response 100%), 99 midwives completed a questionnaire (92%), and 319 practices across the country completed a questionnaire (61%), 30 partners of clients participated in focus groups, 21 other care providers were interviewed and 305 consults at six midwifery practices were videotaped. The multicenter DELIVER study provides an extensive database with national representative data on the quality of primary care midwifery in the Netherlands. This study will support evidence-based practice in primary care midwifery in the Netherlands and contribute to a better understanding of the maternity care system. PMID:22433820

  8. Relational-database model for improving quality assurance and process control in a composite manufacturing environment

    NASA Astrophysics Data System (ADS)

    Gentry, Jeffery D.

    2000-05-01

    A relational database is a powerful tool for collecting and analyzing the vast amounts of inner-related data associated with the manufacture of composite materials. A relational database contains many individual database tables that store data that are related in some fashion. Manufacturing process variables as well as quality assurance measurements can be collected and stored in database tables indexed according to lot numbers, part type or individual serial numbers. Relationships between manufacturing process and product quality can then be correlated over a wide range of product types and process variations. This paper presents details on how relational databases are used to collect, store, and analyze process variables and quality assurance data associated with the manufacture of advanced composite materials. Important considerations are covered including how the various types of data are organized and how relationships between the data are defined. Employing relational database techniques to establish correlative relationships between process variables and quality assurance measurements is then explored. Finally, the benefits of database techniques such as data warehousing, data mining and web based client/server architectures are discussed in the context of composite material manufacturing.

  9. Key competencies of the psychodynamic psychotherapist and how to teach them in supervision.

    PubMed

    Sarnat, Joan

    2010-03-01

    Four of Rodolfa et al.'s (2005) competencies in professional psychology-relationship, self-reflection, assessment-case conceptualization, and intervention-are key for the psychodynamic psychotherapist. Relationship lies at the heart of what is understood to be curative about psychodynamic psychotherapy. Self-reflection implies a complex and highly developed process that includes but goes beyond Rodolfa et al.'s and Kaslow, Dunn, and Smith's (2008) definitions. Competent assessment, diagnosis, and case conceptualization entails making inferences about unconscious processes by observing the client and also one's own experience, and integrating these inferences with theory. Effective psychodynamic intervention is derived from what the psychotherapist has experienced, processed, and conceptualized about the relationship with the client and about the client's internal object world. An extended vignette shows these competencies emerging in a psychotherapist-in-training, facilitated by an intense interaction with a supervisor. Although the supervisory and clinical tasks are different, the supervisor provides a relationship experience that models these same competencies for the supervisee and catalyzes their development in the supervisee. PsycINFO Database Record (c) 2010 APA, all rights reserved

  10. Computerized commodity management system in Thailand and Brazil.

    PubMed

    1984-01-01

    Thailand's National Family Planning Program is testing a computerized contraceptive commodity reporting management in 4 provinces with 104 National Family Planning Program (NFPP) reporting entities. Staff in the Brazilian Association of Family Planning Entities (ABEPF) and CPAIMC, a major family planning service agency, have been trained in the use of a computerized commodity distribution management system and are ready to initiate test use. The systems were designed in response to specific commodity management needs of the concerned organizations. Neither distribution program functions as a contraceptive social marketing (CSM) program, but each system reviewed has aspects that are relevant to CSM commodity management needs. Both the Thai and Brazilian systems were designed to be as automatic and user friendly as possible. Both have 3 main databases and perform similar management and reporting functions. Differing program configurations and basic data forms reflect the specific purposes of each system. Databases for the logistics monitoring system in Thailand arethe reporting entity (or ID) file; the current month's data file; and the master balance file. The data source is the basic reporting form that also serves as a Request and Issue Voucher for commodities. Editing functions in the program check to see that the current "beginning balance" equals the previous month's ending balance. Indexing functions in the system allow direct access to the records of any reporting entity via the ID number, as well as the sequential processing of records by ID number. 6 reports can be generated: status report by issuing entity; status report by dispensing entity; aggregate status report; out of compliance products report; out of compliance outlets report; and suggested shipment to regional warehouse report. Databases for the distribution management system in Brazil are: the name-ID (client institution) file; the product file; and the data file. The data source is an order form that contains a client code similar to the code used in Thailand. An interrogative data entry program enhances the management function of the system. 8 reports can be individually issued: a status report on back orders by product; a status report on back orders by institution and product; a historical report of year to date shipments and value by product; a historical report of year to date shipments by client and product; year to date payment reports from each client; outstanding invoices by month for the previous 12 months; a product report showing the amount of each product or order with outstanding invoices; and a stock position report.

  11. Multi-board kernel communication using socket programming for embedded applications

    NASA Astrophysics Data System (ADS)

    Mishra, Ashish; Girdhar, Neha; Krishnia, Nikita

    2016-03-01

    It is often seen in large application projects, there is a need to communicate between two different processors or two different kernels. The aim of this paper is to communicate between two different kernels and use efficient method to do so. The TCP/IP protocol is implemented to communicate between two boards via the Ethernet port and use lwIP (lightweight IP) stack, which is a smaller independent implementation of the TCP/IP stack suitable for use in embedded systems. While retaining TCP/IP functionality, lwIP stack reduces the use of memory and even size of the code. In this process of communication we made Raspberry pi as an active client and Field programmable gate array(FPGA) board as a passive server and they are allowed to communicate via Ethernet. Three applications based on TCP/IP client-server network communication have been implemented. The Echo server application is used to communicate between two different kernels of two different boards. Socket programming is used as it is independent of platform and programming language used. TCP transmit and receive throughput test applications are used to measure maximum throughput of the transmission of data. These applications are based on communication to an open source tool called iperf. It is used to measure the throughput transmission rate by sending or receiving some constant piece of data to the client or server according to the test application.

  12. Perceived sources of change in trainees' self-efficacy beliefs.

    PubMed

    Lent, Robert W; Cinamon, Rachel Gali; Bryan, Nicole A; Jezzi, Matthew M; Martin, Helena M; Lim, Robert

    2009-09-01

    Thought-listing procedures were used to examine the perceived incidence, size, direction, and bases of change in the session-level self-efficacy of therapists in training. Ninety-eight Master's-level trainees completed a cognitive assessment task immediately after each session with a client in their first practicum. Participants typically reported modest-sized, positive changes in their therapeutic self-efficacy at each session. Seven perceived sources of change in self-efficacy were identified. Some of these sources (e.g., trainees' performance evaluations, affective reactions) were consistent with general self-efficacy theory; others reflected the interpersonal performance context of therapy (e.g., perceptions of the therapeutic relationship and client behavior). Implications of the findings for training and future research on therapist development are considered. (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  13. The Relationship of Community-based Nurse Care Coordination to Costs in the Medicare and Medicaid Programs

    PubMed Central

    Marek, Karen Dorman; Adams, Scott J.; Stetzer, Frank; Popejoy, Lori; Rantz, Marilyn

    2011-01-01

    The purpose of this evaluation was to study the relationship of nurse care coordination (NCC) to the costs of Medicare and Medicaid in a community-based care program called Missouri Care Options (MCO). A retrospective cohort design was used comparing 57 MCO clients with NCC to 80 MCO clients without NCC. Total cost was measured using Medicare and Medicaid claims databases. Fixed effects analysis was used to estimate the relationship of the NCC intervention to costs. Controlling for high resource use on admission, monthly Medicare costs were lower ($686) in the 12 months of NCC intervention (p =.04) while Medicaid costs were higher ($203; p=.03) for the NCC group when compared to the costs of MCO group. PMID:20499393

  14. Is low therapist empathy toxic?

    PubMed

    Moyers, Theresa B; Miller, William R

    2013-09-01

    One of the largest determinants of client outcomes is the counselor who provides treatment. Therapists often vary widely in effectiveness, even when delivering standardized manual-guided treatment. In particular, the therapeutic skill of accurate empathy originally described by Carl Rogers has been found to account for a meaningful proportion of variance in therapeutic alliance and in addiction treatment outcomes. High-empathy counselors appear to have higher success rates regardless of theoretical orientation. Low-empathy and confrontational counseling, in contrast, has been associated with higher drop-out and relapse rates, weaker therapeutic alliance, and less client change. The authors propose emphasis on empathic listening skills as an evidence-based practice in the hiring and training of counselors to improve outcomes and prevent harm in addiction treatment. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  15. System and Method for Providing a Climate Data Analytic Services Application Programming Interface Distribution Package

    NASA Technical Reports Server (NTRS)

    Tamkin, Glenn S. (Inventor); Duffy, Daniel Q. (Inventor); Schnase, John L. (Inventor)

    2016-01-01

    A system, method and computer-readable storage devices for providing a climate data analytic services application programming interface distribution package. The example system can provide various components. The system provides a climate data analytic services application programming interface library that enables software applications running on a client device to invoke the capabilities of a climate data analytic service. The system provides a command-line interface that provides a means of interacting with a climate data analytic service by issuing commands directly to the system's server interface. The system provides sample programs that call on the capabilities of the application programming interface library and can be used as templates for the construction of new client applications. The system can also provide test utilities, build utilities, service integration utilities, and documentation.

  16. Facilitating neurorehabilitation through principles of engagement.

    PubMed

    Danzl, Megan M; Etter, Nicole M; Andreatta, Richard D; Kitzman, Patrick H

    2012-01-01

    A primary goal of neurorehabilitation is to guide recovery of functional skills after injury through evidence-based interventions that operate to manipulate the sensorimotor environment of the client. While choice of intervention is an important decision for clinicians, we contend it is only one part of producing optimal activity-dependent neuroplastic changes. A key variable in the rehabilitation equation is engagement. Applying principles of engagement may yield greater neuroplastic changes and functional outcomes for clients. We review the principles of neuroplasticity and engagement and their potential linkage through concepts of attention and motivation and strategies such as mental practice and enriched environments. Clinical applications and challenges for enhancing engagement during rehabilitation are presented. Engagement strategies, such as building trust and rapport, motivational interviewing, enhancing the client education process, and interventions that empower clients, are reviewed. Well-controlled research is needed to test our theoretical framework and suggested outcomes. Clinicians may enhance engagement by investing time and energy in the growth and development of the therapeutic relationship with clients, as this is paramount to maintaining clients' investment in continuing therapy and also may act as a driver of neuroplastic changes.

  17. Performance assessments of Android-powered military applications operating on tactical handheld devices

    NASA Astrophysics Data System (ADS)

    Weiss, Brian A.; Fronczek, Lisa; Morse, Emile; Kootbally, Zeid; Schlenoff, Craig

    2013-05-01

    Transformative Apps (TransApps) is a Defense Advanced Research Projects Agency (DARPA) funded program whose goal is to develop a range of militarily-relevant software applications ("apps") to enhance the operational-effectiveness of military personnel on (and off) the battlefield. TransApps is also developing a military apps marketplace to facilitate rapid development and dissemination of applications to address user needs by connecting engaged communities of endusers with development groups. The National Institute of Standards and Technology's (NIST) role in the TransApps program is to design and implement evaluation procedures to assess the performance of: 1) the various software applications, 2) software-hardware interactions, and 3) the supporting online application marketplace. Specifically, NIST is responsible for evaluating 50+ tactically-relevant applications operating on numerous Android™-powered platforms. NIST efforts include functional regression testing and quantitative performance testing. This paper discusses the evaluation methodologies employed to assess the performance of three key program elements: 1) handheld-based applications and their integration with various hardware platforms, 2) client-based applications and 3) network technologies operating on both the handheld and client systems along with their integration into the application marketplace. Handheld-based applications are assessed using a combination of utility and usability-based checklists and quantitative performance tests. Client-based applications are assessed to replicate current overseas disconnected (i.e. no network connectivity between handhelds) operations and to assess connected operations envisioned for later use. Finally, networked applications are assessed on handhelds to establish baselines of performance for when connectivity will be common usage.

  18. Earthdata Search: Scaling, Assessing and Improving Relevancy

    NASA Technical Reports Server (NTRS)

    Reese, Mark

    2016-01-01

    NASA's Earthdata Search (https:search.earthdata.nasa.gov) application allows users to search, discover, visualize, and access NASA and international interagency data about the Earth. As a client to NASA's Common Metadata Repository (CMR), its catalog of data collections grew 700 in late 2015. This massive expansion brought improved search and discovery to the forefront of the client's usability needs. During this talk, we will give a brief overview of the application, the challenges that arose during this period of growth, the metrics-driven way we addressed them, and the latest outcomes.

  19. Nursing record systems: effects on nursing practice and health care outcomes.

    PubMed

    Currell, R; Wainwright, P; Urquhart, C

    2000-01-01

    A nursing record system is the record of care planned and/or given to individual patients/clients by qualified nurses or other caregivers under the direction of a qualified nurse. Nursing record systems may be an effective way of influencing nurse practice. To assess the effects of nursing record systems on nursing practice and patient outcomes. We searched The Cochrane Library, MEDLINE, Cinahl, Sigle, and databases of the Royal College of Nursing, King's Fund, the NHS Centre for Reviews and Dissemination, and the Institute of Electrical Engineers up to August 1999; and OCLC First Search, Department of Health database, NHS Register of Computer Applications and the Health Visitors' Association database up to the end of 1995. We hand searched the Journal of Nursing Administration (1971-1999), Computers in Nursing (1984-1999), Information Technology in Nursing (1989-1999) and reference lists of articles. We also hand searched the major health informatics conference proceedings. We contacted experts in the field of nursing informatics, suppliers of nursing computer systems, and relevant Internet groups. Randomised trials, controlled before and after studies and interrupted time series comparing one kind of nursing record system with another, in hospital, community or primary care settings. The participants were qualified nurses, students or health care assistants working under the direction of a qualified nurse and patients receiving care recorded and/or planned using nursing record systems. Two reviewers independently assessed trial quality and extracted data. Six trials involving 1407 people were included. In three studies of client held records, there were no overall positive or negative effects, although some administrative benefits through fewer missing notes were suggested. A paediatric pain management sheet study showed a positive effect on the children's pain intensity. A computerised nursing care planning study showed a negative effect on documented nursing care planning. A controlled before-and-after study of two paper nursing record systems showed improvement in meeting documentation standards. No evidence was found of effects on practice attributable to changes in record systems. Although there is a paucity of studies of sufficient methodological rigour to yield reliable results in this area, it is clear from the literature that it is possible to set up randomised trials or other quasi-experimental designs needed to produce evidence for practice. The research undertaken so far may have suffered both from methodological problems and faulty hypotheses.

  20. The Timing and Accumulation of Judicial Sanctions among Drug Court Clients

    ERIC Educational Resources Information Center

    McRee, Nick; Drapela, Laurie A.

    2012-01-01

    Judicial sanctions are used by drug courts to encourage clients to comply with program requirements. However, few studies have explored the application of sanctions in drug courts or the relationship between sanctions and drug court graduation. This article reports the results of a study of sanctions as applied in a drug court in southwest…

  1. Comparing maternal child health problems and outcomes across public health nursing agencies.

    PubMed

    Monsen, Karen A; Fulkerson, Jayne A; Lytton, Amy B; Taft, Lila L; Schwichtenberg, Linda D; Martin, Karen S

    2010-05-01

    To use aggregated data from health informatics systems to identify needs of maternal and child health (MCH) clients served by county public health agencies and to demonstrate outcomes of services provided. Participating agencies developed and implemented a formal standardized classification data comparison process using structured Omaha System data. An exploratory descriptive analysis of the data was performed. Summary reports of aggregated and analyzed data from records of clients served and discharged in 2005 were compared. Client problems and outcomes were found to be similar across agencies, with behavioral, psychosocial, environmental and physiological problems identified and addressed. Differential improvement was noted by problem, outcome measure, and agency; and areas for enhancing intervention strategies were prioritized. Problems with greatest improvement across agencies were Antepartum/postpartum and Family planning, and least improvement across agencies were Neglect and Substance use. Findings demonstrated that public health nurses address many serious health-related problems with low-income high-risk MCH clients. MCH client needs were found to be similar across agencies. Public health nurse home visiting services addressed important health issues with MCH clients, and statistically significant improvement in client health problems occurred consistently across agencies. The data comparison processes developed in this project were useful for MCH programs, and may be applicable to other program areas using structured client data for evaluation purposes. Using informatics tools and data facilitated needs assessment, program evaluation, and outcomes management processes for the agencies, and will continue to play an integral role in directing practice and improving client outcomes.

  2. ERDDAP - An Easier Way for Diverse Clients to Access Scientific Data From Diverse Sources

    NASA Astrophysics Data System (ADS)

    Mendelssohn, R.; Simons, R. A.

    2008-12-01

    ERDDAP is a new open-source, web-based service that aggregates data from other web services: OPeNDAP grid servers (THREDDS), OPeNDAP sequence servers (Dapper), NOS SOAP service, SOS (IOOS, OOStethys), microWFS, DiGIR (OBIS, BMDE). Regardless of the data source, ERDDAP makes all datasets available to clients via standard (and enhanced) DAP requests and makes some datasets accessible via WMS. A client's request also specifies the desired format for the results, e.g., .asc, .csv, .das, .dds, .dods, htmlTable, XHTML, .mat, netCDF, .kml, .png, or .pdf (formats more directly useful to clients). ERDDAP interprets a client request, requests the data from the data source (in the appropriate way), reformats the data source's response, and sends the result to the client. Thus ERDDAP makes data from diverse sources available to diverse clients via standardized interfaces. Clients don't have to install libraries to get data from ERDDAP because ERDDAP is RESTful and resource-oriented: a URL completely defines a data request and the URL can be used in any application that can send a URL and receive a file. This also makes it easy to use ERDDAP in mashups with other web services. ERDDAP could be extended to support other protocols. ERDDAP's hub and spoke architecture simplifies adding support for new types of data sources and new types of clients. ERDDAP includes metadata management support, catalog services, and services to make graphs and maps.

  3. A prototype molecular interactive collaborative environment (MICE).

    PubMed

    Bourne, P; Gribskov, M; Johnson, G; Moreland, J; Wavra, S; Weissig, H

    1998-01-01

    Illustrations of macromolecular structure in the scientific literature contain a high level of semantic content through which the authors convey, among other features, the biological function of that macromolecule. We refer to these illustrations as molecular scenes. Such scenes, if available electronically, are not readily accessible for further interactive interrogation. The basic PDB format does not retain features of the scene; formats like PostScript retain the scene but are not interactive; and the many formats used by individual graphics programs, while capable of reproducing the scene, are neither interchangeable nor can they be stored in a database and queried for features of the scene. MICE defines a Molecular Scene Description Language (MSDL) which allows scenes to be stored in a relational database (a molecular scene gallery) and queried. Scenes retrieved from the gallery are rendered in Virtual Reality Modeling Language (VRML) and currently displayed in WebView, a VRML browser modified to support the Virtual Reality Behavior System (VRBS) protocol. VRBS provides communication between multiple client browsers, each capable of manipulating the scene. This level of collaboration works well over standard Internet connections and holds promise for collaborative research at a distance and distance learning. Further, via VRBS, the VRML world can be used as a visual cue to trigger an application such as a remote MEME search. MICE is very much work in progress. Current work seeks to replace WebView with Netscape, Cosmoplayer, a standard VRML plug-in, and a Java-based console. The console consists of a generic kernel suitable for multiple collaborative applications and additional application-specific controls. Further details of the MICE project are available at http:/(/)mice.sdsc.edu.

  4. Lsiviewer 2.0 - a Client-Oriented Online Visualization Tool for Geospatial Vector Data

    NASA Astrophysics Data System (ADS)

    Manikanta, K.; Rajan, K. S.

    2017-09-01

    Geospatial data visualization systems have been predominantly through applications that are installed and run in a desktop environment. Over the last decade, with the advent of web technologies and its adoption by Geospatial community, the server-client model for data handling, data rendering and visualization respectively has been the most prevalent approach in Web-GIS. While the client devices have become functionally more powerful over the recent years, the above model has largely ignored it and is still in a mode of serverdominant computing paradigm. In this paper, an attempt has been made to develop and demonstrate LSIViewer - a simple, easy-to-use and robust online geospatial data visualisation system for the user's own data that harness the client's capabilities for data rendering and user-interactive styling, with a reduced load on the server. The developed system can support multiple geospatial vector formats and can be integrated with other web-based systems like WMS, WFS, etc. The technology stack used to build this system is Node.js on the server side and HTML5 Canvas and JavaScript on the client side. Various tests run on a range of vector datasets, upto 35 MB, showed that the time taken to render the vector data using LSIViewer is comparable to a desktop GIS application, QGIS, over an identical system.

  5. Security in the CernVM File System and the Frontier Distributed Database Caching System

    NASA Astrophysics Data System (ADS)

    Dykstra, D.; Blomer, J.

    2014-06-01

    Both the CernVM File System (CVMFS) and the Frontier Distributed Database Caching System (Frontier) distribute centrally updated data worldwide for LHC experiments using http proxy caches. Neither system provides privacy or access control on reading the data, but both control access to updates of the data and can guarantee the authenticity and integrity of the data transferred to clients over the internet. CVMFS has since its early days required digital signatures and secure hashes on all distributed data, and recently Frontier has added X.509-based authenticity and integrity checking. In this paper we detail and compare the security models of CVMFS and Frontier.

  6. An optimized web-based approach for collaborative stereoscopic medical visualization

    PubMed Central

    Kaspar, Mathias; Parsad, Nigel M; Silverstein, Jonathan C

    2013-01-01

    Objective Medical visualization tools have traditionally been constrained to tethered imaging workstations or proprietary client viewers, typically part of hospital radiology systems. To improve accessibility to real-time, remote, interactive, stereoscopic visualization and to enable collaboration among multiple viewing locations, we developed an open source approach requiring only a standard web browser with no added client-side software. Materials and Methods Our collaborative, web-based, stereoscopic, visualization system, CoWebViz, has been used successfully for the past 2 years at the University of Chicago to teach immersive virtual anatomy classes. It is a server application that streams server-side visualization applications to client front-ends, comprised solely of a standard web browser with no added software. Results We describe optimization considerations, usability, and performance results, which make CoWebViz practical for broad clinical use. We clarify technical advances including: enhanced threaded architecture, optimized visualization distribution algorithms, a wide range of supported stereoscopic presentation technologies, and the salient theoretical and empirical network parameters that affect our web-based visualization approach. Discussion The implementations demonstrate usability and performance benefits of a simple web-based approach for complex clinical visualization scenarios. Using this approach overcomes technical challenges that require third-party web browser plug-ins, resulting in the most lightweight client. Conclusions Compared to special software and hardware deployments, unmodified web browsers enhance remote user accessibility to interactive medical visualization. Whereas local hardware and software deployments may provide better interactivity than remote applications, our implementation demonstrates that a simplified, stable, client approach using standard web browsers is sufficient for high quality three-dimensional, stereoscopic, collaborative and interactive visualization. PMID:23048008

  7. The clients' readiness to use mental health care services: Experiences and perceptions from Iranian context.

    PubMed

    Alavi, Mousa; Irajpour, Alireza

    2013-11-01

    Underutilization of mental health care services has been a challenge for the health care providers for many years. This challenge could be met in part by improving the clients' readiness to use such services. This study aimed to introduce the important aspects of the clients' readiness to use mental health services in the Iranian context. A thematic analysis of in-depth interviews was undertaken using a constant comparative approach. Participants (11 health professionals consisting of 3 physicians, 7 nurses, 1 psychologist, and 5 patients/their family members) were recruited from educational hospitals affiliated with Isfahan University of Medical Sciences, Iran. The credibility and trustworthiness was grounded on four aspects: factual value, applicability, consistency, and neutrality. The study findings uncovered two important aspects of the clients' readiness for utilizing mental health care services. These are described through two themes and related sub-themes: "The clients' awareness" implies the cognitive aspect of readiness and "the clients' attitudes" implies the psychological aspect of readiness, both of which have perceived to cultivate a fertile context through which the clients could access and use the mental health services more easily. For the health care system in Isfahan, Iran to be successful in delivering mental health services, training programs directed to prepare service users should be considered. Improving the clients' favorable attitudes and awareness should be considered.

  8. D-Light on promoters: a client-server system for the analysis and visualization of cis-regulatory elements

    PubMed Central

    2013-01-01

    Background The binding of transcription factors to DNA plays an essential role in the regulation of gene expression. Numerous experiments elucidated binding sequences which subsequently have been used to derive statistical models for predicting potential transcription factor binding sites (TFBS). The rapidly increasing number of genome sequence data requires sophisticated computational approaches to manage and query experimental and predicted TFBS data in the context of other epigenetic factors and across different organisms. Results We have developed D-Light, a novel client-server software package to store and query large amounts of TFBS data for any number of genomes. Users can add small-scale data to the server database and query them in a large scale, genome-wide promoter context. The client is implemented in Java and provides simple graphical user interfaces and data visualization. Here we also performed a statistical analysis showing what a user can expect for certain parameter settings and we illustrate the usage of D-Light with the help of a microarray data set. Conclusions D-Light is an easy to use software tool to integrate, store and query annotation data for promoters. A public D-Light server, the client and server software for local installation and the source code under GNU GPL license are available at http://biwww.che.sbg.ac.at/dlight. PMID:23617301

  9. Comparative study of group treatments for posttraumatic stress disorder.

    PubMed

    Maxwell, Kendal; Callahan, Jennifer L; Holtz, Pamela; Janis, Beth M; Gerber, Monica M; Connor, Dana R

    2016-12-01

    Presented herein is a comparative study of group treatments for posttraumatic stress disorder (PTSD). In this study, an emerging intervention, memory specificity training (MeST), was compared with cognitive processing therapy (CPT) using standardized outcome measures of target symptoms (i.e., anxiety and depression from client perspective; memory specificity from independent rater perspective) and global functioning (independent rater perspective), as well as a process measure of expectancy (client perspective). Clients were assessed on 3 separate occasions: at baseline, posttreatment, and 3 months posttreatment. Adherence and treatment fidelity (independent rater perspective) were monitored throughout the course of both treatment conditions. Improvement in PTSD symptoms, depressive symptoms, and global functioning were similar between MeST and CPT; an increase in ability to specify memories upon retrieval was also similar between MeST and CPT. Positive reliable change was observed in both groups on all outcome measures. With respect to the primary target of PTSD symptoms, 88% of participants in both treatment groups moved into the functional distribution by posttreatment and maintained these gains at follow-up. Notably, compared with CPT, MeST required only half the dosage (i.e., number of sessions) to accomplish these gains. Illustrative vignettes from client-therapist exchanges are provided, and results are discussed in terms of the potential mechanisms of action. Implications for both clinical practice and clinical research are also included. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  10. Health-Related Resource-Use Measurement Instruments for Intersectoral Costs and Benefits in the Education and Criminal Justice Sectors.

    PubMed

    Mayer, Susanne; Paulus, Aggie T G; Łaszewska, Agata; Simon, Judit; Drost, Ruben M W A; Ruwaard, Dirk; Evers, Silvia M A A

    2017-09-01

    Intersectoral costs and benefits (ICBs), i.e. costs and benefits of healthcare interventions outside the healthcare sector, can be a crucial component in economic evaluations from the societal perspective. Pivotal to their estimation is the existence of sound resource-use measurement (RUM) instruments; however, RUM instruments for ICBs in the education or criminal justice sectors have not yet been systematically collated or their psychometric quality assessed. This review aims to fill this gap. To identify relevant instruments, the Database of Instruments for Resource Use Measurement (DIRUM) was searched. Additionally, a systematic literature review was conducted in seven electronic databases to detect instruments containing ICB items used in economic evaluations. Finally, studies evaluating the psychometric quality of these instruments were searched. Twenty-six unique instruments were included. Most frequently, ICB items measured school absenteeism, tutoring, classroom assistance or contacts with legal representatives, police custody/prison detainment and court appearances, with the highest number of items listed in the Client Service Receipt Inventory/Client Sociodemographic and Service Receipt Inventory/Client Service Receipt Inventory-Children's Version (CSRI/CSSRI/CSRI-C), Studying the Scope of Parental Expenditures (SCOPE) and Self-Harm Intervention, Family Therapy (SHIFT) instruments. ICBs in the education sector were especially relevant for age-related developmental disorders and chronic diseases, while criminal justice resource use seems more important in mental health, including alcohol-related disorders or substance abuse. Evidence on the validity or reliability of ICB items was published for two instruments only. With a heterogeneous variety of ICBs found to be relevant for several disease areas but many ICB instruments applied in one study only (21/26 instruments), setting-up an international task force to, for example, develop an internationally adaptable instrument is recommended.

  11. BOWS (bioinformatics open web services) to centralize bioinformatics tools in web services.

    PubMed

    Velloso, Henrique; Vialle, Ricardo A; Ortega, J Miguel

    2015-06-02

    Bioinformaticians face a range of difficulties to get locally-installed tools running and producing results; they would greatly benefit from a system that could centralize most of the tools, using an easy interface for input and output. Web services, due to their universal nature and widely known interface, constitute a very good option to achieve this goal. Bioinformatics open web services (BOWS) is a system based on generic web services produced to allow programmatic access to applications running on high-performance computing (HPC) clusters. BOWS intermediates the access to registered tools by providing front-end and back-end web services. Programmers can install applications in HPC clusters in any programming language and use the back-end service to check for new jobs and their parameters, and then to send the results to BOWS. Programs running in simple computers consume the BOWS front-end service to submit new processes and read results. BOWS compiles Java clients, which encapsulate the front-end web service requisitions, and automatically creates a web page that disposes the registered applications and clients. Bioinformatics open web services registered applications can be accessed from virtually any programming language through web services, or using standard java clients. The back-end can run in HPC clusters, allowing bioinformaticians to remotely run high-processing demand applications directly from their machines.

  12. Study of Temporal Effects on Subjective Video Quality of Experience.

    PubMed

    Bampis, Christos George; Zhi Li; Moorthy, Anush Krishna; Katsavounidis, Ioannis; Aaron, Anne; Bovik, Alan Conrad

    2017-11-01

    HTTP adaptive streaming is being increasingly deployed by network content providers, such as Netflix and YouTube. By dividing video content into data chunks encoded at different bitrates, a client is able to request the appropriate bitrate for the segment to be played next based on the estimated network conditions. However, this can introduce a number of impairments, including compression artifacts and rebuffering events, which can severely impact an end-user's quality of experience (QoE). We have recently created a new video quality database, which simulates a typical video streaming application, using long video sequences and interesting Netflix content. Going beyond previous efforts, the new database contains highly diverse and contemporary content, and it includes the subjective opinions of a sizable number of human subjects regarding the effects on QoE of both rebuffering and compression distortions. We observed that rebuffering is always obvious and unpleasant to subjects, while bitrate changes may be less obvious due to content-related dependencies. Transient bitrate drops were preferable over rebuffering only on low complexity video content, while consistently low bitrates were poorly tolerated. We evaluated different objective video quality assessment algorithms on our database and found that objective video quality models are unreliable for QoE prediction on videos suffering from both rebuffering events and bitrate changes. This implies the need for more general QoE models that take into account objective quality models, rebuffering-aware information, and memory. The publicly available video content as well as metadata for all of the videos in the new database can be found at http://live.ece.utexas.edu/research/LIVE_NFLXStudy/nflx_index.html.

  13. Server-based Approach to Web Visualization of Integrated Three-dimensional Brain Imaging Data

    PubMed Central

    Poliakov, Andrew V.; Albright, Evan; Hinshaw, Kevin P.; Corina, David P.; Ojemann, George; Martin, Richard F.; Brinkley, James F.

    2005-01-01

    The authors describe a client-server approach to three-dimensional (3-D) visualization of neuroimaging data, which enables researchers to visualize, manipulate, and analyze large brain imaging datasets over the Internet. All computationally intensive tasks are done by a graphics server that loads and processes image volumes and 3-D models, renders 3-D scenes, and sends the renderings back to the client. The authors discuss the system architecture and implementation and give several examples of client applications that allow visualization and analysis of integrated language map data from single and multiple patients. PMID:15561787

  14. Mesh Networking in the Tactical Environment Using White Space Technolog

    DTIC Science & Technology

    2015-12-01

    Connect network with multi- ple clients . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 Table 4.5 Results of White Space simulation...functionality to devices seeking to allocate unutilized spectrum space . The devices are able to poll the database, via a connection to a web based...and 28 schools, all of whom were provided Internet connectivity by Adaptrum white space devices [16]. The use of white space devices made this

  15. The Protein Disease Database of human body fluids: II. Computer methods and data issues.

    PubMed

    Lemkin, P F; Orr, G A; Goldstein, M P; Creed, G J; Myrick, J E; Merril, C R

    1995-01-01

    The Protein Disease Database (PDD) is a relational database of proteins and diseases. With this database it is possible to screen for quantitative protein abnormalities associated with disease states. These quantitative relationships use data drawn from the peer-reviewed biomedical literature. Assays may also include those observed in high-resolution electrophoretic gels that offer the potential to quantitate many proteins in a single test as well as data gathered by enzymatic or immunologic assays. We are using the Internet World Wide Web (WWW) and the Web browser paradigm as an access method for wide distribution and querying of the Protein Disease Database. The WWW hypertext transfer protocol and its Common Gateway Interface make it possible to build powerful graphical user interfaces that can support easy-to-use data retrieval using query specification forms or images. The details of these interactions are totally transparent to the users of these forms. Using a client-server SQL relational database, user query access, initial data entry and database maintenance are all performed over the Internet with a Web browser. We discuss the underlying design issues, mapping mechanisms and assumptions that we used in constructing the system, data entry, access to the database server, security, and synthesis of derived two-dimensional gel image maps and hypertext documents resulting from SQL database searches.

  16. Levelling and merging of two discrete national-scale geochemical databases: A case study showing the surficial expression of metalliferous black shales

    USGS Publications Warehouse

    Smith, Steven M.; Neilson, Ryan T.; Giles, Stuart A.

    2015-01-01

    Government-sponsored, national-scale, soil and sediment geochemical databases are used to estimate regional and local background concentrations for environmental issues, identify possible anthropogenic contamination, estimate mineral endowment, explore for new mineral deposits, evaluate nutrient levels for agriculture, and establish concentration relationships with human or animal health. Because of these different uses, it is difficult for any single database to accommodate all the needs of each client. Smith et al. (2013, p. 168) reviewed six national-scale soil and sediment geochemical databases for the United States (U.S.) and, for each, evaluated “its appropriateness as a national-scale geochemical database and its usefulness for national-scale geochemical mapping.” Each of the evaluated databases has strengths and weaknesses that were listed in that review.Two of these U.S. national-scale geochemical databases are similar in their sample media and collection protocols but have different strengths—primarily sampling density and analytical consistency. This project was implemented to determine whether those databases could be merged to produce a combined dataset that could be used for mineral resource assessments. The utility of the merged database was tested to see whether mapped distributions could identify metalliferous black shales at a national scale.

  17. Development of a Mobile User Interface for Image-based Dietary Assessment.

    PubMed

    Kim, Sungye; Schap, Tusarebecca; Bosch, Marc; Maciejewski, Ross; Delp, Edward J; Ebert, David S; Boushey, Carol J

    2010-12-31

    In this paper, we present a mobile user interface for image-based dietary assessment. The mobile user interface provides a front end to a client-server image recognition and portion estimation software. In the client-server configuration, the user interactively records a series of food images using a built-in camera on the mobile device. Images are sent from the mobile device to the server, and the calorie content of the meal is estimated. In this paper, we describe and discuss the design and development of our mobile user interface features. We discuss the design concepts, through initial ideas and implementations. For each concept, we discuss qualitative user feedback from participants using the mobile client application. We then discuss future designs, including work on design considerations for the mobile application to allow the user to interactively correct errors in the automatic processing while reducing the user burden associated with classical pen-and-paper dietary records.

  18. Security Risks of Cloud Computing and Its Emergence as 5th Utility Service

    NASA Astrophysics Data System (ADS)

    Ahmad, Mushtaq

    Cloud Computing is being projected by the major cloud services provider IT companies such as IBM, Google, Yahoo, Amazon and others as fifth utility where clients will have access for processing those applications and or software projects which need very high processing speed for compute intensive and huge data capacity for scientific, engineering research problems and also e- business and data content network applications. These services for different types of clients are provided under DASM-Direct Access Service Management based on virtualization of hardware, software and very high bandwidth Internet (Web 2.0) communication. The paper reviews these developments for Cloud Computing and Hardware/Software configuration of the cloud paradigm. The paper also examines the vital aspects of security risks projected by IT Industry experts, cloud clients. The paper also highlights the cloud provider's response to cloud security risks.

  19. K-means cluster analysis of rehabilitation service users in the Home Health Care System of Ontario: examining the heterogeneity of a complex geriatric population.

    PubMed

    Armstrong, Joshua J; Zhu, Mu; Hirdes, John P; Stolee, Paul

    2012-12-01

    To examine the heterogeneity of home care clients who use rehabilitation services by using the K-means algorithm to identify previously unknown patterns of clinical characteristics. Observational study of secondary data. Home care system. Assessment information was collected on 150,253 home care clients using the provincially mandated Resident Assessment Instrument-Home Care (RAI-HC) data system. Not applicable. Assessment information from every long-stay (>60 d) home care client that entered the home care system between 2005 and 2008 and used rehabilitation services within 3 months of their initial assessment was analyzed. The K-means clustering algorithm was applied using 37 variables from the RAI-HC assessment. The K-means cluster analysis resulted in the identification of 7 relatively homogeneous subgroups that differed on characteristics such as age, sex, cognition, and functional impairment. Client profiles were created to illustrate the diversity of this geriatric population. The K-means algorithm provided a useful way to segment a heterogeneous rehabilitation client population into more homogeneous subgroups. This analysis provides an enhanced understanding of client characteristics and needs, and could enable more appropriate targeting of rehabilitation services for home care clients. Copyright © 2012 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  20. Combining data from multiple sources using the CUAHSI Hydrologic Information System

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Ames, D. P.; Horsburgh, J. S.; Goodall, J. L.

    2012-12-01

    The Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) has developed a Hydrologic Information System (HIS) to provide better access to data by enabling the publication, cataloging, discovery, retrieval, and analysis of hydrologic data using web services. The CUAHSI HIS is an Internet based system comprised of hydrologic databases and servers connected through web services as well as software for data publication, discovery and access. The HIS metadata catalog lists close to 100 web services registered to provide data through this system, ranging from large federal agency data sets to experimental watersheds managed by University investigators. The system's flexibility in storing and enabling public access to similarly formatted data and metadata has created a community data resource from governmental and academic data that might otherwise remain private or analyzed only in isolation. Comprehensive understanding of hydrology requires integration of this information from multiple sources. HydroDesktop is the client application developed as part of HIS to support data discovery and access through this system. HydroDesktop is founded on an open source GIS client and has a plug-in architecture that has enabled the integration of modeling and analysis capability with the functionality for data discovery and access. Model integration is possible through a plug-in built on the OpenMI standard and data visualization and analysis is supported by an R plug-in. This presentation will demonstrate HydroDesktop, showing how it provides an analysis environment within which data from multiple sources can be discovered, accessed and integrated.

  1. Motivational interviewing and the clinical science of Carl Rogers.

    PubMed

    Miller, William R; Moyers, Theresa B

    2017-08-01

    The clinical method of motivational interviewing (MI) evolved from the person-centered approach of Carl Rogers, maintaining his pioneering commitment to the scientific study of therapeutic processes and outcomes. The development of MI pertains to all 3 of the 125th anniversary themes explored in this special issue. Applications of MI have spread far beyond clinical psychology into fields including health care, rehabilitation, public health, social work, dentistry, corrections, coaching, and education, directly impacting the lives of many people. The public relevance and impact of clinical psychology are illustrated in the similarity of MI processes and outcomes across such diverse fields and the inseparability of human services from the person who provides them, in that both relational and technical elements of MI predict client outcomes. Within the history of clinical psychology MI is a clear product of clinical science, arising from the seminal work of Carl Rogers whose own research grounded clinical practice in empirical science. As with Rogers' work 70 years ago, MI began as an inductive empirical approach, observing clinical practice to develop and test hypotheses about what actually promotes change. Research on MI bridges the current divide between evidence-based practice and the well-established importance of therapeutic relationship. Research on training and learning of MI further questions the current model of continuing professional education through self-study and workshops as a way of improving practice behavior and client outcomes. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  2. Using Case-Mix Adjustment Methods To Measure the Effectiveness of Substance Abuse Treatment: Three Examples Using Client Employment Outcomes.

    ERIC Educational Resources Information Center

    Koenig, Lane; Fields, Errol L.; Dall, Timothy M.; Ameen, Ansari Z.; Harwood, Henrick J.

    This report demonstrates three applications of case-mix methods using regression analysis. The results are used to assess the relative effectiveness of substance abuse treatment providers. The report also examines the ability of providers to improve client employment outcomes, an outcome domain relatively unexamined in the assessment of provider…

  3. A case study on the application of International Classification of Functioning, Disability and Health (ICF)-based tools for vocational rehabilitation in spinal cord injury.

    PubMed

    Glässel, Andrea; Rauch, Alexandra; Selb, Melissa; Emmenegger, Karl; Lückenkemper, Miriam; Escorpizo, Reuben

    2012-01-01

    Vocational rehabilitation (VR) plays a key role in bringing persons with acquired disabilities back to work, while encouraging employment participation. The purpose of this case study is to illustrate the systematic application of International Classification of Functioning, Disability, and Health (ICF)-based documentation tools by using ICF Core Sets in VR shown with a case example of a client with traumatic spinal cord injury (SCI). The client was a 26-year-old male with paraplegia (7th thoracic level), working in the past as a mover. This case study describes the integration of the ICF Core Sets for VR into an interdisciplinary rehabilitation program by using ICF-based documentation tools. Improvements in the client's impairments, activity limitations, and participation restrictions were observed following rehabilitation. Goals in different areas of functioning were achieved. The use of the ICF Core Sets in VR allows a comprehensive assessment of the client's level of functioning and intervention planning. Specifically, the Brief ICF Core Set in VR can provide domains for intervention relevant to each member of an interdisciplinary team and hence, can facilitate the VR management process in a SCI center in Switzerland.

  4. Software Applications to Access Earth Science Data: Building an ECHO Client

    NASA Astrophysics Data System (ADS)

    Cohen, A.; Cechini, M.; Pilone, D.

    2010-12-01

    Historically, developing an ECHO (NASA’s Earth Observing System (EOS) ClearingHOuse) client required interaction with its SOAP API. SOAP, as a framework for web service communication has numerous advantages for Enterprise applications and Java/C# type programming languages. However, as interest has grown for quick development cycles and more intriguing “mashups,” ECHO has seen the SOAP API lose its appeal. In order to address these changing needs, ECHO has introduced two new interfaces facilitating simple access to its metadata holdings. The first interface is built upon the OpenSearch format and ESIP Federated Search framework. The second interface is built upon the Representational State Transfer (REST) architecture. Using the REST and OpenSearch APIs to access ECHO makes development with modern languages much more feasible and simpler. Client developers can leverage the simple interaction with ECHO to focus more of their time on the advanced functionality they are presenting to users. To demonstrate the simplicity of developing with the REST API, participants will be led through a hands-on experience where they will develop an ECHO client that performs the following actions: + Login + Provider discovery + Provider based dataset discovery + Dataset, Temporal, and Spatial constraint based Granule discovery + Online Data Access

  5. Partitioned key-value store with atomic memory operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    A partitioned key-value store is provided that supports atomic memory operations. A server performs a memory operation in a partitioned key-value store by receiving a request from an application for at least one atomic memory operation, the atomic memory operation comprising a memory address identifier; and, in response to the atomic memory operation, performing one or more of (i) reading a client-side memory location identified by the memory address identifier and storing one or more key-value pairs from the client-side memory location in a local key-value store of the server; and (ii) obtaining one or more key-value pairs from themore » local key-value store of the server and writing the obtained one or more key-value pairs into the client-side memory location identified by the memory address identifier. The server can perform functions obtained from a client-side memory location and return a result to the client using one or more of the atomic memory operations.« less

  6. Documenting progress: hand therapy treatment shift from biomechanical to occupational adaptation.

    PubMed

    Jack, Jada; Estes, Rebecca I

    2010-01-01

    The investment of time and self to develop therapeutic relationships with clients appears incongruent with today's time-constrained health care system, yet bridging the gap of these incongruencies is the challenge therapists face to provide high-quality, client-centered, occupation-based treatment. This case report illustrates a shift in approach from biomechanical to occupational adaptation (OA) in an orthopedic outpatient clinic. The progress of a client with lupus-related arthritis who was 6 days postsurgery is documented. The intervention initially used a biomechanical frame of reference, but when little progress had been made at 10 weeks after surgery, a shift was made to the more client-centered OA approach. The Canadian Occupational Performance Measure was administered, and an OA approach was initiated. On reassessment, clinically important improvements were documented in all functional tasks addressed. An OA approach provides the bridge between the application of clinical expertise, client-centered, occupation-based therapy and the time constraints placed by payer sources.

  7. Improving general practice based epidemiologic surveillance using desktop clients: the French Sentinel Network experience.

    PubMed

    Turbelin, Clément; Boëlle, Pierre-Yves

    2010-01-01

    Web-based applications are a choice tool for general practice based epidemiological surveillance; however their use may disrupt the general practitioners (GPs) work process. In this article, we propose an alternative approach based on a desktop client application. This was developed for use in the French General Practitioners Sentinel Network. We developed a java application running as a client on the local GP computer. It allows reporting cases to a central server and provides feedback to the participating GPs. XML was used to describe surveillance protocols and questionnaires as well as instances of case descriptions. An evaluation of the users' feelings was carried out and the impact on the timeliness and completeness of surveillance data was measured. Better integration in the work process was reported, especially when the software was used at the time of consultation. Reports were received more frequently with less missing data. This study highlights the potential of allowing multiple ways of interaction with the surveillance system to increase participation of GPs and the quality of surveillance.

  8. A cloud-based forensics tracking scheme for online social network clients.

    PubMed

    Lin, Feng-Yu; Huang, Chien-Cheng; Chang, Pei-Ying

    2015-10-01

    In recent years, with significant changes in the communication modes, most users are diverted to cloud-based applications, especially online social networks (OSNs), which applications are mostly hosted on the outside and available to criminals, enabling them to impede criminal investigations and intelligence gathering. In the virtual world, how the Law Enforcement Agency (LEA) identifies the "actual" identity of criminal suspects, and their geolocation in social networks, is a major challenge to current digital investigation. In view of this, this paper proposes a scheme, based on the concepts of IP location and network forensics, which aims to develop forensics tracking on OSNs. According to our empirical analysis, the proposed mechanism can instantly trace the "physical location" of a targeted service resource identifier (SRI), when the target client is using online social network applications (Facebook, Twitter, etc.), and can analyze the probable target client "identity" associatively. To the best of our knowledge, this is the first individualized location method and architecture developed and evaluated in OSNs. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  9. COLLABORATIONS AND SPECIALIZED CLIENT INTERACTIONS

    EPA Science Inventory

    The goal of this task is to improve our understanding of atmospheric modeling research applications through collaborations with the international air pollution community and to demonstrate the applicability of our AQ models for their utility through technical applications by clie...

  10. Family group conferences in public mental health care: an exploration of opportunities.

    PubMed

    de Jong, Gideon; Schout, Gert

    2011-02-01

    Family group conferences are usually organized in youth care settings, especially in cases of (sexual) abuse of children and domestic violence. Studies on the application of family group conferences in mental health practices are scarce, let alone in a setting even more specific, such as public mental health care. The present study reports on an exploratory study on the applicability of family group conferencing in public mental health care. Findings suggest that there are six reasons to start family group conference pilots in public mental health care. First, care providers who work in public mental health care often need to deal with clients who are not motivated in seeking help. Family group conferences could yield support or provide a plan, even without the presence of the client. Second, conferences might complement the repertoire of treatment options between voluntary help and coercive treatment. Third, clients in public mental health care often have a limited network. Conferences promote involvement, as they expand and restore relationships, and generate support. Fourth, conferences could succeed both in a crisis and in other non-critical situations. Sometimes pressure is needed for clients to accept help from their network (such as in the case of an imminent eviction), while in other situations, it is required that clients are stabilized before a conference can be organized (such as in the case of a psychotic episode). Fifth, clients who have negative experiences with care agencies and their representatives might be inclined to accept a conference because these agencies act in another (modest) role. Finally, the social network could elevate the work of professionals. © 2011 The Authors. International Journal of Mental Health Nursing © 2011 Australian College of Mental Health Nurses Inc.

  11. BEAUTY-X: enhanced BLAST searches for DNA queries.

    PubMed

    Worley, K C; Culpepper, P; Wiese, B A; Smith, R F

    1998-01-01

    BEAUTY (BLAST Enhanced Alignment Utility) is an enhanced version of the BLAST database search tool that facilitates identification of the functions of matched sequences. Three recent improvements to the BEAUTY program described here make the enhanced output (1) available for DNA queries, (2) available for searches of any protein database, and (3) more up-to-date, with periodic updates of the domain information. BEAUTY searches of the NCBI and EMBL non-redundant protein sequence databases are available from the BCM Search Launcher Web pages (http://gc.bcm.tmc. edu:8088/search-launcher/launcher.html). BEAUTY Post-Processing of submitted search results is available using the BCM Search Launcher Batch Client (version 2.6) (ftp://gc.bcm.tmc. edu/pub/software/search-launcher/). Example figures are available at http://dot.bcm.tmc. edu:9331/papers/beautypp.html (kworley,culpep)@bcm.tmc.edu

  12. Apollo2Go: a web service adapter for the Apollo genome viewer to enable distributed genome annotation.

    PubMed

    Klee, Kathrin; Ernst, Rebecca; Spannagl, Manuel; Mayer, Klaus F X

    2007-08-30

    Apollo, a genome annotation viewer and editor, has become a widely used genome annotation and visualization tool for distributed genome annotation projects. When using Apollo for annotation, database updates are carried out by uploading intermediate annotation files into the respective database. This non-direct database upload is laborious and evokes problems of data synchronicity. To overcome these limitations we extended the Apollo data adapter with a generic, configurable web service client that is able to retrieve annotation data in a GAME-XML-formatted string and pass it on to Apollo's internal input routine. This Apollo web service adapter, Apollo2Go, simplifies the data exchange in distributed projects and aims to render the annotation process more comfortable. The Apollo2Go software is freely available from ftp://ftpmips.gsf.de/plants/apollo_webservice.

  13. Apollo2Go: a web service adapter for the Apollo genome viewer to enable distributed genome annotation

    PubMed Central

    Klee, Kathrin; Ernst, Rebecca; Spannagl, Manuel; Mayer, Klaus FX

    2007-01-01

    Background Apollo, a genome annotation viewer and editor, has become a widely used genome annotation and visualization tool for distributed genome annotation projects. When using Apollo for annotation, database updates are carried out by uploading intermediate annotation files into the respective database. This non-direct database upload is laborious and evokes problems of data synchronicity. Results To overcome these limitations we extended the Apollo data adapter with a generic, configurable web service client that is able to retrieve annotation data in a GAME-XML-formatted string and pass it on to Apollo's internal input routine. Conclusion This Apollo web service adapter, Apollo2Go, simplifies the data exchange in distributed projects and aims to render the annotation process more comfortable. The Apollo2Go software is freely available from . PMID:17760972

  14. Information-Flow-Based Access Control for Web Browsers

    NASA Astrophysics Data System (ADS)

    Yoshihama, Sachiko; Tateishi, Takaaki; Tabuchi, Naoshi; Matsumoto, Tsutomu

    The emergence of Web 2.0 technologies such as Ajax and Mashup has revealed the weakness of the same-origin policy[1], the current de facto standard for the Web browser security model. We propose a new browser security model to allow fine-grained access control in the client-side Web applications for secure mashup and user-generated contents. We propose a browser security model that is based on information-flow-based access control (IBAC) to overcome the dynamic nature of the client-side Web applications and to accurately determine the privilege of scripts in the event-driven programming model.

  15. DNASynth: a software application to optimization of artificial gene synthesis

    NASA Astrophysics Data System (ADS)

    Muczyński, Jan; Nowak, Robert M.

    2017-08-01

    DNASynth is a client-server software application in which the client runs in a web browser. The aim of this program is to support and optimize process of artificial gene synthesizing using Ligase Chain Reaction. Thanks to LCR it is possible to obtain DNA strand coding defined by user peptide. The DNA sequence is calculated by optimization algorithm that consider optimal codon usage, minimal energy of secondary structures and minimal number of required LCR. Additionally absence of sequences characteristic for defined by user set of restriction enzymes is guaranteed. The presented software was tested on synthetic and real data.

  16. A meta-analysis of motivational interviewing process: Technical, relational, and conditional process models of change.

    PubMed

    Magill, Molly; Apodaca, Timothy R; Borsari, Brian; Gaume, Jacques; Hoadley, Ariel; Gordon, Rebecca E F; Tonigan, J Scott; Moyers, Theresa

    2018-02-01

    In the present meta-analysis, we test the technical and relational hypotheses of Motivational Interviewing (MI) efficacy. We also propose an a priori conditional process model where heterogeneity of technical path effect sizes should be explained by interpersonal/relational (i.e., empathy, MI Spirit) and intrapersonal (i.e., client treatment seeking status) moderators. A systematic review identified k = 58 reports, describing 36 primary studies and 40 effect sizes (N = 3,025 participants). Statistical methods calculated the inverse variance-weighted pooled correlation coefficient for the therapist to client and the client to outcome paths across multiple target behaviors (i.e., alcohol use, other drug use, other behavior change). Therapist MI-consistent skills were correlated with more client change talk (r = .55, p < .001) as well as more sustain talk (r = .40, p < .001). MI-inconsistent skills were correlated with more sustain talk (r = .16, p < .001), but not change talk. When these indicators were combined into proportions, as recommended in the Motivational Interviewing Skill Code, the overall technical hypothesis was supported. Specifically, proportion MI consistency was related to higher proportion change talk (r = .11, p = .004) and higher proportion change talk was related to reductions in risk behavior at follow up (r = -.16, p < .001). When tested as two independent effects, client change talk was not significant, but sustain talk was positively associated with worse outcome (r = .19, p < .001). Finally, the relational hypothesis was not supported, but heterogeneity in technical hypothesis path effect sizes was partially explained by inter- and intrapersonal moderators. This meta-analysis provides additional support for the technical hypothesis of MI efficacy; future research on the relational hypothesis should occur in the field rather than in the context of clinical trials. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  17. Recommendations for a service framework to access astronomical archives

    NASA Technical Reports Server (NTRS)

    Travisano, J. J.; Pollizzi, J.

    1992-01-01

    There are a large number of astronomical archives and catalogs on-line for network access, with many different user interfaces and features. Some systems are moving towards distributed access, supplying users with client software for their home sites which connects to servers at the archive site. Many of the issues involved in defining a standard framework of services that archive/catalog suppliers can use to achieve a basic level of interoperability are described. Such a framework would simplify the development of client and server programs to access the wide variety of astronomical archive systems. The primary services that are supplied by current systems include: catalog browsing, dataset retrieval, name resolution, and data analysis. The following issues (and probably more) need to be considered in establishing a standard set of client/server interfaces and protocols: Archive Access - dataset retrieval, delivery, file formats, data browsing, analysis, etc.; Catalog Access - database management systems, query languages, data formats, synchronous/asynchronous mode of operation, etc.; Interoperability - transaction/message protocols, distributed processing mechanisms (DCE, ONC/SunRPC, etc), networking protocols, etc.; Security - user registration, authorization/authentication mechanisms, etc.; Service Directory - service registration, lookup, port/task mapping, parameters, etc.; Software - public vs proprietary, client/server software, standard interfaces to client/server functions, software distribution, operating system portability, data portability, etc. Several archive/catalog groups, notably the Astrophysics Data System (ADS), are already working in many of these areas. In the process of developing StarView, which is the user interface to the Space Telescope Data Archive and Distribution Service (ST-DADS), these issues and the work of others were analyzed. A framework of standard interfaces for accessing services on any archive system which would benefit archive user and supplier alike is proposed.

  18. High levels of opioid analgesic co-prescription among methadone maintenance treatment clients in British Columbia, Canada: Results from a population-level retrospective cohort study

    PubMed Central

    Nosyk, Bohdan; Fischer, Benedikt; Sun, Huiying; Marsh, David C.; Kerr, Thomas; Rehm, Juergen T.; Anis, Aslam H.

    2014-01-01

    Background and Objectives The nonmedical use of prescription opioids (PO) has increased dramatically in North America. Special consideration for PO prescription is required for individuals in methadone maintenance treatment (MMT). Our objective is to describe the prevalence and correlates of PO use among British Columbia (BC) MMT clients from 1996-2007. Methods This study was based on a linked, population-level medication dispensation database. All individuals receiving 30 days of continuous MMT for opioid dependence were included in the study. Key measurements included the proportion of clients receiving >7 days of a PO other than methadone during MMT from 1996 to 2007. Factors independently associated with PO co-prescription during MMT were assessed using generalized linear mixed effects regression. Results 16,248 individuals with 27,919 MMT episodes at least 30 days in duration were identified for the study period. Among them, 5,552 individuals (34.2%) received a total of 290,543 PO co-prescriptions during MMT. The majority (74.3%) of all PO dispensations >7 days originated from non-MMT physicians. The number of PO prescriptions per person-year nearly doubled between 1996 and 2006, driven by increases in morphine, hydromorphone and oxycodone dispensations. PO co-prescription was positively associated with female gender, older age, higher levels of medical co-morbidity as well as higher MMT dosage, adherence, and retention. Conclusion and Scientific Significance A large proportion of MMT clients in BC received co-occurring PO prescriptions, often from physicians and pharmacies not delivering MMT. Experimental evidence for the treatment of pain in MMT clients is required to guide clinical practice. PMID:24724883

  19. Clients' experiences of treatment and recovery in borderline personality disorder: A meta-synthesis of qualitative studies.

    PubMed

    Katsakou, Christina; Pistrang, Nancy

    2017-01-31

    This review synthesized findings from qualitative studies exploring clients' experiences of their treatment for borderline personality disorder (BPD) and their perceptions of recovery. Fourteen studies were identified through searches in three electronic databases. The Critical Appraisal Skills Programme was used to appraise the methodological quality of the studies. Thematic analysis was used to synthesize the findings. The meta-synthesis identified 10 themes, grouped into 3 domains. The first domain, "Areas of change," suggests that clients make changes in four main areas: developing self-acceptance and self-confidence; controlling difficult thoughts and emotions; practising new ways of relating to others; and implementing practical changes and developing hope. The second domain, "Helpful and unhelpful treatment characteristics," highlights treatment elements that either supported or hindered recovery: safety and containment; being cared for and respected; not being an equal partner in treatment; and focusing on change. The third domain, "The nature of change," refers to clients' experience of change as an open-ended journey and a series of achievements and setbacks. The meta-synthesis highlights areas of change experienced by individuals receiving treatment for BPD, and treatment characteristics that they value. However, further research is needed to better understand how these changes are achieved. Clinical or methodological significance summary: The present qualitative meta-synthesis brings together findings from 14 qualitative studies. The emerging themes point to areas of improvement in psychological functioning that people struggling with BPD issues have identified as both important and achievable. They also highlight treatment characteristics that might facilitate change in these areas. Treatments emphasizing these characteristics, namely striking a balance between creating a safe, caring space, and actively promoting change, may increase clients' motivation and engagement with services and facilitate recovery.

  20. Development of a 3D WebGIS System for Retrieving and Visualizing CityGML Data Based on their Geometric and Semantic Characteristics by Using Free and Open Source Technology

    NASA Astrophysics Data System (ADS)

    Pispidikis, I.; Dimopoulou, E.

    2016-10-01

    CityGML is considered as an optimal standard for representing 3D city models. However, international experience has shown that visualization of the latter is quite difficult to be implemented on the web, due to the large size of data and the complexity of CityGML. As a result, in the context of this paper, a 3D WebGIS application is developed in order to successfully retrieve and visualize CityGML data in accordance with their respective geometric and semantic characteristics. Furthermore, the available web technologies and the architecture of WebGIS systems are investigated, as provided by international experience, in order to be utilized in the most appropriate way for the purposes of this paper. Specifically, a PostgreSQL/ PostGIS Database is used, in compliance with the 3DCityDB schema. At Server tier, Apache HTTP Server and GeoServer are utilized, while a Server Side programming language PHP is used. At Client tier, which implemented the interface of the application, the following technologies were used: JQuery, AJAX, JavaScript, HTML5, WebGL and Ol3-Cesium. Finally, it is worth mentioning that the application's primary objectives are a user-friendly interface and a fully open source development.

Top