Sample records for microsoft standard server

  1. Software Re-Engineering of the Human Factors Analysis and Classification System - (Maintenance Extension) Using Object Oriented Methods in a Microsoft Environment

    DTIC Science & Technology

    2001-09-01

    replication) -- all from Visual Basic and VBA . In fact, we found that the SQL Server engine actually had a plethora of options, most formidable of...2002, the new SQL Server 2000 database engine, and Microsoft Visual Basic.NET. This thesis describes our use of the Spiral Development Model to...versions of Microsoft products? Specifically, the pending release of Microsoft Office 2002, the new SQL Server 2000 database engine, and Microsoft

  2. A Tale of Two Observing Systems: Interoperability in the World of Microsoft Windows

    NASA Astrophysics Data System (ADS)

    Babin, B. L.; Hu, L.

    2008-12-01

    Louisiana Universities Marine Consortium's (LUMCON) and Dauphin Island Sea Lab's (DISL) Environmental Monitoring System provide a unified coastal ocean observing system. These two systems are mirrored to maintain autonomy while offering an integrated data sharing environment. Both systems collect data via Campbell Scientific Data loggers, store the data in Microsoft SQL servers, and disseminate the data in real- time on the World Wide Web via Microsoft Internet Information Servers and Active Server Pages (ASP). The utilization of Microsoft Windows technologies presented many challenges to these observing systems as open source tools for interoperability grow. The current open source tools often require the installation of additional software. In order to make data available through common standards formats, "home grown" software has been developed. One example of this is the development of software to generate xml files for transmission to the National Data Buoy Center (NDBC). OOSTethys partners develop, test and implement easy-to-use, open-source, OGC-compliant software., and have created a working prototype of networked, semantically interoperable, real-time data systems. Partnering with OOSTethys, we are developing a cookbook to implement OGC web services. The implementation will be written in ASP, will run in a Microsoft operating system environment, and will serve data via Sensor Observation Services (SOS). This cookbook will give observing systems running Microsoft Windows the tools to easily participate in the Open Geospatial Consortium (OGC) Oceans Interoperability Experiment (OCEANS IE).

  3. Risk Assessment of the Naval Postgraduate School Gigabit Network

    DTIC Science & Technology

    2004-09-01

    Management Server (1) • Ras Server (1) • Remedy Server (1) • Samba Server(2) • SQL Servers (3) • Web Servers (3) • WINS Server (1) • Library...Server Bob Sharp INCA Windows 2000 Advanced Server NPGS Landesk SQL 2000 Alan Pires eagle Microsoft Windows 2000 Advanced Server EWS NPGS Landesk...Advanced Server Special Projects NPGS SQL Alan Pires MC01BDB Microsoft Windows 2000 Advanced Server Special Projects NPGS SQL 2000 Alan Pires

  4. Electronic Mail (E-Mail) Management and Use

    DTIC Science & Technology

    1999-03-01

    age or date of birth ; present or 14 AFI33-119 1 MARCH 1999future assignments for overseas, or for routinely deployable or sensitive units; and office...Server Naming Convention. The DMS-AF primary (e.g., ESL® Primary, Lotus ® Hub, and Microsoft® Bridgehead) server and its backup must conform to the 8...determine if there is some benefit to establishing standard lower-level folder names which bases should adhere to. Standard public folders offer a

  5. Research of GIS-services applicability for solution of spatial analysis tasks.

    NASA Astrophysics Data System (ADS)

    Terekhin, D. A.; Botygin, I. A.; Sherstneva, A. I.; Sherstnev, V. S.

    2017-01-01

    Experiments for working out the areas of applying various gis-services in the tasks of spatial analysis are discussed in this paper. Google Maps, Yandex Maps, Microsoft SQL Server are used as services of spatial analysis. All services have shown a comparable speed of analyzing the spatial data when carrying out elemental spatial requests (building up the buffer zone of a point object) as well as the preferences of Microsoft SQL Server in operating with more complicated spatial requests. When building up elemental spatial requests, internet-services show higher efficiency due to cliental data handling with JavaScript-subprograms. A weak point of public internet-services is an impossibility to handle data on a server side and a barren variety of spatial analysis functions. Microsoft SQL Server offers a large variety of functions needed for spatial analysis on the server side. The authors conclude that when solving practical problems, the capabilities of internet-services used in building up routes and completing other functions with spatial analysis with Microsoft SQL Server should be involved.

  6. Microsoft Repository Version 2 and the Open Information Model.

    ERIC Educational Resources Information Center

    Bernstein, Philip A.; Bergstraesser, Thomas; Carlson, Jason; Pal, Shankar; Sanders, Paul; Shutt, David

    1999-01-01

    Describes the programming interface and implementation of the repository engine and the Open Information Model for Microsoft Repository, an object-oriented meta-data management facility that ships in Microsoft Visual Studio and Microsoft SQL Server. Discusses Microsoft's component object model, object manipulation, queries, and information…

  7. Implementation of an Enterprise Information Portal (EIP) in the Loyola University Health System

    PubMed Central

    Price, Ronald N.; Hernandez, Kim

    2001-01-01

    Loyola University Chicago Stritch School of Medicine and Loyola University Medical Center have long histories in the development of applications to support the institutions' missions of education, research and clinical care. In late 1998, the institutions' application development group undertook an ambitious program to re-architecture more than 10 years of legacy application development (30+ core applications) into a unified World Wide Web (WWW) environment. The primary project objectives were to construct an environment that would support the rapid development of n-tier, web-based applications while providing standard methods for user authentication/validation, security/access control and definition of a user's organizational context. The project's efforts resulted in Loyola's Enterprise Information Portal (EIP), which meets the aforementioned objectives. This environment: 1) allows access to other vertical Intranet portals (e.g., electronic medical record, patient satisfaction information and faculty effort); 2) supports end-user desktop customization; and 3) provides a means for standardized application “look and feel.” The portal was constructed utilizing readily available hardware and software. Server hardware consists of multiprocessor (Intel Pentium 500Mhz) Compaq 6500 servers with one gigabyte of random access memory and 75 gigabytes of hard disk storage. Microsoft SQL Server was selected to house the portal's internal or security data structures. Netscape Enterprise Server was selected for the web server component of the environment and Allaire's ColdFusion was chosen for access and application tiers. Total costs for the portal environment was less than $40,000. User data storage is accomplished through two Microsoft SQL Servers and an existing SUN Microsystems enterprise server with eight processors, 750 gigabytes of disk storage operating Sybase relational database manager. Total storage capacity for all system exceeds one terabyte. In the past 12 months, the EIP has supported development of more than 88 applications and is utilized by more than 2,200 users.

  8. GLobal Integrated Design Environment (GLIDE): A Concurrent Engineering Application

    NASA Technical Reports Server (NTRS)

    McGuire, Melissa L.; Kunkel, Matthew R.; Smith, David A.

    2010-01-01

    The GLobal Integrated Design Environment (GLIDE) is a client-server software application purpose-built to mitigate issues associated with real time data sharing in concurrent engineering environments and to facilitate discipline-to-discipline interaction between multiple engineers and researchers. GLIDE is implemented in multiple programming languages utilizing standardized web protocols to enable secure parameter data sharing between engineers and researchers across the Internet in closed and/or widely distributed working environments. A well defined, HyperText Transfer Protocol (HTTP) based Application Programming Interface (API) to the GLIDE client/server environment enables users to interact with GLIDE, and each other, within common and familiar tools. One such common tool, Microsoft Excel (Microsoft Corporation), paired with its add-in API for GLIDE, is discussed in this paper. The top-level examples given demonstrate how this interface improves the efficiency of the design process of a concurrent engineering study while reducing potential errors associated with manually sharing information between study participants.

  9. Karst database development in Minnesota: Design and data assembly

    USGS Publications Warehouse

    Gao, Y.; Alexander, E.C.; Tipping, R.G.

    2005-01-01

    The Karst Feature Database (KFD) of Minnesota is a relational GIS-based Database Management System (DBMS). Previous karst feature datasets used inconsistent attributes to describe karst features in different areas of Minnesota. Existing metadata were modified and standardized to represent a comprehensive metadata for all the karst features in Minnesota. Microsoft Access 2000 and ArcView 3.2 were used to develop this working database. Existing county and sub-county karst feature datasets have been assembled into the KFD, which is capable of visualizing and analyzing the entire data set. By November 17 2002, 11,682 karst features were stored in the KFD of Minnesota. Data tables are stored in a Microsoft Access 2000 DBMS and linked to corresponding ArcView applications. The current KFD of Minnesota has been moved from a Windows NT server to a Windows 2000 Citrix server accessible to researchers and planners through networked interfaces. ?? Springer-Verlag 2005.

  10. Web GIS in practice IV: publishing your health maps and connecting to remote WMS sources using the Open Source UMN MapServer and DM Solutions MapLab

    PubMed Central

    Boulos, Maged N Kamel; Honda, Kiyoshi

    2006-01-01

    Open Source Web GIS software systems have reached a stage of maturity, sophistication, robustness and stability, and usability and user friendliness rivalling that of commercial, proprietary GIS and Web GIS server products. The Open Source Web GIS community is also actively embracing OGC (Open Geospatial Consortium) standards, including WMS (Web Map Service). WMS enables the creation of Web maps that have layers coming from multiple different remote servers/sources. In this article we present one easy to implement Web GIS server solution that is based on the Open Source University of Minnesota (UMN) MapServer. By following the accompanying step-by-step tutorial instructions, interested readers running mainstream Microsoft® Windows machines and with no prior technical experience in Web GIS or Internet map servers will be able to publish their own health maps on the Web and add to those maps additional layers retrieved from remote WMS servers. The 'digital Asia' and 2004 Indian Ocean tsunami experiences in using free Open Source Web GIS software are also briefly described. PMID:16420699

  11. Classification of galaxy type from images using Microsoft R Server

    NASA Astrophysics Data System (ADS)

    de Vries, Andrie

    2017-06-01

    Many astronomers working in the field of AstroInformatics write code as part of their work. Although the programming language of choice is Python, a small number (8%) use R. R has its specific strengths in the domain of statistics, and is often viewed as limited in the size of data it can handle. However, Microsoft R Server is a product that removes these limitations by being able to process much larger amounts of data. I present some highlights of R Server, by illustrating how to fit a convolutional neural network using R. The specific task is to classify galaxies, using only images extracted from the Sloan Digital Skyserver.

  12. P43-S Computational Biology Applications Suite for High-Performance Computing (BioHPC.net)

    PubMed Central

    Pillardy, J.

    2007-01-01

    One of the challenges of high-performance computing (HPC) is user accessibility. At the Cornell University Computational Biology Service Unit, which is also a Microsoft HPC institute, we have developed a computational biology application suite that allows researchers from biological laboratories to submit their jobs to the parallel cluster through an easy-to-use Web interface. Through this system, we are providing users with popular bioinformatics tools including BLAST, HMMER, InterproScan, and MrBayes. The system is flexible and can be easily customized to include other software. It is also scalable; the installation on our servers currently processes approximately 8500 job submissions per year, many of them requiring massively parallel computations. It also has a built-in user management system, which can limit software and/or database access to specified users. TAIR, the major database of the plant model organism Arabidopsis, and SGN, the international tomato genome database, are both using our system for storage and data analysis. The system consists of a Web server running the interface (ASP.NET C#), Microsoft SQL server (ADO.NET), compute cluster running Microsoft Windows, ftp server, and file server. Users can interact with their jobs and data via a Web browser, ftp, or e-mail. The interface is accessible at http://cbsuapps.tc.cornell.edu/.

  13. Installing and Executing Information Object Analysis, Intent, Dissemination, and Enhancement (IOAIDE) and Its Dependencies

    DTIC Science & Technology

    2017-02-01

    Image Processing Web Server Administration ...........................17 Fig. 18 Microsoft ASP.NET MVC 4 installation...algorithms are made into client applications that can be accessed from an image processing web service2 developed following Representational State...Transfer (REST) standards by a mobile app, laptop PC, and other devices. Similarly, weather tweets can be accessed via the Weather Digest Web Service

  14. Data Driven Device Failure Prediction

    DTIC Science & Technology

    2016-09-15

    Microsoft enterprise authentication service and Apache web server in an effort to increase up-time and improve mission effectiveness. These new fault loads...54 4.2.2 Web Server . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59...predictor. Finally, the implementation is validated by running the same experiment on a web server. 1.1 Problem Statement According to the operational

  15. GLobal Integrated Design Environment

    NASA Technical Reports Server (NTRS)

    Kunkel, Matthew; McGuire, Melissa; Smith, David A.; Gefert, Leon P.

    2011-01-01

    The GLobal Integrated Design Environment (GLIDE) is a collaborative engineering application built to resolve the design session issues of real-time passing of data between multiple discipline experts in a collaborative environment. Utilizing Web protocols and multiple programming languages, GLIDE allows engineers to use the applications to which they are accustomed in this case, Excel to send and receive datasets via the Internet to a database-driven Web server. Traditionally, a collaborative design session consists of one or more engineers representing each discipline meeting together in a single location. The discipline leads exchange parameters and iterate through their respective processes to converge on an acceptable dataset. In cases in which the engineers are unable to meet, their parameters are passed via e-mail, telephone, facsimile, or even postal mail. The result of this slow process of data exchange would elongate a design session to weeks or even months. While the iterative process remains in place, software can now exchange parameters securely and efficiently, while at the same time allowing for much more information about a design session to be made available. GLIDE is written in a compilation of several programming languages, including REALbasic, PHP, and Microsoft Visual Basic. GLIDE client installers are available to download for both Microsoft Windows and Macintosh systems. The GLIDE client software is compatible with Microsoft Excel 2000 or later on Windows systems, and with Microsoft Excel X or later on Macintosh systems. GLIDE follows the Client-Server paradigm, transferring encrypted and compressed data via standard Web protocols. Currently, the engineers use Excel as a front end to the GLIDE Client, as many of their custom tools run in Excel.

  16. An Array Library for Microsoft SQL Server with Astrophysical Applications

    NASA Astrophysics Data System (ADS)

    Dobos, L.; Szalay, A. S.; Blakeley, J.; Falck, B.; Budavári, T.; Csabai, I.

    2012-09-01

    Today's scientific simulations produce output on the 10-100 TB scale. This unprecedented amount of data requires data handling techniques that are beyond what is used for ordinary files. Relational database systems have been successfully used to store and process scientific data, but the new requirements constantly generate new challenges. Moving terabytes of data among servers on a timely basis is a tough problem, even with the newest high-throughput networks. Thus, moving the computations as close to the data as possible and minimizing the client-server overhead are absolutely necessary. At least data subsetting and preprocessing have to be done inside the server process. Out of the box commercial database systems perform very well in scientific applications from the prospective of data storage optimization, data retrieval, and memory management but lack basic functionality like handling scientific data structures or enabling advanced math inside the database server. The most important gap in Microsoft SQL Server is the lack of a native array data type. Fortunately, the technology exists to extend the database server with custom-written code that enables us to address these problems. We present the prototype of a custom-built extension to Microsoft SQL Server that adds array handling functionality to the database system. With our Array Library, fix-sized arrays of all basic numeric data types can be created and manipulated efficiently. Also, the library is designed to be able to be seamlessly integrated with the most common math libraries, such as BLAS, LAPACK, FFTW, etc. With the help of these libraries, complex operations, such as matrix inversions or Fourier transformations, can be done on-the-fly, from SQL code, inside the database server process. We are currently testing the prototype with two different scientific data sets: The Indra cosmological simulation will use it to store particle and density data from N-body simulations, and the Milky Way Laboratory project will use it to store galaxy simulation data.

  17. Reactive Aggregate Model Protecting Against Real-Time Threats

    DTIC Science & Technology

    2014-09-01

    on the underlying functionality of three core components. • MS SQL server 2008 backend database. • Microsoft IIS running on Windows server 2008...services. The capstone tested a Linux-based Apache web server with the following software implementations: • MySQL as a Linux-based backend server for...malicious compromise. 1. Assumptions • GINA could connect to a backend MS SQL database through proper configuration of DotNetNuke. • GINA had access

  18. Application of Microsoft's ActiveX and DirectX technologies to the visulization of physical system dynamics

    NASA Astrophysics Data System (ADS)

    Mann, Christopher; Narasimhamurthi, Natarajan

    1998-08-01

    This paper discusses a specific implementation of a web and complement based simulation systems. The overall simulation container is implemented within a web page viewed with Microsoft's Internet Explorer 4.0 web browser. Microsoft's ActiveX/Distributed Component Object Model object interfaces are used in conjunction with the Microsoft DirectX graphics APIs to provide visualization functionality for the simulation. The MathWorks' Matlab computer aided control system design program is used as an ActiveX automation server to provide the compute engine for the simulations.

  19. First-year dental students' motivation and attitudes for choosing the dental profession.

    PubMed

    Avramova, Nadya; Yaneva, Krassimira; Bonev, Boyko

    2014-01-01

    To determine first-year dental students' current motivation and attitudes for choosing the dental profession at the Faculty of Dental Medicine, Medical University - Sofia, Bulgaria. An anonymous questionnaire, consisting of 12 questions about students' socio-demographic profile and their motivation for choosing dentistry, was administered to 119 first-year dental students at the Faculty of Dental Medicine of the Medical University of Sofia. The study was conducted at the beginning of the 2012-2013 academic year. The data was processed and analyzed with the following software: Microsoft Windows Server 2008 R2; Microsoft SQL Server 2008; Internet Information Server 7.5.; Microsoft SharePoint Server 2010. The majority of the students (73%) were self-motivated for choosing dentistry as a career; 61% of them did not have relatives in the medical profession; 43% chose dental medicine because it is a prestigious, humane and noble profession; 50% - for financial security; 59% - because of the independence that it provides. There were no significant differences in the motivation between males and females. Independence, financial security and 'prestige' were the predominant motivating factors in this group of first-year dental students. Determining the reasons for choosing dentistry has important implications for the selection and training of students as well as for their future job satisfaction. Copyright © 2014 by Academy of Sciences and Arts of Bosnia and Herzegovina.

  20. Open Scenario Study, Phase II Report: Assessment and Development of Approaches for Satisfying Unclassified Scenario Needs

    DTIC Science & Technology

    2010-01-01

    interface, another providing the application logic (a program used to manipulate the data), and a server running Microsoft SQL Server or Oracle RDBMS... Oracle ) • Mysql (Open Source) • Other What application server software will be needed? • Application Server • CGI PHP/Perl (Open Source...are used throughout DoD and serve a variety of functions. While DoD has a codified and institutionalized process for the development of a common set

  1. A Tools-Based Approach to Teaching Data Mining Methods

    ERIC Educational Resources Information Center

    Jafar, Musa J.

    2010-01-01

    Data mining is an emerging field of study in Information Systems programs. Although the course content has been streamlined, the underlying technology is still in a state of flux. The purpose of this paper is to describe how we utilized Microsoft Excel's data mining add-ins as a front-end to Microsoft's Cloud Computing and SQL Server 2008 Business…

  2. Assessment & Commitment Tracking System (ACTS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bryant, Robert A.; Childs, Teresa A.; Miller, Michael A.

    2004-12-20

    The ACTS computer code provides a centralized tool for planning and scheduling assessments, tracking and managing actions associated with assessments or that result from an event or condition, and "mining" data for reporting and analyzing information for improving performance. The ACTS application is designed to work with the MS SQL database management system. All database interfaces are written in SQL. The following software is used to develop and support the ACTS application: Cold Fusion HTML JavaScript Quest TOAD Microsoft Visual Source Safe (VSS) HTML Mailer for sending email Microsoft SQL Microsoft Internet Information Server

  3. A mobile information management system used in textile enterprises

    NASA Astrophysics Data System (ADS)

    Huang, C.-R.; Yu, W.-D.

    2008-02-01

    The mobile information management system (MIMS) for textile enterprises is based on Microsoft Visual Studios. NET2003 Server, Microsoft SQL Server 2000, C++ language and wireless application protocol (WAP) and wireless markup language (WML) technology. The portable MIMS is composed of three-layer structures, i.e. showing layer; operating layer; and data visiting layer corresponding to the port-link module; processing module; and database module. By using the MIMS, not only the information exchanges become more convenient and easier, but also the compatible between the giant information capacity and a micro-cell phone and functional expansion nature in operating and designing can be realized by means of build-in units. The development of MIMS is suitable for the utilization in textile enterprises.

  4. Managing Attribute—Value Clinical Trials Data Using the ACT/DB Client—Server Database System

    PubMed Central

    Nadkarni, Prakash M.; Brandt, Cynthia; Frawley, Sandra; Sayward, Frederick G.; Einbinder, Robin; Zelterman, Daniel; Schacter, Lee; Miller, Perry L.

    1998-01-01

    ACT/DB is a client—server database application for storing clinical trials and outcomes data, which is currently undergoing initial pilot use. It stores most of its data in entity—attribute—value form. Such data are segregated according to data type to allow indexing by value when possible, and binary large object data are managed in the same way as other data. ACT/DB lets an investigator design a study rapidly by defining the parameters (or attributes) that are to be gathered, as well as their logical grouping for purposes of display and data entry. ACT/DB generates customizable data entry. The data can be viewed through several standard reports as well as exported as text to external analysis programs. ACT/DB is designed to encourage reuse of parameters across multiple studies and has facilities for dictionary search and maintenance. It uses a Microsoft Access client running on Windows 95 machines, which communicates with an Oracle server running on a UNIX platform. ACT/DB is being used to manage the data for seven studies in its initial deployment. PMID:9524347

  5. Windows Terminal Servers Orchestration

    NASA Astrophysics Data System (ADS)

    Bukowiec, Sebastian; Gaspar, Ricardo; Smith, Tim

    2017-10-01

    Windows Terminal Servers provide application gateways for various parts of the CERN accelerator complex, used by hundreds of CERN users every day. The combination of new tools such as Puppet, HAProxy and Microsoft System Center suite enable automation of provisioning workflows to provide a terminal server infrastructure that can scale up and down in an automated manner. The orchestration does not only reduce the time and effort necessary to deploy new instances, but also facilitates operations such as patching, analysis and recreation of compromised nodes as well as catering for workload peaks.

  6. An electronic thesaurus of Evidence Based Laboratory Medicine hematological and biochemical diagnostic tests.

    PubMed

    Dorizzi, R M; Maconi, M; Giavarina, D; Loza, G; Aman, M; Moreira, J; Bisoffi, Z; Gennuso, C

    2009-10-01

    The adoption of Evidence Based Laboratory Medicine (EBLM) has been hampered until today by the lack of effective tools. The SIMeL EBLM e-Thesaurus (on-line Repertoire of the diagnostic effectiveness of the laboratory, radiology and cardiology test) provides a useful support to clinical laboratory professionals and to clinicians for the interpretation of the diagnostic tests. The e-Thesaurus is an application developed using Microsoft Active Server Pages technology and carried out with Web Server Microsoft Internet Information Server and is available at the SIMeL website using a browser running JavaScript scripts (Internet Explorer is recommended). It contains a database (in Italian, English and Spanish) of the sensitivity and specificity (including the 95% confidence interval), the positive and negative likelihood ratios, the Diagnostic Odds Ratio and the Number Needed to Diagnose of more than 2000 diagnostic (most laboratory but also cardiology and radiology) tests. The e-Thesaurus improves the previous SIMeL paper and CD Thesaurus; its main features are a three languages search and a continuous and an easy updating capability.

  7. Mobile Monitoring Stations and Web Visualization of Biotelemetric System - Guardian II

    NASA Astrophysics Data System (ADS)

    Krejcar, Ondrej; Janckulik, Dalibor; Motalova, Leona; Kufel, Jan

    The main area of interest of our project is to provide solution which can be used in different areas of health care and which will be available through PDAs (Personal Digital Assistants), web browsers or desktop clients. The realized system deals with an ECG sensor connected to mobile equipment, such as PDA/Embedded, based on Microsoft Windows Mobile operating system. The whole system is based on the architecture of .NET Compact Framework, and Microsoft SQL Server. Visualization possibilities of web interface and ECG data are also discussed and final suggestion is made to Microsoft Silverlight solution along with current screenshot representation of implemented solution. The project was successfully tested in real environment in cryogenic room (-136OC).

  8. Do-It-Yourself: A Special Library's Approach to Creating Dynamic Web Pages Using Commercial Off-The-Shelf Applications

    NASA Technical Reports Server (NTRS)

    Steeman, Gerald; Connell, Christopher

    2000-01-01

    Many librarians may feel that dynamic Web pages are out of their reach, financially and technically. Yet we are reminded in library and Web design literature that static home pages are a thing of the past. This paper describes how librarians at the Institute for Defense Analyses (IDA) library developed a database-driven, dynamic intranet site using commercial off-the-shelf applications. Administrative issues include surveying a library users group for interest and needs evaluation; outlining metadata elements; and, committing resources from managing time to populate the database and training in Microsoft FrontPage and Web-to-database design. Technical issues covered include Microsoft Access database fundamentals, lessons learned in the Web-to-database process (including setting up Database Source Names (DSNs), redesigning queries to accommodate the Web interface, and understanding Access 97 query language vs. Standard Query Language (SQL)). This paper also offers tips on editing Active Server Pages (ASP) scripting to create desired results. A how-to annotated resource list closes out the paper.

  9. Development of a mobile emergency patient information and imaging communication system based on CDMA-1X EVDO

    NASA Astrophysics Data System (ADS)

    Yang, Keon Ho; Jung, Haijo; Kang, Won-Suk; Jang, Bong Mun; Kim, Joong Il; Han, Dong Hoon; Yoo, Sun-Kook; Yoo, Hyung-Sik; Kim, Hee-Joung

    2006-03-01

    The wireless mobile service with a high bit rate using CDMA-1X EVDO is now widely used in Korea. Mobile devices are also increasingly being used as the conventional communication mechanism. We have developed a web-based mobile system that communicates patient information and images, using CDMA-1X EVDO for emergency diagnosis. It is composed of a Mobile web application system using the Microsoft Windows 2003 server and an internet information service. Also, a mobile web PACS used for a database managing patient information and images was developed by using Microsoft access 2003. A wireless mobile emergency patient information and imaging communication system is developed by using Microsoft Visual Studio.NET, and JPEG 2000 ActiveX control for PDA phone was developed by using the Microsoft Embedded Visual C++. Also, the CDMA-1X EVDO is used for connections between mobile web servers and the PDA phone. This system allows fast access to the patient information database, storing both medical images and patient information anytime and anywhere. Especially, images were compressed into a JPEG2000 format and transmitted from a mobile web PACS inside the hospital to the radiologist using a PDA phone located outside the hospital. Also, this system shows radiological images as well as physiological signal data, including blood pressure, vital signs and so on, in the web browser of the PDA phone so radiologists can diagnose more effectively. Also, we acquired good results using an RW-6100 PDA phone used in the university hospital system of the Sinchon Severance Hospital in Korea.

  10. Electronic Attack Platform Placement Optimization

    DTIC Science & Technology

    2014-09-01

    Processing in VBA ...............................................................33 2. Client-Server Using Two Different Excel Application...6 Figure 3. Screenshot of the VBA IDE contained within all Microsoft Office products...application using MS Excel’s Applicatin.OnTime method. .....................................33 Figure 20. WINSOCK API Functions needed to use TCP via VBA

  11. HydroDesktop: An Open Source GIS-Based Platform for Hydrologic Data Discovery, Visualization, and Analysis

    NASA Astrophysics Data System (ADS)

    Ames, D. P.; Kadlec, J.; Cao, Y.; Grover, D.; Horsburgh, J. S.; Whiteaker, T.; Goodall, J. L.; Valentine, D. W.

    2010-12-01

    A growing number of hydrologic information servers are being deployed by government agencies, university networks, and individual researchers using the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) Hydrologic Information System (HIS). The CUAHSI HIS Project has developed a standard software stack, called HydroServer, for publishing hydrologic observations data. It includes the Observations Data Model (ODM) database and Water Data Service web services, which together enable publication of data on the Internet in a standard format called Water Markup Language (WaterML). Metadata describing available datasets hosted on these servers is compiled within a central metadata catalog called HIS Central at the San Diego Supercomputer Center and is searchable through a set of predefined web services based queries. Together, these servers and central catalog service comprise a federated HIS of a scale and comprehensiveness never previously available. This presentation will briefly review/introduce the CUAHSI HIS system with special focus on a new HIS software tool called "HydroDesktop" and the open source software development web portal, www.HydroDesktop.org, which supports community development and maintenance of the software. HydroDesktop is a client-side, desktop software application that acts as a search and discovery tool for exploring the distributed network of HydroServers, downloading specific data series, visualizing and summarizing data series and exporting these to formats needed for analysis by external software. HydroDesktop is based on the open source DotSpatial GIS developer toolkit which provides it with map-based data interaction and visualization, and a plug-in interface that can be used by third party developers and researchers to easily extend the software using Microsoft .NET programming languages. HydroDesktop plug-ins that are presently available or currently under development within the project and by third party collaborators include functions for data search and discovery, extensive graphing, data editing and export, HydroServer exploration, integration with the OpenMI workflow and modeling system, and an interface for data analysis through the R statistical package.

  12. [Automated anesthesia record system].

    PubMed

    Zhu, Tao; Liu, Jin

    2005-12-01

    Based on Client/Server architecture, a software of automated anesthesia record system running under Windows operation system and networks has been developed and programmed with Microsoft Visual C++ 6.0, Visual Basic 6.0 and SQL Server. The system can deal with patient's information throughout the anesthesia. It can collect and integrate the data from several kinds of medical equipment such as monitor, infusion pump and anesthesia machine automatically and real-time. After that, the system presents the anesthesia sheets automatically. The record system makes the anesthesia record more accurate and integral and can raise the anesthesiologist's working efficiency.

  13. Social Impacts Module (SIM) Transition

    DTIC Science & Technology

    2012-09-28

    User String The authorized user’s name to access the PAVE database. Applies only to Microsoft SQL Server; leave blank, otherwise. passwd String The...otherwise. passwd String The password if an authorized user’s name is required; otherwise, leave blank driver String The class name for the driver to

  14. HPC: Rent or Buy

    ERIC Educational Resources Information Center

    Fredette, Michelle

    2012-01-01

    "Rent or buy?" is a question people ask about everything from housing to textbooks. It is also a question universities must consider when it comes to high-performance computing (HPC). With the advent of Amazon's Elastic Compute Cloud (EC2), Microsoft Windows HPC Server, Rackspace's OpenStack, and other cloud-based services, researchers now have…

  15. Aviation Environmental Design Tool (AEDT) : Version 2c service Pack 1 : installation guide.

    DOT National Transportation Integrated Search

    2016-12-01

    This document provides detailed instructions on how to install and run AEDT 2c Service Pack 1 (SP1). It is important to follow the installation instructions in the order listed below, as Microsoft SQL Server 2008 R2 is a prerequisite for AEDT. Instal...

  16. Investigating Uses and Perceptions of an Online Collaborative Workspace for the Dissertation Process

    ERIC Educational Resources Information Center

    Rockinson-Szapkiw, Amanda J.

    2012-01-01

    The intent of this study was to investigate 93 doctoral candidates' perceptions and use of an online collaboration workspace and content management server, Microsoft Office SharePoint, for dissertation process. All candidates were enrolled in an Ed.D. programme in the United States. Descriptive statistics demonstrate that candidates frequently use…

  17. Multimedia data repository for the World Wide Web

    NASA Astrophysics Data System (ADS)

    Chen, Ken; Lu, Dajin; Xu, Duanyi

    1998-08-01

    This paper introduces the design and implementation of a Multimedia Data Repository served as a multimedia information system, which provides users a Web accessible, platform independent interface to query, browse, and retrieve multimedia data such as images, graphics, audio, video from a large multimedia data repository. By integrating the multimedia DBMS, in which the textual information and samples of the multimedia data is organized and stored, and Web server together into the Microsoft ActiveX Server Framework, users can access the DBMS and query the information by simply using a Web browser at the client-side. The original multimedia data can then be located and transmitted through the Internet from the tertiary storage device, a 400 CDROM optical jukebox at the server-side, to the client-side for further use.

  18. A mobile field-work data collection system for the wireless era of health surveillance.

    PubMed

    Forsell, Marianne; Sjögren, Petteri; Renard, Matthew; Johansson, Olle

    2011-03-01

    In many countries or regions the capacity of health care resources is below the needs of the population and new approaches for health surveillance are needed. Innovative projects, utilizing wireless communication technology, contribute to reliable methods for field-work data collection and reporting to databases. The objective was to describe a new version of a wireless IT-support system for field-work data collection and administration. The system requirements were drawn from the design objective and translated to system functions. The system architecture was based on fieldwork experiences and administrative requirements. The Smartphone devices were HTC Touch Diamond2s, while the system was based on a platform with Microsoft .NET components, and a SQL Server 2005 with Microsoft Windows Server 2003 operating system. The user interfaces were based on .NET programming, and Microsoft Windows Mobile operating system. A synchronization module enabled download of field data to the database, via a General Packet Radio Services (GPRS) to a Local Area Network (LAN) interface. The field-workers considered the here-described applications user-friendly and almost self-instructing. The office administrators considered that the back-office interface facilitated retrieval of health reports and invoice distribution. The current IT-support system facilitates short lead times from fieldwork data registration to analysis, and is suitable for various applications. The advantages of wireless technology, and paper-free data administration need to be increasingly emphasized in development programs, in order to facilitate reliable and transparent use of limited resources.

  19. "Just Another Tool for Online Studies” (JATOS): An Easy Solution for Setup and Management of Web Servers Supporting Online Studies

    PubMed Central

    Lange, Kristian; Kühn, Simone; Filevich, Elisa

    2015-01-01

    We present here “Just Another Tool for Online Studies” (JATOS): an open source, cross-platform web application with a graphical user interface (GUI) that greatly simplifies setting up and communicating with a web server to host online studies that are written in JavaScript. JATOS is easy to install in all three major platforms (Microsoft Windows, Mac OS X, and Linux), and seamlessly pairs with a database for secure data storage. It can be installed on a server or locally, allowing researchers to try the application and feasibility of their studies within a browser environment, before engaging in setting up a server. All communication with the JATOS server takes place via a GUI (with no need to use a command line interface), making JATOS an especially accessible tool for researchers without a strong IT background. We describe JATOS’ main features and implementation and provide a detailed tutorial along with example studies to help interested researchers to set up their online studies. JATOS can be found under the Internet address: www.jatos.org. PMID:26114751

  20. Information Management System Development for the Investigation, Reporting, and Analysis of Human Error in Naval Aviation Maintenance

    DTIC Science & Technology

    2001-09-01

    of MEIMS was programmed in Microsoft Access 97 using Visual Basic for Applications ( VBA ). This prototype had very little documentation. The FAA...using Acess 2000 as an interface and SQL server as the database engine. Question 1: Did you have any problems accessing the program? Y / N

  1. South Carolina's SC LENDS: Optimizing Libraries, Transforming Lending

    ERIC Educational Resources Information Center

    Hamby, Rogan; McBride, Ray; Lundberg, Maria

    2011-01-01

    Since SC LENDS started operating in June 2009, more public libraries have come on board. All of this on the back end connects to a Mozilla-based staff client that has distributions for Mac OS X and Microsoft Windows, using SSL encryption to keep communications secure and private between remote libraries and the servers hosted at a high-end…

  2. Disaster recovery plan for HANDI 2000 business management system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, D.E.

    The BMS production implementation will be complete by October 1, 1998 and the server environment will be comprised of two types of platforms. The PassPort Supply and the PeopleSoft Financials will reside on LNIX servers and the PeopleSoft Human Resources and Payroll will reside on Microsoft NT servers. Because of the wide scope and the requirements of the COTS products to run in various environments backup and recovery responsibilities are divided between two groups in Technical Operations. The Central Computer Systems Management group provides support for the LTNIX/NT Backup Data Center, and the Network Infrastructure Systems group provides support formore » the NT Application Server Backup outside the Data Center. The disaster recovery process is dependent on a good backup and recovery process. Information and integrated system data for determining the disaster recovery process is identified from the Fluor Daniel Hanford (FDH) Risk Assessment Plan, Contingency Plan, and Backup and Recovery Plan, and Backup Form for HANDI 2000 BMS.« less

  3. The personal receiving document management and the realization of email function in OAS

    NASA Astrophysics Data System (ADS)

    Li, Biqing; Li, Zhao

    2017-05-01

    This software is an independent software system, suitable for small and medium enterprises, contains personal office, scientific research project management and system management functions, independently run in relevant environment, and to solve practical needs. This software is an independent software system, using the current popular B/S (browser/server) structure and ASP.NET technology development, using the Windows 7 operating system, Microsoft SQL Server2005 Visual2008 and database as a development platform, suitable for small and medium enterprises, contains personal office, scientific research project management and system management functions, independently run in relevant environment, and to solve practical needs.

  4. CIS4/403: Design and Implementation of an Intranet-based system for Real-Time Tele-Consultation in Oncology

    PubMed Central

    Eccher, C; Berloffa, F; Demichelis, F; Larcher, B; Galvagni, M; Sboner, A; Graiff, A; Forti, S

    1999-01-01

    Introduction This study describes a tele-consultation system (TCS) developed to provide a computing environment over a Wide Area Network (WAN) in North Italy (Province of Trento), that can be used by two or more physicians to share medical data and to work co-operatively on medical records. A pilot study has been carried out in oncology to assess the effectiveness of the system. The aim of this project is to facilitate the management of oncology patients by improving communication among the specialists of central and district hospitals. Methods and Results The TCS is an Intranet-based solution. The Intranet is based on a PC WAN with Windows NT Server, Microsoft SQL Server, and Internet Information Server. TCS is composed of native and custom applications developed in the Microsoft Windows (9x and NT) environment. The basic component of the system is the multimedia digital medical record, structured as a collection of HTML and ASP pages. A distributed relational database will allow users to store and retrieve medical records, accessed by a dedicated Web browser via the Web Server. The medical data to be stored and the presentation architecture of the clinical record had been determined in close collaboration with the clinicians involved in the project. TCS will allow a multi-point tele-consultation (TC) among two or more participants on remote computers, providing synchronized surfing through the clinical report. A set of collaborative and personal tools, whiteboard with drawing tools, point-to-point digital audio-conference, chat, local notepad, e-mail service, are integrated in the system to provide an user friendly environment. TCS has been developed as a client-server architecture. The client part of the system is based on the Microsoft Web Browser control and provides the user interface and the tools described above. The server part, running all the time on a dedicated computer, accepts connection requests and manages the connections among the participants in a TC, allowing multiple TC to run simultaneously. TCS has been developed in Visual C++ environment using MFC library and COM technology; ActiveX controls have been written in Visual Basic to perform dedicated tasks from the inside of the HTML clinical report. Before deploying the system in the hospital departments involved in the project, TCS has been tested in our laboratory by clinicians involved in the project to evaluate the usability of the system. Discussion TCS has the potential to support a "multi-disciplinary distributed virtual oncological meeting". The specialists of different departments and of different hospitals can attend "virtual meetings" and interactively discuss on medical data. An expected benefit of the "virtual meeting" should be the possibility to provide expert remote advice from oncologists to peripheral cancer units in formulating treatment plans, conducting follow-up sessions and supporting clinical research.

  5. Land User and Land Cover Maps of Europe: a Webgis Platform

    NASA Astrophysics Data System (ADS)

    Brovelli, M. A.; Fahl, F. C.; Minghini, M.; Molinari, M. E.

    2016-06-01

    This paper presents the methods and implementation processes of a WebGIS platform designed to publish the available land use and land cover maps of Europe at continental scale. The system is built completely on open source infrastructure and open standards. The proposed architecture is based on a server-client model having GeoServer as the map server, Leaflet as the client-side mapping library and the Bootstrap framework at the core of the front-end user interface. The web user interface is designed to have typical features of a desktop GIS (e.g. activate/deactivate layers and order layers by drag and drop actions) and to show specific information on the activated layers (e.g. legend and simplified metadata). Users have the possibility to change the base map from a given list of map providers (e.g. OpenStreetMap and Microsoft Bing) and to control the opacity of each layer to facilitate the comparison with both other land cover layers and the underlying base map. In addition, users can add to the platform any custom layer available through a Web Map Service (WMS) and activate the visualization of photos from popular photo sharing services. This last functionality is provided in order to have a visual assessment of the available land coverages based on other user-generated contents available on the Internet. It is supposed to be a first step towards a calibration/validation service that will be made available in the future.

  6. Evaluation of Sub Query Performance in SQL Server

    NASA Astrophysics Data System (ADS)

    Oktavia, Tanty; Sujarwo, Surya

    2014-03-01

    The paper explores several sub query methods used in a query and their impact on the query performance. The study uses experimental approach to evaluate the performance of each sub query methods combined with indexing strategy. The sub query methods consist of in, exists, relational operator and relational operator combined with top operator. The experimental shows that using relational operator combined with indexing strategy in sub query has greater performance compared with using same method without indexing strategy and also other methods. In summary, for application that emphasized on the performance of retrieving data from database, it better to use relational operator combined with indexing strategy. This study is done on Microsoft SQL Server 2012.

  7. [Development of expert diagnostic system for common respiratory diseases].

    PubMed

    Xu, Wei-hua; Chen, You-ling; Yan, Zheng

    2014-03-01

    To develop an internet-based expert diagnostic system for common respiratory diseases. SaaS system was used to build architecture; pattern of forward reasoning was applied for inference engine design; ASP.NET with C# from the tool pack of Microsoft Visual Studio 2005 was used for website-interview medical expert system.The database of the system was constructed with Microsoft SQL Server 2005. The developed expert system contained large data memory and high efficient function of data interview and data analysis for diagnosis of various diseases.The users were able to perform this system to obtain diagnosis for common respiratory diseases via internet. The developed expert system may be used for internet-based diagnosis of various respiratory diseases,particularly in telemedicine setting.

  8. Distributed On-line Monitoring System Based on Modem and Public Phone Net

    NASA Astrophysics Data System (ADS)

    Chen, Dandan; Zhang, Qiushi; Li, Guiru

    In order to solve the monitoring problem of urban sewage disposal, a distributed on-line monitoring system is proposed. By introducing dial-up communication technology based on Modem, the serial communication program can rationally solve the information transmission problem between master station and slave station. The realization of serial communication program is based on the MSComm control of C++ Builder 6.0.The software includes real-time data operation part and history data handling part, which using Microsoft SQL Server 2000 for database, and C++ Builder6.0 for user interface. The monitoring center displays a user interface with alarm information of over-standard data and real-time curve. Practical application shows that the system has successfully accomplished the real-time data acquisition from data gather station, and stored them in the terminal database.

  9. A generic minimization random allocation and blinding system on web.

    PubMed

    Cai, Hongwei; Xia, Jielai; Xu, Dezhong; Gao, Donghuai; Yan, Yongping

    2006-12-01

    Minimization is a dynamic randomization method for clinical trials. Although recommended by many researchers, the utilization of minimization has been seldom reported in randomized trials mainly because of the controversy surrounding the validity of conventional analyses and its complexity in implementation. However, both the statistical and clinical validity of minimization were demonstrated in recent studies. Minimization random allocation system integrated with blinding function that could facilitate the implementation of this method in general clinical trials has not been reported. SYSTEM OVERVIEW: The system is a web-based random allocation system using Pocock and Simon minimization method. It also supports multiple treatment arms within a trial, multiple simultaneous trials, and blinding without further programming. This system was constructed with generic database schema design method, Pocock and Simon minimization method and blinding method. It was coded with Microsoft Visual Basic and Active Server Pages (ASP) programming languages. And all dataset were managed with a Microsoft SQL Server database. Some critical programming codes were also provided. SIMULATIONS AND RESULTS: Two clinical trials were simulated simultaneously to test the system's applicability. Not only balanced groups but also blinded allocation results were achieved in both trials. Practical considerations for minimization method, the benefits, general applicability and drawbacks of the technique implemented in this system are discussed. Promising features of the proposed system are also summarized.

  10. A personal digital assistant application (MobilDent) for dental fieldwork data collection, information management and database handling.

    PubMed

    Forsell, M; Häggström, M; Johansson, O; Sjögren, P

    2008-11-08

    To develop a personal digital assistant (PDA) application for oral health assessment fieldwork, including back-office and database systems (MobilDent). System design, construction and implementation of PDA, back-office and database systems. System requirements for MobilDent were collected, analysed and translated into system functions. User interfaces were implemented and system architecture was outlined. MobilDent was based on a platform with. NET (Microsoft) components, using an SQL Server 2005 (Microsoft) for data storage with Windows Mobile (Microsoft) operating system. The PDA devices were Dell Axim. System functions and user interfaces were specified for MobilDent. User interfaces for PDA, back-office and database systems were based on. NET programming. The PDA user interface was based on Windows suitable to a PDA display, whereas the back-office interface was designed for a normal-sized computer screen. A synchronisation module (MS Active Sync, Microsoft) was used to enable download of field data from PDA to the database. MobilDent is a feasible application for oral health assessment fieldwork, and the oral health assessment database may prove a valuable source for care planning, educational and research purposes. Further development of the MobilDent system will include wireless connectivity with download-on-demand technology.

  11. Exploring the Cost and Functionality of MEDCOM Web Services

    DTIC Science & Technology

    2005-10-24

    Software Name 24. What backend database software supports your intranet/Internet content? (check all that apply)-. o Oracle o Microsoft SQL Server E0...Department of Defense (DoD) service branches, which funded and deployed an Internet portal, TRICARE Online, to serve as an information conduit between the...public website, the information contained on the intranet is traditionally limited to the members of the hosting command. The local information serves as

  12. An Automated Solution to the Multiuser Carved Data Ascription Problem

    DTIC Science & Technology

    2010-12-01

    computer might have several authorized users. It is also common in many families, as well as in libraries, hospitals, and Internet cafes . Another way for...starting disk sector number were used in preference to features such as the Microsoft Office em- bedded “Creator” attribute. We believe that this is...with exemplars in a reference collection. 5) Validation Server: Although the technique presented in this paper is effective, it is time consuming to

  13. CIS3/398: Implementation of a Web-Based Electronic Patient Record for Transplant Recipients

    PubMed Central

    Fritsche, L; Lindemann, G; Schroeter, K; Schlaefer, A; Neumayer, H-H

    1999-01-01

    Introduction While the "Electronic patient record" (EPR) is a frequently quoted term in many areas of healthcare, only few working EPR-systems are available so far. To justify their use, EPRs must be able to store and display all kinds of medical information in a reliable, secure, time-saving, user-friendly way at an affordable price. Fields with patients who are attended to by a large number of medical specialists over a prolonged period of time are best suited to demonstrate the potential benefits of an EPR. The aim of our project was to investigate the feasibility of an EPR based solely on "of-the-shelf"-software and Internet-technology in the field of organ transplantation. Methods The EPR-system consists of three main elements: Data-storage facilities, a Web-server and a user-interface. Data are stored either in a relational database (Sybase Adaptive 11.5, Sybase Inc., CA) or in case of pictures (JPEG) and files in application formats (e. g. Word-Documents) on a Windows NT 4.0 Server (Microsoft Corp., WA). The entire communication of all data is handled by a Web-server (IIS 4.0, Microsoft) with an Active Server Pages extension. The database is accessed by ActiveX Data Objects via the ODBC-interface. The only software required on the user's computer is the Internet Explorer 4.01 (Microsoft), during the first use of the EPR, the ActiveX HTML Layout Control is automatically added. The user can access the EPR via Local or Wide Area Network or by dial-up connection. If the EPR is accessed from outside the firewall, all communication is encrypted (SSL 3.0, Netscape Comm. Corp., CA).The speed of the EPR-system was tested with 50 repeated measurements of the duration of two key-functions: 1) Display of all lab results for a given day and patient and 2) automatic composition of a letter containing diagnoses, medication, notes and lab results. For the test a 233 MHz Pentium II Processor with 10 Mbit/s Ethernet connection (ping-time below 10 ms) over 2 hubs to the server (400 MHz Pentium II, 256 MB RAM) was used. Results So far the EPR-system has been running for eight consecutive months and contains complete records of 673 transplant recipients with an average follow-up of 9.9 (SD :4.9) years and a total of 1.1 million lab values. Instruction to enable new users to perform basic operations took less than two hours in all cases. The average duration of laboratory access was 0.9 (SD:0.5) seconds, the automatic composition of a letter took 6.1 (SD:2.4) seconds. Apart from the database and Windows NT, all other components are available for free. The development of the EPR-system required less than two person-years. Conclusion Implementation of an Electronic patient record that meets the requirements of comprehensiveness, reliability, security, speed, user-friendliness and affordability using a combination of "of-the-shelf" software-products can be feasible, if the current state-of-the-art internet technology is applied.

  14. Database Reports Over the Internet

    NASA Technical Reports Server (NTRS)

    Smith, Dean Lance

    2002-01-01

    Most of the summer was spent developing software that would permit existing test report forms to be printed over the web on a printer that is supported by Adobe Acrobat Reader. The data is stored in a DBMS (Data Base Management System). The client asks for the information from the database using an HTML (Hyper Text Markup Language) form in a web browser. JavaScript is used with the forms to assist the user and verify the integrity of the entered data. Queries to a database are made in SQL (Sequential Query Language), a widely supported standard for making queries to databases. Java servlets, programs written in the Java programming language running under the control of network server software, interrogate the database and complete a PDF form template kept in a file. The completed report is sent to the browser requesting the report. Some errors are sent to the browser in an HTML web page, others are reported to the server. Access to the databases was restricted since the data are being transported to new DBMS software that will run on new hardware. However, the SQL queries were made to Microsoft Access, a DBMS that is available on most PCs (Personal Computers). Access does support the SQL commands that were used, and a database was created with Access that contained typical data for the report forms. Some of the problems and features are discussed below.

  15. Automatic management system for dose parameters in interventional radiology and cardiology.

    PubMed

    Ten, J I; Fernandez, J M; Vaño, E

    2011-09-01

    The purpose of this work was to develop an automatic management system to archive and analyse the major study parameters and patient doses for fluoroscopy guided procedures performed in cardiology and interventional radiology systems. The X-ray systems used for this trial have the capability to export at the end of the procedure and via e-mail the technical parameters of the study and the patient dose values. An application was developed to query and retrieve from a mail server, all study reports sent by the imaging modality and store them on a Microsoft SQL Server data base. The results from 3538 interventional study reports generated by 7 interventional systems were processed. In the case of some technical parameters and patient doses, alarms were added to receive malfunction alerts so as to immediately take appropriate corrective actions.

  16. Reducing the Cost of System Administration of a Disk Storage System Built from Commodity Components

    DTIC Science & Technology

    2000-05-01

    quickly by using checkpointing and roll-forward logs. Microsoft Tiger is a video server built from commodity PCs which they call “cubs” [ BBD +96, BFD97...20 cents per megabyte using street prices of components. 3.2.2 Redundancy In designing the TD prototype, we have taken care to ensure it does not have... Td /GridPix/, 1999. [ATP99] Satoshi Asami, Nisha Talagala, and David Patterson. Designing a self-maintaining storage system. In Proceedings of the

  17. Information Mining Technologies to Enable Discovery of Actionable Intelligence to Facilitate Maritime Situational Awareness: I-MINE

    DTIC Science & Technology

    2013-01-01

    website). Data mining tools are in-house code developed in Python, C++ and Java . • NGA The National Geospatial-Intelligence Agency (NGA) performs data...as PostgreSQL (with PostGIS), MySQL , Microsoft SQL Server, SQLite, etc. using the appropriate JDBC driver. 14 The documentation and ease to learn are...written in Java that is able to perform various types of regressions, classi- fications, and other data mining tasks. There is also a commercial version

  18. DEVELOPMENT OF CAPE-OPEN COMPLIANT PROCESS MODELING COMPONENTS IN MICROSOFT .NET

    EPA Science Inventory

    The CAPE-OPEN middleware standards were created to allow process modeling components (PMCs) developed by third parties to be used in any process modeling environment (PME) utilizing these standards. The CAPE-OPEN middleware specifications were based upon both Microsoft's Compone...

  19. Covariant Evolutionary Event Analysis for Base Interaction Prediction Using a Relational Database Management System for RNA.

    PubMed

    Xu, Weijia; Ozer, Stuart; Gutell, Robin R

    2009-01-01

    With an increasingly large amount of sequences properly aligned, comparative sequence analysis can accurately identify not only common structures formed by standard base pairing but also new types of structural elements and constraints. However, traditional methods are too computationally expensive to perform well on large scale alignment and less effective with the sequences from diversified phylogenetic classifications. We propose a new approach that utilizes coevolutional rates among pairs of nucleotide positions using phylogenetic and evolutionary relationships of the organisms of aligned sequences. With a novel data schema to manage relevant information within a relational database, our method, implemented with a Microsoft SQL Server 2005, showed 90% sensitivity in identifying base pair interactions among 16S ribosomal RNA sequences from Bacteria, at a scale 40 times bigger and 50% better sensitivity than a previous study. The results also indicated covariation signals for a few sets of cross-strand base stacking pairs in secondary structure helices, and other subtle constraints in the RNA structure.

  20. Covariant Evolutionary Event Analysis for Base Interaction Prediction Using a Relational Database Management System for RNA

    PubMed Central

    Xu, Weijia; Ozer, Stuart; Gutell, Robin R.

    2010-01-01

    With an increasingly large amount of sequences properly aligned, comparative sequence analysis can accurately identify not only common structures formed by standard base pairing but also new types of structural elements and constraints. However, traditional methods are too computationally expensive to perform well on large scale alignment and less effective with the sequences from diversified phylogenetic classifications. We propose a new approach that utilizes coevolutional rates among pairs of nucleotide positions using phylogenetic and evolutionary relationships of the organisms of aligned sequences. With a novel data schema to manage relevant information within a relational database, our method, implemented with a Microsoft SQL Server 2005, showed 90% sensitivity in identifying base pair interactions among 16S ribosomal RNA sequences from Bacteria, at a scale 40 times bigger and 50% better sensitivity than a previous study. The results also indicated covariation signals for a few sets of cross-strand base stacking pairs in secondary structure helices, and other subtle constraints in the RNA structure. PMID:20502534

  1. Development of yarn breakage detection software system based on machine vision

    NASA Astrophysics Data System (ADS)

    Wang, Wenyuan; Zhou, Ping; Lin, Xiangyu

    2017-10-01

    For questions spinning mills and yarn breakage cannot be detected in a timely manner, and save the cost of textile enterprises. This paper presents a software system based on computer vision for real-time detection of yarn breakage. The system and Windows8.1 system Tablet PC, cloud server to complete the yarn breakage detection and management. Running on the Tablet PC software system is designed to collect yarn and location information for analysis and processing. And will be processed after the information through the Wi-Fi and http protocol sent to the cloud server to store in the Microsoft SQL2008 database. In order to follow up on the yarn break information query and management. Finally sent to the local display on time display, and remind the operator to deal with broken yarn. The experimental results show that the system of missed test rate not more than 5%o, and no error detection.

  2. Measurement of Energy Performances for General-Structured Servers

    NASA Astrophysics Data System (ADS)

    Liu, Ren; Chen, Lili; Li, Pengcheng; Liu, Meng; Chen, Haihong

    2017-11-01

    Energy consumption of servers in data centers increases rapidly along with the wide application of Internet and connected devices. To improve the energy efficiency of servers, voluntary or mandatory energy efficiency programs for servers, including voluntary label program or mandatory energy performance standards have been adopted or being prepared in the US, EU and China. However, the energy performance of servers and testing methods of servers are not well defined. This paper presents matrices to measure the energy performances of general-structured servers. The impacts of various components of servers on their energy performances are also analyzed. Based on a set of normalized workload, the author proposes a standard method for testing energy efficiency of servers. Pilot tests are conducted to assess the energy performance testing methods of servers. The findings of the tests are discussed in the paper.

  3. An integrated data-analysis and database system for AMS 14C

    NASA Astrophysics Data System (ADS)

    Kjeldsen, Henrik; Olsen, Jesper; Heinemeier, Jan

    2010-04-01

    AMSdata is the name of a combined database and data-analysis system for AMS 14C and stable-isotope work that has been developed at Aarhus University. The system (1) contains routines for data analysis of AMS and MS data, (2) allows a flexible and accurate description of sample extraction and pretreatment, also when samples are split into several fractions, and (3) keeps track of all measured, calculated and attributed data. The structure of the database is flexible and allows an unlimited number of measurement and pretreatment procedures. The AMS 14C data analysis routine is fairly advanced and flexible, and it can be easily optimized for different kinds of measuring processes. Technically, the system is based on a Microsoft SQL server and includes stored SQL procedures for the data analysis. Microsoft Office Access is used for the (graphical) user interface, and in addition Excel, Word and Origin are exploited for input and output of data, e.g. for plotting data during data analysis.

  4. [Study of efficiancy of teleconsultation: the Telepathology Consultation Service of the Professional Assoziation of German Pathologists for the screening program of breast carcinoma].

    PubMed

    Schrader, T; Hufnagl, P; Schlake, W; Dietel, M

    2005-01-01

    In the autumn a German screening program was started for detecting breast cancer in the population of women fifty and above. For the first time in this program, quality assurance rules were established: All statements of the radiologists and pathologists have to be confirmed by a second opinion. This improvement in quality is combined with a delay in time and additional expence. A new Telepathology Consultation Service was developed based on the experiences of the Telepathology Consultation Center of the UICC to speed up the second opinion process. The complete web-based service is operated under MS Windows 2003 Server, as web server the Internet Information Server, and the SQL-Server (both Microsoft) as the database. The websites, forms and control mechanism have been coded in by ASP scripts and JavaScript. A study to evaluate the effectiveness of telepathological consultation in comparison to conventional consultation has been carried out. Pathologists of the Professional Association of German Pathologists took part as well as requesting pathologists and as consultants for other participants. The quality of telepathological diagnosis was comparable to the conventional diagnosis. Telepathology allows a faster respond of 1 to 2 day (conventional postal delay). The time to prepare a telepathology request is about twice as conventional. This ratio may be inverted by an interface between the Pathology Information System and the Telepathology Server and the use of virtual microscopy. The Telepathology Consultation Service of the Professional Association of German Pathologists is a fast and effective German-language, internet-based service for obtaining a second opinion.

  5. Automated Computer Access Request System

    NASA Technical Reports Server (NTRS)

    Snook, Bryan E.

    2010-01-01

    The Automated Computer Access Request (AutoCAR) system is a Web-based account provisioning application that replaces the time-consuming paper-based computer-access request process at Johnson Space Center (JSC). Auto- CAR combines rules-based and role-based functionality in one application to provide a centralized system that is easily and widely accessible. The system features a work-flow engine that facilitates request routing, a user registration directory containing contact information and user metadata, an access request submission and tracking process, and a system administrator account management component. This provides full, end-to-end disposition approval chain accountability from the moment a request is submitted. By blending both rules-based and rolebased functionality, AutoCAR has the flexibility to route requests based on a user s nationality, JSC affiliation status, and other export-control requirements, while ensuring a user s request is addressed by either a primary or backup approver. All user accounts that are tracked in AutoCAR are recorded and mapped to the native operating system schema on the target platform where user accounts reside. This allows for future extensibility for supporting creation, deletion, and account management directly on the target platforms by way of AutoCAR. The system s directory-based lookup and day-today change analysis of directory information determines personnel moves, deletions, and additions, and automatically notifies a user via e-mail to revalidate his/her account access as a result of such changes. AutoCAR is a Microsoft classic active server page (ASP) application hosted on a Microsoft Internet Information Server (IIS).

  6. Creating a Parallel Version of VisIt for Microsoft Windows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitlock, B J; Biagas, K S; Rawson, P L

    2011-12-07

    VisIt is a popular, free interactive parallel visualization and analysis tool for scientific data. Users can quickly generate visualizations from their data, animate them through time, manipulate them, and save the resulting images or movies for presentations. VisIt was designed from the ground up to work on many scales of computers from modest desktops up to massively parallel clusters. VisIt is comprised of a set of cooperating programs. All programs can be run locally or in client/server mode in which some run locally and some run remotely on compute clusters. The VisIt program most able to harness today's computing powermore » is the VisIt compute engine. The compute engine is responsible for reading simulation data from disk, processing it, and sending results or images back to the VisIt viewer program. In a parallel environment, the compute engine runs several processes, coordinating using the Message Passing Interface (MPI) library. Each MPI process reads some subset of the scientific data and filters the data in various ways to create useful visualizations. By using MPI, VisIt has been able to scale well into the thousands of processors on large computers such as dawn and graph at LLNL. The advent of multicore CPU's has made parallelism the 'new' way to achieve increasing performance. With today's computers having at least 2 cores and in many cases up to 8 and beyond, it is more important than ever to deploy parallel software that can use that computing power not only on clusters but also on the desktop. We have created a parallel version of VisIt for Windows that uses Microsoft's MPI implementation (MSMPI) to process data in parallel on the Windows desktop as well as on a Windows HPC cluster running Microsoft Windows Server 2008. Initial desktop parallel support for Windows was deployed in VisIt 2.4.0. Windows HPC cluster support has been completed and will appear in the VisIt 2.5.0 release. We plan to continue supporting parallel VisIt on Windows so our users will be able to take full advantage of their multicore resources.« less

  7. National Medical Terminology Server in Korea

    NASA Astrophysics Data System (ADS)

    Lee, Sungin; Song, Seung-Jae; Koh, Soonjeong; Lee, Soo Kyoung; Kim, Hong-Gee

    Interoperable EHR (Electronic Health Record) necessitates at least the use of standardized medical terminologies. This paper describes a medical terminology server, LexCare Suite, which houses terminology management applications, such as a terminology editor, and a terminology repository populated with international standard terminology systems such as Systematized Nomenclature of Medicine (SNOMED). The server is to satisfy the needs of quality terminology systems to local primary to tertiary hospitals. Our partner general hospitals have used the server to test its applicability. This paper describes the server and the results of the applicability test.

  8. The frequency of company-sponsored alcohol brand-related sites on Facebook™-2012.

    PubMed

    Nhean, Siphannay; Nyborn, Justin; Hinchey, Danielle; Valerio, Heather; Kinzel, Kathryn; Siegel, Michael; Jernigan, David H

    2014-06-01

    This research provides an estimate of the frequency of company-sponsored alcohol brand-related sites on Facebook™. We conducted a systematic overview of the extent of alcohol brand-related sites on Facebook™ in 2012. We conducted a 2012 Facebook™ search for sites specifically related to 898 alcohol brands across 16 different alcoholic beverage types. Descriptive statistics were produced using Microsoft SQL Server. We identified 1,017 company-sponsored alcohol-brand related sites on Facebook™. Our study advances previous literature by providing a systematic overview of the extent of alcohol brand sites on Facebook™.

  9. Integrated Distributed Directory Service for KSC

    NASA Technical Reports Server (NTRS)

    Ghansah, Isaac

    1997-01-01

    This paper describes an integrated distributed directory services (DDS) architecture as a fundamental component of KSC distributed computing systems. Specifically, an architecture for an integrated directory service based on DNS and X.500/LDAP has been suggested. The architecture supports using DNS in its traditional role as a name service and X.500 for other services. Specific designs were made in the integration of X.500 DDS for Public Key Certificates, Kerberos Security Services, Network-wide Login, Electronic Mail, WWW URLS, Servers, and other diverse network objects. Issues involved in incorporating the emerging Microsoft Active Directory Service MADS in KSC's X.500 were discussed.

  10. CProb: a computational tool for conducting conditional probability analysis.

    PubMed

    Hollister, Jeffrey W; Walker, Henry A; Paul, John F

    2008-01-01

    Conditional probability is the probability of observing one event given that another event has occurred. In an environmental context, conditional probability helps to assess the association between an environmental contaminant (i.e., the stressor) and the ecological condition of a resource (i.e., the response). These analyses, when combined with controlled experiments and other methodologies, show great promise in evaluating ecological conditions from observational data and in defining water quality and other environmental criteria. Current applications of conditional probability analysis (CPA) are largely done via scripts or cumbersome spreadsheet routines, which may prove daunting to end-users and do not provide access to the underlying scripts. Combining spreadsheets with scripts eases computation through a familiar interface (i.e., Microsoft Excel) and creates a transparent process through full accessibility to the scripts. With this in mind, we developed a software application, CProb, as an Add-in for Microsoft Excel with R, R(D)com Server, and Visual Basic for Applications. CProb calculates and plots scatterplots, empirical cumulative distribution functions, and conditional probability. In this short communication, we describe CPA, our motivation for developing a CPA tool, and our implementation of CPA as a Microsoft Excel Add-in. Further, we illustrate the use of our software with two examples: a water quality example and a landscape example. CProb is freely available for download at http://www.epa.gov/emap/nca/html/regions/cprob.

  11. [Construction and application of the tissue bank and database of oral mucosa precancerous lesions in the Yangtze delta].

    PubMed

    Huang, Ji-yan; Zhao, Hou-ming; Zhou, Hai-wen

    2014-04-01

    To construct a database and a tissue bank of oral mucosa precancerous lesions and to estimate the application values. Patients in the Yangtze delta suffering oral mucosa precancerous lesions were enrolled into this study. The patients' clinical data and samples of oral precancerous mucosa, salivary and blood were collected to create a tissue bank, based on which a database was constructed using Microsoft Access software, Brower/Server structure and ASP language. The tissue bank and database of oral mucosa precancerous lesions were successfully built. The procedure to harvest, store and transport the samples had been standardized. The database showed good interactive interface, convenient for data collection, query and share in the internet. We constructed the tissue bank and database of oral mucosa precancerous lesions for the first time, which not only help preserve the biological resource of oral mucosa precancerous lesions, but also provide enormous convenience in clinical work, researching and teaching. Supported by Research Fund of Science and Technology Committee of Shanghai Municipality (08ZR1416700).

  12. A spatial-temporal system for dynamic cadastral management.

    PubMed

    Nan, Liu; Renyi, Liu; Guangliang, Zhu; Jiong, Xie

    2006-03-01

    A practical spatio-temporal database (STDB) technique for dynamic urban land management is presented. One of the STDB models, the expanded model of Base State with Amendments (BSA), is selected as the basis for developing the dynamic cadastral management technique. Two approaches, the Section Fast Indexing (SFI) and the Storage Factors of Variable Granularity (SFVG), are used to improve the efficiency of the BSA model. Both spatial graphic data and attribute data, through a succinct engine, are stored in standard relational database management systems (RDBMS) for the actual implementation of the BSA model. The spatio-temporal database is divided into three interdependent sub-databases: present DB, history DB and the procedures-tracing DB. The efficiency of database operation is improved by the database connection in the bottom layer of the Microsoft SQL Server. The spatio-temporal system can be provided at a low-cost while satisfying the basic needs of urban land management in China. The approaches presented in this paper may also be of significance to countries where land patterns change frequently or to agencies where financial resources are limited.

  13. The designing and implementation of PE teaching information resource database based on broadband network

    NASA Astrophysics Data System (ADS)

    Wang, Jian

    2017-01-01

    In order to change traditional PE teaching mode and realize the interconnection, interworking and sharing of PE teaching resources, a distance PE teaching platform based on broadband network is designed and PE teaching information resource database is set up. The designing of PE teaching information resource database takes Windows NT 4/2000Server as operating system platform, Microsoft SQL Server 7.0 as RDBMS, and takes NAS technology for data storage and flow technology for video service. The analysis of system designing and implementation shows that the dynamic PE teaching information resource sharing platform based on Web Service can realize loose coupling collaboration, realize dynamic integration and active integration and has good integration, openness and encapsulation. The distance PE teaching platform based on Web Service and the design scheme of PE teaching information resource database can effectively solve and realize the interconnection, interworking and sharing of PE teaching resources and adapt to the informatization development demands of PE teaching.

  14. New Web Server - the Java Version of Tempest - Produced

    NASA Technical Reports Server (NTRS)

    York, David W.; Ponyik, Joseph G.

    2000-01-01

    A new software design and development effort has produced a Java (Sun Microsystems, Inc.) version of the award-winning Tempest software (refs. 1 and 2). In 1999, the Embedded Web Technology (EWT) team received a prestigious R&D 100 Award for Tempest, Java Version. In this article, "Tempest" will refer to the Java version of Tempest, a World Wide Web server for desktop or embedded systems. Tempest was designed at the NASA Glenn Research Center at Lewis Field to run on any platform for which a Java Virtual Machine (JVM, Sun Microsystems, Inc.) exists. The JVM acts as a translator between the native code of the platform and the byte code of Tempest, which is compiled in Java. These byte code files are Java executables with a ".class" extension. Multiple byte code files can be zipped together as a "*.jar" file for more efficient transmission over the Internet. Today's popular browsers, such as Netscape (Netscape Communications Corporation) and Internet Explorer (Microsoft Corporation) have built-in Virtual Machines to display Java applets.

  15. SSL - THE SIMPLE SOCKETS LIBRARY

    NASA Technical Reports Server (NTRS)

    Campbell, C. E.

    1994-01-01

    The Simple Sockets Library (SSL) allows C programmers to develop systems of cooperating programs using Berkeley streaming Sockets running under the TCP/IP protocol over Ethernet. The SSL provides a simple way to move information between programs running on the same or different machines and does so with little overhead. The SSL can create three types of Sockets: namely a server, a client, and an accept Socket. The SSL's Sockets are designed to be used in a fashion reminiscent of the use of FILE pointers so that a C programmer who is familiar with reading and writing files will immediately feel comfortable with reading and writing with Sockets. The SSL consists of three parts: the library, PortMaster, and utilities. The user of the SSL accesses it by linking programs to the SSL library. The PortMaster initializes connections between clients and servers. The PortMaster also supports a "firewall" facility to keep out socket requests from unapproved machines. The "firewall" is a file which contains Internet addresses for all approved machines. There are three utilities provided with the SSL. SKTDBG can be used to debug programs that make use of the SSL. SPMTABLE lists the servers and port numbers on requested machine(s). SRMSRVR tells the PortMaster to forcibly remove a server name from its list. The package also includes two example programs: multiskt.c, which makes multiple accepts on one server, and sktpoll.c, which repeatedly attempts to connect a client to some server at one second intervals. SSL is a machine independent library written in the C-language for computers connected via Ethernet using the TCP/IP protocol. It has been successfully compiled and implemented on a variety of platforms, including Sun series computers running SunOS, DEC VAX series computers running VMS, SGI computers running IRIX, DECstations running ULTRIX, DEC alpha AXPs running OSF/1, IBM RS/6000 computers running AIX, IBM PC and compatibles running BSD/386 UNIX and HP Apollo 3000/4000/9000/400T computers running HP-UX. SSL requires 45K of RAM to run under SunOS and 80K of RAM to run under VMS. For use on IBM PC series computers and compatibles running DOS, SSL requires Microsoft C 6.0 and the Wollongong TCP/IP package. Source code for sample programs and debugging tools are provided. The documentation is available on the distribution medium in TeX and PostScript formats. The standard distribution medium for SSL is a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format. It is also available on a 3.5 inch diskette in UNIX tar format and a 5.25 inch 360K MS-DOS format diskette. The SSL was developed in 1992 and was updated in 1993.

  16. Infectious Disease Information Collection System at the Scene of Disaster Relief Based on a Personal Digital Assistant.

    PubMed

    Li, Ya-Pin; Gao, Hong-Wei; Fan, Hao-Jun; Wei, Wei; Xu, Bo; Dong, Wen-Long; Li, Qing-Feng; Song, Wen-Jing; Hou, Shi-Ke

    2017-12-01

    The objective of this study was to build a database to collect infectious disease information at the scene of a disaster through the use of 128 epidemiological questionnaires and 47 types of options, with rapid acquisition of information regarding infectious disease and rapid questionnaire customization at the scene of disaster relief by use of a personal digital assistant (PDA). SQL Server 2005 (Microsoft Corp, Redmond, WA) was used to create the option database for the infectious disease investigation, to develop a client application for the PDA, and to deploy the application on the server side. The users accessed the server for data collection and questionnaire customization with the PDA. A database with a set of comprehensive options was created and an application system was developed for the Android operating system (Google Inc, Mountain View, CA). On this basis, an infectious disease information collection system was built for use at the scene of disaster relief. The creation of an infectious disease information collection system and rapid questionnaire customization through the use of a PDA was achieved. This system integrated computer technology and mobile communication technology to develop an infectious disease information collection system and to allow for rapid questionnaire customization at the scene of disaster relief. (Disaster Med Public Health Preparedness. 2017;11:668-673).

  17. Linking Management Actions to Interactive Ecosystem Report Cards via an Ontology

    NASA Astrophysics Data System (ADS)

    Alabri, A.; Newman, A.; Abal, E.; van Ingen, C.; Hunter, J.

    2008-12-01

    IINTRODUCTION The Health-e-Waterways Project is a three way collaboration between the University of Queensland, Microsoft Research and the Healthy Waterways Partnership (SEQ-HWP)(over 60 local government, state agency, universities, community and environmental organizations). The project is developing a highly innovative framework and set of services to enable streamlined access to an integrated collection of real- time, near-real-time and static datasets acquired through ecosystem monitoring programs in South East Queensland. Using a novel combination of semantic web technologies, scientific data servers, web services, GIS visualization interfaces and scientific workflows, we are enabling the sharing and integration of high quality data and models, through a combined integrated water information management system and Web portal. DYNAMIC GENERATION OF ECOSYSTEM HEALTH REPORT CARDS SEQ-HWP is responsible for the Ecosystem Health Monitoring Program (EHMP) in South East Queensland. This currently involves sampling 30 freshwater indicators at 100 sites twice a year and 250 estuarine/marine sites every month. The EHMP data sets are statistically aggregated and standardized to produce ecosystem health grades that are published annually in hard copy EHMP Report Cards. Politicians and planners use the report cards to make decisions with respect to land use, water quality, allocations and investments in water recycling plants etc. To date, these report cards have been largely produced manually, by calculating standardized scores (0-1) across 5 indicators and 16 indices (physical, chemical, nutrients, ecosystem processes, acquatic macroinvertebrates and fish) and grades from A-F for each catchment and season (spring and autumn). Currently this process takes about 5 months. For the past 6 months, we have been working with the SEQ-HWP staff, developing software services that will enable the report cards to be generated dynamically via a Web-based Map interface to an underlying database that contains the EHMP water quality and quantity monitoring data. The GUI enables users to specify and query: - Spatial regions of interest through a GoogleEarth or the Microsoft VirtualEarth interface. - Concepts or indicators of interest through the EHMPOntology. - Seasons or years of interest through a timeline. A Report Card Grade is generated for the specified catchment and period. Users can retrieve raw data by clicking on a grade this displays the corresponding EcoH plot, dynamically generated from the 5 indicators in the underlying SQL Server database. Clicking on an EcoH plot, displays the actual raw data (16 indices) used to generate the indicators and plots. CONCLUSIONS Numerous state, national and international agencies are advocating the need for standardized frameworks and procedures for environmental accounting. The Health-e-Waterways project provides an ideal model for delivering a standardized approach to the aggregation of ecosystem health monitoring data and the generation of dynamic, interactive Report Cards (that incorporate links back to the raw data sets). The system we have described here will not only save agencies significant time and money, but it can be used to guide regional, state and national environmental policy development, based on accurate and timely evidential data.

  18. Education and Outreach with the Virtual Astronomical Observatory

    NASA Astrophysics Data System (ADS)

    Lawton, Brandon L.; Eisenhamer, B.; Raddick, M. J.; Mattson, B. J.; Harris, J.

    2012-01-01

    The Virtual Observatory (VO) is an international effort to bring a large-scale electronic integration of astronomy data, tools, and services to the global community. The Virtual Astronomical Observatory (VAO) is the U.S. NSF- and NASA-funded VO effort that seeks to put efficient astronomical tools in the hands of U.S. astronomers, students, educators, and public outreach leaders. These tools will make use of data collected by the multitude of ground- and space-based missions over the previous decades. Many future missions will also be incorporated into the VAO tools when they launch. The Education and Public Outreach (E/PO) program for the VAO is led by the Space Telescope Science Institute in collaboration with the HEASARC E/PO program and Johns Hopkins University. VAO E/PO efforts seek to bring technology, real-world astronomical data, and the story of the development and infrastructure of the VAO to the general public, formal education, and informal education communities. Our E/PO efforts will be structured to provide uniform access to VAO information, enabling educational opportunities across multiple wavelengths and time-series data sets. The VAO team recognizes that many VO programs have built powerful tools for E/PO purposes, such as Microsoft's World Wide Telescope, SDSS Sky Server, Aladin, and a multitude of citizen-science tools available from Zooniverse. We are building partnerships with Microsoft, Zooniverse, and NASA's Night Sky Network to leverage the communities and tools that already exist to meet the needs of our audiences. Our formal education program is standards-based and aims to give teachers the tools to use real astronomical data to teach the STEM subjects. To determine which tools the VAO will incorporate into the formal education program, needs assessments will be conducted with educators across the U.S.

  19. The effectiveness of Microsoft Project in assessing extension of time under PAM 2006 standard form of contract

    NASA Astrophysics Data System (ADS)

    Suhaida, S. K.; Wong, Z. D.

    2017-11-01

    Time is equal to money; and it is applies in the construction industry where time is very important. Most of the standard form of contracts provide contractual clauses to ascertain time and money related to the scenarios while Extension of Time (EOT) is one of them. Under circumstance and delays, contractor is allow to apply EOT in order to complete the works on a later completion date without Liquidated Damages (LD) imposed to the claimant. However, both claimants and assessors encountered problems in assessing the EOT. The aim of this research is to recommend the usage of Microsoft Project as a tool in assessing EOT associated with the standard form of contract, PAM 2006. A quantitative method is applied towards the respondents that consisted of architects and quantity surveyors (QS) in order to collect data on challenges in assessing EOT claims and the effectiveness of Microsoft Project as a tool. The finding of this research highlighted that Microsoft Project can serve as a basis to perform EOT tasks as this software can be used as a data bank to store handy information which crucial for preparing and evaluating EOT.

  20. WebCN: A web-based computation tool for in situ-produced cosmogenic nuclides

    NASA Astrophysics Data System (ADS)

    Ma, Xiuzeng; Li, Yingkui; Bourgeois, Mike; Caffee, Marc; Elmore, David; Granger, Darryl; Muzikar, Paul; Smith, Preston

    2007-06-01

    Cosmogenic nuclide techniques are increasingly being utilized in geoscience research. For this it is critical to establish an effective, easily accessible and well defined tool for cosmogenic nuclide computations. We have been developing a web-based tool (WebCN) to calculate surface exposure ages and erosion rates based on the nuclide concentrations measured by the accelerator mass spectrometry. WebCN for 10Be and 26Al has been finished and published at http://www.physics.purdue.edu/primelab/for_users/rockage.html. WebCN for 36Cl is under construction. WebCN is designed as a three-tier client/server model and uses the open source PostgreSQL for the database management and PHP for the interface design and calculations. On the client side, an internet browser and Microsoft Access are used as application interfaces to access the system. Open Database Connectivity is used to link PostgreSQL and Microsoft Access. WebCN accounts for both spatial and temporal distributions of the cosmic ray flux to calculate the production rates of in situ-produced cosmogenic nuclides at the Earth's surface.

  1. Analysis of practical backoff protocols for contention resolution with multiple servers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldberg, L.A.; MacKenzie, P.D.

    Backoff protocols are probably the most widely used protocols for contention resolution in multiple access channels. In this paper, we analyze the stochastic behavior of backoff protocols for contention resolution among a set of clients and servers, each server being a multiple access channel that deals with contention like an Ethernet channel. We use the standard model in which each client generates requests for a given server according to a Bemoulli distribution with a specified mean. The client-server request rate of a system is the maximum over all client-server pairs (i, j) of the sum of all request rates associatedmore » with either client i or server j. Our main result is that any superlinear polynomial backoff protocol is stable for any multiple-server system with a sub-unit client-server request rate. We confirm the practical relevance of our result by demonstrating experimentally that the average waiting time of requests is very small when such a system is run with reasonably few clients and reasonably small request rates such as those that occur in actual ethernets. Our result is the first proof of stability for any backoff protocol for contention resolution with multiple servers. Our result is also the first proof that any weakly acknowledgment based protocol is stable for contention resolution with multiple servers and such high request rates. Two special cases of our result are of interest. Hastad, Leighton and Rogoff have shown that for a single-server system with a sub-unit client-server request rate any modified superlinear polynomial backoff protocol is stable. These modified backoff protocols are similar to standard backoff protocols but require more random bits to implement. The special case of our result in which there is only one server extends the result of Hastad, Leighton and Rogoff to standard (practical) backoff protocols. Finally, our result applies to dynamic routing in optical networks.« less

  2. Implementation of Headtracking and 3D Stereo with Unity and VRPN for Computer Simulations

    NASA Technical Reports Server (NTRS)

    Noyes, Matthew A.

    2013-01-01

    This paper explores low-cost hardware and software methods to provide depth cues traditionally absent in monocular displays. The use of a VRPN server in conjunction with a Microsoft Kinect and/or Nintendo Wiimote to provide head tracking information to a Unity application, and NVIDIA 3D Vision for retinal disparity support, is discussed. Methods are suggested to implement this technology with NASA's EDGE simulation graphics package, along with potential caveats. Finally, future applications of this technology to astronaut crew training, particularly when combined with an omnidirectional treadmill for virtual locomotion and NASA's ARGOS system for reduced gravity simulation, are discussed.

  3. Architecture of the software for LAMOST fiber positioning subsystem

    NASA Astrophysics Data System (ADS)

    Peng, Xiaobo; Xing, Xiaozheng; Hu, Hongzhuan; Zhai, Chao; Li, Weimin

    2004-09-01

    The architecture of the software which controls the LAMOST fiber positioning sub-system is described. The software is composed of two parts as follows: a main control program in a computer and a unit controller program in a MCS51 single chip microcomputer ROM. And the function of the software includes: Client/Server model establishment, observation planning, collision handling, data transmission, pulse generation, CCD control, image capture and processing, and data analysis etc. Particular attention is paid to the ways in which different parts of the software can communicate. Also software techniques for multi threads, SOCKET programming, Microsoft Windows message response, and serial communications are discussed.

  4. Implementation experience of a patient monitoring solution based on end-to-end standards.

    PubMed

    Martinez, I; Fernandez, J; Galarraga, M; Serrano, L; de Toledo, P; Escayola, J; Jimenez-Fernandez, S; Led, S; Martinez-Espronceda, M; Garcia, J

    2007-01-01

    This paper presents a proof-of-concept design of a patient monitoring solution for Intensive Care Unit (ICU). It is end-to-end standards-based, using ISO/IEEE 11073 (X73) in the bedside environment and EN13606 to communicate the information to an Electronic Healthcare Record (EHR) server. At the bedside end a plug-and-play sensor network is implemented, which communicates with a gateway that collects the medical information and sends it to a monitoring server. At this point the server transforms the data frame into an EN13606 extract, to be stored on the EHR server. The presented system has been tested in a laboratory environment to demonstrate the feasibility of this end-to-end standards-based solution.

  5. KISS--a new approach to self-controlled e-learning of selected chapters in Medical Engineering and other fields at bachelor and master course level.

    PubMed

    Hutten, Helmut; Stiegmaier, Wolfgang; Rauchegger, Günter

    2005-09-01

    Modern life style requires new methods for individual lifelong learning, based on access at every time and from every place. This fundamental requirement is provided by the Internet. The Internet technology promises an increasing potential in the future for e-learning or tele-learning. Some special requirements are password-controlled access, applicability of most commercially available PCs and laptops equipped with standard software (Microsoft Internet Explorer 6.0), central evaluation of the students' performance, inclusion of an examination part, provision of a picture gallery and a comprehensive glossary accessible in the learning mode. The KISS-shell has been developed based on the Oracle 10g application server in combination with a relational data base (Oracle 8i) on the server side and a web browser based interface using JavaScript for user control of data input on the client side (Kontrolliertes Intelligentes Selbstgesteuertes Studium, KISS). The first tutorial application has been realized with a chapter about cardiac pacemakers. The weight of that chapter (or module) is about 2 ECTS (i.e. the equivalent of 30 working hours; European Credit Transfer System, ECTS). The internal structure of the chapter is organized in sequential mode. It consists of five main sections. Each of those five sections is subdivided into five subsections of comparable length. Progression from one subsection to the next is possible only after successfully passing through the respective examination. The whole learning programme with the pacemaker chapter has been evaluated by 10 students. The system will be presented together with first experiences including the evaluation results. Until now the program has not been used for training purposes.

  6. The HydroServer Platform for Sharing Hydrologic Data

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Horsburgh, J. S.; Schreuders, K.; Maidment, D. R.; Zaslavsky, I.; Valentine, D. W.

    2010-12-01

    The CUAHSI Hydrologic Information System (HIS) is an internet based system that supports sharing of hydrologic data. HIS consists of databases connected using the Internet through Web services, as well as software for data discovery, access, and publication. The HIS system architecture is comprised of servers for publishing and sharing data, a centralized catalog to support cross server data discovery and a desktop client to access and analyze data. This paper focuses on HydroServer, the component developed for sharing and publishing space-time hydrologic datasets. A HydroServer is a computer server that contains a collection of databases, web services, tools, and software applications that allow data producers to store, publish, and manage the data from an experimental watershed or project site. HydroServer is designed to permit publication of data as part of a distributed national/international system, while still locally managing access to the data. We describe the HydroServer architecture and software stack, including tools for managing and publishing time series data for fixed point monitoring sites as well as spatially distributed, GIS datasets that describe a particular study area, watershed, or region. HydroServer adopts a standards based approach to data publication, relying on accepted and emerging standards for data storage and transfer. CUAHSI developed HydroServer code is free with community code development managed through the codeplex open source code repository and development system. There is some reliance on widely used commercial software for general purpose and standard data publication capability. The sharing of data in a common format is one way to stimulate interdisciplinary research and collaboration. It is anticipated that the growing, distributed network of HydroServers will facilitate cross-site comparisons and large scale studies that synthesize information from diverse settings, making the network as a whole greater than the sum of its parts in advancing hydrologic research. Details of the CUAHSI HIS can be found at http://his.cuahsi.org, and HydroServer codeplex site http://hydroserver.codeplex.com.

  7. The Microsoft Biology Foundation Applications for High-Throughput Sequencing

    PubMed Central

    Mercer, S.

    2010-01-01

    w9-2 The need for reusable libraries of bioinformatics functions has been recognized for many years and a number of language-specific toolkits have been constructed. Such toolkits have served as valuable nucleation points for the community, promoting the sharing of code and establishing standards. The majority of DNA sequencing machines and many other standard pieces of lab equipment are controlled by PCs using Windows, and a Microsoft genomics toolkit would enable initial processing and quality control to happen closer to the instrumentation and provide opportunities for added-value services within core facilities. The Microsoft Biology Foundation (MBF) is an open source software library, freely available for both commercial and academic use, available as an early-stage betafrom mbf.codeplex.com. This presentation will describe the structure and goals of MBF and demonstrate some of its uses.

  8. Empowering radiologic education on the Internet: a new virtual website technology for hosting interactive educational content on the World Wide Web.

    PubMed

    Frank, M S; Dreyer, K

    2001-06-01

    We describe a virtual web site hosting technology that enables educators in radiology to emblazon and make available for delivery on the world wide web their own interactive educational content, free from dependencies on in-house resources and policies. This suite of technologies includes a graphically oriented software application, designed for the computer novice, to facilitate the input, storage, and management of domain expertise within a database system. The database stores this expertise as choreographed and interlinked multimedia entities including text, imagery, interactive questions, and audio. Case-based presentations or thematic lectures can be authored locally, previewed locally within a web browser, then uploaded at will as packaged knowledge objects to an educator's (or department's) personal web site housed within a virtual server architecture. This architecture can host an unlimited number of unique educational web sites for individuals or departments in need of such service. Each virtual site's content is stored within that site's protected back-end database connected to Internet Information Server (Microsoft Corp, Redmond WA) using a suite of Active Server Page (ASP) modules that incorporate Microsoft's Active Data Objects (ADO) technology. Each person's or department's electronic teaching material appears as an independent web site with different levels of access--controlled by a username-password strategy--for teachers and students. There is essentially no static hypertext markup language (HTML). Rather, all pages displayed for a given site are rendered dynamically from case-based or thematic content that is fetched from that virtual site's database. The dynamically rendered HTML is displayed within a web browser in a Socratic fashion that can assess the recipient's current fund of knowledge while providing instantaneous user-specific feedback. Each site is emblazoned with the logo and identification of the participating institution. Individuals with teacher-level access can use a web browser to upload new content as well as manage content already stored on their virtual site. Each virtual site stores, collates, and scores participants' responses to the interactive questions posed on line. This virtual web site strategy empowers the educator with an end-to-end solution for creating interactive educational content and hosting that content within the educator's personalized and protected educational site on the world wide web, thus providing a valuable outlet that can magnify the impact of his or her talents and contributions.

  9. Rapid application design of an electronic clinical skills portfolio for undergraduate medical students.

    PubMed

    Dornan, Tim; Lee, Catherine; Stopford, Adam; Hosie, Liam; Maredia, Neil; Rector, Alan

    2005-04-01

    The aim was to find how to use information and communication technology to present the clinical skills content of an undergraduate medical curriculum. Rapid application design was used to develop the product, and technical action research was used to evaluate the development process. A clinician-educator, two medical students, two computing science masters students, two other project workers, and a hospital education informatics lead, formed a design team. A sample of stakeholders took part in requirements planning workshops and continued to advise the team throughout the project. A university hospital had many features that favoured fast, inexpensive, and successful system development: a clearly defined and readily accessible user group; location of the development process close to end-users; fast, informal communication; leadership by highly motivated and senior end-users; devolved authority and lack of any rigidly imposed management structure; cooperation of clinicians because the project drew on their clinical expertise to achieve scholastic goals; a culture of learning and involvement of highly motivated students. A detailed specification was developed through storyboarding, use case diagramming, and evolutionary prototyping. A very usable working product was developed within weeks. "SkillsBase" is a database web application using Microsoft Active Server Pages, served from a Microsoft Windows 2000 Server operating system running Internet Information Server 5.0. Graphing functionality is provided by the KavaChart applet. It presents the skills curriculum, provides a password-protected portfolio function, and offers training materials. The curriculum can be presented in several different ways to help students reflect on their objectives and progress towards achieving them. The reflective portfolio function is entirely private to each student user and allows them to document their progress in attaining skills, as judged by self, peer and tutor assessment, and examinations. Training materials include web links and materials developed locally using pedagogic principles developed by the SkillsBase team. Although the usability of SkillsBase has been proven, uptake of software that has arisen 'bottom-up' from within the curriculum has proved slow. We plan to incorporate the SkillsBase services into a more comprehensive virtual managed learning environment, anticipating that presenting the functionality in an environment that is routinely used by students and teachers will increase uptake and use.

  10. The development and implementation of MOSAIQ Integration Platform (MIP) based on the radiotherapy workflow

    NASA Astrophysics Data System (ADS)

    Yang, Xin; He, Zhen-yu; Jiang, Xiao-bo; Lin, Mao-sheng; Zhong, Ning-shan; Hu, Jiang; Qi, Zhen-yu; Bao, Yong; Li, Qiao-qiao; Li, Bao-yue; Hu, Lian-ying; Lin, Cheng-guang; Gao, Yuan-hong; Liu, Hui; Huang, Xiao-yan; Deng, Xiao-wu; Xia, Yun-fei; Liu, Meng-zhong; Sun, Ying

    2017-03-01

    To meet the special demands in China and the particular needs for the radiotherapy department, a MOSAIQ Integration Platform CHN (MIP) based on the workflow of radiation therapy (RT) has been developed, as a supplement system to the Elekta MOSAIQ. The MIP adopts C/S (client-server) structure mode, and its database is based on the Treatment Planning System (TPS) and MOSAIQ SQL Server 2008, running on the hospital local network. Five network servers, as a core hardware, supply data storage and network service based on the cloud services. The core software, using C# programming language, is developed based on Microsoft Visual Studio Platform. The MIP server could offer network service, including entry, query, statistics and print information for about 200 workstations at the same time. The MIP was implemented in the past one and a half years, and some practical patient-oriented functions were developed. And now the MIP is almost covering the whole workflow of radiation therapy. There are 15 function modules, such as: Notice, Appointment, Billing, Document Management (application/execution), System Management, and so on. By June of 2016, recorded data in the MIP are as following: 13546 patients, 13533 plan application, 15475 RT records, 14656 RT summaries, 567048 billing records and 506612 workload records, etc. The MIP based on the RT workflow has been successfully developed and clinically implemented with real-time performance, data security, stable operation. And it is demonstrated to be user-friendly and is proven to significantly improve the efficiency of the department. It is a key to facilitate the information sharing and department management. More functions can be added or modified for further enhancement its potentials in research and clinical practice.

  11. Decision support system for health care resources allocation

    PubMed Central

    Sebaa, Abderrazak; Nouicer, Amina; Tari, AbdelKamel; Tarik, Ramtani; Abdellah, Ouhab

    2017-01-01

    Background A study about healthcare resources can improve decisions regarding the allotment and mobilization of medical resources and to better guide future investment in the health sector. Aim The aim of this work was to design and implement a decision support system to improve medical resources allocation of Bejaia region. Methods To achieve the retrospective cohort study, we integrated existing clinical databases from different Bejaia department health sector institutions (an Algerian department) to collect information about patients from January 2015 through December 2015. Data integration was performed in a data warehouse using the multi-dimensional model and OLAP cube. During implementation, we used Microsoft SQL server 2012 and Microsoft Excel 2010. Results A medical decision support platform was introduced, and was implemented during the planning stages allowing the management of different medical orientations, it provides better apportionment and allotment of medical resources, and ensures that the allocation of health care resources has optimal effects on improving health. Conclusion In this study, we designed and implemented a decision support system which would improve health care in Bejaia department to especially assist in the selection of the optimum location of health center and hospital, the specialty of the health center, the medical equipment and the medical staff. PMID:28848645

  12. Decision support system for health care resources allocation.

    PubMed

    Sebaa, Abderrazak; Nouicer, Amina; Tari, AbdelKamel; Tarik, Ramtani; Abdellah, Ouhab

    2017-06-01

    A study about healthcare resources can improve decisions regarding the allotment and mobilization of medical resources and to better guide future investment in the health sector. The aim of this work was to design and implement a decision support system to improve medical resources allocation of Bejaia region. To achieve the retrospective cohort study, we integrated existing clinical databases from different Bejaia department health sector institutions (an Algerian department) to collect information about patients from January 2015 through December 2015. Data integration was performed in a data warehouse using the multi-dimensional model and OLAP cube. During implementation, we used Microsoft SQL server 2012 and Microsoft Excel 2010. A medical decision support platform was introduced, and was implemented during the planning stages allowing the management of different medical orientations, it provides better apportionment and allotment of medical resources, and ensures that the allocation of health care resources has optimal effects on improving health. In this study, we designed and implemented a decision support system which would improve health care in Bejaia department to especially assist in the selection of the optimum location of health center and hospital, the specialty of the health center, the medical equipment and the medical staff.

  13. [Design and establishment of modern literature database about acupuncture Deqi].

    PubMed

    Guo, Zheng-rong; Qian, Gui-feng; Pan, Qiu-yin; Wang, Yang; Xin, Si-yuan; Li, Jing; Hao, Jie; Hu, Ni-juan; Zhu, Jiang; Ma, Liang-xiao

    2015-02-01

    A search on acupuncture Deqi was conducted using four Chinese-language biomedical databases (CNKI, Wan-Fang, VIP and CBM) and PubMed database and using keywords "Deqi" or "needle sensation" "needling feeling" "needle feel" "obtaining qi", etc. Then, a "Modern Literature Database for Acupuncture Deqi" was established by employing Microsoft SQL Server 2005 Express Edition, introducing the contents, data types, information structure and logic constraint of the system table fields. From this Database, detailed inquiries about general information of clinical trials, acupuncturists' experience, ancient medical works, comprehensive literature, etc. can be obtained. The present databank lays a foundation for subsequent evaluation of literature quality about Deqi and data mining of undetected Deqi knowledge.

  14. Cloud-based robot remote control system for smart factory

    NASA Astrophysics Data System (ADS)

    Wu, Zhiming; Li, Lianzhong; Xu, Yang; Zhai, Jingmei

    2015-12-01

    With the development of internet technologies and the wide application of robots, there is a prospect (trend/tendency) of integration between network and robots. A cloud-based robot remote control system over networks for smart factory is proposed, which enables remote users to control robots and then realize intelligent production. To achieve it, a three-layer system architecture is designed including user layer, service layer and physical layer. Remote control applications running on the cloud server is developed on Microsoft Azure. Moreover, DIV+ CSS technologies are used to design human-machine interface to lower maintenance cost and improve development efficiency. Finally, an experiment is implemented to verify the feasibility of the program.

  15. Web Service Distributed Management Framework for Autonomic Server Virtualization

    NASA Astrophysics Data System (ADS)

    Solomon, Bogdan; Ionescu, Dan; Litoiu, Marin; Mihaescu, Mircea

    Virtualization for the x86 platform has imposed itself recently as a new technology that can improve the usage of machines in data centers and decrease the cost and energy of running a high number of servers. Similar to virtualization, autonomic computing and more specifically self-optimization, aims to improve server farm usage through provisioning and deprovisioning of instances as needed by the system. Autonomic systems are able to determine the optimal number of server machines - real or virtual - to use at a given time, and add or remove servers from a cluster in order to achieve optimal usage. While provisioning and deprovisioning of servers is very important, the way the autonomic system is built is also very important, as a robust and open framework is needed. One such management framework is the Web Service Distributed Management (WSDM) system, which is an open standard of the Organization for the Advancement of Structured Information Standards (OASIS). This paper presents an open framework built on top of the WSDM specification, which aims to provide self-optimization for applications servers residing on virtual machines.

  16. MO/DSD online information server and global information repository access

    NASA Technical Reports Server (NTRS)

    Nguyen, Diem; Ghaffarian, Kam; Hogie, Keith; Mackey, William

    1994-01-01

    Often in the past, standards and new technology information have been available only in hardcopy form, with reproduction and mailing costs proving rather significant. In light of NASA's current budget constraints and in the interest of efficient communications, the Mission Operations and Data Systems Directorate (MO&DSD) New Technology and Data Standards Office recognizes the need for an online information server (OLIS). This server would allow: (1) dissemination of standards and new technology information throughout the Directorate more quickly and economically; (2) online browsing and retrieval of documents that have been published for and by MO&DSD; and (3) searching for current and past study activities on related topics within NASA before issuing a task. This paper explores a variety of available information servers and searching tools, their current capabilities and limitations, and the application of these tools to MO&DSD. Most importantly, the discussion focuses on the way this concept could be easily applied toward improving dissemination of standards and new technologies and improving documentation processes.

  17. Escape Excel: A tool for preventing gene symbol and accession conversion errors.

    PubMed

    Welsh, Eric A; Stewart, Paul A; Kuenzi, Brent M; Eschrich, James A

    2017-01-01

    Microsoft Excel automatically converts certain gene symbols, database accessions, and other alphanumeric text into dates, scientific notation, and other numerical representations. These conversions lead to subsequent, irreversible, corruption of the imported text. A recent survey of popular genomic literature estimates that one-fifth of all papers with supplementary gene lists suffer from this issue. Here, we present an open-source tool, Escape Excel, which prevents these erroneous conversions by generating an escaped text file that can be safely imported into Excel. Escape Excel is implemented in a variety of formats (http://www.github.com/pstew/escape_excel), including a command line based Perl script, a Windows-only Excel Add-In, an OS X drag-and-drop application, a simple web-server, and as a Galaxy web environment interface. Test server implementations are accessible as a Galaxy interface (http://apostl.moffitt.org) and simple non-Galaxy web server (http://apostl.moffitt.org:8000/). Escape Excel detects and escapes a wide variety of problematic text strings so that they are not erroneously converted into other representations upon importation into Excel. Examples of problematic strings include date-like strings, time-like strings, leading zeroes in front of numbers, and long numeric and alphanumeric identifiers that should not be automatically converted into scientific notation. It is hoped that greater awareness of these potential data corruption issues, together with diligent escaping of text files prior to importation into Excel, will help to reduce the amount of Excel-corrupted data in scientific analyses and publications.

  18. Accountable Information Flow for Java-Based Web Applications

    DTIC Science & Technology

    2010-01-01

    runtime library Swift server runtime Java servlet framework HTTP Web server Web browser Figure 2: The Swift architecture introduced an open-ended...On the server, the Java application code links against Swift’s server-side run-time library, which in turn sits on top of the standard Java servlet ...AFRL-RI-RS-TR-2010-9 Final Technical Report January 2010 ACCOUNTABLE INFORMATION FLOW FOR JAVA -BASED WEB APPLICATIONS

  19. General Framework for Animal Food Safety Traceability Using GS1 and RFID

    NASA Astrophysics Data System (ADS)

    Cao, Weizhu; Zheng, Limin; Zhu, Hong; Wu, Ping

    GS1 is global traceability standard, which is composed by the encoding system (EAN/UCC, EPC), the data carriers identified automatically (bar codes, RFID), electronic data interchange standards (EDI, XML). RFID is a non-contact, multi-objective automatic identification technique. Tracing of source food, standardization of RFID tags, sharing of dynamic data are problems to solve urgently for recent traceability systems. The paper designed general framework for animal food safety traceability using GS1 and RFID. This framework uses RFID tags encoding with EPCglobal tag data standards. Each information server has access tier, business tier and resource tier. These servers are heterogeneous and distributed, providing user access interfaces by SOAP or HTTP protocols. For sharing dynamic data, discovery service and object name service are used to locate dynamic distributed information servers.

  20. Development of a Web-based Glaucoma Registry at King Khaled Eye Specialist Hospital, Saudi Arabia: A Cost-Effective Methodology

    PubMed Central

    Zaman, Babar; Khandekar, Rajiv; Al Shahwan, Sami; Song, Jonathan; Al Jadaan, Ibrahim; Al Jiasim, Leyla; Owaydha, Ohood; Asghar, Nasira; Hijazi, Amar; Edward, Deepak P.

    2014-01-01

    In this brief communication, we present the steps used to establish a web-based congenital glaucoma registry at our institution. The contents of a case report form (CRF) were developed by a group of glaucoma subspecialists. Information Technology (IT) specialists used Lime Survey softwareTM to create an electronic CRF. A MY Structured Query Language (MySQL) server was used as a database with a virtual machine operating system. Two ophthalmologists and 2 IT specialists worked for 7 hours, and a biostatistician and a data registrar worked for 24 hours each to establish the electronic CRF. Using the CRF which was transferred to the Lime survey tool, and the MYSQL server application, data could be directly stored in spreadsheet programs that included Microsoft Excel, SPSS, and R-Language and queried in real-time. In a pilot test, clinical data from 80 patients with congenital glaucoma were entered into the registry and successful descriptive analysis and data entry validation was performed. A web-based disease registry was established in a short period of time in a cost-efficient manner using available resources and a team-based approach. PMID:24791112

  1. Development of a web-based glaucoma registry at King Khaled Eye Specialist Hospital, Saudi Arabia: a cost-effective methodology.

    PubMed

    Zaman, Babar; Khandekar, Rajiv; Al Shahwan, Sami; Song, Jonathan; Al Jadaan, Ibrahim; Al Jiasim, Leyla; Owaydha, Ohood; Asghar, Nasira; Hijazi, Amar; Edward, Deepak P

    2014-01-01

    In this brief communication, we present the steps used to establish a web-based congenital glaucoma registry at our institution. The contents of a case report form (CRF) were developed by a group of glaucoma subspecialists. Information Technology (IT) specialists used Lime Survey softwareTM to create an electronic CRF. A MY Structured Query Language (MySQL) server was used as a database with a virtual machine operating system. Two ophthalmologists and 2 IT specialists worked for 7 hours, and a biostatistician and a data registrar worked for 24 hours each to establish the electronic CRF. Using the CRF which was transferred to the Lime survey tool, and the MYSQL server application, data could be directly stored in spreadsheet programs that included Microsoft Excel, SPSS, and R-Language and queried in real-time. In a pilot test, clinical data from 80 patients with congenital glaucoma were entered into the registry and successful descriptive analysis and data entry validation was performed. A web-based disease registry was established in a short period of time in a cost-efficient manner using available resources and a team-based approach.

  2. A web-based quantitative signal detection system on adverse drug reaction in China.

    PubMed

    Li, Chanjuan; Xia, Jielai; Deng, Jianxiong; Chen, Wenge; Wang, Suzhen; Jiang, Jing; Chen, Guanquan

    2009-07-01

    To establish a web-based quantitative signal detection system for adverse drug reactions (ADRs) based on spontaneous reporting to the Guangdong province drug-monitoring database in China. Using Microsoft Visual Basic and Active Server Pages programming languages and SQL Server 2000, a web-based system with three software modules was programmed to perform data preparation and association detection, and to generate reports. Information component (IC), the internationally recognized measure of disproportionality for quantitative signal detection, was integrated into the system, and its capacity for signal detection was tested with ADR reports collected from 1 January 2002 to 30 June 2007 in Guangdong. A total of 2,496 associations including known signals were mined from the test database. Signals (e.g., cefradine-induced hematuria) were found early by using the IC analysis. In addition, 291 drug-ADR associations were alerted for the first time in the second quarter of 2007. The system can be used for the detection of significant associations from the Guangdong drug-monitoring database and could be an extremely useful adjunct to the expert assessment of very large numbers of spontaneously reported ADRs for the first time in China.

  3. HPC in a HEP lab: lessons learned from setting up cost-effective HPC clusters

    NASA Astrophysics Data System (ADS)

    Husejko, Michal; Agtzidis, Ioannis; Baehler, Pierre; Dul, Tadeusz; Evans, John; Himyr, Nils; Meinhard, Helge

    2015-12-01

    In this paper we present our findings gathered during the evaluation and testing of Windows Server High-Performance Computing (Windows HPC) in view of potentially using it as a production HPC system for engineering applications. The Windows HPC package, an extension of Microsofts Windows Server product, provides all essential interfaces, utilities and management functionality for creating, operating and monitoring a Windows-based HPC cluster infrastructure. The evaluation and test phase was focused on verifying the functionalities of Windows HPC, its performance, support of commercial tools and the integration with the users work environment. We describe constraints imposed by the way the CERN Data Centre is operated, licensing for engineering tools and scalability and behaviour of the HPC engineering applications used at CERN. We will present an initial set of requirements, which were created based on the above constraints and requests from the CERN engineering user community. We will explain how we have configured Windows HPC clusters to provide job scheduling functionalities required to support the CERN engineering user community, quality of service, user- and project-based priorities, and fair access to limited resources. Finally, we will present several performance tests we carried out to verify Windows HPC performance and scalability.

  4. Design and development of an IoT-based web application for an intelligent remote SCADA system

    NASA Astrophysics Data System (ADS)

    Kao, Kuang-Chi; Chieng, Wei-Hua; Jeng, Shyr-Long

    2018-03-01

    This paper presents a design of an intelligent remote electrical power supervisory control and data acquisition (SCADA) system based on the Internet of Things (IoT), with Internet Information Services (IIS) for setting up web servers, an ASP.NET model-view- controller (MVC) for establishing a remote electrical power monitoring and control system by using responsive web design (RWD), and a Microsoft SQL Server as the database. With the web browser connected to the Internet, the sensing data is sent to the client by using the TCP/IP protocol, which supports mobile devices with different screen sizes. The users can provide instructions immediately without being present to check the conditions, which considerably reduces labor and time costs. The developed system incorporates a remote measuring function by using a wireless sensor network and utilizes a visual interface to make the human-machine interface (HMI) more instinctive. Moreover, it contains an analog input/output and a basic digital input/output that can be applied to a motor driver and an inverter for integration with a remote SCADA system based on IoT, and thus achieve efficient power management.

  5. World Wide Web Server Standards and Guidelines.

    ERIC Educational Resources Information Center

    Stubbs, Keith M.

    This document defines the specific standards and general guidelines which the U.S. Department of Education (ED) will use to make information available on the World Wide Web (WWW). The purpose of providing such guidance is to ensure high quality and consistent content, organization, and presentation of information on ED WWW servers, in order to…

  6. Building and evaluating an informatics tool to facilitate analysis of a biomedical literature search service in an academic medical center library.

    PubMed

    Hinton, Elizabeth G; Oelschlegel, Sandra; Vaughn, Cynthia J; Lindsay, J Michael; Hurst, Sachiko M; Earl, Martha

    2013-01-01

    This study utilizes an informatics tool to analyze a robust literature search service in an academic medical center library. Structured interviews with librarians were conducted focusing on the benefits of such a tool, expectations for performance, and visual layout preferences. The resulting application utilizes Microsoft SQL Server and .Net Framework 3.5 technologies, allowing for the use of a web interface. Customer tables and MeSH terms are included. The National Library of Medicine MeSH database and entry terms for each heading are incorporated, resulting in functionality similar to searching the MeSH database through PubMed. Data reports will facilitate analysis of the search service.

  7. Color management in the real world: sRGB, ICM2, ICC, ColorSync, and other attempts to make color management transparent

    NASA Astrophysics Data System (ADS)

    Stokes, Michael

    1998-07-01

    A uniformly adopted color standards infrastructure has a dramatic impact on any color imaging industry and technology. This presentation begins by framing the current color standards situation in a historical context. A series of similar appearing infrastructure adoptions in color publishing during the last fifty years are reviewed and compared to the current events. This historical review is followed by brief technical, business and marketing reviews of two of the more popular recent color standards proposals, sRGB and ICC, along with their operating system implementations in the Microsoft and Apple operating systems. The paper concludes with a summary of Hewlett- Packard Company's and Microsoft's proposed future direction.

  8. Active X based standards for healthcare integration.

    PubMed

    Greenberg, D S; Welcker, B

    1998-02-01

    With cost pressures brought to the forefront by the growth of managed care, the integration of healthcare information systems is more important than ever. Providers of healthcare information are under increasing pressure to provide timely information to end users in a cost effective manner. Organizations have had to decide between the strong functionality that a multi-vendor 'best of breed' architecture provides and the strong integration provided by a single-vendor solution. As connectivity between systems increased, these interfaces were migrated to work across serial and eventually, network, connections. In addition, the content of the information became standardized through efforts like HL7 and ANSI X12 and Edifact. Although content-based standards go a long way towards facilitating interoperability, there is also quite a bit of work required to connect two systems even when they both adhere to the standard. A key to accomplishing this goal is increasing the connectivity between disparate systems in the healthcare environment. Microsoft is working with healthcare organizations and independent software vendors to bring Microsoft's powerful enterprise object technology, ActiveX, to the healthcare industry. Whilst object orientation has been heralded as the 'next big thing' in computer applications development, Microsoft believe that, in fact, component software is the technology which will provide the greatest benefit to end users.

  9. Installation of the National Transport Code Collaboration Data Server at the ITPA International Multi-tokamak Confinement Profile Database

    NASA Astrophysics Data System (ADS)

    Roach, Colin; Carlsson, Johan; Cary, John R.; Alexander, David A.

    2002-11-01

    The National Transport Code Collaboration (NTCC) has developed an array of software, including a data client/server. The data server, which is written in C++, serves local data (in the ITER Profile Database format) as well as remote data (by accessing one or several MDS+ servers). The client, a web-invocable Java applet, provides a uniform, intuitive, user-friendly, graphical interface to the data server. The uniformity of the interface relieves the user from the trouble of mastering the differences between different data formats and lets him/her focus on the essentials: plotting and viewing the data. The user runs the client by visiting a web page using any Java capable Web browser. The client is automatically downloaded and run by the browser. A reference to the data server is then retrieved via the standard Web protocol (HTTP). The communication between the client and the server is then handled by the mature, industry-standard CORBA middleware. CORBA has bindings for all common languages and many high-quality implementations are available (both Open Source and commercial). The NTCC data server has been installed at the ITPA International Multi-tokamak Confinement Profile Database, which is hosted by the UKAEA at Culham Science Centre. The installation of the data server is protected by an Internet firewall. To make it accessible to clients outside the firewall some modifications of the server were required. The working version of the ITPA confinement profile database is not open to the public. Authentification of legitimate users is done utilizing built-in Java security features to demand a password to download the client. We present an overview of the NTCC data client/server and some details of how the CORBA firewall-traversal issues were resolved and how the user authentification is implemented.

  10. [Design and implementation of medical instrument standard information retrieval system based on APS.NET].

    PubMed

    Yu, Kaijun

    2010-07-01

    This paper Analys the design goals of Medical Instrumentation standard information retrieval system. Based on the B /S structure,we established a medical instrumentation standard retrieval system with ASP.NET C # programming language, IIS f Web server, SQL Server 2000 database, in the. NET environment. The paper also Introduces the system structure, retrieval system modules, system development environment and detailed design of the system.

  11. Thermal feature extraction of servers in a datacenter using thermal image registration

    NASA Astrophysics Data System (ADS)

    Liu, Hang; Ran, Jian; Xie, Ting; Gao, Shan

    2017-09-01

    Thermal cameras provide fine-grained thermal information that enhances monitoring and enables automatic thermal management in large datacenters. Recent approaches employing mobile robots or thermal camera networks can already identify the physical locations of hot spots. Other distribution information used to optimize datacenter management can also be obtained automatically using pattern recognition technology. However, most of the features extracted from thermal images, such as shape and gradient, may be affected by changes in the position and direction of the thermal camera. This paper presents a method for extracting the thermal features of a hot spot or a server in a container datacenter. First, thermal and visual images are registered based on textural characteristics extracted from images acquired in datacenters. Then, the thermal distribution of each server is standardized. The features of a hot spot or server extracted from the standard distribution can reduce the impact of camera position and direction. The results of experiments show that image registration is efficient for aligning the corresponding visual and thermal images in the datacenter, and the standardization procedure reduces the impacts of camera position and direction on hot spot or server features.

  12. Design of SIP transformation server for efficient media negotiation

    NASA Astrophysics Data System (ADS)

    Pack, Sangheon; Paik, Eun Kyoung; Choi, Yanghee

    2001-07-01

    Voice over IP (VoIP) is one of the advanced services supported by the next generation mobile communication. VoIP should support various media formats and terminals existing together. This heterogeneous environment may prevent diverse users from establishing VoIP sessions among them. To solve the problem an efficient media negotiation mechanism is required. In this paper, we propose the efficient media negotiation architecture using the transformation server and the Intelligent Location Server (ILS). The transformation server is an extended Session Initiation Protocol (SIP) proxy server. It can modify an unacceptable session INVITE message into an acceptable one using the ILS. The ILS is a directory server based on the Lightweight Directory Access Protocol (LDAP) that keeps userí*s location information and available media information. The proposed architecture can eliminate an unnecessary response and re-INVITE messages of the standard SIP architecture. It takes only 1.5 round trip times to negotiate two different media types while the standard media negotiation mechanism takes 2.5 round trip times. The extra processing time in message handling is negligible in comparison to the reduced round trip time. The experimental results show that the session setup time in the proposed architecture is less than the setup time in the standard SIP. These results verify that the proposed media negotiation mechanism is more efficient in solving diversity problems.

  13. Implementation of Medical Information Exchange System Based on EHR Standard

    PubMed Central

    Han, Soon Hwa; Kim, Sang Guk; Jeong, Jun Yong; Lee, Bi Na; Choi, Myeong Seon; Kim, Il Kon; Park, Woo Sung; Ha, Kyooseob; Cho, Eunyoung; Kim, Yoon; Bae, Jae Bong

    2010-01-01

    Objectives To develop effective ways of sharing patients' medical information, we developed a new medical information exchange system (MIES) based on a registry server, which enabled us to exchange different types of data generated by various systems. Methods To assure that patient's medical information can be effectively exchanged under different system environments, we adopted the standardized data transfer methods and terminologies suggested by the Center for Interoperable Electronic Healthcare Record (CIEHR) of Korea in order to guarantee interoperability. Regarding information security, MIES followed the security guidelines suggested by the CIEHR of Korea. This study aimed to develop essential security systems for the implementation of online services, such as encryption of communication, server security, database security, protection against hacking, contents, and network security. Results The registry server managed information exchange as well as the registration information of the clinical document architecture (CDA) documents, and the CDA Transfer Server was used to locate and transmit the proper CDA document from the relevant repository. The CDA viewer showed the CDA documents via connection with the information systems of related hospitals. Conclusions This research chooses transfer items and defines document standards that follow CDA standards, such that exchange of CDA documents between different systems became possible through ebXML. The proposed MIES was designed as an independent central registry server model in order to guarantee the essential security of patients' medical information. PMID:21818447

  14. Implementation of Medical Information Exchange System Based on EHR Standard.

    PubMed

    Han, Soon Hwa; Lee, Min Ho; Kim, Sang Guk; Jeong, Jun Yong; Lee, Bi Na; Choi, Myeong Seon; Kim, Il Kon; Park, Woo Sung; Ha, Kyooseob; Cho, Eunyoung; Kim, Yoon; Bae, Jae Bong

    2010-12-01

    To develop effective ways of sharing patients' medical information, we developed a new medical information exchange system (MIES) based on a registry server, which enabled us to exchange different types of data generated by various systems. To assure that patient's medical information can be effectively exchanged under different system environments, we adopted the standardized data transfer methods and terminologies suggested by the Center for Interoperable Electronic Healthcare Record (CIEHR) of Korea in order to guarantee interoperability. Regarding information security, MIES followed the security guidelines suggested by the CIEHR of Korea. This study aimed to develop essential security systems for the implementation of online services, such as encryption of communication, server security, database security, protection against hacking, contents, and network security. The registry server managed information exchange as well as the registration information of the clinical document architecture (CDA) documents, and the CDA Transfer Server was used to locate and transmit the proper CDA document from the relevant repository. The CDA viewer showed the CDA documents via connection with the information systems of related hospitals. This research chooses transfer items and defines document standards that follow CDA standards, such that exchange of CDA documents between different systems became possible through ebXML. The proposed MIES was designed as an independent central registry server model in order to guarantee the essential security of patients' medical information.

  15. Comprehensive innovative solution for resident education using the Intranet Journal of Chest Radiology.

    PubMed

    Nishino, Mizuki; Wolfe, Donna; Yam, Chun-Shan; Larson, Michael; Boiselle, Phillip M; Hatabu, Hiroto

    2004-10-01

    Because of the rapid increase in clinical workload in academic radiology departments, time for teaching rotating residents is getting more and more limited. As a solution to this problem, we introduced the Intranet Journal of Chest Radiology as a comprehensive innovative tool for assisting resident education. The Intranet Journal of Chest Radiology is constructed using Microsoft FrontPage version 2002 (Microsoft Corp, Redmond, WA) and is hosted in our departmental web server (Beth Israel Deaconess Medical Center, Boston, MA). The home page of the intranet journal provides access to the main features, "Cases of the Month," "Teaching File," "Selected Articles for Residents," "Lecture Series," and "Current Publications." These features provide quick access to the selected radiology articles, the interesting chest cases, and the lecture series and current publication from the chest section. Our intranet journal has been well utilized for 6 months after its introduction. It enhances residents' interest and motivation to work on case collections, to search and read articles, and to generate interest in research. Frequent updating is necessary for the journal to be kept current, relevant, and well-utilized. The intranet journal serves as a comprehensive innovative solution for resident education, providing basic educational resources and opportunities of interactive participation by residents.

  16. 78 FR 49586 - Self-Regulatory Organizations; Miami International Securities Exchange LLC; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-14

    ... Market Maker Standard quote server as a gateway for communicating eQuotes to MIAX. Because of the... connect the Limited Service Ports to independent servers that host their eQuote and purge functionality... same server for all of their Market Maker quoting activity. Currently, Market Makers in the MIAX System...

  17. 78 FR 70615 - Self-Regulatory Organizations; Miami International Securities Exchange LLC; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-26

    ... rather than forcing them to use their Market Maker Standard quote server as a gateway for communicating e... technical flexibility to connect additional Limited Service Ports to independent servers that host their e... mitigate the risk of using the same server for all of their Market Maker quoting activity. By using the...

  18. MetNetAPI: A flexible method to access and manipulate biological network data from MetNet

    PubMed Central

    2010-01-01

    Background Convenient programmatic access to different biological databases allows automated integration of scientific knowledge. Many databases support a function to download files or data snapshots, or a webservice that offers "live" data. However, the functionality that a database offers cannot be represented in a static data download file, and webservices may consume considerable computational resources from the host server. Results MetNetAPI is a versatile Application Programming Interface (API) to the MetNetDB database. It abstracts, captures and retains operations away from a biological network repository and website. A range of database functions, previously only available online, can be immediately (and independently from the website) applied to a dataset of interest. Data is available in four layers: molecular entities, localized entities (linked to a specific organelle), interactions, and pathways. Navigation between these layers is intuitive (e.g. one can request the molecular entities in a pathway, as well as request in what pathways a specific entity participates). Data retrieval can be customized: Network objects allow the construction of new and integration of existing pathways and interactions, which can be uploaded back to our server. In contrast to webservices, the computational demand on the host server is limited to processing data-related queries only. Conclusions An API provides several advantages to a systems biology software platform. MetNetAPI illustrates an interface with a central repository of data that represents the complex interrelationships of a metabolic and regulatory network. As an alternative to data-dumps and webservices, it allows access to a current and "live" database and exposes analytical functions to application developers. Yet it only requires limited resources on the server-side (thin server/fat client setup). The API is available for Java, Microsoft.NET and R programming environments and offers flexible query and broad data- retrieval methods. Data retrieval can be customized to client needs and the API offers a framework to construct and manipulate user-defined networks. The design principles can be used as a template to build programmable interfaces for other biological databases. The API software and tutorials are available at http://www.metnetonline.org/api. PMID:21083943

  19. EarthServer: a Summary of Achievements in Technology, Services, and Standards

    NASA Astrophysics Data System (ADS)

    Baumann, Peter

    2015-04-01

    Big Data in the Earth sciences, the Tera- to Exabyte archives, mostly are made up from coverage data, according to ISO and OGC defined as the digital representation of some space-time varying phenomenon. Common examples include 1-D sensor timeseries, 2-D remote sensing imagery, 3D x/y/t image timese ries and x/y/z geology data, and 4-D x/y/z/t atmosphere and ocean data. Analytics on such data requires on-demand processing of sometimes significant complexity, such as getting the Fourier transform of satellite images. As network bandwidth limits prohibit transfer of such Big Data it is indispensable to devise protocols allowing clients to task flexible and fast processing on the server. The transatlantic EarthServer initiative, running from 2011 through 2014, has united 11 partners to establish Big Earth Data Analytics. A key ingredient has been flexibility for users to ask whatever they want, not impeded and complicated by system internals. The EarthServer answer to this is to use high-level, standards-based query languages which unify data and metadata search in a simple, yet powerful way. A second key ingredient is scalability. Without any doubt, scalability ultimately can only be achieved through parallelization. In the past, parallelizing cod e has been done at compile time and usually with manual intervention. The EarthServer approach is to perform a samentic-based dynamic distribution of queries fragments based on networks optimization and further criteria. The EarthServer platform is comprised by rasdaman, the pioneer and leading Array DBMS built for any-size multi-dimensional raster data being extended with support for irregular grids and general meshes; in-situ retrieval (evaluation of database queries on existing archive structures, avoiding data import and, hence, duplication); the aforementioned distributed query processing. Additionally, Web clients for multi-dimensional data visualization are being established. Client/server interfaces are strictly based on OGC and W3C standards, in particular the Web Coverage Processing Service (WCPS) which defines a high-level coverage query language. Reviewers have attested EarthServer that "With no doubt the project has been shaping the Big Earth Data landscape through the standardization activities within OGC, ISO and beyond". We present the project approach, its outcomes and impact on standardization and Big Data technology, and vistas for the future.

  20. .NET INTEROPERABILITY GUIDELINES

    EPA Science Inventory

    The CAPE-OPEN middleware standards were created to allow process modelling components (PMCs) developed by third parties to be used in any process modelling environment (PME) utilizing these standards. The CAPE-OPEN middleware specifications were based upon both Microsoft's Compo...

  1. The EarthServer project: Exploiting Identity Federations, Science Gateways and Social and Mobile Clients for Big Earth Data Analysis

    NASA Astrophysics Data System (ADS)

    Barbera, Roberto; Bruno, Riccardo; Calanducci, Antonio; Messina, Antonio; Pappalardo, Marco; Passaro, Gianluca

    2013-04-01

    The EarthServer project (www.earthserver.eu), funded by the European Commission under its Seventh Framework Program, aims at establishing open access and ad-hoc analytics on extreme-size Earth Science data, based on and extending leading-edge Array Database technology. The core idea is to use database query languages as client/server interface to achieve barrier-free "mix & match" access to multi-source, any-size, multi-dimensional space-time data -- in short: "Big Earth Data Analytics" - based on the open standards of the Open Geospatial Consortium Web Coverage Processing Service (OGC WCPS) and the W3C XQuery. EarthServer combines both, thereby achieving a tight data/metadata integration. Further, the rasdaman Array Database System (www.rasdaman.com) is extended with further space-time coverage data types. On server side, highly effective optimizations - such as parallel and distributed query processing - ensure scalability to Exabyte volumes. Six Lighthouse Applications are being established in EarthServer, each of which poses distinct challenges on Earth Data Analytics: Cryospheric Science, Airborne Science, Atmospheric Science, Geology, Oceanography, and Planetary Science. Altogether, they cover all Earth Science domains; the Planetary Science use case has been added to challenge concepts and standards in non-standard environments. In addition, EarthLook (maintained by Jacobs University) showcases use of OGC standards in 1D through 5D use cases. In this contribution we will report on the first applications integrated in the EarthServer Science Gateway and on the clients for mobile appliances developed to access them. We will also show how federated and social identity services can allow Big Earth Data Providers to expose their data in a distributed environment keeping a strict and fine-grained control on user authentication and authorisation. The degree of fulfilment of the EarthServer implementation with the recommendations made in the recent TERENA Study on AAA Platforms For Scientific Resources in Europe (https://confluence.terena.org/display/aaastudy/AAA+Study+Home+Page) will also be assessed.

  2. Escape Excel: A tool for preventing gene symbol and accession conversion errors

    PubMed Central

    Stewart, Paul A.; Kuenzi, Brent M.; Eschrich, James A.

    2017-01-01

    Background Microsoft Excel automatically converts certain gene symbols, database accessions, and other alphanumeric text into dates, scientific notation, and other numerical representations. These conversions lead to subsequent, irreversible, corruption of the imported text. A recent survey of popular genomic literature estimates that one-fifth of all papers with supplementary gene lists suffer from this issue. Results Here, we present an open-source tool, Escape Excel, which prevents these erroneous conversions by generating an escaped text file that can be safely imported into Excel. Escape Excel is implemented in a variety of formats (http://www.github.com/pstew/escape_excel), including a command line based Perl script, a Windows-only Excel Add-In, an OS X drag-and-drop application, a simple web-server, and as a Galaxy web environment interface. Test server implementations are accessible as a Galaxy interface (http://apostl.moffitt.org) and simple non-Galaxy web server (http://apostl.moffitt.org:8000/). Conclusions Escape Excel detects and escapes a wide variety of problematic text strings so that they are not erroneously converted into other representations upon importation into Excel. Examples of problematic strings include date-like strings, time-like strings, leading zeroes in front of numbers, and long numeric and alphanumeric identifiers that should not be automatically converted into scientific notation. It is hoped that greater awareness of these potential data corruption issues, together with diligent escaping of text files prior to importation into Excel, will help to reduce the amount of Excel-corrupted data in scientific analyses and publications. PMID:28953918

  3. Consumer server: A UNIX based event distributor in new CDF data acquisition system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abe, F.; Morita, Y.; Nomachi, M.

    1994-12-31

    Consumer Server is a program to handle event data and consumer trigger requests I/Os among Level 3 farm and consumer processes in CDF new data acquisition system. This program uses standard UNIX libraries and commercial network technologies to obtain higher portability. The authors describe the concept and configuration of the Consumer Server and report its performance.

  4. Migration of legacy mumps applications to relational database servers.

    PubMed

    O'Kane, K C

    2001-07-01

    An extended implementation of the Mumps language is described that facilitates vendor neutral migration of legacy Mumps applications to SQL-based relational database servers. Implemented as a compiler, this system translates Mumps programs to operating system independent, standard C code for subsequent compilation to fully stand-alone, binary executables. Added built-in functions and support modules extend the native hierarchical Mumps database with access to industry standard, networked, relational database management servers (RDBMS) thus freeing Mumps applications from dependence upon vendor specific, proprietary, unstandardized database models. Unlike Mumps systems that have added captive, proprietary RDMBS access, the programs generated by this development environment can be used with any RDBMS system that supports common network access protocols. Additional features include a built-in web server interface and the ability to interoperate directly with programs and functions written in other languages.

  5. The DICOM-based radiation therapy information system

    NASA Astrophysics Data System (ADS)

    Law, Maria Y. Y.; Chan, Lawrence W. C.; Zhang, Xiaoyan; Zhang, Jianguo

    2004-04-01

    Similar to DICOM for PACS (Picture Archiving and Communication System), standards for radiotherapy (RT) information have been ratified with seven DICOM-RT objects and their IODs (Information Object Definitions), which are more than just images. This presentation describes how a DICOM-based RT Information System Server can be built based on the PACS technology and its data model for a web-based distribution. Methods: The RT information System consists of a Modality Simulator, a data format translator, a RT Gateway, the DICOM RT Server, and the Web-based Application Server. The DICOM RT Server was designed based on a PACS data model and was connected to a Web application Server for distribution of the RT information including therapeutic plans, structures, dose distribution, images and records. Various DICOM RT objects of the patient transmitted to the RT Server were routed to the Web Application Server where the contents of the DICOM RT objects were decoded and mapped to the corresponding location of the RT data model for display in the specially-designed Graphic User Interface. The non-DICOM objects were first rendered to DICOM RT Objects in the translator before they were sent to the RT Server. Results: Ten clinical cases have been collected from different hopsitals for evaluation of the DICOM-based RT Information System. They were successfully routed through the data flow and displayed in the client workstation of the RT information System. Conclusion: Using the DICOM-RT standards, integration of RT data from different vendors is possible.

  6. Educational Utilization of Microsoft Powerpoint for Oral and Maxillofacial Cancer Presentations.

    PubMed

    Carvalho, Francisco Samuel Rodrigues; Chaves, Filipe Nobre; Soares, Eduardo Costa Studart; Pereira, Karuza Maria Alves; Ribeiro, Thyciana Rodrigues; Fonteles, Cristiane Sa Roriz; Costa, Fabio Wildson Gurgel

    2016-01-01

    Electronic presentations have become useful tools for surgeons, other clinicians and patients, facilitating medical and legal support and scientific research. Microsoft® PowerPoint is by far and away the most commonly used computer-based presentation package. Setting up surgical clinical cases with PowerPoint makes it easy to register and follow patients for the purpose of discussion of treatment plan or scientific presentations. It facilitates communication between professionals, supervising clinical cases and teaching. It is often useful to create a template to standardize the presentation, offered by the software through the slide master. The purpose of this paper was to show a simple and practical method for creating a Microsoft® PowerPoint template for use in presentations concerning oral and maxillofacial cancer.

  7. EarthServer: Cross-Disciplinary Earth Science Through Data Cube Analytics

    NASA Astrophysics Data System (ADS)

    Baumann, P.; Rossi, A. P.

    2016-12-01

    The unprecedented increase of imagery, in-situ measurements, and simulation data produced by Earth (and Planetary) Science observations missions bears a rich, yet not leveraged potential for getting insights from integrating such diverse datasets and transform scientific questions into actual queries to data, formulated in a standardized way.The intercontinental EarthServer [1] initiative is demonstrating new directions for flexible, scalable Earth Science services based on innovative NoSQL technology. Researchers from Europe, the US and Australia have teamed up to rigorously implement the concept of the datacube. Such a datacube may have spatial and temporal dimensions (such as a satellite image time series) and may unite an unlimited number of scenes. Independently from whatever efficient data structuring a server network may perform internally, users (scientist, planners, decision makers) will always see just a few datacubes they can slice and dice.EarthServer has established client [2] and server technology for such spatio-temporal datacubes. The underlying scalable array engine, rasdaman [3,4], enables direct interaction, including 3-D visualization, common EO data processing, and general analytics. Services exclusively rely on the open OGC "Big Geo Data" standards suite, the Web Coverage Service (WCS). Conversely, EarthServer has shaped and advanced WCS based on the experience gained. The first phase of EarthServer has advanced scalable array database technology into 150+ TB services. Currently, Petabyte datacubes are being built for ad-hoc and cross-disciplinary querying, e.g. using climate, Earth observation and ocean data.We will present the EarthServer approach, its impact on OGC / ISO / INSPIRE standardization, and its platform technology, rasdaman.References: [1] Baumann, et al. (2015) DOI: 10.1080/17538947.2014.1003106 [2] Hogan, P., (2011) NASA World Wind, Proceedings of the 2nd International Conference on Computing for Geospatial Research & Applications ACM. [3] Baumann, Peter, et al. (2014) In Proc. 10th ICDM, 194-201. [4] Dumitru, A. et al. (2014) In Proc ACM SIGMOD Workshop on Data Analytics in the Cloud (DanaC'2014), 1-4.

  8. BioBarcode: a general DNA barcoding database and server platform for Asian biodiversity resources.

    PubMed

    Lim, Jeongheui; Kim, Sang-Yoon; Kim, Sungmin; Eo, Hae-Seok; Kim, Chang-Bae; Paek, Woon Kee; Kim, Won; Bhak, Jong

    2009-12-03

    DNA barcoding provides a rapid, accurate, and standardized method for species-level identification using short DNA sequences. Such a standardized identification method is useful for mapping all the species on Earth, particularly when DNA sequencing technology is cheaply available. There are many nations in Asia with many biodiversity resources that need to be mapped and registered in databases. We have built a general DNA barcode data processing system, BioBarcode, with open source software - which is a general purpose database and server. It uses mySQL RDBMS 5.0, BLAST2, and Apache httpd server. An exemplary database of BioBarcode has around 11,300 specimen entries (including GenBank data) and registers the biological species to map their genetic relationships. The BioBarcode database contains a chromatogram viewer which improves the performance in DNA sequence analyses. Asia has a very high degree of biodiversity and the BioBarcode database server system aims to provide an efficient bioinformatics protocol that can be freely used by Asian researchers and research organizations interested in DNA barcoding. The BioBarcode promotes the rapid acquisition of biological species DNA sequence data that meet global standards by providing specialized services, and provides useful tools that will make barcoding cheaper and faster in the biodiversity community such as standardization, depository, management, and analysis of DNA barcode data. The system can be downloaded upon request, and an exemplary server has been constructed with which to build an Asian biodiversity system http://www.asianbarcode.org.

  9. Dynamic alarm response procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martin, J.; Gordon, P.; Fitch, K.

    2006-07-01

    The Dynamic Alarm Response Procedure (DARP) system provides a robust, Web-based alternative to existing hard-copy alarm response procedures. This paperless system improves performance by eliminating time wasted looking up paper procedures by number, looking up plant process values and equipment and component status at graphical display or panels, and maintenance of the procedures. Because it is a Web-based system, it is platform independent. DARP's can be served from any Web server that supports CGI scripting, such as Apache{sup R}, IIS{sup R}, TclHTTPD, and others. DARP pages can be viewed in any Web browser that supports Javascript and Scalable Vector Graphicsmore » (SVG), such as Netscape{sup R}, Microsoft Internet Explorer{sup R}, Mozilla Firefox{sup R}, Opera{sup R}, and others. (authors)« less

  10. Server-Based and Server-Less Byod Solutions to Support Electronic Learning

    DTIC Science & Technology

    2016-06-01

    Knowledge Online NSD National Security Directive OS operating system OWA Outlook Web Access PC personal computer PED personal electronic device PDA...mobile devices, institute mobile device policies and standards, and promote the development and use of DOD mobile and web -enabled applications” (DOD...with an isolated BYOD web server, properly educated system administrators must carry out and execute the necessary, pre-defined network security

  11. Hematology and Serum Biochemistry Reference Intervals for Six-Week-Old, Farm-Reared Chinese Ring-Necked Pheasants ( Phasianus colchicus ) from Minnesota.

    PubMed

    Dzikamunhenga, R S; Griffith, R W; Hostetter, S; Fisher, P; Larson, W

    2017-06-01

    Chinese ring-necked pheasants ( Phasianus colchicus ) are commonly farmed in intensive operations for purposes such as meat production, hunting preserves, or research. Under these conditions, pheasants frequently suffer medical ailments such as bacterial, viral, and parasitic infections or nutritional or metabolic disorders. Relatively little scientific information exists regarding clinical pathology reference intervals (RIs) for farm-reared pheasants. The objective of this study was to determine RIs for hematologic and serum biochemical variables for Chinese ring-necked pheasants from Minnesota at 6 wk of age. Blood samples from 119 clinically healthy Chinese ring-necked pheasants were analyzed using standard techniques. Reference intervals were generated in Microsoft® Excel® 2013 (Microsoft, Redmond, WA) using Reference Value Advisor freeware version 2.1 (Microsoft). Ninety-five percent RIs were determined using nonparametric methods that followed Clinical and Laboratory Standards Institute guidelines. These RIs will be useful for the monitoring of health and diagnosis of disease in confined Chinese ring-necked pheasant populations that are approximately 6 wk old.

  12. CheD: chemical database compilation tool, Internet server, and client for SQL servers.

    PubMed

    Trepalin, S V; Yarkov, A V

    2001-01-01

    An efficient program, which runs on a personal computer, for the storage, retrieval, and processing of chemical information, is presented, The program can work both as a stand-alone application or in conjunction with a specifically written Web server application or with some standard SQL servers, e.g., Oracle, Interbase, and MS SQL. New types of data fields are introduced, e.g., arrays for spectral information storage, HTML and database links, and user-defined functions. CheD has an open architecture; thus, custom data types, controls, and services may be added. A WWW server application for chemical data retrieval features an easy and user-friendly installation on Windows NT or 95 platforms.

  13. Using Microsoft Excel to Assess Standards: A "Techtorial". Article #2 in a 6-Part Series

    ERIC Educational Resources Information Center

    Mears, Derrick

    2009-01-01

    Standards-based assessment is a term currently being used quite often in educational reform discussions. The philosophy behind this initiative is to utilize "standards" or "benchmarks" to focus instruction and assessments of student learning. The National Standards for Physical Education (NASPE, 2004) provide a framework to guide this process for…

  14. An immersive surgery training system with live streaming capability.

    PubMed

    Yang, Yang; Guo, Xinqing; Yu, Zhan; Steiner, Karl V; Barner, Kenneth E; Bauer, Thomas L; Yu, Jingyi

    2014-01-01

    Providing real-time, interactive immersive surgical training has been a key research area in telemedicine. Earlier approaches have mainly adopted videotaped training that can only show imagery from a fixed view point. Recent advances on commodity 3D imaging have enabled a new paradigm for immersive surgical training by acquiring nearly complete 3D reconstructions of actual surgical procedures. However, unlike 2D videotaping that can easily stream data in real-time, by far 3D imaging based solutions require pre-capturing and processing the data; surgical trainings using the data have to be conducted offline after the acquisition. In this paper, we present a new real-time immersive 3D surgical training system. Our solution builds upon the recent multi-Kinect based surgical training system [1] that can acquire and display high delity 3D surgical procedures using only a small number of Microsoft Kinect sensors. We build on top of the system a client-server model for real-time streaming. On the server front, we efficiently fuse multiple Kinect data acquired from different viewpoints and compress and then stream the data to the client. On the client front, we build an interactive space-time navigator to allow remote users (e.g., trainees) to witness the surgical procedure in real-time as if they were present in the room.

  15. Defense in Depth Added to Malicious Activities Simulation Tools (MAST)

    DTIC Science & Technology

    2015-09-01

    cipher suites. The TLS Handshake is a combination of three components: handshake, change cipher spec, and alert. 41 (1) The Handshake ( Hello ) The...TLS Handshake, specifically the “ Hello ” portion, is designed to negotiate session parameters (cipher suite). The client informs the server of the...protocols and standards that it supports and then the server selects the highest common protocols and standards. Specifically, the Client Hello message

  16. Judo strategy. The competitive dynamics of Internet time.

    PubMed

    Yoffie, D B; Cusumano, M A

    1999-01-01

    Competition on the Internet is creating fierce battles between industry giants and small-scale start-ups. Smart start-ups can avoid those conflicts by moving quickly to uncontested ground and, when that's no longer possible, turning dominant players' strengths against them. The authors call this competitive approach judo strategy. They use the Netscape-Microsoft battles to illustrate the three main principles of judo strategy: rapid movement, flexibility, and leverage. In the early part of the browser wars, for instance, Netscape applied the principle of rapid movement by being the first company to offer a free stand-alone browser. This allowed Netscape to build market share fast and to set the market standard. Flexibility became a critical factor later in the browser wars. In December 1995, when Microsoft announced that it would "embrace and extend" competitors' Internet successes, Netscape failed to give way in the face of superior strength. Instead it squared off against Microsoft and even turned down numerous opportunities to craft deep partnerships with other companies. The result was that Netscape lost deal after deal when competing with Microsoft for common distribution channels. Netscape applied the principle of leverage by using Microsoft's strengths against it. Taking advantage of Microsoft's determination to convert the world to Windows or Windows NT, Netscape made its software compatible with existing UNIX systems. While it is true that these principles can't replace basic execution, say the authors, without speed, flexibility, and leverage, very few companies can compete successfully on Internet time.

  17. BioBarcode: a general DNA barcoding database and server platform for Asian biodiversity resources

    PubMed Central

    2009-01-01

    Background DNA barcoding provides a rapid, accurate, and standardized method for species-level identification using short DNA sequences. Such a standardized identification method is useful for mapping all the species on Earth, particularly when DNA sequencing technology is cheaply available. There are many nations in Asia with many biodiversity resources that need to be mapped and registered in databases. Results We have built a general DNA barcode data processing system, BioBarcode, with open source software - which is a general purpose database and server. It uses mySQL RDBMS 5.0, BLAST2, and Apache httpd server. An exemplary database of BioBarcode has around 11,300 specimen entries (including GenBank data) and registers the biological species to map their genetic relationships. The BioBarcode database contains a chromatogram viewer which improves the performance in DNA sequence analyses. Conclusion Asia has a very high degree of biodiversity and the BioBarcode database server system aims to provide an efficient bioinformatics protocol that can be freely used by Asian researchers and research organizations interested in DNA barcoding. The BioBarcode promotes the rapid acquisition of biological species DNA sequence data that meet global standards by providing specialized services, and provides useful tools that will make barcoding cheaper and faster in the biodiversity community such as standardization, depository, management, and analysis of DNA barcode data. The system can be downloaded upon request, and an exemplary server has been constructed with which to build an Asian biodiversity system http://www.asianbarcode.org. PMID:19958506

  18. Validation of the Microsoft Kinect® camera system for measurement of lower extremity jump landing and squatting kinematics.

    PubMed

    Eltoukhy, Moataz; Kelly, Adam; Kim, Chang-Young; Jun, Hyung-Pil; Campbell, Richard; Kuenze, Christopher

    2016-01-01

    Cost effective, quantifiable assessment of lower extremity movement represents potential improvement over standard tools for evaluation of injury risk. Ten healthy participants completed three trials of a drop jump, overhead squat, and single leg squat task. Peak hip and knee kinematics were assessed using an 8 camera BTS Smart 7000DX motion analysis system and the Microsoft Kinect® camera system. The agreement and consistency between both uncorrected and correct Kinect kinematic variables and the BTS camera system were assessed using interclass correlations coefficients. Peak sagittal plane kinematics measured using the Microsoft Kinect® camera system explained a significant amount of variance [Range(hip) = 43.5-62.8%; Range(knee) = 67.5-89.6%] in peak kinematics measured using the BTS camera system. Across tasks, peak knee flexion angle and peak hip flexion were found to be consistent and in agreement when the Microsoft Kinect® camera system was directly compared to the BTS camera system but these values were improved following application of a corrective factor. The Microsoft Kinect® may not be an appropriate surrogate for traditional motion analysis technology, but it may have potential applications as a real-time feedback tool in pathological or high injury risk populations.

  19. Thin client (web browser)-based collaboration for medical imaging and web-enabled data.

    PubMed

    Le, Tuong Huu; Malhi, Nadeem

    2002-01-01

    Utilizing thin client software and open source server technology, a collaborative architecture was implemented allowing for sharing of Digital Imaging and Communications in Medicine (DICOM) and non-DICOM images with real-time markup. Using the Web browser as a thin client integrated with standards-based components, such as DHTML (dynamic hypertext markup language), JavaScript, and Java, collaboration was achieved through a Web server/proxy server combination utilizing Java Servlets and Java Server Pages. A typical collaborative session involved the driver, who directed the navigation of the other collaborators, the passengers, and provided collaborative markups of medical and nonmedical images. The majority of processing was performed on the server side, allowing for the client to remain thin and more accessible.

  20. Worldwide telemedicine services based on distributed multimedia electronic patient records by using the second generation Web server hyperwave.

    PubMed

    Quade, G; Novotny, J; Burde, B; May, F; Beck, L E; Goldschmidt, A

    1999-01-01

    A distributed multimedia electronic patient record (EPR) is a central component of a medicine-telematics application that supports physicians working in rural areas of South America, and offers medical services to scientists in Antarctica. A Hyperwave server is used to maintain the patient record. As opposed to common web servers--and as a second generation web server--Hyperwave provides the capability of holding documents in a distributed web space without the problem of broken links. This enables physicians to browse through a patient's record by using a standard browser even if the patient's record is distributed over several servers. The patient record is basically implemented on the "Good European Health Record" (GEHR) architecture.

  1. An object-oriented, technology-adaptive information model

    NASA Technical Reports Server (NTRS)

    Anyiwo, Joshua C.

    1995-01-01

    The primary objective was to develop a computer information system for effectively presenting NASA's technologies to American industries, for appropriate commercialization. To this end a comprehensive information management model, applicable to a wide variety of situations, and immune to computer software/hardware technological gyrations, was developed. The model consists of four main elements: a DATA_STORE, a data PRODUCER/UPDATER_CLIENT and a data PRESENTATION_CLIENT, anchored to a central object-oriented SERVER engine. This server engine facilitates exchanges among the other model elements and safeguards the integrity of the DATA_STORE element. It is designed to support new technologies, as they become available, such as Object Linking and Embedding (OLE), on-demand audio-video data streaming with compression (such as is required for video conferencing), Worldwide Web (WWW) and other information services and browsing, fax-back data requests, presentation of information on CD-ROM, and regular in-house database management, regardless of the data model in place. The four components of this information model interact through a system of intelligent message agents which are customized to specific information exchange needs. This model is at the leading edge of modern information management models. It is independent of technological changes and can be implemented in a variety of ways to meet the specific needs of any communications situation. This summer a partial implementation of the model has been achieved. The structure of the DATA_STORE has been fully specified and successfully tested using Microsoft's FoxPro 2.6 database management system. Data PRODUCER/UPDATER and PRESENTATION architectures have been developed and also successfully implemented in FoxPro; and work has started on a full implementation of the SERVER engine. The model has also been successfully applied to a CD-ROM presentation of NASA's technologies in support of Langley Research Center's TAG efforts.

  2. Controlling EPICS from a web browser.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, K., Jr.

    1999-04-13

    An alternative to using a large graphical display manager like MEDM [1,2] to interface to a control system, is to use individual control objects, such as text boxes, meters, etc., running in a browser. This paper presents three implementations of this concept, one using ActiveX controls, one with Java applets, and another with Microsoft Agent. The ActiveX controls have performance nearing that of MEDM, but they only work on Windows platforms. The Java applets require a server to get around Web security restrictions and are not as fast, but they have the advantage of working on most platforms and withmore » both of the leading Web browsers. The agent works on Windows platforms with and without a browser and allows voice recognition and speech synthesis, making it somewhat more innovative than MEDM.« less

  3. Request redirection paradigm in medical image archive implementation.

    PubMed

    Dragan, Dinu; Ivetić, Dragan

    2012-08-01

    It is widely recognized that the JPEG2000 facilitates issues in medical imaging: storage, communication, sharing, remote access, interoperability, and presentation scalability. Therefore, JPEG2000 support was added to the DICOM standard Supplement 61. Two approaches to support JPEG2000 medical image are explicitly defined by the DICOM standard: replacing the DICOM image format with corresponding JPEG2000 codestream, or by the Pixel Data Provider service, DICOM supplement 106. The latest one supposes two-step retrieval of medical image: DICOM request and response from a DICOM server, and then JPIP request and response from a JPEG2000 server. We propose a novel strategy for transmission of scalable JPEG2000 images extracted from a single codestream over DICOM network using the DICOM Private Data Element without sacrificing system interoperability. It employs the request redirection paradigm: DICOM request and response from JPEG2000 server through DICOM server. The paper presents programming solution for implementation of request redirection paradigm in a DICOM transparent manner. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  4. Corporate Schooling Meets Corporate Media: Standards, Testing, and Technophilia

    ERIC Educational Resources Information Center

    Saltman, Kenneth J.

    2016-01-01

    Educational publishing corporations and media corporations in the United States have been converging, especially through the promotion of standardization, testing, and for-profit educational technologies. Media and technology companies--including News Corp, Apple, and Microsoft--have significantly expanded their presence in public schools to sell…

  5. Recommendations for a service framework to access astronomical archives

    NASA Technical Reports Server (NTRS)

    Travisano, J. J.; Pollizzi, J.

    1992-01-01

    There are a large number of astronomical archives and catalogs on-line for network access, with many different user interfaces and features. Some systems are moving towards distributed access, supplying users with client software for their home sites which connects to servers at the archive site. Many of the issues involved in defining a standard framework of services that archive/catalog suppliers can use to achieve a basic level of interoperability are described. Such a framework would simplify the development of client and server programs to access the wide variety of astronomical archive systems. The primary services that are supplied by current systems include: catalog browsing, dataset retrieval, name resolution, and data analysis. The following issues (and probably more) need to be considered in establishing a standard set of client/server interfaces and protocols: Archive Access - dataset retrieval, delivery, file formats, data browsing, analysis, etc.; Catalog Access - database management systems, query languages, data formats, synchronous/asynchronous mode of operation, etc.; Interoperability - transaction/message protocols, distributed processing mechanisms (DCE, ONC/SunRPC, etc), networking protocols, etc.; Security - user registration, authorization/authentication mechanisms, etc.; Service Directory - service registration, lookup, port/task mapping, parameters, etc.; Software - public vs proprietary, client/server software, standard interfaces to client/server functions, software distribution, operating system portability, data portability, etc. Several archive/catalog groups, notably the Astrophysics Data System (ADS), are already working in many of these areas. In the process of developing StarView, which is the user interface to the Space Telescope Data Archive and Distribution Service (ST-DADS), these issues and the work of others were analyzed. A framework of standard interfaces for accessing services on any archive system which would benefit archive user and supplier alike is proposed.

  6. An NTP Stratum-One Server Farm Fed By IEEE-1588

    DTIC Science & Technology

    2010-01-01

    Serial Time Code Formats,” U.S. Army White Sands Missile Range, N.M. [11] J. Eidson , 2005, “IEEE-1588 Standard for a Precision Clock Synchronization ... synchronized to its Master Clocks via IRIG-B time code on a low- frequency RF distribution system. The availability of Precise Time Protocol (PTP, IEEE...forwarding back to the requestor. The farm NTP servers are synchronized to the USNO Master Clocks using IRIG-B time code. The current standard NTP

  7. The CUAHSI Water Data Center: Enabling Data Publication, Discovery and Re-use

    NASA Astrophysics Data System (ADS)

    Seul, M.; Pollak, J.

    2014-12-01

    The CUAHSI Water Data Center (WDC) supports a standards-based, services-oriented architecture for time-series data and provides a separate service to publish spatial data layers as shape files. Two new services that the WDC offers are a cloud-based server (Cloud HydroServer) for publishing data and a web-based client for data discovery. The Cloud HydroServer greatly simplifies data publication by eliminating the need for scientists to set up an SQL-server data base, a requirement that has proven to be a significant barrier, and ensures greater reliability and continuity of service. Uploaders have been developed to simplify the metadata documentation process. The web-based data client eliminates the need for installing a program to be used as a client and works across all computer operating systems. The services provided by the WDC is a foundation for big data use, re-use, and meta-analyses. Using data transmission standards enables far more effective data sharing and discovery; standards used by the WDC are part of a global set of standards that should enable scientists to access unprecedented amount of data to address larger-scale research questions than was previously possible. A central mission of the WDC is to ensure these services meet the needs of the water science community and are effective at advancing water science.

  8. Web Design for Space Operations: An Overview of the Challenges and New Technologies Used in Developing and Operating Web-Based Applications in Real-Time Operational Support Onboard the International Space Station, in Astronaut Mission Planning and Mission Control Operations

    NASA Technical Reports Server (NTRS)

    Khan, Ahmed

    2010-01-01

    The International Space Station (ISS) Operations Planning Team, Mission Control Centre and Mission Automation Support Network (MAS) have all evolved over the years to use commercial web-based technologies to create a configurable electronic infrastructure to manage the complex network of real-time planning, crew scheduling, resource and activity management as well as onboard document and procedure management required to co-ordinate ISS assembly, daily operations and mission support. While these Web technologies are classified as non-critical in nature, their use is part of an essential backbone of daily operations on the ISS and allows the crew to operate the ISS as a functioning science laboratory. The rapid evolution of the internet from 1998 (when ISS assembly began) to today, along with the nature of continuous manned operations in space, have presented a unique challenge in terms of software engineering and system development. In addition, the use of a wide array of competing internet technologies (including commercial technologies such as .NET and JAVA ) and the special requirements of having to support this network, both nationally among various control centres for International Partners (IPs), as well as onboard the station itself, have created special challenges for the MCC Web Tools Development Team, software engineers and flight controllers, who implement and maintain this system. This paper presents an overview of some of these operational challenges, and the evolving nature of the solutions and the future use of COTS based rich internet technologies in manned space flight operations. In particular this paper will focus on the use of Microsoft.s .NET API to develop Web-Based Operational tools, the use of XML based service oriented architectures (SOA) that needed to be customized to support Mission operations, the maintenance of a Microsoft IIS web server onboard the ISS, The OpsLan, functional-oriented Web Design with AJAX

  9. Computer assisted data analysis in intensive care: the ICDEV project--development of a scientific database system for intensive care (Intensive Care Data Evaluation Project).

    PubMed

    Metnitz, P G; Laback, P; Popow, C; Laback, O; Lenz, K; Hiesmayr, M

    1995-01-01

    Patient Data Management Systems (PDMS) for ICUs collect, present and store clinical data. Various intentions make analysis of those digitally stored data desirable, such as quality control or scientific purposes. The aim of the Intensive Care Data Evaluation project (ICDEV), was to provide a database tool for the analysis of data recorded at various ICUs at the University Clinics of Vienna. General Hospital of Vienna, with two different PDMSs used: CareVue 9000 (Hewlett Packard, Andover, USA) at two ICUs (one medical ICU and one neonatal ICU) and PICIS Chart+ (PICIS, Paris, France) at one Cardiothoracic ICU. CONCEPT AND METHODS: Clinically oriented analysis of the data collected in a PDMS at an ICU was the beginning of the development. After defining the database structure we established a client-server based database system under Microsoft Windows NI and developed a user friendly data quering application using Microsoft Visual C++ and Visual Basic; ICDEV was successfully installed at three different ICUs, adjustment to the different PDMS configurations were done within a few days. The database structure developed by us enables a powerful query concept representing an 'EXPERT QUESTION COMPILER' which may help to answer almost any clinical questions. Several program modules facilitate queries at the patient, group and unit level. Results from ICDEV-queries are automatically transferred to Microsoft Excel for display (in form of configurable tables and graphs) and further processing. The ICDEV concept is configurable for adjustment to different intensive care information systems and can be used to support computerized quality control. However, as long as there exists no sufficient artifact recognition or data validation software for automatically recorded patient data, the reliability of these data and their usage for computer assisted quality control remain unclear and should be further studied.

  10. The USGODAE Monterey Data Server

    NASA Astrophysics Data System (ADS)

    Sharfstein, P.; Dimitriou, D.; Hankin, S.

    2005-12-01

    The USGODAE Monterey Data Server (http://www.usgodae.org/) has been established at the Fleet Numerical Meteorology and Oceanography Center (FNMOC) as an explicit U.S. contribution to GODAE. The server is operated with oversight and funding from the Office of Naval Research (ONR). Support of the GODAE Monterey Data Server is accomplished by a cooperative effort between FNMOC and NOAA's Pacific Marine Environmental Laboratory (PMEL) in the on-going development of the GODAE server and the support of a collaborative network of GODAE assimilation groups. This server hosts near real-time in-situ oceanographic data available from the Global Telecommunications System (GTS) and other FTP sites, atmospheric forcing fields suitable for driving ocean models, and unique GODAE data sets, including demonstration ocean model products. It supports GODAE participants, as well as the broader oceanographic research community, and is becoming a significant node in the international GODAE program. GODAE is envisioned as a global system of observations, communications, modeling and assimilation, which will deliver regular, comprehensive information on the state of the oceans in a way that will promote and engender wide utility and availability of this resource for maximum benefit to society. It aims to make ocean monitoring and prediction a routine activity in a manner similar to weather forecasting. GODAE will contribute to an information system for the global ocean that will serve interests from climate and climate change to ship routing and fisheries. The USGODAE Server is developed and operated as a prototypical node for this global information system. Presenting data with a consistent interface and ensuring its availability in the maximum number of standard formats is one of the primary challenges in hosting the many diverse formats and broad range of data used by the GODAE community. To this end, all USGODAE data sets are available in their original format via HTTP and FTP. In addition, USGODAE data are served using Local Data Manager (LDM), THREDDS cataloging, OPeNDAP, and GODAE Live Access Server (LAS) from PMEL. Every effort is made to serve USGODAE data through the standards specified by the National Virtual Ocean Data System (NVODS) and the Integrated Ocean Observing System Data Management and Communications (IOOS/DMAC) specifications. USGODAE serves FNMOC GRIB files from the Navy Operational Global Atmospheric Prediction System (NOGAPS) and the Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS) as OPeNDAP data sets using the GrADS Data Server (GDS). The server also provides several FNMOC custom IEEE binary format high resolution ocean analysis products and model outputs through GDS. These data sets are also made available through LAS. The Server functions as one of two Argo Global Data Assembly Centers (GDACs), hosting the complete collection of quality-controlled Argo temperature/salinity profiling float data. The Argo collection includes all available Delayed-Mode (scientific quality controlled and corrected) data. USGODAE Argo data are served through OPeNDAP and LAS, which provide complete integration of the Argo data set into NVODS and the IOOS/DMAC. By providing researchers flexible, easy access to data through standard Internet and oceanographic interfaces, the USGODAE Monterey Data Server has become an invaluable resource for oceanographic research. Also, by promoting the community data serving projects, USGODAE strengthens the community and helps to advance the data serving standards.

  11. WASP (Write a Scientific Paper) using Excel - 6: Standard error and confidence interval.

    PubMed

    Grech, Victor

    2018-03-01

    The calculation of descriptive statistics includes the calculation of standard error and confidence interval, an inevitable component of data analysis in inferential statistics. This paper provides pointers as to how to do this in Microsoft Excel™. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. EarthServer - 3D Visualization on the Web

    NASA Astrophysics Data System (ADS)

    Wagner, Sebastian; Herzig, Pasquale; Bockholt, Ulrich; Jung, Yvonne; Behr, Johannes

    2013-04-01

    EarthServer (www.earthserver.eu), funded by the European Commission under its Seventh Framework Program, is a project to enable the management, access and exploration of massive, multi-dimensional datasets using Open GeoSpatial Consortium (OGC) query and processing language standards like WCS 2.0 and WCPS. To this end, a server/client architecture designed to handle Petabyte/Exabyte volumes of multi-dimensional data is being developed and deployed. As an important part of the EarthServer project, six Lighthouse Applications, major scientific data exploitation initiatives, are being established to make cross-domain, Earth Sciences related data repositories available in an open and unified manner, as service endpoints based on solutions and infrastructure developed within the project. Clients technology developed and deployed in EarthServer ranges from mobile and web clients to immersive virtual reality systems, all designed to interact with a physically and logically distributed server infrastructure using exclusively OGC standards. In this contribution, we would like to present our work on a web-based 3D visualization and interaction client for Earth Sciences data using only technology found in standard web browsers without requiring the user to install plugins or addons. Additionally, we are able to run the earth data visualization client on a wide range of different platforms with very different soft- and hardware requirements such as smart phones (e.g. iOS, Android), different desktop systems etc. High-quality, hardware-accelerated visualization of 3D and 4D content in standard web browsers can be realized now and we believe it will become more and more common to use this fast, lightweight and ubiquitous platform to provide insights into big datasets without requiring the user to set up a specialized client first. With that in mind, we will also point out some of the limitations we encountered using current web technologies. Underlying the EarthServer web client and on top of HTML5, WebGL and JavaScript we have developed the X3DOM framework (www.x3dom.org), which makes possible to embed declarative X3D scenegraphs, an ISO standard XML-based file format for representing 3D computer graphics, directly within HTML, thus enabling developers to rapidly design 3D content that blends seamlessly into HTML interfaces using Javascript. This approach (commonly referred to as a polyfill layer) is used to mimic native web browser support for declarative 3D content and is an important component in our web client architecture.

  13. The military health system's personal health record pilot with Microsoft HealthVault and Google Health.

    PubMed

    Do, Nhan V; Barnhill, Rick; Heermann-Do, Kimberly A; Salzman, Keith L; Gimbel, Ronald W

    2011-01-01

    To design, build, implement, and evaluate a personal health record (PHR), tethered to the Military Health System, that leverages Microsoft® HealthVault and Google® Health infrastructure based on user preference. A pilot project was conducted in 2008-2009 at Madigan Army Medical Center in Tacoma, Washington. Our PHR was architected to a flexible platform that incorporated standards-based models of Continuity of Document and Continuity of Care Record to map Department of Defense-sourced health data, via a secure Veterans Administration data broker, to Microsoft® HealthVault and Google® Health based on user preference. The project design and implementation were guided by provider and patient advisory panels with formal user evaluation. The pilot project included 250 beneficiary users. Approximately 73.2% of users were < 65 years of age, and 38.4% were female. Of the users, 169 (67.6%) selected Microsoft® HealthVault, and 81 (32.4%) selected Google® Health as their PHR of preference. Sample evaluation of users reflected 100% (n = 60) satisfied with convenience of record access and 91.7% (n = 55) satisfied with overall functionality of PHR. Key lessons learned related to data-transfer decisions (push vs pull), purposeful delays in reporting sensitive information, understanding and mapping PHR use and clinical workflow, and decisions on information patients may choose to share with their provider. Currently PHRs are being viewed as empowering tools for patient activation. Design and implementation issues (eg, technical, organizational, information security) are substantial and must be thoughtfully approached. Adopting standards into design can enhance the national goal of portability and interoperability.

  14. Autoplot and the HAPI Server

    NASA Astrophysics Data System (ADS)

    Faden, J.; Vandegriff, J. D.; Weigel, R. S.

    2016-12-01

    Autoplot was introduced in 2008 as an easy-to-use plotting tool for the space physics community. It reads data from a variety of file resources, such as CDF and HDF files, and a number of specialized data servers, such as the PDS/PPI's DIT-DOS, CDAWeb, and from the University of Iowa's RPWG Das2Server. Each of these servers have optimized methods for transmitting data to display in Autoplot, but require coordination and specialized software to work, limiting Autoplot's ability to access new servers and datasets. Likewise, groups who would like software to access their APIs must either write thier own clients, or publish a specification document in hopes that people will write clients. The HAPI specification was written so that a simple, standard API could be used by both Autoplot and server implementations, to remove these barriers to free flow of time series data. Autoplot's software for communicating with HAPI servers is presented, showing the user interface scientists will use, and how data servers might implement the HAPI specification to provide access to their data. This will also include instructions on how Autoplot is used and installed desktop computers, and used to view data from the RBSP, Juno, and other missions.

  15. Asynchronous data change notification between database server and accelerator controls system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fu, W.; Morris, J.; Nemesure, S.

    2011-10-10

    Database data change notification (DCN) is a commonly used feature. Not all database management systems (DBMS) provide an explicit DCN mechanism. Even for those DBMS's which support DCN (such as Oracle and MS SQL server), some server side and/or client side programming may be required to make the DCN system work. This makes the setup of DCN between database server and interested clients tedious and time consuming. In accelerator control systems, there are many well established software client/server architectures (such as CDEV, EPICS, and ADO) that can be used to implement data reflection servers that transfer data asynchronously to anymore » client using the standard SET/GET API. This paper describes a method for using such a data reflection server to set up asynchronous DCN (ADCN) between a DBMS and clients. This method works well for all DBMS systems which provide database trigger functionality. Asynchronous data change notification (ADCN) between database server and clients can be realized by combining the use of a database trigger mechanism, which is supported by major DBMS systems, with server processes that use client/server software architectures that are familiar in the accelerator controls community (such as EPICS, CDEV or ADO). This approach makes the ADCN system easy to set up and integrate into an accelerator controls system. Several ADCN systems have been set up and used in the RHIC-AGS controls system.« less

  16. Parallelization of a Monte Carlo particle transport simulation code

    NASA Astrophysics Data System (ADS)

    Hadjidoukas, P.; Bousis, C.; Emfietzoglou, D.

    2010-05-01

    We have developed a high performance version of the Monte Carlo particle transport simulation code MC4. The original application code, developed in Visual Basic for Applications (VBA) for Microsoft Excel, was first rewritten in the C programming language for improving code portability. Several pseudo-random number generators have been also integrated and studied. The new MC4 version was then parallelized for shared and distributed-memory multiprocessor systems using the Message Passing Interface. Two parallel pseudo-random number generator libraries (SPRNG and DCMT) have been seamlessly integrated. The performance speedup of parallel MC4 has been studied on a variety of parallel computing architectures including an Intel Xeon server with 4 dual-core processors, a Sun cluster consisting of 16 nodes of 2 dual-core AMD Opteron processors and a 200 dual-processor HP cluster. For large problem size, which is limited only by the physical memory of the multiprocessor server, the speedup results are almost linear on all systems. We have validated the parallel implementation against the serial VBA and C implementations using the same random number generator. Our experimental results on the transport and energy loss of electrons in a water medium show that the serial and parallel codes are equivalent in accuracy. The present improvements allow for studying of higher particle energies with the use of more accurate physical models, and improve statistics as more particles tracks can be simulated in low response time.

  17. Ecoupling server: A tool to compute and analyze electronic couplings.

    PubMed

    Cabeza de Vaca, Israel; Acebes, Sandra; Guallar, Victor

    2016-07-05

    Electron transfer processes are often studied through the evaluation and analysis of the electronic coupling (EC). Since most standard QM codes do not provide readily such a measure, additional, and user-friendly tools to compute and analyze electronic coupling from external wave functions will be of high value. The first server to provide a friendly interface for evaluation and analysis of electronic couplings under two different approximations (FDC and GMH) is presented in this communication. Ecoupling server accepts inputs from common QM and QM/MM software and provides useful plots to understand and analyze the results easily. The web server has been implemented in CGI-python using Apache and it is accessible at http://ecouplingserver.bsc.es. Ecoupling server is free and open to all users without login. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  18. Can shoulder range of movement be measured accurately using the Microsoft Kinect sensor plus Medical Interactive Recovery Assistant (MIRA) software?

    PubMed

    Wilson, James D; Khan-Perez, Jennifer; Marley, Dominic; Buttress, Susan; Walton, Michael; Li, Baihua; Roy, Bibhas

    2017-12-01

    This study compared the accuracy of measuring shoulder range of movement (ROM) with a simple laptop-sensor combination vs. trained observers (shoulder physiotherapists and shoulder surgeons) using motion capture (MoCap) laboratory equipment as the gold standard. The Microsoft Kinect sensor (Microsoft Corp., Redmond, WA, USA) tracks 3-dimensional human motion. Ordinarily used with an Xbox (Microsoft Corp.) video game console, Medical Interactive Recovery Assistant (MIRA) software (MIRA Rehab Ltd., London, UK) allows this small sensor to measure shoulder movement with a standard computer. Shoulder movements of 49 healthy volunteers were simultaneously measured by trained observers, MoCap, and the MIRA device. Internal rotation was assessed with the shoulder abducted 90° and external rotation with the shoulder adducted. Visual estimation and MIRA measurements were compared with gold standard MoCap measurements for agreement using Bland-Altman methods. There were 1670 measurements analyzed. The MIRA evaluations of all 4 cardinal shoulder movements were significantly more precise, with narrower limits of agreement, than the measurements of trained observers. MIRA achieved ±11° (95% confidence interval [CI], 8.7°-12.6°) for forward flexion vs. ±16° (95% CI, 14.6°-17.6°) by trained observers. For abduction, MIRA showed ±11° (95% CI, 8.7°-12.8°) against ±15° (95% CI, 13.4°-16.2°) for trained observers. MIRA attained ±10° (95% CI, 8.1°-11.9°) during external rotation measurement, whereas trained observers only reached ±21° (95% CI, 18.7°-22.6°). For internal rotation, MIRA achieved ±9° (95% CI, 7.2°-10.4°), which was again better than TOs at ±18° (95% CI, 16.0°-19.3°). A laptop combined with a Microsoft Kinect sensor and the MIRA software can measure shoulder movements with acceptable levels of accuracy. This technology, which can be easily set up, may also allow precise shoulder ROM measurement outside the clinic setting. Copyright © 2017 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Elsevier Inc. All rights reserved.

  19. Network issues for large mass storage requirements

    NASA Technical Reports Server (NTRS)

    Perdue, James

    1992-01-01

    File Servers and Supercomputing environments need high performance networks to balance the I/O requirements seen in today's demanding computing scenarios. UltraNet is one solution which permits both high aggregate transfer rates and high task-to-task transfer rates as demonstrated in actual tests. UltraNet provides this capability as both a Server-to-Server and Server-to-Client access network giving the supercomputing center the following advantages highest performance Transport Level connections (to 40 MBytes/sec effective rates); matches the throughput of the emerging high performance disk technologies, such as RAID, parallel head transfer devices and software striping; supports standard network and file system applications using SOCKET's based application program interface such as FTP, rcp, rdump, etc.; supports access to the Network File System (NFS) and LARGE aggregate bandwidth for large NFS usage; provides access to a distributed, hierarchical data server capability using DISCOS UniTree product; supports file server solutions available from multiple vendors, including Cray, Convex, Alliant, FPS, IBM, and others.

  20. Improvements to Autoplot's HAPI Support

    NASA Astrophysics Data System (ADS)

    Faden, J.; Vandegriff, J. D.; Weigel, R. S.

    2017-12-01

    Autoplot handles data from a variety of data servers. These servers communicate data in different forms, each somewhat different in capabilities and each needing new software to interface. The Heliophysics Application Programmer's Interface (HAPI) attempts to ease this by providing a standard target for clients and servers to meet. Autoplot fully supports reading data from HAPI servers, and support continues to improve as the HAPI server spec matures. This collaboration has already produced robust clients and documentation which would be expensive for groups creating their own protocol. For example, client-side data caching is introduced where Autoplot maintains a cache of data for performance and off-line use. This is a feature we considered for previous data systems, but we could never afford the time to study and implement this carefully. Also, Autoplot itself can be used as a server, making the data it can read and the results of its processing available to other data systems. Autoplot use with other data transmission systems is reviewed as well, outlining features of each system.

  1. Creating FGDC and NBII metadata with Metavist 2005.

    Treesearch

    David J. Rugg

    2004-01-01

    This report documents a computer program for creating metadata compliant with the Federal Geographic Data Committee (FGDC) 1998 metadata standard or the National Biological Information Infrastructure (NBII) 1999 Biological Data Profile for the FGDC standard. The software runs under the Microsoft Windows 2000 and XP operating systems, and requires the presence of...

  2. BioWord: A sequence manipulation suite for Microsoft Word

    PubMed Central

    2012-01-01

    Background The ability to manipulate, edit and process DNA and protein sequences has rapidly become a necessary skill for practicing biologists across a wide swath of disciplines. In spite of this, most everyday sequence manipulation tools are distributed across several programs and web servers, sometimes requiring installation and typically involving frequent switching between applications. To address this problem, here we have developed BioWord, a macro-enabled self-installing template for Microsoft Word documents that integrates an extensive suite of DNA and protein sequence manipulation tools. Results BioWord is distributed as a single macro-enabled template that self-installs with a single click. After installation, BioWord will open as a tab in the Office ribbon. Biologists can then easily manipulate DNA and protein sequences using a familiar interface and minimize the need to switch between applications. Beyond simple sequence manipulation, BioWord integrates functionality ranging from dyad search and consensus logos to motif discovery and pair-wise alignment. Written in Visual Basic for Applications (VBA) as an open source, object-oriented project, BioWord allows users with varying programming experience to expand and customize the program to better meet their own needs. Conclusions BioWord integrates a powerful set of tools for biological sequence manipulation within a handy, user-friendly tab in a widely used word processing software package. The use of a simple scripting language and an object-oriented scheme facilitates customization by users and provides a very accessible educational platform for introducing students to basic bioinformatics algorithms. PMID:22676326

  3. BioWord: a sequence manipulation suite for Microsoft Word.

    PubMed

    Anzaldi, Laura J; Muñoz-Fernández, Daniel; Erill, Ivan

    2012-06-07

    The ability to manipulate, edit and process DNA and protein sequences has rapidly become a necessary skill for practicing biologists across a wide swath of disciplines. In spite of this, most everyday sequence manipulation tools are distributed across several programs and web servers, sometimes requiring installation and typically involving frequent switching between applications. To address this problem, here we have developed BioWord, a macro-enabled self-installing template for Microsoft Word documents that integrates an extensive suite of DNA and protein sequence manipulation tools. BioWord is distributed as a single macro-enabled template that self-installs with a single click. After installation, BioWord will open as a tab in the Office ribbon. Biologists can then easily manipulate DNA and protein sequences using a familiar interface and minimize the need to switch between applications. Beyond simple sequence manipulation, BioWord integrates functionality ranging from dyad search and consensus logos to motif discovery and pair-wise alignment. Written in Visual Basic for Applications (VBA) as an open source, object-oriented project, BioWord allows users with varying programming experience to expand and customize the program to better meet their own needs. BioWord integrates a powerful set of tools for biological sequence manipulation within a handy, user-friendly tab in a widely used word processing software package. The use of a simple scripting language and an object-oriented scheme facilitates customization by users and provides a very accessible educational platform for introducing students to basic bioinformatics algorithms.

  4. SurfaceSlide: a multitouch digital pathology platform.

    PubMed

    Wang, Yinhai; Williamson, Kate E; Kelly, Paul J; James, Jacqueline A; Hamilton, Peter W

    2012-01-01

    Digital pathology provides a digital environment for the management and interpretation of pathological images and associated data. It is becoming increasing popular to use modern computer based tools and applications in pathological education, tissue based research and clinical diagnosis. Uptake of this new technology is stymied by its single user orientation and its prerequisite and cumbersome combination of mouse and keyboard for navigation and annotation. In this study we developed SurfaceSlide, a dedicated viewing platform which enables the navigation and annotation of gigapixel digitised pathological images using fingertip touch. SurfaceSlide was developed using the Microsoft Surface, a 30 inch multitouch tabletop computing platform. SurfaceSlide users can perform direct panning and zooming operations on digitised slide images. These images are downloaded onto the Microsoft Surface platform from a remote server on-demand. Users can also draw annotations and key in texts using an on-screen virtual keyboard. We also developed a smart caching protocol which caches the surrounding regions of a field of view in multi-resolutions thus providing a smooth and vivid user experience and reducing the delay for image downloading from the internet. We compared the usability of SurfaceSlide against Aperio ImageScope and PathXL online viewer. SurfaceSlide is intuitive, fast and easy to use. SurfaceSlide represents the most direct, effective and intimate human-digital slide interaction experience. It is expected that SurfaceSlide will significantly enhance digital pathology tools and applications in education and clinical practice.

  5. SurfaceSlide: A Multitouch Digital Pathology Platform

    PubMed Central

    Wang, Yinhai; Williamson, Kate E.; Kelly, Paul J.; James, Jacqueline A.; Hamilton, Peter W.

    2012-01-01

    Background Digital pathology provides a digital environment for the management and interpretation of pathological images and associated data. It is becoming increasing popular to use modern computer based tools and applications in pathological education, tissue based research and clinical diagnosis. Uptake of this new technology is stymied by its single user orientation and its prerequisite and cumbersome combination of mouse and keyboard for navigation and annotation. Methodology In this study we developed SurfaceSlide, a dedicated viewing platform which enables the navigation and annotation of gigapixel digitised pathological images using fingertip touch. SurfaceSlide was developed using the Microsoft Surface, a 30 inch multitouch tabletop computing platform. SurfaceSlide users can perform direct panning and zooming operations on digitised slide images. These images are downloaded onto the Microsoft Surface platform from a remote server on-demand. Users can also draw annotations and key in texts using an on-screen virtual keyboard. We also developed a smart caching protocol which caches the surrounding regions of a field of view in multi-resolutions thus providing a smooth and vivid user experience and reducing the delay for image downloading from the internet. We compared the usability of SurfaceSlide against Aperio ImageScope and PathXL online viewer. Conclusion SurfaceSlide is intuitive, fast and easy to use. SurfaceSlide represents the most direct, effective and intimate human–digital slide interaction experience. It is expected that SurfaceSlide will significantly enhance digital pathology tools and applications in education and clinical practice. PMID:22292040

  6. A Comparative Analysis of Extract, Transformation and Loading (ETL) Process

    NASA Astrophysics Data System (ADS)

    Runtuwene, J. P. A.; Tangkawarow, I. R. H. T.; Manoppo, C. T. M.; Salaki, R. J.

    2018-02-01

    The current growth of data and information occurs rapidly in varying amount and media. These types of development will eventually produce large number of data better known as the Big Data. Business Intelligence (BI) utilizes large number of data and information for analysis so that one can obtain important information. This type of information can be used to support decision-making process. In practice a process integrating existing data and information into data warehouse is needed. This data integration process is known as Extract, Transformation and Loading (ETL). In practice, many applications have been developed to carry out the ETL process, but selection which applications are more time, cost and power effective and efficient may become a challenge. Therefore, the objective of the study was to provide comparative analysis through comparison between the ETL process using Microsoft SQL Server Integration Service (SSIS) and one using Pentaho Data Integration (PDI).

  7. Simple Web-based interactive key development software (WEBiKEY) and an example key for Kuruna (Poaceae: Bambusoideae).

    PubMed

    Attigala, Lakshmi; De Silva, Nuwan I; Clark, Lynn G

    2016-04-01

    Programs that are user-friendly and freely available for developing Web-based interactive keys are scarce and most of the well-structured applications are relatively expensive. WEBiKEY was developed to enable researchers to easily develop their own Web-based interactive keys with fewer resources. A Web-based multiaccess identification tool (WEBiKEY) was developed that uses freely available Microsoft ASP.NET technologies and an SQL Server database for Windows-based hosting environments. WEBiKEY was tested for its usability with a sample data set, the temperate woody bamboo genus Kuruna (Poaceae). WEBiKEY is freely available to the public and can be used to develop Web-based interactive keys for any group of species. The interactive key we developed for Kuruna using WEBiKEY enables users to visually inspect characteristics of Kuruna and identify an unknown specimen as one of seven possible species in the genus.

  8. Moles: Tool-Assisted Environment Isolation with Closures

    NASA Astrophysics Data System (ADS)

    de Halleux, Jonathan; Tillmann, Nikolai

    Isolating test cases from environment dependencies is often desirable, as it increases test reliability and reduces test execution time. However, code that calls non-virtual methods or consumes sealed classes is often impossible to test in isolation. Moles is a new lightweight framework which addresses this problem. For any .NET method, Moles allows test-code to provide alternative implementations, given as .NET delegates, for which C# provides very concise syntax while capturing local variables in a closure object. Using code instrumentation, the Moles framework will redirect calls to provided delegates instead of the original methods. The Moles framework is designed to work together with the dynamic symbolic execution tool Pex to enable automated test generation. In a case study, testing code programmed against the Microsoft SharePoint Foundation API, we achieved full code coverage while running tests in isolation without an actual SharePoint server. The Moles framework integrates with .NET and Visual Studio.

  9. Alview: Portable Software for Viewing Sequence Reads in BAM Formatted Files.

    PubMed

    Finney, Richard P; Chen, Qing-Rong; Nguyen, Cu V; Hsu, Chih Hao; Yan, Chunhua; Hu, Ying; Abawi, Massih; Bian, Xiaopeng; Meerzaman, Daoud M

    2015-01-01

    The name Alview is a contraction of the term Alignment Viewer. Alview is a compiled to native architecture software tool for visualizing the alignment of sequencing data. Inputs are files of short-read sequences aligned to a reference genome in the SAM/BAM format and files containing reference genome data. Outputs are visualizations of these aligned short reads. Alview is written in portable C with optional graphical user interface (GUI) code written in C, C++, and Objective-C. The application can run in three different ways: as a web server, as a command line tool, or as a native, GUI program. Alview is compatible with Microsoft Windows, Linux, and Apple OS X. It is available as a web demo at https://cgwb.nci.nih.gov/cgi-bin/alview. The source code and Windows/Mac/Linux executables are available via https://github.com/NCIP/alview.

  10. PsychVACS: a system for asynchronous telepsychiatry.

    PubMed

    Odor, Alberto; Yellowlees, Peter; Hilty, Donald; Parish, Michelle Burke; Nafiz, Najia; Iosif, Ana-Maria

    2011-05-01

    To describe the technical development of an asynchronous telepsychiatry application, the Psychiatric Video Archiving and Communication System. A client-server application was developed in Visual Basic.Net with Microsoft(®) SQL database as the backend. It includes the capability of storing video-recorded psychiatric interviews and manages the workflow of the system with automated messaging. Psychiatric Video Archiving and Communication System has been used to conduct the first ever series of asynchronous telepsychiatry consultations worldwide. A review of the software application and the process as part of this project has led to a number of improvements that are being implemented in the next version, which is being written in Java. This is the first description of the use of video recorded data in an asynchronous telemedicine application. Primary care providers and consulting psychiatrists have found it easy to work with and a valuable resource to increase the availability of psychiatric consultation in remote rural locations.

  11. Narrowing the scope of failure prediction using targeted fault load injection

    NASA Astrophysics Data System (ADS)

    Jordan, Paul L.; Peterson, Gilbert L.; Lin, Alan C.; Mendenhall, Michael J.; Sellers, Andrew J.

    2018-05-01

    As society becomes more dependent upon computer systems to perform increasingly critical tasks, ensuring that those systems do not fail becomes increasingly important. Many organizations depend heavily on desktop computers for day-to-day operations. Unfortunately, the software that runs on these computers is written by humans and, as such, is still subject to human error and consequent failure. A natural solution is to use statistical machine learning to predict failure. However, since failure is still a relatively rare event, obtaining labelled training data to train these models is not a trivial task. This work presents new simulated fault-inducing loads that extend the focus of traditional fault injection techniques to predict failure in the Microsoft enterprise authentication service and Apache web server. These new fault loads were successful in creating failure conditions that were identifiable using statistical learning methods, with fewer irrelevant faults being created.

  12. Development of a forestry government agency enterprise GIS system: a disconnected editing approach

    NASA Astrophysics Data System (ADS)

    Zhu, Jin; Barber, Brad L.

    2008-10-01

    The Texas Forest Service (TFS) has developed a geographic information system (GIS) for use by agency personnel in central Texas for managing oak wilt suppression and other landowner assistance programs. This Enterprise GIS system was designed to support multiple concurrent users accessing shared information resources. The disconnected editing approach was adopted in this system to avoid the overhead of maintaining an active connection between TFS central Texas field offices and headquarters since most field offices are operating with commercially provided Internet service. The GIS system entails maintaining a personal geodatabase on each local field office computer. Spatial data from the field is periodically up-loaded into a central master geodatabase stored in a Microsoft SQL Server at the TFS headquarters in College Station through the ESRI Spatial Database Engine (SDE). This GIS allows users to work off-line when editing data and requires connecting to the central geodatabase only when needed.

  13. The OptIPuter microscopy demonstrator: enabling science through a transatlantic lightpath

    PubMed Central

    Ellisman, M.; Hutton, T.; Kirkland, A.; Lin, A.; Lin, C.; Molina, T.; Peltier, S.; Singh, R.; Tang, K.; Trefethen, A.E.; Wallom, D.C.H.; Xiong, X.

    2009-01-01

    The OptIPuter microscopy demonstrator project has been designed to enable concurrent and remote usage of world-class electron microscopes located in Oxford and San Diego. The project has constructed a network consisting of microscopes and computational and data resources that are all connected by a dedicated network infrastructure using the UK Lightpath and US Starlight systems. Key science drivers include examples from both materials and biological science. The resulting system is now a permanent link between the Oxford and San Diego microscopy centres. This will form the basis of further projects between the sites and expansion of the types of systems that can be remotely controlled, including optical, as well as electron, microscopy. Other improvements will include the updating of the Microsoft cluster software to the high performance computing (HPC) server 2008, which includes the HPC basic profile implementation that will enable the development of interoperable clients. PMID:19487201

  14. The OptIPuter microscopy demonstrator: enabling science through a transatlantic lightpath.

    PubMed

    Ellisman, M; Hutton, T; Kirkland, A; Lin, A; Lin, C; Molina, T; Peltier, S; Singh, R; Tang, K; Trefethen, A E; Wallom, D C H; Xiong, X

    2009-07-13

    The OptIPuter microscopy demonstrator project has been designed to enable concurrent and remote usage of world-class electron microscopes located in Oxford and San Diego. The project has constructed a network consisting of microscopes and computational and data resources that are all connected by a dedicated network infrastructure using the UK Lightpath and US Starlight systems. Key science drivers include examples from both materials and biological science. The resulting system is now a permanent link between the Oxford and San Diego microscopy centres. This will form the basis of further projects between the sites and expansion of the types of systems that can be remotely controlled, including optical, as well as electron, microscopy. Other improvements will include the updating of the Microsoft cluster software to the high performance computing (HPC) server 2008, which includes the HPC basic profile implementation that will enable the development of interoperable clients.

  15. System and Method for Providing a Climate Data Persistence Service

    NASA Technical Reports Server (NTRS)

    Schnase, John L. (Inventor); Ripley, III, William David (Inventor); Duffy, Daniel Q. (Inventor); Thompson, John H. (Inventor); Strong, Savannah L. (Inventor); McInerney, Mark (Inventor); Sinno, Scott (Inventor); Tamkin, Glenn S. (Inventor); Nadeau, Denis (Inventor)

    2018-01-01

    A system, method and computer-readable storage devices for providing a climate data persistence service. A system configured to provide the service can include a climate data server that performs data and metadata storage and management functions for climate data objects, a compute-storage platform that provides the resources needed to support a climate data server, provisioning software that allows climate data server instances to be deployed as virtual climate data servers in a cloud computing environment, and a service interface, wherein persistence service capabilities are invoked by software applications running on a client device. The climate data objects can be in various formats, such as International Organization for Standards (ISO) Open Archival Information System (OAIS) Reference Model Submission Information Packages, Archive Information Packages, and Dissemination Information Packages. The climate data server can enable scalable, federated storage, management, discovery, and access, and can be tailored for particular use cases.

  16. A distributed, graphical user interface based, computer control system for atomic physics experiments

    NASA Astrophysics Data System (ADS)

    Keshet, Aviv; Ketterle, Wolfgang

    2013-01-01

    Atomic physics experiments often require a complex sequence of precisely timed computer controlled events. This paper describes a distributed graphical user interface-based control system designed with such experiments in mind, which makes use of off-the-shelf output hardware from National Instruments. The software makes use of a client-server separation between a user interface for sequence design and a set of output hardware servers. Output hardware servers are designed to use standard National Instruments output cards, but the client-server nature should allow this to be extended to other output hardware. Output sequences running on multiple servers and output cards can be synchronized using a shared clock. By using a field programmable gate array-generated variable frequency clock, redundant buffers can be dramatically shortened, and a time resolution of 100 ns achieved over effectively arbitrary sequence lengths.

  17. A distributed, graphical user interface based, computer control system for atomic physics experiments.

    PubMed

    Keshet, Aviv; Ketterle, Wolfgang

    2013-01-01

    Atomic physics experiments often require a complex sequence of precisely timed computer controlled events. This paper describes a distributed graphical user interface-based control system designed with such experiments in mind, which makes use of off-the-shelf output hardware from National Instruments. The software makes use of a client-server separation between a user interface for sequence design and a set of output hardware servers. Output hardware servers are designed to use standard National Instruments output cards, but the client-server nature should allow this to be extended to other output hardware. Output sequences running on multiple servers and output cards can be synchronized using a shared clock. By using a field programmable gate array-generated variable frequency clock, redundant buffers can be dramatically shortened, and a time resolution of 100 ns achieved over effectively arbitrary sequence lengths.

  18. Filmless PACS in a multiple facility environment

    NASA Astrophysics Data System (ADS)

    Wilson, Dennis L.; Glicksman, Robert A.; Prior, Fred W.; Siu, Kai-Yeung; Goldburgh, Mitchell M.

    1996-05-01

    A Picture Archiving and Communication System centered on a shared image file server can support a filmless hospital. Systems based on this architecture have proven themselves in over four years of clinical operation. Changes in healthcare delivery are causing radiology groups to support multiple facilities for remote clinic support and consolidation of services. There will be a corresponding need for communicating over a standardized wide area network (WAN). Interactive workflow, a natural extension to the single facility case, requires a means to work effectively and seamlessly across moderate to low speed communication networks. Several schemes for supporting a consortium of medical treatment facilities over a WAN are explored. Both centralized and distributed database approaches are evaluated against several WAN scenarios. Likewise, several architectures for distributing image file servers or buffers over a WAN are explored, along with the caching and distribution strategies that support them. An open system implementation is critical to the success of a wide area system. The role of the Digital Imaging and Communications in Medicine (DICOM) standard in supporting multi- facility and multi-vendor open systems is also addressed. An open system can be achieved by using a DICOM server to provide a view of the system-wide distributed database. The DICOM server interface to a local version of the global database lets a local workstation treat the multiple, distributed data servers as though they were one local server for purposes of examination queries. The query will recover information about the examination that will permit retrieval over the network from the server on which the examination resides. For efficiency reasons, the ability to build cross-facility radiologist worklists and clinician-oriented patient folders is essential. The technologies of the World-Wide-Web can be used to generate worklists and patient folders across facilities. A reliable broadcast protocol may be a convenient way to notify many different users and many image servers about new activities in the network of image servers. In addition to ensuring reliability of message delivery and global serialization of each broadcast message in the network, the broadcast protocol should not introduce significant communication overhead.

  19. The military health system's personal health record pilot with Microsoft HealthVault and Google Health

    PubMed Central

    Barnhill, Rick; Heermann-Do, Kimberly A; Salzman, Keith L; Gimbel, Ronald W

    2011-01-01

    Objective To design, build, implement, and evaluate a personal health record (PHR), tethered to the Military Health System, that leverages Microsoft® HealthVault and Google® Health infrastructure based on user preference. Materials and methods A pilot project was conducted in 2008–2009 at Madigan Army Medical Center in Tacoma, Washington. Our PHR was architected to a flexible platform that incorporated standards-based models of Continuity of Document and Continuity of Care Record to map Department of Defense-sourced health data, via a secure Veterans Administration data broker, to Microsoft® HealthVault and Google® Health based on user preference. The project design and implementation were guided by provider and patient advisory panels with formal user evaluation. Results The pilot project included 250 beneficiary users. Approximately 73.2% of users were <65 years of age, and 38.4% were female. Of the users, 169 (67.6%) selected Microsoft® HealthVault, and 81 (32.4%) selected Google® Health as their PHR of preference. Sample evaluation of users reflected 100% (n=60) satisfied with convenience of record access and 91.7% (n=55) satisfied with overall functionality of PHR. Discussion Key lessons learned related to data-transfer decisions (push vs pull), purposeful delays in reporting sensitive information, understanding and mapping PHR use and clinical workflow, and decisions on information patients may choose to share with their provider. Conclusion Currently PHRs are being viewed as empowering tools for patient activation. Design and implementation issues (eg, technical, organizational, information security) are substantial and must be thoughtfully approached. Adopting standards into design can enhance the national goal of portability and interoperability. PMID:21292705

  20. Earth science big data at users' fingertips: the EarthServer Science Gateway Mobile

    NASA Astrophysics Data System (ADS)

    Barbera, Roberto; Bruno, Riccardo; Calanducci, Antonio; Fargetta, Marco; Pappalardo, Marco; Rundo, Francesco

    2014-05-01

    The EarthServer project (www.earthserver.eu), funded by the European Commission under its Seventh Framework Program, aims at establishing open access and ad-hoc analytics on extreme-size Earth Science data, based on and extending leading-edge Array Database technology. The core idea is to use database query languages as client/server interface to achieve barrier-free "mix & match" access to multi-source, any-size, multi-dimensional space-time data -- in short: "Big Earth Data Analytics" - based on the open standards of the Open Geospatial Consortium Web Coverage Processing Service (OGC WCPS) and the W3C XQuery. EarthServer combines both, thereby achieving a tight data/metadata integration. Further, the rasdaman Array Database System (www.rasdaman.com) is extended with further space-time coverage data types. On server side, highly effective optimizations - such as parallel and distributed query processing - ensure scalability to Exabyte volumes. In this contribution we will report on the EarthServer Science Gateway Mobile, an app for both iOS and Android-based devices that allows users to seamlessly access some of the EarthServer applications using SAML-based federated authentication and fine-grained authorisation mechanisms.

  1. 21 CFR 1300.03 - Definitions relating to electronic orders for controlled substances and electronic prescriptions...

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... standard for biometric data specifications for personal identity verification. Operating point means a... records on its servers. Audit trail means a record showing who has accessed an information technology... information on a local server or hard drive. Certificate policy means a named set of rules that sets forth the...

  2. 21 CFR 1300.03 - Definitions relating to electronic orders for controlled substances and electronic prescriptions...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... standard for biometric data specifications for personal identity verification. Operating point means a... records on its servers. Audit trail means a record showing who has accessed an information technology... information on a local server or hard drive. Certificate policy means a named set of rules that sets forth the...

  3. 21 CFR 1300.03 - Definitions relating to electronic orders for controlled substances and electronic prescriptions...

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... standard for biometric data specifications for personal identity verification. Operating point means a... records on its servers. Audit trail means a record showing who has accessed an information technology... information on a local server or hard drive. Certificate policy means a named set of rules that sets forth the...

  4. Developing Weight Training Programs with Microsoft Excel: Techtorial #2. Article #3 in a 6-Part Series

    ERIC Educational Resources Information Center

    Mears, Derrick

    2009-01-01

    Developing personalized fitness programs to meet individual needs of students can be an overwhelming task for the physical educator. The National Standards for Physical Education (NASPE) indicate that students should achieve and maintain appropriate levels of physical fitness. To effectively meet standards with the growing diversity of curricular…

  5. EMR-based TeleGeriatric system.

    PubMed

    Pallawala, P M; Lun, K C

    2001-01-01

    As medical services improve due to new technologies and breakthroughs, it has lead to an increasingly aging population. There has been much discussion and debate on how to solve various aspects such as psychological, socio-economic and medical problems related to aging. Our effort is to implement a feasible telegeriatric medical service with the use of the state of the art technology to deliver medical services efficiently to remote sites where elderly homes are based. The TeleGeriatric system will lead to rapid decision-making in the presence of acute or subacute emergencies. This triage will also lead to a reduction of unnecessary admission. It will enable the doctors who visit these elderly homes once a week basis to improve their geriatric management skills by communication with geriatric specialist. Nursing skills in the geriatric care will also benefit from this system. Integrated electronic medical record (EMR) system will be indispensable in the face of emergency admissions to hospitals. Evolution of EMR database would lead to future research in telegeriatrics and will help to identify the areas where telegeriatrics can be optimally used. This system is based on current web browsing technology and broadband communication. The TeleGeriatric web based server is developed using Java Technology. The TeleGeriatric database server was developed using Microsoft SQL server. Both are based at the Medical Informatics Programme, National University of Singapore. Two elderly homes situated in the periphery of Singapore and a leading government hospital in geriatric care have been chosen for the project. These 3 institutions and National University of Singapore are connected via ADSL protocol. ADSL connection supports high bandwidth, which is necessary for high quality videoconferencing. Each time a patient needs a teleconsultation a nurse or a doctor in the remote site sends the patient's record to the TeleGeriatric server. The TeleGeriatric server forwards the request to the Alexandra Hospital for consultation. Geriatrics specialists at the Alexandra Hospital carry out teleward rounds twice weekly and on demand basis. Following the implementation of the system, a trial run has been done. Total results have demonstrated a high degree of coordination and cooperation between remote site and the Alexandra Hospital. Also the patient compliance is very high and they prefer teleconsultation. Initial results show that the TeleGeriatric system has definite advantages in managing geriatric patients at a remote site. As the system evolves, further research will show the areas where telegeriatrics can be used optimally.

  6. Multimedia content analysis and indexing: evaluation of a distributed and scalable architecture

    NASA Astrophysics Data System (ADS)

    Mandviwala, Hasnain; Blackwell, Scott; Weikart, Chris; Van Thong, Jean-Manuel

    2003-11-01

    Multimedia search engines facilitate the retrieval of documents from large media content archives now available via intranets and the Internet. Over the past several years, many research projects have focused on algorithms for analyzing and indexing media content efficiently. However, special system architectures are required to process large amounts of content from real-time feeds or existing archives. Possible solutions include dedicated distributed architectures for analyzing content rapidly and for making it searchable. The system architecture we propose implements such an approach: a highly distributed and reconfigurable batch media content analyzer that can process media streams and static media repositories. Our distributed media analysis application handles media acquisition, content processing, and document indexing. This collection of modules is orchestrated by a task flow management component, exploiting data and pipeline parallelism in the application. A scheduler manages load balancing and prioritizes the different tasks. Workers implement application-specific modules that can be deployed on an arbitrary number of nodes running different operating systems. Each application module is exposed as a web service, implemented with industry-standard interoperable middleware components such as Microsoft ASP.NET and Sun J2EE. Our system architecture is the next generation system for the multimedia indexing application demonstrated by www.speechbot.com. It can process large volumes of audio recordings with minimal support and maintenance, while running on low-cost commodity hardware. The system has been evaluated on a server farm running concurrent content analysis processes.

  7. Biomechanics Analysis of Combat Sport (Silat) By Using Motion Capture System

    NASA Astrophysics Data System (ADS)

    Zulhilmi Kaharuddin, Muhammad; Badriah Khairu Razak, Siti; Ikram Kushairi, Muhammad; Syawal Abd. Rahman, Mohamed; An, Wee Chang; Ngali, Z.; Siswanto, W. A.; Salleh, S. M.; Yusup, E. M.

    2017-01-01

    ‘Silat’ is a Malay traditional martial art that is practiced in both amateur and in professional levels. The intensity of the motion spurs the scientific research in biomechanics. The main purpose of this abstract is to present the biomechanics method used in the study of ‘silat’. By using the 3D Depth Camera motion capture system, two subjects are to perform ‘Jurus Satu’ in three repetitions each. One subject is set as the benchmark for the research. The videos are captured and its data is processed using the 3D Depth Camera server system in the form of 16 3D body joint coordinates which then will be transformed into displacement, velocity and acceleration components by using Microsoft excel for data calculation and Matlab software for simulation of the body. The translated data obtained serves as an input to differentiate both subjects’ execution of the ‘Jurus Satu’. Nine primary movements with the addition of five secondary movements are observed visually frame by frame from the simulation obtained to get the exact frame that the movement takes place. Further analysis involves the differentiation of both subjects’ execution by referring to the average mean and standard deviation of joints for each parameter stated. The findings provide useful data for joints kinematic parameters as well as to improve the execution of ‘Jurus Satu’ and to exhibit the process of learning a movement that is relatively unknown by the use of a motion capture system.

  8. New implementation of OGC Web Processing Service in Python programming language. PyWPS-4 and issues we are facing with processing of large raster data using OGC WPS

    NASA Astrophysics Data System (ADS)

    Čepický, Jáchym; Moreira de Sousa, Luís

    2016-06-01

    The OGC® Web Processing Service (WPS) Interface Standard provides rules for standardizing inputs and outputs (requests and responses) for geospatial processing services, such as polygon overlay. The standard also defines how a client can request the execution of a process, and how the output from the process is handled. It defines an interface that facilitates publishing of geospatial processes and client discovery of processes and and binding to those processes into workflows. Data required by a WPS can be delivered across a network or they can be available at a server. PyWPS was one of the first implementations of OGC WPS on the server side. It is written in the Python programming language and it tries to connect to all existing tools for geospatial data analysis, available on the Python platform. During the last two years, the PyWPS development team has written a new version (called PyWPS-4) completely from scratch. The analysis of large raster datasets poses several technical issues in implementing the WPS standard. The data format has to be defined and validated on the server side and binary data have to be encoded using some numeric representation. Pulling raster data from remote servers introduces security risks, in addition, running several processes in parallel has to be possible, so that system resources are used efficiently while preserving security. Here we discuss these topics and illustrate some of the solutions adopted within the PyWPS implementation.

  9. LiveBench-1: continuous benchmarking of protein structure prediction servers.

    PubMed

    Bujnicki, J M; Elofsson, A; Fischer, D; Rychlewski, L

    2001-02-01

    We present a novel, continuous approach aimed at the large-scale assessment of the performance of available fold-recognition servers. Six popular servers were investigated: PDB-Blast, FFAS, T98-lib, GenTHREADER, 3D-PSSM, and INBGU. The assessment was conducted using as prediction targets a large number of selected protein structures released from October 1999 to April 2000. A target was selected if its sequence showed no significant similarity to any of the proteins previously available in the structural database. Overall, the servers were able to produce structurally similar models for one-half of the targets, but significantly accurate sequence-structure alignments were produced for only one-third of the targets. We further classified the targets into two sets: easy and hard. We found that all servers were able to find the correct answer for the vast majority of the easy targets if a structurally similar fold was present in the server's fold libraries. However, among the hard targets--where standard methods such as PSI-BLAST fail--the most sensitive fold-recognition servers were able to produce similar models for only 40% of the cases, half of which had a significantly accurate sequence-structure alignment. Among the hard targets, the presence of updated libraries appeared to be less critical for the ranking. An "ideally combined consensus" prediction, where the results of all servers are considered, would increase the percentage of correct assignments by 50%. Each server had a number of cases with a correct assignment, where the assignments of all the other servers were wrong. This emphasizes the benefits of considering more than one server in difficult prediction tasks. The LiveBench program (http://BioInfo.PL/LiveBench) is being continued, and all interested developers are cordially invited to join.

  10. Virtual reality for spherical images

    NASA Astrophysics Data System (ADS)

    Pilarczyk, Rafal; Skarbek, Władysław

    2017-08-01

    Paper presents virtual reality application framework and application concept for mobile devices. Framework uses Google Cardboard library for Android operating system. Framework allows to create virtual reality 360 video player using standard OpenGL ES rendering methods. Framework provides network methods in order to connect to web server as application resource provider. Resources are delivered using JSON response as result of HTTP requests. Web server also uses Socket.IO library for synchronous communication between application and server. Framework implements methods to create event driven process of rendering additional content based on video timestamp and virtual reality head point of view.

  11. MEGANTE: A Web-Based System for Integrated Plant Genome Annotation

    PubMed Central

    Numa, Hisataka; Itoh, Takeshi

    2014-01-01

    The recent advancement of high-throughput genome sequencing technologies has resulted in a considerable increase in demands for large-scale genome annotation. While annotation is a crucial step for downstream data analyses and experimental studies, this process requires substantial expertise and knowledge of bioinformatics. Here we present MEGANTE, a web-based annotation system that makes plant genome annotation easy for researchers unfamiliar with bioinformatics. Without any complicated configuration, users can perform genomic sequence annotations simply by uploading a sequence and selecting the species to query. MEGANTE automatically runs several analysis programs and integrates the results to select the appropriate consensus exon–intron structures and to predict open reading frames (ORFs) at each locus. Functional annotation, including a similarity search against known proteins and a functional domain search, are also performed for the predicted ORFs. The resultant annotation information is visualized with a widely used genome browser, GBrowse. For ease of analysis, the results can be downloaded in Microsoft Excel format. All of the query sequences and annotation results are stored on the server side so that users can access their own data from virtually anywhere on the web. The current release of MEGANTE targets 24 plant species from the Brassicaceae, Fabaceae, Musaceae, Poaceae, Salicaceae, Solanaceae, Rosaceae and Vitaceae families, and it allows users to submit a sequence up to 10 Mb in length and to save up to 100 sequences with the annotation information on the server. The MEGANTE web service is available at https://megante.dna.affrc.go.jp/. PMID:24253915

  12. Development of real-time voltage stability monitoring tool for power system transmission network using Synchrophasor data

    NASA Astrophysics Data System (ADS)

    Pulok, Md Kamrul Hasan

    Intelligent and effective monitoring of power system stability in control centers is one of the key issues in smart grid technology to prevent unwanted power system blackouts. Voltage stability analysis is one of the most important requirements for control center operation in smart grid era. With the advent of Phasor Measurement Unit (PMU) or Synchrophasor technology, real time monitoring of voltage stability of power system is now a reality. This work utilizes real-time PMU data to derive a voltage stability index to monitor the voltage stability related contingency situation in power systems. The developed tool uses PMU data to calculate voltage stability index that indicates relative closeness of the instability by producing numerical indices. The IEEE 39 bus, New England power system was modeled and run on a Real-time Digital Simulator that stream PMU data over the Internet using IEEE C37.118 protocol. A Phasor data concentrator (PDC) is setup that receives streaming PMU data and stores them in Microsoft SQL database server. Then the developed voltage stability monitoring (VSM) tool retrieves phasor measurement data from SQL server, performs real-time state estimation of the whole network, calculate voltage stability index, perform real-time ranking of most vulnerable transmission lines, and finally shows all the results in a graphical user interface. All these actions are done in near real-time. Control centers can easily monitor the systems condition by using this tool and can take precautionary actions if needed.

  13. The IntFOLD server: an integrated web resource for protein fold recognition, 3D model quality assessment, intrinsic disorder prediction, domain prediction and ligand binding site prediction.

    PubMed

    Roche, Daniel B; Buenavista, Maria T; Tetchner, Stuart J; McGuffin, Liam J

    2011-07-01

    The IntFOLD server is a novel independent server that integrates several cutting edge methods for the prediction of structure and function from sequence. Our guiding principles behind the server development were as follows: (i) to provide a simple unified resource that makes our prediction software accessible to all and (ii) to produce integrated output for predictions that can be easily interpreted. The output for predictions is presented as a simple table that summarizes all results graphically via plots and annotated 3D models. The raw machine readable data files for each set of predictions are also provided for developers, which comply with the Critical Assessment of Methods for Protein Structure Prediction (CASP) data standards. The server comprises an integrated suite of five novel methods: nFOLD4, for tertiary structure prediction; ModFOLD 3.0, for model quality assessment; DISOclust 2.0, for disorder prediction; DomFOLD 2.0 for domain prediction; and FunFOLD 1.0, for ligand binding site prediction. Predictions from the IntFOLD server were found to be competitive in several categories in the recent CASP9 experiment. The IntFOLD server is available at the following web site: http://www.reading.ac.uk/bioinf/IntFOLD/.

  14. A Standardized Interface for Obtaining Digital Planetary and Heliophysics Time Series Data

    NASA Astrophysics Data System (ADS)

    Vandegriff, Jon; Weigel, Robert; Faden, Jeremy; King, Todd; Candey, Robert

    2016-10-01

    We describe a low level interface for accessing digital Planetary and Heliophysics data, focusing primarily on time-series data from in-situ instruments. As the volume and variety of planetary data has increased, it has become harder to merge diverse datasets into a common analysis environment. Thus we are building low-level computer-to-computer infrastructure to enable data from different missions or archives to be able to interoperate. The key to enabling interoperability is a simple access interface that standardizes the common capabilities available from any data server: 1. identify the data resources that can be accessed; 2. describe each resource; and 3. get the data from a resource. We have created a standardized way for data servers to perform each of these three activities. We are also developing a standard streaming data format for the actual data content to be returned (i.e., the result of item 3). Our proposed standard access interface is simple enough that it could be implemented on top of or beside existing data services, or it could even be fully implemented by a small data provider as a way to ensure that the provider's holdings can participate in larger data systems or joint analysis with other datasets. We present details of the interface and of the streaming format, including a sample server designed to illustrate the data request and streaming capabilities.

  15. MELTS_Excel: A Microsoft Excel-based MELTS interface for research and teaching of magma properties and evolution

    NASA Astrophysics Data System (ADS)

    Gualda, Guilherme A. R.; Ghiorso, Mark S.

    2015-01-01

    thermodynamic modeling software MELTS is a powerful tool for investigating crystallization and melting in natural magmatic systems. Rhyolite-MELTS is a recalibration of MELTS that better captures the evolution of silicic magmas in the upper crust. The current interface of rhyolite-MELTS, while flexible, can be somewhat cumbersome for the novice. We present a new interface that uses web services consumed by a VBA backend in Microsoft Excel©. The interface is contained within a macro-enabled workbook, where the user can insert the model input information and initiate computations that are executed on a central server at OFM Research. Results of simple calculations are shown immediately within the interface itself. It is also possible to combine a sequence of calculations into an evolutionary path; the user can input starting and ending temperatures and pressures, temperature and pressure steps, and the prevailing oxidation conditions. The program shows partial updates at every step of the computations; at the conclusion of the calculations, a series of data sheets and diagrams are created in a separate workbook, which can be saved independently of the interface. Additionally, the user can specify a grid of temperatures and pressures and calculate a phase diagram showing the conditions at which different phases are present. The interface can be used to apply the rhyolite-MELTS geobarometer. We demonstrate applications of the interface using an example early-erupted Bishop Tuff composition. The interface is simple to use and flexible, but it requires an internet connection. The interface is distributed for free from http://melts.ofm-research.org.

  16. 49 CFR 1104.3 - Copies.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... fully evaluate evidence, all spreadsheets must be fully accessible and manipulable. Electronic databases... Microsoft Open Database Connectivity (ODBC) standard. ODBC is a Windows technology that allows a database software package to import data from a database created using a different software package. We currently...

  17. 49 CFR 1104.3 - Copies.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... fully evaluate evidence, all spreadsheets must be fully accessible and manipulable. Electronic databases... Microsoft Open Database Connectivity (ODBC) standard. ODBC is a Windows technology that allows a database software package to import data from a database created using a different software package. We currently...

  18. 49 CFR 1104.3 - Copies.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... fully evaluate evidence, all spreadsheets must be fully accessible and manipulable. Electronic databases... Microsoft Open Database Connectivity (ODBC) standard. ODBC is a Windows technology that allows a database software package to import data from a database created using a different software package. We currently...

  19. 49 CFR 1104.3 - Copies.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... fully evaluate evidence, all spreadsheets must be fully accessible and manipulable. Electronic databases... Microsoft Open Database Connectivity (ODBC) standard. ODBC is a Windows technology that allows a database software package to import data from a database created using a different software package. We currently...

  20. The Czech National Grid Infrastructure

    NASA Astrophysics Data System (ADS)

    Chudoba, J.; Křenková, I.; Mulač, M.; Ruda, M.; Sitera, J.

    2017-10-01

    The Czech National Grid Infrastructure is operated by MetaCentrum, a CESNET department responsible for coordinating and managing activities related to distributed computing. CESNET as the Czech National Research and Education Network (NREN) provides many e-infrastructure services, which are used by 94% of the scientific and research community in the Czech Republic. Computing and storage resources owned by different organizations are connected by fast enough network to provide transparent access to all resources. We describe in more detail the computing infrastructure, which is based on several different technologies and covers grid, cloud and map-reduce environment. While the largest part of CPUs is still accessible via distributed torque servers, providing environment for long batch jobs, part of infrastructure is available via standard EGI tools in EGI, subset of NGI resources is provided into EGI FedCloud environment with cloud interface and there is also Hadoop cluster provided by the same e-infrastructure.A broad spectrum of computing servers is offered; users can choose from standard 2 CPU servers to large SMP machines with up to 6 TB of RAM or servers with GPU cards. Different groups have different priorities on various resources, resource owners can even have an exclusive access. The software is distributed via AFS. Storage servers offering up to tens of terabytes of disk space to individual users are connected via NFS4 on top of GPFS and access to long term HSM storage with peta-byte capacity is also provided. Overview of available resources and recent statistics of usage will be given.

  1. A new reference implementation of the PSICQUIC web service.

    PubMed

    del-Toro, Noemi; Dumousseau, Marine; Orchard, Sandra; Jimenez, Rafael C; Galeota, Eugenia; Launay, Guillaume; Goll, Johannes; Breuer, Karin; Ono, Keiichiro; Salwinski, Lukasz; Hermjakob, Henning

    2013-07-01

    The Proteomics Standard Initiative Common QUery InterfaCe (PSICQUIC) specification was created by the Human Proteome Organization Proteomics Standards Initiative (HUPO-PSI) to enable computational access to molecular-interaction data resources by means of a standard Web Service and query language. Currently providing >150 million binary interaction evidences from 28 servers globally, the PSICQUIC interface allows the concurrent search of multiple molecular-interaction information resources using a single query. Here, we present an extension of the PSICQUIC specification (version 1.3), which has been released to be compliant with the enhanced standards in molecular interactions. The new release also includes a new reference implementation of the PSICQUIC server available to the data providers. It offers augmented web service capabilities and improves the user experience. PSICQUIC has been running for almost 5 years, with a user base growing from only 4 data providers to 28 (April 2013) allowing access to 151 310 109 binary interactions. The power of this web service is shown in PSICQUIC View web application, an example of how to simultaneously query, browse and download results from the different PSICQUIC servers. This application is free and open to all users with no login requirement (http://www.ebi.ac.uk/Tools/webservices/psicquic/view/main.xhtml).

  2. MODster: Namespaces and Redirection for Earth Science Data

    NASA Astrophysics Data System (ADS)

    Frew, J.; Metzger, D.; Slaughter, P.

    2005-12-01

    MODster is a distributed, decentralized inventory server for Earth science data granules (standard units of data content and structure.) MODster connects data granule users (people who know which specific granule they want, but who don't know who has it or how to get it) with data granule providers (people or institutions that keep granules accessible online.) * If you're a provider, you can tell MODster which granules you have and where they live (i.e., their URLs.) * If you're a user, you can ask MODster for a granule, and it will transparently redirect your request to whomever has it. The key to making this work is a standard granule namespace. A granule namespace is a naming convention that associates particular names with particular granules, regardless of where those granules live. Different Earth science data products have their own granule namespaces. For example, in the MODIS granule namespace, the granule name "MOD43A2.A1998365.h5.v8.001.1999001090020.hdf" always refers to version 1 of the 5th horizontal and 8th vertical tile of the Level 3 16-day Bi-directional Reflectance Distribution Function product, acquired by the MODIS Terra sensor on 31 December 1998 and generated on 01 January 1999 at 9:00:20 AM. A MODster URL is simply a standard way of referring to a data product namespace and one of its granules. MODster URLs have the general form "http://server/namespace/granule" where "granule" is a granule name that conforms to a granule namespace, "namespace" is a MODster namespace, which is the name of a granule namespace whose conventions are known to MODster, and "server" is a MODster server, which is an HTTP server that can redirect namespace/granule requests to granule providers. A MODster URL with no granule component gets a description of the MODster namespace, its authority (the persons or institutions responsible for documenting and maintaining the naming convention), and also any services for that MODster namespace that the MODster server supports. Our current MODster implementation allows granule providers to explicitly register their granules, and can also crawl provider sites looking for granules whose names match specific rules or regular expressions.

  3. A concept to standardize raw biosignal transmission for brain-computer interfaces.

    PubMed

    Breitwieser, Christian; Neuper, Christa; Müller-Putz, Gernot R

    2011-01-01

    With this concept we introduced the attempt of a standardized interface called TiA to transmit raw biosignals. TiA is able to deal with multirate and block-oriented data transmission. Data is distinguished by different signal types (e.g., EEG, EOG, NIRS, …), whereby those signals can be acquired at the same time from different acquisition devices. TiA is built as a client-server model. Multiple clients can connect to one server. Information is exchanged via a control- and a separated data connection. Control commands and meta information are transmitted over the control connection. Raw biosignal data is delivered using the data connection in a unidirectional way. For this purpose a standardized handshaking protocol and raw data packet have been developed. Thus, an abstraction layer between hardware devices and data processing was evolved facilitating standardization.

  4. A Services-Oriented Architecture for Water Observations Data

    NASA Astrophysics Data System (ADS)

    Maidment, D. R.; Zaslavsky, I.; Valentine, D.; Tarboton, D. G.; Whitenack, T.; Whiteaker, T.; Hooper, R.; Kirschtel, D.

    2009-04-01

    Water observations data are time series of measurements made at point locations of water level, flow, and quality and corresponding data for climatic observations at point locations such as gaged precipitation and weather variables. A services-oriented architecture has been built for such information for the United States that has three components: hydrologic information servers, hydrologic information clients, and a centralized metadata cataloging system. These are connected using web services for observations data and metadata defined by an XML-based language called WaterML. A Hydrologic Information Server can be built by storing observations data in a relational database schema in the CUAHSI Observations Data Model, in which case, web services access to the data and metadata is automatically provided by query functions for WaterML that are wrapped around the relational database within a web server. A Hydrologic Information Server can also be constructed by custom-programming an interface to an existing water agency web site so that responds to the same queries by producing data in WaterML as do the CUAHSI Observations Data Model based servers. A Hydrologic Information Client is one which can interpret and ingest WaterML metadata and data. We have two client applications for Excel and ArcGIS and have shown how WaterML web services can be ingested into programming environments such as Matlab and Visual Basic. HIS Central, maintained at the San Diego Supercomputer Center is a repository of observational metadata for WaterML web services which presently indexes 342 million data measured at 1.75 million locations. This is the largest catalog water observational data for the United States presently in existence. As more observation networks join what we term "CUAHSI Water Data Federation", and the system accommodates a growing number of sites, measured parameters, applications, and users, rapid and reliable access to large heterogeneous hydrologic data repositories becomes critical. The CUAHSI HIS solution to the scalability and heterogeneity challenges has several components. Structural differences across the data repositories are addressed by building a standard services foundation for the exchange of hydrologic data, as derived from a common information model for observational data measured at stationary points and its implementation as a relational schema (ODM) and an XML schema (WaterML). Semantic heterogeneity is managed by mapping water quantity, water quality, and other parameters collected by government agencies and academic projects to a common ontology. The WaterML-compliant web services are indexed in a community services registry called HIS Central (hiscentral.cuahsi.org). Once a web service is registered in HIS Central, its metadata (site and variable characteristics, period of record for each variable at each site, etc.) is harvested and appended to the central catalog. The catalog is further updated as the service publisher associates the variables in the published service with ontology concepts. After this, the newly published service becomes available for spatial and semantics-based queries from online and desktop client applications developed by the project. Hydrologic system server software is now deployed at more than a dozen locations in the United States and Australia. To provide rapid access to data summaries, in particular for several nation-wide data repositories including EPA STORET, USGS NWIS, and USDA SNOTEL, we convert the observation data catalogs and databases with harvested data values into special representations that support high-performance analysis and visualization. The construction of OLAP (Online Analytical Processing) cubes, often called data cubes, is an approach to organizing and querying large multi-dimensional data collections. We have applied the OLAP techniques, as implemented in Microsoft SQL Server 2005/2008, to the analysis of the catalogs from several agencies. OLAP analysis results reflect geography and history of observation data availability from USGS NWIS, EPA STORET, and USDA SNOTEL repositories, and spatial and temporal dynamics of the available measurements for several key nutrient-related parameters. Our experience developing the CUAHSI HIS cyberinfrastructure demonstrated that efficient integration of hydrologic observations from multiple government and academic sources requires a range of technical approaches focused on managing different components of data heterogeneity and system scalability. While this submission addresses technical aspects of developing a national-scale information system for hydrologic observations, the challenges of explicating shared semantics of hydrologic observations and building a community of HIS users and developers remain critical in constructing a nation-wide federation of water data services.

  5. EMR based telegeriatric system.

    PubMed

    Pallawala, P M; Lun, K C

    2001-05-01

    As medical services improve due to new technologies and breakthroughs, it has lead to an increasingly aging population. There has been much discussion and debate on how to solve various aspects such as psychological, socioeconomic and medical problems related to aging. Our effort is to implement a feasible telegeriatric medical service with the use of the state of the art technology to deliver medical services efficiently to remote sites where elderly homes are based. Telegeriatric system will lead to rapid decision-making in the presence of acute or subacute emergencies. This triage will also lead to a reduction of unnecessary admission. It will enable the doctors who visit these elderly homes on a once-a-week basis to improve their geriatric management skills by communication with geriatric specialist. Nursing skills in geriatric care will also benefit from this system. Integrated EMR service will be indispensable in the face of emergency admissions to hospitals. Evolution of EMR database would lead to future research in telegeriatrics and will help to identify the areas where telegeriatrics can be optimally used. This system is based on current web browsing technology and broadband communication. EMR web based server is developed using Java Technology. EMR database was developed using Microsoft SQL server. Both are based at the Medical Informatics Programme, National University of Singapore. Two elderly homes situated in the periphery of Singapore and a leading government hospital in geriatric care has been chosen for the project. These three institutions and National University of Singapore are connected via ADSL protocol, which support high bandwidth, which is necessary for high quality videoconferencing. Each time a patient needs a teleconsultation, a nurse or doctor in the remote site sends the history to the EMR server. EMR server forwards the request to the Alexandra Hospital for consultation. Geriatrics specialists at Alexandra Hospital carry out teleward rounds twice weekly and on demand basis. Following the implementation of the system, a trial run has been done. This shows a high degree of coordination and cooperation between remote site and the Alexandra Hospital Also the patient compliance is very high and they prefer teleconsultation. Initial results show that telegeriatric system has definite advantages in managing geriatric patients at a remote site. As the system evolves, further research will show the areas where telegeriatrics can be used optimally.

  6. WASP (Write a Scientific Paper) using Excel -5: Quartiles and standard deviation.

    PubMed

    Grech, Victor

    2018-03-01

    The almost inevitable descriptive statistics exercise that is undergone once data collection is complete, prior to inferential statistics, requires the acquisition of basic descriptors which may include standard deviation and quartiles. This paper provides pointers as to how to do this in Microsoft Excel™ and explains the relationship between the two. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. 49 CFR 1104.3 - Copies.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Microsoft Open Database Connectivity (ODBC) standard. ODBC is a Windows technology that allows a database software package to import data from a database created using a different software package. We currently...-compatible format. All databases must be supported with adequate documentation on data attributes, SQL...

  8. Voxel modelling of sands and gravels of Pleistocene Rhine and Meuse deposits in Flanders (Belgium)

    NASA Astrophysics Data System (ADS)

    van Haren, Tom; Dirix, Katrijn; De Koninck, Roel

    2017-04-01

    Voxel modelling or 3D volume modelling of Quaternary raw materials is VITO's next step in the geological layer modelling of the Flanders and Brussels Capital Region in Belgium (G3D - Matthijs et al., 2013). The aim is to schematise deposits as voxels ('volumetric pixels') that represent lithological information on a grid in three-dimensional space (25 x 25 x 0.5 m). A new voxel model on Pleistocene Meuse and Rhine sands and gravels will be illustrated succeeding a voxel model on loess resources (van Haren et al., 2016). The model methodology is based on a geological 'skeleton' extracted from the regional geological layer model of Flanders. This framework holds the 3D interpolated lithological information of 5.000 boreholes. First a check on quality and spatial location filtered out significant and usable lithological information. Subsequently a manual geological interpretation was performed to analyse stratigraphical arrangement and identify the raw materials of interest. Finally, a workflow was developed that automatically encodes and classifies the borehole descriptions in a standardized manner. This workflow was implemented by combining Microsoft Access® and ArcMap® and is able to convert borehole descriptions into specific geological parameters. An analysis of the conversed lithological data prior to interpolation improves the understanding of the spatial distribution, to fine tune the modelling process and to know the limitations of the data. The converted lithological data were 3D interpolated in Voxler using IDW and resulted in a model containing 52 million voxels. It gives an overview on the regional distribution and thickness variation of interesting Pleistocene aggregates of Meuse and Rhine. Much effort has been put in setting up a database structure in Microsoft Access® and Microsoft SQL Server® in order to arrange and analyse the lithological information, link the voxel model with the geological layer model and handle and analyse the resulting voxelmodel data. The database structure allows to analyse and set certain preconditions (minimal thickness or maximum depth of aggregates, maximum thickness of intercalating clays) on the model in order to calculate and view distributions of deposits which meet these preconditions. These results are interesting for pre-prospective purposes, illustrating the distribution of lithological information and making the end user more aware of the potential economic value of the subsurface. References van Haren T. et al (2016) - An interactive voxel model for mineral resources: loess deposits in Flanders (Belgium). Zeitschrift der Deutschen Gesellschaft für Geowissenschaften, Volume 167, Number 4, pp. 363-376(14). Matthijs J. et al. (2013) - Geological 3D layer model of the Flanders Region and Brussels-Capital Region - 2nd version. Study performed in order of the Ministery of the Flemish Community. VITO report 2013/R/ETE/43, 24p. (in Dutch)

  9. Dynamic Server-Based KML Code Generator Method for Level-of-Detail Traversal of Geospatial Data

    NASA Technical Reports Server (NTRS)

    Baxes, Gregory; Mixon, Brian; Linger, TIm

    2013-01-01

    Web-based geospatial client applications such as Google Earth and NASA World Wind must listen to data requests, access appropriate stored data, and compile a data response to the requesting client application. This process occurs repeatedly to support multiple client requests and application instances. Newer Web-based geospatial clients also provide user-interactive functionality that is dependent on fast and efficient server responses. With massively large datasets, server-client interaction can become severely impeded because the server must determine the best way to assemble data to meet the client applications request. In client applications such as Google Earth, the user interactively wanders through the data using visually guided panning and zooming actions. With these actions, the client application is continually issuing data requests to the server without knowledge of the server s data structure or extraction/assembly paradigm. A method for efficiently controlling the networked access of a Web-based geospatial browser to server-based datasets in particular, massively sized datasets has been developed. The method specifically uses the Keyhole Markup Language (KML), an Open Geospatial Consortium (OGS) standard used by Google Earth and other KML-compliant geospatial client applications. The innovation is based on establishing a dynamic cascading KML strategy that is initiated by a KML launch file provided by a data server host to a Google Earth or similar KMLcompliant geospatial client application user. Upon execution, the launch KML code issues a request for image data covering an initial geographic region. The server responds with the requested data along with subsequent dynamically generated KML code that directs the client application to make follow-on requests for higher level of detail (LOD) imagery to replace the initial imagery as the user navigates into the dataset. The approach provides an efficient data traversal path and mechanism that can be flexibly established for any dataset regardless of size or other characteristics. The method yields significant improvements in userinteractive geospatial client and data server interaction and associated network bandwidth requirements. The innovation uses a C- or PHP-code-like grammar that provides a high degree of processing flexibility. A set of language lexer and parser elements is provided that offers a complete language grammar for writing and executing language directives. A script is wrapped and passed to the geospatial data server by a client application as a component of a standard KML-compliant statement. The approach provides an efficient means for a geospatial client application to request server preprocessing of data prior to client delivery. Data is structured in a quadtree format. As the user zooms into the dataset, geographic regions are subdivided into four child regions. Conversely, as the user zooms out, four child regions collapse into a single, lower-LOD region. The approach provides an efficient data traversal path and mechanism that can be flexibly established for any dataset regardless of size or other characteristics.

  10. Flexible software architecture for user-interface and machine control in laboratory automation.

    PubMed

    Arutunian, E B; Meldrum, D R; Friedman, N A; Moody, S E

    1998-10-01

    We describe a modular, layered software architecture for automated laboratory instruments. The design consists of a sophisticated user interface, a machine controller and multiple individual hardware subsystems, each interacting through a client-server architecture built entirely on top of open Internet standards. In our implementation, the user-interface components are built as Java applets that are downloaded from a server integrated into the machine controller. The user-interface client can thereby provide laboratory personnel with a familiar environment for experiment design through a standard World Wide Web browser. Data management and security are seamlessly integrated at the machine-controller layer using QNX, a real-time operating system. This layer also controls hardware subsystems through a second client-server interface. This architecture has proven flexible and relatively easy to implement and allows users to operate laboratory automation instruments remotely through an Internet connection. The software architecture was implemented and demonstrated on the Acapella, an automated fluid-sample-processing system that is under development at the University of Washington.

  11. System Design for Navy Occupational Standards Development

    DTIC Science & Technology

    2014-07-01

    including, Mr. Thomas Crain, Deputy Director, Workforce Classifications Department, LCDR Juan Carrasco, Michele Jackson, and Johnny Powell. David...and Carrasco, Juan ; Navy Job Analysis Management Project Description, NAVMAC, January 2010. 34  Lists of validated tasks, sorted by Functional...34 runat="server"> <div> <rsweb:ReportViewer ID="ReportViewerSample" runat="server" Font -Names="Verdana" Font -Size=Ŝpt

  12. An integrated WebGIS framework for volunteered geographic information and social media in soil and water conservation.

    PubMed

    Werts, Joshua D; Mikhailova, Elena A; Post, Christopher J; Sharp, Julia L

    2012-04-01

    Volunteered geographic information and social networking in a WebGIS has the potential to increase public participation in soil and water conservation, promote environmental awareness and change, and provide timely data that may be otherwise unavailable to policymakers in soil and water conservation management. The objectives of this study were: (1) to develop a framework for combining current technologies, computing advances, data sources, and social media; and (2) develop and test an online web mapping interface. The mapping interface integrates Microsoft Silverlight, Bing Maps, ArcGIS Server, Google Picasa Web Albums Data API, RSS, Google Analytics, and Facebook to create a rich user experience. The website allows the public to upload photos and attributes of their own subdivisions or sites they have identified and explore other submissions. The website was made available to the public in early February 2011 at http://www.AbandonedDevelopments.com and evaluated for its potential long-term success in a pilot study.

  13. Synoptic reporting in tumor pathology: advantages of a web-based system.

    PubMed

    Qu, Zhenhong; Ninan, Shibu; Almosa, Ahmed; Chang, K G; Kuruvilla, Supriya; Nguyen, Nghia

    2007-06-01

    The American College of Surgeons Commission on Cancer (ACS-CoC) mandates that pathology reports at ACS-CoC-approved cancer programs include all scientifically validated data elements for each site and tumor specimen. The College of American Pathologists (CAP) has produced cancer checklists in static text formats to assist reporting. To be inclusive, the CAP checklists are pages long, requiring extensive text editing and multiple intermediate steps. We created a set of dynamic tumor-reporting templates, using Microsoft Active Server Page (ASP.NET), with drop-down list and data-compile features, and added a reminder function to indicate missing information. Users can access this system on the Internet, prepare the tumor report by selecting relevant data from drop-down lists with an embedded tumor staging scheme, and directly transfer the final report into a laboratory information system by using the copy-and-paste function. By minimizing extensive text editing and eliminating intermediate steps, this system can reduce reporting errors, improve work efficiency, and increase compliance.

  14. Simple Web-based interactive key development software (WEBiKEY) and an example key for Kuruna (Poaceae: Bambusoideae)1

    PubMed Central

    Attigala, Lakshmi; De Silva, Nuwan I.; Clark, Lynn G.

    2016-01-01

    Premise of the study: Programs that are user-friendly and freely available for developing Web-based interactive keys are scarce and most of the well-structured applications are relatively expensive. WEBiKEY was developed to enable researchers to easily develop their own Web-based interactive keys with fewer resources. Methods and Results: A Web-based multiaccess identification tool (WEBiKEY) was developed that uses freely available Microsoft ASP.NET technologies and an SQL Server database for Windows-based hosting environments. WEBiKEY was tested for its usability with a sample data set, the temperate woody bamboo genus Kuruna (Poaceae). Conclusions: WEBiKEY is freely available to the public and can be used to develop Web-based interactive keys for any group of species. The interactive key we developed for Kuruna using WEBiKEY enables users to visually inspect characteristics of Kuruna and identify an unknown specimen as one of seven possible species in the genus. PMID:27144109

  15. An Integrated WebGIS Framework for Volunteered Geographic Information and Social Media in Soil and Water Conservation

    NASA Astrophysics Data System (ADS)

    Werts, Joshua D.; Mikhailova, Elena A.; Post, Christopher J.; Sharp, Julia L.

    2012-04-01

    Volunteered geographic information and social networking in a WebGIS has the potential to increase public participation in soil and water conservation, promote environmental awareness and change, and provide timely data that may be otherwise unavailable to policymakers in soil and water conservation management. The objectives of this study were: (1) to develop a framework for combining current technologies, computing advances, data sources, and social media; and (2) develop and test an online web mapping interface. The mapping interface integrates Microsoft Silverlight, Bing Maps, ArcGIS Server, Google Picasa Web Albums Data API, RSS, Google Analytics, and Facebook to create a rich user experience. The website allows the public to upload photos and attributes of their own subdivisions or sites they have identified and explore other submissions. The website was made available to the public in early February 2011 at http://www.AbandonedDevelopments.com and evaluated for its potential long-term success in a pilot study.

  16. Chemical-text hybrid search engines.

    PubMed

    Zhou, Yingyao; Zhou, Bin; Jiang, Shumei; King, Frederick J

    2010-01-01

    As the amount of chemical literature increases, it is critical that researchers be enabled to accurately locate documents related to a particular aspect of a given compound. Existing solutions, based on text and chemical search engines alone, suffer from the inclusion of "false negative" and "false positive" results, and cannot accommodate diverse repertoire of formats currently available for chemical documents. To address these concerns, we developed an approach called Entity-Canonical Keyword Indexing (ECKI), which converts a chemical entity embedded in a data source into its canonical keyword representation prior to being indexed by text search engines. We implemented ECKI using Microsoft Office SharePoint Server Search, and the resultant hybrid search engine not only supported complex mixed chemical and keyword queries but also was applied to both intranet and Internet environments. We envision that the adoption of ECKI will empower researchers to pose more complex search questions that were not readily attainable previously and to obtain answers at much improved speed and accuracy.

  17. Design considerations of CareWindows, a Windows 3.0-based graphical front end to a Medical Information Management System using a pass-through-requester architecture.

    PubMed Central

    Ward, R. E.; Purves, T.; Feldman, M.; Schiffman, R. M.; Barry, S.; Christner, M.; Kipa, G.; McCarthy, B. D.; Stiphout, R.

    1991-01-01

    The Care Windows development project demonstrated the feasibility of an approach designed to add the benefits of an event-driven, graphically-oriented user interface to an existing Medical Information Management System (MIMS) without overstepping economic and logistic constraints. The design solution selected for the Care Windows project incorporates three important design features: (1) the effective de-coupling of severs from requesters, permitting the use of an extensive pre-existing library of MIMS servers, (2) the off-loading of program control functions of the requesters to the workstation processor, reducing the load per transaction on central resources and permitting the use of object-oriented development environments available for microcomputers, (3) the selection of a low end, GUI-capable workstation consisting of a PC-compatible personal computer running Microsoft Windows 3.0, and (4) the development of a highly layered, modular workstation application, permitting the development of interchangeable modules to insure portability and adaptability. PMID:1807665

  18. Development and operation of a quality assurance system for deviations from standard operating procedures in a clinical cell therapy laboratory.

    PubMed

    McKenna, D; Kadidlo, D; Sumstad, D; McCullough, J

    2003-01-01

    Errors and accidents, or deviations from standard operating procedures, other policy, or regulations must be documented and reviewed, with corrective actions taken to assure quality performance in a cellular therapy laboratory. Though expectations and guidance for deviation management exist, a description of the framework for the development of such a program is lacking in the literature. Here we describe our deviation management program, which uses a Microsoft Access database and Microsoft Excel to analyze deviations and notable events, facilitating quality assurance (QA) functions and ongoing process improvement. Data is stored in a Microsoft Access database with an assignment to one of six deviation type categories. Deviation events are evaluated for potential impact on patient and product, and impact scores for each are determined using a 0- 4 grading scale. An immediate investigation occurs, and corrective actions are taken to prevent future similar events from taking place. Additionally, deviation data is collectively analyzed on a quarterly basis using Microsoft Excel, to identify recurring events or developing trends. Between January 1, 2001 and December 31, 2001 over 2500 products were processed at our laboratory. During this time period, 335 deviations and notable events occurred, affecting 385 products and/or patients. Deviations within the 'technical error' category were most common (37%). Thirteen percent of deviations had a patient and/or a product impact score > or = 2, a score indicating, at a minimum, potentially affected patient outcome or moderate effect upon product quality. Real-time analysis and quarterly review of deviations using our deviation management program allows for identification and correction of deviations. Monitoring of deviation trends allows for process improvement and overall successful functioning of the QA program in the cell therapy laboratory. Our deviation management program could serve as a model for other laboratories in need of such a program.

  19. 76 FR 12106 - Lead-Based Paint Renovation, Repair and Painting Activities in Target Housing and Child Occupied...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-04

    ... will also be accepted on standard disks in Microsoft Word or ASCII file format. D. How should I handle... hazards of lead-based paint and where to receive more information about health protection. The poster also...

  20. Reference-frame-independent quantum-key-distribution server with a telecom tether for an on-chip client.

    PubMed

    Zhang, P; Aungskunsiri, K; Martín-López, E; Wabnig, J; Lobino, M; Nock, R W; Munns, J; Bonneau, D; Jiang, P; Li, H W; Laing, A; Rarity, J G; Niskanen, A O; Thompson, M G; O'Brien, J L

    2014-04-04

    We demonstrate a client-server quantum key distribution (QKD) scheme. Large resources such as laser and detectors are situated at the server side, which is accessible via telecom fiber to a client requiring only an on-chip polarization rotator, which may be integrated into a handheld device. The detrimental effects of unstable fiber birefringence are overcome by employing the reference-frame-independent QKD protocol for polarization qubits in polarization maintaining fiber, where standard QKD protocols fail, as we show for comparison. This opens the way for quantum enhanced secure communications between companies and members of the general public equipped with handheld mobile devices, via telecom-fiber tethering.

  1. Standard Port-Visit Cost Forecasting Model for U.S. Navy Husbanding Contracts

    DTIC Science & Technology

    2009-12-01

    Protocol (HTTP) server.35 2. MySQL . An open-source database.36 3. PHP . A common scripting language used for Web development.37 E. IMPLEMENTATION OF...Inc. (2009). MySQL Community Server (Version 5.1) [Software]. Available from http://dev.mysql.com/downloads/ 37 The PHP Group (2009). PHP (Version...Logistics Services MySQL My Structured Query Language NAVSUP Navy Supply Systems Command NC Non-Contract Items NPS Naval Postgraduate

  2. EarthServer: Use of Rasdaman as a data store for use in visualisation of complex EO data

    NASA Astrophysics Data System (ADS)

    Clements, Oliver; Walker, Peter; Grant, Mike

    2013-04-01

    The European Commission FP7 project EarthServer is establishing open access and ad-hoc analytics on extreme-size Earth Science data, based on and extending cutting-edge Array Database technology. EarthServer is built around the Rasdaman Raster Data Manager which extends standard relational database systems with the ability to store and retrieve multi-dimensional raster data of unlimited size through an SQL style query language. Rasdaman facilitates visualisation of data by providing several Open Geospatial Consortium (OGC) standard interfaces through its web services wrapper, Petascope. These include the well established standards, Web Coverage Service (WCS) and Web Map Service (WMS) as well as the emerging standard, Web Coverage Processing Service (WCPS). The WCPS standard allows the running of ad-hoc queries on the data stored within Rasdaman, creating an infrastructure where users are not restricted by bandwidth when manipulating or querying huge datasets. Here we will show that the use of EarthServer technologies and infrastructure allows access and visualisation of massive scale data through a web client with only marginal bandwidth use as opposed to the current mechanism of copying huge amounts of data to create visualisations locally. For example if a user wanted to generate a plot of global average chlorophyll for a complete decade time series they would only have to download the result instead of Terabytes of data. Firstly we will present a brief overview of the capabilities of Rasdaman and the WCPS query language to introduce the ways in which it is used in a visualisation tool chain. We will show that there are several ways in which WCPS can be utilised to create both standard and novel web based visualisations. An example of a standard visualisation is the production of traditional 2d plots, allowing users the ability to plot data products easily. However, the query language allows the creation of novel/custom products, which can then immediately be plotted with the same system. For more complex multi-spectral data, WCPS allows the user to explore novel combinations of bands in standard band-ratio algorithms through a web browser with dynamic updating of the resultant image. To visualise very large datasets Rasdaman has the capability to dynamically scale a dataset or query result so that it can be appraised quickly for use in later unscaled queries. All of these techniques are accessible through a web based GIS interface increasing the number of potential users of the system. Lastly we will show the advances in dynamic web based 3D visualisations being explored within the EarthServer project. By utilising the emerging declarative 3D web standard X3DOM as a tool to visualise the results of WCPS queries we introduce several possible benefits, including quick appraisal of data for outliers or anomalous data points and visualisation of the uncertainty of data alongside the actual data values.

  3. Bringing the Virtual Astronomical Observatory to the Education Community

    NASA Astrophysics Data System (ADS)

    Lawton, B.; Eisenhamer, B.; Mattson, B. J.; Raddick, M. J.

    2012-08-01

    The Virtual Observatory (VO) is an international effort to bring a large-scale electronic integration of astronomy data, tools, and services to the global community. The Virtual Astronomical Observatory (VAO) is the U.S. NSF- and NASA-funded VO effort that seeks to put efficient astronomical tools in the hands of U.S. astronomers, students, educators, and public outreach leaders. These tools will make use of data collected by the multitude of ground- and space-based missions over the previous decades. The Education and Public Outreach (EPO) program for the VAO will be led by the Space Telescope Science Institute in collaboration with the High Energy Astrophysics Science Archive Research Center (HEASARC) EPO program and Johns Hopkins University. VAO EPO efforts seek to bring technology, real-world astronomical data, and the story of the development and infrastructure of the VAO to the general public and education community. Our EPO efforts will be structured to provide uniform access to VAO information, enabling educational and research opportunities across multiple wavelengths and time-series data sets. The VAO team recognizes that the VO has already built many tools for EPO purposes, such as Microsoft's World Wide Telescope, SDSS Sky Server, Aladin, and a multitude of citizen-science tools available from Zooniverse. However, it is not enough to simply provide tools. Tools must meet the needs of the education community and address national education standards in order to be broadly utilized. To determine which tools the VAO will incorporate into the EPO program, needs assessments will be conducted with educators across the U.S.

  4. "Mr. Database" : Jim Gray and the History of Database Technologies.

    PubMed

    Hanwahr, Nils C

    2017-12-01

    Although the widespread use of the term "Big Data" is comparatively recent, it invokes a phenomenon in the developments of database technology with distinct historical contexts. The database engineer Jim Gray, known as "Mr. Database" in Silicon Valley before his disappearance at sea in 2007, was involved in many of the crucial developments since the 1970s that constitute the foundation of exceedingly large and distributed databases. Jim Gray was involved in the development of relational database systems based on the concepts of Edgar F. Codd at IBM in the 1970s before he went on to develop principles of Transaction Processing that enable the parallel and highly distributed performance of databases today. He was also involved in creating forums for discourse between academia and industry, which influenced industry performance standards as well as database research agendas. As a co-founder of the San Francisco branch of Microsoft Research, Gray increasingly turned toward scientific applications of database technologies, e. g. leading the TerraServer project, an online database of satellite images. Inspired by Vannevar Bush's idea of the memex, Gray laid out his vision of a Personal Memex as well as a World Memex, eventually postulating a new era of data-based scientific discovery termed "Fourth Paradigm Science". This article gives an overview of Gray's contributions to the development of database technology as well as his research agendas and shows that central notions of Big Data have been occupying database engineers for much longer than the actual term has been in use.

  5. The TOPCONS web server for consensus prediction of membrane protein topology and signal peptides

    PubMed Central

    Tsirigos, Konstantinos D.; Peters, Christoph; Shu, Nanjiang; Käll, Lukas; Elofsson, Arne

    2015-01-01

    TOPCONS (http://topcons.net/) is a widely used web server for consensus prediction of membrane protein topology. We hereby present a major update to the server, with some substantial improvements, including the following: (i) TOPCONS can now efficiently separate signal peptides from transmembrane regions. (ii) The server can now differentiate more successfully between globular and membrane proteins. (iii) The server now is even slightly faster, although a much larger database is used to generate the multiple sequence alignments. For most proteins, the final prediction is produced in a matter of seconds. (iv) The user-friendly interface is retained, with the additional feature of submitting batch files and accessing the server programmatically using standard interfaces, making it thus ideal for proteome-wide analyses. Indicatively, the user can now scan the entire human proteome in a few days. (v) For proteins with homology to a known 3D structure, the homology-inferred topology is also displayed. (vi) Finally, the combination of methods currently implemented achieves an overall increase in performance by 4% as compared to the currently available best-scoring methods and TOPCONS is the only method that can identify signal peptides and still maintain a state-of-the-art performance in topology predictions. PMID:25969446

  6. Unobtrusive Social Network Data From Email

    DTIC Science & Technology

    2008-12-01

    PERSON Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 outlook archived files and stores that data into an SQL - database. Communication...Applications ( VBA ) program was installed on the personal computers (PC) of all participants, in the session window of their Microsoft Outlook. Details of

  7. Wealth and Power.

    ERIC Educational Resources Information Center

    Martz, Carlton

    2000-01-01

    This theme issue examines three historical and current problems surrounding wealth and power. The first article looks at King Leopold of Belgium and his exploitation of the Congo. The second article explores John D. Rockefeller and the Standard Oil monopoly. The final article examines the antitrust case against the Microsoft Corporation. Each…

  8. Estimation of pharmacokinetic parameters from non-compartmental variables using Microsoft Excel.

    PubMed

    Dansirikul, Chantaratsamon; Choi, Malcolm; Duffull, Stephen B

    2005-06-01

    This study was conducted to develop a method, termed 'back analysis (BA)', for converting non-compartmental variables to compartment model dependent pharmacokinetic parameters for both one- and two-compartment models. A Microsoft Excel spreadsheet was implemented with the use of Solver and visual basic functions. The performance of the BA method in estimating pharmacokinetic parameter values was evaluated by comparing the parameter values obtained to a standard modelling software program, NONMEM, using simulated data. The results show that the BA method was reasonably precise and provided low bias in estimating fixed and random effect parameters for both one- and two-compartment models. The pharmacokinetic parameters estimated from the BA method were similar to those of NONMEM estimation.

  9. Development of Geospatial Map Based Election Portal

    NASA Astrophysics Data System (ADS)

    Gupta, A. Kumar Chandra; Kumar, P.; Vasanth Kumar, N.

    2014-11-01

    The Geospatial Delhi Limited (GSDL), a Govt. of NCT of Delhi Company formed in order to provide the geospatial information of National Capital Territory of Delhi (NCTD) to the Government of National Capital Territory of Delhi (GNCTD) and its organs such as DDA, MCD, DJB, State Election Department, DMRC etc., for the benefit of all citizens of Government of National Capital Territory of Delhi (GNCTD). This paper describes the development of Geospatial Map based Election portal (GMEP) of NCT of Delhi. The portal has been developed as a map based spatial decision support system (SDSS) for pertain to planning and management of Department of Chief Electoral Officer, and as an election related information searching tools (Polling Station, Assembly and parliamentary constituency etc.,) for the citizens of NCTD. The GMEP is based on Client-Server architecture model. It has been developed using ArcGIS Server 10.0 with J2EE front-end on Microsoft Windows environment. The GMEP is scalable to enterprise SDSS with enterprise Geo Database & Virtual Private Network (VPN) connectivity. Spatial data to GMEP includes delimited precinct area boundaries of Voters Area of Polling stations, Assembly Constituency, Parliamentary Constituency, Election District, Landmark locations of Polling Stations & basic amenities (Police Stations, Hospitals, Schools and Fire Stations etc.). GMEP could help achieve not only the desired transparency and easiness in planning process but also facilitates through efficient & effective tools for management of elections. It enables a faster response to the changing ground realities in the development planning, owing to its in-built scientific approach and open-ended design.

  10. NASA World Wind: A New Mission

    NASA Astrophysics Data System (ADS)

    Hogan, P.; Gaskins, T.; Bailey, J. E.

    2008-12-01

    Virtual Globes are well into their first generation, providing increasingly rich and beautiful visualization of more types and quantities of information. However, they are still mostly single and proprietary programs, akin to a web browser whose content and functionality are controlled and constrained largely by the browser's manufacturer. Today Google and Microsoft determine what we can and cannot see and do in these programs. NASA World Wind started out in nearly the same mode, a single program with limited functionality and information content. But as the possibilities of virtual globes became more apparent, we found that while enabling a new class of information visualization, we were also getting in the way. Many users want to provide World Wind functionality and information in their programs, not ours. They want it in their web pages. They want to include their own features. They told us that only with this kind of flexibility, could their objectives and the potential of the technology be truly realized. World Wind therefore changed its mission: from providing a single information browser to enabling a whole class of 3D geographic applications. Instead of creating one program, we create components to be used in any number of programs. World Wind is NASA open source software. With the source code being fully visible, anyone can readily use it and freely extend it to serve any use. Imagery and other information provided by the World Wind servers is also free and unencumbered, including the server technology to deliver geospatial data. World Wind developers can therefore provide exclusive and custom solutions based on user needs.

  11. To the Geoportal and Beyond! Preparing the Earth Observing Laboratory's Datasets for Inter-Repository Discovery

    NASA Astrophysics Data System (ADS)

    Gordon, S.; Dattore, E.; Williams, S.

    2014-12-01

    Even when a data center makes it's datasets accessible, they can still be hard to discover if the user is unaware of the laboratory or organization the data center supports. NCAR's Earth Observing Laboratory (EOL) is no exception. In response to this problem and as an inquiry into the feasibility of inter-connecting all of NCAR's repositories at a discovery layer, ESRI's Geoportal was researched. It was determined that an implementation of Geoportal would be a good choice to build a proof of concept model of inter-repository discovery around. This collaborative project between the University of Illinois and NCAR is coordinated through the Data Curation Education in Research Centers program. This program is funded by the Institute of Museum and Library Services.Geoportal is open source software. It serves as an aggregation point for metadata catalogs of earth science datasets, with a focus on geospatial information. EOL's metadata is in static THREDDS catalogs. Geoportal can only create records from a THREDDS Data Server. The first step was to make EOL metadata more accessible by utilizing the ISO 19115-2 standard. It was also decided to create DIF records so EOL datasets could be ingested in NASA's Global Change Master Directory (GCMD). To offer records for harvest, it was decided to develop an OAI-PMH server. To make a compliant server, the OAI_DC standard was also implemented. A server was written in Perl to serve a set of static records. We created a sample set of records in ISO 19115-2, FGDC, DIF, and OAI_DC. We utilized GCMD shared vocabularies to enhance discoverability and precision. The proof of concept was tested and verified by having another NCAR laboratory's Geoportal harvest our sample set. To prepare for production, templates for each standard were developed and mapped to the database. These templates will help the automated creation of records. Once the OAI-PMH server is re-written in a Grails framework a dynamic representation of EOL's metadata will be available for harvest. EOL will need to develop an implementation of a Geoportal and point GCMD to the OAI-PMH server. We will also seek out partnerships with other earth science and related discipline repositories that can communicate by OAI-PMH or Geoportal so that the scientific community will benefit from more discoverable data.

  12. Interactive web-based portals to improve patient navigation and connect patients with primary care and specialty services in underserved communities.

    PubMed

    Highfield, Linda; Ottenweller, Cecelia; Pfanz, Andre; Hanks, Jeanne

    2014-01-01

    This article presents a case study in the redesign, development, and implementation of a web-based healthcare clinic search tool for virtual patient navigation in underserved populations in Texas. It describes the workflow, assessment of system requirements, and design and implementation of two online portals: Project Safety Net and the Breast Health Portal. The primary focus of the study was to demonstrate the use of health information technology for the purpose of bridging the gap between underserved populations and access to healthcare. A combination of interviews and focus groups was used to guide the development process. Interviewees were asked a series of questions about usage, usability, and desired features of the new system. The redeveloped system offers a multitier architecture consisting of data, business, and presentation layers. The technology used in the new portals include Microsoft .NET Framework 3.5, Microsoft SQL Server 2008, Google Maps JavaScript API v3, jQuery, Telerik RadControls (ASP.NET AJAX), and HTML. The redesigned portals have 548 registered clinics, and they have averaged 355 visits per month since their launch in late 2011, with the average user visiting five pages per visit. Usage has remained relatively constant over time, with an average of 142 new users (40 percent) each month. This study demonstrates the successful application of health information technology to improve access to healthcare and the successful adoption of the technology by targeted end users. The portals described in this study could be replicated by health information specialists in other areas of the United States to address disparities in healthcare access.

  13. Interactive Web-based Portals to Improve Patient Navigation and Connect Patients with Primary Care and Specialty Services in Underserved Communities

    PubMed Central

    Highfield, Linda; Ottenweller, Cecelia; Pfanz, Andre; Hanks, Jeanne

    2014-01-01

    This article presents a case study in the redesign, development, and implementation of a web-based healthcare clinic search tool for virtual patient navigation in underserved populations in Texas. It describes the workflow, assessment of system requirements, and design and implementation of two online portals: Project Safety Net and the Breast Health Portal. The primary focus of the study was to demonstrate the use of health information technology for the purpose of bridging the gap between underserved populations and access to healthcare. A combination of interviews and focus groups was used to guide the development process. Interviewees were asked a series of questions about usage, usability, and desired features of the new system. The redeveloped system offers a multitier architecture consisting of data, business, and presentation layers. The technology used in the new portals include Microsoft .NET Framework 3.5, Microsoft SQL Server 2008, Google Maps JavaScript API v3, jQuery, Telerik RadControls (ASP.NET AJAX), and HTML. The redesigned portals have 548 registered clinics, and they have averaged 355 visits per month since their launch in late 2011, with the average user visiting five pages per visit. Usage has remained relatively constant over time, with an average of 142 new users (40 percent) each month. This study demonstrates the successful application of health information technology to improve access to healthcare and the successful adoption of the technology by targeted end users. The portals described in this study could be replicated by health information specialists in other areas of the United States to address disparities in healthcare access. PMID:24808806

  14. Design of a Horizontal Penetrometer for Measuring On-the-Go Soil Resistance

    PubMed Central

    Topakci, Mehmet; Unal, Ilker; Canakci, Murad; Celik, Huseyin Kursat; Karayel, Davut

    2010-01-01

    Soil compaction is one of the main negative factors that limits plant growth and crop yield. Therefore, it is important to determine the soil resistance level and map it for the field to find solutions for the negative effects of the compaction. Nowadays, high powered communication technology and computers help us on this issue within the approach of precision agriculture applications. This study is focused on the design of a penetrometer, which can make instantaneous soil resistance measurements in the soil horizontally and data acquisition software based on the GPS (Global Positioning System). The penetrometer was designed using commercial 3D parametric solid modelling design software. The data acquisition software was developed in Microsoft Visual Basic.NET programming language. After the design of the system, manufacturing and assembly of the system was completed and then a field experiment was carried out. According to the data from GPS and penetration resistance values which are collected in Microsoft SQL Server database, a Kriging method by ArcGIS was used and soil resistance was mapped in the field for a soil depth of 40 cm. During operation, no faults, either in mechanical and software parts, were seen. As a result, soil resistance values of 0.2 MPa and 3 MPa were obtained as minimum and maximum values, respectively. In conclusion, the experimental results showed that the designed system works quite well in the field and the horizontal penetrometer is a practical tool for providing on-line soil resistance measurements. This study contributes to further research for the development of on-line soil resistance measurements and mapping within the precision agriculture applications. PMID:22163410

  15. Design of a horizontal penetrometer for measuring on-the-go soil resistance.

    PubMed

    Topakci, Mehmet; Unal, Ilker; Canakci, Murad; Celik, Huseyin Kursat; Karayel, Davut

    2010-01-01

    Soil compaction is one of the main negative factors that limits plant growth and crop yield. Therefore, it is important to determine the soil resistance level and map it for the field to find solutions for the negative effects of the compaction. Nowadays, high powered communication technology and computers help us on this issue within the approach of precision agriculture applications. This study is focused on the design of a penetrometer, which can make instantaneous soil resistance measurements in the soil horizontally and data acquisition software based on the GPS (Global Positioning System). The penetrometer was designed using commercial 3D parametric solid modelling design software. The data acquisition software was developed in Microsoft Visual Basic.NET programming language. After the design of the system, manufacturing and assembly of the system was completed and then a field experiment was carried out. According to the data from GPS and penetration resistance values which are collected in Microsoft SQL Server database, a Kriging method by ArcGIS was used and soil resistance was mapped in the field for a soil depth of 40 cm. During operation, no faults, either in mechanical and software parts, were seen. As a result, soil resistance values of 0.2 MPa and 3 MPa were obtained as minimum and maximum values, respectively. In conclusion, the experimental results showed that the designed system works quite well in the field and the horizontal penetrometer is a practical tool for providing on-line soil resistance measurements. This study contributes to further research for the development of on-line soil resistance measurements and mapping within the precision agriculture applications.

  16. Colorado Late Cenozoic Fault and Fold Database and Internet Map Server: User-friendly technology for complex information

    USGS Publications Warehouse

    Morgan, K.S.; Pattyn, G.J.; Morgan, M.L.

    2005-01-01

    Internet mapping applications for geologic data allow simultaneous data delivery and collection, enabling quick data modification while efficiently supplying the end user with information. Utilizing Web-based technologies, the Colorado Geological Survey's Colorado Late Cenozoic Fault and Fold Database was transformed from a monothematic, nonspatial Microsoft Access database into a complex information set incorporating multiple data sources. The resulting user-friendly format supports easy analysis and browsing. The core of the application is the Microsoft Access database, which contains information compiled from available literature about faults and folds that are known or suspected to have moved during the late Cenozoic. The database contains nonspatial fields such as structure type, age, and rate of movement. Geographic locations of the fault and fold traces were compiled from previous studies at 1:250,000 scale to form a spatial database containing information such as length and strike. Integration of the two databases allowed both spatial and nonspatial information to be presented on the Internet as a single dataset (http://geosurvey.state.co.us/pubs/ceno/). The user-friendly interface enables users to view and query the data in an integrated manner, thus providing multiple ways to locate desired information. Retaining the digital data format also allows continuous data updating and quick delivery of newly acquired information. This dataset is a valuable resource to anyone interested in earthquake hazards and the activity of faults and folds in Colorado. Additional geologic hazard layers and imagery may aid in decision support and hazard evaluation. The up-to-date and customizable maps are invaluable tools for researchers or the public.

  17. Toward high-throughput genotyping: dynamic and automatic software for manipulating large-scale genotype data using fluorescently labeled dinucleotide markers.

    PubMed

    Li, J L; Deng, H; Lai, D B; Xu, F; Chen, J; Gao, G; Recker, R R; Deng, H W

    2001-07-01

    To efficiently manipulate large amounts of genotype data generated with fluorescently labeled dinucleotide markers, we developed a Microsoft database management system, named. offers several advantages. First, it accommodates the dynamic nature of the accumulations of genotype data during the genotyping process; some data need to be confirmed or replaced by repeat lab procedures. By using, the raw genotype data can be imported easily and continuously and incorporated into the database during the genotyping process that may continue over an extended period of time in large projects. Second, almost all of the procedures are automatic, including autocomparison of the raw data read by different technicians from the same gel, autoadjustment among the allele fragment-size data from cross-runs or cross-platforms, autobinning of alleles, and autocompilation of genotype data for suitable programs to perform inheritance check in pedigrees. Third, provides functions to track electrophoresis gel files to locate gel or sample sources for any resultant genotype data, which is extremely helpful for double-checking consistency of raw and final data and for directing repeat experiments. In addition, the user-friendly graphic interface of renders processing of large amounts of data much less labor-intensive. Furthermore, has built-in mechanisms to detect some genotyping errors and to assess the quality of genotype data that then are summarized in the statistic reports automatically generated by. The can easily handle >500,000 genotype data entries, a number more than sufficient for typical whole-genome linkage studies. The modules and programs we developed for the can be extended to other database platforms, such as Microsoft SQL server, if the capability to handle still greater quantities of genotype data simultaneously is desired.

  18. Optimizing Secure Communication Standards for Disadvantaged Networks

    DTIC Science & Technology

    2009-09-01

    created using LATEX with the editor Kile [5]. 5.1.2 Libraries and APIs Several libraries were important to the successful completion of this study. The...Explorer. http://www.microsoft.com/windows/ internet-explorer/default.aspx. [5] Kile - An Integrated LaTeX Environment. http://kile.sourceforge.net/. [6

  19. Standard Oil and Big Business. Lesson Plan One.

    ERIC Educational Resources Information Center

    Dicke, Thomas S.

    1996-01-01

    Presents a lesson plan that aims to show students why individuals admire and distrust the great size and power of the firms that dominate the economy. Activities include student research and presentations on conflicts and competition within the oil industry, as well as within Microsoft, Wal-Mart, and other companies. (MJP)

  20. Analysis of Variance with Summary Statistics in Microsoft® Excel®

    ERIC Educational Resources Information Center

    Larson, David A.; Hsu, Ko-Cheng

    2010-01-01

    Students regularly are asked to solve Single Factor Analysis of Variance problems given only the sample summary statistics (number of observations per category, category means, and corresponding category standard deviations). Most undergraduate students today use Excel for data analysis of this type. However, Excel, like all other statistical…

  1. The Impact of the AACTE-Microsoft Grant on Elementary Reading & Writing

    ERIC Educational Resources Information Center

    Borgia, Laurel; Cheek, Earl H., Jr.

    2005-01-01

    Accountability for student learning and support of evidence-based instructional approaches are critical responsibilities for teachers. Both are particularly significant with the current reliance on state standards, assessment tests and the No Child Left Behind Act (Shanahan 2002). Every elementary teacher must have research-based resources to help…

  2. MetaBar - a tool for consistent contextual data acquisition and standards compliant submission.

    PubMed

    Hankeln, Wolfgang; Buttigieg, Pier Luigi; Fink, Dennis; Kottmann, Renzo; Yilmaz, Pelin; Glöckner, Frank Oliver

    2010-06-30

    Environmental sequence datasets are increasing at an exponential rate; however, the vast majority of them lack appropriate descriptors like sampling location, time and depth/altitude: generally referred to as metadata or contextual data. The consistent capture and structured submission of these data is crucial for integrated data analysis and ecosystems modeling. The application MetaBar has been developed, to support consistent contextual data acquisition. MetaBar is a spreadsheet and web-based software tool designed to assist users in the consistent acquisition, electronic storage, and submission of contextual data associated to their samples. A preconfigured Microsoft Excel spreadsheet is used to initiate structured contextual data storage in the field or laboratory. Each sample is given a unique identifier and at any stage the sheets can be uploaded to the MetaBar database server. To label samples, identifiers can be printed as barcodes. An intuitive web interface provides quick access to the contextual data in the MetaBar database as well as user and project management capabilities. Export functions facilitate contextual and sequence data submission to the International Nucleotide Sequence Database Collaboration (INSDC), comprising of the DNA DataBase of Japan (DDBJ), the European Molecular Biology Laboratory database (EMBL) and GenBank. MetaBar requests and stores contextual data in compliance to the Genomic Standards Consortium specifications. The MetaBar open source code base for local installation is available under the GNU General Public License version 3 (GNU GPL3). The MetaBar software supports the typical workflow from data acquisition and field-sampling to contextual data enriched sequence submission to an INSDC database. The integration with the megx.net marine Ecological Genomics database and portal facilitates georeferenced data integration and metadata-based comparisons of sampling sites as well as interactive data visualization. The ample export functionalities and the INSDC submission support enable exchange of data across disciplines and safeguarding contextual data.

  3. MetaBar - a tool for consistent contextual data acquisition and standards compliant submission

    PubMed Central

    2010-01-01

    Background Environmental sequence datasets are increasing at an exponential rate; however, the vast majority of them lack appropriate descriptors like sampling location, time and depth/altitude: generally referred to as metadata or contextual data. The consistent capture and structured submission of these data is crucial for integrated data analysis and ecosystems modeling. The application MetaBar has been developed, to support consistent contextual data acquisition. Results MetaBar is a spreadsheet and web-based software tool designed to assist users in the consistent acquisition, electronic storage, and submission of contextual data associated to their samples. A preconfigured Microsoft® Excel® spreadsheet is used to initiate structured contextual data storage in the field or laboratory. Each sample is given a unique identifier and at any stage the sheets can be uploaded to the MetaBar database server. To label samples, identifiers can be printed as barcodes. An intuitive web interface provides quick access to the contextual data in the MetaBar database as well as user and project management capabilities. Export functions facilitate contextual and sequence data submission to the International Nucleotide Sequence Database Collaboration (INSDC), comprising of the DNA DataBase of Japan (DDBJ), the European Molecular Biology Laboratory database (EMBL) and GenBank. MetaBar requests and stores contextual data in compliance to the Genomic Standards Consortium specifications. The MetaBar open source code base for local installation is available under the GNU General Public License version 3 (GNU GPL3). Conclusion The MetaBar software supports the typical workflow from data acquisition and field-sampling to contextual data enriched sequence submission to an INSDC database. The integration with the megx.net marine Ecological Genomics database and portal facilitates georeferenced data integration and metadata-based comparisons of sampling sites as well as interactive data visualization. The ample export functionalities and the INSDC submission support enable exchange of data across disciplines and safeguarding contextual data. PMID:20591175

  4. Cross-standard user description in mobile, medical oriented virtual collaborative environments

    NASA Astrophysics Data System (ADS)

    Ganji, Rama Rao; Mitrea, Mihai; Joveski, Bojan; Chammem, Afef

    2015-03-01

    By combining four different open standards belonging to the ISO/IEC JTC1/SC29 WG11 (a.k.a. MPEG) and W3C, this paper advances an architecture for mobile, medical oriented virtual collaborative environments. The various users are represented according to MPEG-UD (MPEG User Description) while the security issues are dealt with by deploying the WebID principles. On the server side, irrespective of their elementary types (text, image, video, 3D, …), the medical data are aggregated into hierarchical, interactive multimedia scenes which are alternatively represented into MPEG-4 BiFS or HTML5 standards. This way, each type of content can be optimally encoded according to its particular constraints (semantic, medical practice, network conditions, etc.). The mobile device should ensure only the displaying of the content (inside an MPEG player or an HTML5 browser) and the capturing of the user interaction. The overall architecture is implemented and tested under the framework of the MEDUSA European project, in partnership with medical institutions. The testbed considers a server emulated by a PC and heterogeneous user devices (tablets, smartphones, laptops) running under iOS, Android and Windows operating systems. The connection between the users and the server is alternatively ensured by WiFi and 3G/4G networks.

  5. Bartenders' and Rum Shopkeepers' Knowledge of and Attitudes Toward "Problem Drinking" in Saint Vincent and the Grenadines.

    PubMed

    Zafer, Maryam; Liu, Shiyuan; Katz, Craig L

    2018-04-28

    Harmful alcohol use encompasses a spectrum of habits, including heavy episodic drinking (HED) which increases the risk of acute alcohol-related harms. The prevalence of HED in Saint Vincent and the Grenadines (SVG) is 5.7% among the overall population aged 15 years and older and 10.2% among drinkers. Responsible Beverage Service interventions train alcohol servers to limit levels of intoxication attained by customers and decrease acute alcohol-related harms. The objectives of this study were to determine bar tenders' and rum shopkeepers' knowledge of and attitudes toward problem drinking and willingness to participate in server training. Researchers used convenience and purposive sampling to recruit 30 participants from Barraouile, Kingstown, and Calliaqua to participate in semi-structured interviews designed to explore study objectives. Results and conclusions were derived from grounded theory analysis. Heavy episodic drinking is common but not stigmatized. Heavy drinking is considered a "problem" if the customer attains a level of disinhibition causing drunken and disruptive or injurious behavior. Bartenders and rum shopkeepers reported intervening with visibly intoxicated patrons and encouraging cessation of continued alcohol consumption. Participants cited economic incentives, prevention of alcohol-related harms, and personal morals as motivators to prevent drunkenness. Respondents acknowledged that encouraging responsible drinking was a legitimate part of their role and were favorable to server training. However, there were mixed opinions about the intervention's perceived efficacy given absent community-wide standards on preventing intoxication and limitations of existing alcohol policy. Given respondents' motivation and lack of standardized alcohol server training in SVG, mandated server training can be an effective strategy when promoted as one piece of a multi-component alcohol policy.

  6. deepTools: a flexible platform for exploring deep-sequencing data.

    PubMed

    Ramírez, Fidel; Dündar, Friederike; Diehl, Sarah; Grüning, Björn A; Manke, Thomas

    2014-07-01

    We present a Galaxy based web server for processing and visualizing deeply sequenced data. The web server's core functionality consists of a suite of newly developed tools, called deepTools, that enable users with little bioinformatic background to explore the results of their sequencing experiments in a standardized setting. Users can upload pre-processed files with continuous data in standard formats and generate heatmaps and summary plots in a straight-forward, yet highly customizable manner. In addition, we offer several tools for the analysis of files containing aligned reads and enable efficient and reproducible generation of normalized coverage files. As a modular and open-source platform, deepTools can easily be expanded and customized to future demands and developments. The deepTools webserver is freely available at http://deeptools.ie-freiburg.mpg.de and is accompanied by extensive documentation and tutorials aimed at conveying the principles of deep-sequencing data analysis. The web server can be used without registration. deepTools can be installed locally either stand-alone or as part of Galaxy. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  7. Lyme Disease Data

    MedlinePlus

    ... County-level Lyme disease data from 2000-2016 Microsoft Excel file [Excel CSV – 209KB] ––Right–click the link ... PDF file Microsoft PowerPoint file Microsoft Word file Microsoft Excel file Audio/Video file Apple Quicktime file RealPlayer ...

  8. PharmMapper 2017 update: a web server for potential drug target identification with a comprehensive target pharmacophore database.

    PubMed

    Wang, Xia; Shen, Yihang; Wang, Shiwei; Li, Shiliang; Zhang, Weilin; Liu, Xiaofeng; Lai, Luhua; Pei, Jianfeng; Li, Honglin

    2017-07-03

    The PharmMapper online tool is a web server for potential drug target identification by reversed pharmacophore matching the query compound against an in-house pharmacophore model database. The original version of PharmMapper includes more than 7000 target pharmacophores derived from complex crystal structures with corresponding protein target annotations. In this article, we present a new version of the PharmMapper web server, of which the backend pharmacophore database is six times larger than the earlier one, with a total of 23 236 proteins covering 16 159 druggable pharmacophore models and 51 431 ligandable pharmacophore models. The expanded target data cover 450 indications and 4800 molecular functions compared to 110 indications and 349 molecular functions in our last update. In addition, the new web server is united with the statistically meaningful ranking of the identified drug targets, which is achieved through the use of standard scores. It also features an improved user interface. The proposed web server is freely available at http://lilab.ecust.edu.cn/pharmmapper/. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  9. Comparing Server Energy Use and Efficiency Using Small Sample Sizes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coles, Henry C.; Qin, Yong; Price, Phillip N.

    This report documents a demonstration that compared the energy consumption and efficiency of a limited sample size of server-type IT equipment from different manufacturers by measuring power at the server power supply power cords. The results are specific to the equipment and methods used. However, it is hoped that those responsible for IT equipment selection can used the methods described to choose models that optimize energy use efficiency. The demonstration was conducted in a data center at Lawrence Berkeley National Laboratory in Berkeley, California. It was performed with five servers of similar mechanical and electronic specifications; three from Intel andmore » one each from Dell and Supermicro. Server IT equipment is constructed using commodity components, server manufacturer-designed assemblies, and control systems. Server compute efficiency is constrained by the commodity component specifications and integration requirements. The design freedom, outside of the commodity component constraints, provides room for the manufacturer to offer a product with competitive efficiency that meets market needs at a compelling price. A goal of the demonstration was to compare and quantify the server efficiency for three different brands. The efficiency is defined as the average compute rate (computations per unit of time) divided by the average energy consumption rate. The research team used an industry standard benchmark software package to provide a repeatable software load to obtain the compute rate and provide a variety of power consumption levels. Energy use when the servers were in an idle state (not providing computing work) were also measured. At high server compute loads, all brands, using the same key components (processors and memory), had similar results; therefore, from these results, it could not be concluded that one brand is more efficient than the other brands. The test results show that the power consumption variability caused by the key components as a group is similar to all other components as a group. However, some differences were observed. The Supermicro server used 27 percent more power at idle compared to the other brands. The Intel server had a power supply control feature called cold redundancy, and the data suggest that cold redundancy can provide energy savings at low power levels. Test and evaluation methods that might be used by others having limited resources for IT equipment evaluation are explained in the report.« less

  10. An Efficient Algorithm for Server Thermal Fault Diagnosis Based on Infrared Image

    NASA Astrophysics Data System (ADS)

    Liu, Hang; Xie, Ting; Ran, Jian; Gao, Shan

    2017-10-01

    It is essential for a data center to maintain server security and stability. Long-time overload operation or high room temperature may cause service disruption even a server crash, which would result in great economic loss for business. Currently, the methods to avoid server outages are monitoring and forecasting. Thermal camera can provide fine texture information for monitoring and intelligent thermal management in large data center. This paper presents an efficient method for server thermal fault monitoring and diagnosis based on infrared image. Initially thermal distribution of server is standardized and the interest regions of the image are segmented manually. Then the texture feature, Hu moments feature as well as modified entropy feature are extracted from the segmented regions. These characteristics are applied to analyze and classify thermal faults, and then make efficient energy-saving thermal management decisions such as job migration. For the larger feature space, the principal component analysis is employed to reduce the feature dimensions, and guarantee high processing speed without losing the fault feature information. Finally, different feature vectors are taken as input for SVM training, and do the thermal fault diagnosis after getting the optimized SVM classifier. This method supports suggestions for optimizing data center management, it can improve air conditioning efficiency and reduce the energy consumption of the data center. The experimental results show that the maximum detection accuracy is 81.5%.

  11. The TOPCONS web server for consensus prediction of membrane protein topology and signal peptides.

    PubMed

    Tsirigos, Konstantinos D; Peters, Christoph; Shu, Nanjiang; Käll, Lukas; Elofsson, Arne

    2015-07-01

    TOPCONS (http://topcons.net/) is a widely used web server for consensus prediction of membrane protein topology. We hereby present a major update to the server, with some substantial improvements, including the following: (i) TOPCONS can now efficiently separate signal peptides from transmembrane regions. (ii) The server can now differentiate more successfully between globular and membrane proteins. (iii) The server now is even slightly faster, although a much larger database is used to generate the multiple sequence alignments. For most proteins, the final prediction is produced in a matter of seconds. (iv) The user-friendly interface is retained, with the additional feature of submitting batch files and accessing the server programmatically using standard interfaces, making it thus ideal for proteome-wide analyses. Indicatively, the user can now scan the entire human proteome in a few days. (v) For proteins with homology to a known 3D structure, the homology-inferred topology is also displayed. (vi) Finally, the combination of methods currently implemented achieves an overall increase in performance by 4% as compared to the currently available best-scoring methods and TOPCONS is the only method that can identify signal peptides and still maintain a state-of-the-art performance in topology predictions. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  12. Datacube Services in Action, Using Open Source and Open Standards

    NASA Astrophysics Data System (ADS)

    Baumann, P.; Misev, D.

    2016-12-01

    Array Databases comprise novel, promising technology for massive spatio-temporal datacubes, extending the SQL paradigm of "any query, anytime" to n-D arrays. On server side, such queries can be optimized, parallelized, and distributed based on partitioned array storage. The rasdaman ("raster data manager") system, which has pioneered Array Databases, is available in open source on www.rasdaman.org. Its declarative query language extends SQL with array operators which are optimized and parallelized on server side. The rasdaman engine, which is part of OSGeo Live, is mature and in operational use databases individually holding dozens of Terabytes. Further, the rasdaman concepts have strongly impacted international Big Data standards in the field, including the forthcoming MDA ("Multi-Dimensional Array") extension to ISO SQL, the OGC Web Coverage Service (WCS) and Web Coverage Processing Service (WCPS) standards, and the forthcoming INSPIRE WCS/WCPS; in both OGC and INSPIRE, OGC is WCS Core Reference Implementation. In our talk we present concepts, architecture, operational services, and standardization impact of open-source rasdaman, as well as experiences made.

  13. Embedded Web Technology: Applying World Wide Web Standards to Embedded Systems

    NASA Technical Reports Server (NTRS)

    Ponyik, Joseph G.; York, David W.

    2002-01-01

    Embedded Systems have traditionally been developed in a highly customized manner. The user interface hardware and software along with the interface to the embedded system are typically unique to the system for which they are built, resulting in extra cost to the system in terms of development time and maintenance effort. World Wide Web standards have been developed in the passed ten years with the goal of allowing servers and clients to intemperate seamlessly. The client and server systems can consist of differing hardware and software platforms but the World Wide Web standards allow them to interface without knowing about the details of system at the other end of the interface. Embedded Web Technology is the merging of Embedded Systems with the World Wide Web. Embedded Web Technology decreases the cost of developing and maintaining the user interface by allowing the user to interface to the embedded system through a web browser running on a standard personal computer. Embedded Web Technology can also be used to simplify an Embedded System's internal network.

  14. Government Information Locator Service (GILS). Draft report to the Information Infrastructure Task Force

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This is a draft report on the Government Information Locator Service (GILS) to the National Information Infrastructure (NII) task force. GILS is designed to take advantage of internetworking technology known as client-server architecture which allows information to be distributed among multiple independent information servers. Two appendices are provided -- (1) A glossary of related terminology and (2) extracts from a draft GILS profile for the use of the American National Standard Information Retrieval Application Service Definition and Protocol Specification for Library Applications.

  15. Use of a Microsoft Excel based add-in program to calculate plasma sinistrin clearance by a two-compartment model analysis in dogs.

    PubMed

    Steinbach, Sarah M L; Sturgess, Christopher P; Dunning, Mark D; Neiger, Reto

    2015-06-01

    Assessment of renal function by means of plasma clearance of a suitable marker has become standard procedure for estimation of glomerular filtration rate (GFR). Sinistrin, a polyfructan solely cleared by the kidney, is often used for this purpose. Pharmacokinetic modeling using adequate software is necessary to calculate disappearance rate and half-life of sinistrin. The purpose of this study was to describe the use of a Microsoft excel based add-in program to calculate plasma sinistrin clearance, as well as additional pharmacokinetic parameters such as transfer rates (k), half-life (t1/2) and volume of distribution (Vss) for sinistrin in dogs with varying degrees of renal function. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Data-Acquisition Software for PSP/TSP Wind-Tunnel Cameras

    NASA Technical Reports Server (NTRS)

    Amer, Tahani R.; Goad, William K.

    2005-01-01

    Wing-Viewer is a computer program for acquisition and reduction of image data acquired by any of five different scientificgrade commercial electronic cameras used at Langley Research center to observe wind-tunnel models coated with pressure or temperature-sensitive paints (PSP/TSP). Wing-Viewer provides full automation of camera operation and acquisition of image data, and has limited data-preprocessing capability for quick viewing of the results of PSP/TSP test images. Wing- Viewer satisfies a requirement for a standard interface between all the cameras and a single personal computer: Written by use of Microsoft Visual C++ and the Microsoft Foundation Class Library as a framework, Wing-Viewer has the ability to communicate with the C/C++ software libraries that run on the controller circuit cards of all five cameras.

  17. The Standard Autonomous File Server, A Customized, Off-the-Shelf Success Story

    NASA Technical Reports Server (NTRS)

    Semancik, Susan K.; Conger, Annette M.; Obenschain, Arthur F. (Technical Monitor)

    2001-01-01

    The Standard Autonomous File Server (SAFS), which includes both off-the-shelf hardware and software, uses an improved automated file transfer process to provide a quicker, more reliable, prioritized file distribution for customers of near real-time data without interfering with the assets involved in the acquisition and processing of the data. It operates as a stand-alone solution, monitoring itself, and providing an automated fail-over process to enhance reliability. This paper describes the unique problems and lessons learned both during the COTS selection and integration into SAFS, and the system's first year of operation in support of NASA's satellite ground network. COTS was the key factor in allowing the two-person development team to deploy systems in less than a year, meeting the required launch schedule. The SAFS system has been so successful; it is becoming a NASA standard resource, leading to its nomination for NASA's Software of the Year Award in 1999.

  18. Implementation of a cloud-based electronic medical record exchange system in compliance with the integrating healthcare enterprise's cross-enterprise document sharing integration profile.

    PubMed

    Wu, Chien Hua; Chiu, Ruey Kei; Yeh, Hong Mo; Wang, Da Wei

    2017-11-01

    In 2011, the Ministry of Health and Welfare of Taiwan established the National Electronic Medical Record Exchange Center (EEC) to permit the sharing of medical resources among hospitals. This system can presently exchange electronic medical records (EMRs) among hospitals, in the form of medical imaging reports, laboratory test reports, discharge summaries, outpatient records, and outpatient medication records. Hospitals can send or retrieve EMRs over the virtual private network by connecting to the EEC through a gateway. International standards should be adopted in the EEC to allow users with those standards to take advantage of this exchange service. In this study, a cloud-based EMR-exchange prototyping system was implemented on the basis of the Integrating the Healthcare Enterprise's Cross-Enterprise Document Sharing integration profile and the existing EMR exchange system. RESTful services were used to implement the proposed prototyping system on the Microsoft Azure cloud-computing platform. Four scenarios were created in Microsoft Azure to determine the feasibility and effectiveness of the proposed system. The experimental results demonstrated that the proposed system successfully completed EMR exchange under the four scenarios created in Microsoft Azure. Additional experiments were conducted to compare the efficiency of the EMR-exchanging mechanisms of the proposed system with those of the existing EEC system. The experimental results suggest that the proposed RESTful service approach is superior to the Simple Object Access Protocol method currently implemented in the EEC system, according to the irrespective response times under the four experimental scenarios. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Web Proxy Auto Discovery for the WLCG

    NASA Astrophysics Data System (ADS)

    Dykstra, D.; Blomer, J.; Blumenfeld, B.; De Salvo, A.; Dewhurst, A.; Verguilov, V.

    2017-10-01

    All four of the LHC experiments depend on web proxies (that is, squids) at each grid site to support software distribution by the CernVM FileSystem (CVMFS). CMS and ATLAS also use web proxies for conditions data distributed through the Frontier Distributed Database caching system. ATLAS & CMS each have their own methods for their grid jobs to find out which web proxies to use for Frontier at each site, and CVMFS has a third method. Those diverse methods limit usability and flexibility, particularly for opportunistic use cases, where an experiment’s jobs are run at sites that do not primarily support that experiment. This paper describes a new Worldwide LHC Computing Grid (WLCG) system for discovering the addresses of web proxies. The system is based on an internet standard called Web Proxy Auto Discovery (WPAD). WPAD is in turn based on another standard called Proxy Auto Configuration (PAC). Both the Frontier and CVMFS clients support this standard. The input into the WLCG system comes from squids registered in the ATLAS Grid Information System (AGIS) and CMS SITECONF files, cross-checked with squids registered by sites in the Grid Configuration Database (GOCDB) and the OSG Information Management (OIM) system, and combined with some exceptions manually configured by people from ATLAS and CMS who operate WLCG Squid monitoring. WPAD servers at CERN respond to http requests from grid nodes all over the world with a PAC file that lists available web proxies, based on IP addresses matched from a database that contains the IP address ranges registered to organizations. Large grid sites are encouraged to supply their own WPAD web servers for more flexibility, to avoid being affected by short term long distance network outages, and to offload the WLCG WPAD servers at CERN. The CERN WPAD servers additionally support requests from jobs running at non-grid sites (particularly for LHC@Home) which they direct to the nearest publicly accessible web proxy servers. The responses to those requests are geographically ordered based on a separate database that maps IP addresses to longitude and latitude.

  20. Web Proxy Auto Discovery for the WLCG

    DOE PAGES

    Dykstra, D.; Blomer, J.; Blumenfeld, B.; ...

    2017-11-23

    All four of the LHC experiments depend on web proxies (that is, squids) at each grid site to support software distribution by the CernVM FileSystem (CVMFS). CMS and ATLAS also use web proxies for conditions data distributed through the Frontier Distributed Database caching system. ATLAS & CMS each have their own methods for their grid jobs to find out which web proxies to use for Frontier at each site, and CVMFS has a third method. Those diverse methods limit usability and flexibility, particularly for opportunistic use cases, where an experiment’s jobs are run at sites that do not primarily supportmore » that experiment. This paper describes a new Worldwide LHC Computing Grid (WLCG) system for discovering the addresses of web proxies. The system is based on an internet standard called Web Proxy Auto Discovery (WPAD). WPAD is in turn based on another standard called Proxy Auto Configuration (PAC). Both the Frontier and CVMFS clients support this standard. The input into the WLCG system comes from squids registered in the ATLAS Grid Information System (AGIS) and CMS SITECONF files, cross-checked with squids registered by sites in the Grid Configuration Database (GOCDB) and the OSG Information Management (OIM) system, and combined with some exceptions manually configured by people from ATLAS and CMS who operate WLCG Squid monitoring. WPAD servers at CERN respond to http requests from grid nodes all over the world with a PAC file that lists available web proxies, based on IP addresses matched from a database that contains the IP address ranges registered to organizations. Large grid sites are encouraged to supply their own WPAD web servers for more flexibility, to avoid being affected by short term long distance network outages, and to offload the WLCG WPAD servers at CERN. The CERN WPAD servers additionally support requests from jobs running at non-grid sites (particularly for LHC@Home) which it directs to the nearest publicly accessible web proxy servers. Furthermore, the responses to those requests are geographically ordered based on a separate database that maps IP addresses to longitude and latitude.« less

  1. Web Proxy Auto Discovery for the WLCG

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dykstra, D.; Blomer, J.; Blumenfeld, B.

    All four of the LHC experiments depend on web proxies (that is, squids) at each grid site to support software distribution by the CernVM FileSystem (CVMFS). CMS and ATLAS also use web proxies for conditions data distributed through the Frontier Distributed Database caching system. ATLAS & CMS each have their own methods for their grid jobs to find out which web proxies to use for Frontier at each site, and CVMFS has a third method. Those diverse methods limit usability and flexibility, particularly for opportunistic use cases, where an experiment’s jobs are run at sites that do not primarily supportmore » that experiment. This paper describes a new Worldwide LHC Computing Grid (WLCG) system for discovering the addresses of web proxies. The system is based on an internet standard called Web Proxy Auto Discovery (WPAD). WPAD is in turn based on another standard called Proxy Auto Configuration (PAC). Both the Frontier and CVMFS clients support this standard. The input into the WLCG system comes from squids registered in the ATLAS Grid Information System (AGIS) and CMS SITECONF files, cross-checked with squids registered by sites in the Grid Configuration Database (GOCDB) and the OSG Information Management (OIM) system, and combined with some exceptions manually configured by people from ATLAS and CMS who operate WLCG Squid monitoring. WPAD servers at CERN respond to http requests from grid nodes all over the world with a PAC file that lists available web proxies, based on IP addresses matched from a database that contains the IP address ranges registered to organizations. Large grid sites are encouraged to supply their own WPAD web servers for more flexibility, to avoid being affected by short term long distance network outages, and to offload the WLCG WPAD servers at CERN. The CERN WPAD servers additionally support requests from jobs running at non-grid sites (particularly for LHC@Home) which it directs to the nearest publicly accessible web proxy servers. Furthermore, the responses to those requests are geographically ordered based on a separate database that maps IP addresses to longitude and latitude.« less

  2. Towards Big Earth Data Analytics: The EarthServer Approach

    NASA Astrophysics Data System (ADS)

    Baumann, Peter

    2013-04-01

    Big Data in the Earth sciences, the Tera- to Exabyte archives, mostly are made up from coverage data whereby the term "coverage", according to ISO and OGC, is defined as the digital representation of some space-time varying phenomenon. Common examples include 1-D sensor timeseries, 2-D remote sensing imagery, 3D x/y/t image timeseries and x/y/z geology data, and 4-D x/y/z/t atmosphere and ocean data. Analytics on such data requires on-demand processing of sometimes significant complexity, such as getting the Fourier transform of satellite images. As network bandwidth limits prohibit transfer of such Big Data it is indispensable to devise protocols allowing clients to task flexible and fast processing on the server. The EarthServer initiative, funded by EU FP7 eInfrastructures, unites 11 partners from computer and earth sciences to establish Big Earth Data Analytics. One key ingredient is flexibility for users to ask what they want, not impeded and complicated by system internals. The EarthServer answer to this is to use high-level query languages; these have proven tremendously successful on tabular and XML data, and we extend them with a central geo data structure, multi-dimensional arrays. A second key ingredient is scalability. Without any doubt, scalability ultimately can only be achieved through parallelization. In the past, parallelizing code has been done at compile time and usually with manual intervention. The EarthServer approach is to perform a samentic-based dynamic distribution of queries fragments based on networks optimization and further criteria. The EarthServer platform is comprised by rasdaman, an Array DBMS enabling efficient storage and retrieval of any-size, any-type multi-dimensional raster data. In the project, rasdaman is being extended with several functionality and scalability features, including: support for irregular grids and general meshes; in-situ retrieval (evaluation of database queries on existing archive structures, avoiding data import and, hence, duplication); the aforementioned distributed query processing. Additionally, Web clients for multi-dimensional data visualization are being established. Client/server interfaces are strictly based on OGC and W3C standards, in particular the Web Coverage Processing Service (WCPS) which defines a high-level raster query language. We present the EarthServer project with its vision and approaches, relate it to the current state of standardization, and demonstrate it by way of large-scale data centers and their services using rasdaman.

  3. A Microsoft-Excel-based tool for running and critically appraising network meta-analyses--an overview and application of NetMetaXL.

    PubMed

    Brown, Stephen; Hutton, Brian; Clifford, Tammy; Coyle, Doug; Grima, Daniel; Wells, George; Cameron, Chris

    2014-09-29

    The use of network meta-analysis has increased dramatically in recent years. WinBUGS, a freely available Bayesian software package, has been the most widely used software package to conduct network meta-analyses. However, the learning curve for WinBUGS can be daunting, especially for new users. Furthermore, critical appraisal of network meta-analyses conducted in WinBUGS can be challenging given its limited data manipulation capabilities and the fact that generation of graphical output from network meta-analyses often relies on different software packages than the analyses themselves. We developed a freely available Microsoft-Excel-based tool called NetMetaXL, programmed in Visual Basic for Applications, which provides an interface for conducting a Bayesian network meta-analysis using WinBUGS from within Microsoft Excel. . This tool allows the user to easily prepare and enter data, set model assumptions, and run the network meta-analysis, with results being automatically displayed in an Excel spreadsheet. It also contains macros that use NetMetaXL's interface to generate evidence network diagrams, forest plots, league tables of pairwise comparisons, probability plots (rankograms), and inconsistency plots within Microsoft Excel. All figures generated are publication quality, thereby increasing the efficiency of knowledge transfer and manuscript preparation. We demonstrate the application of NetMetaXL using data from a network meta-analysis published previously which compares combined resynchronization and implantable defibrillator therapy in left ventricular dysfunction. We replicate results from the previous publication while demonstrating result summaries generated by the software. Use of the freely available NetMetaXL successfully demonstrated its ability to make running network meta-analyses more accessible to novice WinBUGS users by allowing analyses to be conducted entirely within Microsoft Excel. NetMetaXL also allows for more efficient and transparent critical appraisal of network meta-analyses, enhanced standardization of reporting, and integration with health economic evaluations which are frequently Excel-based.

  4. A Microsoft-Excel-based tool for running and critically appraising network meta-analyses—an overview and application of NetMetaXL

    PubMed Central

    2014-01-01

    Background The use of network meta-analysis has increased dramatically in recent years. WinBUGS, a freely available Bayesian software package, has been the most widely used software package to conduct network meta-analyses. However, the learning curve for WinBUGS can be daunting, especially for new users. Furthermore, critical appraisal of network meta-analyses conducted in WinBUGS can be challenging given its limited data manipulation capabilities and the fact that generation of graphical output from network meta-analyses often relies on different software packages than the analyses themselves. Methods We developed a freely available Microsoft-Excel-based tool called NetMetaXL, programmed in Visual Basic for Applications, which provides an interface for conducting a Bayesian network meta-analysis using WinBUGS from within Microsoft Excel. . This tool allows the user to easily prepare and enter data, set model assumptions, and run the network meta-analysis, with results being automatically displayed in an Excel spreadsheet. It also contains macros that use NetMetaXL’s interface to generate evidence network diagrams, forest plots, league tables of pairwise comparisons, probability plots (rankograms), and inconsistency plots within Microsoft Excel. All figures generated are publication quality, thereby increasing the efficiency of knowledge transfer and manuscript preparation. Results We demonstrate the application of NetMetaXL using data from a network meta-analysis published previously which compares combined resynchronization and implantable defibrillator therapy in left ventricular dysfunction. We replicate results from the previous publication while demonstrating result summaries generated by the software. Conclusions Use of the freely available NetMetaXL successfully demonstrated its ability to make running network meta-analyses more accessible to novice WinBUGS users by allowing analyses to be conducted entirely within Microsoft Excel. NetMetaXL also allows for more efficient and transparent critical appraisal of network meta-analyses, enhanced standardization of reporting, and integration with health economic evaluations which are frequently Excel-based. PMID:25267416

  5. The Orthanc Ecosystem for Medical Imaging.

    PubMed

    Jodogne, Sébastien

    2018-05-03

    This paper reviews the components of Orthanc, a free and open-source, highly versatile ecosystem for medical imaging. At the core of the Orthanc ecosystem, the Orthanc server is a lightweight vendor neutral archive that provides PACS managers with a powerful environment to automate and optimize the imaging flows that are very specific to each hospital. The Orthanc server can be extended with plugins that provide solutions for teleradiology, digital pathology, or enterprise-ready databases. It is shown how software developers and research engineers can easily develop external software or Web portals dealing with medical images, with minimal knowledge of the DICOM standard, thanks to the advanced programming interface of the Orthanc server. The paper concludes by introducing the Stone of Orthanc, an innovative toolkit for the cross-platform rendering of medical images.

  6. The Time Series Data Server (TSDS) for Standards-Compliant, Convenient, and Efficient Access to Time Series Data

    NASA Astrophysics Data System (ADS)

    Lindholm, D. M.; Weigel, R. S.; Wilson, A.; Ware Dewolfe, A.

    2009-12-01

    Data analysis in the physical sciences is often plagued by the difficulty in acquiring the desired data. A great deal of work has been done in the area of metadata and data discovery, however, many such discoveries simply provide links that lead directly to a data file. Often these files are impractically large, containing more time samples or variables than desired, and are slow to access. Once these files are downloaded, format issues further complicate using the data. Some data servers have begun to address these problems by improving data virtualization and ease of use. However, these services often don't scale to large datasets. Also, the generic nature of the data models used by these servers, while providing greater flexibility, may complicate setting up such a service for data providers and limit sufficient semantics that would otherwise simplify use for clients, machine or human. The Time Series Data Server (TSDS) aims to address these problems within the limited, yet common, domain of time series data. With the simplifying assumption that all data products served are a function of time, the server can optimize for data access based on time subsets, a common use case. The server also supports requests for specific variables, which can be of type scalar, structure, or sequence. It also supports data types with higher level semantics, such as "spectrum." The TSDS is implemented using Java Servlet technology and can be dropped into any servlet container and customized for a data provider's needs. The interface is based on OPeNDAP (http://opendap.org) and conforms to the Data Acces Protocol (DAP) 2.0, a NASA standard (ESDS-RFC-004), which defines a simple HTTP request and response paradigm. Thus a TSDS server instance is a compliant OPeNDAP server that can be accessed by any OPeNDAP client or directly via RESTful web service requests. The TSDS reads the data that it serves into a common data model via the NetCDF Markup Language (NcML, http://www.unidata.ucar.edu/software/netcdf/ncml/) which enables dataset virtualization. An NcML file can expose a single file, a subset, or an aggregation of files as a single, logical dataset. With the appropriate NcML adapter, the TSDS can read data from its native format, eliminating the need for data providers to reformat their data and lowering the barrier for integration. Data can even be read via remote services which is important for enabling VxOs to be truly virtual. The TSDS provides reading, writing, and filtering capabilities through a modular framework. A collection of standard modules is available and customized modules are easy to create and integrate. This way the TSDS can read and write data in a variety of formats and apply filters to them an a manner customizable to meet the needs of both the data providers and consumers. The TSDS server is currently in use serving solar irradiance data from the LASP Interactive Solar IRradiance Datacenter (LISIRD, http://lasp.colorado.edu/lisird/), and is being introduced into the space physics virtual observatory community. The TSDS software is Open Source and available at SourceForge.

  7. Security Implications of OPC, OLE, DCOM, and RPC in Control Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2006-01-01

    OPC is a collection of software programming standards and interfaces used in the process control industry. It is intended to provide open connectivity and vendor equipment interoperability. The use of OPC technology simplifies the development of control systems that integrate components from multiple vendors and support multiple control protocols. OPC-compliant products are available from most control system vendors, and are widely used in the process control industry. OPC was originally known as OLE for Process Control; the first standards for OPC were based on underlying services in the Microsoft Windows computing environment. These underlying services (OLE [Object Linking and Embedding],more » DCOM [Distributed Component Object Model], and RPC [Remote Procedure Call]) have been the source of many severe security vulnerabilities. It is not feasible to automatically apply vendor patches and service packs to mitigate these vulnerabilities in a control systems environment. Control systems using the original OPC data access technology can thus inherit the vulnerabilities associated with these services. Current OPC standardization efforts are moving away from the original focus on Microsoft protocols, with a distinct trend toward web-based protocols that are independent of any particular operating system. However, the installed base of OPC equipment consists mainly of legacy implementations of the OLE for Process Control protocols.« less

  8. Performance comparison of leading image codecs: H.264/AVC Intra, JPEG2000, and Microsoft HD Photo

    NASA Astrophysics Data System (ADS)

    Tran, Trac D.; Liu, Lijie; Topiwala, Pankaj

    2007-09-01

    This paper provides a detailed rate-distortion performance comparison between JPEG2000, Microsoft HD Photo, and H.264/AVC High Profile 4:4:4 I-frame coding for high-resolution still images and high-definition (HD) 1080p video sequences. This work is an extension to our previous comparative study published in previous SPIE conferences [1, 2]. Here we further optimize all three codecs for compression performance. Coding simulations are performed on a set of large-format color images captured from mainstream digital cameras and 1080p HD video sequences commonly used for H.264/AVC standardization work. Overall, our experimental results show that all three codecs offer very similar coding performances at the high-quality, high-resolution setting. Differences tend to be data-dependent: JPEG2000 with the wavelet technology tends to be the best performer with smooth spatial data; H.264/AVC High-Profile with advanced spatial prediction modes tends to cope best with more complex visual content; Microsoft HD Photo tends to be the most consistent across the board. For the still-image data sets, JPEG2000 offers the best R-D performance gains (around 0.2 to 1 dB in peak signal-to-noise ratio) over H.264/AVC High-Profile intra coding and Microsoft HD Photo. For the 1080p video data set, all three codecs offer very similar coding performance. As in [1, 2], neither do we consider scalability nor complexity in this study (JPEG2000 is operating in non-scalable, but optimal performance mode).

  9. What's New with MS Office Suites

    ERIC Educational Resources Information Center

    Goldsborough, Reid

    2012-01-01

    If one buys a new PC, laptop, or netbook computer today, it probably comes preloaded with Microsoft Office 2010 Starter Edition. This is a significantly limited, advertising-laden version of Microsoft's suite of productivity programs, Microsoft Office. This continues the trend of PC makers providing ever more crippled versions of Microsoft's…

  10. Utilizing Microsoft Mathematics in Teaching and Learning Calculus

    ERIC Educational Resources Information Center

    Oktaviyanthi, Rina; Supriani, Yani

    2015-01-01

    The experimental design was conducted to investigate the use of Microsoft Mathematics, free software made by Microsoft Corporation, in teaching and learning Calculus. This paper reports results from experimental study details on implementation of Microsoft Mathematics in Calculus, students' achievement and the effects of the use of Microsoft…

  11. Experimental Design: Utilizing Microsoft Mathematics in Teaching and Learning Calculus

    ERIC Educational Resources Information Center

    Oktaviyanthi, Rina; Supriani, Yani

    2015-01-01

    The experimental design was conducted to investigate the use of Microsoft Mathematics, free software made by Microsoft Corporation, in teaching and learning Calculus. This paper reports results from experimental study details on implementation of Microsoft Mathematics in Calculus, students' achievement and the effects of the use of Microsoft…

  12. Design and development of a tele-healthcare information system based on web services and HL7 standards.

    PubMed

    Huang, Ean-Wen; Hung, Rui-Suan; Chiou, Shwu-Fen; Liu, Fei-Ying; Liou, Der-Ming

    2011-01-01

    Information and communication technologies progress rapidly and many novel applications have been developed in many domains of human life. In recent years, the demand for healthcare services has been growing because of the increase in the elderly population. Consequently, a number of healthcare institutions have focused on creating technologies to reduce extraneous work and improve the quality of service. In this study, an information platform for tele- healthcare services was implemented. The architecture of the platform included a web-based application server and client system. The client system was able to retrieve the blood pressure and glucose levels of a patient stored in measurement instruments through Bluetooth wireless transmission. The web application server assisted the staffs and clients in analyzing the health conditions of patients. In addition, the server provided face-to-face communications and instructions through remote video devices. The platform deployed a service-oriented architecture, which consisted of HL7 standard messages and web service components. The platform could transfer health records into HL7 standard clinical document architecture for data exchange with other organizations. The prototyping system was pretested and evaluated in a homecare department of hospital and a community management center for chronic disease monitoring. Based on the results of this study, this system is expected to improve the quality of healthcare services.

  13. Z39.50 and the Scholar's Workstation Concept.

    ERIC Educational Resources Information Center

    Phillips, Gary Lee

    1992-01-01

    Examines the potential application of the American National Standards Institute (ANSI)/National Information Standards Organization (NISO) Z39.50 library networking protocol as a client/server environment for a scholar's workstation. Computer networking models are described, and linking the workstation to an online public access catalog (OPAC) is…

  14. ScreenRecorder: A Utility for Creating Screenshot Video Using Only Original Equipment Manufacturer (OEM) Software on Microsoft Windows Systems

    DTIC Science & Technology

    2015-01-01

    class within Microsoft Visual Studio . 2 It has been tested on and is compatible with Microsoft Vista, 7, and 8 and Visual Studio Express 2008...the ScreenRecorder utility assumes a basic understanding of compiling and running C++ code within Microsoft Visual Studio . This report does not...of Microsoft Visual Studio , the ScreenRecorder utility was developed as a C++ class that can be compiled as a library (static or dynamic) to be

  15. Lowering the Barrier for Standards-Compliant and Discoverable Hydrological Data Publication

    NASA Astrophysics Data System (ADS)

    Kadlec, J.

    2013-12-01

    The growing need for sharing and integration of hydrological and climate data across multiple organizations has resulted in the development of distributed, services-based, standards-compliant hydrological data management and data hosting systems. The problem with these systems is complicated set-up and deployment. Many existing systems assume that the data publisher has remote-desktop access to a locally managed server and experience with computer network setup. For corporate websites, shared web hosting services with limited root access provide an inexpensive, dynamic web presence solution using the Linux, Apache, MySQL and PHP (LAMP) software stack. In this paper, we hypothesize that a webhosting service provides an optimal, low-cost solution for hydrological data hosting. We propose a software architecture of a standards-compliant, lightweight and easy-to-deploy hydrological data management system that can be deployed on the majority of existing shared internet webhosting services. The architecture and design is validated by developing Hydroserver Lite: a PHP and MySQL-based hydrological data hosting package that is fully standards-compliant and compatible with the Consortium of Universities for Advancement of Hydrologic Sciences (CUAHSI) hydrologic information system. It is already being used for management of field data collection by students of the McCall Outdoor Science School in Idaho. For testing, the Hydroserver Lite software has been installed on multiple different free and low-cost webhosting sites including Godaddy, Bluehost and 000webhost. The number of steps required to set-up the server is compared with the number of steps required to set-up other standards-compliant hydrologic data hosting systems including THREDDS, IstSOS and MapServer SOS.

  16. Creation of a Web-Based GIS Server and Custom Geoprocessing Tools for Enhanced Hydrologic Applications

    NASA Astrophysics Data System (ADS)

    Welton, B.; Chouinard, K.; Sultan, M.; Becker, D.; Milewski, A.; Becker, R.

    2010-12-01

    Rising populations in the arid and semi arid parts of the World are increasing the demand for fresh water supplies worldwide. Many data sets needed for assessment of hydrologic applications across vast regions of the world are expensive, unpublished, difficult to obtain, or at varying scales which complicates their use. Fortunately, this situation is changing with the development of global remote sensing datasets and web-based platforms such as GIS Server. GIS provides a cost effective vehicle for comparing, analyzing, and querying a variety of spatial datasets as geographically referenced layers. We have recently constructed a web-based GIS, that incorporates all relevant geological, geochemical, geophysical, and remote sensing data sets that were readily used to identify reservoir types and potential well locations on local and regional scales in various tectonic settings including: (1) extensional environment (Red Sea rift), (2) transcurrent fault system (Najd Fault in the Arabian-Nubian Shield), and (3) compressional environments (Himalayas). The web-based GIS could also be used to detect spatial and temporal trends in precipitation, recharge, and runoff in large watersheds on local, regional, and continental scales. These applications were enabled through the construction of a web-based ArcGIS Server with Google Map’s interface and the development of customized geoprocessing tools. ArcGIS Server provides out-of-the-box setups that are generic in nature. This platform includes all of the standard web based GIS tools (e.g. pan, zoom, identify, search, data querying, and measurement). In addition to the standard suite of tools provided by ArcGIS Server an additional set of advanced data manipulation and display tools was also developed to allow for a more complete and customizable view of the area of interest. The most notable addition to the standard GIS Server tools is the custom on-demand geoprocessing tools (e.g., graph, statistical functions, custom raster creation, profile, TRMM). The generation of a wide range of derivative maps (e.g., buffer zone, contour map, graphs, temporal rainfall distribution maps) from various map layers (e.g., geologic maps, geophysics, satellite images) allows for more user flexibility. The use of these tools along with Google Map’s API which enables the website user to utilize high quality GeoEye 2 images provide by Google in conjunction with our data, creates a more complete image of the area being observed and allows for custom derivative maps to be created in the field and viewed immediately on the web, processes that were restricted to offline databases.

  17. VIDANN: a video annotation system.

    PubMed

    De Clercq, A; Buysse, A; Roeyers, H; Ickes, W; Ponnet, K; Verhofstadt, L

    2001-05-01

    VIDANN is a computer program that allows participants to watch a video on a standard TV and to write their annotations (thought/feeling entries) on paper attached to a writing tablet. The system is designed as a Microsoft ActiveX module. It can be further adapted by the individual researcher through the use of a VBScript. All data, including the participant's handwriting, are stored in an XML database. An accompanying Wizard has been designed that enables researchers to generate VBScripts for standard configurations.

  18. Microsoft Biology Initiative: .NET Bioinformatics Platform and Tools

    PubMed Central

    Diaz Acosta, B.

    2011-01-01

    The Microsoft Biology Initiative (MBI) is an effort in Microsoft Research to bring new technology and tools to the area of bioinformatics and biology. This initiative is comprised of two primary components, the Microsoft Biology Foundation (MBF) and the Microsoft Biology Tools (MBT). MBF is a language-neutral bioinformatics toolkit built as an extension to the Microsoft .NET Framework—initially aimed at the area of Genomics research. Currently, it implements a range of parsers for common bioinformatics file formats; a range of algorithms for manipulating DNA, RNA, and protein sequences; and a set of connectors to biological web services such as NCBI BLAST. MBF is available under an open source license, and executables, source code, demo applications, documentation and training materials are freely downloadable from http://research.microsoft.com/bio. MBT is a collection of tools that enable biology and bioinformatics researchers to be more productive in making scientific discoveries.

  19. Towards an integration of affiliated companies energy audit process system at P.T Astra International

    NASA Astrophysics Data System (ADS)

    Telaga, Abdi Suryadinata; Hartanto, Indra Dwi; Audina, Debby Rizky; Prabowo, Fransiscus Dimas

    2017-06-01

    Environmental awareness, stringent regulation and soaring energy costs, together make energy efficiency as an important pillar for every company. Particularly, in 2020, the ministry of energy and mineral resources of Indonesia has set a target to reduce carbon emission by 26%. For that reason, companies in Indonesia have to comply with the emission target. However, there is trade-off between company's productivity and carbon emission. Therefore, the companies' productivity must be weighed against the environmental effect such as carbon emission. Nowadays, distinguish excessive energy in a company is still challenging. The company rarely has skilled person that capable to audit energy consumed in the company. Auditing energy consumption in a company is a lengthy and time consuming process. As PT Astra International (AI) have 220 affiliated companies (AFFCOs). Occasionally, direct visit to audit energy consumption in AFFCOs is inevitable. However, capability to conduct on-site energy audit was limited by the availability of PT AI energy auditors. For that reason, PT AI has developed a set of audit energy tools or Astra green energy (AGEn) tools to aid the AFFCOs auditor to be able to audit energy in their own company. Fishbone chart was developed as an analysis tool to gather root cause of audit energy problem. Following the analysis results, PT AI made an improvement by developing an AGEn web-based system. The system has capability to help AFFCOs to conduct energy audit on-site. The system was developed using prototyping methodology, object-oriented system analysis and design (OOSAD), and three-tier architecture. The implementation of system used ASP.NET, Microsoft SQL Server 2012 database, and web server IIS 8.

  20. AWIPS II in the University Community: Unidata's efforts and capabilities of the software

    NASA Astrophysics Data System (ADS)

    Ramamurthy, Mohan; James, Michael

    2015-04-01

    The Advanced Weather Interactive Processing System, version II (AWIPS II) is a weather forecasting, display and analysis tool that is used by the National Oceanic and Atmospheric Administration/National Weather Service (NOAA/NWS) and the National Centers for Environmental Prediction (NCEP) to ingest analyze and disseminate operational weather data. The AWIPS II software is built on a Service Oriented Architecture, takes advantage of open source software, and its design affords expandability, flexibility, and portability. Since many university meteorology programs are eager to use the same tools used by NWS forecasters, Unidata community interest in AWIPS II is high. The Unidata Program Center (UPC) has worked closely with NCEP staff during AWIPS II development in order to devise a way to make it available to the university. The Unidata AWIPS II software was released in beta form in 2014, and it incorporates a number of key changes to the baseline U. S. National Weather Service release to process and display additional data formats and run all components in a single-server standalone configuration. In addition to making available open-source instances of the software libraries that can be downloaded and run at any university, Unidata has also deployed the data-server side of AWIPS II, known as EDEX, in the Amazon Web Service and Microsoft Azure cloud environments. In this set up, universities receive all of the data from remote cloud instances, while they only have to run the AWIPS II client, known as CAVE, to analyze and visualize the data. In this presentation, we will describe Unidata's AWIPS II efforts, including the capabilities of the software in visualizing many different types of real-time meteorological data and its myriad uses in university and other settings.

  1. LASP Time Series Server (LaTiS): Overcoming Data Access Barriers via a Common Data Model in the Middle Tier (Invited)

    NASA Astrophysics Data System (ADS)

    Lindholm, D. M.; Wilson, A.

    2010-12-01

    The Laboratory for Atmospheric and Space Physics at the University of Colorado has developed an Open Source, OPeNDAP compliant, Java Servlet based, RESTful web service to serve time series data. In addition to handling OPeNDAP style requests and returning standard responses, existing modules for alternate output formats can be reused or customized. It is also simple to reuse or customize modules to directly read various native data sources and even to perform some processing on the server. The server is built around a common data model based on the Unidata Common Data Model (CDM) which merges the NetCDF, HDF, and OPeNDAP data models. The server framework features a modular architecture that supports pluggable Readers, Writers, and Filters via the common interface to the data, enabling a workflow that reads data from their native form, performs some processing on the server, and presents the results to the client in its preferred form. The service is currently being used operationally to serve time series data for the LASP Interactive Solar Irradiance Data Center (LISIRD, http://lasp.colorado.edu/lisird/) and as part of the Time Series Data Server (TSDS, http://tsds.net/). I will present the data model and how it enables reading, writing, and processing concerns to be separated into loosely coupled components. I will also share thoughts for evolving beyond the time series abstraction and providing a general purpose data service that can be orchestrated into larger workflows.

  2. Development of a high-performance image server using ATM technology

    NASA Astrophysics Data System (ADS)

    Do Van, Minh; Humphrey, Louis M.; Ravin, Carl E.

    1996-05-01

    The ability to display digital radiographs to a radiologist in a reasonable time has long been the goal of many PACS. Intelligent routing, or pre-fetching images, has become a solution whereby a system uses a set of rules to route the images to a pre-determined destination. Images would then be stored locally on a workstation for faster display times. Some PACS use a large, centralized storage approach and workstations retrieve images over high bandwidth connections. Another approach to image management is to provide a high performance, clustered storage system. This has the advantage of eliminating the complexity of pre-fetching and allows for rapid image display from anywhere within the hospital. We discuss the development of such a storage device, which provides extremely fast access to images across a local area network. Among the requirements for development of the image server were high performance, DICOM 3.0 compliance, and the use of industry standard components. The completed image server provides performance more than sufficient for use in clinical practice. Setting up modalities to send images to the image server is simple due to the adherence to the DICOM 3.0 specification. Using only off-the-shelf components allows us to keep the cost of the server relatively inexpensive and allows for easy upgrades as technology becomes more advanced. These factors make the image server ideal for use as a clustered storage system in a radiology department.

  3. iScreen: world's first cloud-computing web server for virtual screening and de novo drug design based on TCM database@Taiwan

    NASA Astrophysics Data System (ADS)

    Tsai, Tsung-Ying; Chang, Kai-Wei; Chen, Calvin Yu-Chian

    2011-06-01

    The rapidly advancing researches on traditional Chinese medicine (TCM) have greatly intrigued pharmaceutical industries worldwide. To take initiative in the next generation of drug development, we constructed a cloud-computing system for TCM intelligent screening system (iScreen) based on TCM Database@Taiwan. iScreen is compacted web server for TCM docking and followed by customized de novo drug design. We further implemented a protein preparation tool that both extract protein of interest from a raw input file and estimate the size of ligand bind site. In addition, iScreen is designed in user-friendly graphic interface for users who have less experience with the command line systems. For customized docking, multiple docking services, including standard, in-water, pH environment, and flexible docking modes are implemented. Users can download first 200 TCM compounds of best docking results. For TCM de novo drug design, iScreen provides multiple molecular descriptors for a user's interest. iScreen is the world's first web server that employs world's largest TCM database for virtual screening and de novo drug design. We believe our web server can lead TCM research to a new era of drug development. The TCM docking and screening server is available at http://iScreen.cmu.edu.tw/.

  4. iScreen: world's first cloud-computing web server for virtual screening and de novo drug design based on TCM database@Taiwan.

    PubMed

    Tsai, Tsung-Ying; Chang, Kai-Wei; Chen, Calvin Yu-Chian

    2011-06-01

    The rapidly advancing researches on traditional Chinese medicine (TCM) have greatly intrigued pharmaceutical industries worldwide. To take initiative in the next generation of drug development, we constructed a cloud-computing system for TCM intelligent screening system (iScreen) based on TCM Database@Taiwan. iScreen is compacted web server for TCM docking and followed by customized de novo drug design. We further implemented a protein preparation tool that both extract protein of interest from a raw input file and estimate the size of ligand bind site. In addition, iScreen is designed in user-friendly graphic interface for users who have less experience with the command line systems. For customized docking, multiple docking services, including standard, in-water, pH environment, and flexible docking modes are implemented. Users can download first 200 TCM compounds of best docking results. For TCM de novo drug design, iScreen provides multiple molecular descriptors for a user's interest. iScreen is the world's first web server that employs world's largest TCM database for virtual screening and de novo drug design. We believe our web server can lead TCM research to a new era of drug development. The TCM docking and screening server is available at http://iScreen.cmu.edu.tw/.

  5. The evolution of internet-based map server applications in the United States Department of Agriculture, Veterinary Services.

    PubMed

    Maroney, Susan A; McCool, Mary Jane; Geter, Kenneth D; James, Angela M

    2007-01-01

    The internet is used increasingly as an effective means of disseminating information. For the past five years, the United States Department of Agriculture (USDA) Veterinary Services (VS) has published animal health information in internet-based map server applications, each oriented to a specific surveillance or outbreak response need. Using internet-based technology allows users to create dynamic, customised maps and perform basic spatial analysis without the need to buy or learn desktop geographic information systems (GIS) software. At the same time, access can be restricted to authorised users. The VS internet mapping applications to date are as follows: Equine Infectious Anemia Testing 1972-2005, National Tick Survey tick distribution maps, the Emergency Management Response System-Mapping Module for disease investigations and emergency outbreaks, and the Scrapie mapping module to assist with the control and eradication of this disease. These services were created using Environmental Systems Research Institute (ESRI)'s internet map server technology (ArcIMS). Other leading technologies for spatial data dissemination are ArcGIS Server, ArcEngine, and ArcWeb Services. VS is prototyping applications using these technologies, including the VS Atlas of Animal Health Information using ArcGIS Server technology and the Map Kiosk using ArcEngine for automating standard map production in the case of an emergency.

  6. LocExpress: a web server for efficiently estimating expression of novel transcripts.

    PubMed

    Hou, Mei; Tian, Feng; Jiang, Shuai; Kong, Lei; Yang, Dechang; Gao, Ge

    2016-12-22

    The temporal and spatial-specific expression pattern of a transcript in multiple tissues and cell types can indicate key clues about its function. While several gene atlas available online as pre-computed databases for known gene models, it's still challenging to get expression profile for previously uncharacterized (i.e. novel) transcripts efficiently. Here we developed LocExpress, a web server for efficiently estimating expression of novel transcripts across multiple tissues and cell types in human (20 normal tissues/cells types and 14 cell lines) as well as in mouse (24 normal tissues/cell types and nine cell lines). As a wrapper to RNA-Seq quantification algorithm, LocExpress efficiently reduces the time cost by making abundance estimation calls increasingly within the minimum spanning bundle region of input transcripts. For a given novel gene model, such local context-oriented strategy allows LocExpress to estimate its FPKMs in hundreds of samples within minutes on a standard Linux box, making an online web server possible. To the best of our knowledge, LocExpress is the only web server to provide nearly real-time expression estimation for novel transcripts in common tissues and cell types. The server is publicly available at http://loc-express.cbi.pku.edu.cn .

  7. Microsoft in Southeast Europe: A Conversation with Goran Radman

    ERIC Educational Resources Information Center

    Pendergast, William; Frayne, Colette; Kelley, Patricia

    2009-01-01

    Goran Radman (GR) joined Microsoft in 1996 and served until Fall 2008 as Microsoft Chairman, Southeast Europe (SEE) and Chairman, East and Central Europe (ECEE). Based in Croatia, where he enjoys sailing the Adriatic coast and islands, he spoke with the authors during 2008 and 2009 about his experience launching Microsoft's commercial presence in…

  8. Microsoft's Tom Corddry on Multimedia, the Information Superhighway and the Future of Online.

    ERIC Educational Resources Information Center

    Herther, Nancy K.

    1994-01-01

    Tom Corddry, Microsoft Corporation's Creative Director for the Consumer Division, is interviewed about the Microsoft Home line of products and the development of related CD-ROM and multimedia products. Reasons for Microsoft's entry into the content market and its challenges, the market's future, and the company's interest in developing online…

  9. Are pitch and roll compensations required in all pathologies? A data analysis of 2945 fractions.

    PubMed

    Mancosu, Pietro; Reggiori, Giacomo; Gaudino, Anna; Lobefalo, Francesca; Paganini, Lucia; Palumbo, Valentina; Stravato, Antonella; Tomatis, Stefano; Scorsetti, Marta

    2015-01-01

    New linear accelerators can be equipped with a 6D robotic couch, providing two additional rotational motion axes: pitch and roll. These shifts in kilo voltage-cone beam CT (kV-CBCT) image-guided radiotherapy (IGRT) were evaluated over the first 6 months of usage of a 6D robotic couch-top, ranking the treatment sites for which the two compensations are larger for patient set-up. The couch compensations of 2945 fractions for 376 consecutive patients treated on the PerfectPitch™ 6D couch (Varian(®) Medical Systems, Palo Alto, CA) were analysed. Among these patients, 169 were treated for brain, 111 for lung, 54 for liver, 26 for pancreas and 16 for prostate tumours. During the set-up, patient anatomy from planning CT was aligned to kV-CBCT, and 6D movements were executed. Information related to pitch and roll were extracted by proper querying of the Microsoft(®) SQL server (Microsoft Corporation, Redmond, WA) ARIA database (Varian Medical Systems). Mean values and standard deviations were calculated for all sites. Kolmogorov-Smirnov (KS) test was performed. Considering all the data, mean pitch and roll adjustments were -0.10° ± 0.92° and 0.12° ± 0.96°, respectively; mean absolute values for both adjustments were 0.58° ± 0.69° and 0.69° ± 0.72°, respectively. Brain treatments showed the highest mean absolute values for pitch and roll rotations (0.73° ± 0.69° and 0.80° ± 0.78°, respectively); the lowest values of 0.36° ± 0.47° and 0.49° ± 0.58° were found for pancreas. KS test was significant for brain vs liver, pancreas and prostate. Collective corrections (pitch + roll) >0.5°, >1.0° and >2.0° were observed in, respectively, 79.8%, 61.0% and 29.1% for brain and 56.7%, 39.4% and 6.7% for pancreas. Adjustments in all six dimensions, including unconventional pitch and roll rotations, improve the patient set-up in all treatment sites. The greatest improvement was observed for patients with brain tumours. To our knowledge, this is the first systematic evaluation of the clinical efficacy of a 6D Robotic couch-top in CBCT IGRT over different tumour regions.

  10. CrossQuery: a web tool for easy associative querying of transcriptome data.

    PubMed

    Wagner, Toni U; Fischer, Andreas; Thoma, Eva C; Schartl, Manfred

    2011-01-01

    Enormous amounts of data are being generated by modern methods such as transcriptome or exome sequencing and microarray profiling. Primary analyses such as quality control, normalization, statistics and mapping are highly complex and need to be performed by specialists. Thereafter, results are handed back to biomedical researchers, who are then confronted with complicated data lists. For rather simple tasks like data filtering, sorting and cross-association there is a need for new tools which can be used by non-specialists. Here, we describe CrossQuery, a web tool that enables straight forward, simple syntax queries to be executed on transcriptome sequencing and microarray datasets. We provide deep-sequencing data sets of stem cell lines derived from the model fish Medaka and microarray data of human endothelial cells. In the example datasets provided, mRNA expression levels, gene, transcript and sample identification numbers, GO-terms and gene descriptions can be freely correlated, filtered and sorted. Queries can be saved for later reuse and results can be exported to standard formats that allow copy-and-paste to all widespread data visualization tools such as Microsoft Excel. CrossQuery enables researchers to quickly and freely work with transcriptome and microarray data sets requiring only minimal computer skills. Furthermore, CrossQuery allows growing association of multiple datasets as long as at least one common point of correlated information, such as transcript identification numbers or GO-terms, is shared between samples. For advanced users, the object-oriented plug-in and event-driven code design of both server-side and client-side scripts allow easy addition of new features, data sources and data types.

  11. Experience with an online prospective database on adolescent idiopathic scoliosis: development and implementation.

    PubMed

    Arlet, Vincent; Shilt, Jeffrey; Bersusky, Ernesto; Abel, Mark; Ouellet, Jean Albert; Evans, Davis; Menon, K V; Kandziora, Frank; Shen, Frank; Lamartina, Claudio; Adams, Marc; Reddi, Vasantha

    2008-11-01

    Considerable variability exists in the surgical treatment and outcomes of adolescent idiopathic scoliosis (AIS). This is due to the lack of evidence-based treatment guidelines and outcome measures. Although clinical trials have been extolled as the highest form of evidence for evaluating treatment efficacy, the disadvantage of cost, time, lack of feasibility, and ethical considerations indicate a need for a new paradigm for evidence based research in this spinal deformity. High quality clinical databases offer an alternative approach for evidence-based research in medicine. So, we developed and established Scolisoft, an international, multidimensional and relational database designed to be a repository of surgical cases for AIS, and an active vehicle for standardized surgical information in a format that would permit qualitative and quantitative research and analysis. Here, we describe and discuss the utility of Scolisoft as a new paradigm for evidence-based research on AIS. Scolisoft was developed using dot.net platform and SQL server from Microsoft. All data is deidentified to protect patient privacy. Scolisoft can be accessed at (www.scolisoft.org). Collection of high quality data on surgical cases of AIS is a priority and processes continue to improve the database quality. The database currently has 67 registered users from 21 countries. To date, Scolisoft has 200 detailed surgical cases with pre, post, and follow up data. Scolisoft provides a structured process and practical information for surgeons to benchmark their treatment methods against other like treatments. Scolisoft is multifaceted and its use extends to education of health care providers in training, patients, ability to mine important data to stimulate research and quality improvement initiatives of healthcare organizations.

  12. SU-E-T-255: Development of a Michigan Quality Assurance (MQA) Database for Clinical Machine Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, D

    Purpose: A unified database system was developed to allow accumulation, review and analysis of quality assurance (QA) data for measurement, treatment, imaging and simulation equipment in our department. Recording these data in a database allows a unified and structured approach to review and analysis of data gathered using commercial database tools. Methods: A clinical database was developed to track records of quality assurance operations on linear accelerators, a computed tomography (CT) scanner, high dose rate (HDR) afterloader and imaging systems such as on-board imaging (OBI) and Calypso in our department. The database was developed using Microsoft Access database and visualmore » basic for applications (VBA) programming interface. Separate modules were written for accumulation, review and analysis of daily, monthly and annual QA data. All modules were designed to use structured query language (SQL) as the basis of data accumulation and review. The SQL strings are dynamically re-written at run time. The database also features embedded documentation, storage of documents produced during QA activities and the ability to annotate all data within the database. Tests are defined in a set of tables that define test type, specific value, and schedule. Results: Daily, Monthly and Annual QA data has been taken in parallel with established procedures to test MQA. The database has been used to aggregate data across machines to examine the consistency of machine parameters and operations within the clinic for several months. Conclusion: The MQA application has been developed as an interface to a commercially available SQL engine (JET 5.0) and a standard database back-end. The MQA system has been used for several months for routine data collection.. The system is robust, relatively simple to extend and can be migrated to a commercial SQL server.« less

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCord, Jason

    WLS gathers all known relevant contextual data along with standard event log information, processes it into an easily consumable format for analysis by 3rd party tools, and forwards the logs to any compatible log server.

  14. Method of recommending items to a user based on user interest

    DOEpatents

    Bollen, John; Van De Sompel, Herbert

    2013-11-05

    Although recording of usage data is common in scholarly information services, its exploitation for the creation of value-added services remains limited due to concerns regarding, among others, user privacy, data validity, and the lack of accepted standards for the representation, sharing and aggregation of usage data. A technical, standards-based architecture for sharing usage information is presented. In this architecture, OpenURL-compliant linking servers aggregate usage information of a specific user community as it navigates the distributed information environment that it has access to. This usage information is made OAI-PMH harvestable so that usage information exposed by many linking servers can be aggregated to facilitate the creation of value-added services with a reach beyond that of a single community or a single information service.

  15. Usage based indicators to assess the impact of scholarly works: architecture and method

    DOEpatents

    Bollen, Johan [Santa Fe, NM; Van De Sompel, Herbert [Santa Fe, NM

    2012-03-13

    Although recording of usage data is common in scholarly information services, its exploitation for the creation of value-added services remains limited due to concerns regarding, among others, user privacy, data validity, and the lack of accepted standards for the representation, sharing and aggregation of usage data. A technical, standards-based architecture for sharing usage information is presented. In this architecture, OpenURL-compliant linking servers aggregate usage information of a specific user community as it navigates the distributed information environment that it has access to. This usage information is made OAI-PMH harvestable so that usage information exposed by many linking servers can be aggregated to facilitate the creation of value-added services with a reach beyond that of a single community or a single information service.

  16. Enhanced, Partially Redundant Emergency Notification System

    NASA Technical Reports Server (NTRS)

    Pounds, Clark D.

    2005-01-01

    The Johnson Space Center Emergency Notification System (JENS) software utilizes pre-existing computation and communication infrastructure to augment a prior variable-tone, siren-based, outdoor alarm system, in order to enhance the ability to give notice of emergencies to employees working in multiple buildings. The JENS software includes a component that implements an administrative Web site. Administrators can grant and deny access to the administrative site and to an originator Web site that enables authorized individuals to quickly compose and issue alarms. The originator site also facilitates maintenance and review of alarms already issued. A custom client/server application program enables an originator to notify every user who is logged in on a Microsoft Windows-based desktop computer by means of a pop-up message that interrupts, but does not disrupt, the user s work. Alternatively or in addition, the originator can send an alarm message to recipients on an e-mail distribution list and/or can post the notice on an internal Web site. An alarm message can consist of (1) text describing the emergency and suggesting a course of action and (2) a replica of the corresponding audible outdoor alarm.

  17. A practical approach for inexpensive searches of radiology report databases.

    PubMed

    Desjardins, Benoit; Hamilton, R Curtis

    2007-06-01

    We present a method to perform full text searches of radiology reports for the large number of departments that do not have this ability as part of their radiology or hospital information system. A tool written in Microsoft Access (front-end) has been designed to search a server (back-end) containing the indexed backup weekly copy of the full relational database extracted from a radiology information system (RIS). This front end-/back-end approach has been implemented in a large academic radiology department, and is used for teaching, research and administrative purposes. The weekly second backup of the 80 GB, 4 million record RIS database takes 2 hours. Further indexing of the exported radiology reports takes 6 hours. Individual searches of the indexed database typically take less than 1 minute on the indexed database and 30-60 minutes on the nonindexed database. Guidelines to properly address privacy and institutional review board issues are closely followed by all users. This method has potential to improve teaching, research, and administrative programs within radiology departments that cannot afford more expensive technology.

  18. Schedule-Aware Workflow Management Systems

    NASA Astrophysics Data System (ADS)

    Mans, Ronny S.; Russell, Nick C.; van der Aalst, Wil M. P.; Moleman, Arnold J.; Bakker, Piet J. M.

    Contemporary workflow management systems offer work-items to users through specific work-lists. Users select the work-items they will perform without having a specific schedule in mind. However, in many environments work needs to be scheduled and performed at particular times. For example, in hospitals many work-items are linked to appointments, e.g., a doctor cannot perform surgery without reserving an operating theater and making sure that the patient is present. One of the problems when applying workflow technology in such domains is the lack of calendar-based scheduling support. In this paper, we present an approach that supports the seamless integration of unscheduled (flow) and scheduled (schedule) tasks. Using CPN Tools we have developed a specification and simulation model for schedule-aware workflow management systems. Based on this a system has been realized that uses YAWL, Microsoft Exchange Server 2007, Outlook, and a dedicated scheduling service. The approach is illustrated using a real-life case study at the AMC hospital in the Netherlands. In addition, we elaborate on the experiences obtained when developing and implementing a system of this scale using formal techniques.

  19. Home media server content management

    NASA Astrophysics Data System (ADS)

    Tokmakoff, Andrew A.; van Vliet, Harry

    2001-07-01

    With the advent of set-top boxes, the convergence of TV (broadcasting) and PC (Internet) is set to enter the home environment. Currently, a great deal of activity is occurring in developing standards (TV-Anytime Forum) and devices (TiVo) for local storage on Home Media Servers (HMS). These devices lie at the heart of convergence of the triad: communications/networks - content/media - computing/software. Besides massive storage capacity and being a communications 'gateway', the home media server is characterised by the ability to handle metadata and software that provides an easy to use on-screen interface and intelligent search/content handling facilities. In this paper, we describe a research prototype HMS that is being developed within the GigaCE project at the Telematica Instituut . Our prototype demonstrates advanced search and retrieval (video browsing), adaptive user profiling and an innovative 3D component of the Electronic Program Guide (EPG) which represents online presence. We discuss the use of MPEG-7 for representing metadata, the use of MPEG-21 working draft standards for content identification, description and rights expression, and the use of HMS peer-to-peer content distribution approaches. Finally, we outline explorative user behaviour experiments that aim to investigate the effectiveness of the prototype HMS during development.

  20. TreeVector: scalable, interactive, phylogenetic trees for the web.

    PubMed

    Pethica, Ralph; Barker, Gary; Kovacs, Tim; Gough, Julian

    2010-01-28

    Phylogenetic trees are complex data forms that need to be graphically displayed to be human-readable. Traditional techniques of plotting phylogenetic trees focus on rendering a single static image, but increases in the production of biological data and large-scale analyses demand scalable, browsable, and interactive trees. We introduce TreeVector, a Scalable Vector Graphics-and Java-based method that allows trees to be integrated and viewed seamlessly in standard web browsers with no extra software required, and can be modified and linked using standard web technologies. There are now many bioinformatics servers and databases with a range of dynamic processes and updates to cope with the increasing volume of data. TreeVector is designed as a framework to integrate with these processes and produce user-customized phylogenies automatically. We also address the strengths of phylogenetic trees as part of a linked-in browsing process rather than an end graphic for print. TreeVector is fast and easy to use and is available to download precompiled, but is also open source. It can also be run from the web server listed below or the user's own web server. It has already been deployed on two recognized and widely used database Web sites.

  1. DEVELOPMENT OF A CHEMICAL PROCESS MODELING ENVIRONMENT BASED ON CAPE-OPEN INTERFACE STANDARDS AND THE MICROSOFT .NET FRAMEWORK

    EPA Science Inventory

    Chemical process simulation has long been used as a design tool in the development of chemical plants, and has long been considered a means to evaluate different design options. With the advent of large scale computer networks and interface models for program components, it is po...

  2. The USGODAE Monterey Data Server

    NASA Astrophysics Data System (ADS)

    Sharfstein, P. J.; Dimitriou, D.; Hankin, S. C.

    2004-12-01

    With oversight from the U.S. Global Ocean Data Assimilation Experiment (GODAE) Steering Committee and funding from the Office of Naval Research, the USGODAE Monterey Data Server has been established at the Fleet Numerical Meteorology and Oceanography Center (FNMOC) as an explicit U.S. contribution to GODAE. Support of the Monterey Data Server is accomplished by a cooperative effort between FNMOC and NOAA's Pacific Marine Environmental Laboratory (PMEL) in the on-going development of the server and the support of a collaborative network of GODAE assimilation groups. This server hosts near real-time in-situ oceanographic data, atmospheric forcing fields suitable for driving ocean models, and unique GODAE data sets, including demonstration ocean model products. GODAE is envisioned as a global system of observations, communications, modeling and assimilation, which will deliver regular, comprehensive information on the state of the oceans in a way that will promote and engender wide utility and availability of this resource for maximum benefit to society. It aims to make ocean monitoring and prediction a routine activity in a manner similar to weather forecasting. GODAE will contribute to an information system for the global ocean that will serve interests from climate and climate change to ship routing and fisheries. The USGODAE Server is developed and operated as a prototypical node for this global information system. Because of the broad range and diverse formats of data used by the GODAE community, presenting data with a consistent interface and ensuring its availability in standard formats is a primary challenge faced by the USGODAE Server project. To this end, all USGODAE data sets are available via HTTP and FTP. In addition, USGODAE data are served using Local Data Manager (LDM), THREDDS cataloging, OPeNDAP, and Live Access Server (LAS) from PMEL. Every effort is made to serve USGODAE data through the standards specified by the National Virtual Ocean Data System (NVODS) and the Integrated Ocean Observing System Data Management and Communications (IOOS/DMAC). To provide surface forcing, fluxes, and boundary conditions for ocean model research, USGODAE serves global data from the Navy Operational Global Atmospheric Prediction System (NOGAPS) and regional data from the Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS). Global meteorological data and observational data from the FNMOC Ocean QC process are posted in near real-time to USGODAE. These include T/S profiles, in-situ and satellite sea surface temperature (SST), satellite altimetry, and SSM/I sea ice. They contain all of the unclassified in-situ and satellite observations used to initialize the FNMOC NOGAPS model. Also, the Naval Oceanographic Office provides daily satellite SST and SSH retrievals to USGODAE. The USGODAE Server functions as one of two Argo Global Data Assembly Centers (GDACs), hosting the complete collection of quality-controlled Argo T/S profiling float data. USGODAE Argo data are served through OPeNDAP and LAS, providing complete integration into NVODS and the IOOS/DMAC. Due to its high reliability, ease of data access, and increasing breadth of data, the USGODAE Server is becoming an invaluable resource for both the GODAE community and the general oceanographic community. Continued integration of model, forcing, and in-situ data sets from providers throughout the world is making the USGODAE Monterey Data Server a key part of the international GODAE project.

  3. EarthServer - an FP7 project to enable the web delivery and analysis of 3D/4D models

    NASA Astrophysics Data System (ADS)

    Laxton, John; Sen, Marcus; Passmore, James

    2013-04-01

    EarthServer aims at open access and ad-hoc analytics on big Earth Science data, based on the OGC geoservice standards Web Coverage Service (WCS) and Web Coverage Processing Service (WCPS). The WCS model defines "coverages" as a unifying paradigm for multi-dimensional raster data, point clouds, meshes, etc., thereby addressing a wide range of Earth Science data including 3D/4D models. WCPS allows declarative SQL-style queries on coverages. The project is developing a pilot implementing these standards, and will also investigate the use of GeoSciML to describe coverages. Integration of WCPS with XQuery will in turn allow coverages to be queried in combination with their metadata and GeoSciML description. The unified service will support navigation, extraction, aggregation, and ad-hoc analysis on coverage data from SQL. Clients will range from mobile devices to high-end immersive virtual reality, and will enable 3D model visualisation using web browser technology coupled with developing web standards. EarthServer is establishing open-source client and server technology intended to be scalable to Petabyte/Exabyte volumes, based on distributed processing, supercomputing, and cloud virtualization. Implementation will be based on the existing rasdaman server technology developed. Services using rasdaman technology are being installed serving the atmospheric, oceanographic, geological, cryospheric, planetary and general earth observation communities. The geology service (http://earthserver.bgs.ac.uk/) is being provided by BGS and at present includes satellite imagery, superficial thickness data, onshore DTMs and 3D models for the Glasgow area. It is intended to extend the data sets available to include 3D voxel models. Use of the WCPS standard allows queries to be constructed against single or multiple coverages. For example on a single coverage data for a particular area can be selected or data with a particular range of pixel values. Queries on multiple surfaces can be constructed to calculate, for example, the thickness between two surfaces in a 3D model or the depth from ground surface to the top of a particular geologic unit. In the first version of the service a simple interface showing some example queries has been implemented in order to show the potential of the technologies. The project aims to develop the services available in light of user feedback, both in terms of the data available, the functionality and the interface. User feedback on the services guides the software and standards development aspects of the project, leading to enhanced versions of the software which will be implemented in upgraded versions of the services during the lifetime of the project.

  4. 75 FR 14401 - Amendment of Certain of the Commission's Rules of Practice and Procedure and Rules of Commission...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-25

    ... were created, such as Microsoft Excel, Microsoft Word, or Microsoft PowerPoint (``native format'')? We... (condensed) or expanded (detailed) format Export search results to Excel or PDF As noted above, system is...., Microsoft Word ``.doc'' format or non-copy protected text- searchable ``.pdf'' format)? Should submissions...

  5. Experience of public procurement of Open Compute servers

    NASA Astrophysics Data System (ADS)

    Bärring, Olof; Guerri, Marco; Bonfillou, Eric; Valsan, Liviu; Grigore, Alexandru; Dore, Vincent; Gentit, Alain; Clement, Benoît; Grossir, Anthony

    2015-12-01

    The Open Compute Project. OCP (http://www.opencompute.org/). was launched by Facebook in 2011 with the objective of building efficient computing infrastructures at the lowest possible cost. The technologies are released as open hardware. with the goal to develop servers and data centres following the model traditionally associated with open source software projects. In 2013 CERN acquired a few OCP servers in order to compare performance and power consumption with standard hardware. The conclusions were that there are sufficient savings to motivate an attempt to procure a large scale installation. One objective is to evaluate if the OCP market is sufficiently mature and broad enough to meet the constraints of a public procurement. This paper summarizes this procurement. which started in September 2014 and involved the Request for information (RFI) to qualify bidders and Request for Tender (RFT).

  6. Image-based electronic patient records for secured collaborative medical applications.

    PubMed

    Zhang, Jianguo; Sun, Jianyong; Yang, Yuanyuan; Liang, Chenwen; Yao, Yihong; Cai, Weihua; Jin, Jin; Zhang, Guozhen; Sun, Kun

    2005-01-01

    We developed a Web-based system to interactively display image-based electronic patient records (EPR) for secured intranet and Internet collaborative medical applications. The system consists of four major components: EPR DICOM gateway (EPR-GW), Image-based EPR repository server (EPR-Server), Web Server and EPR DICOM viewer (EPR-Viewer). In the EPR-GW and EPR-Viewer, the security modules of Digital Signature and Authentication are integrated to perform the security processing on the EPR data with integrity and authenticity. The privacy of EPR in data communication and exchanging is provided by SSL/TLS-based secure communication. This presentation gave a new approach to create and manage image-based EPR from actual patient records, and also presented a way to use Web technology and DICOM standard to build an open architecture for collaborative medical applications.

  7. DICOM-compliant PACS with CD-based image archival

    NASA Astrophysics Data System (ADS)

    Cox, Robert D.; Henri, Christopher J.; Rubin, Richard K.; Bret, Patrice M.

    1998-07-01

    This paper describes the design and implementation of a low- cost PACS conforming to the DICOM 3.0 standard. The goal was to provide an efficient image archival and management solution on a heterogeneous hospital network as a basis for filmless radiology. The system follows a distributed, client/server model and was implemented at a fraction of the cost of a commercial PACS. It provides reliable archiving on recordable CD and allows access to digital images throughout the hospital and on the Internet. Dedicated servers have been designed for short-term storage, CD-based archival, data retrieval and remote data access or teleradiology. The short-term storage devices provide DICOM storage and query/retrieve services to scanners and workstations and approximately twelve weeks of 'on-line' image data. The CD-based archival and data retrieval processes are fully automated with the exception of CD loading and unloading. The system employs lossless compression on both short- and long-term storage devices. All servers communicate via the DICOM protocol in conjunction with both local and 'master' SQL-patient databases. Records are transferred from the local to the master database independently, ensuring that storage devices will still function if the master database server cannot be reached. The system features rules-based work-flow management and WWW servers to provide multi-platform remote data access. The WWW server system is distributed on the storage, retrieval and teleradiology servers allowing viewing of locally stored image data directly in a WWW browser without the need for data transfer to a central WWW server. An independent system monitors disk usage, processes, network and CPU load on each server and reports errors to the image management team via email. The PACS was implemented using a combination of off-the-shelf hardware, freely available software and applications developed in-house. The system has enabled filmless operation in CT, MR and ultrasound within the radiology department and throughout the hospital. The use of WWW technology has enabled the development of an intuitive we- based teleradiology and image management solution that provides complete access to image data.

  8. Workflow opportunities using JPEG 2000

    NASA Astrophysics Data System (ADS)

    Foshee, Scott

    2002-11-01

    JPEG 2000 is a new image compression standard from ISO/IEC JTC1 SC29 WG1, the Joint Photographic Experts Group (JPEG) committee. Better thought of as a sibling to JPEG rather than descendant, the JPEG 2000 standard offers wavelet based compression as well as companion file formats and related standardized technology. This paper examines the JPEG 2000 standard for features in four specific areas-compression, file formats, client-server, and conformance/compliance that enable image workflows.

  9. Availability of the OGC geoprocessing standard: March 2011 reality check

    NASA Astrophysics Data System (ADS)

    Lopez-Pellicer, Francisco J.; Rentería-Agualimpia, Walter; Béjar, Rubén; Muro-Medrano, Pedro R.; Zarazaga-Soria, F. Javier

    2012-10-01

    This paper presents an investigation about the servers available in March 2011 conforming to the Web Processing Service interface specification published by the geospatial standards organization Open Geospatial Consortium (OGC) in 2007. This interface specification gives support to standard Web-based geoprocessing. The data used in this research were collected using a focused crawler configured for finding OGC Web services. The research goals are (i) to provide a reality check of the availability of Web Processing Service servers, (ii) to provide quantitative data about the use of different features defined in the standard that are relevant for a scalable Geoprocessing Web (e.g. long-running processes, Web-accessible data outputs), and (iii) to test if the advances in the use of search engines and focused crawlers for finding Web services can be applied for finding geoscience processing systems. Research results show the feasibility of the discovery approach and provide data about the implementation of the Web Processing Service specification. These results also show extensive use of features related to scalability, except for those related to technical and semantic interoperability.

  10. Telemedicine with integrated data security in ATM-based networks

    NASA Astrophysics Data System (ADS)

    Thiel, Andreas; Bernarding, Johannes; Kurth, Ralf; Wenzel, Rudiger; Villringer, Arno; Tolxdorff, Thomas

    1997-05-01

    Telemedical services rely on the digital transfer of large amounts of data in a short time. The acceptance of these services requires therefore new hard- and software concepts. The fast exchange of data is well performed within a high- speed ATM-based network. The fast access to the data from different platforms imposes more difficult problems, which may be divided into those relating to standardized data formats and those relating to different levels of data security across nations. For a standardized access to the formats and those relating to different levels of data security across nations. For a standardized access to the image data, a DICOM 3.0 server was implemented.IMages were converted into the DICOM 3.0 standard if necessary. The access to the server is provided by an implementation of DICOM in JAVA allowing access to the data from different platforms. Data protection measures to ensure the secure transfer of sensitive patient data are not yet solved within the DICOM concept. We investigated different schemes to protect data using the DICOM/JAVA modality with as little impact on data transfer speed as possible.

  11. Using Microsoft Access: A How-To-Do-It Manual for Librarians. How-To-Do-It Manuals for Librarians, Number 76.

    ERIC Educational Resources Information Center

    Butler, E. Sonny

    Much of what librarians do today requires adeptness in creating and manipulating databases. Many new computers bought by libraries every year come packaged with Microsoft Office and include Microsoft Access. This database program features a seamless interface between Microsoft Office's other programs like Word, Excel, and PowerPoint. This book…

  12. Improving the Plasticity of LIMS Implementation: LIMS Extension through Microsoft Excel

    NASA Technical Reports Server (NTRS)

    Culver, Mark

    2017-01-01

    A Laboratory Information Management System (LIMS) is a databasing software with many built-in tools ideal for handling and documenting most laboratory processes in an accurate and consistent manner, making it an indispensable tool for the modern laboratory. However, a lot of LIMS end users will find that in the performance of analyses that have unique considerations such as standard curves, multiple stages incubations, or logical considerations, a base LIMS distribution may not ideally suit their needs. These considerations bring about the need for extension languages, which can extend the functionality of a LIMS. While these languages do provide the implementation team the functionality required to accommodate these special laboratory analyses, they are usually too complex for the end user to modify to compensate for natural changes in laboratory operations. The LIMS utilized by our laboratory offers a unique and easy-to-use choice for an extension language, one that is already heavily relied upon not only in science but also in most academic and business pursuits: Microsoft Excel. The validity of Microsoft Excel as a pseudo programming language and its usability and versatility as a LIMS extension language will be discussed. The NELAC implications and overall drawbacks of this LIMS configuration will also be discussed.

  13. D Capturing Performances of Low-Cost Range Sensors for Mass-Market Applications

    NASA Astrophysics Data System (ADS)

    Guidi, G.; Gonizzi, S.; Micoli, L.

    2016-06-01

    Since the advent of the first Kinect as motion controller device for the Microsoft XBOX platform (November 2010), several similar active and low-cost range sensing devices have been introduced on the mass-market for several purposes, including gesture based interfaces, 3D multimedia interaction, robot navigation, finger tracking, 3D body scanning for garment design and proximity sensors for automotive. However, given their capability to generate a real time stream of range images, these has been used in some projects also as general purpose range devices, with performances that for some applications might be satisfying. This paper shows the working principle of the various devices, analyzing them in terms of systematic errors and random errors for exploring the applicability of them in standard 3D capturing problems. Five actual devices have been tested featuring three different technologies: i) Kinect V1 by Microsoft, Structure Sensor by Occipital, and Xtion PRO by ASUS, all based on different implementations of the Primesense sensor; ii) F200 by Intel/Creative, implementing the Realsense pattern projection technology; Kinect V2 by Microsoft, equipped with the Canesta TOF Camera. A critical analysis of the results tries first of all to compare them, and secondarily to focus the range of applications for which such devices could actually work as a viable solution.

  14. The valuable use of Microsoft Kinect™ sensor 3D kinematic in the rehabilitation process in basketball

    NASA Astrophysics Data System (ADS)

    Braidot, Ariel; Favaretto, Guillermo; Frisoli, Melisa; Gemignani, Diego; Gumpel, Gustavo; Massuh, Roberto; Rayan, Josefina; Turin, Matías

    2016-04-01

    Subjects who practice sports either as professionals or amateurs, have a high incidence of knee injuries. There are a few publications that show studies from a kinematic point of view of lateral-structure-knee injuries, including meniscal (meniscal tears or chondral injury), without anterior cruciate ligament rupture. The use of standard motion capture systems for measuring outdoors sport is hard to implement due to many operative reasons. Recently released, the Microsoft Kinect™ is a sensor that was developed to track movements for gaming purposes and has seen an increased use in clinical applications. The fact that this device is a simple and portable tool allows the acquisition of data of sport common movements in the field. The development and testing of a set of protocols for 3D kinematic measurement using the Microsoft Kinect™ system is presented in this paper. The 3D kinematic evaluation algorithms were developed from information available and with the use of Microsoft’s Software Development Kit 1.8 (SDK). Along with this, an algorithm for calculating the lower limb joints angles was implemented. Thirty healthy adult volunteers were measured, using five different recording protocols for sport characteristic gestures which involve high knee injury risk in athletes.

  15. LabKey Server NAb: A tool for analyzing, visualizing and sharing results from neutralizing antibody assays

    PubMed Central

    2011-01-01

    Background Multiple types of assays allow sensitive detection of virus-specific neutralizing antibodies. For example, the extent of antibody neutralization of HIV-1, SIV and SHIV can be measured in the TZM-bl cell line through the degree of luciferase reporter gene expression after infection. In the past, neutralization curves and titers for this standard assay have been calculated using an Excel macro. Updating all instances of such a macro with new techniques can be unwieldy and introduce non-uniformity across multi-lab teams. Using Excel also poses challenges in centrally storing, sharing and associating raw data files and results. Results We present LabKey Server's NAb tool for organizing, analyzing and securely sharing data, files and results for neutralizing antibody (NAb) assays, including the luciferase-based TZM-bl NAb assay. The customizable tool supports high-throughput experiments and includes a graphical plate template designer, allowing researchers to quickly adapt calculations to new plate layouts. The tool calculates the percent neutralization for each serum dilution based on luminescence measurements, fits a range of neutralization curves to titration results and uses these curves to estimate the neutralizing antibody titers for benchmark dilutions. Results, curve visualizations and raw data files are stored in a database and shared through a secure, web-based interface. NAb results can be integrated with other data sources based on sample identifiers. It is simple to make results public after publication by updating folder security settings. Conclusions Standardized tools for analyzing, archiving and sharing assay results can improve the reproducibility, comparability and reliability of results obtained across many labs. LabKey Server and its NAb tool are freely available as open source software at http://www.labkey.com under the Apache 2.0 license. Many members of the HIV research community can also access the LabKey Server NAb tool without installing the software by using the Atlas Science Portal (https://atlas.scharp.org). Atlas is an installation of LabKey Server. PMID:21619655

  16. SURVEY AND SUMMARY: The morph server: a standardized system for analyzing and visualizing macromolecular motions in a database framework

    PubMed Central

    Krebs, Werner G.; Gerstein, Mark

    2000-01-01

    The number of solved structures of macromolecules that have the same fold and thus exhibit some degree of conformational variability is rapidly increasing. It is consequently advantageous to develop a standardized terminology for describing this variability and automated systems for processing protein structures in different conformations. We have developed such a system as a ‘front-end’ server to our database of macromolecular motions. Our system attempts to describe a protein motion as a rigid-body rotation of a small ‘core’ relative to a larger one, using a set of hinges. The motion is placed in a standardized coordinate system so that all statistics between any two motions are directly comparable. We find that while this model can accommodate most protein motions, it cannot accommodate all; the degree to which a motion can be accommodated provides an aid in classifying it. Furthermore, we perform an adiabatic mapping (a restrained interpolation) between every two conformations. This gives some indication of the extent of the energetic barriers that need to be surmounted in the motion, and as a by-product results in a ‘morph movie’. We make these movies available over the Web to aid in visualization. Many instances of conformational variability occur between proteins with somewhat different sequences. We can accommodate these differences in a rough fashion, generating an ‘evolutionary morph’. Users have already submitted hundreds of examples of protein motions to our server, producing a comprehensive set of statistics. So far the statistics show that the median submitted motion has a rotation of ~10° and a maximum Cα displacement of 17 Å. Almost all involve at least one large torsion angle change of >140°. The server is accessible at http://bioinfo.mbb.yale.edu/MolMovDB PMID:10734184

  17. The Standard Autonomous File Server, a Customized, Off-the-Shelf Success Story

    NASA Technical Reports Server (NTRS)

    Semancik, Susan K.; Conger, Annette M.; Obenschain, Arthur F. (Technical Monitor)

    2001-01-01

    The Standard Autonomous File Server (SAFS), which includes both off-the-shelf hardware and software, uses an improved automated file transfer process to provide a quicker, more reliable, prioritized file distribution for customers of near real-time data without interfering with the assets involved in the acquisition and processing of the data. It operates as a stand-alone solution, monitoring itself, and providing an automated fail-over process to enhance reliability. This paper will describe the unique problems and lessons learned both during the COTS selection and integration into SAFS, and the system's first year of operation in support of NASA's satellite ground network. COTS was the key factor in allowing the two-person development team to deploy systems in less than a year, meeting the required launch schedule. The SAFS system his been so successful, it is becoming a NASA standard resource, leading to its nomination for NASA's Software or the Year Award in 1999.

  18. Development of EPA Protocol Information Enquiry Service System Based on Embedded ARM Linux

    NASA Astrophysics Data System (ADS)

    Peng, Daogang; Zhang, Hao; Weng, Jiannian; Li, Hui; Xia, Fei

    Industrial Ethernet is a new technology for industrial network communications developed in recent years. In the field of industrial automation in China, EPA is the first standard accepted and published by ISO, and has been included in the fourth edition IEC61158 Fieldbus of NO.14 type. According to EPA standard, Field devices such as industrial field controller, actuator and other instruments are all able to realize communication based on the Ethernet standard. The Atmel AT91RM9200 embedded development board and open source embedded Linux are used to develop an information inquiry service system of EPA protocol based on embedded ARM Linux in this paper. The system is capable of designing an EPA Server program for EPA data acquisition procedures, the EPA information inquiry service is available for programs in local or remote host through Socket interface. The EPA client can access data and information of other EPA equipments on the EPA network when it establishes connection with the monitoring port of the server.

  19. EuCliD (European Clinical Database): a database comparing different realities.

    PubMed

    Marcelli, D; Kirchgessner, J; Amato, C; Steil, H; Mitteregger, A; Moscardò, V; Carioni, C; Orlandini, G; Gatti, E

    2001-01-01

    Quality and variability of dialysis practice are generally gaining more and more importance. Fresenius Medical Care (FMC), as provider of dialysis, has the duty to continuously monitor and guarantee the quality of care delivered to patients treated in its European dialysis units. Accordingly, a new clinical database called EuCliD has been developed. It is a multilingual and fully codified database, using as far as possible international standard coding tables. EuCliD collects and handles sensitive medical patient data, fully assuring confidentiality. The Infrastructure: a Domino server is installed in each country connected to EuCliD. All the centres belonging to a country are connected via modem to the country server. All the Domino Servers are connected via Wide Area Network to the Head Quarter Server in Bad Homburg (Germany). Inside each country server only anonymous data related to that particular country are available. The only place where all the anonymous data are available is the Head Quarter Server. The data collection is strongly supported in each country by "key-persons" with solid relationships to their respective national dialysis units. The quality of the data in EuCliD is ensured at different levels. At the end of January 2001, more than 11,000 patients treated in 135 centres located in 7 countries are already included in the system. FMC has put the patient care at the centre of its activities for many years and now is able to provide transparency to the community (Authorities, Nephrologists, Patients.....) thus demonstrating the quality of the service.

  20. Implementation of the Clinical Encounters Tracking system at the Indiana University School of Medicine.

    PubMed

    Hatfield, Amy J; Bangert, Michael P

    2005-01-01

    The Indiana University School of Medicine (IUSM) Office of Medical Education &Student Services directed the IUSM Educational Technology Unit to develop a Clinical Encounters Tracking system in response to the Liaison Committee on Medical Education's (LCME) updated accreditation standards. A personal digital assistant (PDA) and centralized database server solution was implemented. Third-year medical students are required to carry a PDA on which they record clinical encounter experiences during all clerkship clinical rotations. Clinical encounters data collected on the PDAs are routinely uploaded to the central server via the PDA HotSyncing process. Real-time clinical encounter summary reports are accessed in the school's online curriculum management system: ANGEL. The resulting IUSM Clinical Encounters Tracking program addresses the LCME accreditation standard which mandates the tracking of medical students' required clinical curriculum experiences.

  1. Dynamic mapping of EDDL device descriptions to OPC UA

    NASA Astrophysics Data System (ADS)

    Atta Nsiah, Kofi; Schappacher, Manuel; Sikora, Axel

    2017-07-01

    OPC UA (Open Platform Communications Unified Architecture) is already a well-known concept used widely in the automation industry. In the area of factory automation, OPC UA models the underlying field devices such as sensors and actuators in an OPC UA server to allow connecting OPC UA clients to access device-specific information via a standardized information model. One of the requirements of the OPC UA server to represent field device data using its information model is to have advanced knowledge about the properties of the field devices in the form of device descriptions. The international standard IEC 61804 specifies EDDL (Electronic Device Description Language) as a generic language for describing the properties of field devices. In this paper, the authors describe a possibility to dynamically map and integrate field device descriptions based on EDDL into OPCUA.

  2. Solid waste information and tracking system client-server conversion project management plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    May, D.L.

    1998-04-15

    This Project Management Plan is the lead planning document governing the proposed conversion of the Solid Waste Information and Tracking System (SWITS) to a client-server architecture. This plan presents the content specified by American National Standards Institute (ANSI)/Institute of Electrical and Electronics Engineers (IEEE) standards for software development, with additional information categories deemed to be necessary to describe the conversion fully. This plan is a living document that will be reviewed on a periodic basis and revised when necessary to reflect changes in baseline design concepts and schedules. This PMP describes the background, planning and management of the SWITS conversion.more » It does not constitute a statement of product requirements. Requirements and specification documentation needed for the SWITS conversion will be released as supporting documents.« less

  3. CDC Vital Signs: Adult Smoking among People with Mental Illness

    MedlinePlus

    ... PDF file Microsoft PowerPoint file Microsoft Word file Microsoft Excel ... National Center for Chronic Disease Prevention and Health Promotion, Office on Smoking and Health Page maintained by: Office ...

  4. Photos of MRSA Infections

    MedlinePlus

    ... and Team Healthcare Providers Prevention Information and Advice Posters for the Athletic Community General MRSA Information and ... site? Adobe PDF file Microsoft PowerPoint file Microsoft Word file Microsoft Excel file Audio/Video file Apple ...

  5. 77 FR 7526 - Interpretation of Protection System Reliability Standard

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-13

    ... reh'g & compliance, 117 FERC ] 61,126 (2006), aff'd sub nom. Alcoa, Inc. v. FERC, 564 F.3d 1342 (D.C... opportunity to view and/or print the contents of this document via the Internet through FERC's Home Page... available on eLibrary in PDF and Microsoft Word format for viewing, printing, and/or downloading. To access...

  6. 76 FR 58101 - Electric Reliability Organization Interpretation of Transmission Operations Reliability Standard

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-20

    ... on reh'g & compliance, 117 FERC ] 61,126 (2006), aff'd sub nom. Alcoa, Inc. v. FERC, 564 F.3d 1342 (D... persons an opportunity to view and/or print the contents of this document via the Internet through FERC's... document is available on eLibrary in PDF and Microsoft Word format for viewing, printing, and/or...

  7. Volume serving and media management in a networked, distributed client/server environment

    NASA Technical Reports Server (NTRS)

    Herring, Ralph H.; Tefend, Linda L.

    1993-01-01

    The E-Systems Modular Automated Storage System (EMASS) is a family of hierarchical mass storage systems providing complete storage/'file space' management. The EMASS volume server provides the flexibility to work with different clients (file servers), different platforms, and different archives with a 'mix and match' capability. The EMASS design considers all file management programs as clients of the volume server system. System storage capacities are tailored to customer needs ranging from small data centers to large central libraries serving multiple users simultaneously. All EMASS hardware is commercial off the shelf (COTS), selected to provide the performance and reliability needed in current and future mass storage solutions. All interfaces use standard commercial protocols and networks suitable to service multiple hosts. EMASS is designed to efficiently store and retrieve in excess of 10,000 terabytes of data. Current clients include CRAY's YMP Model E based Data Migration Facility (DMF), IBM's RS/6000 based Unitree, and CONVEX based EMASS File Server software. The VolSer software provides the capability to accept client or graphical user interface (GUI) commands from the operator's console and translate them to the commands needed to control any configured archive. The VolSer system offers advanced features to enhance media handling and particularly media mounting such as: automated media migration, preferred media placement, drive load leveling, registered MediaClass groupings, and drive pooling.

  8. How reliable is computerized assessment of readability?

    PubMed

    Mailloux, S L; Johnson, M E; Fisher, D G; Pettibone, T J

    1995-01-01

    To assess the consistency and comparability of readability software programs, four software programs (Corporate Voice, Grammatix IV, Microsoft Word for Windows, and RightWriter) were compared. Standard materials included 28 pieces of printed educational materials on human immunodeficiency virus/acquired immunodeficiency syndrome distributed nationally and the Gettysburg Address. Statistical analyses for the educational materials revealed that each of the three formulas assessed (Flesch-Kincaid, Flesch Reading Ease, and Gunning Fog Index) provided significantly different grade equivalent scores and that the Microsoft Word program provided significantly lower grade levels and was more inconsistent in the scores provided. For the Gettysburg Address, considerable variation was revealed among formulas, with the discrepancy being up to two grade levels. When averaging across formulas, there was a variation of 1.3 grade levels between the four software programs. Given the variation between formulas and programs, implications for decisions based on results of these software programs are provided.

  9. Prokaryotic Contig Annotation Pipeline Server: Web Application for a Prokaryotic Genome Annotation Pipeline Based on the Shiny App Package.

    PubMed

    Park, Byeonghyeok; Baek, Min-Jeong; Min, Byoungnam; Choi, In-Geol

    2017-09-01

    Genome annotation is a primary step in genomic research. To establish a light and portable prokaryotic genome annotation pipeline for use in individual laboratories, we developed a Shiny app package designated as "P-CAPS" (Prokaryotic Contig Annotation Pipeline Server). The package is composed of R and Python scripts that integrate publicly available annotation programs into a server application. P-CAPS is not only a browser-based interactive application but also a distributable Shiny app package that can be installed on any personal computer. The final annotation is provided in various standard formats and is summarized in an R markdown document. Annotation can be visualized and examined with a public genome browser. A benchmark test showed that the annotation quality and completeness of P-CAPS were reliable and compatible with those of currently available public pipelines.

  10. Conducting Automated Test Assembly Using the Premium Solver Platform Version 7.0 with Microsoft Excel and the Large-Scale LP/QP Solver Engine Add-In

    ERIC Educational Resources Information Center

    Cor, Ken; Alves, Cecilia; Gierl, Mark J.

    2008-01-01

    This review describes and evaluates a software add-in created by Frontline Systems, Inc., that can be used with Microsoft Excel 2007 to solve large, complex test assembly problems. The combination of Microsoft Excel 2007 with the Frontline Systems Premium Solver Platform is significant because Microsoft Excel is the most commonly used spreadsheet…

  11. Map-IT! A Web-Based GIS Tool for Watershed Science Education.

    ERIC Educational Resources Information Center

    Curtis, David H.; Hewes, Christopher M.; Lossau, Matthew J.

    This paper describes the development of a prototypic, Web-accessible GIS solution for K-12 science education and citizen-based watershed monitoring. The server side consists of ArcView IMS running on an NT workstation. The client is built around MapCafe. The client interface, which runs through a standard Web browser, supports standard MapCafe…

  12. 78 FR 65431 - Notice of Funds Availability (NOFA) Inviting Applications for the Community Development Financial...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-31

    ... form; (2) a Microsoft Excel Workbook; (3) a Microsoft Word Narrative template; and (4) other mandatory attachments. (Applicants must use the Microsoft Word Narrative template the CDFI Fund provides; alternative...

  13. Extending the Virtual Solar Observatory (VSO) to Incorporate Data Analysis Capabilities (III)

    NASA Astrophysics Data System (ADS)

    Csillaghy, A.; Etesi, L.; Dennis, B.; Zarro, D.; Schwartz, R.; Tolbert, K.

    2008-12-01

    We will present a progress report on our activities to extend the data analysis capabilities of the VSO. Our efforts to date have focused on three areas: 1. Extending the data retrieval capabilities by developing a centralized data processing server. The server is built with Java, IDL (Interactive Data Language), and the SSW (Solar SoftWare) package with all SSW-related instrument libraries and required calibration data. When a user requests VSO data that requires preprocessing, the data are transparently sent to the server, processed, and returned to the user's IDL session for viewing and analysis. It is possible to have any Java or IDL client connect to the server. An IDL prototype for preparing and calibrating SOHO/EIT data wll be demonstrated. 2. Improving the solar data search in SHOW SYNOP, a graphical user tool connected to VSO in IDL. We introduce the Java-IDL interface that allows a flexible dynamic, and extendable way of searching the VSO, where all the communication with VSO are managed dynamically by standard Java tools. 3. Improving image overlay capability to support coregistration of solar disk observations obtained from different orbital view angles, position angles, and distances - such as from the twin STEREO spacecraft.

  14. Open-Source GIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vatsavai, Raju; Burk, Thomas E; Lime, Steve

    2012-01-01

    The components making up an Open Source GIS are explained in this chapter. A map server (Sect. 30.1) can broadly be defined as a software platform for dynamically generating spatially referenced digital map products. The University of Minnesota MapServer (UMN Map Server) is one such system. Its basic features are visualization, overlay, and query. Section 30.2 names and explains many of the geospatial open source libraries, such as GDAL and OGR. The other libraries are FDO, JTS, GEOS, JCS, MetaCRS, and GPSBabel. The application examples include derived GIS-software and data format conversions. Quantum GIS, its origin and its applications explainedmore » in detail in Sect. 30.3. The features include a rich GUI, attribute tables, vector symbols, labeling, editing functions, projections, georeferencing, GPS support, analysis, and Web Map Server functionality. Future developments will address mobile applications, 3-D, and multithreading. The origins of PostgreSQL are outlined and PostGIS discussed in detail in Sect. 30.4. It extends PostgreSQL by implementing the Simple Feature standard. Section 30.5 details the most important open source licenses such as the GPL, the LGPL, the MIT License, and the BSD License, as well as the role of the Creative Commons.« less

  15. The catalogCleaner: Separating the Sheep from the Goats

    NASA Astrophysics Data System (ADS)

    O'Brien, K.; Hankin, S. C.; Schweitzer, R.; Koyuk, H.

    2012-12-01

    The Global Earth Observation Integrated Data Environment (GEO-IDE) is NOAA's effort to successfully integrate data and information with partners in the national US-Global Earth Observation System (US-GEO) and the international Global Earth Observation System of Systems (GEOSS). As part of the GEO-IDE, the Unified Access Framework (UAF) is working to build momentum towards the goal of increased data integration and interoperability. The UAF project is moving towards this goal with an approach that includes leveraging well known and widely used standards and focusing initially on well understood data types, such as gridded data from climate models. This phased approach serves to engage data providers and users and also has a high probability of demonstrable successes. The UAF project shares the widely held conviction that the use of data standards is a key ingredient necessary to achieve interoperability. Many community-based consensus standards fail, though, due to poor compliance. Compliance problems emerge for many reasons: because the standards evolve through versions, because documentation is ambiguous or because individual data providers find the standard inadequate as-is to meet their special needs. In addition, minimalist use of standards will lead to a compliant service, but one which is of low quality. For example, serving five hundred individual files from a single climate model might be compliant, but enhancing the service so that those files are all aggregated together into one virtual dataset and available through a single access URL provides a much more useful service. The UAF project began showcasing the advantages of providing compliant data by manually building a master catalog generated from hand-picked THREDDS servers. With an understanding that educating data managers to provide standards compliant data and metadata can take years, the UAF project wanted to continue increasing the volume of data served through the master catalog as much as possible. However, it quickly became obvious, through the sheer volume of data servers available, that the manual process of building a master catalog was not scalable. Thus, the idea for the catalogCleaner tool was born. The goal of this tool is to automatically crawl a remote OPeNDAP or THREDDS server, and from the information in the server build a "clean" catalog of data that will be: a) served through uniform access services; b) have CF compliant metadata; c) directly link the data to common visualization tools thereby allowing users to immediately begin exploring actual data. In addition, the UAF-generated clean catalog can then be used to drive data discovery tools such as Geoportal, GI-CAT, etc. This presentation will further explore the motivation of creating this tool, the implementation of this tool, as well as the myriad of challenges and difficulties there were encountered along the way.

  16. An expert system for headache diagnosis: the Computerized Headache Assessment tool (CHAT).

    PubMed

    Maizels, Morris; Wolfe, William J

    2008-01-01

    Migraine is a highly prevalent chronic disorder associated with significant morbidity. Chronic daily headache syndromes, while less common, are less likely to be recognized, and impair quality of life to an even greater extent than episodic migraine. A variety of screening and diagnostic tools for migraine have been proposed and studied. Few investigators have developed and evaluated computerized programs to diagnose headache. To develop and determine the accuracy and utility of a computerized headache assessment tool (CHAT). CHAT was designed to identify all of the major primary headache disorders, distinguish daily from episodic types, and recognize medication overuse. CHAT was developed using an expert systems approach to headache diagnosis, with initial branch points determined by headache frequency and duration. Appropriate clinical criteria are presented relevant to brief and longer-lasting headaches. CHAT was posted on a web site using Microsoft active server pages and a SQL-server database server. A convenience sample of patients who presented to the adult urgent care department with headache, and patients in a family practice waiting room, were solicited to participate. Those who completed the on-line questionnaire were contacted for a diagnostic interview. One hundred thirty-five patients completed CHAT and 117 completed a diagnostic interview. CHAT correctly identified 35/35 (100%) patients with episodic migraine and 42/49 (85.7%) of patients with transformed migraine. CHAT also correctly identified 11/11 patients with chronic tension-type headache, 2/2 with episodic tension-type headache, and 1/1 with episodic cluster headache. Medication overuse was correctly recognized in 43/52 (82.7%). The most common misdiagnoses by CHAT were seen in patients with transformed migraine or new daily persistent headache. Fifty patients were referred to their primary care physician and 62 to the headache clinic. Of 29 patients referred to the PCP with a confirmed diagnosis of migraine, 25 made a follow-up appointment, the PCP diagnosed migraine in 19, and initiated migraine-specific therapy or prophylaxis in 17. The described expert system displays high diagnostic accuracy for migraine and other primary headache disorders, including daily headache syndromes and medication overuse. As part of a disease management program, CHAT led to patients receiving appropriate diagnoses and therapy. Limitations of the system include patient willingness to utilize the program, introducing such a process into the culture of medical care, and the difficult distinction of transformed migraine.

  17. Interactive real-time media streaming with reliable communication

    NASA Astrophysics Data System (ADS)

    Pan, Xunyu; Free, Kevin M.

    2014-02-01

    Streaming media is a recent technique for delivering multimedia information from a source provider to an end- user over the Internet. The major advantage of this technique is that the media player can start playing a multimedia file even before the entire file is transmitted. Most streaming media applications are currently implemented based on the client-server architecture, where a server system hosts the media file and a client system connects to this server system to download the file. Although the client-server architecture is successful in many situations, it may not be ideal to rely on such a system to provide the streaming service as users may be required to register an account using personal information in order to use the service. This is troublesome if a user wishes to watch a movie simultaneously while interacting with a friend in another part of the world over the Internet. In this paper, we describe a new real-time media streaming application implemented on a peer-to-peer (P2P) architecture in order to overcome these challenges within a mobile environment. When using the peer-to-peer architecture, streaming media is shared directly between end-users, called peers, with minimal or no reliance on a dedicated server. Based on the proposed software pɛvμa (pronounced [revma]), named for the Greek word meaning stream, we can host a media file on any computer and directly stream it to a connected partner. To accomplish this, pɛvμa utilizes the Microsoft .NET Framework and Windows Presentation Framework, which are widely available on various types of windows-compatible personal computers and mobile devices. With specially designed multi-threaded algorithms, the application can stream HD video at speeds upwards of 20 Mbps using the User Datagram Protocol (UDP). Streaming and playback are handled using synchronized threads that communicate with one another once a connection is established. Alteration of playback, such as pausing playback or tracking to a different spot in the media file, will be reflected in all media streams. These techniques are designed to allow users at different locations to simultaneously view a full length HD video and interactively control the media streaming session. To create a sustainable media stream with high quality, our system supports UDP packet loss recovery at high transmission speed using custom File- Buffers. Traditional real-time streaming protocols such as Real-time Transport Protocol/RTP Control Protocol (RTP/RTCP) provide no such error recovery mechanism. Finally, the system also features an Instant Messenger that allows users to perform social interactions with one another while they enjoy a media file. The ultimate goal of the application is to offer users a hassle free way to watch a media file over long distances without having to upload any personal information into a third party database. Moreover, the users can communicate with each other and stream media directly from one mobile device to another while maintaining an independence from traditional sign up required by most streaming services.

  18. WMT: The CSDMS Web Modeling Tool

    NASA Astrophysics Data System (ADS)

    Piper, M.; Hutton, E. W. H.; Overeem, I.; Syvitski, J. P.

    2015-12-01

    The Community Surface Dynamics Modeling System (CSDMS) has a mission to enable model use and development for research in earth surface processes. CSDMS strives to expand the use of quantitative modeling techniques, promotes best practices in coding, and advocates for the use of open-source software. To streamline and standardize access to models, CSDMS has developed the Web Modeling Tool (WMT), a RESTful web application with a client-side graphical interface and a server-side database and API that allows users to build coupled surface dynamics models in a web browser on a personal computer or a mobile device, and run them in a high-performance computing (HPC) environment. With WMT, users can: Design a model from a set of components Edit component parameters Save models to a web-accessible server Share saved models with the community Submit runs to an HPC system Download simulation results The WMT client is an Ajax application written in Java with GWT, which allows developers to employ object-oriented design principles and development tools such as Ant, Eclipse and JUnit. For deployment on the web, the GWT compiler translates Java code to optimized and obfuscated JavaScript. The WMT client is supported on Firefox, Chrome, Safari, and Internet Explorer. The WMT server, written in Python and SQLite, is a layered system, with each layer exposing a web service API: wmt-db: database of component, model, and simulation metadata and output wmt-api: configure and connect components wmt-exe: launch simulations on remote execution servers The database server provides, as JSON-encoded messages, the metadata for users to couple model components, including descriptions of component exchange items, uses and provides ports, and input parameters. Execution servers are network-accessible computational resources, ranging from HPC systems to desktop computers, containing the CSDMS software stack for running a simulation. Once a simulation completes, its output, in NetCDF, is packaged and uploaded to a data server where it is stored and from which a user can download it as a single compressed archive file.

  19. GrayStarServer: Server-side Spectrum Synthesis with a Browser-based Client-side User Interface

    NASA Astrophysics Data System (ADS)

    Short, C. Ian

    2016-10-01

    We present GrayStarServer (GSS), a stellar atmospheric modeling and spectrum synthesis code of pedagogical accuracy that is accessible in any web browser on commonplace computational devices and that runs on a timescale of a few seconds. The addition of spectrum synthesis annotated with line identifications extends the functionality and pedagogical applicability of GSS beyond that of its predecessor, GrayStar3 (GS3). The spectrum synthesis is based on a line list acquired from the NIST atomic spectra database, and the GSS post-processing and user interface client allows the user to inspect the plain text ASCII version of the line list, as well as to apply macroscopic broadening. Unlike GS3, GSS carries out the physical modeling on the server side in Java, and communicates with the JavaScript and HTML client via an asynchronous HTTP request. We also describe other improvements beyond GS3 such as a more physical treatment of background opacity and atmospheric physics, the comparison of key results with those of the Phoenix code, and the use of the HTML < {canvas}> element for higher quality plotting and rendering of results. We also present LineListServer, a Java code for converting custom ASCII line lists in NIST format to the byte data type file format required by GSS so that users can prepare their own custom line lists. We propose a standard for marking up and packaging model atmosphere and spectrum synthesis output for data transmission and storage that will facilitate a web-based approach to stellar atmospheric modeling and spectrum synthesis. We describe some pedagogical demonstrations and exercises enabled by easily accessible, on-demand, responsive spectrum synthesis. GSS may serve as a research support tool by providing quick spectroscopic reconnaissance. GSS may be found at www.ap.smu.ca/~ishort/OpenStars/GrayStarServer/grayStarServer.html, and source tarballs for local installations of both GSS and LineListServer may be found at www.ap.smu.ca/~ishort/OpenStars/.

  20. A hypermedia reference system to the Forest Ecosystem Management Assessment team report and some related publications.

    Treesearch

    K.M. Reynolds; H.M. Rauscher; C.V. Worth

    1995-01-01

    The hypermedia system, ForestEM, was developed in HyperWriter for use in Microsoft Windows. ForestEM version 1.0 includes text and figures from the FEMAT report and the Record of Decision and Standards and Guidelines. Hypermedia introduces two fundamental changes to knowledge management. The first is the capability to interactively store and retrieve large amounts of...

  1. The Role of Standards in Cloud-Computing Interoperability

    DTIC Science & Technology

    2012-10-01

    services are not shared outside the organization. CloudStack, Eucalyptus, HP, Microsoft, OpenStack , Ubuntu, and VMWare provide tools for building...center requirements • Developing usage models for cloud ven- dors • Independent IT consortium OpenStack http://www.openstack.org • Open-source...software for running private clouds • Currently consists of three core software projects: OpenStack Compute (Nova), OpenStack Object Storage (Swift

  2. Standards-Based Open-Source Planetary Map Server: Lunaserv

    NASA Astrophysics Data System (ADS)

    Estes, N. M.; Silva, V. H.; Bowley, K. S.; Lanjewar, K. K.; Robinson, M. S.

    2018-04-01

    Lunaserv is a planetary capable Web Map Service developed by the LROC SOC. It enables researchers to serve their own planetary data to a wide variety of GIS clients without any additional processing or download steps.

  3. Software Update.

    ERIC Educational Resources Information Center

    Currents, 2000

    2000-01-01

    A chart of 40 alumni-development database systems provides information on vendor/Web site, address, contact/phone, software name, price range, minimum suggested workstation/suggested server, standard reports/reporting tools, minimum/maximum record capacity, and number of installed sites/client type. (DB)

  4. SAME4HPC: A Promising Approach in Building a Scalable and Mobile Environment for High-Performance Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karthik, Rajasekar

    2014-01-01

    In this paper, an architecture for building Scalable And Mobile Environment For High-Performance Computing with spatial capabilities called SAME4HPC is described using cutting-edge technologies and standards such as Node.js, HTML5, ECMAScript 6, and PostgreSQL 9.4. Mobile devices are increasingly becoming powerful enough to run high-performance apps. At the same time, there exist a significant number of low-end and older devices that rely heavily on the server or the cloud infrastructure to do the heavy lifting. Our architecture aims to support both of these types of devices to provide high-performance and rich user experience. A cloud infrastructure consisting of OpenStack withmore » Ubuntu, GeoServer, and high-performance JavaScript frameworks are some of the key open-source and industry standard practices that has been adopted in this architecture.« less

  5. Lamprey: tracking users on the World Wide Web.

    PubMed

    Felciano, R M; Altman, R B

    1996-01-01

    Tracking individual web sessions provides valuable information about user behavior. This information can be used for general purpose evaluation of web-based user interfaces to biomedical information systems. To this end, we have developed Lamprey, a tool for doing quantitative and qualitative analysis of Web-based user interfaces. Lamprey can be used from any conforming browser, and does not require modification of server or client software. By rerouting WWW navigation through a centralized filter, Lamprey collects the sequence and timing of hyperlinks used by individual users to move through the web. Instead of providing marginal statistics, it retains the full information required to recreate a user session. We have built Lamprey as a standard Common Gateway Interface (CGI) that works with all standard WWW browsers and servers. In this paper, we describe Lamprey and provide a short demonstration of this approach for evaluating web usage patterns.

  6. Novel Approach to Analyzing MFE of Noncoding RNA Sequences

    PubMed Central

    George, Tina P.; Thomas, Tessamma

    2016-01-01

    Genomic studies have become noncoding RNA (ncRNA) centric after the study of different genomes provided enormous information on ncRNA over the past decades. The function of ncRNA is decided by its secondary structure, and across organisms, the secondary structure is more conserved than the sequence itself. In this study, the optimal secondary structure or the minimum free energy (MFE) structure of ncRNA was found based on the thermodynamic nearest neighbor model. MFE of over 2600 ncRNA sequences was analyzed in view of its signal properties. Mathematical models linking MFE to the signal properties were found for each of the four classes of ncRNA analyzed. MFE values computed with the proposed models were in concordance with those obtained with the standard web servers. A total of 95% of the sequences analyzed had deviation of MFE values within ±15% relative to those obtained from standard web servers. PMID:27695341

  7. Novel Approach to Analyzing MFE of Noncoding RNA Sequences.

    PubMed

    George, Tina P; Thomas, Tessamma

    2016-01-01

    Genomic studies have become noncoding RNA (ncRNA) centric after the study of different genomes provided enormous information on ncRNA over the past decades. The function of ncRNA is decided by its secondary structure, and across organisms, the secondary structure is more conserved than the sequence itself. In this study, the optimal secondary structure or the minimum free energy (MFE) structure of ncRNA was found based on the thermodynamic nearest neighbor model. MFE of over 2600 ncRNA sequences was analyzed in view of its signal properties. Mathematical models linking MFE to the signal properties were found for each of the four classes of ncRNA analyzed. MFE values computed with the proposed models were in concordance with those obtained with the standard web servers. A total of 95% of the sequences analyzed had deviation of MFE values within ±15% relative to those obtained from standard web servers.

  8. Integration of digital gross pathology images for enterprise-wide access.

    PubMed

    Amin, Milon; Sharma, Gaurav; Parwani, Anil V; Anderson, Ralph; Kolowitz, Brian J; Piccoli, Anthony; Shrestha, Rasu B; Lauro, Gonzalo Romero; Pantanowitz, Liron

    2012-01-01

    Sharing digital pathology images for enterprise- wide use into a picture archiving and communication system (PACS) is not yet widely adopted. We share our solution and 3-year experience of transmitting such images to an enterprise image server (EIS). Gross pathology images acquired by prosectors were integrated with clinical cases into the laboratory information system's image management module, and stored in JPEG2000 format on a networked image server. Automated daily searches for cases with gross images were used to compile an ASCII text file that was forwarded to a separate institutional Enterprise Digital Imaging and Communications in Medicine (DICOM) Wrapper (EDW) server. Concurrently, an HL7-based image order for these cases was generated, containing the locations of images and patient data, and forwarded to the EDW, which combined data in these locations to generate images with patient data, as required by DICOM standards. The image and data were then "wrapped" according to DICOM standards, transferred to the PACS servers, and made accessible on an institution-wide basis. In total, 26,966 gross images from 9,733 cases were transmitted over the 3-year period from the laboratory information system to the EIS. The average process time for cases with successful automatic uploads (n=9,688) to the EIS was 98 seconds. Only 45 cases (0.5%) failed requiring manual intervention. Uploaded images were immediately available to institution- wide PACS users. Since inception, user feedback has been positive. Enterprise- wide PACS- based sharing of pathology images is feasible, provides useful services to clinical staff, and utilizes existing information system and telecommunications infrastructure. PACS-shared pathology images, however, require a "DICOM wrapper" for multisystem compatibility.

  9. Integration of digital gross pathology images for enterprise-wide access

    PubMed Central

    Amin, Milon; Sharma, Gaurav; Parwani, Anil V.; Anderson, Ralph; Kolowitz, Brian J; Piccoli, Anthony; Shrestha, Rasu B.; Lauro, Gonzalo Romero; Pantanowitz, Liron

    2012-01-01

    Background: Sharing digital pathology images for enterprise- wide use into a picture archiving and communication system (PACS) is not yet widely adopted. We share our solution and 3-year experience of transmitting such images to an enterprise image server (EIS). Methods: Gross pathology images acquired by prosectors were integrated with clinical cases into the laboratory information system's image management module, and stored in JPEG2000 format on a networked image server. Automated daily searches for cases with gross images were used to compile an ASCII text file that was forwarded to a separate institutional Enterprise Digital Imaging and Communications in Medicine (DICOM) Wrapper (EDW) server. Concurrently, an HL7-based image order for these cases was generated, containing the locations of images and patient data, and forwarded to the EDW, which combined data in these locations to generate images with patient data, as required by DICOM standards. The image and data were then “wrapped” according to DICOM standards, transferred to the PACS servers, and made accessible on an institution-wide basis. Results: In total, 26,966 gross images from 9,733 cases were transmitted over the 3-year period from the laboratory information system to the EIS. The average process time for cases with successful automatic uploads (n=9,688) to the EIS was 98 seconds. Only 45 cases (0.5%) failed requiring manual intervention. Uploaded images were immediately available to institution- wide PACS users. Since inception, user feedback has been positive. Conclusions: Enterprise- wide PACS- based sharing of pathology images is feasible, provides useful services to clinical staff, and utilizes existing information system and telecommunications infrastructure. PACS-shared pathology images, however, require a “DICOM wrapper” for multisystem compatibility. PMID:22530178

  10. An Open Source Tool to Test Interoperability

    NASA Astrophysics Data System (ADS)

    Bermudez, L. E.

    2012-12-01

    Scientists interact with information at various levels from gathering of the raw observed data to accessing portrayed processed quality control data. Geoinformatics tools help scientist on the acquisition, storage, processing, dissemination and presentation of geospatial information. Most of the interactions occur in a distributed environment between software components that take the role of either client or server. The communication between components includes protocols, encodings of messages and managing of errors. Testing of these communication components is important to guarantee proper implementation of standards. The communication between clients and servers can be adhoc or follow standards. By following standards interoperability between components increase while reducing the time of developing new software. The Open Geospatial Consortium (OGC), not only coordinates the development of standards but also, within the Compliance Testing Program (CITE), provides a testing infrastructure to test clients and servers. The OGC Web-based Test Engine Facility, based on TEAM Engine, allows developers to test Web services and clients for correct implementation of OGC standards. TEAM Engine is a JAVA open source facility, available at Sourceforge that can be run via command line, deployed in a web servlet container or integrated in developer's environment via MAVEN. The TEAM Engine uses the Compliance Test Language (CTL) and TestNG to test HTTP requests, SOAP services and XML instances against Schemas and Schematron based assertions of any type of web service, not only OGC services. For example, the OGC Web Feature Service (WFS) 1.0.0 test has more than 400 test assertions. Some of these assertions includes conformance of HTTP responses, conformance of GML-encoded data; proper values for elements and attributes in the XML; and, correct error responses. This presentation will provide an overview of TEAM Engine, introduction of how to test via the OGC Testing web site and description of performing local tests. It will also provide information about how to participate in the open source code development of TEAM Engine.

  11. 106-17 Telemetry Management Resources Chapter 25

    DTIC Science & Technology

    2017-07-01

    aspects of the TmNS system . There are two primary protocols for accessing the management resources: Simple Network Management Protocol (SNMP) and... management resources as well as a basic HTTP clients and servers for a more RESTful approach to system management . Both tools are available from the...Telemetry Standards, RCC Standard 106-17 Chapter 25, July 2017 i CHAPTER 25 Management Resources Acronyms

  12. Using OPC and HL7 Standards to Incorporate an Industrial Big Data Historian in a Health IT Environment.

    PubMed

    Cruz, Márcio Freire; Cavalcante, Carlos Arthur Mattos Teixeira; Sá Barretto, Sérgio Torres

    2018-05-30

    Health Level Seven (HL7) is one of the standards most used to centralize data from different vital sign monitoring systems. This solution significantly limits the data available for historical analysis, because it typically uses databases that are not effective in storing large volumes of data. In industry, a specific Big Data Historian, known as a Process Information Management System (PIMS), solves this problem. This work proposes the same solution to overcome the restriction on storing vital sign data. The PIMS needs a compatible communication standard to allow storing, and the one most commonly used is the OLE for Process Control (OPC). This paper presents a HL7-OPC Server that permits communication between vital sign monitoring systems with PIMS, thus allowing the storage of long historical series of vital signs. In addition, it carries out a review about local and cloud-based Big Medical Data researches, followed by an analysis of the PIMS in a Health IT Environment. Then it shows the architecture of HL7 and OPC Standards. Finally, it shows the HL7-OPC Server and a sequence of tests that proved its full operation and performance.

  13. FastStats: Kidney Disease

    MedlinePlus

    ... PDF file Microsoft PowerPoint file Microsoft Word file Microsoft Excel file Audio/Video file Apple Quicktime file RealPlayer file Text file Zip Archive file SAS file ePub file RIS file Page last reviewed: February 18, 2013 Page last updated: March 30, 2017 Content source: ...

  14. Holographic Rovers: Augmented Reality and the Microsoft HoloLens

    NASA Technical Reports Server (NTRS)

    Toler, Laura

    2017-01-01

    Augmented Reality is an emerging field in technology, and encompasses Head Mounted Displays, smartphone apps, and even projected images. HMDs include the Meta 2, Magic Leap, Avegant Light Field, and the Microsoft HoloLens, which is evaluated specifically. The Microsoft HoloLens is designed to be used as an AR personal computer, and is being optimized with that goal in mind. Microsoft allied with the Unity3D game engine to create an SDK for interested application developers that can be used in the Unity environment.

  15. Human-Robot Interface Controller Usability for Mission Planning on the Move

    DTIC Science & Technology

    2012-11-01

    5 Figure 3. Microsoft Xbox 360 controller for Windows...6 Figure 5. Microsoft Trackball Explorer. .........................................................................................7 Figure 6...Xbox 360 Controller is a registered trademark of Microsoft Corporation. 4 3.2.1 HMMWV The HMMWV was equipped with a diesel engine

  16. Scabies: Workplace Frequently Asked Questions (FAQs)

    MedlinePlus

    ... PDF file Microsoft PowerPoint file Microsoft Word file Microsoft Excel file Audio/Video file Apple Quicktime file RealPlayer file Text file Zip Archive file SAS file ePub file RIS file Page last reviewed: July 19, 2013 Page last updated: July 19, 2013 Content source: ...

  17. An Efficient and Practical Smart Card Based Anonymity Preserving User Authentication Scheme for TMIS using Elliptic Curve Cryptography.

    PubMed

    Amin, Ruhul; Islam, S K Hafizul; Biswas, G P; Khan, Muhammad Khurram; Kumar, Neeraj

    2015-11-01

    In the last few years, numerous remote user authentication and session key agreement schemes have been put forwarded for Telecare Medical Information System, where the patient and medical server exchange medical information using Internet. We have found that most of the schemes are not usable for practical applications due to known security weaknesses. It is also worth to note that unrestricted number of patients login to the single medical server across the globe. Therefore, the computation and maintenance overhead would be high and the server may fail to provide services. In this article, we have designed a medical system architecture and a standard mutual authentication scheme for single medical server, where the patient can securely exchange medical data with the doctor(s) via trusted central medical server over any insecure network. We then explored the security of the scheme with its resilience to attacks. Moreover, we formally validated the proposed scheme through the simulation using Automated Validation of Internet Security Schemes and Applications software whose outcomes confirm that the scheme is protected against active and passive attacks. The performance comparison demonstrated that the proposed scheme has lower communication cost than the existing schemes in literature. In addition, the computation cost of the proposed scheme is nearly equal to the exiting schemes. The proposed scheme not only efficient in terms of different security attacks, but it also provides an efficient login, mutual authentication, session key agreement and verification and password update phases along with password recovery.

  18. Cyber-T web server: differential analysis of high-throughput data.

    PubMed

    Kayala, Matthew A; Baldi, Pierre

    2012-07-01

    The Bayesian regularization method for high-throughput differential analysis, described in Baldi and Long (A Bayesian framework for the analysis of microarray expression data: regularized t-test and statistical inferences of gene changes. Bioinformatics 2001: 17: 509-519) and implemented in the Cyber-T web server, is one of the most widely validated. Cyber-T implements a t-test using a Bayesian framework to compute a regularized variance of the measurements associated with each probe under each condition. This regularized estimate is derived by flexibly combining the empirical measurements with a prior, or background, derived from pooling measurements associated with probes in the same neighborhood. This approach flexibly addresses problems associated with low replication levels and technology biases, not only for DNA microarrays, but also for other technologies, such as protein arrays, quantitative mass spectrometry and next-generation sequencing (RNA-seq). Here we present an update to the Cyber-T web server, incorporating several useful new additions and improvements. Several preprocessing data normalization options including logarithmic and (Variance Stabilizing Normalization) VSN transforms are included. To augment two-sample t-tests, a one-way analysis of variance is implemented. Several methods for multiple tests correction, including standard frequentist methods and a probabilistic mixture model treatment, are available. Diagnostic plots allow visual assessment of the results. The web server provides comprehensive documentation and example data sets. The Cyber-T web server, with R source code and data sets, is publicly available at http://cybert.ics.uci.edu/.

  19. Automated realtime data import for the i2b2 clinical data warehouse: introducing the HL7 ETL cell.

    PubMed

    Majeed, Raphael W; Röhrig, Rainer

    2012-01-01

    Clinical data warehouses are used to consolidate all available clinical data from one or multiple organizations. They represent an important source for clinical research, quality management and controlling. Since its introduction, the data warehouse i2b2 gathered a large user base in the research community. Yet, little work has been done on the process of importing clinical data into data warehouses using existing standards. In this article, we present a novel approach of utilizing the clinical integration server as data source, commonly available in most hospitals. As information is transmitted through the integration server, the standardized HL7 message is immediately parsed and inserted into the data warehouse. Evaluation of import speeds suggest feasibility of the provided solution for real-time processing of HL7 messages. By using the presented approach of standardized data import, i2b2 can be used as a plug and play data warehouse, without the hurdle of customized import for every clinical information system or electronic medical record. The provided solution is available for download at http://sourceforge.net/projects/histream/.

  20. Inclusion in the Microsoft Workforce

    ERIC Educational Resources Information Center

    Exceptional Parent, 2008

    2008-01-01

    Since 1975, Microsoft has been a worldwide leader in software, services, and solutions that help people and businesses realize their full potential. Loren Mikola, the Disability Inclusion Program Manager at Microsoft, ensures that this technology also reaches and includes the special needs population and, through the hiring of individuals with…

  1. FastStats: Chronic Liver Disease and Cirrhosis

    MedlinePlus

    ... PDF file Microsoft PowerPoint file Microsoft Word file Microsoft Excel file Audio/Video file Apple Quicktime file RealPlayer file Text file Zip Archive file SAS file ePub file RIS file Page last reviewed: May 30, 2013 Page last updated: October 6, 2016 Content source: ...

  2. Time Synchronization Prototype, Server Upgrade Procedure Support and Remote Software Development

    NASA Technical Reports Server (NTRS)

    Sanders, Shania R.

    2014-01-01

    Networks are roadways of communication that connect devices. Like all roadways, there are rules and regulations that govern whatever (information in this case) travels along them. One type of rule that is commonly used is called a protocol. More specifically, a protocol is a standard that specifies how data should be transmitted over a network. The project outlined in this document seeks to implement one protocol in particular, Precision Time Protocol, within the Kennedy Ground Control Subsystem network at Kennedy Space Center. This document also summarizes work completed for server upgrades, remote software developer training and how all three assignments demonstrated the importance of accountability and security.

  3. Fulfillment of HTTP Authentication Based on Alcatel OmniSwitch 9700

    NASA Astrophysics Data System (ADS)

    Liu, Hefu

    This paper provides a way of HTTP authentication On Alcatel OmniSwitch 9700. Authenticated VLANs control user access to network resources based on VLAN assignment and user authentication. The user can be authenticated through the switch via any standard Web browser software. Web browser client displays the username and password prompts. Then a way for HTML forms can be given to pass HTTP authentication data when it's submitted. A radius server will provide a database of user information that the switch checks whenever it tries to authenticate through the switch. Before or after authentication, the client can get an address from a Dhcp server.

  4. A Standard for Sharing and Accessing Time Series Data: The Heliophysics Application Programmers Interface (HAPI) Specification

    NASA Astrophysics Data System (ADS)

    Vandegriff, J. D.; King, T. A.; Weigel, R. S.; Faden, J.; Roberts, D. A.; Harris, B. T.; Lal, N.; Boardsen, S. A.; Candey, R. M.; Lindholm, D. M.

    2017-12-01

    We present the Heliophysics Application Programmers Interface (HAPI), a new interface specification that both large and small data centers can use to expose time series data holdings in a standard way. HAPI was inspired by the similarity of existing services at many Heliophysics data centers, and these data centers have collaborated to define a single interface that captures best practices and represents what everyone considers the essential, lowest common denominator for basic data access. This low level access can serve as infrastructure to support greatly enhanced interoperability among analysis tools, with the goal being simplified analysis and comparison of data from any instrument, model, mission or data center. The three main services a HAPI server must perform are 1. list a catalog of datasets (one unique ID per dataset), 2. describe the content of one dataset (JSON metadata), and 3. retrieve numerical content for one dataset (stream the actual data). HAPI defines both the format of the query to the server, and the response from the server. The metadata is lightweight, focusing on use rather than discovery, and the data format is a streaming one, with Comma Separated Values (CSV) being required and binary or JSON streaming being optional. The HAPI specification is available at GitHub, where projects are also underway to develop reference implementation servers that data providers can adapt and use at their own sites. Also in the works are data analysis clients in multiple languages (IDL, Python, Matlab, and Java). Institutions which have agreed to adopt HAPI include Goddard (CDAWeb for data and CCMC for models), LASP at the University of Colorado Boulder, the Particles and Plasma Interactions node of the Planetary Data System (PPI/PDS) at UCLA, the Plasma Wave Group at the University of Iowa, the Space Sector at the Johns Hopkins Applied Physics Lab (APL), and the tsds.org site maintained at George Mason University. Over the next year, the adoption of a uniform way to access time series data is expected to significantly enhance interoperability within the Heliophysics data environment. https://github.com/hapi-server/data-specification

  5. Fish Karyome: A karyological information network database of Indian Fishes.

    PubMed

    Nagpure, Naresh Sahebrao; Pathak, Ajey Kumar; Pati, Rameshwar; Singh, Shri Prakash; Singh, Mahender; Sarkar, Uttam Kumar; Kushwaha, Basdeo; Kumar, Ravindra

    2012-01-01

    'Fish Karyome', a database on karyological information of Indian fishes have been developed that serves as central source for karyotype data about Indian fishes compiled from the published literature. Fish Karyome has been intended to serve as a liaison tool for the researchers and contains karyological information about 171 out of 2438 finfish species reported in India and is publically available via World Wide Web. The database provides information on chromosome number, morphology, sex chromosomes, karyotype formula and cytogenetic markers etc. Additionally, it also provides the phenotypic information that includes species name, its classification, and locality of sample collection, common name, local name, sex, geographical distribution, and IUCN Red list status. Besides, fish and karyotype images, references for 171 finfish species have been included in the database. Fish Karyome has been developed using SQL Server 2008, a relational database management system, Microsoft's ASP.NET-2008 and Macromedia's FLASH Technology under Windows 7 operating environment. The system also enables users to input new information and images into the database, search and view the information and images of interest using various search options. Fish Karyome has wide range of applications in species characterization and identification, sex determination, chromosomal mapping, karyo-evolution and systematics of fishes.

  6. Scaleable wireless web-enabled sensor networks

    NASA Astrophysics Data System (ADS)

    Townsend, Christopher P.; Hamel, Michael J.; Sonntag, Peter A.; Trutor, B.; Arms, Steven W.

    2002-06-01

    Our goal was to develop a long life, low cost, scalable wireless sensing network, which collects and distributes data from a wide variety of sensors over the internet. Time division multiple access was employed with RF transmitter nodes (each w/unique16 bit address) to communicate digital data to a single receiver (range 1/3 mile). One thousand five channel nodes can communicate to one receiver (30 minute update). Current draw (sleep) is 20 microamps, allowing 5 year battery life w/one 3.6 volt Li-Ion AA size battery. The network nodes include sensor excitation (AC or DC), multiplexer, instrumentation amplifier, 16 bit A/D converter, microprocessor, and RF link. They are compatible with thermocouples, strain gauges, load/torque transducers, inductive/capacitive sensors. The receiver (418 MHz) includes a single board computer (SBC) with Ethernet capability, internet file transfer protocols (XML/HTML), and data storage. The receiver detects data from specific nodes, performs error checking, records the data. The web server interrogates the SBC (from Microsoft's Internet Explorer or Netscape's Navigator) to distribute data. This system can collect data from thousands of remote sensors on a smart structure, and be shared by an unlimited number of users.

  7. 78 FR 65440 - Notice of Funds Availability (NOFA) Inviting Applications for the Native American CDFI Assistance...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-31

    ... of multiple mandatory documents including: (1) a PDF fillable Applicant intake form; (2) a Microsoft Excel Workbook; (3) a Microsoft Word Narrative template; and (4) other mandatory attachments. (Applicants must use the Microsoft Word Narrative template the CDFI Fund provides; alternative templates...

  8. Progress Report--Microsoft Office 2003 Lynchburg College Tutorials

    ERIC Educational Resources Information Center

    Murray, Tom

    2004-01-01

    For the past several years Lynchburg College has developed Microsoft tutorials for use with academic classes and faculty, student and staff training. The tutorials are now used internationally. Last year Microsoft and Verizon sponsored a tutorial web site at http://www.officetutorials.com. This website recognizes ASCUE members for their wonderful…

  9. Web servicing the biological office.

    PubMed

    Szugat, Martin; Güttler, Daniel; Fundel, Katrin; Sohler, Florian; Zimmer, Ralf

    2005-09-01

    Biologists routinely use Microsoft Office applications for standard analysis tasks. Despite ubiquitous internet resources, information needed for everyday work is often not directly and seamlessly available. Here we describe a very simple and easily extendable mechanism using Web Services to enrich standard MS Office applications with internet resources. We demonstrate its capabilities by providing a Web-based thesaurus for biological objects, which maps names to database identifiers and vice versa via an appropriate synonym list. The client application ProTag makes these features available in MS Office applications using Smart Tags and Add-Ins. http://services.bio.ifi.lmu.de/prothesaurus/

  10. The quest to make accessibility a corporate article of faith at Microsoft: case study of corporate culture and human resource dimensions.

    PubMed

    Sandler, Leonard A; Blanck, Peter

    2005-01-01

    This case study examines efforts by Microsoft Corporation to enhance the diversity of its workforce and improve the accessibility and usability of its products and services for persons with disabilities. The research explores the relation among the Americans with Disabilities Act of 1990, corporate leadership, attitudes and behaviors towards individuals with disabilities, and dynamics that shape organizational culture at Microsoft. Implications for Microsoft, other employers, researchers, and the disability community are discussed. 2005 John Wiley & Sons, Ltd.

  11. A Windows application for computing standardized mortality ratios and standardized incidence ratios in cohort studies based on calculation of exact person-years at risk.

    PubMed

    Geiss, Karla; Meyer, Martin

    2013-09-01

    Standardized mortality ratios and standardized incidence ratios are widely used in cohort studies to compare mortality or incidence in a study population to that in the general population on a age-time-specific basis, but their computation is not included in standard statistical software packages. Here we present a user-friendly Microsoft Windows program for computing standardized mortality ratios and standardized incidence ratios based on calculation of exact person-years at risk stratified by sex, age and calendar time. The program offers flexible import of different file formats for input data and easy handling of general population reference rate tables, such as mortality or incidence tables exported from cancer registry databases. The application of the program is illustrated with two examples using empirical data from the Bavarian Cancer Registry. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  12. A Fast Healthcare Interoperability Resources (FHIR) layer implemented over i2b2.

    PubMed

    Boussadi, Abdelali; Zapletal, Eric

    2017-08-14

    Standards and technical specifications have been developed to define how the information contained in Electronic Health Records (EHRs) should be structured, semantically described, and communicated. Current trends rely on differentiating the representation of data instances from the definition of clinical information models. The dual model approach, which combines a reference model (RM) and a clinical information model (CIM), sets in practice this software design pattern. The most recent initiative, proposed by HL7, is called Fast Health Interoperability Resources (FHIR). The aim of our study was to investigate the feasibility of applying the FHIR standard to modeling and exposing EHR data of the Georges Pompidou European Hospital (HEGP) integrating biology and the bedside (i2b2) clinical data warehouse (CDW). We implemented a FHIR server over i2b2 to expose EHR data in relation with five FHIR resources: DiagnosisReport, MedicationOrder, Patient, Encounter, and Medication. The architecture of the server combines a Data Access Object design pattern and FHIR resource providers, implemented using the Java HAPI FHIR API. Two types of queries were tested: query type #1 requests the server to display DiagnosticReport resources, for which the diagnosis code is equal to a given ICD-10 code. A total of 80 DiagnosticReport resources, corresponding to 36 patients, were displayed. Query type #2, requests the server to display MedicationOrder, for which the FHIR Medication identification code is equal to a given code expressed in a French coding system. A total of 503 MedicationOrder resources, corresponding to 290 patients, were displayed. Results were validated by manually comparing the results of each request to the results displayed by an ad-hoc SQL query. We showed the feasibility of implementing a Java layer over the i2b2 database model to expose data of the CDW as a set of FHIR resources. An important part of this work was the structural and semantic mapping between the i2b2 model and the FHIR RM. To accomplish this, developers must manually browse the specifications of the FHIR standard. Our source code is freely available and can be adapted for use in other i2b2 sites.

  13. ATM LAN Emulation: Getting from Here to There.

    ERIC Educational Resources Information Center

    Learn, Larry L., Ed.

    1995-01-01

    Discusses current LAN (local area network) configuration and explains ATM (asynchronous transfer mode) as the future telecommunications transport. Highlights include LAN emulation, which enables the interconnection of legacy LANs and the new ATM environment; virtual LANs; broadcast servers; and standards. (LRW)

  14. An open source Java web application to build self-contained Web GIS sites

    NASA Astrophysics Data System (ADS)

    Zavala Romero, O.; Ahmed, A.; Chassignet, E.; Zavala-Hidalgo, J.

    2014-12-01

    This work describes OWGIS, an open source Java web application that creates Web GIS sites by automatically writing HTML and JavaScript code. OWGIS is configured by XML files that define which layers (geographic datasets) will be displayed on the websites. This project uses several Open Geospatial Consortium standards to request data from typical map servers, such as GeoServer, and is also able to request data from ncWMS servers. The latter allows for the displaying of 4D data stored using the NetCDF file format (widely used for storing environmental model datasets). Some of the features available on the sites built with OWGIS are: multiple languages, animations, vertical profiles and vertical transects, color palettes, color ranges, and the ability to download data. OWGIS main users are scientists, such as oceanographers or climate scientists, who store their data in NetCDF files and want to analyze, visualize, share, or compare their data using a website.

  15. Web-based system for surgical planning and simulation

    NASA Astrophysics Data System (ADS)

    Eldeib, Ayman M.; Ahmed, Mohamed N.; Farag, Aly A.; Sites, C. B.

    1998-10-01

    The growing scientific knowledge and rapid progress in medical imaging techniques has led to an increasing demand for better and more efficient methods of remote access to high-performance computer facilities. This paper introduces a web-based telemedicine project that provides interactive tools for surgical simulation and planning. The presented approach makes use of client-server architecture based on new internet technology where clients use an ordinary web browser to view, send, receive and manipulate patients' medical records while the server uses the supercomputer facility to generate online semi-automatic segmentation, 3D visualization, surgical simulation/planning and neuroendoscopic procedures navigation. The supercomputer (SGI ONYX 1000) is located at the Computer Vision and Image Processing Lab, University of Louisville, Kentucky. This system is under development in cooperation with the Department of Neurological Surgery, Alliant Health Systems, Louisville, Kentucky. The server is connected via a network to the Picture Archiving and Communication System at Alliant Health Systems through a DICOM standard interface that enables authorized clients to access patients' images from different medical modalities.

  16. Performance evaluation of continuity of care records (CCRs): parsing models in a mobile health management system.

    PubMed

    Chen, Hung-Ming; Liou, Yong-Zan

    2014-10-01

    In a mobile health management system, mobile devices act as the application hosting devices for personal health records (PHRs) and the healthcare servers construct to exchange and analyze PHRs. One of the most popular PHR standards is continuity of care record (CCR). The CCR is expressed in XML formats. However, parsing is an expensive operation that can degrade XML processing performance. Hence, the objective of this study was to identify different operational and performance characteristics for those CCR parsing models including the XML DOM parser, the SAX parser, the PULL parser, and the JSON parser with regard to JSON data converted from XML-based CCR. Thus, developers can make sensible choices for their target PHR applications to parse CCRs when using mobile devices or servers with different system resources. Furthermore, the simulation experiments of four case studies are conducted to compare the parsing performance on Android mobile devices and the server with large quantities of CCR data.

  17. Informatics in radiology (infoRAD): Vendor-neutral case input into a server-based digital teaching file system.

    PubMed

    Kamauu, Aaron W C; DuVall, Scott L; Robison, Reid J; Liimatta, Andrew P; Wiggins, Richard H; Avrin, David E

    2006-01-01

    Although digital teaching files are important to radiology education, there are no current satisfactory solutions for export of Digital Imaging and Communications in Medicine (DICOM) images from picture archiving and communication systems (PACS) in desktop publishing format. A vendor-neutral digital teaching file, the Radiology Interesting Case Server (RadICS), offers an efficient tool for harvesting interesting cases from PACS without requiring modifications of the PACS configurations. Radiologists push imaging studies from PACS to RadICS via the standard DICOM Send process, and the RadICS server automatically converts the DICOM images into the Joint Photographic Experts Group format, a common desktop publishing format. They can then select key images and create an interesting case series at the PACS workstation. RadICS was tested successfully against multiple unmodified commercial PACS. Using RadICS, radiologists are able to harvest and author interesting cases at the point of clinical interpretation with minimal disruption in clinical work flow. RSNA, 2006

  18. Towards Direct Manipulation and Remixing of Massive Data: The EarthServer Approach

    NASA Astrophysics Data System (ADS)

    Baumann, P.

    2012-04-01

    Complex analytics on "big data" is one of the core challenges of current Earth science, generating strong requirements for on-demand processing and fil tering of massive data sets. Issues under discussion include flexibility, performance, scalability, and the heterogeneity of the information types invo lved. In other domains, high-level query languages (such as those offered by database systems) have proven successful in the quest for flexible, scalable data access interfaces to massive amounts of data. However, due to the lack of support for many of the Earth science data structures, database systems are only used for registries and catalogs, but not for the bulk of spatio-temporal data. One core information category in this field is given by coverage data. ISO 19123 defines coverages, simplifying, as a representation of a "space-time varying phenomenon". This model can express a large class of Earth science data structures, including rectified and non-rectified rasters, curvilinear grids, point clouds, TINs, general meshes, trajectories, surfaces, and solids. This abstract definition, which is too high-level to establish interoperability, is concretized by the OGC GML 3.2.1 Application Schema for Coverages Standard into an interoperable representation. The OGC Web Coverage Processing Service (WCPS) Standard defines a declarative query language on multi-dimensional raster-type coverages, such as 1D in-situ sensor timeseries, 2D EO imagery, 3D x/y/t image time series and x/y/z geophysical data, 4D x/y/z/t climate and ocean data. Hence, important ingredients for versatile coverage retrieval are given - however, this potential has not been fully unleashed by service architectures up to now. The EU FP7-INFRA project EarthServer, launched in September 2011, aims at enabling standards-based on-demand analytics over the Web for Earth science data based on an integration of W3C XQuery for alphanumeric data and OGC-WCPS for raster data. Ultimately, EarthServer will support all OGC coverage types. The platform used by EarthServer is the rasdaman raster database system. To exploit heterogeneous multi-parallel platforms, automatic request distribution and orchestration is being established. Client toolkits are under development which will allow to quickly compose bespoke interactive clients, ranging from mobile devices over Web clients to high-end immersive virtual reality. The EarthServer platform has been deployed in six large-scale data centres with the aim of setting up Lighthouse Applications addressing all Earth Sciences, including satellite and airborne earth observation as well as use cases from atmosphere, ocean, snow, and ice monitoring, and geology on Earth and Mars. These services, each of which will ultimately host at least 100 TB, will form a peer cloud with distributed query processing for arbitrarily mixing database and in-situ access. With its ability to directly manipulate, analyze and remix massive data, the goal of EarthServer is to lift the data providers' semantic level from data stewardship to service stewardship.

  19. CMD: a Cotton Microsatellite Database resource for Gossypium genomics

    PubMed Central

    Blenda, Anna; Scheffler, Jodi; Scheffler, Brian; Palmer, Michael; Lacape, Jean-Marc; Yu, John Z; Jesudurai, Christopher; Jung, Sook; Muthukumar, Sriram; Yellambalase, Preetham; Ficklin, Stephen; Staton, Margaret; Eshelman, Robert; Ulloa, Mauricio; Saha, Sukumar; Burr, Ben; Liu, Shaolin; Zhang, Tianzhen; Fang, Deqiu; Pepper, Alan; Kumpatla, Siva; Jacobs, John; Tomkins, Jeff; Cantrell, Roy; Main, Dorrie

    2006-01-01

    Background The Cotton Microsatellite Database (CMD) is a curated and integrated web-based relational database providing centralized access to publicly available cotton microsatellites, an invaluable resource for basic and applied research in cotton breeding. Description At present CMD contains publication, sequence, primer, mapping and homology data for nine major cotton microsatellite projects, collectively representing 5,484 microsatellites. In addition, CMD displays data for three of the microsatellite projects that have been screened against a panel of core germplasm. The standardized panel consists of 12 diverse genotypes including genetic standards, mapping parents, BAC donors, subgenome representatives, unique breeding lines, exotic introgression sources, and contemporary Upland cottons with significant acreage. A suite of online microsatellite data mining tools are accessible at CMD. These include an SSR server which identifies microsatellites, primers, open reading frames, and GC-content of uploaded sequences; BLAST and FASTA servers providing sequence similarity searches against the existing cotton SSR sequences and primers, a CAP3 server to assemble EST sequences into longer transcripts prior to mining for SSRs, and CMap, a viewer for comparing cotton SSR maps. Conclusion The collection of publicly available cotton SSR markers in a centralized, readily accessible and curated web-enabled database provides a more efficient utilization of microsatellite resources and will help accelerate basic and applied research in molecular breeding and genetic mapping in Gossypium spp. PMID:16737546

  20. A Three-fold Outlook of the Ultra-Efficient Engine Technology Program Office (UEET)

    NASA Technical Reports Server (NTRS)

    Graham, La Quilia E.

    2004-01-01

    The Ultra-Efficient Engine Technology (UEET) Office at NASA Glenn Research Center is a part of the Aeronautics Directorate. Its vision is to develop and hand off revolutionary turbine engine propulsion technologies that will enable future generation vehicles over a wide range of flight speeds. There are seven different technology area projects of UEET. During my tenure at NASA Glenn Research Center, my assignment was to assist three different areas of UEET, simultaneously. I worked with Kathy Zona in Education Outreach, Lynn Boukalik in Knowledge Management, and Denise Busch with Financial Management. All of my tasks were related to the business side of UEET. As an intern with Education Outreach I created a word search to partner with an exhibit of a Turbine Engine developed out of the UEET office. This exhibit is a portable model that is presented to students of varying ages. The word search complies with National Standards for Education which are part of every science, engineering, and technology teachers curriculum. I also updated a Conference Planning/Workshop Excel Spreadsheet for the UEET Office. I collected and inputted facility overviews from various venues, both on and off site to determine where to hold upcoming conferences. I then documented which facilities were compliant with the Federal Emergency Management Agency's (FEMA) Hotel and Motel Fire Safety Act of 1990. The second area in which I worked was Knowledge Management. a large knowledge management system online which has extensive documentation that continually needs reviewing, updating, and archiving. Knowledge management is the ability to bring individual or team knowledge to an organizational level so that the information can be stored, shared, reviewed, archived. Livelink and a secure server are the Knowledge Management systems that UEET utilizes, Through these systems, I was able to obtain the documents needed for archiving. My assignment was to obtain intellectual property including reports, presentations, or any other documents related to the project. My next task was to document the author, date of creation, and all other properties of each document. To archive these documents I worked extensively with Microsoft Excel. different financial systems of accounting such as the SAP business accounting system. I also learned the best ways to present financial data and shadowed my mentor as she presented financial data to both UEET's project management and the Resources Analysis and Management Office (RAMO). I analyzed the June 2004 financial data of UEET and used Microsoft Excel to input the results of the data. This process made it easier to present the full cost of the project in the month of June. In addition I assisted in the End of the Year 2003 Reconciliation of Purchases of UEET.

  1. Development of a 3D WebGIS System for Retrieving and Visualizing CityGML Data Based on their Geometric and Semantic Characteristics by Using Free and Open Source Technology

    NASA Astrophysics Data System (ADS)

    Pispidikis, I.; Dimopoulou, E.

    2016-10-01

    CityGML is considered as an optimal standard for representing 3D city models. However, international experience has shown that visualization of the latter is quite difficult to be implemented on the web, due to the large size of data and the complexity of CityGML. As a result, in the context of this paper, a 3D WebGIS application is developed in order to successfully retrieve and visualize CityGML data in accordance with their respective geometric and semantic characteristics. Furthermore, the available web technologies and the architecture of WebGIS systems are investigated, as provided by international experience, in order to be utilized in the most appropriate way for the purposes of this paper. Specifically, a PostgreSQL/ PostGIS Database is used, in compliance with the 3DCityDB schema. At Server tier, Apache HTTP Server and GeoServer are utilized, while a Server Side programming language PHP is used. At Client tier, which implemented the interface of the application, the following technologies were used: JQuery, AJAX, JavaScript, HTML5, WebGL and Ol3-Cesium. Finally, it is worth mentioning that the application's primary objectives are a user-friendly interface and a fully open source development.

  2. Integrated multimodal human-computer interface and augmented reality for interactive display applications

    NASA Astrophysics Data System (ADS)

    Vassiliou, Marius S.; Sundareswaran, Venkataraman; Chen, S.; Behringer, Reinhold; Tam, Clement K.; Chan, M.; Bangayan, Phil T.; McGee, Joshua H.

    2000-08-01

    We describe new systems for improved integrated multimodal human-computer interaction and augmented reality for a diverse array of applications, including future advanced cockpits, tactical operations centers, and others. We have developed an integrated display system featuring: speech recognition of multiple concurrent users equipped with both standard air- coupled microphones and novel throat-coupled sensors (developed at Army Research Labs for increased noise immunity); lip reading for improving speech recognition accuracy in noisy environments, three-dimensional spatialized audio for improved display of warnings, alerts, and other information; wireless, coordinated handheld-PC control of a large display; real-time display of data and inferences from wireless integrated networked sensors with on-board signal processing and discrimination; gesture control with disambiguated point-and-speak capability; head- and eye- tracking coupled with speech recognition for 'look-and-speak' interaction; and integrated tetherless augmented reality on a wearable computer. The various interaction modalities (speech recognition, 3D audio, eyetracking, etc.) are implemented a 'modality servers' in an Internet-based client-server architecture. Each modality server encapsulates and exposes commercial and research software packages, presenting a socket network interface that is abstracted to a high-level interface, minimizing both vendor dependencies and required changes on the client side as the server's technology improves.

  3. Web-based access to near real-time and archived high-density time-series data: cyber infrastructure challenges & developments in the open-source Waveform Server

    NASA Astrophysics Data System (ADS)

    Reyes, J. C.; Vernon, F. L.; Newman, R. L.; Steidl, J. H.

    2010-12-01

    The Waveform Server is an interactive web-based interface to multi-station, multi-sensor and multi-channel high-density time-series data stored in Center for Seismic Studies (CSS) 3.0 schema relational databases (Newman et al., 2009). In the last twelve months, based on expanded specifications and current user feedback, both the server-side infrastructure and client-side interface have been extensively rewritten. The Python Twisted server-side code-base has been fundamentally modified to now present waveform data stored in cluster-based databases using a multi-threaded architecture, in addition to supporting the pre-existing single database model. This allows interactive web-based access to high-density (broadband @ 40Hz to strong motion @ 200Hz) waveform data that can span multiple years; the common lifetime of broadband seismic networks. The client-side interface expands on it's use of simple JSON-based AJAX queries to now incorporate a variety of User Interface (UI) improvements including standardized calendars for defining time ranges, applying on-the-fly data calibration to display SI-unit data, and increased rendering speed. This presentation will outline the various cyber infrastructure challenges we have faced while developing this application, the use-cases currently in existence, and the limitations of web-based application development.

  4. Alignment-Annotator web server: rendering and annotating sequence alignments.

    PubMed

    Gille, Christoph; Fähling, Michael; Weyand, Birgit; Wieland, Thomas; Gille, Andreas

    2014-07-01

    Alignment-Annotator is a novel web service designed to generate interactive views of annotated nucleotide and amino acid sequence alignments (i) de novo and (ii) embedded in other software. All computations are performed at server side. Interactivity is implemented in HTML5, a language native to web browsers. The alignment is initially displayed using default settings and can be modified with the graphical user interfaces. For example, individual sequences can be reordered or deleted using drag and drop, amino acid color code schemes can be applied and annotations can be added. Annotations can be made manually or imported (BioDAS servers, the UniProt, the Catalytic Site Atlas and the PDB). Some edits take immediate effect while others require server interaction and may take a few seconds to execute. The final alignment document can be downloaded as a zip-archive containing the HTML files. Because of the use of HTML the resulting interactive alignment can be viewed on any platform including Windows, Mac OS X, Linux, Android and iOS in any standard web browser. Importantly, no plugins nor Java are required and therefore Alignment-Anotator represents the first interactive browser-based alignment visualization. http://www.bioinformatics.org/strap/aa/ and http://strap.charite.de/aa/. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  5. Alignment-Annotator web server: rendering and annotating sequence alignments

    PubMed Central

    Gille, Christoph; Fähling, Michael; Weyand, Birgit; Wieland, Thomas; Gille, Andreas

    2014-01-01

    Alignment-Annotator is a novel web service designed to generate interactive views of annotated nucleotide and amino acid sequence alignments (i) de novo and (ii) embedded in other software. All computations are performed at server side. Interactivity is implemented in HTML5, a language native to web browsers. The alignment is initially displayed using default settings and can be modified with the graphical user interfaces. For example, individual sequences can be reordered or deleted using drag and drop, amino acid color code schemes can be applied and annotations can be added. Annotations can be made manually or imported (BioDAS servers, the UniProt, the Catalytic Site Atlas and the PDB). Some edits take immediate effect while others require server interaction and may take a few seconds to execute. The final alignment document can be downloaded as a zip-archive containing the HTML files. Because of the use of HTML the resulting interactive alignment can be viewed on any platform including Windows, Mac OS X, Linux, Android and iOS in any standard web browser. Importantly, no plugins nor Java are required and therefore Alignment-Anotator represents the first interactive browser-based alignment visualization. Availability: http://www.bioinformatics.org/strap/aa/ and http://strap.charite.de/aa/. PMID:24813445

  6. Dynamic Interactive Educational Diabetes Simulations Using the World Wide Web: An Experience of More Than 15 Years with AIDA Online

    PubMed Central

    Lehmann, Eldon D.; DeWolf, Dennis K.; Novotny, Christopher A.; Reed, Karen; Gotwals, Robert R.

    2014-01-01

    Background. AIDA is a widely available downloadable educational simulator of glucose-insulin interaction in diabetes. Methods. A web-based version of AIDA was developed that utilises a server-based architecture with HTML FORM commands to submit numerical data from a web-browser client to a remote web server. AIDA online, located on a remote server, passes the received data through Perl scripts which interactively produce 24 hr insulin and glucose simulations. Results. AIDA online allows users to modify the insulin regimen and diet of 40 different prestored “virtual diabetic patients” on the internet or create new “patients” with user-generated regimens. Multiple simulations can be run, with graphical results viewed via a standard web-browser window. To date, over 637,500 diabetes simulations have been run at AIDA online, from all over the world. Conclusions. AIDA online's functionality is similar to the downloadable AIDA program, but the mode of implementation and usage is different. An advantage to utilising a server-based application is the flexibility that can be offered. New modules can be added quickly to the online simulator. This has facilitated the development of refinements to AIDA online, which have instantaneously become available around the world, with no further local downloads or installations being required. PMID:24511312

  7. GeneSilico protein structure prediction meta-server.

    PubMed

    Kurowski, Michal A; Bujnicki, Janusz M

    2003-07-01

    Rigorous assessments of protein structure prediction have demonstrated that fold recognition methods can identify remote similarities between proteins when standard sequence search methods fail. It has been shown that the accuracy of predictions is improved when refined multiple sequence alignments are used instead of single sequences and if different methods are combined to generate a consensus model. There are several meta-servers available that integrate protein structure predictions performed by various methods, but they do not allow for submission of user-defined multiple sequence alignments and they seldom offer confidentiality of the results. We developed a novel WWW gateway for protein structure prediction, which combines the useful features of other meta-servers available, but with much greater flexibility of the input. The user may submit an amino acid sequence or a multiple sequence alignment to a set of methods for primary, secondary and tertiary structure prediction. Fold-recognition results (target-template alignments) are converted into full-atom 3D models and the quality of these models is uniformly assessed. A consensus between different FR methods is also inferred. The results are conveniently presented on-line on a single web page over a secure, password-protected connection. The GeneSilico protein structure prediction meta-server is freely available for academic users at http://genesilico.pl/meta.

  8. GeneSilico protein structure prediction meta-server

    PubMed Central

    Kurowski, Michal A.; Bujnicki, Janusz M.

    2003-01-01

    Rigorous assessments of protein structure prediction have demonstrated that fold recognition methods can identify remote similarities between proteins when standard sequence search methods fail. It has been shown that the accuracy of predictions is improved when refined multiple sequence alignments are used instead of single sequences and if different methods are combined to generate a consensus model. There are several meta-servers available that integrate protein structure predictions performed by various methods, but they do not allow for submission of user-defined multiple sequence alignments and they seldom offer confidentiality of the results. We developed a novel WWW gateway for protein structure prediction, which combines the useful features of other meta-servers available, but with much greater flexibility of the input. The user may submit an amino acid sequence or a multiple sequence alignment to a set of methods for primary, secondary and tertiary structure prediction. Fold-recognition results (target-template alignments) are converted into full-atom 3D models and the quality of these models is uniformly assessed. A consensus between different FR methods is also inferred. The results are conveniently presented on-line on a single web page over a secure, password-protected connection. The GeneSilico protein structure prediction meta-server is freely available for academic users at http://genesilico.pl/meta. PMID:12824313

  9. Dynamic Interactive Educational Diabetes Simulations Using the World Wide Web: An Experience of More Than 15 Years with AIDA Online.

    PubMed

    Lehmann, Eldon D; Dewolf, Dennis K; Novotny, Christopher A; Reed, Karen; Gotwals, Robert R

    2014-01-01

    Background. AIDA is a widely available downloadable educational simulator of glucose-insulin interaction in diabetes. Methods. A web-based version of AIDA was developed that utilises a server-based architecture with HTML FORM commands to submit numerical data from a web-browser client to a remote web server. AIDA online, located on a remote server, passes the received data through Perl scripts which interactively produce 24 hr insulin and glucose simulations. Results. AIDA online allows users to modify the insulin regimen and diet of 40 different prestored "virtual diabetic patients" on the internet or create new "patients" with user-generated regimens. Multiple simulations can be run, with graphical results viewed via a standard web-browser window. To date, over 637,500 diabetes simulations have been run at AIDA online, from all over the world. Conclusions. AIDA online's functionality is similar to the downloadable AIDA program, but the mode of implementation and usage is different. An advantage to utilising a server-based application is the flexibility that can be offered. New modules can be added quickly to the online simulator. This has facilitated the development of refinements to AIDA online, which have instantaneously become available around the world, with no further local downloads or installations being required.

  10. Development of a novel SCADA system for laboratory testing.

    PubMed

    Patel, M; Cole, G R; Pryor, T L; Wilmot, N A

    2004-07-01

    This document summarizes the supervisory control and data acquisition (SCADA) system that allows communication with, and controlling the output of, various I/O devices in the renewable energy systems and components test facility RESLab. This SCADA system differs from traditional SCADA systems in that it supports a continuously changing operating environment depending on the test to be performed. The SCADA System is based on the concept of having one Master I/O Server and multiple client computer systems. This paper describes the main features and advantages of this dynamic SCADA system, the connections of various field devices to the master I/O server, the device servers, and numerous software features used in the system. The system is based on the graphical programming language "LabVIEW" and its "Datalogging and Supervisory Control" (DSC) module. The DSC module supports a real-time database called the "tag engine," which performs the I/O operations with all field devices attached to the master I/O server and communications with the other tag engines running on the client computers connected via a local area network. Generic and detailed communication block diagrams illustrating the hierarchical structure of this SCADA system are presented. The flow diagram outlining a complete test performed using this system in one of its standard configurations is described.

  11. Intellectual Production Supervision Perform based on RFID Smart Electricity Meter

    NASA Astrophysics Data System (ADS)

    Chen, Xiangqun; Huang, Rui; Shen, Liman; chen, Hao; Xiong, Dezhi; Xiao, Xiangqi; Liu, Mouhai; Xu, Renheng

    2018-03-01

    This topic develops the RFID intelligent electricity meter production supervision project management system. The system is designed for energy meter production supervision in the management of the project schedule, quality and cost information management requirements in RFID intelligent power, and provide quantitative information more comprehensive, timely and accurate for supervision engineer and project manager management decisions, and to provide technical information for the product manufacturing stage file. From the angle of scheme analysis, design, implementation and test, the system development of production supervision project management system for RFID smart meter project is discussed. Focus on the development of the system, combined with the main business application and management mode at this stage, focuses on the energy meter to monitor progress information, quality information and cost based information on RFID intelligent power management function. The paper introduces the design scheme of the system, the overall client / server architecture, client oriented graphical user interface universal, complete the supervision of project management and interactive transaction information display, the server system of realizing the main program. The system is programmed with C# language and.NET operating environment, and the client and server platforms use Windows operating system, and the database server software uses Oracle. The overall platform supports mainstream information and standards and has good scalability.

  12. Microsoft's Vista: Guarantees People with Special Needs Access to Computers

    ERIC Educational Resources Information Center

    Williams, John M.

    2006-01-01

    In this article, the author discusses the accessibility features of Microsoft's Windows Vista. One of the most innovative aspects of Windows Vista is a new accessibility and automated testing model called Microsoft UI Automation, which reduces development costs not only for accessible and assistive technology (AT) developers, but also for…

  13. Microsoft Excel Software Usage for Teaching Science and Engineering Curriculum

    ERIC Educational Resources Information Center

    Singh, Gurmukh; Siddiqui, Khalid

    2009-01-01

    In this article, our main objective is to present the use of Microsoft Software Excel 2007/2003 for teaching college and university level curriculum in science and engineering. In particular, we discuss two interesting and fascinating examples of interactive applications of Microsoft Excel targeted for undergraduate students in: 1) computational…

  14. Challenging Google, Microsoft Unveils a Search Tool for Scholarly Articles

    ERIC Educational Resources Information Center

    Carlson, Scott

    2006-01-01

    Microsoft has introduced a new search tool to help people find scholarly articles online. The service, which includes journal articles from prominent academic societies and publishers, puts Microsoft in direct competition with Google Scholar. The new free search tool, which should work on most Web browsers, is called Windows Live Academic Search…

  15. Microsoft's Book-Search Project Has a Surprise Ending

    ERIC Educational Resources Information Center

    Foster, Andrea L.

    2008-01-01

    It is hard to imagine a Microsoft venture falling under the weight of a competitor. That's the post-mortem offered by many academic librarians as they ponder the software giant's recent and sudden announcement that it is shutting down its book-digitization project. The librarians' conclusion: Google did it. Microsoft quietly revealed in May that…

  16. QNAP 1263U Network Attached Storage (NAS)/ Storage Area Network (SAN) Device Users Guide

    DTIC Science & Technology

    2016-11-01

    standard Ethernet network. Operating either a NAS or SAN is vital for the integrity of the data stored on the drives found in the device. Redundant...speed of the network itself. Many standards are in place for transferring data, including more standard ones such as File Transfer Protocol and Server ...following are the procedures for connecting to the NAS administrative web page: 1) Open a web browser and browse to 192.168.40.8:8080. 2) Enter the

  17. Some Programs Should Not Run on Laptops - Providing Programmatic Access to Applications Via Web Services

    NASA Astrophysics Data System (ADS)

    Gupta, V.; Gupta, N.; Gupta, S.; Field, E.; Maechling, P.

    2003-12-01

    Modern laptop computers, and personal computers, can provide capabilities that are, in many ways, comparable to workstations or departmental servers. However, this doesn't mean we should run all computations on our local computers. We have identified several situations in which it preferable to implement our seismological application programs in a distributed, server-based, computing model. In this model, application programs on the user's laptop, or local computer, invoke programs that run on an organizational server, and the results are returned to the invoking system. Situations in which a server-based architecture may be preferred include: (a) a program is written in a language, or written for an operating environment, that is unsupported on the local computer, (b) software libraries or utilities required to execute a program are not available on the users computer, (c) a computational program is physically too large, or computationally too expensive, to run on a users computer, (d) a user community wants to enforce a consistent method of performing a computation by standardizing on a single implementation of a program, and (e) the computational program may require current information, that is not available to all client computers. Until recently, distributed, server-based, computational capabilities were implemented using client/server architectures. In these architectures, client programs were often written in the same language, and they executed in the same computing environment, as the servers. Recently, a new distributed computational model, called Web Services, has been developed. Web Services are based on Internet standards such as XML, SOAP, WDSL, and UDDI. Web Services offer the promise of platform, and language, independent distributed computing. To investigate this new computational model, and to provide useful services to the SCEC Community, we have implemented several computational and utility programs using a Web Service architecture. We have hosted these Web Services as a part of the SCEC Community Modeling Environment (SCEC/CME) ITR Project (http://www.scec.org/cme). We have implemented Web Services for several of the reasons sited previously. For example, we implemented a FORTRAN-based Earthquake Rupture Forecast (ERF) as a Web Service for use by client computers that don't support a FORTRAN runtime environment. We implemented a Generic Mapping Tool (GMT) Web Service for use by systems that don't have local access to GMT. We implemented a Hazard Map Calculator Web Service to execute Hazard calculations that are too computationally intensive to run on a local system. We implemented a Coordinate Conversion Web Service to enforce a standard and consistent method for converting between UTM and Lat/Lon. Our experience developing these services indicates both strengths and weakness in current Web Service technology. Client programs that utilize Web Services typically need network access, a significant disadvantage at times. Programs with simple input and output parameters were the easiest to implement as Web Services, while programs with complex parameter-types required a significant amount of additional development. We also noted that Web services are very data-oriented, and adapting object-oriented software into the Web Service model proved problematic. Also, the Web Service approach of converting data types into XML format for network transmission has significant inefficiencies for some data sets.

  18. Environmental Monitoring Using Sensor Networks

    NASA Astrophysics Data System (ADS)

    Yang, J.; Zhang, C.; Li, X.; Huang, Y.; Fu, S.; Acevedo, M. F.

    2008-12-01

    Environmental observatories, consisting of a variety of sensor systems, computational resources and informatics, are important for us to observe, model, predict, and ultimately help preserve the health of the nature. The commoditization and proliferation of coin-to-palm sized wireless sensors will allow environmental monitoring with unprecedented fine spatial and temporal resolution. Once scattered around, these sensors can identify themselves, locate their positions, describe their functions, and self-organize into a network. They communicate through wireless channel with nearby sensors and transmit data through multi-hop protocols to a gateway, which can forward information to a remote data server. In this project, we describe an environmental observatory called Texas Environmental Observatory (TEO) that incorporates a sensor network system with intertwined wired and wireless sensors. We are enhancing and expanding the existing wired weather stations to include wireless sensor networks (WSNs) and telemetry using solar-powered cellular modems. The new WSNs will monitor soil moisture and support long-term hydrologic modeling. Hydrologic models are helpful in predicting how changes in land cover translate into changes in the stream flow regime. These models require inputs that are difficult to measure over large areas, especially variables related to storm events, such as soil moisture antecedent conditions and rainfall amount and intensity. This will also contribute to improve rainfall estimations from meteorological radar data and enhance hydrological forecasts. Sensor data are transmitted from monitoring site to a Central Data Collection (CDC) Server. We incorporate a GPRS modem for wireless telemetry, a single-board computer (SBC) as Remote Field Gateway (RFG) Server, and a WSN for distributed soil moisture monitoring. The RFG provides effective control, management, and coordination of two independent sensor systems, i.e., a traditional datalogger-based wired sensor system and the WSN-based wireless sensor system. The RFG also supports remote manipulation of the devices in the field such as the SBC, datalogger, and WSN. Sensor data collected from the distributed monitoring stations are stored in a database (DB) Server. The CDC Server acts as an intermediate component to hide the heterogeneity of different devices and support data validation required by the DB Server. Daemon programs running on the CDC Server pre-process the data before it is inserted into the database, and periodically perform synchronization tasks. A SWE-compliant data repository is installed to enable data exchange, accepting data from both internal DB Server and external sources through the OGC web services. The web portal, i.e. TEO Online, serves as a user-friendly interface for data visualization, analysis, synthesis, modeling, and K-12 educational outreach activities. It also provides useful capabilities for system developers and operators to remotely monitor system status and remotely update software and system configuration, which greatly simplifies the system debugging and maintenance tasks. We also implement Sensor Observation Services (SOS) at this layer, conforming to the SWE standard to facilitate data exchange. The standard SensorML/O&M data representation makes it easy to integrate our sensor data into the existing Geographic Information Systems (GIS) web services and exchange the data with other organizations.

  19. Transnational Private Authority in Education Policy in Jordan and South Africa: The Case of Microsoft Corporation

    ERIC Educational Resources Information Center

    Bhanji, Zahra

    2012-01-01

    The purpose of this article is to explore Microsoft Corporation as a new international actor shaping educational reforms and practices. This study examines how the implementation of Microsoft's global Partners in Learning (PiL) program varied and was mediated by national politics and national institutional practices in two different contexts,…

  20. Unidata's Vision for Transforming Geoscience by Moving Data Services and Software to the Cloud

    NASA Astrophysics Data System (ADS)

    Ramamurthy, M. K.; Fisher, W.; Yoksas, T.

    2014-12-01

    Universities are facing many challenges: shrinking budgets, rapidly evolving information technologies, exploding data volumes, multidisciplinary science requirements, and high student expectations. These changes are upending traditional approaches to accessing and using data and software. It is clear that Unidata's products and services must evolve to support new approaches to research and education. After years of hype and ambiguity, cloud computing is maturing in usability in many areas of science and education, bringing the benefits of virtualized and elastic remote services to infrastructure, software, computation, and data. Cloud environments reduce the amount of time and money spent to procure, install, and maintain new hardware and software, and reduce costs through resource pooling and shared infrastructure. Cloud services aimed at providing any resource, at any time, from any place, using any device are increasingly being embraced by all types of organizations. Given this trend and the enormous potential of cloud-based services, Unidata is taking moving to augment its products, services, data delivery mechanisms and applications to align with the cloud-computing paradigm. Specifically, Unidata is working toward establishing a community-based development environment that supports the creation and use of software services to build end-to-end data workflows. The design encourages the creation of services that can be broken into small, independent chunks that provide simple capabilities. Chunks could be used individually to perform a task, or chained into simple or elaborate workflows. The services will also be portable, allowing their use in researchers' own cloud-based computing environments. In this talk, we present a vision for Unidata's future in a cloud-enabled data services and discuss our initial efforts to deploy a subset of Unidata data services and tools in the Amazon EC2 and Microsoft Azure cloud environments, including the transfer of real-time meteorological data into its cloud instances, product generation using those data, and the deployment of TDS, McIDAS ADDE and AWIPS II data servers and the Integrated Data Server visualization tool.

  1. Spatial Information Processing: Standards-Based Open Source Visualization Technology

    NASA Astrophysics Data System (ADS)

    Hogan, P.

    2009-12-01

    . Spatial information intelligence is a global issue that will increasingly affect our ability to survive as a species. Collectively we must better appreciate the complex relationships that make life on Earth possible. Providing spatial information in its native context can accelerate our ability to process that information. To maximize this ability to process information, three basic elements are required: data delivery (server technology), data access (client technology), and data processing (information intelligence). NASA World Wind provides open source client and server technologies based on open standards. The possibilities for data processing and data sharing are enhanced by this inclusive infrastructure for geographic information. It is interesting that this open source and open standards approach, unfettered by proprietary constraints, simultaneously provides for entirely proprietary use of this same technology. 1. WHY WORLD WIND? NASA World Wind began as a single program with specific functionality, to deliver NASA content. But as the possibilities for virtual globe technology became more apparent, we found that while enabling a new class of information technology, we were also getting in the way. Researchers, developers and even users expressed their desire for World Wind functionality in ways that would service their specific needs. They want it in their web pages. They want to add their own features. They want to manage their own data. They told us that only with this kind of flexibility, could their objectives and the potential for this technology be truly realized. World Wind client technology is a set of development tools, a software development kit (SDK) that allows a software engineer to create applications requiring geographic visualization technology. 2. MODULAR COMPONENTRY Accelerated evolution of a technology requires that the essential elements of that technology be modular components such that each can advance independent of the other elements. World Wind therefore changed its mission from providing a single information browser to enabling a whole class of 3D geographic applications. Instead of creating a single program, World Wind is a suite of components that can be selectively used in any number of programs. World Wind technology can be a part of any application, or it can be a window in a web page. Or it can be extended with additional functionalities by application and web developers. World Wind makes it possible to include virtual globe visualization and server technology in support of any objective. The world community can continually benefit from advances made in the technology by NASA in concert with the world community. 3. OPEN SOURCE AND OPEN STANDARDS NASA World Wind is NASA Open Source software. This means that the source code is fully accessible for anyone to freely use, even in association with proprietary technology. Imagery and other data provided by the World Wind servers reside in the public domain, including the data server technology itself. This allows others to deliver their own geospatial data and to provide custom solutions based on users specific needs.

  2. Knowledge Glyphs: Visualization Theory Development to Support C2 Practice

    DTIC Science & Technology

    2006-03-01

    interface’s graphic structure (Calder and Linton, 2003). "• ’Glyphs’ as components of a typographical set (Microsoft Typography Standards). "* ’DataGlyphs...MOOTW) factors MIL STD 2525’s symbology set was designed for application in the context of geospatial representations - i.e., geographical maps. It is...the visual elements used to portray discrete entities. In a conventional windowing environment, such entities are likely to be graphically portrayed

  3. CANEapp: a user-friendly application for automated next generation transcriptomic data analysis.

    PubMed

    Velmeshev, Dmitry; Lally, Patrick; Magistri, Marco; Faghihi, Mohammad Ali

    2016-01-13

    Next generation sequencing (NGS) technologies are indispensable for molecular biology research, but data analysis represents the bottleneck in their application. Users need to be familiar with computer terminal commands, the Linux environment, and various software tools and scripts. Analysis workflows have to be optimized and experimentally validated to extract biologically meaningful data. Moreover, as larger datasets are being generated, their analysis requires use of high-performance servers. To address these needs, we developed CANEapp (application for Comprehensive automated Analysis of Next-generation sequencing Experiments), a unique suite that combines a Graphical User Interface (GUI) and an automated server-side analysis pipeline that is platform-independent, making it suitable for any server architecture. The GUI runs on a PC or Mac and seamlessly connects to the server to provide full GUI control of RNA-sequencing (RNA-seq) project analysis. The server-side analysis pipeline contains a framework that is implemented on a Linux server through completely automated installation of software components and reference files. Analysis with CANEapp is also fully automated and performs differential gene expression analysis and novel noncoding RNA discovery through alternative workflows (Cuffdiff and R packages edgeR and DESeq2). We compared CANEapp to other similar tools, and it significantly improves on previous developments. We experimentally validated CANEapp's performance by applying it to data derived from different experimental paradigms and confirming the results with quantitative real-time PCR (qRT-PCR). CANEapp adapts to any server architecture by effectively using available resources and thus handles large amounts of data efficiently. CANEapp performance has been experimentally validated on various biological datasets. CANEapp is available free of charge at http://psychiatry.med.miami.edu/research/laboratory-of-translational-rna-genomics/CANE-app . We believe that CANEapp will serve both biologists with no computational experience and bioinformaticians as a simple, timesaving but accurate and powerful tool to analyze large RNA-seq datasets and will provide foundations for future development of integrated and automated high-throughput genomics data analysis tools. Due to its inherently standardized pipeline and combination of automated analysis and platform-independence, CANEapp is an ideal for large-scale collaborative RNA-seq projects between different institutions and research groups.

  4. PropeR revisited.

    PubMed

    van der Linden, Helma; Talmon, Jan; Tange, Huibert; Grimson, Jane; Hasman, Arie

    2005-03-01

    The PropeR EHR system (PropeRWeb) is a multidisciplinary electronic health record (EHR) system for multidisciplinary use in extramural patient care for stroke patients. The system is built using existing open source components and is based on open standards. It is implemented as a web application using servlets and Java Server Pages (JSP's) with a CORBA connection to the database servers, which are based on the OMG HDTF specifications. PropeRWeb is a generic system which can be readily customized for use in a variety of clinical domains. The system proved to be stable and flexible, although some aspects (a.o. user friendliness) could be improved. These improvements are currently under development in a second version.

  5. Web-based segmentation and display of three-dimensional radiologic image data.

    PubMed

    Silverstein, J; Rubenstein, J; Millman, A; Panko, W

    1998-01-01

    In many clinical circumstances, viewing sequential radiological image data as three-dimensional models is proving beneficial. However, designing customized computer-generated radiological models is beyond the scope of most physicians, due to specialized hardware and software requirements. We have created a simple method for Internet users to remotely construct and locally display three-dimensional radiological models using only a standard web browser. Rapid model construction is achieved by distributing the hardware intensive steps to a remote server. Once created, the model is automatically displayed on the requesting browser and is accessible to multiple geographically distributed users. Implementation of our server software on large scale systems could be of great service to the worldwide medical community.

  6. "WWW.MDTF.ORG": a World Wide Web forum for developing open-architecture, freely distributed, digital teaching file software by participant consensus.

    PubMed

    Katzman, G L; Morris, D; Lauman, J; Cochella, C; Goede, P; Harnsberger, H R

    2001-06-01

    To foster a community supported evaluation processes for open-source digital teaching file (DTF) development and maintenance. The mechanisms used to support this process will include standard web browsers, web servers, forum software, and custom additions to the forum software to potentially enable a mediated voting protocol. The web server will also serve as a focal point for beta and release software distribution, which is the desired end-goal of this process. We foresee that www.mdtf.org will provide for widespread distribution of open source DTF software that will include function and interface design decisions from community participation on the website forums.

  7. Prototyping a 10 Gigabit-Ethernet Event-Builder for the CTA Camera Server

    NASA Astrophysics Data System (ADS)

    Hoffmann, Dirk; Houles, Julien

    2012-12-01

    While the Cherenkov Telescope Array will end its Preperatory Phase in 2012 or 2013 with the publication of a Technical Design Report, our lab has undertaken within the french CTA community the design and prototyping of a Camera-Server, which is a PC architecture based computer, used as a switchboard assigned to each of a hundred telescopes to handle a maximum amount of scientific data recorded by each telescope. Our work aims for a data acquisition hardware and software system for the scientific raw data at optimal speed. We have evaluated the maximum performance that can be obtained by choosing standard (COTS) hardware and software (Linux) in conjunction with a 10 Gb/s switch.

  8. Topographic mapping data semantics through data conversion and enhancement: Chapter 7

    USGS Publications Warehouse

    Varanka, Dalia; Carter, Jonathan; Usery, E. Lynn; Shoberg, Thomas; Edited by Ashish, Naveen; Sheth, Amit P.

    2011-01-01

    This paper presents research on the semantics of topographic data for triples and ontologies to blend the capabilities of the Semantic Web and The National Map of the U.S. Geological Survey. Automated conversion of relational topographic data of several geographic sample areas to the triple data model standard resulted in relatively poor semantic associations. Further research employed vocabularies of feature type and spatial relation terms. A user interface was designed to model the capture of non-standard terms relevant to public users and to map those terms to existing data models of The National Map through the use of ontology. Server access for the study area triple stores was made publicly available, illustrating how the development of linked data may transform institutional policies to open government data resources to the public. This paper presents these data conversion and research techniques that were tested as open linked data concepts leveraged through a user-centered interface and open USGS server access to the public.

  9. Performance Evaluation of a M/Geo[xy]/1 Queue with varying probabilities of success which Treats Two Like Jobs As a Single Entity

    NASA Astrophysics Data System (ADS)

    Gowrishankar, Lavanya; Bhaskar, Vidhyacharan; Sundarammal, K.

    2018-04-01

    The developed model comprises of a single server capable of handling two different job types X and Y type job. Job Y takes more time for execution than job X. The objective is to construct a single server which would replace the standard M/M/2 queuing model The method used to find the relative measures involves the cost equation. The properties of the service distribution are discussed in detail. The maximum likelihood estimates for the parameters are obtained. The results are analytically derived for the M/Geo[xy]/1 model. A comparison is done between the model proposed and the standard M/M/2 queue. From the numerical results, it is observed that the waiting time in queue increases as the number of cycles is increased but however it is more economical than the M/M/2 model with restriction on the number of time slices.

  10. Human Factors Feedback: Brain Acoustic Monitor

    DTIC Science & Technology

    2012-02-01

    Microsoft Office Excel .................................................................12  iv 4.  Conclusions 13  5.  References 15  Appendix A...Panasonic Toughbook system. †Toughbook is registered trademark of Panasonic Corporation. ‡Windows is a registered trademark of Microsoft Corporation. 4...was preloaded with Microsoft Windows XP service pack 2 OS. This OS is widely used on IBM-style personal computers, and the BAM system did not

  11. snpTree--a web-server to identify and construct SNP trees from whole genome sequence data.

    PubMed

    Leekitcharoenphon, Pimlapas; Kaas, Rolf S; Thomsen, Martin Christen Frølund; Friis, Carsten; Rasmussen, Simon; Aarestrup, Frank M

    2012-01-01

    The advances and decreasing economical cost of whole genome sequencing (WGS), will soon make this technology available for routine infectious disease epidemiology. In epidemiological studies, outbreak isolates have very little diversity and require extensive genomic analysis to differentiate and classify isolates. One of the successfully and broadly used methods is analysis of single nucletide polymorphisms (SNPs). Currently, there are different tools and methods to identify SNPs including various options and cut-off values. Furthermore, all current methods require bioinformatic skills. Thus, we lack a standard and simple automatic tool to determine SNPs and construct phylogenetic tree from WGS data. Here we introduce snpTree, a server for online-automatic SNPs analysis. This tool is composed of different SNPs analysis suites, perl and python scripts. snpTree can identify SNPs and construct phylogenetic trees from WGS as well as from assembled genomes or contigs. WGS data in fastq format are aligned to reference genomes by BWA while contigs in fasta format are processed by Nucmer. SNPs are concatenated based on position on reference genome and a tree is constructed from concatenated SNPs using FastTree and a perl script. The online server was implemented by HTML, Java and python script.The server was evaluated using four published bacterial WGS data sets (V. cholerae, S. aureus CC398, S. Typhimurium and M. tuberculosis). The evaluation results for the first three cases was consistent and concordant for both raw reads and assembled genomes. In the latter case the original publication involved extensive filtering of SNPs, which could not be repeated using snpTree. The snpTree server is an easy to use option for rapid standardised and automatic SNP analysis in epidemiological studies also for users with limited bioinformatic experience. The web server is freely accessible at http://www.cbs.dtu.dk/services/snpTree-1.0/.

  12. Web tools for large-scale 3D biological images and atlases

    PubMed Central

    2012-01-01

    Background Large-scale volumetric biomedical image data of three or more dimensions are a significant challenge for distributed browsing and visualisation. Many images now exceed 10GB which for most users is too large to handle in terms of computer RAM and network bandwidth. This is aggravated when users need to access tens or hundreds of such images from an archive. Here we solve the problem for 2D section views through archive data delivering compressed tiled images enabling users to browse through very-large volume data in the context of a standard web-browser. The system provides an interactive visualisation for grey-level and colour 3D images including multiple image layers and spatial-data overlay. Results The standard Internet Imaging Protocol (IIP) has been extended to enable arbitrary 2D sectioning of 3D data as well a multi-layered images and indexed overlays. The extended protocol is termed IIP3D and we have implemented a matching server to deliver the protocol and a series of Ajax/Javascript client codes that will run in an Internet browser. We have tested the server software on a low-cost linux-based server for image volumes up to 135GB and 64 simultaneous users. The section views are delivered with response times independent of scale and orientation. The exemplar client provided multi-layer image views with user-controlled colour-filtering and overlays. Conclusions Interactive browsing of arbitrary sections through large biomedical-image volumes is made possible by use of an extended internet protocol and efficient server-based image tiling. The tools open the possibility of enabling fast access to large image archives without the requirement of whole image download and client computers with very large memory configurations. The system was demonstrated using a range of medical and biomedical image data extending up to 135GB for a single image volume. PMID:22676296

  13. SU-F-J-72: A Clinical Usable Integrated Contouring Quality Evaluation Software for Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, S; Dolly, S; Cai, B

    Purpose: To introduce the Auto Contour Evaluation (ACE) software, which is the clinical usable, user friendly, efficient and all-in-one toolbox for automatically identify common contouring errors in radiotherapy treatment planning using supervised machine learning techniques. Methods: ACE is developed with C# using Microsoft .Net framework and Windows Presentation Foundation (WPF) for elegant GUI design and smooth GUI transition animations through the integration of graphics engines and high dots per inch (DPI) settings on modern high resolution monitors. The industrial standard software design pattern, Model-View-ViewModel (MVVM) pattern, is chosen to be the major architecture of ACE for neat coding structure, deepmore » modularization, easy maintainability and seamless communication with other clinical software. ACE consists of 1) a patient data importing module integrated with clinical patient database server, 2) a 2D DICOM image and RT structure simultaneously displaying module, 3) a 3D RT structure visualization module using Visualization Toolkit or VTK library and 4) a contour evaluation module using supervised pattern recognition algorithms to detect contouring errors and display detection results. ACE relies on supervised learning algorithms to handle all image processing and data processing jobs. Implementations of related algorithms are powered by Accord.Net scientific computing library for better efficiency and effectiveness. Results: ACE can take patient’s CT images and RT structures from commercial treatment planning software via direct user input or from patients’ database. All functionalities including 2D and 3D image visualization and RT contours error detection have been demonstrated with real clinical patient cases. Conclusion: ACE implements supervised learning algorithms and combines image processing and graphical visualization modules for RT contours verification. ACE has great potential for automated radiotherapy contouring quality verification. Structured with MVVM pattern, it is highly maintainable and extensible, and support smooth connections with other clinical software tools.« less

  14. Particle Identification on an FPGA Accelerated Compute Platform for the LHCb Upgrade

    NASA Astrophysics Data System (ADS)

    Fäerber, Christian; Schwemmer, Rainer; Machen, Jonathan; Neufeld, Niko

    2017-07-01

    The current LHCb readout system will be upgraded in 2018 to a “triggerless” readout of the entire detector at the Large Hadron Collider collision rate of 40 MHz. The corresponding bandwidth from the detector down to the foreseen dedicated computing farm (event filter farm), which acts as the trigger, has to be increased by a factor of almost 100 from currently 500 Gb/s up to 40 Tb/s. The event filter farm will preanalyze the data and will select the events on an event by event basis. This will reduce the bandwidth down to a manageable size to write the interesting physics data to tape. The design of such a system is a challenging task, and the reason why different new technologies are considered and have to be investigated for the different parts of the system. For the usage in the event building farm or in the event filter farm (trigger), an experimental field programmable gate array (FPGA) accelerated computing platform is considered and, therefore, tested. FPGA compute accelerators are used more and more in standard servers such as for Microsoft Bing search or Baidu search. The platform we use hosts a general Intel CPU and a high-performance FPGA linked via the high-speed Intel QuickPath Interconnect. An accelerator is implemented on the FPGA. It is very likely that these platforms, which are built, in general, for high-performance computing, are also very interesting for the high-energy physics community. First, the performance results of smaller test cases performed at the beginning are presented. Afterward, a part of the existing LHCb RICH particle identification is tested and is ported to the experimental FPGA accelerated platform. We have compared the performance of the LHCb RICH particle identification running on a normal CPU with the performance of the same algorithm, which is running on the Xeon-FPGA compute accelerator platform.

  15. BingEO: Enable Distributed Earth Observation Data for Environmental Research

    NASA Astrophysics Data System (ADS)

    Wu, H.; Yang, C.; Xu, Y.

    2010-12-01

    Our planet is facing great environmental challenges including global climate change, environmental vulnerability, extreme poverty, and a shortage of clean cheap energy. To address these problems, scientists are developing various models to analysis, forecast, simulate various geospatial phenomena to support critical decision making. These models not only challenge our computing technology, but also challenge us to feed huge demands of earth observation data. Through various policies and programs, open and free sharing of earth observation data are advocated in earth science. Currently, thousands of data sources are freely available online through open standards such as Web Map Service (WMS), Web Feature Service (WFS) and Web Coverage Service (WCS). Seamless sharing and access to these resources call for a spatial Cyberinfrastructure (CI) to enable the use of spatial data for the advancement of related applied sciences including environmental research. Based on Microsoft Bing Search Engine and Bing Map, a seamlessly integrated and visual tool is under development to bridge the gap between researchers/educators and earth observation data providers. With this tool, earth science researchers/educators can easily and visually find the best data sets for their research and education. The tool includes a registry and its related supporting module at server-side and an integrated portal as its client. The proposed portal, Bing Earth Observation (BingEO), is based on Bing Search and Bing Map to: 1) Use Bing Search to discover Web Map Services (WMS) resources available over the internet; 2) Develop and maintain a registry to manage all the available WMS resources and constantly monitor their service quality; 3) Allow users to manually register data services; 4) Provide a Bing Maps-based Web application to visualize the data on a high-quality and easy-to-manipulate map platform and enable users to select the best data layers online. Given the amount of observation data accumulated already and still growing, BingEO will allow these resources to be utilized more widely, intensively, efficiently and economically in earth science applications.

  16. Design and Development of a Network-Based Electronic Library.

    ERIC Educational Resources Information Center

    Larson, Ray R.

    1994-01-01

    Describes collaboration between the University of California at Berkeley and four other universities to develop interoperable servers containing each participant's Computer Science Technical Reports and to make them available over the Internet using standard protocols. The proposed library architecture, approaches to indexing and retrieval, and…

  17. Electronic Mail for Personal Computers: Development Issues.

    ERIC Educational Resources Information Center

    Tomer, Christinger

    1994-01-01

    Examines competing, commercially developed electronic mail programs and how these technologies will affect the functionality and quality of electronic mail. How new standards for client-server mail systems are likely to enhance messaging capabilities and the use of electronic mail for information retrieval are considered. (Contains eight…

  18. MATREX Leads the Way in Implementing New DOD VV&A Documentation Standards

    DTIC Science & Technology

    2007-05-24

    Acquisition Operations & Support B C Sustainment FRP Decision Review FOC LRIP/IOT& ECritical Design Review Pre-Systems Acquisition Concept...Communications Human Performance Model • C3GRID – Command & Control, Computer GRID • CES – Communications Effects Server • CMS2 – Comprehensive

  19. UAF: a generic OPC unified architecture framework

    NASA Astrophysics Data System (ADS)

    Pessemier, Wim; Deconinck, Geert; Raskin, Gert; Saey, Philippe; Van Winckel, Hans

    2012-09-01

    As an emerging Service Oriented Architecture (SOA) specically designed for industrial automation and process control, the OPC Unied Architecture specication should be regarded as an attractive candidate for controlling scientic instrumentation. Even though an industry-backed standard such as OPC UA can oer substantial added value to these projects, its inherent complexity poses an important obstacle for adopting the technology. Building OPC UA applications requires considerable eort, even when taking advantage of a COTS Software Development Kit (SDK). The OPC Unied Architecture Framework (UAF) attempts to reduce this burden by introducing an abstraction layer between the SDK and the application code in order to achieve a better separation of the technical and the functional concerns. True to its industrial origin, the primary requirement of the framework is to maintain interoperability by staying close to the standard specications, and by expecting the minimum compliance from other OPC UA servers and clients. UAF can therefore be regarded as a software framework to quickly and comfortably develop and deploy OPC UA-based applications, while remaining compatible to third party OPC UA-compliant toolkits, servers (such as PLCs) and clients (such as SCADA software). In the rst phase, as covered by this paper, only the client-side of UAF has been tackled in order to transparently handle discovery, session management, subscriptions, monitored items etc. We describe the design principles and internal architecture of our open-source software project, the rst results of the framework running at the Mercator Telescope, and we give a preview of the planned server-side implementation.

  20. UAV field demonstration of social media enabled tactical data link

    NASA Astrophysics Data System (ADS)

    Olson, Christopher C.; Xu, Da; Martin, Sean R.; Castelli, Jonathan C.; Newman, Andrew J.

    2015-05-01

    This paper addresses the problem of enabling Command and Control (C2) and data exfiltration functions for missions using small, unmanned, airborne surveillance and reconnaissance platforms. The authors demonstrated the feasibility of using existing commercial wireless networks as the data transmission infrastructure to support Unmanned Aerial Vehicle (UAV) autonomy functions such as transmission of commands, imagery, metadata, and multi-vehicle coordination messages. The authors developed and integrated a C2 Android application for ground users with a common smart phone, a C2 and data exfiltration Android application deployed on-board the UAVs, and a web server with database to disseminate the collected data to distributed users using standard web browsers. The authors performed a mission-relevant field test and demonstration in which operators commanded a UAV from an Android device to search and loiter; and remote users viewed imagery, video, and metadata via web server to identify and track a vehicle on the ground. Social media served as the tactical data link for all command messages, images, videos, and metadata during the field demonstration. Imagery, video, and metadata were transmitted from the UAV to the web server via multiple Twitter, Flickr, Facebook, YouTube, and similar media accounts. The web server reassembled images and video with corresponding metadata for distributed users. The UAV autopilot communicated with the on-board Android device via on-board Bluetooth network.

  1. DNA barcode goes two-dimensions: DNA QR code web server.

    PubMed

    Liu, Chang; Shi, Linchun; Xu, Xiaolan; Li, Huan; Xing, Hang; Liang, Dong; Jiang, Kun; Pang, Xiaohui; Song, Jingyuan; Chen, Shilin

    2012-01-01

    The DNA barcoding technology uses a standard region of DNA sequence for species identification and discovery. At present, "DNA barcode" actually refers to DNA sequences, which are not amenable to information storage, recognition, and retrieval. Our aim is to identify the best symbology that can represent DNA barcode sequences in practical applications. A comprehensive set of sequences for five DNA barcode markers ITS2, rbcL, matK, psbA-trnH, and CO1 was used as the test data. Fifty-three different types of one-dimensional and ten two-dimensional barcode symbologies were compared based on different criteria, such as coding capacity, compression efficiency, and error detection ability. The quick response (QR) code was found to have the largest coding capacity and relatively high compression ratio. To facilitate the further usage of QR code-based DNA barcodes, a web server was developed and is accessible at http://qrfordna.dnsalias.org. The web server allows users to retrieve the QR code for a species of interests, convert a DNA sequence to and from a QR code, and perform species identification based on local and global sequence similarities. In summary, the first comprehensive evaluation of various barcode symbologies has been carried out. The QR code has been found to be the most appropriate symbology for DNA barcode sequences. A web server has also been constructed to allow biologists to utilize QR codes in practical DNA barcoding applications.

  2. Simplifying the Analysis of Data from Multiple Heliophysics Instruments and Missions

    NASA Astrophysics Data System (ADS)

    Bazell, D.; Vandegriff, J. D.

    2014-12-01

    Understanding the intertwined plasma, particles and fields connecting the Sun and the Earth requires combining data from many diverse sources, but there are still many technological barriers that complicate the merging of data from different instruments and missions. We present an emerging data serving capability that provides a uniform way to access heterogeneous and distributed data. The goal of our data server is to provide a standardized data access mechanism that is identical for data of any format and layout (CDF, custom binary, FITS, netCDF, CSV and other flavors of ASCII, etc). Data remain in their original format and location (i.e., at instrument team sites or existing data centers), and our data server delivers a dynamically reformatted view of the data. Scientists can then use tools (clients that talk to the server) that offer a single interface for browsing, analyzing or downloading many different contemporary and legacy heliophysics data sets. Our current server accesses many CDF data resources at CDAWeb, as well as multiple other instrument team sites. Our webservice will be deployed on the Amazon Cloud at http://datashop.elasticbeanstalk.com/. Two basic clients will also be demonstrated: one in Java and one in IDL. Python, Perl, and Matlab clients are also planned. Complex missions such as Solar Orbiter and Solar Probe Plus will benefit greatly from tools that enable multi-instrument and multi-mission data comparison.

  3. Modernization of the USGS Hawaiian Volcano Observatory Seismic Processing Infrastructure

    NASA Astrophysics Data System (ADS)

    Antolik, L.; Shiro, B.; Friberg, P. A.

    2016-12-01

    The USGS Hawaiian Volcano Observatory (HVO) operates a Tier 1 Advanced National Seismic System (ANSS) seismic network to monitor, characterize, and report on volcanic and earthquake activity in the State of Hawaii. Upgrades at the observatory since 2009 have improved the digital telemetry network, computing resources, and seismic data processing with the adoption of the ANSS Quake Management System (AQMS) system. HVO aims to build on these efforts by further modernizing its seismic processing infrastructure and strengthen its ability to meet ANSS performance standards. Most notably, this will also allow HVO to support redundant systems, both onsite and offsite, in order to provide better continuity of operation during intermittent power and network outages. We are in the process of implementing a number of upgrades and improvements on HVO's seismic processing infrastructure, including: 1) Virtualization of AQMS physical servers; 2) Migration of server operating systems from Solaris to Linux; 3) Consolidation of AQMS real-time and post-processing services to a single server; 4) Upgrading database from Oracle 10 to Oracle 12; and 5) Upgrading to the latest Earthworm and AQMS software. These improvements will make server administration more efficient, minimize hardware resources required by AQMS, simplify the Oracle replication setup, and provide better integration with HVO's existing state of health monitoring tools and backup system. Ultimately, it will provide HVO with the latest and most secure software available while making the software easier to deploy and support.

  4. Enriching the Web Processing Service

    NASA Astrophysics Data System (ADS)

    Wosniok, Christoph; Bensmann, Felix; Wössner, Roman; Kohlus, Jörn; Roosmann, Rainer; Heidmann, Carsten; Lehfeldt, Rainer

    2014-05-01

    The OGC Web Processing Service (WPS) provides a standard for implementing geospatial processes in service-oriented networks. In its current version 1.0.0 it allocates the operations GetCapabilities, DescribeProcess and Execute, which can be used to offer custom processes based on single or multiple sub-processes. A large range of ready to use fine granular, fundamental geospatial processes have been developed by the GIS-community in the past. However, modern use cases or whole workflow processes demand specifications of lifecycle management and service orchestration. Orchestrating smaller sub-processes is a task towards interoperability; a comprehensive documentation by using appropriate metadata is also required. Though different approaches were tested in the past, developing complex WPS applications still requires programming skills, knowledge about software libraries in use and a lot of effort for integration. Our toolset RichWPS aims at providing a better overall experience by setting up two major components. The RichWPS ModelBuilder enables the graphics-aided design of workflow processes based on existing local and distributed processes and geospatial services. Once tested by the RichWPS Server, a composition can be deployed for production use on the RichWPS Server. The ModelBuilder obtains necessary processes and services from a directory service, the RichWPS semantic proxy. It manages the lifecycle and is able to visualize results and debugging-information. One aim will be to generate reproducible results; the workflow should be documented by metadata that can be integrated in Spatial Data Infrastructures. The RichWPS Server provides a set of interfaces to the ModelBuilder for, among others, testing composed workflow sequences, estimating their performance and to publish them as common processes. Therefore the server is oriented towards the upcoming WPS 2.0 standard and its ability to transactionally deploy and undeploy processes making use of a WPS-T interface. In order to deal with the results of these processing workflows, a server side extension enables the RichWPS Server and its clients to use WPS presentation directives (WPS-PD), a content related enhancement for the standardized WPS schema. We identified essential requirements of the components of our toolset by applying two use cases. The first enables the simplified comparison of modeled and measured data, a common task in hydro-engineering to validate the accuracy of a model. An implementation of the workflow includes reading, harmonizing and comparing two datasets in NetCDF-format. 2D Water level data from the German Bight can be chosen, presented and evaluated in a web client with interactive plots. The second use case is motivated by the Marine Strategy Directive (MSD) of the EU, which demands monitoring, action plans and at least an evaluation of the ecological situation in marine environment. Information technics adapted to those of INSPIRE should be used. One of the parameters monitored and evaluated for MSD is the expansion and quality of seagrass fields. With the view towards other evaluation parameters we decompose the complex process of evaluation of seagrass in reusable process steps and implement those packages as configurable WPS.

  5. DIABCARE Quality Network in Europe--a model for quality management in chronic diseases.

    PubMed

    Piwernetz, K

    2001-04-01

    The DIABCARE Q-Net project developed a complete and integrated information technology system to monitor diabetes care, according to the gold standards of the St Vincent Declaration Action Program. This is the first Telematic platform for standardized documentation on medical quality and evaluation across Europe, which will serve as a model for other chronic diseases. Quality development starts from the comparison of diabetes services, based on the key data on diabetes care in the basic information sheet. This is a 141 field form, which is to be completed once a year for each patient under the care of the diabetes team. The system performs an analysis of the local data and compares the data with peer teams by means of telecommunication of anonymous data. These data are collected regionally. At the next level these regional data are compared on a national basis across Europe using dedicated communication lines. National data can be compared transnationally by the use of the Internet and the DIABCARE benchmarking servers. These different lines are used according to the necessary security standards. Medical data are transferred via dedicated lines, aggregated data via the Internet. The architecture follows the open-platform concept in order to allow for heterogeneous technical environments. Already at the start of the project, the necessity for expanding the quality approach to telemedicine methodology was identified and included. For each level, specific programs are available to improve the performance of diabetes care delivery: DIABCARE data as client and DIABCARE server as regional and DIABCARE 'international server' as transnational server. Functioning pilots were established across all levels. The clients have been linked to the servers on a routine basis. According to the open architecture design, the various countries decided on different systems at the entry point: full system--Portugal; fax systems--Italy, Bavaria; implementation into doctor's office systems--Norway; paper forms and chip cards--France. This system can improve the local, regional and national diabetes care. Initiatives in several countries proved the feasibility of the system. The most extensive use, from Portugal, will be reported later in this paper. The exploitation of the DIABCARE Q-Net system will be performed with the DIABCARE International European Economic Interest Grouping as a co-ordinator and several commercial companies as contractors to market the products inside the system. The key project participants are: DIABCARE Office EURO, DIABCARE Portugal, DIABCARE France, DIABCARE Bavaria, DIABCARE UK, DIABCARE Netherlands, DIABCARE Norway, DIABCARE Italy, DIABCARE Sweden, DIABCARE Austria, DIABCARE Spain, GSF Research Centre for Health and Environment, FAST Research Institute for Applied Software Technology, Tromsø University Hospital, Stavanger Technical College, Technical University of Ilmenau, World Health Organisation (WHO), Regional Office for Europe.

  6. OLIVER: an online library of images for veterinary education and research.

    PubMed

    McGreevy, Paul; Shaw, Tim; Burn, Daniel; Miller, Nick

    2007-01-01

    As part of a strategic move by the University of Sydney toward increased flexibility in learning, the Faculty of Veterinary Science undertook a number of developments involving Web-based teaching and assessment. OLIVER underpins them by providing a rich, durable repository for learning objects. To integrate Web-based learning, case studies, and didactic presentations for veterinary and animal science students, we established an online library of images and other learning objects for use by academics in the Faculties of Veterinary Science and Agriculture. The objectives of OLIVER were to maximize the use of the faculty's teaching resources by providing a stable archiving facility for graphic images and other multimedia learning objects that allows flexible and precise searching, integrating indexing standards, thesauri, pull-down lists of preferred terms, and linking of objects within cases. OLIVER offers a portable and expandable Web-based shell that facilitates ongoing storage of learning objects in a range of media. Learning objects can be downloaded in common, standardized formats so that they can be easily imported for use in a range of applications, including Microsoft PowerPoint, WebCT, and Microsoft Word. OLIVER now contains more than 9,000 images relating to many facets of veterinary science; these are annotated and supported by search engines that allow rapid access to both images and relevant information. The Web site is easily updated and adapted as required.

  7. The Primary Care Electronic Library: RSS feeds using SNOMED-CT indexing for dynamic content delivery.

    PubMed

    Robinson, Judas; de Lusignan, Simon; Kostkova, Patty; Madge, Bruce; Marsh, A; Biniaris, C

    2006-01-01

    Rich Site Summary (RSS) feeds are a method for disseminating and syndicating the contents of a website using extensible mark-up language (XML). The Primary Care Electronic Library (PCEL) distributes recent additions to the site in the form of an RSS feed. When new resources are added to PCEL, they are manually assigned medical subject headings (MeSH terms), which are then automatically mapped to SNOMED-CT terms using the Unified Medical Language System (UMLS) Metathesaurus. The library is thus searchable using MeSH or SNOMED-CT. Our syndicate partner wished to have remote access to PCEL coronary heart disease (CHD) information resources based on SNOMED-CT search terms. To pilot the supply of relevant information resources in response to clinically coded requests, using RSS syndication for transmission between web servers. Our syndicate partner provided a list of CHD SNOMED-CT terms to its end-users, a list which was coded according to UMLS specifications. When the end-user requested relevant information resources, this request was relayed from our syndicate partner's web server to the PCEL web server. The relevant resources were retrieved from the PCEL MySQL database. This database is accessed using a server side scripting language (PHP), which enables the production of dynamic RSS feeds on the basis of Source Asserted Identifiers (CODEs) contained in UMLS. Retrieving resources using SNOMED-CT terms using syndication can be used to build a functioning application. The process from request to display of syndicated resources took less than one second. The results of the pilot illustrate that it is possible to exchange data between servers using RSS syndication. This method could be utilised dynamically to supply digital library resources to a clinical system with SNOMED-CT data used as the standard of reference.

  8. BlueSky Cloud - rapid infrastructure capacity using Amazon's Cloud for wildfire emergency response

    NASA Astrophysics Data System (ADS)

    Haderman, M.; Larkin, N. K.; Beach, M.; Cavallaro, A. M.; Stilley, J. C.; DeWinter, J. L.; Craig, K. J.; Raffuse, S. M.

    2013-12-01

    During peak fire season in the United States, many large wildfires often burn simultaneously across the country. Smoke from these fires can produce air quality emergencies. It is vital that incident commanders, air quality agencies, and public health officials have smoke impact information at their fingertips for evaluating where fires and smoke are and where the smoke will go next. To address the need for this kind of information, the U.S. Forest Service AirFire Team created the BlueSky Framework, a modeling system that predicts concentrations of particle pollution from wildfires. During emergency response, decision makers use BlueSky predictions to make public outreach and evacuation decisions. The models used in BlueSky predictions are computationally intensive, and the peak fire season requires significantly more computer resources than off-peak times. Purchasing enough hardware to run the number of BlueSky Framework runs that are needed during fire season is expensive and leaves idle servers running the majority of the year. The AirFire Team and STI developed BlueSky Cloud to take advantage of Amazon's virtual servers hosted in the cloud. With BlueSky Cloud, as demand increases and decreases, servers can be easily spun up and spun down at a minimal cost. Moving standard BlueSky Framework runs into the Amazon Cloud made it possible for the AirFire Team to rapidly increase the number of BlueSky Framework instances that could be run simultaneously without the costs associated with purchasing and managing servers. In this presentation, we provide an overview of the features of BlueSky Cloud, describe how the system uses Amazon Cloud, and discuss the costs and benefits of moving from privately hosted servers to a cloud-based infrastructure.

  9. Owgis 2.0: Open Source Java Application that Builds Web GIS Interfaces for Desktop Andmobile Devices

    NASA Astrophysics Data System (ADS)

    Zavala Romero, O.; Chassignet, E.; Zavala-Hidalgo, J.; Pandav, H.; Velissariou, P.; Meyer-Baese, A.

    2016-12-01

    OWGIS is an open source Java and JavaScript application that builds easily configurable Web GIS sites for desktop and mobile devices. The current version of OWGIS generates mobile interfaces based on HTML5 technology and can be used to create mobile applications. The style of the generated websites can be modified using COMPASS, a well known CSS Authoring Framework. In addition, OWGIS uses several Open Geospatial Consortium standards to request datafrom the most common map servers, such as GeoServer. It is also able to request data from ncWMS servers, allowing the websites to display 4D data from NetCDF files. This application is configured by XML files that define which layers, geographic datasets, are displayed on the Web GIS sites. Among other features, OWGIS allows for animations; streamlines from vector data; virtual globe display; vertical profiles and vertical transects; different color palettes; the ability to download data; and display text in multiple languages. OWGIS users are mainly scientists in the oceanography, meteorology and climate fields.

  10. BiodMHC: an online server for the prediction of MHC class II-peptide binding affinity.

    PubMed

    Wang, Lian; Pan, Danling; Hu, Xihao; Xiao, Jinyu; Gao, Yangyang; Zhang, Huifang; Zhang, Yan; Liu, Juan; Zhu, Shanfeng

    2009-05-01

    Effective identification of major histocompatibility complex (MHC) molecules restricted peptides is a critical step in discovering immune epitopes. Although many online servers have been built to predict class II MHC-peptide binding affinity, they have been trained on different datasets, and thus fail in providing a unified comparison of various methods. In this paper, we present our implementation of seven popular predictive methods, namely SMM-align, ARB, SVR-pairwise, Gibbs sampler, ProPred, LP-top2, and MHCPred, on a single web server named BiodMHC (http://biod.whu.edu.cn/BiodMHC/index.html, the software is available upon request). Using a standard measure of AUC (Area Under the receiver operating characteristic Curves), we compare these methods by means of not only cross validation but also prediction on independent test datasets. We find that SMM-align, ProPred, SVR-pairwise, ARB, and Gibbs sampler are the five best-performing methods. For the binding affinity prediction of class II MHC-peptide, BiodMHC provides a convenient online platform for researchers to obtain binding information simultaneously using various methods.

  11. Viewing ISS Data in Real Time via the Internet

    NASA Technical Reports Server (NTRS)

    Myers, Gerry; Chamberlain, Jim

    2004-01-01

    EZStream is a computer program that enables authorized users at diverse terrestrial locations to view, in real time, data generated by scientific payloads aboard the International Space Station (ISS). The only computation/communication resource needed for use of EZStream is a computer equipped with standard Web-browser software and a connection to the Internet. EZStream runs in conjunction with the TReK software, described in a prior NASA Tech Briefs article, that coordinates multiple streams of data for the ground communication system of the ISS. EZStream includes server components that interact with TReK within the ISS ground communication system and client components that reside in the users' remote computers. Once an authorized client has logged in, a server component of EZStream pulls the requested data from a TReK application-program interface and sends the data to the client. Future EZStream enhancements will include (1) extensions that enable the server to receive and process arbitrary data streams on its own and (2) a Web-based graphical-user-interface-building subprogram that enables a client who lacks programming expertise to create customized display Web pages.

  12. The concurrent validity and intrarater reliability of the Microsoft Kinect to measure thoracic kyphosis.

    PubMed

    Quek, June; Brauer, Sandra G; Treleaven, Julia; Clark, Ross A

    2017-09-01

    This study aims to investigate the concurrent validity and intrarater reliability of the Microsoft Kinect to measure thoracic kyphosis against the Flexicurve. Thirty-three healthy individuals (age: 31±11.0 years, men: 17, height: 170.2±8.2 cm, weight: 64.2±12.0 kg) participated, with 29 re-examined for intrarater reliability 1-7 days later. Thoracic kyphosis was measured using the Flexicurve and the Microsoft Kinect consecutively in both standing and sitting positions. Both the kyphosis index and angle were calculated. The Microsoft Kinect showed excellent concurrent validity (intraclass correlation coefficient=0.76-0.82) and reliability (intraclass correlation coefficient=0.81-0.98) for measuring thoracic kyphosis (angle and index) in both standing and sitting postures. This study is the first to show that the Microsoft Kinect has excellent validity and intrarater reliability to measure thoracic kyphosis, which is promising for its use in the clinical setting.

  13. Microsoft health patient journey demonstrator.

    PubMed

    Disse, Kirsten

    2008-01-01

    As health care becomes more reliant on electronic systems, there is a need to standardise display elements to promote patient safety and clinical efficiency. The Microsoft Health Common User Interface (MSCUI) programme, developed by Microsoft and the National Health Service (NHS) was born out of this need and creates guidance and controls designed to increase patient safety and clinical effectiveness through consistent interface treatments. The Microsoft Health Patient Journey Demonstrator is a prototype tool designed to provide exemplar implementations of MSCUI guidance on a Microsoft platform. It is a targeted glimpse at a visual interface for the integration of health-relevant information, including electronic medical records. We built the demonstrator in Microsoft Silverlight 2, our application technology which brings desktop functionality and enriched levels of user experience to health settings worldwide via the internet. We based the demonstrator on an easily recognisable clinical scenario which offered us the most scope for demonstrating MSCUI guidance and innovation. The demonstrator is structured in three sections (administration, primary care and secondary care) each of which illustrates the activities associated within the setting relevant to our scenario. The demonstrator is published on the MSCUI website www.mscui.net The MSCUI patient journey demonstrator has been successful in raising awareness and increasing interest in the CUI programme.

  14. Hardware Assisted Stealthy Diversity (CHECKMATE)

    DTIC Science & Technology

    2013-09-01

    applicable across multiple architectures. Figure 29 shows an example an attack against an interpreted environment with a Java executable. CHECKMATE can...Architectures ARM PPCx86 Java VM Java VMJava VM Java Executable Attack APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED 33 a user executes “/usr/bin/wget...Server 1 - Administration Server 2 – Database ( mySQL ) Server 3 – Web server (Mongoose) Server 4 – File server (SSH) Server 5 – Email server

  15. Why American business demands twenty-first century learning: A company perspective.

    PubMed

    Knox, Allyson

    2006-01-01

    Microsoft is an innovative corporation demonstrating the kind and caliber of job skills needed in the twenty-first century. It demonstrates its commitment to twenty-first century skills by holding its employees accountable to a set of core competencies, enabling the company to run effectively. The author explores how Microsoft's core competencies parallel the Partnership for 21st Century Skills learning frameworks. Both require advanced problem-solving skills and a passion for technology, both expect individuals to be able to work in teams, both look for a love of learning, and both call for the self-confidence to honestly self-evaluate. Microsoft also works to cultivate twenty-first century skills among future workers, investing in education to help prepare young people for competitive futures. As the need for digital literacy has become imperative, technology companies have taken the lead in facilitating technology training by partnering with schools and communities. Microsoft is playing a direct role in preparing students for what lies ahead in their careers. To further twenty-first century skills, or core competencies, among the nation's youth, Microsoft has established Partners in Learning, a program that helps education organizations build partnerships that leverage technology to improve teaching and learning. One Partners in Learning grantee is Global Kids, a nonprofit organization that trains students to design online games focused on global social issues resonating with civic and global competencies. As Microsoft believes the challenges of competing in today's economy and teaching today's students are substantial but not insurmountable, such partnerships and investments demonstrate Microsoft's belief in and commitment to twenty-first century skills.

  16. The Red Atrapa Sismos (Quake Catcher Network in Mexico): assessing performance during large and damaging earthquakes.

    USGS Publications Warehouse

    Dominguez, Luis A.; Yildirim, Battalgazi; Husker, Allen L.; Cochran, Elizabeth S.; Christensen, Carl; Cruz-Atienza, Victor M.

    2015-01-01

    Each volunteer computer monitors ground motion and communicates using the Berkeley Open Infrastructure for Network Computing (BOINC, Anderson, 2004). Using a standard short‐term average, long‐term average (STLA) algorithm (Earle and Shearer, 1994; Cochran, Lawrence, Christensen, Chung, 2009; Cochran, Lawrence, Christensen, and Jakka, 2009), volunteer computer and sensor systems detect abrupt changes in the acceleration recordings. Each time a possible trigger signal is declared, a small package of information containing sensor and ground‐motion information is streamed to one of the QCN servers (Chung et al., 2011). Trigger signals, correlated in space and time, are then processed by the QCN server to look for potential earthquakes.

  17. Modeling And Simulation Of Multimedia Communication Networks

    NASA Astrophysics Data System (ADS)

    Vallee, Richard; Orozco-Barbosa, Luis; Georganas, Nicolas D.

    1989-05-01

    In this paper, we present a simulation study of a browsing system involving radiological image servers. The proposed IEEE 802.6 DQDB MAN standard is designated as the computer network to transfer radiological images from file servers to medical workstations, and to simultaneously support real time voice communications. Storage and transmission of original raster scanned images and images compressed according to pyramid data structures are considered. Different types of browsing as well as various image sizes and bit rates in the DQDB MAN are also compared. The elapsed time, measured from the time an image request is issued until the image is displayed on the monitor, is the parameter considered to evaluate the system performance. Simulation results show that image browsing can be supported by the DQDB MAN.

  18. Migration of the Japanese healthcare enterprise from a financial to integrated management: strategy and architecture.

    PubMed

    Akiyama, M

    2001-01-01

    The Hospital Information System (HIS) has been positioned as the hub of the healthcare information management architecture. In Japan, the billing system assigns an "insurance disease names" to performed exams based on the diagnosis type. Departmental systems provide localized, departmental services, such as order receipt and diagnostic reporting, but do not provide patient demographic information. The system above has many problems. The departmental system's terminals and the HIS's terminals are not integrated. Duplicate data entry introduces errors and increases workloads. Order and exam data managed by the HIS can be sent to the billing system, but departmental data cannot usually be entered. Additionally, billing systems usually keep departmental data for only a short time before it is deleted. The billing system provides payment based on what is entered. The billing system is oriented towards diagnoses. Most importantly, the system is geared towards generating billing reports rather than at providing high-quality patient care. The role of the application server is that of a mediator between system components. Data and events generated by system components are sent to the application server that routes them to appropriate destinations. It also records all system events, including state changes to clinical data, access of clinical data and so on. Finally, the Resource Management System identifies all system resources available to the enterprise. The departmental systems are responsible for managing data and clinical processes at a departmental level. The client interacts with the system via the application server, which provides a general set of system-level functions. The system is implemented using current technologies CORBA and HTTP. System data is collected by the application server and assembled into XML documents for delivery to clients. Clients can access these URLs using standard HTTP clients, since each department provides an HTTP compliant web-server. We have implemented an integrated system communicating via CORBA middleware, consisting of an application server, endoscopy departmental server, pathology departmental server and wrappered legacy HIS. We have found this new approach solves the problems outlined earlier. It provides the services needed to ensure that data is never lost and is always available, that events that occur in the hospital are always captured, and that resources are managed and tracked effectively. Finally, it reduces costs, raises efficiency, increases the quality of patient care, and ultimately saves lives. Now, we are going to integrate all remaining hospital departments, and ultimately, all hospital functions.

  19. When are Overcomplete Representations Identifiable? Uniqueness of Tensor Decompositions Under Expansion Constraints

    DTIC Science & Technology

    2013-06-16

    Science Dept., University of California, Irvine, USA 92697. Email : a.anandkumar@uci.edu,mjanzami@uci.edu. Daniel Hsu and Sham Kakade are with...Microsoft Research New England, 1 Memorial Drive, Cambridge, MA 02142. Email : dahsu@microsoft.com, skakade@microsoft.com 1 a latent space dimensionality...Sparse coding for multitask and transfer learning. ArxXiv preprint, abs/1209.0738, 2012. [34] G.H. Golub and C.F. Van Loan. Matrix Computations. The

  20. A Simple Method for Reproducing Orbital Plots for Illustration Using Microsoft Paint and Microsoft Excel

    NASA Astrophysics Data System (ADS)

    Niebuhr, Cole

    2018-04-01

    Papers published in the astronomical community, particularly in the field of double star research, often contain plots that display the positions of the component stars relative to each other on a Cartesian coordinate plane. Due to the complexities of plotting a three-dimensional orbit into a two-dimensional image, it is often difficult to include an accurate reproduction of the orbit for comparison purposes. Methods to circumvent this obstacle do exist; however, many of these protocols result in low-quality blurred images or require specific and often expensive software. Here, a method is reported using Microsoft Paint and Microsoft Excel to produce high-quality images with an accurate reproduction of a partial orbit.

  1. Using OpenOffice as a Portable Interface to JAVA-Based Applications

    NASA Astrophysics Data System (ADS)

    Comeau, T.; Garrett, B.; Richon, J.; Romelfanger, F.

    2004-07-01

    STScI previously used Microsoft Word and Microsoft Access, a Sybase ODBC driver, and the Adobe Acrobat PDF writer, along with a substantial amount of Visual Basic, to generate a variety of documents for the internal Space Telescope Grants Administration System (STGMS). While investigating an upgrade to Microsoft Office XP, we began considering alternatives, ultimately selecting an open source product, OpenOffice.org. This reduces the total number of products required to operate the internal STGMS system, simplifies the build system, and opens the possibility of moving to a non-Windows platform. We describe the experience of moving from Microsoft Office to OpenOffice.org, and our other internal uses of OpenOffice.org in our development environment.

  2. Software for Allocating Resources in the Deep Space Network

    NASA Technical Reports Server (NTRS)

    Wang, Yeou-Fang; Borden, Chester; Zendejas, Silvino; Baldwin, John

    2003-01-01

    TIGRAS 2.0 is a computer program designed to satisfy a need for improved means for analyzing the tracking demands of interplanetary space-flight missions upon the set of ground antenna resources of the Deep Space Network (DSN) and for allocating those resources. Written in Microsoft Visual C++, TIGRAS 2.0 provides a single rich graphical analysis environment for use by diverse DSN personnel, by connecting to various data sources (relational databases or files) based on the stages of the analyses being performed. Notable among the algorithms implemented by TIGRAS 2.0 are a DSN antenna-load-forecasting algorithm and a conflict-aware DSN schedule-generating algorithm. Computers running TIGRAS 2.0 can also be connected using SOAP/XML to a Web services server that provides analysis services via the World Wide Web. TIGRAS 2.0 supports multiple windows and multiple panes in each window for users to view and use information, all in the same environment, to eliminate repeated switching among various application programs and Web pages. TIGRAS 2.0 enables the use of multiple windows for various requirements, trajectory-based time intervals during which spacecraft are viewable, ground resources, forecasts, and schedules. Each window includes a time navigation pane, a selection pane, a graphical display pane, a list pane, and a statistics pane.

  3. Entamoeba histolytica: construction and applications of subgenomic databases.

    PubMed

    Hofer, Margit; Duchêne, Michael

    2005-07-01

    Knowledge about the influence of environmental stress such as the action of chemotherapeutic agents on gene expression in Entamoeba histolytica is limited. We plan to use oligonucleotide microarray hybridization to approach these questions. As the basis for our array, sequence data from the genome project carried out by the Institute for Genomic Research (TIGR) and the Sanger Institute were used to annotate parts of the parasite genome. Three subgenomic databases containing enzymes, cytoskeleton genes, and stress genes were compiled with the help of the ExPASy proteomics website and the BLAST servers at the two genome project sites. The known sequences from reference species, mostly human and Escherichia coli, were searched against TIGR and Sanger E. histolytica sequence contigs and the homologs were copied into a Microsoft Access database. In a similar way, two additional databases of cytoskeletal genes and stress genes were generated. Metabolic pathways could be assembled from our enzyme database, but sometimes they were incomplete as is the case for the sterol biosynthesis pathway. The raw databases contained a significant number of duplicate entries which were merged to obtain curated non-redundant databases. This procedure revealed that some E. histolytica genes may have several putative functions. Representative examples such as the case of the delta-aminolevulinate synthase/serine palmitoyltransferase are discussed.

  4. Automated DICOM metadata and volumetric anatomical information extraction for radiation dosimetry

    NASA Astrophysics Data System (ADS)

    Papamichail, D.; Ploussi, A.; Kordolaimi, S.; Karavasilis, E.; Papadimitroulas, P.; Syrgiamiotis, V.; Efstathopoulos, E.

    2015-09-01

    Patient-specific dosimetry calculations based on simulation techniques have as a prerequisite the modeling of the modality system and the creation of voxelized phantoms. This procedure requires the knowledge of scanning parameters and patients’ information included in a DICOM file as well as image segmentation. However, the extraction of this information is complicated and time-consuming. The objective of this study was to develop a simple graphical user interface (GUI) to (i) automatically extract metadata from every slice image of a DICOM file in a single query and (ii) interactively specify the regions of interest (ROI) without explicit access to the radiology information system. The user-friendly application developed in Matlab environment. The user can select a series of DICOM files and manage their text and graphical data. The metadata are automatically formatted and presented to the user as a Microsoft Excel file. The volumetric maps are formed by interactively specifying the ROIs and by assigning a specific value in every ROI. The result is stored in DICOM format, for data and trend analysis. The developed GUI is easy, fast and and constitutes a very useful tool for individualized dosimetry. One of the future goals is to incorporate a remote access to a PACS server functionality.

  5. OLAP Cube Visualization of Hydrologic Data Catalogs

    NASA Astrophysics Data System (ADS)

    Zaslavsky, I.; Rodriguez, M.; Beran, B.; Valentine, D.; van Ingen, C.; Wallis, J. C.

    2007-12-01

    As part of the CUAHSI Hydrologic Information System project, we assemble comprehensive observations data catalogs that support CUAHSI data discovery services (WaterOneFlow services) and online mapping interfaces (e.g. the Data Access System for Hydrology, DASH). These catalogs describe several nation-wide data repositories that are important for hydrologists, including USGS NWIS and EPA STORET data collections. The catalogs contain a wealth of information reflecting the entire history and geography of hydrologic observations in the US. Managing such catalogs requires high performance analysis and visualization technologies. OLAP (Online Analytical Processing) cube, often called data cubes, is an approach to organizing and querying large multi-dimensional data collections. We have applied the OLAP techniques, as implemented in Microsoft SQL Server 2005, to the analysis of the catalogs from several agencies. In this initial report, we focus on the OLAP technology as applied to catalogs, and preliminary results of the analysis. Specifically, we describe the challenges of generating OLAP cube dimensions, and defining aggregations and views for data catalogs as opposed to observations data themselves. The initial results are related to hydrologic data availability from the observations data catalogs. The results reflect geography and history of available data totals from USGS NWIS and EPA STORET repositories, and spatial and temporal dynamics of available measurements for several key nutrient-related parameters.

  6. An academic radiology information system (RIS): a review of the commercial RIS systems, and how an individualized academic RIS can be created and utilized.

    PubMed

    Tamm, E P; Kawashima, A; Silverman, P

    2001-06-01

    Current commercial radiology information systems (RIS) are designed for scheduling, billing, charge collection, and report dissemination. Academic institutions have additional requirements for their missions for teaching, research and clinical care. The newest versions of commercial RIS offer greater flexibility than prior systems. We sent questionnaires to Cerner Corporation, ADAC Health Care Information Systems, IDX Systems, Per-Se' Technologies, and Siemens Health Services regarding features of their products. All of the products we surveyed offer user customizable fields. However, most products did not allow the user to expand their product's data table. The search capabilities of the products varied. All of the products supported the Health Level 7 (HL-7) interface and the use of structured query language (SQL). All of the products were offered with an SQL editor for creating customized queries and custom reports. All products included capabilities for collecting data for quality assurance and included capabilities for tracking "interesting cases," though they varied in the functionality offered. No product offered dedicated functions for research. Alternatively, radiology departments can create their own client-server Windows-based database systems to supplement the capabilities of commercial systems. Such systems can be developed with "web-enabled" database products like Microsoft Access or Apple Filemaker Pro.

  7. MAT - MULTI-ATTRIBUTE TASK BATTERY FOR HUMAN OPERATOR WORKLOAD AND STRATEGIC BEHAVIOR RESEARCH

    NASA Technical Reports Server (NTRS)

    Comstock, J. R.

    1994-01-01

    MAT, a Multi-Attribute Task battery, gives the researcher the capability of performing multi-task workload and performance experiments. The battery provides a benchmark set of tasks for use in a wide range of laboratory studies of operator performance and workload. MAT incorporates tasks analogous to activities that aircraft crew members perform in flight, while providing a high degree of experiment control, performance data on each subtask, and freedom to use non-pilot test subjects. The MAT battery primary display is composed of four separate task windows which are as follows: a monitoring task window which includes gauges and warning lights, a tracking task window for the demands of manual control, a communication task window to simulate air traffic control communications, and a resource management task window which permits maintaining target levels on a fuel management task. In addition, a scheduling task window gives the researcher information about future task demands. The battery also provides the option of manual or automated control of tasks. The task generates performance data for each subtask. The task battery may be paused and onscreen workload rating scales presented to the subject. The MAT battery was designed to use a serially linked second computer to generate the voice messages for the Communications task. The MATREMX program and support files, which are included in the MAT package, were designed to work with the Heath Voice Card (Model HV-2000, available through the Heath Company, Benton Harbor, Michigan 49022); however, the MATREMX program and support files may easily be modified to work with other voice synthesizer or digitizer cards. The MAT battery task computer may also be used independent of the voice computer if no computer synthesized voice messages are desired or if some other method of presenting auditory messages is devised. MAT is written in QuickBasic and assembly language for IBM PC series and compatible computers running MS-DOS. The code in MAT is written for Microsoft QuickBasic 4.5 and Microsoft Macro Assembler 5.1. This package requires a joystick and EGA or VGA color graphics. An 80286, 386, or 486 processor machine is highly recommended. The standard distribution medium for MAT is a 5.25 inch 360K MS-DOS format diskette. The files are compressed using the PKZIP file compression utility. PKUNZIP is included on the distribution diskette. MAT was developed in 1992. IBM PC is a registered trademark of International Business Machines. MS-DOS, Microsoft QuickBasic, and Microsoft Macro Assembler are registered trademarks of Microsoft Corporation. PKZIP and PKUNZIP are registered trademarks of PKWare, Inc.

  8. A simple method to compare firing pin marks using stereomicroscope and Microsoft office (Windows 8) tools.

    PubMed

    Suresh, R

    2017-08-01

    Pertinent marks of fired cartridge cases such as firing pin, breech face, extractor, ejector, etc. are used for firearm identification. A non-standard semiautomatic pistol and four .22rim fire cartridges (head stamp KF) is used for known source comparison study. Two test fired cartridge cases are examined under stereomicroscope. The characteristic marks are captured by digital camera and comparative analysis of striation marks is done by using different tools available in the Microsoft word (Windows 8) of a computer system. The similarities of striation marks thus obtained are highly convincing to identify the firearm. In this paper, an effort has been made to study and compare the striation marks of two fired cartridge cases using stereomicroscope, digital camera and computer system. Comparison microscope is not used in this study. The method described in this study is simple, cost effective, transport to field study and can be equipped in a crime scene vehicle to facilitate immediate on spot examination. The findings may be highly helpful to the forensic community, law enforcement agencies and students. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Custom controls

    NASA Astrophysics Data System (ADS)

    Butell, Bart

    1996-02-01

    Microsoft's Visual Basic (VB) and Borland's Delphi provide an extremely robust programming environment for delivering multimedia solutions for interactive kiosks, games and titles. Their object oriented use of standard and custom controls enable a user to build extremely powerful applications. A multipurpose, database enabled programming environment that can provide an event driven interface functions as a multimedia kernel. This kernel can provide a variety of authoring solutions (e.g. a timeline based model similar to Macromedia Director or a node authoring model similar to Icon Author). At the heart of the kernel is a set of low level multimedia components providing object oriented interfaces for graphics, audio, video and imaging. Data preparation tools (e.g., layout, palette and Sprite Editors) could be built to manage the media database. The flexible interface for VB allows the construction of an infinite number of user models. The proliferation of these models within a popular, easy to use environment will allow the vast developer segment of 'producer' types to bring their ideas to the market. This is the key to building exciting, content rich multimedia solutions. Microsoft's VB and Borland's Delphi environments combined with multimedia components enable these possibilities.

  10. LabKey Server: an open source platform for scientific data integration, analysis and collaboration.

    PubMed

    Nelson, Elizabeth K; Piehler, Britt; Eckels, Josh; Rauch, Adam; Bellew, Matthew; Hussey, Peter; Ramsay, Sarah; Nathe, Cory; Lum, Karl; Krouse, Kevin; Stearns, David; Connolly, Brian; Skillman, Tom; Igra, Mark

    2011-03-09

    Broad-based collaborations are becoming increasingly common among disease researchers. For example, the Global HIV Enterprise has united cross-disciplinary consortia to speed progress towards HIV vaccines through coordinated research across the boundaries of institutions, continents and specialties. New, end-to-end software tools for data and specimen management are necessary to achieve the ambitious goals of such alliances. These tools must enable researchers to organize and integrate heterogeneous data early in the discovery process, standardize processes, gain new insights into pooled data and collaborate securely. To meet these needs, we enhanced the LabKey Server platform, formerly known as CPAS. This freely available, open source software is maintained by professional engineers who use commercially proven practices for software development and maintenance. Recent enhancements support: (i) Submitting specimens requests across collaborating organizations (ii) Graphically defining new experimental data types, metadata and wizards for data collection (iii) Transitioning experimental results from a multiplicity of spreadsheets to custom tables in a shared database (iv) Securely organizing, integrating, analyzing, visualizing and sharing diverse data types, from clinical records to specimens to complex assays (v) Interacting dynamically with external data sources (vi) Tracking study participants and cohorts over time (vii) Developing custom interfaces using client libraries (viii) Authoring custom visualizations in a built-in R scripting environment. Diverse research organizations have adopted and adapted LabKey Server, including consortia within the Global HIV Enterprise. Atlas is an installation of LabKey Server that has been tailored to serve these consortia. It is in production use and demonstrates the core capabilities of LabKey Server. Atlas now has over 2,800 active user accounts originating from approximately 36 countries and 350 organizations. It tracks roughly 27,000 assay runs, 860,000 specimen vials and 1,300,000 vial transfers. Sharing data, analysis tools and infrastructure can speed the efforts of large research consortia by enhancing efficiency and enabling new insights. The Atlas installation of LabKey Server demonstrates the utility of the LabKey platform for collaborative research. Stable, supported builds of LabKey Server are freely available for download at http://www.labkey.org. Documentation and source code are available under the Apache License 2.0.

  11. LabKey Server: An open source platform for scientific data integration, analysis and collaboration

    PubMed Central

    2011-01-01

    Background Broad-based collaborations are becoming increasingly common among disease researchers. For example, the Global HIV Enterprise has united cross-disciplinary consortia to speed progress towards HIV vaccines through coordinated research across the boundaries of institutions, continents and specialties. New, end-to-end software tools for data and specimen management are necessary to achieve the ambitious goals of such alliances. These tools must enable researchers to organize and integrate heterogeneous data early in the discovery process, standardize processes, gain new insights into pooled data and collaborate securely. Results To meet these needs, we enhanced the LabKey Server platform, formerly known as CPAS. This freely available, open source software is maintained by professional engineers who use commercially proven practices for software development and maintenance. Recent enhancements support: (i) Submitting specimens requests across collaborating organizations (ii) Graphically defining new experimental data types, metadata and wizards for data collection (iii) Transitioning experimental results from a multiplicity of spreadsheets to custom tables in a shared database (iv) Securely organizing, integrating, analyzing, visualizing and sharing diverse data types, from clinical records to specimens to complex assays (v) Interacting dynamically with external data sources (vi) Tracking study participants and cohorts over time (vii) Developing custom interfaces using client libraries (viii) Authoring custom visualizations in a built-in R scripting environment. Diverse research organizations have adopted and adapted LabKey Server, including consortia within the Global HIV Enterprise. Atlas is an installation of LabKey Server that has been tailored to serve these consortia. It is in production use and demonstrates the core capabilities of LabKey Server. Atlas now has over 2,800 active user accounts originating from approximately 36 countries and 350 organizations. It tracks roughly 27,000 assay runs, 860,000 specimen vials and 1,300,000 vial transfers. Conclusions Sharing data, analysis tools and infrastructure can speed the efforts of large research consortia by enhancing efficiency and enabling new insights. The Atlas installation of LabKey Server demonstrates the utility of the LabKey platform for collaborative research. Stable, supported builds of LabKey Server are freely available for download at http://www.labkey.org. Documentation and source code are available under the Apache License 2.0. PMID:21385461

  12. Preliminary analysis techniques for ring and stringer stiffened cylindrical shells

    NASA Technical Reports Server (NTRS)

    Graham, J.

    1993-01-01

    This report outlines methods of analysis for the buckling of thin-walled circumferentially and longitudinally stiffened cylindrical shells. Methods of analysis for the various failure modes are presented in one cohesive package. Where applicable, more than one method of analysis for a failure mode is presented along with standard practices. The results of this report are primarily intended for use in launch vehicle design in the elastic range. A Microsoft Excel worksheet with accompanying macros has been developed to automate the analysis procedures.

  13. Network characteristics for server selection in online games

    NASA Astrophysics Data System (ADS)

    Claypool, Mark

    2008-01-01

    Online gameplay is impacted by the network characteristics of players connected to the same server. Unfortunately, the network characteristics of online game servers are not well-understood, particularly for groups that wish to play together on the same server. As a step towards a remedy, this paper presents analysis of an extensive set of measurements of game servers on the Internet. Over the course of many months, actual Internet game servers were queried simultaneously by twenty-five emulated game clients, with both servers and clients spread out on the Internet. The data provides statistics on the uptime and populations of game servers over a month long period an an in-depth look at the suitability for game servers for multi-player server selection, concentrating on characteristics critical to playability--latency and fairness. Analysis finds most game servers have latencies suitable for third-person and omnipresent games, such as real-time strategy, sports and role-playing games, providing numerous server choices for game players. However, far fewer game servers have the low latencies required for first-person games, such as shooters or race games. In all cases, groups that wish to play together have a greatly reduced set of servers from which to choose because of inherent unfairness in server latencies and server selection is particularly limited as the group size increases. These results hold across different game types and even across different generations of games. The data should be useful for game developers and network researchers that seek to improve game server selection, whether for single or multiple players.

  14. NASA SensorWeb and OGC Standards for Disaster Management

    NASA Technical Reports Server (NTRS)

    Mandl, Dan

    2010-01-01

    I. Goal: Enable user to cost-effectively find and create customized data products to help manage disasters; a) On-demand; b) Low cost and non-specialized tools such as Google Earth and browsers; c) Access via open network but with sufficient security. II. Use standards to interface various sensors and resultant data: a) Wrap sensors in Open Geospatial Consortium (OGC) standards; b) Wrap data processing algorithms and servers with OGC standards c) Use standardized workflows to orchestrate and script the creation of these data; products. III. Target Web 2.0 mass market: a) Make it simple and easy to use; b) Leverage new capabilities and tools that are emerging; c) Improve speed and responsiveness.

  15. Automation of workplace lifting hazard assessment for musculoskeletal injury prevention.

    PubMed

    Spector, June T; Lieblich, Max; Bao, Stephen; McQuade, Kevin; Hughes, Margaret

    2014-01-01

    Existing methods for practically evaluating musculoskeletal exposures such as posture and repetition in workplace settings have limitations. We aimed to automate the estimation of parameters in the revised United States National Institute for Occupational Safety and Health (NIOSH) lifting equation, a standard manual observational tool used to evaluate back injury risk related to lifting in workplace settings, using depth camera (Microsoft Kinect) and skeleton algorithm technology. A large dataset (approximately 22,000 frames, derived from six subjects) of simultaneous lifting and other motions recorded in a laboratory setting using the Kinect (Microsoft Corporation, Redmond, Washington, United States) and a standard optical motion capture system (Qualysis, Qualysis Motion Capture Systems, Qualysis AB, Sweden) was assembled. Error-correction regression models were developed to improve the accuracy of NIOSH lifting equation parameters estimated from the Kinect skeleton. Kinect-Qualysis errors were modelled using gradient boosted regression trees with a Huber loss function. Models were trained on data from all but one subject and tested on the excluded subject. Finally, models were tested on three lifting trials performed by subjects not involved in the generation of the model-building dataset. Error-correction appears to produce estimates for NIOSH lifting equation parameters that are more accurate than those derived from the Microsoft Kinect algorithm alone. Our error-correction models substantially decreased the variance of parameter errors. In general, the Kinect underestimated parameters, and modelling reduced this bias, particularly for more biased estimates. Use of the raw Kinect skeleton model tended to result in falsely high safe recommended weight limits of loads, whereas error-corrected models gave more conservative, protective estimates. Our results suggest that it may be possible to produce reasonable estimates of posture and temporal elements of tasks such as task frequency in an automated fashion, although these findings should be confirmed in a larger study. Further work is needed to incorporate force assessments and address workplace feasibility challenges. We anticipate that this approach could ultimately be used to perform large-scale musculoskeletal exposure assessment not only for research but also to provide real-time feedback to workers and employers during work method improvement activities and employee training.

  16. PR-EDB: Power Reactor Embrittlement Database - Version 3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jy-An John; Subramani, Ranjit

    2008-03-01

    The aging and degradation of light-water reactor pressure vessels is of particular concern because of their relevance to plant integrity and the magnitude of the expected irradiation embrittlement. The radiation embrittlement of reactor pressure vessel materials depends on many factors, such as neutron fluence, flux, and energy spectrum, irradiation temperature, and preirradiation material history and chemical compositions. These factors must be considered to reliably predict pressure vessel embrittlement and to ensure the safe operation of the reactor. Large amounts of data from surveillance capsules are needed to develop a generally applicable damage prediction model that can be used for industrymore » standards and regulatory guides. Furthermore, the investigations of regulatory issues such as vessel integrity over plant life, vessel failure, and sufficiency of current codes, Standard Review Plans (SRPs), and Guides for license renewal can be greatly expedited by the use of a well-designed computerized database. The Power Reactor Embrittlement Database (PR-EDB) is such a comprehensive collection of data for U.S. designed commercial nuclear reactors. The current version of the PR-EDB lists the test results of 104 heat-affected-zone (HAZ) materials, 115 weld materials, and 141 base materials, including 103 plates, 35 forgings, and 3 correlation monitor materials that were irradiated in 321 capsules from 106 commercial power reactors. The data files are given in dBASE format and can be accessed with any personal computer using the Windows operating system. "User-friendly" utility programs have been written to investigate radiation embrittlement using this database. Utility programs allow the user to retrieve, select and manipulate specific data, display data to the screen or printer, and fit and plot Charpy impact data. The PR-EDB Version 3.0 upgrades Version 2.0. The package was developed based on the Microsoft .NET framework technology and uses Microsoft Access for backend data storage, and Microsoft Excel for plotting graphs. This software package is compatible with Windows (98 or higher) and has been built with a highly versatile user interface. PR-EDB Version 3.0 also contains an "Evaluated Residual File" utility for generating the evaluated processed files used for radiation embrittlement study.« less

  17. sTeam--Providing Primary Media Functions for Web-Based Computer-Supported Cooperative Learning.

    ERIC Educational Resources Information Center

    Hampel, Thorsten

    The World Wide Web has developed as the de facto standard for computer based learning. However, as a server-centered approach, it confines readers and learners to passive nonsequential reading. Authoring and Web-publishing systems aim at supporting the authors' design process. Consequently, learners' activities are confined to selecting and…

  18. The Status of African Studies Digitized Content: Three Metadata Schemes.

    ERIC Educational Resources Information Center

    Kuntz, Patricia S.

    The proliferation of Web pages and digitized material mounted on Internet servers has become unmanageable. Librarians and users are concerned that documents and information are being lost in cyberspace as a result of few bibliographic controls and common standards. Librarians in cooperation with software creators and Web page designers are…

  19. Baobab Laboratory Information Management System: Development of an Open-Source Laboratory Information Management System for Biobanking.

    PubMed

    Bendou, Hocine; Sizani, Lunga; Reid, Tim; Swanepoel, Carmen; Ademuyiwa, Toluwaleke; Merino-Martinez, Roxana; Meuller, Heimo; Abayomi, Akin; Christoffels, Alan

    2017-04-01

    A laboratory information management system (LIMS) is central to the informatics infrastructure that underlies biobanking activities. To date, a wide range of commercial and open-source LIMSs are available and the decision to opt for one LIMS over another is often influenced by the needs of the biobank clients and researchers, as well as available financial resources. The Baobab LIMS was developed by customizing the Bika LIMS software ( www.bikalims.org ) to meet the requirements of biobanking best practices. The need to implement biobank standard operation procedures as well as stimulate the use of standards for biobank data representation motivated the implementation of Baobab LIMS, an open-source LIMS for Biobanking. Baobab LIMS comprises modules for biospecimen kit assembly, shipping of biospecimen kits, storage management, analysis requests, reporting, and invoicing. The Baobab LIMS is based on the Plone web-content management framework. All the system requirements for Plone are applicable to Baobab LIMS, including the need for a server with at least 8 GB RAM and 120 GB hard disk space. Baobab LIMS is a server-client-based system, whereby the end user is able to access the system securely through the internet on a standard web browser, thereby eliminating the need for standalone installations on all machines.

  20. DNA Barcode Goes Two-Dimensions: DNA QR Code Web Server

    PubMed Central

    Li, Huan; Xing, Hang; Liang, Dong; Jiang, Kun; Pang, Xiaohui; Song, Jingyuan; Chen, Shilin

    2012-01-01

    The DNA barcoding technology uses a standard region of DNA sequence for species identification and discovery. At present, “DNA barcode” actually refers to DNA sequences, which are not amenable to information storage, recognition, and retrieval. Our aim is to identify the best symbology that can represent DNA barcode sequences in practical applications. A comprehensive set of sequences for five DNA barcode markers ITS2, rbcL, matK, psbA-trnH, and CO1 was used as the test data. Fifty-three different types of one-dimensional and ten two-dimensional barcode symbologies were compared based on different criteria, such as coding capacity, compression efficiency, and error detection ability. The quick response (QR) code was found to have the largest coding capacity and relatively high compression ratio. To facilitate the further usage of QR code-based DNA barcodes, a web server was developed and is accessible at http://qrfordna.dnsalias.org. The web server allows users to retrieve the QR code for a species of interests, convert a DNA sequence to and from a QR code, and perform species identification based on local and global sequence similarities. In summary, the first comprehensive evaluation of various barcode symbologies has been carried out. The QR code has been found to be the most appropriate symbology for DNA barcode sequences. A web server has also been constructed to allow biologists to utilize QR codes in practical DNA barcoding applications. PMID:22574113

  1. Cardio-PACs: a new opportunity

    NASA Astrophysics Data System (ADS)

    Heupler, Frederick A., Jr.; Thomas, James D.; Blume, Hartwig R.; Cecil, Robert A.; Heisler, Mary

    2000-05-01

    It is now possible to replace film-based image management in the cardiac catheterization laboratory with a Cardiology Picture Archiving and Communication System (Cardio-PACS) based on digital imaging technology. The first step in the conversion process is installation of a digital image acquisition system that is capable of generating high-quality DICOM-compatible images. The next three steps, which are the subject of this presentation, involve image display, distribution, and storage. Clinical requirements and associated cost considerations for these three steps are listed below: Image display: (1) Image quality equal to film, with DICOM format, lossless compression, image processing, desktop PC-based with color monitor, and physician-friendly imaging software; (2) Performance specifications include: acquire 30 frames/sec; replay 15 frames/sec; access to file server 5 seconds, and to archive 5 minutes; (3) Compatibility of image file, transmission, and processing formats; (4) Image manipulation: brightness, contrast, gray scale, zoom, biplane display, and quantification; (5) User-friendly control of image review. Image distribution: (1) Standard IP-based network between cardiac catheterization laboratories, file server, long-term archive, review stations, and remote sites; (2) Non-proprietary formats; (3) Bidirectional distribution. Image storage: (1) CD-ROM vs disk vs tape; (2) Verification of data integrity; (3) User-designated storage capacity for catheterization laboratory, file server, long-term archive. Costs: (1) Image acquisition equipment, file server, long-term archive; (2) Network infrastructure; (3) Review stations and software; (4) Maintenance and administration; (5) Future upgrades and expansion; (6) Personnel.

  2. Improving Website Hyperlink Structure Using Server Logs

    PubMed Central

    Paranjape, Ashwin; West, Robert; Zia, Leila; Leskovec, Jure

    2016-01-01

    Good websites should be easy to navigate via hyperlinks, yet maintaining a high-quality link structure is difficult. Identifying pairs of pages that should be linked may be hard for human editors, especially if the site is large and changes frequently. Further, given a set of useful link candidates, the task of incorporating them into the site can be expensive, since it typically involves humans editing pages. In the light of these challenges, it is desirable to develop data-driven methods for automating the link placement task. Here we develop an approach for automatically finding useful hyperlinks to add to a website. We show that passively collected server logs, beyond telling us which existing links are useful, also contain implicit signals indicating which nonexistent links would be useful if they were to be introduced. We leverage these signals to model the future usefulness of yet nonexistent links. Based on our model, we define the problem of link placement under budget constraints and propose an efficient algorithm for solving it. We demonstrate the effectiveness of our approach by evaluating it on Wikipedia, a large website for which we have access to both server logs (used for finding useful new links) and the complete revision history (containing a ground truth of new links). As our method is based exclusively on standard server logs, it may also be applied to any other website, as we show with the example of the biomedical research site Simtk. PMID:28345077

  3. NEOview: Near Earth Object Data Discovery and Query

    NASA Astrophysics Data System (ADS)

    Tibbetts, M.; Elvis, M.; Galache, J. L.; Harbo, P.; McDowell, J. C.; Rudenko, M.; Van Stone, D.; Zografou, P.

    2013-10-01

    Missions to Near Earth Objects (NEOs) figure prominently in NASA's Flexible Path approach to human space exploration. NEOs offer insight into both the origins of the Solar System and of life, as well as a source of materials for future missions. With NEOview scientists can locate NEO datasets, explore metadata provided by the archives, and query or combine disparate NEO datasets in the search for NEO candidates for exploration. NEOview is a software system that illustrates how standards-based interfaces facilitate NEO data discovery and research. NEOview software follows a client-server architecture. The server is a configurable implementation of the International Virtual Observatory Alliance (IVOA) Table Access Protocol (TAP), a general interface for tabular data access, that can be deployed as a front end to existing NEO datasets. The TAP client, seleste, is a graphical interface that provides intuitive means of discovering NEO providers, exploring dataset metadata to identify fields of interest, and constructing queries to retrieve or combine data. It features a powerful, graphical query builder capable of easing the user's introduction to table searches. Through science use cases, NEOview demonstrates how potential targets for NEO rendezvous could be identified by combining data from complementary sources. Through deployment and operations, it has been shown that the software components are data independent and configurable to many different data servers. As such, NEOview's TAP server and seleste TAP client can be used to create a seamless environment for data discovery and exploration for tabular data in any astronomical archive.

  4. Analytical modeling and feasibility study of a multi-GPU cloud-based server (MGCS) framework for non-voxel-based dose calculations.

    PubMed

    Neylon, J; Min, Y; Kupelian, P; Low, D A; Santhanam, A

    2017-04-01

    In this paper, a multi-GPU cloud-based server (MGCS) framework is presented for dose calculations, exploring the feasibility of remote computing power for parallelization and acceleration of computationally and time intensive radiotherapy tasks in moving toward online adaptive therapies. An analytical model was developed to estimate theoretical MGCS performance acceleration and intelligently determine workload distribution. Numerical studies were performed with a computing setup of 14 GPUs distributed over 4 servers interconnected by a 1 Gigabits per second (Gbps) network. Inter-process communication methods were optimized to facilitate resource distribution and minimize data transfers over the server interconnect. The analytically predicted computation time predicted matched experimentally observations within 1-5 %. MGCS performance approached a theoretical limit of acceleration proportional to the number of GPUs utilized when computational tasks far outweighed memory operations. The MGCS implementation reproduced ground-truth dose computations with negligible differences, by distributing the work among several processes and implemented optimization strategies. The results showed that a cloud-based computation engine was a feasible solution for enabling clinics to make use of fast dose calculations for advanced treatment planning and adaptive radiotherapy. The cloud-based system was able to exceed the performance of a local machine even for optimized calculations, and provided significant acceleration for computationally intensive tasks. Such a framework can provide access to advanced technology and computational methods to many clinics, providing an avenue for standardization across institutions without the requirements of purchasing, maintaining, and continually updating hardware.

  5. Survey Software Evaluation

    DTIC Science & Technology

    2009-01-01

    Oracle 9i, 10g  MySQL  MS SQL Server MS SQL Server Operating System Supported Windows 2003 Server  Windows 2000 Server (32 bit...WebStar (Mac OS X)  SunOne Internet Information Services (IIS) Database Server Supported MS SQL Server  MS SQL Server  Oracle 9i, 10g...challenges of Web-based surveys are: 1) identifying the best Commercial Off the Shelf (COTS) Web-based survey packages to serve the particular

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    The open source Project Haystack initiative defines meta data and communication standards related to data from buildings and intelligent devices. The Project Haystack REST API defines standard formats and operations for exchanging Haystack tagged data over HTTP. The HaystackRuby gem wraps calls to this REST API to enable Ruby application to easily integrate data hosted on a Project Haystack compliant server. The HaystackRuby gem was developed at the National Renewable Energy Lab to support applications related to campus energy. We hope that this tool may be useful to others.

  7. Voluntary Aviation Safety Information-Sharing Process: Preliminary Audit of Distributed FOQA and ASAP Archives Against Industry Statement of Requirements

    DTIC Science & Technology

    2007-04-01

    the underlying parameters are available. Standard data format. Battelle, SAGEM Avionics, and Austin Digital, Inc. agreed upon a standard data format...data was initiated at four airlines by SAGEM Avionics beginning January 1, 2006. Transfer was initiated at one airline by Aus- tin Digital, Inc...internal issues have been resolved. As of April 0, 2006, more than 124,000 flights have been transferred to local archive servers by SAGEM and over

  8. A VBA Desktop Database for Proposal Processing at National Optical Astronomy Observatories

    NASA Astrophysics Data System (ADS)

    Brown, Christa L.

    National Optical Astronomy Observatories (NOAO) has developed a relational Microsoft Windows desktop database using Microsoft Access and the Microsoft Office programming language, Visual Basic for Applications (VBA). The database is used to track data relating to observing proposals from original receipt through the review process, scheduling, observing, and final statistical reporting. The database has automated proposal processing and distribution of information. It allows NOAO to collect and archive data so as to query and analyze information about our science programs in new ways.

  9. The Effects of Using Microsoft Power Point on EFL Learners' Attitude and Anxiety: Case Study of Two Master Students of Didactics of English as a Foreign Language, Djillali Liabes University, Sidi Bel Abbes, Algeria

    ERIC Educational Resources Information Center

    Benghalem, Boualem

    2015-01-01

    This study aims to investigate the effects of using ICT tools such as Microsoft PowerPoint on EFL students' attitude and anxiety. The participants in this study were 40 Master 2 students of Didactics of English as a Foreign Language, Djillali Liabes University, Sidi Bel Abbes Algeria. In order to find out the effects of Microsoft PowerPoint on EFL…

  10. Investigating the Feasibility of Conducting Human Tracking and Following in an Indoor Environment Using a Microsoft Kinect and the Robot Operating System

    DTIC Science & Technology

    2017-06-01

    implement human following on a mobile robot in an indoor environment . B. FUTURE WORK Future work that could be conducted in the realm of this thesis...FEASIBILITY OF CONDUCTING HUMAN TRACKING AND FOLLOWING IN AN INDOOR ENVIRONMENT USING A MICROSOFT KINECT AND THE ROBOT OPERATING SYSTEM by...FEASIBILITY OF CONDUCTING HUMAN TRACKING AND FOLLOWING IN AN INDOOR ENVIRONMENT USING A MICROSOFT KINECT AND THE ROBOT OPERATING SYSTEM 5. FUNDING NUMBERS

  11. THttpServer class in ROOT

    NASA Astrophysics Data System (ADS)

    Adamczewski-Musch, Joern; Linev, Sergey

    2015-12-01

    The new THttpServer class in ROOT implements HTTP server for arbitrary ROOT applications. It is based on Civetweb embeddable HTTP server and provides direct access to all objects registered for the server. Objects data could be provided in different formats: binary, XML, GIF/PNG, and JSON. A generic user interface for THttpServer has been implemented with HTML/JavaScript based on JavaScript ROOT development. With any modern web browser one could list, display, and monitor objects available on the server. THttpServer is used in Go4 framework to provide HTTP interface to the online analysis.

  12. Integrating Wireless Networking for Radiation Detection

    NASA Astrophysics Data System (ADS)

    Board, Jeremy; Barzilov, Alexander; Womble, Phillip; Paschal, Jon

    2006-10-01

    As wireless networking becomes more available, new applications are being developed for this technology. Our group has been studying the advantages of wireless networks of radiation detectors. With the prevalence of the IEEE 802.11 standard (``WiFi''), we have developed a wireless detector unit which is comprised of a 5 cm x 5 cm NaI(Tl) detector, amplifier and data acquisition electronics, and a WiFi transceiver. A server may communicate with the detector unit using a TCP/IP network connected to a WiFi access point. Special software on the server will perform radioactive isotope determination and estimate dose-rates. We are developing an enhanced version of the software which utilizes the receiver signal strength index (RSSI) to estimate source strengths and to create maps of radiation intensity.

  13. [Relevance of the hemovigilance regional database for the shared medical file identity server].

    PubMed

    Doly, A; Fressy, P; Garraud, O

    2008-11-01

    The French Health Products Safety Agency coordinates the national initiative of computerization of blood products traceability within regional blood banks and public and private hospitals. The Auvergne-Loire Regional French Blood Service, based in Saint-Etienne, together with a number of public hospitals set up a transfusion data network named EDITAL. After four years of progressive implementation and experimentation, a software enabling standardized data exchange has built up a regional nominative database, endorsed by the Traceability Computerization National Committee in 2004. This database now provides secured web access to a regional transfusion history enabling biologists and all hospital and family practitioners to take in charge the patient follow-up. By running independently from the softwares of its partners, EDITAL database provides reference for the regional identity server.

  14. Development of a graphical user interface for the global land information system (GLIS)

    USGS Publications Warehouse

    Alstad, Susan R.; Jackson, David A.

    1993-01-01

    The process of developing a Motif Graphical User Interface for the Global Land Information System (GLIS) involved incorporating user requirements, in-house visual and functional design requirements, and Open Software Foundation (OSF) Motif style guide standards. Motif user interface windows have been developed using the software to support Motif window functions war written using the C programming language. The GLIS architecture was modified to support multiple servers and remote handlers running the X Window System by forming a network of servers and handlers connected by TCP/IP communications. In April 1993, prior to release the GLIS graphical user interface and system architecture modifications were test by developers and users located at the EROS Data Center and 11 beta test sites across the country.

  15. Characteristics and Energy Use of Volume Servers in the United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fuchs, H.; Shehabi, A.; Ganeshalingam, M.

    Servers’ field energy use remains poorly understood, given heterogeneous computing loads, configurable hardware and software, and operation over a wide range of management practices. This paper explores various characteristics of 1- and 2-socket volume servers that affect energy consumption, and quantifies the difference in power demand between higher-performing SPEC and ENERGY STAR servers and our best understanding of a typical server operating today. We first establish general characteristics of the U.S. installed base of volume servers from existing IDC data and the literature, before presenting information on server hardware configurations from data collection events at a major online retail website.more » We then compare cumulative distribution functions of server idle power across three separate datasets and explain the differences between them via examination of the hardware characteristics to which power draw is most sensitive. We find that idle server power demand is significantly higher than ENERGY STAR benchmarks and the industry-released energy use documented in SPEC, and that SPEC server configurations—and likely the associated power-scaling trends—are atypical of volume servers. Next, we examine recent trends in server power draw among high-performing servers across their full load range to consider how representative these trends are of all volume servers before inputting weighted average idle power load values into a recently published model of national server energy use. Finally, we present results from two surveys of IT managers (n=216) and IT vendors (n=178) that illustrate the prevalence of more-efficient equipment and operational practices in server rooms and closets; these findings highlight opportunities to improve the energy efficiency of the U.S. server stock.« less

  16. Mobile Assisted Security in Wireless Sensor Networks

    DTIC Science & Technology

    2015-08-03

    server from Google’s DNS, Chromecast and the content server does the 3-way TCP Handshake which is followed by Client Hello and Server Hello TLS messages...utilized TLS v1.2, except NTP servers and google’s DNS server. In the TLS v1.2, after handshake, client and server sends Client Hello and Server Hello ...Messages in order. In Client Hello messages, client offers a list of Cipher Suites that it supports. Each Cipher Suite defines the key exchange algorithm

  17. Analyzing CRISM hyperspectral imagery using PlanetServer.

    NASA Astrophysics Data System (ADS)

    Figuera, Ramiro Marco; Pham Huu, Bang; Minin, Mikhail; Flahaut, Jessica; Halder, Anik; Rossi, Angelo Pio

    2017-04-01

    Mineral characterization of planetary surfaces bears great importance for space exploration. In order to perform it, orbital hyperspectral imagery is widely used. In our research we use Compact Reconnaissance Imaging Spectrometer for Mars (CRISM) [1] TRDR L observations with a spectral range of 1 to 4 µm. PlanetServer comprises a server, a web client and a Python client/API. The server side uses the Array DataBase Management System (DBMS) Raster Data Manager (Rasdaman) Community Edition [2]. OGC standards such as the Web Coverage Processing Service (WCPS) [3], an SQL-like language capable to query information along the image cube, are implemented in the PetaScope component [4]. The client side uses NASA's Web World Wind [5] allowing the user to access the data in an intuitive way. The client consists of a globe where all cubes are deployed, a main menu where projections, base maps and RGB combinations are provided, and a plot dock where the spectral information is shown. The RGB combinator tool allows to do band combination such as the CRISM products [6] using WCPS. The spectral information is retrieved using WCPS and shown in the plot dock/widget. The USGS splib06a library [7] is available to compare CRISM vs. laboratory spectra. The Python API provides an environment to create RGB combinations that can be embedded into existing pipelines. All employed libraries and tools are open source and can be easily adapted to other datasets. PlanetServer stands as a promising tool for spectral analysis on planetary bodies. M3/Moon and OMEGA datasets will be soon available. [1] S. Murchie et al., "Compact Connaissance Imaging Spectrometer for Mars (CRISM) on Mars Reconnaissance Orbiter (MRO)," J. Geophys. Res. E Planets,2007. [2] P. Baumann, A. Dehmel, P. Furtado, R. Ritsch, and N. Widmann, "The multidimensional database system RasDaMan," ACM SIGMOD Rec., vol. 27, no. 2, pp. 575-577, Jun. 1998. [3] P. Baumann, "The OGC web coverage processing service (WCPS) standard," Geoinformatica, vol. 14, no. 4, Jul. 2010. [4] A. Aiordǎchioaie and P. Baumann, "PetaScope: An open-source implementation of the OGC WCS Geo service standards suite," Lect. Notes Comput. Sci. (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics), vol. 6187 LNCS, pp. 160-168, Jun. 2010. [5] P. Hogan, C. Maxwell, R. Kim, and T. Gaskins, "World Wind 3D Earth Viewing," Apr. 2007. [6] C. E. Viviano-Beck et al., "Revised CRISM spectral parameters and summary products based on the currently detected mineral diversity on Mars," J. Geophys. Res. E Planets, vol. 119, no. 6, pp. 1403-1431, Jun. 2014. [7] R. N. Clark et al., "USGS digital spectral library splib06a: U.S. Geological Survey, Digital Data Series 231," 2007. [Online]. Available: http://speclab.cr.usgs.gov/spectral.lib06.

  18. Earth Observing System Data Gateway

    NASA Technical Reports Server (NTRS)

    Pfister, Robin; McMahon, Joe; Amrhein, James; Sefert, Ed; Marsans, Lorena; Solomon, Mark; Nestler, Mark

    2006-01-01

    The Earth Observing System Data Gateway (EDG) software provides a "one-stop-shopping" standard interface for exploring and ordering Earth-science data stored at geographically distributed sites. EDG enables a user to do the following: 1) Search for data according to high-level criteria (e.g., geographic location, time, or satellite that acquired the data); 2) Browse the results of a search, viewing thumbnail sketches of data that satisfy the user s criteria; and 3) Order selected data for delivery to a specified address on a chosen medium (e.g., compact disk or magnetic tape). EDG consists of (1) a component that implements a high-level client/server protocol, and (2) a collection of C-language libraries that implement the passing of protocol messages between an EDG client and one or more EDG servers. EDG servers are located at sites usually called "Distributed Active Archive Centers" (DAACs). Each DAAC may allow access to many individual data items, called "granules" (e.g., single Landsat images). Related granules are grouped into collections called "data sets." EDG enables a user to send a search query to multiple DAACs simultaneously, inspect the resulting information, select browseable granules, and then order selected data from the different sites in a seamless fashion.

  19. COMAN: a web server for comprehensive metatranscriptomics analysis.

    PubMed

    Ni, Yueqiong; Li, Jun; Panagiotou, Gianni

    2016-08-11

    Microbiota-oriented studies based on metagenomic or metatranscriptomic sequencing have revolutionised our understanding on microbial ecology and the roles of both clinical and environmental microbes. The analysis of massive metatranscriptomic data requires extensive computational resources, a collection of bioinformatics tools and expertise in programming. We developed COMAN (Comprehensive Metatranscriptomics Analysis), a web-based tool dedicated to automatically and comprehensively analysing metatranscriptomic data. COMAN pipeline includes quality control of raw reads, removal of reads derived from non-coding RNA, followed by functional annotation, comparative statistical analysis, pathway enrichment analysis, co-expression network analysis and high-quality visualisation. The essential data generated by COMAN are also provided in tabular format for additional analysis and integration with other software. The web server has an easy-to-use interface and detailed instructions, and is freely available at http://sbb.hku.hk/COMAN/ CONCLUSIONS: COMAN is an integrated web server dedicated to comprehensive functional analysis of metatranscriptomic data, translating massive amount of reads to data tables and high-standard figures. It is expected to facilitate the researchers with less expertise in bioinformatics in answering microbiota-related biological questions and to increase the accessibility and interpretation of microbiota RNA-Seq data.

  20. SU-D-BRD-02: A Web-Based Image Processing and Plan Evaluation Platform (WIPPEP) for Future Cloud-Based Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chai, X; Liu, L; Xing, L

    Purpose: Visualization and processing of medical images and radiation treatment plan evaluation have traditionally been constrained to local workstations with limited computation power and ability of data sharing and software update. We present a web-based image processing and planning evaluation platform (WIPPEP) for radiotherapy applications with high efficiency, ubiquitous web access, and real-time data sharing. Methods: This software platform consists of three parts: web server, image server and computation server. Each independent server communicates with each other through HTTP requests. The web server is the key component that provides visualizations and user interface through front-end web browsers and relay informationmore » to the backend to process user requests. The image server serves as a PACS system. The computation server performs the actual image processing and dose calculation. The web server backend is developed using Java Servlets and the frontend is developed using HTML5, Javascript, and jQuery. The image server is based on open source DCME4CHEE PACS system. The computation server can be written in any programming language as long as it can send/receive HTTP requests. Our computation server was implemented in Delphi, Python and PHP, which can process data directly or via a C++ program DLL. Results: This software platform is running on a 32-core CPU server virtually hosting the web server, image server, and computation servers separately. Users can visit our internal website with Chrome browser, select a specific patient, visualize image and RT structures belonging to this patient and perform image segmentation running Delphi computation server and Monte Carlo dose calculation on Python or PHP computation server. Conclusion: We have developed a webbased image processing and plan evaluation platform prototype for radiotherapy. This system has clearly demonstrated the feasibility of performing image processing and plan evaluation platform through a web browser and exhibited potential for future cloud based radiotherapy.« less

  1. 75 FR 7648 - Agency Information Collection Activities: Emergency Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-22

    ..., recipients, and representative payees: Braille and Microsoft Word files (on data compact discs). Current...) Braille, or (5) Microsoft Word. This call did not require OMB clearance. However, there may be respondents...

  2. [MapDraw: a microsoft excel macro for drawing genetic linkage maps based on given genetic linkage data].

    PubMed

    Liu, Ren-Hu; Meng, Jin-Ling

    2003-05-01

    MAPMAKER is one of the most widely used computer software package for constructing genetic linkage maps.However, the PC version, MAPMAKER 3.0 for PC, could not draw the genetic linkage maps that its Macintosh version, MAPMAKER 3.0 for Macintosh,was able to do. Especially in recent years, Macintosh computer is much less popular than PC. Most of the geneticists use PC to analyze their genetic linkage data. So a new computer software to draw the same genetic linkage maps on PC as the MAPMAKER for Macintosh to do on Macintosh has been crying for. Microsoft Excel,one component of Microsoft Office package, is one of the most popular software in laboratory data processing. Microsoft Visual Basic for Applications (VBA) is one of the most powerful functions of Microsoft Excel. Using this program language, we can take creative control of Excel, including genetic linkage map construction, automatic data processing and more. In this paper, a Microsoft Excel macro called MapDraw is constructed to draw genetic linkage maps on PC computer based on given genetic linkage data. Use this software,you can freely construct beautiful genetic linkage map in Excel and freely edit and copy it to Word or other application. This software is just an Excel format file. You can freely copy it from ftp://211.69.140.177 or ftp://brassica.hzau.edu.cn and the source code can be found in Excel's Visual Basic Editor.

  3. How to make your own response boxes: A step-by-step guide for the construction of reliable and inexpensive parallel-port response pads from computer mice.

    PubMed

    Voss, Andreas; Leonhart, Rainer; Stahl, Christoph

    2007-11-01

    Psychological research is based in large parts on response latencies, which are often registered by keypresses on a standard computer keyboard. Recording response latencies with a standard keyboard is problematic because keypresses are buffered within the keyboard hardware before they are signaled to the computer, adding error variance to the recorded latencies. This can be circumvented by using external response pads connected to the computer's parallel port. In this article, we describe how to build inexpensive, reliable, and easy-to-use response pads with six keys from two standard computer mice that can be connected to the PC's parallel port. We also address the problem of recording data from the parallel port with different software packages under Microsoft's Windows XP.

  4. OPeNDAP Server4: Buidling a High-Performance Server for the DAP by Leveraging Existing Software

    NASA Astrophysics Data System (ADS)

    Potter, N.; West, P.; Gallagher, J.; Garcia, J.; Fox, P.

    2006-12-01

    OPeNDAP has been working in conjunction with NCAR/ESSL/HAO to develop a modular, high performance data server that will be the successor to the current OPeNDAP data server. The new server, called Server4, is really two servers: A 'Back-End' data server which reads information from various types of data sources and packages the results in DAP objects; and A 'Front-End' which receives client DAP request and then decides how use features of the Back-End data server to build the correct responses. This architecture can be configured in several interesting ways: The Front- and Back-End components can be run on either the same or different machines, depending on security and performance needs, new Front-End software can be written to support other network data access protocols and local applications can interact directly with the Back-End data server. This new server's Back-End component will use the server infrastructure developed by HAO for use with the Earth System Grid II project. Extensions needed to use it as part of the new OPeNDAP server were minimal. The HAO server was modified so that it loads 'data handlers' at run-time. Each data handler module only needs to satisfy a simple interface which both enabled the existing data handlers written for the old OPeNDAP server to be directly used and also simplifies writing new handlers from scratch. The Back-End server leverages high- performance features developed for the ESG II project, so applications that can interact with it directly can read large volumes of data efficiently. The Front-End module of Server4 uses the Java Servlet system in place of the Common Gateway Interface (CGI) used in the past. New front-end modules can be written to support different network data access protocols, so that same server will ultimately be able to support more than the DAP/2.0 protocol. As an example, we will discuss a SOAP interface that's currently in development. In addition to support for DAP/2.0 and prototypical support for a SOAP interface, the new server includes support for the THREDDS cataloging protocol. THREDDS is tightly integrated into the Front-End of Server4. The Server4 Front-End can make full use of the advanced THREDDS features such as attribute specification and inheritance, custom catalogs which segue into automatically generated catalogs as well as providing a default behavior which requires almost no catalog configuration.

  5. An extensible and lightweight architecture for adaptive server applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorton, Ian; Liu, Yan; Trivedi, Nihar

    2008-07-10

    Server applications augmented with behavioral adaptation logic can react to environmental changes, creating self-managing server applications with improved quality of service at runtime. However, developing adaptive server applications is challenging due to the complexity of the underlying server technologies and highly dynamic application environments. This paper presents an architecture framework, the Adaptive Server Framework (ASF), to facilitate the development of adaptive behavior for legacy server applications. ASF provides a clear separation between the implementation of adaptive behavior and the business logic of the server application. This means a server application can be extended with programmable adaptive features through the definitionmore » and implementation of control components defined in ASF. Furthermore, ASF is a lightweight architecture in that it incurs low CPU overhead and memory usage. We demonstrate the effectiveness of ASF through a case study, in which a server application dynamically determines the resolution and quality to scale an image based on the load of the server and network connection speed. The experimental evaluation demonstrates the erformance gains possible by adaptive behavior and the low overhead introduced by ASF.« less

  6. TIPMaP: a web server to establish transcript isoform profiles from reliable microarray probes.

    PubMed

    Chitturi, Neelima; Balagannavar, Govindkumar; Chandrashekar, Darshan S; Abinaya, Sadashivam; Srini, Vasan S; Acharya, Kshitish K

    2013-12-27

    Standard 3' Affymetrix gene expression arrays have contributed a significantly higher volume of existing gene expression data than other microarray platforms. These arrays were designed to identify differentially expressed genes, but not their alternatively spliced transcript forms. No resource can currently identify expression pattern of specific mRNA forms using these microarray data, even though it is possible to do this. We report a web server for expression profiling of alternatively spliced transcripts using microarray data sets from 31 standard 3' Affymetrix arrays for human, mouse and rat species. The tool has been experimentally validated for mRNAs transcribed or not-detected in a human disease condition (non-obstructive azoospermia, a male infertility condition). About 4000 gene expression datasets were downloaded from a public repository. 'Good probes' with complete coverage and identity to latest reference transcript sequences were first identified. Using them, 'Transcript specific probe-clusters' were derived for each platform and used to identify expression status of possible transcripts. The web server can lead the user to datasets corresponding to specific tissues, conditions via identifiers of the microarray studies or hybridizations, keywords, official gene symbols or reference transcript identifiers. It can identify, in the tissues and conditions of interest, about 40% of known transcripts as 'transcribed', 'not-detected' or 'differentially regulated'. Corresponding additional information for probes, genes, transcripts and proteins can be viewed too. We identified the expression of transcripts in a specific clinical condition and validated a few of these transcripts by experiments (using reverse transcription followed by polymerase chain reaction). The experimental observations indicated higher agreements with the web server results, than contradictions. The tool is accessible at http://resource.ibab.ac.in/TIPMaP. The newly developed online tool forms a reliable means for identification of alternatively spliced transcript-isoforms that may be differentially expressed in various tissues, cell types or physiological conditions. Thus, by making better use of existing data, TIPMaP avoids the dependence on precious tissue-samples, in experiments with a goal to establish expression profiles of alternative splice forms--at least in some cases.

  7. An Interactive Microsoft(registered tm) Excel Program for Tracking a Single Evaporating Droplet in Crossflow

    NASA Technical Reports Server (NTRS)

    Liew, K. H.; Urip, E.; Yang, S. L.; Marek, C. J.

    2004-01-01

    Droplet interaction with a high temperature gaseous crossflow is important because of its wide application in systems involving two phase mixing such as in combustion requiring quick mixing of fuel and air with the reduction of pollutants and for jet mixing in the dilution zone of combustors. Therefore, the focus of this work is to investigate dispersion of a two-dimensional atomized and evaporating spray jet into a two-dimensional crossflow. An interactive Microsoft Excel program for tracking a single droplet in crossflow that has previously been developed will be modified to include droplet evaporation computation. In addition to the high velocity airflow, the injected droplets are also subjected to combustor temperature and pressure that affect their motion in the flow field. Six ordinary differential equations are then solved by 4th-order Runge-Kutta method using Microsoft Excel software. Microsoft Visual Basic programming and Microsoft Excel macrocode are used to produce the data and plot graphs describing the droplet's motion in the flow field. This program computes and plots the data sequentially without forcing the user to open other types of plotting programs. A user's manual on how to use the program is included.

  8. Thirty Meter Telescope (TMT) Narrow Field Infrared Adaptive Optics System (NFIRAOS) real-time controller preliminary architecture

    NASA Astrophysics Data System (ADS)

    Kerley, Dan; Smith, Malcolm; Dunn, Jennifer; Herriot, Glen; Véran, Jean-Pierre; Boyer, Corinne; Ellerbroek, Brent; Gilles, Luc; Wang, Lianqi

    2016-08-01

    The Narrow Field Infrared Adaptive Optics System (NFIRAOS) is the first light Adaptive Optics (AO) system for the Thirty Meter Telescope (TMT). A critical component of NFIRAOS is the Real-Time Controller (RTC) subsystem which provides real-time wavefront correction by processing wavefront information to compute Deformable Mirror (DM) and Tip/Tilt Stage (TTS) commands. The National Research Council of Canada - Herzberg (NRC-H), in conjunction with TMT, has developed a preliminary design for the NFIRAOS RTC. The preliminary architecture for the RTC is comprised of several Linux-based servers. These servers are assigned various roles including: the High-Order Processing (HOP) servers, the Wavefront Corrector Controller (WCC) server, the Telemetry Engineering Display (TED) server, the Persistent Telemetry Storage (PTS) server, and additional testing and spare servers. There are up to six HOP servers that accept high-order wavefront pixels, and perform parallelized pixel processing and wavefront reconstruction to produce wavefront corrector error vectors. The WCC server performs low-order mode processing, and synchronizes and aggregates the high-order wavefront corrector error vectors from the HOP servers to generate wavefront corrector commands. The Telemetry Engineering Display (TED) server is the RTC interface to TMT and other subsystems. The TED server receives all external commands and dispatches them to the rest of the RTC servers and is responsible for aggregating several offloading and telemetry values that are reported to other subsystems within NFIRAOS and TMT. The TED server also provides the engineering GUIs and real-time displays. The Persistent Telemetry Storage (PTS) server contains fault tolerant data storage that receives and stores telemetry data, including data for Point-Spread Function Reconstruction (PSFR).

  9. Interoperability In The New Planetary Science Archive (PSA)

    NASA Astrophysics Data System (ADS)

    Rios, C.; Barbarisi, I.; Docasal, R.; Macfarlane, A. J.; Gonzalez, J.; Arviset, C.; Grotheer, E.; Besse, S.; Martinez, S.; Heather, D.; De Marchi, G.; Lim, T.; Fraga, D.; Barthelemy, M.

    2015-12-01

    As the world becomes increasingly interconnected, there is a greater need to provide interoperability with software and applications that are commonly being used globally. For this purpose, the development of the new Planetary Science Archive (PSA), by the European Space Astronomy Centre (ESAC) Science Data Centre (ESDC), is focused on building a modern science archive that takes into account internationally recognised standards in order to provide access to the archive through tools from third parties, for example by the NASA Planetary Data System (PDS), the VESPA project from the Virtual Observatory of Paris as well as other international institutions. The protocols and standards currently being supported by the new Planetary Science Archive at this time are the Planetary Data Access Protocol (PDAP), the EuroPlanet-Table Access Protocol (EPN-TAP) and Open Geospatial Consortium (OGC) standards. The architecture of the PSA consists of a Geoserver (an open-source map server), the goal of which is to support use cases such as the distribution of search results, sharing and processing data through a OGC Web Feature Service (WFS) and a Web Map Service (WMS). This server also allows the retrieval of requested information in several standard output formats like Keyhole Markup Language (KML), Geography Markup Language (GML), shapefile, JavaScript Object Notation (JSON) and Comma Separated Values (CSV), among others. The provision of these various output formats enables end-users to be able to transfer retrieved data into popular applications such as Google Mars and NASA World Wind.

  10. 78 FR 45542 - Request for Information: The National Institute of Environmental Health Sciences/National...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-29

    ... submitted electronically in Microsoft Excel or Word formats to [email protected] . FOR FURTHER... recommendations should be submitted electronically in Microsoft Excel or Word format. Respondents to this request...

  11. Selecting a Z39.50 Client or Web Gateway.

    ERIC Educational Resources Information Center

    Turner, Fay

    1998-01-01

    Provides a brief description of the Z39.50 information retrieval standard and reviews evaluation criteria and questions that should be asked when selecting a Z39.50 client. Areas for consideration include whether to buy or build a Z39.50 client, the end-user's requirements, connecting to a remote server, searching, managing the search response,…

  12. Energy Efficiency in Small Server Rooms: Field Surveys and Findings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheung, Iris; Greenberg, Steve; Mahdavi, Roozbeh

    Fifty-seven percent of US servers are housed in server closets, server rooms, and localized data centers, in what are commonly referred to as small server rooms, which comprise 99percent of all server spaces in the US. While many mid-tier and enterprise-class data centers are owned by large corporations that consider energy efficiency a goal to minimize business operating costs, small server rooms typically are not similarly motivated. They are characterized by decentralized ownership and management and come in many configurations, which creates a unique set of efficiency challenges. To develop energy efficiency strategies for these spaces, we surveyed 30 smallmore » server rooms across eight institutions, and selected four of them for detailed assessments. The four rooms had Power Usage Effectiveness (PUE) values ranging from 1.5 to 2.1. Energy saving opportunities ranged from no- to low-cost measures such as raising cooling set points and better airflow management, to more involved but cost-effective measures including server consolidation and virtualization, and dedicated cooling with economizers. We found that inefficiencies mainly resulted from organizational rather than technical issues. Because of the inherent space and resource limitations, the most effective measure is to operate servers through energy-efficient cloud-based services or well-managed larger data centers, rather than server rooms. Backup power requirement, and IT and cooling efficiency should be evaluated to minimize energy waste in the server space. Utility programs are instrumental in raising awareness and spreading technical knowledge on server operation, and the implementation of energy efficiency measures in small server rooms.« less

  13. Pyrolaser Operating System

    NASA Technical Reports Server (NTRS)

    Roberts, Floyd E., III

    1994-01-01

    Software provides for control and acquisition of data from optical pyrometer. There are six individual programs in PYROLASER package. Provides quick and easy way to set up, control, and program standard Pyrolaser. Temperature and emisivity measurements either collected as if Pyrolaser in manual operating mode or displayed on real-time strip charts and stored in standard spreadsheet format for posttest analysis. Shell supplied to allow macros, which are test-specific, added to system easily. Written using Labview software for use on Macintosh-series computers running System 6.0.3 or later, Sun Sparc-series computers running Open-Windows 3.0 or MIT's X Window System (X11R4 or X11R5), and IBM PC or compatible computers running Microsoft Windows 3.1 or later.

  14. Cybersecurity, massive data processing, community interaction, and other developments at WWW-based computational X-ray Server

    NASA Astrophysics Data System (ADS)

    Stepanov, Sergey

    2013-03-01

    X-Ray Server (x-server.gmca.aps.anl.gov) is a WWW-based computational server for modeling of X-ray diffraction, reflection and scattering data. The modeling software operates directly on the server and can be accessed remotely either from web browsers or from user software. In the later case the server can be deployed as a software library or a data fitting engine. As the server recently surpassed the milestones of 15 years online and 1.5 million calculations, it accumulated a number of technical solutions that are discussed in this paper. The developed approaches to detecting physical model limits and user calculations failures, solutions to spam and firewall problems, ways to involve the community in replenishing databases and methods to teach users automated access to the server programs may be helpful for X-ray researchers interested in using the server or sharing their own software online.

  15. Effect of video server topology on contingency capacity requirements

    NASA Astrophysics Data System (ADS)

    Kienzle, Martin G.; Dan, Asit; Sitaram, Dinkar; Tetzlaff, William H.

    1996-03-01

    Video servers need to assign a fixed set of resources to each video stream in order to guarantee on-time delivery of the video data. If a server has insufficient resources to guarantee the delivery, it must reject the stream request rather than slowing down all existing streams. Large scale video servers are being built as clusters of smaller components, so as to be economical, scalable, and highly available. This paper uses a blocking model developed for telephone systems to evaluate video server cluster topologies. The goal is to achieve high utilization of the components and low per-stream cost combined with low blocking probability and high user satisfaction. The analysis shows substantial economies of scale achieved by larger server images. Simple distributed server architectures can result in partitioning of resources with low achievable resource utilization. By comparing achievable resource utilization of partitioned and monolithic servers, we quantify the cost of partitioning. Next, we present an architecture for a distributed server system that avoids resource partitioning and results in highly efficient server clusters. Finally, we show how, in these server clusters, further optimizations can be achieved through caching and batching of video streams.

  16. Evolution of the Data Access Protocol in Response to Community Needs

    NASA Astrophysics Data System (ADS)

    Gallagher, J.; Caron, J. L.; Davis, E.; Fulker, D.; Heimbigner, D.; Holloway, D.; Howe, B.; Moe, S.; Potter, N.

    2012-12-01

    Under the aegis of the OPULS (OPeNDAP-Unidata Linked Servers) Project, funded by NOAA, version 2 of OPeNDAP's Data Access Protocol (DAP2) is being updated to version 4. DAP4 is the first major upgrade in almost two decades and will embody three main areas of advancement. First, the data-model extensions developed by the OPULS team focus on three areas: Better support for coverages, access to HDF5 files and access to relational databases. DAP2 support for coverages (defined as a sampled functions) was limited to simple rectangular coverages that work well for (some) model outputs and processed satellite data but that cannot represent trajectories or satellite swath data, for example. We have extended the coverage concept in DAP4 to remove these limitations. These changes are informed by work at Unidata on the Common Data Model and also by the OGC's abstract coverages specification. In a similar vein, we have extended DAP2's support for relations by including the concept of foreign keys, so that tables can be explicitly related to one another. Second, the web interfaces - web services - that provides access to data using via DAP will be more clearly defined and use other (, orthogonal), standards where they are appropriate. An important case is the XML interface, which provides a cleaner way to build other response media types such as JSON and RDF (for metadata) and to build support for Atom, thus simplify the integration of DAP servers with tools that support OpenSearch. Input from the ESIP federation and work performed with IOOS have informed our choices here. Last, DAP4-compliant servers will support richer data-processing capabilities than DAP2, enabling a wider array of server functions that manipulate data before returning values. Two projects currently are exploring just what can be done even with DAP2's server-function model: The MIIC project at LARC and OPULS itself (with work performed at the University of Washington). Both projects have demonstrated that server functions can be used to perform operations on large volumes of data and return results that are far smaller than would be required to achieve the same outcomes via client-side processing. We are using information from these efforts to inform the design of server functions in DAP4. Each of the three areas of DAP4 advancement is being guided by input from a number of community members, including an OPULS Advisory Committee.

  17. WASP (Write a Scientific Paper) using Excel - 7: The t-distribution.

    PubMed

    Grech, Victor

    2018-03-01

    The calculation of descriptive statistics after data collection provides researchers with an overview of the shape and nature of their datasets, along with basic descriptors, and may help identify true or incorrect outlier values. This exercise should always precede inferential statistics, when possible. This paper provides some pointers for doing so in Microsoft Excel, both statically and dynamically, with Excel's functions, including the calculation of standard deviation and variance and the relevance of the t-distribution. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Handling of computational in vitro/in vivo correlation problems by Microsoft Excel: IV. Generalized matrix analysis of linear compartment systems.

    PubMed

    Langenbucher, Frieder

    2005-01-01

    A linear system comprising n compartments is completely defined by the rate constants between any of the compartments and the initial condition in which compartment(s) the drug is present at the beginning. The generalized solution is the time profiles of drug amount in each compartment, described by polyexponential equations. Based on standard matrix operations, an Excel worksheet computes the rate constants and the coefficients, finally the full time profiles for a specified range of time values.

  19. Whole-rock and glass major-element geochemistry of Kilauea Volcano, Hawaii, near-vent eruptive products: September 1994 through September 2001

    USGS Publications Warehouse

    Thornber, Carl R.; Sherrod, David R.; Siems, David F.; Heliker, Christina C.; Meeker, Gregory P.; Oscarson, Robert L.; Kauahikaua, James P.

    2002-01-01

    This report presents major-element geochemical data for glasses and whole-rock aliquots among 523 lava samples collected near the vent on Kilauea's east rift zone between September 1994 and October 2001. Information on sample collection, analysis techniques and analytical standard reproducibility are presented as a PDF file, which also includes a detailed explantion of the categories of sample information presented in the database spreadsheet. The sample database is downloadable as a separate Microsoft Excel file.

  20. FY96 Support to the Defense Information Systems Agency (DISA), Center for Standards (CFS) for continuing improvement of the DoD HCI Style Guide. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Avery, L.W.; Donohoo, D.T.; Sanchez, J.A.

    1996-09-30

    PNNL successfully completed the three tasks: Task 1 - This task provided DISA with an updated set of design checklists that can be used to measure compliance with the Style Guide. These checklists are in Microsoft{reg_sign}Word 6.0 format. Task 2 - This task provided a discussion of two basic models for using the Style Guide and the Design Checklist, as a compliance tool and as a design tool.

Top