GLobal Integrated Design Environment
NASA Technical Reports Server (NTRS)
Kunkel, Matthew; McGuire, Melissa; Smith, David A.; Gefert, Leon P.
2011-01-01
The GLobal Integrated Design Environment (GLIDE) is a collaborative engineering application built to resolve the design session issues of real-time passing of data between multiple discipline experts in a collaborative environment. Utilizing Web protocols and multiple programming languages, GLIDE allows engineers to use the applications to which they are accustomed in this case, Excel to send and receive datasets via the Internet to a database-driven Web server. Traditionally, a collaborative design session consists of one or more engineers representing each discipline meeting together in a single location. The discipline leads exchange parameters and iterate through their respective processes to converge on an acceptable dataset. In cases in which the engineers are unable to meet, their parameters are passed via e-mail, telephone, facsimile, or even postal mail. The result of this slow process of data exchange would elongate a design session to weeks or even months. While the iterative process remains in place, software can now exchange parameters securely and efficiently, while at the same time allowing for much more information about a design session to be made available. GLIDE is written in a compilation of several programming languages, including REALbasic, PHP, and Microsoft Visual Basic. GLIDE client installers are available to download for both Microsoft Windows and Macintosh systems. The GLIDE client software is compatible with Microsoft Excel 2000 or later on Windows systems, and with Microsoft Excel X or later on Macintosh systems. GLIDE follows the Client-Server paradigm, transferring encrypted and compressed data via standard Web protocols. Currently, the engineers use Excel as a front end to the GLIDE Client, as many of their custom tools run in Excel.
A mobile information management system used in textile enterprises
NASA Astrophysics Data System (ADS)
Huang, C.-R.; Yu, W.-D.
2008-02-01
The mobile information management system (MIMS) for textile enterprises is based on Microsoft Visual Studios. NET2003 Server, Microsoft SQL Server 2000, C++ language and wireless application protocol (WAP) and wireless markup language (WML) technology. The portable MIMS is composed of three-layer structures, i.e. showing layer; operating layer; and data visiting layer corresponding to the port-link module; processing module; and database module. By using the MIMS, not only the information exchanges become more convenient and easier, but also the compatible between the giant information capacity and a micro-cell phone and functional expansion nature in operating and designing can be realized by means of build-in units. The development of MIMS is suitable for the utilization in textile enterprises.
2001-09-01
replication) -- all from Visual Basic and VBA . In fact, we found that the SQL Server engine actually had a plethora of options, most formidable of...2002, the new SQL Server 2000 database engine, and Microsoft Visual Basic.NET. This thesis describes our use of the Spiral Development Model to...versions of Microsoft products? Specifically, the pending release of Microsoft Office 2002, the new SQL Server 2000 database engine, and Microsoft
Risk Assessment of the Naval Postgraduate School Gigabit Network
2004-09-01
Management Server (1) • Ras Server (1) • Remedy Server (1) • Samba Server(2) • SQL Servers (3) • Web Servers (3) • WINS Server (1) • Library...Server Bob Sharp INCA Windows 2000 Advanced Server NPGS Landesk SQL 2000 Alan Pires eagle Microsoft Windows 2000 Advanced Server EWS NPGS Landesk...Advanced Server Special Projects NPGS SQL Alan Pires MC01BDB Microsoft Windows 2000 Advanced Server Special Projects NPGS SQL 2000 Alan Pires
Research of GIS-services applicability for solution of spatial analysis tasks.
NASA Astrophysics Data System (ADS)
Terekhin, D. A.; Botygin, I. A.; Sherstneva, A. I.; Sherstnev, V. S.
2017-01-01
Experiments for working out the areas of applying various gis-services in the tasks of spatial analysis are discussed in this paper. Google Maps, Yandex Maps, Microsoft SQL Server are used as services of spatial analysis. All services have shown a comparable speed of analyzing the spatial data when carrying out elemental spatial requests (building up the buffer zone of a point object) as well as the preferences of Microsoft SQL Server in operating with more complicated spatial requests. When building up elemental spatial requests, internet-services show higher efficiency due to cliental data handling with JavaScript-subprograms. A weak point of public internet-services is an impossibility to handle data on a server side and a barren variety of spatial analysis functions. Microsoft SQL Server offers a large variety of functions needed for spatial analysis on the server side. The authors conclude that when solving practical problems, the capabilities of internet-services used in building up routes and completing other functions with spatial analysis with Microsoft SQL Server should be involved.
Microsoft Repository Version 2 and the Open Information Model.
ERIC Educational Resources Information Center
Bernstein, Philip A.; Bergstraesser, Thomas; Carlson, Jason; Pal, Shankar; Sanders, Paul; Shutt, David
1999-01-01
Describes the programming interface and implementation of the repository engine and the Open Information Model for Microsoft Repository, an object-oriented meta-data management facility that ships in Microsoft Visual Studio and Microsoft SQL Server. Discusses Microsoft's component object model, object manipulation, queries, and information…
A Tale of Two Observing Systems: Interoperability in the World of Microsoft Windows
NASA Astrophysics Data System (ADS)
Babin, B. L.; Hu, L.
2008-12-01
Louisiana Universities Marine Consortium's (LUMCON) and Dauphin Island Sea Lab's (DISL) Environmental Monitoring System provide a unified coastal ocean observing system. These two systems are mirrored to maintain autonomy while offering an integrated data sharing environment. Both systems collect data via Campbell Scientific Data loggers, store the data in Microsoft SQL servers, and disseminate the data in real- time on the World Wide Web via Microsoft Internet Information Servers and Active Server Pages (ASP). The utilization of Microsoft Windows technologies presented many challenges to these observing systems as open source tools for interoperability grow. The current open source tools often require the installation of additional software. In order to make data available through common standards formats, "home grown" software has been developed. One example of this is the development of software to generate xml files for transmission to the National Data Buoy Center (NDBC). OOSTethys partners develop, test and implement easy-to-use, open-source, OGC-compliant software., and have created a working prototype of networked, semantically interoperable, real-time data systems. Partnering with OOSTethys, we are developing a cookbook to implement OGC web services. The implementation will be written in ASP, will run in a Microsoft operating system environment, and will serve data via Sensor Observation Services (SOS). This cookbook will give observing systems running Microsoft Windows the tools to easily participate in the Open Geospatial Consortium (OGC) Oceans Interoperability Experiment (OCEANS IE).
Classification of galaxy type from images using Microsoft R Server
NASA Astrophysics Data System (ADS)
de Vries, Andrie
2017-06-01
Many astronomers working in the field of AstroInformatics write code as part of their work. Although the programming language of choice is Python, a small number (8%) use R. R has its specific strengths in the domain of statistics, and is often viewed as limited in the size of data it can handle. However, Microsoft R Server is a product that removes these limitations by being able to process much larger amounts of data. I present some highlights of R Server, by illustrating how to fit a convolutional neural network using R. The specific task is to classify galaxies, using only images extracted from the Sloan Digital Skyserver.
P43-S Computational Biology Applications Suite for High-Performance Computing (BioHPC.net)
Pillardy, J.
2007-01-01
One of the challenges of high-performance computing (HPC) is user accessibility. At the Cornell University Computational Biology Service Unit, which is also a Microsoft HPC institute, we have developed a computational biology application suite that allows researchers from biological laboratories to submit their jobs to the parallel cluster through an easy-to-use Web interface. Through this system, we are providing users with popular bioinformatics tools including BLAST, HMMER, InterproScan, and MrBayes. The system is flexible and can be easily customized to include other software. It is also scalable; the installation on our servers currently processes approximately 8500 job submissions per year, many of them requiring massively parallel computations. It also has a built-in user management system, which can limit software and/or database access to specified users. TAIR, the major database of the plant model organism Arabidopsis, and SGN, the international tomato genome database, are both using our system for storage and data analysis. The system consists of a Web server running the interface (ASP.NET C#), Microsoft SQL server (ADO.NET), compute cluster running Microsoft Windows, ftp server, and file server. Users can interact with their jobs and data via a Web browser, ftp, or e-mail. The interface is accessible at http://cbsuapps.tc.cornell.edu/.
Data Driven Device Failure Prediction
2016-09-15
Microsoft enterprise authentication service and Apache web server in an effort to increase up-time and improve mission effectiveness. These new fault loads...54 4.2.2 Web Server . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59...predictor. Finally, the implementation is validated by running the same experiment on a web server. 1.1 Problem Statement According to the operational
An Array Library for Microsoft SQL Server with Astrophysical Applications
NASA Astrophysics Data System (ADS)
Dobos, L.; Szalay, A. S.; Blakeley, J.; Falck, B.; Budavári, T.; Csabai, I.
2012-09-01
Today's scientific simulations produce output on the 10-100 TB scale. This unprecedented amount of data requires data handling techniques that are beyond what is used for ordinary files. Relational database systems have been successfully used to store and process scientific data, but the new requirements constantly generate new challenges. Moving terabytes of data among servers on a timely basis is a tough problem, even with the newest high-throughput networks. Thus, moving the computations as close to the data as possible and minimizing the client-server overhead are absolutely necessary. At least data subsetting and preprocessing have to be done inside the server process. Out of the box commercial database systems perform very well in scientific applications from the prospective of data storage optimization, data retrieval, and memory management but lack basic functionality like handling scientific data structures or enabling advanced math inside the database server. The most important gap in Microsoft SQL Server is the lack of a native array data type. Fortunately, the technology exists to extend the database server with custom-written code that enables us to address these problems. We present the prototype of a custom-built extension to Microsoft SQL Server that adds array handling functionality to the database system. With our Array Library, fix-sized arrays of all basic numeric data types can be created and manipulated efficiently. Also, the library is designed to be able to be seamlessly integrated with the most common math libraries, such as BLAS, LAPACK, FFTW, etc. With the help of these libraries, complex operations, such as matrix inversions or Fourier transformations, can be done on-the-fly, from SQL code, inside the database server process. We are currently testing the prototype with two different scientific data sets: The Indra cosmological simulation will use it to store particle and density data from N-body simulations, and the Milky Way Laboratory project will use it to store galaxy simulation data.
Reactive Aggregate Model Protecting Against Real-Time Threats
2014-09-01
on the underlying functionality of three core components. • MS SQL server 2008 backend database. • Microsoft IIS running on Windows server 2008...services. The capstone tested a Linux-based Apache web server with the following software implementations: • MySQL as a Linux-based backend server for...malicious compromise. 1. Assumptions • GINA could connect to a backend MS SQL database through proper configuration of DotNetNuke. • GINA had access
NASA Astrophysics Data System (ADS)
Mann, Christopher; Narasimhamurthi, Natarajan
1998-08-01
This paper discusses a specific implementation of a web and complement based simulation systems. The overall simulation container is implemented within a web page viewed with Microsoft's Internet Explorer 4.0 web browser. Microsoft's ActiveX/Distributed Component Object Model object interfaces are used in conjunction with the Microsoft DirectX graphics APIs to provide visualization functionality for the simulation. The MathWorks' Matlab computer aided control system design program is used as an ActiveX automation server to provide the compute engine for the simulations.
First-year dental students' motivation and attitudes for choosing the dental profession.
Avramova, Nadya; Yaneva, Krassimira; Bonev, Boyko
2014-01-01
To determine first-year dental students' current motivation and attitudes for choosing the dental profession at the Faculty of Dental Medicine, Medical University - Sofia, Bulgaria. An anonymous questionnaire, consisting of 12 questions about students' socio-demographic profile and their motivation for choosing dentistry, was administered to 119 first-year dental students at the Faculty of Dental Medicine of the Medical University of Sofia. The study was conducted at the beginning of the 2012-2013 academic year. The data was processed and analyzed with the following software: Microsoft Windows Server 2008 R2; Microsoft SQL Server 2008; Internet Information Server 7.5.; Microsoft SharePoint Server 2010. The majority of the students (73%) were self-motivated for choosing dentistry as a career; 61% of them did not have relatives in the medical profession; 43% chose dental medicine because it is a prestigious, humane and noble profession; 50% - for financial security; 59% - because of the independence that it provides. There were no significant differences in the motivation between males and females. Independence, financial security and 'prestige' were the predominant motivating factors in this group of first-year dental students. Determining the reasons for choosing dentistry has important implications for the selection and training of students as well as for their future job satisfaction. Copyright © 2014 by Academy of Sciences and Arts of Bosnia and Herzegovina.
2010-01-01
interface, another providing the application logic (a program used to manipulate the data), and a server running Microsoft SQL Server or Oracle RDBMS... Oracle ) • Mysql (Open Source) • Other What application server software will be needed? • Application Server • CGI PHP/Perl (Open Source...are used throughout DoD and serve a variety of functions. While DoD has a codified and institutionalized process for the development of a common set
A Tools-Based Approach to Teaching Data Mining Methods
ERIC Educational Resources Information Center
Jafar, Musa J.
2010-01-01
Data mining is an emerging field of study in Information Systems programs. Although the course content has been streamlined, the underlying technology is still in a state of flux. The purpose of this paper is to describe how we utilized Microsoft Excel's data mining add-ins as a front-end to Microsoft's Cloud Computing and SQL Server 2008 Business…
Assessment & Commitment Tracking System (ACTS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryant, Robert A.; Childs, Teresa A.; Miller, Michael A.
2004-12-20
The ACTS computer code provides a centralized tool for planning and scheduling assessments, tracking and managing actions associated with assessments or that result from an event or condition, and "mining" data for reporting and analyzing information for improving performance. The ACTS application is designed to work with the MS SQL database management system. All database interfaces are written in SQL. The following software is used to develop and support the ACTS application: Cold Fusion HTML JavaScript Quest TOAD Microsoft Visual Source Safe (VSS) HTML Mailer for sending email Microsoft SQL Microsoft Internet Information Server
An object-oriented, technology-adaptive information model
NASA Technical Reports Server (NTRS)
Anyiwo, Joshua C.
1995-01-01
The primary objective was to develop a computer information system for effectively presenting NASA's technologies to American industries, for appropriate commercialization. To this end a comprehensive information management model, applicable to a wide variety of situations, and immune to computer software/hardware technological gyrations, was developed. The model consists of four main elements: a DATA_STORE, a data PRODUCER/UPDATER_CLIENT and a data PRESENTATION_CLIENT, anchored to a central object-oriented SERVER engine. This server engine facilitates exchanges among the other model elements and safeguards the integrity of the DATA_STORE element. It is designed to support new technologies, as they become available, such as Object Linking and Embedding (OLE), on-demand audio-video data streaming with compression (such as is required for video conferencing), Worldwide Web (WWW) and other information services and browsing, fax-back data requests, presentation of information on CD-ROM, and regular in-house database management, regardless of the data model in place. The four components of this information model interact through a system of intelligent message agents which are customized to specific information exchange needs. This model is at the leading edge of modern information management models. It is independent of technological changes and can be implemented in a variety of ways to meet the specific needs of any communications situation. This summer a partial implementation of the model has been achieved. The structure of the DATA_STORE has been fully specified and successfully tested using Microsoft's FoxPro 2.6 database management system. Data PRODUCER/UPDATER and PRESENTATION architectures have been developed and also successfully implemented in FoxPro; and work has started on a full implementation of the SERVER engine. The model has also been successfully applied to a CD-ROM presentation of NASA's technologies in support of Langley Research Center's TAG efforts.
Windows Terminal Servers Orchestration
NASA Astrophysics Data System (ADS)
Bukowiec, Sebastian; Gaspar, Ricardo; Smith, Tim
2017-10-01
Windows Terminal Servers provide application gateways for various parts of the CERN accelerator complex, used by hundreds of CERN users every day. The combination of new tools such as Puppet, HAProxy and Microsoft System Center suite enable automation of provisioning workflows to provide a terminal server infrastructure that can scale up and down in an automated manner. The orchestration does not only reduce the time and effort necessary to deploy new instances, but also facilitates operations such as patching, analysis and recreation of compromised nodes as well as catering for workload peaks.
Dorizzi, R M; Maconi, M; Giavarina, D; Loza, G; Aman, M; Moreira, J; Bisoffi, Z; Gennuso, C
2009-10-01
The adoption of Evidence Based Laboratory Medicine (EBLM) has been hampered until today by the lack of effective tools. The SIMeL EBLM e-Thesaurus (on-line Repertoire of the diagnostic effectiveness of the laboratory, radiology and cardiology test) provides a useful support to clinical laboratory professionals and to clinicians for the interpretation of the diagnostic tests. The e-Thesaurus is an application developed using Microsoft Active Server Pages technology and carried out with Web Server Microsoft Internet Information Server and is available at the SIMeL website using a browser running JavaScript scripts (Internet Explorer is recommended). It contains a database (in Italian, English and Spanish) of the sensitivity and specificity (including the 95% confidence interval), the positive and negative likelihood ratios, the Diagnostic Odds Ratio and the Number Needed to Diagnose of more than 2000 diagnostic (most laboratory but also cardiology and radiology) tests. The e-Thesaurus improves the previous SIMeL paper and CD Thesaurus; its main features are a three languages search and a continuous and an easy updating capability.
GLobal Integrated Design Environment (GLIDE): A Concurrent Engineering Application
NASA Technical Reports Server (NTRS)
McGuire, Melissa L.; Kunkel, Matthew R.; Smith, David A.
2010-01-01
The GLobal Integrated Design Environment (GLIDE) is a client-server software application purpose-built to mitigate issues associated with real time data sharing in concurrent engineering environments and to facilitate discipline-to-discipline interaction between multiple engineers and researchers. GLIDE is implemented in multiple programming languages utilizing standardized web protocols to enable secure parameter data sharing between engineers and researchers across the Internet in closed and/or widely distributed working environments. A well defined, HyperText Transfer Protocol (HTTP) based Application Programming Interface (API) to the GLIDE client/server environment enables users to interact with GLIDE, and each other, within common and familiar tools. One such common tool, Microsoft Excel (Microsoft Corporation), paired with its add-in API for GLIDE, is discussed in this paper. The top-level examples given demonstrate how this interface improves the efficiency of the design process of a concurrent engineering study while reducing potential errors associated with manually sharing information between study participants.
Karst database development in Minnesota: Design and data assembly
Gao, Y.; Alexander, E.C.; Tipping, R.G.
2005-01-01
The Karst Feature Database (KFD) of Minnesota is a relational GIS-based Database Management System (DBMS). Previous karst feature datasets used inconsistent attributes to describe karst features in different areas of Minnesota. Existing metadata were modified and standardized to represent a comprehensive metadata for all the karst features in Minnesota. Microsoft Access 2000 and ArcView 3.2 were used to develop this working database. Existing county and sub-county karst feature datasets have been assembled into the KFD, which is capable of visualizing and analyzing the entire data set. By November 17 2002, 11,682 karst features were stored in the KFD of Minnesota. Data tables are stored in a Microsoft Access 2000 DBMS and linked to corresponding ArcView applications. The current KFD of Minnesota has been moved from a Windows NT server to a Windows 2000 Citrix server accessible to researchers and planners through networked interfaces. ?? Springer-Verlag 2005.
Electronic Mail (E-Mail) Management and Use
1999-03-01
age or date of birth ; present or 14 AFI33-119 1 MARCH 1999future assignments for overseas, or for routinely deployable or sensitive units; and office...Server Naming Convention. The DMS-AF primary (e.g., ESL® Primary, Lotus ® Hub, and Microsoft® Bridgehead) server and its backup must conform to the 8...determine if there is some benefit to establishing standard lower-level folder names which bases should adhere to. Standard public folders offer a
Implementation of an Enterprise Information Portal (EIP) in the Loyola University Health System
Price, Ronald N.; Hernandez, Kim
2001-01-01
Loyola University Chicago Stritch School of Medicine and Loyola University Medical Center have long histories in the development of applications to support the institutions' missions of education, research and clinical care. In late 1998, the institutions' application development group undertook an ambitious program to re-architecture more than 10 years of legacy application development (30+ core applications) into a unified World Wide Web (WWW) environment. The primary project objectives were to construct an environment that would support the rapid development of n-tier, web-based applications while providing standard methods for user authentication/validation, security/access control and definition of a user's organizational context. The project's efforts resulted in Loyola's Enterprise Information Portal (EIP), which meets the aforementioned objectives. This environment: 1) allows access to other vertical Intranet portals (e.g., electronic medical record, patient satisfaction information and faculty effort); 2) supports end-user desktop customization; and 3) provides a means for standardized application “look and feel.” The portal was constructed utilizing readily available hardware and software. Server hardware consists of multiprocessor (Intel Pentium 500Mhz) Compaq 6500 servers with one gigabyte of random access memory and 75 gigabytes of hard disk storage. Microsoft SQL Server was selected to house the portal's internal or security data structures. Netscape Enterprise Server was selected for the web server component of the environment and Allaire's ColdFusion was chosen for access and application tiers. Total costs for the portal environment was less than $40,000. User data storage is accomplished through two Microsoft SQL Servers and an existing SUN Microsystems enterprise server with eight processors, 750 gigabytes of disk storage operating Sybase relational database manager. Total storage capacity for all system exceeds one terabyte. In the past 12 months, the EIP has supported development of more than 88 applications and is utilized by more than 2,200 users.
Mobile Monitoring Stations and Web Visualization of Biotelemetric System - Guardian II
NASA Astrophysics Data System (ADS)
Krejcar, Ondrej; Janckulik, Dalibor; Motalova, Leona; Kufel, Jan
The main area of interest of our project is to provide solution which can be used in different areas of health care and which will be available through PDAs (Personal Digital Assistants), web browsers or desktop clients. The realized system deals with an ECG sensor connected to mobile equipment, such as PDA/Embedded, based on Microsoft Windows Mobile operating system. The whole system is based on the architecture of .NET Compact Framework, and Microsoft SQL Server. Visualization possibilities of web interface and ECG data are also discussed and final suggestion is made to Microsoft Silverlight solution along with current screenshot representation of implemented solution. The project was successfully tested in real environment in cryogenic room (-136OC).
NASA Astrophysics Data System (ADS)
Yang, Keon Ho; Jung, Haijo; Kang, Won-Suk; Jang, Bong Mun; Kim, Joong Il; Han, Dong Hoon; Yoo, Sun-Kook; Yoo, Hyung-Sik; Kim, Hee-Joung
2006-03-01
The wireless mobile service with a high bit rate using CDMA-1X EVDO is now widely used in Korea. Mobile devices are also increasingly being used as the conventional communication mechanism. We have developed a web-based mobile system that communicates patient information and images, using CDMA-1X EVDO for emergency diagnosis. It is composed of a Mobile web application system using the Microsoft Windows 2003 server and an internet information service. Also, a mobile web PACS used for a database managing patient information and images was developed by using Microsoft access 2003. A wireless mobile emergency patient information and imaging communication system is developed by using Microsoft Visual Studio.NET, and JPEG 2000 ActiveX control for PDA phone was developed by using the Microsoft Embedded Visual C++. Also, the CDMA-1X EVDO is used for connections between mobile web servers and the PDA phone. This system allows fast access to the patient information database, storing both medical images and patient information anytime and anywhere. Especially, images were compressed into a JPEG2000 format and transmitted from a mobile web PACS inside the hospital to the radiologist using a PDA phone located outside the hospital. Also, this system shows radiological images as well as physiological signal data, including blood pressure, vital signs and so on, in the web browser of the PDA phone so radiologists can diagnose more effectively. Also, we acquired good results using an RW-6100 PDA phone used in the university hospital system of the Sinchon Severance Hospital in Korea.
Boulos, Maged N Kamel; Honda, Kiyoshi
2006-01-01
Open Source Web GIS software systems have reached a stage of maturity, sophistication, robustness and stability, and usability and user friendliness rivalling that of commercial, proprietary GIS and Web GIS server products. The Open Source Web GIS community is also actively embracing OGC (Open Geospatial Consortium) standards, including WMS (Web Map Service). WMS enables the creation of Web maps that have layers coming from multiple different remote servers/sources. In this article we present one easy to implement Web GIS server solution that is based on the Open Source University of Minnesota (UMN) MapServer. By following the accompanying step-by-step tutorial instructions, interested readers running mainstream Microsoft® Windows machines and with no prior technical experience in Web GIS or Internet map servers will be able to publish their own health maps on the Web and add to those maps additional layers retrieved from remote WMS servers. The 'digital Asia' and 2004 Indian Ocean tsunami experiences in using free Open Source Web GIS software are also briefly described. PMID:16420699
Electronic Attack Platform Placement Optimization
2014-09-01
Processing in VBA ...............................................................33 2. Client-Server Using Two Different Excel Application...6 Figure 3. Screenshot of the VBA IDE contained within all Microsoft Office products...application using MS Excel’s Applicatin.OnTime method. .....................................33 Figure 20. WINSOCK API Functions needed to use TCP via VBA
[Automated anesthesia record system].
Zhu, Tao; Liu, Jin
2005-12-01
Based on Client/Server architecture, a software of automated anesthesia record system running under Windows operation system and networks has been developed and programmed with Microsoft Visual C++ 6.0, Visual Basic 6.0 and SQL Server. The system can deal with patient's information throughout the anesthesia. It can collect and integrate the data from several kinds of medical equipment such as monitor, infusion pump and anesthesia machine automatically and real-time. After that, the system presents the anesthesia sheets automatically. The record system makes the anesthesia record more accurate and integral and can raise the anesthesiologist's working efficiency.
Social Impacts Module (SIM) Transition
2012-09-28
User String The authorized user’s name to access the PAVE database. Applies only to Microsoft SQL Server; leave blank, otherwise. passwd String The...otherwise. passwd String The password if an authorized user’s name is required; otherwise, leave blank driver String The class name for the driver to
ERIC Educational Resources Information Center
Fredette, Michelle
2012-01-01
"Rent or buy?" is a question people ask about everything from housing to textbooks. It is also a question universities must consider when it comes to high-performance computing (HPC). With the advent of Amazon's Elastic Compute Cloud (EC2), Microsoft Windows HPC Server, Rackspace's OpenStack, and other cloud-based services, researchers now have…
Aviation Environmental Design Tool (AEDT) : Version 2c service Pack 1 : installation guide.
DOT National Transportation Integrated Search
2016-12-01
This document provides detailed instructions on how to install and run AEDT 2c Service Pack 1 (SP1). It is important to follow the installation instructions in the order listed below, as Microsoft SQL Server 2008 R2 is a prerequisite for AEDT. Instal...
Investigating Uses and Perceptions of an Online Collaborative Workspace for the Dissertation Process
ERIC Educational Resources Information Center
Rockinson-Szapkiw, Amanda J.
2012-01-01
The intent of this study was to investigate 93 doctoral candidates' perceptions and use of an online collaboration workspace and content management server, Microsoft Office SharePoint, for dissertation process. All candidates were enrolled in an Ed.D. programme in the United States. Descriptive statistics demonstrate that candidates frequently use…
Multimedia data repository for the World Wide Web
NASA Astrophysics Data System (ADS)
Chen, Ken; Lu, Dajin; Xu, Duanyi
1998-08-01
This paper introduces the design and implementation of a Multimedia Data Repository served as a multimedia information system, which provides users a Web accessible, platform independent interface to query, browse, and retrieve multimedia data such as images, graphics, audio, video from a large multimedia data repository. By integrating the multimedia DBMS, in which the textual information and samples of the multimedia data is organized and stored, and Web server together into the Microsoft ActiveX Server Framework, users can access the DBMS and query the information by simply using a Web browser at the client-side. The original multimedia data can then be located and transmitted through the Internet from the tertiary storage device, a 400 CDROM optical jukebox at the server-side, to the client-side for further use.
Server-Controlled Identity-Based Authenticated Key Exchange
NASA Astrophysics Data System (ADS)
Guo, Hua; Mu, Yi; Zhang, Xiyong; Li, Zhoujun
We present a threshold identity-based authenticated key exchange protocol that can be applied to an authenticated server-controlled gateway-user key exchange. The objective is to allow a user and a gateway to establish a shared session key with the permission of the back-end servers, while the back-end servers cannot obtain any information about the established session key. Our protocol has potential applications in strong access control of confidential resources. In particular, our protocol possesses the semantic security and demonstrates several highly-desirable security properties such as key privacy and transparency. We prove the security of the protocol based on the Bilinear Diffie-Hellman assumption in the random oracle model.
A mobile field-work data collection system for the wireless era of health surveillance.
Forsell, Marianne; Sjögren, Petteri; Renard, Matthew; Johansson, Olle
2011-03-01
In many countries or regions the capacity of health care resources is below the needs of the population and new approaches for health surveillance are needed. Innovative projects, utilizing wireless communication technology, contribute to reliable methods for field-work data collection and reporting to databases. The objective was to describe a new version of a wireless IT-support system for field-work data collection and administration. The system requirements were drawn from the design objective and translated to system functions. The system architecture was based on fieldwork experiences and administrative requirements. The Smartphone devices were HTC Touch Diamond2s, while the system was based on a platform with Microsoft .NET components, and a SQL Server 2005 with Microsoft Windows Server 2003 operating system. The user interfaces were based on .NET programming, and Microsoft Windows Mobile operating system. A synchronization module enabled download of field data to the database, via a General Packet Radio Services (GPRS) to a Local Area Network (LAN) interface. The field-workers considered the here-described applications user-friendly and almost self-instructing. The office administrators considered that the back-office interface facilitated retrieval of health reports and invoice distribution. The current IT-support system facilitates short lead times from fieldwork data registration to analysis, and is suitable for various applications. The advantages of wireless technology, and paper-free data administration need to be increasingly emphasized in development programs, in order to facilitate reliable and transparent use of limited resources.
Lange, Kristian; Kühn, Simone; Filevich, Elisa
2015-01-01
We present here “Just Another Tool for Online Studies” (JATOS): an open source, cross-platform web application with a graphical user interface (GUI) that greatly simplifies setting up and communicating with a web server to host online studies that are written in JavaScript. JATOS is easy to install in all three major platforms (Microsoft Windows, Mac OS X, and Linux), and seamlessly pairs with a database for secure data storage. It can be installed on a server or locally, allowing researchers to try the application and feasibility of their studies within a browser environment, before engaging in setting up a server. All communication with the JATOS server takes place via a GUI (with no need to use a command line interface), making JATOS an especially accessible tool for researchers without a strong IT background. We describe JATOS’ main features and implementation and provide a detailed tutorial along with example studies to help interested researchers to set up their online studies. JATOS can be found under the Internet address: www.jatos.org. PMID:26114751
2001-09-01
of MEIMS was programmed in Microsoft Access 97 using Visual Basic for Applications ( VBA ). This prototype had very little documentation. The FAA...using Acess 2000 as an interface and SQL server as the database engine. Question 1: Did you have any problems accessing the program? Y / N
South Carolina's SC LENDS: Optimizing Libraries, Transforming Lending
ERIC Educational Resources Information Center
Hamby, Rogan; McBride, Ray; Lundberg, Maria
2011-01-01
Since SC LENDS started operating in June 2009, more public libraries have come on board. All of this on the back end connects to a Mozilla-based staff client that has distributions for Mac OS X and Microsoft Windows, using SSL encryption to keep communications secure and private between remote libraries and the servers hosted at a high-end…
Disaster recovery plan for HANDI 2000 business management system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, D.E.
The BMS production implementation will be complete by October 1, 1998 and the server environment will be comprised of two types of platforms. The PassPort Supply and the PeopleSoft Financials will reside on LNIX servers and the PeopleSoft Human Resources and Payroll will reside on Microsoft NT servers. Because of the wide scope and the requirements of the COTS products to run in various environments backup and recovery responsibilities are divided between two groups in Technical Operations. The Central Computer Systems Management group provides support for the LTNIX/NT Backup Data Center, and the Network Infrastructure Systems group provides support formore » the NT Application Server Backup outside the Data Center. The disaster recovery process is dependent on a good backup and recovery process. Information and integrated system data for determining the disaster recovery process is identified from the Fluor Daniel Hanford (FDH) Risk Assessment Plan, Contingency Plan, and Backup and Recovery Plan, and Backup Form for HANDI 2000 BMS.« less
The personal receiving document management and the realization of email function in OAS
NASA Astrophysics Data System (ADS)
Li, Biqing; Li, Zhao
2017-05-01
This software is an independent software system, suitable for small and medium enterprises, contains personal office, scientific research project management and system management functions, independently run in relevant environment, and to solve practical needs. This software is an independent software system, using the current popular B/S (browser/server) structure and ASP.NET technology development, using the Windows 7 operating system, Microsoft SQL Server2005 Visual2008 and database as a development platform, suitable for small and medium enterprises, contains personal office, scientific research project management and system management functions, independently run in relevant environment, and to solve practical needs.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-24
... electronic servers in close physical proximity to the Exchange's trading and execution system. See id. at 59299. Partial Cabinets A User is able to request a physical cabinet to house its servers and other... Exchange enter the Exchange's trading and execution systems through the same order gateway, regardless of...
Implementation of Medical Information Exchange System Based on EHR Standard
Han, Soon Hwa; Kim, Sang Guk; Jeong, Jun Yong; Lee, Bi Na; Choi, Myeong Seon; Kim, Il Kon; Park, Woo Sung; Ha, Kyooseob; Cho, Eunyoung; Kim, Yoon; Bae, Jae Bong
2010-01-01
Objectives To develop effective ways of sharing patients' medical information, we developed a new medical information exchange system (MIES) based on a registry server, which enabled us to exchange different types of data generated by various systems. Methods To assure that patient's medical information can be effectively exchanged under different system environments, we adopted the standardized data transfer methods and terminologies suggested by the Center for Interoperable Electronic Healthcare Record (CIEHR) of Korea in order to guarantee interoperability. Regarding information security, MIES followed the security guidelines suggested by the CIEHR of Korea. This study aimed to develop essential security systems for the implementation of online services, such as encryption of communication, server security, database security, protection against hacking, contents, and network security. Results The registry server managed information exchange as well as the registration information of the clinical document architecture (CDA) documents, and the CDA Transfer Server was used to locate and transmit the proper CDA document from the relevant repository. The CDA viewer showed the CDA documents via connection with the information systems of related hospitals. Conclusions This research chooses transfer items and defines document standards that follow CDA standards, such that exchange of CDA documents between different systems became possible through ebXML. The proposed MIES was designed as an independent central registry server model in order to guarantee the essential security of patients' medical information. PMID:21818447
Implementation of Medical Information Exchange System Based on EHR Standard.
Han, Soon Hwa; Lee, Min Ho; Kim, Sang Guk; Jeong, Jun Yong; Lee, Bi Na; Choi, Myeong Seon; Kim, Il Kon; Park, Woo Sung; Ha, Kyooseob; Cho, Eunyoung; Kim, Yoon; Bae, Jae Bong
2010-12-01
To develop effective ways of sharing patients' medical information, we developed a new medical information exchange system (MIES) based on a registry server, which enabled us to exchange different types of data generated by various systems. To assure that patient's medical information can be effectively exchanged under different system environments, we adopted the standardized data transfer methods and terminologies suggested by the Center for Interoperable Electronic Healthcare Record (CIEHR) of Korea in order to guarantee interoperability. Regarding information security, MIES followed the security guidelines suggested by the CIEHR of Korea. This study aimed to develop essential security systems for the implementation of online services, such as encryption of communication, server security, database security, protection against hacking, contents, and network security. The registry server managed information exchange as well as the registration information of the clinical document architecture (CDA) documents, and the CDA Transfer Server was used to locate and transmit the proper CDA document from the relevant repository. The CDA viewer showed the CDA documents via connection with the information systems of related hospitals. This research chooses transfer items and defines document standards that follow CDA standards, such that exchange of CDA documents between different systems became possible through ebXML. The proposed MIES was designed as an independent central registry server model in order to guarantee the essential security of patients' medical information.
Eccher, C; Berloffa, F; Demichelis, F; Larcher, B; Galvagni, M; Sboner, A; Graiff, A; Forti, S
1999-01-01
Introduction This study describes a tele-consultation system (TCS) developed to provide a computing environment over a Wide Area Network (WAN) in North Italy (Province of Trento), that can be used by two or more physicians to share medical data and to work co-operatively on medical records. A pilot study has been carried out in oncology to assess the effectiveness of the system. The aim of this project is to facilitate the management of oncology patients by improving communication among the specialists of central and district hospitals. Methods and Results The TCS is an Intranet-based solution. The Intranet is based on a PC WAN with Windows NT Server, Microsoft SQL Server, and Internet Information Server. TCS is composed of native and custom applications developed in the Microsoft Windows (9x and NT) environment. The basic component of the system is the multimedia digital medical record, structured as a collection of HTML and ASP pages. A distributed relational database will allow users to store and retrieve medical records, accessed by a dedicated Web browser via the Web Server. The medical data to be stored and the presentation architecture of the clinical record had been determined in close collaboration with the clinicians involved in the project. TCS will allow a multi-point tele-consultation (TC) among two or more participants on remote computers, providing synchronized surfing through the clinical report. A set of collaborative and personal tools, whiteboard with drawing tools, point-to-point digital audio-conference, chat, local notepad, e-mail service, are integrated in the system to provide an user friendly environment. TCS has been developed as a client-server architecture. The client part of the system is based on the Microsoft Web Browser control and provides the user interface and the tools described above. The server part, running all the time on a dedicated computer, accepts connection requests and manages the connections among the participants in a TC, allowing multiple TC to run simultaneously. TCS has been developed in Visual C++ environment using MFC library and COM technology; ActiveX controls have been written in Visual Basic to perform dedicated tasks from the inside of the HTML clinical report. Before deploying the system in the hospital departments involved in the project, TCS has been tested in our laboratory by clinicians involved in the project to evaluate the usability of the system. Discussion TCS has the potential to support a "multi-disciplinary distributed virtual oncological meeting". The specialists of different departments and of different hospitals can attend "virtual meetings" and interactively discuss on medical data. An expected benefit of the "virtual meeting" should be the possibility to provide expert remote advice from oncologists to peripheral cancer units in formulating treatment plans, conducting follow-up sessions and supporting clinical research.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-20
... Change To Eliminate the 100MB Connectivity Option and Fee March 14, 2012. Pursuant to Section 19(b)(1) of... Exchange proposes to eliminate 100MB connectivity between the Exchange and co-located servers, as well as..., Section X(b) to eliminate 100MB connectivity between the Exchange and co-located servers, as well as...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-24
... services allow Users to rent space in the data center so they may locate their electronic servers in close... User is able to request a physical cabinet to house its servers and other equipment in the data center... Exchange enter the Exchange's trading and execution systems through the same order gateway, regardless of...
Evaluation of Sub Query Performance in SQL Server
NASA Astrophysics Data System (ADS)
Oktavia, Tanty; Sujarwo, Surya
2014-03-01
The paper explores several sub query methods used in a query and their impact on the query performance. The study uses experimental approach to evaluate the performance of each sub query methods combined with indexing strategy. The sub query methods consist of in, exists, relational operator and relational operator combined with top operator. The experimental shows that using relational operator combined with indexing strategy in sub query has greater performance compared with using same method without indexing strategy and also other methods. In summary, for application that emphasized on the performance of retrieving data from database, it better to use relational operator combined with indexing strategy. This study is done on Microsoft SQL Server 2012.
[Development of expert diagnostic system for common respiratory diseases].
Xu, Wei-hua; Chen, You-ling; Yan, Zheng
2014-03-01
To develop an internet-based expert diagnostic system for common respiratory diseases. SaaS system was used to build architecture; pattern of forward reasoning was applied for inference engine design; ASP.NET with C# from the tool pack of Microsoft Visual Studio 2005 was used for website-interview medical expert system.The database of the system was constructed with Microsoft SQL Server 2005. The developed expert system contained large data memory and high efficient function of data interview and data analysis for diagnosis of various diseases.The users were able to perform this system to obtain diagnosis for common respiratory diseases via internet. The developed expert system may be used for internet-based diagnosis of various respiratory diseases,particularly in telemedicine setting.
Wu, Chien Hua; Chiu, Ruey Kei; Yeh, Hong Mo; Wang, Da Wei
2017-11-01
In 2011, the Ministry of Health and Welfare of Taiwan established the National Electronic Medical Record Exchange Center (EEC) to permit the sharing of medical resources among hospitals. This system can presently exchange electronic medical records (EMRs) among hospitals, in the form of medical imaging reports, laboratory test reports, discharge summaries, outpatient records, and outpatient medication records. Hospitals can send or retrieve EMRs over the virtual private network by connecting to the EEC through a gateway. International standards should be adopted in the EEC to allow users with those standards to take advantage of this exchange service. In this study, a cloud-based EMR-exchange prototyping system was implemented on the basis of the Integrating the Healthcare Enterprise's Cross-Enterprise Document Sharing integration profile and the existing EMR exchange system. RESTful services were used to implement the proposed prototyping system on the Microsoft Azure cloud-computing platform. Four scenarios were created in Microsoft Azure to determine the feasibility and effectiveness of the proposed system. The experimental results demonstrated that the proposed system successfully completed EMR exchange under the four scenarios created in Microsoft Azure. Additional experiments were conducted to compare the efficiency of the EMR-exchanging mechanisms of the proposed system with those of the existing EEC system. The experimental results suggest that the proposed RESTful service approach is superior to the Simple Object Access Protocol method currently implemented in the EEC system, according to the irrespective response times under the four experimental scenarios. Copyright © 2017 Elsevier B.V. All rights reserved.
2017-02-01
Image Processing Web Server Administration ...........................17 Fig. 18 Microsoft ASP.NET MVC 4 installation...algorithms are made into client applications that can be accessed from an image processing web service2 developed following Representational State...Transfer (REST) standards by a mobile app, laptop PC, and other devices. Similarly, weather tweets can be accessed via the Weather Digest Web Service
Mobile Assisted Security in Wireless Sensor Networks
2015-08-03
server from Google’s DNS, Chromecast and the content server does the 3-way TCP Handshake which is followed by Client Hello and Server Hello TLS messages...utilized TLS v1.2, except NTP servers and google’s DNS server. In the TLS v1.2, after handshake, client and server sends Client Hello and Server Hello ...Messages in order. In Client Hello messages, client offers a list of Cipher Suites that it supports. Each Cipher Suite defines the key exchange algorithm
Schedule-Aware Workflow Management Systems
NASA Astrophysics Data System (ADS)
Mans, Ronny S.; Russell, Nick C.; van der Aalst, Wil M. P.; Moleman, Arnold J.; Bakker, Piet J. M.
Contemporary workflow management systems offer work-items to users through specific work-lists. Users select the work-items they will perform without having a specific schedule in mind. However, in many environments work needs to be scheduled and performed at particular times. For example, in hospitals many work-items are linked to appointments, e.g., a doctor cannot perform surgery without reserving an operating theater and making sure that the patient is present. One of the problems when applying workflow technology in such domains is the lack of calendar-based scheduling support. In this paper, we present an approach that supports the seamless integration of unscheduled (flow) and scheduled (schedule) tasks. Using CPN Tools we have developed a specification and simulation model for schedule-aware workflow management systems. Based on this a system has been realized that uses YAWL, Microsoft Exchange Server 2007, Outlook, and a dedicated scheduling service. The approach is illustrated using a real-life case study at the AMC hospital in the Netherlands. In addition, we elaborate on the experiences obtained when developing and implementing a system of this scale using formal techniques.
A generic minimization random allocation and blinding system on web.
Cai, Hongwei; Xia, Jielai; Xu, Dezhong; Gao, Donghuai; Yan, Yongping
2006-12-01
Minimization is a dynamic randomization method for clinical trials. Although recommended by many researchers, the utilization of minimization has been seldom reported in randomized trials mainly because of the controversy surrounding the validity of conventional analyses and its complexity in implementation. However, both the statistical and clinical validity of minimization were demonstrated in recent studies. Minimization random allocation system integrated with blinding function that could facilitate the implementation of this method in general clinical trials has not been reported. SYSTEM OVERVIEW: The system is a web-based random allocation system using Pocock and Simon minimization method. It also supports multiple treatment arms within a trial, multiple simultaneous trials, and blinding without further programming. This system was constructed with generic database schema design method, Pocock and Simon minimization method and blinding method. It was coded with Microsoft Visual Basic and Active Server Pages (ASP) programming languages. And all dataset were managed with a Microsoft SQL Server database. Some critical programming codes were also provided. SIMULATIONS AND RESULTS: Two clinical trials were simulated simultaneously to test the system's applicability. Not only balanced groups but also blinded allocation results were achieved in both trials. Practical considerations for minimization method, the benefits, general applicability and drawbacks of the technique implemented in this system are discussed. Promising features of the proposed system are also summarized.
Forsell, M; Häggström, M; Johansson, O; Sjögren, P
2008-11-08
To develop a personal digital assistant (PDA) application for oral health assessment fieldwork, including back-office and database systems (MobilDent). System design, construction and implementation of PDA, back-office and database systems. System requirements for MobilDent were collected, analysed and translated into system functions. User interfaces were implemented and system architecture was outlined. MobilDent was based on a platform with. NET (Microsoft) components, using an SQL Server 2005 (Microsoft) for data storage with Windows Mobile (Microsoft) operating system. The PDA devices were Dell Axim. System functions and user interfaces were specified for MobilDent. User interfaces for PDA, back-office and database systems were based on. NET programming. The PDA user interface was based on Windows suitable to a PDA display, whereas the back-office interface was designed for a normal-sized computer screen. A synchronisation module (MS Active Sync, Microsoft) was used to enable download of field data from PDA to the database. MobilDent is a feasible application for oral health assessment fieldwork, and the oral health assessment database may prove a valuable source for care planning, educational and research purposes. Further development of the MobilDent system will include wireless connectivity with download-on-demand technology.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-15
... premises controlled by the Exchange in order that they may locate their electronic servers in close... the Exchange's trading and execution systems through the same order gateway regardless of whether the... weekends if NOT scheduled at least 1 day in advance. Rack and Stack Installation of one $200 per server...
Exploring the Cost and Functionality of MEDCOM Web Services
2005-10-24
Software Name 24. What backend database software supports your intranet/Internet content? (check all that apply)-. o Oracle o Microsoft SQL Server E0...Department of Defense (DoD) service branches, which funded and deployed an Internet portal, TRICARE Online, to serve as an information conduit between the...public website, the information contained on the intranet is traditionally limited to the members of the hosting command. The local information serves as
An Automated Solution to the Multiuser Carved Data Ascription Problem
2010-12-01
computer might have several authorized users. It is also common in many families, as well as in libraries, hospitals, and Internet cafes . Another way for...starting disk sector number were used in preference to features such as the Microsoft Office em- bedded “Creator” attribute. We believe that this is...with exemplars in a reference collection. 5) Validation Server: Although the technique presented in this paper is effective, it is time consuming to
CIS3/398: Implementation of a Web-Based Electronic Patient Record for Transplant Recipients
Fritsche, L; Lindemann, G; Schroeter, K; Schlaefer, A; Neumayer, H-H
1999-01-01
Introduction While the "Electronic patient record" (EPR) is a frequently quoted term in many areas of healthcare, only few working EPR-systems are available so far. To justify their use, EPRs must be able to store and display all kinds of medical information in a reliable, secure, time-saving, user-friendly way at an affordable price. Fields with patients who are attended to by a large number of medical specialists over a prolonged period of time are best suited to demonstrate the potential benefits of an EPR. The aim of our project was to investigate the feasibility of an EPR based solely on "of-the-shelf"-software and Internet-technology in the field of organ transplantation. Methods The EPR-system consists of three main elements: Data-storage facilities, a Web-server and a user-interface. Data are stored either in a relational database (Sybase Adaptive 11.5, Sybase Inc., CA) or in case of pictures (JPEG) and files in application formats (e. g. Word-Documents) on a Windows NT 4.0 Server (Microsoft Corp., WA). The entire communication of all data is handled by a Web-server (IIS 4.0, Microsoft) with an Active Server Pages extension. The database is accessed by ActiveX Data Objects via the ODBC-interface. The only software required on the user's computer is the Internet Explorer 4.01 (Microsoft), during the first use of the EPR, the ActiveX HTML Layout Control is automatically added. The user can access the EPR via Local or Wide Area Network or by dial-up connection. If the EPR is accessed from outside the firewall, all communication is encrypted (SSL 3.0, Netscape Comm. Corp., CA).The speed of the EPR-system was tested with 50 repeated measurements of the duration of two key-functions: 1) Display of all lab results for a given day and patient and 2) automatic composition of a letter containing diagnoses, medication, notes and lab results. For the test a 233 MHz Pentium II Processor with 10 Mbit/s Ethernet connection (ping-time below 10 ms) over 2 hubs to the server (400 MHz Pentium II, 256 MB RAM) was used. Results So far the EPR-system has been running for eight consecutive months and contains complete records of 673 transplant recipients with an average follow-up of 9.9 (SD :4.9) years and a total of 1.1 million lab values. Instruction to enable new users to perform basic operations took less than two hours in all cases. The average duration of laboratory access was 0.9 (SD:0.5) seconds, the automatic composition of a letter took 6.1 (SD:2.4) seconds. Apart from the database and Windows NT, all other components are available for free. The development of the EPR-system required less than two person-years. Conclusion Implementation of an Electronic patient record that meets the requirements of comprehensiveness, reliability, security, speed, user-friendliness and affordability using a combination of "of-the-shelf" software-products can be feasible, if the current state-of-the-art internet technology is applied.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-14
... Market Maker Standard quote server as a gateway for communicating eQuotes to MIAX. Because of the... connect the Limited Service Ports to independent servers that host their eQuote and purge functionality... same server for all of their Market Maker quoting activity. Currently, Market Makers in the MIAX System...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-26
... rather than forcing them to use their Market Maker Standard quote server as a gateway for communicating e... technical flexibility to connect additional Limited Service Ports to independent servers that host their e... mitigate the risk of using the same server for all of their Market Maker quoting activity. By using the...
Automatic management system for dose parameters in interventional radiology and cardiology.
Ten, J I; Fernandez, J M; Vaño, E
2011-09-01
The purpose of this work was to develop an automatic management system to archive and analyse the major study parameters and patient doses for fluoroscopy guided procedures performed in cardiology and interventional radiology systems. The X-ray systems used for this trial have the capability to export at the end of the procedure and via e-mail the technical parameters of the study and the patient dose values. An application was developed to query and retrieve from a mail server, all study reports sent by the imaging modality and store them on a Microsoft SQL Server data base. The results from 3538 interventional study reports generated by 7 interventional systems were processed. In the case of some technical parameters and patient doses, alarms were added to receive malfunction alerts so as to immediately take appropriate corrective actions.
Reducing the Cost of System Administration of a Disk Storage System Built from Commodity Components
2000-05-01
quickly by using checkpointing and roll-forward logs. Microsoft Tiger is a video server built from commodity PCs which they call “cubs” [ BBD +96, BFD97...20 cents per megabyte using street prices of components. 3.2.2 Redundancy In designing the TD prototype, we have taken care to ensure it does not have... Td /GridPix/, 1999. [ATP99] Satoshi Asami, Nisha Talagala, and David Patterson. Designing a self-maintaining storage system. In Proceedings of the
2013-01-01
website). Data mining tools are in-house code developed in Python, C++ and Java . • NGA The National Geospatial-Intelligence Agency (NGA) performs data...as PostgreSQL (with PostGIS), MySQL , Microsoft SQL Server, SQLite, etc. using the appropriate JDBC driver. 14 The documentation and ease to learn are...written in Java that is able to perform various types of regressions, classi- fications, and other data mining tasks. There is also a commercial version
Managing Attribute—Value Clinical Trials Data Using the ACT/DB Client—Server Database System
Nadkarni, Prakash M.; Brandt, Cynthia; Frawley, Sandra; Sayward, Frederick G.; Einbinder, Robin; Zelterman, Daniel; Schacter, Lee; Miller, Perry L.
1998-01-01
ACT/DB is a client—server database application for storing clinical trials and outcomes data, which is currently undergoing initial pilot use. It stores most of its data in entity—attribute—value form. Such data are segregated according to data type to allow indexing by value when possible, and binary large object data are managed in the same way as other data. ACT/DB lets an investigator design a study rapidly by defining the parameters (or attributes) that are to be gathered, as well as their logical grouping for purposes of display and data entry. ACT/DB generates customizable data entry. The data can be viewed through several standard reports as well as exported as text to external analysis programs. ACT/DB is designed to encourage reuse of parameters across multiple studies and has facilities for dictionary search and maintenance. It uses a Microsoft Access client running on Windows 95 machines, which communicates with an Oracle server running on a UNIX platform. ACT/DB is being used to manage the data for seven studies in its initial deployment. PMID:9524347
Data exchange technology based on handshake protocol for industrial automation system
NASA Astrophysics Data System (ADS)
Astafiev, A. V.; Shardin, T. O.
2018-05-01
In the article, questions of data exchange technology based on the handshake protocol for industrial automation system are considered. The methods of organizing the technology in client-server applications are analyzed. In the process of work, the main threats of client-server applications that arise during the information interaction of users are indicated. Also, a comparative analysis of analogue systems was carried out, as a result of which the most suitable option was chosen for further use. The basic schemes for the operation of the handshake protocol are shown, as well as the general scheme of the implemented application, which describes the entire process of interaction between the client and the server.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-16
... the Exchange in order that they may locate their electronic servers in close physical proximity to the... execution systems through the same order gateway regardless of whether the sender is co-located in the... scheduled at least 1 day in advance. Rack and Stack Installation of one $200 per server. server in User's...
Czaplewski, Cezary; Karczynska, Agnieszka; Sieradzan, Adam K; Liwo, Adam
2018-04-30
A server implementation of the UNRES package (http://www.unres.pl) for coarse-grained simulations of protein structures with the physics-based UNRES model, coined a name UNRES server, is presented. In contrast to most of the protein coarse-grained models, owing to its physics-based origin, the UNRES force field can be used in simulations, including those aimed at protein-structure prediction, without ancillary information from structural databases; however, the implementation includes the possibility of using restraints. Local energy minimization, canonical molecular dynamics simulations, replica exchange and multiplexed replica exchange molecular dynamics simulations can be run with the current UNRES server; the latter are suitable for protein-structure prediction. The user-supplied input includes protein sequence and, optionally, restraints from secondary-structure prediction or small x-ray scattering data, and simulation type and parameters which are selected or typed in. Oligomeric proteins, as well as those containing D-amino-acid residues and disulfide links can be treated. The output is displayed graphically (minimized structures, trajectories, final models, analysis of trajectory/ensembles); however, all output files can be downloaded by the user. The UNRES server can be freely accessed at http://unres-server.chem.ug.edu.pl.
Leaders Are the Network: Applying the Kotter Model in Shaping Future Information Systems
2010-01-01
common operational picture (COP) ( Hinson , 2009). Figure 3 demonstrates how CID combines Link 16 and FBCB2 feeds. The CID server polls different...Link 16 Info Exchange A B C S A D S Figure 3 FBCB2-Link 16 Information Exchange. Source: Created by author based on information derived from Hinson ...31552-new-army-leader-development-strategy- released/ (accessed July 30, 2010). Hinson , Jason and Summit, Bob, “Combat Identification Server: Blue
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perry, Marcia
The IRCD is an IRC server that was originally distributed by the IRCD Hybrid developer team for use as a server in IRC message over the public Internet. By supporting the IRC protocol defined in the IRC RFC, IRCD allows the users to create and join channels for group or one-to-one text-based instant messaging. It stores information about channels (e.g., whether it is public, secret, or invite-only, the topic set, membership) and users (who is online and what channels they are members of). It receives messages for a specific user or channel and forwards these messages to the targeted destination.more » Since server-to-server communication is also supported, these targeted destinations may be connected to different IRC servers. Messages are exchanged over TCP connections that remain open between the client and the server. The IRCD is being used within the Pervasive Computing Collaboration Environment (PCCE) as the 'chat server' for message exchange over public and private channels. After an LBNLSecureMessaging(PCCE chat) client has been authenticated, the client connects to IRCD with its assigned nickname or 'nick.' The client can then create or join channels for group discussions or one-to-one conversations. These channels can have an initial mode of public or invite-only and the mode may be changed after creation. If a channel is public, any one online can join the discussion; if a channel is invite-only, users can only join if existing members of the channel explicity invite them. Users can be invited to any type of channel and users may be members of multiple channels simultaneously. For use with the PCCE environment, the IRCD application (which was written in C) was ported to Linux and has been tested and installed under Linux Redhat 7.2. The source code was also modified with SSL so that all messages exchanged over the network are encrypted. This modified IRC server also verifies with an authentication server that the client is who he or she claims to be and that this user is authorized to ain access to the IRCD.« less
Development of yarn breakage detection software system based on machine vision
NASA Astrophysics Data System (ADS)
Wang, Wenyuan; Zhou, Ping; Lin, Xiangyu
2017-10-01
For questions spinning mills and yarn breakage cannot be detected in a timely manner, and save the cost of textile enterprises. This paper presents a software system based on computer vision for real-time detection of yarn breakage. The system and Windows8.1 system Tablet PC, cloud server to complete the yarn breakage detection and management. Running on the Tablet PC software system is designed to collect yarn and location information for analysis and processing. And will be processed after the information through the Wi-Fi and http protocol sent to the cloud server to store in the Microsoft SQL2008 database. In order to follow up on the yarn break information query and management. Finally sent to the local display on time display, and remind the operator to deal with broken yarn. The experimental results show that the system of missed test rate not more than 5%o, and no error detection.
An integrated data-analysis and database system for AMS 14C
NASA Astrophysics Data System (ADS)
Kjeldsen, Henrik; Olsen, Jesper; Heinemeier, Jan
2010-04-01
AMSdata is the name of a combined database and data-analysis system for AMS 14C and stable-isotope work that has been developed at Aarhus University. The system (1) contains routines for data analysis of AMS and MS data, (2) allows a flexible and accurate description of sample extraction and pretreatment, also when samples are split into several fractions, and (3) keeps track of all measured, calculated and attributed data. The structure of the database is flexible and allows an unlimited number of measurement and pretreatment procedures. The AMS 14C data analysis routine is fairly advanced and flexible, and it can be easily optimized for different kinds of measuring processes. Technically, the system is based on a Microsoft SQL server and includes stored SQL procedures for the data analysis. Microsoft Office Access is used for the (graphical) user interface, and in addition Excel, Word and Origin are exploited for input and output of data, e.g. for plotting data during data analysis.
Schrader, T; Hufnagl, P; Schlake, W; Dietel, M
2005-01-01
In the autumn a German screening program was started for detecting breast cancer in the population of women fifty and above. For the first time in this program, quality assurance rules were established: All statements of the radiologists and pathologists have to be confirmed by a second opinion. This improvement in quality is combined with a delay in time and additional expence. A new Telepathology Consultation Service was developed based on the experiences of the Telepathology Consultation Center of the UICC to speed up the second opinion process. The complete web-based service is operated under MS Windows 2003 Server, as web server the Internet Information Server, and the SQL-Server (both Microsoft) as the database. The websites, forms and control mechanism have been coded in by ASP scripts and JavaScript. A study to evaluate the effectiveness of telepathological consultation in comparison to conventional consultation has been carried out. Pathologists of the Professional Association of German Pathologists took part as well as requesting pathologists and as consultants for other participants. The quality of telepathological diagnosis was comparable to the conventional diagnosis. Telepathology allows a faster respond of 1 to 2 day (conventional postal delay). The time to prepare a telepathology request is about twice as conventional. This ratio may be inverted by an interface between the Pathology Information System and the Telepathology Server and the use of virtual microscopy. The Telepathology Consultation Service of the Professional Association of German Pathologists is a fast and effective German-language, internet-based service for obtaining a second opinion.
Automated Computer Access Request System
NASA Technical Reports Server (NTRS)
Snook, Bryan E.
2010-01-01
The Automated Computer Access Request (AutoCAR) system is a Web-based account provisioning application that replaces the time-consuming paper-based computer-access request process at Johnson Space Center (JSC). Auto- CAR combines rules-based and role-based functionality in one application to provide a centralized system that is easily and widely accessible. The system features a work-flow engine that facilitates request routing, a user registration directory containing contact information and user metadata, an access request submission and tracking process, and a system administrator account management component. This provides full, end-to-end disposition approval chain accountability from the moment a request is submitted. By blending both rules-based and rolebased functionality, AutoCAR has the flexibility to route requests based on a user s nationality, JSC affiliation status, and other export-control requirements, while ensuring a user s request is addressed by either a primary or backup approver. All user accounts that are tracked in AutoCAR are recorded and mapped to the native operating system schema on the target platform where user accounts reside. This allows for future extensibility for supporting creation, deletion, and account management directly on the target platforms by way of AutoCAR. The system s directory-based lookup and day-today change analysis of directory information determines personnel moves, deletions, and additions, and automatically notifies a user via e-mail to revalidate his/her account access as a result of such changes. AutoCAR is a Microsoft classic active server page (ASP) application hosted on a Microsoft Internet Information Server (IIS).
Creating a Parallel Version of VisIt for Microsoft Windows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitlock, B J; Biagas, K S; Rawson, P L
2011-12-07
VisIt is a popular, free interactive parallel visualization and analysis tool for scientific data. Users can quickly generate visualizations from their data, animate them through time, manipulate them, and save the resulting images or movies for presentations. VisIt was designed from the ground up to work on many scales of computers from modest desktops up to massively parallel clusters. VisIt is comprised of a set of cooperating programs. All programs can be run locally or in client/server mode in which some run locally and some run remotely on compute clusters. The VisIt program most able to harness today's computing powermore » is the VisIt compute engine. The compute engine is responsible for reading simulation data from disk, processing it, and sending results or images back to the VisIt viewer program. In a parallel environment, the compute engine runs several processes, coordinating using the Message Passing Interface (MPI) library. Each MPI process reads some subset of the scientific data and filters the data in various ways to create useful visualizations. By using MPI, VisIt has been able to scale well into the thousands of processors on large computers such as dawn and graph at LLNL. The advent of multicore CPU's has made parallelism the 'new' way to achieve increasing performance. With today's computers having at least 2 cores and in many cases up to 8 and beyond, it is more important than ever to deploy parallel software that can use that computing power not only on clusters but also on the desktop. We have created a parallel version of VisIt for Windows that uses Microsoft's MPI implementation (MSMPI) to process data in parallel on the Windows desktop as well as on a Windows HPC cluster running Microsoft Windows Server 2008. Initial desktop parallel support for Windows was deployed in VisIt 2.4.0. Windows HPC cluster support has been completed and will appear in the VisIt 2.5.0 release. We plan to continue supporting parallel VisIt on Windows so our users will be able to take full advantage of their multicore resources.« less
Japan Data Exchange Network JDXnet and Cloud-type Data Relay Server for Earthquake Observation Data
NASA Astrophysics Data System (ADS)
Takano, K.; Urabe, T.; Tsuruoka, H.; Nakagawa, S.
2015-12-01
In Japan, high-sensitive seismic observation and broad-band seismic observation are carried out by several organization such as Japan Meteorological Agency (JMA) , National Research Institute for Earth Science and Disaster Prevention (NIED), nine National Universities, Japan Agency for Marine-Earth Science and Technology (JAMSTEC) , etc. The total number of the observation station is about 1400 points. The total volume of the seismic waveform data collected from all these observation station is about 1MByte for 1 second (about 8 to 10Mbps) by using the WIN system(Urabe 1991). JDXnet is the Japan Data eXchange network for earthquake observation data. JDXnet was started from 2007 by cooperation of the researchers of each organization. All the seismic waveform data are available at the all organizations in real-time. The core of JDXnet is the broadcast type real-time data exchange by using the nationwide L2-VPN service offered in JGN-X of NICT and SINET4 of NII. Before the Tohoku earthquake, the nine national universities had collected seismic data to each data center and then exchanged with other universities and institutions by JDXnet. However, in this case, if the center of the university was stopped, all data of the university could not use even though there are some alive observation stations. Because of this problem, we have prepared the data relay server in the data center of SINET4 ie the cloud center. This data relay server collects data directly from the observation stations of the universities and delivers data to all universities and institutions by JDXnet. By using the relay server on cloud center, even if some universities are affected by a large disaster, it is eliminated that the data of the living station is lost. If the researchers set up seismometers and send data to the relay server, then data are available to all researchers. This mechanism promotes the joint use of the seismometers and joint research activities in nationwide researchers.
Conversation Threads Hidden within Email Server Logs
NASA Astrophysics Data System (ADS)
Palus, Sebastian; Kazienko, Przemysław
Email server logs contain records of all email Exchange through this server. Often we would like to analyze those emails not separately but in conversation thread, especially when we need to analyze social network extracted from those email logs. Unfortunately each mail is in different record and those record are not tided to each other in any obvious way. In this paper method for discussion threads extraction was proposed together with experiments on two different data sets - Enron and WrUT..
www.p2p.edu: Rip, Mix & Burn Your Education.
ERIC Educational Resources Information Center
Gillespie, Thom
2001-01-01
Discusses peer to peer technology which allows uploading files from one hard drive to another. Topics include the client/server model for education; the Napster client/server model; Gnutella; Freenet and other projects to allow the free exchange of information without censorship; bandwidth problems; copyright issues; metadata; and the United…
Proposal for a new CAPE-OPEN Object Model
Process simulation applications require the exchange of significant amounts of data between the flowsheet environment, unit operation model, and thermodynamic server. Packing and unpacking various data types and exchanging data using structured text-based architectures, including...
The frequency of company-sponsored alcohol brand-related sites on Facebook™-2012.
Nhean, Siphannay; Nyborn, Justin; Hinchey, Danielle; Valerio, Heather; Kinzel, Kathryn; Siegel, Michael; Jernigan, David H
2014-06-01
This research provides an estimate of the frequency of company-sponsored alcohol brand-related sites on Facebook™. We conducted a systematic overview of the extent of alcohol brand-related sites on Facebook™ in 2012. We conducted a 2012 Facebook™ search for sites specifically related to 898 alcohol brands across 16 different alcoholic beverage types. Descriptive statistics were produced using Microsoft SQL Server. We identified 1,017 company-sponsored alcohol-brand related sites on Facebook™. Our study advances previous literature by providing a systematic overview of the extent of alcohol brand sites on Facebook™.
Integrated Distributed Directory Service for KSC
NASA Technical Reports Server (NTRS)
Ghansah, Isaac
1997-01-01
This paper describes an integrated distributed directory services (DDS) architecture as a fundamental component of KSC distributed computing systems. Specifically, an architecture for an integrated directory service based on DNS and X.500/LDAP has been suggested. The architecture supports using DNS in its traditional role as a name service and X.500 for other services. Specific designs were made in the integration of X.500 DDS for Public Key Certificates, Kerberos Security Services, Network-wide Login, Electronic Mail, WWW URLS, Servers, and other diverse network objects. Issues involved in incorporating the emerging Microsoft Active Directory Service MADS in KSC's X.500 were discussed.
CProb: a computational tool for conducting conditional probability analysis.
Hollister, Jeffrey W; Walker, Henry A; Paul, John F
2008-01-01
Conditional probability is the probability of observing one event given that another event has occurred. In an environmental context, conditional probability helps to assess the association between an environmental contaminant (i.e., the stressor) and the ecological condition of a resource (i.e., the response). These analyses, when combined with controlled experiments and other methodologies, show great promise in evaluating ecological conditions from observational data and in defining water quality and other environmental criteria. Current applications of conditional probability analysis (CPA) are largely done via scripts or cumbersome spreadsheet routines, which may prove daunting to end-users and do not provide access to the underlying scripts. Combining spreadsheets with scripts eases computation through a familiar interface (i.e., Microsoft Excel) and creates a transparent process through full accessibility to the scripts. With this in mind, we developed a software application, CProb, as an Add-in for Microsoft Excel with R, R(D)com Server, and Visual Basic for Applications. CProb calculates and plots scatterplots, empirical cumulative distribution functions, and conditional probability. In this short communication, we describe CPA, our motivation for developing a CPA tool, and our implementation of CPA as a Microsoft Excel Add-in. Further, we illustrate the use of our software with two examples: a water quality example and a landscape example. CProb is freely available for download at http://www.epa.gov/emap/nca/html/regions/cprob.
NASA Astrophysics Data System (ADS)
Wang, Jian
2017-01-01
In order to change traditional PE teaching mode and realize the interconnection, interworking and sharing of PE teaching resources, a distance PE teaching platform based on broadband network is designed and PE teaching information resource database is set up. The designing of PE teaching information resource database takes Windows NT 4/2000Server as operating system platform, Microsoft SQL Server 7.0 as RDBMS, and takes NAS technology for data storage and flow technology for video service. The analysis of system designing and implementation shows that the dynamic PE teaching information resource sharing platform based on Web Service can realize loose coupling collaboration, realize dynamic integration and active integration and has good integration, openness and encapsulation. The distance PE teaching platform based on Web Service and the design scheme of PE teaching information resource database can effectively solve and realize the interconnection, interworking and sharing of PE teaching resources and adapt to the informatization development demands of PE teaching.
New Web Server - the Java Version of Tempest - Produced
NASA Technical Reports Server (NTRS)
York, David W.; Ponyik, Joseph G.
2000-01-01
A new software design and development effort has produced a Java (Sun Microsystems, Inc.) version of the award-winning Tempest software (refs. 1 and 2). In 1999, the Embedded Web Technology (EWT) team received a prestigious R&D 100 Award for Tempest, Java Version. In this article, "Tempest" will refer to the Java version of Tempest, a World Wide Web server for desktop or embedded systems. Tempest was designed at the NASA Glenn Research Center at Lewis Field to run on any platform for which a Java Virtual Machine (JVM, Sun Microsystems, Inc.) exists. The JVM acts as a translator between the native code of the platform and the byte code of Tempest, which is compiled in Java. These byte code files are Java executables with a ".class" extension. Multiple byte code files can be zipped together as a "*.jar" file for more efficient transmission over the Internet. Today's popular browsers, such as Netscape (Netscape Communications Corporation) and Internet Explorer (Microsoft Corporation) have built-in Virtual Machines to display Java applets.
NASA Technical Reports Server (NTRS)
Sundermier, Amy (Inventor)
2002-01-01
A method for acquiring and assembling software components at execution time into a client program, where the components may be acquired from remote networked servers is disclosed. The acquired components are assembled according to knowledge represented within one or more acquired mediating components. A mediating component implements knowledge of an object model. A mediating component uses its implemented object model knowledge, acquired component class information and polymorphism to assemble components into an interacting program at execution time. The interactions or abstract relationships between components in the object model may be implemented by the mediating component as direct invocations or indirect events or software bus exchanges. The acquired components may establish communications with remote servers. The acquired components may also present a user interface representing data to be exchanged with the remote servers. The mediating components may be assembled into layers, allowing arbitrarily complex programs to be constructed at execution time.
Li, Ya-Pin; Gao, Hong-Wei; Fan, Hao-Jun; Wei, Wei; Xu, Bo; Dong, Wen-Long; Li, Qing-Feng; Song, Wen-Jing; Hou, Shi-Ke
2017-12-01
The objective of this study was to build a database to collect infectious disease information at the scene of a disaster through the use of 128 epidemiological questionnaires and 47 types of options, with rapid acquisition of information regarding infectious disease and rapid questionnaire customization at the scene of disaster relief by use of a personal digital assistant (PDA). SQL Server 2005 (Microsoft Corp, Redmond, WA) was used to create the option database for the infectious disease investigation, to develop a client application for the PDA, and to deploy the application on the server side. The users accessed the server for data collection and questionnaire customization with the PDA. A database with a set of comprehensive options was created and an application system was developed for the Android operating system (Google Inc, Mountain View, CA). On this basis, an infectious disease information collection system was built for use at the scene of disaster relief. The creation of an infectious disease information collection system and rapid questionnaire customization through the use of a PDA was achieved. This system integrated computer technology and mobile communication technology to develop an infectious disease information collection system and to allow for rapid questionnaire customization at the scene of disaster relief. (Disaster Med Public Health Preparedness. 2017;11:668-673).
WebCN: A web-based computation tool for in situ-produced cosmogenic nuclides
NASA Astrophysics Data System (ADS)
Ma, Xiuzeng; Li, Yingkui; Bourgeois, Mike; Caffee, Marc; Elmore, David; Granger, Darryl; Muzikar, Paul; Smith, Preston
2007-06-01
Cosmogenic nuclide techniques are increasingly being utilized in geoscience research. For this it is critical to establish an effective, easily accessible and well defined tool for cosmogenic nuclide computations. We have been developing a web-based tool (WebCN) to calculate surface exposure ages and erosion rates based on the nuclide concentrations measured by the accelerator mass spectrometry. WebCN for 10Be and 26Al has been finished and published at http://www.physics.purdue.edu/primelab/for_users/rockage.html. WebCN for 36Cl is under construction. WebCN is designed as a three-tier client/server model and uses the open source PostgreSQL for the database management and PHP for the interface design and calculations. On the client side, an internet browser and Microsoft Access are used as application interfaces to access the system. Open Database Connectivity is used to link PostgreSQL and Microsoft Access. WebCN accounts for both spatial and temporal distributions of the cosmic ray flux to calculate the production rates of in situ-produced cosmogenic nuclides at the Earth's surface.
NASA Technical Reports Server (NTRS)
Steeman, Gerald; Connell, Christopher
2000-01-01
Many librarians may feel that dynamic Web pages are out of their reach, financially and technically. Yet we are reminded in library and Web design literature that static home pages are a thing of the past. This paper describes how librarians at the Institute for Defense Analyses (IDA) library developed a database-driven, dynamic intranet site using commercial off-the-shelf applications. Administrative issues include surveying a library users group for interest and needs evaluation; outlining metadata elements; and, committing resources from managing time to populate the database and training in Microsoft FrontPage and Web-to-database design. Technical issues covered include Microsoft Access database fundamentals, lessons learned in the Web-to-database process (including setting up Database Source Names (DSNs), redesigning queries to accommodate the Web interface, and understanding Access 97 query language vs. Standard Query Language (SQL)). This paper also offers tips on editing Active Server Pages (ASP) scripting to create desired results. A how-to annotated resource list closes out the paper.
Implementation of Headtracking and 3D Stereo with Unity and VRPN for Computer Simulations
NASA Technical Reports Server (NTRS)
Noyes, Matthew A.
2013-01-01
This paper explores low-cost hardware and software methods to provide depth cues traditionally absent in monocular displays. The use of a VRPN server in conjunction with a Microsoft Kinect and/or Nintendo Wiimote to provide head tracking information to a Unity application, and NVIDIA 3D Vision for retinal disparity support, is discussed. Methods are suggested to implement this technology with NASA's EDGE simulation graphics package, along with potential caveats. Finally, future applications of this technology to astronaut crew training, particularly when combined with an omnidirectional treadmill for virtual locomotion and NASA's ARGOS system for reduced gravity simulation, are discussed.
Architecture of the software for LAMOST fiber positioning subsystem
NASA Astrophysics Data System (ADS)
Peng, Xiaobo; Xing, Xiaozheng; Hu, Hongzhuan; Zhai, Chao; Li, Weimin
2004-09-01
The architecture of the software which controls the LAMOST fiber positioning sub-system is described. The software is composed of two parts as follows: a main control program in a computer and a unit controller program in a MCS51 single chip microcomputer ROM. And the function of the software includes: Client/Server model establishment, observation planning, collision handling, data transmission, pulse generation, CCD control, image capture and processing, and data analysis etc. Particular attention is paid to the ways in which different parts of the software can communicate. Also software techniques for multi threads, SOCKET programming, Microsoft Windows message response, and serial communications are discussed.
CDC WONDER: a cooperative processing architecture for public health.
Friede, A; Rosen, D H; Reid, J A
1994-01-01
CDC WONDER is an information management architecture designed for public health. It provides access to information and communications without the user's needing to know the location of data or communication pathways and mechanisms. CDC WONDER users have access to extractions from some 40 databases; electronic mail (e-mail); and surveillance data processing. System components include the Remote Client, the Communications Server, the Queue Managers, and Data Servers and Process Servers. The Remote Client software resides in the user's machine; other components are at the Centers for Disease Control and Prevention (CDC). The Remote Client, the Communications Server, and the Applications Server provide access to the information and functions in the Data Servers and Process Servers. The system architecture is based on cooperative processing, and components are coupled via pure message passing, using several protocols. This architecture allows flexibility in the choice of hardware and software. One system limitation is that final results from some subsystems are obtained slowly. Although designed for public health, CDC WONDER could be useful for other disciplines that need flexible, integrated information exchange. PMID:7719813
NASA Astrophysics Data System (ADS)
Reddy, K. Rasool; Rao, Ch. Madhava
2018-04-01
Currently safety is one of the primary concerns in the transmission of images due to increasing the use of images within the industrial applications. So it's necessary to secure the image facts from unauthorized individuals. There are various strategies are investigated to secure the facts. In that encryption is certainly one of maximum distinguished method. This paper gives a sophisticated Rijndael (AES) algorithm to shield the facts from unauthorized humans. Here Exponential Key Change (EKE) concept is also introduced to exchange the key between client and server. The things are exchange in a network among client and server through a simple protocol is known as Trivial File Transfer Protocol (TFTP). This protocol is used mainly in embedded servers to transfer the data and also provide protection to the data if protection capabilities are integrated. In this paper, implementing a GUI environment for image encryption and decryption. All these experiments carried out on Linux environment the usage of Open CV-Python script.
Deterministic entanglement distillation for secure double-server blind quantum computation.
Sheng, Yu-Bo; Zhou, Lan
2015-01-15
Blind quantum computation (BQC) provides an efficient method for the client who does not have enough sophisticated technology and knowledge to perform universal quantum computation. The single-server BQC protocol requires the client to have some minimum quantum ability, while the double-server BQC protocol makes the client's device completely classical, resorting to the pure and clean Bell state shared by two servers. Here, we provide a deterministic entanglement distillation protocol in a practical noisy environment for the double-server BQC protocol. This protocol can get the pure maximally entangled Bell state. The success probability can reach 100% in principle. The distilled maximally entangled states can be remaind to perform the BQC protocol subsequently. The parties who perform the distillation protocol do not need to exchange the classical information and they learn nothing from the client. It makes this protocol unconditionally secure and suitable for the future BQC protocol.
Deterministic entanglement distillation for secure double-server blind quantum computation
Sheng, Yu-Bo; Zhou, Lan
2015-01-01
Blind quantum computation (BQC) provides an efficient method for the client who does not have enough sophisticated technology and knowledge to perform universal quantum computation. The single-server BQC protocol requires the client to have some minimum quantum ability, while the double-server BQC protocol makes the client's device completely classical, resorting to the pure and clean Bell state shared by two servers. Here, we provide a deterministic entanglement distillation protocol in a practical noisy environment for the double-server BQC protocol. This protocol can get the pure maximally entangled Bell state. The success probability can reach 100% in principle. The distilled maximally entangled states can be remaind to perform the BQC protocol subsequently. The parties who perform the distillation protocol do not need to exchange the classical information and they learn nothing from the client. It makes this protocol unconditionally secure and suitable for the future BQC protocol. PMID:25588565
Amin, Ruhul; Islam, S K Hafizul; Biswas, G P; Khan, Muhammad Khurram; Kumar, Neeraj
2015-11-01
In the last few years, numerous remote user authentication and session key agreement schemes have been put forwarded for Telecare Medical Information System, where the patient and medical server exchange medical information using Internet. We have found that most of the schemes are not usable for practical applications due to known security weaknesses. It is also worth to note that unrestricted number of patients login to the single medical server across the globe. Therefore, the computation and maintenance overhead would be high and the server may fail to provide services. In this article, we have designed a medical system architecture and a standard mutual authentication scheme for single medical server, where the patient can securely exchange medical data with the doctor(s) via trusted central medical server over any insecure network. We then explored the security of the scheme with its resilience to attacks. Moreover, we formally validated the proposed scheme through the simulation using Automated Validation of Internet Security Schemes and Applications software whose outcomes confirm that the scheme is protected against active and passive attacks. The performance comparison demonstrated that the proposed scheme has lower communication cost than the existing schemes in literature. In addition, the computation cost of the proposed scheme is nearly equal to the exiting schemes. The proposed scheme not only efficient in terms of different security attacks, but it also provides an efficient login, mutual authentication, session key agreement and verification and password update phases along with password recovery.
Patients’ Data Management System Protected by Identity-Based Authentication and Key Exchange
Rivero-García, Alexandra; Santos-González, Iván; Hernández-Goya, Candelaria; Caballero-Gil, Pino; Yung, Moti
2017-01-01
A secure and distributed framework for the management of patients’ information in emergency and hospitalization services is proposed here in order to seek improvements in efficiency and security in this important area. In particular, confidentiality protection, mutual authentication, and automatic identification of patients are provided. The proposed system is based on two types of devices: Near Field Communication (NFC) wristbands assigned to patients, and mobile devices assigned to medical staff. Two other main elements of the system are an intermediate server to manage the involved data, and a second server with a private key generator to define the information required to protect communications. An identity-based authentication and key exchange scheme is essential to provide confidential communication and mutual authentication between the medical staff and the private key generator through an intermediate server. The identification of patients is carried out through a keyed-hash message authentication code. Thanks to the combination of the aforementioned tools, a secure alternative mobile health (mHealth) scheme for managing patients’ data is defined for emergency and hospitalization services. Different parts of the proposed system have been implemented, including mobile application, intermediate server, private key generator and communication channels. Apart from that, several simulations have been performed, and, compared with the current system, significant improvements in efficiency have been observed. PMID:28362328
Patients' Data Management System Protected by Identity-Based Authentication and Key Exchange.
Rivero-García, Alexandra; Santos-González, Iván; Hernández-Goya, Candelaria; Caballero-Gil, Pino; Yung, Moti
2017-03-31
A secure and distributed framework for the management of patients' information in emergency and hospitalization services is proposed here in order to seek improvements in efficiency and security in this important area. In particular, confidentiality protection, mutual authentication, and automatic identification of patients are provided. The proposed system is based on two types of devices: Near Field Communication (NFC) wristbands assigned to patients, and mobile devices assigned to medical staff. Two other main elements of the system are an intermediate server to manage the involved data, and a second server with a private key generator to define the information required to protect communications. An identity-based authentication and key exchange scheme is essential to provide confidential communication and mutual authentication between the medical staff and the private key generator through an intermediate server. The identification of patients is carried out through a keyed-hash message authentication code. Thanks to the combination of the aforementioned tools, a secure alternative mobile health (mHealth) scheme for managing patients' data is defined for emergency and hospitalization services. Different parts of the proposed system have been implemented, including mobile application, intermediate server, private key generator and communication channels. Apart from that, several simulations have been performed, and, compared with the current system, significant improvements in efficiency have been observed.
Frank, M S; Dreyer, K
2001-06-01
We describe a virtual web site hosting technology that enables educators in radiology to emblazon and make available for delivery on the world wide web their own interactive educational content, free from dependencies on in-house resources and policies. This suite of technologies includes a graphically oriented software application, designed for the computer novice, to facilitate the input, storage, and management of domain expertise within a database system. The database stores this expertise as choreographed and interlinked multimedia entities including text, imagery, interactive questions, and audio. Case-based presentations or thematic lectures can be authored locally, previewed locally within a web browser, then uploaded at will as packaged knowledge objects to an educator's (or department's) personal web site housed within a virtual server architecture. This architecture can host an unlimited number of unique educational web sites for individuals or departments in need of such service. Each virtual site's content is stored within that site's protected back-end database connected to Internet Information Server (Microsoft Corp, Redmond WA) using a suite of Active Server Page (ASP) modules that incorporate Microsoft's Active Data Objects (ADO) technology. Each person's or department's electronic teaching material appears as an independent web site with different levels of access--controlled by a username-password strategy--for teachers and students. There is essentially no static hypertext markup language (HTML). Rather, all pages displayed for a given site are rendered dynamically from case-based or thematic content that is fetched from that virtual site's database. The dynamically rendered HTML is displayed within a web browser in a Socratic fashion that can assess the recipient's current fund of knowledge while providing instantaneous user-specific feedback. Each site is emblazoned with the logo and identification of the participating institution. Individuals with teacher-level access can use a web browser to upload new content as well as manage content already stored on their virtual site. Each virtual site stores, collates, and scores participants' responses to the interactive questions posed on line. This virtual web site strategy empowers the educator with an end-to-end solution for creating interactive educational content and hosting that content within the educator's personalized and protected educational site on the world wide web, thus providing a valuable outlet that can magnify the impact of his or her talents and contributions.
Dornan, Tim; Lee, Catherine; Stopford, Adam; Hosie, Liam; Maredia, Neil; Rector, Alan
2005-04-01
The aim was to find how to use information and communication technology to present the clinical skills content of an undergraduate medical curriculum. Rapid application design was used to develop the product, and technical action research was used to evaluate the development process. A clinician-educator, two medical students, two computing science masters students, two other project workers, and a hospital education informatics lead, formed a design team. A sample of stakeholders took part in requirements planning workshops and continued to advise the team throughout the project. A university hospital had many features that favoured fast, inexpensive, and successful system development: a clearly defined and readily accessible user group; location of the development process close to end-users; fast, informal communication; leadership by highly motivated and senior end-users; devolved authority and lack of any rigidly imposed management structure; cooperation of clinicians because the project drew on their clinical expertise to achieve scholastic goals; a culture of learning and involvement of highly motivated students. A detailed specification was developed through storyboarding, use case diagramming, and evolutionary prototyping. A very usable working product was developed within weeks. "SkillsBase" is a database web application using Microsoft Active Server Pages, served from a Microsoft Windows 2000 Server operating system running Internet Information Server 5.0. Graphing functionality is provided by the KavaChart applet. It presents the skills curriculum, provides a password-protected portfolio function, and offers training materials. The curriculum can be presented in several different ways to help students reflect on their objectives and progress towards achieving them. The reflective portfolio function is entirely private to each student user and allows them to document their progress in attaining skills, as judged by self, peer and tutor assessment, and examinations. Training materials include web links and materials developed locally using pedagogic principles developed by the SkillsBase team. Although the usability of SkillsBase has been proven, uptake of software that has arisen 'bottom-up' from within the curriculum has proved slow. We plan to incorporate the SkillsBase services into a more comprehensive virtual managed learning environment, anticipating that presenting the functionality in an environment that is routinely used by students and teachers will increase uptake and use.
NASA Technical Reports Server (NTRS)
Baumbach, J. I.; Vonirmer, A.
1995-01-01
To assist current discussion in the field of ion mobility spectrometry, at the Institut fur Spectrochemie und angewandte Spektroskopie, Dortmund, start with 4th of December, 1994 work of an FTP-Server, available for all research groups at univerisities, institutes and research worker in industry. We support the exchange, interpretation, and database-search of ion mobility spectra through data format JCAMP-DS (Joint Committee on Atomic and Molecular Physical Data) as well as literature retrieval, pre-print, notice, and discussion board. We describe in general lines the entrance conditions, local addresses, and main code words. For further details, a monthly news report will be prepared for all common users. Internet email address for subscribing is included in document.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-22
..., Incorporated; Order Approving a Proposed Rule Change Relating to Co- Location Service Fees I. Introduction On... to co-location services and related fees. The proposed rule change was published for comment in the... of the equipment to the Exchange's servers, at no additional charge. This ``co-location service...
NASA Astrophysics Data System (ADS)
Ames, D. P.; Kadlec, J.; Cao, Y.; Grover, D.; Horsburgh, J. S.; Whiteaker, T.; Goodall, J. L.; Valentine, D. W.
2010-12-01
A growing number of hydrologic information servers are being deployed by government agencies, university networks, and individual researchers using the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) Hydrologic Information System (HIS). The CUAHSI HIS Project has developed a standard software stack, called HydroServer, for publishing hydrologic observations data. It includes the Observations Data Model (ODM) database and Water Data Service web services, which together enable publication of data on the Internet in a standard format called Water Markup Language (WaterML). Metadata describing available datasets hosted on these servers is compiled within a central metadata catalog called HIS Central at the San Diego Supercomputer Center and is searchable through a set of predefined web services based queries. Together, these servers and central catalog service comprise a federated HIS of a scale and comprehensiveness never previously available. This presentation will briefly review/introduce the CUAHSI HIS system with special focus on a new HIS software tool called "HydroDesktop" and the open source software development web portal, www.HydroDesktop.org, which supports community development and maintenance of the software. HydroDesktop is a client-side, desktop software application that acts as a search and discovery tool for exploring the distributed network of HydroServers, downloading specific data series, visualizing and summarizing data series and exporting these to formats needed for analysis by external software. HydroDesktop is based on the open source DotSpatial GIS developer toolkit which provides it with map-based data interaction and visualization, and a plug-in interface that can be used by third party developers and researchers to easily extend the software using Microsoft .NET programming languages. HydroDesktop plug-ins that are presently available or currently under development within the project and by third party collaborators include functions for data search and discovery, extensive graphing, data editing and export, HydroServer exploration, integration with the OpenMI workflow and modeling system, and an interface for data analysis through the R statistical package.
NASA Astrophysics Data System (ADS)
Yang, Xin; He, Zhen-yu; Jiang, Xiao-bo; Lin, Mao-sheng; Zhong, Ning-shan; Hu, Jiang; Qi, Zhen-yu; Bao, Yong; Li, Qiao-qiao; Li, Bao-yue; Hu, Lian-ying; Lin, Cheng-guang; Gao, Yuan-hong; Liu, Hui; Huang, Xiao-yan; Deng, Xiao-wu; Xia, Yun-fei; Liu, Meng-zhong; Sun, Ying
2017-03-01
To meet the special demands in China and the particular needs for the radiotherapy department, a MOSAIQ Integration Platform CHN (MIP) based on the workflow of radiation therapy (RT) has been developed, as a supplement system to the Elekta MOSAIQ. The MIP adopts C/S (client-server) structure mode, and its database is based on the Treatment Planning System (TPS) and MOSAIQ SQL Server 2008, running on the hospital local network. Five network servers, as a core hardware, supply data storage and network service based on the cloud services. The core software, using C# programming language, is developed based on Microsoft Visual Studio Platform. The MIP server could offer network service, including entry, query, statistics and print information for about 200 workstations at the same time. The MIP was implemented in the past one and a half years, and some practical patient-oriented functions were developed. And now the MIP is almost covering the whole workflow of radiation therapy. There are 15 function modules, such as: Notice, Appointment, Billing, Document Management (application/execution), System Management, and so on. By June of 2016, recorded data in the MIP are as following: 13546 patients, 13533 plan application, 15475 RT records, 14656 RT summaries, 567048 billing records and 506612 workload records, etc. The MIP based on the RT workflow has been successfully developed and clinically implemented with real-time performance, data security, stable operation. And it is demonstrated to be user-friendly and is proven to significantly improve the efficiency of the department. It is a key to facilitate the information sharing and department management. More functions can be added or modified for further enhancement its potentials in research and clinical practice.
Decision support system for health care resources allocation
Sebaa, Abderrazak; Nouicer, Amina; Tari, AbdelKamel; Tarik, Ramtani; Abdellah, Ouhab
2017-01-01
Background A study about healthcare resources can improve decisions regarding the allotment and mobilization of medical resources and to better guide future investment in the health sector. Aim The aim of this work was to design and implement a decision support system to improve medical resources allocation of Bejaia region. Methods To achieve the retrospective cohort study, we integrated existing clinical databases from different Bejaia department health sector institutions (an Algerian department) to collect information about patients from January 2015 through December 2015. Data integration was performed in a data warehouse using the multi-dimensional model and OLAP cube. During implementation, we used Microsoft SQL server 2012 and Microsoft Excel 2010. Results A medical decision support platform was introduced, and was implemented during the planning stages allowing the management of different medical orientations, it provides better apportionment and allotment of medical resources, and ensures that the allocation of health care resources has optimal effects on improving health. Conclusion In this study, we designed and implemented a decision support system which would improve health care in Bejaia department to especially assist in the selection of the optimum location of health center and hospital, the specialty of the health center, the medical equipment and the medical staff. PMID:28848645
Decision support system for health care resources allocation.
Sebaa, Abderrazak; Nouicer, Amina; Tari, AbdelKamel; Tarik, Ramtani; Abdellah, Ouhab
2017-06-01
A study about healthcare resources can improve decisions regarding the allotment and mobilization of medical resources and to better guide future investment in the health sector. The aim of this work was to design and implement a decision support system to improve medical resources allocation of Bejaia region. To achieve the retrospective cohort study, we integrated existing clinical databases from different Bejaia department health sector institutions (an Algerian department) to collect information about patients from January 2015 through December 2015. Data integration was performed in a data warehouse using the multi-dimensional model and OLAP cube. During implementation, we used Microsoft SQL server 2012 and Microsoft Excel 2010. A medical decision support platform was introduced, and was implemented during the planning stages allowing the management of different medical orientations, it provides better apportionment and allotment of medical resources, and ensures that the allocation of health care resources has optimal effects on improving health. In this study, we designed and implemented a decision support system which would improve health care in Bejaia department to especially assist in the selection of the optimum location of health center and hospital, the specialty of the health center, the medical equipment and the medical staff.
Navy Network Dependability: Models, Metrics, and Tools
2010-01-01
different COP servers. The COP Synchronization Tool (CST) is the preferred method of exchanging data between COP servers: A critical component of COP...ASW mission’s equipment strings. A major difference in results between the new model and the old one is that the new one is far less optimistic about...understand why perceptions about the dependability (e.g., availability) of networks from users’ (e.g., sailors) per- spectives sometimes differ from the
2009-09-01
DIFFIE-HELLMAN KEY EXCHANGE .......................14 III. GHOSTNET SETUP .........................................15 A. INSTALLATION OF OPENVPN FOR...16 3. Verifying the Secure Connection ..............16 B. RUNNING OPENVPN AS A SERVER ON WINDOWS ............17 1. Creating...Generating Server and Client Keys ............20 5. Keys to Transfer to the Client ...............21 6. Configuring OpenVPN to Use Certificates
2012 ARPA-E Energy Innovation Summit: Fireside Chat with Steven Chu and Bill Gates
Chu, Steven; Gates, Bill; Podesta, John
2018-05-14
The third annual ARPA-E Energy Innovation Summit was held in Washington D.C. in February, 2012. The event brought together key players from across the energy ecosystem - researchers, entrepreneurs, investors, corporate executives, and government officials - to share ideas for developing and deploying the next generation of energy technologies. This video captures a session called Fireside Chat that featured Steven Chu, the Secretary of Energy, and Bill Gates, Chairman of Microsoft Corporation. The session is moderated by John Podesta, Chair of the Center for American Progress. Energy Secretary Steven Chu and Microsoft Founder and Chairman Bill Gates exchanged ideas about how small businesses and innovators can overcome the challenges that face many startups.
[Design and establishment of modern literature database about acupuncture Deqi].
Guo, Zheng-rong; Qian, Gui-feng; Pan, Qiu-yin; Wang, Yang; Xin, Si-yuan; Li, Jing; Hao, Jie; Hu, Ni-juan; Zhu, Jiang; Ma, Liang-xiao
2015-02-01
A search on acupuncture Deqi was conducted using four Chinese-language biomedical databases (CNKI, Wan-Fang, VIP and CBM) and PubMed database and using keywords "Deqi" or "needle sensation" "needling feeling" "needle feel" "obtaining qi", etc. Then, a "Modern Literature Database for Acupuncture Deqi" was established by employing Microsoft SQL Server 2005 Express Edition, introducing the contents, data types, information structure and logic constraint of the system table fields. From this Database, detailed inquiries about general information of clinical trials, acupuncturists' experience, ancient medical works, comprehensive literature, etc. can be obtained. The present databank lays a foundation for subsequent evaluation of literature quality about Deqi and data mining of undetected Deqi knowledge.
Cloud-based robot remote control system for smart factory
NASA Astrophysics Data System (ADS)
Wu, Zhiming; Li, Lianzhong; Xu, Yang; Zhai, Jingmei
2015-12-01
With the development of internet technologies and the wide application of robots, there is a prospect (trend/tendency) of integration between network and robots. A cloud-based robot remote control system over networks for smart factory is proposed, which enables remote users to control robots and then realize intelligent production. To achieve it, a three-layer system architecture is designed including user layer, service layer and physical layer. Remote control applications running on the cloud server is developed on Microsoft Azure. Moreover, DIV+ CSS technologies are used to design human-machine interface to lower maintenance cost and improve development efficiency. Finally, an experiment is implemented to verify the feasibility of the program.
UAV Data Exchange Test Bed for At-Sea and Ashore Information Systems
2014-12-02
29 3.2 Visualization using NASA World Wind . . . . . . . . . . . . . . . . . . . . . . . . 30 3.3 Visualization using Quantum GIS...Data Server and the Global Positioning Warehouse 37 4.1 Naval Position Repository Installation . . . . . . . . . . . . . . . . . . . . . . . . . 37 4.2...4.4 Data Exchange between CSD and NPR . . . . . . . . . . . . . . . . . . . . . . . . 41 5 Maritime Tactical Command and Control 43 5.1 Global Command
Escape Excel: A tool for preventing gene symbol and accession conversion errors.
Welsh, Eric A; Stewart, Paul A; Kuenzi, Brent M; Eschrich, James A
2017-01-01
Microsoft Excel automatically converts certain gene symbols, database accessions, and other alphanumeric text into dates, scientific notation, and other numerical representations. These conversions lead to subsequent, irreversible, corruption of the imported text. A recent survey of popular genomic literature estimates that one-fifth of all papers with supplementary gene lists suffer from this issue. Here, we present an open-source tool, Escape Excel, which prevents these erroneous conversions by generating an escaped text file that can be safely imported into Excel. Escape Excel is implemented in a variety of formats (http://www.github.com/pstew/escape_excel), including a command line based Perl script, a Windows-only Excel Add-In, an OS X drag-and-drop application, a simple web-server, and as a Galaxy web environment interface. Test server implementations are accessible as a Galaxy interface (http://apostl.moffitt.org) and simple non-Galaxy web server (http://apostl.moffitt.org:8000/). Escape Excel detects and escapes a wide variety of problematic text strings so that they are not erroneously converted into other representations upon importation into Excel. Examples of problematic strings include date-like strings, time-like strings, leading zeroes in front of numbers, and long numeric and alphanumeric identifiers that should not be automatically converted into scientific notation. It is hoped that greater awareness of these potential data corruption issues, together with diligent escaping of text files prior to importation into Excel, will help to reduce the amount of Excel-corrupted data in scientific analyses and publications.
Land User and Land Cover Maps of Europe: a Webgis Platform
NASA Astrophysics Data System (ADS)
Brovelli, M. A.; Fahl, F. C.; Minghini, M.; Molinari, M. E.
2016-06-01
This paper presents the methods and implementation processes of a WebGIS platform designed to publish the available land use and land cover maps of Europe at continental scale. The system is built completely on open source infrastructure and open standards. The proposed architecture is based on a server-client model having GeoServer as the map server, Leaflet as the client-side mapping library and the Bootstrap framework at the core of the front-end user interface. The web user interface is designed to have typical features of a desktop GIS (e.g. activate/deactivate layers and order layers by drag and drop actions) and to show specific information on the activated layers (e.g. legend and simplified metadata). Users have the possibility to change the base map from a given list of map providers (e.g. OpenStreetMap and Microsoft Bing) and to control the opacity of each layer to facilitate the comparison with both other land cover layers and the underlying base map. In addition, users can add to the platform any custom layer available through a Web Map Service (WMS) and activate the visualization of photos from popular photo sharing services. This last functionality is provided in order to have a visual assessment of the available land coverages based on other user-generated contents available on the Internet. It is supposed to be a first step towards a calibration/validation service that will be made available in the future.
2012 ARPA-E Energy Innovation Summit: Fireside Chat with Steven Chu and Bill Gates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chu, Steven; Gates, Bill; Podesta, John
2012-02-28
The third annual ARPA-E Energy Innovation Summit was held in Washington D.C. in February, 2012. The event brought together key players from across the energy ecosystem - researchers, entrepreneurs, investors, corporate executives, and government officials - to share ideas for developing and deploying the next generation of energy technologies. This video captures a session called Fireside Chat that featured Steven Chu, the Secretary of Energy, and Bill Gates, Chairman of Microsoft Corporation. The session is moderated by John Podesta, Chair of the Center for American Progress. Energy Secretary Steven Chu and Microsoft Founder and Chairman Bill Gates exchanged ideas aboutmore » how small businesses and innovators can overcome the challenges that face many startups.« less
Web-Based Distributed Simulation of Aeronautical Propulsion System
NASA Technical Reports Server (NTRS)
Zheng, Desheng; Follen, Gregory J.; Pavlik, William R.; Kim, Chan M.; Liu, Xianyou; Blaser, Tammy M.; Lopez, Isaac
2001-01-01
An application was developed to allow users to run and view the Numerical Propulsion System Simulation (NPSS) engine simulations from web browsers. Simulations were performed on multiple INFORMATION POWER GRID (IPG) test beds. The Common Object Request Broker Architecture (CORBA) was used for brokering data exchange among machines and IPG/Globus for job scheduling and remote process invocation. Web server scripting was performed by JavaServer Pages (JSP). This application has proven to be an effective and efficient way to couple heterogeneous distributed components.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-20
... Eliminate the 100MB Connectivity Option and Fee March 14, 2012. Pursuant to Section 19(b)(1) of the... proposes to eliminate 100MB connectivity between the Exchange and co-located servers, as well as associated... Proposed Rule Change 1. Purpose The Exchange proposes to modify Rule 7034(b) to eliminate 100MB...
Zaman, Babar; Khandekar, Rajiv; Al Shahwan, Sami; Song, Jonathan; Al Jadaan, Ibrahim; Al Jiasim, Leyla; Owaydha, Ohood; Asghar, Nasira; Hijazi, Amar; Edward, Deepak P.
2014-01-01
In this brief communication, we present the steps used to establish a web-based congenital glaucoma registry at our institution. The contents of a case report form (CRF) were developed by a group of glaucoma subspecialists. Information Technology (IT) specialists used Lime Survey softwareTM to create an electronic CRF. A MY Structured Query Language (MySQL) server was used as a database with a virtual machine operating system. Two ophthalmologists and 2 IT specialists worked for 7 hours, and a biostatistician and a data registrar worked for 24 hours each to establish the electronic CRF. Using the CRF which was transferred to the Lime survey tool, and the MYSQL server application, data could be directly stored in spreadsheet programs that included Microsoft Excel, SPSS, and R-Language and queried in real-time. In a pilot test, clinical data from 80 patients with congenital glaucoma were entered into the registry and successful descriptive analysis and data entry validation was performed. A web-based disease registry was established in a short period of time in a cost-efficient manner using available resources and a team-based approach. PMID:24791112
Zaman, Babar; Khandekar, Rajiv; Al Shahwan, Sami; Song, Jonathan; Al Jadaan, Ibrahim; Al Jiasim, Leyla; Owaydha, Ohood; Asghar, Nasira; Hijazi, Amar; Edward, Deepak P
2014-01-01
In this brief communication, we present the steps used to establish a web-based congenital glaucoma registry at our institution. The contents of a case report form (CRF) were developed by a group of glaucoma subspecialists. Information Technology (IT) specialists used Lime Survey softwareTM to create an electronic CRF. A MY Structured Query Language (MySQL) server was used as a database with a virtual machine operating system. Two ophthalmologists and 2 IT specialists worked for 7 hours, and a biostatistician and a data registrar worked for 24 hours each to establish the electronic CRF. Using the CRF which was transferred to the Lime survey tool, and the MYSQL server application, data could be directly stored in spreadsheet programs that included Microsoft Excel, SPSS, and R-Language and queried in real-time. In a pilot test, clinical data from 80 patients with congenital glaucoma were entered into the registry and successful descriptive analysis and data entry validation was performed. A web-based disease registry was established in a short period of time in a cost-efficient manner using available resources and a team-based approach.
A web-based quantitative signal detection system on adverse drug reaction in China.
Li, Chanjuan; Xia, Jielai; Deng, Jianxiong; Chen, Wenge; Wang, Suzhen; Jiang, Jing; Chen, Guanquan
2009-07-01
To establish a web-based quantitative signal detection system for adverse drug reactions (ADRs) based on spontaneous reporting to the Guangdong province drug-monitoring database in China. Using Microsoft Visual Basic and Active Server Pages programming languages and SQL Server 2000, a web-based system with three software modules was programmed to perform data preparation and association detection, and to generate reports. Information component (IC), the internationally recognized measure of disproportionality for quantitative signal detection, was integrated into the system, and its capacity for signal detection was tested with ADR reports collected from 1 January 2002 to 30 June 2007 in Guangdong. A total of 2,496 associations including known signals were mined from the test database. Signals (e.g., cefradine-induced hematuria) were found early by using the IC analysis. In addition, 291 drug-ADR associations were alerted for the first time in the second quarter of 2007. The system can be used for the detection of significant associations from the Guangdong drug-monitoring database and could be an extremely useful adjunct to the expert assessment of very large numbers of spontaneously reported ADRs for the first time in China.
HPC in a HEP lab: lessons learned from setting up cost-effective HPC clusters
NASA Astrophysics Data System (ADS)
Husejko, Michal; Agtzidis, Ioannis; Baehler, Pierre; Dul, Tadeusz; Evans, John; Himyr, Nils; Meinhard, Helge
2015-12-01
In this paper we present our findings gathered during the evaluation and testing of Windows Server High-Performance Computing (Windows HPC) in view of potentially using it as a production HPC system for engineering applications. The Windows HPC package, an extension of Microsofts Windows Server product, provides all essential interfaces, utilities and management functionality for creating, operating and monitoring a Windows-based HPC cluster infrastructure. The evaluation and test phase was focused on verifying the functionalities of Windows HPC, its performance, support of commercial tools and the integration with the users work environment. We describe constraints imposed by the way the CERN Data Centre is operated, licensing for engineering tools and scalability and behaviour of the HPC engineering applications used at CERN. We will present an initial set of requirements, which were created based on the above constraints and requests from the CERN engineering user community. We will explain how we have configured Windows HPC clusters to provide job scheduling functionalities required to support the CERN engineering user community, quality of service, user- and project-based priorities, and fair access to limited resources. Finally, we will present several performance tests we carried out to verify Windows HPC performance and scalability.
Design and development of an IoT-based web application for an intelligent remote SCADA system
NASA Astrophysics Data System (ADS)
Kao, Kuang-Chi; Chieng, Wei-Hua; Jeng, Shyr-Long
2018-03-01
This paper presents a design of an intelligent remote electrical power supervisory control and data acquisition (SCADA) system based on the Internet of Things (IoT), with Internet Information Services (IIS) for setting up web servers, an ASP.NET model-view- controller (MVC) for establishing a remote electrical power monitoring and control system by using responsive web design (RWD), and a Microsoft SQL Server as the database. With the web browser connected to the Internet, the sensing data is sent to the client by using the TCP/IP protocol, which supports mobile devices with different screen sizes. The users can provide instructions immediately without being present to check the conditions, which considerably reduces labor and time costs. The developed system incorporates a remote measuring function by using a wireless sensor network and utilizes a visual interface to make the human-machine interface (HMI) more instinctive. Moreover, it contains an analog input/output and a basic digital input/output that can be applied to a motor driver and an inverter for integration with a remote SCADA system based on IoT, and thus achieve efficient power management.
Hinton, Elizabeth G; Oelschlegel, Sandra; Vaughn, Cynthia J; Lindsay, J Michael; Hurst, Sachiko M; Earl, Martha
2013-01-01
This study utilizes an informatics tool to analyze a robust literature search service in an academic medical center library. Structured interviews with librarians were conducted focusing on the benefits of such a tool, expectations for performance, and visual layout preferences. The resulting application utilizes Microsoft SQL Server and .Net Framework 3.5 technologies, allowing for the use of a web interface. Customer tables and MeSH terms are included. The National Library of Medicine MeSH database and entry terms for each heading are incorporated, resulting in functionality similar to searching the MeSH database through PubMed. Data reports will facilitate analysis of the search service.
Electronic Transfer of School Records.
ERIC Educational Resources Information Center
Yeagley, Raymond
2001-01-01
Describes the electronic transfer of student records, notably the use of a Web-server named CHARLOTTE sponsored by the National Forum on Education Statistics and an Electronic Data Exchange system named SPEEDE/ExPRESS. (PKP)
PEM public key certificate cache server
NASA Astrophysics Data System (ADS)
Cheung, T.
1993-12-01
Privacy Enhanced Mail (PEM) provides privacy enhancement services to users of Internet electronic mail. Confidentiality, authentication, message integrity, and non-repudiation of origin are provided by applying cryptographic measures to messages transferred between end systems by the Message Transfer System. PEM supports both symmetric and asymmetric key distribution. However, the prevalent implementation uses a public key certificate-based strategy, modeled after the X.509 directory authentication framework. This scheme provides an infrastructure compatible with X.509. According to RFC 1422, public key certificates can be stored in directory servers, transmitted via non-secure message exchanges, or distributed via other means. Directory services provide a specialized distributed database for OSI applications. The directory contains information about objects and then provides structured mechanisms for accessing that information. Since directory services are not widely available now, a good approach is to manage certificates in a centralized certificate server. This document describes the detailed design of a centralized certificate cache serve. This server manages a cache of certificates and a cache of Certificate Revocation Lists (CRL's) for PEM applications. PEMapplications contact the server to obtain/store certificates and CRL's. The server software is programmed in C and ELROS. To use this server, ISODE has to be configured and installed properly. The ISODE library 'libisode.a' has to be linked together with this library because ELROS uses the transport layer functions provided by 'libisode.a.' The X.500 DAP library that is included with the ELROS distribution has to be linked in also, since the server uses the DAP library functions to communicate with directory servers.
Nishino, Mizuki; Wolfe, Donna; Yam, Chun-Shan; Larson, Michael; Boiselle, Phillip M; Hatabu, Hiroto
2004-10-01
Because of the rapid increase in clinical workload in academic radiology departments, time for teaching rotating residents is getting more and more limited. As a solution to this problem, we introduced the Intranet Journal of Chest Radiology as a comprehensive innovative tool for assisting resident education. The Intranet Journal of Chest Radiology is constructed using Microsoft FrontPage version 2002 (Microsoft Corp, Redmond, WA) and is hosted in our departmental web server (Beth Israel Deaconess Medical Center, Boston, MA). The home page of the intranet journal provides access to the main features, "Cases of the Month," "Teaching File," "Selected Articles for Residents," "Lecture Series," and "Current Publications." These features provide quick access to the selected radiology articles, the interesting chest cases, and the lecture series and current publication from the chest section. Our intranet journal has been well utilized for 6 months after its introduction. It enhances residents' interest and motivation to work on case collections, to search and read articles, and to generate interest in research. Frequent updating is necessary for the journal to be kept current, relevant, and well-utilized. The intranet journal serves as a comprehensive innovative solution for resident education, providing basic educational resources and opportunities of interactive participation by residents.
MetNetAPI: A flexible method to access and manipulate biological network data from MetNet
2010-01-01
Background Convenient programmatic access to different biological databases allows automated integration of scientific knowledge. Many databases support a function to download files or data snapshots, or a webservice that offers "live" data. However, the functionality that a database offers cannot be represented in a static data download file, and webservices may consume considerable computational resources from the host server. Results MetNetAPI is a versatile Application Programming Interface (API) to the MetNetDB database. It abstracts, captures and retains operations away from a biological network repository and website. A range of database functions, previously only available online, can be immediately (and independently from the website) applied to a dataset of interest. Data is available in four layers: molecular entities, localized entities (linked to a specific organelle), interactions, and pathways. Navigation between these layers is intuitive (e.g. one can request the molecular entities in a pathway, as well as request in what pathways a specific entity participates). Data retrieval can be customized: Network objects allow the construction of new and integration of existing pathways and interactions, which can be uploaded back to our server. In contrast to webservices, the computational demand on the host server is limited to processing data-related queries only. Conclusions An API provides several advantages to a systems biology software platform. MetNetAPI illustrates an interface with a central repository of data that represents the complex interrelationships of a metabolic and regulatory network. As an alternative to data-dumps and webservices, it allows access to a current and "live" database and exposes analytical functions to application developers. Yet it only requires limited resources on the server-side (thin server/fat client setup). The API is available for Java, Microsoft.NET and R programming environments and offers flexible query and broad data- retrieval methods. Data retrieval can be customized to client needs and the API offers a framework to construct and manipulate user-defined networks. The design principles can be used as a template to build programmable interfaces for other biological databases. The API software and tutorials are available at http://www.metnetonline.org/api. PMID:21083943
Password-only authenticated three-party key exchange with provable security in the standard model.
Nam, Junghyun; Choo, Kim-Kwang Raymond; Kim, Junghwan; Kang, Hyun-Kyu; Kim, Jinsoo; Paik, Juryon; Won, Dongho
2014-01-01
Protocols for password-only authenticated key exchange (PAKE) in the three-party setting allow two clients registered with the same authentication server to derive a common secret key from their individual password shared with the server. Existing three-party PAKE protocols were proven secure under the assumption of the existence of random oracles or in a model that does not consider insider attacks. Therefore, these protocols may turn out to be insecure when the random oracle is instantiated with a particular hash function or an insider attack is mounted against the partner client. The contribution of this paper is to present the first three-party PAKE protocol whose security is proven without any idealized assumptions in a model that captures insider attacks. The proof model we use is a variant of the indistinguishability-based model of Bellare, Pointcheval, and Rogaway (2000), which is one of the most widely accepted models for security analysis of password-based key exchange protocols. We demonstrated that our protocol achieves not only the typical indistinguishability-based security of session keys but also the password security against undetectable online dictionary attacks.
Escape Excel: A tool for preventing gene symbol and accession conversion errors
Stewart, Paul A.; Kuenzi, Brent M.; Eschrich, James A.
2017-01-01
Background Microsoft Excel automatically converts certain gene symbols, database accessions, and other alphanumeric text into dates, scientific notation, and other numerical representations. These conversions lead to subsequent, irreversible, corruption of the imported text. A recent survey of popular genomic literature estimates that one-fifth of all papers with supplementary gene lists suffer from this issue. Results Here, we present an open-source tool, Escape Excel, which prevents these erroneous conversions by generating an escaped text file that can be safely imported into Excel. Escape Excel is implemented in a variety of formats (http://www.github.com/pstew/escape_excel), including a command line based Perl script, a Windows-only Excel Add-In, an OS X drag-and-drop application, a simple web-server, and as a Galaxy web environment interface. Test server implementations are accessible as a Galaxy interface (http://apostl.moffitt.org) and simple non-Galaxy web server (http://apostl.moffitt.org:8000/). Conclusions Escape Excel detects and escapes a wide variety of problematic text strings so that they are not erroneously converted into other representations upon importation into Excel. Examples of problematic strings include date-like strings, time-like strings, leading zeroes in front of numbers, and long numeric and alphanumeric identifiers that should not be automatically converted into scientific notation. It is hoped that greater awareness of these potential data corruption issues, together with diligent escaping of text files prior to importation into Excel, will help to reduce the amount of Excel-corrupted data in scientific analyses and publications. PMID:28953918
Black Sea GIS developed in MHI
NASA Astrophysics Data System (ADS)
Zhuk, E.; Khaliulin, A.; Zodiatis, G.; Nikolaidis, A.; Isaeva, E.
2016-08-01
The work aims at creating the Black Sea geoinformation system (GIS) and complementing it with a model bank. The software for data access and visualization was developed using client server architecture. A map service based on MapServer and MySQL data management system were chosen for the Black Sea GIS. Php-modules and python-scripts are used to provide data access, processing, and exchange between the client application and the server. According to the basic data types, the module structure of GIS was developed. Each type of data is matched to a module which allows selection and visualization of the data. At present, a GIS complement with a model bank (the models build in to the GIS) and users' models (programs launched on users' PCs but receiving and displaying data via GIS) is developed.
Environmental Monitoring Using Sensor Networks
NASA Astrophysics Data System (ADS)
Yang, J.; Zhang, C.; Li, X.; Huang, Y.; Fu, S.; Acevedo, M. F.
2008-12-01
Environmental observatories, consisting of a variety of sensor systems, computational resources and informatics, are important for us to observe, model, predict, and ultimately help preserve the health of the nature. The commoditization and proliferation of coin-to-palm sized wireless sensors will allow environmental monitoring with unprecedented fine spatial and temporal resolution. Once scattered around, these sensors can identify themselves, locate their positions, describe their functions, and self-organize into a network. They communicate through wireless channel with nearby sensors and transmit data through multi-hop protocols to a gateway, which can forward information to a remote data server. In this project, we describe an environmental observatory called Texas Environmental Observatory (TEO) that incorporates a sensor network system with intertwined wired and wireless sensors. We are enhancing and expanding the existing wired weather stations to include wireless sensor networks (WSNs) and telemetry using solar-powered cellular modems. The new WSNs will monitor soil moisture and support long-term hydrologic modeling. Hydrologic models are helpful in predicting how changes in land cover translate into changes in the stream flow regime. These models require inputs that are difficult to measure over large areas, especially variables related to storm events, such as soil moisture antecedent conditions and rainfall amount and intensity. This will also contribute to improve rainfall estimations from meteorological radar data and enhance hydrological forecasts. Sensor data are transmitted from monitoring site to a Central Data Collection (CDC) Server. We incorporate a GPRS modem for wireless telemetry, a single-board computer (SBC) as Remote Field Gateway (RFG) Server, and a WSN for distributed soil moisture monitoring. The RFG provides effective control, management, and coordination of two independent sensor systems, i.e., a traditional datalogger-based wired sensor system and the WSN-based wireless sensor system. The RFG also supports remote manipulation of the devices in the field such as the SBC, datalogger, and WSN. Sensor data collected from the distributed monitoring stations are stored in a database (DB) Server. The CDC Server acts as an intermediate component to hide the heterogeneity of different devices and support data validation required by the DB Server. Daemon programs running on the CDC Server pre-process the data before it is inserted into the database, and periodically perform synchronization tasks. A SWE-compliant data repository is installed to enable data exchange, accepting data from both internal DB Server and external sources through the OGC web services. The web portal, i.e. TEO Online, serves as a user-friendly interface for data visualization, analysis, synthesis, modeling, and K-12 educational outreach activities. It also provides useful capabilities for system developers and operators to remotely monitor system status and remotely update software and system configuration, which greatly simplifies the system debugging and maintenance tasks. We also implement Sensor Observation Services (SOS) at this layer, conforming to the SWE standard to facilitate data exchange. The standard SensorML/O&M data representation makes it easy to integrate our sensor data into the existing Geographic Information Systems (GIS) web services and exchange the data with other organizations.
Dynamic alarm response procedures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martin, J.; Gordon, P.; Fitch, K.
2006-07-01
The Dynamic Alarm Response Procedure (DARP) system provides a robust, Web-based alternative to existing hard-copy alarm response procedures. This paperless system improves performance by eliminating time wasted looking up paper procedures by number, looking up plant process values and equipment and component status at graphical display or panels, and maintenance of the procedures. Because it is a Web-based system, it is platform independent. DARP's can be served from any Web server that supports CGI scripting, such as Apache{sup R}, IIS{sup R}, TclHTTPD, and others. DARP pages can be viewed in any Web browser that supports Javascript and Scalable Vector Graphicsmore » (SVG), such as Netscape{sup R}, Microsoft Internet Explorer{sup R}, Mozilla Firefox{sup R}, Opera{sup R}, and others. (authors)« less
AutoCAD-To-NASTRAN Translator Program
NASA Technical Reports Server (NTRS)
Jones, A.
1989-01-01
Program facilitates creation of finite-element mathematical models from geometric entities. AutoCAD to NASTRAN translator (ACTON) computer program developed to facilitate quick generation of small finite-element mathematical models for use with NASTRAN finite-element modeling program. Reads geometric data of drawing from Data Exchange File (DXF) used in AutoCAD and other PC-based drafting programs. Written in Microsoft Quick-Basic (Version 2.0).
An immersive surgery training system with live streaming capability.
Yang, Yang; Guo, Xinqing; Yu, Zhan; Steiner, Karl V; Barner, Kenneth E; Bauer, Thomas L; Yu, Jingyi
2014-01-01
Providing real-time, interactive immersive surgical training has been a key research area in telemedicine. Earlier approaches have mainly adopted videotaped training that can only show imagery from a fixed view point. Recent advances on commodity 3D imaging have enabled a new paradigm for immersive surgical training by acquiring nearly complete 3D reconstructions of actual surgical procedures. However, unlike 2D videotaping that can easily stream data in real-time, by far 3D imaging based solutions require pre-capturing and processing the data; surgical trainings using the data have to be conducted offline after the acquisition. In this paper, we present a new real-time immersive 3D surgical training system. Our solution builds upon the recent multi-Kinect based surgical training system [1] that can acquire and display high delity 3D surgical procedures using only a small number of Microsoft Kinect sensors. We build on top of the system a client-server model for real-time streaming. On the server front, we efficiently fuse multiple Kinect data acquired from different viewpoints and compress and then stream the data to the client. On the client front, we build an interactive space-time navigator to allow remote users (e.g., trainees) to witness the surgical procedure in real-time as if they were present in the room.
Image-based electronic patient records for secured collaborative medical applications.
Zhang, Jianguo; Sun, Jianyong; Yang, Yuanyuan; Liang, Chenwen; Yao, Yihong; Cai, Weihua; Jin, Jin; Zhang, Guozhen; Sun, Kun
2005-01-01
We developed a Web-based system to interactively display image-based electronic patient records (EPR) for secured intranet and Internet collaborative medical applications. The system consists of four major components: EPR DICOM gateway (EPR-GW), Image-based EPR repository server (EPR-Server), Web Server and EPR DICOM viewer (EPR-Viewer). In the EPR-GW and EPR-Viewer, the security modules of Digital Signature and Authentication are integrated to perform the security processing on the EPR data with integrity and authenticity. The privacy of EPR in data communication and exchanging is provided by SSL/TLS-based secure communication. This presentation gave a new approach to create and manage image-based EPR from actual patient records, and also presented a way to use Web technology and DICOM standard to build an open architecture for collaborative medical applications.
Establishing a Federal and State Data Exchange Pilot for Public Health Situational Awareness
Passman, Dina B.; Kite-Powell, Aaron; Spector, Dara; Loschen, Wayne; Harp, Barry; Chern, Aaron; Hamilton, Janet; Eggers, Cary; Lombardo, Joseph
2013-01-01
Objective U.S. Department of Health and Human Services (HHS) Office of the Assistant Secretary for Preparedness and Response (ASPR) partnered with the Florida Department of Health (FDOH), Bureau of Epidemiology, to implement a new process for the unidirectional exchange of electronic medical record (EMR) data when ASPR clinical assets are operational in the state following a disaster or other response event. The purpose of the current work was to automate the exchange of data from the ASPR electronic medical record system EMR-S into the FDOH Electronic Surveillance System for the Early Notification of Community-based Epidemics (ESSENCE-FL) system during the 2012 Republican National Convention (RNC). Introduction ASPR deploys clinical assets, including an EMR system, to the ground per state requests during planned and no-notice events. The analysis of patient data collected by deployed federal personnel is an integral part of ASPR and FDOH’s surveillance efforts. However, this surveillance can be hampered by the logistical issues of field work in a post-disaster environment leading to delayed analysis and interpretation of these data to inform decision makers at the federal, state, and local levels. FDOH operates ESSENCE-FL, a multi-tiered, automated, and secure web-based application for analysis and visualization of clinical data. The system is accessible statewide by FDOH staff as well as by hospitals that participate in the system. To improve surveillance ASPR and FDOH engaged in a pilot project whereby EMR data from ASPR would be sent to FDOH in near real-time during the 2012 hurricane season and the 2012 RNC. This project is in direct support of Healthcare Preparedness Capability 6, Information Sharing, and Public Health Preparedness Capability 13, Public Health Surveillance and Epidemiological Investigation. Methods In 2011, FDOH approached ASPR about securely transmitting raw EMR data that could be ingested by ESSENCE-FL during ASPR deployments in the state. Upon conclusion of an agreement for a date exchange pilot, data elements of interest from the ASPR EMR were identified. Due to the modular design ESSENCE-FL Microsoft SQL databases were easily adapted by the Johns Hopkins University Applied Physics Laboratory (JHU/APL) to add a new module to handle receipt of ASPR EMR data including code to process the files, remove duplicates and create associations with existing reference information, such as system-defined geographic regions and age groups. Scripts were developed to run on the ASPR server to create and send updated files via secure file transfer protocol (SFTP) every 15 minutes to ESSENCE-FL. Prior ASPR event deployment data was scrubbed and sent to ESSENCE-FL as a test dataset to ensure appropriate receipt and ingestion of the new data source. Results EMR data was transmitted through a central server at ASPR to ESSENCE-FL every 15 minutes during each day of the 2012 RNC (August 26–31). In ESSENCE-FL, configuration allowed the data to be queried, analyzed, and visualized similar to existing ESSENCE-FL data sources. In all, data from 11 patient encounters were successfully exchanged between the partners. The data were used by ASPR and FDOH to simultaneously monitor in near real-time onsite medical response activities during the convention. Conclusions Timely access to patient data can enhance situational awareness and disease surveillance efforts and provide decision makers with key information in an expedient manner during disaster response and mass gatherings such as the RNC. However, data are siloed within organizations. The collaboration between FDOH, ASPR and JHU/APL made EMR data sharing and analysis more expeditious and efficient and increased timely access to these data by local, state, and federal epidemiologists. The integration of these data into the ESSENCE-FL system created one location where users could go to access data and create epidemiologic reports for a given region in Florida, including the RNC. To achieve these successes with partners in the future, it will be necessary to develop partnerships well in advance of intended data exchange. Future recommendations include robust pre-event testing of the data exchange process and planning for a greater amount of lead-time between enacting the official agreement and beginning data exchange.
Password-Only Authenticated Three-Party Key Exchange with Provable Security in the Standard Model
Nam, Junghyun; Kim, Junghwan; Kang, Hyun-Kyu; Kim, Jinsoo; Paik, Juryon
2014-01-01
Protocols for password-only authenticated key exchange (PAKE) in the three-party setting allow two clients registered with the same authentication server to derive a common secret key from their individual password shared with the server. Existing three-party PAKE protocols were proven secure under the assumption of the existence of random oracles or in a model that does not consider insider attacks. Therefore, these protocols may turn out to be insecure when the random oracle is instantiated with a particular hash function or an insider attack is mounted against the partner client. The contribution of this paper is to present the first three-party PAKE protocol whose security is proven without any idealized assumptions in a model that captures insider attacks. The proof model we use is a variant of the indistinguishability-based model of Bellare, Pointcheval, and Rogaway (2000), which is one of the most widely accepted models for security analysis of password-based key exchange protocols. We demonstrated that our protocol achieves not only the typical indistinguishability-based security of session keys but also the password security against undetectable online dictionary attacks. PMID:24977229
Patient Data Synchronization Process in a Continuity of Care Environment
Haras, Consuela; Sauquet, Dominique; Ameline, Philippe; Jaulent, Marie-Christine; Degoulet, Patrice
2005-01-01
In a distributed patient record environment, we analyze the processes needed to ensure exchange and access to EHR data. We propose an adapted method and the tools for data synchronization. Our study takes into account the issues of user rights management for data access and of decreasing the amount of data exchanged over the network. We describe a XML-based synchronization model that is portable and independent of specific medical data models. The implemented platform consists of several servers, of local network clients, of workstations running user’s interfaces and of data exchange and synchronization tools. PMID:16779049
Deneyer, M; Hachimi-Idrissi, S; Michel, L; Nyssen, M; De Moor, G; Vandenplas, Y
2012-01-01
The authors propose the introduction of a pilot project: "paediatric core file exchange in emergencies" (PCF-EXEM) which enables the exchange of medical data between the attending paediatrician (AP), holder of the medical record, and on-duty medical units (i.e. general practitioners, paediatricians, surgeons, emergency physicians,...). This project is based on two pillars: a protected server (PCF-server) containing paediatric core files (PCF), with important clinical data that should be available for the physician in order to quickly get a clear insight into the relevant clinical medical history of the child, and secondly, the possibility to provide feedback to the attending physician about the findings recorded during the on-call duty. The permanent availability of health data on the PCF-server and the possibility to provide feedback represent together the PCF-EXEM-project. This project meets the demand of the care providers to have relevant medical information permanently available in order to guarantee high quality care in emergency situations. The frail balance between the right to informative privacy and professional confidentiality on the one hand and the right to quality health care on the other hand has been taken into account. The technical and practical feasibility of this project is described. The objectives and vision of the PCF-EXEM project are conform to Belgian legislation concerning the processing of medical data and are in line with the still under consideration European projects which are focusing on interoperability and the development of a common access control to databanks containing health data for care providers. PCF-EXEM could therefore be a model for other EU countries as well.
Advanced Pulse Oximetry System for Remote Monitoring and Management
Pak, Ju Geon; Park, Kee Hyun
2012-01-01
Pulse oximetry data such as saturation of peripheral oxygen (SpO2) and pulse rate are vital signals for early diagnosis of heart disease. Therefore, various pulse oximeters have been developed continuously. However, some of the existing pulse oximeters are not equipped with communication capabilities, and consequently, the continuous monitoring of patient health is restricted. Moreover, even though certain oximeters have been built as network models, they focus on exchanging only pulse oximetry data, and they do not provide sufficient device management functions. In this paper, we propose an advanced pulse oximetry system for remote monitoring and management. The system consists of a networked pulse oximeter and a personal monitoring server. The proposed pulse oximeter measures a patient's pulse oximetry data and transmits the data to the personal monitoring server. The personal monitoring server then analyzes the received data and displays the results to the patient. Furthermore, for device management purposes, operational errors that occur in the pulse oximeter are reported to the personal monitoring server, and the system configurations of the pulse oximeter, such as thresholds and measurement targets, are modified by the server. We verify that the proposed pulse oximetry system operates efficiently and that it is appropriate for monitoring and managing a pulse oximeter in real time. PMID:22933841
Advanced pulse oximetry system for remote monitoring and management.
Pak, Ju Geon; Park, Kee Hyun
2012-01-01
Pulse oximetry data such as saturation of peripheral oxygen (SpO(2)) and pulse rate are vital signals for early diagnosis of heart disease. Therefore, various pulse oximeters have been developed continuously. However, some of the existing pulse oximeters are not equipped with communication capabilities, and consequently, the continuous monitoring of patient health is restricted. Moreover, even though certain oximeters have been built as network models, they focus on exchanging only pulse oximetry data, and they do not provide sufficient device management functions. In this paper, we propose an advanced pulse oximetry system for remote monitoring and management. The system consists of a networked pulse oximeter and a personal monitoring server. The proposed pulse oximeter measures a patient's pulse oximetry data and transmits the data to the personal monitoring server. The personal monitoring server then analyzes the received data and displays the results to the patient. Furthermore, for device management purposes, operational errors that occur in the pulse oximeter are reported to the personal monitoring server, and the system configurations of the pulse oximeter, such as thresholds and measurement targets, are modified by the server. We verify that the proposed pulse oximetry system operates efficiently and that it is appropriate for monitoring and managing a pulse oximeter in real time.
A Services-Oriented Architecture for Water Observations Data
NASA Astrophysics Data System (ADS)
Maidment, D. R.; Zaslavsky, I.; Valentine, D.; Tarboton, D. G.; Whitenack, T.; Whiteaker, T.; Hooper, R.; Kirschtel, D.
2009-04-01
Water observations data are time series of measurements made at point locations of water level, flow, and quality and corresponding data for climatic observations at point locations such as gaged precipitation and weather variables. A services-oriented architecture has been built for such information for the United States that has three components: hydrologic information servers, hydrologic information clients, and a centralized metadata cataloging system. These are connected using web services for observations data and metadata defined by an XML-based language called WaterML. A Hydrologic Information Server can be built by storing observations data in a relational database schema in the CUAHSI Observations Data Model, in which case, web services access to the data and metadata is automatically provided by query functions for WaterML that are wrapped around the relational database within a web server. A Hydrologic Information Server can also be constructed by custom-programming an interface to an existing water agency web site so that responds to the same queries by producing data in WaterML as do the CUAHSI Observations Data Model based servers. A Hydrologic Information Client is one which can interpret and ingest WaterML metadata and data. We have two client applications for Excel and ArcGIS and have shown how WaterML web services can be ingested into programming environments such as Matlab and Visual Basic. HIS Central, maintained at the San Diego Supercomputer Center is a repository of observational metadata for WaterML web services which presently indexes 342 million data measured at 1.75 million locations. This is the largest catalog water observational data for the United States presently in existence. As more observation networks join what we term "CUAHSI Water Data Federation", and the system accommodates a growing number of sites, measured parameters, applications, and users, rapid and reliable access to large heterogeneous hydrologic data repositories becomes critical. The CUAHSI HIS solution to the scalability and heterogeneity challenges has several components. Structural differences across the data repositories are addressed by building a standard services foundation for the exchange of hydrologic data, as derived from a common information model for observational data measured at stationary points and its implementation as a relational schema (ODM) and an XML schema (WaterML). Semantic heterogeneity is managed by mapping water quantity, water quality, and other parameters collected by government agencies and academic projects to a common ontology. The WaterML-compliant web services are indexed in a community services registry called HIS Central (hiscentral.cuahsi.org). Once a web service is registered in HIS Central, its metadata (site and variable characteristics, period of record for each variable at each site, etc.) is harvested and appended to the central catalog. The catalog is further updated as the service publisher associates the variables in the published service with ontology concepts. After this, the newly published service becomes available for spatial and semantics-based queries from online and desktop client applications developed by the project. Hydrologic system server software is now deployed at more than a dozen locations in the United States and Australia. To provide rapid access to data summaries, in particular for several nation-wide data repositories including EPA STORET, USGS NWIS, and USDA SNOTEL, we convert the observation data catalogs and databases with harvested data values into special representations that support high-performance analysis and visualization. The construction of OLAP (Online Analytical Processing) cubes, often called data cubes, is an approach to organizing and querying large multi-dimensional data collections. We have applied the OLAP techniques, as implemented in Microsoft SQL Server 2005/2008, to the analysis of the catalogs from several agencies. OLAP analysis results reflect geography and history of observation data availability from USGS NWIS, EPA STORET, and USDA SNOTEL repositories, and spatial and temporal dynamics of the available measurements for several key nutrient-related parameters. Our experience developing the CUAHSI HIS cyberinfrastructure demonstrated that efficient integration of hydrologic observations from multiple government and academic sources requires a range of technical approaches focused on managing different components of data heterogeneity and system scalability. While this submission addresses technical aspects of developing a national-scale information system for hydrologic observations, the challenges of explicating shared semantics of hydrologic observations and building a community of HIS users and developers remain critical in constructing a nation-wide federation of water data services.
Controlling EPICS from a web browser.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, K., Jr.
1999-04-13
An alternative to using a large graphical display manager like MEDM [1,2] to interface to a control system, is to use individual control objects, such as text boxes, meters, etc., running in a browser. This paper presents three implementations of this concept, one using ActiveX controls, one with Java applets, and another with Microsoft Agent. The ActiveX controls have performance nearing that of MEDM, but they only work on Windows platforms. The Java applets require a server to get around Web security restrictions and are not as fast, but they have the advantage of working on most platforms and withmore » both of the leading Web browsers. The agent works on Windows platforms with and without a browser and allows voice recognition and speech synthesis, making it somewhat more innovative than MEDM.« less
Distributed On-line Monitoring System Based on Modem and Public Phone Net
NASA Astrophysics Data System (ADS)
Chen, Dandan; Zhang, Qiushi; Li, Guiru
In order to solve the monitoring problem of urban sewage disposal, a distributed on-line monitoring system is proposed. By introducing dial-up communication technology based on Modem, the serial communication program can rationally solve the information transmission problem between master station and slave station. The realization of serial communication program is based on the MSComm control of C++ Builder 6.0.The software includes real-time data operation part and history data handling part, which using Microsoft SQL Server 2000 for database, and C++ Builder6.0 for user interface. The monitoring center displays a user interface with alarm information of over-standard data and real-time curve. Practical application shows that the system has successfully accomplished the real-time data acquisition from data gather station, and stored them in the terminal database.
NASA Technical Reports Server (NTRS)
Vrnak, Daniel R.; Stueber, Thomas J.; Le, Dzu K.
2012-01-01
This report presents a method for running a dynamic legacy inlet simulation in concert with another dynamic simulation that uses a graphical interface. The legacy code, NASA's LArge Perturbation INlet (LAPIN) model, was coded using the FORTRAN 77 (The Portland Group, Lake Oswego, OR) programming language to run in a command shell similar to other applications that used the Microsoft Disk Operating System (MS-DOS) (Microsoft Corporation, Redmond, WA). Simulink (MathWorks, Natick, MA) is a dynamic simulation that runs on a modern graphical operating system. The product of this work has both simulations, LAPIN and Simulink, running synchronously on the same computer with periodic data exchanges. Implementing the method described in this paper avoided extensive changes to the legacy code and preserved its basic operating procedure. This paper presents a novel method that promotes inter-task data communication between the synchronously running processes.
Physical explosion analysis in heat exchanger network design
NASA Astrophysics Data System (ADS)
Pasha, M.; Zaini, D.; Shariff, A. M.
2016-06-01
The failure of shell and tube heat exchangers is being extensively experienced by the chemical process industries. This failure can create a loss of production for long time duration. Moreover, loss of containment through heat exchanger could potentially lead to a credible event such as fire, explosion and toxic release. There is a need to analyse the possible worst case effect originated from the loss of containment of the heat exchanger at the early design stage. Physical explosion analysis during the heat exchanger network design is presented in this work. Baker and Prugh explosion models are deployed for assessing the explosion effect. Microsoft Excel integrated with process design simulator through object linking and embedded (OLE) automation for this analysis. Aspen HYSYS V (8.0) used as a simulation platform in this work. A typical heat exchanger network of steam reforming and shift conversion process was presented as a case study. It is investigated from this analysis that overpressure generated from the physical explosion of each heat exchanger can be estimated in a more precise manner by using Prugh model. The present work could potentially assist the design engineer to identify the critical heat exchanger in the network at the preliminary design stage.
NASA Technical Reports Server (NTRS)
Ullman, Richard; Bane, Bob; Yang, Jingli
2008-01-01
A shell script has been written as a means of automatically making HDF-EOS-formatted data sets available via the World Wide Web. ("HDF-EOS" and variants thereof are defined in the first of the two immediately preceding articles.) The shell script chains together some software tools developed by the Data Usability Group at Goddard Space Flight Center to perform the following actions: Extract metadata in Object Definition Language (ODL) from an HDF-EOS file, Convert the metadata from ODL to Extensible Markup Language (XML), Reformat the XML metadata into human-readable Hypertext Markup Language (HTML), Publish the HTML metadata and the original HDF-EOS file to a Web server and an Open-source Project for a Network Data Access Protocol (OPeN-DAP) server computer, and Reformat the XML metadata and submit the resulting file to the EOS Clearinghouse, which is a Web-based metadata clearinghouse that facilitates searching for, and exchange of, Earth-Science data.
Provably Secure Password-based Authentication in TLS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abdalla, Michel; Emmanuel, Bresson; Chevassut, Olivier
2005-12-20
In this paper, we show how to design an efficient, provably secure password-based authenticated key exchange mechanism specifically for the TLS (Transport Layer Security) protocol. The goal is to provide a technique that allows users to employ (short) passwords to securely identify themselves to servers. As our main contribution, we describe a new password-based technique for user authentication in TLS, called Simple Open Key Exchange (SOKE). Loosely speaking, the SOKE ciphersuites are unauthenticated Diffie-Hellman ciphersuites in which the client's Diffie-Hellman ephemeral public value is encrypted using a simple mask generation function. The mask is simply a constant value raised tomore » the power of (a hash of) the password.The SOKE ciphersuites, in advantage over previous pass-word-based authentication ciphersuites for TLS, combine the following features. First, SOKE has formal security arguments; the proof of security based on the computational Diffie-Hellman assumption is in the random oracle model, and holds for concurrent executions and for arbitrarily large password dictionaries. Second, SOKE is computationally efficient; in particular, it only needs operations in a sufficiently large prime-order subgroup for its Diffie-Hellman computations (no safe primes). Third, SOKE provides good protocol flexibility because the user identity and password are only required once a SOKE ciphersuite has actually been negotiated, and after the server has sent a server identity.« less
A Secure Authenticated Key Exchange Protocol for Credential Services
NASA Astrophysics Data System (ADS)
Shin, Seonghan; Kobara, Kazukuni; Imai, Hideki
In this paper, we propose a leakage-resilient and proactive authenticated key exchange (called LRP-AKE) protocol for credential services which provides not only a higher level of security against leakage of stored secrets but also secrecy of private key with respect to the involving server. And we show that the LRP-AKE protocol is provably secure in the random oracle model with the reduction to the computational Difie-Hellman problem. In addition, we discuss about some possible applications of the LRP-AKE protocol.
NASA Technical Reports Server (NTRS)
Khan, Ahmed
2010-01-01
The International Space Station (ISS) Operations Planning Team, Mission Control Centre and Mission Automation Support Network (MAS) have all evolved over the years to use commercial web-based technologies to create a configurable electronic infrastructure to manage the complex network of real-time planning, crew scheduling, resource and activity management as well as onboard document and procedure management required to co-ordinate ISS assembly, daily operations and mission support. While these Web technologies are classified as non-critical in nature, their use is part of an essential backbone of daily operations on the ISS and allows the crew to operate the ISS as a functioning science laboratory. The rapid evolution of the internet from 1998 (when ISS assembly began) to today, along with the nature of continuous manned operations in space, have presented a unique challenge in terms of software engineering and system development. In addition, the use of a wide array of competing internet technologies (including commercial technologies such as .NET and JAVA ) and the special requirements of having to support this network, both nationally among various control centres for International Partners (IPs), as well as onboard the station itself, have created special challenges for the MCC Web Tools Development Team, software engineers and flight controllers, who implement and maintain this system. This paper presents an overview of some of these operational challenges, and the evolving nature of the solutions and the future use of COTS based rich internet technologies in manned space flight operations. In particular this paper will focus on the use of Microsoft.s .NET API to develop Web-Based Operational tools, the use of XML based service oriented architectures (SOA) that needed to be customized to support Mission operations, the maintenance of a Microsoft IIS web server onboard the ISS, The OpsLan, functional-oriented Web Design with AJAX
Metnitz, P G; Laback, P; Popow, C; Laback, O; Lenz, K; Hiesmayr, M
1995-01-01
Patient Data Management Systems (PDMS) for ICUs collect, present and store clinical data. Various intentions make analysis of those digitally stored data desirable, such as quality control or scientific purposes. The aim of the Intensive Care Data Evaluation project (ICDEV), was to provide a database tool for the analysis of data recorded at various ICUs at the University Clinics of Vienna. General Hospital of Vienna, with two different PDMSs used: CareVue 9000 (Hewlett Packard, Andover, USA) at two ICUs (one medical ICU and one neonatal ICU) and PICIS Chart+ (PICIS, Paris, France) at one Cardiothoracic ICU. CONCEPT AND METHODS: Clinically oriented analysis of the data collected in a PDMS at an ICU was the beginning of the development. After defining the database structure we established a client-server based database system under Microsoft Windows NI and developed a user friendly data quering application using Microsoft Visual C++ and Visual Basic; ICDEV was successfully installed at three different ICUs, adjustment to the different PDMS configurations were done within a few days. The database structure developed by us enables a powerful query concept representing an 'EXPERT QUESTION COMPILER' which may help to answer almost any clinical questions. Several program modules facilitate queries at the patient, group and unit level. Results from ICDEV-queries are automatically transferred to Microsoft Excel for display (in form of configurable tables and graphs) and further processing. The ICDEV concept is configurable for adjustment to different intensive care information systems and can be used to support computerized quality control. However, as long as there exists no sufficient artifact recognition or data validation software for automatically recorded patient data, the reliability of these data and their usage for computer assisted quality control remain unclear and should be further studied.
Modules based on the geochemical model PHREEQC for use in scripting and programming languages
Charlton, S.R.; Parkhurst, D.L.
2011-01-01
The geochemical model PHREEQC is capable of simulating a wide range of equilibrium reactions between water and minerals, ion exchangers, surface complexes, solid solutions, and gases. It also has a general kinetic formulation that allows modeling of nonequilibrium mineral dissolution and precipitation, microbial reactions, decomposition of organic compounds, and other kinetic reactions. To facilitate use of these reaction capabilities in scripting languages and other models, PHREEQC has been implemented in modules that easily interface with other software. A Microsoft COM (component object model) has been implemented, which allows PHREEQC to be used by any software that can interface with a COM server-for example, Excel??, Visual Basic??, Python, or MATLAB??. PHREEQC has been converted to a C++ class, which can be included in programs written in C++. The class also has been compiled in libraries for Linux and Windows that allow PHREEQC to be called from C++, C, and Fortran. A limited set of methods implements the full reaction capabilities of PHREEQC for each module. Input methods use strings or files to define reaction calculations in exactly the same formats used by PHREEQC. Output methods provide a table of user-selected model results, such as concentrations, activities, saturation indices, and densities. The PHREEQC module can add geochemical reaction capabilities to surface-water, groundwater, and watershed transport models. It is possible to store and manipulate solution compositions and reaction information for many cells within the module. In addition, the object-oriented nature of the PHREEQC modules simplifies implementation of parallel processing for reactive-transport models. The PHREEQC COM module may be used in scripting languages to fit parameters; to plot PHREEQC results for field, laboratory, or theoretical investigations; or to develop new models that include simple or complex geochemical calculations. ?? 2011.
NASA Astrophysics Data System (ADS)
Xie, Qi; Hu, Bin; Chen, Ke-Fei; Liu, Wen-Hao; Tan, Xiao
2015-11-01
In three-party password authenticated key exchange (AKE) protocol, since two users use their passwords to establish a secure session key over an insecure communication channel with the help of the trusted server, such a protocol may suffer the password guessing attacks and the server has to maintain the password table. To eliminate the shortages of password-based AKE protocol, very recently, according to chaotic maps, Lee et al. [2015 Nonlinear Dyn. 79 2485] proposed a first three-party-authenticated key exchange scheme without using passwords, and claimed its security by providing a well-organized BAN logic test. Unfortunately, their protocol cannot resist impersonation attack, which is demonstrated in the present paper. To overcome their security weakness, by using chaotic maps, we propose a biometrics-based anonymous three-party AKE protocol with the same advantages. Further, we use the pi calculus-based formal verification tool ProVerif to show that our AKE protocol achieves authentication, security and anonymity, and an acceptable efficiency. Project supported by the Natural Science Foundation of Zhejiang Province, China (Grant No. LZ12F02005), the Major State Basic Research Development Program of China (Grant No. 2013CB834205), and the National Natural Science Foundation of China (Grant No. 61070153).
Parallelization of a Monte Carlo particle transport simulation code
NASA Astrophysics Data System (ADS)
Hadjidoukas, P.; Bousis, C.; Emfietzoglou, D.
2010-05-01
We have developed a high performance version of the Monte Carlo particle transport simulation code MC4. The original application code, developed in Visual Basic for Applications (VBA) for Microsoft Excel, was first rewritten in the C programming language for improving code portability. Several pseudo-random number generators have been also integrated and studied. The new MC4 version was then parallelized for shared and distributed-memory multiprocessor systems using the Message Passing Interface. Two parallel pseudo-random number generator libraries (SPRNG and DCMT) have been seamlessly integrated. The performance speedup of parallel MC4 has been studied on a variety of parallel computing architectures including an Intel Xeon server with 4 dual-core processors, a Sun cluster consisting of 16 nodes of 2 dual-core AMD Opteron processors and a 200 dual-processor HP cluster. For large problem size, which is limited only by the physical memory of the multiprocessor server, the speedup results are almost linear on all systems. We have validated the parallel implementation against the serial VBA and C implementations using the same random number generator. Our experimental results on the transport and energy loss of electrons in a water medium show that the serial and parallel codes are equivalent in accuracy. The present improvements allow for studying of higher particle energies with the use of more accurate physical models, and improve statistics as more particles tracks can be simulated in low response time.
Database Reports Over the Internet
NASA Technical Reports Server (NTRS)
Smith, Dean Lance
2002-01-01
Most of the summer was spent developing software that would permit existing test report forms to be printed over the web on a printer that is supported by Adobe Acrobat Reader. The data is stored in a DBMS (Data Base Management System). The client asks for the information from the database using an HTML (Hyper Text Markup Language) form in a web browser. JavaScript is used with the forms to assist the user and verify the integrity of the entered data. Queries to a database are made in SQL (Sequential Query Language), a widely supported standard for making queries to databases. Java servlets, programs written in the Java programming language running under the control of network server software, interrogate the database and complete a PDF form template kept in a file. The completed report is sent to the browser requesting the report. Some errors are sent to the browser in an HTML web page, others are reported to the server. Access to the databases was restricted since the data are being transported to new DBMS software that will run on new hardware. However, the SQL queries were made to Microsoft Access, a DBMS that is available on most PCs (Personal Computers). Access does support the SQL commands that were used, and a database was created with Access that contained typical data for the report forms. Some of the problems and features are discussed below.
BioWord: A sequence manipulation suite for Microsoft Word
2012-01-01
Background The ability to manipulate, edit and process DNA and protein sequences has rapidly become a necessary skill for practicing biologists across a wide swath of disciplines. In spite of this, most everyday sequence manipulation tools are distributed across several programs and web servers, sometimes requiring installation and typically involving frequent switching between applications. To address this problem, here we have developed BioWord, a macro-enabled self-installing template for Microsoft Word documents that integrates an extensive suite of DNA and protein sequence manipulation tools. Results BioWord is distributed as a single macro-enabled template that self-installs with a single click. After installation, BioWord will open as a tab in the Office ribbon. Biologists can then easily manipulate DNA and protein sequences using a familiar interface and minimize the need to switch between applications. Beyond simple sequence manipulation, BioWord integrates functionality ranging from dyad search and consensus logos to motif discovery and pair-wise alignment. Written in Visual Basic for Applications (VBA) as an open source, object-oriented project, BioWord allows users with varying programming experience to expand and customize the program to better meet their own needs. Conclusions BioWord integrates a powerful set of tools for biological sequence manipulation within a handy, user-friendly tab in a widely used word processing software package. The use of a simple scripting language and an object-oriented scheme facilitates customization by users and provides a very accessible educational platform for introducing students to basic bioinformatics algorithms. PMID:22676326
BioWord: a sequence manipulation suite for Microsoft Word.
Anzaldi, Laura J; Muñoz-Fernández, Daniel; Erill, Ivan
2012-06-07
The ability to manipulate, edit and process DNA and protein sequences has rapidly become a necessary skill for practicing biologists across a wide swath of disciplines. In spite of this, most everyday sequence manipulation tools are distributed across several programs and web servers, sometimes requiring installation and typically involving frequent switching between applications. To address this problem, here we have developed BioWord, a macro-enabled self-installing template for Microsoft Word documents that integrates an extensive suite of DNA and protein sequence manipulation tools. BioWord is distributed as a single macro-enabled template that self-installs with a single click. After installation, BioWord will open as a tab in the Office ribbon. Biologists can then easily manipulate DNA and protein sequences using a familiar interface and minimize the need to switch between applications. Beyond simple sequence manipulation, BioWord integrates functionality ranging from dyad search and consensus logos to motif discovery and pair-wise alignment. Written in Visual Basic for Applications (VBA) as an open source, object-oriented project, BioWord allows users with varying programming experience to expand and customize the program to better meet their own needs. BioWord integrates a powerful set of tools for biological sequence manipulation within a handy, user-friendly tab in a widely used word processing software package. The use of a simple scripting language and an object-oriented scheme facilitates customization by users and provides a very accessible educational platform for introducing students to basic bioinformatics algorithms.
SurfaceSlide: a multitouch digital pathology platform.
Wang, Yinhai; Williamson, Kate E; Kelly, Paul J; James, Jacqueline A; Hamilton, Peter W
2012-01-01
Digital pathology provides a digital environment for the management and interpretation of pathological images and associated data. It is becoming increasing popular to use modern computer based tools and applications in pathological education, tissue based research and clinical diagnosis. Uptake of this new technology is stymied by its single user orientation and its prerequisite and cumbersome combination of mouse and keyboard for navigation and annotation. In this study we developed SurfaceSlide, a dedicated viewing platform which enables the navigation and annotation of gigapixel digitised pathological images using fingertip touch. SurfaceSlide was developed using the Microsoft Surface, a 30 inch multitouch tabletop computing platform. SurfaceSlide users can perform direct panning and zooming operations on digitised slide images. These images are downloaded onto the Microsoft Surface platform from a remote server on-demand. Users can also draw annotations and key in texts using an on-screen virtual keyboard. We also developed a smart caching protocol which caches the surrounding regions of a field of view in multi-resolutions thus providing a smooth and vivid user experience and reducing the delay for image downloading from the internet. We compared the usability of SurfaceSlide against Aperio ImageScope and PathXL online viewer. SurfaceSlide is intuitive, fast and easy to use. SurfaceSlide represents the most direct, effective and intimate human-digital slide interaction experience. It is expected that SurfaceSlide will significantly enhance digital pathology tools and applications in education and clinical practice.
SurfaceSlide: A Multitouch Digital Pathology Platform
Wang, Yinhai; Williamson, Kate E.; Kelly, Paul J.; James, Jacqueline A.; Hamilton, Peter W.
2012-01-01
Background Digital pathology provides a digital environment for the management and interpretation of pathological images and associated data. It is becoming increasing popular to use modern computer based tools and applications in pathological education, tissue based research and clinical diagnosis. Uptake of this new technology is stymied by its single user orientation and its prerequisite and cumbersome combination of mouse and keyboard for navigation and annotation. Methodology In this study we developed SurfaceSlide, a dedicated viewing platform which enables the navigation and annotation of gigapixel digitised pathological images using fingertip touch. SurfaceSlide was developed using the Microsoft Surface, a 30 inch multitouch tabletop computing platform. SurfaceSlide users can perform direct panning and zooming operations on digitised slide images. These images are downloaded onto the Microsoft Surface platform from a remote server on-demand. Users can also draw annotations and key in texts using an on-screen virtual keyboard. We also developed a smart caching protocol which caches the surrounding regions of a field of view in multi-resolutions thus providing a smooth and vivid user experience and reducing the delay for image downloading from the internet. We compared the usability of SurfaceSlide against Aperio ImageScope and PathXL online viewer. Conclusion SurfaceSlide is intuitive, fast and easy to use. SurfaceSlide represents the most direct, effective and intimate human–digital slide interaction experience. It is expected that SurfaceSlide will significantly enhance digital pathology tools and applications in education and clinical practice. PMID:22292040
A Comparative Analysis of Extract, Transformation and Loading (ETL) Process
NASA Astrophysics Data System (ADS)
Runtuwene, J. P. A.; Tangkawarow, I. R. H. T.; Manoppo, C. T. M.; Salaki, R. J.
2018-02-01
The current growth of data and information occurs rapidly in varying amount and media. These types of development will eventually produce large number of data better known as the Big Data. Business Intelligence (BI) utilizes large number of data and information for analysis so that one can obtain important information. This type of information can be used to support decision-making process. In practice a process integrating existing data and information into data warehouse is needed. This data integration process is known as Extract, Transformation and Loading (ETL). In practice, many applications have been developed to carry out the ETL process, but selection which applications are more time, cost and power effective and efficient may become a challenge. Therefore, the objective of the study was to provide comparative analysis through comparison between the ETL process using Microsoft SQL Server Integration Service (SSIS) and one using Pentaho Data Integration (PDI).
Attigala, Lakshmi; De Silva, Nuwan I; Clark, Lynn G
2016-04-01
Programs that are user-friendly and freely available for developing Web-based interactive keys are scarce and most of the well-structured applications are relatively expensive. WEBiKEY was developed to enable researchers to easily develop their own Web-based interactive keys with fewer resources. A Web-based multiaccess identification tool (WEBiKEY) was developed that uses freely available Microsoft ASP.NET technologies and an SQL Server database for Windows-based hosting environments. WEBiKEY was tested for its usability with a sample data set, the temperate woody bamboo genus Kuruna (Poaceae). WEBiKEY is freely available to the public and can be used to develop Web-based interactive keys for any group of species. The interactive key we developed for Kuruna using WEBiKEY enables users to visually inspect characteristics of Kuruna and identify an unknown specimen as one of seven possible species in the genus.
Moles: Tool-Assisted Environment Isolation with Closures
NASA Astrophysics Data System (ADS)
de Halleux, Jonathan; Tillmann, Nikolai
Isolating test cases from environment dependencies is often desirable, as it increases test reliability and reduces test execution time. However, code that calls non-virtual methods or consumes sealed classes is often impossible to test in isolation. Moles is a new lightweight framework which addresses this problem. For any .NET method, Moles allows test-code to provide alternative implementations, given as .NET delegates, for which C# provides very concise syntax while capturing local variables in a closure object. Using code instrumentation, the Moles framework will redirect calls to provided delegates instead of the original methods. The Moles framework is designed to work together with the dynamic symbolic execution tool Pex to enable automated test generation. In a case study, testing code programmed against the Microsoft SharePoint Foundation API, we achieved full code coverage while running tests in isolation without an actual SharePoint server. The Moles framework integrates with .NET and Visual Studio.
Alview: Portable Software for Viewing Sequence Reads in BAM Formatted Files.
Finney, Richard P; Chen, Qing-Rong; Nguyen, Cu V; Hsu, Chih Hao; Yan, Chunhua; Hu, Ying; Abawi, Massih; Bian, Xiaopeng; Meerzaman, Daoud M
2015-01-01
The name Alview is a contraction of the term Alignment Viewer. Alview is a compiled to native architecture software tool for visualizing the alignment of sequencing data. Inputs are files of short-read sequences aligned to a reference genome in the SAM/BAM format and files containing reference genome data. Outputs are visualizations of these aligned short reads. Alview is written in portable C with optional graphical user interface (GUI) code written in C, C++, and Objective-C. The application can run in three different ways: as a web server, as a command line tool, or as a native, GUI program. Alview is compatible with Microsoft Windows, Linux, and Apple OS X. It is available as a web demo at https://cgwb.nci.nih.gov/cgi-bin/alview. The source code and Windows/Mac/Linux executables are available via https://github.com/NCIP/alview.
PsychVACS: a system for asynchronous telepsychiatry.
Odor, Alberto; Yellowlees, Peter; Hilty, Donald; Parish, Michelle Burke; Nafiz, Najia; Iosif, Ana-Maria
2011-05-01
To describe the technical development of an asynchronous telepsychiatry application, the Psychiatric Video Archiving and Communication System. A client-server application was developed in Visual Basic.Net with Microsoft(®) SQL database as the backend. It includes the capability of storing video-recorded psychiatric interviews and manages the workflow of the system with automated messaging. Psychiatric Video Archiving and Communication System has been used to conduct the first ever series of asynchronous telepsychiatry consultations worldwide. A review of the software application and the process as part of this project has led to a number of improvements that are being implemented in the next version, which is being written in Java. This is the first description of the use of video recorded data in an asynchronous telemedicine application. Primary care providers and consulting psychiatrists have found it easy to work with and a valuable resource to increase the availability of psychiatric consultation in remote rural locations.
Narrowing the scope of failure prediction using targeted fault load injection
NASA Astrophysics Data System (ADS)
Jordan, Paul L.; Peterson, Gilbert L.; Lin, Alan C.; Mendenhall, Michael J.; Sellers, Andrew J.
2018-05-01
As society becomes more dependent upon computer systems to perform increasingly critical tasks, ensuring that those systems do not fail becomes increasingly important. Many organizations depend heavily on desktop computers for day-to-day operations. Unfortunately, the software that runs on these computers is written by humans and, as such, is still subject to human error and consequent failure. A natural solution is to use statistical machine learning to predict failure. However, since failure is still a relatively rare event, obtaining labelled training data to train these models is not a trivial task. This work presents new simulated fault-inducing loads that extend the focus of traditional fault injection techniques to predict failure in the Microsoft enterprise authentication service and Apache web server. These new fault loads were successful in creating failure conditions that were identifiable using statistical learning methods, with fewer irrelevant faults being created.
Development of a forestry government agency enterprise GIS system: a disconnected editing approach
NASA Astrophysics Data System (ADS)
Zhu, Jin; Barber, Brad L.
2008-10-01
The Texas Forest Service (TFS) has developed a geographic information system (GIS) for use by agency personnel in central Texas for managing oak wilt suppression and other landowner assistance programs. This Enterprise GIS system was designed to support multiple concurrent users accessing shared information resources. The disconnected editing approach was adopted in this system to avoid the overhead of maintaining an active connection between TFS central Texas field offices and headquarters since most field offices are operating with commercially provided Internet service. The GIS system entails maintaining a personal geodatabase on each local field office computer. Spatial data from the field is periodically up-loaded into a central master geodatabase stored in a Microsoft SQL Server at the TFS headquarters in College Station through the ESRI Spatial Database Engine (SDE). This GIS allows users to work off-line when editing data and requires connecting to the central geodatabase only when needed.
The OptIPuter microscopy demonstrator: enabling science through a transatlantic lightpath
Ellisman, M.; Hutton, T.; Kirkland, A.; Lin, A.; Lin, C.; Molina, T.; Peltier, S.; Singh, R.; Tang, K.; Trefethen, A.E.; Wallom, D.C.H.; Xiong, X.
2009-01-01
The OptIPuter microscopy demonstrator project has been designed to enable concurrent and remote usage of world-class electron microscopes located in Oxford and San Diego. The project has constructed a network consisting of microscopes and computational and data resources that are all connected by a dedicated network infrastructure using the UK Lightpath and US Starlight systems. Key science drivers include examples from both materials and biological science. The resulting system is now a permanent link between the Oxford and San Diego microscopy centres. This will form the basis of further projects between the sites and expansion of the types of systems that can be remotely controlled, including optical, as well as electron, microscopy. Other improvements will include the updating of the Microsoft cluster software to the high performance computing (HPC) server 2008, which includes the HPC basic profile implementation that will enable the development of interoperable clients. PMID:19487201
The OptIPuter microscopy demonstrator: enabling science through a transatlantic lightpath.
Ellisman, M; Hutton, T; Kirkland, A; Lin, A; Lin, C; Molina, T; Peltier, S; Singh, R; Tang, K; Trefethen, A E; Wallom, D C H; Xiong, X
2009-07-13
The OptIPuter microscopy demonstrator project has been designed to enable concurrent and remote usage of world-class electron microscopes located in Oxford and San Diego. The project has constructed a network consisting of microscopes and computational and data resources that are all connected by a dedicated network infrastructure using the UK Lightpath and US Starlight systems. Key science drivers include examples from both materials and biological science. The resulting system is now a permanent link between the Oxford and San Diego microscopy centres. This will form the basis of further projects between the sites and expansion of the types of systems that can be remotely controlled, including optical, as well as electron, microscopy. Other improvements will include the updating of the Microsoft cluster software to the high performance computing (HPC) server 2008, which includes the HPC basic profile implementation that will enable the development of interoperable clients.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chainer, Timothy J.; Parida, Pritish R.
Systems and methods for cooling include one or more computing structure, an inter-structure liquid cooling system that includes valves configured to selectively provide liquid coolant to the one or more computing structures; a heat rejection system that includes one or more heat rejection units configured to cool liquid coolant; and one or more liquid-to-liquid heat exchangers that include valves configured to selectively transfer heat from liquid coolant in the inter-structure liquid cooling system to liquid coolant in the heat rejection system. Each computing structure further includes one or more liquid-cooled servers; and an intra-structure liquid cooling system that has valvesmore » configured to selectively provide liquid coolant to the one or more liquid-cooled servers.« less
Provisioning cooling elements for chillerless data centers
Chainer, Timothy J.; Parida, Pritish R.
2016-12-13
Systems and methods for cooling include one or more computing structure, an inter-structure liquid cooling system that includes valves configured to selectively provide liquid coolant to the one or more computing structures; a heat rejection system that includes one or more heat rejection units configured to cool liquid coolant; and one or more liquid-to-liquid heat exchangers that include valves configured to selectively transfer heat from liquid coolant in the inter-structure liquid cooling system to liquid coolant in the heat rejection system. Each computing structure further includes one or more liquid-cooled servers; and an intra-structure liquid cooling system that has valves configured to selectively provide liquid coolant to the one or more liquid-cooled servers.
SSL - THE SIMPLE SOCKETS LIBRARY
NASA Technical Reports Server (NTRS)
Campbell, C. E.
1994-01-01
The Simple Sockets Library (SSL) allows C programmers to develop systems of cooperating programs using Berkeley streaming Sockets running under the TCP/IP protocol over Ethernet. The SSL provides a simple way to move information between programs running on the same or different machines and does so with little overhead. The SSL can create three types of Sockets: namely a server, a client, and an accept Socket. The SSL's Sockets are designed to be used in a fashion reminiscent of the use of FILE pointers so that a C programmer who is familiar with reading and writing files will immediately feel comfortable with reading and writing with Sockets. The SSL consists of three parts: the library, PortMaster, and utilities. The user of the SSL accesses it by linking programs to the SSL library. The PortMaster initializes connections between clients and servers. The PortMaster also supports a "firewall" facility to keep out socket requests from unapproved machines. The "firewall" is a file which contains Internet addresses for all approved machines. There are three utilities provided with the SSL. SKTDBG can be used to debug programs that make use of the SSL. SPMTABLE lists the servers and port numbers on requested machine(s). SRMSRVR tells the PortMaster to forcibly remove a server name from its list. The package also includes two example programs: multiskt.c, which makes multiple accepts on one server, and sktpoll.c, which repeatedly attempts to connect a client to some server at one second intervals. SSL is a machine independent library written in the C-language for computers connected via Ethernet using the TCP/IP protocol. It has been successfully compiled and implemented on a variety of platforms, including Sun series computers running SunOS, DEC VAX series computers running VMS, SGI computers running IRIX, DECstations running ULTRIX, DEC alpha AXPs running OSF/1, IBM RS/6000 computers running AIX, IBM PC and compatibles running BSD/386 UNIX and HP Apollo 3000/4000/9000/400T computers running HP-UX. SSL requires 45K of RAM to run under SunOS and 80K of RAM to run under VMS. For use on IBM PC series computers and compatibles running DOS, SSL requires Microsoft C 6.0 and the Wollongong TCP/IP package. Source code for sample programs and debugging tools are provided. The documentation is available on the distribution medium in TeX and PostScript formats. The standard distribution medium for SSL is a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format. It is also available on a 3.5 inch diskette in UNIX tar format and a 5.25 inch 360K MS-DOS format diskette. The SSL was developed in 1992 and was updated in 1993.
NDEx - The Network Data Exchange | Informatics Technology for Cancer Research (ITCR)
NDEx is an online commons where scientists can upload, share, and publicly distribute biological networks and pathway models. The NDEx Project maintains a web-accessible public server, a documentation website, provides seamless connectivity to Cytoscape as well as programmatic access using a variety of languages including Python and Java.
Chen, Hung-Ming; Liou, Yong-Zan
2014-10-01
In a mobile health management system, mobile devices act as the application hosting devices for personal health records (PHRs) and the healthcare servers construct to exchange and analyze PHRs. One of the most popular PHR standards is continuity of care record (CCR). The CCR is expressed in XML formats. However, parsing is an expensive operation that can degrade XML processing performance. Hence, the objective of this study was to identify different operational and performance characteristics for those CCR parsing models including the XML DOM parser, the SAX parser, the PULL parser, and the JSON parser with regard to JSON data converted from XML-based CCR. Thus, developers can make sensible choices for their target PHR applications to parse CCRs when using mobile devices or servers with different system resources. Furthermore, the simulation experiments of four case studies are conducted to compare the parsing performance on Android mobile devices and the server with large quantities of CCR data.
Application of wireless networks-peer-to-peer information sharing
NASA Astrophysics Data System (ADS)
ellappan, Vijayan; chaki, suchismita; kumar, avn
2017-11-01
Peer to Peer communications and its applications have gotten to be ordinary construction modelling in the wired network environment. But then, they have not been successfully adjusted with the wireless environment. Unlike the traditional client-server framework, in a P2P framework, each node can play the role of client as well as server simultaneously and exchange data or information with others. We aim to design an application which can adapt to the wireless ad-hoc networks. Peer to Peer communication can help people to share their files (information, image, audio, video and so on) and communicate with each other without relying on a particular network infrastructure or limited data usage. Here there is a central server with the help of which, the peers will have the capability to get the information about the other peers in the network. Indeed, even without the Internet, devices have the potential to allow users to connect and communicate in a special way through short range remote protocols such Wi-Fi.
Securing a web-based teleradiology platform according to German law and "best practices".
Spitzer, Michael; Ullrich, Tobias; Ueckert, Frank
2009-01-01
The Medical Data and Picture Exchange platform (MDPE), as a teleradiology system, facilitates the exchange of digital medical imaging data among authorized users. It features extensive support of the DICOM standard including networking functions. Since MDPE is designed as a web service, security and confidentiality of data and communication pose an outstanding challenge. To comply with demands of German laws and authorities, a generic data security concept considered as "best practice" in German health telematics was adapted to the specific demands of MDPE. The concept features strict logical and physical separation of diagnostic and identity data and thus an all-encompassing pseudonymization throughout the system. Hence, data may only be merged at authorized clients. MDPE's solution of merging data from separate sources within a web browser avoids technically questionable techniques such as deliberate cross-site scripting. Instead, data is merged dynamically by JavaScriptlets running in the user's browser. These scriptlets are provided by one server, while content and method calls are generated by another server. Additionally, MDPE uses encrypted temporary IDs for communication and merging of data.
EMR-based TeleGeriatric system.
Pallawala, P M; Lun, K C
2001-01-01
As medical services improve due to new technologies and breakthroughs, it has lead to an increasingly aging population. There has been much discussion and debate on how to solve various aspects such as psychological, socio-economic and medical problems related to aging. Our effort is to implement a feasible telegeriatric medical service with the use of the state of the art technology to deliver medical services efficiently to remote sites where elderly homes are based. The TeleGeriatric system will lead to rapid decision-making in the presence of acute or subacute emergencies. This triage will also lead to a reduction of unnecessary admission. It will enable the doctors who visit these elderly homes once a week basis to improve their geriatric management skills by communication with geriatric specialist. Nursing skills in the geriatric care will also benefit from this system. Integrated electronic medical record (EMR) system will be indispensable in the face of emergency admissions to hospitals. Evolution of EMR database would lead to future research in telegeriatrics and will help to identify the areas where telegeriatrics can be optimally used. This system is based on current web browsing technology and broadband communication. The TeleGeriatric web based server is developed using Java Technology. The TeleGeriatric database server was developed using Microsoft SQL server. Both are based at the Medical Informatics Programme, National University of Singapore. Two elderly homes situated in the periphery of Singapore and a leading government hospital in geriatric care have been chosen for the project. These 3 institutions and National University of Singapore are connected via ADSL protocol. ADSL connection supports high bandwidth, which is necessary for high quality videoconferencing. Each time a patient needs a teleconsultation a nurse or a doctor in the remote site sends the patient's record to the TeleGeriatric server. The TeleGeriatric server forwards the request to the Alexandra Hospital for consultation. Geriatrics specialists at the Alexandra Hospital carry out teleward rounds twice weekly and on demand basis. Following the implementation of the system, a trial run has been done. Total results have demonstrated a high degree of coordination and cooperation between remote site and the Alexandra Hospital. Also the patient compliance is very high and they prefer teleconsultation. Initial results show that the TeleGeriatric system has definite advantages in managing geriatric patients at a remote site. As the system evolves, further research will show the areas where telegeriatrics can be used optimally.
NASA Technical Reports Server (NTRS)
Rocker, JoAnne; Roncaglia, George J.; Heimerl, Lynn N.; Nelson, Michael L.
2002-01-01
Interoperability and data-exchange are critical for the survival of government information management programs. E-government initiatives are transforming the way the government interacts with the public. More information is to be made available through web-enabled technologies. Programs such as the NASA's Scientific and Technical Information (STI) Program Office are tasked to find more effective ways to disseminate information to the public. The NASA STI Program is an agency-wide program charged with gathering, organizing, storing, and disseminating NASA-produced information for research and public use. The program is investigating the use of a new protocol called the Open Archives Initiative (OAI) as a means to improve data interoperability and data collection. OAI promotes the use of the OAI harvesting protocol as a simple way for data sharing among repositories. In two separate initiatives, the STI Program is implementing OAI In collaboration with the Air Force, Department of Energy, and Old Dominion University, the NASA STI Program has funded research on implementing the OAI to exchange data between the three organizations. The second initiative is the deployment of OAI for the NASA technical report server (TRS) environment. The NASA TRS environment is comprised of distributed technical report servers with a centralized search interface. This paper focuses on the implementation of OAI to promote interoperability among diverse data repositories.
NASA Technical Reports Server (NTRS)
Boulanger, Richard P., Jr.; Kwauk, Xian-Min; Stagnaro, Mike; Kliss, Mark (Technical Monitor)
1998-01-01
The BIO-Plex control system requires real-time, flexible, and reliable data delivery. There is no simple "off-the-shelf 'solution. However, several commercial packages will be evaluated using a testbed at ARC for publish- and-subscribe and client-server communication architectures. Point-to-point communication architecture is not suitable for real-time BIO-Plex control system. Client-server architecture provides more flexible data delivery. However, it does not provide direct communication among nodes on the network. Publish-and-subscribe implementation allows direct information exchange among nodes on the net, providing the best time-critical communication. In this work Network Data Delivery Service (NDDS) from Real-Time Innovations, Inc. ARTIE will be used to implement publish-and subscribe architecture. It offers update guarantees and deadlines for real-time data delivery. Bridgestone, a data acquisition and control software package from National Instruments, will be tested for client-server arrangement. A microwave incinerator located at ARC will be instrumented with a fieldbus network of control devices. BridgeVIEW will be used to implement an enterprise server. An enterprise network consisting of several nodes at ARC and a WAN connecting ARC and RISC will then be setup to evaluate proposed control system architectures. Several network configurations will be evaluated for fault tolerance, quality of service, reliability and efficiency. Data acquired from these network evaluation tests will then be used to determine preliminary design criteria for the BIO-Plex distributed control system.
Multi-resolution extension for transmission of geodata in a mobile context
NASA Astrophysics Data System (ADS)
Follin, Jean-Michel; Bouju, Alain; Bertrand, Frédéric; Boursier, Patrice
2005-03-01
A solution is proposed for the management of multi-resolution vector data in a mobile spatial information visualization system. The client-server architecture and the models of data and transfer of the system are presented first. The aim of this system is to reduce data exchanged between client and server by reusing data already present on the client side. Then, an extension of this system to multi-resolution data is proposed. Our solution is based on the use of increments in a multi-scale database. A database architecture where data sets for different predefined scales are precomputed and stored on the server side is adopted. In this model, each object representing the same real world entities at different levels of detail has to be linked beforehand. Increments correspond to the difference between two datasets with different levels of detail. They are transmitted in order to increase (or decrease) the detail to the client upon request. They include generalization and refinement operators allowing transitions between the different levels. Finally, a framework suited to the transfer of multi-resolution data in a mobile context is presented. This allows reuse of data locally available at different levels of detail and, in this way, reduces the amount of data transferred between client and server.
eCX: A Secure Infrastructure for E-Course Delivery.
ERIC Educational Resources Information Center
Yau, Joe C. K; Hui, Lucas C. K.; Cheung, Bruce; Yiu, S. M.
2003-01-01
Presents a mechanism, the Secure e-Course eXchange (eCX) designed to protect learning material from unauthorized dissemination, and shows how this mechanism can be integrated in the operation model of online learning course providers. The design of eCX is flexible to fit two operating models, the Institutional Server Model and the Corporate Server…
WMT: The CSDMS Web Modeling Tool
NASA Astrophysics Data System (ADS)
Piper, M.; Hutton, E. W. H.; Overeem, I.; Syvitski, J. P.
2015-12-01
The Community Surface Dynamics Modeling System (CSDMS) has a mission to enable model use and development for research in earth surface processes. CSDMS strives to expand the use of quantitative modeling techniques, promotes best practices in coding, and advocates for the use of open-source software. To streamline and standardize access to models, CSDMS has developed the Web Modeling Tool (WMT), a RESTful web application with a client-side graphical interface and a server-side database and API that allows users to build coupled surface dynamics models in a web browser on a personal computer or a mobile device, and run them in a high-performance computing (HPC) environment. With WMT, users can: Design a model from a set of components Edit component parameters Save models to a web-accessible server Share saved models with the community Submit runs to an HPC system Download simulation results The WMT client is an Ajax application written in Java with GWT, which allows developers to employ object-oriented design principles and development tools such as Ant, Eclipse and JUnit. For deployment on the web, the GWT compiler translates Java code to optimized and obfuscated JavaScript. The WMT client is supported on Firefox, Chrome, Safari, and Internet Explorer. The WMT server, written in Python and SQLite, is a layered system, with each layer exposing a web service API: wmt-db: database of component, model, and simulation metadata and output wmt-api: configure and connect components wmt-exe: launch simulations on remote execution servers The database server provides, as JSON-encoded messages, the metadata for users to couple model components, including descriptions of component exchange items, uses and provides ports, and input parameters. Execution servers are network-accessible computational resources, ranging from HPC systems to desktop computers, containing the CSDMS software stack for running a simulation. Once a simulation completes, its output, in NetCDF, is packaged and uploaded to a data server where it is stored and from which a user can download it as a single compressed archive file.
NASA Astrophysics Data System (ADS)
Kehlenbeck, Matthias; Breitner, Michael H.
Business users define calculated facts based on the dimensions and facts contained in a data warehouse. These business calculation definitions contain necessary knowledge regarding quantitative relations for deep analyses and for the production of meaningful reports. The business calculation definitions are implementation and widely organization independent. But no automated procedures facilitating their exchange across organization and implementation boundaries exist. Separately each organization currently has to map its own business calculations to analysis and reporting tools. This paper presents an innovative approach based on standard Semantic Web technologies. This approach facilitates the exchange of business calculation definitions and allows for their automatic linking to specific data warehouses through semantic reasoning. A novel standard proxy server which enables the immediate application of exchanged definitions is introduced. Benefits of the approach are shown in a comprehensive case study.
On the security of a simple three-party key exchange protocol without server's public keys.
Nam, Junghyun; Choo, Kim-Kwang Raymond; Park, Minkyu; Paik, Juryon; Won, Dongho
2014-01-01
Authenticated key exchange protocols are of fundamental importance in securing communications and are now extensively deployed for use in various real-world network applications. In this work, we reveal major previously unpublished security vulnerabilities in the password-based authenticated three-party key exchange protocol according to Lee and Hwang (2010): (1) the Lee-Hwang protocol is susceptible to a man-in-the-middle attack and thus fails to achieve implicit key authentication; (2) the protocol cannot protect clients' passwords against an offline dictionary attack; and (3) the indistinguishability-based security of the protocol can be easily broken even in the presence of a passive adversary. We also propose an improved password-based authenticated three-party key exchange protocol that addresses the security vulnerabilities identified in the Lee-Hwang protocol.
On the Security of a Simple Three-Party Key Exchange Protocol without Server's Public Keys
Nam, Junghyun; Choo, Kim-Kwang Raymond; Park, Minkyu; Paik, Juryon; Won, Dongho
2014-01-01
Authenticated key exchange protocols are of fundamental importance in securing communications and are now extensively deployed for use in various real-world network applications. In this work, we reveal major previously unpublished security vulnerabilities in the password-based authenticated three-party key exchange protocol according to Lee and Hwang (2010): (1) the Lee-Hwang protocol is susceptible to a man-in-the-middle attack and thus fails to achieve implicit key authentication; (2) the protocol cannot protect clients' passwords against an offline dictionary attack; and (3) the indistinguishability-based security of the protocol can be easily broken even in the presence of a passive adversary. We also propose an improved password-based authenticated three-party key exchange protocol that addresses the security vulnerabilities identified in the Lee-Hwang protocol. PMID:25258723
Software Modules for the Proximity-1 Space Link Interleaved Time Synchronization (PITS) Protocol
NASA Technical Reports Server (NTRS)
Woo, Simon S.; Veregge, John R.; Gao, Jay L.; Clare, Loren P.; Mills, David
2012-01-01
The Proximity-1 Space Link Interleaved Time Synchronization (PITS) protocol provides time distribution and synchronization services for space systems. A software prototype implementation of the PITS algorithm has been developed that also provides the test harness to evaluate the key functionalities of PITS with simulated data source and sink. PITS integrates time synchronization functionality into the link layer of the CCSDS Proximity-1 Space Link Protocol. The software prototype implements the network packet format, data structures, and transmit- and receive-timestamp function for a time server and a client. The software also simulates the transmit and receive-time stamp exchanges via UDP (User Datagram Protocol) socket between a time server and a time client, and produces relative time offsets and delay estimates.
Development of a system for transferring images via a network: supporting a regional liaison.
Mihara, Naoki; Manabe, Shiro; Takeda, Toshihiro; Shinichirou, Kitamura; Junichi, Murakami; Kouji, Kiso; Matsumura, Yasushi
2013-01-01
We developed a system that transfers images via network and started using them in our hospital's PACS (Picture Archiving and Communication Systems) in 2006. We are pleased to report that the system has been re-developed and has been running so that there will be a regional liaison in the future. It has become possible to automatically transfer images simply by selecting the destination hospital that is registered in advance at the relay server. The gateway of this system can send images to a multi-center, relay management server, which receives the images and resends them. This system has the potential to be useful for image exchange, and to serve as a regional medical liaison.
Provisioning cooling elements for chillerless data centers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chainer, Timothy J.; Parida, Pritish R.
Systems and methods for cooling include one or more computing structure, an inter-structure liquid cooling system that includes valves configured to selectively provide liquid coolant to the one or more computing structures; a heat rejection system that includes one or more heat rejection units configured to cool liquid coolant; and one or more liquid-to-liquid heat exchangers that include valves configured to selectively transfer heat from liquid coolant in the inter-structure liquid cooling system to liquid coolant in the heat rejection system. Each computing structure further includes one or more liquid-cooled servers; and an intra-structure liquid cooling system that has valvesmore » configured to selectively provide liquid coolant to the one or more liquid-cooled servers.« less
MEGANTE: A Web-Based System for Integrated Plant Genome Annotation
Numa, Hisataka; Itoh, Takeshi
2014-01-01
The recent advancement of high-throughput genome sequencing technologies has resulted in a considerable increase in demands for large-scale genome annotation. While annotation is a crucial step for downstream data analyses and experimental studies, this process requires substantial expertise and knowledge of bioinformatics. Here we present MEGANTE, a web-based annotation system that makes plant genome annotation easy for researchers unfamiliar with bioinformatics. Without any complicated configuration, users can perform genomic sequence annotations simply by uploading a sequence and selecting the species to query. MEGANTE automatically runs several analysis programs and integrates the results to select the appropriate consensus exon–intron structures and to predict open reading frames (ORFs) at each locus. Functional annotation, including a similarity search against known proteins and a functional domain search, are also performed for the predicted ORFs. The resultant annotation information is visualized with a widely used genome browser, GBrowse. For ease of analysis, the results can be downloaded in Microsoft Excel format. All of the query sequences and annotation results are stored on the server side so that users can access their own data from virtually anywhere on the web. The current release of MEGANTE targets 24 plant species from the Brassicaceae, Fabaceae, Musaceae, Poaceae, Salicaceae, Solanaceae, Rosaceae and Vitaceae families, and it allows users to submit a sequence up to 10 Mb in length and to save up to 100 sequences with the annotation information on the server. The MEGANTE web service is available at https://megante.dna.affrc.go.jp/. PMID:24253915
NASA Astrophysics Data System (ADS)
Pulok, Md Kamrul Hasan
Intelligent and effective monitoring of power system stability in control centers is one of the key issues in smart grid technology to prevent unwanted power system blackouts. Voltage stability analysis is one of the most important requirements for control center operation in smart grid era. With the advent of Phasor Measurement Unit (PMU) or Synchrophasor technology, real time monitoring of voltage stability of power system is now a reality. This work utilizes real-time PMU data to derive a voltage stability index to monitor the voltage stability related contingency situation in power systems. The developed tool uses PMU data to calculate voltage stability index that indicates relative closeness of the instability by producing numerical indices. The IEEE 39 bus, New England power system was modeled and run on a Real-time Digital Simulator that stream PMU data over the Internet using IEEE C37.118 protocol. A Phasor data concentrator (PDC) is setup that receives streaming PMU data and stores them in Microsoft SQL database server. Then the developed voltage stability monitoring (VSM) tool retrieves phasor measurement data from SQL server, performs real-time state estimation of the whole network, calculate voltage stability index, perform real-time ranking of most vulnerable transmission lines, and finally shows all the results in a graphical user interface. All these actions are done in near real-time. Control centers can easily monitor the systems condition by using this tool and can take precautionary actions if needed.
Sharing digital micrographs and other data files between computers.
Entwistle, A
2004-01-01
It ought to be easy to exchange digital micrographs and other computer data files with a colleague even on another continent. In practice, this often is not the case. The advantages and disadvantages of various methods that are available for exchanging data files between computers are discussed. When possible, data should be transferred through computer networking. When data are to be exchanged locally between computers with similar operating systems, the use of a local area network is recommended. For computers in commercial or academic environments that have dissimilar operating systems or are more widely spaced, the use of FTPs is recommended. Failing this, posting the data on a website and transferring by hypertext transfer protocol is suggested. If peer to peer exchange between computers in domestic environments is needed, the use of Messenger services such as Microsoft Messenger or Yahoo Messenger is the method of choice. When it is not possible to transfer the data files over the internet, single use, writable CD ROMs are the best media for transferring data. If for some reason this is not possible, DVD-R/RW, DVD+R/RW, 100 MB ZIP disks and USB flash media are potentially useful media for exchanging data files.
NASA Astrophysics Data System (ADS)
Gualda, Guilherme A. R.; Ghiorso, Mark S.
2015-01-01
thermodynamic modeling software MELTS is a powerful tool for investigating crystallization and melting in natural magmatic systems. Rhyolite-MELTS is a recalibration of MELTS that better captures the evolution of silicic magmas in the upper crust. The current interface of rhyolite-MELTS, while flexible, can be somewhat cumbersome for the novice. We present a new interface that uses web services consumed by a VBA backend in Microsoft Excel©. The interface is contained within a macro-enabled workbook, where the user can insert the model input information and initiate computations that are executed on a central server at OFM Research. Results of simple calculations are shown immediately within the interface itself. It is also possible to combine a sequence of calculations into an evolutionary path; the user can input starting and ending temperatures and pressures, temperature and pressure steps, and the prevailing oxidation conditions. The program shows partial updates at every step of the computations; at the conclusion of the calculations, a series of data sheets and diagrams are created in a separate workbook, which can be saved independently of the interface. Additionally, the user can specify a grid of temperatures and pressures and calculate a phase diagram showing the conditions at which different phases are present. The interface can be used to apply the rhyolite-MELTS geobarometer. We demonstrate applications of the interface using an example early-erupted Bishop Tuff composition. The interface is simple to use and flexible, but it requires an internet connection. The interface is distributed for free from http://melts.ofm-research.org.
Performance of the High Sensitivity Open Source Multi-GNSS Assisted GNSS Reference Server.
NASA Astrophysics Data System (ADS)
Sarwar, Ali; Rizos, Chris; Glennon, Eamonn
2015-06-01
The Open Source GNSS Reference Server (OSGRS) exploits the GNSS Reference Interface Protocol (GRIP) to provide assistance data to GPS receivers. Assistance can be in terms of signal acquisition and in the processing of the measurement data. The data transfer protocol is based on Extensible Mark-up Language (XML) schema. The first version of the OSGRS required a direct hardware connection to a GPS device to acquire the data necessary to generate the appropriate assistance. Scenarios of interest for the OSGRS users are weak signal strength indoors, obstructed outdoors or heavy multipath environments. This paper describes an improved version of OSGRS that provides alternative assistance support from a number of Global Navigation Satellite Systems (GNSS). The underlying protocol to transfer GNSS assistance data from global casters is the Networked Transport of RTCM (Radio Technical Commission for Maritime Services) over Internet Protocol (NTRIP), and/or the RINEX (Receiver Independent Exchange) format. This expands the assistance and support model of the OSGRS to globally available GNSS data servers connected via internet casters. A variety of formats and versions of RINEX and RTCM streams become available, which strengthens the assistance provisioning capability of the OSGRS platform. The prime motivation for this work was to enhance the system architecture of the OSGRS to take advantage of globally available GNSS data sources. Open source software architectures and assistance models provide acquisition and data processing assistance for GNSS receivers operating in weak signal environments. This paper describes test scenarios to benchmark the OSGRSv2 performance against other Assisted-GNSS solutions. Benchmarking devices include the SPOT satellite messenger, MS-Based & MS-Assisted GNSS, HSGNSS (SiRFstar-III) and Wireless Sensor Networks Assisted-GNSS. Benchmarked parameters include the number of tracked satellites, the Time to Fix First (TTFF), navigation availability and accuracy. Three different configurations of Multi-GNSS assistance servers were used, namely Cloud-Client-Server, the Demilitarized Zone (DMZ) Client-Server and PC-Client-Server; with respect to the connectivity location of client and server. The impact on the performance based on server and/or client initiation, hardware capability, network latency, processing delay and computation times with their storage, scalability, processing and load sharing capabilities, were analysed. The performance of the OSGRS is compared against commercial GNSS, Assisted-GNSS and WSN-enabled GNSS devices. The OSGRS system demonstrated lower TTFF and higher availability.
EMR based telegeriatric system.
Pallawala, P M; Lun, K C
2001-05-01
As medical services improve due to new technologies and breakthroughs, it has lead to an increasingly aging population. There has been much discussion and debate on how to solve various aspects such as psychological, socioeconomic and medical problems related to aging. Our effort is to implement a feasible telegeriatric medical service with the use of the state of the art technology to deliver medical services efficiently to remote sites where elderly homes are based. Telegeriatric system will lead to rapid decision-making in the presence of acute or subacute emergencies. This triage will also lead to a reduction of unnecessary admission. It will enable the doctors who visit these elderly homes on a once-a-week basis to improve their geriatric management skills by communication with geriatric specialist. Nursing skills in geriatric care will also benefit from this system. Integrated EMR service will be indispensable in the face of emergency admissions to hospitals. Evolution of EMR database would lead to future research in telegeriatrics and will help to identify the areas where telegeriatrics can be optimally used. This system is based on current web browsing technology and broadband communication. EMR web based server is developed using Java Technology. EMR database was developed using Microsoft SQL server. Both are based at the Medical Informatics Programme, National University of Singapore. Two elderly homes situated in the periphery of Singapore and a leading government hospital in geriatric care has been chosen for the project. These three institutions and National University of Singapore are connected via ADSL protocol, which support high bandwidth, which is necessary for high quality videoconferencing. Each time a patient needs a teleconsultation, a nurse or doctor in the remote site sends the history to the EMR server. EMR server forwards the request to the Alexandra Hospital for consultation. Geriatrics specialists at Alexandra Hospital carry out teleward rounds twice weekly and on demand basis. Following the implementation of the system, a trial run has been done. This shows a high degree of coordination and cooperation between remote site and the Alexandra Hospital Also the patient compliance is very high and they prefer teleconsultation. Initial results show that telegeriatric system has definite advantages in managing geriatric patients at a remote site. As the system evolves, further research will show the areas where telegeriatrics can be used optimally.
Serving by local consensus in the public service location game.
Sun, Yi-Fan; Zhou, Hai-Jun
2016-09-02
We discuss the issue of distributed and cooperative decision-making in a network game of public service location. Each node of the network can decide to host a certain public service incurring in a construction cost and serving all the neighboring nodes and itself. A pure consumer node has to pay a tax, and the collected tax is evenly distributed to all the hosting nodes to remedy their construction costs. If all nodes make individual best-response decisions, the system gets trapped in an inefficient situation of high tax level. Here we introduce a decentralized local-consensus selection mechanism which requires nodes to recommend their neighbors of highest local impact as candidate servers, and a node may become a server only if all its non-server neighbors give their assent. We demonstrate that although this mechanism involves only information exchange among neighboring nodes, it leads to socially efficient solutions with tax level approaching the lowest possible value. Our results may help in understanding and improving collective problem-solving in various networked social and robotic systems.
HARMONY: a server for the assessment of protein structures
Pugalenthi, G.; Shameer, K.; Srinivasan, N.; Sowdhamini, R.
2006-01-01
Protein structure validation is an important step in computational modeling and structure determination. Stereochemical assessment of protein structures examine internal parameters such as bond lengths and Ramachandran (φ,ψ) angles. Gross structure prediction methods such as inverse folding procedure and structure determination especially at low resolution can sometimes give rise to models that are incorrect due to assignment of misfolds or mistracing of electron density maps. Such errors are not reflected as strain in internal parameters. HARMONY is a procedure that examines the compatibility between the sequence and the structure of a protein by assigning scores to individual residues and their amino acid exchange patterns after considering their local environments. Local environments are described by the backbone conformation, solvent accessibility and hydrogen bonding patterns. We are now providing HARMONY through a web server such that users can submit their protein structure files and, if required, the alignment of homologous sequences. Scores are mapped on the structure for subsequent examination that is useful to also recognize regions of possible local errors in protein structures. HARMONY server is located at PMID:16844999
Serving by local consensus in the public service location game
Sun, Yi-Fan; Zhou, Hai-Jun
2016-01-01
We discuss the issue of distributed and cooperative decision-making in a network game of public service location. Each node of the network can decide to host a certain public service incurring in a construction cost and serving all the neighboring nodes and itself. A pure consumer node has to pay a tax, and the collected tax is evenly distributed to all the hosting nodes to remedy their construction costs. If all nodes make individual best-response decisions, the system gets trapped in an inefficient situation of high tax level. Here we introduce a decentralized local-consensus selection mechanism which requires nodes to recommend their neighbors of highest local impact as candidate servers, and a node may become a server only if all its non-server neighbors give their assent. We demonstrate that although this mechanism involves only information exchange among neighboring nodes, it leads to socially efficient solutions with tax level approaching the lowest possible value. Our results may help in understanding and improving collective problem-solving in various networked social and robotic systems. PMID:27586793
Serving by local consensus in the public service location game
NASA Astrophysics Data System (ADS)
Sun, Yi-Fan; Zhou, Hai-Jun
2016-09-01
We discuss the issue of distributed and cooperative decision-making in a network game of public service location. Each node of the network can decide to host a certain public service incurring in a construction cost and serving all the neighboring nodes and itself. A pure consumer node has to pay a tax, and the collected tax is evenly distributed to all the hosting nodes to remedy their construction costs. If all nodes make individual best-response decisions, the system gets trapped in an inefficient situation of high tax level. Here we introduce a decentralized local-consensus selection mechanism which requires nodes to recommend their neighbors of highest local impact as candidate servers, and a node may become a server only if all its non-server neighbors give their assent. We demonstrate that although this mechanism involves only information exchange among neighboring nodes, it leads to socially efficient solutions with tax level approaching the lowest possible value. Our results may help in understanding and improving collective problem-solving in various networked social and robotic systems.
2012-09-01
Services FSD Federated Services Daemon I&A Identification and Authentication IKE Internet Key Exchange KPI Key Performance Indicator LAN Local Area...spection takes place in different processes in the server architecture. Key Performance Indica- tor ( KPI )s associated with the system need to be...application and risk analysis of security controls. Thus, measurement of the KPIs is needed before an informed tradeoff between the performance penalties
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-20
... be charged on a per-Login ID basis. Firms may access C2 via either a CMI Client Application [[Page..., using different Login IDs, accessing the same CMI Client Application Server or FIX Port, allowing the firm to only pay the monthly fee once. Alternatively, a firm may use the same Login ID to access...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-14
... charges to assess a fee for each CMI Login ID. Firms may access CBOEdirect via either a CMI Client... Login IDs, accessing the same CMI Client Application Server, allowing the firm to only pay the monthly fee once. Alternatively, a firm may use the same Login ID to access different CMI Client Application...
[Relevance of the hemovigilance regional database for the shared medical file identity server].
Doly, A; Fressy, P; Garraud, O
2008-11-01
The French Health Products Safety Agency coordinates the national initiative of computerization of blood products traceability within regional blood banks and public and private hospitals. The Auvergne-Loire Regional French Blood Service, based in Saint-Etienne, together with a number of public hospitals set up a transfusion data network named EDITAL. After four years of progressive implementation and experimentation, a software enabling standardized data exchange has built up a regional nominative database, endorsed by the Traceability Computerization National Committee in 2004. This database now provides secured web access to a regional transfusion history enabling biologists and all hospital and family practitioners to take in charge the patient follow-up. By running independently from the softwares of its partners, EDITAL database provides reference for the regional identity server.
Werts, Joshua D; Mikhailova, Elena A; Post, Christopher J; Sharp, Julia L
2012-04-01
Volunteered geographic information and social networking in a WebGIS has the potential to increase public participation in soil and water conservation, promote environmental awareness and change, and provide timely data that may be otherwise unavailable to policymakers in soil and water conservation management. The objectives of this study were: (1) to develop a framework for combining current technologies, computing advances, data sources, and social media; and (2) develop and test an online web mapping interface. The mapping interface integrates Microsoft Silverlight, Bing Maps, ArcGIS Server, Google Picasa Web Albums Data API, RSS, Google Analytics, and Facebook to create a rich user experience. The website allows the public to upload photos and attributes of their own subdivisions or sites they have identified and explore other submissions. The website was made available to the public in early February 2011 at http://www.AbandonedDevelopments.com and evaluated for its potential long-term success in a pilot study.
Synoptic reporting in tumor pathology: advantages of a web-based system.
Qu, Zhenhong; Ninan, Shibu; Almosa, Ahmed; Chang, K G; Kuruvilla, Supriya; Nguyen, Nghia
2007-06-01
The American College of Surgeons Commission on Cancer (ACS-CoC) mandates that pathology reports at ACS-CoC-approved cancer programs include all scientifically validated data elements for each site and tumor specimen. The College of American Pathologists (CAP) has produced cancer checklists in static text formats to assist reporting. To be inclusive, the CAP checklists are pages long, requiring extensive text editing and multiple intermediate steps. We created a set of dynamic tumor-reporting templates, using Microsoft Active Server Page (ASP.NET), with drop-down list and data-compile features, and added a reminder function to indicate missing information. Users can access this system on the Internet, prepare the tumor report by selecting relevant data from drop-down lists with an embedded tumor staging scheme, and directly transfer the final report into a laboratory information system by using the copy-and-paste function. By minimizing extensive text editing and eliminating intermediate steps, this system can reduce reporting errors, improve work efficiency, and increase compliance.
Attigala, Lakshmi; De Silva, Nuwan I.; Clark, Lynn G.
2016-01-01
Premise of the study: Programs that are user-friendly and freely available for developing Web-based interactive keys are scarce and most of the well-structured applications are relatively expensive. WEBiKEY was developed to enable researchers to easily develop their own Web-based interactive keys with fewer resources. Methods and Results: A Web-based multiaccess identification tool (WEBiKEY) was developed that uses freely available Microsoft ASP.NET technologies and an SQL Server database for Windows-based hosting environments. WEBiKEY was tested for its usability with a sample data set, the temperate woody bamboo genus Kuruna (Poaceae). Conclusions: WEBiKEY is freely available to the public and can be used to develop Web-based interactive keys for any group of species. The interactive key we developed for Kuruna using WEBiKEY enables users to visually inspect characteristics of Kuruna and identify an unknown specimen as one of seven possible species in the genus. PMID:27144109
NASA Astrophysics Data System (ADS)
Werts, Joshua D.; Mikhailova, Elena A.; Post, Christopher J.; Sharp, Julia L.
2012-04-01
Volunteered geographic information and social networking in a WebGIS has the potential to increase public participation in soil and water conservation, promote environmental awareness and change, and provide timely data that may be otherwise unavailable to policymakers in soil and water conservation management. The objectives of this study were: (1) to develop a framework for combining current technologies, computing advances, data sources, and social media; and (2) develop and test an online web mapping interface. The mapping interface integrates Microsoft Silverlight, Bing Maps, ArcGIS Server, Google Picasa Web Albums Data API, RSS, Google Analytics, and Facebook to create a rich user experience. The website allows the public to upload photos and attributes of their own subdivisions or sites they have identified and explore other submissions. The website was made available to the public in early February 2011 at http://www.AbandonedDevelopments.com and evaluated for its potential long-term success in a pilot study.
Chemical-text hybrid search engines.
Zhou, Yingyao; Zhou, Bin; Jiang, Shumei; King, Frederick J
2010-01-01
As the amount of chemical literature increases, it is critical that researchers be enabled to accurately locate documents related to a particular aspect of a given compound. Existing solutions, based on text and chemical search engines alone, suffer from the inclusion of "false negative" and "false positive" results, and cannot accommodate diverse repertoire of formats currently available for chemical documents. To address these concerns, we developed an approach called Entity-Canonical Keyword Indexing (ECKI), which converts a chemical entity embedded in a data source into its canonical keyword representation prior to being indexed by text search engines. We implemented ECKI using Microsoft Office SharePoint Server Search, and the resultant hybrid search engine not only supported complex mixed chemical and keyword queries but also was applied to both intranet and Internet environments. We envision that the adoption of ECKI will empower researchers to pose more complex search questions that were not readily attainable previously and to obtain answers at much improved speed and accuracy.
Xu, Weijia; Ozer, Stuart; Gutell, Robin R
2009-01-01
With an increasingly large amount of sequences properly aligned, comparative sequence analysis can accurately identify not only common structures formed by standard base pairing but also new types of structural elements and constraints. However, traditional methods are too computationally expensive to perform well on large scale alignment and less effective with the sequences from diversified phylogenetic classifications. We propose a new approach that utilizes coevolutional rates among pairs of nucleotide positions using phylogenetic and evolutionary relationships of the organisms of aligned sequences. With a novel data schema to manage relevant information within a relational database, our method, implemented with a Microsoft SQL Server 2005, showed 90% sensitivity in identifying base pair interactions among 16S ribosomal RNA sequences from Bacteria, at a scale 40 times bigger and 50% better sensitivity than a previous study. The results also indicated covariation signals for a few sets of cross-strand base stacking pairs in secondary structure helices, and other subtle constraints in the RNA structure.
Xu, Weijia; Ozer, Stuart; Gutell, Robin R.
2010-01-01
With an increasingly large amount of sequences properly aligned, comparative sequence analysis can accurately identify not only common structures formed by standard base pairing but also new types of structural elements and constraints. However, traditional methods are too computationally expensive to perform well on large scale alignment and less effective with the sequences from diversified phylogenetic classifications. We propose a new approach that utilizes coevolutional rates among pairs of nucleotide positions using phylogenetic and evolutionary relationships of the organisms of aligned sequences. With a novel data schema to manage relevant information within a relational database, our method, implemented with a Microsoft SQL Server 2005, showed 90% sensitivity in identifying base pair interactions among 16S ribosomal RNA sequences from Bacteria, at a scale 40 times bigger and 50% better sensitivity than a previous study. The results also indicated covariation signals for a few sets of cross-strand base stacking pairs in secondary structure helices, and other subtle constraints in the RNA structure. PMID:20502534
Ward, R. E.; Purves, T.; Feldman, M.; Schiffman, R. M.; Barry, S.; Christner, M.; Kipa, G.; McCarthy, B. D.; Stiphout, R.
1991-01-01
The Care Windows development project demonstrated the feasibility of an approach designed to add the benefits of an event-driven, graphically-oriented user interface to an existing Medical Information Management System (MIMS) without overstepping economic and logistic constraints. The design solution selected for the Care Windows project incorporates three important design features: (1) the effective de-coupling of severs from requesters, permitting the use of an extensive pre-existing library of MIMS servers, (2) the off-loading of program control functions of the requesters to the workstation processor, reducing the load per transaction on central resources and permitting the use of object-oriented development environments available for microcomputers, (3) the selection of a low end, GUI-capable workstation consisting of a PC-compatible personal computer running Microsoft Windows 3.0, and (4) the development of a highly layered, modular workstation application, permitting the development of interchangeable modules to insure portability and adaptability. PMID:1807665
Dynamic XML-based exchange of relational data: application to the Human Brain Project.
Tang, Zhengming; Kadiyska, Yana; Li, Hao; Suciu, Dan; Brinkley, James F
2003-01-01
This paper discusses an approach to exporting relational data in XML format for data exchange over the web. We describe the first real-world application of SilkRoute, a middleware program that dynamically converts existing relational data to a user-defined XML DTD. The application, called XBrain, wraps SilkRoute in a Java Server Pages framework, thus permitting a web-based XQuery interface to a legacy relational database. The application is demonstrated as a query interface to the University of Washington Brain Project's Language Map Experiment Management System, which is used to manage data about language organization in the brain.
2001-01-01
System (GCCS) Track Database Management System (TDBM) (3) GCCS Integrated Imagery and Intelligence (3) Intelligence Shared Data Server (ISDS) General ...The CTH is a powerful model that will allow more than just message systems to exchange information. It could be used for object-oriented databases, as...of the Naval Integrated Tactical Environmental System I (NITES I) is used as a case study to demonstrate the utility of this distributed component
Robinson, Judas; de Lusignan, Simon; Kostkova, Patty; Madge, Bruce; Marsh, A; Biniaris, C
2006-01-01
Rich Site Summary (RSS) feeds are a method for disseminating and syndicating the contents of a website using extensible mark-up language (XML). The Primary Care Electronic Library (PCEL) distributes recent additions to the site in the form of an RSS feed. When new resources are added to PCEL, they are manually assigned medical subject headings (MeSH terms), which are then automatically mapped to SNOMED-CT terms using the Unified Medical Language System (UMLS) Metathesaurus. The library is thus searchable using MeSH or SNOMED-CT. Our syndicate partner wished to have remote access to PCEL coronary heart disease (CHD) information resources based on SNOMED-CT search terms. To pilot the supply of relevant information resources in response to clinically coded requests, using RSS syndication for transmission between web servers. Our syndicate partner provided a list of CHD SNOMED-CT terms to its end-users, a list which was coded according to UMLS specifications. When the end-user requested relevant information resources, this request was relayed from our syndicate partner's web server to the PCEL web server. The relevant resources were retrieved from the PCEL MySQL database. This database is accessed using a server side scripting language (PHP), which enables the production of dynamic RSS feeds on the basis of Source Asserted Identifiers (CODEs) contained in UMLS. Retrieving resources using SNOMED-CT terms using syndication can be used to build a functioning application. The process from request to display of syndicated resources took less than one second. The results of the pilot illustrate that it is possible to exchange data between servers using RSS syndication. This method could be utilised dynamically to supply digital library resources to a clinical system with SNOMED-CT data used as the standard of reference.
Development of Geospatial Map Based Election Portal
NASA Astrophysics Data System (ADS)
Gupta, A. Kumar Chandra; Kumar, P.; Vasanth Kumar, N.
2014-11-01
The Geospatial Delhi Limited (GSDL), a Govt. of NCT of Delhi Company formed in order to provide the geospatial information of National Capital Territory of Delhi (NCTD) to the Government of National Capital Territory of Delhi (GNCTD) and its organs such as DDA, MCD, DJB, State Election Department, DMRC etc., for the benefit of all citizens of Government of National Capital Territory of Delhi (GNCTD). This paper describes the development of Geospatial Map based Election portal (GMEP) of NCT of Delhi. The portal has been developed as a map based spatial decision support system (SDSS) for pertain to planning and management of Department of Chief Electoral Officer, and as an election related information searching tools (Polling Station, Assembly and parliamentary constituency etc.,) for the citizens of NCTD. The GMEP is based on Client-Server architecture model. It has been developed using ArcGIS Server 10.0 with J2EE front-end on Microsoft Windows environment. The GMEP is scalable to enterprise SDSS with enterprise Geo Database & Virtual Private Network (VPN) connectivity. Spatial data to GMEP includes delimited precinct area boundaries of Voters Area of Polling stations, Assembly Constituency, Parliamentary Constituency, Election District, Landmark locations of Polling Stations & basic amenities (Police Stations, Hospitals, Schools and Fire Stations etc.). GMEP could help achieve not only the desired transparency and easiness in planning process but also facilitates through efficient & effective tools for management of elections. It enables a faster response to the changing ground realities in the development planning, owing to its in-built scientific approach and open-ended design.
NASA World Wind: A New Mission
NASA Astrophysics Data System (ADS)
Hogan, P.; Gaskins, T.; Bailey, J. E.
2008-12-01
Virtual Globes are well into their first generation, providing increasingly rich and beautiful visualization of more types and quantities of information. However, they are still mostly single and proprietary programs, akin to a web browser whose content and functionality are controlled and constrained largely by the browser's manufacturer. Today Google and Microsoft determine what we can and cannot see and do in these programs. NASA World Wind started out in nearly the same mode, a single program with limited functionality and information content. But as the possibilities of virtual globes became more apparent, we found that while enabling a new class of information visualization, we were also getting in the way. Many users want to provide World Wind functionality and information in their programs, not ours. They want it in their web pages. They want to include their own features. They told us that only with this kind of flexibility, could their objectives and the potential of the technology be truly realized. World Wind therefore changed its mission: from providing a single information browser to enabling a whole class of 3D geographic applications. Instead of creating one program, we create components to be used in any number of programs. World Wind is NASA open source software. With the source code being fully visible, anyone can readily use it and freely extend it to serve any use. Imagery and other information provided by the World Wind servers is also free and unencumbered, including the server technology to deliver geospatial data. World Wind developers can therefore provide exclusive and custom solutions based on user needs.
MetaBar - a tool for consistent contextual data acquisition and standards compliant submission.
Hankeln, Wolfgang; Buttigieg, Pier Luigi; Fink, Dennis; Kottmann, Renzo; Yilmaz, Pelin; Glöckner, Frank Oliver
2010-06-30
Environmental sequence datasets are increasing at an exponential rate; however, the vast majority of them lack appropriate descriptors like sampling location, time and depth/altitude: generally referred to as metadata or contextual data. The consistent capture and structured submission of these data is crucial for integrated data analysis and ecosystems modeling. The application MetaBar has been developed, to support consistent contextual data acquisition. MetaBar is a spreadsheet and web-based software tool designed to assist users in the consistent acquisition, electronic storage, and submission of contextual data associated to their samples. A preconfigured Microsoft Excel spreadsheet is used to initiate structured contextual data storage in the field or laboratory. Each sample is given a unique identifier and at any stage the sheets can be uploaded to the MetaBar database server. To label samples, identifiers can be printed as barcodes. An intuitive web interface provides quick access to the contextual data in the MetaBar database as well as user and project management capabilities. Export functions facilitate contextual and sequence data submission to the International Nucleotide Sequence Database Collaboration (INSDC), comprising of the DNA DataBase of Japan (DDBJ), the European Molecular Biology Laboratory database (EMBL) and GenBank. MetaBar requests and stores contextual data in compliance to the Genomic Standards Consortium specifications. The MetaBar open source code base for local installation is available under the GNU General Public License version 3 (GNU GPL3). The MetaBar software supports the typical workflow from data acquisition and field-sampling to contextual data enriched sequence submission to an INSDC database. The integration with the megx.net marine Ecological Genomics database and portal facilitates georeferenced data integration and metadata-based comparisons of sampling sites as well as interactive data visualization. The ample export functionalities and the INSDC submission support enable exchange of data across disciplines and safeguarding contextual data.
MetaBar - a tool for consistent contextual data acquisition and standards compliant submission
2010-01-01
Background Environmental sequence datasets are increasing at an exponential rate; however, the vast majority of them lack appropriate descriptors like sampling location, time and depth/altitude: generally referred to as metadata or contextual data. The consistent capture and structured submission of these data is crucial for integrated data analysis and ecosystems modeling. The application MetaBar has been developed, to support consistent contextual data acquisition. Results MetaBar is a spreadsheet and web-based software tool designed to assist users in the consistent acquisition, electronic storage, and submission of contextual data associated to their samples. A preconfigured Microsoft® Excel® spreadsheet is used to initiate structured contextual data storage in the field or laboratory. Each sample is given a unique identifier and at any stage the sheets can be uploaded to the MetaBar database server. To label samples, identifiers can be printed as barcodes. An intuitive web interface provides quick access to the contextual data in the MetaBar database as well as user and project management capabilities. Export functions facilitate contextual and sequence data submission to the International Nucleotide Sequence Database Collaboration (INSDC), comprising of the DNA DataBase of Japan (DDBJ), the European Molecular Biology Laboratory database (EMBL) and GenBank. MetaBar requests and stores contextual data in compliance to the Genomic Standards Consortium specifications. The MetaBar open source code base for local installation is available under the GNU General Public License version 3 (GNU GPL3). Conclusion The MetaBar software supports the typical workflow from data acquisition and field-sampling to contextual data enriched sequence submission to an INSDC database. The integration with the megx.net marine Ecological Genomics database and portal facilitates georeferenced data integration and metadata-based comparisons of sampling sites as well as interactive data visualization. The ample export functionalities and the INSDC submission support enable exchange of data across disciplines and safeguarding contextual data. PMID:20591175
A concept to standardize raw biosignal transmission for brain-computer interfaces.
Breitwieser, Christian; Neuper, Christa; Müller-Putz, Gernot R
2011-01-01
With this concept we introduced the attempt of a standardized interface called TiA to transmit raw biosignals. TiA is able to deal with multirate and block-oriented data transmission. Data is distinguished by different signal types (e.g., EEG, EOG, NIRS, …), whereby those signals can be acquired at the same time from different acquisition devices. TiA is built as a client-server model. Multiple clients can connect to one server. Information is exchanged via a control- and a separated data connection. Control commands and meta information are transmitted over the control connection. Raw biosignal data is delivered using the data connection in a unidirectional way. For this purpose a standardized handshaking protocol and raw data packet have been developed. Thus, an abstraction layer between hardware devices and data processing was evolved facilitating standardization.
Highfield, Linda; Ottenweller, Cecelia; Pfanz, Andre; Hanks, Jeanne
2014-01-01
This article presents a case study in the redesign, development, and implementation of a web-based healthcare clinic search tool for virtual patient navigation in underserved populations in Texas. It describes the workflow, assessment of system requirements, and design and implementation of two online portals: Project Safety Net and the Breast Health Portal. The primary focus of the study was to demonstrate the use of health information technology for the purpose of bridging the gap between underserved populations and access to healthcare. A combination of interviews and focus groups was used to guide the development process. Interviewees were asked a series of questions about usage, usability, and desired features of the new system. The redeveloped system offers a multitier architecture consisting of data, business, and presentation layers. The technology used in the new portals include Microsoft .NET Framework 3.5, Microsoft SQL Server 2008, Google Maps JavaScript API v3, jQuery, Telerik RadControls (ASP.NET AJAX), and HTML. The redesigned portals have 548 registered clinics, and they have averaged 355 visits per month since their launch in late 2011, with the average user visiting five pages per visit. Usage has remained relatively constant over time, with an average of 142 new users (40 percent) each month. This study demonstrates the successful application of health information technology to improve access to healthcare and the successful adoption of the technology by targeted end users. The portals described in this study could be replicated by health information specialists in other areas of the United States to address disparities in healthcare access.
Highfield, Linda; Ottenweller, Cecelia; Pfanz, Andre; Hanks, Jeanne
2014-01-01
This article presents a case study in the redesign, development, and implementation of a web-based healthcare clinic search tool for virtual patient navigation in underserved populations in Texas. It describes the workflow, assessment of system requirements, and design and implementation of two online portals: Project Safety Net and the Breast Health Portal. The primary focus of the study was to demonstrate the use of health information technology for the purpose of bridging the gap between underserved populations and access to healthcare. A combination of interviews and focus groups was used to guide the development process. Interviewees were asked a series of questions about usage, usability, and desired features of the new system. The redeveloped system offers a multitier architecture consisting of data, business, and presentation layers. The technology used in the new portals include Microsoft .NET Framework 3.5, Microsoft SQL Server 2008, Google Maps JavaScript API v3, jQuery, Telerik RadControls (ASP.NET AJAX), and HTML. The redesigned portals have 548 registered clinics, and they have averaged 355 visits per month since their launch in late 2011, with the average user visiting five pages per visit. Usage has remained relatively constant over time, with an average of 142 new users (40 percent) each month. This study demonstrates the successful application of health information technology to improve access to healthcare and the successful adoption of the technology by targeted end users. The portals described in this study could be replicated by health information specialists in other areas of the United States to address disparities in healthcare access. PMID:24808806
Design of a Horizontal Penetrometer for Measuring On-the-Go Soil Resistance
Topakci, Mehmet; Unal, Ilker; Canakci, Murad; Celik, Huseyin Kursat; Karayel, Davut
2010-01-01
Soil compaction is one of the main negative factors that limits plant growth and crop yield. Therefore, it is important to determine the soil resistance level and map it for the field to find solutions for the negative effects of the compaction. Nowadays, high powered communication technology and computers help us on this issue within the approach of precision agriculture applications. This study is focused on the design of a penetrometer, which can make instantaneous soil resistance measurements in the soil horizontally and data acquisition software based on the GPS (Global Positioning System). The penetrometer was designed using commercial 3D parametric solid modelling design software. The data acquisition software was developed in Microsoft Visual Basic.NET programming language. After the design of the system, manufacturing and assembly of the system was completed and then a field experiment was carried out. According to the data from GPS and penetration resistance values which are collected in Microsoft SQL Server database, a Kriging method by ArcGIS was used and soil resistance was mapped in the field for a soil depth of 40 cm. During operation, no faults, either in mechanical and software parts, were seen. As a result, soil resistance values of 0.2 MPa and 3 MPa were obtained as minimum and maximum values, respectively. In conclusion, the experimental results showed that the designed system works quite well in the field and the horizontal penetrometer is a practical tool for providing on-line soil resistance measurements. This study contributes to further research for the development of on-line soil resistance measurements and mapping within the precision agriculture applications. PMID:22163410
Education and Outreach with the Virtual Astronomical Observatory
NASA Astrophysics Data System (ADS)
Lawton, Brandon L.; Eisenhamer, B.; Raddick, M. J.; Mattson, B. J.; Harris, J.
2012-01-01
The Virtual Observatory (VO) is an international effort to bring a large-scale electronic integration of astronomy data, tools, and services to the global community. The Virtual Astronomical Observatory (VAO) is the U.S. NSF- and NASA-funded VO effort that seeks to put efficient astronomical tools in the hands of U.S. astronomers, students, educators, and public outreach leaders. These tools will make use of data collected by the multitude of ground- and space-based missions over the previous decades. Many future missions will also be incorporated into the VAO tools when they launch. The Education and Public Outreach (E/PO) program for the VAO is led by the Space Telescope Science Institute in collaboration with the HEASARC E/PO program and Johns Hopkins University. VAO E/PO efforts seek to bring technology, real-world astronomical data, and the story of the development and infrastructure of the VAO to the general public, formal education, and informal education communities. Our E/PO efforts will be structured to provide uniform access to VAO information, enabling educational opportunities across multiple wavelengths and time-series data sets. The VAO team recognizes that many VO programs have built powerful tools for E/PO purposes, such as Microsoft's World Wide Telescope, SDSS Sky Server, Aladin, and a multitude of citizen-science tools available from Zooniverse. We are building partnerships with Microsoft, Zooniverse, and NASA's Night Sky Network to leverage the communities and tools that already exist to meet the needs of our audiences. Our formal education program is standards-based and aims to give teachers the tools to use real astronomical data to teach the STEM subjects. To determine which tools the VAO will incorporate into the formal education program, needs assessments will be conducted with educators across the U.S.
Design of a horizontal penetrometer for measuring on-the-go soil resistance.
Topakci, Mehmet; Unal, Ilker; Canakci, Murad; Celik, Huseyin Kursat; Karayel, Davut
2010-01-01
Soil compaction is one of the main negative factors that limits plant growth and crop yield. Therefore, it is important to determine the soil resistance level and map it for the field to find solutions for the negative effects of the compaction. Nowadays, high powered communication technology and computers help us on this issue within the approach of precision agriculture applications. This study is focused on the design of a penetrometer, which can make instantaneous soil resistance measurements in the soil horizontally and data acquisition software based on the GPS (Global Positioning System). The penetrometer was designed using commercial 3D parametric solid modelling design software. The data acquisition software was developed in Microsoft Visual Basic.NET programming language. After the design of the system, manufacturing and assembly of the system was completed and then a field experiment was carried out. According to the data from GPS and penetration resistance values which are collected in Microsoft SQL Server database, a Kriging method by ArcGIS was used and soil resistance was mapped in the field for a soil depth of 40 cm. During operation, no faults, either in mechanical and software parts, were seen. As a result, soil resistance values of 0.2 MPa and 3 MPa were obtained as minimum and maximum values, respectively. In conclusion, the experimental results showed that the designed system works quite well in the field and the horizontal penetrometer is a practical tool for providing on-line soil resistance measurements. This study contributes to further research for the development of on-line soil resistance measurements and mapping within the precision agriculture applications.
Morgan, K.S.; Pattyn, G.J.; Morgan, M.L.
2005-01-01
Internet mapping applications for geologic data allow simultaneous data delivery and collection, enabling quick data modification while efficiently supplying the end user with information. Utilizing Web-based technologies, the Colorado Geological Survey's Colorado Late Cenozoic Fault and Fold Database was transformed from a monothematic, nonspatial Microsoft Access database into a complex information set incorporating multiple data sources. The resulting user-friendly format supports easy analysis and browsing. The core of the application is the Microsoft Access database, which contains information compiled from available literature about faults and folds that are known or suspected to have moved during the late Cenozoic. The database contains nonspatial fields such as structure type, age, and rate of movement. Geographic locations of the fault and fold traces were compiled from previous studies at 1:250,000 scale to form a spatial database containing information such as length and strike. Integration of the two databases allowed both spatial and nonspatial information to be presented on the Internet as a single dataset (http://geosurvey.state.co.us/pubs/ceno/). The user-friendly interface enables users to view and query the data in an integrated manner, thus providing multiple ways to locate desired information. Retaining the digital data format also allows continuous data updating and quick delivery of newly acquired information. This dataset is a valuable resource to anyone interested in earthquake hazards and the activity of faults and folds in Colorado. Additional geologic hazard layers and imagery may aid in decision support and hazard evaluation. The up-to-date and customizable maps are invaluable tools for researchers or the public.
Li, J L; Deng, H; Lai, D B; Xu, F; Chen, J; Gao, G; Recker, R R; Deng, H W
2001-07-01
To efficiently manipulate large amounts of genotype data generated with fluorescently labeled dinucleotide markers, we developed a Microsoft database management system, named. offers several advantages. First, it accommodates the dynamic nature of the accumulations of genotype data during the genotyping process; some data need to be confirmed or replaced by repeat lab procedures. By using, the raw genotype data can be imported easily and continuously and incorporated into the database during the genotyping process that may continue over an extended period of time in large projects. Second, almost all of the procedures are automatic, including autocomparison of the raw data read by different technicians from the same gel, autoadjustment among the allele fragment-size data from cross-runs or cross-platforms, autobinning of alleles, and autocompilation of genotype data for suitable programs to perform inheritance check in pedigrees. Third, provides functions to track electrophoresis gel files to locate gel or sample sources for any resultant genotype data, which is extremely helpful for double-checking consistency of raw and final data and for directing repeat experiments. In addition, the user-friendly graphic interface of renders processing of large amounts of data much less labor-intensive. Furthermore, has built-in mechanisms to detect some genotyping errors and to assess the quality of genotype data that then are summarized in the statistic reports automatically generated by. The can easily handle >500,000 genotype data entries, a number more than sufficient for typical whole-genome linkage studies. The modules and programs we developed for the can be extended to other database platforms, such as Microsoft SQL server, if the capability to handle still greater quantities of genotype data simultaneously is desired.
Ajax Architecture Implementation Techniques
NASA Astrophysics Data System (ADS)
Hussaini, Syed Asadullah; Tabassum, S. Nasira; Baig, Tabassum, M. Khader
2012-03-01
Today's rich Web applications use a mix of Java Script and asynchronous communication with the application server. This mechanism is also known as Ajax: Asynchronous JavaScript and XML. The intent of Ajax is to exchange small pieces of data between the browser and the application server, and in doing so, use partial page refresh instead of reloading the entire Web page. AJAX (Asynchronous JavaScript and XML) is a powerful Web development model for browser-based Web applications. Technologies that form the AJAX model, such as XML, JavaScript, HTTP, and XHTML, are individually widely used and well known. However, AJAX combines these technologies to let Web pages retrieve small amounts of data from the server without having to reload the entire page. This capability makes Web pages more interactive and lets them behave like local applications. Web 2.0 enabled by the Ajax architecture has given rise to a new level of user interactivity through web browsers. Many new and extremely popular Web applications have been introduced such as Google Maps, Google Docs, Flickr, and so on. Ajax Toolkits such as Dojo allow web developers to build Web 2.0 applications quickly and with little effort.
2008-07-01
also a large Internet service provider and an operator of two of the 13 root zone servers that provide the basic information for locating Internet ...routing and address information to assure continued connectivity and speed? In addition, exchange point technology needs to be improved and there are...alternative technology will come along that will make the Internet outmoded in the same way the Internet has begun to make the Public Switched Telephone
Wang, Chunliang; Ritter, Felix; Smedby, Orjan
2010-07-01
To enhance the functional expandability of a picture archiving and communication systems (PACS) workstation and to facilitate the integration of third-part image-processing modules, we propose a browser-server style method. In the proposed solution, the PACS workstation shows the front-end user interface defined in an XML file while the image processing software is running in the background as a server. Inter-process communication (IPC) techniques allow an efficient exchange of image data, parameters, and user input between the PACS workstation and stand-alone image-processing software. Using a predefined communication protocol, the PACS workstation developer or image processing software developer does not need detailed information about the other system, but will still be able to achieve seamless integration between the two systems and the IPC procedure is totally transparent to the final user. A browser-server style solution was built between OsiriX (PACS workstation software) and MeVisLab (Image-Processing Software). Ten example image-processing modules were easily added to OsiriX by converting existing MeVisLab image processing networks. Image data transfer using shared memory added <10ms of processing time while the other IPC methods cost 1-5 s in our experiments. The browser-server style communication based on IPC techniques is an appealing method that allows PACS workstation developers and image processing software developers to cooperate while focusing on different interests.
Video streaming technologies using ActiveX and LabVIEW
NASA Astrophysics Data System (ADS)
Panoiu, M.; Rat, C. L.; Panoiu, C.
2015-06-01
The goal of this paper is to present the possibilities of remote image processing through data exchange between two programming technologies: LabVIEW and ActiveX. ActiveX refers to the process of controlling one program from another via ActiveX component; where one program acts as the client and the other as the server. LabVIEW can be either client or server. Both programs (client and server) exist independent of each other but are able to share information. The client communicates with the ActiveX objects that the server opens to allow the sharing of information [7]. In the case of video streaming [1] [2], most ActiveX controls can only display the data, being incapable of transforming it into a data type that LabVIEW can process. This becomes problematic when the system is used for remote image processing. The LabVIEW environment itself provides little if any possibilities for video streaming, and the methods it does offer are usually not high performance, but it possesses high performance toolkits and modules specialized in image processing, making it ideal for processing the captured data. Therefore, we chose to use existing software, specialized in video streaming along with LabVIEW and to capture the data provided by them, for further use, within LabVIEW. The software we studied (the ActiveX controls of a series of media players that utilize streaming technology) provide high quality data and a very small transmission delay, ensuring the reliability of the results of the image processing.
... County-level Lyme disease data from 2000-2016 Microsoft Excel file [Excel CSV – 209KB] ––Right–click the link ... PDF file Microsoft PowerPoint file Microsoft Word file Microsoft Excel file Audio/Video file Apple Quicktime file RealPlayer ...
Wrapping SRS with CORBA: from textual data to distributed objects.
Coupaye, T
1999-04-01
Biological data come in very different shapes. Databanks are maintained and used by distinct organizations. Text is the de facto Standard exchange format. The SRS system can integrate heterogeneous textual databanks but it was lacking a way to structure the extracted data. This paper presents a CORBA interface to the SRS system which manages databanks in a flat file format. SRS Object Servers are CORBA wrappers for SRS. They allow client applications (visualisation tools, data mining tools, etc.) to access and query SRS servers remotely through an Object Request Broker (ORB). They provide loader objects that contain the information extracted from the databanks by SRS. Loader objects are not hard-coded but generated in a flexible way by using loader specifications which allow SRS administrators to package data coming from distinct databanks. The prototype may be available for beta-testing. Please contact the SRS group (http://srs.ebi.ac.uk).
What's New with MS Office Suites
ERIC Educational Resources Information Center
Goldsborough, Reid
2012-01-01
If one buys a new PC, laptop, or netbook computer today, it probably comes preloaded with Microsoft Office 2010 Starter Edition. This is a significantly limited, advertising-laden version of Microsoft's suite of productivity programs, Microsoft Office. This continues the trend of PC makers providing ever more crippled versions of Microsoft's…
Utilizing Microsoft Mathematics in Teaching and Learning Calculus
ERIC Educational Resources Information Center
Oktaviyanthi, Rina; Supriani, Yani
2015-01-01
The experimental design was conducted to investigate the use of Microsoft Mathematics, free software made by Microsoft Corporation, in teaching and learning Calculus. This paper reports results from experimental study details on implementation of Microsoft Mathematics in Calculus, students' achievement and the effects of the use of Microsoft…
Experimental Design: Utilizing Microsoft Mathematics in Teaching and Learning Calculus
ERIC Educational Resources Information Center
Oktaviyanthi, Rina; Supriani, Yani
2015-01-01
The experimental design was conducted to investigate the use of Microsoft Mathematics, free software made by Microsoft Corporation, in teaching and learning Calculus. This paper reports results from experimental study details on implementation of Microsoft Mathematics in Calculus, students' achievement and the effects of the use of Microsoft…
2015-01-01
class within Microsoft Visual Studio . 2 It has been tested on and is compatible with Microsoft Vista, 7, and 8 and Visual Studio Express 2008...the ScreenRecorder utility assumes a basic understanding of compiling and running C++ code within Microsoft Visual Studio . This report does not...of Microsoft Visual Studio , the ScreenRecorder utility was developed as a C++ class that can be compiled as a library (static or dynamic) to be
Microsoft Biology Initiative: .NET Bioinformatics Platform and Tools
Diaz Acosta, B.
2011-01-01
The Microsoft Biology Initiative (MBI) is an effort in Microsoft Research to bring new technology and tools to the area of bioinformatics and biology. This initiative is comprised of two primary components, the Microsoft Biology Foundation (MBF) and the Microsoft Biology Tools (MBT). MBF is a language-neutral bioinformatics toolkit built as an extension to the Microsoft .NET Framework—initially aimed at the area of Genomics research. Currently, it implements a range of parsers for common bioinformatics file formats; a range of algorithms for manipulating DNA, RNA, and protein sequences; and a set of connectors to biological web services such as NCBI BLAST. MBF is available under an open source license, and executables, source code, demo applications, documentation and training materials are freely downloadable from http://research.microsoft.com/bio. MBT is a collection of tools that enable biology and bioinformatics researchers to be more productive in making scientific discoveries.
NASA Astrophysics Data System (ADS)
Telaga, Abdi Suryadinata; Hartanto, Indra Dwi; Audina, Debby Rizky; Prabowo, Fransiscus Dimas
2017-06-01
Environmental awareness, stringent regulation and soaring energy costs, together make energy efficiency as an important pillar for every company. Particularly, in 2020, the ministry of energy and mineral resources of Indonesia has set a target to reduce carbon emission by 26%. For that reason, companies in Indonesia have to comply with the emission target. However, there is trade-off between company's productivity and carbon emission. Therefore, the companies' productivity must be weighed against the environmental effect such as carbon emission. Nowadays, distinguish excessive energy in a company is still challenging. The company rarely has skilled person that capable to audit energy consumed in the company. Auditing energy consumption in a company is a lengthy and time consuming process. As PT Astra International (AI) have 220 affiliated companies (AFFCOs). Occasionally, direct visit to audit energy consumption in AFFCOs is inevitable. However, capability to conduct on-site energy audit was limited by the availability of PT AI energy auditors. For that reason, PT AI has developed a set of audit energy tools or Astra green energy (AGEn) tools to aid the AFFCOs auditor to be able to audit energy in their own company. Fishbone chart was developed as an analysis tool to gather root cause of audit energy problem. Following the analysis results, PT AI made an improvement by developing an AGEn web-based system. The system has capability to help AFFCOs to conduct energy audit on-site. The system was developed using prototyping methodology, object-oriented system analysis and design (OOSAD), and three-tier architecture. The implementation of system used ASP.NET, Microsoft SQL Server 2012 database, and web server IIS 8.
Hutten, Helmut; Stiegmaier, Wolfgang; Rauchegger, Günter
2005-09-01
Modern life style requires new methods for individual lifelong learning, based on access at every time and from every place. This fundamental requirement is provided by the Internet. The Internet technology promises an increasing potential in the future for e-learning or tele-learning. Some special requirements are password-controlled access, applicability of most commercially available PCs and laptops equipped with standard software (Microsoft Internet Explorer 6.0), central evaluation of the students' performance, inclusion of an examination part, provision of a picture gallery and a comprehensive glossary accessible in the learning mode. The KISS-shell has been developed based on the Oracle 10g application server in combination with a relational data base (Oracle 8i) on the server side and a web browser based interface using JavaScript for user control of data input on the client side (Kontrolliertes Intelligentes Selbstgesteuertes Studium, KISS). The first tutorial application has been realized with a chapter about cardiac pacemakers. The weight of that chapter (or module) is about 2 ECTS (i.e. the equivalent of 30 working hours; European Credit Transfer System, ECTS). The internal structure of the chapter is organized in sequential mode. It consists of five main sections. Each of those five sections is subdivided into five subsections of comparable length. Progression from one subsection to the next is possible only after successfully passing through the respective examination. The whole learning programme with the pacemaker chapter has been evaluated by 10 students. The system will be presented together with first experiences including the evaluation results. Until now the program has not been used for training purposes.
AWIPS II in the University Community: Unidata's efforts and capabilities of the software
NASA Astrophysics Data System (ADS)
Ramamurthy, Mohan; James, Michael
2015-04-01
The Advanced Weather Interactive Processing System, version II (AWIPS II) is a weather forecasting, display and analysis tool that is used by the National Oceanic and Atmospheric Administration/National Weather Service (NOAA/NWS) and the National Centers for Environmental Prediction (NCEP) to ingest analyze and disseminate operational weather data. The AWIPS II software is built on a Service Oriented Architecture, takes advantage of open source software, and its design affords expandability, flexibility, and portability. Since many university meteorology programs are eager to use the same tools used by NWS forecasters, Unidata community interest in AWIPS II is high. The Unidata Program Center (UPC) has worked closely with NCEP staff during AWIPS II development in order to devise a way to make it available to the university. The Unidata AWIPS II software was released in beta form in 2014, and it incorporates a number of key changes to the baseline U. S. National Weather Service release to process and display additional data formats and run all components in a single-server standalone configuration. In addition to making available open-source instances of the software libraries that can be downloaded and run at any university, Unidata has also deployed the data-server side of AWIPS II, known as EDEX, in the Amazon Web Service and Microsoft Azure cloud environments. In this set up, universities receive all of the data from remote cloud instances, while they only have to run the AWIPS II client, known as CAVE, to analyze and visualize the data. In this presentation, we will describe Unidata's AWIPS II efforts, including the capabilities of the software in visualizing many different types of real-time meteorological data and its myriad uses in university and other settings.
Microsoft in Southeast Europe: A Conversation with Goran Radman
ERIC Educational Resources Information Center
Pendergast, William; Frayne, Colette; Kelley, Patricia
2009-01-01
Goran Radman (GR) joined Microsoft in 1996 and served until Fall 2008 as Microsoft Chairman, Southeast Europe (SEE) and Chairman, East and Central Europe (ECEE). Based in Croatia, where he enjoys sailing the Adriatic coast and islands, he spoke with the authors during 2008 and 2009 about his experience launching Microsoft's commercial presence in…
Microsoft's Tom Corddry on Multimedia, the Information Superhighway and the Future of Online.
ERIC Educational Resources Information Center
Herther, Nancy K.
1994-01-01
Tom Corddry, Microsoft Corporation's Creative Director for the Consumer Division, is interviewed about the Microsoft Home line of products and the development of related CD-ROM and multimedia products. Reasons for Microsoft's entry into the content market and its challenges, the market's future, and the company's interest in developing online…
Huang, Ji-yan; Zhao, Hou-ming; Zhou, Hai-wen
2014-04-01
To construct a database and a tissue bank of oral mucosa precancerous lesions and to estimate the application values. Patients in the Yangtze delta suffering oral mucosa precancerous lesions were enrolled into this study. The patients' clinical data and samples of oral precancerous mucosa, salivary and blood were collected to create a tissue bank, based on which a database was constructed using Microsoft Access software, Brower/Server structure and ASP language. The tissue bank and database of oral mucosa precancerous lesions were successfully built. The procedure to harvest, store and transport the samples had been standardized. The database showed good interactive interface, convenient for data collection, query and share in the internet. We constructed the tissue bank and database of oral mucosa precancerous lesions for the first time, which not only help preserve the biological resource of oral mucosa precancerous lesions, but also provide enormous convenience in clinical work, researching and teaching. Supported by Research Fund of Science and Technology Committee of Shanghai Municipality (08ZR1416700).
Enhanced, Partially Redundant Emergency Notification System
NASA Technical Reports Server (NTRS)
Pounds, Clark D.
2005-01-01
The Johnson Space Center Emergency Notification System (JENS) software utilizes pre-existing computation and communication infrastructure to augment a prior variable-tone, siren-based, outdoor alarm system, in order to enhance the ability to give notice of emergencies to employees working in multiple buildings. The JENS software includes a component that implements an administrative Web site. Administrators can grant and deny access to the administrative site and to an originator Web site that enables authorized individuals to quickly compose and issue alarms. The originator site also facilitates maintenance and review of alarms already issued. A custom client/server application program enables an originator to notify every user who is logged in on a Microsoft Windows-based desktop computer by means of a pop-up message that interrupts, but does not disrupt, the user s work. Alternatively or in addition, the originator can send an alarm message to recipients on an e-mail distribution list and/or can post the notice on an internal Web site. An alarm message can consist of (1) text describing the emergency and suggesting a course of action and (2) a replica of the corresponding audible outdoor alarm.
A spatial-temporal system for dynamic cadastral management.
Nan, Liu; Renyi, Liu; Guangliang, Zhu; Jiong, Xie
2006-03-01
A practical spatio-temporal database (STDB) technique for dynamic urban land management is presented. One of the STDB models, the expanded model of Base State with Amendments (BSA), is selected as the basis for developing the dynamic cadastral management technique. Two approaches, the Section Fast Indexing (SFI) and the Storage Factors of Variable Granularity (SFVG), are used to improve the efficiency of the BSA model. Both spatial graphic data and attribute data, through a succinct engine, are stored in standard relational database management systems (RDBMS) for the actual implementation of the BSA model. The spatio-temporal database is divided into three interdependent sub-databases: present DB, history DB and the procedures-tracing DB. The efficiency of database operation is improved by the database connection in the bottom layer of the Microsoft SQL Server. The spatio-temporal system can be provided at a low-cost while satisfying the basic needs of urban land management in China. The approaches presented in this paper may also be of significance to countries where land patterns change frequently or to agencies where financial resources are limited.
A practical approach for inexpensive searches of radiology report databases.
Desjardins, Benoit; Hamilton, R Curtis
2007-06-01
We present a method to perform full text searches of radiology reports for the large number of departments that do not have this ability as part of their radiology or hospital information system. A tool written in Microsoft Access (front-end) has been designed to search a server (back-end) containing the indexed backup weekly copy of the full relational database extracted from a radiology information system (RIS). This front end-/back-end approach has been implemented in a large academic radiology department, and is used for teaching, research and administrative purposes. The weekly second backup of the 80 GB, 4 million record RIS database takes 2 hours. Further indexing of the exported radiology reports takes 6 hours. Individual searches of the indexed database typically take less than 1 minute on the indexed database and 30-60 minutes on the nonindexed database. Guidelines to properly address privacy and institutional review board issues are closely followed by all users. This method has potential to improve teaching, research, and administrative programs within radiology departments that cannot afford more expensive technology.
Biographer: web-based editing and rendering of SBGN compliant biochemical networks.
Krause, Falko; Schulz, Marvin; Ripkens, Ben; Flöttmann, Max; Krantz, Marcus; Klipp, Edda; Handorf, Thomas
2013-06-01
The rapid accumulation of knowledge in the field of Systems Biology during the past years requires advanced, but simple-to-use, methods for the visualization of information in a structured and easily comprehensible manner. We have developed biographer, a web-based renderer and editor for reaction networks, which can be integrated as a library into tools dealing with network-related information. Our software enables visualizations based on the emerging standard Systems Biology Graphical Notation. It is able to import networks encoded in various formats such as SBML, SBGN-ML and jSBGN, a custom lightweight exchange format. The core package is implemented in HTML5, CSS and JavaScript and can be used within any kind of web-based project. It features interactive graph-editing tools and automatic graph layout algorithms. In addition, we provide a standalone graph editor and a web server, which contains enhanced features like web services for the import and export of models and visualizations in different formats. The biographer tool can be used at and downloaded from the web page http://biographer.biologie.hu-berlin.de/. The different software packages, including a server-independent version as well as a web server for Windows and Linux based systems, are available at http://code.google.com/p/biographer/ under the open-source license LGPL
Web catalog of oceanographic data using GeoNetwork
NASA Astrophysics Data System (ADS)
Marinova, Veselka; Stefanov, Asen
2017-04-01
Most of the data collected, analyzed and used by Bulgarian oceanographic data center (BgODC) from scientific cruises, argo floats, ferry boxes and real time operating systems are spatially oriented and need to be displayed on the map. The challenge is to make spatial information more accessible to users, decision makers and scientists. In order to meet this challenge, BgODC concentrate its efforts on improving dynamic and standardized access to their geospatial data as well as those from various related organizations and institutions. BgODC currently is implementing a project to create a geospatial portal for distributing metadata and search, exchange and harvesting spatial data. There are many open source software solutions able to create such spatial data infrastructure (SDI). Finally, the GeoNetwork open source is chosen, as it is already widespread. This software is free, effective and "cheap" solution for implementing SDI at organization level. It is platform independent and runs under many operating systems. Filling of the catalog goes through these practical steps: • Managing and storing data reliably within MS SQL spatial data base; • Registration of maps and data of various formats and sources in GeoServer (most popular open source geospatial server embedded with GeoNetwork) ; • Filling added meta data and publishing geospatial data at the desktop of GeoNetwork. GeoServer and GeoNetwork are based on Java so they require installing of a servlet engine like Tomcat. The experience gained from the use of GeoNetwork Open Source confirms that the catalog meets the requirements for data management and is flexible enough to customize. Building the catalog facilitates sustainable data exchange between end users. The catalog is a big step towards implementation of the INSPIRE directive due to availability of many features necessary for producing "INSPIRE compliant" metadata records. The catalog now contains all available GIS data provided by BgODC for Internet access. Searching data within the catalog is based upon geographic extent, theme type and free text search.
Method of Performance-Aware Security of Unicast Communication in Hybrid Satellite Networks
NASA Technical Reports Server (NTRS)
Baras, John S. (Inventor); Roy-Chowdhury, Ayan (Inventor)
2014-01-01
A method and apparatus utilizes Layered IPSEC (LES) protocol as an alternative to IPSEC for network-layer security including a modification to the Internet Key Exchange protocol. For application-level security of web browsing with acceptable end-to-end delay, the Dual-mode SSL protocol (DSSL) is used instead of SSL. The LES and DSSL protocols achieve desired end-to-end communication security while allowing the TCP and HTTP proxy servers to function correctly.
2013-06-01
Communication Applet) UNIGE – D.I.M.E. Using a free application as “MIT APP Inventor” Android Software Development Kit DEGRADED C2 ICCRTS 2013...operate on an Android operating system up-gradable on which will be developed a simplified ACA ( Android Communication Applet) that will call C24U...Server) IP number . . . Portable COTS Devices ACA - C24U ( Android Communication Applet) Sending/receiving SEFL (Simple Exchange
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-25
... were created, such as Microsoft Excel, Microsoft Word, or Microsoft PowerPoint (``native format'')? We... (condensed) or expanded (detailed) format Export search results to Excel or PDF As noted above, system is...., Microsoft Word ``.doc'' format or non-copy protected text- searchable ``.pdf'' format)? Should submissions...
Layered virus protection for the operations and administrative messaging system
NASA Technical Reports Server (NTRS)
Cortez, R. H.
2002-01-01
NASA's Deep Space Network (DSN) is critical in supporting the wide variety of operating and plannedunmanned flight projects. For day-to-day operations it relies on email communication between the three Deep Space Communication Complexes (Canberra, Goldstone, Madrid) and NASA's Jet Propulsion Laboratory. The Operations & Administrative Messaging system, based on the Microsoft Windows NTand Exchange platform, provides the infrastructure that is required for reliable, mission-critical messaging. The reliability of this system, however, is threatened by the proliferation of email viruses that continue to spread at alarming rates. A layered approach to email security has been implemented across the DSN to protect against this threat.
AutoCAD-To-GIFTS Translator Program
NASA Technical Reports Server (NTRS)
Jones, Andrew
1989-01-01
AutoCAD-to-GIFTS translator program, ACTOG, developed to facilitate quick generation of small finite-element models using CASA/GIFTS finite-element modeling program. Reads geometric data of drawing from Data Exchange File (DXF) used in AutoCAD and other PC-based drafting programs. Geometric entities recognized by ACTOG include points, lines, arcs, solids, three-dimensional lines, and three-dimensional faces. From this information, ACTOG creates GIFTS SRC file, which then reads into GIFTS preprocessor BULKM or modified and reads into EDITM to create finite-element model. SRC file used as is or edited for any number of uses. Written in Microsoft Quick-Basic (Version 2.0).
ERIC Educational Resources Information Center
Butler, E. Sonny
Much of what librarians do today requires adeptness in creating and manipulating databases. Many new computers bought by libraries every year come packaged with Microsoft Office and include Microsoft Access. This database program features a seamless interface between Microsoft Office's other programs like Word, Excel, and PowerPoint. This book…
Lee, Tian-Fu
2014-12-01
Telecare medicine information systems provide a communicating platform for accessing remote medical resources through public networks, and help health care workers and medical personnel to rapidly making correct clinical decisions and treatments. An authentication scheme for data exchange in telecare medicine information systems enables legal users in hospitals and medical institutes to establish a secure channel and exchange electronic medical records or electronic health records securely and efficiently. This investigation develops an efficient and secure verified-based three-party authentication scheme by using extended chaotic maps for data exchange in telecare medicine information systems. The proposed scheme does not require server's public keys and avoids time-consuming modular exponential computations and scalar multiplications on elliptic curve used in previous related approaches. Additionally, the proposed scheme is proven secure in the random oracle model, and realizes the lower bounds of messages and rounds in communications. Compared to related verified-based approaches, the proposed scheme not only possesses higher security, but also has lower computational cost and fewer transmissions. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
CDC Vital Signs: Adult Smoking among People with Mental Illness
... PDF file Microsoft PowerPoint file Microsoft Word file Microsoft Excel ... National Center for Chronic Disease Prevention and Health Promotion, Office on Smoking and Health Page maintained by: Office ...
... and Team Healthcare Providers Prevention Information and Advice Posters for the Athletic Community General MRSA Information and ... site? Adobe PDF file Microsoft PowerPoint file Microsoft Word file Microsoft Excel file Audio/Video file Apple ...
Efficiently Distributing Component-based Applications Across Wide-Area Environments
2002-01-01
a variety of sophisticated network-accessible services such as e-mail, banking, on-line shopping, entertainment, and serv - ing as a data exchange...product database Customer Serves as a façade to Order and Account Stateful Session Beans ShoppingCart Maintains list of items to be bought by customer...Pet Store tests; and JBoss 3.0.3 with Jetty 4.1.0, for the RUBiS tests) and a sin- gle database server ( Oracle 8.1.7 Enterprise Edition), each running
CSlib, a library to couple codes via Client/Server messaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plimpton, Steve
The CSlib is a small, portable library which enables two (or more) independent simulation codes to be coupled, by exchanging messages with each other. Both codes link to the library when they are built, and can them communicate with each other as they run. The messages contain data or instructions that the two codes send back-and-forth to each other. The messaging can take place via files, sockets, or MPI. The latter is a standard distributed-memory message-passing library.
DOE Office of Scientific and Technical Information (OSTI.GOV)
The open source Project Haystack initiative defines meta data and communication standards related to data from buildings and intelligent devices. The Project Haystack REST API defines standard formats and operations for exchanging Haystack tagged data over HTTP. The HaystackRuby gem wraps calls to this REST API to enable Ruby application to easily integrate data hosted on a Project Haystack compliant server. The HaystackRuby gem was developed at the National Renewable Energy Lab to support applications related to campus energy. We hope that this tool may be useful to others.
2013-09-01
Malicious Activity Simulation Tool MMORPG Massively Multiplayer Online Role-Playing Game MMS Mission Management Server MOA Memorandum of Agreement MS...conferencing, and massively multiplayer online role- playing games (MMORPG). During all of these Internet-based exchanges and transactions, the Internet user...In its 2011 Internet Crime Report, the Internet Crime Complaint Center (IC3) stated there were more than 300,000 complaints of online criminal
ERIC Educational Resources Information Center
Cor, Ken; Alves, Cecilia; Gierl, Mark J.
2008-01-01
This review describes and evaluates a software add-in created by Frontline Systems, Inc., that can be used with Microsoft Excel 2007 to solve large, complex test assembly problems. The combination of Microsoft Excel 2007 with the Frontline Systems Premium Solver Platform is significant because Microsoft Excel is the most commonly used spreadsheet…
MyDas, an Extensible Java DAS Server
Jimenez, Rafael C.; Quinn, Antony F.; Jenkinson, Andrew M.; Mulder, Nicola; Martin, Maria; Hunter, Sarah; Hermjakob, Henning
2012-01-01
A large number of diverse, complex, and distributed data resources are currently available in the Bioinformatics domain. The pace of discovery and the diversity of information means that centralised reference databases like UniProt and Ensembl cannot integrate all potentially relevant information sources. From a user perspective however, centralised access to all relevant information concerning a specific query is essential. The Distributed Annotation System (DAS) defines a communication protocol to exchange annotations on genomic and protein sequences; this standardisation enables clients to retrieve data from a myriad of sources, thus offering centralised access to end-users. We introduce MyDas, a web server that facilitates the publishing of biological annotations according to the DAS specification. It deals with the common functionality requirements of making data available, while also providing an extension mechanism in order to implement the specifics of data store interaction. MyDas allows the user to define where the required information is located along with its structure, and is then responsible for the communication protocol details. PMID:23028496
MyDas, an extensible Java DAS server.
Salazar, Gustavo A; García, Leyla J; Jones, Philip; Jimenez, Rafael C; Quinn, Antony F; Jenkinson, Andrew M; Mulder, Nicola; Martin, Maria; Hunter, Sarah; Hermjakob, Henning
2012-01-01
A large number of diverse, complex, and distributed data resources are currently available in the Bioinformatics domain. The pace of discovery and the diversity of information means that centralised reference databases like UniProt and Ensembl cannot integrate all potentially relevant information sources. From a user perspective however, centralised access to all relevant information concerning a specific query is essential. The Distributed Annotation System (DAS) defines a communication protocol to exchange annotations on genomic and protein sequences; this standardisation enables clients to retrieve data from a myriad of sources, thus offering centralised access to end-users.We introduce MyDas, a web server that facilitates the publishing of biological annotations according to the DAS specification. It deals with the common functionality requirements of making data available, while also providing an extension mechanism in order to implement the specifics of data store interaction. MyDas allows the user to define where the required information is located along with its structure, and is then responsible for the communication protocol details.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-31
... form; (2) a Microsoft Excel Workbook; (3) a Microsoft Word Narrative template; and (4) other mandatory attachments. (Applicants must use the Microsoft Word Narrative template the CDFI Fund provides; alternative...
DASMI: exchanging, annotating and assessing molecular interaction data.
Blankenburg, Hagen; Finn, Robert D; Prlić, Andreas; Jenkinson, Andrew M; Ramírez, Fidel; Emig, Dorothea; Schelhorn, Sven-Eric; Büch, Joachim; Lengauer, Thomas; Albrecht, Mario
2009-05-15
Ever increasing amounts of biological interaction data are being accumulated worldwide, but they are currently not readily accessible to the biologist at a single site. New techniques are required for retrieving, sharing and presenting data spread over the Internet. We introduce the DASMI system for the dynamic exchange, annotation and assessment of molecular interaction data. DASMI is based on the widely used Distributed Annotation System (DAS) and consists of a data exchange specification, web servers for providing the interaction data and clients for data integration and visualization. The decentralized architecture of DASMI affords the online retrieval of the most recent data from distributed sources and databases. DASMI can also be extended easily by adding new data sources and clients. We describe all DASMI components and demonstrate their use for protein and domain interactions. The DASMI tools are available at http://www.dasmi.de/ and http://ipfam.sanger.ac.uk/graph. The DAS registry and the DAS 1.53E specification is found at http://www.dasregistry.org/.
Development of mobile platform integrated with existing electronic medical records.
Kim, YoungAh; Kim, Sung Soo; Kang, Simon; Kim, Kyungduk; Kim, Jun
2014-07-01
This paper describes a mobile Electronic Medical Record (EMR) platform designed to manage and utilize the existing EMR and mobile application with optimized resources. We structured the mEMR to reuse services of retrieval and storage in mobile app environments that have already proven to have no problem working with EMRs. A new mobile architecture-based mobile solution was developed in four steps: the construction of a server and its architecture; screen layout and storyboard making; screen user interface design and development; and a pilot test and step-by-step deployment. This mobile architecture consists of two parts, the server-side area and the client-side area. In the server-side area, it performs the roles of service management for EMR and documents and for information exchange. Furthermore, it performs menu allocation depending on user permission and automatic clinical document architecture document conversion. Currently, Severance Hospital operates an iOS-compatible mobile solution based on this mobile architecture and provides stable service without additional resources, dealing with dynamic changes of EMR templates. The proposed mobile solution should go hand in hand with the existing EMR system, and it can be a cost-effective solution if a quality EMR system is operated steadily with this solution. Thus, we expect this example to be shared with hospitals that currently plan to deploy mobile solutions.
Development of Mobile Platform Integrated with Existing Electronic Medical Records
Kim, YoungAh; Kang, Simon; Kim, Kyungduk; Kim, Jun
2014-01-01
Objectives This paper describes a mobile Electronic Medical Record (EMR) platform designed to manage and utilize the existing EMR and mobile application with optimized resources. Methods We structured the mEMR to reuse services of retrieval and storage in mobile app environments that have already proven to have no problem working with EMRs. A new mobile architecture-based mobile solution was developed in four steps: the construction of a server and its architecture; screen layout and storyboard making; screen user interface design and development; and a pilot test and step-by-step deployment. This mobile architecture consists of two parts, the server-side area and the client-side area. In the server-side area, it performs the roles of service management for EMR and documents and for information exchange. Furthermore, it performs menu allocation depending on user permission and automatic clinical document architecture document conversion. Results Currently, Severance Hospital operates an iOS-compatible mobile solution based on this mobile architecture and provides stable service without additional resources, dealing with dynamic changes of EMR templates. Conclusions The proposed mobile solution should go hand in hand with the existing EMR system, and it can be a cost-effective solution if a quality EMR system is operated steadily with this solution. Thus, we expect this example to be shared with hospitals that currently plan to deploy mobile solutions. PMID:25152837
Huang, Ean-Wen; Hung, Rui-Suan; Chiou, Shwu-Fen; Liu, Fei-Ying; Liou, Der-Ming
2011-01-01
Information and communication technologies progress rapidly and many novel applications have been developed in many domains of human life. In recent years, the demand for healthcare services has been growing because of the increase in the elderly population. Consequently, a number of healthcare institutions have focused on creating technologies to reduce extraneous work and improve the quality of service. In this study, an information platform for tele- healthcare services was implemented. The architecture of the platform included a web-based application server and client system. The client system was able to retrieve the blood pressure and glucose levels of a patient stored in measurement instruments through Bluetooth wireless transmission. The web application server assisted the staffs and clients in analyzing the health conditions of patients. In addition, the server provided face-to-face communications and instructions through remote video devices. The platform deployed a service-oriented architecture, which consisted of HL7 standard messages and web service components. The platform could transfer health records into HL7 standard clinical document architecture for data exchange with other organizations. The prototyping system was pretested and evaluated in a homecare department of hospital and a community management center for chronic disease monitoring. Based on the results of this study, this system is expected to improve the quality of healthcare services.
Biographer: web-based editing and rendering of SBGN compliant biochemical networks
Krause, Falko; Schulz, Marvin; Ripkens, Ben; Flöttmann, Max; Krantz, Marcus; Klipp, Edda; Handorf, Thomas
2013-01-01
Motivation: The rapid accumulation of knowledge in the field of Systems Biology during the past years requires advanced, but simple-to-use, methods for the visualization of information in a structured and easily comprehensible manner. Results: We have developed biographer, a web-based renderer and editor for reaction networks, which can be integrated as a library into tools dealing with network-related information. Our software enables visualizations based on the emerging standard Systems Biology Graphical Notation. It is able to import networks encoded in various formats such as SBML, SBGN-ML and jSBGN, a custom lightweight exchange format. The core package is implemented in HTML5, CSS and JavaScript and can be used within any kind of web-based project. It features interactive graph-editing tools and automatic graph layout algorithms. In addition, we provide a standalone graph editor and a web server, which contains enhanced features like web services for the import and export of models and visualizations in different formats. Availability: The biographer tool can be used at and downloaded from the web page http://biographer.biologie.hu-berlin.de/. The different software packages, including a server-indepenent version as well as a web server for Windows and Linux based systems, are available at http://code.google.com/p/biographer/ under the open-source license LGPL. Contact: edda.klipp@biologie.hu-berlin.de or handorf@physik.hu-berlin.de PMID:23574737
Network oriented radiological and medical archive
NASA Astrophysics Data System (ADS)
Ferraris, M.; Frixione, P.; Squarcia, S.
2001-10-01
In this paper the basic ideas of NORMA (Network Oriented Radiological and Medical Archive) are discussed. NORMA is an original project built by a team of physicists in collaboration with radiologists in order to select the best Treatment Planning in radiotherapy. It allows physicians and health physicists, working in different places, to discuss on interesting clinical cases visualizing the same diagnostic images, at the same time, and highlighting zones of interest (tumors and organs at risk). NORMA has a client/server architecture in order to be platform independent. Applying World Wide Web technologies, it can be easily used by people with no specific computer knowledge providing a verbose help to guide the user through the right steps of execution. The client side is an applet while the server side is a Java application. In order to optimize execution the project also includes a proprietary protocol, lying over TCP/IP suite, that organizes data exchanges and control messages. Diagnostic images are retrieved from a relational database or from a standard DICOM (Digital Images and COmmunications in Medicine) PACS through the DICOM-WWW gateway allowing connection of the usual Web browsers, used by the NORMA system, to DICOM applications via the HTTP protocol. Browser requests are sent to the gateway from the Web server through CGI (Common Gateway Interface). DICOM software translates the requests in DICOM messages and organizes the communication with the remote DICOM Application.
An expert system for headache diagnosis: the Computerized Headache Assessment tool (CHAT).
Maizels, Morris; Wolfe, William J
2008-01-01
Migraine is a highly prevalent chronic disorder associated with significant morbidity. Chronic daily headache syndromes, while less common, are less likely to be recognized, and impair quality of life to an even greater extent than episodic migraine. A variety of screening and diagnostic tools for migraine have been proposed and studied. Few investigators have developed and evaluated computerized programs to diagnose headache. To develop and determine the accuracy and utility of a computerized headache assessment tool (CHAT). CHAT was designed to identify all of the major primary headache disorders, distinguish daily from episodic types, and recognize medication overuse. CHAT was developed using an expert systems approach to headache diagnosis, with initial branch points determined by headache frequency and duration. Appropriate clinical criteria are presented relevant to brief and longer-lasting headaches. CHAT was posted on a web site using Microsoft active server pages and a SQL-server database server. A convenience sample of patients who presented to the adult urgent care department with headache, and patients in a family practice waiting room, were solicited to participate. Those who completed the on-line questionnaire were contacted for a diagnostic interview. One hundred thirty-five patients completed CHAT and 117 completed a diagnostic interview. CHAT correctly identified 35/35 (100%) patients with episodic migraine and 42/49 (85.7%) of patients with transformed migraine. CHAT also correctly identified 11/11 patients with chronic tension-type headache, 2/2 with episodic tension-type headache, and 1/1 with episodic cluster headache. Medication overuse was correctly recognized in 43/52 (82.7%). The most common misdiagnoses by CHAT were seen in patients with transformed migraine or new daily persistent headache. Fifty patients were referred to their primary care physician and 62 to the headache clinic. Of 29 patients referred to the PCP with a confirmed diagnosis of migraine, 25 made a follow-up appointment, the PCP diagnosed migraine in 19, and initiated migraine-specific therapy or prophylaxis in 17. The described expert system displays high diagnostic accuracy for migraine and other primary headache disorders, including daily headache syndromes and medication overuse. As part of a disease management program, CHAT led to patients receiving appropriate diagnoses and therapy. Limitations of the system include patient willingness to utilize the program, introducing such a process into the culture of medical care, and the difficult distinction of transformed migraine.
Interactive real-time media streaming with reliable communication
NASA Astrophysics Data System (ADS)
Pan, Xunyu; Free, Kevin M.
2014-02-01
Streaming media is a recent technique for delivering multimedia information from a source provider to an end- user over the Internet. The major advantage of this technique is that the media player can start playing a multimedia file even before the entire file is transmitted. Most streaming media applications are currently implemented based on the client-server architecture, where a server system hosts the media file and a client system connects to this server system to download the file. Although the client-server architecture is successful in many situations, it may not be ideal to rely on such a system to provide the streaming service as users may be required to register an account using personal information in order to use the service. This is troublesome if a user wishes to watch a movie simultaneously while interacting with a friend in another part of the world over the Internet. In this paper, we describe a new real-time media streaming application implemented on a peer-to-peer (P2P) architecture in order to overcome these challenges within a mobile environment. When using the peer-to-peer architecture, streaming media is shared directly between end-users, called peers, with minimal or no reliance on a dedicated server. Based on the proposed software pɛvμa (pronounced [revma]), named for the Greek word meaning stream, we can host a media file on any computer and directly stream it to a connected partner. To accomplish this, pɛvμa utilizes the Microsoft .NET Framework and Windows Presentation Framework, which are widely available on various types of windows-compatible personal computers and mobile devices. With specially designed multi-threaded algorithms, the application can stream HD video at speeds upwards of 20 Mbps using the User Datagram Protocol (UDP). Streaming and playback are handled using synchronized threads that communicate with one another once a connection is established. Alteration of playback, such as pausing playback or tracking to a different spot in the media file, will be reflected in all media streams. These techniques are designed to allow users at different locations to simultaneously view a full length HD video and interactively control the media streaming session. To create a sustainable media stream with high quality, our system supports UDP packet loss recovery at high transmission speed using custom File- Buffers. Traditional real-time streaming protocols such as Real-time Transport Protocol/RTP Control Protocol (RTP/RTCP) provide no such error recovery mechanism. Finally, the system also features an Instant Messenger that allows users to perform social interactions with one another while they enjoy a media file. The ultimate goal of the application is to offer users a hassle free way to watch a media file over long distances without having to upload any personal information into a third party database. Moreover, the users can communicate with each other and stream media directly from one mobile device to another while maintaining an independence from traditional sign up required by most streaming services.
... PDF file Microsoft PowerPoint file Microsoft Word file Microsoft Excel file Audio/Video file Apple Quicktime file RealPlayer file Text file Zip Archive file SAS file ePub file RIS file Page last reviewed: February 18, 2013 Page last updated: March 30, 2017 Content source: ...
Holographic Rovers: Augmented Reality and the Microsoft HoloLens
NASA Technical Reports Server (NTRS)
Toler, Laura
2017-01-01
Augmented Reality is an emerging field in technology, and encompasses Head Mounted Displays, smartphone apps, and even projected images. HMDs include the Meta 2, Magic Leap, Avegant Light Field, and the Microsoft HoloLens, which is evaluated specifically. The Microsoft HoloLens is designed to be used as an AR personal computer, and is being optimized with that goal in mind. Microsoft allied with the Unity3D game engine to create an SDK for interested application developers that can be used in the Unity environment.
OGS improvements in the year 2011 in running the Northeastern Italy Seismic Network
NASA Astrophysics Data System (ADS)
Bragato, P. L.; Pesaresi, D.; Saraò, A.; Di Bartolomeo, P.; Durı, G.
2012-04-01
The Centro di Ricerche Sismologiche (CRS, Seismological Research Center) of the Istituto Nazionale di Oceanografia e di Geofisica Sperimentale (OGS, Italian National Institute for Oceanography and Experimental Geophysics) in Udine (Italy) after the strong earthquake of magnitude M=6.4 occurred in 1976 in the Italian Friuli-Venezia Giulia region, started to operate the Northeastern Italy Seismic Network: it currently consists of 15 very sensitive broad band and 21 simpler short period seismic stations, all telemetered to and acquired in real time at the OGS-CRS data center in Udine. Real time data exchange agreements in place with other Italian, Slovenian, Austrian and Swiss seismological institutes lead to a total number of about 100 seismic stations acquired in real time, which makes the OGS the reference institute for seismic monitoring of Northeastern Italy. Since 2002 OGS-CRS is using the Antelope software suite on several workstations plus a SUN Cluster as the main tool for collecting, analyzing, archiving and exchanging seismic data, initially in the framework of the EU Interreg IIIA project "Trans-national seismological networks in the South-Eastern Alps". SeisComP is also used as a real time data exchange server tool. In order to improve the seismological monitoring of the Northeastern Italy area, at OGS-CRS we tuned existing programs and created ad hoc ones like: a customized web server named PickServer to manually relocate earthquakes, a script for automatic moment tensor determination, scripts for web publishing of earthquake parametric data, waveforms, state of health parameters and shaking maps, noise characterization by means of automatic spectra analysis, and last but not least scripts for email/SMS/fax alerting. The OGS-CRS Real Time Seismological website (RTS, http://rts.crs.inogs.it/) operative since several years was initially developed in the framework of the Italian DPC-INGV S3 Project: the RTS website shows classic earthquake locations parametric data plus ShakeMap and moment tensor information. At OGS-CRS we also spent a considerable amount of efforts in improving the long-period performances of broadband seismic stations, either by carrying out full re-installations and/or applying thermal insulations to the seismometers: more examples of PSD plots of the PRED broad band seismic station installation in the cave tunnel of Cave del Predil using a Quanterra Q330HR high resolution digitizer and a Sterckeisen STS-2 broadband seismometer will be illustrated. Efforts in strengthening the reliability of data links, exploring the use of redundant satellite/radio/GPRS links will also be shown.
Human-Robot Interface Controller Usability for Mission Planning on the Move
2012-11-01
5 Figure 3. Microsoft Xbox 360 controller for Windows...6 Figure 5. Microsoft Trackball Explorer. .........................................................................................7 Figure 6...Xbox 360 Controller is a registered trademark of Microsoft Corporation. 4 3.2.1 HMMWV The HMMWV was equipped with a diesel engine
Scabies: Workplace Frequently Asked Questions (FAQs)
... PDF file Microsoft PowerPoint file Microsoft Word file Microsoft Excel file Audio/Video file Apple Quicktime file RealPlayer file Text file Zip Archive file SAS file ePub file RIS file Page last reviewed: July 19, 2013 Page last updated: July 19, 2013 Content source: ...
Inclusion in the Microsoft Workforce
ERIC Educational Resources Information Center
Exceptional Parent, 2008
2008-01-01
Since 1975, Microsoft has been a worldwide leader in software, services, and solutions that help people and businesses realize their full potential. Loren Mikola, the Disability Inclusion Program Manager at Microsoft, ensures that this technology also reaches and includes the special needs population and, through the hiring of individuals with…
FastStats: Chronic Liver Disease and Cirrhosis
... PDF file Microsoft PowerPoint file Microsoft Word file Microsoft Excel file Audio/Video file Apple Quicktime file RealPlayer file Text file Zip Archive file SAS file ePub file RIS file Page last reviewed: May 30, 2013 Page last updated: October 6, 2016 Content source: ...
Implementation of a World Wide Web server for the oil and gas industry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blaylock, R.E.; Martin, F.D.; Emery, R.
1995-12-31
The Gas and Oil Technology Exchange and Communication Highway, (GO-TECH), provides an electronic information system for the petroleum community for the purpose of exchanging ideas, data, and technology. The personal computer-based system fosters communication and discussion by linking oil and gas producers with resource centers, government agencies, consulting firms, service companies, national laboratories, academic research groups, and universities throughout the world. The oil and gas producers are provided access to the GO-TECH World Wide Web home page via modem links, as well as Internet. The future GO-TECH applications will include the establishment of{open_quote}Virtual corporations {close_quotes} consisting of consortiums of smallmore » companies, consultants, and service companies linked by electronic information systems. These virtual corporations will have the resources and expertise previously found only in major corporations.« less
Implementation of a World Wide Web server for the oil and gas industry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blaylock, R.E.; Martin, F.D.; Emery, R.
1996-10-01
The Gas and Oil Technology Exchange and Communication Highway (GO-TECH) provides an electronic information system for the petroleum community for exchanging ideas, data, and technology. The PC-based system fosters communication and discussion by linking the oil and gas producers with resource centers, government agencies, consulting firms, service companies, national laboratories, academic research groups, and universities throughout the world. The oil and gas producers can access the GO-TECH World Wide Web (WWW) home page through modem links, as well as through the Internet. Future GO-TECH applications will include the establishment of virtual corporations consisting of consortia of small companies, consultants, andmore » service companies linked by electronic information systems. These virtual corporations will have the resources and expertise previously found only in major corporations.« less
Fish Karyome: A karyological information network database of Indian Fishes.
Nagpure, Naresh Sahebrao; Pathak, Ajey Kumar; Pati, Rameshwar; Singh, Shri Prakash; Singh, Mahender; Sarkar, Uttam Kumar; Kushwaha, Basdeo; Kumar, Ravindra
2012-01-01
'Fish Karyome', a database on karyological information of Indian fishes have been developed that serves as central source for karyotype data about Indian fishes compiled from the published literature. Fish Karyome has been intended to serve as a liaison tool for the researchers and contains karyological information about 171 out of 2438 finfish species reported in India and is publically available via World Wide Web. The database provides information on chromosome number, morphology, sex chromosomes, karyotype formula and cytogenetic markers etc. Additionally, it also provides the phenotypic information that includes species name, its classification, and locality of sample collection, common name, local name, sex, geographical distribution, and IUCN Red list status. Besides, fish and karyotype images, references for 171 finfish species have been included in the database. Fish Karyome has been developed using SQL Server 2008, a relational database management system, Microsoft's ASP.NET-2008 and Macromedia's FLASH Technology under Windows 7 operating environment. The system also enables users to input new information and images into the database, search and view the information and images of interest using various search options. Fish Karyome has wide range of applications in species characterization and identification, sex determination, chromosomal mapping, karyo-evolution and systematics of fishes.
Scaleable wireless web-enabled sensor networks
NASA Astrophysics Data System (ADS)
Townsend, Christopher P.; Hamel, Michael J.; Sonntag, Peter A.; Trutor, B.; Arms, Steven W.
2002-06-01
Our goal was to develop a long life, low cost, scalable wireless sensing network, which collects and distributes data from a wide variety of sensors over the internet. Time division multiple access was employed with RF transmitter nodes (each w/unique16 bit address) to communicate digital data to a single receiver (range 1/3 mile). One thousand five channel nodes can communicate to one receiver (30 minute update). Current draw (sleep) is 20 microamps, allowing 5 year battery life w/one 3.6 volt Li-Ion AA size battery. The network nodes include sensor excitation (AC or DC), multiplexer, instrumentation amplifier, 16 bit A/D converter, microprocessor, and RF link. They are compatible with thermocouples, strain gauges, load/torque transducers, inductive/capacitive sensors. The receiver (418 MHz) includes a single board computer (SBC) with Ethernet capability, internet file transfer protocols (XML/HTML), and data storage. The receiver detects data from specific nodes, performs error checking, records the data. The web server interrogates the SBC (from Microsoft's Internet Explorer or Netscape's Navigator) to distribute data. This system can collect data from thousands of remote sensors on a smart structure, and be shared by an unlimited number of users.
The covert channel over HTTP protocol
NASA Astrophysics Data System (ADS)
Graniszewski, Waldemar; Krupski, Jacek; Szczypiorski, Krzysztof
2016-09-01
The paper presents a new steganographic method - the covert channel is created over HTTP protocol header, i.e. trailer field. HTTP protocol is one of the most frequently used in the Internet. The popularity of the Web servers and network traffic from, and to them, is one of the requirements for undetectable message exchange. To study this kind of the information hiding technique an application in Javascript language based on the Node.js framework was written. The results of the experiment that was performed to send a message in the covert channel are also presented.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-31
... of multiple mandatory documents including: (1) a PDF fillable Applicant intake form; (2) a Microsoft Excel Workbook; (3) a Microsoft Word Narrative template; and (4) other mandatory attachments. (Applicants must use the Microsoft Word Narrative template the CDFI Fund provides; alternative templates...
Progress Report--Microsoft Office 2003 Lynchburg College Tutorials
ERIC Educational Resources Information Center
Murray, Tom
2004-01-01
For the past several years Lynchburg College has developed Microsoft tutorials for use with academic classes and faculty, student and staff training. The tutorials are now used internationally. Last year Microsoft and Verizon sponsored a tutorial web site at http://www.officetutorials.com. This website recognizes ASCUE members for their wonderful…
Sandler, Leonard A; Blanck, Peter
2005-01-01
This case study examines efforts by Microsoft Corporation to enhance the diversity of its workforce and improve the accessibility and usability of its products and services for persons with disabilities. The research explores the relation among the Americans with Disabilities Act of 1990, corporate leadership, attitudes and behaviors towards individuals with disabilities, and dynamics that shape organizational culture at Microsoft. Implications for Microsoft, other employers, researchers, and the disability community are discussed. 2005 John Wiley & Sons, Ltd.
Microsoft's Vista: Guarantees People with Special Needs Access to Computers
ERIC Educational Resources Information Center
Williams, John M.
2006-01-01
In this article, the author discusses the accessibility features of Microsoft's Windows Vista. One of the most innovative aspects of Windows Vista is a new accessibility and automated testing model called Microsoft UI Automation, which reduces development costs not only for accessible and assistive technology (AT) developers, but also for…
Microsoft Excel Software Usage for Teaching Science and Engineering Curriculum
ERIC Educational Resources Information Center
Singh, Gurmukh; Siddiqui, Khalid
2009-01-01
In this article, our main objective is to present the use of Microsoft Software Excel 2007/2003 for teaching college and university level curriculum in science and engineering. In particular, we discuss two interesting and fascinating examples of interactive applications of Microsoft Excel targeted for undergraduate students in: 1) computational…
Challenging Google, Microsoft Unveils a Search Tool for Scholarly Articles
ERIC Educational Resources Information Center
Carlson, Scott
2006-01-01
Microsoft has introduced a new search tool to help people find scholarly articles online. The service, which includes journal articles from prominent academic societies and publishers, puts Microsoft in direct competition with Google Scholar. The new free search tool, which should work on most Web browsers, is called Windows Live Academic Search…
Microsoft's Book-Search Project Has a Surprise Ending
ERIC Educational Resources Information Center
Foster, Andrea L.
2008-01-01
It is hard to imagine a Microsoft venture falling under the weight of a competitor. That's the post-mortem offered by many academic librarians as they ponder the software giant's recent and sudden announcement that it is shutting down its book-digitization project. The librarians' conclusion: Google did it. Microsoft quietly revealed in May that…
ERIC Educational Resources Information Center
Bhanji, Zahra
2012-01-01
The purpose of this article is to explore Microsoft Corporation as a new international actor shaping educational reforms and practices. This study examines how the implementation of Microsoft's global Partners in Learning (PiL) program varied and was mediated by national politics and national institutional practices in two different contexts,…
Unidata's Vision for Transforming Geoscience by Moving Data Services and Software to the Cloud
NASA Astrophysics Data System (ADS)
Ramamurthy, M. K.; Fisher, W.; Yoksas, T.
2014-12-01
Universities are facing many challenges: shrinking budgets, rapidly evolving information technologies, exploding data volumes, multidisciplinary science requirements, and high student expectations. These changes are upending traditional approaches to accessing and using data and software. It is clear that Unidata's products and services must evolve to support new approaches to research and education. After years of hype and ambiguity, cloud computing is maturing in usability in many areas of science and education, bringing the benefits of virtualized and elastic remote services to infrastructure, software, computation, and data. Cloud environments reduce the amount of time and money spent to procure, install, and maintain new hardware and software, and reduce costs through resource pooling and shared infrastructure. Cloud services aimed at providing any resource, at any time, from any place, using any device are increasingly being embraced by all types of organizations. Given this trend and the enormous potential of cloud-based services, Unidata is taking moving to augment its products, services, data delivery mechanisms and applications to align with the cloud-computing paradigm. Specifically, Unidata is working toward establishing a community-based development environment that supports the creation and use of software services to build end-to-end data workflows. The design encourages the creation of services that can be broken into small, independent chunks that provide simple capabilities. Chunks could be used individually to perform a task, or chained into simple or elaborate workflows. The services will also be portable, allowing their use in researchers' own cloud-based computing environments. In this talk, we present a vision for Unidata's future in a cloud-enabled data services and discuss our initial efforts to deploy a subset of Unidata data services and tools in the Amazon EC2 and Microsoft Azure cloud environments, including the transfer of real-time meteorological data into its cloud instances, product generation using those data, and the deployment of TDS, McIDAS ADDE and AWIPS II data servers and the Integrated Data Server visualization tool.
Löck, Steffen; Roth, Klaus; Skripcak, Tomas; Worbs, Mario; Helmbrecht, Stephan; Jakobi, Annika; Just, Uwe; Krause, Mechthild; Baumann, Michael; Enghardt, Wolfgang; Lühr, Armin
2015-09-01
To guarantee equal access to optimal radiotherapy, a concept of patient assignment to photon or particle radiotherapy using remote treatment plan exchange and comparison - ReCompare - was proposed. We demonstrate the implementation of this concept and present its clinical applicability. The ReCompare concept was implemented using a client-server based software solution. A clinical workflow for the remote treatment plan exchange and comparison was defined. The steps required by the user and performed by the software for a complete plan transfer were described and an additional module for dose-response modeling was added. The ReCompare software was successfully tested in cooperation with three external partner clinics and worked meeting all required specifications. It was compatible with several standard treatment planning systems, ensured patient data protection, and integrated in the clinical workflow. The ReCompare software can be applied to support non-particle radiotherapy institutions with the patient-specific treatment decision on the optimal irradiation modality by remote treatment plan exchange and comparison. Copyright © 2015. Published by Elsevier GmbH.
X-Windows Information Sharing Protocol Widget Class
NASA Technical Reports Server (NTRS)
Barry, Matthew R.
2006-01-01
The X-Windows Information Sharing Protocol (ISP) Widget Class ("Class") is used here in the object-oriented-programming sense of the word) was devised to simplify the task of implementing ISP graphical-user-interface (GUI) computer programs. ISP programming tasks require many method calls to identify, query, and interpret the connections and messages exchanged between a client and an ISP server. Most X-Windows GUI programs use widget sets or toolkits to facilitate management of complex objects. The widget standards facilitate construction of toolkits and application programs. The X-Windows Information Sharing Protocol (ISP) Widget Class encapsulates the client side of the ISP programming libraries within the framework of an X-Windows widget. Using the widget framework, X-Windows GUI programs can interact with ISP services in an abstract way and in the same manner as that of other graphical widgets, making it easier to write ISP GUI client programs. Wrapping ISP client services inside a widget framework enables a programmer to treat an ISP server interface as though it were a GUI. Moreover, an alternate subclass could implement another communication protocol in the same sort of widget.
Telemedicine with integrated data security in ATM-based networks
NASA Astrophysics Data System (ADS)
Thiel, Andreas; Bernarding, Johannes; Kurth, Ralf; Wenzel, Rudiger; Villringer, Arno; Tolxdorff, Thomas
1997-05-01
Telemedical services rely on the digital transfer of large amounts of data in a short time. The acceptance of these services requires therefore new hard- and software concepts. The fast exchange of data is well performed within a high- speed ATM-based network. The fast access to the data from different platforms imposes more difficult problems, which may be divided into those relating to standardized data formats and those relating to different levels of data security across nations. For a standardized access to the formats and those relating to different levels of data security across nations. For a standardized access to the image data, a DICOM 3.0 server was implemented.IMages were converted into the DICOM 3.0 standard if necessary. The access to the server is provided by an implementation of DICOM in JAVA allowing access to the data from different platforms. Data protection measures to ensure the secure transfer of sensitive patient data are not yet solved within the DICOM concept. We investigated different schemes to protect data using the DICOM/JAVA modality with as little impact on data transfer speed as possible.
A Proposal of TLS Implementation for Cross Certification Model
NASA Astrophysics Data System (ADS)
Kaji, Tadashi; Fujishiro, Takahiro; Tezuka, Satoru
Today, TLS is widely used for achieving a secure communication system. And TLS is used PKI for server authentication and/or client authentication. However, its PKI environment, which is called as “multiple trust anchors environment,” causes the problem that the verifier has to maintain huge number of CA certificates in the ubiquitous network because the increase of terminals connected to the network brings the increase of CAs. However, most of terminals in the ubiquitous network will not have enough memory to hold such huge number of CA certificates. Therefore, another PKI environment, “cross certification environment”, is useful for the ubiquitous network. But, because current TLS is designed for the multiple trust anchors model, TLS cannot work efficiently on the cross-certification model. This paper proposes a TLS implementation method to support the cross certification model efficiently. Our proposal reduces the size of exchanged messages between the TLS client and the TLS server during the handshake process. Therefore, our proposal is suitable for implementing TLS in the terminals that do not have enough computing power and memory in ubiquitous network.
Feasibility of interactive biking exercise system for telemanagement in elderly.
Finkelstein, Joseph; Jeong, In Cheol
2013-01-01
Inexpensive cycling equipment is widely available for home exercise however its use is hampered by lack of tools supporting real-time monitoring of cycling exercise in elderly and coordination with a clinical care team. To address these barriers, we developed a low-cost mobile system aimed at facilitating safe and effective home-based cycling exercise. The system used a miniature wireless 3-axis accelerometer that transmitted the cycling acceleration data to a tablet PC that was integrated with a multi-component disease management system. An exercise dashboard was presented to a patient allowing real-time graphical visualization of exercise progress. The system was programmed to alert patients when exercise intensity exceeded the levels recommended by the patient care providers and to exchange information with a central server. The feasibility of the system was assessed by testing the accuracy of cycling speed monitoring and reliability of alerts generated by the system. Our results demonstrated high validity of the system both for upper and lower extremity exercise monitoring as well as reliable data transmission between home unit and central server.
NASA's EOSDIS: options for data providers
NASA Astrophysics Data System (ADS)
Khalsa, Siri J.; Ujhazy, John E.
1995-12-01
EOSDIS, the data and information system being developed by NASA to support interdisciplinary earth science research into the 21st century, will do more than manage and distribute data from EOS-era satellites. It will also promote the exchange of data, tools, and research results across disciplinary, agency, and national boundaries. This paper describes the options that data providers will have for interacting with the EOSDIS Core System (ECS), the infrastructure of EOSDIS. The options include: using the ECS advertising service to announce the availability of data at the provider's site; submitting a candidate data set to one of the Distributed Active Archive Centers (DAACs); establishing a data server that will make the data accessible via ECS and establishing Local Information Manager (LIM) which would make the data available for multi-site searches. One additional option is through custom gateway interfaces which would provide access to existing data archives. The gateway, data server, and LIM options require the implementation of ECS code at the provider site to insure proper protocols. The advertisement and ingest options require no part of ECS design to reside at the provider site.
Integrating sequence and structural biology with DAS
Prlić, Andreas; Down, Thomas A; Kulesha, Eugene; Finn, Robert D; Kähäri, Andreas; Hubbard, Tim JP
2007-01-01
Background The Distributed Annotation System (DAS) is a network protocol for exchanging biological data. It is frequently used to share annotations of genomes and protein sequence. Results Here we present several extensions to the current DAS 1.5 protocol. These provide new commands to share alignments, three dimensional molecular structure data, add the possibility for registration and discovery of DAS servers, and provide a convention how to provide different types of data plots. We present examples of web sites and applications that use the new extensions. We operate a public registry of DAS sources, which now includes entries for more than 250 distinct sources. Conclusion Our DAS extensions are essential for the management of the growing number of services and exchange of diverse biological data sets. In addition the extensions allow new types of applications to be developed and scientific questions to be addressed. The registry of DAS sources is available at PMID:17850653
NASA Astrophysics Data System (ADS)
Marshall, Stuart; Thaler, Jon; Schalk, Terry; Huffer, Michael
2006-06-01
The LSST Camera Control System (CCS) will manage the activities of the various camera subsystems and coordinate those activities with the LSST Observatory Control System (OCS). The CCS comprises a set of modules (nominally implemented in software) which are each responsible for managing one camera subsystem. Generally, a control module will be a long lived "server" process running on an embedded computer in the subsystem. Multiple control modules may run on a single computer or a module may be implemented in "firmware" on a subsystem. In any case control modules must exchange messages and status data with a master control module (MCM). The main features of this approach are: (1) control is distributed to the local subsystem level; (2) the systems follow a "Master/Slave" strategy; (3) coordination will be achieved by the exchange of messages through the interfaces between the CCS and its subsystems. The interface between the camera data acquisition system and its downstream clients is also presented.
Three-party authenticated key agreements for optimal communication
Lee, Tian-Fu; Hwang, Tzonelih
2017-01-01
Authenticated key agreements enable users to determine session keys, and to securely communicate with others over an insecure channel via the session keys. This study investigates the lower bounds on communications for three-party authenticated key agreements and considers whether or not the sub-keys for generating a session key can be revealed in the channel. Since two clients do not share any common secret key, they require the help of the server to authenticate their identities and exchange confidential and authenticated information over insecure networks. However, if the session key security is based on asymmetric cryptosystems, then revealing the sub-keys cannot compromise the session key. The clients can directly exchange the sub-keys and reduce the transmissions. In addition, authenticated key agreements were developed by using the derived results of the lower bounds on communications. Compared with related approaches, the proposed protocols had fewer transmissions and realized the lower bounds on communications. PMID:28355253
Human Factors Feedback: Brain Acoustic Monitor
2012-02-01
Microsoft Office Excel .................................................................12 iv 4. Conclusions 13 5. References 15 Appendix A...Panasonic Toughbook system. †Toughbook is registered trademark of Panasonic Corporation. ‡Windows is a registered trademark of Microsoft Corporation. 4...was preloaded with Microsoft Windows XP service pack 2 OS. This OS is widely used on IBM-style personal computers, and the BAM system did not
Workflow based framework for life science informatics.
Tiwari, Abhishek; Sekhar, Arvind K T
2007-10-01
Workflow technology is a generic mechanism to integrate diverse types of available resources (databases, servers, software applications and different services) which facilitate knowledge exchange within traditionally divergent fields such as molecular biology, clinical research, computational science, physics, chemistry and statistics. Researchers can easily incorporate and access diverse, distributed tools and data to develop their own research protocols for scientific analysis. Application of workflow technology has been reported in areas like drug discovery, genomics, large-scale gene expression analysis, proteomics, and system biology. In this article, we have discussed the existing workflow systems and the trends in applications of workflow based systems.
Quek, June; Brauer, Sandra G; Treleaven, Julia; Clark, Ross A
2017-09-01
This study aims to investigate the concurrent validity and intrarater reliability of the Microsoft Kinect to measure thoracic kyphosis against the Flexicurve. Thirty-three healthy individuals (age: 31±11.0 years, men: 17, height: 170.2±8.2 cm, weight: 64.2±12.0 kg) participated, with 29 re-examined for intrarater reliability 1-7 days later. Thoracic kyphosis was measured using the Flexicurve and the Microsoft Kinect consecutively in both standing and sitting positions. Both the kyphosis index and angle were calculated. The Microsoft Kinect showed excellent concurrent validity (intraclass correlation coefficient=0.76-0.82) and reliability (intraclass correlation coefficient=0.81-0.98) for measuring thoracic kyphosis (angle and index) in both standing and sitting postures. This study is the first to show that the Microsoft Kinect has excellent validity and intrarater reliability to measure thoracic kyphosis, which is promising for its use in the clinical setting.
Microsoft health patient journey demonstrator.
Disse, Kirsten
2008-01-01
As health care becomes more reliant on electronic systems, there is a need to standardise display elements to promote patient safety and clinical efficiency. The Microsoft Health Common User Interface (MSCUI) programme, developed by Microsoft and the National Health Service (NHS) was born out of this need and creates guidance and controls designed to increase patient safety and clinical effectiveness through consistent interface treatments. The Microsoft Health Patient Journey Demonstrator is a prototype tool designed to provide exemplar implementations of MSCUI guidance on a Microsoft platform. It is a targeted glimpse at a visual interface for the integration of health-relevant information, including electronic medical records. We built the demonstrator in Microsoft Silverlight 2, our application technology which brings desktop functionality and enriched levels of user experience to health settings worldwide via the internet. We based the demonstrator on an easily recognisable clinical scenario which offered us the most scope for demonstrating MSCUI guidance and innovation. The demonstrator is structured in three sections (administration, primary care and secondary care) each of which illustrates the activities associated within the setting relevant to our scenario. The demonstrator is published on the MSCUI website www.mscui.net The MSCUI patient journey demonstrator has been successful in raising awareness and increasing interest in the CUI programme.
Hardware Assisted Stealthy Diversity (CHECKMATE)
2013-09-01
applicable across multiple architectures. Figure 29 shows an example an attack against an interpreted environment with a Java executable. CHECKMATE can...Architectures ARM PPCx86 Java VM Java VMJava VM Java Executable Attack APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED 33 a user executes “/usr/bin/wget...Server 1 - Administration Server 2 – Database ( mySQL ) Server 3 – Web server (Mongoose) Server 4 – File server (SSH) Server 5 – Email server
Why American business demands twenty-first century learning: A company perspective.
Knox, Allyson
2006-01-01
Microsoft is an innovative corporation demonstrating the kind and caliber of job skills needed in the twenty-first century. It demonstrates its commitment to twenty-first century skills by holding its employees accountable to a set of core competencies, enabling the company to run effectively. The author explores how Microsoft's core competencies parallel the Partnership for 21st Century Skills learning frameworks. Both require advanced problem-solving skills and a passion for technology, both expect individuals to be able to work in teams, both look for a love of learning, and both call for the self-confidence to honestly self-evaluate. Microsoft also works to cultivate twenty-first century skills among future workers, investing in education to help prepare young people for competitive futures. As the need for digital literacy has become imperative, technology companies have taken the lead in facilitating technology training by partnering with schools and communities. Microsoft is playing a direct role in preparing students for what lies ahead in their careers. To further twenty-first century skills, or core competencies, among the nation's youth, Microsoft has established Partners in Learning, a program that helps education organizations build partnerships that leverage technology to improve teaching and learning. One Partners in Learning grantee is Global Kids, a nonprofit organization that trains students to design online games focused on global social issues resonating with civic and global competencies. As Microsoft believes the challenges of competing in today's economy and teaching today's students are substantial but not insurmountable, such partnerships and investments demonstrate Microsoft's belief in and commitment to twenty-first century skills.
2013-01-01
Background Immunoassays that employ multiplexed bead arrays produce high information content per sample. Such assays are now frequently used to evaluate humoral responses in clinical trials. Integrated software is needed for the analysis, quality control, and secure sharing of the high volume of data produced by such multiplexed assays. Software that facilitates data exchange and provides flexibility to perform customized analyses (including multiple curve fits and visualizations of assay performance over time) could increase scientists’ capacity to use these immunoassays to evaluate human clinical trials. Results The HIV Vaccine Trials Network and the Statistical Center for HIV/AIDS Research and Prevention collaborated with LabKey Software to enhance the open source LabKey Server platform to facilitate workflows for multiplexed bead assays. This system now supports the management, analysis, quality control, and secure sharing of data from multiplexed immunoassays that leverage Luminex xMAP® technology. These assays may be custom or kit-based. Newly added features enable labs to: (i) import run data from spreadsheets output by Bio-Plex Manager™ software; (ii) customize data processing, curve fits, and algorithms through scripts written in common languages, such as R; (iii) select script-defined calculation options through a graphical user interface; (iv) collect custom metadata for each titration, analyte, run and batch of runs; (v) calculate dose–response curves for titrations; (vi) interpolate unknown concentrations from curves for titrated standards; (vii) flag run data for exclusion from analysis; (viii) track quality control metrics across runs using Levey-Jennings plots; and (ix) automatically flag outliers based on expected values. Existing system features allow researchers to analyze, integrate, visualize, export and securely share their data, as well as to construct custom user interfaces and workflows. Conclusions Unlike other tools tailored for Luminex immunoassays, LabKey Server allows labs to customize their Luminex analyses using scripting while still presenting users with a single, graphical interface for processing and analyzing data. The LabKey Server system also stands out among Luminex tools for enabling smooth, secure transfer of data, quality control information, and analyses between collaborators. LabKey Server and its Luminex features are freely available as open source software at http://www.labkey.com under the Apache 2.0 license. PMID:23631706
Eckels, Josh; Nathe, Cory; Nelson, Elizabeth K; Shoemaker, Sara G; Nostrand, Elizabeth Van; Yates, Nicole L; Ashley, Vicki C; Harris, Linda J; Bollenbeck, Mark; Fong, Youyi; Tomaras, Georgia D; Piehler, Britt
2013-04-30
Immunoassays that employ multiplexed bead arrays produce high information content per sample. Such assays are now frequently used to evaluate humoral responses in clinical trials. Integrated software is needed for the analysis, quality control, and secure sharing of the high volume of data produced by such multiplexed assays. Software that facilitates data exchange and provides flexibility to perform customized analyses (including multiple curve fits and visualizations of assay performance over time) could increase scientists' capacity to use these immunoassays to evaluate human clinical trials. The HIV Vaccine Trials Network and the Statistical Center for HIV/AIDS Research and Prevention collaborated with LabKey Software to enhance the open source LabKey Server platform to facilitate workflows for multiplexed bead assays. This system now supports the management, analysis, quality control, and secure sharing of data from multiplexed immunoassays that leverage Luminex xMAP® technology. These assays may be custom or kit-based. Newly added features enable labs to: (i) import run data from spreadsheets output by Bio-Plex Manager™ software; (ii) customize data processing, curve fits, and algorithms through scripts written in common languages, such as R; (iii) select script-defined calculation options through a graphical user interface; (iv) collect custom metadata for each titration, analyte, run and batch of runs; (v) calculate dose-response curves for titrations; (vi) interpolate unknown concentrations from curves for titrated standards; (vii) flag run data for exclusion from analysis; (viii) track quality control metrics across runs using Levey-Jennings plots; and (ix) automatically flag outliers based on expected values. Existing system features allow researchers to analyze, integrate, visualize, export and securely share their data, as well as to construct custom user interfaces and workflows. Unlike other tools tailored for Luminex immunoassays, LabKey Server allows labs to customize their Luminex analyses using scripting while still presenting users with a single, graphical interface for processing and analyzing data. The LabKey Server system also stands out among Luminex tools for enabling smooth, secure transfer of data, quality control information, and analyses between collaborators. LabKey Server and its Luminex features are freely available as open source software at http://www.labkey.com under the Apache 2.0 license.
A Web-Based Database for Nurse Led Outreach Teams (NLOT) in Toronto.
Li, Shirley; Kuo, Mu-Hsing; Ryan, David
2016-01-01
A web-based system can provide access to real-time data and information. Healthcare is moving towards digitizing patients' medical information and securely exchanging it through web-based systems. In one of Ontario's health regions, Nurse Led Outreach Teams (NLOT) provide emergency mobile nursing services to help reduce unnecessary transfers from long-term care homes to emergency departments. Currently the NLOT team uses a Microsoft Access database to keep track of the health information on the residents that they serve. The Access database lacks scalability, portability, and interoperability. The objective of this study is the development of a web-based database using Oracle Application Express that is easily accessible from mobile devices. The web-based database will allow NLOT nurses to enter and access resident information anytime and from anywhere.
2013-06-16
Science Dept., University of California, Irvine, USA 92697. Email : a.anandkumar@uci.edu,mjanzami@uci.edu. Daniel Hsu and Sham Kakade are with...Microsoft Research New England, 1 Memorial Drive, Cambridge, MA 02142. Email : dahsu@microsoft.com, skakade@microsoft.com 1 a latent space dimensionality...Sparse coding for multitask and transfer learning. ArxXiv preprint, abs/1209.0738, 2012. [34] G.H. Golub and C.F. Van Loan. Matrix Computations. The
NASA Astrophysics Data System (ADS)
Niebuhr, Cole
2018-04-01
Papers published in the astronomical community, particularly in the field of double star research, often contain plots that display the positions of the component stars relative to each other on a Cartesian coordinate plane. Due to the complexities of plotting a three-dimensional orbit into a two-dimensional image, it is often difficult to include an accurate reproduction of the orbit for comparison purposes. Methods to circumvent this obstacle do exist; however, many of these protocols result in low-quality blurred images or require specific and often expensive software. Here, a method is reported using Microsoft Paint and Microsoft Excel to produce high-quality images with an accurate reproduction of a partial orbit.
Using OpenOffice as a Portable Interface to JAVA-Based Applications
NASA Astrophysics Data System (ADS)
Comeau, T.; Garrett, B.; Richon, J.; Romelfanger, F.
2004-07-01
STScI previously used Microsoft Word and Microsoft Access, a Sybase ODBC driver, and the Adobe Acrobat PDF writer, along with a substantial amount of Visual Basic, to generate a variety of documents for the internal Space Telescope Grants Administration System (STGMS). While investigating an upgrade to Microsoft Office XP, we began considering alternatives, ultimately selecting an open source product, OpenOffice.org. This reduces the total number of products required to operate the internal STGMS system, simplifies the build system, and opens the possibility of moving to a non-Windows platform. We describe the experience of moving from Microsoft Office to OpenOffice.org, and our other internal uses of OpenOffice.org in our development environment.
Demonstrating NaradaBrokering as a Middleware Fabric for Grid-based Remote Visualization Services
NASA Astrophysics Data System (ADS)
Pallickara, S.; Erlebacher, G.; Yuen, D.; Fox, G.; Pierce, M.
2003-12-01
Remote Visualization Services (RVS) have tended to rely on approaches based on the client server paradigm. Here we demonstrate our approach - based on a distributed brokering infrastructure, NaradaBrokering [1] - that relies on distributed, asynchronous and loosely coupled interactions to meet the requirements and constraints of RVS. In our approach to RVS, services advertise their capabilities to the broker network that manages these service advertisements. Among the services considered within our system are those that perform graphic transformations, mediate access to specialized datasets and finally those that manage the execution of specified tasks. There could be multiple instances of each of these services and the system ensures that load for a given service is distributed efficiently over these service instances. We will demonstrate implementation of concepts that we outlined in the oral presentation. This would involve two or more visualization servers interacting asynchronously with multiple clients through NaradaBrokering. The communicating entities may exchange SOAP [2] (Simple Object Access Protocol) messages. SOAP is a lightweight protocol for exchange of information in a decentralized, distributed environment. It is an XML based protocol that consists of three parts: an envelope that describes what is in a message and how to process it, rules for expressing instances of application-defined data types, and a convention for representing remote invocation related operations. Furthermore, we will also demonstrate how clients can retrieve their results after prolonged disconnects or after any failures that might have taken place. The entities, services and clients alike, are not limited by the geographical distances that separate them. We are planning to test this system in the context of trans-Atlantic links separating interacting entities. {[1]} The NaradaBrokering Project: http://www.naradabrokering.org {[2]} Newcomer, E., 2002, Understanding web services: XML, WSDL, SOAP, and UDDI, Addison Wesley Professional.
Software for Allocating Resources in the Deep Space Network
NASA Technical Reports Server (NTRS)
Wang, Yeou-Fang; Borden, Chester; Zendejas, Silvino; Baldwin, John
2003-01-01
TIGRAS 2.0 is a computer program designed to satisfy a need for improved means for analyzing the tracking demands of interplanetary space-flight missions upon the set of ground antenna resources of the Deep Space Network (DSN) and for allocating those resources. Written in Microsoft Visual C++, TIGRAS 2.0 provides a single rich graphical analysis environment for use by diverse DSN personnel, by connecting to various data sources (relational databases or files) based on the stages of the analyses being performed. Notable among the algorithms implemented by TIGRAS 2.0 are a DSN antenna-load-forecasting algorithm and a conflict-aware DSN schedule-generating algorithm. Computers running TIGRAS 2.0 can also be connected using SOAP/XML to a Web services server that provides analysis services via the World Wide Web. TIGRAS 2.0 supports multiple windows and multiple panes in each window for users to view and use information, all in the same environment, to eliminate repeated switching among various application programs and Web pages. TIGRAS 2.0 enables the use of multiple windows for various requirements, trajectory-based time intervals during which spacecraft are viewable, ground resources, forecasts, and schedules. Each window includes a time navigation pane, a selection pane, a graphical display pane, a list pane, and a statistics pane.
Entamoeba histolytica: construction and applications of subgenomic databases.
Hofer, Margit; Duchêne, Michael
2005-07-01
Knowledge about the influence of environmental stress such as the action of chemotherapeutic agents on gene expression in Entamoeba histolytica is limited. We plan to use oligonucleotide microarray hybridization to approach these questions. As the basis for our array, sequence data from the genome project carried out by the Institute for Genomic Research (TIGR) and the Sanger Institute were used to annotate parts of the parasite genome. Three subgenomic databases containing enzymes, cytoskeleton genes, and stress genes were compiled with the help of the ExPASy proteomics website and the BLAST servers at the two genome project sites. The known sequences from reference species, mostly human and Escherichia coli, were searched against TIGR and Sanger E. histolytica sequence contigs and the homologs were copied into a Microsoft Access database. In a similar way, two additional databases of cytoskeletal genes and stress genes were generated. Metabolic pathways could be assembled from our enzyme database, but sometimes they were incomplete as is the case for the sterol biosynthesis pathway. The raw databases contained a significant number of duplicate entries which were merged to obtain curated non-redundant databases. This procedure revealed that some E. histolytica genes may have several putative functions. Representative examples such as the case of the delta-aminolevulinate synthase/serine palmitoyltransferase are discussed.
Automated DICOM metadata and volumetric anatomical information extraction for radiation dosimetry
NASA Astrophysics Data System (ADS)
Papamichail, D.; Ploussi, A.; Kordolaimi, S.; Karavasilis, E.; Papadimitroulas, P.; Syrgiamiotis, V.; Efstathopoulos, E.
2015-09-01
Patient-specific dosimetry calculations based on simulation techniques have as a prerequisite the modeling of the modality system and the creation of voxelized phantoms. This procedure requires the knowledge of scanning parameters and patients’ information included in a DICOM file as well as image segmentation. However, the extraction of this information is complicated and time-consuming. The objective of this study was to develop a simple graphical user interface (GUI) to (i) automatically extract metadata from every slice image of a DICOM file in a single query and (ii) interactively specify the regions of interest (ROI) without explicit access to the radiology information system. The user-friendly application developed in Matlab environment. The user can select a series of DICOM files and manage their text and graphical data. The metadata are automatically formatted and presented to the user as a Microsoft Excel file. The volumetric maps are formed by interactively specifying the ROIs and by assigning a specific value in every ROI. The result is stored in DICOM format, for data and trend analysis. The developed GUI is easy, fast and and constitutes a very useful tool for individualized dosimetry. One of the future goals is to incorporate a remote access to a PACS server functionality.
OLAP Cube Visualization of Hydrologic Data Catalogs
NASA Astrophysics Data System (ADS)
Zaslavsky, I.; Rodriguez, M.; Beran, B.; Valentine, D.; van Ingen, C.; Wallis, J. C.
2007-12-01
As part of the CUAHSI Hydrologic Information System project, we assemble comprehensive observations data catalogs that support CUAHSI data discovery services (WaterOneFlow services) and online mapping interfaces (e.g. the Data Access System for Hydrology, DASH). These catalogs describe several nation-wide data repositories that are important for hydrologists, including USGS NWIS and EPA STORET data collections. The catalogs contain a wealth of information reflecting the entire history and geography of hydrologic observations in the US. Managing such catalogs requires high performance analysis and visualization technologies. OLAP (Online Analytical Processing) cube, often called data cubes, is an approach to organizing and querying large multi-dimensional data collections. We have applied the OLAP techniques, as implemented in Microsoft SQL Server 2005, to the analysis of the catalogs from several agencies. In this initial report, we focus on the OLAP technology as applied to catalogs, and preliminary results of the analysis. Specifically, we describe the challenges of generating OLAP cube dimensions, and defining aggregations and views for data catalogs as opposed to observations data themselves. The initial results are related to hydrologic data availability from the observations data catalogs. The results reflect geography and history of available data totals from USGS NWIS and EPA STORET repositories, and spatial and temporal dynamics of available measurements for several key nutrient-related parameters.
Tamm, E P; Kawashima, A; Silverman, P
2001-06-01
Current commercial radiology information systems (RIS) are designed for scheduling, billing, charge collection, and report dissemination. Academic institutions have additional requirements for their missions for teaching, research and clinical care. The newest versions of commercial RIS offer greater flexibility than prior systems. We sent questionnaires to Cerner Corporation, ADAC Health Care Information Systems, IDX Systems, Per-Se' Technologies, and Siemens Health Services regarding features of their products. All of the products we surveyed offer user customizable fields. However, most products did not allow the user to expand their product's data table. The search capabilities of the products varied. All of the products supported the Health Level 7 (HL-7) interface and the use of structured query language (SQL). All of the products were offered with an SQL editor for creating customized queries and custom reports. All products included capabilities for collecting data for quality assurance and included capabilities for tracking "interesting cases," though they varied in the functionality offered. No product offered dedicated functions for research. Alternatively, radiology departments can create their own client-server Windows-based database systems to supplement the capabilities of commercial systems. Such systems can be developed with "web-enabled" database products like Microsoft Access or Apple Filemaker Pro.
Network characteristics for server selection in online games
NASA Astrophysics Data System (ADS)
Claypool, Mark
2008-01-01
Online gameplay is impacted by the network characteristics of players connected to the same server. Unfortunately, the network characteristics of online game servers are not well-understood, particularly for groups that wish to play together on the same server. As a step towards a remedy, this paper presents analysis of an extensive set of measurements of game servers on the Internet. Over the course of many months, actual Internet game servers were queried simultaneously by twenty-five emulated game clients, with both servers and clients spread out on the Internet. The data provides statistics on the uptime and populations of game servers over a month long period an an in-depth look at the suitability for game servers for multi-player server selection, concentrating on characteristics critical to playability--latency and fairness. Analysis finds most game servers have latencies suitable for third-person and omnipresent games, such as real-time strategy, sports and role-playing games, providing numerous server choices for game players. However, far fewer game servers have the low latencies required for first-person games, such as shooters or race games. In all cases, groups that wish to play together have a greatly reduced set of servers from which to choose because of inherent unfairness in server latencies and server selection is particularly limited as the group size increases. These results hold across different game types and even across different generations of games. The data should be useful for game developers and network researchers that seek to improve game server selection, whether for single or multiple players.
2009-01-01
Oracle 9i, 10g MySQL MS SQL Server MS SQL Server Operating System Supported Windows 2003 Server Windows 2000 Server (32 bit...WebStar (Mac OS X) SunOne Internet Information Services (IIS) Database Server Supported MS SQL Server MS SQL Server Oracle 9i, 10g...challenges of Web-based surveys are: 1) identifying the best Commercial Off the Shelf (COTS) Web-based survey packages to serve the particular
Linking Management Actions to Interactive Ecosystem Report Cards via an Ontology
NASA Astrophysics Data System (ADS)
Alabri, A.; Newman, A.; Abal, E.; van Ingen, C.; Hunter, J.
2008-12-01
IINTRODUCTION The Health-e-Waterways Project is a three way collaboration between the University of Queensland, Microsoft Research and the Healthy Waterways Partnership (SEQ-HWP)(over 60 local government, state agency, universities, community and environmental organizations). The project is developing a highly innovative framework and set of services to enable streamlined access to an integrated collection of real- time, near-real-time and static datasets acquired through ecosystem monitoring programs in South East Queensland. Using a novel combination of semantic web technologies, scientific data servers, web services, GIS visualization interfaces and scientific workflows, we are enabling the sharing and integration of high quality data and models, through a combined integrated water information management system and Web portal. DYNAMIC GENERATION OF ECOSYSTEM HEALTH REPORT CARDS SEQ-HWP is responsible for the Ecosystem Health Monitoring Program (EHMP) in South East Queensland. This currently involves sampling 30 freshwater indicators at 100 sites twice a year and 250 estuarine/marine sites every month. The EHMP data sets are statistically aggregated and standardized to produce ecosystem health grades that are published annually in hard copy EHMP Report Cards. Politicians and planners use the report cards to make decisions with respect to land use, water quality, allocations and investments in water recycling plants etc. To date, these report cards have been largely produced manually, by calculating standardized scores (0-1) across 5 indicators and 16 indices (physical, chemical, nutrients, ecosystem processes, acquatic macroinvertebrates and fish) and grades from A-F for each catchment and season (spring and autumn). Currently this process takes about 5 months. For the past 6 months, we have been working with the SEQ-HWP staff, developing software services that will enable the report cards to be generated dynamically via a Web-based Map interface to an underlying database that contains the EHMP water quality and quantity monitoring data. The GUI enables users to specify and query: - Spatial regions of interest through a GoogleEarth or the Microsoft VirtualEarth interface. - Concepts or indicators of interest through the EHMPOntology. - Seasons or years of interest through a timeline. A Report Card Grade is generated for the specified catchment and period. Users can retrieve raw data by clicking on a grade this displays the corresponding EcoH plot, dynamically generated from the 5 indicators in the underlying SQL Server database. Clicking on an EcoH plot, displays the actual raw data (16 indices) used to generate the indicators and plots. CONCLUSIONS Numerous state, national and international agencies are advocating the need for standardized frameworks and procedures for environmental accounting. The Health-e-Waterways project provides an ideal model for delivering a standardized approach to the aggregation of ecosystem health monitoring data and the generation of dynamic, interactive Report Cards (that incorporate links back to the raw data sets). The system we have described here will not only save agencies significant time and money, but it can be used to guide regional, state and national environmental policy development, based on accurate and timely evidential data.
Measurement of Energy Performances for General-Structured Servers
NASA Astrophysics Data System (ADS)
Liu, Ren; Chen, Lili; Li, Pengcheng; Liu, Meng; Chen, Haihong
2017-11-01
Energy consumption of servers in data centers increases rapidly along with the wide application of Internet and connected devices. To improve the energy efficiency of servers, voluntary or mandatory energy efficiency programs for servers, including voluntary label program or mandatory energy performance standards have been adopted or being prepared in the US, EU and China. However, the energy performance of servers and testing methods of servers are not well defined. This paper presents matrices to measure the energy performances of general-structured servers. The impacts of various components of servers on their energy performances are also analyzed. Based on a set of normalized workload, the author proposes a standard method for testing energy efficiency of servers. Pilot tests are conducted to assess the energy performance testing methods of servers. The findings of the tests are discussed in the paper.
A VBA Desktop Database for Proposal Processing at National Optical Astronomy Observatories
NASA Astrophysics Data System (ADS)
Brown, Christa L.
National Optical Astronomy Observatories (NOAO) has developed a relational Microsoft Windows desktop database using Microsoft Access and the Microsoft Office programming language, Visual Basic for Applications (VBA). The database is used to track data relating to observing proposals from original receipt through the review process, scheduling, observing, and final statistical reporting. The database has automated proposal processing and distribution of information. It allows NOAO to collect and archive data so as to query and analyze information about our science programs in new ways.
ERIC Educational Resources Information Center
Benghalem, Boualem
2015-01-01
This study aims to investigate the effects of using ICT tools such as Microsoft PowerPoint on EFL students' attitude and anxiety. The participants in this study were 40 Master 2 students of Didactics of English as a Foreign Language, Djillali Liabes University, Sidi Bel Abbes Algeria. In order to find out the effects of Microsoft PowerPoint on EFL…
2017-06-01
implement human following on a mobile robot in an indoor environment . B. FUTURE WORK Future work that could be conducted in the realm of this thesis...FEASIBILITY OF CONDUCTING HUMAN TRACKING AND FOLLOWING IN AN INDOOR ENVIRONMENT USING A MICROSOFT KINECT AND THE ROBOT OPERATING SYSTEM by...FEASIBILITY OF CONDUCTING HUMAN TRACKING AND FOLLOWING IN AN INDOOR ENVIRONMENT USING A MICROSOFT KINECT AND THE ROBOT OPERATING SYSTEM 5. FUNDING NUMBERS
NASA Astrophysics Data System (ADS)
Adamczewski-Musch, Joern; Linev, Sergey
2015-12-01
The new THttpServer class in ROOT implements HTTP server for arbitrary ROOT applications. It is based on Civetweb embeddable HTTP server and provides direct access to all objects registered for the server. Objects data could be provided in different formats: binary, XML, GIF/PNG, and JSON. A generic user interface for THttpServer has been implemented with HTML/JavaScript based on JavaScript ROOT development. With any modern web browser one could list, display, and monitor objects available on the server. THttpServer is used in Go4 framework to provide HTTP interface to the online analysis.
A Privacy-Preserving Platform for User-Centric Quantitative Benchmarking
NASA Astrophysics Data System (ADS)
Herrmann, Dominik; Scheuer, Florian; Feustel, Philipp; Nowey, Thomas; Federrath, Hannes
We propose a centralised platform for quantitative benchmarking of key performance indicators (KPI) among mutually distrustful organisations. Our platform offers users the opportunity to request an ad-hoc benchmarking for a specific KPI within a peer group of their choice. Architecture and protocol are designed to provide anonymity to its users and to hide the sensitive KPI values from other clients and the central server. To this end, we integrate user-centric peer group formation, exchangeable secure multi-party computation protocols, short-lived ephemeral key pairs as pseudonyms, and attribute certificates. We show by empirical evaluation of a prototype that the performance is acceptable for reasonably sized peer groups.
WebGLORE: a web service for Grid LOgistic REgression.
Jiang, Wenchao; Li, Pinghao; Wang, Shuang; Wu, Yuan; Xue, Meng; Ohno-Machado, Lucila; Jiang, Xiaoqian
2013-12-15
WebGLORE is a free web service that enables privacy-preserving construction of a global logistic regression model from distributed datasets that are sensitive. It only transfers aggregated local statistics (from participants) through Hypertext Transfer Protocol Secure to a trusted server, where the global model is synthesized. WebGLORE seamlessly integrates AJAX, JAVA Applet/Servlet and PHP technologies to provide an easy-to-use web service for biomedical researchers to break down policy barriers during information exchange. http://dbmi-engine.ucsd.edu/webglore3/. WebGLORE can be used under the terms of GNU general public license as published by the Free Software Foundation.
Characteristics and Energy Use of Volume Servers in the United States
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fuchs, H.; Shehabi, A.; Ganeshalingam, M.
Servers’ field energy use remains poorly understood, given heterogeneous computing loads, configurable hardware and software, and operation over a wide range of management practices. This paper explores various characteristics of 1- and 2-socket volume servers that affect energy consumption, and quantifies the difference in power demand between higher-performing SPEC and ENERGY STAR servers and our best understanding of a typical server operating today. We first establish general characteristics of the U.S. installed base of volume servers from existing IDC data and the literature, before presenting information on server hardware configurations from data collection events at a major online retail website.more » We then compare cumulative distribution functions of server idle power across three separate datasets and explain the differences between them via examination of the hardware characteristics to which power draw is most sensitive. We find that idle server power demand is significantly higher than ENERGY STAR benchmarks and the industry-released energy use documented in SPEC, and that SPEC server configurations—and likely the associated power-scaling trends—are atypical of volume servers. Next, we examine recent trends in server power draw among high-performing servers across their full load range to consider how representative these trends are of all volume servers before inputting weighted average idle power load values into a recently published model of national server energy use. Finally, we present results from two surveys of IT managers (n=216) and IT vendors (n=178) that illustrate the prevalence of more-efficient equipment and operational practices in server rooms and closets; these findings highlight opportunities to improve the energy efficiency of the U.S. server stock.« less
New method for assessing risks of email
NASA Astrophysics Data System (ADS)
Raja, Seyyed H.; Afrooz, Farzad
2013-03-01
E-mail technology, has become one of the requirements of human lives for correspondence between individuals. Given this, the important point is that the messages, server and client of e-mail and correspondences that exchanged between different people have acceptable security, to make people sure to use of this technology. In the information age, many of financial and non financial transactions are done electronically, data exchange takes place via the internet and theft and manipulation of data can make exorbitant cost in terms of integrity, financial, political, economic and culture. E-mail correspondence in there is same and it is very important. With review took place, a method that will focus on email system for risks assessment is not provided. We are examining ways of assessing for other systems and their strengths and weaknesses, then we use Mr Convery method for assessing email risks which it is for assessing network risks. At the end of paper we have offered special table for email risk assessment.
eHealth Networking Information Systems - The New Quality of Information Exchange.
Messer-Misak, Karin; Reiter, Christoph
2017-01-01
The development and introduction of platforms that enable interdisciplinary exchange on current developments and projects in the area of eHealth have been stimulated by different authorities. The aim of this project was to develop a repository of eHealth projects that will make the wealth of eHealth projects visible and enable mutual learning through the sharing of experiences and good practice. The content of the database and search criteria as well as their categories were determined in close co-ordination and cooperation with stakeholders from the specialist areas. Technically, we used Java Server Faces (JSF) for the implementation of the frontend of the web application. Access to structured information on projects can support stakeholders to combining skills and knowledge residing in different places to create new solutions and approaches within a network of evolving competencies and opportunities. A regional database is the beginning of a structured collection and presentation of projects, which can then be incorporated into a broader context. The next step will be to unify this information transparently.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chai, X; Liu, L; Xing, L
Purpose: Visualization and processing of medical images and radiation treatment plan evaluation have traditionally been constrained to local workstations with limited computation power and ability of data sharing and software update. We present a web-based image processing and planning evaluation platform (WIPPEP) for radiotherapy applications with high efficiency, ubiquitous web access, and real-time data sharing. Methods: This software platform consists of three parts: web server, image server and computation server. Each independent server communicates with each other through HTTP requests. The web server is the key component that provides visualizations and user interface through front-end web browsers and relay informationmore » to the backend to process user requests. The image server serves as a PACS system. The computation server performs the actual image processing and dose calculation. The web server backend is developed using Java Servlets and the frontend is developed using HTML5, Javascript, and jQuery. The image server is based on open source DCME4CHEE PACS system. The computation server can be written in any programming language as long as it can send/receive HTTP requests. Our computation server was implemented in Delphi, Python and PHP, which can process data directly or via a C++ program DLL. Results: This software platform is running on a 32-core CPU server virtually hosting the web server, image server, and computation servers separately. Users can visit our internal website with Chrome browser, select a specific patient, visualize image and RT structures belonging to this patient and perform image segmentation running Delphi computation server and Monte Carlo dose calculation on Python or PHP computation server. Conclusion: We have developed a webbased image processing and plan evaluation platform prototype for radiotherapy. This system has clearly demonstrated the feasibility of performing image processing and plan evaluation platform through a web browser and exhibited potential for future cloud based radiotherapy.« less
75 FR 7648 - Agency Information Collection Activities: Emergency Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-22
..., recipients, and representative payees: Braille and Microsoft Word files (on data compact discs). Current...) Braille, or (5) Microsoft Word. This call did not require OMB clearance. However, there may be respondents...
Liu, Ren-Hu; Meng, Jin-Ling
2003-05-01
MAPMAKER is one of the most widely used computer software package for constructing genetic linkage maps.However, the PC version, MAPMAKER 3.0 for PC, could not draw the genetic linkage maps that its Macintosh version, MAPMAKER 3.0 for Macintosh,was able to do. Especially in recent years, Macintosh computer is much less popular than PC. Most of the geneticists use PC to analyze their genetic linkage data. So a new computer software to draw the same genetic linkage maps on PC as the MAPMAKER for Macintosh to do on Macintosh has been crying for. Microsoft Excel,one component of Microsoft Office package, is one of the most popular software in laboratory data processing. Microsoft Visual Basic for Applications (VBA) is one of the most powerful functions of Microsoft Excel. Using this program language, we can take creative control of Excel, including genetic linkage map construction, automatic data processing and more. In this paper, a Microsoft Excel macro called MapDraw is constructed to draw genetic linkage maps on PC computer based on given genetic linkage data. Use this software,you can freely construct beautiful genetic linkage map in Excel and freely edit and copy it to Word or other application. This software is just an Excel format file. You can freely copy it from ftp://211.69.140.177 or ftp://brassica.hzau.edu.cn and the source code can be found in Excel's Visual Basic Editor.
Judo strategy. The competitive dynamics of Internet time.
Yoffie, D B; Cusumano, M A
1999-01-01
Competition on the Internet is creating fierce battles between industry giants and small-scale start-ups. Smart start-ups can avoid those conflicts by moving quickly to uncontested ground and, when that's no longer possible, turning dominant players' strengths against them. The authors call this competitive approach judo strategy. They use the Netscape-Microsoft battles to illustrate the three main principles of judo strategy: rapid movement, flexibility, and leverage. In the early part of the browser wars, for instance, Netscape applied the principle of rapid movement by being the first company to offer a free stand-alone browser. This allowed Netscape to build market share fast and to set the market standard. Flexibility became a critical factor later in the browser wars. In December 1995, when Microsoft announced that it would "embrace and extend" competitors' Internet successes, Netscape failed to give way in the face of superior strength. Instead it squared off against Microsoft and even turned down numerous opportunities to craft deep partnerships with other companies. The result was that Netscape lost deal after deal when competing with Microsoft for common distribution channels. Netscape applied the principle of leverage by using Microsoft's strengths against it. Taking advantage of Microsoft's determination to convert the world to Windows or Windows NT, Netscape made its software compatible with existing UNIX systems. While it is true that these principles can't replace basic execution, say the authors, without speed, flexibility, and leverage, very few companies can compete successfully on Internet time.
OPeNDAP Server4: Buidling a High-Performance Server for the DAP by Leveraging Existing Software
NASA Astrophysics Data System (ADS)
Potter, N.; West, P.; Gallagher, J.; Garcia, J.; Fox, P.
2006-12-01
OPeNDAP has been working in conjunction with NCAR/ESSL/HAO to develop a modular, high performance data server that will be the successor to the current OPeNDAP data server. The new server, called Server4, is really two servers: A 'Back-End' data server which reads information from various types of data sources and packages the results in DAP objects; and A 'Front-End' which receives client DAP request and then decides how use features of the Back-End data server to build the correct responses. This architecture can be configured in several interesting ways: The Front- and Back-End components can be run on either the same or different machines, depending on security and performance needs, new Front-End software can be written to support other network data access protocols and local applications can interact directly with the Back-End data server. This new server's Back-End component will use the server infrastructure developed by HAO for use with the Earth System Grid II project. Extensions needed to use it as part of the new OPeNDAP server were minimal. The HAO server was modified so that it loads 'data handlers' at run-time. Each data handler module only needs to satisfy a simple interface which both enabled the existing data handlers written for the old OPeNDAP server to be directly used and also simplifies writing new handlers from scratch. The Back-End server leverages high- performance features developed for the ESG II project, so applications that can interact with it directly can read large volumes of data efficiently. The Front-End module of Server4 uses the Java Servlet system in place of the Common Gateway Interface (CGI) used in the past. New front-end modules can be written to support different network data access protocols, so that same server will ultimately be able to support more than the DAP/2.0 protocol. As an example, we will discuss a SOAP interface that's currently in development. In addition to support for DAP/2.0 and prototypical support for a SOAP interface, the new server includes support for the THREDDS cataloging protocol. THREDDS is tightly integrated into the Front-End of Server4. The Server4 Front-End can make full use of the advanced THREDDS features such as attribute specification and inheritance, custom catalogs which segue into automatically generated catalogs as well as providing a default behavior which requires almost no catalog configuration.
Array Processing in the Cloud: the rasdaman Approach
NASA Astrophysics Data System (ADS)
Merticariu, Vlad; Dumitru, Alex
2015-04-01
The multi-dimensional array data model is gaining more and more attention when dealing with Big Data challenges in a variety of domains such as climate simulations, geographic information systems, medical imaging or astronomical observations. Solutions provided by classical Big Data tools such as Key-Value Stores and MapReduce, as well as traditional relational databases, proved to be limited in domains associated with multi-dimensional data. This problem has been addressed by the field of array databases, in which systems provide database services for raster data, without imposing limitations on the number of dimensions that a dataset can have. Examples of datasets commonly handled by array databases include 1-dimensional sensor data, 2-D satellite imagery, 3-D x/y/t image time series as well as x/y/z geophysical voxel data, and 4-D x/y/z/t weather data. And this can grow as large as simulations of the whole universe when it comes to astrophysics. rasdaman is a well established array database, which implements many optimizations for dealing with large data volumes and operation complexity. Among those, the latest one is intra-query parallelization support: a network of machines collaborate for answering a single array database query, by dividing it into independent sub-queries sent to different servers. This enables massive processing speed-ups, which promise solutions to research challenges on multi-Petabyte data cubes. There are several correlated factors which influence the speedup that intra-query parallelisation brings: the number of servers, the capabilities of each server, the quality of the network, the availability of the data to the server that needs it in order to compute the result and many more. In the effort of adapting the engine to cloud processing patterns, two main components have been identified: one that handles communication and gathers information about the arrays sitting on every server, and a processing unit responsible with dividing work among available nodes and executing operations on local data. The federation daemon collects and stores statistics from the other network nodes and provides real time updates about local changes. Information exchanged includes available datasets, CPU load and memory usage per host. The processing component is represented by the rasdaman server. Using information from the federation daemon it breaks queries into subqueries to be executed on peer nodes, ships them, and assembles the intermediate results. Thus, we define a rasdaman network node as a pair of a federation daemon and a rasdaman server. Any node can receive a query and will subsequently act as this query's dispatcher, so all peers are at the same level and there is no single point of failure. Should a node become inaccessible then the peers will recognize this and will not any longer consider this peer for distribution. Conversely, a peer at any time can join the network. To assess the feasibility of our approach, we deployed a rasdaman network in the Amazon Elastic Cloud environment on 1001 nodes, and observed that this feature can greatly increase the performance and scalability of the system, offering a large throughput of processed data.
An extensible and lightweight architecture for adaptive server applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gorton, Ian; Liu, Yan; Trivedi, Nihar
2008-07-10
Server applications augmented with behavioral adaptation logic can react to environmental changes, creating self-managing server applications with improved quality of service at runtime. However, developing adaptive server applications is challenging due to the complexity of the underlying server technologies and highly dynamic application environments. This paper presents an architecture framework, the Adaptive Server Framework (ASF), to facilitate the development of adaptive behavior for legacy server applications. ASF provides a clear separation between the implementation of adaptive behavior and the business logic of the server application. This means a server application can be extended with programmable adaptive features through the definitionmore » and implementation of control components defined in ASF. Furthermore, ASF is a lightweight architecture in that it incurs low CPU overhead and memory usage. We demonstrate the effectiveness of ASF through a case study, in which a server application dynamically determines the resolution and quality to scale an image based on the load of the server and network connection speed. The experimental evaluation demonstrates the erformance gains possible by adaptive behavior and the low overhead introduced by ASF.« less
NASA Technical Reports Server (NTRS)
Liew, K. H.; Urip, E.; Yang, S. L.; Marek, C. J.
2004-01-01
Droplet interaction with a high temperature gaseous crossflow is important because of its wide application in systems involving two phase mixing such as in combustion requiring quick mixing of fuel and air with the reduction of pollutants and for jet mixing in the dilution zone of combustors. Therefore, the focus of this work is to investigate dispersion of a two-dimensional atomized and evaporating spray jet into a two-dimensional crossflow. An interactive Microsoft Excel program for tracking a single droplet in crossflow that has previously been developed will be modified to include droplet evaporation computation. In addition to the high velocity airflow, the injected droplets are also subjected to combustor temperature and pressure that affect their motion in the flow field. Six ordinary differential equations are then solved by 4th-order Runge-Kutta method using Microsoft Excel software. Microsoft Visual Basic programming and Microsoft Excel macrocode are used to produce the data and plot graphs describing the droplet's motion in the flow field. This program computes and plots the data sequentially without forcing the user to open other types of plotting programs. A user's manual on how to use the program is included.
NASA Astrophysics Data System (ADS)
Kerley, Dan; Smith, Malcolm; Dunn, Jennifer; Herriot, Glen; Véran, Jean-Pierre; Boyer, Corinne; Ellerbroek, Brent; Gilles, Luc; Wang, Lianqi
2016-08-01
The Narrow Field Infrared Adaptive Optics System (NFIRAOS) is the first light Adaptive Optics (AO) system for the Thirty Meter Telescope (TMT). A critical component of NFIRAOS is the Real-Time Controller (RTC) subsystem which provides real-time wavefront correction by processing wavefront information to compute Deformable Mirror (DM) and Tip/Tilt Stage (TTS) commands. The National Research Council of Canada - Herzberg (NRC-H), in conjunction with TMT, has developed a preliminary design for the NFIRAOS RTC. The preliminary architecture for the RTC is comprised of several Linux-based servers. These servers are assigned various roles including: the High-Order Processing (HOP) servers, the Wavefront Corrector Controller (WCC) server, the Telemetry Engineering Display (TED) server, the Persistent Telemetry Storage (PTS) server, and additional testing and spare servers. There are up to six HOP servers that accept high-order wavefront pixels, and perform parallelized pixel processing and wavefront reconstruction to produce wavefront corrector error vectors. The WCC server performs low-order mode processing, and synchronizes and aggregates the high-order wavefront corrector error vectors from the HOP servers to generate wavefront corrector commands. The Telemetry Engineering Display (TED) server is the RTC interface to TMT and other subsystems. The TED server receives all external commands and dispatches them to the rest of the RTC servers and is responsible for aggregating several offloading and telemetry values that are reported to other subsystems within NFIRAOS and TMT. The TED server also provides the engineering GUIs and real-time displays. The Persistent Telemetry Storage (PTS) server contains fault tolerant data storage that receives and stores telemetry data, including data for Point-Spread Function Reconstruction (PSFR).
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-29
... submitted electronically in Microsoft Excel or Word formats to [email protected] . FOR FURTHER... recommendations should be submitted electronically in Microsoft Excel or Word format. Respondents to this request...
A survey of tools for the analysis of quantitative PCR (qPCR) data.
Pabinger, Stephan; Rödiger, Stefan; Kriegner, Albert; Vierlinger, Klemens; Weinhäusel, Andreas
2014-09-01
Real-time quantitative polymerase-chain-reaction (qPCR) is a standard technique in most laboratories used for various applications in basic research. Analysis of qPCR data is a crucial part of the entire experiment, which has led to the development of a plethora of methods. The released tools either cover specific parts of the workflow or provide complete analysis solutions. Here, we surveyed 27 open-access software packages and tools for the analysis of qPCR data. The survey includes 8 Microsoft Windows, 5 web-based, 9 R-based and 5 tools from other platforms. Reviewed packages and tools support the analysis of different qPCR applications, such as RNA quantification, DNA methylation, genotyping, identification of copy number variations, and digital PCR. We report an overview of the functionality, features and specific requirements of the individual software tools, such as data exchange formats, availability of a graphical user interface, included procedures for graphical data presentation, and offered statistical methods. In addition, we provide an overview about quantification strategies, and report various applications of qPCR. Our comprehensive survey showed that most tools use their own file format and only a fraction of the currently existing tools support the standardized data exchange format RDML. To allow a more streamlined and comparable analysis of qPCR data, more vendors and tools need to adapt the standardized format to encourage the exchange of data between instrument software, analysis tools, and researchers.
Energy Efficiency in Small Server Rooms: Field Surveys and Findings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheung, Iris; Greenberg, Steve; Mahdavi, Roozbeh
Fifty-seven percent of US servers are housed in server closets, server rooms, and localized data centers, in what are commonly referred to as small server rooms, which comprise 99percent of all server spaces in the US. While many mid-tier and enterprise-class data centers are owned by large corporations that consider energy efficiency a goal to minimize business operating costs, small server rooms typically are not similarly motivated. They are characterized by decentralized ownership and management and come in many configurations, which creates a unique set of efficiency challenges. To develop energy efficiency strategies for these spaces, we surveyed 30 smallmore » server rooms across eight institutions, and selected four of them for detailed assessments. The four rooms had Power Usage Effectiveness (PUE) values ranging from 1.5 to 2.1. Energy saving opportunities ranged from no- to low-cost measures such as raising cooling set points and better airflow management, to more involved but cost-effective measures including server consolidation and virtualization, and dedicated cooling with economizers. We found that inefficiencies mainly resulted from organizational rather than technical issues. Because of the inherent space and resource limitations, the most effective measure is to operate servers through energy-efficient cloud-based services or well-managed larger data centers, rather than server rooms. Backup power requirement, and IT and cooling efficiency should be evaluated to minimize energy waste in the server space. Utility programs are instrumental in raising awareness and spreading technical knowledge on server operation, and the implementation of energy efficiency measures in small server rooms.« less
NASA Astrophysics Data System (ADS)
Stepanov, Sergey
2013-03-01
X-Ray Server (x-server.gmca.aps.anl.gov) is a WWW-based computational server for modeling of X-ray diffraction, reflection and scattering data. The modeling software operates directly on the server and can be accessed remotely either from web browsers or from user software. In the later case the server can be deployed as a software library or a data fitting engine. As the server recently surpassed the milestones of 15 years online and 1.5 million calculations, it accumulated a number of technical solutions that are discussed in this paper. The developed approaches to detecting physical model limits and user calculations failures, solutions to spam and firewall problems, ways to involve the community in replenishing databases and methods to teach users automated access to the server programs may be helpful for X-ray researchers interested in using the server or sharing their own software online.
Effect of video server topology on contingency capacity requirements
NASA Astrophysics Data System (ADS)
Kienzle, Martin G.; Dan, Asit; Sitaram, Dinkar; Tetzlaff, William H.
1996-03-01
Video servers need to assign a fixed set of resources to each video stream in order to guarantee on-time delivery of the video data. If a server has insufficient resources to guarantee the delivery, it must reject the stream request rather than slowing down all existing streams. Large scale video servers are being built as clusters of smaller components, so as to be economical, scalable, and highly available. This paper uses a blocking model developed for telephone systems to evaluate video server cluster topologies. The goal is to achieve high utilization of the components and low per-stream cost combined with low blocking probability and high user satisfaction. The analysis shows substantial economies of scale achieved by larger server images. Simple distributed server architectures can result in partitioning of resources with low achievable resource utilization. By comparing achievable resource utilization of partitioned and monolithic servers, we quantify the cost of partitioning. Next, we present an architecture for a distributed server system that avoids resource partitioning and results in highly efficient server clusters. Finally, we show how, in these server clusters, further optimizations can be achieved through caching and batching of video streams.
A Community - Centered Astronomy Research Program
NASA Astrophysics Data System (ADS)
Boyce, Pat; Boyce, Grady
2017-06-01
The Boyce Research Initiatives and Education Foundation (BRIEF) is providing semester-long, hands-on, astronomy research experiences for students of all ages that results in their publishing peer-reviewed papers. The course in astronomy and double star research has evolved from a face-to-face learning experience with two instructors to an online - hybrid course that simultaneously supports classroom instruction at a variety of schools in the San Diego area. Currently, there are over 65 students enrolled in three community colleges, seven high schools, and one university as well as individual adult learners. Instructional experience, courseware, and supporting systems were developed and refined through experience gained in classroom settings from 2014 through 2016. Topics of instruction include Kepler's Laws, basic astrometry, properties of light, CCD imaging, use of filters for varying stellar spectral types, and how to perform research, scientific writing, and proposal preparation. Volunteer instructors were trained by taking the course and producing their own research papers. An expanded program was launched in the fall semester of 2016. Twelve papers from seven schools were produced; eight have been accepted for publication by the Journal of Double Observations (JDSO) and the remainder are in peer review. Three additional papers have been accepted by the JDSO and two more are in process papers. Three college professors and five advanced amateur astronomers are now qualified volunteer instructors. Supporting tools are provided by a BRIEF server and other online services. The server-based tools range from Microsoft Office and planetarium software to top-notch imaging programs and computational software for data reduction for each student team. Observations are performed by robotic telescopes worldwide supported by BRIEF. With this success, student demand has increased significantly. Many of the graduates of the first semester course wanted to expand their astronomy knowledge and experience. To answer this demand, BRIEF is developing additional astronomy research courses with partners in advanced astrometry, photometry, and exoplanets. The program provides a significant opportunity for schools, teachers, and advanced amateur astronomers to introduce high school and college students to astronomy, science, and STEM careers.
A Community-Centered Astronomy Research Program (Abstract)
NASA Astrophysics Data System (ADS)
Boyce, P.; Boyce, G.
2017-12-01
(Abstract only) The Boyce Research Initiatives and Education Foundation (BRIEF) is providing semester-long, hands-on, astronomy research experiences for students of all ages that results in their publishing peer-reviewed papers. The course in astronomy and double star research has evolved from a face-to-face learning experience with two instructors to an online hybrid course that simultaneously supports classroom instruction at a variety of schools in the San Diego area. Currently, there are over 65 students enrolled in three community colleges, seven high schools, and one university as well as individual adult learners. Instructional experience, courseware, and supporting systems were developed and refined through experience gained in classroom settings from 2014 through 2016. Topics of instruction include Kepler's Laws, basic astrometry, properties of light, CCD imaging, use of filters for varying stellar spectral types, and how to perform research, scientific writing, and proposal preparation. Volunteer instructors were trained by taking the course and producing their own research papers. An expanded program was launched in the fall semester of 2016. Twelve papers from seven schools were produced; eight have been accepted for publication by the Journal of Double Star Observations (JDSO) and the remainder are in peer review. Three additional papers have been accepted by the JDSO and two more are in process papers. Three college professors and five advanced amateur astronomers are now qualified volunteer instructors. Supporting tools are provided by a BRIEF server and other online services. The server-based tools range from Microsoft Office and planetarium software to top-notch imaging programs and computational software for data reduction for each student team. Observations are performed by robotic telescopes worldwide supported by BRIEF. With this success, student demand has increased significantly. Many of the graduates of the first semester course wanted to expand their astronomy knowledge and experience. To answer this demand, BRIEF is developing additional astronomy research courses with partners in advanced astrometry, photometry, and exoplanets. The program provides a significant opportunity for schools, teachers, and advanced amateur astronomers to introduce high school and college
Unidata's Vision for Transforming Geoscience by Moving Data Services and Software to the Cloud
NASA Astrophysics Data System (ADS)
Ramamurthy, Mohan; Fisher, Ward; Yoksas, Tom
2015-04-01
Universities are facing many challenges: shrinking budgets, rapidly evolving information technologies, exploding data volumes, multidisciplinary science requirements, and high expectations from students who have grown up with smartphones and tablets. These changes are upending traditional approaches to accessing and using data and software. Unidata recognizes that its products and services must evolve to support new approaches to research and education. After years of hype and ambiguity, cloud computing is maturing in usability in many areas of science and education, bringing the benefits of virtualized and elastic remote services to infrastructure, software, computation, and data. Cloud environments reduce the amount of time and money spent to procure, install, and maintain new hardware and software, and reduce costs through resource pooling and shared infrastructure. Cloud services aimed at providing any resource, at any time, from any place, using any device are increasingly being embraced by all types of organizations. Given this trend and the enormous potential of cloud-based services, Unidata is taking moving to augment its products, services, data delivery mechanisms and applications to align with the cloud-computing paradigm. Specifically, Unidata is working toward establishing a community-based development environment that supports the creation and use of software services to build end-to-end data workflows. The design encourages the creation of services that can be broken into small, independent chunks that provide simple capabilities. Chunks could be used individually to perform a task, or chained into simple or elaborate workflows. The services will also be portable in the form of downloadable Unidata-in-a-box virtual images, allowing their use in researchers' own cloud-based computing environments. In this talk, we present a vision for Unidata's future in a cloud-enabled data services and discuss our ongoing efforts to deploy a suite of Unidata data services and tools in the Amazon EC2 and Microsoft Azure cloud environments, including the transfer of real-time meteorological data into its cloud instances, product generation using those data, and the deployment of TDS, McIDAS ADDE and AWIPS II data servers and the Integrated Data Server visualization tool.
Voxel modelling of sands and gravels of Pleistocene Rhine and Meuse deposits in Flanders (Belgium)
NASA Astrophysics Data System (ADS)
van Haren, Tom; Dirix, Katrijn; De Koninck, Roel
2017-04-01
Voxel modelling or 3D volume modelling of Quaternary raw materials is VITO's next step in the geological layer modelling of the Flanders and Brussels Capital Region in Belgium (G3D - Matthijs et al., 2013). The aim is to schematise deposits as voxels ('volumetric pixels') that represent lithological information on a grid in three-dimensional space (25 x 25 x 0.5 m). A new voxel model on Pleistocene Meuse and Rhine sands and gravels will be illustrated succeeding a voxel model on loess resources (van Haren et al., 2016). The model methodology is based on a geological 'skeleton' extracted from the regional geological layer model of Flanders. This framework holds the 3D interpolated lithological information of 5.000 boreholes. First a check on quality and spatial location filtered out significant and usable lithological information. Subsequently a manual geological interpretation was performed to analyse stratigraphical arrangement and identify the raw materials of interest. Finally, a workflow was developed that automatically encodes and classifies the borehole descriptions in a standardized manner. This workflow was implemented by combining Microsoft Access® and ArcMap® and is able to convert borehole descriptions into specific geological parameters. An analysis of the conversed lithological data prior to interpolation improves the understanding of the spatial distribution, to fine tune the modelling process and to know the limitations of the data. The converted lithological data were 3D interpolated in Voxler using IDW and resulted in a model containing 52 million voxels. It gives an overview on the regional distribution and thickness variation of interesting Pleistocene aggregates of Meuse and Rhine. Much effort has been put in setting up a database structure in Microsoft Access® and Microsoft SQL Server® in order to arrange and analyse the lithological information, link the voxel model with the geological layer model and handle and analyse the resulting voxelmodel data. The database structure allows to analyse and set certain preconditions (minimal thickness or maximum depth of aggregates, maximum thickness of intercalating clays) on the model in order to calculate and view distributions of deposits which meet these preconditions. These results are interesting for pre-prospective purposes, illustrating the distribution of lithological information and making the end user more aware of the potential economic value of the subsurface. References van Haren T. et al (2016) - An interactive voxel model for mineral resources: loess deposits in Flanders (Belgium). Zeitschrift der Deutschen Gesellschaft für Geowissenschaften, Volume 167, Number 4, pp. 363-376(14). Matthijs J. et al. (2013) - Geological 3D layer model of the Flanders Region and Brussels-Capital Region - 2nd version. Study performed in order of the Ministery of the Flemish Community. VITO report 2013/R/ETE/43, 24p. (in Dutch)
76 FR 42164 - Announcement of Competition Under the America COMPETES Reauthorization Act of 2011
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-18
... listing will be in a format completely compatible with Microsoft Excel 2007 and contain the information... by VA. The narrative will be in a format completely compatible with Microsoft Word 2007, not...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-04
... Microsoft Excel version of the Form 561. The Microsoft Excel version of the Form 561 has been available... of the Excel software to make filling the form out easier and compiling the filed information more...
Design and implementation of streaming media server cluster based on FFMpeg.
Zhao, Hong; Zhou, Chun-long; Jin, Bao-zhao
2015-01-01
Poor performance and network congestion are commonly observed in the streaming media single server system. This paper proposes a scheme to construct a streaming media server cluster system based on FFMpeg. In this scheme, different users are distributed to different servers according to their locations and the balance among servers is maintained by the dynamic load-balancing algorithm based on active feedback. Furthermore, a service redirection algorithm is proposed to improve the transmission efficiency of streaming media data. The experiment results show that the server cluster system has significantly alleviated the network congestion and improved the performance in comparison with the single server system.
Design and Implementation of Streaming Media Server Cluster Based on FFMpeg
Zhao, Hong; Zhou, Chun-long; Jin, Bao-zhao
2015-01-01
Poor performance and network congestion are commonly observed in the streaming media single server system. This paper proposes a scheme to construct a streaming media server cluster system based on FFMpeg. In this scheme, different users are distributed to different servers according to their locations and the balance among servers is maintained by the dynamic load-balancing algorithm based on active feedback. Furthermore, a service redirection algorithm is proposed to improve the transmission efficiency of streaming media data. The experiment results show that the server cluster system has significantly alleviated the network congestion and improved the performance in comparison with the single server system. PMID:25734187
FastLane: An Agile Congestion Signaling Mechanism for Improving Datacenter Performance
2013-05-20
Cloudera, Ericsson, Facebook, General Electric, Hortonworks, Huawei , Intel, Microsoft, NetApp, Oracle, Quanta, Samsung, Splunk, VMware and Yahoo...Web Services, Google, SAP, Blue Goji, Cisco, Clearstory Data, Cloud- era, Ericsson, Facebook, General Electric, Hortonworks, Huawei , Intel, Microsoft
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-23
... a separate document, our preferred file format is Microsoft Word. If you attach multiple comments (such as form letters), our preferred format is a Microsoft Excel spreadsheet. (2) By Hard Copy: Submit...
Microsoft Windows XP Version 5.1 (Build 2600) Service Pack 3; ESRI ArcCatalog 9.3.0.1770 en Monthly and @NREL.gov 10km Direct Normal Microsoft Windows XP Version 5.1 (Build 2600) Service Pack 3; ESRI ArcCatalog
} ItemDescription Microsoft Windows XP Version 5.1 (Build 2600) Service Pack 3; ESRI ArcCatalog 9.3.1.3000 en Annual Student RPP 303-384-7278 nick.grue@nrel.gov Microsoft Windows XP Version 5.1 (Build 2600) Service Pack 3
Windows Program For Driving The TDU-850 Printer
NASA Technical Reports Server (NTRS)
Parrish, Brett T.
1995-01-01
Program provides WYSIWYG compatibility between video display and printout. PDW is Microsoft Windows printer-driver computer program for use with Raytheon TDU-850 printer. Provides previously unavailable linkage between printer and IBM PC-compatible computers running Microsoft Windows. Enhances capabilities of Raytheon TDU-850 hardcopier by emulating all textual and graphical features normally supported by laser/ink-jet printers and makes printer compatible with any Microsoft Windows application. Also provides capabilities not found in laser/ink-jet printer drivers by providing certain Windows applications with ability to render high quality, true gray-scale photographic hardcopy on TDU-850. Written in C language.
Software For Design And Analysis Of Tanks And Cylindrical Shells
NASA Technical Reports Server (NTRS)
Luz, Paul L.; Graham, Jerry B.
1995-01-01
Skin-stringer Tank Analysis Spreadsheet System (STASS) computer program developed for use as preliminary design software tool that enables quick-turnaround design and analysis of structural domes and cylindrical barrel sections in propellant tanks or other cylindrical shells. Determines minimum required skin thicknesses for domes and cylindrical shells to withstand material failure due to applied pressures (ullage and/or hydrostatic) and runs buckling analyses on cylindrical shells and skin-stringers. Implemented as workbook program, using Microsoft Excel v4.0 on Macintosh II. Also implemented using Microsoft Excel v4.0 for Microsoft Windows v3.1 IBM PC.
An add-in implementation of the RESAMPLING syntax under Microsoft EXCEL.
Meineke, I
2000-10-01
The RESAMPLING syntax defines a set of powerful commands, which allow the programming of probabilistic statistical models with few, easily memorized statements. This paper presents an implementation of the RESAMPLING syntax using Microsoft EXCEL with Microsoft WINDOWS(R) as a platform. Two examples are given to demonstrate typical applications of RESAMPLING in biomedicine. Details of the implementation with special emphasis on the programming environment are discussed at length. The add-in is available electronically to interested readers upon request. The use of the add-in facilitates numerical statistical analyses of data from within EXCEL in a comfortable way.
Eltoukhy, Moataz; Kelly, Adam; Kim, Chang-Young; Jun, Hyung-Pil; Campbell, Richard; Kuenze, Christopher
2016-01-01
Cost effective, quantifiable assessment of lower extremity movement represents potential improvement over standard tools for evaluation of injury risk. Ten healthy participants completed three trials of a drop jump, overhead squat, and single leg squat task. Peak hip and knee kinematics were assessed using an 8 camera BTS Smart 7000DX motion analysis system and the Microsoft Kinect® camera system. The agreement and consistency between both uncorrected and correct Kinect kinematic variables and the BTS camera system were assessed using interclass correlations coefficients. Peak sagittal plane kinematics measured using the Microsoft Kinect® camera system explained a significant amount of variance [Range(hip) = 43.5-62.8%; Range(knee) = 67.5-89.6%] in peak kinematics measured using the BTS camera system. Across tasks, peak knee flexion angle and peak hip flexion were found to be consistent and in agreement when the Microsoft Kinect® camera system was directly compared to the BTS camera system but these values were improved following application of a corrective factor. The Microsoft Kinect® may not be an appropriate surrogate for traditional motion analysis technology, but it may have potential applications as a real-time feedback tool in pathological or high injury risk populations.
Analysis of data throughput in communication between PLCs and HMI/SCADA systems
NASA Astrophysics Data System (ADS)
Mikolajek, Martin; Koziorek, Jiri
2016-09-01
This paper is focused on Analysis of data throughout in communication between PLCs and HMI/SCADA systems. The first part of paper discusses basic problematic communication between PLC and HMI systems. Next part is about specific types of communications PLC - HMI requests. For those cases paper is talking about response and data throughput1-3 . Subsequent section of this article contains practical parts with various data exchanges between PLC Siemens and HMI. The possibilities of communication that are described in this article are focused on using OPC server for visualization software, custom HMI system and own application created by using .NET with Technology. The last part of this article contains some communication solutions.
WebGLORE: a Web service for Grid LOgistic REgression
Jiang, Wenchao; Li, Pinghao; Wang, Shuang; Wu, Yuan; Xue, Meng; Ohno-Machado, Lucila; Jiang, Xiaoqian
2013-01-01
WebGLORE is a free web service that enables privacy-preserving construction of a global logistic regression model from distributed datasets that are sensitive. It only transfers aggregated local statistics (from participants) through Hypertext Transfer Protocol Secure to a trusted server, where the global model is synthesized. WebGLORE seamlessly integrates AJAX, JAVA Applet/Servlet and PHP technologies to provide an easy-to-use web service for biomedical researchers to break down policy barriers during information exchange. Availability and implementation: http://dbmi-engine.ucsd.edu/webglore3/. WebGLORE can be used under the terms of GNU general public license as published by the Free Software Foundation. Contact: x1jiang@ucsd.edu PMID:24072732
Regional early flood warning system: design and implementation
NASA Astrophysics Data System (ADS)
Chang, L. C.; Yang, S. N.; Kuo, C. L.; Wang, Y. F.
2017-12-01
This study proposes a prototype of the regional early flood inundation warning system in Tainan City, Taiwan. The AI technology is used to forecast multi-step-ahead regional flood inundation maps during storm events. The computing time is only few seconds that leads to real-time regional flood inundation forecasting. A database is built to organize data and information for building real-time forecasting models, maintaining the relations of forecasted points, and displaying forecasted results, while real-time data acquisition is another key task where the model requires immediately accessing rain gauge information to provide forecast services. All programs related database are constructed in Microsoft SQL Server by using Visual C# to extracting real-time hydrological data, managing data, storing the forecasted data and providing the information to the visual map-based display. The regional early flood inundation warning system use the up-to-date Web technologies driven by the database and real-time data acquisition to display the on-line forecasting flood inundation depths in the study area. The friendly interface includes on-line sequentially showing inundation area by Google Map, maximum inundation depth and its location, and providing KMZ file download of the results which can be watched on Google Earth. The developed system can provide all the relevant information and on-line forecast results that helps city authorities to make decisions during typhoon events and make actions to mitigate the losses.
A DICOM based radiotherapy plan database for research collaboration and reporting
NASA Astrophysics Data System (ADS)
Westberg, J.; Krogh, S.; Brink, C.; Vogelius, I. R.
2014-03-01
Purpose: To create a central radiotherapy (RT) plan database for dose analysis and reporting, capable of calculating and presenting statistics on user defined patient groups. The goal is to facilitate multi-center research studies with easy and secure access to RT plans and statistics on protocol compliance. Methods: RT institutions are able to send data to the central database using DICOM communications on a secure computer network. The central system is composed of a number of DICOM servers, an SQL database and in-house developed software services to process the incoming data. A web site within the secure network allows the user to manage their submitted data. Results: The RT plan database has been developed in Microsoft .NET and users are able to send DICOM data between RT centers in Denmark. Dose-volume histogram (DVH) calculations performed by the system are comparable to those of conventional RT software. A permission system was implemented to ensure access control and easy, yet secure, data sharing across centers. The reports contain DVH statistics for structures in user defined patient groups. The system currently contains over 2200 patients in 14 collaborations. Conclusions: A central RT plan repository for use in multi-center trials and quality assurance was created. The system provides an attractive alternative to dummy runs by enabling continuous monitoring of protocol conformity and plan metrics in a trial.
Wang, Fang; Dong, Jian-Cheng; Chen, Jian-Rong; Wu, Hui-Qun; Liu, Man-Hua; Xue, Li-Ly; Zhu, Xiang-Hua; Wang, Jian
2015-01-01
To independently research and develop an electronic information system for safety administration of newborns in the rooming-in care, and to investigate the effects of its clinical application. By VS 2010 SQL SERVER 2005 database and adopting Microsoft visual programming tool, an interactive mobile information system was established, with integrating data, information and knowledge with using information structures, information processes and information technology. From July 2011 to July 2012, totally 210 newborns from the rooming-in care of the Obstetrics Department of the Second Affiliated Hospital of Nantong University were chosen and randomly divided into two groups: the information system monitoring group (110 cases) and the regular monitoring group (100 cases). Incidence of abnormal events and degree of satisfaction were recorded and calculated. ① The wireless electronic information system has four main functions including risk scaling display, identity recognition display, nursing round notes board and health education board; ② statistically significant differences were found between the two groups both on the active or passive discovery rate of abnormal events occurred in the newborns (P<0.05) and the satisfaction degree of the mothers and their families (P<0.05); ③ the system was sensitive and reliable, and the wireless transmission of information was correct and safety. The system is with high practicability in the clinic and can ensure the safety for the newborns with improved satisfactions.
Online database for documenting clinical pathology resident education.
Hoofnagle, Andrew N; Chou, David; Astion, Michael L
2007-01-01
Training of clinical pathologists is evolving and must now address the 6 core competencies described by the Accreditation Council for Graduate Medical Education (ACGME), which include patient care. A substantial portion of the patient care performed by the clinical pathology resident takes place while the resident is on call for the laboratory, a practice that provides the resident with clinical experience and assists the laboratory in providing quality service to clinicians in the hospital and surrounding community. Documenting the educational value of these on-call experiences and providing evidence of competence is difficult for residency directors. An online database of these calls, entered by residents and reviewed by faculty, would provide a mechanism for documenting and improving the education of clinical pathology residents. With Microsoft Access we developed an online database that uses active server pages and secure sockets layer encryption to document calls to the clinical pathology resident. Using the data collected, we evaluated the efficacy of 3 interventions aimed at improving resident education. The database facilitated the documentation of more than 4 700 calls in the first 21 months it was online, provided archived resident-generated data to assist in serving clients, and demonstrated that 2 interventions aimed at improving resident education were successful. We have developed a secure online database, accessible from any computer with Internet access, that can be used to easily document clinical pathology resident education and competency.
ERIC Educational Resources Information Center
de Miranda, John
The field of alcohol server awareness and training has grown dramatically in the past several years and the idea of training servers to reduce alcohol problems has become a central fixture in the current alcohol policy debate. The San Mateo County, California Server Information Program (SIP) is a community-based prevention strategy designed to…
The use of geospatial web services for exchanging utilities data
NASA Astrophysics Data System (ADS)
Kuczyńska, Joanna
2013-04-01
Geographic information technologies and related geo-information systems currently play an important role in the management of public administration in Poland. One of these tasks is to maintain and update Geodetic Evidence of Public Utilities (GESUT), part of the National Geodetic and Cartographic Resource, which contains an important for many institutions information of technical infrastructure. It requires an active exchange of data between the Geodesy and Cartography Documentation Centers and institutions, which administrate transmission lines. The administrator of public utilities, is legally obliged to provide information about utilities to GESUT. The aim of the research work was to develop a universal data exchange methodology, which can be implemented on a variety of hardware and software platforms. This methodology use Unified Modeling Language (UML), eXtensible Markup Language (XML), and Geography Markup Language (GML). The proposed methodology is based on the two different strategies: Model Driven Architecture (MDA) and Service Oriented Architecture (SOA). Used solutions are consistent with the INSPIRE Directive and ISO 19100 series standards for geographic information. On the basis of analysis of the input data structures, conceptual models were built for both databases. Models were written in the universal modeling language: UML. Combined model that defines a common data structure was also built. This model was transformed into developed for the exchange of geographic information GML standard. The structure of the document describing the data that may be exchanged is defined in the .xsd file. Network services were selected and implemented in the system designed for data exchange based on open source tools. Methodology was implemented and tested. Data in the agreed data structure and metadata were set up on the server. Data access was provided by geospatial network services: data searching possibilities by Catalog Service for the Web (CSW), data collection by Web Feature Service (WFS). WFS provides also operation for modification data, for example to update them by utility administrator. The proposed solution significantly increases the efficiency of data exchange and facilitates maintenance the National Geodetic and Cartographic Resource.
A web access script language to support clinical application development.
O'Kane, K C; McColligan, E E
1998-02-01
This paper describes the development of a script language to support the implementation of decentralized, clinical information applications on the World Wide Web (Web). The goal of this work is to facilitate construction of low overhead, fully functional clinical information systems that can be accessed anywhere by low cost Web browsers to search, retrieve and analyze stored patient data. The Web provides a model of network access to data bases on a global scale. Although it was originally conceived as a means to exchange scientific documents, Web browsers and servers currently support access to a wide variety of audio, video, graphical and text based data to a rapidly growing community. Access to these services is via inexpensive client software browsers that connect to servers by means of the open architecture of the Internet. In this paper, the design and implementation of a script language that supports the development of low cost, Web-based, distributed clinical information systems for both Inter- and Intra-Net use is presented. The language is based on the Mumps language and, consequently, supports many legacy applications with few modifications. Several enhancements, however, have been made to support modern programming practices and the Web interface. The interpreter for the language also supports standalone program execution on Unix, MS-Windows, OS/2 and other operating systems.
Change in Microsoft's Licensing Prices Attracts Some Colleges and Worries Others.
ERIC Educational Resources Information Center
Olsen, Florence
2002-01-01
Discusses the difficult choices facing campus officials as Microsoft pressures colleges to sign lease agreements for desktop software rather than continue to buy licenses; the new leasing option saves money in the short term but might limit choices later. (EV)
SiRen: Leveraging Similar Regions for Efficient and Accurate Variant Calling
2015-05-30
Cloudera, EMC2, Ericsson, Facebook, Guavus, HP, Huawei, Informatica , Intel, Microsoft, NetApp, Pivotal, Samsung, Schlumberger, Splunk, Virdata and VMware...EMC2, Ericsson, Facebook, Guavus, HP, Huawei, Informatica , Intel, Microsoft, NetApp, Pivotal, Samsung, Schlumberger, Splunk, Virdata and VMware
DEVELOPMENT OF CAPE-OPEN COMPLIANT PROCESS MODELING COMPONENTS IN MICROSOFT .NET
The CAPE-OPEN middleware standards were created to allow process modeling components (PMCs) developed by third parties to be used in any process modeling environment (PME) utilizing these standards. The CAPE-OPEN middleware specifications were based upon both Microsoft's Compone...
Analysis of practical backoff protocols for contention resolution with multiple servers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldberg, L.A.; MacKenzie, P.D.
Backoff protocols are probably the most widely used protocols for contention resolution in multiple access channels. In this paper, we analyze the stochastic behavior of backoff protocols for contention resolution among a set of clients and servers, each server being a multiple access channel that deals with contention like an Ethernet channel. We use the standard model in which each client generates requests for a given server according to a Bemoulli distribution with a specified mean. The client-server request rate of a system is the maximum over all client-server pairs (i, j) of the sum of all request rates associatedmore » with either client i or server j. Our main result is that any superlinear polynomial backoff protocol is stable for any multiple-server system with a sub-unit client-server request rate. We confirm the practical relevance of our result by demonstrating experimentally that the average waiting time of requests is very small when such a system is run with reasonably few clients and reasonably small request rates such as those that occur in actual ethernets. Our result is the first proof of stability for any backoff protocol for contention resolution with multiple servers. Our result is also the first proof that any weakly acknowledgment based protocol is stable for contention resolution with multiple servers and such high request rates. Two special cases of our result are of interest. Hastad, Leighton and Rogoff have shown that for a single-server system with a sub-unit client-server request rate any modified superlinear polynomial backoff protocol is stable. These modified backoff protocols are similar to standard backoff protocols but require more random bits to implement. The special case of our result in which there is only one server extends the result of Hastad, Leighton and Rogoff to standard (practical) backoff protocols. Finally, our result applies to dynamic routing in optical networks.« less
Shani, Guy; Shapiro, Amir; Oded, Goldstein; Dima, Kagan; Melzer, Itshak
2017-01-01
Rapid compensatory stepping plays an important role in preventing falls when balance is lost; however, these responses cannot be accurately quantified in the clinic. The Microsoft Kinect™ system provides real-time anatomical landmark position data in three dimensions (3D), which may bridge this gap. Compensatory stepping reactions were evoked in 8 young adults by a sudden platform horizontal motion on which the subject stood or walked on a treadmill. The movements were recorded with both a 3D-APAS motion capture and Microsoft Kinect™ systems. The outcome measures consisted of compensatory step times (milliseconds) and length (centimeters). The average values of two standing and walking trials for Microsoft Kinect™ and the 3D-APAS systems were compared using t -test, Pearson's correlation, Altman-bland plots, and the average difference of root mean square error (RMSE) of joint position. The Microsoft Kinect™ had high correlations for the compensatory step times ( r = 0.75-0.78, p = 0.04) during standing and moderate correlations for walking ( r = 0.53-0.63, p = 0.05). The step length, however had a very high correlations for both standing and walking ( r > 0.97, p = 0.01). The RMSE showed acceptable differences during the perturbation trials with smallest relative error in anterior-posterior direction (2-3%) and the highest in the vertical direction (11-13%). No systematic bias were evident in the Bland and Altman graphs. The Microsoft Kinect™ system provides comparable data to a video-based 3D motion analysis system when assessing step length and less accurate but still clinically acceptable for step times during balance recovery when balance is lost and fall is initiated.
Naver: a PC-cluster-based VR system
NASA Astrophysics Data System (ADS)
Park, ChangHoon; Ko, HeeDong; Kim, TaiYun
2003-04-01
In this paper, we present a new framework NAVER for virtual reality application. The NAVER is based on a cluster of low-cost personal computers. The goal of NAVER is to provide flexible, extensible, scalable and re-configurable framework for the virtual environments defined as the integration of 3D virtual space and external modules. External modules are various input or output devices and applications on the remote hosts. From the view of system, personal computers are divided into three servers according to its specific functions: Render Server, Device Server and Control Server. While Device Server contains external modules requiring event-based communication for the integration, Control Server contains external modules requiring synchronous communication every frame. And, the Render Server consists of 5 managers: Scenario Manager, Event Manager, Command Manager, Interaction Manager and Sync Manager. These managers support the declaration and operation of virtual environment and the integration with external modules on remote servers.
NASA Astrophysics Data System (ADS)
Antony, Joby; Mathuria, D. S.; Chaudhary, Anup; Datta, T. S.; Maity, T.
2017-02-01
Cryogenic network for linear accelerator operations demand a large number of Cryogenic sensors, associated instruments and other control-instrumentation to measure, monitor and control different cryogenic parameters remotely. Here we describe an alternate approach of six types of newly designed integrated intelligent cryogenic instruments called device-servers which has the complete circuitry for various sensor-front-end analog instrumentation and the common digital back-end http-server built together, to make crateless PLC-free model of controls and data acquisition. These identified instruments each sensor-specific viz. LHe server, LN2 Server, Control output server, Pressure server, Vacuum server and Temperature server are completely deployed over LAN for the cryogenic operations of IUAC linac (Inter University Accelerator Centre linear Accelerator), New Delhi. This indigenous design gives certain salient features like global connectivity, low cost due to crateless model, easy signal processing due to integrated design, less cabling and device-interconnectivity etc.
Twin-tailed fail-over for fileservers maintaining full performance in the presence of a failure
Coteus, Paul W.; Gara, Alan G.; Giampapa, Mark E.; Heidelberger, Philip; Steinmacher-Burow, Burkhard D.
2008-02-12
A method for maintaining full performance of a file system in the presence of a failure is provided. The file system having N storage devices, where N is an integer greater than zero and N primary file servers where each file server is operatively connected to a corresponding storage device for accessing files therein. The file system further having a secondary file server operatively connected to at least one of the N storage devices. The method including: switching the connection of one of the N storage devices to the secondary file server upon a failure of one of the N primary file servers; and switching the connections of one or more of the remaining storage devices to a primary file server other than the failed file server as necessary so as to prevent a loss in performance and to provide each storage device with an operating file server.
Experimental parametric study of servers cooling management in data centers buildings
NASA Astrophysics Data System (ADS)
Nada, S. A.; Elfeky, K. E.; Attia, Ali M. A.; Alshaer, W. G.
2017-06-01
A parametric study of air flow and cooling management of data centers servers is experimentally conducted for different design conditions. A physical scale model of data center accommodating one rack of four servers was designed and constructed for testing purposes. Front and rear rack and server's temperatures distributions and supply/return heat indices (SHI/RHI) are used to evaluate data center thermal performance. Experiments were conducted to parametrically study the effects of perforated tiles opening ratio, servers power load variation and rack power density. The results showed that (1) perforated tile of 25% opening ratio provides the best results among the other opening ratios, (2) optimum benefit of cold air in servers cooling is obtained at uniformly power loading of servers (3) increasing power density decrease air re-circulation but increase air bypass and servers temperature. The present results are compared with previous experimental and CFD results and fair agreement was found.
Experience with Adaptive Security Policies.
1998-03-01
3.1 Introduction r: 3.2 Logical Groupings of audited permission checks 29 3.3 Auditing of system servers via microkernel snooping 31 3.4...performed by servers other than the microkernel . Since altering each server to audit events would complicate the integration of new servers, a...modification to the microkernel was implemented to allow the microkernel to audit the requests made of other servers. Both methods for enhancing audit
Opportunities for the Mashup of Heterogenous Data Server via Semantic Web Technology
NASA Astrophysics Data System (ADS)
Ritschel, Bernd; Seelus, Christoph; Neher, Günther; Iyemori, Toshihiko; Koyama, Yukinobu; Yatagai, Akiyo; Murayama, Yasuhiro; King, Todd; Hughes, John; Fung, Shing; Galkin, Ivan; Hapgood, Michael; Belehaki, Anna
2015-04-01
Opportunities for the Mashup of Heterogenous Data Server via Semantic Web Technology European Union ESPAS, Japanese IUGONET and GFZ ISDC data server are developed for the ingestion, archiving and distributing of geo and space science domain data. Main parts of the data -managed by the mentioned data server- are related to near earth-space and geomagnetic field data. A smart mashup of the data server would allow a seamless browse and access to data and related context information. However the achievement of a high level of interoperability is a challenge because the data server are based on different data models and software frameworks. This paper is focused on the latest experiments and results for the mashup of the data server using the semantic Web approach. Besides the mashup of domain and terminological ontologies, especially the options to connect data managed by relational databases using D2R server and SPARQL technology will be addressed. A successful realization of the data server mashup will not only have a positive impact to the data users of the specific scientific domain but also to related projects, such as e.g. the development of a new interoperable version of NASA's Planetary Data System (PDS) or ICUS's World Data System alliance. ESPAS data server: https://www.espas-fp7.eu/portal/ IUGONET data server: http://search.iugonet.org/iugonet/ GFZ ISDC data server (semantic Web based prototype): http://rz-vm30.gfz-potsdam.de/drupal-7.9/ NASA PDS: http://pds.nasa.gov ICSU-WDS: https://www.icsu-wds.org
Triple-server blind quantum computation using entanglement swapping
NASA Astrophysics Data System (ADS)
Li, Qin; Chan, Wai Hong; Wu, Chunhui; Wen, Zhonghua
2014-04-01
Blind quantum computation allows a client who does not have enough quantum resources or technologies to achieve quantum computation on a remote quantum server such that the client's input, output, and algorithm remain unknown to the server. Up to now, single- and double-server blind quantum computation have been considered. In this work, we propose a triple-server blind computation protocol where the client can delegate quantum computation to three quantum servers by the use of entanglement swapping. Furthermore, the three quantum servers can communicate with each other and the client is almost classical since one does not require any quantum computational power, quantum memory, and the ability to prepare any quantum states and only needs to be capable of getting access to quantum channels.
Are Microsoft's Animated Interface Agents Helpful?
ERIC Educational Resources Information Center
Head, Allison J.
1998-01-01
Discusses interface agents and online help systems, focusing on Microsoft's animated office assistants. Highlights include intermediaries such as librarians in off-line reference problems; user complaints about online help systems; navigation problems; evaluation of the online office assistants; and categories of user queries to online help…
Code of Federal Regulations, 2011 CFR
2011-10-01
... ENVIRONMENTAL, CONSERVATION, OCCUPATIONAL SAFETY, AND DRUG-FREE WORKPLACE Energy-Efficient Computer Equipment 1523.7002 Waivers. (a) There are several types of computer equipment which technically fall under the... types of equipment: (1) LAN servers, including file servers; application servers; communication servers...
Code of Federal Regulations, 2012 CFR
2012-10-01
... ENVIRONMENTAL, CONSERVATION, OCCUPATIONAL SAFETY, AND DRUG-FREE WORKPLACE Energy-Efficient Computer Equipment 1523.7002 Waivers. (a) There are several types of computer equipment which technically fall under the... types of equipment: (1) LAN servers, including file servers; application servers; communication servers...
Code of Federal Regulations, 2010 CFR
2010-10-01
... ENVIRONMENTAL, CONSERVATION, OCCUPATIONAL SAFETY, AND DRUG-FREE WORKPLACE Energy-Efficient Computer Equipment 1523.7002 Waivers. (a) There are several types of computer equipment which technically fall under the... types of equipment: (1) LAN servers, including file servers; application servers; communication servers...
User's Manual: Routines for Radiative Heat Transfer and Thermometry
NASA Technical Reports Server (NTRS)
Risch, Timothy K.
2016-01-01
Determining the intensity and spectral distribution of radiation emanating from a heated surface has applications in many areas of science and engineering. Areas of research in which the quantification of spectral radiation is used routinely include thermal radiation heat transfer, infrared signature analysis, and radiation thermometry. In the analysis of radiation, it is helpful to be able to predict the radiative intensity and the spectral distribution of the emitted energy. Presented in this report is a set of routines written in Microsoft Visual Basic for Applications (VBA) (Microsoft Corporation, Redmond, Washington) and incorporating functions specific to Microsoft Excel (Microsoft Corporation, Redmond, Washington) that are useful for predicting the radiative behavior of heated surfaces. These routines include functions for calculating quantities of primary importance to engineers and scientists. In addition, the routines also provide the capability to use such information to determine surface temperatures from spectral intensities and for calculating the sensitivity of the surface temperature measurements to unknowns in the input parameters.
Abe, Eiji; Abe, Mari
2011-08-01
With the spread of total intravenous anesthesia, clinical pharmacology has become more important. We report Microsoft Excel file applying three compartment model and response surface model to clinical anesthesia. On the Microsoft Excel sheet, propofol, remifentanil and fentanyl effect-site concentrations are predicted (three compartment model), and probabilities of no response to prodding, shaking, surrogates of painful stimuli and laryngoscopy are calculated using predicted effect-site drug concentration. Time-dependent changes in these calculated values are shown graphically. Recent development in anesthetic drug interaction studies are remarkable, and its application to clinical anesthesia with this Excel file is simple and helpful for clinical anesthesia.
Maitra, Tanmoy; Giri, Debasis
2014-12-01
The medical organizations have introduced Telecare Medical Information System (TMIS) to provide a reliable facility by which a patient who is unable to go to a doctor in critical or urgent period, can communicate to a doctor through a medical server via internet from home. An authentication mechanism is needed in TMIS to hide the secret information of both parties, namely a server and a patient. Recent research includes patient's biometric information as well as password to design a remote user authentication scheme that enhances the security level. In a single server environment, one server is responsible for providing services to all the authorized remote patients. However, the problem arises if a patient wishes to access several branch servers, he/she needs to register to the branch servers individually. In 2014, Chuang and Chen proposed an remote user authentication scheme for multi-server environment. In this paper, we have shown that in their scheme, an non-register adversary can successfully logged-in into the system as a valid patient. To resist the weaknesses, we have proposed an authentication scheme for TMIS in multi-server environment where the patients can register to a root telecare server called registration center (RC) in one time to get services from all the telecare branch servers through their registered smart card. Security analysis and comparison shows that our proposed scheme provides better security with low computational and communication cost.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-24
... Microsoft Excel. By Hard Copy: U.S. mail or hand-delivery: Public Comments Processing, Attn: FWS-HQ-ES-2013... procedures. If you attach your comments as a separate document, our preferred file format is Microsoft Word...
Top-K Interesting Subgraph Discovery in Information Networks
2014-03-03
Integrative Biomarker Discovery for Breast Cancer Metastasis from Gene Expression and Protein Interaction Data Using Error-tolerant Pattern Mining” at...Jiawei Han¶ ∗Microsoft, India . Email: gmanish@microsoft.com †State University of New York at Buffalo. Email: jing@buffalo.edu ‡University of California
Visual Communication: Integrating Visual Instruction into Business Communication Courses
ERIC Educational Resources Information Center
Baker, William H.
2006-01-01
Business communication courses are ideal for teaching visual communication principles and techniques. Many assignments lend themselves to graphic enrichment, such as flyers, handouts, slide shows, Web sites, and newsletters. Microsoft Publisher and Microsoft PowerPoint are excellent tools for these assignments, with Publisher being best for…
Search Engines for Tomorrow's Scholars
ERIC Educational Resources Information Center
Fagan, Jody Condit
2011-01-01
Today's scholars face an outstanding array of choices when choosing search tools: Google Scholar, discipline-specific abstracts and index databases, library discovery tools, and more recently, Microsoft's re-launch of their academic search tool, now dubbed Microsoft Academic Search. What are these tools' strengths for the emerging needs of…
Secure Dynamic access control scheme of PHR in cloud computing.
Chen, Tzer-Shyong; Liu, Chia-Hui; Chen, Tzer-Long; Chen, Chin-Sheng; Bau, Jian-Guo; Lin, Tzu-Ching
2012-12-01
With the development of information technology and medical technology, medical information has been developed from traditional paper records into electronic medical records, which have now been widely applied. The new-style medical information exchange system "personal health records (PHR)" is gradually developed. PHR is a kind of health records maintained and recorded by individuals. An ideal personal health record could integrate personal medical information from different sources and provide complete and correct personal health and medical summary through the Internet or portable media under the requirements of security and privacy. A lot of personal health records are being utilized. The patient-centered PHR information exchange system allows the public autonomously maintain and manage personal health records. Such management is convenient for storing, accessing, and sharing personal medical records. With the emergence of Cloud computing, PHR service has been transferred to storing data into Cloud servers that the resources could be flexibly utilized and the operation cost can be reduced. Nevertheless, patients would face privacy problem when storing PHR data into Cloud. Besides, it requires a secure protection scheme to encrypt the medical records of each patient for storing PHR into Cloud server. In the encryption process, it would be a challenge to achieve accurately accessing to medical records and corresponding to flexibility and efficiency. A new PHR access control scheme under Cloud computing environments is proposed in this study. With Lagrange interpolation polynomial to establish a secure and effective PHR information access scheme, it allows to accurately access to PHR with security and is suitable for enormous multi-users. Moreover, this scheme also dynamically supports multi-users in Cloud computing environments with personal privacy and offers legal authorities to access to PHR. From security and effectiveness analyses, the proposed PHR access scheme in Cloud computing environments is proven flexible and secure and could effectively correspond to real-time appending and deleting user access authorization and appending and revising PHR records.
RSA-Based Password-Authenticated Key Exchange, Revisited
NASA Astrophysics Data System (ADS)
Shin, Seonghan; Kobara, Kazukuni; Imai, Hideki
The RSA-based Password-Authenticated Key Exchange (PAKE) protocols have been proposed to realize both mutual authentication and generation of secure session keys where a client is sharing his/her password only with a server and the latter should generate its RSA public/private key pair (e, n), (d, n) every time due to the lack of PKI (Public-Key Infrastructures). One of the ways to avoid a special kind of off-line (so called e-residue) attacks in the RSA-based PAKE protocols is to deploy a challenge/response method by which a client verifies the relative primality of e and φ(n) interactively with a server. However, this kind of RSA-based PAKE protocols did not give any proof of the underlying challenge/response method and therefore could not specify the exact complexity of their protocols since there exists another security parameter, needed in the challenge/response method. In this paper, we first present an RSA-based PAKE (RSA-PAKE) protocol that can deploy two different challenge/response methods (denoted by Challenge/Response Method1 and Challenge/Response Method2). The main contributions of this work include: (1) Based on the number theory, we prove that the Challenge/Response Method1 and the Challenge/Response Method2 are secure against e-residue attacks for any odd prime e (2) With the security parameter for the on-line attacks, we show that the RSA-PAKE protocol is provably secure in the random oracle model where all of the off-line attacks are not more efficient than on-line dictionary attacks; and (3) By considering the Hamming weight of e and its complexity in the. RSA-PAKE protocol, we search for primes to be recommended for a practical use. We also compare the RSA-PAKE protocol with the previous ones mainly in terms of computation and communication complexities.
An Optimization of the Basic School Military Occupational Skill Assignment Process
2003-06-01
Corps Intranet (NMCI)23 supports it. We evaluated the use of Microsoft’s SQL Server, but dismissed this after learning that TBS did not possess a SQL ...Server license or a qualified SQL Server administrator.24 SQL Server would have provided for additional security measures not available in MS...administrator. Although not has powerful as SQL Server, MS Access can handle the multi-user environment necessary for this system.25 The training
NASA Astrophysics Data System (ADS)
Sasikala, S.; Indhira, K.; Chandrasekaran, V. M.
2017-11-01
In this paper, we have considered an MX / (a,b) / 1 queueing system with server breakdown without interruption, multiple vacations, setup times and N-policy. After a batch of service, if the size of the queue is ξ (< a), then the server immediately takes a vacation. Upon returns from a vacation, if the queue is less than N, then the server takes another vacation. This process continues until the server finds atleast N customers in the queue. After a vacation, if the server finds at least N customers waiting for service, then the server needs a setup time to start the service. After a batch of service, if the amount of waiting customers in the queue is ξ (≥ a) then the server serves a batch of min(ξ,b) customers, where b ≥ a. We derived the probability generating function of queue length at arbitrary time epoch. Further, we obtained some important performance measures.
Secure entanglement distillation for double-server blind quantum computation.
Morimae, Tomoyuki; Fujii, Keisuke
2013-07-12
Blind quantum computation is a new secure quantum computing protocol where a client, who does not have enough quantum technologies at her disposal, can delegate her quantum computation to a server, who has a fully fledged quantum computer, in such a way that the server cannot learn anything about the client's input, output, and program. If the client interacts with only a single server, the client has to have some minimum quantum power, such as the ability of emitting randomly rotated single-qubit states or the ability of measuring states. If the client interacts with two servers who share Bell pairs but cannot communicate with each other, the client can be completely classical. For such a double-server scheme, two servers have to share clean Bell pairs, and therefore the entanglement distillation is necessary in a realistic noisy environment. In this Letter, we show that it is possible to perform entanglement distillation in the double-server scheme without degrading the security of blind quantum computing.
SciServer Compute brings Analysis to Big Data in the Cloud
NASA Astrophysics Data System (ADS)
Raddick, Jordan; Medvedev, Dmitry; Lemson, Gerard; Souter, Barbara
2016-06-01
SciServer Compute uses Jupyter Notebooks running within server-side Docker containers attached to big data collections to bring advanced analysis to big data "in the cloud." SciServer Compute is a component in the SciServer Big-Data ecosystem under development at JHU, which will provide a stable, reproducible, sharable virtual research environment.SciServer builds on the popular CasJobs and SkyServer systems that made the Sloan Digital Sky Survey (SDSS) archive one of the most-used astronomical instruments. SciServer extends those systems with server-side computational capabilities and very large scratch storage space, and further extends their functions to a range of other scientific disciplines.Although big datasets like SDSS have revolutionized astronomy research, for further analysis, users are still restricted to downloading the selected data sets locally - but increasing data sizes make this local approach impractical. Instead, researchers need online tools that are co-located with data in a virtual research environment, enabling them to bring their analysis to the data.SciServer supports this using the popular Jupyter notebooks, which allow users to write their own Python and R scripts and execute them on the server with the data (extensions to Matlab and other languages are planned). We have written special-purpose libraries that enable querying the databases and other persistent datasets. Intermediate results can be stored in large scratch space (hundreds of TBs) and analyzed directly from within Python or R with state-of-the-art visualization and machine learning libraries. Users can store science-ready results in their permanent allocation on SciDrive, a Dropbox-like system for sharing and publishing files. Communication between the various components of the SciServer system is managed through SciServer‘s new Single Sign-on Portal.We have created a number of demos to illustrate the capabilities of SciServer Compute, including Python and R scripts accessing a range of datasets and showing the data flow between storage and compute components.Demos, documentation, and more information can be found at www.sciserver.org.SciServer is funded by the National Science Foundation Award ACI-1261715.
Libraries Online!: Microsoft Partnering with American Library Association (ALA).
ERIC Educational Resources Information Center
Machovec, George S., Ed.
1995-01-01
Describes Libraries Online, a pilot project created by Microsoft and the American Library Association to develop ways to provide access to information technologies to underserved populations. Presents the nine public libraries that will receive cash grants, staff training, computer hardware and software, and technical support to help support local…
Customising Microsoft Office to Develop a Tutorial Learning Environment
ERIC Educational Resources Information Center
Deacon, Andrew; Jaftha, Jacob; Horwitz, David
2004-01-01
Powerful applications such as Microsoft Office's Excel and Word are widely used to perform common tasks in the workplace and in education. Scripting within these applications allows unanticipated user requirements to be addressed. We show that such extensibility, intended to support office automation-type applications, is well suited to the…
imDEV: a graphical user interface to R multivariate analysis tools in Microsoft Excel
USDA-ARS?s Scientific Manuscript database
Interactive modules for data exploration and visualization (imDEV) is a Microsoft Excel spreadsheet embedded application providing an integrated environment for the analysis of omics data sets with a user-friendly interface. Individual modules were designed to provide toolsets to enable interactive ...
Illustrating the Central Limit Theorem through Microsoft Excel Simulations
ERIC Educational Resources Information Center
Moen, David H.; Powell, John E.
2005-01-01
Using Microsoft Excel, several interactive, computerized learning modules are developed to demonstrate the Central Limit Theorem. These modules are used in the classroom to enhance the comprehension of this theorem. The Central Limit Theorem is a very important theorem in statistics, and yet because it is not intuitively obvious, statistics…
Cryptanalysis on classical cipher based on Indonesian language
NASA Astrophysics Data System (ADS)
Marwati, R.; Yulianti, K.
2018-05-01
Cryptanalysis is a process of breaking a cipher in an illegal decryption. This paper discusses about encryption some classic cryptography, breaking substitution cipher and stream cipher, and increasing its security. Encryption and ciphering based on Indonesian Language text. Microsoft Word and Microsoft Excel were chosen as ciphering and breaking tools.
Cellular Consequences of Telomere Shortening in Histologically Normal Breast Tissues
2013-09-01
using the open source, JAVA -based image analysis software package ImageJ (http://rsb.info.nih.gov/ij/) and a custom designed plugin (“Telometer...Tabulated data were stored in a MySQL (http://www.mysql.com) database and viewed through Microsoft Access (Microsoft Corp.). Statistical Analysis For
The Effect of Anisotropic Scatter on Atmospheric Neutron Transport
2015-03-26
Labratory, ENDF-6 Formats Manual: Data Formats and Procedures for the Evaluated Nuclear Data Files ENDF/B-VI and ENDF/B-VII. BNL - 90365-2009, Revision 2...Upton, NY: BNL , December 2011 [7] Microsoft Visual Studio Professional 2013, Version 12.0.30501.00 Update 2. Computer Software. Microsoft Corporation
Challenges in Database Design with Microsoft Access
ERIC Educational Resources Information Center
Letkowski, Jerzy
2014-01-01
Design, development and explorations of databases are popular topics covered in introductory courses taught at business schools. Microsoft Access is the most popular software used in those courses. Despite quite high complexity of Access, it is considered to be one of the most friendly database programs for beginners. A typical Access textbook…
A Multiple-Representation Paradigm for Document Development
1988-07-05
Write [10], MicroSoft ·word [99], PageMaker [4], Vent ura Pub- lisher [135], Interleaf Publishing System [78], FrameMaker [52] and more have alre ady...processing in FrameMaker , MicroSoft Word, and Ventura Publisher are all handled by a noninteractive off-line program. Direct manipulation, from the
75 FR 77934 - Small Business Information Security Task Force
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-14
... on them. The Task Force has until the end of 2013 to complete the report but it is hoped that the... computing technology industry itself. Mr. Aaron Berstein then volunteered to contact Microsoft to inquire into the possibility of Microsoft providing an online collaborative space software tool for use...
Landscape habitat suitability index software
William D. Dijak; Chadwick D. Rittenhouse; Michael A. Larson; Frank R. III Thompson; Joshua J. Millspaugh
2007-01-01
Habitat suitability index (HSI) models are traditionally used to evaluate habitat quality for wildlife at a local scale. Rarely have such models incorporated spatial relationships of habitat components. We introduce Landscape HSImodels, a new Microsoft Windowst (Microsoft, Redmond, WA)-based program that incorporates local habitat as well as landscape-scale attributes...
ERIC Educational Resources Information Center
Allen, Denise
1995-01-01
Reviews five compact disc-read only memory (CD-ROM) products and one video series that focus on science projects: (1) "Body Park" (Virtual Entertainment); (2) "The Magic School Bus Explores the Solar System" (Microsoft); (3) "The Magic School Bus Explores the Human Body" (Microsoft); (4) "Science Curriculum Assistance Program" (Demco); and (5)…
Providing Internet Access to High-Resolution Lunar Images
NASA Technical Reports Server (NTRS)
Plesea, Lucian
2008-01-01
The OnMoon server is a computer program that provides Internet access to high-resolution Lunar images, maps, and elevation data, all suitable for use in geographical information system (GIS) software for generating images, maps, and computational models of the Moon. The OnMoon server implements the Open Geospatial Consortium (OGC) Web Map Service (WMS) server protocol and supports Moon-specific extensions. Unlike other Internet map servers that provide Lunar data using an Earth coordinate system, the OnMoon server supports encoding of data in Moon-specific coordinate systems. The OnMoon server offers access to most of the available high-resolution Lunar image and elevation data. This server can generate image and map files in the tagged image file format (TIFF) or the Joint Photographic Experts Group (JPEG), 8- or 16-bit Portable Network Graphics (PNG), or Keyhole Markup Language (KML) format. Image control is provided by use of the OGC Style Layer Descriptor (SLD) protocol. Full-precision spectral arithmetic processing is also available, by use of a custom SLD extension. This server can dynamically add shaded relief based on the Lunar elevation to any image layer. This server also implements tiled WMS protocol and super-overlay KML for high-performance client application programs.
Wide Area Information Servers: An Executive Information System for Unstructured Files.
ERIC Educational Resources Information Center
Kahle, Brewster; And Others
1992-01-01
Describes the Wide Area Information Servers (WAIS) system, an integrated information retrieval system for corporate end users. Discussion covers general characteristics of the system, search techniques, protocol development, user interfaces, servers, selective dissemination of information, nontextual data, access to other servers, and description…
Parallel Computing Using Web Servers and "Servlets".
ERIC Educational Resources Information Center
Lo, Alfred; Bloor, Chris; Choi, Y. K.
2000-01-01
Describes parallel computing and presents inexpensive ways to implement a virtual parallel computer with multiple Web servers. Highlights include performance measurement of parallel systems; models for using Java and intranet technology including single server, multiple clients and multiple servers, single client; and a comparison of CGI (common…
Asynchronous data change notification between database server and accelerator controls system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fu, W.; Morris, J.; Nemesure, S.
2011-10-10
Database data change notification (DCN) is a commonly used feature. Not all database management systems (DBMS) provide an explicit DCN mechanism. Even for those DBMS's which support DCN (such as Oracle and MS SQL server), some server side and/or client side programming may be required to make the DCN system work. This makes the setup of DCN between database server and interested clients tedious and time consuming. In accelerator control systems, there are many well established software client/server architectures (such as CDEV, EPICS, and ADO) that can be used to implement data reflection servers that transfer data asynchronously to anymore » client using the standard SET/GET API. This paper describes a method for using such a data reflection server to set up asynchronous DCN (ADCN) between a DBMS and clients. This method works well for all DBMS systems which provide database trigger functionality. Asynchronous data change notification (ADCN) between database server and clients can be realized by combining the use of a database trigger mechanism, which is supported by major DBMS systems, with server processes that use client/server software architectures that are familiar in the accelerator controls community (such as EPICS, CDEV or ADO). This approach makes the ADCN system easy to set up and integrate into an accelerator controls system. Several ADCN systems have been set up and used in the RHIC-AGS controls system.« less
GRAMM-X public web server for protein–protein docking
Tovchigrechko, Andrey; Vakser, Ilya A.
2006-01-01
Protein docking software GRAMM-X and its web interface () extend the original GRAMM Fast Fourier Transformation methodology by employing smoothed potentials, refinement stage, and knowledge-based scoring. The web server frees users from complex installation of database-dependent parallel software and maintaining large hardware resources needed for protein docking simulations. Docking problems submitted to GRAMM-X server are processed by a 320 processor Linux cluster. The server was extensively tested by benchmarking, several months of public use, and participation in the CAPRI server track. PMID:16845016
2016-06-08
server environment. While the college’s two Cisco blade -servers are located in separate buildings, these 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND...college’s two Cisco blade -servers are located in separate buildings, these units now work as one unit. Critical databases and software packages are...server environment. While the college’s two Cisco blade -servers are located in separate buildings, these units now work as one unit. Critical
Scaling NS-3 DCE Experiments on Multi-Core Servers
2016-06-15
that work well together. 3.2 Simulation Server Details We ran the simulations on a Dell® PowerEdge M520 blade server[8] running Ubuntu Linux 14.04...To minimize the amount of time needed to complete all of the simulations, we planned to run multiple simulations at the same time on a blade server...MacBook was running the simulation inside a virtual machine (Ubuntu 14.04), while the blade server was running the same operating system directly on
Reliability Information Analysis Center 1st Quarter 2007, Technical Area Task (TAT) Report
2007-02-05
34* Created new SQL server database for "PC Configuration" web application. Added roles for security closed 4235 and posted application to production. "e Wrote...and ran SQL Server scripts to migrate production databases to new server . "e Created backup jobs for new SQL Server databases. "* Continued...second phase of the TENA demo. Extensive tasking was established and assigned. A TENA interface to EW Server was reaffirmed after some uncertainty about
Lawrence, Daphne
2009-03-01
Blade servers and virtualization can reduce infrastructure, maintenance, heating, electric, cooling and equipment costs. Blade server technology is evolving and some elements may become obsolete. There is very little interoperability between blades. Hospitals can virtualize 40 to 60 percent of their servers, and old servers can be reused for testing. Not all applications lend themselves to virtualization--especially those with high memory requirements. CIOs should engage their vendors in virtualization discussions.
A cloud based brokering framework to support hydrology at global scale
NASA Astrophysics Data System (ADS)
Boldrini, E.; Pecora, S.; Bordini, F.; Nativi, S.
2016-12-01
This work presents the hydrology broker designed and deployed in the context of a collaboration between the Regional Agency for Environmental Protection in the Italian region Emilia-Romagna (ARPA-ER) and CNR-IIA (National Research Council of Italy). The hydrology brokering platform eases the task of discovering and accessing hydrological observation data, usually acquired and made available by national agencies by means of a set of heterogeneous services (e.g. CUAHSI HIS servers, OGC services, FTP servers) and formats (e.g. WaterML, O&M, ...). The hydrology broker makes all the already published data available according to one or more of the desired and well known discovery protocols, access protocols, and formats . As a result, the user is able to search and access the available hydrological data through his preferred client (e.g. CUAHSI HydroDesktop, 52North SWE client). It is also easy to build a hydrological web portal on top of the broker, using the user friendly js API. The hydrology broker has been deployed on the Amazon cloud to ensure scalability and tested in the context of the work of the Commission for Hydrology of WMO on three different scenarios: the La Plata river basin, the Sava river basin and the Arctic-HYCOS project. In each scenario the hydrology broker discovered and accessed heterogeneous data formats (e.g. Waterml 1.0/2.0, proprietary CSV documents) from the heterogeneous services (e.g. CUAHSI HIS servers, FTP service and agency proprietary services) managed by several national agencies and international commissions. The hydrology broker made possible to present all the available data uniformly through the user desired service type and format (e.g. an HIS server publishing Waterml 2.0), producing a great improvement in both system interoperability and data exchange. Interoperability tests were also successfully conducted with WMO Information System (WIS) nodes, making possible for a specific Global Information Center System (GISC) to gather the available hydrological records as ISO 19115:2007 metadata documents through the OAI-PMH interface exposed by the broker. The framework flexibility makes it also easy to add other sources, as well as additional published interfaces, in order to cope with the future standard requirements needed by the hydrological community.
A Scalability Model for ECS's Data Server
NASA Technical Reports Server (NTRS)
Menasce, Daniel A.; Singhal, Mukesh
1998-01-01
This report presents in four chapters a model for the scalability analysis of the Data Server subsystem of the Earth Observing System Data and Information System (EOSDIS) Core System (ECS). The model analyzes if the planned architecture of the Data Server will support an increase in the workload with the possible upgrade and/or addition of processors, storage subsystems, and networks. The approaches in the report include a summary of the architecture of ECS's Data server as well as a high level description of the Ingest and Retrieval operations as they relate to ECS's Data Server. This description forms the basis for the development of the scalability model of the data server and the methodology used to solve it.
NASA Astrophysics Data System (ADS)
Anderson, J.; Bauer, K.; Borga, A.; Boterenbrood, H.; Chen, H.; Chen, K.; Drake, G.; Dönszelmann, M.; Francis, D.; Guest, D.; Gorini, B.; Joos, M.; Lanni, F.; Lehmann Miotto, G.; Levinson, L.; Narevicius, J.; Panduro Vazquez, W.; Roich, A.; Ryu, S.; Schreuder, F.; Schumacher, J.; Vandelli, W.; Vermeulen, J.; Whiteson, D.; Wu, W.; Zhang, J.
2016-12-01
The ATLAS Phase-I upgrade (2019) requires a Trigger and Data Acquisition (TDAQ) system able to trigger and record data from up to three times the nominal LHC instantaneous luminosity. The Front-End LInk eXchange (FELIX) system provides an infrastructure to achieve this in a scalable, detector agnostic and easily upgradeable way. It is a PC-based gateway, interfacing custom radiation tolerant optical links from front-end electronics, via PCIe Gen3 cards, to a commodity switched Ethernet or InfiniBand network. FELIX enables reducing custom electronics in favour of software running on commercial servers. The FELIX system, the design of the PCIe prototype card and the integration test results are presented in this paper.
Load Balancing in Distributed Web Caching: A Novel Clustering Approach
NASA Astrophysics Data System (ADS)
Tiwari, R.; Kumar, K.; Khan, G.
2010-11-01
The World Wide Web suffers from scaling and reliability problems due to overloaded and congested proxy servers. Caching at local proxy servers helps, but cannot satisfy more than a third to half of requests; more requests are still sent to original remote origin servers. In this paper we have developed an algorithm for Distributed Web Cache, which incorporates cooperation among proxy servers of one cluster. This algorithm uses Distributed Web Cache concepts along with static hierarchies with geographical based clusters of level one proxy server with dynamic mechanism of proxy server during the congestion of one cluster. Congestion and scalability problems are being dealt by clustering concept used in our approach. This results in higher hit ratio of caches, with lesser latency delay for requested pages. This algorithm also guarantees data consistency between the original server objects and the proxy cache objects.
On the optimal use of a slow server in two-stage queueing systems
NASA Astrophysics Data System (ADS)
Papachristos, Ioannis; Pandelis, Dimitrios G.
2017-07-01
We consider two-stage tandem queueing systems with a dedicated server in each queue and a slower flexible server that can attend both queues. We assume Poisson arrivals and exponential service times, and linear holding costs for jobs present in the system. We study the optimal dynamic assignment of servers to jobs assuming that two servers cannot collaborate to work on the same job and preemptions are not allowed. We formulate the problem as a Markov decision process and derive properties of the optimal allocation for the dedicated (fast) servers. Specifically, we show that the one downstream should not idle, and the same is true for the one upstream when holding costs are larger there. The optimal allocation of the slow server is investigated through extensive numerical experiments that lead to conjectures on the structure of the optimal policy.
Process evaluation distributed system
NASA Technical Reports Server (NTRS)
Moffatt, Christopher L. (Inventor)
2006-01-01
The distributed system includes a database server, an administration module, a process evaluation module, and a data display module. The administration module is in communication with the database server for providing observation criteria information to the database server. The process evaluation module is in communication with the database server for obtaining the observation criteria information from the database server and collecting process data based on the observation criteria information. The process evaluation module utilizes a personal digital assistant (PDA). A data display module in communication with the database server, including a website for viewing collected process data in a desired metrics form, the data display module also for providing desired editing and modification of the collected process data. The connectivity established by the database server to the administration module, the process evaluation module, and the data display module, minimizes the requirement for manual input of the collected process data.
Nakrani, Sunil; Tovey, Craig
2007-12-01
An Internet hosting center hosts services on its server ensemble. The center must allocate servers dynamically amongst services to maximize revenue earned from hosting fees. The finite server ensemble, unpredictable request arrival behavior and server reallocation cost make server allocation optimization difficult. Server allocation closely resembles honeybee forager allocation amongst flower patches to optimize nectar influx. The resemblance inspires a honeybee biomimetic algorithm. This paper describes details of the honeybee self-organizing model in terms of information flow and feedback, analyzes the homology between the two problems and derives the resulting biomimetic algorithm for hosting centers. The algorithm is assessed for effectiveness and adaptiveness by comparative testing against benchmark and conventional algorithms. Computational results indicate that the new algorithm is highly adaptive to widely varying external environments and quite competitive against benchmark assessment algorithms. Other swarm intelligence applications are briefly surveyed, and some general speculations are offered regarding their various degrees of success.
Smith, Nicholas; Witham, Shawn; Sarkar, Subhra; Zhang, Jie; Li, Lin; Li, Chuan; Alexov, Emil
2012-06-15
A new edition of the DelPhi web server, DelPhi web server v2, is released to include atomic presentation of geometrical figures. These geometrical objects can be used to model nano-size objects together with real biological macromolecules. The position and size of the object can be manipulated by the user in real time until desired results are achieved. The server fixes structural defects, adds hydrogen atoms and calculates electrostatic energies and the corresponding electrostatic potential and ionic distributions. The web server follows a client-server architecture built on PHP and HTML and utilizes DelPhi software. The computation is carried out on supercomputer cluster and results are given back to the user via http protocol, including the ability to visualize the structure and corresponding electrostatic potential via Jmol implementation. The DelPhi web server is available from http://compbio.clemson.edu/delphi_webserver.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-12
... Commercial and Industrial Equipment: Proposed Determination of Computer Servers as a Covered Consumer Product... comments on the proposed determination that computer servers (servers) qualify as a covered product. DATES: The comment period for the proposed determination relating to servers published on July 12, 2013 (78...
ASPEN--A Web-Based Application for Managing Student Server Accounts
ERIC Educational Resources Information Center
Sandvig, J. Christopher
2004-01-01
The growth of the Internet has greatly increased the demand for server-side programming courses at colleges and universities. Students enrolled in such courses must be provided with server-based accounts that support the technologies that they are learning. The process of creating, managing and removing large numbers of student server accounts is…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-31
... Microsoft Word, Microsoft Excel, WordPerfect, or Adobe PDF file formats only. FOR FURTHER INFORMATION...). The closure was implemented based on advice from the U.S. Food and Drug Administration (FDA) after... Management Plan (FMP). Since the implementation of the closure, NOAA's National Ocean Service has provided...
Integrated Approach to User Account Management
NASA Technical Reports Server (NTRS)
Kesselman, Glenn; Smith, William
2007-01-01
IT environments consist of both Windows and other platforms. Providing user account management for this model has become increasingly diffi cult. If Microsoft#s Active Directory could be enhanced to extend a W indows identity for authentication services for Unix, Linux, Java and Macintosh systems, then an integrated approach to user account manag ement could be realized.
Microsoft Producer: A Software Tool for Creating Multimedia PowerPoint[R] Presentations
ERIC Educational Resources Information Center
Leffingwell, Thad R.; Thomas, David G.; Elliott, William H.
2007-01-01
Microsoft[R] Producer[R] is a powerful yet user-friendly PowerPoint companion tool for creating on-demand multimedia presentations. Instructors can easily distribute these presentations via compact disc or streaming media over the Internet. We describe the features of the software, system requirements, and other required hardware. We also describe…
Project-Based Learning around the World
ERIC Educational Resources Information Center
Weatherby, Kristen
2007-01-01
This paper, the first of a two-part article, addresses ways that project-based learning is being used in countries around the world. It introduces Microsoft's worldwide K-12 education initiative, Partners in Learning, and provides some background as to why Microsoft is interested in developing project-based learning curricula for teachers to help…
Overcoming Microsoft Excel's Weaknesses for Crop Model Building and Simulations
ERIC Educational Resources Information Center
Sung, Christopher Teh Boon
2011-01-01
Using spreadsheets such as Microsoft Excel for building crop models and running simulations can be beneficial. Excel is easy to use, powerful, and versatile, and it requires the least proficiency in computer programming compared to other programming platforms. Excel, however, has several weaknesses: it does not directly support loops for iterative…
Using Microsoft Excel to Generate Usage Statistics
ERIC Educational Resources Information Center
Spellman, Rosemary
2011-01-01
At the Libraries Service Center, statistics are generated on a monthly, quarterly, and yearly basis by using four Microsoft Excel workbooks. These statistics provide information about what materials are being requested and by whom. They also give details about why certain requests may not have been filled. Utilizing Excel allows for a shallower…
EVALIDatorReports: Reporting beyond the FIADB
Patrick D. Miles
2009-01-01
Tools for analyzing data collected by the U.S. Forest Service's Forest Inventory and Analysis (FIA) program are available in Microsoft Access© format. Databases have been created for every state, except Hawaii, and are available for downloading. EVALIDatorReports is a Visual Basic Application that is stored within each Microsoft Access© database...
Windows 8: What Educators Need to Know
ERIC Educational Resources Information Center
Vedder, Richard G.
2012-01-01
In October 2012, Microsoft will release the commercial version of its next operating system, presently called "Windows 8." This version represents a significant departure from the past. Microsoft wants this operating system to meet user needs regardless of physical platform (e.g., desktop, notebook, tablet, mobile phone). As part of this mission,…
PC vs. Mac--Which Way Should You Go?
ERIC Educational Resources Information Center
Wodarz, Nan
1997-01-01
Outlines the factors in hardware, software, and administration to consider in developing specifications for choosing a computer operating system. Compares Microsoft Windows 95/NT that runs on PC/Intel-based systems and System 7.5 that runs on the Apple-based systems. Lists reasons why the Microsoft platform clearly stands above the Apple platform.…
The Web-Database Connection Tools for Sharing Information on the Campus Intranet.
ERIC Educational Resources Information Center
Thibeault, Nancy E.
This paper evaluates four tools for creating World Wide Web pages that interface with Microsoft Access databases: DB Gateway, Internet Database Assistant (IDBA), Microsoft Internet Database Connector (IDC), and Cold Fusion. The system requirements and features of each tool are discussed. A sample application, "The Virtual Help Desk"…
Tools for Requirements Management: A Comparison of Telelogic DOORS and the HiVe
2006-07-01
types DOORS deals with are text files, spreadsheets, FrameMaker , rich text, Microsoft Word and Microsoft Project. 2.5.1 Predefined file formats DOORS...during the export. DOORS exports FrameMaker files in an incomplete format, meaning DOORS exported files will have to be opened in FrameMaker and saved
47 CFR 61.22 - Composition of tariffs.
Code of Federal Regulations, 2010 CFR
2010-10-01
...Perfect 5.1, Microsoft Word 6, or Microsoft Word 97 software. No diskettes shall contain more than one... clearly labelled with the carrier's name, Tariff Number, software used, and the date of submission. When... defined in § 1.4(e)(2) of this chapter. (d) Domestic and international nondominant carriers subject to the...
Where Big-City Schools Meet "Microsoft Smarts"
ERIC Educational Resources Information Center
Borja, Rhea R.
2006-01-01
This article talks about a new school built, which is called "School of the Future," which was born of a partnership between the Philadelphia public schools and the world's leading software-maker, Microsoft Corp. A gleaming white building on the edge of a blighted West Philadelphia neighborhood, the $62 million school garnered wide attention when…
Search Engines for Tomorrow's Scholars, Part Two
ERIC Educational Resources Information Center
Fagan, Jody Condit
2012-01-01
This two-part article considers how well some of today's search tools support scholars' work. The first part of the article reviewed Google Scholar and Microsoft Academic Search using a modified version of Carole L. Palmer, Lauren C. Teffeau, and Carrier M. Pirmann's framework (2009). Microsoft Academic Search is a strong contender when…
14 CFR 302.603 - Contents of complaint or request for determination.
Code of Federal Regulations, 2010 CFR
2010-01-01
... determination. 302.603 Section 302.603 Aeronautics and Space OFFICE OF THE SECRETARY, DEPARTMENT OF...: Microsoft Word (or RTF), Word Perfect, Ami Pro, Microsoft Excel, Lotus 123, Quattro Pro, or ASCII tab...: one copy for the docket, one copy for the Office of Hearings, and one copy for the Office of Aviation...
Vickers, Linda D
2010-05-01
This paper describes the method using Microsoft Excel (Microsoft Corporation One Microsoft Way Redmond, WA 98052-6399) to compute the 5% overall site X/Q value and the 95th percentile of the distribution of doses to the nearest maximally exposed offsite individual (MEOI) in accordance with guidance from DOE-STD-3009-1994 and U.S. NRC Regulatory Guide 1.145-1982. The accurate determination of the 5% overall site X/Q value is the most important factor in the computation of the 95th percentile of the distribution of doses to the nearest MEOI. This method should be used to validate software codes that compute the X/Q. The 95th percentile of the distribution of doses to the nearest MEOI must be compared to the U.S. DOE Evaluation Guide of 25 rem to determine the relative severity of hazard to the public from a postulated, unmitigated design basis accident that involves an offsite release of radioactive material.
Neyeloff, Jeruza L; Fuchs, Sandra C; Moreira, Leila B
2012-01-20
Meta-analyses are necessary to synthesize data obtained from primary research, and in many situations reviews of observational studies are the only available alternative. General purpose statistical packages can meta-analyze data, but usually require external macros or coding. Commercial specialist software is available, but may be expensive and focused in a particular type of primary data. Most available softwares have limitations in dealing with descriptive data, and the graphical display of summary statistics such as incidence and prevalence is unsatisfactory. Analyses can be conducted using Microsoft Excel, but there was no previous guide available. We constructed a step-by-step guide to perform a meta-analysis in a Microsoft Excel spreadsheet, using either fixed-effect or random-effects models. We have also developed a second spreadsheet capable of producing customized forest plots. It is possible to conduct a meta-analysis using only Microsoft Excel. More important, to our knowledge this is the first description of a method for producing a statistically adequate but graphically appealing forest plot summarizing descriptive data, using widely available software.
2012-01-01
Background Meta-analyses are necessary to synthesize data obtained from primary research, and in many situations reviews of observational studies are the only available alternative. General purpose statistical packages can meta-analyze data, but usually require external macros or coding. Commercial specialist software is available, but may be expensive and focused in a particular type of primary data. Most available softwares have limitations in dealing with descriptive data, and the graphical display of summary statistics such as incidence and prevalence is unsatisfactory. Analyses can be conducted using Microsoft Excel, but there was no previous guide available. Findings We constructed a step-by-step guide to perform a meta-analysis in a Microsoft Excel spreadsheet, using either fixed-effect or random-effects models. We have also developed a second spreadsheet capable of producing customized forest plots. Conclusions It is possible to conduct a meta-analysis using only Microsoft Excel. More important, to our knowledge this is the first description of a method for producing a statistically adequate but graphically appealing forest plot summarizing descriptive data, using widely available software. PMID:22264277
A broadcast-based key agreement scheme using set reconciliation for wireless body area networks.
Ali, Aftab; Khan, Farrukh Aslam
2014-05-01
Information and communication technologies have thrived over the last few years. Healthcare systems have also benefited from this progression. A wireless body area network (WBAN) consists of small, low-power sensors used to monitor human physiological values remotely, which enables physicians to remotely monitor the health of patients. Communication security in WBANs is essential because it involves human physiological data. Key agreement and authentication are the primary issues in the security of WBANs. To agree upon a common key, the nodes exchange information with each other using wireless communication. This information exchange process must be secure enough or the information exchange should be minimized to a certain level so that if information leak occurs, it does not affect the overall system. Most of the existing solutions for this problem exchange too much information for the sake of key agreement; getting this information is sufficient for an attacker to reproduce the key. Set reconciliation is a technique used to reconcile two similar sets held by two different hosts with minimal communication complexity. This paper presents a broadcast-based key agreement scheme using set reconciliation for secure communication in WBANs. The proposed scheme allows the neighboring nodes to agree upon a common key with the personal server (PS), generated from the electrocardiogram (EKG) feature set of the host body. Minimal information is exchanged in a broadcast manner, and even if every node is missing a different subset, by reconciling these feature sets, the whole network will still agree upon a single common key. Because of the limited information exchange, if an attacker gets the information in any way, he/she will not be able to reproduce the key. The proposed scheme mitigates replay, selective forwarding, and denial of service attacks using a challenge-response authentication mechanism. The simulation results show that the proposed scheme has a great deal of adoptability in terms of security, communication overhead, and running time complexity, as compared to the existing EKG-based key agreement scheme.
How to securely replicate services
NASA Technical Reports Server (NTRS)
Reiter, Michael; Birman, Kenneth
1992-01-01
A method is presented for constructing replicated services that retain their availability and integrity despite several servers and clients corrupted by an intruder, in addition to others failing benignly. More precisely, a service is replicated by n servers in such a way that a correct client will accept a correct server's response if, for some prespecified parameter k, at least k servers are correct and fewer than k servers are corrupt. The issue of maintaining causality among client requests is also addressed. A security breach resulting from an intruder's ability to effect a violation of causality in the sequence of requests processed by the service is illustrated. An approach to counter this problem is proposed that requires fewer than k servers to be corrupt and that is live if at least k+b servers are correct, where b is the assumed maximum total number of corrupt servers in any system run. An important and novel feature of these schemes is that the client need not be able to identify or authenticate even a single server. Instead, the client is required only to possess at most two public keys for the service. The practicality of these schemes is illustrated through a discussion of several issues pertinent to their implementation and use, and their intended role in a secure version of the Isis system is also described.
Optimal Self-Tuning PID Controller Based on Low Power Consumption for a Server Fan Cooling System.
Lee, Chengming; Chen, Rongshun
2015-05-20
Recently, saving the cooling power in servers by controlling the fan speed has attracted considerable attention because of the increasing demand for high-density servers. This paper presents an optimal self-tuning proportional-integral-derivative (PID) controller, combining a PID neural network (PIDNN) with fan-power-based optimization in the transient-state temperature response in the time domain, for a server fan cooling system. Because the thermal model of the cooling system is nonlinear and complex, a server mockup system simulating a 1U rack server was constructed and a fan power model was created using a third-order nonlinear curve fit to determine the cooling power consumption by the fan speed control. PIDNN with a time domain criterion is used to tune all online and optimized PID gains. The proposed controller was validated through experiments of step response when the server operated from the low to high power state. The results show that up to 14% of a server's fan cooling power can be saved if the fan control permits a slight temperature response overshoot in the electronic components, which may provide a time-saving strategy for tuning the PID controller to control the server fan speed during low fan power consumption.
Informatics in radiology (infoRAD): A complete continuous-availability PACS archive server.
Liu, Brent J; Huang, H K; Cao, Fei; Zhou, Michael Z; Zhang, Jianguo; Mogel, Greg
2004-01-01
The operational reliability of the picture archiving and communication system (PACS) server in a filmless hospital environment is always a major concern because server failure could cripple the entire PACS operation. A simple, low-cost, continuous-availability (CA) PACS archive server was designed and developed. The server makes use of a triple modular redundancy (TMR) system with a simple majority voting logic that automatically identifies a faulty module and removes it from service. The remaining two modules continue normal operation with no adverse effects on data flow or system performance. In addition, the server is integrated with two external mass storage devices for short- and long-term storage. Evaluation and testing of the server were conducted with laboratory experiments in which hardware failures were simulated to observe recovery time and the resumption of normal data flow. The server provides maximum uptime (99.999%) for end users while ensuring the transactional integrity of all clinical PACS data. Hardware failure has only minimal impact on performance, with no interruption of clinical data flow or loss of data. As hospital PACS become more widespread, the need for CA PACS solutions will increase. A TMR CA PACS archive server can reliably help achieve CA in this setting. Copyright RSNA, 2004
Performance of a distributed superscalar storage server
NASA Technical Reports Server (NTRS)
Finestead, Arlan; Yeager, Nancy
1993-01-01
The RS/6000 performed well in our test environment. The potential exists for the RS/6000 to act as a departmental server for a small number of users, rather than as a high speed archival server. Multiple UniTree Disk Server's utilizing one UniTree Disk Server's utilizing one UniTree Name Server could be developed that would allow for a cost effective archival system. Our performance tests were clearly limited by the network bandwidth. The performance gathered by the LibUnix testing shows that UniTree is capable of exceeding ethernet speeds on an RS/6000 Model 550. The performance of FTP might be significantly faster if asked to perform across a higher bandwidth network. The UniTree Name Server also showed signs of being a potential bottleneck. UniTree sites that would require a high ratio of file creations and deletions to reads and writes would run into this bottleneck. It is possible to improve the UniTree Name Server performance by bypassing the UniTree LibUnix Library altogether and communicating directly with the UniTree Name Server and optimizing creations. Although testing was performed in a less than ideal environment, hopefully the performance statistics stated in this paper will give end-users a realistic idea as to what performance they can expect in this type of setup.
NASA Astrophysics Data System (ADS)
Roach, Colin; Carlsson, Johan; Cary, John R.; Alexander, David A.
2002-11-01
The National Transport Code Collaboration (NTCC) has developed an array of software, including a data client/server. The data server, which is written in C++, serves local data (in the ITER Profile Database format) as well as remote data (by accessing one or several MDS+ servers). The client, a web-invocable Java applet, provides a uniform, intuitive, user-friendly, graphical interface to the data server. The uniformity of the interface relieves the user from the trouble of mastering the differences between different data formats and lets him/her focus on the essentials: plotting and viewing the data. The user runs the client by visiting a web page using any Java capable Web browser. The client is automatically downloaded and run by the browser. A reference to the data server is then retrieved via the standard Web protocol (HTTP). The communication between the client and the server is then handled by the mature, industry-standard CORBA middleware. CORBA has bindings for all common languages and many high-quality implementations are available (both Open Source and commercial). The NTCC data server has been installed at the ITPA International Multi-tokamak Confinement Profile Database, which is hosted by the UKAEA at Culham Science Centre. The installation of the data server is protected by an Internet firewall. To make it accessible to clients outside the firewall some modifications of the server were required. The working version of the ITPA confinement profile database is not open to the public. Authentification of legitimate users is done utilizing built-in Java security features to demand a password to download the client. We present an overview of the NTCC data client/server and some details of how the CORBA firewall-traversal issues were resolved and how the user authentification is implemented.
LiveBench-1: continuous benchmarking of protein structure prediction servers.
Bujnicki, J M; Elofsson, A; Fischer, D; Rychlewski, L
2001-02-01
We present a novel, continuous approach aimed at the large-scale assessment of the performance of available fold-recognition servers. Six popular servers were investigated: PDB-Blast, FFAS, T98-lib, GenTHREADER, 3D-PSSM, and INBGU. The assessment was conducted using as prediction targets a large number of selected protein structures released from October 1999 to April 2000. A target was selected if its sequence showed no significant similarity to any of the proteins previously available in the structural database. Overall, the servers were able to produce structurally similar models for one-half of the targets, but significantly accurate sequence-structure alignments were produced for only one-third of the targets. We further classified the targets into two sets: easy and hard. We found that all servers were able to find the correct answer for the vast majority of the easy targets if a structurally similar fold was present in the server's fold libraries. However, among the hard targets--where standard methods such as PSI-BLAST fail--the most sensitive fold-recognition servers were able to produce similar models for only 40% of the cases, half of which had a significantly accurate sequence-structure alignment. Among the hard targets, the presence of updated libraries appeared to be less critical for the ranking. An "ideally combined consensus" prediction, where the results of all servers are considered, would increase the percentage of correct assignments by 50%. Each server had a number of cases with a correct assignment, where the assignments of all the other servers were wrong. This emphasizes the benefits of considering more than one server in difficult prediction tasks. The LiveBench program (http://BioInfo.PL/LiveBench) is being continued, and all interested developers are cordially invited to join.
The HydroServer Platform for Sharing Hydrologic Data
NASA Astrophysics Data System (ADS)
Tarboton, D. G.; Horsburgh, J. S.; Schreuders, K.; Maidment, D. R.; Zaslavsky, I.; Valentine, D. W.
2010-12-01
The CUAHSI Hydrologic Information System (HIS) is an internet based system that supports sharing of hydrologic data. HIS consists of databases connected using the Internet through Web services, as well as software for data discovery, access, and publication. The HIS system architecture is comprised of servers for publishing and sharing data, a centralized catalog to support cross server data discovery and a desktop client to access and analyze data. This paper focuses on HydroServer, the component developed for sharing and publishing space-time hydrologic datasets. A HydroServer is a computer server that contains a collection of databases, web services, tools, and software applications that allow data producers to store, publish, and manage the data from an experimental watershed or project site. HydroServer is designed to permit publication of data as part of a distributed national/international system, while still locally managing access to the data. We describe the HydroServer architecture and software stack, including tools for managing and publishing time series data for fixed point monitoring sites as well as spatially distributed, GIS datasets that describe a particular study area, watershed, or region. HydroServer adopts a standards based approach to data publication, relying on accepted and emerging standards for data storage and transfer. CUAHSI developed HydroServer code is free with community code development managed through the codeplex open source code repository and development system. There is some reliance on widely used commercial software for general purpose and standard data publication capability. The sharing of data in a common format is one way to stimulate interdisciplinary research and collaboration. It is anticipated that the growing, distributed network of HydroServers will facilitate cross-site comparisons and large scale studies that synthesize information from diverse settings, making the network as a whole greater than the sum of its parts in advancing hydrologic research. Details of the CUAHSI HIS can be found at http://his.cuahsi.org, and HydroServer codeplex site http://hydroserver.codeplex.com.
Group-oriented coordination models for distributed client-server computing
NASA Technical Reports Server (NTRS)
Adler, Richard M.; Hughes, Craig S.
1994-01-01
This paper describes group-oriented control models for distributed client-server interactions. These models transparently coordinate requests for services that involve multiple servers, such as queries across distributed databases. Specific capabilities include: decomposing and replicating client requests; dispatching request subtasks or copies to independent, networked servers; and combining server results into a single response for the client. The control models were implemented by combining request broker and process group technologies with an object-oriented communication middleware tool. The models are illustrated in the context of a distributed operations support application for space-based systems.
National Medical Terminology Server in Korea
NASA Astrophysics Data System (ADS)
Lee, Sungin; Song, Seung-Jae; Koh, Soonjeong; Lee, Soo Kyoung; Kim, Hong-Gee
Interoperable EHR (Electronic Health Record) necessitates at least the use of standardized medical terminologies. This paper describes a medical terminology server, LexCare Suite, which houses terminology management applications, such as a terminology editor, and a terminology repository populated with international standard terminology systems such as Systematized Nomenclature of Medicine (SNOMED). The server is to satisfy the needs of quality terminology systems to local primary to tertiary hospitals. Our partner general hospitals have used the server to test its applicability. This paper describes the server and the results of the applicability test.
CIVET: Continuous Integration, Verification, Enhancement, and Testing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alger, Brian; Gaston, Derek R.; Permann, Cody J
A Git server (GitHub, GitLab, BitBucket) sends event notifications to the Civet server. These are either a " Pull Request" or a "Push" notification. Civet then checks the database to determine what tests need to be run and marks them as ready to run. Civet clients, running on dedicated machines, query the server for available jobs that are ready to run. When a client gets a job it executes the scripts attached to the job and report back to the server the output and exit status. When the client updates the server, the server will also update the Git servermore » with the result of the job, as well as updating the main web page.« less
NASA Astrophysics Data System (ADS)
He, Huimin; Liu, Fengman; Li, Baoxia; Xue, Haiyun; Wang, Haidong; Qiu, Delong; Zhou, Yunyan; Cao, Liqiang
2016-11-01
With the development of the multicore processor, the bandwidth and capacity of the memory, rather than the memory area, are the key factors in server performance. At present, however, the new architectures, such as fully buffered DIMM (FBDIMM), hybrid memory cube (HMC), and high bandwidth memory (HBM), cannot be commercially applied in the server. Therefore, a new architecture for the server is proposed. CPU and memory are separated onto different boards, and optical interconnection is used for the communication between them. Each optical module corresponds to each dual inline memory module (DIMM) with 64 channels. Compared to the previous technology, not only can the architecture realize high-capacity and wide-bandwidth memory, it also can reduce power consumption and cost, and be compatible with the existing dynamic random access memory (DRAM). In this article, the proposed module with system-in-package (SiP) integration is demonstrated. In the optical module, the silicon photonic chip is included, which is a promising technology to be applied in the next-generation data exchanging centers. And due to the bandwidth-distance performance of the optical interconnection, SerDes chips are introduced to convert the 64-bit data at 800 Mbps from/to 4-channel data at 12.8 Gbps after/before they are transmitted though optical fiber. All the devices are packaged on cheap organic substrates. To ensure the performance of the whole system, several optimization efforts have been performed on the two modules. High-speed interconnection traces have been designed and simulated with electromagnetic simulation software. Steady-state thermal characteristics of the transceiver module have been evaluated by ANSYS APLD based on finite-element methodology (FEM). Heat sinks are placed at the hotspot area to ensure the reliability of all working chips. Finally, this transceiver system based on silicon photonics is measured, and the eye diagrams of data and clock signals are verified.
NASA Technical Reports Server (NTRS)
Anyiwo, Joshua C.
2000-01-01
Vixen is a collection of enabling technologies for uninhibited distributed object computing. In the Spring of 1995 when Vixen was proposed, it was an innovative idea very much ahead of its time. But today the technologies proposed in Vixen have become standard technologies for Enterprise Computing. Sun Microsystems J2EE/EJB specifications, among others, are independently proposed technologies of the Vixen type. I have brought Vixen completely under the J2EE standard in order to maximize interoperability and compatibility with other computing industry efforts. Vixen and the Enterprise JavaBean (EJB) Server technologies are now practically identical; OIL, another Vixen technology, and the Java Messaging System (JMS) are practically identical; and so on. There is no longer anything novel or patentable in the Vixen work performed under this grant. The above discussion, notwithstanding, my independent development of Vixen has significantly helped me, my university, my students and the local community. The undergraduate students who worked with me in developing Vixen have enhanced their expertise in what has become the cutting edge technology of their industry and are therefore well positioned for lucrative employment opportunities in the industry. My academic department has gained a new course: "Multi-media System Development", which provides a highly desirable expertise to our students for employment in any enterprise today. The many Outreach Programs that I conducted during this grant period have exposed local Middle School students to the contributions that NASA is making in our society as well as awakened desires in many such students for careers in Science and Technology. I have applied Vixen to the development of two software packages: (a) JAS: Joshua Application Server - which allows a user to configure an EJB Server to serve a J2EE compliant application over the world wide web; (b) PCM: Professor Course Manager: a J2EE compliant application for configuring a course for distance learning. These types of applications are, however, generally available in the industry today.
ICM: a web server for integrated clustering of multi-dimensional biomedical data.
He, Song; He, Haochen; Xu, Wenjian; Huang, Xin; Jiang, Shuai; Li, Fei; He, Fuchu; Bo, Xiaochen
2016-07-08
Large-scale efforts for parallel acquisition of multi-omics profiling continue to generate extensive amounts of multi-dimensional biomedical data. Thus, integrated clustering of multiple types of omics data is essential for developing individual-based treatments and precision medicine. However, while rapid progress has been made, methods for integrated clustering are lacking an intuitive web interface that facilitates the biomedical researchers without sufficient programming skills. Here, we present a web tool, named Integrated Clustering of Multi-dimensional biomedical data (ICM), that provides an interface from which to fuse, cluster and visualize multi-dimensional biomedical data and knowledge. With ICM, users can explore the heterogeneity of a disease or a biological process by identifying subgroups of patients. The results obtained can then be interactively modified by using an intuitive user interface. Researchers can also exchange the results from ICM with collaborators via a web link containing a Project ID number that will directly pull up the analysis results being shared. ICM also support incremental clustering that allows users to add new sample data into the data of a previous study to obtain a clustering result. Currently, the ICM web server is available with no login requirement and at no cost at http://biotech.bmi.ac.cn/icm/. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
A Simple Microsoft Excel Method to Predict Antibiotic Outbreaks and Underutilization.
Miglis, Cristina; Rhodes, Nathaniel J; Avedissian, Sean N; Zembower, Teresa R; Postelnick, Michael; Wunderink, Richard G; Sutton, Sarah H; Scheetz, Marc H
2017-07-01
Benchmarking strategies are needed to promote the appropriate use of antibiotics. We have adapted a simple regressive method in Microsoft Excel that is easily implementable and creates predictive indices. This method trends consumption over time and can identify periods of over- and underuse at the hospital level. Infect Control Hosp Epidemiol 2017;38:860-862.
ERIC Educational Resources Information Center
Lo, Ya-yu; Starling, A. Leyf Peirce
2009-01-01
This study examined the effects of a graphing task analysis using the Microsoft[R] Office Excel 2007 program on the single-subject multiple baseline graphing skills of three university graduate students. Using a multiple probe across participants design, the study demonstrated a functional relationship between the number of correct graphing…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-27
... methods for submitting multiple FFATA subaward reports: A batch upload template using Microsoft Excel, an... three methods for submitting multiple FFATA subaward reports: A batch upload template using Microsoft Excel, an XML report submission template and an XML web service. These methods do take advantage of the...
The Effect of Using Microsoft Excel in a High School Algebra Class
ERIC Educational Resources Information Center
Neurath, Rachel A.; Stephens, Larry J.
2006-01-01
The purpose of this study was to investigate the effect of integrating Microsoft Excel into a high school algebra class. The results indicate a slight increase in student achievement when Excel was used. A teacher-created final exam and two Criterion Referenced Tests measured success. One of the Criterion Referenced Tests indicated that the…
Creating Single-Subject Design Graphs in Microsoft Excel[TM] 2007
ERIC Educational Resources Information Center
Dixon, Mark R.; Jackson, James W.; Small, Stacey L.; Horner-King, Mollie J.; Mui Ker Lik, Nicholas; Garcia, Yors; Rosales, Rocio
2009-01-01
Over 10 years have passed since the publication of Carr and Burkholder's (1998) technical article on how to construct single-subject graphs using Microsoft Excel. Over the course of the past decade, the Excel program has undergone a series of revisions that make the Carr and Burkholder paper somewhat difficult to follow with newer versions. The…
Teaching Fundamental Skills in Microsoft Excel to First-Year Students in Quantitative Analysis
ERIC Educational Resources Information Center
Rubin, Samuel J.; Abrams, Binyomin
2015-01-01
Despite their technological savvy, most students entering university lack the necessary computer skills to succeed in a quantitative analysis course, in which they are often expected to input, analyze, and plot results of experiments without any previous formal education in Microsoft Excel or similar programs. This lack of formal education results…
ERIC Educational Resources Information Center
King, Michael A.
2009-01-01
Business intelligence derived from data warehousing and data mining has become one of the most strategic management tools today, providing organizations with long-term competitive advantages. Business school curriculums and popular database textbooks cover data warehousing, but the examples and problem sets typically are small and unrealistic. The…
ERIC Educational Resources Information Center
Diffin, Jennifer; Chirombo, Fanuel; Nangle, Dennis; de Jong, Mark
2010-01-01
This article explains how the document management team (circulation and interlibrary loan) at the University of Maryland University College implemented Microsoft's SharePoint product to create a central hub for online collaboration, communication, and storage. Enhancing the team's efficiency, organization, and cooperation was the primary goal.…
A Longitudinal Study Assessing the Microsoft Office Skills Course
ERIC Educational Resources Information Center
Carpenter, Donald A.; McGinnis, Denise; Slauson, Gayla Jo; Snyder, Johnny
2013-01-01
This paper explains a four-year longitudinal study of the assessment process for a Microsoft Office skills course. It examines whether there is an increase in students' knowledge based on responses to pre- and post-surveys that asked students to evaluate how well they can do particular tasks. Classical classroom teaching methods were used in the…
ERIC Educational Resources Information Center
Fasoula, S.; Nikitas, P.; Pappa-Louisi, A.
2017-01-01
A series of Microsoft Excel spreadsheets were developed to simulate the process of separation optimization under isocratic and simple gradient conditions. The optimization procedure is performed in a stepwise fashion using simple macros for an automatic application of this approach. The proposed optimization approach involves modeling of the peak…
Teaching Tip: Active Learning via a Sample Database: The Case of Microsoft's Adventure Works
ERIC Educational Resources Information Center
Mitri, Michel
2015-01-01
This paper describes the use and benefits of Microsoft's Adventure Works (AW) database to teach advanced database skills in a hands-on, realistic environment. Database management and querying skills are a key element of a robust information systems curriculum, and active learning is an important way to develop these skills. To facilitate active…
Back to the Future: The Practicality of Using Microsoft NetMeeting for Effective Distance Tutoring
ERIC Educational Resources Information Center
Legutko, Robert S.
2007-01-01
Background: The idea for attempting a distance tutoring project between university tutors and elementary school students using Microsoft NetMeeting was conceived: (a) to provide a new experience mentoring children for university students pursuing a teaching certificate, (b) for university students to utilize technology in pedagogy, (c) as an…
Concept, Content, Construction, and Contingencies: Getting the Horse before the PowerPoint Cart
ERIC Educational Resources Information Center
DuFrene, Debbie D.; Lehman, Carol M.
2004-01-01
The phrase "death by PowerPoint" was not born in the offices of Microsoft's competitors; it came straight from the hearts of victimized meeting attendees. Microsoft estimates that at least 30 million PowerPoint presentations are made daily, with many rightfully warranting death verdict assessment. Death sentences often result from a "construction…
A Guide to Fast and Simple Web Site Development. Using Microsoft FrontPage.
ERIC Educational Resources Information Center
La, Minh; Beachler, Judith
Designed by California's Los Rios Community College District for use in instructional workshops, this guide is intended to help institutional researchers create World Wide Web sites using Microsoft FrontPage (MF) software. The first part of the guide presents practical suggestions for working with the software to create a site, covering the…
PETRO.CALC.PLOT, Microsoft Excel macros to aid petrologic interpretation
Sidder, G.B.
1994-01-01
PETRO.CALC.PLOT is a package of macros which normalizes whole-rock oxide data to 100%, calculates the cation percentages and molecular proportions used for normative mineral calculations, computes the apices for ternary diagrams, determines sums and ratios of specific elements of petrologic interest, and plots 33 X-Y graphs and five ternary diagrams. PETRO.CALC.PLOT also may be used to create other diagrams as desired by the user. The macros run in Microsoft Excel 3.0 and 4.0 for Macintosh computers and in Microsoft Excel 3.0 and 4.0 for Windows. Macros provided in PETRO.CALC.PLOT minimize repetition and time required to recalculate and plot whole-rock oxide data for petrologic analysis. ?? 1994.
Educational Utilization of Microsoft Powerpoint for Oral and Maxillofacial Cancer Presentations.
Carvalho, Francisco Samuel Rodrigues; Chaves, Filipe Nobre; Soares, Eduardo Costa Studart; Pereira, Karuza Maria Alves; Ribeiro, Thyciana Rodrigues; Fonteles, Cristiane Sa Roriz; Costa, Fabio Wildson Gurgel
2016-01-01
Electronic presentations have become useful tools for surgeons, other clinicians and patients, facilitating medical and legal support and scientific research. Microsoft® PowerPoint is by far and away the most commonly used computer-based presentation package. Setting up surgical clinical cases with PowerPoint makes it easy to register and follow patients for the purpose of discussion of treatment plan or scientific presentations. It facilitates communication between professionals, supervising clinical cases and teaching. It is often useful to create a template to standardize the presentation, offered by the software through the slide master. The purpose of this paper was to show a simple and practical method for creating a Microsoft® PowerPoint template for use in presentations concerning oral and maxillofacial cancer.
NASA Technical Reports Server (NTRS)
Plesea, Lucian; Wood, James F.
2012-01-01
This software is a simple, yet flexible server of raster map products, compliant with the Open Geospatial Consortium (OGC) Web Map Service (WMS) 1.1.1 protocol. The server is a full implementation of the OGC WMS 1.1.1 as a fastCGI client and using Geospatial Data Abstraction Library (GDAL) for data access. The server can operate in a proxy mode, where all or part of the WMS requests are done on a back server. The server has explicit support for a colocated tiled WMS, including rapid response of black (no-data) requests. It generates JPEG and PNG images, including 16-bit PNG. The GDAL back-end support allows great flexibility on the data access. The server is a port to a Linux/GDAL platform from the original IRIX/IL platform. It is simpler to configure and use, and depending on the storage format used, it has better performance than other available implementations. The WMS server 2.0 is a high-performance WMS implementation due to the fastCGI architecture. The use of GDAL data back end allows for great flexibility. The configuration is relatively simple, based on a single XML file. It provides scaling and cropping, as well as blending of multiple layers based on layer transparency.
Virtual network computing: cross-platform remote display and collaboration software.
Konerding, D E
1999-04-01
VNC (Virtual Network Computing) is a computer program written to address the problem of cross-platform remote desktop/application display. VNC uses a client/server model in which an image of the desktop of the server is transmitted to the client and displayed. The client collects mouse and keyboard input from the user and transmits them back to the server. The VNC client and server can run on Windows 95/98/NT, MacOS, and Unix (including Linux) operating systems. VNC is multi-user on Unix machines (any number of servers can be run are unrelated to the primary display of the computer), while it is effectively single-user on Macintosh and Windows machines (only one server can be run, displaying the contents of the primary display of the server). The VNC servers can be configured to allow more than one client to connect at one time, effectively allowing collaboration through the shared desktop. I describe the function of VNC, provide details of installation, describe how it achieves its goal, and evaluate the use of VNC for molecular modelling. VNC is an extremely useful tool for collaboration, instruction, software development, and debugging of graphical programs with remote users.
How to securely replicate services (preliminary version)
NASA Technical Reports Server (NTRS)
Reiter, Michael; Birman, Kenneth
1992-01-01
A method is presented for constructing replicated services that retain their availability and integrity despite several servers and clients being corrupted by an intruder, in addition to others failing benignly. More precisely, a service is replicated by 'n' servers in such a way that a correct client will accept a correct server's response if, for some prespecified parameter, k, at least k servers are correct and fewer than k servers are correct. The issue of maintaining causality among client requests is also addressed. A security breach resulting from an intruder's ability to effect a violation of causality in the sequence of requests processed by the service is illustrated. An approach to counter this problem is proposed that requires that fewer than k servers are corrupt and, to ensure liveness, that k is less than or = n - 2t, where t is the assumed maximum total number of both corruptions and benign failures suffered by servers in any system run. An important and novel feature of these schemes is that the client need not be able to identify or authenticate even a single server. Instead, the client is required only to possess at most two public keys for the service.
NASA Astrophysics Data System (ADS)
Faden, J.; Vandegriff, J. D.; Weigel, R. S.
2016-12-01
Autoplot was introduced in 2008 as an easy-to-use plotting tool for the space physics community. It reads data from a variety of file resources, such as CDF and HDF files, and a number of specialized data servers, such as the PDS/PPI's DIT-DOS, CDAWeb, and from the University of Iowa's RPWG Das2Server. Each of these servers have optimized methods for transmitting data to display in Autoplot, but require coordination and specialized software to work, limiting Autoplot's ability to access new servers and datasets. Likewise, groups who would like software to access their APIs must either write thier own clients, or publish a specification document in hopes that people will write clients. The HAPI specification was written so that a simple, standard API could be used by both Autoplot and server implementations, to remove these barriers to free flow of time series data. Autoplot's software for communicating with HAPI servers is presented, showing the user interface scientists will use, and how data servers might implement the HAPI specification to provide access to their data. This will also include instructions on how Autoplot is used and installed desktop computers, and used to view data from the RBSP, Juno, and other missions.
Providing Internet Access to High-Resolution Mars Images
NASA Technical Reports Server (NTRS)
Plesea, Lucian
2008-01-01
The OnMars server is a computer program that provides Internet access to high-resolution Mars images, maps, and elevation data, all suitable for use in geographical information system (GIS) software for generating images, maps, and computational models of Mars. The OnMars server is an implementation of the Open Geospatial Consortium (OGC) Web Map Service (WMS) server. Unlike other Mars Internet map servers that provide Martian data using an Earth coordinate system, the OnMars WMS server supports encoding of data in Mars-specific coordinate systems. The OnMars server offers access to most of the available high-resolution Martian image and elevation data, including an 8-meter-per-pixel uncontrolled mosaic of most of the Mars Global Surveyor (MGS) Mars Observer Camera Narrow Angle (MOCNA) image collection, which is not available elsewhere. This server can generate image and map files in the tagged image file format (TIFF), Joint Photographic Experts Group (JPEG), 8- or 16-bit Portable Network Graphics (PNG), or Keyhole Markup Language (KML) format. Image control is provided by use of the OGC Style Layer Descriptor (SLD) protocol. The OnMars server also implements tiled WMS protocol and super-overlay KML for high-performance client application programs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bent, John M.; Faibish, Sorin; Pedone, Jr., James M.
A cluster file system is provided having a plurality of distributed metadata servers with shared access to one or more shared low latency persistent key-value metadata stores. A metadata server comprises an abstract storage interface comprising a software interface module that communicates with at least one shared persistent key-value metadata store providing a key-value interface for persistent storage of key-value metadata. The software interface module provides the key-value metadata to the at least one shared persistent key-value metadata store in a key-value format. The shared persistent key-value metadata store is accessed by a plurality of metadata servers. A metadata requestmore » can be processed by a given metadata server independently of other metadata servers in the cluster file system. A distributed metadata storage environment is also disclosed that comprises a plurality of metadata servers having an abstract storage interface to at least one shared persistent key-value metadata store.« less
An assessment of burn prevention knowledge in a high burn-risk environment: restaurants.
Piazza-Waggoner, Carrie; Adams, C D; Goldfarb, I W; Slater, H
2002-01-01
Our facility has seen an increase in the number of cases of children burned in restaurants. Fieldwork has revealed many unsafe serving practices in restaurants in our tristate area. The current research targets what appears to be an underexamined burn-risk environment, restaurants, to examine server knowledge about burn prevention and burn care with customers. Participants included 71 local restaurant servers and 53 servers from various restaurants who were recruited from undergraduate courses. All participants completed a brief demographic form as well as a Burn Knowledge Questionnaire. It was found that server knowledge was low (ie, less than 50% accuracy). Yet, most servers reported that they felt customer burn safety was important enough to change the way that they serve. Additionally, it was found that length of time employed as a server was a significant predictor of servers' burn knowledge (ie, more years serving associated with higher knowledge). Finally, individual items were examined to identify potential targets for developing prevention programs.
Horton, John J.
2006-04-11
A system and method of maintaining communication between a computer and a server, the server being in communication with the computer via xDSL service or dial-up modem service, with xDSL service being the default mode of communication, the method including sending a request to the server via xDSL service to which the server should respond and determining if a response has been received. If no response has been received, displaying on the computer a message (i) indicating that xDSL service has failed and (ii) offering to establish communication between the computer and the server via the dial-up modem, and thereafter changing the default mode of communication between the computer and the server to dial-up modem service. In a preferred embodiment, an xDSL service provider monitors dial-up modem communications and determines if the computer dialing in normally establishes communication with the server via xDSL service. The xDSL service provider can thus quickly and easily detect xDSL failures.
Rclick: a web server for comparison of RNA 3D structures.
Nguyen, Minh N; Verma, Chandra
2015-03-15
RNA molecules play important roles in key biological processes in the cell and are becoming attractive for developing therapeutic applications. Since the function of RNA depends on its structure and dynamics, comparing and classifying the RNA 3D structures is of crucial importance to molecular biology. In this study, we have developed Rclick, a web server that is capable of superimposing RNA 3D structures by using clique matching and 3D least-squares fitting. Our server Rclick has been benchmarked and compared with other popular servers and methods for RNA structural alignments. In most cases, Rclick alignments were better in terms of structure overlap. Our server also recognizes conformational changes between structures. For this purpose, the server produces complementary alignments to maximize the extent of detectable similarity. Various examples showcase the utility of our web server for comparison of RNA, RNA-protein complexes and RNA-ligand structures. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
A General Purpose Connections type CTI Server Based on SIP Protocol and Its Implementation
NASA Astrophysics Data System (ADS)
Watanabe, Toru; Koizumi, Hisao
In this paper, we propose a general purpose connections type CTI (Computer Telephony Integration) server that provides various CTI services such as voice logging where the CTI server communicates with IP-PBX using the SIP (Session Initiation Protocol), and accumulates voice packets of external line telephone call flowing between an IP telephone for extension and a VoIP gateway connected to outside line networks. The CTI server realizes CTI services such as voice logging, telephone conference, or IVR (interactive voice response) with accumulating and processing voice packets sampled. Furthermore, the CTI server incorporates a web server function which can provide various CTI services such as a Web telephone directory via a Web browser to PCs, cellular telephones or smart-phones in mobile environments.
Implementing TCP/IP and a socket interface as a server in a message-passing operating system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hipp, E.; Wiltzius, D.
1990-03-01
The UNICOS 4.3BSD network code and socket transport interface are the basis of an explicit network server for NLTSS, a message passing operating system on the Cray YMP. A BSD socket user library provides access to the network server using an RPC mechanism. The advantages of this server methodology are its modularity and extensibility to migrate to future protocol suites (e.g. OSI) and transport interfaces. In addition, the network server is implemented in an explicit multi-tasking environment to take advantage of the Cray YMP multi-processor platform. 19 refs., 5 figs.
Single-server blind quantum computation with quantum circuit model
NASA Astrophysics Data System (ADS)
Zhang, Xiaoqian; Weng, Jian; Li, Xiaochun; Luo, Weiqi; Tan, Xiaoqing; Song, Tingting
2018-06-01
Blind quantum computation (BQC) enables the client, who has few quantum technologies, to delegate her quantum computation to a server, who has strong quantum computabilities and learns nothing about the client's quantum inputs, outputs and algorithms. In this article, we propose a single-server BQC protocol with quantum circuit model by replacing any quantum gate with the combination of rotation operators. The trap quantum circuits are introduced, together with the combination of rotation operators, such that the server is unknown about quantum algorithms. The client only needs to perform operations X and Z, while the server honestly performs rotation operators.
An Evaluation of Alternative Designs for a Grid Information Service
NASA Technical Reports Server (NTRS)
Smith, Warren; Waheed, Abdul; Meyers, David; Yan, Jerry; Kwak, Dochan (Technical Monitor)
2001-01-01
The Globus information service wasn't working well. There were many updates of data from Globus daemons which saturated the single server and users couldn't retrieve information. We created a second server for NASA and Alliance. Things were great on that server, but a bit slow on the other server. We needed to know exactly how the information service was being used. What were the best servers and configurations? This viewgraph presentation gives an overview of the evaluation of alternative designs for a Grid Information Service. Details are given on the workload characterization, methodology used, and the performance evaluation.
Multimedia content analysis and indexing: evaluation of a distributed and scalable architecture
NASA Astrophysics Data System (ADS)
Mandviwala, Hasnain; Blackwell, Scott; Weikart, Chris; Van Thong, Jean-Manuel
2003-11-01
Multimedia search engines facilitate the retrieval of documents from large media content archives now available via intranets and the Internet. Over the past several years, many research projects have focused on algorithms for analyzing and indexing media content efficiently. However, special system architectures are required to process large amounts of content from real-time feeds or existing archives. Possible solutions include dedicated distributed architectures for analyzing content rapidly and for making it searchable. The system architecture we propose implements such an approach: a highly distributed and reconfigurable batch media content analyzer that can process media streams and static media repositories. Our distributed media analysis application handles media acquisition, content processing, and document indexing. This collection of modules is orchestrated by a task flow management component, exploiting data and pipeline parallelism in the application. A scheduler manages load balancing and prioritizes the different tasks. Workers implement application-specific modules that can be deployed on an arbitrary number of nodes running different operating systems. Each application module is exposed as a web service, implemented with industry-standard interoperable middleware components such as Microsoft ASP.NET and Sun J2EE. Our system architecture is the next generation system for the multimedia indexing application demonstrated by www.speechbot.com. It can process large volumes of audio recordings with minimal support and maintenance, while running on low-cost commodity hardware. The system has been evaluated on a server farm running concurrent content analysis processes.
NASA Astrophysics Data System (ADS)
Chao, Woodrew; Ho, Bruce K. T.; Chao, John T.; Sadri, Reza M.; Huang, Lu J.; Taira, Ricky K.
1995-05-01
Our tele-medicine/PACS archive system is based on a three-tier distributed hierarchical architecture, including magnetic disk farms, optical jukebox, and tape jukebox sub-systems. The hierarchical storage management (HSM) architecture, built around a low cost high performance platform [personal computers (PC) and Microsoft Windows NT], presents a very scaleable and distributed solution ideal for meeting the needs of client/server environments such as tele-medicine, tele-radiology, and PACS. These image based systems typically require storage capacities mirroring those of film based technology (multi-terabyte with 10+ years storage) and patient data retrieval times at near on-line performance as demanded by radiologists. With the scaleable architecture, storage requirements can be easily configured to meet the needs of the small clinic (multi-gigabyte) to those of a major hospital (multi-terabyte). The patient data retrieval performance requirement was achieved by employing system intelligence to manage migration and caching of archived data. Relevant information from HIS/RIS triggers prefetching of data whenever possible based on simple rules. System intelligence embedded in the migration manger allows the clustering of patient data onto a single tape during data migration from optical to tape medium. Clustering of patient data on the same tape eliminates multiple tape loading and associated seek time during patient data retrieval. Optimal tape performance can then be achieved by utilizing the tape drives high performance data streaming capabilities thereby reducing typical data retrieval delays associated with streaming tape devices.
Control of surface thermal scratch of strip in tandem cold rolling
NASA Astrophysics Data System (ADS)
Chen, Jinshan; Li, Changsheng
2014-07-01
The thermal scratch seriously affects the surface quality of the cold rolled stainless steel strip. Some researchers have carried out qualitative and theoretical studies in this field. However, there is currently a lack of research on effective forecast and control of thermal scratch defects in practical production, especially in tandem cold rolling. In order to establish precise mathematical model of oil film thickness in deformation zone, the lubrication in cold rolling process of SUS410L stainless steel strip is studied, and major factors affecting oil film thickness are also analyzed. According to the principle of statistics, mathematical model of critical oil film thickness in deformation zone for thermal scratch is built, with fitting and regression analytical method, and then based on temperature comparison method, the criterion for deciding thermal scratch defects is put forward. Storing and calling data through SQL Server 2010, a software on thermal scratch defects control is developed through Microsoft Visual Studio 2008 by MFC technique for stainless steel in tandem cold rolling, and then it is put into practical production. Statistics indicate that the hit rate of thermal scratch is as high as 92.38%, and the occurrence rate of thermal scratch is decreased by 89.13%. Owing to the application of the software, the rolling speed is increased by approximately 9.3%. The software developed provides an effective solution to the problem of thermal scratch defects in tandem cold rolling, and helps to promote products surface quality of stainless steel strips in practical production.
Virtualization for the LHCb Online system
NASA Astrophysics Data System (ADS)
Bonaccorsi, Enrico; Brarda, Loic; Moine, Gary; Neufeld, Niko
2011-12-01
Virtualization has long been advertised by the IT-industry as a way to cut down cost, optimise resource usage and manage the complexity in large data-centers. The great number and the huge heterogeneity of hardware, both industrial and custom-made, has up to now led to reluctance in the adoption of virtualization in the IT infrastructure of large experiment installations. Our experience in the LHCb experiment has shown that virtualization improves the availability and the manageability of the whole system. We have done an evaluation of available hypervisors / virtualization solutions and find that the Microsoft HV technology provides a high level of maturity and flexibility for our purpose. We present the results of these comparison tests, describing in detail, the architecture of our virtualization infrastructure with a special emphasis on the security for services visible to the outside world. Security is achieved by a sophisticated combination of VLANs, firewalls and virtual routing - the cost and benefits of this solution are analysed. We have adapted our cluster management tools, notably Quattor, for the needs of virtual machines and this allows us to migrate smoothly services on physical machines to the virtualized infrastructure. The procedures for migration will also be described. In the final part of the document we describe our recent R&D activities aiming to replacing the SAN-backend for the virtualization by a cheaper iSCSI solution - this will allow to move all servers and related services to the virtualized infrastructure, excepting the ones doing hardware control via non-commodity PCI plugin cards.
Biomechanics Analysis of Combat Sport (Silat) By Using Motion Capture System
NASA Astrophysics Data System (ADS)
Zulhilmi Kaharuddin, Muhammad; Badriah Khairu Razak, Siti; Ikram Kushairi, Muhammad; Syawal Abd. Rahman, Mohamed; An, Wee Chang; Ngali, Z.; Siswanto, W. A.; Salleh, S. M.; Yusup, E. M.
2017-01-01
‘Silat’ is a Malay traditional martial art that is practiced in both amateur and in professional levels. The intensity of the motion spurs the scientific research in biomechanics. The main purpose of this abstract is to present the biomechanics method used in the study of ‘silat’. By using the 3D Depth Camera motion capture system, two subjects are to perform ‘Jurus Satu’ in three repetitions each. One subject is set as the benchmark for the research. The videos are captured and its data is processed using the 3D Depth Camera server system in the form of 16 3D body joint coordinates which then will be transformed into displacement, velocity and acceleration components by using Microsoft excel for data calculation and Matlab software for simulation of the body. The translated data obtained serves as an input to differentiate both subjects’ execution of the ‘Jurus Satu’. Nine primary movements with the addition of five secondary movements are observed visually frame by frame from the simulation obtained to get the exact frame that the movement takes place. Further analysis involves the differentiation of both subjects’ execution by referring to the average mean and standard deviation of joints for each parameter stated. The findings provide useful data for joints kinematic parameters as well as to improve the execution of ‘Jurus Satu’ and to exhibit the process of learning a movement that is relatively unknown by the use of a motion capture system.
An urban energy performance evaluation system and its computer implementation.
Wang, Lei; Yuan, Guan; Long, Ruyin; Chen, Hong
2017-12-15
To improve the urban environment and effectively reflect and promote urban energy performance, an urban energy performance evaluation system was constructed, thereby strengthening urban environmental management capabilities. From the perspectives of internalization and externalization, a framework of evaluation indicators and key factors that determine urban energy performance and explore the reasons for differences in performance was proposed according to established theory and previous studies. Using the improved stochastic frontier analysis method, an urban energy performance evaluation and factor analysis model was built that brings performance evaluation and factor analysis into the same stage for study. According to data obtained for the Chinese provincial capitals from 2004 to 2013, the coefficients of the evaluation indicators and key factors were calculated by the urban energy performance evaluation and factor analysis model. These coefficients were then used to compile the program file. The urban energy performance evaluation system developed in this study was designed in three parts: a database, a distributed component server, and a human-machine interface. Its functions were designed as login, addition, edit, input, calculation, analysis, comparison, inquiry, and export. On the basis of these contents, an urban energy performance evaluation system was developed using Microsoft Visual Studio .NET 2015. The system can effectively reflect the status of and any changes in urban energy performance. Beijing was considered as an example to conduct an empirical study, which further verified the applicability and convenience of this evaluation system. Copyright © 2017 Elsevier Ltd. All rights reserved.
Setup Instructions for the Applied Anomaly Detection Tool (AADT) Web Server
2016-09-01
ARL-TR-7798 ● SEP 2016 US Army Research Laboratory Setup Instructions for the Applied Anomaly Detection Tool (AADT) Web Server...for the Applied Anomaly Detection Tool (AADT) Web Server by Christian D Schlesiger Computational and Information Sciences Directorate, ARL...SUBTITLE Setup Instructions for the Applied Anomaly Detection Tool (AADT) Web Server 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT
PREDICT: Privacy and Security Enhancing Dynamic Information Monitoring
2015-08-03
consisting of global server-side probabilistic assignment by an untrusted server using cloaked locations, followed by feedback-loop guided local...12], consisting of global server-side probabilistic assignment by an untrusted server using cloaked locations, followed by feedback-loop guided...these methods achieve high sensing coverage with low cost using cloaked locations [3]. In follow-on work, the issue of mobility is addressed. Task
Performance Modeling of the ADA Rendezvous
1991-10-01
queueing network of figure 2, SERVERTASK can complete only one rendezvous at a time. Thus, the rate that the rendezvous requests are processed at the... Network 1, SERVERTASK competes with the traffic tasks of Server Processor. Each time SERVERTASK gains access to the processor, SERVERTASK completes...Client Processor Server Processor Software Server Nek Netork2 Figure 10. A conceptualization of the algorithm. The SERVERTASK software server of Network 2