Sample records for active server page

  1. Automated grading of homework assignments and tests in introductory and intermediate statistics courses using active server pages.

    PubMed

    Stockburger, D W

    1999-05-01

    Active server pages permit a software developer to customize the Web experience for users by inserting server-side script and database access into Web pages. This paper describes applications of these techniques and provides a primer on the use of these methods. Applications include a system that generates and grades individualized homework assignments and tests for statistics students. The student accesses the system as a Web page, prints out the assignment, does the assignment, and enters the answers on the Web page. The server, running on NT Server 4.0, grades the assignment, updates the grade book (on a database), and returns the answer key to the student.

  2. Dynamic Web Pages: Performance Impact on Web Servers.

    ERIC Educational Resources Information Center

    Kothari, Bhupesh; Claypool, Mark

    2001-01-01

    Discussion of Web servers and requests for dynamic pages focuses on experimentally measuring and analyzing the performance of the three dynamic Web page generation technologies: CGI, FastCGI, and Servlets. Develops a multivariate linear regression model and predicts Web server performance under some typical dynamic requests. (Author/LRW)

  3. A radiology department intranet: development and applications.

    PubMed

    Willing, S J; Berland, L L

    1999-01-01

    An intranet is a "private Internet" that uses the protocols of the World Wide Web to share information resources within a company or with the company's business partners and clients. The hardware requirements for an intranet begin with a dedicated Web server permanently connected to the departmental network. The heart of a Web server is the hypertext transfer protocol (HTTP) service, which receives a page request from a client's browser and transmits the page back to the client. Although knowledge of hypertext markup language (HTML) is not essential for authoring a Web page, a working familiarity with HTML is useful, as is knowledge of programming and database management. Security can be ensured by using scripts to write information in hidden fields or by means of "cookies." Interfacing databases and database management systems with the Web server and conforming the user interface to HTML syntax can be achieved by means of the common gateway interface (CGI), Active Server Pages (ASP), or other methods. An intranet in a radiology department could include the following types of content: on-call schedules, work schedules and a calendar, a personnel directory, resident resources, memorandums and discussion groups, software for a radiology information system, and databases.

  4. Educational use of World Wide Web pages on CD-ROM.

    PubMed

    Engel, Thomas P; Smith, Michael

    2002-01-01

    The World Wide Web is increasingly important for medical education. Internet served pages may also be used on a local hard disk or CD-ROM without a network or server. This allows authors to reuse existing content and provide access to users without a network connection. CD-ROM offers several advantages over network delivery of Web pages for several applications. However, creating Web pages for CD-ROM requires careful planning. Issues include file names, relative links, directory names, default pages, server created content, image maps, other file types and embedded programming. With care, it is possible to create server based pages that can be copied directly to CD-ROM. In addition, Web pages on CD-ROM may reference Internet served pages to provide the best features of both methods.

  5. A web-based approach for electrocardiogram monitoring in the home.

    PubMed

    Magrabi, F; Lovell, N H; Celler, B G

    1999-05-01

    A Web-based electrocardiogram (ECG) monitoring service in which a longitudinal clinical record is used for management of patients, is described. The Web application is used to collect clinical data from the patient's home. A database on the server acts as a central repository where this clinical information is stored. A Web browser provides access to the patient's records and ECG data. We discuss the technologies used to automate the retrieval and storage of clinical data from a patient database, and the recording and reviewing of clinical measurement data. On the client's Web browser, ActiveX controls embedded in the Web pages provide a link between the various components including the Web server, Web page, the specialised client side ECG review and acquisition software, and the local file system. The ActiveX controls also implement FTP functions to retrieve and submit clinical data to and from the server. An intelligent software agent on the server is activated whenever new ECG data is sent from the home. The agent compares historical data with newly acquired data. Using this method, an optimum patient care strategy can be evaluated, a summarised report along with reminders and suggestions for action is sent to the doctor and patient by email.

  6. 36 CFR 1194.22 - Web-based intranet and internet information and applications.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... active region of a server-side image map. (f) Client-side image maps shall be provided instead of server-side image maps except where the regions cannot be defined with an available geometric shape. (g) Row...) Frames shall be titled with text that facilitates frame identification and navigation. (j) Pages shall be...

  7. 36 CFR 1194.22 - Web-based intranet and internet information and applications.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... active region of a server-side image map. (f) Client-side image maps shall be provided instead of server-side image maps except where the regions cannot be defined with an available geometric shape. (g) Row...) Frames shall be titled with text that facilitates frame identification and navigation. (j) Pages shall be...

  8. 36 CFR § 1194.22 - Web-based intranet and internet information and applications.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... active region of a server-side image map. (f) Client-side image maps shall be provided instead of server-side image maps except where the regions cannot be defined with an available geometric shape. (g) Row...) Frames shall be titled with text that facilitates frame identification and navigation. (j) Pages shall be...

  9. Designing a Relational Database for the Basic School; Schools Command Web Enabled Officer and Enlisted Database (Sword)

    DTIC Science & Technology

    2002-06-01

    Student memo for personnel MCLLS . . . . . . . . . . . . . . 75 i. Migrate data to SQL Server...The Web Server is on the same server as the SWORD database in the current version. 4: results set 5: dynamic HTML page 6: dynamic HTML page 3: SQL ...still be supported by Access. SQL Server would be a more viable tool for a fully developed application based on the number of potential users and

  10. PlantCAZyme: a database for plant carbohydrate-active enzymes

    PubMed Central

    Ekstrom, Alexander; Taujale, Rahil; McGinn, Nathan; Yin, Yanbin

    2014-01-01

    PlantCAZyme is a database built upon dbCAN (database for automated carbohydrate active enzyme annotation), aiming to provide pre-computed sequence and annotation data of carbohydrate active enzymes (CAZymes) to plant carbohydrate and bioenergy research communities. The current version contains data of 43 790 CAZymes of 159 protein families from 35 plants (including angiosperms, gymnosperms, lycophyte and bryophyte mosses) and chlorophyte algae with fully sequenced genomes. Useful features of the database include: (i) a BLAST server and a HMMER server that allow users to search against our pre-computed sequence data for annotation purpose, (ii) a download page to allow batch downloading data of a specific CAZyme family or species and (iii) protein browse pages to provide an easy access to the most comprehensive sequence and annotation data. Database URL: http://cys.bios.niu.edu/plantcazyme/ PMID:25125445

  11. Business Process Reengineering With Knowledge Value Added in Support of the Department of the Navy Chief Information Officer

    DTIC Science & Technology

    2003-09-01

    BLANK xv LIST OF ACRONYMS ABC Activity Based Costing ADO ActiveX Data Object ASP Application Server Page BPR Business Process Re...processes uses people and systems (hardware, software, machinery, etc.) and that these people and systems contain the “corporate” knowledge of the...server architecture was also a high maintenance item. Data was no longer contained on one mainframe but was distributed throughout the enterprise

  12. Defeating Adversary Network Intelligence Efforts with Active Cyber Defense Techniques

    DTIC Science & Technology

    2008-06-01

    Hide Things from Hackers: Processes, Principles, and Techniques,” Journal of Information Warfare , 5 (3): 26-40 (2006). 20. Rosenau, William ...54 Additional Sources Apel , Thomas. Generating Fingerprints of Network Servers and their Use in Honeypots. Thesis. Aachen University, Aachen...Paul Williams , PhD (ENG) REPORT U ABSTRACT U c. THIS PAGE U 17. LIMITATION OF ABSTRACT UU 18. NUMBER OF PAGES 55

  13. Aviation System Analysis Capability Quick Response System Report Server User’s Guide.

    DTIC Science & Technology

    1996-10-01

    primary data sources for the QRS Report Server are the following: ♦ United States Department of Transportation airline service quality per- formance...and to cross-reference sections of this document. is used to indicate quoted text messages from WWW pages. is used for WWW page and section titles...would link the user to another document or another section of the same document. ALL CAPS is used to indicate Report Server variables for which the

  14. Design and evaluation of web-based image transmission and display with different protocols

    NASA Astrophysics Data System (ADS)

    Tan, Bin; Chen, Kuangyi; Zheng, Xichuan; Zhang, Jianguo

    2011-03-01

    There are many Web-based image accessing technologies used in medical imaging area, such as component-based (ActiveX Control) thick client Web display, Zerofootprint thin client Web viewer (or called server side processing Web viewer), Flash Rich Internet Application(RIA) ,or HTML5 based Web display. Different Web display methods have different peformance in different network environment. In this presenation, we give an evaluation on two developed Web based image display systems. The first one is used for thin client Web display. It works between a PACS Web server with WADO interface and thin client. The PACS Web server provides JPEG format images to HTML pages. The second one is for thick client Web display. It works between a PACS Web server with WADO interface and thick client running in browsers containing ActiveX control, Flash RIA program or HTML5 scripts. The PACS Web server provides native DICOM format images or JPIP stream for theses clients.

  15. Web server for priority ordered multimedia services

    NASA Astrophysics Data System (ADS)

    Celenk, Mehmet; Godavari, Rakesh K.; Vetnes, Vermund

    2001-10-01

    In this work, our aim is to provide finer priority levels in the design of a general-purpose Web multimedia server with provisions of the CM services. The type of services provided include reading/writing a web page, downloading/uploading an audio/video stream, navigating the Web through browsing, and interactive video teleconferencing. The selected priority encoding levels for such operations follow the order of admin read/write, hot page CM and Web multicasting, CM read, Web read, CM write and Web write. Hot pages are the most requested CM streams (e.g., the newest movies, video clips, and HDTV channels) and Web pages (e.g., portal pages of the commercial Internet search engines). Maintaining a list of these hot Web pages and CM streams in a content addressable buffer enables a server to multicast hot streams with lower latency and higher system throughput. Cold Web pages and CM streams are treated as regular Web and CM requests. Interactive CM operations such as pause (P), resume (R), fast-forward (FF), and rewind (RW) have to be executed without allocation of extra resources. The proposed multimedia server model is a part of the distributed network with load balancing schedulers. The SM is connected to an integrated disk scheduler (IDS), which supervises an allocated disk manager. The IDS follows the same priority handling as the SM, and implements a SCAN disk-scheduling method for an improved disk access and a higher throughput. Different disks are used for the Web and CM services in order to meet the QoS requirements of CM services. The IDS ouput is forwarded to an Integrated Transmission Scheduler (ITS). The ITS creates a priority ordered buffering of the retrieved Web pages and CM data streams that are fed into an auto regressive moving average (ARMA) based traffic shaping circuitry before being transmitted through the network.

  16. Ajax Architecture Implementation Techniques

    NASA Astrophysics Data System (ADS)

    Hussaini, Syed Asadullah; Tabassum, S. Nasira; Baig, Tabassum, M. Khader

    2012-03-01

    Today's rich Web applications use a mix of Java Script and asynchronous communication with the application server. This mechanism is also known as Ajax: Asynchronous JavaScript and XML. The intent of Ajax is to exchange small pieces of data between the browser and the application server, and in doing so, use partial page refresh instead of reloading the entire Web page. AJAX (Asynchronous JavaScript and XML) is a powerful Web development model for browser-based Web applications. Technologies that form the AJAX model, such as XML, JavaScript, HTTP, and XHTML, are individually widely used and well known. However, AJAX combines these technologies to let Web pages retrieve small amounts of data from the server without having to reload the entire page. This capability makes Web pages more interactive and lets them behave like local applications. Web 2.0 enabled by the Ajax architecture has given rise to a new level of user interactivity through web browsers. Many new and extremely popular Web applications have been introduced such as Google Maps, Google Docs, Flickr, and so on. Ajax Toolkits such as Dojo allow web developers to build Web 2.0 applications quickly and with little effort.

  17. Using Web Server Logs to Track Users through the Electronic Forest

    ERIC Educational Resources Information Center

    Coombs, Karen A.

    2005-01-01

    This article analyzes server logs, providing helpful information in making decisions about Web-based services. The author indicates, as a result of analyzing server logs, several interesting things about the users' behavior were learned. The resulting findings are discussed in this article. Certain pages of the author's Web site, for instance, are…

  18. Server-Side Includes Made Simple.

    ERIC Educational Resources Information Center

    Fagan, Jody Condit

    2002-01-01

    Describes server-side include (SSI) codes which allow Webmasters to insert content into Web pages without programming knowledge. Explains how to enable the codes on a Web server, provides a step-by-step process for implementing them, discusses tags and syntax errors, and includes examples of their use on the Web site for Southern Illinois…

  19. Novel Advancements in Internet-Based Real Time Data Technologies

    NASA Technical Reports Server (NTRS)

    Myers, Gerry; Welch, Clara L. (Technical Monitor)

    2002-01-01

    AZ Technology has been working with MSFC Ground Systems Department to find ways to make it easier for remote experimenters (RPI's) to monitor their International Space Station (ISS) payloads in real-time from anywhere using standard/familiar devices. AZ Technology was awarded an SBIR Phase I grant to research the technologies behind and advancements of distributing live ISS data across the Internet. That research resulted in a product called "EZStream" which is in use on several ISS-related projects. Although the initial implementation is geared toward ISS, the architecture and lessons learned are applicable to other space-related programs. This paper presents the high-level architecture and components that make up EZStream. A combination of commercial-off-the-shelf (COTS) and custom components were used and their interaction will be discussed. The server is powered by Apache's Jakarta-Tomcat web server/servlet engine. User accounts are maintained in a My SQL database. Both Tomcat and MySQL are Open Source products. When used for ISS, EZStream pulls the live data directly from NASA's Telescience Resource Kit (TReK) API. TReK parses the ISS data stream into individual measurement parameters and performs on-the- fly engineering unit conversion and range checking before passing the data to EZStream for distribution. TReK is provided by NASA at no charge to ISS experimenters. By using a combination of well established Open Source, NASA-supplied. and AZ Technology-developed components, operations using EZStream are robust and economical. Security over the Internet is a major concern on most space programs. This paper describes how EZStream provides for secure connection to and transmission of space- related data over the public Internet. Display pages that show sensitive data can be placed under access control by EZStream. Users are required to login before being allowed to pull up those web pages. To enhance security, the EZStream client/server data transmissions can be encrypted to preclude interception. EZStream was developed to make use of a host of standard platforms and protocols. Each are discussed in detail in this paper. The I3ZStream server is written as Java Servlets. This allows different platforms (i.e. Windows, Unix, Linux . Mac) to host the server portion. The EZStream client component is written in two different flavors: JavaBean and ActiveX. The JavaBean component is used to develop Java Applet displays. The ActiveX component is used for developing ActiveX-based displays. Remote user devices will be covered including web browsers on PC#s and scaled-down displays for PDA's and smart cell phones. As mentioned. the interaction between EZStream (web/data server) and TReK (data source) will be covered as related to ISS. EZStream is being enhanced to receive and parse binary data stream directly. This makes EZStream beneficial to both the ISS International Partners and non-NASA applications (i.e. factory floor monitoring). The options for developing client-side display web pages will be addressed along with the development of tools to allow creation of display web pages by non-programmers.

  20. NEA - NEA Home

    Science.gov Websites

    : Page not found 500: Internal Server Error BloggerSkin C40 Training - Stand Alone C4O Training - April 2017 - Stand Alone - Article Page 66739 C4O Training - Aug 2017 - Stand Alone - Article Page 66743 C4O Training - Feb 2017 - Stand Alone - Article Page 66737 C4O Training - Jan 2017 - Stand Alone - Article Page

  1. Establishment of Textbook Information Management System Based on Active Server Page

    ERIC Educational Resources Information Center

    Geng, Lihua

    2011-01-01

    In the process of textbook management of universities, the flow of storage, collection and check of textbook is quite complicated and daily management flow and system also seriously constrains the efficiency of the management process. Thus, in order to combine the information management model and the traditional management model, it is necessary…

  2. An Innovative Improvement of Engineering Learning System Using Computational Fluid Dynamics Concept

    ERIC Educational Resources Information Center

    Hung, T. C.; Wang, S. K.; Tai, S. W.; Hung, C. T.

    2007-01-01

    An innovative concept of an electronic learning system has been established in an attempt to achieve a technology that provides engineering students with an instructive and affordable framework for learning engineering-related courses. This system utilizes an existing Computational Fluid Dynamics (CFD) package, Active Server Pages programming,…

  3. Accelerating Demand Paging for Local and Remote Out-of-Core Visualization

    NASA Technical Reports Server (NTRS)

    Ellsworth, David

    2001-01-01

    This paper describes a new algorithm that improves the performance of application-controlled demand paging for the out-of-core visualization of data sets that are on either local disks or disks on remote servers. The performance improvements come from better overlapping the computation with the page reading process, and by performing multiple page reads in parallel. The new algorithm can be applied to many different visualization algorithms since application-controlled demand paging is not specific to any visualization algorithm. The paper includes measurements that show that the new multi-threaded paging algorithm decreases the time needed to compute visualizations by one third when using one processor and reading data from local disk. The time needed when using one processor and reading data from remote disk decreased by up to 60%. Visualization runs using data from remote disk ran about as fast as ones using data from local disk because the remote runs were able to make use of the remote server's high performance disk array.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    The system is developed to collect, process, store and present the information provided by the radio frequency identification (RFID) devices. The system contains three parts, the application software, the database and the web page. The application software manages multiple RFID devices, such as readers and portals, simultaneously. It communicates with the devices through application programming interface (API) provided by the device vendor. The application software converts data collected by the RFID readers and portals to readable information. It is capable of encrypting data using 256 bits advanced encryption standard (AES). The application software has a graphical user interface (GUI). Themore » GUI mimics the configurations of the nucler material storage sites or transport vehicles. The GUI gives the user and system administrator an intuitive way to read the information and/or configure the devices. The application software is capable of sending the information to a remote, dedicated and secured web and database server. Two captured screen samples, one for storage and transport, are attached. The database is constructed to handle a large number of RFID tag readers and portals. A SQL server is employed for this purpose. An XML script is used to update the database once the information is sent from the application software. The design of the web page imitates the design of the application software. The web page retrieves data from the database and presents it in different panels. The user needs a user name combined with a password to access the web page. The web page is capable of sending e-mail and text messages based on preset criteria, such as when alarm thresholds are excceeded. A captured screen sample is attached. The application software is designed to be installed on a local computer. The local computer is directly connected to the RFID devices and can be controlled locally or remotely. There are multiple local computers managing different sites or transport vehicles. The control from remote sites and information transmitted to a central database server is through secured internet. The information stored in the central databaser server is shown on the web page. The users can view the web page on the internet. A dedicated and secured web and database server (https) is used to provide information security.« less

  5. Hiding the Disk and Network Latency of Out-of-Core Visualization

    NASA Technical Reports Server (NTRS)

    Ellsworth, David

    2001-01-01

    This paper describes an algorithm that improves the performance of application-controlled demand paging for out-of-core visualization by hiding the latency of reading data from both local disks or disks on remote servers. The performance improvements come from better overlapping the computation with the page reading process, and by performing multiple page reads in parallel. The paper includes measurements that show that the new multithreaded paging algorithm decreases the time needed to compute visualizations by one third when using one processor and reading data from local disk. The time needed when using one processor and reading data from remote disk decreased by two thirds. Visualization runs using data from remote disk actually ran faster than ones using data from local disk because the remote runs were able to make use of the remote server's high performance disk array.

  6. On-demand server-side image processing for web-based DICOM image display

    NASA Astrophysics Data System (ADS)

    Sakusabe, Takaya; Kimura, Michio; Onogi, Yuzo

    2000-04-01

    Low cost image delivery is needed in modern networked hospitals. If a hospital has hundreds of clients, cost of client systems is a big problem. Naturally, a Web-based system is the most effective solution. But a Web browser could not display medical images with certain image processing such as a lookup table transformation. We developed a Web-based medical image display system using Web browser and on-demand server-side image processing. All images displayed on a Web page are generated from DICOM files on a server, delivered on-demand. User interaction on the Web page is handled by a client-side scripting technology such as JavaScript. This combination makes a look-and-feel of an imaging workstation not only for its functionality but also for its speed. Real time update of images with tracing mouse motion is achieved on Web browser without any client-side image processing which may be done by client-side plug-in technology such as Java Applets or ActiveX. We tested performance of the system in three cases. Single client, small number of clients in a fast speed network, and large number of clients in a normal speed network. The result shows that there are very slight overhead for communication and very scalable in number of clients.

  7. The development of a tele-monitoring system for physiological parameters based on the B/S model.

    PubMed

    Shuicai, Wu; Peijie, Jiang; Chunlan, Yang; Haomin, Li; Yanping, Bai

    2010-01-01

    The development of a new physiological multi-parameter remote monitoring system is based on the B/S model. The system consists of a server monitoring center, Internet network and PC-based multi-parameter monitors. Using the B/S model, the clients can browse web pages via the server monitoring center and download and install ActiveX controls. The physiological multi-parameters are collected, displayed and remotely transmitted. The experimental results show that the system is stable, reliable and operates in real time. The system is suitable for use in physiological multi-parameter remote monitoring for family and community healthcare. Copyright © 2010 Elsevier Ltd. All rights reserved.

  8. CIVET: Continuous Integration, Verification, Enhancement, and Testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alger, Brian; Gaston, Derek R.; Permann, Cody J

    A Git server (GitHub, GitLab, BitBucket) sends event notifications to the Civet server. These are either a " Pull Request" or a "Push" notification. Civet then checks the database to determine what tests need to be run and marks them as ready to run. Civet clients, running on dedicated machines, query the server for available jobs that are ready to run. When a client gets a job it executes the scripts attached to the job and report back to the server the output and exit status. When the client updates the server, the server will also update the Git servermore » with the result of the job, as well as updating the main web page.« less

  9. Network and User-Perceived Performance of Web Page Retrievals

    NASA Technical Reports Server (NTRS)

    Kruse, Hans; Allman, Mark; Mallasch, Paul

    1998-01-01

    The development of the HTTP protocol has been driven by the need to improve the network performance of the protocol by allowing the efficient retrieval of multiple parts of a web page without the need for multiple simultaneous TCP connections between a client and a server. We suggest that the retrieval of multiple page elements sequentially over a single TCP connection may result in a degradation of the perceived performance experienced by the user. We attempt to quantify this perceived degradation through the use of a model which combines a web retrieval simulation and an analytical model of TCP operation. Starting with the current HTTP/l.1 specification, we first suggest a client@side heuristic to improve the perceived transfer performance. We show that the perceived speed of the page retrieval can be increased without sacrificing data transfer efficiency. We then propose a new client/server extension to the HTTP/l.1 protocol to allow for the interleaving of page element retrievals. We finally address the issue of the display of advertisements on web pages, and in particular suggest a number of mechanisms which can make efficient use of IP multicast to send advertisements to a number of clients within the same network.

  10. Application of Microsoft's ActiveX and DirectX technologies to the visulization of physical system dynamics

    NASA Astrophysics Data System (ADS)

    Mann, Christopher; Narasimhamurthi, Natarajan

    1998-08-01

    This paper discusses a specific implementation of a web and complement based simulation systems. The overall simulation container is implemented within a web page viewed with Microsoft's Internet Explorer 4.0 web browser. Microsoft's ActiveX/Distributed Component Object Model object interfaces are used in conjunction with the Microsoft DirectX graphics APIs to provide visualization functionality for the simulation. The MathWorks' Matlab computer aided control system design program is used as an ActiveX automation server to provide the compute engine for the simulations.

  11. The Four Levels of Web Site Development Expertise.

    ERIC Educational Resources Information Center

    Ingram, Albert L.

    2000-01-01

    Discusses the design of Web pages and sites and proposes a four-level model of Web development expertise that can serve as a curriculum overview or as a plan for an individual's professional development. Highlights include page design, media use, client-side processing, server-side processing, and site structure. (LRW)

  12. Load Balancing in Distributed Web Caching: A Novel Clustering Approach

    NASA Astrophysics Data System (ADS)

    Tiwari, R.; Kumar, K.; Khan, G.

    2010-11-01

    The World Wide Web suffers from scaling and reliability problems due to overloaded and congested proxy servers. Caching at local proxy servers helps, but cannot satisfy more than a third to half of requests; more requests are still sent to original remote origin servers. In this paper we have developed an algorithm for Distributed Web Cache, which incorporates cooperation among proxy servers of one cluster. This algorithm uses Distributed Web Cache concepts along with static hierarchies with geographical based clusters of level one proxy server with dynamic mechanism of proxy server during the congestion of one cluster. Congestion and scalability problems are being dealt by clustering concept used in our approach. This results in higher hit ratio of caches, with lesser latency delay for requested pages. This algorithm also guarantees data consistency between the original server objects and the proxy cache objects.

  13. Thin client (web browser)-based collaboration for medical imaging and web-enabled data.

    PubMed

    Le, Tuong Huu; Malhi, Nadeem

    2002-01-01

    Utilizing thin client software and open source server technology, a collaborative architecture was implemented allowing for sharing of Digital Imaging and Communications in Medicine (DICOM) and non-DICOM images with real-time markup. Using the Web browser as a thin client integrated with standards-based components, such as DHTML (dynamic hypertext markup language), JavaScript, and Java, collaboration was achieved through a Web server/proxy server combination utilizing Java Servlets and Java Server Pages. A typical collaborative session involved the driver, who directed the navigation of the other collaborators, the passengers, and provided collaborative markups of medical and nonmedical images. The majority of processing was performed on the server side, allowing for the client to remain thin and more accessible.

  14. Do-It-Yourself: A Special Library's Approach to Creating Dynamic Web Pages Using Commercial Off-The-Shelf Applications

    NASA Technical Reports Server (NTRS)

    Steeman, Gerald; Connell, Christopher

    2000-01-01

    Many librarians may feel that dynamic Web pages are out of their reach, financially and technically. Yet we are reminded in library and Web design literature that static home pages are a thing of the past. This paper describes how librarians at the Institute for Defense Analyses (IDA) library developed a database-driven, dynamic intranet site using commercial off-the-shelf applications. Administrative issues include surveying a library users group for interest and needs evaluation; outlining metadata elements; and, committing resources from managing time to populate the database and training in Microsoft FrontPage and Web-to-database design. Technical issues covered include Microsoft Access database fundamentals, lessons learned in the Web-to-database process (including setting up Database Source Names (DSNs), redesigning queries to accommodate the Web interface, and understanding Access 97 query language vs. Standard Query Language (SQL)). This paper also offers tips on editing Active Server Pages (ASP) scripting to create desired results. A how-to annotated resource list closes out the paper.

  15. Neutron Scattering Template

    Science.gov Websites

    Scattering Banner Acknowledgements The graphics used on the Neutron Scattering Web Pages were designed by reused on these web pages by kind permission of Jack Carpenter, and with the assistance of Mary Koelbl (IPD). Rick Goyette (IPNS) set up and maintains the Linux web server as well as helping to automate the

  16. ITMS: Individualized Teaching Material System: Adaptive Integration of Web Pages Distributed in Some Servers.

    ERIC Educational Resources Information Center

    Mitsuhara, Hiroyuki; Kurose, Yoshinobu; Ochi, Youji; Yano, Yoneo

    The authors developed a Web-based Adaptive Educational System (Web-based AES) named ITMS (Individualized Teaching Material System). ITMS adaptively integrates knowledge on the distributed Web pages and generates individualized teaching material that has various contents. ITMS also presumes the learners' knowledge levels from the states of their…

  17. The Status of African Studies Digitized Content: Three Metadata Schemes.

    ERIC Educational Resources Information Center

    Kuntz, Patricia S.

    The proliferation of Web pages and digitized material mounted on Internet servers has become unmanageable. Librarians and users are concerned that documents and information are being lost in cyberspace as a result of few bibliographic controls and common standards. Librarians in cooperation with software creators and Web page designers are…

  18. Default Parallels Plesk Panel Page

    Science.gov Websites

    services that small businesses want and need. Our software includes key building blocks of cloud service virtualized servers Service Provider Products Parallels® Automation Hosting, SaaS, and cloud computing , the leading hosting automation software. You see this page because there is no Web site at this

  19. CH5M3D: an HTML5 program for creating 3D molecular structures.

    PubMed

    Earley, Clarke W

    2013-11-18

    While a number of programs and web-based applications are available for the interactive display of 3-dimensional molecular structures, few of these provide the ability to edit these structures. For this reason, we have developed a library written in JavaScript to allow for the simple creation of web-based applications that should run on any browser capable of rendering HTML5 web pages. While our primary interest in developing this application was for educational use, it may also prove useful to researchers who want a light-weight application for viewing and editing small molecular structures. Molecular compounds are drawn on the HTML5 Canvas element, with the JavaScript code making use of standard techniques to allow display of three-dimensional structures on a two-dimensional canvas. Information about the structure (bond lengths, bond angles, and dihedral angles) can be obtained using a mouse or other pointing device. Both atoms and bonds can be added or deleted, and rotation about bonds is allowed. Routines are provided to read structures either from the web server or from the user's computer, and creation of galleries of structures can be accomplished with only a few lines of code. Documentation and examples are provided to demonstrate how users can access all of the molecular information for creation of web pages with more advanced features. A light-weight (≈ 75 kb) JavaScript library has been made available that allows for the simple creation of web pages containing interactive 3-dimensional molecular structures. Although this library is designed to create web pages, a web server is not required. Installation on a web server is straightforward and does not require any server-side modules or special permissions. The ch5m3d.js library has been released under the GNU GPL version 3 open-source license and is available from http://sourceforge.net/projects/ch5m3d/.

  20. CH5M3D: an HTML5 program for creating 3D molecular structures

    PubMed Central

    2013-01-01

    Background While a number of programs and web-based applications are available for the interactive display of 3-dimensional molecular structures, few of these provide the ability to edit these structures. For this reason, we have developed a library written in JavaScript to allow for the simple creation of web-based applications that should run on any browser capable of rendering HTML5 web pages. While our primary interest in developing this application was for educational use, it may also prove useful to researchers who want a light-weight application for viewing and editing small molecular structures. Results Molecular compounds are drawn on the HTML5 Canvas element, with the JavaScript code making use of standard techniques to allow display of three-dimensional structures on a two-dimensional canvas. Information about the structure (bond lengths, bond angles, and dihedral angles) can be obtained using a mouse or other pointing device. Both atoms and bonds can be added or deleted, and rotation about bonds is allowed. Routines are provided to read structures either from the web server or from the user’s computer, and creation of galleries of structures can be accomplished with only a few lines of code. Documentation and examples are provided to demonstrate how users can access all of the molecular information for creation of web pages with more advanced features. Conclusions A light-weight (≈ 75 kb) JavaScript library has been made available that allows for the simple creation of web pages containing interactive 3-dimensional molecular structures. Although this library is designed to create web pages, a web server is not required. Installation on a web server is straightforward and does not require any server-side modules or special permissions. The ch5m3d.js library has been released under the GNU GPL version 3 open-source license and is available from http://sourceforge.net/projects/ch5m3d/. PMID:24246004

  1. ORBIT: an integrated environment for user-customized bioinformatics tools.

    PubMed

    Bellgard, M I; Hiew, H L; Hunter, A; Wiebrands, M

    1999-10-01

    There are a large number of computational programs freely available to bioinformaticians via a client/server, web-based environment. However, the client interface to these tools (typically an html form page) cannot be customized from the client side as it is created by the service provider. The form page is usually generic enough to cater for a wide range of users. However, this implies that a user cannot set as 'default' advanced program parameters on the form or even customize the interface to his/her specific requirements or preferences. Currently, there is a lack of end-user interface environments that can be modified by the user when accessing computer programs available on a remote server running on an intranet or over the Internet. We have implemented a client/server system called ORBIT (Online Researcher's Bioinformatics Interface Tools) where individual clients can have interfaces created and customized to command-line-driven, server-side programs. Thus, Internet-based interfaces can be tailored to a user's specific bioinformatic needs. As interfaces are created on the client machine independent of the server, there can be different interfaces to the same server-side program to cater for different parameter settings. The interface customization is relatively quick (between 10 and 60 min) and all client interfaces are integrated into a single modular environment which will run on any computer platform supporting Java. The system has been developed to allow for a number of future enhancements and features. ORBIT represents an important advance in the way researchers gain access to bioinformatics tools on the Internet.

  2. Web Site On a Budget: How to Find an Affordable Home for Your Pages.

    ERIC Educational Resources Information Center

    Callihan, Steven E.

    1996-01-01

    Offers advice for choosing an Internet provider: consider the amount of time, effort, and expertise one has, coupled with the complexity of the Web page, which impact price and choice of provider; and question providers about server speed, ports, architecture, traffic levels, fee structures, and registration of domain names. Lists 33 Web presence…

  3. A Web Server for MACCS Magnetometer Data

    NASA Technical Reports Server (NTRS)

    Engebretson, Mark J.

    1998-01-01

    NASA Grant NAG5-3719 was provided to Augsburg College to support the development of a web server for the Magnetometer Array for Cusp and Cleft Studies (MACCS), a two-dimensional array of fluxgate magnetometers located at cusp latitudes in Arctic Canada. MACCS was developed as part of the National Science Foundation's GEM (Geospace Environment Modeling) Program, which was designed in part to complement NASA's Global Geospace Science programs during the decade of the 1990s. This report describes the successful use of these grant funds to support a working web page that provides both daily plots and file access to any user accessing the worldwide web. The MACCS home page can be accessed at http://space.augsburg.edu/space/MaccsHome.html.

  4. Design and Analysis of a Model Reconfigurable Cyber-Exercise Laboratory (RCEL) for Information Assurance Education

    DTIC Science & Technology

    2004-03-01

    with MySQL . This choice was made because MySQL is open source. Any significant database engine such as Oracle or MS- SQL or even MS Access can be used...10 Figure 6. The DoD vs . Commercial Life Cycle...necessarily be interested in SCADA network security 13. MySQL (Database server) – This station represents a typical data server for a web page

  5. SurveyWiz and factorWiz: JavaScript Web pages that make HTML forms for research on the Internet.

    PubMed

    Birnbaum, M H

    2000-05-01

    SurveyWiz and factorWiz are Web pages that act as wizards to create HTML forms that enable one to collect data via the Web. SurveyWiz allows the user to enter survey questions or personality test items with a mixture of text boxes and scales of radio buttons. One can add demographic questions of age, sex, education, and nationality with the push of a button. FactorWiz creates the HTML for within-subjects, two-factor designs as large as 9 x 9, or higher order factorial designs up to 81 cells. The user enters levels of the row and column factors, which can be text, images, or other multimedia. FactorWiz generates the stimulus combinations, randomizes their order, and creates the page. In both programs HTML is displayed in a window, and the user copies it to a text editor to save it. When uploaded to a Web server and supported by a CGI script, the created Web pages allow data to be collected, coded, and saved on the server. These programs are intended to assist researchers and students in quickly creating studies that can be administered via the Web.

  6. Using Firefly Tools to Enhance Archive Web Pages

    NASA Astrophysics Data System (ADS)

    Roby, W.; Wu, X.; Ly, L.; Goldina, T.

    2013-10-01

    Astronomy web developers are looking for fast and powerful HTML 5/AJAX tools to enhance their web archives. We are exploring ways to make this easier for the developer. How could you have a full FITS visualizer or a Web 2.0 table that supports paging, sorting, and filtering in your web page in 10 minutes? Can it be done without even installing any software or maintaining a server? Firefly is a powerful, configurable system for building web-based user interfaces to access astronomy science archives. It has been in production for the past three years. Recently, we have made some of the advanced components available through very simple JavaScript calls. This allows a web developer, without any significant knowledge of Firefly, to have FITS visualizers, advanced table display, and spectrum plots on their web pages with minimal learning curve. Because we use cross-site JSONP, installing a server is not necessary. Web sites that use these tools can be created in minutes. Firefly was created in IRSA, the NASA/IPAC Infrared Science Archive (http://irsa.ipac.caltech.edu). We are using Firefly to serve many projects including Spitzer, Planck, WISE, PTF, LSST and others.

  7. Brisement

    MedlinePlus

    ... this site from a secured browser on the server. Please enable scripts and reload this page. Home ... The content is not intended to substitute for professional medical advice, diagnoses or treatments. If you need ...

  8. An electronic thesaurus of Evidence Based Laboratory Medicine hematological and biochemical diagnostic tests.

    PubMed

    Dorizzi, R M; Maconi, M; Giavarina, D; Loza, G; Aman, M; Moreira, J; Bisoffi, Z; Gennuso, C

    2009-10-01

    The adoption of Evidence Based Laboratory Medicine (EBLM) has been hampered until today by the lack of effective tools. The SIMeL EBLM e-Thesaurus (on-line Repertoire of the diagnostic effectiveness of the laboratory, radiology and cardiology test) provides a useful support to clinical laboratory professionals and to clinicians for the interpretation of the diagnostic tests. The e-Thesaurus is an application developed using Microsoft Active Server Pages technology and carried out with Web Server Microsoft Internet Information Server and is available at the SIMeL website using a browser running JavaScript scripts (Internet Explorer is recommended). It contains a database (in Italian, English and Spanish) of the sensitivity and specificity (including the 95% confidence interval), the positive and negative likelihood ratios, the Diagnostic Odds Ratio and the Number Needed to Diagnose of more than 2000 diagnostic (most laboratory but also cardiology and radiology) tests. The e-Thesaurus improves the previous SIMeL paper and CD Thesaurus; its main features are a three languages search and a continuous and an easy updating capability.

  9. Subtalar Arthroscopy

    MedlinePlus

    ... this site from a secured browser on the server. Please enable scripts and reload this page. Home ... The content is not intended to substitute for professional medical advice, diagnoses or treatments. If you need ...

  10. Peroneal Tendinosis

    MedlinePlus

    ... this site from a secured browser on the server. Please enable scripts and reload this page. Home ... The content is not intended to substitute for professional medical advice, diagnoses or treatments. If you need ...

  11. Ankle Cheilectomy

    MedlinePlus

    ... this site from a secured browser on the server. Please enable scripts and reload this page. Home ... The content is not intended to substitute for professional medical advice, diagnoses or treatments. If you need ...

  12. Disobedient Child

    MedlinePlus

    ... this site from a secured browser on the server. Please enable scripts and reload this page. Turn ... some disobedient children, you may need to obtain professional mental health treatment. Here are some situations where ...

  13. Distraction Arthroplasty

    MedlinePlus

    ... this site from a secured browser on the server. Please enable scripts and reload this page. Home ... The content is not intended to substitute for professional medical advice, diagnoses or treatments. If you need ...

  14. Pediatric Specialists

    MedlinePlus

    ... this site from a secured browser on the server. Please enable scripts and reload this page. Turn ... for Referral to Pediatric Surgical Specialists Sports Medicine Professionals What is a Child Abuse Pediatrician? What is ...

  15. Securing the anonymity of content providers in the World Wide Web

    NASA Astrophysics Data System (ADS)

    Demuth, Thomas; Rieke, Andreas

    1999-04-01

    Nowadays the World Wide Web (WWW) is an established service used by people all over the world. Most of them do not recognize the fact that they reveal plenty of information about themselves or their affiliation and computer equipment to the providers of web pages they connect to. As a result, a lot of services offer users to access web pages unrecognized or without risk of being backtracked, respectively. This kind of anonymity is called user or client anonymity. But on the other hand, an equivalent protection for content providers does not exist, although this feature is desirable for many situations in which the identity of a publisher or content provider shall be hidden. We call this property server anonymity. We will introduce the first system with the primary target to offer anonymity for providers of information in the WWW. Beside this property, it provides also client anonymity. Based on David Chaum's idea of mixes and in relation to the context of the WWW, we explain the term 'server anonymity' motivating the system JANUS which offers both client and server anonymity.

  16. Cavus Foot Surgery

    MedlinePlus

    ... this site from a secured browser on the server. Please enable scripts and reload this page. Home ... The content is not intended to substitute for professional medical advice, diagnoses or treatments. If you need ...

  17. Bunionette Deformity Correction

    MedlinePlus

    ... this site from a secured browser on the server. Please enable scripts and reload this page. Home ... The content is not intended to substitute for professional medical advice, diagnoses or treatments. If you need ...

  18. Shyness in Children

    MedlinePlus

    ... this site from a secured browser on the server. Please enable scripts and reload this page. Turn ... then an evaluation by a child mental-health professional would be helpful. Time to adjust Most shy ...

  19. An Application Server for Scientific Collaboration

    NASA Astrophysics Data System (ADS)

    Cary, John R.; Luetkemeyer, Kelly G.

    1998-11-01

    Tech-X Corporation has developed SciChat, an application server for scientific collaboration. Connections are made to the server through a Java client, that can either be an application or an applet served in a web page. Once connected, the client may choose to start or join a session. A session includes not only other clients, but also an application. Any client can send a command to the application. This command is executed on the server and echoed to all clients. The results of the command, whether numerical or graphical, are then distributed to all of the clients; thus, multiple clients can interact collaboratively with a single application. The client is developed in Java, the server in C++, and the middleware is the Common Object Request Broker Architecture. In this system, the Graphical User Interface processing is on the client machine, so one does not have the disadvantages of insufficient bandwidth as occurs when running X over the internet. Because the server, client, and middleware are object oriented, new types of servers and clients specialized to particular scientific applications are more easily developed.

  20. Percutaneous Achilles Tendon Lengthening

    MedlinePlus

    ... this site from a secured browser on the server. Please enable scripts and reload this page. Home ... The content is not intended to substitute for professional medical advice, diagnoses or treatments. If you need ...

  1. Where to Donate Blood

    MedlinePlus

    ... this site from a secured browser on the server. Please enable scripts and reload this page. Find ... Correspondence Regulatory and Public Meetings Stop the Bleed Professional Development Education Annual Meeting International Cord Blood Symposium ...

  2. Shoes and Orthotics for Diabetics

    MedlinePlus

    ... this site from a secured browser on the server. Please enable scripts and reload this page. Home ... By working with a physician and a footwear professional, such as a certified pedorthist, many patients can ...

  3. How to Tape an Ankle

    MedlinePlus

    ... this site from a secured browser on the server. Please enable scripts and reload this page. Home ... The content is not intended to substitute for professional medical advice, diagnoses or treatments. If you need ...

  4. Emotional Development: 2 Year Olds

    MedlinePlus

    ... this site from a secured browser on the server. Please enable scripts and reload this page. Turn ... probably refer your child to a mental health professional for a consultation. Last Updated 8/1/2009 ...

  5. How to "Read" Your Footprint

    MedlinePlus

    ... this site from a secured browser on the server. Please enable scripts and reload this page. Home ... The content is not intended to substitute for professional medical advice, diagnoses or treatments. If you need ...

  6. Plantar Fibroma and Plantar Fibromatosis

    MedlinePlus

    ... this site from a secured browser on the server. Please enable scripts and reload this page. Home ... The content is not intended to substitute for professional medical advice, diagnoses or treatments. If you need ...

  7. Muscle Cramp - A Common Pain

    MedlinePlus

    ... this site from a secured browser on the server. Please enable scripts and reload this page. Physicians ... Media Center The DO JAOA AOA Health Watch Professional Development AOA Board Certification Continuing Medical Education Research ...

  8. Optimizing the Replication of Multi-Quality Web Applications Using ACO and WoLF

    DTIC Science & Technology

    2006-09-14

    bipartite graph in both directions as they construct solutions, pheromone is used for traversing from one side of the bipartite graph to the other and back...27 3.1.3 Transitioning From 〈d, q〉 pairs to Servers. . . . . 29 3.1.4 Pheromone Update Rule . . . . . . . . . . . . . . 30 vi Page 3.2 WoLFAntDA: A...35 3.2.6 Pheromone Update Rule . . . . . . . . . . . . . . 36 3.2.7 Policy Updates . . . . . . . . . . . . . . . . . . . 36 3.3 The Server-Filling

  9. How to Keep Your Feet Flexible

    MedlinePlus

    ... this site from a secured browser on the server. Please enable scripts and reload this page. Home ... The content is not intended to substitute for professional medical advice, diagnoses or treatments. If you need ...

  10. How to Assess Your Shoe IQ

    MedlinePlus

    ... this site from a secured browser on the server. Please enable scripts and reload this page. Home ... The content is not intended to substitute for professional medical advice, diagnoses or treatments. If you need ...

  11. Osteopathic Medicine: What is a DO?

    MedlinePlus

    ... this site from a secured browser on the server. Please enable scripts and reload this page. Physicians ... Media Center The DO JAOA AOA Health Watch Professional Development AOA Board Certification Continuing Medical Education Research ...

  12. Summer Safety Tips - Staying Safe Outdoors

    MedlinePlus

    ... this site from a secured browser on the server. Please enable scripts and reload this page. Turn ... Families should attend community fireworks displays run by professionals rather than using fireworks at home. The AAP ...

  13. Disclaimer - NOAA's National Weather Service

    Science.gov Websites

    from this server through the Internet is not guaranteed. Official NWS dissemination systems which can Weather Service 1325 East West Highway Silver Spring, MD 20910 Page Author: NWS Internet Services Team

  14. How to Strengthen Your Ankle After a Sprain

    MedlinePlus

    ... this site from a secured browser on the server. Please enable scripts and reload this page. Home ... The content is not intended to substitute for professional medical advice, diagnoses or treatments. If you need ...

  15. Sore Throat? Know When To Call the Doctor

    MedlinePlus

    ... this site from a secured browser on the server. Please enable scripts and reload this page. Physicians ... Media Center The DO JAOA AOA Health Watch Professional Development AOA Board Certification Continuing Medical Education Research ...

  16. How to Stretch Your Ankle After a Sprain

    MedlinePlus

    ... this site from a secured browser on the server. Please enable scripts and reload this page. Home ... The content is not intended to substitute for professional medical advice, diagnoses or treatments. If you need ...

  17. Understanding Motherhood and Mood - Baby Blues and Beyond

    MedlinePlus

    ... this site from a secured browser on the server. Please enable scripts and reload this page. Turn ... requires immediate help. Talk to a Health Care Professional Screening for depression during and after pregnancy should ...

  18. How to Assess Changes in Feet: Normal or Abnormal

    MedlinePlus

    ... this site from a secured browser on the server. Please enable scripts and reload this page. Home ... The content is not intended to substitute for professional medical advice, diagnoses or treatments. If you need ...

  19. Web-Based Distributed Simulation of Aeronautical Propulsion System

    NASA Technical Reports Server (NTRS)

    Zheng, Desheng; Follen, Gregory J.; Pavlik, William R.; Kim, Chan M.; Liu, Xianyou; Blaser, Tammy M.; Lopez, Isaac

    2001-01-01

    An application was developed to allow users to run and view the Numerical Propulsion System Simulation (NPSS) engine simulations from web browsers. Simulations were performed on multiple INFORMATION POWER GRID (IPG) test beds. The Common Object Request Broker Architecture (CORBA) was used for brokering data exchange among machines and IPG/Globus for job scheduling and remote process invocation. Web server scripting was performed by JavaServer Pages (JSP). This application has proven to be an effective and efficient way to couple heterogeneous distributed components.

  20. A collaborative platform for consensus sessions in pathology over Internet.

    PubMed

    Zapletal, Eric; Le Bozec, Christel; Degoulet, Patrice; Jaulent, Marie-Christine

    2003-01-01

    The design of valid databases in pathology faces the problem of diagnostic disagreement between pathologists. Organizing consensus sessions between experts to reduce the variability is a difficult task. The TRIDEM platform addresses the issue to organize consensus sessions in pathology over the Internet. In this paper, we present the basis to achieve such collaborative platform. On the one hand, the platform integrates the functionalities of the IDEM consensus module that alleviates the consensus task by presenting to pathologists preliminary computed consensus through ergonomic interfaces (automatic step). On the other hand, a set of lightweight interaction tools such as vocal annotations are implemented to ease the communication between experts as they discuss a case (interactive step). The architecture of the TRIDEM platform is based on a Java-Server-Page web server that communicate with the ObjectStore PSE/PRO database used for the object storage. The HTML pages generated by the web server run Java applets to perform the different steps (automatic and interactive) of the consensus. The current limitations of the platform is to only handle a synchronous process. Moreover, improvements like re-writing the consensus workflow with a protocol such as BPML are already forecast.

  1. [The therapeutic drug monitoring network server of tacrolimus for Chinese renal transplant patients].

    PubMed

    Deng, Chen-Hui; Zhang, Guan-Min; Bi, Shan-Shan; Zhou, Tian-Yan; Lu, Wei

    2011-07-01

    This study is to develop a therapeutic drug monitoring (TDM) network server of tacrolimus for Chinese renal transplant patients, which can facilitate doctor to manage patients' information and provide three levels of predictions. Database management system MySQL was employed to build and manage the database of patients and doctors' information, and hypertext mark-up language (HTML) and Java server pages (JSP) technology were employed to construct network server for database management. Based on the population pharmacokinetic model of tacrolimus for Chinese renal transplant patients, above program languages were used to construct the population prediction and subpopulation prediction modules. Based on Bayesian principle and maximization of the posterior probability function, an objective function was established, and minimized by an optimization algorithm to estimate patient's individual pharmacokinetic parameters. It is proved that the network server has the basic functions for database management and three levels of prediction to aid doctor to optimize the regimen of tacrolimus for Chinese renal transplant patients.

  2. Cyber Intelligence Analysis Platform

    DTIC Science & Technology

    2014-04-01

    inside a node. Moreover, by École Polytechnique de Montréal Page 6 of 18 adding one or two 10-Gigabit port(s) and/or fiber -channel ports enough... Java SDKs for the development of custom management tools. In any case, all these tools and SDKs would work with the vCenter Server. École...vSphere SDK for Java , http://communities.vmware.com/community/vmtn/developer/forums/java_toolkit xCAT main documentation page, http

  3. Core Technical Capability Laboratory Management System

    NASA Technical Reports Server (NTRS)

    Shaykhian, Linda; Dugger, Curtis; Griffin, Laurie

    2008-01-01

    The Core Technical Capability Lab - oratory Management System (CTCLMS) consists of dynamically generated Web pages used to access a database containing detailed CTC lab data with the software hosted on a server that allows users to have remote access.

  4. Initial Evaluation: What Kind of Shape Are Your Feet In?

    MedlinePlus

    ... this site from a secured browser on the server. Please enable scripts and reload this page. Home ... The content is not intended to substitute for professional medical advice, diagnoses or treatments. If you need ...

  5. Improving Website Hyperlink Structure Using Server Logs

    PubMed Central

    Paranjape, Ashwin; West, Robert; Zia, Leila; Leskovec, Jure

    2016-01-01

    Good websites should be easy to navigate via hyperlinks, yet maintaining a high-quality link structure is difficult. Identifying pairs of pages that should be linked may be hard for human editors, especially if the site is large and changes frequently. Further, given a set of useful link candidates, the task of incorporating them into the site can be expensive, since it typically involves humans editing pages. In the light of these challenges, it is desirable to develop data-driven methods for automating the link placement task. Here we develop an approach for automatically finding useful hyperlinks to add to a website. We show that passively collected server logs, beyond telling us which existing links are useful, also contain implicit signals indicating which nonexistent links would be useful if they were to be introduced. We leverage these signals to model the future usefulness of yet nonexistent links. Based on our model, we define the problem of link placement under budget constraints and propose an efficient algorithm for solving it. We demonstrate the effectiveness of our approach by evaluating it on Wikipedia, a large website for which we have access to both server logs (used for finding useful new links) and the complete revision history (containing a ground truth of new links). As our method is based exclusively on standard server logs, it may also be applied to any other website, as we show with the example of the biomedical research site Simtk. PMID:28345077

  6. A strategy for providing electronic library services to members of the AGATE Consortium

    NASA Technical Reports Server (NTRS)

    Thompson, J. Garth

    1995-01-01

    In November, 1992, NASA Administrator Daniel Goldin established a Task Force to evaluate conditions which have lead to the precipitous decline of the US General Aviation System and to recommend actions needed to re-establish US leadership in General Aviation. The Task Force Report and a report by Dr. Bruce J. Holmes, Manager of the General Aviation/Commuter Office at NASA Langley Research Center provided the directions for the formation of the Advanced General Aviation Transport Experiments (AGATE), a consortium of government, industry and university committed to the revitalization of the US General Aviation Industry. One of the recommendations of the Task Force Report was that 'a central repository of information should be created to disseminate NASA research as well as other domestic and foreign aeronautical research that has been accomplished, is ongoing or is planned... A user friendly environment should be created.' This paper describes technical and logistic issues and recommends a plan for providing technical information to members of the AGATE Consortium. It is recommended that the General Aviation office establish and maintain an electronic literature page on the AGATE server. This page should provide a user friendly interface to existing technical report and index servers identified in the report and listed in the Recommendations section. A page should also be provided which gives links to Web resources. A list of specific resources is provided in the Recommendations section. Links should also be provided to a page with tips on searching, a form to provide for feedback and suggestions from users for other resources. Finally, a page should be maintained which provides pointers to other resources like the LaRCsim workstation simulation software which is avail from LaRC at no cost. The developments of the Web is very dynamic. These developments should be monitored regularly by the GA staff and links to additional resources should be provided on the server as they become available. An recommendation to NASA Headquarters should be made to establish a logically central access to all of the NASA Technical Libraries, to make these resources available both to all NASA employees and to the AGATE Consortium.

  7. Aviation Research and the Internet

    NASA Technical Reports Server (NTRS)

    Scott, Antoinette M.

    1995-01-01

    The Internet is a network of networks. It was originally funded by the Defense Advanced Research Projects Agency or DOD/DARPA and evolved in part from the connection of supercomputer sites across the United States. The National Science Foundation (NSF) made the most of their supercomputers by connecting the sites to each other. This made the supercomputers more efficient and now allows scientists, engineers and researchers to access the supercomputers from their own labs and offices. The high speed networks that connect the NSF supercomputers form the backbone of the Internet. The World Wide Web (WWW) is a menu system. It gathers Internet resources from all over the world into a series of screens that appear on your computer. The WWW is also a distributed. The distributed system stores data information on many computers (servers). These servers can go out and get data when you ask for it. Hypermedia is the base of the WWW. One can 'click' on a section and visit other hypermedia (pages). Our approach to demonstrating the importance of aviation research through the Internet began with learning how to put pages on the Internet (on-line) ourselves. We were assigned two aviation companies; Vision Micro Systems Inc. and Innovative Aerodynamic Technologies (IAT). We developed home pages for these SBIR companies. The equipment used to create the pages were the UNIX and Macintosh machines. HTML Supertext software was used to write the pages and the Sharp JX600S scanner to scan the images. As a result, with the use of the UNIX, Macintosh, Sun, PC, and AXIL machines, we were able to present our home pages to over 800,000 visitors.

  8. Radiology teaching file cases on the World Wide Web.

    PubMed

    Scalzetti, E M

    1997-08-01

    The presentation of a radiographic teaching file on the World Wide Web can be enhanced by attending to principles of web design. Chief among these are appropriate control of page layout, minimization of the time required to download a page from the remote server, and provision for navigation within and among the web pages that constitute the site. Page layout is easily accomplished by the use of tables; column widths can be fixed to maintain an acceptable line length for text. Downloading time is minimized by rigorous editing and by optimal compression of image files; beyond this, techniques like preloading of images and specification of image width and height are also helpful. Navigation controls should be clear, consistent, and readily available.

  9. Automated Cryocooler Monitor and Control System Software

    NASA Technical Reports Server (NTRS)

    Britchcliffe, Michael J.; Conroy, Bruce L.; Anderson, Paul E.; Wilson, Ahmad

    2011-01-01

    This software is used in an automated cryogenic control system developed to monitor and control the operation of small-scale cryocoolers. The system was designed to automate the cryogenically cooled low-noise amplifier system described in "Automated Cryocooler Monitor and Control System" (NPO-47246), NASA Tech Briefs, Vol. 35, No. 5 (May 2011), page 7a. The software contains algorithms necessary to convert non-linear output voltages from the cryogenic diode-type thermometers and vacuum pressure and helium pressure sensors, to temperature and pressure units. The control function algorithms use the monitor data to control the cooler power, vacuum solenoid, vacuum pump, and electrical warm-up heaters. The control algorithms are based on a rule-based system that activates the required device based on the operating mode. The external interface is Web-based. It acts as a Web server, providing pages for monitor, control, and configuration. No client software from the external user is required.

  10. Mobile cloud-computing-based healthcare service by noncontact ECG monitoring.

    PubMed

    Fong, Ee-May; Chung, Wan-Young

    2013-12-02

    Noncontact electrocardiogram (ECG) measurement technique has gained popularity these days owing to its noninvasive features and convenience in daily life use. This paper presents mobile cloud computing for a healthcare system where a noncontact ECG measurement method is employed to capture biomedical signals from users. Healthcare service is provided to continuously collect biomedical signals from multiple locations. To observe and analyze the ECG signals in real time, a mobile device is used as a mobile monitoring terminal. In addition, a personalized healthcare assistant is installed on the mobile device; several healthcare features such as health status summaries, medication QR code scanning, and reminders are integrated into the mobile application. Health data are being synchronized into the healthcare cloud computing service (Web server system and Web server dataset) to ensure a seamless healthcare monitoring system and anytime and anywhere coverage of network connection is available. Together with a Web page application, medical data are easily accessed by medical professionals or family members. Web page performance evaluation was conducted to ensure minimal Web server latency. The system demonstrates better availability of off-site and up-to-the-minute patient data, which can help detect health problems early and keep elderly patients out of the emergency room, thus providing a better and more comprehensive healthcare cloud computing service.

  11. Mobile Cloud-Computing-Based Healthcare Service by Noncontact ECG Monitoring

    PubMed Central

    Fong, Ee-May; Chung, Wan-Young

    2013-01-01

    Noncontact electrocardiogram (ECG) measurement technique has gained popularity these days owing to its noninvasive features and convenience in daily life use. This paper presents mobile cloud computing for a healthcare system where a noncontact ECG measurement method is employed to capture biomedical signals from users. Healthcare service is provided to continuously collect biomedical signals from multiple locations. To observe and analyze the ECG signals in real time, a mobile device is used as a mobile monitoring terminal. In addition, a personalized healthcare assistant is installed on the mobile device; several healthcare features such as health status summaries, medication QR code scanning, and reminders are integrated into the mobile application. Health data are being synchronized into the healthcare cloud computing service (Web server system and Web server dataset) to ensure a seamless healthcare monitoring system and anytime and anywhere coverage of network connection is available. Together with a Web page application, medical data are easily accessed by medical professionals or family members. Web page performance evaluation was conducted to ensure minimal Web server latency. The system demonstrates better availability of off-site and up-to-the-minute patient data, which can help detect health problems early and keep elderly patients out of the emergency room, thus providing a better and more comprehensive healthcare cloud computing service. PMID:24316562

  12. The Department of Defense and the Power of Cloud Computing: Weighing Acceptable Cost Versus Acceptable Risk

    DTIC Science & Technology

    2016-04-01

    the DOD will put DOD systems and data at a risk level comparable to that of their neighbors in the cloud. Just as a user browses a Web page on the...proxy servers for controlling user access to Web pages, and large-scale storage for data management. Each of these devices allows access to the...user to develop applications. Acunetics.com describes Web applications as “computer programs allowing Website visitors to submit and retrieve data

  13. CCTOP: a Consensus Constrained TOPology prediction web server.

    PubMed

    Dobson, László; Reményi, István; Tusnády, Gábor E

    2015-07-01

    The Consensus Constrained TOPology prediction (CCTOP; http://cctop.enzim.ttk.mta.hu) server is a web-based application providing transmembrane topology prediction. In addition to utilizing 10 different state-of-the-art topology prediction methods, the CCTOP server incorporates topology information from existing experimental and computational sources available in the PDBTM, TOPDB and TOPDOM databases using the probabilistic framework of hidden Markov model. The server provides the option to precede the topology prediction with signal peptide prediction and transmembrane-globular protein discrimination. The initial result can be recalculated by (de)selecting any of the prediction methods or mapped experiments or by adding user specified constraints. CCTOP showed superior performance to existing approaches. The reliability of each prediction is also calculated, which correlates with the accuracy of the per protein topology prediction. The prediction results and the collected experimental information are visualized on the CCTOP home page and can be downloaded in XML format. Programmable access of the CCTOP server is also available, and an example of client-side script is provided. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  14. 2MASS Catalog Server Kit Version 2.1

    NASA Astrophysics Data System (ADS)

    Yamauchi, C.

    2013-10-01

    The 2MASS Catalog Server Kit is open source software for use in easily constructing a high performance search server for important astronomical catalogs. This software utilizes the open source RDBMS PostgreSQL, therefore, any users can setup the database on their local computers by following step-by-step installation guide. The kit provides highly optimized stored functions for positional searchs similar to SDSS SkyServer. Together with these, the powerful SQL environment of PostgreSQL will meet various user's demands. We released 2MASS Catalog Server Kit version 2.1 in 2012 May, which supports the latest WISE All-Sky catalog (563,921,584 rows) and 9 major all-sky catalogs. Local databases are often indispensable for observatories with unstable or narrow-band networks or severe use, such as retrieving large numbers of records within a small period of time. This software is the best for such purposes, and increasing supported catalogs and improvements of version 2.1 can cover a wider range of applications including advanced calibration system, scientific studies using complicated SQL queries, etc. Official page: http://www.ir.isas.jaxa.jp/~cyamauch/2masskit/

  15. I/O performance evaluation of a Linux-based network-attached storage device

    NASA Astrophysics Data System (ADS)

    Sun, Zhaoyan; Dong, Yonggui; Wu, Jinglian; Jia, Huibo; Feng, Guanping

    2002-09-01

    In a Local Area Network (LAN), clients are permitted to access the files on high-density optical disks via a network server. But the quality of read service offered by the conventional server is not satisfied because of the multiple functions on the server and the overmuch caller. This paper develops a Linux-based Network-Attached Storage (NAS) server. The Operation System (OS), composed of an optimized kernel and a miniaturized file system, is stored in a flash memory. After initialization, the NAS device is connected into the LAN. The administrator and users could configure the access the server through the web page respectively. In order to enhance the quality of access, the management of buffer cache in file system is optimized. Some benchmark programs are peformed to evaluate the I/O performance of the NAS device. Since data recorded in optical disks are usually for reading accesses, our attention is focused on the reading throughput of the device. The experimental results indicate that the I/O performance of our NAS device is excellent.

  16. Design and development of a web-based application for diabetes patient data management.

    PubMed

    Deo, S S; Deobagkar, D N; Deobagkar, Deepti D

    2005-01-01

    A web-based database management system developed for collecting, managing and analysing information of diabetes patients is described here. It is a searchable, client-server, relational database application, developed on the Windows platform using Oracle, Active Server Pages (ASP), Visual Basic Script (VB Script) and Java Script. The software is menu-driven and allows authorized healthcare providers to access, enter, update and analyse patient information. Graphical representation of data can be generated by the system using bar charts and pie charts. An interactive web interface allows users to query the database and generate reports. Alpha- and beta-testing of the system was carried out and the system at present holds records of 500 diabetes patients and is found useful in diagnosis and treatment. In addition to providing patient data on a continuous basis in a simple format, the system is used in population and comparative analysis. It has proved to be of significant advantage to the healthcare provider as compared to the paper-based system.

  17. Audience response made easy: using personal digital assistants as a classroom polling tool.

    PubMed

    Menon, Anil S; Moffett, Shannon; Enriquez, Melissa; Martinez, Miriam M; Dev, Parvati; Grappone, Todd

    2004-01-01

    Both teachers and students benefit from an interactive classroom. The teacher receives valuable input about effectiveness, student interest, and comprehension, whereas student participation, active learning, and enjoyment of the class are enhanced. Cost and deployment have limited the use of existing audience response systems, allowing anonymous linking of teachers and students in the classroom. These limitations can be circumvented, however, by use of personal digital assistants (PDAs), which are cheaper and widely used by students. In this study, the authors equipped a summer histology class of 12 students with PDAs and wireless Bluetooth cards to allow access to a central server. Teachers displayed questions in multiple-choice format as a Web page on the server and students responded with their PDAs, a process referred to as polling. Responses were immediately compiled, analyzed, and displayed. End-of-class survey results indicated that students were enthusiastic about the polling tool. The surveys also provided technical feedback that will be valuable in streamlining future trials.

  18. Audience Response Made Easy: Using Personal Digital Assistants as a Classroom Polling Tool

    PubMed Central

    Menon, Anil S.; Moffett, Shannon; Enriquez, Melissa; Martinez, Miriam M.; Dev, Parvati; Grappone, Todd

    2004-01-01

    Both teachers and students benefit from an interactive classroom. The teacher receives valuable input about effectiveness, student interest, and comprehension, whereas student participation, active learning, and enjoyment of the class are enhanced. Cost and deployment have limited the use of existing audience response systems, allowing anonymous linking of teachers and students in the classroom. These limitations can be circumvented, however, by use of personal digital assistants (PDAs), which are cheaper and widely used by students. In this study, the authors equipped a summer histology class of 12 students with PDAs and wireless Bluetooth cards to allow access to a central server. Teachers displayed questions in multiple-choice format as a Web page on the server and students responded with their PDAs, a process referred to as polling. Responses were immediately compiled, analyzed, and displayed. End-of-class survey results indicated that students were enthusiastic about the polling tool. The surveys also provided technical feedback that will be valuable in streamlining future trials. PMID:14764615

  19. How To Get Your Web Page Noticed.

    ERIC Educational Resources Information Center

    Schrock, Kathleen

    1997-01-01

    Presents guidelines for making a Web site noticeable. Discusses submitting the URL to directories, links, and announcement lists, and sending the site over the server via FTP to search engines. Describes how to index the site with "Title,""Heading," and "Meta" tags. (AEF)

  20. The World Wide Web and Technology Transfer at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Bianco, David J.

    1994-01-01

    NASA Langley Research Center (LaRC) began using the World Wide Web (WWW) in the summer of 1993, becoming the first NASA installation to provide a Center-wide home page. This coincided with a reorganization of LaRC to provide a more concentrated focus on technology transfer to both aerospace and non-aerospace industry. Use of the WWW and NCSA Mosaic not only provides automated information dissemination, but also allows for the implementation, evolution and integration of many technology transfer applications. This paper describes several of these innovative applications, including the on-line presentation of the entire Technology Opportunities Showcase (TOPS), an industrial partnering showcase that exists on the Web long after the actual 3-day event ended. During its first year on the Web, LaRC also developed several WWW-based information repositories. The Langley Technical Report Server (LTRS), a technical paper delivery system with integrated searching and retrieval, has proved to be quite popular. The NASA Technical Report Server (NTRS), an outgrowth of LTRS, provides uniform access to many logically similar, yet physically distributed NASA report servers. WWW is also the foundation of the Langley Software Server (LSS), an experimental software distribution system which will distribute LaRC-developed software with the possible phase-out of NASA's COSMIC program. In addition to the more formal technology distribution projects, WWW has been successful in connecting people with technologies and people with other people. With the completion of the LaRC reorganization, the Technology Applications Group, charged with interfacing with non-aerospace companies, opened for business with a popular home page.

  1. Using MATLAB software with Tomcat server and Java platform for remote image analysis in pathology.

    PubMed

    Markiewicz, Tomasz

    2011-03-30

    The Matlab software is a one of the most advanced development tool for application in engineering practice. From our point of view the most important is the image processing toolbox, offering many built-in functions, including mathematical morphology, and implementation of a many artificial neural networks as AI. It is very popular platform for creation of the specialized program for image analysis, also in pathology. Based on the latest version of Matlab Builder Java toolbox, it is possible to create the software, serving as a remote system for image analysis in pathology via internet communication. The internet platform can be realized based on Java Servlet Pages with Tomcat server as servlet container. In presented software implementation we propose remote image analysis realized by Matlab algorithms. These algorithms can be compiled to executable jar file with the help of Matlab Builder Java toolbox. The Matlab function must be declared with the set of input data, output structure with numerical results and Matlab web figure. Any function prepared in that manner can be used as a Java function in Java Servlet Pages (JSP). The graphical user interface providing the input data and displaying the results (also in graphical form) must be implemented in JSP. Additionally the data storage to database can be implemented within algorithm written in Matlab with the help of Matlab Database Toolbox directly with the image processing. The complete JSP page can be run by Tomcat server. The proposed tool for remote image analysis was tested on the Computerized Analysis of Medical Images (CAMI) software developed by author. The user provides image and case information (diagnosis, staining, image parameter etc.). When analysis is initialized, input data with image are sent to servlet on Tomcat. When analysis is done, client obtains the graphical results as an image with marked recognized cells and also the quantitative output. Additionally, the results are stored in a server database. The internet platform was tested on PC Intel Core2 Duo T9600 2.8 GHz 4 GB RAM server with 768x576 pixel size, 1.28 Mb tiff format images reffering to meningioma tumour (x400, Ki-67/MIB-1). The time consumption was as following: at analysis by CAMI, locally on a server - 3.5 seconds, at remote analysis - 26 seconds, from which 22 seconds were used for data transfer via internet connection. At jpg format image (102 Kb) the consumption time was reduced to 14 seconds. The results have confirmed that designed remote platform can be useful for pathology image analysis. The time consumption is depended mainly on the image size and speed of the internet connections. The presented implementation can be used for many types of analysis at different staining, tissue, morphometry approaches, etc. The significant problem is the implementation of the JSP page in the multithread form, that can be used parallelly by many users. The presented platform for image analysis in pathology can be especially useful for small laboratory without its own image analysis system.

  2. Using MATLAB software with Tomcat server and Java platform for remote image analysis in pathology

    PubMed Central

    2011-01-01

    Background The Matlab software is a one of the most advanced development tool for application in engineering practice. From our point of view the most important is the image processing toolbox, offering many built-in functions, including mathematical morphology, and implementation of a many artificial neural networks as AI. It is very popular platform for creation of the specialized program for image analysis, also in pathology. Based on the latest version of Matlab Builder Java toolbox, it is possible to create the software, serving as a remote system for image analysis in pathology via internet communication. The internet platform can be realized based on Java Servlet Pages with Tomcat server as servlet container. Methods In presented software implementation we propose remote image analysis realized by Matlab algorithms. These algorithms can be compiled to executable jar file with the help of Matlab Builder Java toolbox. The Matlab function must be declared with the set of input data, output structure with numerical results and Matlab web figure. Any function prepared in that manner can be used as a Java function in Java Servlet Pages (JSP). The graphical user interface providing the input data and displaying the results (also in graphical form) must be implemented in JSP. Additionally the data storage to database can be implemented within algorithm written in Matlab with the help of Matlab Database Toolbox directly with the image processing. The complete JSP page can be run by Tomcat server. Results The proposed tool for remote image analysis was tested on the Computerized Analysis of Medical Images (CAMI) software developed by author. The user provides image and case information (diagnosis, staining, image parameter etc.). When analysis is initialized, input data with image are sent to servlet on Tomcat. When analysis is done, client obtains the graphical results as an image with marked recognized cells and also the quantitative output. Additionally, the results are stored in a server database. The internet platform was tested on PC Intel Core2 Duo T9600 2.8GHz 4GB RAM server with 768x576 pixel size, 1.28Mb tiff format images reffering to meningioma tumour (x400, Ki-67/MIB-1). The time consumption was as following: at analysis by CAMI, locally on a server – 3.5 seconds, at remote analysis – 26 seconds, from which 22 seconds were used for data transfer via internet connection. At jpg format image (102 Kb) the consumption time was reduced to 14 seconds. Conclusions The results have confirmed that designed remote platform can be useful for pathology image analysis. The time consumption is depended mainly on the image size and speed of the internet connections. The presented implementation can be used for many types of analysis at different staining, tissue, morphometry approaches, etc. The significant problem is the implementation of the JSP page in the multithread form, that can be used parallelly by many users. The presented platform for image analysis in pathology can be especially useful for small laboratory without its own image analysis system. PMID:21489188

  3. Installation of the National Transport Code Collaboration Data Server at the ITPA International Multi-tokamak Confinement Profile Database

    NASA Astrophysics Data System (ADS)

    Roach, Colin; Carlsson, Johan; Cary, John R.; Alexander, David A.

    2002-11-01

    The National Transport Code Collaboration (NTCC) has developed an array of software, including a data client/server. The data server, which is written in C++, serves local data (in the ITER Profile Database format) as well as remote data (by accessing one or several MDS+ servers). The client, a web-invocable Java applet, provides a uniform, intuitive, user-friendly, graphical interface to the data server. The uniformity of the interface relieves the user from the trouble of mastering the differences between different data formats and lets him/her focus on the essentials: plotting and viewing the data. The user runs the client by visiting a web page using any Java capable Web browser. The client is automatically downloaded and run by the browser. A reference to the data server is then retrieved via the standard Web protocol (HTTP). The communication between the client and the server is then handled by the mature, industry-standard CORBA middleware. CORBA has bindings for all common languages and many high-quality implementations are available (both Open Source and commercial). The NTCC data server has been installed at the ITPA International Multi-tokamak Confinement Profile Database, which is hosted by the UKAEA at Culham Science Centre. The installation of the data server is protected by an Internet firewall. To make it accessible to clients outside the firewall some modifications of the server were required. The working version of the ITPA confinement profile database is not open to the public. Authentification of legitimate users is done utilizing built-in Java security features to demand a password to download the client. We present an overview of the NTCC data client/server and some details of how the CORBA firewall-traversal issues were resolved and how the user authentification is implemented.

  4. BCM Search Launcher--an integrated interface to molecular biology data base search and analysis services available on the World Wide Web.

    PubMed

    Smith, R F; Wiese, B A; Wojzynski, M K; Davison, D B; Worley, K C

    1996-05-01

    The BCM Search Launcher is an integrated set of World Wide Web (WWW) pages that organize molecular biology-related search and analysis services available on the WWW by function, and provide a single point of entry for related searches. The Protein Sequence Search Page, for example, provides a single sequence entry form for submitting sequences to WWW servers that offer remote access to a variety of different protein sequence search tools, including BLAST, FASTA, Smith-Waterman, BEAUTY, PROSITE, and BLOCKS searches. Other Launch pages provide access to (1) nucleic acid sequence searches, (2) multiple and pair-wise sequence alignments, (3) gene feature searches, (4) protein secondary structure prediction, and (5) miscellaneous sequence utilities (e.g., six-frame translation). The BCM Search Launcher also provides a mechanism to extend the utility of other WWW services by adding supplementary hypertext links to results returned by remote servers. For example, links to the NCBI's Entrez data base and to the Sequence Retrieval System (SRS) are added to search results returned by the NCBI's WWW BLAST server. These links provide easy access to auxiliary information, such as Medline abstracts, that can be extremely helpful when analyzing BLAST data base hits. For new or infrequent users of sequence data base search tools, we have preset the default search parameters to provide the most informative first-pass sequence analysis possible. We have also developed a batch client interface for Unix and Macintosh computers that allows multiple input sequences to be searched automatically as a background task, with the results returned as individual HTML documents directly to the user's system. The BCM Search Launcher and batch client are available on the WWW at URL http:@gc.bcm.tmc.edu:8088/search-launcher.html.

  5. The ClusPro web server for protein-protein docking

    PubMed Central

    Kozakov, Dima; Hall, David R.; Xia, Bing; Porter, Kathryn A.; Padhorny, Dzmitry; Yueh, Christine; Beglov, Dmitri; Vajda, Sandor

    2017-01-01

    The ClusPro server (https://cluspro.org) is a widely used tool for protein-protein docking. The server provides a simple home page for basic use, requiring only two files in Protein Data Bank format. However, ClusPro also offers a number of advanced options to modify the search that include the removal of unstructured protein regions, applying attraction or repulsion, accounting for pairwise distance restraints, constructing homo-multimers, considering small angle X-ray scattering (SAXS) data, and finding heparin binding sites. Six different energy functions can be used depending on the type of proteins. Docking with each energy parameter set results in ten models defined by centers of highly populated clusters of low energy docked structures. This protocol describes the use of the various options, the construction of auxiliary restraints files, the selection of the energy parameters, and the analysis of the results. Although the server is heavily used, runs are generally completed in < 4 hours. PMID:28079879

  6. Enhancement web proxy cache performance using Wrapper Feature Selection methods with NB and J48

    NASA Astrophysics Data System (ADS)

    Mahmoud Al-Qudah, Dua'a.; Funke Olanrewaju, Rashidah; Wong Azman, Amelia

    2017-11-01

    Web proxy cache technique reduces response time by storing a copy of pages between client and server sides. If requested pages are cached in the proxy, there is no need to access the server. Due to the limited size and excessive cost of cache compared to the other storages, cache replacement algorithm is used to determine evict page when the cache is full. On the other hand, the conventional algorithms for replacement such as Least Recently Use (LRU), First in First Out (FIFO), Least Frequently Use (LFU), Randomized Policy etc. may discard important pages just before use. Furthermore, using conventional algorithm cannot be well optimized since it requires some decision to intelligently evict a page before replacement. Hence, most researchers propose an integration among intelligent classifiers and replacement algorithm to improves replacement algorithms performance. This research proposes using automated wrapper feature selection methods to choose the best subset of features that are relevant and influence classifiers prediction accuracy. The result present that using wrapper feature selection methods namely: Best First (BFS), Incremental Wrapper subset selection(IWSS)embedded NB and particle swarm optimization(PSO)reduce number of features and have a good impact on reducing computation time. Using PSO enhance NB classifier accuracy by 1.1%, 0.43% and 0.22% over using NB with all features, using BFS and using IWSS embedded NB respectively. PSO rises J48 accuracy by 0.03%, 1.91 and 0.04% over using J48 classifier with all features, using IWSS-embedded NB and using BFS respectively. While using IWSS embedded NB fastest NB and J48 classifiers much more than BFS and PSO. However, it reduces computation time of NB by 0.1383 and reduce computation time of J48 by 2.998.

  7. Climate Prediction Center

    Science.gov Websites

    Climate Stratosphere Pacific Islands International Desks Climate.gov Climate Test Bed (CTB) JAWF USAID FEWS-NET NWS / NCEP Aviation Weather Center Climate Prediction Center Environmental Modeling Center non-operational server hosts the redesigned web pages developed, thus far, as part of the Climate

  8. Automatic Web-based Calibration of Network-Capable Shipboard Sensors

    DTIC Science & Technology

    2007-09-01

    Server, Java , Applet, and Servlet . 16. PRICE CODE 17. SECURITY CLASSIFICATION OF REPORT Unclassified 18. SECURITY CLASSIFICATION OF THIS PAGE...49 b. Sensor Applet...........................................................................49 3. Java Servlet ...Table 1. Required System Environment Variables for Java Servlet Development. ......25 Table 2. Payload Data Format of the POST Requests from

  9. Synoptic reporting in tumor pathology: advantages of a web-based system.

    PubMed

    Qu, Zhenhong; Ninan, Shibu; Almosa, Ahmed; Chang, K G; Kuruvilla, Supriya; Nguyen, Nghia

    2007-06-01

    The American College of Surgeons Commission on Cancer (ACS-CoC) mandates that pathology reports at ACS-CoC-approved cancer programs include all scientifically validated data elements for each site and tumor specimen. The College of American Pathologists (CAP) has produced cancer checklists in static text formats to assist reporting. To be inclusive, the CAP checklists are pages long, requiring extensive text editing and multiple intermediate steps. We created a set of dynamic tumor-reporting templates, using Microsoft Active Server Page (ASP.NET), with drop-down list and data-compile features, and added a reminder function to indicate missing information. Users can access this system on the Internet, prepare the tumor report by selecting relevant data from drop-down lists with an embedded tumor staging scheme, and directly transfer the final report into a laboratory information system by using the copy-and-paste function. By minimizing extensive text editing and eliminating intermediate steps, this system can reduce reporting errors, improve work efficiency, and increase compliance.

  10. Graphic Server: A real time system for displaying and monitoring telemetry data of several satellites

    NASA Technical Reports Server (NTRS)

    Douard, Stephane

    1994-01-01

    Known as a Graphic Server, the system presented was designed for the control ground segment of the Telecom 2 satellites. It is a tool used to dynamically display telemetry data within graphic pages, also known as views. The views are created off-line through various utilities and then, on the operator's request, displayed and animated in real time as data is received. The system was designed as an independent component, and is installed in different Telecom 2 operational control centers. It enables operators to monitor changes in the platform and satellite payloads in real time. It has been in operation since December 1991.

  11. A Tale of Two Observing Systems: Interoperability in the World of Microsoft Windows

    NASA Astrophysics Data System (ADS)

    Babin, B. L.; Hu, L.

    2008-12-01

    Louisiana Universities Marine Consortium's (LUMCON) and Dauphin Island Sea Lab's (DISL) Environmental Monitoring System provide a unified coastal ocean observing system. These two systems are mirrored to maintain autonomy while offering an integrated data sharing environment. Both systems collect data via Campbell Scientific Data loggers, store the data in Microsoft SQL servers, and disseminate the data in real- time on the World Wide Web via Microsoft Internet Information Servers and Active Server Pages (ASP). The utilization of Microsoft Windows technologies presented many challenges to these observing systems as open source tools for interoperability grow. The current open source tools often require the installation of additional software. In order to make data available through common standards formats, "home grown" software has been developed. One example of this is the development of software to generate xml files for transmission to the National Data Buoy Center (NDBC). OOSTethys partners develop, test and implement easy-to-use, open-source, OGC-compliant software., and have created a working prototype of networked, semantically interoperable, real-time data systems. Partnering with OOSTethys, we are developing a cookbook to implement OGC web services. The implementation will be written in ASP, will run in a Microsoft operating system environment, and will serve data via Sensor Observation Services (SOS). This cookbook will give observing systems running Microsoft Windows the tools to easily participate in the Open Geospatial Consortium (OGC) Oceans Interoperability Experiment (OCEANS IE).

  12. MTSAT: Full Disk - NOAA GOES Geostationary Satellite Server

    Science.gov Websites

    GOES Himawari-8 Indian Ocean Meteosat HEMISPHERIC GOES Atlantic Source | Local GOES West Himawari-8 Meteosat CONTINENTAL PACUS CONUS Source | Local REGIONAL GOES-West Northwest West Central Southwest GOES -East Regional Page Source | Local Pacific Northwest Source | Local Northern Rockies Source | Local

  13. Serving Grades Over the Internet.

    ERIC Educational Resources Information Center

    Harris, James K.

    This paper demonstrates a grade server that allows college students to access their grades over the Internet from the instructor's home page. Using a CGI (common gateway interface) program written in Visual Basic, the grades are read directly from an Excel spreadsheet and presented to the requester after he/she enters a password. The grade for…

  14. Staleness Among Web Search Engines.

    ERIC Educational Resources Information Center

    Koehler, Wallace

    1998-01-01

    Describes a study of four major Web search engines that tested for staleness, a condition when a significant number of the hits it returns point to Web pages or server-level domains (SLD) that are no longer viable. Results of tests of URLs with AltaVista, HotBot, InfoSeek, and Open Text are discussed. (Author/LRW)

  15. Automatic page layout using genetic algorithms for electronic albuming

    NASA Astrophysics Data System (ADS)

    Geigel, Joe; Loui, Alexander C. P.

    2000-12-01

    In this paper, we describe a flexible system for automatic page layout that makes use of genetic algorithms for albuming applications. The system is divided into two modules, a page creator module which is responsible for distributing images amongst various album pages, and an image placement module which positions images on individual pages. Final page layouts are specified in a textual form using XML for printing or viewing over the Internet. The system makes use of genetic algorithms, a class of search and optimization algorithms that are based on the concepts of biological evolution, for generating solutions with fitness based on graphic design preferences supplied by the user. The genetic page layout algorithm has been incorporated into a web-based prototype system for interactive page layout over the Internet. The prototype system is built using client-server architecture and is implemented in java. The system described in this paper has demonstrated the feasibility of using genetic algorithms for automated page layout in albuming and web-based imaging applications. We believe that the system adequately proves the validity of the concept, providing creative layouts in a reasonable number of iterations. By optimizing the layout parameters of the fitness function, we hope to further improve the quality of the final layout in terms of user preference and computation speed.

  16. Empowering radiologic education on the Internet: a new virtual website technology for hosting interactive educational content on the World Wide Web.

    PubMed

    Frank, M S; Dreyer, K

    2001-06-01

    We describe a virtual web site hosting technology that enables educators in radiology to emblazon and make available for delivery on the world wide web their own interactive educational content, free from dependencies on in-house resources and policies. This suite of technologies includes a graphically oriented software application, designed for the computer novice, to facilitate the input, storage, and management of domain expertise within a database system. The database stores this expertise as choreographed and interlinked multimedia entities including text, imagery, interactive questions, and audio. Case-based presentations or thematic lectures can be authored locally, previewed locally within a web browser, then uploaded at will as packaged knowledge objects to an educator's (or department's) personal web site housed within a virtual server architecture. This architecture can host an unlimited number of unique educational web sites for individuals or departments in need of such service. Each virtual site's content is stored within that site's protected back-end database connected to Internet Information Server (Microsoft Corp, Redmond WA) using a suite of Active Server Page (ASP) modules that incorporate Microsoft's Active Data Objects (ADO) technology. Each person's or department's electronic teaching material appears as an independent web site with different levels of access--controlled by a username-password strategy--for teachers and students. There is essentially no static hypertext markup language (HTML). Rather, all pages displayed for a given site are rendered dynamically from case-based or thematic content that is fetched from that virtual site's database. The dynamically rendered HTML is displayed within a web browser in a Socratic fashion that can assess the recipient's current fund of knowledge while providing instantaneous user-specific feedback. Each site is emblazoned with the logo and identification of the participating institution. Individuals with teacher-level access can use a web browser to upload new content as well as manage content already stored on their virtual site. Each virtual site stores, collates, and scores participants' responses to the interactive questions posed on line. This virtual web site strategy empowers the educator with an end-to-end solution for creating interactive educational content and hosting that content within the educator's personalized and protected educational site on the world wide web, thus providing a valuable outlet that can magnify the impact of his or her talents and contributions.

  17. The Alaska Volcano Observatory Website a Tool for Information Management and Dissemination

    NASA Astrophysics Data System (ADS)

    Snedigar, S. F.; Cameron, C. E.; Nye, C. J.

    2006-12-01

    The Alaska Volcano Observatory's (AVO's) website served as a primary information management tool during the 2006 eruption of Augustine Volcano. The AVO website is dynamically generated from a database back- end. This system enabled AVO to quickly and easily update the website, and provide content based on user- queries to the database. During the Augustine eruption, the new AVO website was heavily used by members of the public (up to 19 million hits per day), and this was largely because the AVO public pages were an excellent source of up-to-date information. There are two different, yet fully integrated parts of the website. An external, public site (www.avo.alaska.edu) allows the general public to track eruptive activity by viewing the latest photographs, webcam images, webicorder graphs, and official information releases about activity at the volcano, as well as maps, previous eruption information, bibliographies, and rich information about other Alaska volcanoes. The internal half of the website hosts diverse geophysical and geological data (as browse images) in a format equally accessible by AVO staff in different locations. In addition, an observation log allows users to enter information about anything from satellite passes to seismic activity to ash fall reports into a searchable database. The individual(s) on duty at the watch office use forms on the internal website to post a summary of the latest activity directly to the public website, ensuring that the public website is always up to date. The internal website also serves as a starting point for monitoring Alaska's volcanoes. AVO's extensive image database allows AVO personnel to upload many photos, diagrams, and videos which are then available to be browsed by anyone in the AVO community. Selected images are viewable from the public page. The primary webserver is housed at the University of Alaska Fairbanks, and holds a MySQL database with over 200 tables and several thousand lines of php code gluing the database and website together. The database currently holds 95 GB of data. Webcam images and webicorder graphs are pulled from servers in Anchorage every few minutes. Other servers in Fairbanks generate earthquake location plots and spectrograms.

  18. PropeR revisited.

    PubMed

    van der Linden, Helma; Talmon, Jan; Tange, Huibert; Grimson, Jane; Hasman, Arie

    2005-03-01

    The PropeR EHR system (PropeRWeb) is a multidisciplinary electronic health record (EHR) system for multidisciplinary use in extramural patient care for stroke patients. The system is built using existing open source components and is based on open standards. It is implemented as a web application using servlets and Java Server Pages (JSP's) with a CORBA connection to the database servers, which are based on the OMG HDTF specifications. PropeRWeb is a generic system which can be readily customized for use in a variety of clinical domains. The system proved to be stable and flexible, although some aspects (a.o. user friendliness) could be improved. These improvements are currently under development in a second version.

  19. Web-based network analysis and visualization using CellMaps

    PubMed Central

    Salavert, Francisco; García-Alonso, Luz; Sánchez, Rubén; Alonso, Roberto; Bleda, Marta; Medina, Ignacio; Dopazo, Joaquín

    2016-01-01

    Summary: CellMaps is an HTML5 open-source web tool that allows displaying, editing, exploring and analyzing biological networks as well as integrating metadata into them. Computations and analyses are remotely executed in high-end servers, and all the functionalities are available through RESTful web services. CellMaps can easily be integrated in any web page by using an available JavaScript API. Availability and Implementation: The application is available at: http://cellmaps.babelomics.org/ and the code can be found in: https://github.com/opencb/cell-maps. The client is implemented in JavaScript and the server in C and Java. Contact: jdopazo@cipf.es Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27296979

  20. Web-based network analysis and visualization using CellMaps.

    PubMed

    Salavert, Francisco; García-Alonso, Luz; Sánchez, Rubén; Alonso, Roberto; Bleda, Marta; Medina, Ignacio; Dopazo, Joaquín

    2016-10-01

    : CellMaps is an HTML5 open-source web tool that allows displaying, editing, exploring and analyzing biological networks as well as integrating metadata into them. Computations and analyses are remotely executed in high-end servers, and all the functionalities are available through RESTful web services. CellMaps can easily be integrated in any web page by using an available JavaScript API. The application is available at: http://cellmaps.babelomics.org/ and the code can be found in: https://github.com/opencb/cell-maps The client is implemented in JavaScript and the server in C and Java. jdopazo@cipf.es Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  1. Secure web-based access to radiology: forms and databases for fast queries

    NASA Astrophysics Data System (ADS)

    McColl, Roderick W.; Lane, Thomas J.

    2002-05-01

    Currently, Web-based access to mini-PACS or similar databases commonly utilizes either JavaScript, Java applets or ActiveX controls. Many sites do not permit applets or controls or other binary objects for fear of viruses or worms sent by malicious users. In addition, the typical CGI query mechanism requires several parameters to be sent with the http GET/POST request, which may identify the patient in some way; this in unacceptable for privacy protection. Also unacceptable are pages produced by server-side scripts which can be cached by the browser, since these may also contain sensitive information. We propose a simple mechanism for access to patient information, including images, which guarantees security of information, makes it impossible to bookmark the page, or to return to the page after some defined length of time. In addition, this mechanism is simple, therefore permitting rapid access without the need to initially download an interface such as an applet or control. In addition to image display, the design of the site allows the user to view and save movies of multi-phasic data, or to construct multi-frame datasets from entire series. These capabilities make the site attractive for research purposes such as teaching file preparation.

  2. Casting the Net: The Development of a Resource Collection for an Internet Database.

    ERIC Educational Resources Information Center

    McKiernan, Gerry

    CyberStacks(sm), a demonstration prototype World Wide Web information service, was established on the home page server at Iowa State University with the intent of facilitating identification and use of significant Internet resources in science and technology. CyberStacks(sm) was created in response to perceived deficiencies in early efforts to…

  3. 77 FR 6601 - Facility Operating License Amendment From Nine Mile Point Nuclear Station, LLC.; Nine Mile Point...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-08

    ... (or its counsel or representative) to digitally sign documents and access the E-Submittal server for... Information (SGI) is necessary to respond to this notice must request document access by February 21, 2012... instructions on submitting comments and instructions on accessing documents related to this action, see [[Page...

  4. Using a Java Web-based Graphical User Interface to access the SOHO Data Arch ive

    NASA Astrophysics Data System (ADS)

    Scholl, I.; Girard, Y.; Bykowski, A.

    This paper presents the architecture of a Java web-based graphical interface dedicated to the access of the SOHO Data archive. This application allows local and remote users to search in the SOHO data catalog and retrieve the SOHO data files from the archive. It has been developed at MEDOC (Multi-Experiment Data and Operations Centre), located at the Institut d'Astrophysique Spatiale (Orsay, France), which is one of the European Archives for the SOHO data. This development is part of a joint effort between ESA, NASA and IAS in order to implement long term archive systems for the SOHO data. The software architecture is built as a client-server application using Java language and SQL above a set of components such as an HTTP server, a JDBC gateway, a RDBMS server, a data server and a Web browser. Since HTML pages and CGI scripts are not powerful enough to allow user interaction during a multi-instrument catalog search, this type of requirement enforces the choice of Java as the main language. We also discuss performance issues, security problems and portability on different Web browsers and operating syste ms.

  5. The SubCons webserver: A user friendly web interface for state-of-the-art subcellular localization prediction.

    PubMed

    Salvatore, M; Shu, N; Elofsson, A

    2018-01-01

    SubCons is a recently developed method that predicts the subcellular localization of a protein. It combines predictions from four predictors using a Random Forest classifier. Here, we present the user-friendly web-interface implementation of SubCons. Starting from a protein sequence, the server rapidly predicts the subcellular localizations of an individual protein. In addition, the server accepts the submission of sets of proteins either by uploading the files or programmatically by using command line WSDL API scripts. This makes SubCons ideal for proteome wide analyses allowing the user to scan a whole proteome in few days. From the web page, it is also possible to download precalculated predictions for several eukaryotic organisms. To evaluate the performance of SubCons we present a benchmark of LocTree3 and SubCons using two recent mass-spectrometry based datasets of mouse and drosophila proteins. The server is available at http://subcons.bioinfo.se/. © 2017 The Protein Society.

  6. SeWeR: a customizable and integrated dynamic HTML interface to bioinformatics services.

    PubMed

    Basu, M K

    2001-06-01

    Sequence analysis using Web Resources (SeWeR) is an integrated, Dynamic HTML (DHTML) interface to commonly used bioinformatics services available on the World Wide Web. It is highly customizable, extendable, platform neutral, completely server-independent and can be hosted as a web page as well as being used as stand-alone software running within a web browser.

  7. Cal OES Internet Home Cal OES Home

    Science.gov Websites

    Sign In You may be trying to access this site from a secured browser on the server. Please enable scripts and reload this page. Turn on more accessible mode Turn off more accessible mode Skip Ribbon Commands Skip to main content Turn off Animations Turn on Animations image of CA.gov logo image of Cal OES

  8. 77 FR 3017 - Self-Regulatory Organizations; C2 Options Exchange, Incorporated; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-20

    ... be charged on a per-Login ID basis. Firms may access C2 via either a CMI Client Application [[Page..., using different Login IDs, accessing the same CMI Client Application Server or FIX Port, allowing the firm to only pay the monthly fee once. Alternatively, a firm may use the same Login ID to access...

  9. A web-based quantitative signal detection system on adverse drug reaction in China.

    PubMed

    Li, Chanjuan; Xia, Jielai; Deng, Jianxiong; Chen, Wenge; Wang, Suzhen; Jiang, Jing; Chen, Guanquan

    2009-07-01

    To establish a web-based quantitative signal detection system for adverse drug reactions (ADRs) based on spontaneous reporting to the Guangdong province drug-monitoring database in China. Using Microsoft Visual Basic and Active Server Pages programming languages and SQL Server 2000, a web-based system with three software modules was programmed to perform data preparation and association detection, and to generate reports. Information component (IC), the internationally recognized measure of disproportionality for quantitative signal detection, was integrated into the system, and its capacity for signal detection was tested with ADR reports collected from 1 January 2002 to 30 June 2007 in Guangdong. A total of 2,496 associations including known signals were mined from the test database. Signals (e.g., cefradine-induced hematuria) were found early by using the IC analysis. In addition, 291 drug-ADR associations were alerted for the first time in the second quarter of 2007. The system can be used for the detection of significant associations from the Guangdong drug-monitoring database and could be an extremely useful adjunct to the expert assessment of very large numbers of spontaneously reported ADRs for the first time in China.

  10. MyFreePACS: a free web-based radiology image storage and viewing tool.

    PubMed

    de Regt, David; Weinberger, Ed

    2004-08-01

    We developed an easy-to-use method for central storage and subsequent viewing of radiology images for use on any PC equipped with Internet Explorer. We developed MyFreePACS, a program that uses a DICOM server to receive and store images and transmit them over the Web to the MyFreePACS Web client. The MyFreePACS Web client is a Web page that uses an ActiveX control for viewing and manipulating images. The client contains many of the tools found in modern image viewing stations including 3D localization and multiplanar reformation. The system is built entirely with free components and is freely available for download and installation from the Web at www.myfreepacs.com.

  11. Workload Characterization and Performance Implications of Large-Scale Blog Servers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeon, Myeongjae; Kim, Youngjae; Hwang, Jeaho

    With the ever-increasing popularity of social network services (SNSs), an understanding of the characteristics of these services and their effects on the behavior of their host servers is critical. However, there has been a lack of research on the workload characterization of servers running SNS applications such as blog services. To fill this void, we empirically characterized real-world web server logs collected from one of the largest South Korean blog hosting sites for 12 consecutive days. The logs consist of more than 96 million HTTP requests and 4.7 TB of network traffic. Our analysis reveals the followings: (i) The transfermore » size of non-multimedia files and blog articles can be modeled using a truncated Pareto distribution and a log-normal distribution, respectively; (ii) User access for blog articles does not show temporal locality, but is strongly biased towards those posted with image or audio files. We additionally discuss the potential performance improvement through clustering of small files on a blog page into contiguous disk blocks, which benefits from the observed file access patterns. Trace-driven simulations show that, on average, the suggested approach achieves 60.6% better system throughput and reduces the processing time for file access by 30.8% compared to the best performance of the Ext4 file system.« less

  12. GeneSilico protein structure prediction meta-server.

    PubMed

    Kurowski, Michal A; Bujnicki, Janusz M

    2003-07-01

    Rigorous assessments of protein structure prediction have demonstrated that fold recognition methods can identify remote similarities between proteins when standard sequence search methods fail. It has been shown that the accuracy of predictions is improved when refined multiple sequence alignments are used instead of single sequences and if different methods are combined to generate a consensus model. There are several meta-servers available that integrate protein structure predictions performed by various methods, but they do not allow for submission of user-defined multiple sequence alignments and they seldom offer confidentiality of the results. We developed a novel WWW gateway for protein structure prediction, which combines the useful features of other meta-servers available, but with much greater flexibility of the input. The user may submit an amino acid sequence or a multiple sequence alignment to a set of methods for primary, secondary and tertiary structure prediction. Fold-recognition results (target-template alignments) are converted into full-atom 3D models and the quality of these models is uniformly assessed. A consensus between different FR methods is also inferred. The results are conveniently presented on-line on a single web page over a secure, password-protected connection. The GeneSilico protein structure prediction meta-server is freely available for academic users at http://genesilico.pl/meta.

  13. GeneSilico protein structure prediction meta-server

    PubMed Central

    Kurowski, Michal A.; Bujnicki, Janusz M.

    2003-01-01

    Rigorous assessments of protein structure prediction have demonstrated that fold recognition methods can identify remote similarities between proteins when standard sequence search methods fail. It has been shown that the accuracy of predictions is improved when refined multiple sequence alignments are used instead of single sequences and if different methods are combined to generate a consensus model. There are several meta-servers available that integrate protein structure predictions performed by various methods, but they do not allow for submission of user-defined multiple sequence alignments and they seldom offer confidentiality of the results. We developed a novel WWW gateway for protein structure prediction, which combines the useful features of other meta-servers available, but with much greater flexibility of the input. The user may submit an amino acid sequence or a multiple sequence alignment to a set of methods for primary, secondary and tertiary structure prediction. Fold-recognition results (target-template alignments) are converted into full-atom 3D models and the quality of these models is uniformly assessed. A consensus between different FR methods is also inferred. The results are conveniently presented on-line on a single web page over a secure, password-protected connection. The GeneSilico protein structure prediction meta-server is freely available for academic users at http://genesilico.pl/meta. PMID:12824313

  14. SCOPE: a web server for practical de novo motif discovery.

    PubMed

    Carlson, Jonathan M; Chakravarty, Arijit; DeZiel, Charles E; Gross, Robert H

    2007-07-01

    SCOPE is a novel parameter-free method for the de novo identification of potential regulatory motifs in sets of coordinately regulated genes. The SCOPE algorithm combines the output of three component algorithms, each designed to identify a particular class of motifs. Using an ensemble learning approach, SCOPE identifies the best candidate motifs from its component algorithms. In tests on experimentally determined datasets, SCOPE identified motifs with a significantly higher level of accuracy than a number of other web-based motif finders run with their default parameters. Because SCOPE has no adjustable parameters, the web server has an intuitive interface, requiring only a set of gene names or FASTA sequences and a choice of species. The most significant motifs found by SCOPE are displayed graphically on the main results page with a table containing summary statistics for each motif. Detailed motif information, including the sequence logo, PWM, consensus sequence and specific matching sites can be viewed through a single click on a motif. SCOPE's efficient, parameter-free search strategy has enabled the development of a web server that is readily accessible to the practising biologist while providing results that compare favorably with those of other motif finders. The SCOPE web server is at .

  15. Development of new on-line statistical program for the Korean Society for Radiation Oncology

    PubMed Central

    Song, Si Yeol; Ahn, Seung Do; Chung, Weon Kuu; Choi, Eun Kyung; Cho, Kwan Ho

    2015-01-01

    Purpose To develop new on-line statistical program for the Korean Society for Radiation Oncology (KOSRO) to collect and extract medical data in radiation oncology more efficiently. Materials and Methods The statistical program is a web-based program. The directory was placed in a sub-folder of the homepage of KOSRO and its web address is http://www.kosro.or.kr/asda. The operating systems server is Linux and the webserver is the Apache HTTP server. For database (DB) server, MySQL is adopted and dedicated scripting language is the PHP. Each ID and password are controlled independently and all screen pages for data input or analysis are made to be friendly to users. Scroll-down menu is actively used for the convenience of user and the consistence of data analysis. Results Year of data is one of top categories and main topics include human resource, equipment, clinical statistics, specialized treatment and research achievement. Each topic or category has several subcategorized topics. Real-time on-line report of analysis is produced immediately after entering each data and the administrator is able to monitor status of data input of each hospital. Backup of data as spread sheets can be accessed by the administrator and be used for academic works by any members of the KOSRO. Conclusion The new on-line statistical program was developed to collect data from nationwide departments of radiation oncology. Intuitive screen and consistent input structure are expected to promote entering data of member hospitals and annual statistics should be a cornerstone of advance in radiation oncology. PMID:26157684

  16. Development of new on-line statistical program for the Korean Society for Radiation Oncology.

    PubMed

    Song, Si Yeol; Ahn, Seung Do; Chung, Weon Kuu; Shin, Kyung Hwan; Choi, Eun Kyung; Cho, Kwan Ho

    2015-06-01

    To develop new on-line statistical program for the Korean Society for Radiation Oncology (KOSRO) to collect and extract medical data in radiation oncology more efficiently. The statistical program is a web-based program. The directory was placed in a sub-folder of the homepage of KOSRO and its web address is http://www.kosro.or.kr/asda. The operating systems server is Linux and the webserver is the Apache HTTP server. For database (DB) server, MySQL is adopted and dedicated scripting language is the PHP. Each ID and password are controlled independently and all screen pages for data input or analysis are made to be friendly to users. Scroll-down menu is actively used for the convenience of user and the consistence of data analysis. Year of data is one of top categories and main topics include human resource, equipment, clinical statistics, specialized treatment and research achievement. Each topic or category has several subcategorized topics. Real-time on-line report of analysis is produced immediately after entering each data and the administrator is able to monitor status of data input of each hospital. Backup of data as spread sheets can be accessed by the administrator and be used for academic works by any members of the KOSRO. The new on-line statistical program was developed to collect data from nationwide departments of radiation oncology. Intuitive screen and consistent input structure are expected to promote entering data of member hospitals and annual statistics should be a cornerstone of advance in radiation oncology.

  17. BEAM web server: a tool for structural RNA motif discovery.

    PubMed

    Pietrosanto, Marco; Adinolfi, Marta; Casula, Riccardo; Ausiello, Gabriele; Ferrè, Fabrizio; Helmer-Citterich, Manuela

    2018-03-15

    RNA structural motif finding is a relevant problem that becomes computationally hard when working on high-throughput data (e.g. eCLIP, PAR-CLIP), often represented by thousands of RNA molecules. Currently, the BEAM server is the only web tool capable to handle tens of thousands of RNA in input with a motif discovery procedure that is only limited by the current secondary structure prediction accuracies. The recently developed method BEAM (BEAr Motifs finder) can analyze tens of thousands of RNA molecules and identify RNA secondary structure motifs associated to a measure of their statistical significance. BEAM is extremely fast thanks to the BEAR encoding that transforms each RNA secondary structure in a string of characters. BEAM also exploits the evolutionary knowledge contained in a substitution matrix of secondary structure elements, extracted from the RFAM database of families of homologous RNAs. The BEAM web server has been designed to streamline data pre-processing by automatically handling folding and encoding of RNA sequences, giving users a choice for the preferred folding program. The server provides an intuitive and informative results page with the list of secondary structure motifs identified, the logo of each motif, its significance, graphic representation and information about its position in the RNA molecules sharing it. The web server is freely available at http://beam.uniroma2.it/ and it is implemented in NodeJS and Python with all major browsers supported. marco.pietrosanto@uniroma2.it. Supplementary data are available at Bioinformatics online.

  18. TMFoldWeb: a web server for predicting transmembrane protein fold class.

    PubMed

    Kozma, Dániel; Tusnády, Gábor E

    2015-09-17

    Here we present TMFoldWeb, the web server implementation of TMFoldRec, a transmembrane protein fold recognition algorithm. TMFoldRec uses statistical potentials and utilizes topology filtering and a gapless threading algorithm. It ranks template structures and selects the most likely candidates and estimates the reliability of the obtained lowest energy model. The statistical potential was developed in a maximum likelihood framework on a representative set of the PDBTM database. According to the benchmark test the performance of TMFoldRec is about 77 % in correctly predicting fold class for a given transmembrane protein sequence. An intuitive web interface has been developed for the recently published TMFoldRec algorithm. The query sequence goes through a pipeline of topology prediction and a systematic sequence to structure alignment (threading). Resulting templates are ordered by energy and reliability values and are colored according to their significance level. Besides the graphical interface, a programmatic access is available as well, via a direct interface for developers or for submitting genome-wide data sets. The TMFoldWeb web server is unique and currently the only web server that is able to predict the fold class of transmembrane proteins while assigning reliability scores for the prediction. This method is prepared for genome-wide analysis with its easy-to-use interface, informative result page and programmatic access. Considering the info-communication evolution in the last few years, the developed web server, as well as the molecule viewer, is responsive and fully compatible with the prevalent tablets and mobile devices.

  19. SITEHOUND-web: a server for ligand binding site identification in protein structures.

    PubMed

    Hernandez, Marylens; Ghersi, Dario; Sanchez, Roberto

    2009-07-01

    SITEHOUND-web (http://sitehound.sanchezlab.org) is a binding-site identification server powered by the SITEHOUND program. Given a protein structure in PDB format SITEHOUND-web will identify regions of the protein characterized by favorable interactions with a probe molecule. These regions correspond to putative ligand binding sites. Depending on the probe used in the calculation, sites with preference for different ligands will be identified. Currently, a carbon probe for identification of binding sites for drug-like molecules, and a phosphate probe for phosphorylated ligands (ATP, phoshopeptides, etc.) have been implemented. SITEHOUND-web will display the results in HTML pages including an interactive 3D representation of the protein structure and the putative sites using the Jmol java applet. Various downloadable data files are also provided for offline data analysis.

  20. A SPDS Node to Support the Systematic Interpretation of Cosmic Ray Data

    NASA Technical Reports Server (NTRS)

    1997-01-01

    The purpose of this project was to establish and maintain a Space Physics Data System (SPDS) node that supports the analysis and interpretation of current and future galactic cosmic ray (GCR) measurements by (1) providing on-line databases relevant to GCR propagation studies; (2) providing other on-line services, such as anonymous FTP access, mail list service and pointers to e-mail address books, to support the cosmic ray community; (3) providing a mechanism for those in the community who might wish to submit similar contributions for public access; (4) maintaining the node to assure that the databases remain current; and (5) investigating other possibilities, such as CD-ROM, for public dissemination of the data products. Shortly after the original grant to support these activities was established at Louisiana State University a detailed study of alternate choices for the node hardware was initiated. The chosen hardware was an Apple Workgroup Server 9150/120 consisting of a 120 MHz PowerPC 601 processor, 32 MB of memory, two I GB disks and one 2 GB disk. This hardware was ordered and installed and has been operating reliably ever since. A preliminary version of the database server was available during the first year effort and was used as part of the very successful SPDS demonstration during the Rome, Italy International Cosmic Ray Conference. For this server version we were able to establish the html and anonymous FTP server software, develop a Web page structure which can be easily modified to include new items, provide an on-line database of charge changing total cross sections, include the cross section prediction software of Silberberg & Tsao as well as Webber, Kish and Schrier for download access, and provide an on-line bibliography of the cross section measurement references by the Transport Collaboration. The preliminary version of this SPDS Cosmic Ray node was examined by members of the C&H SPDS committee and returned comments were used to refine the implementation.

  1. Server Level Analysis of Network Operation Utilizing System Call Data

    DTIC Science & Technology

    2010-09-25

    Server DLL Inject 6 Executable Download and Execute 7 Execute Command 8 Execute net user /ADD 9 PassiveX ActiveX Inject Meterpreter Payload...10 PassiveX ActiveX Inject VNC Server Payload 11 PassiveX ActiveX Injection Payload 12 Recv Tag Findsock Meterpreter 13 Recv Tag Findsock

  2. How Does One Manage ’Information? Making Sense of the Information Being Received

    DTIC Science & Technology

    2012-12-01

    to manage. (Photo by PFC Franklin E. Mercado .) Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of...in choosing the right application. Appli- cation software is written to perform a specific task or function, and it becomes increasingly difficult...common data, virtu- alizing machines for all software (using one computer/server, but dividing it into logical segments), and standardizing

  3. WebScope: A New Tool for Fusion Data Analysis and Visualization

    NASA Astrophysics Data System (ADS)

    Yang, Fei; Dang, Ningning; Xiao, Bingjia

    2010-04-01

    A visualization tool was developed through a web browser based on Java applets embedded into HTML pages, in order to provide a world access to the EAST experimental data. It can display data from various trees in different servers in a single panel. With WebScope, it is easier to make a comparison between different data sources and perform a simple calculation over different data sources.

  4. Real-Time Speaker Detection for User-Device Binding

    DTIC Science & Technology

    2010-12-01

    31 xi THIS PAGE INTENTIONALLY LEFT BLANK xii CHAPTER 1: Introduction The roll-out of commercial wireless networks continues to rise worldwide...in a secured facility. It could also be connected to the call server via a Virtual Private Network (VPN) or public lines if security is not a top...communications network [25]. Yet, James Arden Barnett, Jr., Chief of the Public Safety and Homeland Security Bureau, argues that emergency communications

  5. Web-based Tool Suite for Plasmasphere Information Discovery

    NASA Astrophysics Data System (ADS)

    Newman, T. S.; Wang, C.; Gallagher, D. L.

    2005-12-01

    A suite of tools that enable discovery of terrestrial plasmasphere characteristics from NASA IMAGE Extreme Ultra Violet (EUV) images is described. The tool suite is web-accessible, allowing easy remote access without the need for any software installation on the user's computer. The features supported by the tool include reconstruction of the plasmasphere plasma density distribution from a short sequence of EUV images, semi-automated selection of the plasmapause boundary in an EUV image, and mapping of the selected boundary to the geomagnetic equatorial plane. EUV image upload and result download is also supported. The tool suite's plasmapause mapping feature is achieved via the Roelof and Skinner (2000) Edge Algorithm. The plasma density reconstruction is achieved through a tomographic technique that exploits physical constraints to allow for a moderate resolution result. The tool suite's software architecture uses Java Server Pages (JSP) and Java Applets on the front side for user-software interaction and Java Servlets on the server side for task execution. The compute-intensive components of the tool suite are implemented in C++ and invoked by the server via Java Native Interface (JNI).

  6. Accessing NASA Technology with the World Wide Web

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Bianco, David J.

    1995-01-01

    NASA Langley Research Center (LaRC) began using the World Wide Web (WWW) in the summer of 1993, becoming the first NASA installation to provide a Center-wide home page. This coincided with a reorganization of LaRC to provide a more concentrated focus on technology transfer to both aerospace and non-aerospace industry. Use of WWW and NCSA Mosaic not only provides automated information dissemination, but also allows for the implementation, evolution and integration of many technology transfer and technology awareness applications. This paper describes several of these innovative applications, including the on-line presentation of the entire Technology OPportunities Showcase (TOPS), an industrial partnering showcase that exists on the Web long after the actual 3-day event ended. The NASA Technical Report Server (NTRS) provides uniform access to many logically similar, yet physically distributed NASA report servers. WWW is also the foundation of the Langley Software Server (LSS), an experimental software distribution system which will distribute LaRC-developed software. In addition to the more formal technology distribution projects, WWW has been successful in connecting people with technologies and people with other people.

  7. A FPGA embedded web server for remote monitoring and control of smart sensors networks.

    PubMed

    Magdaleno, Eduardo; Rodríguez, Manuel; Pérez, Fernando; Hernández, David; García, Enrique

    2013-12-27

    This article describes the implementation of a web server using an embedded Altera NIOS II IP core, a general purpose and configurable RISC processor which is embedded in a Cyclone FPGA. The processor uses the μCLinux operating system to support a Boa web server of dynamic pages using Common Gateway Interface (CGI). The FPGA is configured to act like the master node of a network, and also to control and monitor a network of smart sensors or instruments. In order to develop a totally functional system, the FPGA also includes an implementation of the time-triggered protocol (TTP/A). Thus, the implemented master node has two interfaces, the webserver that acts as an Internet interface and the other to control the network. This protocol is widely used to connecting smart sensors and actuators and microsystems in embedded real-time systems in different application domains, e.g., industrial, automotive, domotic, etc., although this protocol can be easily replaced by any other because of the inherent characteristics of the FPGA-based technology.

  8. A FPGA Embedded Web Server for Remote Monitoring and Control of Smart Sensors Networks

    PubMed Central

    Magdaleno, Eduardo; Rodríguez, Manuel; Pérez, Fernando; Hernández, David; García, Enrique

    2014-01-01

    This article describes the implementation of a web server using an embedded Altera NIOS II IP core, a general purpose and configurable RISC processor which is embedded in a Cyclone FPGA. The processor uses the μCLinux operating system to support a Boa web server of dynamic pages using Common Gateway Interface (CGI). The FPGA is configured to act like the master node of a network, and also to control and monitor a network of smart sensors or instruments. In order to develop a totally functional system, the FPGA also includes an implementation of the time-triggered protocol (TTP/A). Thus, the implemented master node has two interfaces, the webserver that acts as an Internet interface and the other to control the network. This protocol is widely used to connecting smart sensors and actuators and microsystems in embedded real-time systems in different application domains, e.g., industrial, automotive, domotic, etc., although this protocol can be easily replaced by any other because of the inherent characteristics of the FPGA-based technology. PMID:24379047

  9. Viewing ISS Data in Real Time via the Internet

    NASA Technical Reports Server (NTRS)

    Myers, Gerry; Chamberlain, Jim

    2004-01-01

    EZStream is a computer program that enables authorized users at diverse terrestrial locations to view, in real time, data generated by scientific payloads aboard the International Space Station (ISS). The only computation/communication resource needed for use of EZStream is a computer equipped with standard Web-browser software and a connection to the Internet. EZStream runs in conjunction with the TReK software, described in a prior NASA Tech Briefs article, that coordinates multiple streams of data for the ground communication system of the ISS. EZStream includes server components that interact with TReK within the ISS ground communication system and client components that reside in the users' remote computers. Once an authorized client has logged in, a server component of EZStream pulls the requested data from a TReK application-program interface and sends the data to the client. Future EZStream enhancements will include (1) extensions that enable the server to receive and process arbitrary data streams on its own and (2) a Web-based graphical-user-interface-building subprogram that enables a client who lacks programming expertise to create customized display Web pages.

  10. A generic minimization random allocation and blinding system on web.

    PubMed

    Cai, Hongwei; Xia, Jielai; Xu, Dezhong; Gao, Donghuai; Yan, Yongping

    2006-12-01

    Minimization is a dynamic randomization method for clinical trials. Although recommended by many researchers, the utilization of minimization has been seldom reported in randomized trials mainly because of the controversy surrounding the validity of conventional analyses and its complexity in implementation. However, both the statistical and clinical validity of minimization were demonstrated in recent studies. Minimization random allocation system integrated with blinding function that could facilitate the implementation of this method in general clinical trials has not been reported. SYSTEM OVERVIEW: The system is a web-based random allocation system using Pocock and Simon minimization method. It also supports multiple treatment arms within a trial, multiple simultaneous trials, and blinding without further programming. This system was constructed with generic database schema design method, Pocock and Simon minimization method and blinding method. It was coded with Microsoft Visual Basic and Active Server Pages (ASP) programming languages. And all dataset were managed with a Microsoft SQL Server database. Some critical programming codes were also provided. SIMULATIONS AND RESULTS: Two clinical trials were simulated simultaneously to test the system's applicability. Not only balanced groups but also blinded allocation results were achieved in both trials. Practical considerations for minimization method, the benefits, general applicability and drawbacks of the technique implemented in this system are discussed. Promising features of the proposed system are also summarized.

  11. Early Student Support for Application of Advanced Multi-Core Processor Technologies to Oceanographic Research

    DTIC Science & Technology

    2016-05-07

    REPORT DOCUMENTATION PAGE I . ... ... .. . ,...,.., ............. OMB No. 0704-0188 The public reporting burden for this collection of...Student Support for Appl ication of Advanced Multi- Core Processor N00014-12-1-0298 Technologies to Oceanographic Research Sb. GRANT NUMBER Sc...communications protocols (i.e. UART, I2C, and SPI), through the , ’ . handing off of the data to the server APis. By providing a common set of tools

  12. Delayed Instantiation Bulk Operations for Management of Distributed, Object-Based Storage Systems

    DTIC Science & Technology

    2009-08-01

    source and destination object sets, while they have attribute pages to indicate that history . Fourth, we allow for operations to occur on any objects...client dialogue to the PostgreSQL database where server-side functions implement the service logic for the requests. The translation is done...to satisfy client requests, and performs delayed instantiation bulk operations. It is built around a PostgreSQL database with tables for storing

  13. Efficient Server-Aided Secure Two-Party Function Evaluation with Applications to Genomic Computation

    DTIC Science & Technology

    2016-07-14

    of the important properties of secure computation . In particular, it is known that full fairness cannot be achieved in the case of two-party com...Jakobsen, J. Nielsen, and C. Orlandi. A framework for outsourcing of secure computation . In ACM Workshop on Cloud Computing Security (CCSW), pages...Function Evaluation with Applications to Genomic Computation Abstract: Computation based on genomic data is becoming increasingly popular today, be it

  14. The EarthServer project: Exploiting Identity Federations, Science Gateways and Social and Mobile Clients for Big Earth Data Analysis

    NASA Astrophysics Data System (ADS)

    Barbera, Roberto; Bruno, Riccardo; Calanducci, Antonio; Messina, Antonio; Pappalardo, Marco; Passaro, Gianluca

    2013-04-01

    The EarthServer project (www.earthserver.eu), funded by the European Commission under its Seventh Framework Program, aims at establishing open access and ad-hoc analytics on extreme-size Earth Science data, based on and extending leading-edge Array Database technology. The core idea is to use database query languages as client/server interface to achieve barrier-free "mix & match" access to multi-source, any-size, multi-dimensional space-time data -- in short: "Big Earth Data Analytics" - based on the open standards of the Open Geospatial Consortium Web Coverage Processing Service (OGC WCPS) and the W3C XQuery. EarthServer combines both, thereby achieving a tight data/metadata integration. Further, the rasdaman Array Database System (www.rasdaman.com) is extended with further space-time coverage data types. On server side, highly effective optimizations - such as parallel and distributed query processing - ensure scalability to Exabyte volumes. Six Lighthouse Applications are being established in EarthServer, each of which poses distinct challenges on Earth Data Analytics: Cryospheric Science, Airborne Science, Atmospheric Science, Geology, Oceanography, and Planetary Science. Altogether, they cover all Earth Science domains; the Planetary Science use case has been added to challenge concepts and standards in non-standard environments. In addition, EarthLook (maintained by Jacobs University) showcases use of OGC standards in 1D through 5D use cases. In this contribution we will report on the first applications integrated in the EarthServer Science Gateway and on the clients for mobile appliances developed to access them. We will also show how federated and social identity services can allow Big Earth Data Providers to expose their data in a distributed environment keeping a strict and fine-grained control on user authentication and authorisation. The degree of fulfilment of the EarthServer implementation with the recommendations made in the recent TERENA Study on AAA Platforms For Scientific Resources in Europe (https://confluence.terena.org/display/aaastudy/AAA+Study+Home+Page) will also be assessed.

  15. TOPS On-Line: Automating the Construction and Maintenance of HTML Pages

    NASA Technical Reports Server (NTRS)

    Jones, Kennie H.

    1994-01-01

    After the Technology Opportunities Showcase (TOPS), in October, 1993, Langley Research Center's (LaRC) Information Systems Division (ISD) accepted the challenge to preserve the investment in information assembled in the TOPS exhibits by establishing a data base. Following the lead of several people at LaRC and others around the world, the HyperText Transport Protocol (HTTP) server and Mosaic were the obvious tools of choice for implementation. Initially, some TOPS exhibitors began the conventional approach of constructing HyperText Markup Language (HTML) pages of their exhibits as input to Mosaic. Considering the number of pages to construct, a better approach was conceived that would automate the construction of pages. This approach allowed completion of the data base construction in a shorter period of time using fewer resources than would have been possible with the conventional approach. It also provided flexibility for the maintenance and enhancement of the data base. Since that time, this approach has been used to automate construction of other HTML data bases. Through these experiences, it is concluded that the most effective use of the HTTP/Mosaic technology will require better tools and techniques for creating, maintaining and managing the HTML pages. The development and use of these tools and techniques are the subject of this document.

  16. The efficacy of a Web-based counterargument tutor.

    PubMed

    Wolfe, Christopher R; Britt, M Anne; Petrovic, Melina; Albrecht, Michael; Kopp, Kristopher

    2009-08-01

    In two experiments, we developed and tested an interactive Web-based tutor to help students identify and evaluate counterarguments. In Experiment 1, we determined the extent to which high- and low-argumentationability participants were able to identify counterarguments. We tested the effectiveness of having participants read didactic text regarding counterarguments and highlight claims. Both preparations had some positive effects that were often limited to high-ability participants. The Web-based intervention included interactive exercises on identifying and using counterarguments. Web-based presentation was state driven, using a Java Server Pages page. As participants progressively identified argument elements, the page changed display state and presented feedback by checking what the user clicked against elements that we had coded in XML beforehand. Instructions and feedback strings were indexed by state, so that changing state selected new text to display. In Experiment 2, the tutor was effective in teaching participants to identify counterarguments, recognize responses, and determine whether counterarguments were rebutted, dismissed, or conceded.

  17. PACS quality control and automatic problem notifier

    NASA Astrophysics Data System (ADS)

    Honeyman-Buck, Janice C.; Jones, Douglas; Frost, Meryll M.; Staab, Edward V.

    1997-05-01

    One side effect of installing a clinical PACS Is that users become dependent upon the technology and in some cases it can be very difficult to revert back to a film based system if components fail. The nature of system failures range from slow deterioration of function as seen in the loss of monitor luminance through sudden catastrophic loss of the entire PACS networks. This paper describes the quality control procedures in place at the University of Florida and the automatic notification system that alerts PACS personnel when a failure has happened or is anticipated. The goal is to recover from a failure with a minimum of downtime and no data loss. Routine quality control is practiced on all aspects of PACS, from acquisition, through network routing, through display, and including archiving. Whenever possible, the system components perform self and between platform checks for active processes, file system status, errors in log files, and system uptime. When an error is detected or a exception occurs, an automatic page is sent to a pager with a diagnostic code. Documentation on each code, trouble shooting procedures, and repairs are kept on an intranet server accessible only to people involved in maintaining the PACS. In addition to the automatic paging system for error conditions, acquisition is assured by an automatic fax report sent on a daily basis to all technologists acquiring PACS images to be used as a cross check that all studies are archived prior to being removed from the acquisition systems. Daily quality control is preformed to assure that studies can be moved from each acquisition and contrast adjustment. The results of selected quality control reports will be presented. The intranet documentation server will be described with the automatic pager system. Monitor quality control reports will be described and the cost of quality control will be quantified. As PACS is accepted as a clinical tool, the same standards of quality control must be established as are expected on other equipment used in the diagnostic process.

  18. Wireless Acoustic Measurement System

    NASA Technical Reports Server (NTRS)

    Anderson, Paul D.; Dorland, Wade D.; Jolly, Ronald L.

    2007-01-01

    A prototype wireless acoustic measurement system (WAMS) is one of two main subsystems of the Acoustic Prediction/ Measurement Tool, which comprises software, acoustic instrumentation, and electronic hardware combined to afford integrated capabilities for predicting and measuring noise emitted by rocket and jet engines. The other main subsystem is described in the article on page 8. The WAMS includes analog acoustic measurement instrumentation and analog and digital electronic circuitry combined with computer wireless local-area networking to enable (1) measurement of sound-pressure levels at multiple locations in the sound field of an engine under test and (2) recording and processing of the measurement data. At each field location, the measurements are taken by a portable unit, denoted a field station. There are ten field stations, each of which can take two channels of measurements. Each field station is equipped with two instrumentation microphones, a micro- ATX computer, a wireless network adapter, an environmental enclosure, a directional radio antenna, and a battery power supply. The environmental enclosure shields the computer from weather and from extreme acoustically induced vibrations. The power supply is based on a marine-service lead-acid storage battery that has enough capacity to support operation for as long as 10 hours. A desktop computer serves as a control server for the WAMS. The server is connected to a wireless router for communication with the field stations via a wireless local-area network that complies with wireless-network standard 802.11b of the Institute of Electrical and Electronics Engineers. The router and the wireless network adapters are controlled by use of Linux-compatible driver software. The server runs custom Linux software for synchronizing the recording of measurement data in the field stations. The software includes a module that provides an intuitive graphical user interface through which an operator at the control server can control the operations of the field stations for calibration and for recording of measurement data. A test engineer positions and activates the WAMS. The WAMS automatically establishes the wireless network. Next, the engineer performs pretest calibrations. Then the engineer executes the test and measurement procedures. After the test, the raw measurement files are copied and transferred, through the wireless network, to a hard disk in the control server. Subsequently, the data are processed into 1.3-octave spectrograms.

  19. miRanalyzer: a microRNA detection and analysis tool for next-generation sequencing experiments.

    PubMed

    Hackenberg, Michael; Sturm, Martin; Langenberger, David; Falcón-Pérez, Juan Manuel; Aransay, Ana M

    2009-07-01

    Next-generation sequencing allows now the sequencing of small RNA molecules and the estimation of their expression levels. Consequently, there will be a high demand of bioinformatics tools to cope with the several gigabytes of sequence data generated in each single deep-sequencing experiment. Given this scene, we developed miRanalyzer, a web server tool for the analysis of deep-sequencing experiments for small RNAs. The web server tool requires a simple input file containing a list of unique reads and its copy numbers (expression levels). Using these data, miRanalyzer (i) detects all known microRNA sequences annotated in miRBase, (ii) finds all perfect matches against other libraries of transcribed sequences and (iii) predicts new microRNAs. The prediction of new microRNAs is an especially important point as there are many species with very few known microRNAs. Therefore, we implemented a highly accurate machine learning algorithm for the prediction of new microRNAs that reaches AUC values of 97.9% and recall values of up to 75% on unseen data. The web tool summarizes all the described steps in a single output page, which provides a comprehensive overview of the analysis, adding links to more detailed output pages for each analysis module. miRanalyzer is available at http://web.bioinformatics.cicbiogune.es/microRNA/.

  20. YODA++: A proposal for a semi-automatic space mission control

    NASA Astrophysics Data System (ADS)

    Casolino, M.; de Pascale, M. P.; Nagni, M.; Picozza, P.

    YODA++ is a proposal for a semi-automated data handling and analysis system for the PAMELA space experiment. The core of the routines have been developed to process a stream of raw data downlinked from the Resurs DK1 satellite (housing PAMELA) to the ground station in Moscow. Raw data consist of scientific data and are complemented by housekeeping information. Housekeeping information will be analyzed within a short time from download (1 h) in order to monitor the status of the experiment and to foreseen the mission acquisition planning. A prototype for the data visualization will run on an APACHE TOMCAT web application server, providing an off-line analysis tool using a browser and part of code for the system maintenance. Data retrieving development is in production phase, while a GUI interface for human friendly monitoring is on preliminary phase as well as a JavaServerPages/JavaServerFaces (JSP/JSF) web application facility. On a longer timescale (1 3 h from download) scientific data are analyzed. The data storage core will be a mix of CERNs ROOT files structure and MySQL as a relational database. YODA++ is currently being used in the integration and testing on ground of PAMELA data.

  1. OPserver: opacities and radiative accelerations on demand

    NASA Astrophysics Data System (ADS)

    Mendoza, C.; González, J.; Seaton, M. J.; Buerger, P.; Bellorín, A.; Meléndez, M.; Rodríguez, L. S.; Delahaye, F.; Zeippen, C. J.; Palacios, E.; Pradhan, A. K.

    2009-05-01

    We report on developments carried out within the Opacity Project (OP) to upgrade atomic database services to comply with e-infrastructure requirements. We give a detailed description of an interactive, online server for astrophysical opacities, referred to as OPserver, to be used in sophisticated stellar modelling where Rosseland mean opacities and radiative accelerations are computed at every depth point and each evolution cycle. This is crucial, for instance, in chemically peculiar stars and in the exploitation of the new asteroseismological data. OPserver, downloadable with the new OPCD_3.0 release from the Centre de Données Astronomiques de Strasbourg, France, computes mean opacities and radiative data for arbitrary chemical mixtures from the OP monochromatic opacities. It is essentially a client-server network restructuring and optimization of the suite of codes included in the earlier OPCD_2.0 release. The server can be installed locally or, alternatively, accessed remotely from the Ohio Supercomputer Center, Columbus, Ohio, USA. The client is an interactive web page or a subroutine library that can be linked to the user code. The suitability of this scheme in grid computing environments is emphasized, and its extension to other atomic database services for astrophysical purposes is discussed.

  2. A Configurable Internet Telemetry Server / Remote Client System

    NASA Astrophysics Data System (ADS)

    Boyd, W. T.; Hopkins, A.; Abbott, M. J.; Girouard, F. R.

    2000-05-01

    We have created a general, object-oriented software framework in Java for remote viewing of telemetry over the Internet. The general system consists of a data server and a remote client that can be extended by any project that uses telemetry to implement a remote telemetry viewer. We have implemented a system that serves live telemetry from NASA's Extreme Ultraviolet Explorer satellite and a client that can display the telemetry at a remote location. An authenticated user may run a standalone graphical or text-based client, or an applet on a web page, to view EUVE telemetry. In the case of the GUI client, a user can build displays to his/her own specifications using a GUI view-building tool. This work was supported by grants NCC2-947 and NCC2-966 from NASA Ames Research Center and grant JPL-960684 from NASA Jet Propulsion Laboratory.

  3. Development of Web-Based Menu Planning Support System and its Solution Using Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Kashima, Tomoko; Matsumoto, Shimpei; Ishii, Hiroaki

    2009-10-01

    Recently lifestyle-related diseases have become an object of public concern, while at the same time people are being more health conscious. As an essential factor for causing the lifestyle-related diseases, we assume that the knowledge circulation on dietary habits is still insufficient. This paper focuses on everyday meals close to our life and proposes a well-balanced menu planning system as a preventive measure of lifestyle-related diseases. The system is developed by using a Web-based frontend and it provides multi-user services and menu information sharing capabilities like social networking services (SNS). The system is implemented on a Web server running Apache (HTTP server software), MySQL (database management system), and PHP (scripting language for dynamic Web pages). For the menu planning, a genetic algorithm is applied by understanding this problem as multidimensional 0-1 integer programming.

  4. XMM-Newton Remote Interface to Science Analysis Software: First Public Version

    NASA Astrophysics Data System (ADS)

    Ibarra, A.; Gabriel, C.

    2011-07-01

    We present the first public beta release of the XMM-Newton Remote Interface to Science Analysis (RISA) software, available through the official XMM-Newton web pages. In a nutshell, RISA is a web based application that encapsulates the XMM-Newton data analysis software. The client identifies observations and creates XMM-Newton workflows. The server processes the client request, creates job templates and sends the jobs to a computer. RISA has been designed to help, at the same time, non-expert and professional XMM-Newton users. Thanks to the predefined threads, non-expert users can easily produce light curves and spectra. And on the other hand, expert user can use the full parameter interface to tune their own analysis. In both cases, the VO compliant client/server design frees the users from having to install any specific software to analyze XMM-Newton data.

  5. High-speed network for delivery of education-on-demand

    NASA Astrophysics Data System (ADS)

    Cordero, Carlos; Harris, Dale; Hsieh, Jeff

    1996-03-01

    A project to investigate the feasibility of delivering on-demand distance education to the desktop, known as the Asynchronous Distance Education ProjecT (ADEPT), is presently being carried out. A set of Stanford engineering classes is digitized on PC, Macintosh, and UNIX platforms, and is made available on servers. Students on campus and in industry may then access class material on these servers via local and metropolitan area networks. Students can download class video and audio, encoded in QuickTimeTM and Show-Me TVTM formats, via file-transfer protocol or the World Wide Web. Alternatively, they may stream a vector-quantized version of the class directly from a server for real-time playback. Students may also download PostscriptTM and Adobe AcrobatTM versions of class notes. Off-campus students may connect to ADEPT servers via the internet, the Silicon Valley Test Track (SVTT), or the Bay-Area Gigabit Network (BAGNet). The SVTT and BAGNet are high-speed metropolitan-area networks, spanning the Bay Area, which provide IP access over asynchronous transfer mode (ATM). Student interaction is encouraged through news groups, electronic mailing lists, and an ADEPT home page. Issues related to having multiple platforms and interoperability are examined in this paper. The ramifications of providing a reliable service are discussed. System performance and the parameters that affect it are then described. Finally, future work on expanding ATM access, real-time delivery of classes, and enhanced student interaction is described.

  6. Launch Support Video Site

    NASA Technical Reports Server (NTRS)

    OFarrell, Zachary L.

    2013-01-01

    The goal of this project is to create a website that displays video, countdown clock, and event times to customers during launches, without needing to be connected to the internal operations network. The requirements of this project are to also minimize the delay in the clock and events to be less than two seconds. The two parts of this are the webpage, which will display the data and videos to the user, and a server to send clock and event data to the webpage. The webpage is written in HTML with CSS and JavaScript. The JavaScript is responsible for connecting to the server, receiving new clock data, and updating the webpage. JavaScript is used for this because it can send custom HTTP requests from the webpage, and provides the ability to update parts of the webpage without having to refresh the entire page. The server application will act as a relay between the operations network, and the open internet. On the operations network side, the application receives multicast packets that contain countdown clock and events data. It will then parse the data into current countdown times and events, and create a packet with that information that can be sent to webpages. The other part will accept HTTP requests from the webpage, and respond to them with current data. The server is written in C# with some C++ files used to define the structure of data packets. The videos for the webpage will be shown in an embedded player from UStream.

  7. QNAP 1263U Network Attached Storage (NAS)/ Storage Area Network (SAN) Device Users Guide

    DTIC Science & Technology

    2016-11-01

    standard Ethernet network. Operating either a NAS or SAN is vital for the integrity of the data stored on the drives found in the device. Redundant...speed of the network itself. Many standards are in place for transferring data, including more standard ones such as File Transfer Protocol and Server ...following are the procedures for connecting to the NAS administrative web page: 1) Open a web browser and browse to 192.168.40.8:8080. 2) Enter the

  8. Design and implementation of streaming media server cluster based on FFMpeg.

    PubMed

    Zhao, Hong; Zhou, Chun-long; Jin, Bao-zhao

    2015-01-01

    Poor performance and network congestion are commonly observed in the streaming media single server system. This paper proposes a scheme to construct a streaming media server cluster system based on FFMpeg. In this scheme, different users are distributed to different servers according to their locations and the balance among servers is maintained by the dynamic load-balancing algorithm based on active feedback. Furthermore, a service redirection algorithm is proposed to improve the transmission efficiency of streaming media data. The experiment results show that the server cluster system has significantly alleviated the network congestion and improved the performance in comparison with the single server system.

  9. Design and Implementation of Streaming Media Server Cluster Based on FFMpeg

    PubMed Central

    Zhao, Hong; Zhou, Chun-long; Jin, Bao-zhao

    2015-01-01

    Poor performance and network congestion are commonly observed in the streaming media single server system. This paper proposes a scheme to construct a streaming media server cluster system based on FFMpeg. In this scheme, different users are distributed to different servers according to their locations and the balance among servers is maintained by the dynamic load-balancing algorithm based on active feedback. Furthermore, a service redirection algorithm is proposed to improve the transmission efficiency of streaming media data. The experiment results show that the server cluster system has significantly alleviated the network congestion and improved the performance in comparison with the single server system. PMID:25734187

  10. Collaborative GIS for flood susceptibility mapping: An example from Mekong river basin of Viet Nam

    NASA Astrophysics Data System (ADS)

    Thanh, B.

    2016-12-01

    Flooding is one of the most dangerous natural disasters in Vietnam. Floods have caused serious damages to people and made adverse impact on social economic development across the country, especially in lower river basin where there is high risk of flooding as consequences of the climate change and social activities. This paper presents a collaborative platform of a combination of an interactive web-GIS framework and a multi-criteria evaluation (MCE) tool. MCE is carried out in server side through web interface, in which parameters used for evaluation are groups into three major categories, including (1) climatic factor: precipitation, typhoon frequency, temperature, humidity (2) physiographic data: DEM, topographic wetness index, NDVI, stream power index, soil texture, distance to river (3) social factor: NDBI, land use pattern. Web-based GIS is based on open-source technology that includes an information page, a page for MCE tool that users can interactively alter parameters in flood susceptible mapping, and a discussion page. The system is designed for local participation in prediction of the flood risk magnitude under impacts of natural processes and human intervention. The proposed flood susceptibility assessment prototype was implemented in the Mekong river basin, Viet Nam. Index images were calculated using Landsat data, and other were collected from authorized agencies. This study shows the potential to combine web-GIS and spatial analysis tool to flood hazard risk assessment. The combination can be a supportive solution that potentially assists the interaction between stakeholders in information exchange and in disaster management, thus provides for better analysis, control and decision-making.

  11. Experience with a Spanish-language laparoscopy website.

    PubMed

    Moreno-Sanz, Carlos; Seoane-González, Jose B

    2006-02-01

    Although there are no clearly defined electronic tools for continuing medical education (CME), new information technologies offer a basic platform for presenting training content on the internet. Due to the shortage of websites about minimally invasive surgery in the Spanish language, we set up a topical website in Spanish. This study considers the experience with the website between April 2001 and January 2005. To study the activity of the website, the registry information was analyzed descriptively using the log files of the server. To study the characteristics of the users, we searched the database of registered users. We found a total of 107,941 visits to our website and a total of 624,895 page downloads. Most visits to the site were made from Spanish-speaking countries. The most frequent professional profile of the registered users was that of general surgeon. The development, implementation, and evaluation of Spanish-language CME initiatives over the internet is promising but presents challenges.

  12. BioPepDB: an integrated data platform for food-derived bioactive peptides.

    PubMed

    Li, Qilin; Zhang, Chao; Chen, Hongjun; Xue, Jitong; Guo, Xiaolei; Liang, Ming; Chen, Ming

    2018-03-12

    Food-derived bioactive peptides play critical roles in regulating most biological processes and have considerable biological, medical and industrial importance. However, a large number of active peptides data, including sequence, function, source, commercial product information, references and other information are poorly integrated. BioPepDB is a searchable database of food-derived bioactive peptides and their related articles, including more than four thousand bioactive peptide entries. Moreover, BioPepDB provides modules of prediction and hydrolysis-simulation for discovering novel peptides. It can serve as a reference database to investigate the function of different bioactive peptides. BioPepDB is available at http://bis.zju.edu.cn/biopepdbr/ . The web page utilises Apache, PHP5 and MySQL to provide the user interface for accessing the database and predict novel peptides. The database itself is operated on a specialised server.

  13. Web-based UMLS concept retrieval by automatic text scanning: a comparison of two methods.

    PubMed

    Brandt, C; Nadkarni, P

    2001-01-01

    The Web is increasingly the medium of choice for multi-user application program delivery. Yet selection of an appropriate programming environment for rapid prototyping, code portability, and maintainability remain issues. We summarize our experience on the conversion of a LISP Web application, Search/SR to a new, functionally identical application, Search/SR-ASP using a relational database and active server pages (ASP) technology. Our results indicate that provision of easy access to database engines and external objects is almost essential for a development environment to be considered viable for rapid and robust application delivery. While LISP itself is a robust language, its use in Web applications may be hard to justify given that current vendor implementations do not provide such functionality. Alternative, currently available scripting environments for Web development appear to have most of LISP's advantages and few of its disadvantages.

  14. Calypso: a user-friendly web-server for mining and visualizing microbiome-environment interactions.

    PubMed

    Zakrzewski, Martha; Proietti, Carla; Ellis, Jonathan J; Hasan, Shihab; Brion, Marie-Jo; Berger, Bernard; Krause, Lutz

    2017-03-01

    Calypso is an easy-to-use online software suite that allows non-expert users to mine, interpret and compare taxonomic information from metagenomic or 16S rDNA datasets. Calypso has a focus on multivariate statistical approaches that can identify complex environment-microbiome associations. The software enables quantitative visualizations, statistical testing, multivariate analysis, supervised learning, factor analysis, multivariable regression, network analysis and diversity estimates. Comprehensive help pages, tutorials and videos are provided via a wiki page. The web-interface is accessible via http://cgenome.net/calypso/ . The software is programmed in Java, PERL and R and the source code is available from Zenodo ( https://zenodo.org/record/50931 ). The software is freely available for non-commercial users. l.krause@uq.edu.au. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  15. Biographer: web-based editing and rendering of SBGN compliant biochemical networks.

    PubMed

    Krause, Falko; Schulz, Marvin; Ripkens, Ben; Flöttmann, Max; Krantz, Marcus; Klipp, Edda; Handorf, Thomas

    2013-06-01

    The rapid accumulation of knowledge in the field of Systems Biology during the past years requires advanced, but simple-to-use, methods for the visualization of information in a structured and easily comprehensible manner. We have developed biographer, a web-based renderer and editor for reaction networks, which can be integrated as a library into tools dealing with network-related information. Our software enables visualizations based on the emerging standard Systems Biology Graphical Notation. It is able to import networks encoded in various formats such as SBML, SBGN-ML and jSBGN, a custom lightweight exchange format. The core package is implemented in HTML5, CSS and JavaScript and can be used within any kind of web-based project. It features interactive graph-editing tools and automatic graph layout algorithms. In addition, we provide a standalone graph editor and a web server, which contains enhanced features like web services for the import and export of models and visualizations in different formats. The biographer tool can be used at and downloaded from the web page http://biographer.biologie.hu-berlin.de/. The different software packages, including a server-independent version as well as a web server for Windows and Linux based systems, are available at http://code.google.com/p/biographer/ under the open-source license LGPL

  16. FAF-Drugs3: a web server for compound property calculation and chemical library design

    PubMed Central

    Lagorce, David; Sperandio, Olivier; Baell, Jonathan B.; Miteva, Maria A.; Villoutreix, Bruno O.

    2015-01-01

    Drug attrition late in preclinical or clinical development is a serious economic problem in the field of drug discovery. These problems can be linked, in part, to the quality of the compound collections used during the hit generation stage and to the selection of compounds undergoing optimization. Here, we present FAF-Drugs3, a web server that can be used for drug discovery and chemical biology projects to help in preparing compound libraries and to assist decision-making during the hit selection/lead optimization phase. Since it was first described in 2006, FAF-Drugs has been significantly modified. The tool now applies an enhanced structure curation procedure, can filter or analyze molecules with user-defined or eight predefined physicochemical filters as well as with several simple ADMET (absorption, distribution, metabolism, excretion and toxicity) rules. In addition, compounds can be filtered using an updated list of 154 hand-curated structural alerts while Pan Assay Interference compounds (PAINS) and other, generally unwanted groups are also investigated. FAF-Drugs3 offers access to user-friendly html result pages and the possibility to download all computed data. The server requires as input an SDF file of the compounds; it is open to all users and can be accessed without registration at http://fafdrugs3.mti.univ-paris-diderot.fr. PMID:25883137

  17. Accounting Data to Web Interface Using PERL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hargeaves, C

    2001-08-13

    This document will explain the process to create a web interface for the accounting information generated by the High Performance Storage Systems (HPSS) accounting report feature. The accounting report contains useful data but it is not easily accessed in a meaningful way. The accounting report is the only way to see summarized storage usage information. The first step is to take the accounting data, make it meaningful and store the modified data in persistent databases. The second step is to generate the various user interfaces, HTML pages, that will be used to access the data. The third step is tomore » transfer all required files to the web server. The web pages pass parameters to Common Gateway Interface (CGI) scripts that generate dynamic web pages and graphs. The end result is a web page with specific information presented in text with or without graphs. The accounting report has a specific format that allows the use of regular expressions to verify if a line is storage data. Each storage data line is stored in a detailed database file with a name that includes the run date. The detailed database is used to create a summarized database file that also uses run date in its name. The summarized database is used to create the group.html web page that includes a list of all storage users. Scripts that query the database folder to build a list of available databases generate two additional web pages. A master script that is run monthly as part of a cron job, after the accounting report has completed, manages all of these individual scripts. All scripts are written in the PERL programming language. Whenever possible data manipulation scripts are written as filters. All scripts are written to be single source, which means they will function properly on both the open and closed networks at LLNL. The master script handles the command line inputs for all scripts, file transfers to the web server and records run information in a log file. The rest of the scripts manipulate the accounting data or use the files created to generate HTML pages. Each script will be described in detail herein. The following is a brief description of HPSS taken directly from an HPSS web site. ''HPSS is a major development project, which began in 1993 as a Cooperative Research and Development Agreement (CRADA) between government and industry. The primary objective of HPSS is to move very large data objects between high performance computers, workstation clusters, and storage libraries at speeds many times faster than is possible with today's software systems. For example, HPSS can manage parallel data transfers from multiple network-connected disk arrays at rates greater than 1 Gbyte per second, making it possible to access high definition digitized video in real time.'' The HPSS accounting report is a canned report whose format is controlled by the HPSS developers.« less

  18. Webmail: an Automated Web Publishing System

    NASA Astrophysics Data System (ADS)

    Bell, David

    A system for publishing frequently updated information to the World Wide Web will be described. Many documents now hosted by the NOAO Web server require timely posting and frequent updates, but need only minor changes in markup or are in a standard format requiring only conversion to HTML. These include information from outside the organization, such as electronic bulletins, and a number of internal reports, both human and machine generated. Webmail uses procmail and Perl scripts to process incoming email messages in a variety of ways. This processing may include wrapping or conversion to HTML, posting to the Web or internal newsgroups, updating search indices or links on related pages, and sending email notification of the new pages to interested parties. The Webmail system has been in use at NOAO since early 1997 and has steadily grown to include fourteen recipes that together handle about fifty messages per week.

  19. Markovian Queues with Arrival Dependence

    DTIC Science & Technology

    1976-03-01

    adding together the three balance equations for P 2o’ ^21’ "^22 as ^°ll°ws ’ 1 20 2 21 <W P21= XP10 + *2P22 H- ( ^ l^ 2 )p22 = Xp11 "lP20 +UlP21 +V22...REPORT DOCUMENTATION PAGE READ INSTRUCTIONSBEFORE COMPLETING FORM 1 REPORT NUMBER 2 . GOVT ACCESSION NO. 3. RECIPIENT’S CATALOG NUMBER 4. TITLE (and...ADDITIONAL FACTS CONCERNING THE TRANSIENT DISTRIBUTION OF WAITING TIMES FOR ARRIVING CUSTOMERS 2 ? IV. THE TWO CHANNEL SERVER QUEUE WITH SINGLE

  20. MiniWall Tool for Analyzing CFD and Wind Tunnel Large Data Sets

    NASA Technical Reports Server (NTRS)

    Schuh, Michael J.; Melton, John E.; Stremel, Paul M.

    2017-01-01

    It is challenging to review and assimilate large data sets created by Computational Fluid Dynamics (CFD) simulations and wind tunnel tests. Over the past 10 years, NASA Ames Research Center has developed and refined a software tool dubbed the MiniWall to increase productivity in reviewing and understanding large CFD-generated data sets. Under the recent NASA ERA project, the application of the tool expanded to enable rapid comparison of experimental and computational data. The MiniWall software is browser based so that it runs on any computer or device that can display a web page. It can also be used remotely and securely by using web server software such as the Apache HTTP server. The MiniWall software has recently been rewritten and enhanced to make it even easier for analysts to review large data sets and extract knowledge and understanding from these data sets. This paper describes the MiniWall software and demonstrates how the different features are used to review and assimilate large data sets.

  1. WEBSLIDE: A "Virtual" Slide Projector Based on World Wide Web

    NASA Astrophysics Data System (ADS)

    Barra, Maria; Ferrandino, Salvatore; Scarano, Vittorio

    1999-03-01

    We present here the design key concepts of WEBSLIDE, a software project whose objective is to provide a simple, cheap and efficient solution for showing slides during lessons in computer labs. In fact, WEBSLIDE allows the video monitors of several client machines (the "STUDENTS") to be synchronously updated by the actions of a particular client machine, called the "INSTRUCTOR." The system is based on the World Wide Web and the software components of WEBSLIDE mainly consists in a WWW server, browsers and small Cgi-Bill scripts. What makes WEBSLIDE particularly appealing for small educational institutions is that WEBSLIDE is built with "off the shelf" products: it does not involve using a specifically designed program but any Netscape browser, one of the most popular browsers available on the market, is sufficient. Another possible use is to use our system to implement "guided automatic tours" through several pages or Intranets internal news bulletins: the company Web server can broadcast to all employees relevant information on their browser.

  2. MiniWall Tool for Analyzing CFD and Wind Tunnel Large Data Sets

    NASA Technical Reports Server (NTRS)

    Schuh, Michael J.; Melton, John E.; Stremel, Paul M.

    2017-01-01

    It is challenging to review and assimilate large data sets created by Computational Fluid Dynamics (CFD) simulations and wind tunnel tests. Over the past 10 years, NASA Ames Research Center has developed and refined a software tool dubbed the "MiniWall" to increase productivity in reviewing and understanding large CFD-generated data sets. Under the recent NASA ERA project, the application of the tool expanded to enable rapid comparison of experimental and computational data. The MiniWall software is browser based so that it runs on any computer or device that can display a web page. It can also be used remotely and securely by using web server software such as the Apache HTTP Server. The MiniWall software has recently been rewritten and enhanced to make it even easier for analysts to review large data sets and extract knowledge and understanding from these data sets. This paper describes the MiniWall software and demonstrates how the different features are used to review and assimilate large data sets.

  3. CIS3/398: Implementation of a Web-Based Electronic Patient Record for Transplant Recipients

    PubMed Central

    Fritsche, L; Lindemann, G; Schroeter, K; Schlaefer, A; Neumayer, H-H

    1999-01-01

    Introduction While the "Electronic patient record" (EPR) is a frequently quoted term in many areas of healthcare, only few working EPR-systems are available so far. To justify their use, EPRs must be able to store and display all kinds of medical information in a reliable, secure, time-saving, user-friendly way at an affordable price. Fields with patients who are attended to by a large number of medical specialists over a prolonged period of time are best suited to demonstrate the potential benefits of an EPR. The aim of our project was to investigate the feasibility of an EPR based solely on "of-the-shelf"-software and Internet-technology in the field of organ transplantation. Methods The EPR-system consists of three main elements: Data-storage facilities, a Web-server and a user-interface. Data are stored either in a relational database (Sybase Adaptive 11.5, Sybase Inc., CA) or in case of pictures (JPEG) and files in application formats (e. g. Word-Documents) on a Windows NT 4.0 Server (Microsoft Corp., WA). The entire communication of all data is handled by a Web-server (IIS 4.0, Microsoft) with an Active Server Pages extension. The database is accessed by ActiveX Data Objects via the ODBC-interface. The only software required on the user's computer is the Internet Explorer 4.01 (Microsoft), during the first use of the EPR, the ActiveX HTML Layout Control is automatically added. The user can access the EPR via Local or Wide Area Network or by dial-up connection. If the EPR is accessed from outside the firewall, all communication is encrypted (SSL 3.0, Netscape Comm. Corp., CA).The speed of the EPR-system was tested with 50 repeated measurements of the duration of two key-functions: 1) Display of all lab results for a given day and patient and 2) automatic composition of a letter containing diagnoses, medication, notes and lab results. For the test a 233 MHz Pentium II Processor with 10 Mbit/s Ethernet connection (ping-time below 10 ms) over 2 hubs to the server (400 MHz Pentium II, 256 MB RAM) was used. Results So far the EPR-system has been running for eight consecutive months and contains complete records of 673 transplant recipients with an average follow-up of 9.9 (SD :4.9) years and a total of 1.1 million lab values. Instruction to enable new users to perform basic operations took less than two hours in all cases. The average duration of laboratory access was 0.9 (SD:0.5) seconds, the automatic composition of a letter took 6.1 (SD:2.4) seconds. Apart from the database and Windows NT, all other components are available for free. The development of the EPR-system required less than two person-years. Conclusion Implementation of an Electronic patient record that meets the requirements of comprehensiveness, reliability, security, speed, user-friendliness and affordability using a combination of "of-the-shelf" software-products can be feasible, if the current state-of-the-art internet technology is applied.

  4. Ligand.Info small-molecule Meta-Database.

    PubMed

    von Grotthuss, Marcin; Koczyk, Grzegorz; Pas, Jakub; Wyrwicz, Lucjan S; Rychlewski, Leszek

    2004-12-01

    Ligand.Info is a compilation of various publicly available databases of small molecules. The total size of the Meta-Database is over 1 million entries. The compound records contain calculated three-dimensional coordinates and sometimes information about biological activity. Some molecules have information about FDA drug approving status or about anti-HIV activity. Meta-Database can be downloaded from the http://Ligand.Info web page. The database can also be screened using a Java-based tool. The tool can interactively cluster sets of molecules on the user side and automatically download similar molecules from the server. The application requires the Java Runtime Environment 1.4 or higher, which can be automatically downloaded from Sun Microsystems or Apple Computer and installed during the first use of Ligand.Info on desktop systems, which support Java (Ms Windows, Mac OS, Solaris, and Linux). The Ligand.Info Meta-Database can be used for virtual high-throughput screening of new potential drugs. Presented examples showed that using a known antiviral drug as query the system was able to find others antiviral drugs and inhibitors.

  5. Automatic analysis of attack data from distributed honeypot network

    NASA Astrophysics Data System (ADS)

    Safarik, Jakub; Voznak, MIroslav; Rezac, Filip; Partila, Pavol; Tomala, Karel

    2013-05-01

    There are many ways of getting real data about malicious activity in a network. One of them relies on masquerading monitoring servers as a production one. These servers are called honeypots and data about attacks on them brings us valuable information about actual attacks and techniques used by hackers. The article describes distributed topology of honeypots, which was developed with a strong orientation on monitoring of IP telephony traffic. IP telephony servers can be easily exposed to various types of attacks, and without protection, this situation can lead to loss of money and other unpleasant consequences. Using a distributed topology with honeypots placed in different geological locations and networks provides more valuable and independent results. With automatic system of gathering information from all honeypots, it is possible to work with all information on one centralized point. Communication between honeypots and centralized data store use secure SSH tunnels and server communicates only with authorized honeypots. The centralized server also automatically analyses data from each honeypot. Results of this analysis and also other statistical data about malicious activity are simply accessible through a built-in web server. All statistical and analysis reports serve as information basis for an algorithm which classifies different types of used VoIP attacks. The web interface then brings a tool for quick comparison and evaluation of actual attacks in all monitored networks. The article describes both, the honeypots nodes in distributed architecture, which monitor suspicious activity, and also methods and algorithms used on the server side for analysis of gathered data.

  6. STELAR: An experiment in the electronic distribution of astronomical literature

    NASA Technical Reports Server (NTRS)

    Warnock, A.; Vansteenburg, M. E.; Brotzman, L. E.; Gass, J.; Kovalsky, D.

    1992-01-01

    STELAR (Study of Electronic Literature for Astronomical Research) is a Goddard-based project designed to test methods of delivering technical literature in machine readable form. To that end, we have scanned a five year span of the ApJ, ApJ Supp, AJ and PASP, and have obtained abstracts for eight leading academic journals from NASA/STI CASI, which also makes these abstracts available through the NASA RECON system. We have also obtained machine readable versions of some journal volumes from the publishers, although in many instances, the final typeset versions are no longer available. The fundamental data object for the STELAR database is the article, a collection of items associated with a scientific paper - abstract, scanned pages (in a variety of formats), figures, OCR extractions, forward and backward references, errata and versions of the paper in various formats (e.g., TEX, SGML, PostScript, DVI). Articles are uniquely referenced in the database by journal name, volume number and page number. The selection and delivery of articles is accomplished through the WAIS (Wide Area Information Server) client/server models requiring only an Internet connection. Modest modifications to the server code have made it capable of delivering the multiple data types required by STELAR. WAIS is a platform independent and fully open multi-disciplinary delivery system, originally developed by Thinking Machines Corp. and made available free of charge. It is based on the ISO Z39.50 standard communications protocol. WAIS servers run under both UNIX and VMS. WAIS clients run on a wide variety of machines, from UNIX-based Xwindows systems to MS-DOS and macintosh microcomputers. The WAIS system includes full-test indexing and searching of documents, network interface and easy access to a variety of document viewers. ASCII versions of the CASI abstracts have been formatted for display and the full test of the abstracts has been indexed. The entire WAIS database of abstracts is now available for use by the astronomical community. Enhancements of the search and retrieval system are under investigation to include specialized searches (by reference, author or keyword, as opposed to full test searches), improved handling of word stems, improvements in relevancy criteria and other retrieval techniques, such as factor spaces. The STELAR project has been assisted by the full cooperation of the AAS, the ASP, the publishers of the academic journals, librarians from GSFC, NRAO and STScI, the Library of Congress, and the University of North Carolina at Chapel Hill.

  7. Pulse oximeter based mobile biotelemetry application.

    PubMed

    Işik, Ali Hakan; Güler, Inan

    2012-01-01

    Quality and features of tele-homecare are improved by information and communication technologies. In this context, a pulse oximeter-based mobile biotelemetry application is developed. With this application, patients can measure own oxygen saturation and heart rate through Bluetooth pulse oximeter at home. Bluetooth virtual serial port protocol is used to send the test results from pulse oximeter to the smart phone. These data are converted into XML type and transmitted to remote web server database via smart phone. In transmission of data, GPRS, WLAN or 3G can be used. The rule based algorithm is used in the decision making process. By default, the threshold value of oxygen saturation is 80; the heart rate threshold values are 40 and 150 respectively. If the patient's heart rate is out of the threshold values or the oxygen saturation is below the threshold value, an emergency SMS is sent to the doctor. By this way, the directing of an ambulance to the patient can be performed by doctor. The doctor for different patients can change these threshold values. The conversion of the result of the evaluated data to SMS XML template is done on the web server. Another important component of the application is web-based monitoring of pulse oximeter data. The web page provides access to of all patient data, so the doctors can follow their patients and send e-mail related to the evaluation of the disease. In addition, patients can follow own data on this page. Eight patients have become part of the procedure. It is believed that developed application will facilitate pulse oximeter-based measurement from anywhere and at anytime.

  8. cisPath: an R/Bioconductor package for cloud users for visualization and management of functional protein interaction networks.

    PubMed

    Wang, Likun; Yang, Luhe; Peng, Zuohan; Lu, Dan; Jin, Yan; McNutt, Michael; Yin, Yuxin

    2015-01-01

    With the burgeoning development of cloud technology and services, there are an increasing number of users who prefer cloud to run their applications. All software and associated data are hosted on the cloud, allowing users to access them via a web browser from any computer, anywhere. This paper presents cisPath, an R/Bioconductor package deployed on cloud servers for client users to visualize, manage, and share functional protein interaction networks. With this R package, users can easily integrate downloaded protein-protein interaction information from different online databases with private data to construct new and personalized interaction networks. Additional functions allow users to generate specific networks based on private databases. Since the results produced with the use of this package are in the form of web pages, cloud users can easily view and edit the network graphs via the browser, using a mouse or touch screen, without the need to download them to a local computer. This package can also be installed and run on a local desktop computer. Depending on user preference, results can be publicized or shared by uploading to a web server or cloud driver, allowing other users to directly access results via a web browser. This package can be installed and run on a variety of platforms. Since all network views are shown in web pages, such package is particularly useful for cloud users. The easy installation and operation is an attractive quality for R beginners and users with no previous experience with cloud services.

  9. cisPath: an R/Bioconductor package for cloud users for visualization and management of functional protein interaction networks

    PubMed Central

    2015-01-01

    Background With the burgeoning development of cloud technology and services, there are an increasing number of users who prefer cloud to run their applications. All software and associated data are hosted on the cloud, allowing users to access them via a web browser from any computer, anywhere. This paper presents cisPath, an R/Bioconductor package deployed on cloud servers for client users to visualize, manage, and share functional protein interaction networks. Results With this R package, users can easily integrate downloaded protein-protein interaction information from different online databases with private data to construct new and personalized interaction networks. Additional functions allow users to generate specific networks based on private databases. Since the results produced with the use of this package are in the form of web pages, cloud users can easily view and edit the network graphs via the browser, using a mouse or touch screen, without the need to download them to a local computer. This package can also be installed and run on a local desktop computer. Depending on user preference, results can be publicized or shared by uploading to a web server or cloud driver, allowing other users to directly access results via a web browser. Conclusions This package can be installed and run on a variety of platforms. Since all network views are shown in web pages, such package is particularly useful for cloud users. The easy installation and operation is an attractive quality for R beginners and users with no previous experience with cloud services. PMID:25708840

  10. Static Memory Deduplication for Performance Optimization in Cloud Computing.

    PubMed

    Jia, Gangyong; Han, Guangjie; Wang, Hao; Yang, Xuan

    2017-04-27

    In a cloud computing environment, the number of virtual machines (VMs) on a single physical server and the number of applications running on each VM are continuously growing. This has led to an enormous increase in the demand of memory capacity and subsequent increase in the energy consumption in the cloud. Lack of enough memory has become a major bottleneck for scalability and performance of virtualization interfaces in cloud computing. To address this problem, memory deduplication techniques which reduce memory demand through page sharing are being adopted. However, such techniques suffer from overheads in terms of number of online comparisons required for the memory deduplication. In this paper, we propose a static memory deduplication (SMD) technique which can reduce memory capacity requirement and provide performance optimization in cloud computing. The main innovation of SMD is that the process of page detection is performed offline, thus potentially reducing the performance cost, especially in terms of response time. In SMD, page comparisons are restricted to the code segment, which has the highest shared content. Our experimental results show that SMD efficiently reduces memory capacity requirement and improves performance. We demonstrate that, compared to other approaches, the cost in terms of the response time is negligible.

  11. Static Memory Deduplication for Performance Optimization in Cloud Computing

    PubMed Central

    Jia, Gangyong; Han, Guangjie; Wang, Hao; Yang, Xuan

    2017-01-01

    In a cloud computing environment, the number of virtual machines (VMs) on a single physical server and the number of applications running on each VM are continuously growing. This has led to an enormous increase in the demand of memory capacity and subsequent increase in the energy consumption in the cloud. Lack of enough memory has become a major bottleneck for scalability and performance of virtualization interfaces in cloud computing. To address this problem, memory deduplication techniques which reduce memory demand through page sharing are being adopted. However, such techniques suffer from overheads in terms of number of online comparisons required for the memory deduplication. In this paper, we propose a static memory deduplication (SMD) technique which can reduce memory capacity requirement and provide performance optimization in cloud computing. The main innovation of SMD is that the process of page detection is performed offline, thus potentially reducing the performance cost, especially in terms of response time. In SMD, page comparisons are restricted to the code segment, which has the highest shared content. Our experimental results show that SMD efficiently reduces memory capacity requirement and improves performance. We demonstrate that, compared to other approaches, the cost in terms of the response time is negligible. PMID:28448434

  12. Spatial Information Processing: Standards-Based Open Source Visualization Technology

    NASA Astrophysics Data System (ADS)

    Hogan, P.

    2009-12-01

    . Spatial information intelligence is a global issue that will increasingly affect our ability to survive as a species. Collectively we must better appreciate the complex relationships that make life on Earth possible. Providing spatial information in its native context can accelerate our ability to process that information. To maximize this ability to process information, three basic elements are required: data delivery (server technology), data access (client technology), and data processing (information intelligence). NASA World Wind provides open source client and server technologies based on open standards. The possibilities for data processing and data sharing are enhanced by this inclusive infrastructure for geographic information. It is interesting that this open source and open standards approach, unfettered by proprietary constraints, simultaneously provides for entirely proprietary use of this same technology. 1. WHY WORLD WIND? NASA World Wind began as a single program with specific functionality, to deliver NASA content. But as the possibilities for virtual globe technology became more apparent, we found that while enabling a new class of information technology, we were also getting in the way. Researchers, developers and even users expressed their desire for World Wind functionality in ways that would service their specific needs. They want it in their web pages. They want to add their own features. They want to manage their own data. They told us that only with this kind of flexibility, could their objectives and the potential for this technology be truly realized. World Wind client technology is a set of development tools, a software development kit (SDK) that allows a software engineer to create applications requiring geographic visualization technology. 2. MODULAR COMPONENTRY Accelerated evolution of a technology requires that the essential elements of that technology be modular components such that each can advance independent of the other elements. World Wind therefore changed its mission from providing a single information browser to enabling a whole class of 3D geographic applications. Instead of creating a single program, World Wind is a suite of components that can be selectively used in any number of programs. World Wind technology can be a part of any application, or it can be a window in a web page. Or it can be extended with additional functionalities by application and web developers. World Wind makes it possible to include virtual globe visualization and server technology in support of any objective. The world community can continually benefit from advances made in the technology by NASA in concert with the world community. 3. OPEN SOURCE AND OPEN STANDARDS NASA World Wind is NASA Open Source software. This means that the source code is fully accessible for anyone to freely use, even in association with proprietary technology. Imagery and other data provided by the World Wind servers reside in the public domain, including the data server technology itself. This allows others to deliver their own geospatial data and to provide custom solutions based on users specific needs.

  13. A daily living activity remote monitoring system for solitary elderly people.

    PubMed

    Maki, Hiromichi; Ogawa, Hidekuni; Matsuoka, Shingo; Yonezawa, Yoshiharu; Caldwell, W Morton

    2011-01-01

    A daily living activity remote monitoring system has been developed for supporting solitary elderly people. The monitoring system consists of a tri-axis accelerometer, six low-power active filters, a low-power 8-bit microcontroller (MC), a 1GB SD memory card (SDMC) and a 2.4 GHz low transmitting power mobile phone (PHS). The tri-axis accelerometer attached to the subject's chest can simultaneously measure dynamic and static acceleration forces produced by heart sound, respiration, posture and behavior. The heart rate, respiration rate, activity, posture and behavior are detected from the dynamic and static acceleration forces. These data are stored in the SD. The MC sends the data to the server computer every hour. The server computer stores the data and makes a graphic chart from the data. When the caregiver calls from his/her mobile phone to the server computer, the server computer sends the graphical chart via the PHS. The caregiver's mobile phone displays the chart to the monitor graphically.

  14. COM1/348: Design and Implementation of a Portal for the Market of the Medical Equipment (MEDICOM)

    PubMed Central

    Palamas, S; Vlachos, I; Panou-Diamandi, O; Marinos, G; Kalivas, D; Zeelenberg, C; Nimwegen, C; Koutsouris, D

    1999-01-01

    Introduction The MEDICOM system provides the electronic means for medical equipment manufacturers to communicate online with their customers supporting the Purchasing Process and the Post Market Surveillance. The MEDICOM service will be provided over the Internet by the MEDICOM Portal, and by a set of distributed subsystems dedicated to handle structured information related to medical devices. There are three kinds of these subsystems, the Hypermedia Medical Catalogue (HMC), Virtual Medical Exhibition (VME), which contains information in a form of Virtual Models, and the Post Market Surveillance system (PMS). The Universal Medical Devices Nomenclature System (UMDNS) is used to register all products. This work was partially funded by the ESPRIT Project 25289 (MEDICOM). Methods The Portal provides the end user interface operating as the MEDICOM Portal, acts as the yellow pages for finding both products and providers, providing links to the providers servers, implements the system management and supports the subsystem database compatibility. The Portal hosts a database system composed of two parts: (a) the Common Database, which describes a set of encoded parameters (like Supported Languages, Geographic Regions, UMDNS Codes, etc) common to all subsystems and (b) the Short Description Database, which contains summarised descriptions of medical devices, including a text description, the codes of the manufacturer, UMDNS code, attribute values and links to the corresponding HTML pages of the HMC, VME and PMS servers. The Portal provides the MEDICOM user interface including services like end user profiling and registration, end user query forms, creation and hosting of newsgroups, links to online libraries, end user subscription to manufacturers' mailing lists, online information for the MEDICOM system and special messages or advertisements from manufacturers. Results Platform independence and interoperability characterise the system design. A general purpose RDBMS is used for the implementation of the databases. The end user interface is implemented using HTML and Java applets, while the subsystem administration applications are developed using Java. The JDBC interface is used in order to provide database access to these applications. The communication between subsystems is implemented using CORBA objects and Java servlets are used in subsystem servers for the activation of remote operations. Discussion In the second half of 1999, the MEDICOM Project will enter the phase of evaluation and pilot operation. The benefits of the MEDICOM system are expected to be the establishment of a world wide accessible marketplace between providers and health care professionals. The latter will achieve the provision of up-to-date and high quality products information in an easy and friendly way, and the enhancement of the marketing procedures and after sales support efficiency.

  15. Got a Minute? Tune Your iPad to NASA's Best

    NASA Astrophysics Data System (ADS)

    Leon, N.; Fitzpatrick, A. J.; Fisher, D. K.; Netting, R. A.

    2012-12-01

    Space Place Prime is a content presentation app for the iPad. It gathers some of the best and most recent web offerings from NASA. A spinoff of NASA's popular kids' website The Space Place (spaceplace.nasa.gov or science.nasa.gov/kids), Space Place Prime taps timely educational and easy-to-read articles from the website, as well as daily updates of NASA space and Earth images and the latest informative videos, including Science Casts and the monthly "What's up in the Sky." Space Place Prime targets a multigenerational audience, including anyone with an interest in NASA and science in general. Features are offered for kids, teachers, parents, space enthusiasts, and everyone in between. The app can be the user's own NASA news source. Like a newspaper or magazine app, Space Place Prime downloads new content daily via wireless connection. In addition to the Space Place website, several NASA RSS feeds are tapped to provide new content. Content is retained for the previous several days or some number of editions of each feed. All content is controlled on the server side, so we can push features about the latest news or change any content without updating the app in the Apple Store. The Space Place Prime interface is a virtual endless grid of small images with short titles, each image a link to an image, video, article, or hands-on activity for kids. The grid can be dragged in any direction with no boundaries. (Image links repeat to fill in the grid "infinitely.") For a more focused search, a list mode presents menus of images, videos, and articles (including activity articles) separately. If the user tags a page (image, video, or article) as a Favorite, the content is downloaded and maintained on the device, and remains permanently available regardless of connectivity. (Very large video files are permanently retained on the server side, however, rather than taking up the limited storage on the iPad.) Facebook, twitter, and e-mail connections make any feature easy to share. The format for each type of feature is designed to fit the genre. Image pages show the full-screen image with the complete caption from the feed in a scrollable panel on the right side of the image (which can be dragged to the left, if desired). Tap on the image and the caption goes away. Tap and it returns, along with a close "X" in the upper right. The user can just enjoy the image, or dig into all its significance by reading the entire caption. Videos use the familiar YouTube video player, with a title and scrollable caption field underneath. Article pages from The Space Place website look similar to the web pages. They are often published to correspond to current events or important science or NASA-related anniversaries. Space Place Prime can be a valuable tool for teachers. Curriculum enriching images or short videos can be projected from the iPad to a classroom screen. Busy educators will be able to find appropriate and fresh material from NASA in one place every day. Space Place Prime fills a unique niche for NASA space and Earth science fans, providing a channel for some of its most recent, most compelling material. It's an app that aims to meet the needs of both the casual user who is short on time, as well as young students, their teachers, and their parents.

  16. An incremental database access method for autonomous interoperable databases

    NASA Technical Reports Server (NTRS)

    Roussopoulos, Nicholas; Sellis, Timos

    1994-01-01

    We investigated a number of design and performance issues of interoperable database management systems (DBMS's). The major results of our investigation were obtained in the areas of client-server database architectures for heterogeneous DBMS's, incremental computation models, buffer management techniques, and query optimization. We finished a prototype of an advanced client-server workstation-based DBMS which allows access to multiple heterogeneous commercial DBMS's. Experiments and simulations were then run to compare its performance with the standard client-server architectures. The focus of this research was on adaptive optimization methods of heterogeneous database systems. Adaptive buffer management accounts for the random and object-oriented access methods for which no known characterization of the access patterns exists. Adaptive query optimization means that value distributions and selectives, which play the most significant role in query plan evaluation, are continuously refined to reflect the actual values as opposed to static ones that are computed off-line. Query feedback is a concept that was first introduced to the literature by our group. We employed query feedback for both adaptive buffer management and for computing value distributions and selectivities. For adaptive buffer management, we use the page faults of prior executions to achieve more 'informed' management decisions. For the estimation of the distributions of the selectivities, we use curve-fitting techniques, such as least squares and splines, for regressing on these values.

  17. Biographer: web-based editing and rendering of SBGN compliant biochemical networks

    PubMed Central

    Krause, Falko; Schulz, Marvin; Ripkens, Ben; Flöttmann, Max; Krantz, Marcus; Klipp, Edda; Handorf, Thomas

    2013-01-01

    Motivation: The rapid accumulation of knowledge in the field of Systems Biology during the past years requires advanced, but simple-to-use, methods for the visualization of information in a structured and easily comprehensible manner. Results: We have developed biographer, a web-based renderer and editor for reaction networks, which can be integrated as a library into tools dealing with network-related information. Our software enables visualizations based on the emerging standard Systems Biology Graphical Notation. It is able to import networks encoded in various formats such as SBML, SBGN-ML and jSBGN, a custom lightweight exchange format. The core package is implemented in HTML5, CSS and JavaScript and can be used within any kind of web-based project. It features interactive graph-editing tools and automatic graph layout algorithms. In addition, we provide a standalone graph editor and a web server, which contains enhanced features like web services for the import and export of models and visualizations in different formats. Availability: The biographer tool can be used at and downloaded from the web page http://biographer.biologie.hu-berlin.de/. The different software packages, including a server-indepenent version as well as a web server for Windows and Linux based systems, are available at http://code.google.com/p/biographer/ under the open-source license LGPL. Contact: edda.klipp@biologie.hu-berlin.de or handorf@physik.hu-berlin.de PMID:23574737

  18. CIS4/403: Design and Implementation of an Intranet-based system for Real-Time Tele-Consultation in Oncology

    PubMed Central

    Eccher, C; Berloffa, F; Demichelis, F; Larcher, B; Galvagni, M; Sboner, A; Graiff, A; Forti, S

    1999-01-01

    Introduction This study describes a tele-consultation system (TCS) developed to provide a computing environment over a Wide Area Network (WAN) in North Italy (Province of Trento), that can be used by two or more physicians to share medical data and to work co-operatively on medical records. A pilot study has been carried out in oncology to assess the effectiveness of the system. The aim of this project is to facilitate the management of oncology patients by improving communication among the specialists of central and district hospitals. Methods and Results The TCS is an Intranet-based solution. The Intranet is based on a PC WAN with Windows NT Server, Microsoft SQL Server, and Internet Information Server. TCS is composed of native and custom applications developed in the Microsoft Windows (9x and NT) environment. The basic component of the system is the multimedia digital medical record, structured as a collection of HTML and ASP pages. A distributed relational database will allow users to store and retrieve medical records, accessed by a dedicated Web browser via the Web Server. The medical data to be stored and the presentation architecture of the clinical record had been determined in close collaboration with the clinicians involved in the project. TCS will allow a multi-point tele-consultation (TC) among two or more participants on remote computers, providing synchronized surfing through the clinical report. A set of collaborative and personal tools, whiteboard with drawing tools, point-to-point digital audio-conference, chat, local notepad, e-mail service, are integrated in the system to provide an user friendly environment. TCS has been developed as a client-server architecture. The client part of the system is based on the Microsoft Web Browser control and provides the user interface and the tools described above. The server part, running all the time on a dedicated computer, accepts connection requests and manages the connections among the participants in a TC, allowing multiple TC to run simultaneously. TCS has been developed in Visual C++ environment using MFC library and COM technology; ActiveX controls have been written in Visual Basic to perform dedicated tasks from the inside of the HTML clinical report. Before deploying the system in the hospital departments involved in the project, TCS has been tested in our laboratory by clinicians involved in the project to evaluate the usability of the system. Discussion TCS has the potential to support a "multi-disciplinary distributed virtual oncological meeting". The specialists of different departments and of different hospitals can attend "virtual meetings" and interactively discuss on medical data. An expected benefit of the "virtual meeting" should be the possibility to provide expert remote advice from oncologists to peripheral cancer units in formulating treatment plans, conducting follow-up sessions and supporting clinical research.

  19. 78 FR 49586 - Self-Regulatory Organizations; Miami International Securities Exchange LLC; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-14

    ... Market Maker Standard quote server as a gateway for communicating eQuotes to MIAX. Because of the... connect the Limited Service Ports to independent servers that host their eQuote and purge functionality... same server for all of their Market Maker quoting activity. Currently, Market Makers in the MIAX System...

  20. 78 FR 70615 - Self-Regulatory Organizations; Miami International Securities Exchange LLC; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-26

    ... rather than forcing them to use their Market Maker Standard quote server as a gateway for communicating e... technical flexibility to connect additional Limited Service Ports to independent servers that host their e... mitigate the risk of using the same server for all of their Market Maker quoting activity. By using the...

  1. Kalium: a database of potassium channel toxins from scorpion venom.

    PubMed

    Kuzmenkov, Alexey I; Krylov, Nikolay A; Chugunov, Anton O; Grishin, Eugene V; Vassilevski, Alexander A

    2016-01-01

    Kalium (http://kaliumdb.org/) is a manually curated database that accumulates data on potassium channel toxins purified from scorpion venom (KTx). This database is an open-access resource, and provides easy access to pages of other databases of interest, such as UniProt, PDB, NCBI Taxonomy Browser, and PubMed. General achievements of Kalium are a strict and easy regulation of KTx classification based on the unified nomenclature supported by researchers in the field, removal of peptides with partial sequence and entries supported by transcriptomic information only, classification of β-family toxins, and addition of a novel λ-family. Molecules presented in the database can be processed by the Clustal Omega server using a one-click option. Molecular masses of mature peptides are calculated and available activity data are compiled for all KTx. We believe that Kalium is not only of high interest to professional toxinologists, but also of general utility to the scientific community.Database URL:http://kaliumdb.org/. © The Author(s) 2016. Published by Oxford University Press.

  2. Integrating GIS, Archeology, and the Internet.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sera White; Brenda Ringe Pace; Randy Lee

    2004-08-01

    At the Idaho National Engineering and Environmental Laboratory's (INEEL) Cultural Resource Management Office, a newly developed Data Management Tool (DMT) is improving management and long-term stewardship of cultural resources. The fully integrated system links an archaeological database, a historical database, and a research database to spatial data through a customized user interface using ArcIMS and Active Server Pages. Components of the new DMT are tailored specifically to the INEEL and include automated data entry forms for historic and prehistoric archaeological sites, specialized queries and reports that address both yearly and project-specific documentation requirements, and unique field recording forms. The predictivemore » modeling component increases the DMT’s value for land use planning and long-term stewardship. The DMT enhances the efficiency of archive searches, improving customer service, oversight, and management of the large INEEL cultural resource inventory. In the future, the DMT will facilitate data sharing with regulatory agencies, tribal organizations, and the general public.« less

  3. PubMed-EX: a web browser extension to enhance PubMed search with text mining features.

    PubMed

    Tsai, Richard Tzong-Han; Dai, Hong-Jie; Lai, Po-Ting; Huang, Chi-Hsin

    2009-11-15

    PubMed-EX is a browser extension that marks up PubMed search results with additional text-mining information. PubMed-EX's page mark-up, which includes section categorization and gene/disease and relation mark-up, can help researchers to quickly focus on key terms and provide additional information on them. All text processing is performed server-side, freeing up user resources. PubMed-EX is freely available at http://bws.iis.sinica.edu.tw/PubMed-EX and http://iisr.cse.yzu.edu.tw:8000/PubMed-EX/.

  4. Threshold-based queuing system for performance analysis of cloud computing system with dynamic scaling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shorgin, Sergey Ya.; Pechinkin, Alexander V.; Samouylov, Konstantin E.

    Cloud computing is promising technology to manage and improve utilization of computing center resources to deliver various computing and IT services. For the purpose of energy saving there is no need to unnecessarily operate many servers under light loads, and they are switched off. On the other hand, some servers should be switched on in heavy load cases to prevent very long delays. Thus, waiting times and system operating cost can be maintained on acceptable level by dynamically adding or removing servers. One more fact that should be taken into account is significant server setup costs and activation times. Formore » better energy efficiency, cloud computing system should not react on instantaneous increase or instantaneous decrease of load. That is the main motivation for using queuing systems with hysteresis for cloud computing system modelling. In the paper, we provide a model of cloud computing system in terms of multiple server threshold-based infinite capacity queuing system with hysteresis and noninstantanuous server activation. For proposed model, we develop a method for computing steady-state probabilities that allow to estimate a number of performance measures.« less

  5. Enhanced networked server management with random remote backups

    NASA Astrophysics Data System (ADS)

    Kim, Song-Kyoo

    2003-08-01

    In this paper, the model is focused on available server management in network environments. The (remote) backup servers are hooked up by VPN (Virtual Private Network) and replace broken main severs immediately. A virtual private network (VPN) is a way to use a public network infrastructure and hooks up long-distance servers within a single network infrastructure. The servers can be represent as "machines" and then the system deals with main unreliable and random auxiliary spare (remote backup) machines. When the system performs a mandatory routine maintenance, auxiliary machines are being used for backups during idle periods. Unlike other existing models, the availability of auxiliary machines is changed for each activation in this enhanced model. Analytically tractable results are obtained by using several mathematical techniques and the results are demonstrated in the framework of optimized networked server allocation problems.

  6. International distance-learning outreach: the APEC EINet experience.

    PubMed

    Kimball, A M; Shih, L; Brown, J; Harris, T G; Pautler, N; Jamieson, R W; Bolles, J; Horwitch, C

    2003-01-01

    The Emerging Infections Network is a mature electronic network that links Public Health professionals in the Asia Pacific through regular e-mail bulletins and an extensive Web site (http://www.apec.org/infectious). Emerging infections is a new area of study; learning materials help foster education. Our objective is to quantify the response of the network to the introduction of distance-learning materials on the Web site. Distance-learning materials, developed by the University of Washington School of Public Health, were field tested and launched on the site. Publicity was carried out prior to the launch of the materials. Access was tracked prospectively using server counts of page downloads. Web access increased substantially during the month after the materials were launched, especially among Asia based computers. The effect was isolated to the distance-learning pages, and not general to the site. This Web site appears to be responsive to the advertisement and to the materials. Prospective Web-site monitoring proved useful. Copyright 2002 Elsevier Science Ireland Ltd.

  7. Mercury Shopping Cart Interface

    NASA Technical Reports Server (NTRS)

    Pfister, Robin; McMahon, Joe

    2006-01-01

    Mercury Shopping Cart Interface (MSCI) is a reusable component of the Power User Interface 5.0 (PUI) program described in another article. MSCI is a means of encapsulating the logic and information needed to describe an orderable item consistent with Mercury Shopping Cart service protocol. Designed to be used with Web-browser software, MSCI generates Hypertext Markup Language (HTML) pages on which ordering information can be entered. MSCI comprises two types of Practical Extraction and Report Language (PERL) modules: template modules and shopping-cart logic modules. Template modules generate HTML pages for entering the required ordering details and enable submission of the order via a Hypertext Transfer Protocol (HTTP) post. Shopping cart modules encapsulate the logic and data needed to describe an individual orderable item to the Mercury Shopping Cart service. These modules evaluate information entered by the user to determine whether it is sufficient for the Shopping Cart service to process the order. Once an order has been passed from MSCI to a deployed Mercury Shopping Cart server, there is no further interaction with the user.

  8. Computational algorithm to evaluate product disassembly cost index

    NASA Astrophysics Data System (ADS)

    Zeid, Ibrahim; Gupta, Surendra M.

    2002-02-01

    Environmentally conscious manufacturing is an important paradigm in today's engineering practice. Disassembly is a crucial factor in implementing this paradigm. Disassembly allows the reuse and recycling of parts and products that reach their death after their life cycle ends. There are many questions that must be answered before a disassembly decision can be reached. The most important question is economical. The cost of disassembly versus the cost of scrapping a product is always considered. This paper develops a computational tool that allows decision-makers to calculate the disassembly cost of a product. The tool makes it simple to perform 'what if' scenarios fairly quickly. The tool is Web based and has two main parts. The front-end part is a Web page and runs on the client side in a Web browser, while the back-end part is a disassembly engine (servlet) that has disassembly knowledge and costing algorithms and runs on the server side. The tool is based on the client/server model that is pervasively utilized throughout the World Wide Web. An example is used to demonstrate the implementation and capabilities of the tool.

  9. Multi-Harmony: detecting functional specificity from sequence alignment

    PubMed Central

    Brandt, Bernd W.; Feenstra, K. Anton; Heringa, Jaap

    2010-01-01

    Many protein families contain sub-families with functional specialization, such as binding different ligands or being involved in different protein–protein interactions. A small number of amino acids generally determine functional specificity. The identification of these residues can aid the understanding of protein function and help finding targets for experimental analysis. Here, we present multi-Harmony, an interactive web sever for detecting sub-type-specific sites in proteins starting from a multiple sequence alignment. Combining our Sequence Harmony (SH) and multi-Relief (mR) methods in one web server allows simultaneous analysis and comparison of specificity residues; furthermore, both methods have been significantly improved and extended. SH has been extended to cope with more than two sub-groups. mR has been changed from a sampling implementation to a deterministic one, making it more consistent and user friendly. For both methods Z-scores are reported. The multi-Harmony web server produces a dynamic output page, which includes interactive connections to the Jalview and Jmol applets, thereby allowing interactive analysis of the results. Multi-Harmony is available at http://www.ibi.vu.nl/ programs/shmrwww. PMID:20525785

  10. EzMol: A Web Server Wizard for the Rapid Visualization and Image Production of Protein and Nucleic Acid Structures.

    PubMed

    Reynolds, Christopher R; Islam, Suhail A; Sternberg, Michael J E

    2018-01-31

    EzMol is a molecular visualization Web server in the form of a software wizard, located at http://www.sbg.bio.ic.ac.uk/ezmol/. It is designed for easy and rapid image manipulation and display of protein molecules, and is intended for users who need to quickly produce high-resolution images of protein molecules but do not have the time or inclination to use a software molecular visualization system. EzMol allows the upload of molecular structure files in PDB format to generate a Web page including a representation of the structure that the user can manipulate. EzMol provides intuitive options for chain display, adjusting the color/transparency of residues, side chains and protein surfaces, and for adding labels to residues. The final adjusted protein image can then be downloaded as a high-resolution image. There are a range of applications for rapid protein display, including the illustration of specific areas of a protein structure and the rapid prototyping of images. Copyright © 2018. Published by Elsevier Ltd.

  11. The PhEDEx next-gen website

    NASA Astrophysics Data System (ADS)

    Egeland, R.; Huang, C.-H.; Rossman, P.; Sundarrajan, P.; Wildish, T.

    2012-12-01

    PhEDEx is the data-transfer management solution written by CMS. It consists of agents running at each site, a website for presentation of information, and a web-based data-service for scripted access to information. The website allows users to monitor the progress of data-transfers, the status of site agents and links between sites, and the overall status and behaviour of everything about PhEDEx. It also allows users to make and approve requests for data-transfers and for deletion of data. It is the main point-of-entry for all users wishing to interact with PhEDEx. For several years, the website has consisted of a single perl program with about 10K SLOC. This program has limited capabilities for exploring the data, with only coarse filtering capabilities and no context-sensitive awareness. Graphical information is presented as static images, generated on the server, with no interactivity. It is also not well connected to the rest of the PhEDEx codebase, since much of it was written before the data-service was developed. All this makes it hard to maintain and extend. We are re-implementing the website to address these issues. The UI is being rewritten in Javascript, replacing most of the server-side code. We are using the YUI toolkit to provide advanced features and context-sensitive interaction, and will adopt a Javascript charting library for generating graphical representations client-side. This relieves the server of much of its load, and automatically improves server-side security. The Javascript components can be re-used in many ways, allowing custom pages to be developed for specific uses. In particular, standalone test-cases using small numbers of components make it easier to debug the Javascript than it is to debug a large server program. Information about PhEDEx is accessed through the PhEDEx data-service, since direct SQL is not available from the clients’ browser. This provides consistent semantics with other, externally written monitoring tools, which already use the data-service. It also reduces redundancy in the code, yielding a simpler, consolidated codebase. In this talk we describe our experience of re-factoring this monolithic server-side program into a lighter client-side framework. We describe some of the techniques that worked well for us, and some of the mistakes we made along the way. We present the current state of the project, and its future direction.

  12. [Examinations and exercises in medical technology utilizing a personal computer and the web].

    PubMed

    Niwa, Toshifumi

    2006-03-01

    The practice of e-learning in our department utilizing freeware without additional cost has been introduced: 1) Examinations and exercises are performed on the Web. Using the form-filling format of HTML, multiple-choice questions are asked. When the examinee submits the answers, the server will process the data using active server pages and send the result to the examinee with the score and explanations. So far, the students have given the Web examination (exercise) system a good evaluation. Concerning the effect of the explanations given in the Web exercises on the written test, some improvements were observed in the enumeration-type questions. On the other hand, no such improvements were clearly observed in the explanation-type questions, which require essential understanding of the principle. This suggests that effective utilization of the materials strongly depends on the students' eagerness to study. 2) To understand the chemical structures of body constituents, an exercise to draw structural formulae on the personal computer (PC) is performed. The students had difficulty in the fine control of arranging the whole shape of the formula, in addition to setting the format of characters such as super- and subscripts. In respect of understanding, they had significant difficulty in finding the structures and in distinguishing stereoisomers; however, the students had fun with the structure-drawing software and found it convenient to draw structures. These findings suggest that PC exercises will be attractive to students of the "PC generation", and thus helpful for the understanding of and training in data analysis in medical technology.

  13. Automated Computer Access Request System

    NASA Technical Reports Server (NTRS)

    Snook, Bryan E.

    2010-01-01

    The Automated Computer Access Request (AutoCAR) system is a Web-based account provisioning application that replaces the time-consuming paper-based computer-access request process at Johnson Space Center (JSC). Auto- CAR combines rules-based and role-based functionality in one application to provide a centralized system that is easily and widely accessible. The system features a work-flow engine that facilitates request routing, a user registration directory containing contact information and user metadata, an access request submission and tracking process, and a system administrator account management component. This provides full, end-to-end disposition approval chain accountability from the moment a request is submitted. By blending both rules-based and rolebased functionality, AutoCAR has the flexibility to route requests based on a user s nationality, JSC affiliation status, and other export-control requirements, while ensuring a user s request is addressed by either a primary or backup approver. All user accounts that are tracked in AutoCAR are recorded and mapped to the native operating system schema on the target platform where user accounts reside. This allows for future extensibility for supporting creation, deletion, and account management directly on the target platforms by way of AutoCAR. The system s directory-based lookup and day-today change analysis of directory information determines personnel moves, deletions, and additions, and automatically notifies a user via e-mail to revalidate his/her account access as a result of such changes. AutoCAR is a Microsoft classic active server page (ASP) application hosted on a Microsoft Internet Information Server (IIS).

  14. MODBUS APPLICATION AT JEFFERSON LAB

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yan, Jianxun; Seaton, Chad; Philip, Sarin

    Modbus is a client/server communication model. In our applications, the embedded Ethernet device XPort is designed as the server and a SoftIOC running EPICS Modbus is the client. The SoftIOC builds a Modbus request from parameter contained in a demand that is sent by the EPICS application to the Modbus Client interface. On reception of the Modbus request, the Modbus server activates a local action to read, write, or achieve some other action. So, the main Modbus server functions are to wait for a Modbus request on 502 TCP port, treat this request, and then build a Modbus response.

  15. R3D-2-MSA: the RNA 3D structure-to-multiple sequence alignment server

    PubMed Central

    Cannone, Jamie J.; Sweeney, Blake A.; Petrov, Anton I.; Gutell, Robin R.; Zirbel, Craig L.; Leontis, Neocles

    2015-01-01

    The RNA 3D Structure-to-Multiple Sequence Alignment Server (R3D-2-MSA) is a new web service that seamlessly links RNA three-dimensional (3D) structures to high-quality RNA multiple sequence alignments (MSAs) from diverse biological sources. In this first release, R3D-2-MSA provides manual and programmatic access to curated, representative ribosomal RNA sequence alignments from bacterial, archaeal, eukaryal and organellar ribosomes, using nucleotide numbers from representative atomic-resolution 3D structures. A web-based front end is available for manual entry and an Application Program Interface for programmatic access. Users can specify up to five ranges of nucleotides and 50 nucleotide positions per range. The R3D-2-MSA server maps these ranges to the appropriate columns of the corresponding MSA and returns the contents of the columns, either for display in a web browser or in JSON format for subsequent programmatic use. The browser output page provides a 3D interactive display of the query, a full list of sequence variants with taxonomic information and a statistical summary of distinct sequence variants found. The output can be filtered and sorted in the browser. Previous user queries can be viewed at any time by resubmitting the output URL, which encodes the search and re-generates the results. The service is freely available with no login requirement at http://rna.bgsu.edu/r3d-2-msa. PMID:26048960

  16. GPCR-SSFE 2.0-a fragment-based molecular modeling web tool for Class A G-protein coupled receptors.

    PubMed

    Worth, Catherine L; Kreuchwig, Franziska; Tiemann, Johanna K S; Kreuchwig, Annika; Ritschel, Michele; Kleinau, Gunnar; Hildebrand, Peter W; Krause, Gerd

    2017-07-03

    G-protein coupled receptors (GPCRs) are key players in signal transduction and therefore a large proportion of pharmaceutical drugs target these receptors. Structural data of GPCRs are sparse yet important for elucidating the molecular basis of GPCR-related diseases and for performing structure-based drug design. To ameliorate this problem, GPCR-SSFE 2.0 (http://www.ssfa-7tmr.de/ssfe2/), an intuitive web server dedicated to providing three-dimensional Class A GPCR homology models has been developed. The updated web server includes 27 inactive template structures and incorporates various new functionalities. Uniquely, it uses a fingerprint correlation scoring strategy for identifying the optimal templates, which we demonstrate captures structural features that sequence similarity alone is unable to do. Template selection is carried out separately for each helix, allowing both single-template models and fragment-based models to be built. Additionally, GPCR-SSFE 2.0 stores a comprehensive set of pre-calculated and downloadable homology models and also incorporates interactive loop modeling using the tool SL2, allowing knowledge-based input by the user to guide the selection process. For visual analysis, the NGL viewer is embedded into the result pages. Finally, blind-testing using two recently published structures shows that GPCR-SSFE 2.0 performs comparably or better than other state-of-the art GPCR modeling web servers. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  17. Globe Teachers Guide and Photographic Data on the Web

    NASA Technical Reports Server (NTRS)

    Kowal, Dan

    2004-01-01

    The task of managing the GLOBE Online Teacher s Guide during this time period focused on transforming the technology behind the delivery system of this document. The web application transformed from a flat file retrieval system to a dynamic database access approach. The new methodology utilizes Java Server Pages (JSP) on the front-end and an Oracle relational database on the backend. This new approach allows users of the web site, mainly teachers, to access content efficiently by grade level and/or by investigation or educational concept area. Moreover, teachers can gain easier access to data sheets and lab and field guides. The new online guide also included updated content for all GLOBE protocols. The GLOBE web management team was given documentation for maintaining the new application. Instructions for modifying the JSP templates and managing database content were included in this document. It was delivered to the team by the end of October, 2003. The National Geophysical Data Center (NGDC) continued to manage the school study site photos on the GLOBE website. 333 study site photo images were added to the GLOBE database and posted on the web during this same time period for 64 schools. Documentation for processing study site photos was also delivered to the new GLOBE web management team. Lastly, assistance was provided in transferring reference applications such as the Cloud and LandSat quizzes and Earth Systems Online Poster from NGDC servers to GLOBE servers along with documentation for maintaining these applications.

  18. SQLGEN: a framework for rapid client-server database application development.

    PubMed

    Nadkarni, P M; Cheung, K H

    1995-12-01

    SQLGEN is a framework for rapid client-server relational database application development. It relies on an active data dictionary on the client machine that stores metadata on one or more database servers to which the client may be connected. The dictionary generates dynamic Structured Query Language (SQL) to perform common database operations; it also stores information about the access rights of the user at log-in time, which is used to partially self-configure the behavior of the client to disable inappropriate user actions. SQLGEN uses a microcomputer database as the client to store metadata in relational form, to transiently capture server data in tables, and to allow rapid application prototyping followed by porting to client-server mode with modest effort. SQLGEN is currently used in several production biomedical databases.

  19. ProBiS-ligands: a web server for prediction of ligands by examination of protein binding sites.

    PubMed

    Konc, Janez; Janežič, Dušanka

    2014-07-01

    The ProBiS-ligands web server predicts binding of ligands to a protein structure. Starting with a protein structure or binding site, ProBiS-ligands first identifies template proteins in the Protein Data Bank that share similar binding sites. Based on the superimpositions of the query protein and the similar binding sites found, the server then transposes the ligand structures from those sites to the query protein. Such ligand prediction supports many activities, e.g. drug repurposing. The ProBiS-ligands web server, an extension of the ProBiS web server, is open and free to all users at http://probis.cmm.ki.si/ligands. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  20. Web GIS in practice IV: publishing your health maps and connecting to remote WMS sources using the Open Source UMN MapServer and DM Solutions MapLab

    PubMed Central

    Boulos, Maged N Kamel; Honda, Kiyoshi

    2006-01-01

    Open Source Web GIS software systems have reached a stage of maturity, sophistication, robustness and stability, and usability and user friendliness rivalling that of commercial, proprietary GIS and Web GIS server products. The Open Source Web GIS community is also actively embracing OGC (Open Geospatial Consortium) standards, including WMS (Web Map Service). WMS enables the creation of Web maps that have layers coming from multiple different remote servers/sources. In this article we present one easy to implement Web GIS server solution that is based on the Open Source University of Minnesota (UMN) MapServer. By following the accompanying step-by-step tutorial instructions, interested readers running mainstream Microsoft® Windows machines and with no prior technical experience in Web GIS or Internet map servers will be able to publish their own health maps on the Web and add to those maps additional layers retrieved from remote WMS servers. The 'digital Asia' and 2004 Indian Ocean tsunami experiences in using free Open Source Web GIS software are also briefly described. PMID:16420699

  1. Clinical Digital Libraries Project: design approach and exploratory assessment of timely use in clinical environments*

    PubMed Central

    MacCall, Steven L.

    2006-01-01

    Objective: The paper describes and evaluates the use of Clinical Digital Libraries Project (CDLP) digital library collections in terms of their facilitation of timely clinical information seeking. Design: A convenience sample of CDLP Web server log activity over a twelve-month period (7/2002 to 6/2003) was analyzed for evidence of timely information seeking after users were referred to digital library clinical topic pages from Web search engines. Sample searches were limited to those originating from medical schools (26% North American and 19% non-North American) and from hospitals or clinics (51% North American and 4% non-North American). Measurement: Timeliness was determined based on a calculation of the difference between the timestamps of the first and last Web server log “hit” during each search in the sample. The calculated differences were mapped into one of three ranges: less than one minute, one to three minutes, and three to five minutes. Results: Of the 864 searches analyzed, 48% were less than 1 minute, 41% were 1 to 3 minutes, and 11% were 3 to 5 minutes. These results were further analyzed by environment (medical schools versus hospitals or clinics) and by geographic location (North America versus non-North American). Searches reflected a consistent pattern of less than 1 minute in these environments. Though the results were not consistent on a month-by-month basis over the entire time period, data for 8 of 12 months showed that searches shorter than 1 minute predominated and data for 1 month showed an equal number of less than 1 minute and 1 to 3 minute searches. Conclusions: The CDLP digital library collections provided timely access to high-quality Web clinical resources when used for information seeking in medical education and hospital or clinic environments from North American and non–North American locations and consistently provided access to the sought information within the documented two-minute standard. The limitations of the use of Web server data warrant an exploratory assessment. This research also suggests the need for further investigation in the area of timely digital library collection services to clinical environments. PMID:16636712

  2. Clinical Digital Libraries Project: design approach and exploratory assessment of timely use in clinical environments.

    PubMed

    Maccall, Steven L

    2006-04-01

    The paper describes and evaluates the use of Clinical Digital Libraries Project (CDLP) digital library collections in terms of their facilitation of timely clinical information seeking. A convenience sample of CDLP Web server log activity over a twelve-month period (7/2002 to 6/2003) was analyzed for evidence of timely information seeking after users were referred to digital library clinical topic pages from Web search engines. Sample searches were limited to those originating from medical schools (26% North American and 19% non-North American) and from hospitals or clinics (51% North American and 4% non-North American). Timeliness was determined based on a calculation of the difference between the timestamps of the first and last Web server log "hit" during each search in the sample. The calculated differences were mapped into one of three ranges: less than one minute, one to three minutes, and three to five minutes. Of the 864 searches analyzed, 48% were less than 1 minute, 41% were 1 to 3 minutes, and 11% were 3 to 5 minutes. These results were further analyzed by environment (medical schools versus hospitals or clinics) and by geographic location (North America versus non-North American). Searches reflected a consistent pattern of less than 1 minute in these environments. Though the results were not consistent on a month-by-month basis over the entire time period, data for 8 of 12 months showed that searches shorter than 1 minute predominated and data for 1 month showed an equal number of less than 1 minute and 1 to 3 minute searches. The CDLP digital library collections provided timely access to high-quality Web clinical resources when used for information seeking in medical education and hospital or clinic environments from North American and non-North American locations and consistently provided access to the sought information within the documented two-minute standard. The limitations of the use of Web server data warrant an exploratory assessment. This research also suggests the need for further investigation in the area of timely digital library collection services to clinical environments.

  3. Dynamic XML-based exchange of relational data: application to the Human Brain Project.

    PubMed

    Tang, Zhengming; Kadiyska, Yana; Li, Hao; Suciu, Dan; Brinkley, James F

    2003-01-01

    This paper discusses an approach to exporting relational data in XML format for data exchange over the web. We describe the first real-world application of SilkRoute, a middleware program that dynamically converts existing relational data to a user-defined XML DTD. The application, called XBrain, wraps SilkRoute in a Java Server Pages framework, thus permitting a web-based XQuery interface to a legacy relational database. The application is demonstrated as a query interface to the University of Washington Brain Project's Language Map Experiment Management System, which is used to manage data about language organization in the brain.

  4. Bio-inspired diversity for increasing attacker workload

    NASA Astrophysics Data System (ADS)

    Kuhn, Stephen

    2014-05-01

    Much of the traffic in modern computer networks is conducted between clients and servers, rather than client-toclient. As a result, servers represent a high-value target for collection and analysis of network traffic. As they reside at a single network location (i.e. IP/MAC address) for long periods of time. Servers present a static target for surveillance, and a unique opportunity to observe the network traffic. Although servers present a heightened value for attackers, the security community as a whole has shifted more towards protecting clients in recent years leaving a gap in coverage. In addition, servers typically remain active on networks for years, potentially decades. This paper builds on previous work that demonstrated a proof of concept leveraging existing technology for increasing attacker workload. Here we present our clean slate approach to increasing attacker workload through a novel hypervisor and micro-kernel, utilizing next generation virtualization technology to create synthetic diversity of the server's presence including the hardware components.

  5. Data Access System for Hydrology

    NASA Astrophysics Data System (ADS)

    Whitenack, T.; Zaslavsky, I.; Valentine, D.; Djokic, D.

    2007-12-01

    As part of the CUAHSI HIS (Consortium of Universities for the Advancement of Hydrologic Science, Inc., Hydrologic Information System), the CUAHSI HIS team has developed Data Access System for Hydrology or DASH. DASH is based on commercial off the shelf technology, which has been developed in conjunction with a commercial partner, ESRI. DASH is a web-based user interface, developed in ASP.NET developed using ESRI ArcGIS Server 9.2 that represents a mapping, querying and data retrieval interface over observation and GIS databases, and web services. This is the front end application for the CUAHSI Hydrologic Information System Server. The HIS Server is a software stack that organizes observation databases, geographic data layers, data importing and management tools, and online user interfaces such as the DASH application, into a flexible multi- tier application for serving both national-level and locally-maintained observation data. The user interface of the DASH web application allows online users to query observation networks by location and attributes, selecting stations in a user-specified area where a particular variable was measured during a given time interval. Once one or more stations and variables are selected, the user can retrieve and download the observation data for further off-line analysis. The DASH application is highly configurable. The mapping interface can be configured to display map services from multiple sources in multiple formats, including ArcGIS Server, ArcIMS, and WMS. The observation network data is configured in an XML file where you specify the network's web service location and its corresponding map layer. Upon initial deployment, two national level observation networks (USGS NWIS daily values and USGS NWIS Instantaneous values) are already pre-configured. There is also an optional login page which can be used to restrict access as well as providing a alternative to immediate downloads. For large request, users would be notified via email with a link to their data when it is ready.

  6. Data Publishing and Sharing Via the THREDDS Data Repository

    NASA Astrophysics Data System (ADS)

    Wilson, A.; Caron, J.; Davis, E.; Baltzer, T.

    2007-12-01

    The terms "Team Science" and "Networked Science" have been coined to describe a virtual organization of researchers tied via some intellectual challenge, but often located in different organizations and locations. A critical component to these endeavors is publishing and sharing of content, including scientific data. Imagine pointing your web browser to a web page that interactively lets you upload data and metadata to a repository residing on a remote server, which can then be accessed by others in a secure fasion via the web. While any content can be added to this repository, it is designed particularly for storing and sharing scientific data and metadata. Server support includes uploading of data files that can subsequently be subsetted, aggregrated, and served in NetCDF or other scientific data formats. Metadata can be associated with the data and interactively edited. The THREDDS Data Repository (TDR) is a server that provides client initiated, on demand, location transparent storage for data of any type that can then be served by the THREDDS Data Server (TDS). The TDR provides functionality to: * securely store and "own" data files and associated metadata * upload files via HTTP and gridftp * upload a collection of data as single file * modify and restructure repository contents * incorporate metadata provided by the user * generate additional metadata programmatically * edit individual metadata elements The TDR can exist separately from a TDS, serving content via HTTP. Also, it can work in conjunction with the TDS, which includes functionality to provide: * access to data in a variety of formats via -- OPeNDAP -- OGC Web Coverage Service (for gridded datasets) -- bulk HTTP file transfer * a NetCDF view of datasets in NetCDF, OPeNDAP, HDF-5, GRIB, and NEXRAD formats * serving of very large volume datasets, such as NEXRAD radar * aggregation into virtual datasets * subsetting via OPeNDAP and NetCDF Subsetting services This talk will discuss TDR/TDS capabilities as well as how users can install this software to create their own repositories.

  7. 77 FR 38608 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-28

    ..., professional qualifications and skills, training courses completed, certifications received, level of education... Activity Unit Identification Code (UIC).'' Safeguards: Delete entry and replace with ``The NTMPS servers.... All data transferred is encrypted. The interface server is protected from attempts to penetrate the...

  8. Video streaming technologies using ActiveX and LabVIEW

    NASA Astrophysics Data System (ADS)

    Panoiu, M.; Rat, C. L.; Panoiu, C.

    2015-06-01

    The goal of this paper is to present the possibilities of remote image processing through data exchange between two programming technologies: LabVIEW and ActiveX. ActiveX refers to the process of controlling one program from another via ActiveX component; where one program acts as the client and the other as the server. LabVIEW can be either client or server. Both programs (client and server) exist independent of each other but are able to share information. The client communicates with the ActiveX objects that the server opens to allow the sharing of information [7]. In the case of video streaming [1] [2], most ActiveX controls can only display the data, being incapable of transforming it into a data type that LabVIEW can process. This becomes problematic when the system is used for remote image processing. The LabVIEW environment itself provides little if any possibilities for video streaming, and the methods it does offer are usually not high performance, but it possesses high performance toolkits and modules specialized in image processing, making it ideal for processing the captured data. Therefore, we chose to use existing software, specialized in video streaming along with LabVIEW and to capture the data provided by them, for further use, within LabVIEW. The software we studied (the ActiveX controls of a series of media players that utilize streaming technology) provide high quality data and a very small transmission delay, ensuring the reliability of the results of the image processing.

  9. Data and Data Products for Climate Research: Web Services at the Asia-Pacific Data-Research Center (APDRC)

    NASA Astrophysics Data System (ADS)

    DeCarlo, S.; Potemra, J. T.; Wang, K.

    2012-12-01

    The International Pacific Research Center (IPRC) at the University of Hawaii maintains a data center for climate studies called the Asia-Pacific Data-Research Center (APDRC). This data center was designed within a center of excellence in climate research with the intention of serving the needs of the research scientist. The APDRC provides easy access to a wide collection of climate data and data products for a wide variety of users. The data center maintains an archive of approximately 100 data sets including in-situ and remote data, as well as a range of model-based output. All data are available via on-line browsing tools such as a Live Access Server (LAS) and DChart, and direct binary access is available through OPeNDAP services. On-line tutorials on how to use these services are now available. Users can keep up-to-date with new data and product announcements via the APDRC facebook page. The main focus of the APDRC has been climate scientists, and the services are therefore streamlined to such users, both in the number and types of data served, but also in the way data are served. In addition, due to the integration of the APDRC within the IPRC, several value-added data products (see figure for an example using Argo floats) have been developed via a variety of research activities. The APDRC, therefore, has three main foci: 1. acquisition of climate-related data, 2. maintenance of integrated data servers, and 3. development and distribution of data products The APDRC can be found at http://apdrc.soest.hawaii.edu. The presentation will provide an overview along with specific examples of the data, data products and data services available at the APDRC.; APDRC product example: gridded field from Argo profiling floats

  10. OPeNDAP servers like Hyrax and TDS can easily support common single-sign-on authentication protocols using the Apache httpd and related software; adding support for these protocols to clients can be more challenging

    NASA Astrophysics Data System (ADS)

    Gallagher, J. H. R.; Potter, N.; Evans, B. J. K.

    2016-12-01

    OPeNDAP, in conjunction with the Australian National University, documented the installation process needed to add authentication to OPeNDAP-enabled data servers (Hyrax, TDS, etc.) and examined 13 OPeNDAP clients to determine how best to add authentication using LDAP, Shibboleth and OAuth2 (we used NASA's URS). We settled on a server configuration (architecture) that uses the Apache web server and a collection of open-source modules to perform the authentication and authorization actions. This is not the only way to accomplish those goals, but using Apache represents a good balance between functionality, leveraging existing work that has been well vetted and includes support for a wide variety of web services, include those that depend on a servlet engine such as tomcat (which both Hyrax and TDS do). Or work shows how LDAP, OAuth2 and Shibboleth can all be accommodated using this readily available software stack. Also important is that the Apache software is very widely used and is fairly robust - extremely important for security software components. In order to make use of a server requiring authentication, clients must support the authentication process. Because HTTP has included authentication for well over a decade, and because HTTP/HTTPS can be used by simply linking programs with a library, both the LDAP and OAuth2/URS authentication schemes have almost universal support within the OPeNDAP client base. The clients, i.e. the HTTP client libraries they employ, understand how to submit the credentials to the correct server when confronted by an HTTP/S Unauthorized (401) response. Interestingly OAuth2 can achieve it's SSO objectives while relying entirely on normative HTTP transport. All 13 of the clients examined worked.The situation with Shibboleth is different. While Shibboleth does use HTTP, it also requires the client to either scrape a web page or support the SAML2.0 ECP profile, which, for programmatic clients, means using SOAP messages. Since working with SOAP is outside the scope of HTTP, support for Shibboleth must be added explicitly into the client software. Some of the potential burden of enabling OPeNDAP clients to work with Shibboleth may be mitigated by getting both NetCDF-C and NetCDF-Java libraries to use the Shibboleth ECP profile. If done, this would get 9 of the 13 clients we examined working.

  11. The PhEDEx next-gen website

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Egeland, R.; Huang, C. H.; Rossman, P.

    PhEDEx is the data-transfer management solution written by CMS. It consists of agents running at each site, a website for presentation of information, and a web-based data-service for scripted access to information. The website allows users to monitor the progress of data-transfers, the status of site agents and links between sites, and the overall status and behaviour of everything about PhEDEx. It also allows users to make and approve requests for data-transfers and for deletion of data. It is the main point-of-entry for all users wishing to interact with PhEDEx. For several years, the website has consisted of a singlemore » perl program with about 10K SLOC. This program has limited capabilities for exploring the data, with only coarse filtering capabilities and no context-sensitive awareness. Graphical information is presented as static images, generated on the server, with no interactivity. It is also not well connected to the rest of the PhEDEx codebase, since much of it was written before the data-service was developed. All this makes it hard to maintain and extend. We are re-implementing the website to address these issues. The UI is being rewritten in Javascript, replacing most of the server-side code. We are using the YUI toolkit to provide advanced features and context-sensitive interaction, and will adopt a Javascript charting library for generating graphical representations client-side. This relieves the server of much of its load, and automatically improves server-side security. The Javascript components can be re-used in many ways, allowing custom pages to be developed for specific uses. In particular, standalone test-cases using small numbers of components make it easier to debug the Javascript than it is to debug a large server program. Information about PhEDEx is accessed through the PhEDEx data-service, since direct SQL is not available from the clients browser. This provides consistent semantics with other, externally written monitoring tools, which already use the data-service. It also reduces redundancy in the code, yielding a simpler, consolidated codebase. In this talk we describe our experience of re-factoring this monolithic server-side program into a lighter client-side framework. We describe some of the techniques that worked well for us, and some of the mistakes we made along the way. We present the current state of the project, and its future direction.« less

  12. Continuous integration and quality control for scientific software

    NASA Astrophysics Data System (ADS)

    Neidhardt, A.; Ettl, M.; Brisken, W.; Dassing, R.

    2013-08-01

    Modern software has to be stable, portable, fast and reliable. This is going to be also more and more important for scientific software. But this requires a sophisticated way to inspect, check and evaluate the quality of source code with a suitable, automated infrastructure. A centralized server with a software repository and a version control system is one essential part, to manage the code basis and to control the different development versions. While each project can be compiled separately, the whole code basis can also be compiled with one central “Makefile”. This is used to create automated, nightly builds. Additionally all sources are inspected automatically with static code analysis and inspection tools, which check well-none error situations, memory and resource leaks, performance issues, or style issues. In combination with an automatic documentation generator it is possible to create the developer documentation directly from the code and the inline comments. All reports and generated information are presented as HTML page on a Web server. Because this environment increased the stability and quality of the software of the Geodetic Observatory Wettzell tremendously, it is now also available for scientific communities. One regular customer is already the developer group of the DiFX software correlator project.

  13. e-Stars Template Builder

    NASA Technical Reports Server (NTRS)

    Cox, Brian

    2003-01-01

    e-Stars Template Builder is a computer program that implements a concept of enabling users to rapidly gain access to information on projects of NASA's Jet Propulsion Laboratory. The information about a given project is not stored in a data base, but rather, in a network that follows the project as it develops. e-Stars Template Builder resides on a server computer, using Practical Extraction and Reporting Language (PERL) scripts to create what are called "e-STARS node templates," which are software constructs that allow for project-specific configurations. The software resides on the server and does not require specific software on the user machine except for an Internet browser. A user's computer need not be equipped with special software (other than an Internet-browser program). e-Stars Template Builder is compatible with Windows, Macintosh, and UNIX operating systems. A user invokes e-Stars Template Builder from a browser window. Operations that can be performed by the user include the creation of child processes and the addition of links and descriptions of documentation to existing pages or nodes. By means of this addition of "child processes" of nodes, a network that reflects the development of a project is generated.

  14. Practical Issues of Wireless Mobile Devices Usage with Downlink Optimization

    NASA Astrophysics Data System (ADS)

    Krejcar, Ondrej; Janckulik, Dalibor; Motalova, Leona

    Mobile device makers produce tens of new complex mobile devices per year to put users a special mobile device with a possibility to do anything, anywhere, anytime. These devices can operate full scale applications with nearly the same comfort as their desktop equivalents only with several limitations. One of such limitation is insufficient download on wireless connectivity in case of the large multimedia files. Main area of paper is in a possibilities description of solving this problem as well as the test of several new mobile devices along with server interface tests and common software descriptions. New devices have a full scale of wireless connectivity which can be used not only to communication with outer land. Several such possibilities of use are described. Mobile users will have also an online connection to internet all time powered on. Internet is mainly the web pages but the web services use is still accelerate up. The paper deal also with a possibility of maximum user amounts to have a connection at same time to current server type. At last the new kind of database access - Linq technology is compare to ADO.NET in response time meaning.

  15. WriteShield: A Pseudo Thin Client for Prevention of Information Leakage

    NASA Astrophysics Data System (ADS)

    Kirihata, Yasuhiro; Sameshima, Yoshiki; Onoyama, Takashi; Komoda, Norihisa

    While thin-client systems are diffusing as an effective security method in enterprises and organizations, there is a new approach called pseudo thin-client system. In this system, local disks of clients are write-protected and user data is forced to save on the central file server to realize the same security effect of conventional thin-client systems. Since it takes purely the software-based simple approach, it does not require the hardware enhancement of network and servers to reduce the installation cost. However there are several problems such as no write control to external media, memory depletion possibility, and lower security because of the exceptional write permission to the system processes. In this paper, we propose WriteShield, a pseudo thin-client system which solves these issues. In this system, the local disks are write-protected with volume filter driver and it has a virtual cache mechanism to extend the memory cache size for the write protection. This paper presents design and implementation details of WriteShield. Besides we describe the security analysis and simulation evaluation of paging algorithms for virtual cache mechanism and measure the disk I/O performance to verify its feasibility in the actual environment.

  16. Chemozart: a web-based 3D molecular structure editor and visualizer platform.

    PubMed

    Mohebifar, Mohamad; Sajadi, Fatemehsadat

    2015-01-01

    Chemozart is a 3D Molecule editor and visualizer built on top of native web components. It offers an easy to access service, user-friendly graphical interface and modular design. It is a client centric web application which communicates with the server via a representational state transfer style web service. Both client-side and server-side application are written in JavaScript. A combination of JavaScript and HTML is used to draw three-dimensional structures of molecules. With the help of WebGL, three-dimensional visualization tool is provided. Using CSS3 and HTML5, a user-friendly interface is composed. More than 30 packages are used to compose this application which adds enough flexibility to it to be extended. Molecule structures can be drawn on all types of platforms and is compatible with mobile devices. No installation is required in order to use this application and it can be accessed through the internet. This application can be extended on both server-side and client-side by implementing modules in JavaScript. Molecular compounds are drawn on the HTML5 Canvas element using WebGL context. Chemozart is a chemical platform which is powerful, flexible, and easy to access. It provides an online web-based tool used for chemical visualization along with result oriented optimization for cloud based API (application programming interface). JavaScript libraries which allow creation of web pages containing interactive three-dimensional molecular structures has also been made available. The application has been released under Apache 2 License and is available from the project website https://chemozart.com.

  17. R3D-2-MSA: the RNA 3D structure-to-multiple sequence alignment server.

    PubMed

    Cannone, Jamie J; Sweeney, Blake A; Petrov, Anton I; Gutell, Robin R; Zirbel, Craig L; Leontis, Neocles

    2015-07-01

    The RNA 3D Structure-to-Multiple Sequence Alignment Server (R3D-2-MSA) is a new web service that seamlessly links RNA three-dimensional (3D) structures to high-quality RNA multiple sequence alignments (MSAs) from diverse biological sources. In this first release, R3D-2-MSA provides manual and programmatic access to curated, representative ribosomal RNA sequence alignments from bacterial, archaeal, eukaryal and organellar ribosomes, using nucleotide numbers from representative atomic-resolution 3D structures. A web-based front end is available for manual entry and an Application Program Interface for programmatic access. Users can specify up to five ranges of nucleotides and 50 nucleotide positions per range. The R3D-2-MSA server maps these ranges to the appropriate columns of the corresponding MSA and returns the contents of the columns, either for display in a web browser or in JSON format for subsequent programmatic use. The browser output page provides a 3D interactive display of the query, a full list of sequence variants with taxonomic information and a statistical summary of distinct sequence variants found. The output can be filtered and sorted in the browser. Previous user queries can be viewed at any time by resubmitting the output URL, which encodes the search and re-generates the results. The service is freely available with no login requirement at http://rna.bgsu.edu/r3d-2-msa. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  18. A decade of Web Server updates at the Bioinformatics Links Directory: 2003-2012.

    PubMed

    Brazas, Michelle D; Yim, David; Yeung, Winston; Ouellette, B F Francis

    2012-07-01

    The 2012 Bioinformatics Links Directory update marks the 10th special Web Server issue from Nucleic Acids Research. Beginning with content from their 2003 publication, the Bioinformatics Links Directory in collaboration with Nucleic Acids Research has compiled and published a comprehensive list of freely accessible, online tools, databases and resource materials for the bioinformatics and life science research communities. The past decade has exhibited significant growth and change in the types of tools, databases and resources being put forth, reflecting both technology changes and the nature of research over that time. With the addition of 90 web server tools and 12 updates from the July 2012 Web Server issue of Nucleic Acids Research, the Bioinformatics Links Directory at http://bioinformatics.ca/links_directory/ now contains an impressive 134 resources, 455 databases and 1205 web server tools, mirroring the continued activity and efforts of our field.

  19. Web Transfer Over Satellites Being Improved

    NASA Technical Reports Server (NTRS)

    Allman, Mark

    1999-01-01

    Extensive research conducted by NASA Lewis Research Center's Satellite Networks and Architectures Branch and the Ohio University has demonstrated performance improvements in World Wide Web transfers over satellite-based networks. The use of a new version of the Hypertext Transfer Protocol (HTTP) reduced the time required to load web pages over a single Transmission Control Protocol (TCP) connection traversing a satellite channel. However, an older technique of simultaneously making multiple requests of a given server has been shown to provide even faster transfer time. Unfortunately, the use of multiple simultaneous requests has been shown to be harmful to the network in general. Therefore, we are developing new mechanisms for the HTTP protocol which may allow a single request at any given time to perform as well as, or better than, multiple simultaneous requests. In the course of study, we also demonstrated that the time for web pages to load is at least as short via a satellite link as it is via a standard 28.8-kbps dialup modem channel. This demonstrates that satellites are a viable means of accessing the Internet.

  20. Ultrabroadband photonic internet: safety aspects

    NASA Astrophysics Data System (ADS)

    Kalicki, Arkadiusz; Romaniuk, Ryszard

    2008-11-01

    Web applications became most popular medium in the Internet. Popularity, easiness of web application frameworks together with careless development results in high number of vulnerabilities and attacks. There are several types of attacks possible because of improper input validation. SQL injection is ability to execute arbitrary SQL queries in a database through an existing application. Cross-site scripting is the vulnerability which allows malicious web users to inject code into the web pages viewed by other users. Cross-Site Request Forgery (CSRF) is an attack that tricks the victim into loading a page that contains malicious request. Web spam in blogs. There are several techniques to mitigate attacks. Most important are web application strong design, correct input validation, defined data types for each field and parameterized statements in SQL queries. Server hardening with firewall, modern security policies systems and safe web framework interpreter configuration are essential. It is advised to keep proper security level on client side, keep updated software and install personal web firewalls or IDS/IPS systems. Good habits are logging out from services just after finishing work and using even separate web browser for most important sites, like e-banking.

  1. SSPI - Space Service Provider Infrastructure: Image Information Mining and Management Prototype for a Distributed Environment

    NASA Astrophysics Data System (ADS)

    Candela, L.; Ruggieri, G.; Giancaspro, A.

    2004-09-01

    In the sphere of "Multi-Mission Ground Segment" Italian Space Agency project, some innovative technologies such as CORBA[1], Z39.50[2], XML[3], Java[4], Java server Pages[4] and C++ has been experimented. The SSPI system (Space Service Provider Infrastructure) is the prototype of a distributed environment aimed to facilitate the access to Earth Observation (EO) data. SSPI allows to ingests, archive, consolidate, visualize and evaluate these data. Hence, SSPI is not just a database of or a data repository, but an application that by means of a set of protocols, standards and specifications provides a unified access to multi-mission EO data.

  2. MED31/437: A Web-based Diabetes Management System: DiabNet

    PubMed Central

    Zhao, N; Roudsari, A; Carson, E

    1999-01-01

    Introduction A web-based system (DiabNet) was developed to provide instant access to the Electronic Diabetes Records (EDR) for end-users, and real-time information for healthcare professionals to facilitate their decision-making. It integrates portable glucometer, handheld computer, mobile phone and Internet access as a combined telecommunication and mobile computing solution for diabetes management. Methods: Active Server Pages (ASP) embedded with advanced ActiveX controls and VBScript were developed to allow remote data upload, retrieval and interpretation. Some advisory and Internet-based learning features, together with a video teleconferencing component make DiabNet web site an informative platform for Web-consultation. Results The evaluation of the system is being implemented among several UK Internet diabetes discussion groups and the Diabetes Day Centre at the Guy's & St. Thomas' Hospital. Many positive feedback are received from the web site demonstrating DiabNet is an advanced web-based diabetes management system which can help patients to keep closer control of self-monitoring blood glucose remotely, and is an integrated diabetes information resource that offers telemedicine knowledge in diabetes management. Discussion In summary, DiabNet introduces an innovative online diabetes management concept, such as online appointment and consultation, to enable users to access diabetes management information without time and location limitation and security concerns.

  3. Sensor Fusion for Nuclear Proliferation Activity Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adel Ghanem, Ph D

    2007-03-30

    The objective of Phase 1 of this STTR project is to demonstrate a Proof-of-Concept (PoC) of the Geo-Rad system that integrates a location-aware SmartTag (made by ZonTrak) and a radiation detector (developed by LLNL). It also includes the ability to transmit the collected radiation data and location information to the ZonTrak server (ZonService). The collected data is further transmitted to a central server at LLNL (the Fusion Server) to be processed in conjunction with overhead imagery to generate location estimates of nuclear proliferation and radiation sources.

  4. EnviroAtlas - Metrics for Austin, TX

    EPA Pesticide Factsheets

    This EnviroAtlas web service supports research and online mapping activities related to EnviroAtlas (https://enviroatlas.epa.gov/EnviroAtlas). The layers in this web service depict ecosystem services at the census block group level for the community of Austin, Texas. These layers illustrate the ecosystems and natural resources that are associated with clean air (https://enviroatlas.epa.gov/arcgis/rest/services/Communities/ESC_ATX_CleanAir/MapServer); clean and plentiful water (https://enviroatlas.epa.gov/arcgis/rest/services/Communities/ESC_ATX_CleanPlentifulWater/MapServer); natural hazard mitigation (https://enviroatlas.epa.gov/arcgis/rest/services/Communities/ESC_ATX_NaturalHazardMitigation/MapServer); climate stabilization (https://enviroatlas.epa.gov/arcgis/rest/services/Communities/ESC_ATX_ClimateStabilization/MapServer); food, fuel, and materials (https://enviroatlas.epa.gov/arcgis/rest/services/Communities/ESC_ATX_FoodFuelMaterials/MapServer); recreation, culture, and aesthetics (https://enviroatlas.epa.gov/arcgis/rest/services/Communities/ESC_ATX_RecreationCultureAesthetics/MapServer); and biodiversity conservation (https://enviroatlas.epa.gov/arcgis/rest/services/Communities/ESC_ATX_BiodiversityConservation/MapServer), and factors that place stress on those resources. EnviroAtlas allows the user to interact with a web-based, easy-to-use, mapping application to view and analyze multiple ecosystem services for the conterminous United States as well as de

  5. CBS Genome Atlas Database: a dynamic storage for bioinformatic results and sequence data.

    PubMed

    Hallin, Peter F; Ussery, David W

    2004-12-12

    Currently, new bacterial genomes are being published on a monthly basis. With the growing amount of genome sequence data, there is a demand for a flexible and easy-to-maintain structure for storing sequence data and results from bioinformatic analysis. More than 150 sequenced bacterial genomes are now available, and comparisons of properties for taxonomically similar organisms are not readily available to many biologists. In addition to the most basic information, such as AT content, chromosome length, tRNA count and rRNA count, a large number of more complex calculations are needed to perform detailed comparative genomics. DNA structural calculations like curvature and stacking energy, DNA compositions like base skews, oligo skews and repeats at the local and global level are just a few of the analysis that are presented on the CBS Genome Atlas Web page. Complex analysis, changing methods and frequent addition of new models are factors that require a dynamic database layout. Using basic tools like the GNU Make system, csh, Perl and MySQL, we have created a flexible database environment for storing and maintaining such results for a collection of complete microbial genomes. Currently, these results counts to more than 220 pieces of information. The backbone of this solution consists of a program package written in Perl, which enables administrators to synchronize and update the database content. The MySQL database has been connected to the CBS web-server via PHP4, to present a dynamic web content for users outside the center. This solution is tightly fitted to existing server infrastructure and the solutions proposed here can perhaps serve as a template for other research groups to solve database issues. A web based user interface which is dynamically linked to the Genome Atlas Database can be accessed via www.cbs.dtu.dk/services/GenomeAtlas/. This paper has a supplemental information page which links to the examples presented: www.cbs.dtu.dk/services/GenomeAtlas/suppl/bioinfdatabase.

  6. A Web Terminology Server Using UMLS for the Description of Medical Procedures

    PubMed Central

    Burgun, Anita; Denier, Patrick; Bodenreider, Olivier; Botti, Geneviève; Delamarre, Denis; Pouliquen, Bruno; Oberlin, Philippe; Lévéque, Jean M.; Lukacs, Bertrand; Kohler, François; Fieschi, Marius; Le Beux, Pierre

    1997-01-01

    Abstract The Model for Assistance in the Orientation of a User within Coding Systems (MAOUSSC) project has been designed to provide a representation for medical and surgical procedures that allows several applications to be developed from several viewpoints. It is based on a conceptual model, a controlled set of terms, and Web server development. The design includes the UMLS knowledge sources associated with additional knowledge about medico-surgical procedures. The model was implemented using a relational database. The authors developed a complete interface for the Web presentation, with the intermediary layer being written in PERL. The server has been used for the representation of medico-surgical procedures that occur in the discharge summaries of the national survey of hospital activities that is performed by the French Health Statistics Agency in order to produce inpatient profiles. The authors describe the current status of the MAOUSSC server and discuss their interest in using such a server to assist in the coordination of terminology tasks and in the sharing of controlled terminologies. PMID:9292841

  7. IPG Job Manager v2.0 Design Documentation

    NASA Technical Reports Server (NTRS)

    Hu, Chaumin

    2003-01-01

    This viewgraph presentation provides a high-level design of the IPG Job Manager, and satisfies its Master Requirement Specification v2.0 Revision 1.0, 01/29/2003. The presentation includes a Software Architecture/Functional Overview with the following: Job Model; Job Manager Client/Server Architecture; Job Manager Client (Job Manager Client Class Diagram and Job Manager Client Activity Diagram); Job Manager Server (Job Manager Client Class Diagram and Job Manager Client Activity Diagram); Development Environment; Project Plan; Requirement Traceability.

  8. PTB’S Time and Frequency Activities in 2006: New DCF77 Electronics, New NTP Servers, and Calibration Activities

    DTIC Science & Technology

    2007-01-01

    PTTI) Meeting ( TWSTFT ) is being routinely performed with several European and US stations. On the initiative of NICT, a TWSTFT link was...During the last 2 years, PTB has upgraded its TWSTFT and GPS capabilities in order to achieve better reliability and robustness against system failures...NTP-server, and, briefly, the calibration of the international time links, i.e. the result of the latest calibration of the TWSTFT links to the USNO

  9. ROME (Request Object Management Environment)

    NASA Astrophysics Data System (ADS)

    Kong, M.; Good, J. C.; Berriman, G. B.

    2005-12-01

    Most current astronomical archive services are based on an HTML/ CGI architecture where users submit HTML forms via a browser and CGI programs operating under a web server process the requests. Most services return an HTML result page with URL links to the result files or, for longer jobs, return a message indicating that email will be sent when the job is done. This paradigm has a few serious shortcomings. First, it is all too common for something to go wrong and for the user to never hear about the job again. Second, for long and complicated jobs there is often important intermediate information that would allow the user to adjust the processing. Finally, unless some sort of custom queueing mechanism is used, background jobs are started immediately upon receiving the CGI request. When there are many such requests the server machine can easily be overloaded and either slow to a crawl or crash. Request Object Management Environment (ROME) is a collection of middleware components being developed under the National Virtual Observatory Project to provide mechanism for managing long jobs such as computationally intensive statistical analysis requests or the generation of large scale mosaic images. Written as EJB objects within the open-source JBoss applications server, ROME receives processing requests via a servelet interface, stores them in a DBMS using JDBC, distributes the processing (via queuing mechanisms) across multiple machines and environments (including Grid resources), manages realtime messages from the processing modules, and ensures proper user notification. The request processing modules are identical in structure to standard CGI-programs -- though they can optionally implement status messaging -- and can be written in any language. ROME will persist these jobs across failures of processing modules, network outages, and even downtime of ROME and the DBMS, restarting them as necessary.

  10. Rapid application design of an electronic clinical skills portfolio for undergraduate medical students.

    PubMed

    Dornan, Tim; Lee, Catherine; Stopford, Adam; Hosie, Liam; Maredia, Neil; Rector, Alan

    2005-04-01

    The aim was to find how to use information and communication technology to present the clinical skills content of an undergraduate medical curriculum. Rapid application design was used to develop the product, and technical action research was used to evaluate the development process. A clinician-educator, two medical students, two computing science masters students, two other project workers, and a hospital education informatics lead, formed a design team. A sample of stakeholders took part in requirements planning workshops and continued to advise the team throughout the project. A university hospital had many features that favoured fast, inexpensive, and successful system development: a clearly defined and readily accessible user group; location of the development process close to end-users; fast, informal communication; leadership by highly motivated and senior end-users; devolved authority and lack of any rigidly imposed management structure; cooperation of clinicians because the project drew on their clinical expertise to achieve scholastic goals; a culture of learning and involvement of highly motivated students. A detailed specification was developed through storyboarding, use case diagramming, and evolutionary prototyping. A very usable working product was developed within weeks. "SkillsBase" is a database web application using Microsoft Active Server Pages, served from a Microsoft Windows 2000 Server operating system running Internet Information Server 5.0. Graphing functionality is provided by the KavaChart applet. It presents the skills curriculum, provides a password-protected portfolio function, and offers training materials. The curriculum can be presented in several different ways to help students reflect on their objectives and progress towards achieving them. The reflective portfolio function is entirely private to each student user and allows them to document their progress in attaining skills, as judged by self, peer and tutor assessment, and examinations. Training materials include web links and materials developed locally using pedagogic principles developed by the SkillsBase team. Although the usability of SkillsBase has been proven, uptake of software that has arisen 'bottom-up' from within the curriculum has proved slow. We plan to incorporate the SkillsBase services into a more comprehensive virtual managed learning environment, anticipating that presenting the functionality in an environment that is routinely used by students and teachers will increase uptake and use.

  11. An expert system for headache diagnosis: the Computerized Headache Assessment tool (CHAT).

    PubMed

    Maizels, Morris; Wolfe, William J

    2008-01-01

    Migraine is a highly prevalent chronic disorder associated with significant morbidity. Chronic daily headache syndromes, while less common, are less likely to be recognized, and impair quality of life to an even greater extent than episodic migraine. A variety of screening and diagnostic tools for migraine have been proposed and studied. Few investigators have developed and evaluated computerized programs to diagnose headache. To develop and determine the accuracy and utility of a computerized headache assessment tool (CHAT). CHAT was designed to identify all of the major primary headache disorders, distinguish daily from episodic types, and recognize medication overuse. CHAT was developed using an expert systems approach to headache diagnosis, with initial branch points determined by headache frequency and duration. Appropriate clinical criteria are presented relevant to brief and longer-lasting headaches. CHAT was posted on a web site using Microsoft active server pages and a SQL-server database server. A convenience sample of patients who presented to the adult urgent care department with headache, and patients in a family practice waiting room, were solicited to participate. Those who completed the on-line questionnaire were contacted for a diagnostic interview. One hundred thirty-five patients completed CHAT and 117 completed a diagnostic interview. CHAT correctly identified 35/35 (100%) patients with episodic migraine and 42/49 (85.7%) of patients with transformed migraine. CHAT also correctly identified 11/11 patients with chronic tension-type headache, 2/2 with episodic tension-type headache, and 1/1 with episodic cluster headache. Medication overuse was correctly recognized in 43/52 (82.7%). The most common misdiagnoses by CHAT were seen in patients with transformed migraine or new daily persistent headache. Fifty patients were referred to their primary care physician and 62 to the headache clinic. Of 29 patients referred to the PCP with a confirmed diagnosis of migraine, 25 made a follow-up appointment, the PCP diagnosed migraine in 19, and initiated migraine-specific therapy or prophylaxis in 17. The described expert system displays high diagnostic accuracy for migraine and other primary headache disorders, including daily headache syndromes and medication overuse. As part of a disease management program, CHAT led to patients receiving appropriate diagnoses and therapy. Limitations of the system include patient willingness to utilize the program, introducing such a process into the culture of medical care, and the difficult distinction of transformed migraine.

  12. A Wireless Physiological Signal Monitoring System with Integrated Bluetooth and WiFi Technologies.

    PubMed

    Yu, Sung-Nien; Cheng, Jen-Chieh

    2005-01-01

    This paper proposes a wireless patient monitoring system which integrates Bluetooth and WiFi wireless technologies. A wireless portable multi-parameter device was designated to acquire physiological signals and transmit them to a local server via Bluetooth wireless technology. Four kinds of monitor units were designed to communicate via the WiFi wireless technology, including a local monitor unit, a control center, mobile devices (personal digital assistant; PDA), and a web page. The use of various monitor units is intending to meet different medical requirements for different medical personnel. This system was demonstrated to promote the mobility and flexibility for both the patients and the medical personnel, which further improves the quality of health care.

  13. A teledentistry system for the second opinion.

    PubMed

    Gambino, Orazio; Lima, Fausto; Pirrone, Roberto; Ardizzone, Edoardo; Campisi, Giuseppina; di Fede, Olga

    2014-01-01

    In this paper we present a Teledentistry system aimed to the Second Opinion task. It make use of a particular camera called intra-oral camera, also called dental camera, in order to perform the photo shooting and real-time video of the inner part of the mouth. The pictures acquired by the Operator with such a device are sent to the Oral Medicine Expert (OME) by means of a current File Transfer Protocol (FTP) service and the real-time video is channeled into a video streaming thanks to the VideoLan client/server (VLC) application. It is composed by a HTML5 web-pages generated by PHP and allows to perform the Second Opinion both when Operator and OME are logged and when one of them is offline.

  14. Implementation of remote monitoring and managing switches

    NASA Astrophysics Data System (ADS)

    Leng, Junmin; Fu, Guo

    2010-12-01

    In order to strengthen the safety performance of the network and provide the big convenience and efficiency for the operator and the manager, the system of remote monitoring and managing switches has been designed and achieved using the advanced network technology and present network resources. The fast speed Internet Protocol Cameras (FS IP Camera) is selected, which has 32-bit RSIC embedded processor and can support a number of protocols. An Optimal image compress algorithm Motion-JPEG is adopted so that high resolution images can be transmitted by narrow network bandwidth. The architecture of the whole monitoring and managing system is designed and implemented according to the current infrastructure of the network and switches. The control and administrative software is projected. The dynamical webpage Java Server Pages (JSP) development platform is utilized in the system. SQL (Structured Query Language) Server database is applied to save and access images information, network messages and users' data. The reliability and security of the system is further strengthened by the access control. The software in the system is made to be cross-platform so that multiple operating systems (UNIX, Linux and Windows operating systems) are supported. The application of the system can greatly reduce manpower cost, and can quickly find and solve problems.

  15. The Live Access Server Scientific Product Generation Through Workflow Orchestration

    NASA Astrophysics Data System (ADS)

    Hankin, S.; Calahan, J.; Li, J.; Manke, A.; O'Brien, K.; Schweitzer, R.

    2006-12-01

    The Live Access Server (LAS) is a well-established Web-application for display and analysis of geo-science data sets. The software, which can be downloaded and installed by anyone, gives data providers an easy way to establish services for their on-line data holdings, so their users can make plots; create and download data sub-sets; compare (difference) fields; and perform simple analyses. Now at version 7.0, LAS has been in operation since 1994. The current "Armstrong" release of LAS V7 consists of three components in a tiered architecture: user interface, workflow orchestration and Web Services. The LAS user interface (UI) communicates with the LAS Product Server via an XML protocol embedded in an HTTP "get" URL. Libraries (APIs) have been developed in Java, JavaScript and perl that can readily generate this URL. As a result of this flexibility it is common to find LAS user interfaces of radically different character, tailored to the nature of specific datasets or the mindset of specific users. When a request is received by the LAS Product Server (LPS -- the workflow orchestration component), business logic converts this request into a series of Web Service requests invoked via SOAP. These "back- end" Web services perform data access and generate products (visualizations, data subsets, analyses, etc.). LPS then packages these outputs into final products (typically HTML pages) via Jakarta Velocity templates for delivery to the end user. "Fine grained" data access is performed by back-end services that may utilize JDBC for data base access; the OPeNDAP "DAPPER" protocol; or (in principle) the OGC WFS protocol. Back-end visualization services are commonly legacy science applications wrapped in Java or Python (or perl) classes and deployed as Web Services accessible via SOAP. Ferret is the default visualization application used by LAS, though other applications such as Matlab, CDAT, and GrADS can also be used. Other back-end services may include generation of Google Earth layers using KML; generation of maps via WMS or ArcIMS protocols; and data manipulation with Unix utilities.

  16. The Pfam protein families database.

    PubMed

    Punta, Marco; Coggill, Penny C; Eberhardt, Ruth Y; Mistry, Jaina; Tate, John; Boursnell, Chris; Pang, Ningze; Forslund, Kristoffer; Ceric, Goran; Clements, Jody; Heger, Andreas; Holm, Liisa; Sonnhammer, Erik L L; Eddy, Sean R; Bateman, Alex; Finn, Robert D

    2012-01-01

    Pfam is a widely used database of protein families, currently containing more than 13,000 manually curated protein families as of release 26.0. Pfam is available via servers in the UK (http://pfam.sanger.ac.uk/), the USA (http://pfam.janelia.org/) and Sweden (http://pfam.sbc.su.se/). Here, we report on changes that have occurred since our 2010 NAR paper (release 24.0). Over the last 2 years, we have generated 1840 new families and increased coverage of the UniProt Knowledgebase (UniProtKB) to nearly 80%. Notably, we have taken the step of opening up the annotation of our families to the Wikipedia community, by linking Pfam families to relevant Wikipedia pages and encouraging the Pfam and Wikipedia communities to improve and expand those pages. We continue to improve the Pfam website and add new visualizations, such as the 'sunburst' representation of taxonomic distribution of families. In this work we additionally address two topics that will be of particular interest to the Pfam community. First, we explain the definition and use of family-specific, manually curated gathering thresholds. Second, we discuss some of the features of domains of unknown function (also known as DUFs), which constitute a rapidly growing class of families within Pfam.

  17. Identification of metal ion binding sites based on amino acid sequences

    PubMed Central

    Cao, Xiaoyong; Zhang, Xiaojin; Gao, Sujuan; Ding, Changjiang; Feng, Yonge; Bao, Weihua

    2017-01-01

    The identification of metal ion binding sites is important for protein function annotation and the design of new drug molecules. This study presents an effective method of analyzing and identifying the binding residues of metal ions based solely on sequence information. Ten metal ions were extracted from the BioLip database: Zn2+, Cu2+, Fe2+, Fe3+, Ca2+, Mg2+, Mn2+, Na+, K+ and Co2+. The analysis showed that Zn2+, Cu2+, Fe2+, Fe3+, and Co2+ were sensitive to the conservation of amino acids at binding sites, and promising results can be achieved using the Position Weight Scoring Matrix algorithm, with an accuracy of over 79.9% and a Matthews correlation coefficient of over 0.6. The binding sites of other metals can also be accurately identified using the Support Vector Machine algorithm with multifeature parameters as input. In addition, we found that Ca2+ was insensitive to hydrophobicity and hydrophilicity information and Mn2+ was insensitive to polarization charge information. An online server was constructed based on the framework of the proposed method and is freely available at http://60.31.198.140:8081/metal/HomePage/HomePage.html. PMID:28854211

  18. Identification of metal ion binding sites based on amino acid sequences.

    PubMed

    Cao, Xiaoyong; Hu, Xiuzhen; Zhang, Xiaojin; Gao, Sujuan; Ding, Changjiang; Feng, Yonge; Bao, Weihua

    2017-01-01

    The identification of metal ion binding sites is important for protein function annotation and the design of new drug molecules. This study presents an effective method of analyzing and identifying the binding residues of metal ions based solely on sequence information. Ten metal ions were extracted from the BioLip database: Zn2+, Cu2+, Fe2+, Fe3+, Ca2+, Mg2+, Mn2+, Na+, K+ and Co2+. The analysis showed that Zn2+, Cu2+, Fe2+, Fe3+, and Co2+ were sensitive to the conservation of amino acids at binding sites, and promising results can be achieved using the Position Weight Scoring Matrix algorithm, with an accuracy of over 79.9% and a Matthews correlation coefficient of over 0.6. The binding sites of other metals can also be accurately identified using the Support Vector Machine algorithm with multifeature parameters as input. In addition, we found that Ca2+ was insensitive to hydrophobicity and hydrophilicity information and Mn2+ was insensitive to polarization charge information. An online server was constructed based on the framework of the proposed method and is freely available at http://60.31.198.140:8081/metal/HomePage/HomePage.html.

  19. Java RMI Software Technology for the Payload Planning System of the International Space Station

    NASA Technical Reports Server (NTRS)

    Bryant, Barrett R.

    1999-01-01

    The Payload Planning System is for experiment planning on the International Space Station. The planning process has a number of different aspects which need to be stored in a database which is then used to generate reports on the planning process in a variety of formats. This process is currently structured as a 3-tier client/server software architecture comprised of a Java applet at the front end, a Java server in the middle, and an Oracle database in the third tier. This system presently uses CGI, the Common Gateway Interface, to communicate between the user-interface and server tiers and Active Data Objects (ADO) to communicate between the server and database tiers. This project investigated other methods and tools for performing the communications between the three tiers of the current system so that both the system performance and software development time could be improved. We specifically found that for the hardware and software platforms that PPS is required to run on, the best solution is to use Java Remote Method Invocation (RMI) for communication between the client and server and SQLJ (Structured Query Language for Java) for server interaction with the database. Prototype implementations showed that RMI combined with SQLJ significantly improved performance and also greatly facilitated construction of the communication software.

  20. Basics. [A Compilation of Learning Activities Pages from Seven Issues of Instructor Magazine, September 1982 through March 1983 and May 1983.

    ERIC Educational Resources Information Center

    Instructor, 1983

    1983-01-01

    This collection of 18 learning activities pages focuses on the subject areas of science, language arts, mathematics, and social studies. The science activities pages concern the study of earthquakes, sound, environmental changes, snails and slugs, and friction. Many of the activities are in the form of experiments for the students to perform.…

  1. Online social networking and US poison control centers: Facebook as a means of information distribution.

    PubMed

    Vo, Kathy; Smollin, Craig

    2015-06-01

    Online social networking services such as Facebook provide a novel medium for the dissemination of public health information by poison control centers in the United States. We performed a cross-sectional study of poison control center Facebook pages to describe and assess the use of this medium. Facebook pages associated with poison control centers were identified during a continuous two-week period from December 24, 2012 to January 7, 2013. Data were extracted from each page, including affiliated poison control center; page duration, measured in years since registration; number of subscribers; number of postings by general toxicological category; and measures of user-generated activity including "likes", "shares", and comments per posting. Among the 56 US poison control centers, 39 Facebook pages were identified, of which 29 were currently active. The total number of active pages has increased by 140% from 2009 to 2013 (average of 25% per year). The total number of all subscribers to active pages was 11,211, ranging from 40 to 2,456 (mean 387, SD 523), equal to 0.006% of all Facebook users in the United States. The number of subscribers per page was associated with page duration, number of postings, and type of postings. The types of toxicological postings were public education (45%), self-promotion (28%), childhood safety (12%), drugs of abuse (8%), environmental poisonings (6%), and general overdoses (1%). Slightly over half of all poison control centers in the United States are supplementing their outreach and education efforts through Facebook. In general, the more active the poison control center on Facebook, the more page followers and follower engagement gained.

  2. Work activity in food service: The significance of customer relations, tipping practices and gender for preventing musculoskeletal disorders.

    PubMed

    Laperrière, Ève; Messing, Karen; Bourbonnais, Renée

    2017-01-01

    Some evidence shows that food servers are exposed to an elevated risk of musculoskeletal disorders and injuries, and that their work activity varies by gender. Interviews of servers and observations of food service in Québec, Canada, were carried out in three restaurants and a questionnaire was administered to 64 workers from 44 other restaurants. The relationship with the customer has specific effects on work activity and transforms the physical, emotional and cognitive work. Strategies intended to speed service or otherwise related to the customer relationship can involve health risks. Women reported more direct food service (p < 0.01), a tendency to do more "housekeeping" tasks (p < 0.07) and fewer hours of work per week (p < 0.01). Women workers reported experiencing more sites of pain (p < 0.003). This exploratory study suggests that managing the server-customer relationship could be important in preventing musculoskeletal disorders in this population and that women are at particular risk. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Risk Assessment of the Naval Postgraduate School Gigabit Network

    DTIC Science & Technology

    2004-09-01

    Management Server (1) • Ras Server (1) • Remedy Server (1) • Samba Server(2) • SQL Servers (3) • Web Servers (3) • WINS Server (1) • Library...Server Bob Sharp INCA Windows 2000 Advanced Server NPGS Landesk SQL 2000 Alan Pires eagle Microsoft Windows 2000 Advanced Server EWS NPGS Landesk...Advanced Server Special Projects NPGS SQL Alan Pires MC01BDB Microsoft Windows 2000 Advanced Server Special Projects NPGS SQL 2000 Alan Pires

  4. Use of Deception to Improve Client Honeypot Detection of Drive-by-Download Attacks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Popovsky, Barbara; Narvaez Suarez, Julia F.; Seifert, Christian

    2009-07-24

    This paper presents the application of deception theory to improve the success of client honeypots at detecting malicious web page attacks from infected servers programmed by online criminals to launch drive-by-download attacks. The design of honeypots faces three main challenges: deception, how to design honeypots that seem real systems; counter-deception, techniques used to identify honeypots and hence defeating their deceiving nature; and counter counter-deception, how to design honeypots that deceive attackers. The authors propose the application of a deception model known as the deception planning loop to identify the current status on honeypot research, development and deployment. The analysis leadsmore » to a proposal to formulate a landscape of the honeypot research and planning of steps ahead.« less

  5. Dynamic alarm response procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martin, J.; Gordon, P.; Fitch, K.

    2006-07-01

    The Dynamic Alarm Response Procedure (DARP) system provides a robust, Web-based alternative to existing hard-copy alarm response procedures. This paperless system improves performance by eliminating time wasted looking up paper procedures by number, looking up plant process values and equipment and component status at graphical display or panels, and maintenance of the procedures. Because it is a Web-based system, it is platform independent. DARP's can be served from any Web server that supports CGI scripting, such as Apache{sup R}, IIS{sup R}, TclHTTPD, and others. DARP pages can be viewed in any Web browser that supports Javascript and Scalable Vector Graphicsmore » (SVG), such as Netscape{sup R}, Microsoft Internet Explorer{sup R}, Mozilla Firefox{sup R}, Opera{sup R}, and others. (authors)« less

  6. Web-based encyclopedia on physical effects

    NASA Astrophysics Data System (ADS)

    Papliatseyeu, Andrey; Repich, Maryna; Ilyushonak, Boris; Hurbo, Aliaksandr; Makarava, Katerina; Lutkovski, Vladimir M.

    2004-07-01

    Web-based learning applications open new horizons for educators. In this work we present the computer encyclopedia designed to overcome drawbacks of traditional paper information sources such as awkward search, low update rate, limited copies count and high cost. Moreover, we intended to improve access and search functions in comparison with some Internet sources in order to make it more convenient. The system is developed using modern Java technologies (Jave Servlets, Java Server Pages) and contains systemized information about most important and explored physical effects. It also may be used in other fields of science. The system is accessible via Intranet/Internet networks by means of any up-to-date Internet browser. It may be used for general learning purposes and as a study guide or tutorial for performing laboratory works.

  7. Vacation model for Markov machine repair problem with two heterogeneous unreliable servers and threshold recovery

    NASA Astrophysics Data System (ADS)

    Jain, Madhu; Meena, Rakesh Kumar

    2018-03-01

    Markov model of multi-component machining system comprising two unreliable heterogeneous servers and mixed type of standby support has been studied. The repair job of broken down machines is done on the basis of bi-level threshold policy for the activation of the servers. The server returns back to render repair job when the pre-specified workload of failed machines is build up. The first (second) repairman turns on only when the work load of N1 (N2) failed machines is accumulated in the system. The both servers may go for vacation in case when all the machines are in good condition and there are no pending repair jobs for the repairmen. Runge-Kutta method is implemented to solve the set of governing equations used to formulate the Markov model. Various system metrics including the mean queue length, machine availability, throughput, etc., are derived to determine the performance of the machining system. To provide the computational tractability of the present investigation, a numerical illustration is provided. A cost function is also constructed to determine the optimal repair rate of the server by minimizing the expected cost incurred on the system. The hybrid soft computing method is considered to develop the adaptive neuro-fuzzy inference system (ANFIS). The validation of the numerical results obtained by Runge-Kutta approach is also facilitated by computational results generated by ANFIS.

  8. Systematic plan of building Web geographic information system based on ActiveX control

    NASA Astrophysics Data System (ADS)

    Zhang, Xia; Li, Deren; Zhu, Xinyan; Chen, Nengcheng

    2003-03-01

    A systematic plan of building Web Geographic Information System (WebGIS) using ActiveX technology is proposed in this paper. In the proposed plan, ActiveX control technology is adopted in building client-side application, and two different schemas are introduced to implement communication between controls in users¡ browser and middle application server. One is based on Distribute Component Object Model (DCOM), the other is based on socket. In the former schema, middle service application is developed as a DCOM object that communicates with ActiveX control through Object Remote Procedure Call (ORPC) and accesses data in GIS Data Server through Open Database Connectivity (ODBC). In the latter, middle service application is developed using Java language. It communicates with ActiveX control through socket based on TCP/IP and accesses data in GIS Data Server through Java Database Connectivity (JDBC). The first one is usually developed using C/C++, and it is difficult to develop and deploy. The second one is relatively easy to develop, but its performance of data transfer relies on Web bandwidth. A sample application is developed using the latter schema. It is proved that the performance of the sample application is better than that of some other WebGIS applications in some degree.

  9. The NASA Technical Report Server

    NASA Astrophysics Data System (ADS)

    Nelson, M. L.; Gottlich, G. L.; Bianco, D. J.; Paulson, S. S.; Binkley, R. L.; Kellogg, Y. D.; Beaumont, C. J.; Schmunk, R. B.; Kurtz, M. J.; Accomazzi, A.; Syed, O.

    The National Aeronautics and Space Act of 1958 established the National Aeronautics and Space Administration (NASA) and charged it to "provide for the widest practicable and appropriate dissemination of information concerning...its activities and the results thereof". The search for innovative methods to distribute NASA's information led a grass-roots team to create the NASA Technical Report Server (NTRS), which uses the World Wide Web and other popular Internet-based information systems .

  10. Defense in Depth Added to Malicious Activities Simulation Tools (MAST)

    DTIC Science & Technology

    2015-09-01

    cipher suites. The TLS Handshake is a combination of three components: handshake, change cipher spec, and alert. 41 (1) The Handshake ( Hello ) The...TLS Handshake, specifically the “ Hello ” portion, is designed to negotiate session parameters (cipher suite). The client informs the server of the...protocols and standards that it supports and then the server selects the highest common protocols and standards. Specifically, the Client Hello message

  11. Network Consumption and Storage Needs when Working in a Full-Time Routine Digital Environment in a Large Nonacademic Training Hospital.

    PubMed

    Nap, Marius

    2016-01-01

    Digital pathology is indisputably connected with high demands on data traffic and storage. As a consequence, control of the logistic process and insight into the management of both traffic and storage is essential. We monitored data traffic from scanners to server and server to workstation and registered storage needs for diagnostic images and additional projects. The results showed that data traffic inside the hospital network (1 Gbps) never exceeded 80 Mbps for scanner-to-server activity, and activity from the server to the workstation took at most 5 Mbps. Data storage per image increased from 300 MB to an average of 600 MB as a result of camera and software updates, and, due to the increased scanning speed, the scanning time was reduced with almost 8 h/day. Introduction of a storage policy of only 12 months for diagnostic images and rescanning if needed resulted in a manageable storage window of 45 TB for the period of 1 year. Using simple registration tools allowed the transition of digital pathology into a concise package that allows planning and control. Incorporating retrieval of such information from scanning and storage devices will reduce the fear of losing control by the management when introducing digital pathology in daily routine. © 2016 S. Karger AG, Basel.

  12. Associations between eating disorder related symptoms and participants' utilization of an individualized Internet-based prevention and early intervention program.

    PubMed

    Kindermann, Sally; Moessner, Markus; Ozer, Fikret; Bauer, Stephanie

    2017-10-01

    Flexible, individualized interventions allow participants to adjust the intensity of support to their current needs. Between-persons, participants with greater needs can receive more intense support, within-persons, participants can adjust utilization to their current level of symptoms. The purpose of the present study was to analyze associations between ED-related symptoms and utilization of the individualized program ProYouth both between- and within-persons, aiming to investigate whether participants adapt utilization intensity to their current needs. Generalized estimated equations (GEEs) were used to analyze log data on program utilization (monthly page visits, monthly use of chats and forum) assessed via server logs and self-reported data on ED-related symptoms from N = 394 ProYouth participants who provided longitudinal data for at least two months. Between-persons, page visits per month were significantly associated with compensatory behavior, body dissatisfaction, and binge eating. Monthly use of the more intense modules with personal support chat and forum was associated with the frequency of compensatory behavior. Within-persons, unbalanced nutrition and dieting showed the strongest associations with monthly page visits. Monthly use of chats and forum was significantly associated with compensatory behavior and unbalanced nutrition and dieting. Results indicate that program utilization is associated with ED-related symptoms between- as well as within-persons. The individualized, flexible approach of ProYouth thus seems to be a promising way for Internet-based provision of combined prevention and early intervention programs addressing ED. © 2017 Wiley Periodicals, Inc.

  13. An Enhanced Biometric Based Authentication with Key-Agreement Protocol for Multi-Server Architecture Based on Elliptic Curve Cryptography.

    PubMed

    Reddy, Alavalapati Goutham; Das, Ashok Kumar; Odelu, Vanga; Yoo, Kee-Young

    2016-01-01

    Biometric based authentication protocols for multi-server architectures have gained momentum in recent times due to advancements in wireless technologies and associated constraints. Lu et al. recently proposed a robust biometric based authentication with key agreement protocol for a multi-server environment using smart cards. They claimed that their protocol is efficient and resistant to prominent security attacks. The careful investigation of this paper proves that Lu et al.'s protocol does not provide user anonymity, perfect forward secrecy and is susceptible to server and user impersonation attacks, man-in-middle attacks and clock synchronization problems. In addition, this paper proposes an enhanced biometric based authentication with key-agreement protocol for multi-server architecture based on elliptic curve cryptography using smartcards. We proved that the proposed protocol achieves mutual authentication using Burrows-Abadi-Needham (BAN) logic. The formal security of the proposed protocol is verified using the AVISPA (Automated Validation of Internet Security Protocols and Applications) tool to show that our protocol can withstand active and passive attacks. The formal and informal security analyses and performance analysis demonstrates that the proposed protocol is robust and efficient compared to Lu et al.'s protocol and existing similar protocols.

  14. Characteristics of emergency pages using a computer-based anesthesiology paging system in children and adults undergoing procedures at a tertiary care medical center.

    PubMed

    Weingarten, Toby N; Abenstein, John P; Dutton, Claire H; Kohn, Melinda A; Lee, Elizabeth A; Mullenbach, Tami E; Narr, Bradly J; Schroeder, Darrell R; Sprung, Juraj

    2013-04-01

    In our large academic supervisory practice, attending anesthesiologists concomitantly care for multiple patients. To manage communications within the procedural environment, we use a proprietary electronic computer-based anesthesiology visual paging system. This system can send an emergency page that instantly alerts the attending anesthesiologist and other available personnel that immediate help is needed. We analyzed the characteristics of intraoperative emergency pages in children and adults. We identified all emergency page activations between January 1, 2005 and July 31, 2010 in our main operating rooms. Electronic medical records were reviewed for rates and characteristics of pages such as primary etiology, performed interventions, and outcomes. During the study period, 258,135 anesthetics were performed (n = 32,103 children, younger than 18 years) and 370 emergency pages (n = 309 adults, n = 61 children) were recorded (1.4 per 1000 cases; 95% confidence interval, 1.3-1.6). Infants had the highest rates (9.4 per 1000; 95% confidence interval, 5.7-14.4) of emergency page activations (P < 0.001 compared with each other age group). In adults, the most frequent causes were hemodynamic (55%), and in children respiratory and airway (60.7%) events. Emergency pages were rare in patients older than 2 years. Infants were more likely than children 1 to 2 years of age to have emergency page activation, despite both groups being cared for by pediatric fellowship trained anesthesiologists.

  15. The Gypsy Database (GyDB) of mobile genetic elements: release 2.0

    PubMed Central

    Llorens, Carlos; Futami, Ricardo; Covelli, Laura; Domínguez-Escribá, Laura; Viu, Jose M.; Tamarit, Daniel; Aguilar-Rodríguez, Jose; Vicente-Ripolles, Miguel; Fuster, Gonzalo; Bernet, Guillermo P.; Maumus, Florian; Munoz-Pomer, Alfonso; Sempere, Jose M.; Latorre, Amparo; Moya, Andres

    2011-01-01

    This article introduces the second release of the Gypsy Database of Mobile Genetic Elements (GyDB 2.0): a research project devoted to the evolutionary dynamics of viruses and transposable elements based on their phylogenetic classification (per lineage and protein domain). The Gypsy Database (GyDB) is a long-term project that is continuously progressing, and that owing to the high molecular diversity of mobile elements requires to be completed in several stages. GyDB 2.0 has been powered with a wiki to allow other researchers participate in the project. The current database stage and scope are long terminal repeats (LTR) retroelements and relatives. GyDB 2.0 is an update based on the analysis of Ty3/Gypsy, Retroviridae, Ty1/Copia and Bel/Pao LTR retroelements and the Caulimoviridae pararetroviruses of plants. Among other features, in terms of the aforementioned topics, this update adds: (i) a variety of descriptions and reviews distributed in multiple web pages; (ii) protein-based phylogenies, where phylogenetic levels are assigned to distinct classified elements; (iii) a collection of multiple alignments, lineage-specific hidden Markov models and consensus sequences, called GyDB collection; (iv) updated RefSeq databases and BLAST and HMM servers to facilitate sequence characterization of new LTR retroelement and caulimovirus queries; and (v) a bibliographic server. GyDB 2.0 is available at http://gydb.org. PMID:21036865

  16. The Gypsy Database (GyDB) of mobile genetic elements: release 2.0.

    PubMed

    Llorens, Carlos; Futami, Ricardo; Covelli, Laura; Domínguez-Escribá, Laura; Viu, Jose M; Tamarit, Daniel; Aguilar-Rodríguez, Jose; Vicente-Ripolles, Miguel; Fuster, Gonzalo; Bernet, Guillermo P; Maumus, Florian; Munoz-Pomer, Alfonso; Sempere, Jose M; Latorre, Amparo; Moya, Andres

    2011-01-01

    This article introduces the second release of the Gypsy Database of Mobile Genetic Elements (GyDB 2.0): a research project devoted to the evolutionary dynamics of viruses and transposable elements based on their phylogenetic classification (per lineage and protein domain). The Gypsy Database (GyDB) is a long-term project that is continuously progressing, and that owing to the high molecular diversity of mobile elements requires to be completed in several stages. GyDB 2.0 has been powered with a wiki to allow other researchers participate in the project. The current database stage and scope are long terminal repeats (LTR) retroelements and relatives. GyDB 2.0 is an update based on the analysis of Ty3/Gypsy, Retroviridae, Ty1/Copia and Bel/Pao LTR retroelements and the Caulimoviridae pararetroviruses of plants. Among other features, in terms of the aforementioned topics, this update adds: (i) a variety of descriptions and reviews distributed in multiple web pages; (ii) protein-based phylogenies, where phylogenetic levels are assigned to distinct classified elements; (iii) a collection of multiple alignments, lineage-specific hidden Markov models and consensus sequences, called GyDB collection; (iv) updated RefSeq databases and BLAST and HMM servers to facilitate sequence characterization of new LTR retroelement and caulimovirus queries; and (v) a bibliographic server. GyDB 2.0 is available at http://gydb.org.

  17. Database Reports Over the Internet

    NASA Technical Reports Server (NTRS)

    Smith, Dean Lance

    2002-01-01

    Most of the summer was spent developing software that would permit existing test report forms to be printed over the web on a printer that is supported by Adobe Acrobat Reader. The data is stored in a DBMS (Data Base Management System). The client asks for the information from the database using an HTML (Hyper Text Markup Language) form in a web browser. JavaScript is used with the forms to assist the user and verify the integrity of the entered data. Queries to a database are made in SQL (Sequential Query Language), a widely supported standard for making queries to databases. Java servlets, programs written in the Java programming language running under the control of network server software, interrogate the database and complete a PDF form template kept in a file. The completed report is sent to the browser requesting the report. Some errors are sent to the browser in an HTML web page, others are reported to the server. Access to the databases was restricted since the data are being transported to new DBMS software that will run on new hardware. However, the SQL queries were made to Microsoft Access, a DBMS that is available on most PCs (Personal Computers). Access does support the SQL commands that were used, and a database was created with Access that contained typical data for the report forms. Some of the problems and features are discussed below.

  18. LoColms: an innovative approach of enhancing traditional classroom form of education by promoting web-based distance learning in the poorer countries.

    PubMed

    Ngarambe, Donart; Pan, Yun-he; Chen, De-ren

    2003-01-01

    There have been numerous attempts recently to promote technology based education (Shrestha, 1997) in the poorer third world countries, but so far all these have not provided a sustainable solution as they are either centered and controlled from abroad and relying solely on foreign donors for their sustenance or they are not web-based, which make distribution problematic, and some are not affordable by most of the local population in these places. In this paper we discuss an application, the Local College Learning Management System (LoColms), which we are developing, that is both sustainable and economical to suit the situation in these countries. The application is a web-based system, and aims at improving the traditional form of education by empowering the local universities. Its economy comes from the fact that it is supported by traditional communication technology, the public switching telephone network system, PSTN, which eliminates the need for packet switched or dedicated private virtual networks (PVN) usually required in similar situations. At a later stage, we shall incorporate ontology and paging tools to improve resource sharing and storage optimization in the Proxy Caches (ProCa) and LoColms servers. The system is based on the client/server paradigm and its infrastructure consists of the PSTN, ProCa, with the learning centers accessing the universities by means of point-to-point protocol (PPP).

  19. Userscripts for the life sciences.

    PubMed

    Willighagen, Egon L; O'Boyle, Noel M; Gopalakrishnan, Harini; Jiao, Dazhi; Guha, Rajarshi; Steinbeck, Christoph; Wild, David J

    2007-12-21

    The web has seen an explosion of chemistry and biology related resources in the last 15 years: thousands of scientific journals, databases, wikis, blogs and resources are available with a wide variety of types of information. There is a huge need to aggregate and organise this information. However, the sheer number of resources makes it unrealistic to link them all in a centralised manner. Instead, search engines to find information in those resources flourish, and formal languages like Resource Description Framework and Web Ontology Language are increasingly used to allow linking of resources. A recent development is the use of userscripts to change the appearance of web pages, by on-the-fly modification of the web content. This opens possibilities to aggregate information and computational results from different web resources into the web page of one of those resources. Several userscripts are presented that enrich biology and chemistry related web resources by incorporating or linking to other computational or data sources on the web. The scripts make use of Greasemonkey-like plugins for web browsers and are written in JavaScript. Information from third-party resources are extracted using open Application Programming Interfaces, while common Universal Resource Locator schemes are used to make deep links to related information in that external resource. The userscripts presented here use a variety of techniques and resources, and show the potential of such scripts. This paper discusses a number of userscripts that aggregate information from two or more web resources. Examples are shown that enrich web pages with information from other resources, and show how information from web pages can be used to link to, search, and process information in other resources. Due to the nature of userscripts, scientists are able to select those scripts they find useful on a daily basis, as the scripts run directly in their own web browser rather than on the web server. This flexibility allows the scientists to tune the features of web resources to optimise their productivity.

  20. Web-based pathology practice examination usage.

    PubMed

    Klatt, Edward C

    2014-01-01

    General and subject specific practice examinations for students in health sciences studying pathology were placed onto a free public internet web site entitled web path and were accessed four clicks from the home web site menu. Multiple choice questions were coded into. html files with JavaScript functions for web browser viewing in a timed format. A Perl programming language script with common gateway interface for web page forms scored examinations and placed results into a log file on an internet computer server. The four general review examinations of 30 questions each could be completed in up to 30 min. The 17 subject specific examinations of 10 questions each with accompanying images could be completed in up to 15 min each. The results of scores and user educational field of study from log files were compiled from June 2006 to January 2014. The four general review examinations had 31,639 accesses with completion of all questions, for a completion rate of 54% and average score of 75%. A score of 100% was achieved by 7% of users, ≥90% by 21%, and ≥50% score by 95% of users. In top to bottom web page menu order, review examination usage was 44%, 24%, 17%, and 15% of all accessions. The 17 subject specific examinations had 103,028 completions, with completion rate 73% and average score 74%. Scoring at 100% was 20% overall, ≥90% by 37%, and ≥50% score by 90% of users. The first three menu items on the web page accounted for 12.6%, 10.0%, and 8.2% of all completions, and the bottom three accounted for no more than 2.2% each. Completion rates were higher for shorter 10 questions subject examinations. Users identifying themselves as MD/DO scored higher than other users, averaging 75%. Usage was higher for examinations at the top of the web page menu. Scores achieved suggest that a cohort of serious users fully completing the examinations had sufficient preparation to use them to support their pathology education.

  1. Userscripts for the Life Sciences

    PubMed Central

    Willighagen, Egon L; O'Boyle, Noel M; Gopalakrishnan, Harini; Jiao, Dazhi; Guha, Rajarshi; Steinbeck, Christoph; Wild, David J

    2007-01-01

    Background The web has seen an explosion of chemistry and biology related resources in the last 15 years: thousands of scientific journals, databases, wikis, blogs and resources are available with a wide variety of types of information. There is a huge need to aggregate and organise this information. However, the sheer number of resources makes it unrealistic to link them all in a centralised manner. Instead, search engines to find information in those resources flourish, and formal languages like Resource Description Framework and Web Ontology Language are increasingly used to allow linking of resources. A recent development is the use of userscripts to change the appearance of web pages, by on-the-fly modification of the web content. This opens possibilities to aggregate information and computational results from different web resources into the web page of one of those resources. Results Several userscripts are presented that enrich biology and chemistry related web resources by incorporating or linking to other computational or data sources on the web. The scripts make use of Greasemonkey-like plugins for web browsers and are written in JavaScript. Information from third-party resources are extracted using open Application Programming Interfaces, while common Universal Resource Locator schemes are used to make deep links to related information in that external resource. The userscripts presented here use a variety of techniques and resources, and show the potential of such scripts. Conclusion This paper discusses a number of userscripts that aggregate information from two or more web resources. Examples are shown that enrich web pages with information from other resources, and show how information from web pages can be used to link to, search, and process information in other resources. Due to the nature of userscripts, scientists are able to select those scripts they find useful on a daily basis, as the scripts run directly in their own web browser rather than on the web server. This flexibility allows the scientists to tune the features of web resources to optimise their productivity. PMID:18154664

  2. MO/DSD online information server and global information repository access

    NASA Technical Reports Server (NTRS)

    Nguyen, Diem; Ghaffarian, Kam; Hogie, Keith; Mackey, William

    1994-01-01

    Often in the past, standards and new technology information have been available only in hardcopy form, with reproduction and mailing costs proving rather significant. In light of NASA's current budget constraints and in the interest of efficient communications, the Mission Operations and Data Systems Directorate (MO&DSD) New Technology and Data Standards Office recognizes the need for an online information server (OLIS). This server would allow: (1) dissemination of standards and new technology information throughout the Directorate more quickly and economically; (2) online browsing and retrieval of documents that have been published for and by MO&DSD; and (3) searching for current and past study activities on related topics within NASA before issuing a task. This paper explores a variety of available information servers and searching tools, their current capabilities and limitations, and the application of these tools to MO&DSD. Most importantly, the discussion focuses on the way this concept could be easily applied toward improving dissemination of standards and new technologies and improving documentation processes.

  3. Network information security in a phase III Integrated Academic Information Management System (IAIMS).

    PubMed

    Shea, S; Sengupta, S; Crosswell, A; Clayton, P D

    1992-01-01

    The developing Integrated Academic Information System (IAIMS) at Columbia-Presbyterian Medical Center provides data sharing links between two separate corporate entities, namely Columbia University Medical School and The Presbyterian Hospital, using a network-based architecture. Multiple database servers with heterogeneous user authentication protocols are linked to this network. "One-stop information shopping" implies one log-on procedure per session, not separate log-on and log-off procedures for each server or application used during a session. These circumstances provide challenges at the policy and technical levels to data security at the network level and insuring smooth information access for end users of these network-based services. Five activities being conducted as part of our security project are described: (1) policy development; (2) an authentication server for the network; (3) Kerberos as a tool for providing mutual authentication, encryption, and time stamping of authentication messages; (4) a prototype interface using Kerberos services to authenticate users accessing a network database server; and (5) a Kerberized electronic signature.

  4. Integrating RFID technique to design mobile handheld inventory management system

    NASA Astrophysics Data System (ADS)

    Huang, Yo-Ping; Yen, Wei; Chen, Shih-Chung

    2008-04-01

    An RFID-based mobile handheld inventory management system is proposed in this paper. Differing from the manual inventory management method, the proposed system works on the personal digital assistant (PDA) with an RFID reader. The system identifies electronic tags on the properties and checks the property information in the back-end database server through a ubiquitous wireless network. The system also provides a set of functions to manage the back-end inventory database and assigns different levels of access privilege according to various user categories. In the back-end database server, to prevent improper or illegal accesses, the server not only stores the inventory database and user privilege information, but also keeps track of the user activities in the server including the login and logout time and location, the records of database accessing, and every modification of the tables. Some experimental results are presented to verify the applicability of the integrated RFID-based mobile handheld inventory management system.

  5. 11 Years of Geoscience Outreach through the Windows to the Universe Project: Lessons Learned for On-Line Education in Formal and Informal Settings

    NASA Astrophysics Data System (ADS)

    Johnson, R.; Bergman, J.; Gardiner, L.; Genyuk, J.; Lagrave, M.; Mastie, D.; Metcalfe, T.; Russell, R.

    2005-12-01

    The Windows to the Universe project was initiated in February 1995 with support from NASA to bring the geosciences and integrated interdisciplinary content to the public through a web site (http://www.windows.ucar.edu) designed to be engaging and appealing to the general public in informal settings. Since that time, the project has continued to develop content and associated educational resources, including interactives and games, and standards-based classroom activities and demonstrations. Most resources on the website are provided at three levels of sophistication ' beginner, intermediate, and advanced' approximating upper elementary, middle, and high school students. In 1996, we initiated an annual professional development program that includes training opportunities on these materials through workshops offered at local, state, national, and international meetings; we now regularly reach over 800 teachers per year through our professional development efforts. Most recently, in 2004 we initiated an effort to translate the entire website into Spanish and keep the translation up to date as content is added and updated. The website is now composed of over 7000 pages, with a comparable number of images, animations, and interactives. The website now hosts over 9.6 million visits per year (corresponding to nearly 80 million page views) from around the world, including over 2.2 million on the Spanish version of the website; ~35,000 visits per day are hosted on our servers during a typical day in the academic year. User surveys, comments from users, and interactions with educators document that we website is extremely popular with their students, that it successfully reaches students across the age groups indicated above, that it is used by educators to provide background content information for themselves as well as their students, in addition to using our educational activities (our 'Teacher Resources' section of the website is hosts ~3500 visits per day during the academic year). Content on the website is developed by a team of scientist/educators (all with degrees in the Earth or space sciences), and is reviewed for accuracy within the team. If needed because the topic of a page falls outside the expertise of the development team, additional review is requested from scientific colleagues. Content development is guided by the needs of sponsors as well as requests from our users. Our experience overall demonstrates that websites designed to be engaging and interactive are effective, not only for informal education, but also for formal education, and that careful consideration for design and dissemination encourages this crossover use.

  6. Common Ground: An Interactive Visual Exploration and Discovery for Complex Health Data

    DTIC Science & Technology

    2014-04-01

    annotate other ontologies for the visual interface client. Finally, we are actively working on software development of both a backend server and the...the following infrastructure and resources. For the development and management of the ontologies, we installed a framework consisting of a server...that is being developed by Google. Using these 9 technologies, we developed an HTML5 client that runs on Windows, Mac OSX, Linux and mobile systems

  7. Security Proof for Password Authentication in TLS-Verifier-based Three-Party Group Diffie-Hellman

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chevassut, Olivier; Milner, Joseph; Pointcheval, David

    2008-04-21

    The internet has grown greatly in the past decade, by some numbers exceeding 47 million active web sites and a total aggregate exceeding100 million web sites. What is common practice today on the Internet is that servers have public keys, but clients are largely authenticated via short passwords. Protecting these passwords by not storing them in the clear on institutions's servers has become a priority. This paper develops password-based ciphersuites for the Transport Layer Security (TLS) protocol that are: (1) resistant to server compromise; (2) provably secure; (3) believed to be free from patent and licensing restrictions based on anmore » analysis of relevant patents in the area.« less

  8. CIP Training Manual: Collaborative Information Portal Advance Training Information for Field Test Participants

    NASA Technical Reports Server (NTRS)

    Schreiner, John; Clancy, Daniel (Technical Monitor)

    2002-01-01

    The Collaborative Information Portal (CIP) is a web-based information management and retrieval system. Its purpose is to provide users at MER (Mars Exploration Rover) mission operations with easy access to a broad range of mission data and products and contextual information such as the current operations schedule. The CIP web-server provides this content in a user customizable web-portal environment. Since CIP is still under development, only a subset of the full feature set will be available for the EDO field test. The CIP web-portal will be accessed through a standard web browser. CIP is intended to be intuitive and simple to use, however, at the training session, users will receive a one to two page reference guide, which should aid them in using CIP. Users must provide their own computers for accessing CIP during the field test. These computers should be configured with Java 1.3 and a Java 2 enabled browser. Macintosh computers should be running OS 10.1.3 or later. Classic Mac OS (OS 9) is not supported. For more information please read section 7.3 in the FIASCO Rover Science Operations Test Mission Plan. Several screen shots of the Beta Release of CIP are shown on the following pages.

  9. The Pfam protein families database

    PubMed Central

    Punta, Marco; Coggill, Penny C.; Eberhardt, Ruth Y.; Mistry, Jaina; Tate, John; Boursnell, Chris; Pang, Ningze; Forslund, Kristoffer; Ceric, Goran; Clements, Jody; Heger, Andreas; Holm, Liisa; Sonnhammer, Erik L. L.; Eddy, Sean R.; Bateman, Alex; Finn, Robert D.

    2012-01-01

    Pfam is a widely used database of protein families, currently containing more than 13 000 manually curated protein families as of release 26.0. Pfam is available via servers in the UK (http://pfam.sanger.ac.uk/), the USA (http://pfam.janelia.org/) and Sweden (http://pfam.sbc.su.se/). Here, we report on changes that have occurred since our 2010 NAR paper (release 24.0). Over the last 2 years, we have generated 1840 new families and increased coverage of the UniProt Knowledgebase (UniProtKB) to nearly 80%. Notably, we have taken the step of opening up the annotation of our families to the Wikipedia community, by linking Pfam families to relevant Wikipedia pages and encouraging the Pfam and Wikipedia communities to improve and expand those pages. We continue to improve the Pfam website and add new visualizations, such as the ‘sunburst’ representation of taxonomic distribution of families. In this work we additionally address two topics that will be of particular interest to the Pfam community. First, we explain the definition and use of family-specific, manually curated gathering thresholds. Second, we discuss some of the features of domains of unknown function (also known as DUFs), which constitute a rapidly growing class of families within Pfam. PMID:22127870

  10. Comprehensive innovative solution for resident education using the Intranet Journal of Chest Radiology.

    PubMed

    Nishino, Mizuki; Wolfe, Donna; Yam, Chun-Shan; Larson, Michael; Boiselle, Phillip M; Hatabu, Hiroto

    2004-10-01

    Because of the rapid increase in clinical workload in academic radiology departments, time for teaching rotating residents is getting more and more limited. As a solution to this problem, we introduced the Intranet Journal of Chest Radiology as a comprehensive innovative tool for assisting resident education. The Intranet Journal of Chest Radiology is constructed using Microsoft FrontPage version 2002 (Microsoft Corp, Redmond, WA) and is hosted in our departmental web server (Beth Israel Deaconess Medical Center, Boston, MA). The home page of the intranet journal provides access to the main features, "Cases of the Month," "Teaching File," "Selected Articles for Residents," "Lecture Series," and "Current Publications." These features provide quick access to the selected radiology articles, the interesting chest cases, and the lecture series and current publication from the chest section. Our intranet journal has been well utilized for 6 months after its introduction. It enhances residents' interest and motivation to work on case collections, to search and read articles, and to generate interest in research. Frequent updating is necessary for the journal to be kept current, relevant, and well-utilized. The intranet journal serves as a comprehensive innovative solution for resident education, providing basic educational resources and opportunities of interactive participation by residents.

  11. Filmless PACS in a multiple facility environment

    NASA Astrophysics Data System (ADS)

    Wilson, Dennis L.; Glicksman, Robert A.; Prior, Fred W.; Siu, Kai-Yeung; Goldburgh, Mitchell M.

    1996-05-01

    A Picture Archiving and Communication System centered on a shared image file server can support a filmless hospital. Systems based on this architecture have proven themselves in over four years of clinical operation. Changes in healthcare delivery are causing radiology groups to support multiple facilities for remote clinic support and consolidation of services. There will be a corresponding need for communicating over a standardized wide area network (WAN). Interactive workflow, a natural extension to the single facility case, requires a means to work effectively and seamlessly across moderate to low speed communication networks. Several schemes for supporting a consortium of medical treatment facilities over a WAN are explored. Both centralized and distributed database approaches are evaluated against several WAN scenarios. Likewise, several architectures for distributing image file servers or buffers over a WAN are explored, along with the caching and distribution strategies that support them. An open system implementation is critical to the success of a wide area system. The role of the Digital Imaging and Communications in Medicine (DICOM) standard in supporting multi- facility and multi-vendor open systems is also addressed. An open system can be achieved by using a DICOM server to provide a view of the system-wide distributed database. The DICOM server interface to a local version of the global database lets a local workstation treat the multiple, distributed data servers as though they were one local server for purposes of examination queries. The query will recover information about the examination that will permit retrieval over the network from the server on which the examination resides. For efficiency reasons, the ability to build cross-facility radiologist worklists and clinician-oriented patient folders is essential. The technologies of the World-Wide-Web can be used to generate worklists and patient folders across facilities. A reliable broadcast protocol may be a convenient way to notify many different users and many image servers about new activities in the network of image servers. In addition to ensuring reliability of message delivery and global serialization of each broadcast message in the network, the broadcast protocol should not introduce significant communication overhead.

  12. SUPAR: Smartphone as a ubiquitous physical activity recognizer for u-healthcare services.

    PubMed

    Fahim, Muhammad; Lee, Sungyoung; Yoon, Yongik

    2014-01-01

    Current generation smartphone can be seen as one of the most ubiquitous device for physical activity recognition. In this paper we proposed a physical activity recognizer to provide u-healthcare services in a cost effective manner by utilizing cloud computing infrastructure. Our model is comprised on embedded triaxial accelerometer of the smartphone to sense the body movements and a cloud server to store and process the sensory data for numerous kind of services. We compute the time and frequency domain features over the raw signals and evaluate different machine learning algorithms to identify an accurate activity recognition model for four kinds of physical activities (i.e., walking, running, cycling and hopping). During our experiments we found Support Vector Machine (SVM) algorithm outperforms for the aforementioned physical activities as compared to its counterparts. Furthermore, we also explain how smartphone application and cloud server communicate with each other.

  13. STRUCTURAL BIOLOGY AND MOLECULAR MEDICINE RESEARCH PROGRAM (LSBMM)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eisenberg, David S.

    2008-07-15

    The UCLA-DOE Institute of Genomics and Proteomics is an organized research unit of the University of California, sponsored by the Department of Energy through the mechanism of a Cooperative Agreement. Today the Institute consists of 10 Principal Investigators and 7 Associate Members, developing and applying technologies to promote the biological and environmental missions of the Department of Energy, and 5 Core Technology Centers to sustain this work. The focus is on understanding genomes, pathways and molecular machines in organisms of interest to DOE, with special emphasis on developing enabling technologies. Since it was founded in 1947, the UCLA-DOE Institute hasmore » adapted its mission to the research needs of DOE and its progenitor agencies as these research needs have changed. The Institute started as the AEC Laboratory of Nuclear Medicine, directed by Stafford Warren, who later became the founding Dean of the UCLA School of Medicine. In this sense, the entire UCLA medical center grew out of the precursor of our Institute. In 1963, the mission of the Institute was expanded into environmental studies by Director Ray Lunt. I became the third director in 1993, and in close consultation with David Galas and John Wooley of DOE, shifted the mission of the Institute towards genomics and proteomics. Since 1993, the Principal Investigators and Core Technology Centers are entirely new, and the Institute has separated from its former division concerned with PET imaging. The UCLA-DOE Institute shares the space of Boyer Hall with the Molecular Biology Institute, and assumes responsibility for the operation of the main core facilities. Fig. 1 gives the organizational chart of the Institute. Some of the benefits to the public of research carried out at the UCLA-DOE Institute include the following: The development of publicly accessible, web-based databases, including the Database of Protein Interactions, and the ProLinks database of genomicly inferred protein function linkages. The development of publicly accessible, web-based servers, including the HOTPATCH server, the ProKnow Server and the SAVEs server. All of these are accessible from the home page of the Institute. Advancing the science of bioenergy, in the laboratories of the Principal Investigators of the Institute, including the laboratories of Shimon Weiss, James Liao, James Bowie, Todd Yeates, Rob Gunsalus.« less

  14. MapApp: A Java(TM) Applet for Accessing Geographic Databases

    NASA Astrophysics Data System (ADS)

    Haxby, W.; Carbotte, S.; Ryan, W. B.; OHara, S.

    2001-12-01

    MapApp (http://coast.ldeo.columbia.edu/help/MapApp.html) is a prototype Java(TM) applet that is intended to give easy and versatile access to geographic data sets through a web browser. It was developed initially to interface with the RIDGE Multibeam Synthesis. Subsequently, interfaces with other geophysical databases were added. At present, multibeam bathymetry grids, underway geophysics along ship tracks, and the LDEO Borehole Research Group's ODP well logging database are accessible through MapApp. We plan to add an interface with the Ridge Petrology Database in the near future. The central component of MapApp is a world physiographic map. Users may navigate around the map (zoom/pan) without waiting for HTTP requests to a remote server to be processed. A focus request loads image tiles from the server to compose a new map at the current viewing resolution. Areas in which multibeam grids are available may be focused to a pixel resolution of about 200 m. These areas may be identified by toggling a mask. Databases may be accessed through menus, and selected data objects may be loaded into MapApp by selecting items from tables. Once loaded, a bathymetry grid may be contoured or used to create bathymetric profiles; ship tracks and ODP sites may be overlain on the map and their geophysical data plotted in X-Y graphs. The advantage of applets over traditional web pages is that they permit dynamic interaction with data sets, while limiting time consuming interaction with a remote server. Users may customize the graphics display by modifying the scale, or the symbol or line characteristics of rendered data, contour interval, etc. The ease with which users can select areas, view the physiography of areas, and preview data sets and evaluate them for quality and applicability, makes MapApp a valuable tool for education and research.

  15. Cannabis and Kratom online information in Thailand: Facebook trends 2015-2016.

    PubMed

    Thaikla, Kanittha; Pinyopornpanish, Kanokporn; Jiraporncharoen, Wichuda; Angkurawaranon, Chaisiri

    2018-05-09

    Our study aims to evaluate the trends in online information about cannabis and kratom on Facebook in Thailand, where there is current discussion regarding legalizing these drugs. Between April and November 2015, reviewers searched for cannabis and kratom Facebook pages in the Thai language via the common search engines. Content analysis was performed and the contents of each page were categorized by the tone of the post (positive, negative or neutral). Then, a one-year follow-up search was conducted to compare the contents. Twelve Facebook pages each were initially identified for cannabis and for kratom. Follower numbers were higher for cannabis pages. Kratom pages were less active but were open for a longer time. Posts with positive tones and neutral tones were found for both drugs, but none had negative tones. Other drugs were mentioned on the cannabis pages, but they were different from those mentioned on the kratom pages. Issues regarding drug legalization were found on the cannabis pages but not on the kratom pages during the searching period. One year later, the tone of the posts was in the same direction, but the page activity had increased. The information currently available on the sampled Facebook pages was positive towards the use of cannabis and kratom. No information about harm from these drugs was found through our search.

  16. DiRE: identifying distant regulatory elements of co-expressed genes

    PubMed Central

    Gotea, Valer; Ovcharenko, Ivan

    2008-01-01

    Regulation of gene expression in eukaryotic genomes is established through a complex cooperative activity of proximal promoters and distant regulatory elements (REs) such as enhancers, repressors and silencers. We have developed a web server named DiRE, based on the Enhancer Identification (EI) method, for predicting distant regulatory elements in higher eukaryotic genomes, namely for determining their chromosomal location and functional characteristics. The server uses gene co-expression data, comparative genomics and profiles of transcription factor binding sites (TFBSs) to determine TFBS-association signatures that can be used for discriminating specific regulatory functions. DiRE's unique feature is its ability to detect REs outside of proximal promoter regions, as it takes advantage of the full gene locus to conduct the search. DiRE can predict common REs for any set of input genes for which the user has prior knowledge of co-expression, co-function or other biologically meaningful grouping. The server predicts function-specific REs consisting of clusters of specifically-associated TFBSs and it also scores the association of individual transcription factors (TFs) with the biological function shared by the group of input genes. Its integration with the Array2BIO server allows users to start their analysis with raw microarray expression data. The DiRE web server is freely available at http://dire.dcode.org. PMID:18487623

  17. An Enhanced Biometric Based Authentication with Key-Agreement Protocol for Multi-Server Architecture Based on Elliptic Curve Cryptography

    PubMed Central

    Reddy, Alavalapati Goutham; Das, Ashok Kumar; Odelu, Vanga; Yoo, Kee-Young

    2016-01-01

    Biometric based authentication protocols for multi-server architectures have gained momentum in recent times due to advancements in wireless technologies and associated constraints. Lu et al. recently proposed a robust biometric based authentication with key agreement protocol for a multi-server environment using smart cards. They claimed that their protocol is efficient and resistant to prominent security attacks. The careful investigation of this paper proves that Lu et al.’s protocol does not provide user anonymity, perfect forward secrecy and is susceptible to server and user impersonation attacks, man-in-middle attacks and clock synchronization problems. In addition, this paper proposes an enhanced biometric based authentication with key-agreement protocol for multi-server architecture based on elliptic curve cryptography using smartcards. We proved that the proposed protocol achieves mutual authentication using Burrows-Abadi-Needham (BAN) logic. The formal security of the proposed protocol is verified using the AVISPA (Automated Validation of Internet Security Protocols and Applications) tool to show that our protocol can withstand active and passive attacks. The formal and informal security analyses and performance analysis demonstrates that the proposed protocol is robust and efficient compared to Lu et al.’s protocol and existing similar protocols. PMID:27163786

  18. Implementation of a World Wide Web server for the oil and gas industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blaylock, R.E.; Martin, F.D.; Emery, R.

    1995-12-31

    The Gas and Oil Technology Exchange and Communication Highway, (GO-TECH), provides an electronic information system for the petroleum community for the purpose of exchanging ideas, data, and technology. The personal computer-based system fosters communication and discussion by linking oil and gas producers with resource centers, government agencies, consulting firms, service companies, national laboratories, academic research groups, and universities throughout the world. The oil and gas producers are provided access to the GO-TECH World Wide Web home page via modem links, as well as Internet. The future GO-TECH applications will include the establishment of{open_quote}Virtual corporations {close_quotes} consisting of consortiums of smallmore » companies, consultants, and service companies linked by electronic information systems. These virtual corporations will have the resources and expertise previously found only in major corporations.« less

  19. Implementation of a World Wide Web server for the oil and gas industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blaylock, R.E.; Martin, F.D.; Emery, R.

    1996-10-01

    The Gas and Oil Technology Exchange and Communication Highway (GO-TECH) provides an electronic information system for the petroleum community for exchanging ideas, data, and technology. The PC-based system fosters communication and discussion by linking the oil and gas producers with resource centers, government agencies, consulting firms, service companies, national laboratories, academic research groups, and universities throughout the world. The oil and gas producers can access the GO-TECH World Wide Web (WWW) home page through modem links, as well as through the Internet. Future GO-TECH applications will include the establishment of virtual corporations consisting of consortia of small companies, consultants, andmore » service companies linked by electronic information systems. These virtual corporations will have the resources and expertise previously found only in major corporations.« less

  20. Sequence History Update Tool

    NASA Technical Reports Server (NTRS)

    Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; DelGuercio, Chris

    2008-01-01

    The Sequence History Update Tool performs Web-based sequence statistics archiving for Mars Reconnaissance Orbiter (MRO). Using a single UNIX command, the software takes advantage of sequencing conventions to automatically extract the needed statistics from multiple files. This information is then used to populate a PHP database, which is then seamlessly formatted into a dynamic Web page. This tool replaces a previous tedious and error-prone process of manually editing HTML code to construct a Web-based table. Because the tool manages all of the statistics gathering and file delivery to and from multiple data sources spread across multiple servers, there is also a considerable time and effort savings. With the use of The Sequence History Update Tool what previously took minutes is now done in less than 30 seconds, and now provides a more accurate archival record of the sequence commanding for MRO.

  1. High Performance Analytics with the R3-Cache

    NASA Astrophysics Data System (ADS)

    Eavis, Todd; Sayeed, Ruhan

    Contemporary data warehouses now represent some of the world’s largest databases. As these systems grow in size and complexity, however, it becomes increasingly difficult for brute force query processing approaches to meet the performance demands of end users. Certainly, improved indexing and more selective view materialization are helpful in this regard. Nevertheless, with warehouses moving into the multi-terabyte range, it is clear that the minimization of external memory accesses must be a primary performance objective. In this paper, we describe the R 3-cache, a natively multi-dimensional caching framework designed specifically to support sophisticated warehouse/OLAP environments. R 3-cache is based upon an in-memory version of the R-tree that has been extended to support buffer pages rather than disk blocks. A key strength of the R 3-cache is that it is able to utilize multi-dimensional fragments of previous query results so as to significantly minimize the frequency and scale of disk accesses. Moreover, the new caching model directly accommodates the standard relational storage model and provides mechanisms for pro-active updates that exploit the existence of query “hot spots”. The current prototype has been evaluated as a component of the Sidera DBMS, a “shared nothing” parallel OLAP server designed for multi-terabyte analytics. Experimental results demonstrate significant performance improvements relative to simpler alternatives.

  2. Hardware Assisted Stealthy Diversity (CHECKMATE)

    DTIC Science & Technology

    2013-09-01

    applicable across multiple architectures. Figure 29 shows an example an attack against an interpreted environment with a Java executable. CHECKMATE can...Architectures ARM PPCx86 Java VM Java VMJava VM Java Executable Attack APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED 33 a user executes “/usr/bin/wget...Server 1 - Administration Server 2 – Database ( mySQL ) Server 3 – Web server (Mongoose) Server 4 – File server (SSH) Server 5 – Email server

  3. Multimedia data repository for the World Wide Web

    NASA Astrophysics Data System (ADS)

    Chen, Ken; Lu, Dajin; Xu, Duanyi

    1998-08-01

    This paper introduces the design and implementation of a Multimedia Data Repository served as a multimedia information system, which provides users a Web accessible, platform independent interface to query, browse, and retrieve multimedia data such as images, graphics, audio, video from a large multimedia data repository. By integrating the multimedia DBMS, in which the textual information and samples of the multimedia data is organized and stored, and Web server together into the Microsoft ActiveX Server Framework, users can access the DBMS and query the information by simply using a Web browser at the client-side. The original multimedia data can then be located and transmitted through the Internet from the tertiary storage device, a 400 CDROM optical jukebox at the server-side, to the client-side for further use.

  4. Add Java extensions to your wiki: Java applets can bring dynamic functionality to your wiki pages

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scarberry, Randall E.

    Virtually everyone familiar with today’s world wide web has encountered the free online encyclopedia Wikipedia many times. What you may not know is that Wikipedia is driven by an excellent open-source product called MediaWiki which is available to anyone for free. This has led to a proliferation of wiki sites devoted to just about any topic one can imagine. Users of a wiki can add content -- all that is required of them is that they type in their additions into their web browsers using the simple markup language called wikitext. Even better, the developers of wikitext made it extensible.more » With a little server-side development of your own, you can add your own custom syntax. Users aware of your extensions can then utilize them on their wiki pages with a few simple keystrokes. These extensions can be custom decorations, formatting, web applications, and even instances of the venerable old Java applet. One example of a Java applet extension is the Jmol extension (REF), used to embed a 3-D molecular viewer. This article will walk you through the deployment of a fairly elaborate applet via a MediaWiki extension. By no means exhaustive -- an entire book would be required for that -- it will demonstrate how to give the applet resize handles using using a little Javascript and CSS coding and some popular Javascript libraries. It even describes how a user may customize the extension somewhat using a wiki template. Finally, it explains a rudimentary persistence mechanism which allows applets to save data directly to the wiki pages on which they reside.« less

  5. An eight-year study of online lecture use in a medical gross anatomy and embryology course.

    PubMed

    Nieder, Gary L; Borges, Nicole J

    2012-01-01

    Online lectures have been used in lieu of live lectures in our gross anatomy and embryology course for the past eight years. We examined patterns of online lecture use by our students and related that use to academic entry measures, gender and examination performance. Detailed access records identified by student were available from server logs. Total views per page of lecture material increased over the first six years, then decreased markedly between years seven and eight, possibly due to the recent availability of alternate forms of lecture audio. Lecture use peaked in midafternoon and again in the evening, although some use was seen at all hours. Usage was highest at midweek and lowest on Fridays as might be expected. Individual student's use varied widely from rates equivalent to less than one viewing/page to more than three viewings per page. Overall use by male students was greater than that of females and gender-specific differences in the daily pattern were seen. Lecture use was correlated to the Medical College Admission Test(®) (MCAT(®)) Verbal Reasoning and Physical Sciences scores but not to composite MCAT scores or undergraduate grade point average. Overall use appeared to be driven by scheduled team-based learning (TBL) sessions and major examinations. Specific subsets of lecture material were most often viewed before related TBL sessions and again during review for examinations. A small but significant correlation between lecture use and examination and course performance was seen, specifically in the male student population. These findings, along with earlier observations, suggest that varied use of online lectures is attributable to multiple factors. Copyright © 2012 American Association of Anatomists.

  6. Model of load balancing using reliable algorithm with multi-agent system

    NASA Astrophysics Data System (ADS)

    Afriansyah, M. F.; Somantri, M.; Riyadi, M. A.

    2017-04-01

    Massive technology development is linear with the growth of internet users which increase network traffic activity. It also increases load of the system. The usage of reliable algorithm and mobile agent in distributed load balancing is a viable solution to handle the load issue on a large-scale system. Mobile agent works to collect resource information and can migrate according to given task. We propose reliable load balancing algorithm using least time first byte (LFB) combined with information from the mobile agent. In system overview, the methodology consisted of defining identification system, specification requirements, network topology and design system infrastructure. The simulation method for simulated system was using 1800 request for 10 s from the user to the server and taking the data for analysis. Software simulation was based on Apache Jmeter by observing response time and reliability of each server and then compared it with existing method. Results of performed simulation show that the LFB method with mobile agent can perform load balancing with efficient systems to all backend server without bottleneck, low risk of server overload, and reliable.

  7. EuCliD (European Clinical Database): a database comparing different realities.

    PubMed

    Marcelli, D; Kirchgessner, J; Amato, C; Steil, H; Mitteregger, A; Moscardò, V; Carioni, C; Orlandini, G; Gatti, E

    2001-01-01

    Quality and variability of dialysis practice are generally gaining more and more importance. Fresenius Medical Care (FMC), as provider of dialysis, has the duty to continuously monitor and guarantee the quality of care delivered to patients treated in its European dialysis units. Accordingly, a new clinical database called EuCliD has been developed. It is a multilingual and fully codified database, using as far as possible international standard coding tables. EuCliD collects and handles sensitive medical patient data, fully assuring confidentiality. The Infrastructure: a Domino server is installed in each country connected to EuCliD. All the centres belonging to a country are connected via modem to the country server. All the Domino Servers are connected via Wide Area Network to the Head Quarter Server in Bad Homburg (Germany). Inside each country server only anonymous data related to that particular country are available. The only place where all the anonymous data are available is the Head Quarter Server. The data collection is strongly supported in each country by "key-persons" with solid relationships to their respective national dialysis units. The quality of the data in EuCliD is ensured at different levels. At the end of January 2001, more than 11,000 patients treated in 135 centres located in 7 countries are already included in the system. FMC has put the patient care at the centre of its activities for many years and now is able to provide transparency to the community (Authorities, Nephrologists, Patients.....) thus demonstrating the quality of the service.

  8. Building a Smartphone Seismic Network

    NASA Astrophysics Data System (ADS)

    Kong, Q.; Allen, R. M.

    2013-12-01

    We are exploring to build a new type of seismic network by using the smartphones. The accelerometers in smartphones can be used to record earthquakes, the GPS unit can give an accurate location, and the built-in communication unit makes the communication easier for this network. In the future, these smartphones may work as a supplement network to the current traditional network for scientific research and real-time applications. In order to build this network, we developed an application for android phones and server to record the acceleration in real time. These records can be sent back to a server in real time, and analyzed at the server. We evaluated the performance of the smartphone as a seismic recording instrument by comparing them with high quality accelerometer while located on controlled shake tables for a variety of tests, and also the noise floor test. Based on the daily human activity data recorded by the volunteers and the shake table tests data, we also developed algorithm for the smartphones to detect earthquakes from daily human activities. These all form the basis of setting up a new prototype smartphone seismic network in the near future.

  9. Using ant colony optimization on the quadratic assignment problem to achieve low energy cost in geo-distributed data centers

    NASA Astrophysics Data System (ADS)

    Osei, Richard

    There are many problems associated with operating a data center. Some of these problems include data security, system performance, increasing infrastructure complexity, increasing storage utilization, keeping up with data growth, and increasing energy costs. Energy cost differs by location, and at most locations fluctuates over time. The rising cost of energy makes it harder for data centers to function properly and provide a good quality of service. With reduced energy cost, data centers will have longer lasting servers/equipment, higher availability of resources, better quality of service, a greener environment, and reduced service and software costs for consumers. Some of the ways that data centers have tried to using to reduce energy costs include dynamically switching on and off servers based on the number of users and some predefined conditions, the use of environmental monitoring sensors, and the use of dynamic voltage and frequency scaling (DVFS), which enables processors to run at different combinations of frequencies with voltages to reduce energy cost. This thesis presents another method by which energy cost at data centers could be reduced. This method involves the use of Ant Colony Optimization (ACO) on a Quadratic Assignment Problem (QAP) in assigning user request to servers in geo-distributed data centers. In this paper, an effort to reduce data center energy cost involves the use of front portals, which handle users' requests, were used as ants to find cost effective ways to assign users requests to a server in heterogeneous geo-distributed data centers. The simulation results indicate that the ACO for Optimal Server Activation and Task Placement algorithm reduces energy cost on a small and large number of users' requests in a geo-distributed data center and its performance increases as the input data grows. In a simulation with 3 geo-distributed data centers, and user's resource request ranging from 25,000 to 25,000,000, the ACO algorithm was able to reduce energy cost on an average of $.70 per second. The ACO for Optimal Server Activation and Task Placement algorithm has proven to work as an alternative or improvement in reducing energy cost in geo-distributed data centers.

  10. Development of a Real-Time GPS/Seismic Displacement Meter: GPS Component

    NASA Astrophysics Data System (ADS)

    Bock, Y.; Canas, J.; Andrew, A.; Vernon, F.

    2002-12-01

    We report on the status of the Orange County Real-Time GPS Network (OCRTN), an upgrade of the SCIGN sites in Orange County and Catalina Island to low latency (1 sec), high-rate (1 Hz) data streaming, analysis, and dissemination. The project is a collaborative effort of the California Spatial Reference Center (CSRC) and the Orange County Dept. of Geomatics, with partners from the geophysical community (SCIGN), local and state government, and the private sector. As part of Phase 1 of the project, nine sites are streaming data by dedicated, point-to-point radio modems to a central data server located in Santa Ana. Instantaneous positions are computed for each site. Data are converted from 1 Hz Ashtech binary MBEN format to (1) 1 Hz RTCM format, and (2) decimated (15 sec) RINEX format. A second computer outside a firewall and located in another building at the Orange County's Computer Center is a TCP-based client of RTCM data (messages 18, 19, 3, and 22) from the data server, as well as a TCP-based server of RTCM data to the outside world. An external computer can access the RTCM data from all active sites through an IP socket connection. Data latency, in the best case, is less than 1 sec from real-time. Once a day, the decimated RINEX data are transferred by ftp from the data server to the SOPAC-CSRC archive at Scripps. Data recovery is typically 99-100%. As part of the second phase of the project, the RTCM server provides data to field receivers to perform RTK surveying. On connection to the RTCM server the user gets a list of active stations, and can then choose from which site to retrieve RTCM data. This site then plays the role of the RTK base station and a CDPD-based wireless Internet device plays the role of the normal RTK radio link. If an Internet connection is available, we will demonstrate how the system operates. This system will serve as a prototype for the GPS component of the GPS/seismic displacement meter.

  11. Network characteristics for server selection in online games

    NASA Astrophysics Data System (ADS)

    Claypool, Mark

    2008-01-01

    Online gameplay is impacted by the network characteristics of players connected to the same server. Unfortunately, the network characteristics of online game servers are not well-understood, particularly for groups that wish to play together on the same server. As a step towards a remedy, this paper presents analysis of an extensive set of measurements of game servers on the Internet. Over the course of many months, actual Internet game servers were queried simultaneously by twenty-five emulated game clients, with both servers and clients spread out on the Internet. The data provides statistics on the uptime and populations of game servers over a month long period an an in-depth look at the suitability for game servers for multi-player server selection, concentrating on characteristics critical to playability--latency and fairness. Analysis finds most game servers have latencies suitable for third-person and omnipresent games, such as real-time strategy, sports and role-playing games, providing numerous server choices for game players. However, far fewer game servers have the low latencies required for first-person games, such as shooters or race games. In all cases, groups that wish to play together have a greatly reduced set of servers from which to choose because of inherent unfairness in server latencies and server selection is particularly limited as the group size increases. These results hold across different game types and even across different generations of games. The data should be useful for game developers and network researchers that seek to improve game server selection, whether for single or multiple players.

  12. Effects of High Pressure on Membrane Ion Binding and Transport.

    DTIC Science & Technology

    1980-12-31

    diffusion in red cell membranes have appar- ent activation volumes of 40 ml/mol in agreement with data on liposomes, and ,6) perturbations in osmotic...Extrapolated to the Red Cell? (page 15) B. Pressure Dependence of Butanol Diffusion (page 17) C. Development of a High Pressure Stop-Flow (page 19...page 16 Figure 3 -- Pressure effect on the diffusion coefficient n-butanol in packed human red cells ................... page 18 Figure 9

  13. CBD: a biomarker database for colorectal cancer.

    PubMed

    Zhang, Xueli; Sun, Xiao-Feng; Cao, Yang; Ye, Benchen; Peng, Qiliang; Liu, Xingyun; Shen, Bairong; Zhang, Hong

    2018-01-01

    Colorectal cancer (CRC) biomarker database (CBD) was established based on 870 identified CRC biomarkers and their relevant information from 1115 original articles in PubMed published from 1986 to 2017. In this version of the CBD, CRC biomarker data were collected, sorted, displayed and analysed. The CBD with the credible contents as a powerful and time-saving tool provide more comprehensive and accurate information for further CRC biomarker research. The CBD was constructed under MySQL server. HTML, PHP and JavaScript languages have been used to implement the web interface. The Apache was selected as HTTP server. All of these web operations were implemented under the Windows system. The CBD could provide to users the multiple individual biomarker information and categorized into the biological category, source and application of biomarkers; the experiment methods, results, authors and publication resources; the research region, the average age of cohort, gender, race, the number of tumours, tumour location and stage. We only collect data from the articles with clear and credible results to prove the biomarkers are useful in the diagnosis, treatment or prognosis of CRC. The CBD can also provide a professional platform to researchers who are interested in CRC research to communicate, exchange their research ideas and further design high-quality research in CRC. They can submit their new findings to our database via the submission page and communicate with us in the CBD.Database URL: http://sysbio.suda.edu.cn/CBD/.

  14. Sealife: a semantic grid browser for the life sciences applied to the study of infectious diseases.

    PubMed

    Schroeder, Michael; Burger, Albert; Kostkova, Patty; Stevens, Robert; Habermann, Bianca; Dieng-Kuntz, Rose

    2006-01-01

    The objective of Sealife is the conception and realisation of a semantic Grid browser for the life sciences, which will link the existing Web to the currently emerging eScience infrastructure. The SeaLife Browser will allow users to automatically link a host of Web servers and Web/Grid services to the Web content he/she is visiting. This will be accomplished using eScience's growing number of Web/Grid Services and its XML-based standards and ontologies. The browser will identify terms in the pages being browsed through the background knowledge held in ontologies. Through the use of Semantic Hyperlinks, which link identified ontology terms to servers and services, the SeaLife Browser will offer a new dimension of context-based information integration. In this paper, we give an overview over the different components of the browser and their interplay. This SeaLife Browser will be demonstrated within three application scenarios in evidence-based medicine, literature & patent mining, and molecular biology, all relating to the study of infectious diseases. The three applications vertically integrate the molecule/cell, the tissue/organ and the patient/population level by covering the analysis of high-throughput screening data for endocytosis (the molecular entry pathway into the cell), the expression of proteins in the spatial context of tissue and organs, and a high-level library on infectious diseases designed for clinicians and their patients. For more information see http://www.biote.ctu-dresden.de/sealife.

  15. CBD: a biomarker database for colorectal cancer

    PubMed Central

    Zhang, Xueli; Sun, Xiao-Feng; Ye, Benchen; Peng, Qiliang; Liu, Xingyun; Shen, Bairong; Zhang, Hong

    2018-01-01

    Abstract Colorectal cancer (CRC) biomarker database (CBD) was established based on 870 identified CRC biomarkers and their relevant information from 1115 original articles in PubMed published from 1986 to 2017. In this version of the CBD, CRC biomarker data were collected, sorted, displayed and analysed. The CBD with the credible contents as a powerful and time-saving tool provide more comprehensive and accurate information for further CRC biomarker research. The CBD was constructed under MySQL server. HTML, PHP and JavaScript languages have been used to implement the web interface. The Apache was selected as HTTP server. All of these web operations were implemented under the Windows system. The CBD could provide to users the multiple individual biomarker information and categorized into the biological category, source and application of biomarkers; the experiment methods, results, authors and publication resources; the research region, the average age of cohort, gender, race, the number of tumours, tumour location and stage. We only collect data from the articles with clear and credible results to prove the biomarkers are useful in the diagnosis, treatment or prognosis of CRC. The CBD can also provide a professional platform to researchers who are interested in CRC research to communicate, exchange their research ideas and further design high-quality research in CRC. They can submit their new findings to our database via the submission page and communicate with us in the CBD. Database URL: http://sysbio.suda.edu.cn/CBD/ PMID:29846545

  16. A service protocol for post-processing of medical images on the mobile device

    NASA Astrophysics Data System (ADS)

    He, Longjun; Ming, Xing; Xu, Lang; Liu, Qian

    2014-03-01

    With computing capability and display size growing, the mobile device has been used as a tool to help clinicians view patient information and medical images anywhere and anytime. It is uneasy and time-consuming for transferring medical images with large data size from picture archiving and communication system to mobile client, since the wireless network is unstable and limited by bandwidth. Besides, limited by computing capability, memory and power endurance, it is hard to provide a satisfactory quality of experience for radiologists to handle some complex post-processing of medical images on the mobile device, such as real-time direct interactive three-dimensional visualization. In this work, remote rendering technology is employed to implement the post-processing of medical images instead of local rendering, and a service protocol is developed to standardize the communication between the render server and mobile client. In order to make mobile devices with different platforms be able to access post-processing of medical images, the Extensible Markup Language is taken to describe this protocol, which contains four main parts: user authentication, medical image query/ retrieval, 2D post-processing (e.g. window leveling, pixel values obtained) and 3D post-processing (e.g. maximum intensity projection, multi-planar reconstruction, curved planar reformation and direct volume rendering). And then an instance is implemented to verify the protocol. This instance can support the mobile device access post-processing of medical image services on the render server via a client application or on the web page.

  17. TFmiR: a web server for constructing and analyzing disease-specific transcription factor and miRNA co-regulatory networks.

    PubMed

    Hamed, Mohamed; Spaniol, Christian; Nazarieh, Maryam; Helms, Volkhard

    2015-07-01

    TFmiR is a freely available web server for deep and integrative analysis of combinatorial regulatory interactions between transcription factors, microRNAs and target genes that are involved in disease pathogenesis. Since the inner workings of cells rely on the correct functioning of an enormously complex system of activating and repressing interactions that can be perturbed in many ways, TFmiR helps to better elucidate cellular mechanisms at the molecular level from a network perspective. The provided topological and functional analyses promote TFmiR as a reliable systems biology tool for researchers across the life science communities. TFmiR web server is accessible through the following URL: http://service.bioinformatik.uni-saarland.de/tfmir. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  18. AtlasCBS: a web server to map and explore chemico-biological space

    NASA Astrophysics Data System (ADS)

    Cortés-Cabrera, Álvaro; Morreale, Antonio; Gago, Federico; Abad-Zapatero, Celerino

    2012-09-01

    New approaches are needed that can help decrease the unsustainable failure in small-molecule drug discovery. Ligand Efficiency Indices (LEI) are making a great impact on early-stage compound selection and prioritization. Given a target-ligand database with chemical structures and associated biological affinities/activities for a target, the AtlasCBS server generates two-dimensional, dynamical representations of its contents in terms of LEI. These variables allow an effective decoupling of the chemical (angular) and biological (radial) components. BindingDB, PDBBind and ChEMBL databases are currently implemented. Proprietary datasets can also be uploaded and compared. The utility of this atlas-like representation in the future of drug design is highlighted with some examples. The web server can be accessed at http://ub.cbm.uam.es/atlascbs and https://www.ebi.ac.uk/chembl/atlascbs.

  19. AtlasCBS: a web server to map and explore chemico-biological space.

    PubMed

    Cortés-Cabrera, Alvaro; Morreale, Antonio; Gago, Federico; Abad-Zapatero, Celerino

    2012-09-01

    New approaches are needed that can help decrease the unsustainable failure in small-molecule drug discovery. Ligand Efficiency Indices (LEI) are making a great impact on early-stage compound selection and prioritization. Given a target-ligand database with chemical structures and associated biological affinities/activities for a target, the AtlasCBS server generates two-dimensional, dynamical representations of its contents in terms of LEI. These variables allow an effective decoupling of the chemical (angular) and biological (radial) components. BindingDB, PDBBind and ChEMBL databases are currently implemented. Proprietary datasets can also be uploaded and compared. The utility of this atlas-like representation in the future of drug design is highlighted with some examples. The web server can be accessed at http://ub.cbm.uam.es/atlascbs and https://www.ebi.ac.uk/chembl/atlascbs.

  20. Survey Software Evaluation

    DTIC Science & Technology

    2009-01-01

    Oracle 9i, 10g  MySQL  MS SQL Server MS SQL Server Operating System Supported Windows 2003 Server  Windows 2000 Server (32 bit...WebStar (Mac OS X)  SunOne Internet Information Services (IIS) Database Server Supported MS SQL Server  MS SQL Server  Oracle 9i, 10g...challenges of Web-based surveys are: 1) identifying the best Commercial Off the Shelf (COTS) Web-based survey packages to serve the particular

  1. Measurement of Energy Performances for General-Structured Servers

    NASA Astrophysics Data System (ADS)

    Liu, Ren; Chen, Lili; Li, Pengcheng; Liu, Meng; Chen, Haihong

    2017-11-01

    Energy consumption of servers in data centers increases rapidly along with the wide application of Internet and connected devices. To improve the energy efficiency of servers, voluntary or mandatory energy efficiency programs for servers, including voluntary label program or mandatory energy performance standards have been adopted or being prepared in the US, EU and China. However, the energy performance of servers and testing methods of servers are not well defined. This paper presents matrices to measure the energy performances of general-structured servers. The impacts of various components of servers on their energy performances are also analyzed. Based on a set of normalized workload, the author proposes a standard method for testing energy efficiency of servers. Pilot tests are conducted to assess the energy performance testing methods of servers. The findings of the tests are discussed in the paper.

  2. THttpServer class in ROOT

    NASA Astrophysics Data System (ADS)

    Adamczewski-Musch, Joern; Linev, Sergey

    2015-12-01

    The new THttpServer class in ROOT implements HTTP server for arbitrary ROOT applications. It is based on Civetweb embeddable HTTP server and provides direct access to all objects registered for the server. Objects data could be provided in different formats: binary, XML, GIF/PNG, and JSON. A generic user interface for THttpServer has been implemented with HTML/JavaScript based on JavaScript ROOT development. With any modern web browser one could list, display, and monitor objects available on the server. THttpServer is used in Go4 framework to provide HTTP interface to the online analysis.

  3. Characteristics and Energy Use of Volume Servers in the United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fuchs, H.; Shehabi, A.; Ganeshalingam, M.

    Servers’ field energy use remains poorly understood, given heterogeneous computing loads, configurable hardware and software, and operation over a wide range of management practices. This paper explores various characteristics of 1- and 2-socket volume servers that affect energy consumption, and quantifies the difference in power demand between higher-performing SPEC and ENERGY STAR servers and our best understanding of a typical server operating today. We first establish general characteristics of the U.S. installed base of volume servers from existing IDC data and the literature, before presenting information on server hardware configurations from data collection events at a major online retail website.more » We then compare cumulative distribution functions of server idle power across three separate datasets and explain the differences between them via examination of the hardware characteristics to which power draw is most sensitive. We find that idle server power demand is significantly higher than ENERGY STAR benchmarks and the industry-released energy use documented in SPEC, and that SPEC server configurations—and likely the associated power-scaling trends—are atypical of volume servers. Next, we examine recent trends in server power draw among high-performing servers across their full load range to consider how representative these trends are of all volume servers before inputting weighted average idle power load values into a recently published model of national server energy use. Finally, we present results from two surveys of IT managers (n=216) and IT vendors (n=178) that illustrate the prevalence of more-efficient equipment and operational practices in server rooms and closets; these findings highlight opportunities to improve the energy efficiency of the U.S. server stock.« less

  4. Web-based healthcare hand drawing management system.

    PubMed

    Hsieh, Sheau-Ling; Weng, Yung-Ching; Chen, Chi-Huang; Hsu, Kai-Ping; Lin, Jeng-Wei; Lai, Feipei

    2010-01-01

    The paper addresses Medical Hand Drawing Management System architecture and implementation. In the system, we developed four modules: hand drawing management module; patient medical records query module; hand drawing editing and upload module; hand drawing query module. The system adapts windows-based applications and encompasses web pages by ASP.NET hosting mechanism under web services platforms. The hand drawings implemented as files are stored in a FTP server. The file names with associated data, e.g. patient identification, drawing physician, access rights, etc. are reposited in a database. The modules can be conveniently embedded, integrated into any system. Therefore, the system possesses the hand drawing features to support daily medical operations, effectively improve healthcare qualities as well. Moreover, the system includes the printing capability to achieve a complete, computerized medical document process. In summary, the system allows web-based applications to facilitate the graphic processes for healthcare operations.

  5. Development of the Subaru-Mitaka-Okayama-Kiso Archive System

    NASA Astrophysics Data System (ADS)

    Baba, Hajime; Yasuda, Naoki; Ichikawa, Shin-Ichi; Yagi, Masafumi; Iwamoto, Nobuyuki; Takata, Tadafumi; Horaguchi, Toshihiro; Taga, Masatoshi; Watanabe, Masaru; Ozawa, Tomohiko; Hamabe, Masaru

    We have developed the Subaru-Mitaka-Okayama-Kiso-Archive (SMOKA) public science archive system which provides access to the data of the Subaru Telescope, the 188 cm telescope at Okayama Astrophysical Observatory, and the 105 cm Schmidt telescope at Kiso Observatory/University of Tokyo. SMOKA is the successor of the MOKA3 system. The user can browse the Quick-Look Images, Header Information (HDI) and the ASCII Table Extension (ATE) of each frame from the search result table. A request for data can be submitted in a simple manner. The system is developed with Java Servlet for the back-end, and Java Server Pages (JSP) for content display. The advantage of JSP's is the separation of the front-end presentation from the middle- and back-end tiers which led to an efficient development of the system. The SMOKA homepage is available at SMOKA

  6. Extending CATH: increasing coverage of the protein structure universe and linking structure with function

    PubMed Central

    Cuff, Alison L.; Sillitoe, Ian; Lewis, Tony; Clegg, Andrew B.; Rentzsch, Robert; Furnham, Nicholas; Pellegrini-Calace, Marialuisa; Jones, David; Thornton, Janet; Orengo, Christine A.

    2011-01-01

    CATH version 3.3 (class, architecture, topology, homology) contains 128 688 domains, 2386 homologous superfamilies and 1233 fold groups, and reflects a major focus on classifying structural genomics (SG) structures and transmembrane proteins, both of which are likely to add structural novelty to the database and therefore increase the coverage of protein fold space within CATH. For CATH version 3.4 we have significantly improved the presentation of sequence information and associated functional information for CATH superfamilies. The CATH superfamily pages now reflect both the functional and structural diversity within the superfamily and include structural alignments of close and distant relatives within the superfamily, annotated with functional information and details of conserved residues. A significantly more efficient search function for CATH has been established by implementing the search server Solr (http://lucene.apache.org/solr/). The CATH v3.4 webpages have been built using the Catalyst web framework. PMID:21097779

  7. Mobile Assisted Security in Wireless Sensor Networks

    DTIC Science & Technology

    2015-08-03

    server from Google’s DNS, Chromecast and the content server does the 3-way TCP Handshake which is followed by Client Hello and Server Hello TLS messages...utilized TLS v1.2, except NTP servers and google’s DNS server. In the TLS v1.2, after handshake, client and server sends Client Hello and Server Hello ...Messages in order. In Client Hello messages, client offers a list of Cipher Suites that it supports. Each Cipher Suite defines the key exchange algorithm

  8. The ISO Data Archive and Interoperability with Other Archives

    NASA Astrophysics Data System (ADS)

    Salama, Alberto; Arviset, Christophe; Hernández, José; Dowson, John; Osuna, Pedro

    The ESA's Infrared Space Observatory (ISO), an unprecedented observatory for infrared astronomy launched in November 1995, successfully made nearly 30,000 scientific observations in its 2.5-year mission. The ISO data can be retrieved from the ISO Data Archive, available at ISO Data Archive , and comprised of about 150,000 observations, including parallel and serendipity mode observations. A user-friendly Java interface permits queries to the database and data retrieval. The interface currently offers a wide variety of links to other archives, such as name resolution with NED and SIMBAD, access to electronic articles from ADS and CDS/VizieR, and access to IRAS data. In the past year development has been focused on improving the IDA interoperability with other astronomical archives, either by accessing other relevant archives or by providing direct access to the ISO data for external services. A mechanism of information transfer has been developed, allowing direct query to the IDA via a Java Server Page, returning quick look ISO images and relevant, observation-specific information embedded in an HTML page. This method has been used to link from the CDS/Vizier Data Centre and ADS, and work with IPAC to allow access to the ISO Archive from IRSA, including display capabilities of the observed sky regions onto other mission images, is in progress. Prospects for further links to and from other archives and databases are also addressed.

  9. DB-PABP: a database of polyanion-binding proteins

    PubMed Central

    Fang, Jianwen; Dong, Yinghua; Salamat-Miller, Nazila; Russell Middaugh, C.

    2008-01-01

    The interactions between polyanions (PAs) and polyanion-binding proteins (PABPs) have been found to play significant roles in many essential biological processes including intracellular organization, transport and protein folding. Furthermore, many neurodegenerative disease-related proteins are PABPs. Thus, a better understanding of PA/PABP interactions may not only enhance our understandings of biological systems but also provide new clues to these deadly diseases. The literature in this field is widely scattered, suggesting the need for a comprehensive and searchable database of PABPs. The DB-PABP is a comprehensive, manually curated and searchable database of experimentally characterized PABPs. It is freely available and can be accessed online at http://pabp.bcf.ku.edu/DB_PABP/. The DB-PABP was implemented as a MySQL relational database. An interactive web interface was created using Java Server Pages (JSP). The search page of the database is organized into a main search form and a section for utilities. The main search form enables custom searches via four menus: protein names, polyanion names, the source species of the proteins and the methods used to discover the interactions. Available utilities include a commonality matrix, a function of listing PABPs by the number of interacting polyanions and a string search for author surnames. The DB-PABP is maintained at the University of Kansas. We encourage users to provide feedback and submit new data and references. PMID:17916573

  10. DB-PABP: a database of polyanion-binding proteins.

    PubMed

    Fang, Jianwen; Dong, Yinghua; Salamat-Miller, Nazila; Middaugh, C Russell

    2008-01-01

    The interactions between polyanions (PAs) and polyanion-binding proteins (PABPs) have been found to play significant roles in many essential biological processes including intracellular organization, transport and protein folding. Furthermore, many neurodegenerative disease-related proteins are PABPs. Thus, a better understanding of PA/PABP interactions may not only enhance our understandings of biological systems but also provide new clues to these deadly diseases. The literature in this field is widely scattered, suggesting the need for a comprehensive and searchable database of PABPs. The DB-PABP is a comprehensive, manually curated and searchable database of experimentally characterized PABPs. It is freely available and can be accessed online at http://pabp.bcf.ku.edu/DB_PABP/. The DB-PABP was implemented as a MySQL relational database. An interactive web interface was created using Java Server Pages (JSP). The search page of the database is organized into a main search form and a section for utilities. The main search form enables custom searches via four menus: protein names, polyanion names, the source species of the proteins and the methods used to discover the interactions. Available utilities include a commonality matrix, a function of listing PABPs by the number of interacting polyanions and a string search for author surnames. The DB-PABP is maintained at the University of Kansas. We encourage users to provide feedback and submit new data and references.

  11. Skylign: a tool for creating informative, interactive logos representing sequence alignments and profile hidden Markov models

    PubMed Central

    2014-01-01

    Background Logos are commonly used in molecular biology to provide a compact graphical representation of the conservation pattern of a set of sequences. They render the information contained in sequence alignments or profile hidden Markov models by drawing a stack of letters for each position, where the height of the stack corresponds to the conservation at that position, and the height of each letter within a stack depends on the frequency of that letter at that position. Results We present a new tool and web server, called Skylign, which provides a unified framework for creating logos for both sequence alignments and profile hidden Markov models. In addition to static image files, Skylign creates a novel interactive logo plot for inclusion in web pages. These interactive logos enable scrolling, zooming, and inspection of underlying values. Skylign can avoid sampling bias in sequence alignments by down-weighting redundant sequences and by combining observed counts with informed priors. It also simplifies the representation of gap parameters, and can optionally scale letter heights based on alternate calculations of the conservation of a position. Conclusion Skylign is available as a website, a scriptable web service with a RESTful interface, and as a software package for download. Skylign’s interactive logos are easily incorporated into a web page with just a few lines of HTML markup. Skylign may be found at http://skylign.org. PMID:24410852

  12. SU-D-BRD-02: A Web-Based Image Processing and Plan Evaluation Platform (WIPPEP) for Future Cloud-Based Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chai, X; Liu, L; Xing, L

    Purpose: Visualization and processing of medical images and radiation treatment plan evaluation have traditionally been constrained to local workstations with limited computation power and ability of data sharing and software update. We present a web-based image processing and planning evaluation platform (WIPPEP) for radiotherapy applications with high efficiency, ubiquitous web access, and real-time data sharing. Methods: This software platform consists of three parts: web server, image server and computation server. Each independent server communicates with each other through HTTP requests. The web server is the key component that provides visualizations and user interface through front-end web browsers and relay informationmore » to the backend to process user requests. The image server serves as a PACS system. The computation server performs the actual image processing and dose calculation. The web server backend is developed using Java Servlets and the frontend is developed using HTML5, Javascript, and jQuery. The image server is based on open source DCME4CHEE PACS system. The computation server can be written in any programming language as long as it can send/receive HTTP requests. Our computation server was implemented in Delphi, Python and PHP, which can process data directly or via a C++ program DLL. Results: This software platform is running on a 32-core CPU server virtually hosting the web server, image server, and computation servers separately. Users can visit our internal website with Chrome browser, select a specific patient, visualize image and RT structures belonging to this patient and perform image segmentation running Delphi computation server and Monte Carlo dose calculation on Python or PHP computation server. Conclusion: We have developed a webbased image processing and plan evaluation platform prototype for radiotherapy. This system has clearly demonstrated the feasibility of performing image processing and plan evaluation platform through a web browser and exhibited potential for future cloud based radiotherapy.« less

  13. CyBy(2): a structure-based data management tool for chemical and biological data.

    PubMed

    Höck, Stefan; Riedl, Rainer

    2012-01-01

    We report the development of a powerful data management tool for chemical and biological data: CyBy(2). CyBy(2) is a structure-based information management tool used to store and visualize structural data alongside additional information such as project assignment, physical information, spectroscopic data, biological activity, functional data and synthetic procedures. The application consists of a database, an application server, used to query and update the database, and a client application with a rich graphical user interface (GUI) used to interact with the server.

  14. OPeNDAP Server4: Buidling a High-Performance Server for the DAP by Leveraging Existing Software

    NASA Astrophysics Data System (ADS)

    Potter, N.; West, P.; Gallagher, J.; Garcia, J.; Fox, P.

    2006-12-01

    OPeNDAP has been working in conjunction with NCAR/ESSL/HAO to develop a modular, high performance data server that will be the successor to the current OPeNDAP data server. The new server, called Server4, is really two servers: A 'Back-End' data server which reads information from various types of data sources and packages the results in DAP objects; and A 'Front-End' which receives client DAP request and then decides how use features of the Back-End data server to build the correct responses. This architecture can be configured in several interesting ways: The Front- and Back-End components can be run on either the same or different machines, depending on security and performance needs, new Front-End software can be written to support other network data access protocols and local applications can interact directly with the Back-End data server. This new server's Back-End component will use the server infrastructure developed by HAO for use with the Earth System Grid II project. Extensions needed to use it as part of the new OPeNDAP server were minimal. The HAO server was modified so that it loads 'data handlers' at run-time. Each data handler module only needs to satisfy a simple interface which both enabled the existing data handlers written for the old OPeNDAP server to be directly used and also simplifies writing new handlers from scratch. The Back-End server leverages high- performance features developed for the ESG II project, so applications that can interact with it directly can read large volumes of data efficiently. The Front-End module of Server4 uses the Java Servlet system in place of the Common Gateway Interface (CGI) used in the past. New front-end modules can be written to support different network data access protocols, so that same server will ultimately be able to support more than the DAP/2.0 protocol. As an example, we will discuss a SOAP interface that's currently in development. In addition to support for DAP/2.0 and prototypical support for a SOAP interface, the new server includes support for the THREDDS cataloging protocol. THREDDS is tightly integrated into the Front-End of Server4. The Server4 Front-End can make full use of the advanced THREDDS features such as attribute specification and inheritance, custom catalogs which segue into automatically generated catalogs as well as providing a default behavior which requires almost no catalog configuration.

  15. An extensible and lightweight architecture for adaptive server applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorton, Ian; Liu, Yan; Trivedi, Nihar

    2008-07-10

    Server applications augmented with behavioral adaptation logic can react to environmental changes, creating self-managing server applications with improved quality of service at runtime. However, developing adaptive server applications is challenging due to the complexity of the underlying server technologies and highly dynamic application environments. This paper presents an architecture framework, the Adaptive Server Framework (ASF), to facilitate the development of adaptive behavior for legacy server applications. ASF provides a clear separation between the implementation of adaptive behavior and the business logic of the server application. This means a server application can be extended with programmable adaptive features through the definitionmore » and implementation of control components defined in ASF. Furthermore, ASF is a lightweight architecture in that it incurs low CPU overhead and memory usage. We demonstrate the effectiveness of ASF through a case study, in which a server application dynamically determines the resolution and quality to scale an image based on the load of the server and network connection speed. The experimental evaluation demonstrates the erformance gains possible by adaptive behavior and the low overhead introduced by ASF.« less

  16. Extending the Virtual Solar Observatory (VSO) to Incorporate Data Analysis Capabilities (III)

    NASA Astrophysics Data System (ADS)

    Csillaghy, A.; Etesi, L.; Dennis, B.; Zarro, D.; Schwartz, R.; Tolbert, K.

    2008-12-01

    We will present a progress report on our activities to extend the data analysis capabilities of the VSO. Our efforts to date have focused on three areas: 1. Extending the data retrieval capabilities by developing a centralized data processing server. The server is built with Java, IDL (Interactive Data Language), and the SSW (Solar SoftWare) package with all SSW-related instrument libraries and required calibration data. When a user requests VSO data that requires preprocessing, the data are transparently sent to the server, processed, and returned to the user's IDL session for viewing and analysis. It is possible to have any Java or IDL client connect to the server. An IDL prototype for preparing and calibrating SOHO/EIT data wll be demonstrated. 2. Improving the solar data search in SHOW SYNOP, a graphical user tool connected to VSO in IDL. We introduce the Java-IDL interface that allows a flexible dynamic, and extendable way of searching the VSO, where all the communication with VSO are managed dynamically by standard Java tools. 3. Improving image overlay capability to support coregistration of solar disk observations obtained from different orbital view angles, position angles, and distances - such as from the twin STEREO spacecraft.

  17. Occupational Exploration at Ontario Junior High School: 9th Grade.

    ERIC Educational Resources Information Center

    Bates, Gene; And Others

    The document contains 56 activities for Grade 9. The contents include the following areas: questions about the future; job seeking activities and guidelines; career games; a personal interest check list; unit guides for courses in World of Work (55 pages), and Career Educational Planning (40 pages) which include objectives, activities, evaluation,…

  18. Thirty Meter Telescope (TMT) Narrow Field Infrared Adaptive Optics System (NFIRAOS) real-time controller preliminary architecture

    NASA Astrophysics Data System (ADS)

    Kerley, Dan; Smith, Malcolm; Dunn, Jennifer; Herriot, Glen; Véran, Jean-Pierre; Boyer, Corinne; Ellerbroek, Brent; Gilles, Luc; Wang, Lianqi

    2016-08-01

    The Narrow Field Infrared Adaptive Optics System (NFIRAOS) is the first light Adaptive Optics (AO) system for the Thirty Meter Telescope (TMT). A critical component of NFIRAOS is the Real-Time Controller (RTC) subsystem which provides real-time wavefront correction by processing wavefront information to compute Deformable Mirror (DM) and Tip/Tilt Stage (TTS) commands. The National Research Council of Canada - Herzberg (NRC-H), in conjunction with TMT, has developed a preliminary design for the NFIRAOS RTC. The preliminary architecture for the RTC is comprised of several Linux-based servers. These servers are assigned various roles including: the High-Order Processing (HOP) servers, the Wavefront Corrector Controller (WCC) server, the Telemetry Engineering Display (TED) server, the Persistent Telemetry Storage (PTS) server, and additional testing and spare servers. There are up to six HOP servers that accept high-order wavefront pixels, and perform parallelized pixel processing and wavefront reconstruction to produce wavefront corrector error vectors. The WCC server performs low-order mode processing, and synchronizes and aggregates the high-order wavefront corrector error vectors from the HOP servers to generate wavefront corrector commands. The Telemetry Engineering Display (TED) server is the RTC interface to TMT and other subsystems. The TED server receives all external commands and dispatches them to the rest of the RTC servers and is responsible for aggregating several offloading and telemetry values that are reported to other subsystems within NFIRAOS and TMT. The TED server also provides the engineering GUIs and real-time displays. The Persistent Telemetry Storage (PTS) server contains fault tolerant data storage that receives and stores telemetry data, including data for Point-Spread Function Reconstruction (PSFR).

  19. Interactive remote data processing using Pixelize Wavelet Filtration (PWF-method) and PeriodMap analysis

    NASA Astrophysics Data System (ADS)

    Sych, Robert; Nakariakov, Valery; Anfinogentov, Sergey

    Wavelet analysis is suitable for investigating waves and oscillating in solar atmosphere, which are limited in both time and frequency. We have developed an algorithms to detect this waves by use the Pixelize Wavelet Filtration (PWF-method). This method allows to obtain information about the presence of propagating and non-propagating waves in the data observation (cube images), and localize them precisely in time as well in space. We tested the algorithm and found that the results of coronal waves detection are consistent with those obtained by visual inspection. For fast exploration of the data cube, in addition, we applied early-developed Period- Map analysis. This method based on the Fast Fourier Transform and allows on initial stage quickly to look for "hot" regions with the peak harmonic oscillations and determine spatial distribution at the significant harmonics. We propose the detection procedure of coronal waves separate on two parts: at the first part, we apply the PeriodMap analysis (fast preparation) and than, at the second part, use information about spatial distribution of oscillation sources to apply the PWF-method (slow preparation). There are two possible algorithms working with the data: in automatic and hands-on operation mode. Firstly we use multiply PWF analysis as a preparation narrowband maps at frequency subbands multiply two and/or harmonic PWF analysis for separate harmonics in a spectrum. Secondly we manually select necessary spectral subband and temporal interval and than construct narrowband maps. For practical implementation of the proposed methods, we have developed the remote data processing system at Institute of Solar-Terrestrial Physics, Irkutsk. The system based on the data processing server - http://pwf.iszf.irk.ru. The main aim of this resource is calculation in remote access through the local and/or global network (Internet) narrowband maps of wave's sources both in whole spectral band and at significant harmonics. In addition, we can obtain temporal dynamics (mpeg- files) of the main oscillation characteristics: amplitude, power and phase as a spatial-temporal coordinates. For periodogram mapping of data cubes as a method for the pre-analysis, we developed preparation of the color maps where the pixel's colour corresponds to the frequency of the power spectrum maximum. The computer system based on applications ION-scripts, algorithmic languages IDL and PHP, and Apache WEB server. The IDL ION-scripts use for preparation and configuration of network requests at the central data server with subsequent connection to IDL run-unit software and graphic output on FTP-server and screen. Web page is constructed using PHP language.

  20. Energy Efficiency in Small Server Rooms: Field Surveys and Findings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheung, Iris; Greenberg, Steve; Mahdavi, Roozbeh

    Fifty-seven percent of US servers are housed in server closets, server rooms, and localized data centers, in what are commonly referred to as small server rooms, which comprise 99percent of all server spaces in the US. While many mid-tier and enterprise-class data centers are owned by large corporations that consider energy efficiency a goal to minimize business operating costs, small server rooms typically are not similarly motivated. They are characterized by decentralized ownership and management and come in many configurations, which creates a unique set of efficiency challenges. To develop energy efficiency strategies for these spaces, we surveyed 30 smallmore » server rooms across eight institutions, and selected four of them for detailed assessments. The four rooms had Power Usage Effectiveness (PUE) values ranging from 1.5 to 2.1. Energy saving opportunities ranged from no- to low-cost measures such as raising cooling set points and better airflow management, to more involved but cost-effective measures including server consolidation and virtualization, and dedicated cooling with economizers. We found that inefficiencies mainly resulted from organizational rather than technical issues. Because of the inherent space and resource limitations, the most effective measure is to operate servers through energy-efficient cloud-based services or well-managed larger data centers, rather than server rooms. Backup power requirement, and IT and cooling efficiency should be evaluated to minimize energy waste in the server space. Utility programs are instrumental in raising awareness and spreading technical knowledge on server operation, and the implementation of energy efficiency measures in small server rooms.« less

  1. Brandenburg 3D - a comprehensive 3D Subsurface Model, Conception of an Infrastructure Node and a Web Application

    NASA Astrophysics Data System (ADS)

    Kerschke, Dorit; Schilling, Maik; Simon, Andreas; Wächter, Joachim

    2014-05-01

    The Energiewende and the increasing scarcity of raw materials will lead to an intensified utilization of the subsurface in Germany. Within this context, geological 3D modeling is a fundamental approach for integrated decision and planning processes. Initiated by the development of the European Geospatial Infrastructure INSPIRE, the German State Geological Offices started digitizing their predominantly analog archive inventory. Until now, a comprehensive 3D subsurface model of Brandenburg did not exist. Therefore the project B3D strived to develop a new 3D model as well as a subsequent infrastructure node to integrate all geological and spatial data within the Geodaten-Infrastruktur Brandenburg (Geospatial Infrastructure, GDI-BB) and provide it to the public through an interactive 2D/3D web application. The functionality of the web application is based on a client-server architecture. Server-sided, all available spatial data is published through GeoServer. GeoServer is designed for interoperability and acts as the reference implementation of the Open Geospatial Consortium (OGC) Web Feature Service (WFS) standard that provides the interface that allows requests for geographical features. In addition, GeoServer implements, among others, the high performance certified compliant Web Map Service (WMS) that serves geo-referenced map images. For publishing 3D data, the OGC Web 3D Service (W3DS), a portrayal service for three-dimensional geo-data, is used. The W3DS displays elements representing the geometry, appearance, and behavior of geographic objects. On the client side, the web application is solely based on Free and Open Source Software and leans on the JavaScript API WebGL that allows the interactive rendering of 2D and 3D graphics by means of GPU accelerated usage of physics and image processing as part of the web page canvas without the use of plug-ins. WebGL is supported by most web browsers (e.g., Google Chrome, Mozilla Firefox, Safari, and Opera). The web application enables an intuitive navigation through all available information and allows the visualization of geological maps (2D), seismic transects (2D/3D), wells (2D/3D), and the 3D-model. These achievements will alleviate spatial and geological data management within the German State Geological Offices and foster the interoperability of heterogeneous systems. It will provide guidance to a systematic subsurface management across system, domain and administrative boundaries on the basis of a federated spatial data infrastructure, and include the public in the decision processes (e-Governance). Yet, the interoperability of the systems has to be strongly propelled forward through agreements on standards that need to be decided upon in responsible committees. The project B3D is funded with resources from the European Fund for Regional Development (EFRE).

  2. An Efficient and Practical Smart Card Based Anonymity Preserving User Authentication Scheme for TMIS using Elliptic Curve Cryptography.

    PubMed

    Amin, Ruhul; Islam, S K Hafizul; Biswas, G P; Khan, Muhammad Khurram; Kumar, Neeraj

    2015-11-01

    In the last few years, numerous remote user authentication and session key agreement schemes have been put forwarded for Telecare Medical Information System, where the patient and medical server exchange medical information using Internet. We have found that most of the schemes are not usable for practical applications due to known security weaknesses. It is also worth to note that unrestricted number of patients login to the single medical server across the globe. Therefore, the computation and maintenance overhead would be high and the server may fail to provide services. In this article, we have designed a medical system architecture and a standard mutual authentication scheme for single medical server, where the patient can securely exchange medical data with the doctor(s) via trusted central medical server over any insecure network. We then explored the security of the scheme with its resilience to attacks. Moreover, we formally validated the proposed scheme through the simulation using Automated Validation of Internet Security Schemes and Applications software whose outcomes confirm that the scheme is protected against active and passive attacks. The performance comparison demonstrated that the proposed scheme has lower communication cost than the existing schemes in literature. In addition, the computation cost of the proposed scheme is nearly equal to the exiting schemes. The proposed scheme not only efficient in terms of different security attacks, but it also provides an efficient login, mutual authentication, session key agreement and verification and password update phases along with password recovery.

  3. Design and Analysis of an Enhanced Patient-Server Mutual Authentication Protocol for Telecare Medical Information System.

    PubMed

    Amin, Ruhul; Islam, S K Hafizul; Biswas, G P; Khan, Muhammad Khurram; Obaidat, Mohammad S

    2015-11-01

    In order to access remote medical server, generally the patients utilize smart card to login to the server. It has been observed that most of the user (patient) authentication protocols suffer from smart card stolen attack that means the attacker can mount several common attacks after extracting smart card information. Recently, Lu et al.'s proposes a session key agreement protocol between the patient and remote medical server and claims that the same protocol is secure against relevant security attacks. However, this paper presents several security attacks on Lu et al.'s protocol such as identity trace attack, new smart card issue attack, patient impersonation attack and medical server impersonation attack. In order to fix the mentioned security pitfalls including smart card stolen attack, this paper proposes an efficient remote mutual authentication protocol using smart card. We have then simulated the proposed protocol using widely-accepted AVISPA simulation tool whose results make certain that the same protocol is secure against active and passive attacks including replay and man-in-the-middle attacks. Moreover, the rigorous security analysis proves that the proposed protocol provides strong security protection on the relevant security attacks including smart card stolen attack. We compare the proposed scheme with several related schemes in terms of computation cost and communication cost as well as security functionalities. It has been observed that the proposed scheme is comparatively better than related existing schemes.

  4. The Czech National Grid Infrastructure

    NASA Astrophysics Data System (ADS)

    Chudoba, J.; Křenková, I.; Mulač, M.; Ruda, M.; Sitera, J.

    2017-10-01

    The Czech National Grid Infrastructure is operated by MetaCentrum, a CESNET department responsible for coordinating and managing activities related to distributed computing. CESNET as the Czech National Research and Education Network (NREN) provides many e-infrastructure services, which are used by 94% of the scientific and research community in the Czech Republic. Computing and storage resources owned by different organizations are connected by fast enough network to provide transparent access to all resources. We describe in more detail the computing infrastructure, which is based on several different technologies and covers grid, cloud and map-reduce environment. While the largest part of CPUs is still accessible via distributed torque servers, providing environment for long batch jobs, part of infrastructure is available via standard EGI tools in EGI, subset of NGI resources is provided into EGI FedCloud environment with cloud interface and there is also Hadoop cluster provided by the same e-infrastructure.A broad spectrum of computing servers is offered; users can choose from standard 2 CPU servers to large SMP machines with up to 6 TB of RAM or servers with GPU cards. Different groups have different priorities on various resources, resource owners can even have an exclusive access. The software is distributed via AFS. Storage servers offering up to tens of terabytes of disk space to individual users are connected via NFS4 on top of GPFS and access to long term HSM storage with peta-byte capacity is also provided. Overview of available resources and recent statistics of usage will be given.

  5. SFESA: a web server for pairwise alignment refinement by secondary structure shifts.

    PubMed

    Tong, Jing; Pei, Jimin; Grishin, Nick V

    2015-09-03

    Protein sequence alignment is essential for a variety of tasks such as homology modeling and active site prediction. Alignment errors remain the main cause of low-quality structure models. A bioinformatics tool to refine alignments is needed to make protein alignments more accurate. We developed the SFESA web server to refine pairwise protein sequence alignments. Compared to the previous version of SFESA, which required a set of 3D coordinates for a protein, the new server will search a sequence database for the closest homolog with an available 3D structure to be used as a template. For each alignment block defined by secondary structure elements in the template, SFESA evaluates alignment variants generated by local shifts and selects the best-scoring alignment variant. A scoring function that combines the sequence score of profile-profile comparison and the structure score of template-derived contact energy is used for evaluation of alignments. PROMALS pairwise alignments refined by SFESA are more accurate than those produced by current advanced alignment methods such as HHpred and CNFpred. In addition, SFESA also improves alignments generated by other software. SFESA is a web-based tool for alignment refinement, designed for researchers to compute, refine, and evaluate pairwise alignments with a combined sequence and structure scoring of alignment blocks. To our knowledge, the SFESA web server is the only tool that refines alignments by evaluating local shifts of secondary structure elements. The SFESA web server is available at http://prodata.swmed.edu/sfesa.

  6. Implementation of 3D spatial indexing and compression in a large-scale molecular dynamics simulation database for rapid atomic contact detection.

    PubMed

    Toofanny, Rudesh D; Simms, Andrew M; Beck, David A C; Daggett, Valerie

    2011-08-10

    Molecular dynamics (MD) simulations offer the ability to observe the dynamics and interactions of both whole macromolecules and individual atoms as a function of time. Taken in context with experimental data, atomic interactions from simulation provide insight into the mechanics of protein folding, dynamics, and function. The calculation of atomic interactions or contacts from an MD trajectory is computationally demanding and the work required grows exponentially with the size of the simulation system. We describe the implementation of a spatial indexing algorithm in our multi-terabyte MD simulation database that significantly reduces the run-time required for discovery of contacts. The approach is applied to the Dynameomics project data. Spatial indexing, also known as spatial hashing, is a method that divides the simulation space into regular sized bins and attributes an index to each bin. Since, the calculation of contacts is widely employed in the simulation field, we also use this as the basis for testing compression of data tables. We investigate the effects of compression of the trajectory coordinate tables with different options of data and index compression within MS SQL SERVER 2008. Our implementation of spatial indexing speeds up the calculation of contacts over a 1 nanosecond (ns) simulation window by between 14% and 90% (i.e., 1.2 and 10.3 times faster). For a 'full' simulation trajectory (51 ns) spatial indexing reduces the calculation run-time between 31 and 81% (between 1.4 and 5.3 times faster). Compression resulted in reduced table sizes but resulted in no significant difference in the total execution time for neighbour discovery. The greatest compression (~36%) was achieved using page level compression on both the data and indexes. The spatial indexing scheme significantly decreases the time taken to calculate atomic contacts and could be applied to other multidimensional neighbor discovery problems. The speed up enables on-the-fly calculation and visualization of contacts and rapid cross simulation analysis for knowledge discovery. Using page compression for the atomic coordinate tables and indexes saves ~36% of disk space without any significant decrease in calculation time and should be considered for other non-transactional databases in MS SQL SERVER 2008.

  7. Implementation of 3D spatial indexing and compression in a large-scale molecular dynamics simulation database for rapid atomic contact detection

    PubMed Central

    2011-01-01

    Background Molecular dynamics (MD) simulations offer the ability to observe the dynamics and interactions of both whole macromolecules and individual atoms as a function of time. Taken in context with experimental data, atomic interactions from simulation provide insight into the mechanics of protein folding, dynamics, and function. The calculation of atomic interactions or contacts from an MD trajectory is computationally demanding and the work required grows exponentially with the size of the simulation system. We describe the implementation of a spatial indexing algorithm in our multi-terabyte MD simulation database that significantly reduces the run-time required for discovery of contacts. The approach is applied to the Dynameomics project data. Spatial indexing, also known as spatial hashing, is a method that divides the simulation space into regular sized bins and attributes an index to each bin. Since, the calculation of contacts is widely employed in the simulation field, we also use this as the basis for testing compression of data tables. We investigate the effects of compression of the trajectory coordinate tables with different options of data and index compression within MS SQL SERVER 2008. Results Our implementation of spatial indexing speeds up the calculation of contacts over a 1 nanosecond (ns) simulation window by between 14% and 90% (i.e., 1.2 and 10.3 times faster). For a 'full' simulation trajectory (51 ns) spatial indexing reduces the calculation run-time between 31 and 81% (between 1.4 and 5.3 times faster). Compression resulted in reduced table sizes but resulted in no significant difference in the total execution time for neighbour discovery. The greatest compression (~36%) was achieved using page level compression on both the data and indexes. Conclusions The spatial indexing scheme significantly decreases the time taken to calculate atomic contacts and could be applied to other multidimensional neighbor discovery problems. The speed up enables on-the-fly calculation and visualization of contacts and rapid cross simulation analysis for knowledge discovery. Using page compression for the atomic coordinate tables and indexes saves ~36% of disk space without any significant decrease in calculation time and should be considered for other non-transactional databases in MS SQL SERVER 2008. PMID:21831299

  8. BioSig-Air-Force

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2011-07-15

    1) Configured servers: In coordination with the INSIGHT team, a hardware configuration was selected. Two nodes were purchased, configured, and shipped with compatible OS and database installation. The servers have been stress tested for reliability as they use leading edge technologies. Each node has two CPUs and 12 cores per CPU with maximum onboard memory for high performance. 2) LIM and Experimental module: The original BioSig system was developed for cancer research. Accordingly, the LIM system its corresponding web pages are being modified to facilitate (i) pathogene-donor interactions, (ii) media composition, (iii) chemical and siRNA plate configurations. The LIM systemmore » has been redesigned. The revised system allows design of new media and tracking it from lot-to-lot so that variations in the phenotypic responses can be tracked to a specific media and lot number. Similar associations are also possible with other experimental factors (e.g., donor-pathoge, siRNA, and chemical). Furthermore, the design of the experimental variables has also been revised to (i) interact with the newly developed LIM system, (ii) simplify experimental specifications, and (iii) test for potential operator's error during the data entry. Part of the complication has been due to the handshake between multiple teams that provide the small molecule plates and the team that creates assay plates. Our efforts have focused to harmonize these interactions (e.g., various data formats) so that each assay plate can be mapped to its source so that a correct set of experimental variables can be associated with each image. For example, depending upon the source of the chemical plates, they may have different formats. We have developed a canonical representation that registers SMILES code, for each chemical compound, along with its physiochemical properties. The schema for LIM conjunction with customized Web pages. 3) Import of Images and computed descriptors module: In coordination with the INSIGHT team, policies were designed to route images and computed representation into BioSig. This module (i) examines for completion of image analysis, and imports images, computed masks, and descriptors into BioSig. A database API for efficient retrieval of selection of descriptors (among thousands) was designed and implemented. 4) Computed segmentation masks from external software were imported, boundaries computed, and overlaid on images for quality control.« less

  9. Cybersecurity, massive data processing, community interaction, and other developments at WWW-based computational X-ray Server

    NASA Astrophysics Data System (ADS)

    Stepanov, Sergey

    2013-03-01

    X-Ray Server (x-server.gmca.aps.anl.gov) is a WWW-based computational server for modeling of X-ray diffraction, reflection and scattering data. The modeling software operates directly on the server and can be accessed remotely either from web browsers or from user software. In the later case the server can be deployed as a software library or a data fitting engine. As the server recently surpassed the milestones of 15 years online and 1.5 million calculations, it accumulated a number of technical solutions that are discussed in this paper. The developed approaches to detecting physical model limits and user calculations failures, solutions to spam and firewall problems, ways to involve the community in replenishing databases and methods to teach users automated access to the server programs may be helpful for X-ray researchers interested in using the server or sharing their own software online.

  10. Effect of video server topology on contingency capacity requirements

    NASA Astrophysics Data System (ADS)

    Kienzle, Martin G.; Dan, Asit; Sitaram, Dinkar; Tetzlaff, William H.

    1996-03-01

    Video servers need to assign a fixed set of resources to each video stream in order to guarantee on-time delivery of the video data. If a server has insufficient resources to guarantee the delivery, it must reject the stream request rather than slowing down all existing streams. Large scale video servers are being built as clusters of smaller components, so as to be economical, scalable, and highly available. This paper uses a blocking model developed for telephone systems to evaluate video server cluster topologies. The goal is to achieve high utilization of the components and low per-stream cost combined with low blocking probability and high user satisfaction. The analysis shows substantial economies of scale achieved by larger server images. Simple distributed server architectures can result in partitioning of resources with low achievable resource utilization. By comparing achievable resource utilization of partitioned and monolithic servers, we quantify the cost of partitioning. Next, we present an architecture for a distributed server system that avoids resource partitioning and results in highly efficient server clusters. Finally, we show how, in these server clusters, further optimizations can be achieved through caching and batching of video streams.

  11. When I Grow Up... Career Activities for Kindergarten through Sixth Grade.

    ERIC Educational Resources Information Center

    Oklahoma State Dept. of Education, Oklahoma City. Curriculum Div.

    This resource unit provides activities and resources for career awareness at the elementary school level. Student pages which can be used as a basis for activities are included for both primary and intermediate levels. The student pages are related to the following job areas in which growth has been predicted: (1) manufacturing; (2) foods; (3)…

  12. Data Access Tools And Services At The Goddard Distributed Active Archive Center (GDAAC)

    NASA Technical Reports Server (NTRS)

    Pham, Long; Eng, Eunice; Sweatman, Paul

    2003-01-01

    As one of the largest providers of Earth Science data from the Earth Observing System, GDAAC provides the latest data from the Moderate Resolution Imaging Spectroradiometer (MODIS), Atmospheric Infrared Sounder (AIRS), Solar Radiation and Climate Experiment (SORCE) data products via GDAAC's data pool (50TB of disk cache). In order to make this huge volume of data more accessible to the public and science communities, the GDAAC offers multiple data access tools and services: Open Source Project for Network Data Access Protocol (OPeNDAP), Grid Analysis and Display System (GrADS/DODS) (GDS), Live Access Server (LAS), OpenGlS Web Map Server (WMS) and Near Archive Data Mining (NADM). The objective is to assist users in retrieving electronically a smaller, usable portion of data for further analysis. The OPeNDAP server, formerly known as the Distributed Oceanographic Data System (DODS), allows the user to retrieve data without worrying about the data format. OPeNDAP is capable of server-side subsetting of HDF, HDF-EOS, netCDF, JGOFS, ASCII, DSP, FITS and binary data formats. The GrADS/DODS server is capable of serving the same data formats as OPeNDAP. GDS has an additional feature of server-side analysis. Users can analyze the data on the server there by decreasing the computational load on their client's system. The LAS is a flexible server that allows user to graphically visualize data on the fly, to request different file formats and to compare variables from distributed locations. Users of LAS have options to use other available graphics viewers such as IDL, Matlab or GrADS. WMS is based on the OPeNDAP for serving geospatial information. WMS supports OpenGlS protocol to provide data in GIs-friendly formats for analysis and visualization. NADM is another access to the GDAAC's data pool. NADM gives users the capability to use a browser to upload their C, FORTRAN or IDL algorithms, test the algorithms, and mine data in the data pool. With NADM, the GDAAC provides an environment physically close to the data source. NADM will benefit users with mining or offer data reduction algorithms by reducing large volumes of data before transmission over the network to the user.

  13. EarthServer: a Summary of Achievements in Technology, Services, and Standards

    NASA Astrophysics Data System (ADS)

    Baumann, Peter

    2015-04-01

    Big Data in the Earth sciences, the Tera- to Exabyte archives, mostly are made up from coverage data, according to ISO and OGC defined as the digital representation of some space-time varying phenomenon. Common examples include 1-D sensor timeseries, 2-D remote sensing imagery, 3D x/y/t image timese ries and x/y/z geology data, and 4-D x/y/z/t atmosphere and ocean data. Analytics on such data requires on-demand processing of sometimes significant complexity, such as getting the Fourier transform of satellite images. As network bandwidth limits prohibit transfer of such Big Data it is indispensable to devise protocols allowing clients to task flexible and fast processing on the server. The transatlantic EarthServer initiative, running from 2011 through 2014, has united 11 partners to establish Big Earth Data Analytics. A key ingredient has been flexibility for users to ask whatever they want, not impeded and complicated by system internals. The EarthServer answer to this is to use high-level, standards-based query languages which unify data and metadata search in a simple, yet powerful way. A second key ingredient is scalability. Without any doubt, scalability ultimately can only be achieved through parallelization. In the past, parallelizing cod e has been done at compile time and usually with manual intervention. The EarthServer approach is to perform a samentic-based dynamic distribution of queries fragments based on networks optimization and further criteria. The EarthServer platform is comprised by rasdaman, the pioneer and leading Array DBMS built for any-size multi-dimensional raster data being extended with support for irregular grids and general meshes; in-situ retrieval (evaluation of database queries on existing archive structures, avoiding data import and, hence, duplication); the aforementioned distributed query processing. Additionally, Web clients for multi-dimensional data visualization are being established. Client/server interfaces are strictly based on OGC and W3C standards, in particular the Web Coverage Processing Service (WCPS) which defines a high-level coverage query language. Reviewers have attested EarthServer that "With no doubt the project has been shaping the Big Earth Data landscape through the standardization activities within OGC, ISO and beyond". We present the project approach, its outcomes and impact on standardization and Big Data technology, and vistas for the future.

  14. Development and Evaluation of Internet-Based Hypermedia Chemistry Tutorials

    NASA Astrophysics Data System (ADS)

    Tissue, Brian M.; Earp, Ronald L.; Yip, Ching-Wan; Anderson, Mark R.

    1996-05-01

    This progress report describes the development and student use of World-Wide-Web-based prelaboratory exercises in senior-level Instrumental Analysis during the 1995 Fall semester. The laboratory preparation exercises contained hypermedia tutorials and multiple-choice questions that were intended to familiarize the students with the experiments and instrumentation before their laboratory session. The overall goal of our work is to explore ways in which computer and network technology can be applied in education to improve the cost-effectiveness and efficacy of teaching. The course material can be accessed at http://www.chem.vt.edu/chem-ed/4114/Fall1995.html. The students were instructed to read their experimental procedure and to do the relevant laboratory preparation exercise. The individual tutorial documents were primarily text that provided basic theoretical and experimental descriptions of analytical and instrumental methods. The documents included hyperlinks to basic concepts, simple schematics, and color graphics of experimental set-ups or instrumentation. We chose the World-Wide Web (WWW) as the delivery platform for this project because of the ease of developing, distributing, and modifying hypermedia material in a client-server system. The disadvantage of the WWW is that network bandwidth limits the size and sophistication of the hypermedia material. To minimize internet transfer time, the individual documents were kept short and usually contained no more than 3 or 4 inline images. After reading the tutorial the students answered several multiple-choice questions. The figure shows one example of a multiple-choice question and the response page. Clicking on the "Submit answer" button calls a *.cgi file, which contains instructions in the PERL interpretive language, that generates the response page and saves the date, time, and student's answer to a file on the server. Usage and student perception of the on-line material was evaluated from server logs and student surveys. On-time completion of the assignments was 75%, but use of other on-line resources such as a question-and-answer page was minimal. Responses from student surveys indicated that the students had sufficient access to the internet. Approximately half of the students completed the prelaboratory exercises from one of several computers in the laboratory, and half worked from a workplace, university library, or home. Greater than 85% of all student usage from the laboratory computers occurred between 11 am and 4 pm. A mid-semester student survey indicated that the spectroscopy prelabs with three multiple-choice questions were better for increasing conceptual understanding rather than preparing the students for the actual lab work. An end-of-the-semester survey based on the electrochemistry assignments, which consisted of two multiple-choice questions and one clickable-map graphical exercise, produced a slightly higher rating for preparing students for the laboratory work. The differences between the spectroscopy and electrochemistry exercises prevent drawing any real conclusions from these two surveys, however, they do help guide the preparation of the content of future exercises. Next year's materials will contain three multiple-choice questions and one graphics-based exercise. The clickable-map graphics and at least one of the multiple-choice questions will be designed to test an understanding of the experimental procedure and instrument use to better prepare students for the actual laboratory work. Acknowledgment. We would like to thank Professor Gary Long for his assistance with the course, and the NSF for financial support through the Division of Undergraduate Education (DUE-9455382) and a CAREER award (CHE-9502460). Literature Cited. Laurillard, D. Rethinking Teaching, a Framework for the Effective Use of Educational Technology; Routledge: London, 1993. Tissue, B. M.; Earp, R. L.; Yip, C.-W. Chem. Educator 1996, 1(1), S1430-4171(96)01010-2. Only available at http://journals.springer-ny.com/chedr.

  15. Multimedia and physiology: a new way to ensure the quality of medical education and medical knowledge.

    PubMed

    Lessard, Yvon; Siregar, Pridi; Julen, Nathalie; Sinteff, Jean-Paul; Le Beux, Pierre

    2006-01-01

    since the eighties and the existence of virtual campuses, the value of computers in distance education has been acknowledged. The development of information and communication technologies is driving at discriminating distance education and on-line education. the aim of the "Campus Numérique de Physiologie" is not to reproduce an on-line copy of classical textbooks but to put at students' and physicians' disposal the huge possibilities of multimedia resources for an active and easier understanding of complex physiopathological phenomena. the on-line course materials were created using both original IBC-made and registered trade-mark software tools. Multiscale modelling and corresponding knowledge bases were implemented by mathematicians, biologists and software engineers from Rennes. The website, which is accessible through a server of the French Virtual Medical University, was developed in the language HTML/PHP connected to a MySQL database. the content managing system is consistent with classical home page facilities and multicriteria browser. Interactive resources are freely available for the site's users. Two- and three-dimensional simulations born out of mathematical qualitative and quantitative models at the molecular, cellular or organic level keep students active with regards to fundamental mechanisms by interactively manipulating the simulation environment. authors comment the already available course materials which should stimulate the creation of new documents following a validation by a qualified commission of the "Société de Physiologie". Providing evaluation tests, teachers anticipate that the increasing content of this virtual campus will allow users to gain a complete understanding and an integrative view of many physiopathological mechanisms.

  16. Online database for documenting clinical pathology resident education.

    PubMed

    Hoofnagle, Andrew N; Chou, David; Astion, Michael L

    2007-01-01

    Training of clinical pathologists is evolving and must now address the 6 core competencies described by the Accreditation Council for Graduate Medical Education (ACGME), which include patient care. A substantial portion of the patient care performed by the clinical pathology resident takes place while the resident is on call for the laboratory, a practice that provides the resident with clinical experience and assists the laboratory in providing quality service to clinicians in the hospital and surrounding community. Documenting the educational value of these on-call experiences and providing evidence of competence is difficult for residency directors. An online database of these calls, entered by residents and reviewed by faculty, would provide a mechanism for documenting and improving the education of clinical pathology residents. With Microsoft Access we developed an online database that uses active server pages and secure sockets layer encryption to document calls to the clinical pathology resident. Using the data collected, we evaluated the efficacy of 3 interventions aimed at improving resident education. The database facilitated the documentation of more than 4 700 calls in the first 21 months it was online, provided archived resident-generated data to assist in serving clients, and demonstrated that 2 interventions aimed at improving resident education were successful. We have developed a secure online database, accessible from any computer with Internet access, that can be used to easily document clinical pathology resident education and competency.

  17. Influence of Social Media on the Dissemination of a Traditional Surgical Research Article.

    PubMed

    Buckarma, EeeLN H; Thiels, Cornelius A; Gas, Becca L; Cabrera, Daniel; Bingener-Casey, Juliane; Farley, David R

    Many institutions use social media to share research with the general public. However, the influence of social media on the dissemination of a surgical research article itself is unknown. Our objective was to determine whether a blog post highlighting the findings of a surgical research article would lead to increased dissemination of the article itself. We prospectively followed the online page views of an article that was published online in Surgery in May 2015 and published in print in August 2015. The authors subsequently released a blog post in October 2015 to promote the research. The number of article page views from the journal's website was obtained before and after the blog post, along with the page views from the blog post itself. Social media influence data were collected, including social activity in the form of mentions on social media sites, scholarly activity in online libraries, and scholarly commentary. The article's online activity peaked in the first month after online publication (475 page views). Online activity plateaued by 4 months after publication, with 118 monthly page views, and a blog post was subsequently published. The blog post was viewed by 1566 readers, and readers spent a mean of 2.5 minutes on the page. When compared to the projected trend, the page views increased by 33% in the month after the blog post. The blog post resulted in a 9% increase in the social media influence score and a 5% absolute increase in total article page views. Social media is an important tool for sharing surgical research. Our data suggest that social media can increase distribution of an article's message and also potentially increase dissemination of the article itself. We believe that authors should consider using social media to increase the dissemination of traditionally published articles. Copyright © 2016 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  18. Mixed Methods for Mixed Reality: Understanding Users' Avatar Activities in Virtual Worlds

    ERIC Educational Resources Information Center

    Feldon, David F.; Kafai, Yasmin B.

    2008-01-01

    This paper examines the use of mixed methods for analyzing users' avatar-related activities in a virtual world. Server logs recorded keystroke-level activity for 595 participants over a six-month period in Whyville.net, an informal science website. Participants also completed surveys and participated in interviews regarding their experiences.…

  19. Mild and Wild Weather.

    ERIC Educational Resources Information Center

    NatureScope, 1985

    1985-01-01

    Presents background information and six activities that focus on clouds, precipitation, and stormy weather. Each activity includes an objective, recommended age level(s), subject area(s), and instructional strategies. Also provided are two ready-to-copy pages (a coloring page on lightning and a list of weather riddles to solve). (JN)

  20. The USGODAE Monterey Data Server

    NASA Astrophysics Data System (ADS)

    Sharfstein, P.; Dimitriou, D.; Hankin, S.

    2005-12-01

    The USGODAE Monterey Data Server (http://www.usgodae.org/) has been established at the Fleet Numerical Meteorology and Oceanography Center (FNMOC) as an explicit U.S. contribution to GODAE. The server is operated with oversight and funding from the Office of Naval Research (ONR). Support of the GODAE Monterey Data Server is accomplished by a cooperative effort between FNMOC and NOAA's Pacific Marine Environmental Laboratory (PMEL) in the on-going development of the GODAE server and the support of a collaborative network of GODAE assimilation groups. This server hosts near real-time in-situ oceanographic data available from the Global Telecommunications System (GTS) and other FTP sites, atmospheric forcing fields suitable for driving ocean models, and unique GODAE data sets, including demonstration ocean model products. It supports GODAE participants, as well as the broader oceanographic research community, and is becoming a significant node in the international GODAE program. GODAE is envisioned as a global system of observations, communications, modeling and assimilation, which will deliver regular, comprehensive information on the state of the oceans in a way that will promote and engender wide utility and availability of this resource for maximum benefit to society. It aims to make ocean monitoring and prediction a routine activity in a manner similar to weather forecasting. GODAE will contribute to an information system for the global ocean that will serve interests from climate and climate change to ship routing and fisheries. The USGODAE Server is developed and operated as a prototypical node for this global information system. Presenting data with a consistent interface and ensuring its availability in the maximum number of standard formats is one of the primary challenges in hosting the many diverse formats and broad range of data used by the GODAE community. To this end, all USGODAE data sets are available in their original format via HTTP and FTP. In addition, USGODAE data are served using Local Data Manager (LDM), THREDDS cataloging, OPeNDAP, and GODAE Live Access Server (LAS) from PMEL. Every effort is made to serve USGODAE data through the standards specified by the National Virtual Ocean Data System (NVODS) and the Integrated Ocean Observing System Data Management and Communications (IOOS/DMAC) specifications. USGODAE serves FNMOC GRIB files from the Navy Operational Global Atmospheric Prediction System (NOGAPS) and the Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS) as OPeNDAP data sets using the GrADS Data Server (GDS). The server also provides several FNMOC custom IEEE binary format high resolution ocean analysis products and model outputs through GDS. These data sets are also made available through LAS. The Server functions as one of two Argo Global Data Assembly Centers (GDACs), hosting the complete collection of quality-controlled Argo temperature/salinity profiling float data. The Argo collection includes all available Delayed-Mode (scientific quality controlled and corrected) data. USGODAE Argo data are served through OPeNDAP and LAS, which provide complete integration of the Argo data set into NVODS and the IOOS/DMAC. By providing researchers flexible, easy access to data through standard Internet and oceanographic interfaces, the USGODAE Monterey Data Server has become an invaluable resource for oceanographic research. Also, by promoting the community data serving projects, USGODAE strengthens the community and helps to advance the data serving standards.

  1. eShopper modeling and simulation

    NASA Astrophysics Data System (ADS)

    Petrushin, Valery A.

    2001-03-01

    The advent of e-commerce gives an opportunity to shift the paradigm of customer communication into a highly interactive mode. The new generation of commercial Web servers, such as the Blue Martini's server, combines the collection of data on a customer behavior with real-time processing and dynamic tailoring of a feedback page. The new opportunities for direct product marketing and cross selling are arriving. The key problem is what kind of information do we need to achieve these goals, or in other words, how do we model the customer? The paper is devoted to customer modeling and simulation. The focus is on modeling an individual customer. The model is based on the customer's transaction data, click stream data, and demographics. The model includes the hierarchical profile of a customer's preferences to different types of products and brands; consumption models for the different types of products; the current focus, trends, and stochastic models for time intervals between purchases; product affinity models; and some generalized features, such as purchasing power, sensitivity to advertising, price sensitivity, etc. This type of model is used for predicting the date of the next visit, overall spending, and spending for different types of products and brands. For some type of stores (for example, a supermarket) and stable customers, it is possible to forecast the shopping lists rather accurately. The forecasting techniques are discussed. The forecasting results can be used for on- line direct marketing, customer retention, and inventory management. The customer model can also be used as a generative model for simulating the customer's purchasing behavior in different situations and for estimating customer's features.

  2. Sequence alignment visualization in HTML5 without Java.

    PubMed

    Gille, Christoph; Birgit, Weyand; Gille, Andreas

    2014-01-01

    Java has been extensively used for the visualization of biological data in the web. However, the Java runtime environment is an additional layer of software with an own set of technical problems and security risks. HTML in its new version 5 provides features that for some tasks may render Java unnecessary. Alignment-To-HTML is the first HTML-based interactive visualization for annotated multiple sequence alignments. The server side script interpreter can perform all tasks like (i) sequence retrieval, (ii) alignment computation, (iii) rendering, (iv) identification of a homologous structural models and (v) communication with BioDAS-servers. The rendered alignment can be included in web pages and is displayed in all browsers on all platforms including touch screen tablets. The functionality of the user interface is similar to legacy Java applets and includes color schemes, highlighting of conserved and variable alignment positions, row reordering by drag and drop, interlinked 3D visualization and sequence groups. Novel features are (i) support for multiple overlapping residue annotations, such as chemical modifications, single nucleotide polymorphisms and mutations, (ii) mechanisms to quickly hide residue annotations, (iii) export to MS-Word and (iv) sequence icons. Alignment-To-HTML, the first interactive alignment visualization that runs in web browsers without additional software, confirms that to some extend HTML5 is already sufficient to display complex biological data. The low speed at which programs are executed in browsers is still the main obstacle. Nevertheless, we envision an increased use of HTML and JavaScript for interactive biological software. Under GPL at: http://www.bioinformatics.org/strap/toHTML/.

  3. A low-cost wireless system for autonomous generation of road safety alerts

    NASA Astrophysics Data System (ADS)

    Banks, B.; Harms, T.; Sedigh Sarvestani, S.; Bastianini, F.

    2009-03-01

    This paper describes an autonomous wireless system that generates road safety alerts, in the form of SMS and email messages, and sends them to motorists subscribed to the service. Drivers who regularly traverse a particular route are the main beneficiaries of the proposed system, which is intended for sparsely populated rural areas, where information available to drivers about road safety, especially bridge conditions, is very limited. At the heart of this system is the SmartBrick, a wireless system for remote structural health monitoring that has been presented in our previous work. Sensors on the SmartBrick network regularly collect data on water level, temperature, strain, and other parameters important to safety of a bridge. This information is stored on the device, and reported to a remote server over the GSM cellular infrastructure. The system generates alerts indicating hazardous road conditions when the data exceeds thresholds that can be remotely changed. The remote server and any number of designated authorities can be notified by email, FTP, and SMS. Drivers can view road conditions and subscribe to SMS and/or email alerts through a web page. The subscription-only form of alert generation has been deliberately selected to mitigate privacy concerns. The proposed system can significantly increase the safety of travel through rural areas. Real-time availability of information to transportation authorities and law enforcement officials facilitates early or proactive reaction to road hazards. Direct notification of drivers further increases the utility of the system in increasing the safety of the traveling public.

  4. NASA World Wind: A New Mission

    NASA Astrophysics Data System (ADS)

    Hogan, P.; Gaskins, T.; Bailey, J. E.

    2008-12-01

    Virtual Globes are well into their first generation, providing increasingly rich and beautiful visualization of more types and quantities of information. However, they are still mostly single and proprietary programs, akin to a web browser whose content and functionality are controlled and constrained largely by the browser's manufacturer. Today Google and Microsoft determine what we can and cannot see and do in these programs. NASA World Wind started out in nearly the same mode, a single program with limited functionality and information content. But as the possibilities of virtual globes became more apparent, we found that while enabling a new class of information visualization, we were also getting in the way. Many users want to provide World Wind functionality and information in their programs, not ours. They want it in their web pages. They want to include their own features. They told us that only with this kind of flexibility, could their objectives and the potential of the technology be truly realized. World Wind therefore changed its mission: from providing a single information browser to enabling a whole class of 3D geographic applications. Instead of creating one program, we create components to be used in any number of programs. World Wind is NASA open source software. With the source code being fully visible, anyone can readily use it and freely extend it to serve any use. Imagery and other information provided by the World Wind servers is also free and unencumbered, including the server technology to deliver geospatial data. World Wind developers can therefore provide exclusive and custom solutions based on user needs.

  5. Developing a Web-based system by integrating VGI and SDI for real estate management and marketing

    NASA Astrophysics Data System (ADS)

    Salajegheh, J.; Hakimpour, F.; Esmaeily, A.

    2014-10-01

    Property importance of various aspects, especially the impact on various sectors of the economy and the country's macroeconomic is clear. Because of the real, multi-dimensional and heterogeneous nature of housing as a commodity, the lack of an integrated system includes comprehensive information of property, the lack of awareness of some actors in this field about comprehensive information about property and the lack of clear and comprehensive rules and regulations for the trading and pricing, several problems arise for the people involved in this field. In this research implementation of a crowd-sourced Web-based real estate support system is desired. Creating a Spatial Data Infrastructure (SDI) in this system for collecting, updating and integrating all official data about property is also desired in this study. In this system a Web2.0 broker and technologies such as Web services and service composition has been used. This work aims to provide comprehensive and diverse information about property from different sources. For this purpose five-level real estate support system architecture is used. PostgreSql DBMS is used to implement the desired system. Geoserver software is also used as map server and reference implementation of OGC (Open Geospatial Consortium) standards. And Apache server is used to run web pages and user interfaces. Integration introduced methods and technologies provide a proper environment for various users to use the system and share their information. This goal is only achieved by cooperation between all involved organizations in real estate with implementation their required infrastructures in interoperability Web services format.

  6. The Air Force Nuclear Engineering Center Structural Activation and Integrity Evaluation

    DTIC Science & Technology

    1990-03-01

    Vi1 List of Figures Figure Page 1. Inside Piqua Nuclear Power Facility containment building on top of the entombed reactor core ... 5...5. Predicted activity percentage of individual materials in the AFNEC ..... ........................ 21 6. Predicted radioisotope activity percentage...of total radioisotopic inventory within entombment at 20 years after shutdown ......................... 23 iv List of Tables Table Page 1. ORIGEN2

  7. Teaching with Student Math Notes.

    ERIC Educational Resources Information Center

    National Council of Teachers of Mathematics, Inc., Reston, VA.

    Since 1982, the National Council of Teachers of Mathematics has published a student periodical five times a year. Each four-page issue focuses on a single theme, developing it from a simple opening-page activity through more challenging extensions at a higher level of understanding. In this document each four-page issue published from September,…

  8. [The future of telepathology. An Internet "distributed system" with "open standards"].

    PubMed

    Brauchli, K; Helfrich, M; Christen, H; Jundt, G; Haroske, G; Mihatsch, M; Oberli, H; Oberholzer, M

    2002-05-01

    With the availability of Internet, the interest in the possibilities of telepathology has increased considerably. In the foreground is thereby the need of the non-expert to bring in the opinions of experts on morphological findings by means of a fast and simple procedure. The new telepathology system iPath is in compliance with these needs. The system is based on small, but when possible independently working modules. This concept allows a simple adaptation of the system to the individual environment of the user (e.g. for different cameras, frame-grabbers, microscope steering tables etc.) and for individual needs. iPath has been in use for 6 months with various working groups. In telepathology a distinction is made between "passive" and "active" consultations but for both forms a non-expert brings in the opinion of an expert. In an active consultation both are in direct connection with each other (orally or via a chat-function), this is however not the case with a passive consultation. An active consultation can include the interactive discussion of the expert with the non-expert on images in an image database or the direct interpretation of images from a microscope by the expert. Four software modules are available for a free and as fast as possible application: (1) the module "Microscope control", (2) the module "Connector" (insertion of images directly from the microscope without a motorized microscope), (3) the module "Client-application" via the web-browser and (4) the module "Server" with a database. The server is placed in the internet and not behind a firewall. The server permanently receives information from the periphery and returns the information to the periphery on request. The only thing which the expert, the non-expert and the microscope have to know is how contact can made with the server.

  9. "I didn't know her, but…": parasocial mourning of mediated deaths on Facebook RIP pages

    NASA Astrophysics Data System (ADS)

    Klastrup, Lisbeth

    2015-04-01

    This article examines the use of six Danish "Rest in Peace" or (RIP) memorial pages. The article focuses on the relation between news media and RIP page use, in relation to general communicative practices on these pages. Based on an analysis of press coverage of the deaths of six young people and a close analysis of 1,015 comments extracted from the RIP pages created to memorialize them, it is shown that their deaths attracted considerable media attention, as did the RIP pages themselves. Comment activity seem to reflect the news stories in the way the commenters refer to the context of death and the emotional distress they experience, but mainly comments on the RIP pages are conventional expressions of sympathy and "RIP" wishes. The article concludes that public RIP pages might be understood as virtual spontaneous shrines, affording an emerging practice of "RIP-ing."

  10. San Mateo County's Server Information Program (S.I.P.): A Community-Based Alcohol Server Training Program.

    ERIC Educational Resources Information Center

    de Miranda, John

    The field of alcohol server awareness and training has grown dramatically in the past several years and the idea of training servers to reduce alcohol problems has become a central fixture in the current alcohol policy debate. The San Mateo County, California Server Information Program (SIP) is a community-based prevention strategy designed to…

  11. Analysis of practical backoff protocols for contention resolution with multiple servers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldberg, L.A.; MacKenzie, P.D.

    Backoff protocols are probably the most widely used protocols for contention resolution in multiple access channels. In this paper, we analyze the stochastic behavior of backoff protocols for contention resolution among a set of clients and servers, each server being a multiple access channel that deals with contention like an Ethernet channel. We use the standard model in which each client generates requests for a given server according to a Bemoulli distribution with a specified mean. The client-server request rate of a system is the maximum over all client-server pairs (i, j) of the sum of all request rates associatedmore » with either client i or server j. Our main result is that any superlinear polynomial backoff protocol is stable for any multiple-server system with a sub-unit client-server request rate. We confirm the practical relevance of our result by demonstrating experimentally that the average waiting time of requests is very small when such a system is run with reasonably few clients and reasonably small request rates such as those that occur in actual ethernets. Our result is the first proof of stability for any backoff protocol for contention resolution with multiple servers. Our result is also the first proof that any weakly acknowledgment based protocol is stable for contention resolution with multiple servers and such high request rates. Two special cases of our result are of interest. Hastad, Leighton and Rogoff have shown that for a single-server system with a sub-unit client-server request rate any modified superlinear polynomial backoff protocol is stable. These modified backoff protocols are similar to standard backoff protocols but require more random bits to implement. The special case of our result in which there is only one server extends the result of Hastad, Leighton and Rogoff to standard (practical) backoff protocols. Finally, our result applies to dynamic routing in optical networks.« less

  12. Pathogenicity in POLG syndromes: DNA polymerase gamma pathogenicity prediction server and database.

    PubMed

    Nurminen, Anssi; Farnum, Gregory A; Kaguni, Laurie S

    2017-06-01

    DNA polymerase gamma (POLG) is the replicative polymerase responsible for maintaining mitochondrial DNA (mtDNA). Disorders related to its functionality are a major cause of mitochondrial disease. The clinical spectrum of POLG syndromes includes Alpers-Huttenlocher syndrome (AHS), childhood myocerebrohepatopathy spectrum (MCHS), myoclonic epilepsy myopathy sensory ataxia (MEMSA), the ataxia neuropathy spectrum (ANS) and progressive external ophthalmoplegia (PEO). We have collected all publicly available POLG-related patient data and analyzed it using our pathogenic clustering model to provide a new research and clinical tool in the form of an online server. The server evaluates the pathogenicity of both previously reported and novel mutations. There are currently 176 unique point mutations reported and found in mitochondrial patients in the gene encoding the catalytic subunit of POLG, POLG . The mutations are distributed nearly uniformly along the length of the primary amino acid sequence of the gene. Our analysis shows that most of the mutations are recessive, and that the reported dominant mutations cluster within the polymerase active site in the tertiary structure of the POLG enzyme. The POLG Pathogenicity Prediction Server (http://polg.bmb.msu.edu) is targeted at clinicians and scientists studying POLG disorders, and aims to provide the most current available information regarding the pathogenicity of POLG mutations.

  13. ACFIS: a web server for fragment-based drug discovery

    PubMed Central

    Hao, Ge-Fei; Jiang, Wen; Ye, Yuan-Nong; Wu, Feng-Xu; Zhu, Xiao-Lei; Guo, Feng-Biao; Yang, Guang-Fu

    2016-01-01

    In order to foster innovation and improve the effectiveness of drug discovery, there is a considerable interest in exploring unknown ‘chemical space’ to identify new bioactive compounds with novel and diverse scaffolds. Hence, fragment-based drug discovery (FBDD) was developed rapidly due to its advanced expansive search for ‘chemical space’, which can lead to a higher hit rate and ligand efficiency (LE). However, computational screening of fragments is always hampered by the promiscuous binding model. In this study, we developed a new web server Auto Core Fragment in silico Screening (ACFIS). It includes three computational modules, PARA_GEN, CORE_GEN and CAND_GEN. ACFIS can generate core fragment structure from the active molecule using fragment deconstruction analysis and perform in silico screening by growing fragments to the junction of core fragment structure. An integrated energy calculation rapidly identifies which fragments fit the binding site of a protein. We constructed a simple interface to enable users to view top-ranking molecules in 2D and the binding mode in 3D for further experimental exploration. This makes the ACFIS a highly valuable tool for drug discovery. The ACFIS web server is free and open to all users at http://chemyang.ccnu.edu.cn/ccb/server/ACFIS/. PMID:27150808

  14. ACFIS: a web server for fragment-based drug discovery.

    PubMed

    Hao, Ge-Fei; Jiang, Wen; Ye, Yuan-Nong; Wu, Feng-Xu; Zhu, Xiao-Lei; Guo, Feng-Biao; Yang, Guang-Fu

    2016-07-08

    In order to foster innovation and improve the effectiveness of drug discovery, there is a considerable interest in exploring unknown 'chemical space' to identify new bioactive compounds with novel and diverse scaffolds. Hence, fragment-based drug discovery (FBDD) was developed rapidly due to its advanced expansive search for 'chemical space', which can lead to a higher hit rate and ligand efficiency (LE). However, computational screening of fragments is always hampered by the promiscuous binding model. In this study, we developed a new web server Auto Core Fragment in silico Screening (ACFIS). It includes three computational modules, PARA_GEN, CORE_GEN and CAND_GEN. ACFIS can generate core fragment structure from the active molecule using fragment deconstruction analysis and perform in silico screening by growing fragments to the junction of core fragment structure. An integrated energy calculation rapidly identifies which fragments fit the binding site of a protein. We constructed a simple interface to enable users to view top-ranking molecules in 2D and the binding mode in 3D for further experimental exploration. This makes the ACFIS a highly valuable tool for drug discovery. The ACFIS web server is free and open to all users at http://chemyang.ccnu.edu.cn/ccb/server/ACFIS/. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  15. Naver: a PC-cluster-based VR system

    NASA Astrophysics Data System (ADS)

    Park, ChangHoon; Ko, HeeDong; Kim, TaiYun

    2003-04-01

    In this paper, we present a new framework NAVER for virtual reality application. The NAVER is based on a cluster of low-cost personal computers. The goal of NAVER is to provide flexible, extensible, scalable and re-configurable framework for the virtual environments defined as the integration of 3D virtual space and external modules. External modules are various input or output devices and applications on the remote hosts. From the view of system, personal computers are divided into three servers according to its specific functions: Render Server, Device Server and Control Server. While Device Server contains external modules requiring event-based communication for the integration, Control Server contains external modules requiring synchronous communication every frame. And, the Render Server consists of 5 managers: Scenario Manager, Event Manager, Command Manager, Interaction Manager and Sync Manager. These managers support the declaration and operation of virtual environment and the integration with external modules on remote servers.

  16. Design details of Intelligent Instruments for PLC-free Cryogenic measurements, control and data acquisition

    NASA Astrophysics Data System (ADS)

    Antony, Joby; Mathuria, D. S.; Chaudhary, Anup; Datta, T. S.; Maity, T.

    2017-02-01

    Cryogenic network for linear accelerator operations demand a large number of Cryogenic sensors, associated instruments and other control-instrumentation to measure, monitor and control different cryogenic parameters remotely. Here we describe an alternate approach of six types of newly designed integrated intelligent cryogenic instruments called device-servers which has the complete circuitry for various sensor-front-end analog instrumentation and the common digital back-end http-server built together, to make crateless PLC-free model of controls and data acquisition. These identified instruments each sensor-specific viz. LHe server, LN2 Server, Control output server, Pressure server, Vacuum server and Temperature server are completely deployed over LAN for the cryogenic operations of IUAC linac (Inter University Accelerator Centre linear Accelerator), New Delhi. This indigenous design gives certain salient features like global connectivity, low cost due to crateless model, easy signal processing due to integrated design, less cabling and device-interconnectivity etc.

  17. Twin-tailed fail-over for fileservers maintaining full performance in the presence of a failure

    DOEpatents

    Coteus, Paul W.; Gara, Alan G.; Giampapa, Mark E.; Heidelberger, Philip; Steinmacher-Burow, Burkhard D.

    2008-02-12

    A method for maintaining full performance of a file system in the presence of a failure is provided. The file system having N storage devices, where N is an integer greater than zero and N primary file servers where each file server is operatively connected to a corresponding storage device for accessing files therein. The file system further having a secondary file server operatively connected to at least one of the N storage devices. The method including: switching the connection of one of the N storage devices to the secondary file server upon a failure of one of the N primary file servers; and switching the connections of one or more of the remaining storage devices to a primary file server other than the failed file server as necessary so as to prevent a loss in performance and to provide each storage device with an operating file server.

  18. Experimental parametric study of servers cooling management in data centers buildings

    NASA Astrophysics Data System (ADS)

    Nada, S. A.; Elfeky, K. E.; Attia, Ali M. A.; Alshaer, W. G.

    2017-06-01

    A parametric study of air flow and cooling management of data centers servers is experimentally conducted for different design conditions. A physical scale model of data center accommodating one rack of four servers was designed and constructed for testing purposes. Front and rear rack and server's temperatures distributions and supply/return heat indices (SHI/RHI) are used to evaluate data center thermal performance. Experiments were conducted to parametrically study the effects of perforated tiles opening ratio, servers power load variation and rack power density. The results showed that (1) perforated tile of 25% opening ratio provides the best results among the other opening ratios, (2) optimum benefit of cold air in servers cooling is obtained at uniformly power loading of servers (3) increasing power density decrease air re-circulation but increase air bypass and servers temperature. The present results are compared with previous experimental and CFD results and fair agreement was found.

  19. Experience with Adaptive Security Policies.

    DTIC Science & Technology

    1998-03-01

    3.1 Introduction r: 3.2 Logical Groupings of audited permission checks 29 3.3 Auditing of system servers via microkernel snooping 31 3.4...performed by servers other than the microkernel . Since altering each server to audit events would complicate the integration of new servers, a...modification to the microkernel was implemented to allow the microkernel to audit the requests made of other servers. Both methods for enhancing audit

  20. Opportunities for the Mashup of Heterogenous Data Server via Semantic Web Technology

    NASA Astrophysics Data System (ADS)

    Ritschel, Bernd; Seelus, Christoph; Neher, Günther; Iyemori, Toshihiko; Koyama, Yukinobu; Yatagai, Akiyo; Murayama, Yasuhiro; King, Todd; Hughes, John; Fung, Shing; Galkin, Ivan; Hapgood, Michael; Belehaki, Anna

    2015-04-01

    Opportunities for the Mashup of Heterogenous Data Server via Semantic Web Technology European Union ESPAS, Japanese IUGONET and GFZ ISDC data server are developed for the ingestion, archiving and distributing of geo and space science domain data. Main parts of the data -managed by the mentioned data server- are related to near earth-space and geomagnetic field data. A smart mashup of the data server would allow a seamless browse and access to data and related context information. However the achievement of a high level of interoperability is a challenge because the data server are based on different data models and software frameworks. This paper is focused on the latest experiments and results for the mashup of the data server using the semantic Web approach. Besides the mashup of domain and terminological ontologies, especially the options to connect data managed by relational databases using D2R server and SPARQL technology will be addressed. A successful realization of the data server mashup will not only have a positive impact to the data users of the specific scientific domain but also to related projects, such as e.g. the development of a new interoperable version of NASA's Planetary Data System (PDS) or ICUS's World Data System alliance. ESPAS data server: https://www.espas-fp7.eu/portal/ IUGONET data server: http://search.iugonet.org/iugonet/ GFZ ISDC data server (semantic Web based prototype): http://rz-vm30.gfz-potsdam.de/drupal-7.9/ NASA PDS: http://pds.nasa.gov ICSU-WDS: https://www.icsu-wds.org

  1. Triple-server blind quantum computation using entanglement swapping

    NASA Astrophysics Data System (ADS)

    Li, Qin; Chan, Wai Hong; Wu, Chunhui; Wen, Zhonghua

    2014-04-01

    Blind quantum computation allows a client who does not have enough quantum resources or technologies to achieve quantum computation on a remote quantum server such that the client's input, output, and algorithm remain unknown to the server. Up to now, single- and double-server blind quantum computation have been considered. In this work, we propose a triple-server blind computation protocol where the client can delegate quantum computation to three quantum servers by the use of entanglement swapping. Furthermore, the three quantum servers can communicate with each other and the client is almost classical since one does not require any quantum computational power, quantum memory, and the ability to prepare any quantum states and only needs to be capable of getting access to quantum channels.

  2. Antihemorrhagin in the blood serum of king cobra (Ophiophagus hannah): purification and characterization.

    PubMed

    Chanhome, Lawan; Khow, Orawan; Omori-Satoh, Tamotsu; Sitprija, Visith

    2003-06-01

    King cobra (Ophiophagus hannah) serum was found to possess antihemorrhagic activity against king cobra hemorrhagin. The activity was stronger than that in commercial king cobra antivenom. An antihemorrhagin has been purified by ion exchange chromatography, affinity chromatography and gel filtration with a 22-fold purification and an overall yield of 12% of the total antihemorrhagic activity contained in crude serum. The purified antihemorrhagin was homogeneous in disc-PAGE and SDS-PAGE. Its apparent molecular weight determined by SDS-PAGE was 120 kDa. The antihemorrhagin was also active against other hemorrhagic snake venoms obtained in Thailand and Japan such as Calloselasma rhodostoma, Trimeresurus albolabris, Trimeresurus macrops and Trimeresurus flavoviridis (Japanese Habu). It inhibited the proteolytic activity of king cobra venom. It is an acid- and thermolabile protein and does not form precipitin lines against king cobra venom.

  3. 48 CFR 1523.7002 - Waivers.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... ENVIRONMENTAL, CONSERVATION, OCCUPATIONAL SAFETY, AND DRUG-FREE WORKPLACE Energy-Efficient Computer Equipment 1523.7002 Waivers. (a) There are several types of computer equipment which technically fall under the... types of equipment: (1) LAN servers, including file servers; application servers; communication servers...

  4. 48 CFR 1523.7002 - Waivers.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... ENVIRONMENTAL, CONSERVATION, OCCUPATIONAL SAFETY, AND DRUG-FREE WORKPLACE Energy-Efficient Computer Equipment 1523.7002 Waivers. (a) There are several types of computer equipment which technically fall under the... types of equipment: (1) LAN servers, including file servers; application servers; communication servers...

  5. 48 CFR 1523.7002 - Waivers.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... ENVIRONMENTAL, CONSERVATION, OCCUPATIONAL SAFETY, AND DRUG-FREE WORKPLACE Energy-Efficient Computer Equipment 1523.7002 Waivers. (a) There are several types of computer equipment which technically fall under the... types of equipment: (1) LAN servers, including file servers; application servers; communication servers...

  6. Role of communication systems in coordinating supervising anesthesiologists' activities outside of operating rooms.

    PubMed

    Smallman, Bettina; Dexter, Franklin; Masursky, Danielle; Li, Fenghua; Gorji, Reza; George, Dave; Epstein, Richard H

    2013-04-01

    Theoretically, communication systems have the potential to increase the productivity of anesthesiologists supervising anesthesia providers. We evaluated the maximal potential of communication systems to increase the productivity of anesthesia care by enhancing anesthesiologists' coordination of care (activities) among operating rooms (ORs). At hospital A, data for 13,368 pages were obtained from files recorded in the internal alphanumeric text paging system. Pages from the postanesthesia care unit were processed through a numeric paging system and thus not included. At hospital B, in a different US state, 3 of the authors categorized each of 898 calls received using the internal wireless audio system (Vocera(®)). Lower and upper 95% confidence limits for percentages are the values reported. At least 45% of pages originated from outside the ORs (e.g., 20% from holding area) at hospital A and at least 56% of calls (e.g., 30% administrative) at hospital B. In contrast, requests from ORs for urgent presence of the anesthesiologist were at most 0.2% of pages at hospital A and 1.8% of calls at hospital B. Approximately half of messages to supervising anesthesiologists are for activity originating outside the ORs being supervised. To use communication tools to increase anesthesia productivity on the day of surgery, their use should include a focus on care coordination outside ORs (e.g., holding area) and among ORs (e.g., at the control desk).

  7. Expansion and Enhacement of the Wyoming Coalbed Methane Clearinghouse Website to the Wyoming Energy Resources Information Clearinghouse.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hulme, Diana; Hamerlinck, Jeffrey; Bergman, Harold

    Energy development is expanding across the United States, particularly in western states like Wyoming. Federal and state land management agencies, local governments, industry and non-governmental organizations have realized the need to access spatially-referenced data and other non-spatial information to determine the geographical extent and cumulative impacts of expanding energy development. The Wyoming Energy Resources Information Clearinghouse (WERIC) is a web-based portal which centralizes access to news, data, maps, reports and other information related to the development, management and conservation of Wyoming's diverse energy resources. WERIC was established in 2006 by the University of Wyoming's Ruckelshaus Institute of Environment and Naturalmore » Resources (ENR) and the Wyoming Geographic Information Science Center (WyGISC) with funding from the US Department of Energy (DOE) and the US Bureau of Land Management (BLM). The WERIC web portal originated in concept from a more specifically focused website, the Coalbed Methane (CBM) Clearinghouse. The CBM Clearinghouse effort focused only on coalbed methane production within the Powder River Basin of northeast Wyoming. The CBM Clearinghouse demonstrated a need to expand the effort statewide with a comprehensive energy focus, including fossil fuels and renewable and alternative energy resources produced and/or developed in Wyoming. WERIC serves spatial data to the greater Wyoming geospatial community through the Wyoming GeoLibrary, the WyGISC Data Server and the Wyoming Energy Map. These applications are critical components that support the Wyoming Energy Resources Information Clearinghouse (WERIC). The Wyoming GeoLibrary is a tool for searching and browsing a central repository for metadata. It provides the ability to publish and maintain metadata and geospatial data in a distributed environment. The WyGISC Data Server is an internet mapping application that provides traditional GIS mapping and analysis functionality via the web. It is linked into various state and federal agency spatial data servers allowing users to visualize multiple themes, such as well locations and core sage grouse areas, in one domain. Additionally, this application gives users the ability to download any of the data being displayed within the web map. The Wyoming Energy Map is the newest mapping application developed directly from this effort. With over a 100 different layers accessible via this mapping application, it is the most comprehensive Wyoming energy mapping application available. This application also provides the public with the ability to create cultural and wildlife reports based on any location throughout Wyoming and at multiple scales. The WERIC website also allows users to access links to federal, state, and local natural resource agency websites and map servers; research documents about energy; and educational information, including information on upcoming energy-relate conferences. The WERIC website has seen significant use by energy industry consultants, land management agencies, state and local decision-makers, non-governmental organizations and the public. Continued service to these sectors is desirable but some challenges remain in keeping the WERIC site viable. The most pressing issue is finding the human and financial resources to keep the site continually updated. Initially, the concept included offering users the ability to maintain the site themselves; however, this has proven not to be a viable option since very few people contributed. Without user contributions, the web page relied on already committed university staff to publish and link to the appropriate documents and web-pages. An option that is currently being explored to address this issue is development of a partnership with the University of Wyoming, School of Energy Resources (SER). As part of their outreach program, SER may be able to contribute funding for a full-time position dedicated to maintenance of WERIC.« less

  8. GeoSciML and EarthResourceML Update, 2012

    NASA Astrophysics Data System (ADS)

    Richard, S. M.; Commissionthe Management; Application Inte, I.

    2012-12-01

    CGI Interoperability Working Group activities during 2012 include deployment of services using the GeoSciML-Portrayal schema, addition of new vocabularies to support properties added in version 3.0, improvements to server software for deploying services, introduction of EarthResourceML v.2 for mineral resources, and collaboration with the IUSS on a markup language for soils information. GeoSciML and EarthResourceML have been used as the basis for the INSPIRE Geology and Mineral Resources specifications respectively. GeoSciML-Portrayal is an OGC GML simple-feature application schema for presentation of geologic map unit, contact, and shear displacement structure (fault and ductile shear zone) descriptions in web map services. Use of standard vocabularies for geologic age and lithology enables map services using shared legends to achieve visual harmonization of maps provided by different services. New vocabularies have been added to the collection of CGI vocabularies provided to support interoperable GeoSciML services, and can be accessed through http://resource.geosciml.org. Concept URIs can be dereferenced to obtain SKOS rdf or html representations using the SISSVoc vocabulary service. New releases of the FOSS GeoServer application greatly improve support for complex XML feature schemas like GeoSciML, and the ArcGIS for INSPIRE extension implements similar complex feature support for ArcGIS Server. These improved server implementations greatly facilitate deploying GeoSciML services. EarthResourceML v2 adds features for information related to mining activities. SoilML provides an interchange format for soil material, soil profile, and terrain information. Work is underway to add GeoSciML to the portfolio of Open Geospatial Consortium (OGC) specifications.

  9. minepath.org: a free interactive pathway analysis web server.

    PubMed

    Koumakis, Lefteris; Roussos, Panos; Potamias, George

    2017-07-03

    ( www.minepath.org ) is a web-based platform that elaborates on, and radically extends the identification of differentially expressed sub-paths in molecular pathways. Besides the network topology, the underlying MinePath algorithmic processes exploit exact gene-gene molecular relationships (e.g. activation, inhibition) and are able to identify differentially expressed pathway parts. Each pathway is decomposed into all its constituent sub-paths, which in turn are matched with corresponding gene expression profiles. The highly ranked, and phenotype inclined sub-paths are kept. Apart from the pathway analysis algorithm, the fundamental innovation of the MinePath web-server concerns its advanced visualization and interactive capabilities. To our knowledge, this is the first pathway analysis server that introduces and offers visualization of the underlying and active pathway regulatory mechanisms instead of genes. Other features include live interaction, immediate visualization of functional sub-paths per phenotype and dynamic linked annotations for the engaged genes and molecular relations. The user can download not only the results but also the corresponding web viewer framework of the performed analysis. This feature provides the flexibility to immediately publish results without publishing source/expression data, and get all the functionality of a web based pathway analysis viewer. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  10. SmartSearch steganalysis

    NASA Astrophysics Data System (ADS)

    Bloom, Jeffrey A.; Alonso, Rafael

    2003-06-01

    There are two primary challenges to monitoring the Web for steganographic media: finding suspect media and examining those found. The challenge that has received a great deal of attention is the second of these, the steganalysis problem. The other challenge, and one that has received much less attention, is the search problem. How does the steganalyzer get the suspect media in the first place? This paper describes an innovative method and architecture to address this search problem. The typical approaches to searching the web for covert communications are often based on the concept of "crawling" the Web via a smart "spider." Such spiders find new pages by following ever-expanding chains of links from one page to many next pages. Rather than seek pages by chasing links from other pages, we find candidate pages by identifying requests to access pages. To do this we monitor traffic on Internet backbones, identify and log HTTP requests, and use this information to guide our process. Our approach has the advantages that we examine pages to which no links exist, we examine pages as soon as they are requested, and we concentrate resources only on active pages, rather than examining pages that are never viewed.

  11. 78 FR 22021 - Agency Information Collection Activities: Requests for Comments; Clearance of Renewed Approval of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-12

    ..., page 71473. FAA Form 7480-1 (Notice of Landing Area Proposal) is used to collect information about any... Activities: Requests for Comments; Clearance of Renewed Approval of Information Collection: Notice of Landing... information collection. The Federal Register Notice with a 60-day comment period soliciting [[Page 22022...

  12. 78 FR 69875 - Agency Information Collection Activities; Revision of a Previously Approved Collection, With...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-21

    ... DEPARTMENT OF JUSTICE Agency Information Collection Activities; Revision of a Previously Approved Collection, With Change; Comments Requested: COPS Progress Report Correction In notice document 2013-25701, appearing on page 64979 in the issue of Wednesday, October 30, 2013, make the following correction: On page 64979, in the second column,...

  13. A Powerful, Cost Effective, Web Based Engineering Solution Supporting Conjunction Detection and Visual Analysis

    NASA Astrophysics Data System (ADS)

    Novak, Daniel M.; Biamonti, Davide; Gross, Jeremy; Milnes, Martin

    2013-08-01

    An innovative and visually appealing tool is presented for efficient all-vs-all conjunction analysis on a large catalogue of objects. The conjunction detection uses a nearest neighbour search algorithm, based on spatial binning and identification of pairs of objects in adjacent bins. This results in the fastest all vs all filtering the authors are aware of. The tool is constructed on a server-client architecture, where the server broadcasts to the client the conjunction data and ephemerides, while the client supports the user interface through a modern browser, without plug-in. In order to make the tool flexible and maintainable, Java software technologies were used on the server side, including Spring, Camel, ActiveMQ and CometD. The user interface and visualisation are based on the latest web technologies: HTML5, WebGL, THREE.js. Importance has been given on the ergonomics and visual appeal of the software. In fact certain design concepts have been borrowed from the gaming industry.

  14. Using the Textpresso Site-Specific Recombinases Web server to identify Cre expressing mouse strains and floxed alleles.

    PubMed

    Condie, Brian G; Urbanski, William M

    2014-01-01

    Effective tools for searching the biomedical literature are essential for identifying reagents or mouse strains as well as for effective experimental design and informed interpretation of experimental results. We have built the Textpresso Site Specific Recombinases (Textpresso SSR) Web server to enable researchers who use mice to perform in-depth searches of a rapidly growing and complex part of the mouse literature. Our Textpresso Web server provides an interface for searching the full text of most of the peer-reviewed publications that report the characterization or use of mouse strains that express Cre or Flp recombinase. The database also contains most of the publications that describe the characterization or analysis of strains carrying conditional alleles or transgenes that can be inactivated or activated by site-specific recombinases such as Cre or Flp. Textpresso SSR complements the existing online databases that catalog Cre and Flp expression patterns by providing a unique online interface for the in-depth text mining of the site specific recombinase literature.

  15. The web server of IBM's Bioinformatics and Pattern Discovery group: 2004 update

    PubMed Central

    Huynh, Tien; Rigoutsos, Isidore

    2004-01-01

    In this report, we provide an update on the services and content which are available on the web server of IBM's Bioinformatics and Pattern Discovery group. The server, which is operational around the clock, provides access to a large number of methods that have been developed and published by the group's members. There is an increasing number of problems that these tools can help tackle; these problems range from the discovery of patterns in streams of events and the computation of multiple sequence alignments, to the discovery of genes in nucleic acid sequences, the identification—directly from sequence—of structural deviations from α-helicity and the annotation of amino acid sequences for antimicrobial activity. Additionally, annotations for more than 130 archaeal, bacterial, eukaryotic and viral genomes are now available on-line and can be searched interactively. The tools and code bundles continue to be accessible from http://cbcsrv.watson.ibm.com/Tspd.html whereas the genomics annotations are available at http://cbcsrv.watson.ibm.com/Annotations/. PMID:15215340

  16. The web server of IBM's Bioinformatics and Pattern Discovery group: 2004 update.

    PubMed

    Huynh, Tien; Rigoutsos, Isidore

    2004-07-01

    In this report, we provide an update on the services and content which are available on the web server of IBM's Bioinformatics and Pattern Discovery group. The server, which is operational around the clock, provides access to a large number of methods that have been developed and published by the group's members. There is an increasing number of problems that these tools can help tackle; these problems range from the discovery of patterns in streams of events and the computation of multiple sequence alignments, to the discovery of genes in nucleic acid sequences, the identification--directly from sequence--of structural deviations from alpha-helicity and the annotation of amino acid sequences for antimicrobial activity. Additionally, annotations for more than 130 archaeal, bacterial, eukaryotic and viral genomes are now available on-line and can be searched interactively. The tools and code bundles continue to be accessible from http://cbcsrv.watson.ibm.com/Tspd.html whereas the genomics annotations are available at http://cbcsrv.watson.ibm.com/Annotations/.

  17. GSCALite: A Web Server for Gene Set Cancer Analysis.

    PubMed

    Liu, Chun-Jie; Hu, Fei-Fei; Xia, Mengxuan; Han, Leng; Zhang, Qiong; Guo, An-Yuan

    2018-05-22

    The availability of cancer genomic data makes it possible to analyze genes related to cancer. Cancer is usually the result of a set of genes and the signal of a single gene could be covered by background noise. Here, we present a web server named Gene Set Cancer Analysis (GSCALite) to analyze a set of genes in cancers with the following functional modules. (i) Differential expression in tumor vs normal, and the survival analysis; (ii) Genomic variations and their survival analysis; (iii) Gene expression associated cancer pathway activity; (iv) miRNA regulatory network for genes; (v) Drug sensitivity for genes; (vi) Normal tissue expression and eQTL for genes. GSCALite is a user-friendly web server for dynamic analysis and visualization of gene set in cancer and drug sensitivity correlation, which will be of broad utilities to cancer researchers. GSCALite is available on http://bioinfo.life.hust.edu.cn/web/GSCALite/. guoay@hust.edu.cn or zhangqiong@hust.edu.cn. Supplementary data are available at Bioinformatics online.

  18. Distributed Operations Planning

    NASA Technical Reports Server (NTRS)

    Fox, Jason; Norris, Jeffrey; Powell, Mark; Rabe, Kenneth; Shams, Khawaja

    2007-01-01

    Maestro software provides a secure and distributed mission planning system for long-term missions in general, and the Mars Exploration Rover Mission (MER) specifically. Maestro, the successor to the Science Activity Planner, has a heavy emphasis on portability and distributed operations, and requires no data replication or expensive hardware, instead relying on a set of services functioning on JPL institutional servers. Maestro works on most current computers with network connections, including laptops. When browsing down-link data from a spacecraft, Maestro functions similarly to being on a Web browser. After authenticating the user, it connects to a database server to query an index of data products. It then contacts a Web server to download and display the actual data products. The software also includes collaboration support based upon a highly reliable messaging system. Modifications made to targets in one instance are quickly and securely transmitted to other instances of Maestro. The back end that has been developed for Maestro could benefit many future missions by reducing the cost of centralized operations system architecture.

  19. Spectral atlases of the Sun from 3980 to 7100 Å at the center and at the limb

    NASA Astrophysics Data System (ADS)

    Fathivavsari, H.; Ajabshirizadeh, A.; Koutchmy, S.

    2014-10-01

    In this work, we present digital and graphical atlases of spectra of both the solar disk-center and of the limb near the Solar poles using data taken at the UTS-IAP & RIAAM (the University of Tabriz Siderostat, telescope and spectrograph jointly developed with the Institut d'Astrophysique de Paris and Research Institute for Astronomy and Astrophysics of Maragha). High resolution and high signal-to-noise ratio (SNR) CCD-slit spectra of the sun for 2 different parts of the disk, namely for μ=1.0 (solar center) & for μ=0.3 (solar limb) are provided and discussed. While there are several spectral atlases of the solar disk-center, this is the first spectral atlas ever produced for the solar limb at this spectral range. The resolution of the spectra is about R˜70 000 (Δ λ˜0.09 Å) with the signal-to-noise ratio (SNR) of 400-600. The full atlas covers the 3980 to 7100 Å spectral regions and contains 44 pages with three partial spectra of the solar spectrum put on each page to make it compact. The difference spectrum of the normalized solar disk-center and the solar limb is also included in the graphic presentation of the atlas to show the difference of line profiles, including far wings. The identification of the most significant solar lines is included in the graphic presentation of the atlas. Telluric lines are producing a definite signature on the difference spectra which is easy to notice. At the end of this paper we present only two sample pages of the whole atlas while the graphic presentation of the whole atlas along with its ASCII file can be accessed via the ftp server of the CDS in Strasbourg via anonymous ftp to cdsarc.u-strasbg.fr (130.79.128.5) or via this link: http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/other/ApSS.

  20. An automatically updateable web publishing solution: taking document sharing and conversion to enterprise level

    NASA Astrophysics Data System (ADS)

    Rahman, Fuad; Tarnikova, Yuliya; Hartono, Rachmat; Alam, Hassan

    2006-01-01

    This paper presents a novel automatic web publishing solution, Pageview (R). PageView (R) is a complete working solution for document processing and management. The principal aim of this tool is to allow workgroups to share, access and publish documents on-line on a regular basis. For example, assuming that a person is working on some documents. The user will, in some fashion, organize his work either in his own local directory or in a shared network drive. Now extend that concept to a workgroup. Within a workgroup, some users are working together on some documents, and they are saving them in a directory structure somewhere on a document repository. The next stage of this reasoning is that a workgroup is working on some documents, and they want to publish them routinely on-line. Now it may happen that they are using different editing tools, different software, and different graphics tools. The resultant documents may be in PDF, Microsoft Office (R), HTML, or Word Perfect format, just to name a few. In general, this process needs the documents to be processed in a fashion so that they are in the HTML format, and then a web designer needs to work on that collection to make them available on-line. PageView (R) takes care of this whole process automatically, making the document workflow clean and easy to follow. PageView (R) Server publishes documents, complete with the directory structure, for online use. The documents are automatically converted to HTML and PDF so that users can view the content without downloading the original files, or having to download browser plug-ins. Once published, other users can access the documents as if they are accessing them from their local folders. The paper will describe the complete working system and will discuss possible applications within the document management research.

  1. An efficient biometric and password-based remote user authentication using smart card for Telecare Medical Information Systems in multi-server environment.

    PubMed

    Maitra, Tanmoy; Giri, Debasis

    2014-12-01

    The medical organizations have introduced Telecare Medical Information System (TMIS) to provide a reliable facility by which a patient who is unable to go to a doctor in critical or urgent period, can communicate to a doctor through a medical server via internet from home. An authentication mechanism is needed in TMIS to hide the secret information of both parties, namely a server and a patient. Recent research includes patient's biometric information as well as password to design a remote user authentication scheme that enhances the security level. In a single server environment, one server is responsible for providing services to all the authorized remote patients. However, the problem arises if a patient wishes to access several branch servers, he/she needs to register to the branch servers individually. In 2014, Chuang and Chen proposed an remote user authentication scheme for multi-server environment. In this paper, we have shown that in their scheme, an non-register adversary can successfully logged-in into the system as a valid patient. To resist the weaknesses, we have proposed an authentication scheme for TMIS in multi-server environment where the patients can register to a root telecare server called registration center (RC) in one time to get services from all the telecare branch servers through their registered smart card. Security analysis and comparison shows that our proposed scheme provides better security with low computational and communication cost.

  2. Land User and Land Cover Maps of Europe: a Webgis Platform

    NASA Astrophysics Data System (ADS)

    Brovelli, M. A.; Fahl, F. C.; Minghini, M.; Molinari, M. E.

    2016-06-01

    This paper presents the methods and implementation processes of a WebGIS platform designed to publish the available land use and land cover maps of Europe at continental scale. The system is built completely on open source infrastructure and open standards. The proposed architecture is based on a server-client model having GeoServer as the map server, Leaflet as the client-side mapping library and the Bootstrap framework at the core of the front-end user interface. The web user interface is designed to have typical features of a desktop GIS (e.g. activate/deactivate layers and order layers by drag and drop actions) and to show specific information on the activated layers (e.g. legend and simplified metadata). Users have the possibility to change the base map from a given list of map providers (e.g. OpenStreetMap and Microsoft Bing) and to control the opacity of each layer to facilitate the comparison with both other land cover layers and the underlying base map. In addition, users can add to the platform any custom layer available through a Web Map Service (WMS) and activate the visualization of photos from popular photo sharing services. This last functionality is provided in order to have a visual assessment of the available land coverages based on other user-generated contents available on the Internet. It is supposed to be a first step towards a calibration/validation service that will be made available in the future.

  3. Breastfeeding: Planning Ahead

    MedlinePlus Videos and Cool Tools

    ... Page Next Page It's Only Natural resources Related information Breastfeeding Pregnancy Resources Your Guide to Breastfeeding Support ... Our vision and mission Programs and Activities Health Information Gateway It's Only Natural Make the Call, Don' ...

  4. Child Sexual Abuse

    MedlinePlus

    ... to page content Attention A T users. To access the menus on this page please perform the ... up or down through the submenu options to access/activate the submenu links. Get help from Veterans ...

  5. An Optimization of the Basic School Military Occupational Skill Assignment Process

    DTIC Science & Technology

    2003-06-01

    Corps Intranet (NMCI)23 supports it. We evaluated the use of Microsoft’s SQL Server, but dismissed this after learning that TBS did not possess a SQL ...Server license or a qualified SQL Server administrator.24 SQL Server would have provided for additional security measures not available in MS...administrator. Although not has powerful as SQL Server, MS Access can handle the multi-user environment necessary for this system.25 The training

  6. General bulk service queueing system with N-policy, multiplevacations, setup time and server breakdown without interruption

    NASA Astrophysics Data System (ADS)

    Sasikala, S.; Indhira, K.; Chandrasekaran, V. M.

    2017-11-01

    In this paper, we have considered an MX / (a,b) / 1 queueing system with server breakdown without interruption, multiple vacations, setup times and N-policy. After a batch of service, if the size of the queue is ξ (< a), then the server immediately takes a vacation. Upon returns from a vacation, if the queue is less than N, then the server takes another vacation. This process continues until the server finds atleast N customers in the queue. After a vacation, if the server finds at least N customers waiting for service, then the server needs a setup time to start the service. After a batch of service, if the amount of waiting customers in the queue is ξ (≥ a) then the server serves a batch of min(ξ,b) customers, where b ≥ a. We derived the probability generating function of queue length at arbitrary time epoch. Further, we obtained some important performance measures.

  7. Secure entanglement distillation for double-server blind quantum computation.

    PubMed

    Morimae, Tomoyuki; Fujii, Keisuke

    2013-07-12

    Blind quantum computation is a new secure quantum computing protocol where a client, who does not have enough quantum technologies at her disposal, can delegate her quantum computation to a server, who has a fully fledged quantum computer, in such a way that the server cannot learn anything about the client's input, output, and program. If the client interacts with only a single server, the client has to have some minimum quantum power, such as the ability of emitting randomly rotated single-qubit states or the ability of measuring states. If the client interacts with two servers who share Bell pairs but cannot communicate with each other, the client can be completely classical. For such a double-server scheme, two servers have to share clean Bell pairs, and therefore the entanglement distillation is necessary in a realistic noisy environment. In this Letter, we show that it is possible to perform entanglement distillation in the double-server scheme without degrading the security of blind quantum computing.

  8. Space Weather Activities at SERC for IHY: (1) Local Education, (2) Global Outreach and (3) Data Base Service (P61)

    NASA Astrophysics Data System (ADS)

    Yumoto, K.; Magdas/Cpmn Group

    2006-11-01

    arnoldyuki@serc.kyushu-u.ac.jp The Space Environment Research Center (SERC), Kyushu University (KU), conducts everyday space weather “now casting”. There are two main goals in this effort: (1) to train and educate KU students about the complexities of the Sun-Earth system so that they can become space weather forecasters in the future, (2) to globally disseminate space weather information from SERC as a service to the scientific community and the general public. In order to understand the complexities of the Sun-Earth system, KU students analyze the data of four regions: (1) solar surface, (2) solar wind, (3) geospace, and (4) the Earth’s surface. Using real-time public data from SOHO Real Time Movies, Solar Monitor, NASA/GSFC/SDAC, and SEC‘s Anonymous FTP Server, they check each day the Sun Spot Number, locations of active regions and coronal holes, and identify solar flare events: GOES X-Ray Flux, CME: SOHO/ LASCO- C2, 3, and Proton Event: GOES Proton Flux. By analyzing ACE Real Time Data, KU students examine the solar wind (Speed, Density, Temperature) and Interplanetary Magnetic Field (IMF: Bt, Bz, Phi), and identify events of sector boundary, CIR, CME, and Shock/Discontinuity. To understand magnetic circumstances in geospace and on the Earth’s Surface, KU students analyze storms and substorms using Dst index (Kyoto Univ.), Kp index (NOAA), and Magnetic Pulsation Index (Pc 3, 4, and 5: SERC). Every morning KU students create a space weather report and then discuss it with the staff at SERC for local training and education. The report and its details are disseminated on the SERC Home Page (http://www.serc.kyushu-u.ac.jp) to provide "global outreach" for space weather information. MAGDAS (Magnetic Data Acquisition System) data are obtained from the Circum-pan Pacific Magnetometer Network (CPMN) locations during the IHY period (2007-2008). MAGDAS magnetometers are installed at 50 stations along the 210o magnetic meridian and the magnetic dip equator, which includes East Asia, the Pacific Islands, South America and Africa. MAGDAS data is made up of three components of the magnetic variations (H, D, Z) with 1-second and 1- minute sampling rate. After data correction at SERC, authorized MAGDAS collaborators can access the SERC server, in which the corrected data are stored, and get 1-min and 1-sec digital data. The MAGDAS data can be provided for all the scientific purposes through the Internet. SERC will offer the MAGDAS database to the scientific community for collaborative work.

  9. SciServer Compute brings Analysis to Big Data in the Cloud

    NASA Astrophysics Data System (ADS)

    Raddick, Jordan; Medvedev, Dmitry; Lemson, Gerard; Souter, Barbara

    2016-06-01

    SciServer Compute uses Jupyter Notebooks running within server-side Docker containers attached to big data collections to bring advanced analysis to big data "in the cloud." SciServer Compute is a component in the SciServer Big-Data ecosystem under development at JHU, which will provide a stable, reproducible, sharable virtual research environment.SciServer builds on the popular CasJobs and SkyServer systems that made the Sloan Digital Sky Survey (SDSS) archive one of the most-used astronomical instruments. SciServer extends those systems with server-side computational capabilities and very large scratch storage space, and further extends their functions to a range of other scientific disciplines.Although big datasets like SDSS have revolutionized astronomy research, for further analysis, users are still restricted to downloading the selected data sets locally - but increasing data sizes make this local approach impractical. Instead, researchers need online tools that are co-located with data in a virtual research environment, enabling them to bring their analysis to the data.SciServer supports this using the popular Jupyter notebooks, which allow users to write their own Python and R scripts and execute them on the server with the data (extensions to Matlab and other languages are planned). We have written special-purpose libraries that enable querying the databases and other persistent datasets. Intermediate results can be stored in large scratch space (hundreds of TBs) and analyzed directly from within Python or R with state-of-the-art visualization and machine learning libraries. Users can store science-ready results in their permanent allocation on SciDrive, a Dropbox-like system for sharing and publishing files. Communication between the various components of the SciServer system is managed through SciServer‘s new Single Sign-on Portal.We have created a number of demos to illustrate the capabilities of SciServer Compute, including Python and R scripts accessing a range of datasets and showing the data flow between storage and compute components.Demos, documentation, and more information can be found at www.sciserver.org.SciServer is funded by the National Science Foundation Award ACI-1261715.

  10. Providing Internet Access to High-Resolution Lunar Images

    NASA Technical Reports Server (NTRS)

    Plesea, Lucian

    2008-01-01

    The OnMoon server is a computer program that provides Internet access to high-resolution Lunar images, maps, and elevation data, all suitable for use in geographical information system (GIS) software for generating images, maps, and computational models of the Moon. The OnMoon server implements the Open Geospatial Consortium (OGC) Web Map Service (WMS) server protocol and supports Moon-specific extensions. Unlike other Internet map servers that provide Lunar data using an Earth coordinate system, the OnMoon server supports encoding of data in Moon-specific coordinate systems. The OnMoon server offers access to most of the available high-resolution Lunar image and elevation data. This server can generate image and map files in the tagged image file format (TIFF) or the Joint Photographic Experts Group (JPEG), 8- or 16-bit Portable Network Graphics (PNG), or Keyhole Markup Language (KML) format. Image control is provided by use of the OGC Style Layer Descriptor (SLD) protocol. Full-precision spectral arithmetic processing is also available, by use of a custom SLD extension. This server can dynamically add shaded relief based on the Lunar elevation to any image layer. This server also implements tiled WMS protocol and super-overlay KML for high-performance client application programs.

  11. Update on Rover Sequencing and Visualization Program

    NASA Technical Reports Server (NTRS)

    Cooper, Brian; Hartman, Frank; Maxwell, Scott; Yen, Jeng; Wright, John; Balacuit, Carlos

    2005-01-01

    The Rover Sequencing and Visualization Program (RSVP) has been updated. RSVP was reported in Rover Sequencing and Visualization Program (NPO-30845), NASA Tech Briefs, Vol. 29, No. 4 (April 2005), page 38. To recapitulate: The Rover Sequencing and Visualization Program (RSVP) is the software tool to be used in the Mars Exploration Rover (MER) mission for planning rover operations and generating command sequences for accomplishing those operations. RSVP combines three-dimensional (3D) visualization for immersive exploration of the operations area, stereoscopic image display for high-resolution examination of the downlinked imagery, and a sophisticated command-sequence editing tool for analysis and completion of the sequences. RSVP is linked with actual flight code modules for operations rehearsal to provide feedback on the expected behavior of the rover prior to committing to a particular sequence. Playback tools allow for review of both rehearsed rover behavior and downlinked results of actual rover operations. These can be displayed simultaneously for comparison of rehearsed and actual activities for verification. The primary inputs to RSVP are downlink data products from the Operations Storage Server (OSS) and activity plans generated by the science team. The activity plans are high-level goals for the next day s activities. The downlink data products include imagery, terrain models, and telemetered engineering data on rover activities and state. The Rover Sequence Editor (RoSE) component of RSVP performs activity expansion to command sequences, command creation and editing with setting of command parameters, and viewing and management of rover resources. The HyperDrive component of RSVP performs 2D and 3D visualization of the rover s environment, graphical and animated review of rover predicted and telemetered state, and creation and editing of command sequences related to mobility and Instrument Deployment Device (robotic arm) operations. Additionally, RoSE and HyperDrive together evaluate command sequences for potential violations of flight and safety rules. The products of RSVP include command sequences for uplink that are stored in the Distributed Object Manager (DOM) and predicted rover state histories stored in the OSS for comparison and validation of downlinked telemetry. The majority of components comprising RSVP utilize the MER command and activity dictionaries to automatically customize the system for MER activities.

  12. Wide Area Information Servers: An Executive Information System for Unstructured Files.

    ERIC Educational Resources Information Center

    Kahle, Brewster; And Others

    1992-01-01

    Describes the Wide Area Information Servers (WAIS) system, an integrated information retrieval system for corporate end users. Discussion covers general characteristics of the system, search techniques, protocol development, user interfaces, servers, selective dissemination of information, nontextual data, access to other servers, and description…

  13. Parallel Computing Using Web Servers and "Servlets".

    ERIC Educational Resources Information Center

    Lo, Alfred; Bloor, Chris; Choi, Y. K.

    2000-01-01

    Describes parallel computing and presents inexpensive ways to implement a virtual parallel computer with multiple Web servers. Highlights include performance measurement of parallel systems; models for using Java and intranet technology including single server, multiple clients and multiple servers, single client; and a comparison of CGI (common…

  14. Narcissism and social networking Web sites.

    PubMed

    Buffardi, Laura E; Campbell, W Keith

    2008-10-01

    The present research examined how narcissism is manifested on a social networking Web site (i.e., Facebook.com). Narcissistic personality self-reports were collected from social networking Web page owners. Then their Web pages were coded for both objective and subjective content features. Finally, strangers viewed the Web pages and rated their impression of the owner on agentic traits, communal traits, and narcissism. Narcissism predicted (a) higher levels of social activity in the online community and (b) more self-promoting content in several aspects of the social networking Web pages. Strangers who viewed the Web pages judged more narcissistic Web page owners to be more narcissistic. Finally, mediational analyses revealed several Web page content features that were influential in raters' narcissistic impressions of the owners, including quantity of social interaction, main photo self-promotion, and main photo attractiveness. Implications of the expression of narcissism in social networking communities are discussed.

  15. Asynchronous data change notification between database server and accelerator controls system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fu, W.; Morris, J.; Nemesure, S.

    2011-10-10

    Database data change notification (DCN) is a commonly used feature. Not all database management systems (DBMS) provide an explicit DCN mechanism. Even for those DBMS's which support DCN (such as Oracle and MS SQL server), some server side and/or client side programming may be required to make the DCN system work. This makes the setup of DCN between database server and interested clients tedious and time consuming. In accelerator control systems, there are many well established software client/server architectures (such as CDEV, EPICS, and ADO) that can be used to implement data reflection servers that transfer data asynchronously to anymore » client using the standard SET/GET API. This paper describes a method for using such a data reflection server to set up asynchronous DCN (ADCN) between a DBMS and clients. This method works well for all DBMS systems which provide database trigger functionality. Asynchronous data change notification (ADCN) between database server and clients can be realized by combining the use of a database trigger mechanism, which is supported by major DBMS systems, with server processes that use client/server software architectures that are familiar in the accelerator controls community (such as EPICS, CDEV or ADO). This approach makes the ADCN system easy to set up and integrate into an accelerator controls system. Several ADCN systems have been set up and used in the RHIC-AGS controls system.« less

  16. The USGODAE Monterey Data Server

    NASA Astrophysics Data System (ADS)

    Sharfstein, P. J.; Dimitriou, D.; Hankin, S. C.

    2004-12-01

    With oversight from the U.S. Global Ocean Data Assimilation Experiment (GODAE) Steering Committee and funding from the Office of Naval Research, the USGODAE Monterey Data Server has been established at the Fleet Numerical Meteorology and Oceanography Center (FNMOC) as an explicit U.S. contribution to GODAE. Support of the Monterey Data Server is accomplished by a cooperative effort between FNMOC and NOAA's Pacific Marine Environmental Laboratory (PMEL) in the on-going development of the server and the support of a collaborative network of GODAE assimilation groups. This server hosts near real-time in-situ oceanographic data, atmospheric forcing fields suitable for driving ocean models, and unique GODAE data sets, including demonstration ocean model products. GODAE is envisioned as a global system of observations, communications, modeling and assimilation, which will deliver regular, comprehensive information on the state of the oceans in a way that will promote and engender wide utility and availability of this resource for maximum benefit to society. It aims to make ocean monitoring and prediction a routine activity in a manner similar to weather forecasting. GODAE will contribute to an information system for the global ocean that will serve interests from climate and climate change to ship routing and fisheries. The USGODAE Server is developed and operated as a prototypical node for this global information system. Because of the broad range and diverse formats of data used by the GODAE community, presenting data with a consistent interface and ensuring its availability in standard formats is a primary challenge faced by the USGODAE Server project. To this end, all USGODAE data sets are available via HTTP and FTP. In addition, USGODAE data are served using Local Data Manager (LDM), THREDDS cataloging, OPeNDAP, and Live Access Server (LAS) from PMEL. Every effort is made to serve USGODAE data through the standards specified by the National Virtual Ocean Data System (NVODS) and the Integrated Ocean Observing System Data Management and Communications (IOOS/DMAC). To provide surface forcing, fluxes, and boundary conditions for ocean model research, USGODAE serves global data from the Navy Operational Global Atmospheric Prediction System (NOGAPS) and regional data from the Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS). Global meteorological data and observational data from the FNMOC Ocean QC process are posted in near real-time to USGODAE. These include T/S profiles, in-situ and satellite sea surface temperature (SST), satellite altimetry, and SSM/I sea ice. They contain all of the unclassified in-situ and satellite observations used to initialize the FNMOC NOGAPS model. Also, the Naval Oceanographic Office provides daily satellite SST and SSH retrievals to USGODAE. The USGODAE Server functions as one of two Argo Global Data Assembly Centers (GDACs), hosting the complete collection of quality-controlled Argo T/S profiling float data. USGODAE Argo data are served through OPeNDAP and LAS, providing complete integration into NVODS and the IOOS/DMAC. Due to its high reliability, ease of data access, and increasing breadth of data, the USGODAE Server is becoming an invaluable resource for both the GODAE community and the general oceanographic community. Continued integration of model, forcing, and in-situ data sets from providers throughout the world is making the USGODAE Monterey Data Server a key part of the international GODAE project.

  17. In silico Prediction, in vitro Antibacterial Spectrum, and Physicochemical Properties of a Putative Bacteriocin Produced by Lactobacillus rhamnosus Strain L156.4

    PubMed Central

    Oliveira, Letícia de C.; Silveira, Aline M. M.; Monteiro, Andréa de S.; dos Santos, Vera L.; Nicoli, Jacques R.; Azevedo, Vasco A. de C.; Soares, Siomar de C.; Dias-Souza, Marcus V.; Nardi, Regina M. D.

    2017-01-01

    A bacteriocinogenic Lactobacillus rhamnosus L156.4 strain isolated from the feces of NIH mice was identified by 16S rRNA gene sequencing and MALDI-TOF mass spectrometry. The entire genome was sequenced using Illumina, annotated in the PGAAP, and RAST servers, and deposited. Conserved genes associated with bacteriocin synthesis were predicted using BAGEL3, leading to the identification of an open reading frame (ORF) that shows homology with the L. rhamnosus GG (ATCC 53103) prebacteriocin gene. The encoded protein contains a conserved protein motif associated a structural gene of the Enterocin A superfamily. We found ORFs related to the prebacteriocin, immunity protein, ABC transporter proteins, and regulatory genes with 100% identity to those of L. rhamnosus HN001. In this study, we provide evidence of a putative bacteriocin produced by L. rhamnosus L156.4 that was further confirmed by in vitro assays. The antibacterial activity of the substances produced by this strain was evaluated using the deferred agar-spot and spot-on-the lawn assays, and a wide antimicrobial activity spectrum against human and foodborne pathogens was observed. The physicochemical characterization of the putative bacteriocin indicated that it was sensitive to proteolytic enzymes, heat stable and maintained its antibacterial activity in a pH ranging from 3 to 9. The activity against Lactobacillus fermentum, which was used as an indicator strain, was detected during bacterial logarithmic growth phase, and a positive correlation was confirmed between bacterial growth and production of the putative bacteriocin. After a partial purification from cell-free supernatant by salt precipitation, the putative bacteriocin migrated as a diffuse band of approximately 1.0–3.0 kDa by SDS-PAGE. Additional studies are being conducted to explore its use in the food industry for controlling bacterial growth and for probiotic applications. PMID:28579977

  18. Effects of Different Multimedia Presentations on Viewers' Information-Processing Activities Measured by Eye-Tracking Technology

    ERIC Educational Resources Information Center

    Chuang, Hsueh-Hua; Liu, Han-Chin

    2012-01-01

    This study implemented eye-tracking technology to understand the impact of different multimedia instructional materials, i.e., five successive pages versus a single page with the same amount of information, on information-processing activities in 21 non-science-major college students. The findings showed that students demonstrated the same number…

  19. The Development of the Puerto Rico Lightning Detection Network for Meteorological Research

    NASA Technical Reports Server (NTRS)

    Legault, Marc D.; Miranda, Carmelo; Medin, J.; Ojeda, L. J.; Blakeslee, Richard J.

    2011-01-01

    A land-based Puerto Rico Lightning Detection Network (PR-LDN) dedicated to the academic research of meteorological phenomena has being developed. Five Boltek StormTracker PCI-Receivers with LTS-2 Timestamp Cards with GPS and lightning detectors were integrated to Pentium III PC-workstations running the CentOS linux operating system. The Boltek detector linux driver was compiled under CentOS, modified, and thoroughly tested. These PC-workstations with integrated lightning detectors were installed at five of the University of Puerto Rico (UPR) campuses distributed around the island of PR. The PC-workstations are left on permanently in order to monitor lightning activity at all times. Each is networked to their campus network-backbone permitting quasi-instantaneous data transfer to a central server at the UPR-Bayam n campus. Information generated by each lightning detector is managed by a C-program developed by us called the LDN-client. The LDN-client maintains an open connection to the central server operating the LDN-server program where data is sent real-time for analysis and archival. The LDN-client also manages the storing of data on the PC-workstation hard disk. The LDN-server software (also an in-house effort) analyses the data from each client and performs event triangulations. Time-of-arrival (TOA) and related hybrid algorithms, lightning-type and event discriminating routines are also implemented in the LDN-server software. We also have developed software to visually monitor lightning events in real-time from all clients and the triangulated events. We are currently monitoring and studying the spatial, temporal, and type distribution of lightning strikes associated with electrical storms and tropical cyclones in the vicinity of Puerto Rico.

  20. psRNATarget: a plant small RNA target analysis server

    PubMed Central

    Dai, Xinbin; Zhao, Patrick Xuechun

    2011-01-01

    Plant endogenous non-coding short small RNAs (20–24 nt), including microRNAs (miRNAs) and a subset of small interfering RNAs (ta-siRNAs), play important role in gene expression regulatory networks (GRNs). For example, many transcription factors and development-related genes have been reported as targets of these regulatory small RNAs. Although a number of miRNA target prediction algorithms and programs have been developed, most of them were designed for animal miRNAs which are significantly different from plant miRNAs in the target recognition process. These differences demand the development of separate plant miRNA (and ta-siRNA) target analysis tool(s). We present psRNATarget, a plant small RNA target analysis server, which features two important analysis functions: (i) reverse complementary matching between small RNA and target transcript using a proven scoring schema, and (ii) target-site accessibility evaluation by calculating unpaired energy (UPE) required to ‘open’ secondary structure around small RNA’s target site on mRNA. The psRNATarget incorporates recent discoveries in plant miRNA target recognition, e.g. it distinguishes translational and post-transcriptional inhibition, and it reports the number of small RNA/target site pairs that may affect small RNA binding activity to target transcript. The psRNATarget server is designed for high-throughput analysis of next-generation data with an efficient distributed computing back-end pipeline that runs on a Linux cluster. The server front-end integrates three simplified user-friendly interfaces to accept user-submitted or preloaded small RNAs and transcript sequences; and outputs a comprehensive list of small RNA/target pairs along with the online tools for batch downloading, key word searching and results sorting. The psRNATarget server is freely available at http://plantgrn.noble.org/psRNATarget/. PMID:21622958

  1. Hypertext-based computer vision teaching packages

    NASA Astrophysics Data System (ADS)

    Marshall, A. David

    1994-10-01

    The World Wide Web Initiative has provided a means for providing hypertext and multimedia based information across the whole INTERNET. Many applications have been developed on such http servers. At Cardiff we have developed a http hypertext based multimedia server, the Cardiff Information Server, using the widely available Mosaic system. The server provides a variety of information ranging from the provision of teaching modules, on- line documentation, timetables for departmental activities to more light hearted hobby interests. One important and novel development to the server has been the development of courseware facilities. This ranges from the provision of on-line lecture notes, exercises and their solutions to more interactive teaching packages. A variety of disciplines have benefitted notably Computer Vision, and Image Processing but also C programming, X Windows, Computer Graphics and Parallel Computing. This paper will address the issues of the implementation of the Computer Vision and Image Processing packages, the advantages gained from using a hypertext based system and also will relate practical experiences of using the packages in a class environment. The paper addresses issues of how best to provide information in such a hypertext based system and how interactive image processing packages can be developed and integrated into courseware. The suite of tools developed facilitates a flexible and powerful courseware package that has proved popular in the classroom and over the Internet. The paper will also detail many future developments we see possible. One of the key points raised in the paper is that Mosaic's hypertext language (html) is extremely powerful and yet relatively straightforward to use. It is also possible to link in Unix calls so that programs and shells can be executed. This provides a powerful suite of utilities that can be exploited to develop many packages.

  2. GRAMM-X public web server for protein–protein docking

    PubMed Central

    Tovchigrechko, Andrey; Vakser, Ilya A.

    2006-01-01

    Protein docking software GRAMM-X and its web interface () extend the original GRAMM Fast Fourier Transformation methodology by employing smoothed potentials, refinement stage, and knowledge-based scoring. The web server frees users from complex installation of database-dependent parallel software and maintaining large hardware resources needed for protein docking simulations. Docking problems submitted to GRAMM-X server are processed by a 320 processor Linux cluster. The server was extensively tested by benchmarking, several months of public use, and participation in the CAPRI server track. PMID:16845016

  3. Improving STEM Education and Workforce Development by the Inclusion of Research Experiences in the Curriculum at SWC

    DTIC Science & Technology

    2016-06-08

    server environment. While the college’s two Cisco blade -servers are located in separate buildings, these 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND...college’s two Cisco blade -servers are located in separate buildings, these units now work as one unit. Critical databases and software packages are...server environment. While the college’s two Cisco blade -servers are located in separate buildings, these units now work as one unit. Critical

  4. Scaling NS-3 DCE Experiments on Multi-Core Servers

    DTIC Science & Technology

    2016-06-15

    that work well together. 3.2 Simulation Server Details We ran the simulations on a Dell® PowerEdge M520 blade server[8] running Ubuntu Linux 14.04...To minimize the amount of time needed to complete all of the simulations, we planned to run multiple simulations at the same time on a blade server...MacBook was running the simulation inside a virtual machine (Ubuntu 14.04), while the blade server was running the same operating system directly on

  5. Reliability Information Analysis Center 1st Quarter 2007, Technical Area Task (TAT) Report

    DTIC Science & Technology

    2007-02-05

    34* Created new SQL server database for "PC Configuration" web application. Added roles for security closed 4235 and posted application to production. "e Wrote...and ran SQL Server scripts to migrate production databases to new server . "e Created backup jobs for new SQL Server databases. "* Continued...second phase of the TENA demo. Extensive tasking was established and assigned. A TENA interface to EW Server was reaffirmed after some uncertainty about

  6. Blade runner. Blade server and virtualization technology can help hospitals save money--but they are far from silver bullets.

    PubMed

    Lawrence, Daphne

    2009-03-01

    Blade servers and virtualization can reduce infrastructure, maintenance, heating, electric, cooling and equipment costs. Blade server technology is evolving and some elements may become obsolete. There is very little interoperability between blades. Hospitals can virtualize 40 to 60 percent of their servers, and old servers can be reused for testing. Not all applications lend themselves to virtualization--especially those with high memory requirements. CIOs should engage their vendors in virtualization discussions.

  7. Multimedia explorer: image database, image proxy-server and search-engine.

    PubMed Central

    Frankewitsch, T.; Prokosch, U.

    1999-01-01

    Multimedia plays a major role in medicine. Databases containing images, movies or other types of multimedia objects are increasing in number, especially on the WWW. However, no good retrieval mechanism or search engine currently exists to efficiently track down such multimedia sources in the vast of information provided by the WWW. Secondly, the tools for searching databases are usually not adapted to the properties of images. HTML pages do not allow complex searches. Therefore establishing a more comfortable retrieval involves the use of a higher programming level like JAVA. With this platform independent language it is possible to create extensions to commonly used web browsers. These applets offer a graphical user interface for high level navigation. We implemented a database using JAVA objects as the primary storage container which are then stored by a JAVA controlled ORACLE8 database. Navigation depends on a structured vocabulary enhanced by a semantic network. With this approach multimedia objects can be encapsulated within a logical module for quick data retrieval. PMID:10566463

  8. SPR online: creating, maintaining, and distributing a virtual professional society on the Internet.

    PubMed

    D'Alessandro, M P; Galvin, J R

    1998-01-01

    SPR Online (http:@www.pedrad.org) is a recently developed digital representation of the Society for Pediatric Radiology (SPR) that enables physicians to access pertinent information and services on the Internet. SPR Online was organized on the basis of the five main services of the SPR, which include Administration, Patient Care, Education, Research, and Meetings. For each service, related content from the SPR was digitized and placed onto SPR Online. Usage over a 12-month period was evaluated with server log file analysis. A total of 3,209 users accessed SPR Online, viewing 11,246 pages of information. A wide variety of information was accessed, with that from the Education, Administration, and Meetings services being the most popular. Fifteen percent of users came from foreign countries. As a virtual professional society, SPR Online greatly enhances the power and scope of the SPR and has proved to be a popular resource, meeting the diverse information needs of an international community of pediatric radiologists.

  9. Multimedia explorer: image database, image proxy-server and search-engine.

    PubMed

    Frankewitsch, T; Prokosch, U

    1999-01-01

    Multimedia plays a major role in medicine. Databases containing images, movies or other types of multimedia objects are increasing in number, especially on the WWW. However, no good retrieval mechanism or search engine currently exists to efficiently track down such multimedia sources in the vast of information provided by the WWW. Secondly, the tools for searching databases are usually not adapted to the properties of images. HTML pages do not allow complex searches. Therefore establishing a more comfortable retrieval involves the use of a higher programming level like JAVA. With this platform independent language it is possible to create extensions to commonly used web browsers. These applets offer a graphical user interface for high level navigation. We implemented a database using JAVA objects as the primary storage container which are then stored by a JAVA controlled ORACLE8 database. Navigation depends on a structured vocabulary enhanced by a semantic network. With this approach multimedia objects can be encapsulated within a logical module for quick data retrieval.

  10. Specification and Verification of Web Applications in Rewriting Logic

    NASA Astrophysics Data System (ADS)

    Alpuente, María; Ballis, Demis; Romero, Daniel

    This paper presents a Rewriting Logic framework that formalizes the interactions between Web servers and Web browsers through a communicating protocol abstracting HTTP. The proposed framework includes a scripting language that is powerful enough to model the dynamics of complex Web applications by encompassing the main features of the most popular Web scripting languages (e.g. PHP, ASP, Java Servlets). We also provide a detailed characterization of browser actions (e.g. forward/backward navigation, page refresh, and new window/tab openings) via rewrite rules, and show how our models can be naturally model-checked by using the Linear Temporal Logic of Rewriting (LTLR), which is a Linear Temporal Logic specifically designed for model-checking rewrite theories. Our formalization is particularly suitable for verification purposes, since it allows one to perform in-depth analyses of many subtle aspects related to Web interaction. Finally, the framework has been completely implemented in Maude, and we report on some successful experiments that we conducted by using the Maude LTLR model-checker.

  11. Fast access to the CMS detector condition data employing HTML5 technologies

    NASA Astrophysics Data System (ADS)

    Pierro, Giuseppe Antonio; Cavallari, Francesca; Di Guida, Salvatore; Innocente, Vincenzo

    2011-12-01

    This paper focuses on using HTML version 5 (HTML5) for accessing condition data for the CMS experiment, evaluating the benefits and risks posed by the use of this technology. According to the authors of HTML5, this technology attempts to solve issues found in previous iterations of HTML and addresses the needs of web applications, an area previously not adequately covered by HTML. We demonstrate that employing HTML5 brings important benefits in terms of access performance to the CMS condition data. The combined use of web storage and web sockets allows increasing the performance and reducing the costs in term of computation power, memory usage and network bandwidth for client and server. Above all, the web workers allow creating different scripts that can be executed using multi-thread mode, exploiting multi-core microprocessors. Web workers have been employed in order to substantially decrease the web page rendering time to display the condition data stored in the CMS condition database.

  12. On-the-fly Data Reprocessing and Analysis Capabilities from the XMM-Newton Archive

    NASA Astrophysics Data System (ADS)

    Ibarra, A.; Sarmiento, M.; Colomo, E.; Loiseau, N.; Salgado, J.; Gabriel, C.

    2017-10-01

    The XMM-Newton Science Archive (XSA) includes since last release the possibility to perform on-the-fly data processing with SAS through the Remote Interface for Science Analysis (RISA) server. It enables scientists to analyse data without any download and installation of data and software. The analysis options presently available include extraction of spectra and light curves of user-defined EPIC source regions and full reprocessing of data for which currently archived pipeline products were processed with older SAS versions or calibration files. The current pipeline is fully aligned with the most recent SAS and calibration, while the last full reprocessing of the archive was performed in 2013. The on-the-fly data processing functionality in this release is an experimental version and we invite the community to test and let us know their results. Known issues and workarounds are described in the 'Watchouts' section of the XSA web page. Feedback on how this functionality should evolve will be highly appreciated.

  13. Efficient discovery of overlapping communities in massive networks

    PubMed Central

    Gopalan, Prem K.; Blei, David M.

    2013-01-01

    Detecting overlapping communities is essential to analyzing and exploring natural networks such as social networks, biological networks, and citation networks. However, most existing approaches do not scale to the size of networks that we regularly observe in the real world. In this paper, we develop a scalable approach to community detection that discovers overlapping communities in massive real-world networks. Our approach is based on a Bayesian model of networks that allows nodes to participate in multiple communities, and a corresponding algorithm that naturally interleaves subsampling from the network and updating an estimate of its communities. We demonstrate how we can discover the hidden community structure of several real-world networks, including 3.7 million US patents, 575,000 physics articles from the arXiv preprint server, and 875,000 connected Web pages from the Internet. Furthermore, we demonstrate on large simulated networks that our algorithm accurately discovers the true community structure. This paper opens the door to using sophisticated statistical models to analyze massive networks. PMID:23950224

  14. Wi-Fi/MARG Integration for Indoor Pedestrian Localization.

    PubMed

    Tian, Zengshan; Jin, Yue; Zhou, Mu; Wu, Zipeng; Li, Ze

    2016-12-10

    With the wide deployment of Wi-Fi networks, Wi-Fi based indoor localization systems that are deployed without any special hardware have caught significant attention and have become a currently practical technology. At the same time, the Magnetic, Angular Rate, and Gravity (MARG) sensors installed in commercial mobile devices can achieve highly-accurate localization in short time. Based on this, we design a novel indoor localization system by using built-in MARG sensors and a Wi-Fi module. The innovative contributions of this paper include the enhanced Pedestrian Dead Reckoning (PDR) and Wi-Fi localization approaches, and an Extended Kalman Particle Filter (EKPF) based fusion algorithm. A new Wi-Fi/MARG indoor localization system, including an Android based mobile client, a Web page for remote control, and a location server, is developed for real-time indoor pedestrian localization. The extensive experimental results show that the proposed system is featured with better localization performance, with the average error 0.85 m, than the one achieved by using the Wi-Fi module or MARG sensors solely.

  15. A simple and low-cost Internet-based teleconsultation system that could effectively solve the health care access problems in underserved areas of developing countries.

    PubMed

    Kuntalp, Mehmet; Akar, Orkun

    2004-08-01

    In many developing countries including Turkey, telemedicine systems are not in wide use due to the high cost and complexity of the required technology. Lack of these systems however has serious implications on patients who live in rural areas. The objective of this paper is to present a simple and economically affordable alternative to the current systems that would allow experts to easily access the medical data of their remote patients over the Internet. The system is developed in client-server architecture with a user-friendly graphical interface and various services are implemented as dynamic web pages based on PHP. The other key features of the system are its powerful security features and platform independency. An academic prototype is implemented and presented to the evaluation of a group of physicians. The results reveal that the system could find acceptance from the medical community and it could be an effective means of providing quality health care in developing countries.

  16. A Scalability Model for ECS's Data Server

    NASA Technical Reports Server (NTRS)

    Menasce, Daniel A.; Singhal, Mukesh

    1998-01-01

    This report presents in four chapters a model for the scalability analysis of the Data Server subsystem of the Earth Observing System Data and Information System (EOSDIS) Core System (ECS). The model analyzes if the planned architecture of the Data Server will support an increase in the workload with the possible upgrade and/or addition of processors, storage subsystems, and networks. The approaches in the report include a summary of the architecture of ECS's Data server as well as a high level description of the Ingest and Retrieval operations as they relate to ECS's Data Server. This description forms the basis for the development of the scalability model of the data server and the methodology used to solve it.

  17. LabKey Server: an open source platform for scientific data integration, analysis and collaboration.

    PubMed

    Nelson, Elizabeth K; Piehler, Britt; Eckels, Josh; Rauch, Adam; Bellew, Matthew; Hussey, Peter; Ramsay, Sarah; Nathe, Cory; Lum, Karl; Krouse, Kevin; Stearns, David; Connolly, Brian; Skillman, Tom; Igra, Mark

    2011-03-09

    Broad-based collaborations are becoming increasingly common among disease researchers. For example, the Global HIV Enterprise has united cross-disciplinary consortia to speed progress towards HIV vaccines through coordinated research across the boundaries of institutions, continents and specialties. New, end-to-end software tools for data and specimen management are necessary to achieve the ambitious goals of such alliances. These tools must enable researchers to organize and integrate heterogeneous data early in the discovery process, standardize processes, gain new insights into pooled data and collaborate securely. To meet these needs, we enhanced the LabKey Server platform, formerly known as CPAS. This freely available, open source software is maintained by professional engineers who use commercially proven practices for software development and maintenance. Recent enhancements support: (i) Submitting specimens requests across collaborating organizations (ii) Graphically defining new experimental data types, metadata and wizards for data collection (iii) Transitioning experimental results from a multiplicity of spreadsheets to custom tables in a shared database (iv) Securely organizing, integrating, analyzing, visualizing and sharing diverse data types, from clinical records to specimens to complex assays (v) Interacting dynamically with external data sources (vi) Tracking study participants and cohorts over time (vii) Developing custom interfaces using client libraries (viii) Authoring custom visualizations in a built-in R scripting environment. Diverse research organizations have adopted and adapted LabKey Server, including consortia within the Global HIV Enterprise. Atlas is an installation of LabKey Server that has been tailored to serve these consortia. It is in production use and demonstrates the core capabilities of LabKey Server. Atlas now has over 2,800 active user accounts originating from approximately 36 countries and 350 organizations. It tracks roughly 27,000 assay runs, 860,000 specimen vials and 1,300,000 vial transfers. Sharing data, analysis tools and infrastructure can speed the efforts of large research consortia by enhancing efficiency and enabling new insights. The Atlas installation of LabKey Server demonstrates the utility of the LabKey platform for collaborative research. Stable, supported builds of LabKey Server are freely available for download at http://www.labkey.org. Documentation and source code are available under the Apache License 2.0.

  18. LabKey Server: An open source platform for scientific data integration, analysis and collaboration

    PubMed Central

    2011-01-01

    Background Broad-based collaborations are becoming increasingly common among disease researchers. For example, the Global HIV Enterprise has united cross-disciplinary consortia to speed progress towards HIV vaccines through coordinated research across the boundaries of institutions, continents and specialties. New, end-to-end software tools for data and specimen management are necessary to achieve the ambitious goals of such alliances. These tools must enable researchers to organize and integrate heterogeneous data early in the discovery process, standardize processes, gain new insights into pooled data and collaborate securely. Results To meet these needs, we enhanced the LabKey Server platform, formerly known as CPAS. This freely available, open source software is maintained by professional engineers who use commercially proven practices for software development and maintenance. Recent enhancements support: (i) Submitting specimens requests across collaborating organizations (ii) Graphically defining new experimental data types, metadata and wizards for data collection (iii) Transitioning experimental results from a multiplicity of spreadsheets to custom tables in a shared database (iv) Securely organizing, integrating, analyzing, visualizing and sharing diverse data types, from clinical records to specimens to complex assays (v) Interacting dynamically with external data sources (vi) Tracking study participants and cohorts over time (vii) Developing custom interfaces using client libraries (viii) Authoring custom visualizations in a built-in R scripting environment. Diverse research organizations have adopted and adapted LabKey Server, including consortia within the Global HIV Enterprise. Atlas is an installation of LabKey Server that has been tailored to serve these consortia. It is in production use and demonstrates the core capabilities of LabKey Server. Atlas now has over 2,800 active user accounts originating from approximately 36 countries and 350 organizations. It tracks roughly 27,000 assay runs, 860,000 specimen vials and 1,300,000 vial transfers. Conclusions Sharing data, analysis tools and infrastructure can speed the efforts of large research consortia by enhancing efficiency and enabling new insights. The Atlas installation of LabKey Server demonstrates the utility of the LabKey platform for collaborative research. Stable, supported builds of LabKey Server are freely available for download at http://www.labkey.org. Documentation and source code are available under the Apache License 2.0. PMID:21385461

  19. On the optimal use of a slow server in two-stage queueing systems

    NASA Astrophysics Data System (ADS)

    Papachristos, Ioannis; Pandelis, Dimitrios G.

    2017-07-01

    We consider two-stage tandem queueing systems with a dedicated server in each queue and a slower flexible server that can attend both queues. We assume Poisson arrivals and exponential service times, and linear holding costs for jobs present in the system. We study the optimal dynamic assignment of servers to jobs assuming that two servers cannot collaborate to work on the same job and preemptions are not allowed. We formulate the problem as a Markov decision process and derive properties of the optimal allocation for the dedicated (fast) servers. Specifically, we show that the one downstream should not idle, and the same is true for the one upstream when holding costs are larger there. The optimal allocation of the slow server is investigated through extensive numerical experiments that lead to conjectures on the structure of the optimal policy.

  20. Process evaluation distributed system

    NASA Technical Reports Server (NTRS)

    Moffatt, Christopher L. (Inventor)

    2006-01-01

    The distributed system includes a database server, an administration module, a process evaluation module, and a data display module. The administration module is in communication with the database server for providing observation criteria information to the database server. The process evaluation module is in communication with the database server for obtaining the observation criteria information from the database server and collecting process data based on the observation criteria information. The process evaluation module utilizes a personal digital assistant (PDA). A data display module in communication with the database server, including a website for viewing collected process data in a desired metrics form, the data display module also for providing desired editing and modification of the collected process data. The connectivity established by the database server to the administration module, the process evaluation module, and the data display module, minimizes the requirement for manual input of the collected process data.

  1. From honeybees to Internet servers: biomimicry for distributed management of Internet hosting centers.

    PubMed

    Nakrani, Sunil; Tovey, Craig

    2007-12-01

    An Internet hosting center hosts services on its server ensemble. The center must allocate servers dynamically amongst services to maximize revenue earned from hosting fees. The finite server ensemble, unpredictable request arrival behavior and server reallocation cost make server allocation optimization difficult. Server allocation closely resembles honeybee forager allocation amongst flower patches to optimize nectar influx. The resemblance inspires a honeybee biomimetic algorithm. This paper describes details of the honeybee self-organizing model in terms of information flow and feedback, analyzes the homology between the two problems and derives the resulting biomimetic algorithm for hosting centers. The algorithm is assessed for effectiveness and adaptiveness by comparative testing against benchmark and conventional algorithms. Computational results indicate that the new algorithm is highly adaptive to widely varying external environments and quite competitive against benchmark assessment algorithms. Other swarm intelligence applications are briefly surveyed, and some general speculations are offered regarding their various degrees of success.

  2. DelPhi web server v2: incorporating atomic-style geometrical figures into the computational protocol.

    PubMed

    Smith, Nicholas; Witham, Shawn; Sarkar, Subhra; Zhang, Jie; Li, Lin; Li, Chuan; Alexov, Emil

    2012-06-15

    A new edition of the DelPhi web server, DelPhi web server v2, is released to include atomic presentation of geometrical figures. These geometrical objects can be used to model nano-size objects together with real biological macromolecules. The position and size of the object can be manipulated by the user in real time until desired results are achieved. The server fixes structural defects, adds hydrogen atoms and calculates electrostatic energies and the corresponding electrostatic potential and ionic distributions. The web server follows a client-server architecture built on PHP and HTML and utilizes DelPhi software. The computation is carried out on supercomputer cluster and results are given back to the user via http protocol, including the ability to visualize the structure and corresponding electrostatic potential via Jmol implementation. The DelPhi web server is available from http://compbio.clemson.edu/delphi_webserver.

  3. 78 FR 48821 - Energy Conservation Program for Consumer Products and Certain Commercial and Industrial Equipment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-12

    ... Commercial and Industrial Equipment: Proposed Determination of Computer Servers as a Covered Consumer Product... comments on the proposed determination that computer servers (servers) qualify as a covered product. DATES: The comment period for the proposed determination relating to servers published on July 12, 2013 (78...

  4. ASPEN--A Web-Based Application for Managing Student Server Accounts

    ERIC Educational Resources Information Center

    Sandvig, J. Christopher

    2004-01-01

    The growth of the Internet has greatly increased the demand for server-side programming courses at colleges and universities. Students enrolled in such courses must be provided with server-based accounts that support the technologies that they are learning. The process of creating, managing and removing large numbers of student server accounts is…

  5. Multi-board kernel communication using socket programming for embedded applications

    NASA Astrophysics Data System (ADS)

    Mishra, Ashish; Girdhar, Neha; Krishnia, Nikita

    2016-03-01

    It is often seen in large application projects, there is a need to communicate between two different processors or two different kernels. The aim of this paper is to communicate between two different kernels and use efficient method to do so. The TCP/IP protocol is implemented to communicate between two boards via the Ethernet port and use lwIP (lightweight IP) stack, which is a smaller independent implementation of the TCP/IP stack suitable for use in embedded systems. While retaining TCP/IP functionality, lwIP stack reduces the use of memory and even size of the code. In this process of communication we made Raspberry pi as an active client and Field programmable gate array(FPGA) board as a passive server and they are allowed to communicate via Ethernet. Three applications based on TCP/IP client-server network communication have been implemented. The Echo server application is used to communicate between two different kernels of two different boards. Socket programming is used as it is independent of platform and programming language used. TCP transmit and receive throughput test applications are used to measure maximum throughput of the transmission of data. These applications are based on communication to an open source tool called iperf. It is used to measure the throughput transmission rate by sending or receiving some constant piece of data to the client or server according to the test application.

  6. Modernization of the USGS Hawaiian Volcano Observatory Seismic Processing Infrastructure

    NASA Astrophysics Data System (ADS)

    Antolik, L.; Shiro, B.; Friberg, P. A.

    2016-12-01

    The USGS Hawaiian Volcano Observatory (HVO) operates a Tier 1 Advanced National Seismic System (ANSS) seismic network to monitor, characterize, and report on volcanic and earthquake activity in the State of Hawaii. Upgrades at the observatory since 2009 have improved the digital telemetry network, computing resources, and seismic data processing with the adoption of the ANSS Quake Management System (AQMS) system. HVO aims to build on these efforts by further modernizing its seismic processing infrastructure and strengthen its ability to meet ANSS performance standards. Most notably, this will also allow HVO to support redundant systems, both onsite and offsite, in order to provide better continuity of operation during intermittent power and network outages. We are in the process of implementing a number of upgrades and improvements on HVO's seismic processing infrastructure, including: 1) Virtualization of AQMS physical servers; 2) Migration of server operating systems from Solaris to Linux; 3) Consolidation of AQMS real-time and post-processing services to a single server; 4) Upgrading database from Oracle 10 to Oracle 12; and 5) Upgrading to the latest Earthworm and AQMS software. These improvements will make server administration more efficient, minimize hardware resources required by AQMS, simplify the Oracle replication setup, and provide better integration with HVO's existing state of health monitoring tools and backup system. Ultimately, it will provide HVO with the latest and most secure software available while making the software easier to deploy and support.

  7. How to securely replicate services

    NASA Technical Reports Server (NTRS)

    Reiter, Michael; Birman, Kenneth

    1992-01-01

    A method is presented for constructing replicated services that retain their availability and integrity despite several servers and clients corrupted by an intruder, in addition to others failing benignly. More precisely, a service is replicated by n servers in such a way that a correct client will accept a correct server's response if, for some prespecified parameter k, at least k servers are correct and fewer than k servers are corrupt. The issue of maintaining causality among client requests is also addressed. A security breach resulting from an intruder's ability to effect a violation of causality in the sequence of requests processed by the service is illustrated. An approach to counter this problem is proposed that requires fewer than k servers to be corrupt and that is live if at least k+b servers are correct, where b is the assumed maximum total number of corrupt servers in any system run. An important and novel feature of these schemes is that the client need not be able to identify or authenticate even a single server. Instead, the client is required only to possess at most two public keys for the service. The practicality of these schemes is illustrated through a discussion of several issues pertinent to their implementation and use, and their intended role in a secure version of the Isis system is also described.

  8. Optimal Self-Tuning PID Controller Based on Low Power Consumption for a Server Fan Cooling System.

    PubMed

    Lee, Chengming; Chen, Rongshun

    2015-05-20

    Recently, saving the cooling power in servers by controlling the fan speed has attracted considerable attention because of the increasing demand for high-density servers. This paper presents an optimal self-tuning proportional-integral-derivative (PID) controller, combining a PID neural network (PIDNN) with fan-power-based optimization in the transient-state temperature response in the time domain, for a server fan cooling system. Because the thermal model of the cooling system is nonlinear and complex, a server mockup system simulating a 1U rack server was constructed and a fan power model was created using a third-order nonlinear curve fit to determine the cooling power consumption by the fan speed control. PIDNN with a time domain criterion is used to tune all online and optimized PID gains. The proposed controller was validated through experiments of step response when the server operated from the low to high power state. The results show that up to 14% of a server's fan cooling power can be saved if the fan control permits a slight temperature response overshoot in the electronic components, which may provide a time-saving strategy for tuning the PID controller to control the server fan speed during low fan power consumption.

  9. Informatics in radiology (infoRAD): A complete continuous-availability PACS archive server.

    PubMed

    Liu, Brent J; Huang, H K; Cao, Fei; Zhou, Michael Z; Zhang, Jianguo; Mogel, Greg

    2004-01-01

    The operational reliability of the picture archiving and communication system (PACS) server in a filmless hospital environment is always a major concern because server failure could cripple the entire PACS operation. A simple, low-cost, continuous-availability (CA) PACS archive server was designed and developed. The server makes use of a triple modular redundancy (TMR) system with a simple majority voting logic that automatically identifies a faulty module and removes it from service. The remaining two modules continue normal operation with no adverse effects on data flow or system performance. In addition, the server is integrated with two external mass storage devices for short- and long-term storage. Evaluation and testing of the server were conducted with laboratory experiments in which hardware failures were simulated to observe recovery time and the resumption of normal data flow. The server provides maximum uptime (99.999%) for end users while ensuring the transactional integrity of all clinical PACS data. Hardware failure has only minimal impact on performance, with no interruption of clinical data flow or loss of data. As hospital PACS become more widespread, the need for CA PACS solutions will increase. A TMR CA PACS archive server can reliably help achieve CA in this setting. Copyright RSNA, 2004

  10. Performance of a distributed superscalar storage server

    NASA Technical Reports Server (NTRS)

    Finestead, Arlan; Yeager, Nancy

    1993-01-01

    The RS/6000 performed well in our test environment. The potential exists for the RS/6000 to act as a departmental server for a small number of users, rather than as a high speed archival server. Multiple UniTree Disk Server's utilizing one UniTree Disk Server's utilizing one UniTree Name Server could be developed that would allow for a cost effective archival system. Our performance tests were clearly limited by the network bandwidth. The performance gathered by the LibUnix testing shows that UniTree is capable of exceeding ethernet speeds on an RS/6000 Model 550. The performance of FTP might be significantly faster if asked to perform across a higher bandwidth network. The UniTree Name Server also showed signs of being a potential bottleneck. UniTree sites that would require a high ratio of file creations and deletions to reads and writes would run into this bottleneck. It is possible to improve the UniTree Name Server performance by bypassing the UniTree LibUnix Library altogether and communicating directly with the UniTree Name Server and optimizing creations. Although testing was performed in a less than ideal environment, hopefully the performance statistics stated in this paper will give end-users a realistic idea as to what performance they can expect in this type of setup.

  11. LiveBench-1: continuous benchmarking of protein structure prediction servers.

    PubMed

    Bujnicki, J M; Elofsson, A; Fischer, D; Rychlewski, L

    2001-02-01

    We present a novel, continuous approach aimed at the large-scale assessment of the performance of available fold-recognition servers. Six popular servers were investigated: PDB-Blast, FFAS, T98-lib, GenTHREADER, 3D-PSSM, and INBGU. The assessment was conducted using as prediction targets a large number of selected protein structures released from October 1999 to April 2000. A target was selected if its sequence showed no significant similarity to any of the proteins previously available in the structural database. Overall, the servers were able to produce structurally similar models for one-half of the targets, but significantly accurate sequence-structure alignments were produced for only one-third of the targets. We further classified the targets into two sets: easy and hard. We found that all servers were able to find the correct answer for the vast majority of the easy targets if a structurally similar fold was present in the server's fold libraries. However, among the hard targets--where standard methods such as PSI-BLAST fail--the most sensitive fold-recognition servers were able to produce similar models for only 40% of the cases, half of which had a significantly accurate sequence-structure alignment. Among the hard targets, the presence of updated libraries appeared to be less critical for the ranking. An "ideally combined consensus" prediction, where the results of all servers are considered, would increase the percentage of correct assignments by 50%. Each server had a number of cases with a correct assignment, where the assignments of all the other servers were wrong. This emphasizes the benefits of considering more than one server in difficult prediction tasks. The LiveBench program (http://BioInfo.PL/LiveBench) is being continued, and all interested developers are cordially invited to join.

  12. The HydroServer Platform for Sharing Hydrologic Data

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Horsburgh, J. S.; Schreuders, K.; Maidment, D. R.; Zaslavsky, I.; Valentine, D. W.

    2010-12-01

    The CUAHSI Hydrologic Information System (HIS) is an internet based system that supports sharing of hydrologic data. HIS consists of databases connected using the Internet through Web services, as well as software for data discovery, access, and publication. The HIS system architecture is comprised of servers for publishing and sharing data, a centralized catalog to support cross server data discovery and a desktop client to access and analyze data. This paper focuses on HydroServer, the component developed for sharing and publishing space-time hydrologic datasets. A HydroServer is a computer server that contains a collection of databases, web services, tools, and software applications that allow data producers to store, publish, and manage the data from an experimental watershed or project site. HydroServer is designed to permit publication of data as part of a distributed national/international system, while still locally managing access to the data. We describe the HydroServer architecture and software stack, including tools for managing and publishing time series data for fixed point monitoring sites as well as spatially distributed, GIS datasets that describe a particular study area, watershed, or region. HydroServer adopts a standards based approach to data publication, relying on accepted and emerging standards for data storage and transfer. CUAHSI developed HydroServer code is free with community code development managed through the codeplex open source code repository and development system. There is some reliance on widely used commercial software for general purpose and standard data publication capability. The sharing of data in a common format is one way to stimulate interdisciplinary research and collaboration. It is anticipated that the growing, distributed network of HydroServers will facilitate cross-site comparisons and large scale studies that synthesize information from diverse settings, making the network as a whole greater than the sum of its parts in advancing hydrologic research. Details of the CUAHSI HIS can be found at http://his.cuahsi.org, and HydroServer codeplex site http://hydroserver.codeplex.com.

  13. 77 FR 26520 - Procurement List Additions and Deletions

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-04

    ... Book, 6'' x 9'', Green NSN: 7530-00-NIB-1012--60 Pages NSN: 7530-00-NIB-1013--80 Pages NPA: Alabama...: VIP Services, Inc., Elkhorn, WI. Contracting Activity: Defense Logistics Agency Land and Maritime...

  14. 47 CFR 54.639 - Ineligible expenses.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ..., including the following: i. Computers, including servers, and related hardware (e.g., printers, scanners, laptops), unless used exclusively for network management, maintenance, or other network operations; ii... installation/construction; marketing studies, marketing activities, or outreach to potential network members...

  15. 47 CFR 54.639 - Ineligible expenses.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ..., including the following: i. Computers, including servers, and related hardware (e.g., printers, scanners, laptops), unless used exclusively for network management, maintenance, or other network operations; ii... installation/construction; marketing studies, marketing activities, or outreach to potential network members...

  16. A Large-scale Distributed Indexed Learning Framework for Data that Cannot Fit into Memory

    DTIC Science & Technology

    2015-03-27

    learn a classifier. Integrating three learning techniques (online, semi-supervised and active learning ) together with a selective sampling with minimum communication between the server and the clients solved this problem.

  17. Dynamics of list-server discussion on genetically modified foods.

    PubMed

    Triunfol, Marcia L; Hines, Pamela J

    2004-04-01

    Computer-mediated discussion lists, or list-servers, are popular tools in settings ranging from professional to personal to educational. A discussion list on genetically modified food (GMF) was created in September 2000 as part of the Forum on Genetically Modified Food developed by Science Controversies: Online Partnerships in Education (SCOPE), an educational project that uses computer resources to aid research and learning around unresolved scientific questions. The discussion list "GMF-Science" was actively supported from January 2001 to May 2002. The GMF-Science list welcomed anyone interested in discussing the controversies surrounding GMF. Here, we analyze the dynamics of the discussions and how the GMF-Science list may contribute to learning. Activity on the GMF-Science discussion list reflected some but not all the controversies that were appearing in more traditional publication formats, broached other topics not well represented in the published literature, and tended to leave undiscussed the more technical research developments.

  18. A user-friendly, dynamic web environment for remote data browsing and analysis of multiparametric geophysical data within the MULTIMO project

    NASA Astrophysics Data System (ADS)

    Carniel, Roberto; Di Cecca, Mauro; Jaquet, Olivier

    2006-05-01

    In the framework of the EU-funded project "Multi-disciplinary monitoring, modelling and forecasting of volcanic hazard" (MULTIMO), multiparametric data have been recorded at the MULTIMO station in Montserrat. Moreover, several other long time series, recorded at Montserrat and at other volcanoes, have been acquired in order to test stochastic and deterministic methodologies under development. Creating a general framework to handle data efficiently is a considerable task even for homogeneous data. In the case of heterogeneous data, this becomes a major issue. A need for a consistent way of browsing such a heterogeneous dataset in a user-friendly way therefore arose. Additionally, a framework for applying the calculation of the developed dynamical parameters on the data series was also needed in order to easily keep these parameters under control, e.g. for monitoring, research or forecasting purposes. The solution which we present is completely based on Open Source software, including Linux operating system, MySql database management system, Apache web server, Zope application server, Scilab math engine, Plone content management framework, Unified Modelling Language. From the user point of view the main advantage is the possibility of browsing through datasets recorded on different volcanoes, with different instruments, with different sampling frequencies, stored in different formats, all via a consistent, user- friendly interface that transparently runs queries to the database, gets the data from the main storage units, generates the graphs and produces dynamically generated web pages to interact with the user. The involvement of third parties for continuing the development in the Open Source philosophy and/or extending the application fields is now sought.

  19. A low-cost mobile adaptive tracking system for chronic pulmonary patients in home environment.

    PubMed

    Işik, Ali Hakan; Güler, Inan; Sener, Melahat Uzel

    2013-01-01

    The main objective of this study is presenting a real-time mobile adaptive tracking system for patients diagnosed with diseases such as asthma or chronic obstructive pulmonary disease and application results at home. The main role of the system is to support and track chronic pulmonary patients in real time who are comfortable in their home environment. It is not intended to replace the doctor, regular treatment, and diagnosis. In this study, the Java 2 micro edition-based system is integrated with portable spirometry, smartphone, extensible markup language-based Web services, Web server, and Web pages for visualizing pulmonary function test results. The Bluetooth(®) (Bluetooth SIG, Kirkland, WA) virtual serial port protocol is used to obtain the test results from spirometry. General packet radio service, wireless local area network, or third-generation-based wireless networks are used to send the test results from a smartphone to the remote database. The system provides real-time classification of test results with the back propagation artificial neural network algorithm on a mobile smartphone. It also provides the generation of appropriate short message service-based notification and sending of all data to the Web server. In this study, the test results of 486 patients, obtained from Atatürk Chest Diseases and Thoracic Surgery Training and Research Hospital in Ankara, Turkey, are used as the training and test set in the algorithm. The algorithm has 98.7% accuracy, 97.83% specificity, 97.63% sensitivity, and 0.946 correlation values. The results show that the system is cheap (900 Euros) and reliable. The developed real-time system provides improvement in classification accuracy and facilitates tracking of chronic pulmonary patients.

  20. Group-oriented coordination models for distributed client-server computing

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.; Hughes, Craig S.

    1994-01-01

    This paper describes group-oriented control models for distributed client-server interactions. These models transparently coordinate requests for services that involve multiple servers, such as queries across distributed databases. Specific capabilities include: decomposing and replicating client requests; dispatching request subtasks or copies to independent, networked servers; and combining server results into a single response for the client. The control models were implemented by combining request broker and process group technologies with an object-oriented communication middleware tool. The models are illustrated in the context of a distributed operations support application for space-based systems.

  1. National Medical Terminology Server in Korea

    NASA Astrophysics Data System (ADS)

    Lee, Sungin; Song, Seung-Jae; Koh, Soonjeong; Lee, Soo Kyoung; Kim, Hong-Gee

    Interoperable EHR (Electronic Health Record) necessitates at least the use of standardized medical terminologies. This paper describes a medical terminology server, LexCare Suite, which houses terminology management applications, such as a terminology editor, and a terminology repository populated with international standard terminology systems such as Systematized Nomenclature of Medicine (SNOMED). The server is to satisfy the needs of quality terminology systems to local primary to tertiary hospitals. Our partner general hospitals have used the server to test its applicability. This paper describes the server and the results of the applicability test.

  2. Fuzzy expert system for diagnosing diabetic neuropathy.

    PubMed

    Rahmani Katigari, Meysam; Ayatollahi, Haleh; Malek, Mojtaba; Kamkar Haghighi, Mehran

    2017-02-15

    To design a fuzzy expert system to help detect and diagnose the severity of diabetic neuropathy. The research was completed in 2014 and consisted of two main phases. In the first phase, the diagnostic parameters were determined based on the literature review and by investigating specialists' perspectives ( n = 8). In the second phase, 244 medical records related to the patients who were visited in an endocrinology and metabolism research centre during the first six months of 2014 and were primarily diagnosed with diabetic neuropathy, were used to test the sensitivity, specificity, and accuracy of the fuzzy expert system. The final diagnostic parameters included the duration of diabetes, the score of a symptom examination based on the Michigan questionnaire, the score of a sign examination based on the Michigan questionnaire, the glycolysis haemoglobin level, fasting blood sugar, blood creatinine, and albuminuria. The output variable was the severity of diabetic neuropathy which was shown as a number between zero and 10, had been divided into four categories: absence of the disease, (the degree of severity) mild, moderate, and severe. The interface of the system was designed by ASP.Net (Active Server Pages Network Enabled Technology) and the system function was tested in terms of sensitivity (true positive rate) (89%), specificity (true negative rate) (98%), and accuracy (a proportion of true results, both positive and negative) (93%). The system designed in this study can help specialists and general practitioners to diagnose the disease more quickly to improve the quality of care for patients.

  3. Fuzzy expert system for diagnosing diabetic neuropathy

    PubMed Central

    Rahmani Katigari, Meysam; Ayatollahi, Haleh; Malek, Mojtaba; Kamkar Haghighi, Mehran

    2017-01-01

    AIM To design a fuzzy expert system to help detect and diagnose the severity of diabetic neuropathy. METHODS The research was completed in 2014 and consisted of two main phases. In the first phase, the diagnostic parameters were determined based on the literature review and by investigating specialists’ perspectives (n = 8). In the second phase, 244 medical records related to the patients who were visited in an endocrinology and metabolism research centre during the first six months of 2014 and were primarily diagnosed with diabetic neuropathy, were used to test the sensitivity, specificity, and accuracy of the fuzzy expert system. RESULTS The final diagnostic parameters included the duration of diabetes, the score of a symptom examination based on the Michigan questionnaire, the score of a sign examination based on the Michigan questionnaire, the glycolysis haemoglobin level, fasting blood sugar, blood creatinine, and albuminuria. The output variable was the severity of diabetic neuropathy which was shown as a number between zero and 10, had been divided into four categories: absence of the disease, (the degree of severity) mild, moderate, and severe. The interface of the system was designed by ASP.Net (Active Server Pages Network Enabled Technology) and the system function was tested in terms of sensitivity (true positive rate) (89%), specificity (true negative rate) (98%), and accuracy (a proportion of true results, both positive and negative) (93%). CONCLUSION The system designed in this study can help specialists and general practitioners to diagnose the disease more quickly to improve the quality of care for patients. PMID:28265346

  4. Modeling Large-Scale Networks Using Virtual Machines and Physical Appliances

    DTIC Science & Technology

    2014-01-27

    downloaded and run locally. The lab solution couldn’t be based on ActiveX because the military Report Documentation Page Form ApprovedOMB No. 0704-0188...unclassified b. ABSTRACT unclassified c. THIS PAGE unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 disallowed ActiveX support on...its systems, which made running an RDP client over ActiveX not possible. The challenges the SEI encountered in delivering the instruction were

  5. Digital junk: food and beverage marketing on Facebook.

    PubMed

    Freeman, Becky; Kelly, Bridget; Baur, Louise; Chapman, Kathy; Chapman, Simon; Gill, Tim; King, Lesley

    2014-12-01

    We assessed the amount, reach, and nature of energy-dense, nutrient-poor (EDNP) food and beverage marketing on Facebook. We conducted a content analysis of the marketing techniques used by the 27 most popular food and beverage brand Facebook pages in Australia. We coded content across 19 marketing categories; data were collected from the day each page launched (mean = 3.65 years of activity per page). We analyzed 13 international pages and 14 Australian-based brand pages; 4 brands (Subway, Coca-Cola, Slurpee, Maltesers) had both national and international pages. Pages widely used marketing features unique to social media that increase consumer interaction and engagement. Common techniques were competitions based on user-generated content, interactive games, and apps. Four pages included apps that allowed followers to place an order directly through Facebook. Adolescent and young adult Facebook users appeared most receptive to engaging with this content. By using the interactive and social aspects of Facebook to market products, EDNP food brands capitalize on users' social networks and magnify the reach and personal relevance of their marketing messages.

  6. Digital Junk: Food and Beverage Marketing on Facebook

    PubMed Central

    Freeman, Becky; Kelly, Bridget; Baur, Louise; Chapman, Kathy; Chapman, Simon; Gill, Tim; King, Lesley

    2014-01-01

    Objectives. We assessed the amount, reach, and nature of energy-dense, nutrient-poor (EDNP) food and beverage marketing on Facebook. Methods. We conducted a content analysis of the marketing techniques used by the 27 most popular food and beverage brand Facebook pages in Australia. We coded content across 19 marketing categories; data were collected from the day each page launched (mean = 3.65 years of activity per page). Results. We analyzed 13 international pages and 14 Australian-based brand pages; 4 brands (Subway, Coca-Cola, Slurpee, Maltesers) had both national and international pages. Pages widely used marketing features unique to social media that increase consumer interaction and engagement. Common techniques were competitions based on user-generated content, interactive games, and apps. Four pages included apps that allowed followers to place an order directly through Facebook. Adolescent and young adult Facebook users appeared most receptive to engaging with this content. Conclusions. By using the interactive and social aspects of Facebook to market products, EDNP food brands capitalize on users’ social networks and magnify the reach and personal relevance of their marketing messages. PMID:25322294

  7. An Innovative Approach to Improve Communication and Reduce Physician Stress and Burnout in a University Affiliated Residency Program.

    PubMed

    Lapointe, Ryan; Bhesania, Siddharth; Tanner, Tristan; Peruri, Adithya; Mehta, Parag

    2018-05-28

    Ineffective communication between nursing staff and residents leads to numerous educational and patient-care interruptions, increasing resident stress and overall workload. We developed an innovative and simple, secure electronic health record (EHR) base text paging system to communicate with internal medicine residents. The goal is to avoid unnecessary interruption during patient care or educational activities and reduce stress. Traditional paging system can send a phone number to call back. We developed and implemented a HIPPA-compliant, EHR-integrated text paging at a busy 591-bed urban hospital. Access was granted to unit clerks, nursing staff, case managers, and physicians. Senders could either send a traditional telephone number page or a text page through our EHR. The recipient could then either acknowledge receipt of the page or take appropriate actions. Afterward, Internal medicine residents were polled on overall satisfaction difference between basic phone based numeric paging and the enhanced EHR text paging system. Educational interruptions (averaging over 7 pages) decreased from 64% to 16%. Patient care interruptions fell from 68% to 12%. 88% of residents felt that 50% or less of the pages were non-emergent and did not require an immediate action. 92% of 25 surveyed internal medicine residents preferred text paging over numeric paging and responded through the EHR 60% of the time by placing direct orders. Time savings using the new system over a 3-month span amounted to 72.5 h in transmission time alone. Text paging among medical caregivers and internal medicine residents through EHR-associated communication reduced patient care and educational interruptions. It saved time spent sending pages, answering unnecessary pages and it improved resident's subjective stress and satisfaction levels.

  8. Ad Hoc Selection of Voice over Internet Streams

    NASA Technical Reports Server (NTRS)

    Macha, Mitchell G. (Inventor); Bullock, John T. (Inventor)

    2014-01-01

    A method and apparatus for a communication system technique involving ad hoc selection of at least two audio streams is provided. Each of the at least two audio streams is a packetized version of an audio source. A data connection exists between a server and a client where a transport protocol actively propagates the at least two audio streams from the server to the client. Furthermore, software instructions executable on the client indicate a presence of the at least two audio streams, allow selection of at least one of the at least two audio streams, and direct the selected at least one of the at least two audio streams for audio playback.

  9. Ad Hoc Selection of Voice over Internet Streams

    NASA Technical Reports Server (NTRS)

    Macha, Mitchell G. (Inventor); Bullock, John T. (Inventor)

    2008-01-01

    A method and apparatus for a communication system technique involving ad hoc selection of at least two audio streams is provided. Each of the at least two audio streams is a packetized version of an audio source. A data connection exists between a server and a client where a transport protocol actively propagates the at least two audio streams from the server to the client. Furthermore, software instructions executable on the client indicate a presence of the at least two audio streams, allow selection of at least one of the at least two audio streams, and direct the selected at least one of the at least two audio streams for audio playback.

  10. WMS Server 2.0

    NASA Technical Reports Server (NTRS)

    Plesea, Lucian; Wood, James F.

    2012-01-01

    This software is a simple, yet flexible server of raster map products, compliant with the Open Geospatial Consortium (OGC) Web Map Service (WMS) 1.1.1 protocol. The server is a full implementation of the OGC WMS 1.1.1 as a fastCGI client and using Geospatial Data Abstraction Library (GDAL) for data access. The server can operate in a proxy mode, where all or part of the WMS requests are done on a back server. The server has explicit support for a colocated tiled WMS, including rapid response of black (no-data) requests. It generates JPEG and PNG images, including 16-bit PNG. The GDAL back-end support allows great flexibility on the data access. The server is a port to a Linux/GDAL platform from the original IRIX/IL platform. It is simpler to configure and use, and depending on the storage format used, it has better performance than other available implementations. The WMS server 2.0 is a high-performance WMS implementation due to the fastCGI architecture. The use of GDAL data back end allows for great flexibility. The configuration is relatively simple, based on a single XML file. It provides scaling and cropping, as well as blending of multiple layers based on layer transparency.

  11. Virtual network computing: cross-platform remote display and collaboration software.

    PubMed

    Konerding, D E

    1999-04-01

    VNC (Virtual Network Computing) is a computer program written to address the problem of cross-platform remote desktop/application display. VNC uses a client/server model in which an image of the desktop of the server is transmitted to the client and displayed. The client collects mouse and keyboard input from the user and transmits them back to the server. The VNC client and server can run on Windows 95/98/NT, MacOS, and Unix (including Linux) operating systems. VNC is multi-user on Unix machines (any number of servers can be run are unrelated to the primary display of the computer), while it is effectively single-user on Macintosh and Windows machines (only one server can be run, displaying the contents of the primary display of the server). The VNC servers can be configured to allow more than one client to connect at one time, effectively allowing collaboration through the shared desktop. I describe the function of VNC, provide details of installation, describe how it achieves its goal, and evaluate the use of VNC for molecular modelling. VNC is an extremely useful tool for collaboration, instruction, software development, and debugging of graphical programs with remote users.

  12. How to securely replicate services (preliminary version)

    NASA Technical Reports Server (NTRS)

    Reiter, Michael; Birman, Kenneth

    1992-01-01

    A method is presented for constructing replicated services that retain their availability and integrity despite several servers and clients being corrupted by an intruder, in addition to others failing benignly. More precisely, a service is replicated by 'n' servers in such a way that a correct client will accept a correct server's response if, for some prespecified parameter, k, at least k servers are correct and fewer than k servers are correct. The issue of maintaining causality among client requests is also addressed. A security breach resulting from an intruder's ability to effect a violation of causality in the sequence of requests processed by the service is illustrated. An approach to counter this problem is proposed that requires that fewer than k servers are corrupt and, to ensure liveness, that k is less than or = n - 2t, where t is the assumed maximum total number of both corruptions and benign failures suffered by servers in any system run. An important and novel feature of these schemes is that the client need not be able to identify or authenticate even a single server. Instead, the client is required only to possess at most two public keys for the service.

  13. Autoplot and the HAPI Server

    NASA Astrophysics Data System (ADS)

    Faden, J.; Vandegriff, J. D.; Weigel, R. S.

    2016-12-01

    Autoplot was introduced in 2008 as an easy-to-use plotting tool for the space physics community. It reads data from a variety of file resources, such as CDF and HDF files, and a number of specialized data servers, such as the PDS/PPI's DIT-DOS, CDAWeb, and from the University of Iowa's RPWG Das2Server. Each of these servers have optimized methods for transmitting data to display in Autoplot, but require coordination and specialized software to work, limiting Autoplot's ability to access new servers and datasets. Likewise, groups who would like software to access their APIs must either write thier own clients, or publish a specification document in hopes that people will write clients. The HAPI specification was written so that a simple, standard API could be used by both Autoplot and server implementations, to remove these barriers to free flow of time series data. Autoplot's software for communicating with HAPI servers is presented, showing the user interface scientists will use, and how data servers might implement the HAPI specification to provide access to their data. This will also include instructions on how Autoplot is used and installed desktop computers, and used to view data from the RBSP, Juno, and other missions.

  14. Providing Internet Access to High-Resolution Mars Images

    NASA Technical Reports Server (NTRS)

    Plesea, Lucian

    2008-01-01

    The OnMars server is a computer program that provides Internet access to high-resolution Mars images, maps, and elevation data, all suitable for use in geographical information system (GIS) software for generating images, maps, and computational models of Mars. The OnMars server is an implementation of the Open Geospatial Consortium (OGC) Web Map Service (WMS) server. Unlike other Mars Internet map servers that provide Martian data using an Earth coordinate system, the OnMars WMS server supports encoding of data in Mars-specific coordinate systems. The OnMars server offers access to most of the available high-resolution Martian image and elevation data, including an 8-meter-per-pixel uncontrolled mosaic of most of the Mars Global Surveyor (MGS) Mars Observer Camera Narrow Angle (MOCNA) image collection, which is not available elsewhere. This server can generate image and map files in the tagged image file format (TIFF), Joint Photographic Experts Group (JPEG), 8- or 16-bit Portable Network Graphics (PNG), or Keyhole Markup Language (KML) format. Image control is provided by use of the OGC Style Layer Descriptor (SLD) protocol. The OnMars server also implements tiled WMS protocol and super-overlay KML for high-performance client application programs.

  15. 75 FR 11890 - Agency Information Collection Activities: Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-12

    ... housed on a secure server and database. The results of the survey shall be used for inpatient quality... of records are necessary to ensure the well-being and safety of patients and that professional...

  16. Distributed metadata servers for cluster file systems using shared low latency persistent key-value metadata store

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bent, John M.; Faibish, Sorin; Pedone, Jr., James M.

    A cluster file system is provided having a plurality of distributed metadata servers with shared access to one or more shared low latency persistent key-value metadata stores. A metadata server comprises an abstract storage interface comprising a software interface module that communicates with at least one shared persistent key-value metadata store providing a key-value interface for persistent storage of key-value metadata. The software interface module provides the key-value metadata to the at least one shared persistent key-value metadata store in a key-value format. The shared persistent key-value metadata store is accessed by a plurality of metadata servers. A metadata requestmore » can be processed by a given metadata server independently of other metadata servers in the cluster file system. A distributed metadata storage environment is also disclosed that comprises a plurality of metadata servers having an abstract storage interface to at least one shared persistent key-value metadata store.« less

  17. An assessment of burn prevention knowledge in a high burn-risk environment: restaurants.

    PubMed

    Piazza-Waggoner, Carrie; Adams, C D; Goldfarb, I W; Slater, H

    2002-01-01

    Our facility has seen an increase in the number of cases of children burned in restaurants. Fieldwork has revealed many unsafe serving practices in restaurants in our tristate area. The current research targets what appears to be an underexamined burn-risk environment, restaurants, to examine server knowledge about burn prevention and burn care with customers. Participants included 71 local restaurant servers and 53 servers from various restaurants who were recruited from undergraduate courses. All participants completed a brief demographic form as well as a Burn Knowledge Questionnaire. It was found that server knowledge was low (ie, less than 50% accuracy). Yet, most servers reported that they felt customer burn safety was important enough to change the way that they serve. Additionally, it was found that length of time employed as a server was a significant predictor of servers' burn knowledge (ie, more years serving associated with higher knowledge). Finally, individual items were examined to identify potential targets for developing prevention programs.

  18. xDSL connection monitor

    DOEpatents

    Horton, John J.

    2006-04-11

    A system and method of maintaining communication between a computer and a server, the server being in communication with the computer via xDSL service or dial-up modem service, with xDSL service being the default mode of communication, the method including sending a request to the server via xDSL service to which the server should respond and determining if a response has been received. If no response has been received, displaying on the computer a message (i) indicating that xDSL service has failed and (ii) offering to establish communication between the computer and the server via the dial-up modem, and thereafter changing the default mode of communication between the computer and the server to dial-up modem service. In a preferred embodiment, an xDSL service provider monitors dial-up modem communications and determines if the computer dialing in normally establishes communication with the server via xDSL service. The xDSL service provider can thus quickly and easily detect xDSL failures.

  19. Rclick: a web server for comparison of RNA 3D structures.

    PubMed

    Nguyen, Minh N; Verma, Chandra

    2015-03-15

    RNA molecules play important roles in key biological processes in the cell and are becoming attractive for developing therapeutic applications. Since the function of RNA depends on its structure and dynamics, comparing and classifying the RNA 3D structures is of crucial importance to molecular biology. In this study, we have developed Rclick, a web server that is capable of superimposing RNA 3D structures by using clique matching and 3D least-squares fitting. Our server Rclick has been benchmarked and compared with other popular servers and methods for RNA structural alignments. In most cases, Rclick alignments were better in terms of structure overlap. Our server also recognizes conformational changes between structures. For this purpose, the server produces complementary alignments to maximize the extent of detectable similarity. Various examples showcase the utility of our web server for comparison of RNA, RNA-protein complexes and RNA-ligand structures. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. Fibrinogenolytic and anticoagulant activities in the tissue covering the stingers of marine stingrays Dasyatis sephen and Aetobatis narinari.

    PubMed

    Kumar, Kalainesan Rajesh; Vennila, Rathinam; Kanchana, Shankar; Arumugam, Muthuvel; Balasubramaniam, Thangavel

    2011-05-01

    Stingray envenomation is one of the major problems in the marine and freshwater ecosystem. Accidents in human cause immediate, local and intense pain, erythema, edema, hemorrhage, tissue necrosis and secondary bacterial infection are also common. To determine the effect of two marine stingray species Dasyatis sephen and Aetobatis narinari venom extract on coagulation, fibrin(ogen)olytic, proteolytic activities. Plasma coagulation, Thrombin catalyzed fibrinocoagulation, Fibrin plate assay, sodium dodecyl sulfate-polyacrylamide gel electrophoresis (SDS-PAGE), substrate SDS-PAGE and thrombin like activity by using chromogenic substrate were used to determine the effect of venom on plasma coagulation, its fibrin(ogen)olytic and proteolytic activity. The results show the presence of fibrin(ogen)olytic, anticoagulant and gelatinolytic activity in both stingray venom extracts. D. sephen venom delays coagulation of citrated plasma more significantly than A. narinari upon using increasing concentration of the venom. The same results were obtained in the fibrinocoagulation assays. SDS-PAGE analysis of fibrinogen and fibrin after incubation with D. sephen and A. narinari venom show fibrin(ogen)olytic activity. Through SDS-PAGE analysis it is confirmed that the delaying in coagulation process by stingray venom is due to its fibrin(ogen)olytic activity and fibrinolytic activity also confirmed through fibrin plate assay. Zymogram analysis shows the presence of array of gelatinolytic and fibrinogenolytic enzymes above 43-276 kDa in the D. sephen and A. narinari venom respectively. Protease inhibitor studies show the serine and metallo proteases are responsible for these activities. From the results, fibrinogenolytic, proteolytic activity of the stingray venom is confirmed, but it has no thrombin like activity and these activities may aid in hemorrhages, tissue necrosis and secondary bacterial infections at the site of envenomation.

  1. The importance of social media for patients and families affected by congenital anomalies: A Facebook cross-sectional analysis and user survey.

    PubMed

    Jacobs, Robyn; Boyd, Leanne; Brennan, Kirsty; Sinha, C K; Giuliani, Stefano

    2016-11-01

    We aimed to define characteristics and needs of Facebook users in relation to congenital anomalies. Cross-sectional analysis of Facebook related to four congenital anomalies: anorectal malformation (ARM), congenital diaphragmatic hernia (CDH), congenital heart disease (CHD) and hypospadias/epispadias (HS/ES). A keyword search was performed to identify relevant Groups/Pages. An anonymous survey was posted to obtain quantitative/qualitative data on users and their healthcare needs. 54 Groups and 24 Pages were identified (ARM: 10 Groups; CDH: 9 Groups, 7 Pages; CHD: 32 Groups, 17 Pages; HS/ES: 3 Groups), with 16,191 Group members and 48,766 Page likes. 868/1103 (79%) of respondents were parents. Male:female ratio was 1:10.9. 65% of the users were 26-40years old. Common reasons for joining these Groups/Pages included: seeking support, education, making friends, and providing support to others. 932/1103 (84%) would like healthcare professionals (HCPs) to actively participate in their Group. 31% of the respondents felt that they did not receive enough support from their healthcare system. 97% of the respondents would like to join a Group linked to their primary hospital. Facebook Groups/Pages related to congenital anomalies are highly populated and active. There is a need for HCPs and policy makers to better understand and participate in social media to support families and improve patient care. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. A General Purpose Connections type CTI Server Based on SIP Protocol and Its Implementation

    NASA Astrophysics Data System (ADS)

    Watanabe, Toru; Koizumi, Hisao

    In this paper, we propose a general purpose connections type CTI (Computer Telephony Integration) server that provides various CTI services such as voice logging where the CTI server communicates with IP-PBX using the SIP (Session Initiation Protocol), and accumulates voice packets of external line telephone call flowing between an IP telephone for extension and a VoIP gateway connected to outside line networks. The CTI server realizes CTI services such as voice logging, telephone conference, or IVR (interactive voice response) with accumulating and processing voice packets sampled. Furthermore, the CTI server incorporates a web server function which can provide various CTI services such as a Web telephone directory via a Web browser to PCs, cellular telephones or smart-phones in mobile environments.

  3. User-Friendly Data Servers for Climate Studies at the Asia-Pacific Data-Research Center (APDRC)

    NASA Astrophysics Data System (ADS)

    Yuan, G.; Shen, Y.; Zhang, Y.; Merrill, R.; Waseda, T.; Mitsudera, H.; Hacker, P.

    2002-12-01

    The APDRC was recently established within the International Pacific Research Center (IPRC) at the University of Hawaii. The APDRC mission is to increase understanding of climate variability in the Asia-Pacific region by developing the computational, data-management, and networking infrastructure necessary to make data resources readily accessible and usable by researchers, and by undertaking data-intensive research activities that will both advance knowledge and lead to improvements in data preparation and data products. A focus of recent activity is the implementation of user-friendly data servers. The APDRC is currently running a Live Access Server (LAS) developed at NOAA/PMEL to provide access to and visualization of gridded climate products via the web. The LAS also allows users to download the selected data subsets in various formats (such as binary, netCDF and ASCII). Most of the datasets served by the LAS are also served through our OPeNDAP server (formerly DODS), which allows users to directly access the data using their desktop client tools (e.g. GrADS, Matlab and Ferret). In addition, the APDRC is running an OPeNDAP Catalog/Aggregation Server (CAS) developed by Unidata at UCAR to serve climate data and products such as model output and satellite-derived products. These products are often large (> 2 GB) and are therefore stored as multiple files (stored separately in time or in parameters). The CAS remedies the inconvenience of multiple files and allows access to the whole dataset (or any subset that cuts across the multiple files) via a single request command from any DODS enabled client software. Once the aggregation of files is configured at the server (CAS), the process of aggregation is transparent to the user. The user only needs to know a single URL for the entire dataset, which is, in fact, stored as multiple files. CAS even allows aggregation of files on different systems and at different locations. Currently, the APDRC is serving NCEP, ECMWF, SODA, WOCE-Satellite, TMI, GPI and GSSTF products through the CAS. The APDRC is also running an EPIC server developed by PMEL/NOAA. EPIC is a web-based, data search and display system suited for in situ (station versus gridded) data. The process of locating and selecting individual station data from large collections (millions of profiles or time series, etc.) of in situ data is a major challenge. Serving in situ data on the Internet faces two problems: the irregularity of data formats; and the large quantity of data files. To solve the first problem, we have converted the in situ data into netCDF data format. The second problem was solved by using the EPIC server, which allows users to easily subset the files using a friendly graphical interface. Furthermore, we enhanced the capability of EPIC and configured OPeNDAP into EPIC to serve the numerous in situ data files and to export them to users through two different options: 1) an OPeNDAP pointer file of user-selected data files; and 2) a data package that includes meta-information (e.g., location, time, cruise no, etc.), a local pointer file, and the data files that the user selected. Option 1) is for those who do not want to download the selected data but want to use their own application software (such as GrADS, Matlab and Ferret) for access and analysis; option 2) is for users who want to store the data on their own system (e.g. laptops before going for a cruise) for subsequent analysis. Currently, WOCE CTD and bottle data, the WOCE current meter data, and some Argo float data are being served on the EPIC server.

  4. MyShake - A smartphone app to detect earthquake

    NASA Astrophysics Data System (ADS)

    Kong, Q.; Allen, R. M.; Schreier, L.; Kwon, Y. W.

    2015-12-01

    We designed an android app that harnesses the accelerometers in personal smartphones to record earthquake-shaking data for research, hazard information and warnings. The app has the function to distinguish earthquake shakings from daily human activities based on the different patterns behind the movements. It also can be triggered by the traditional earthquake early warning (EEW) system to record for a certain amount of time to collect earthquake data. When the app is triggered by the earthquake-like movements, it sends the trigger information back to our server which contains time and location of the trigger, at the same time, it stores the waveform data on local phone first, and upload to our server later. Trigger information from multiple phones will be processed in real time on the server to find the coherent signal to confirm the earthquakes. Therefore, the app provides the basis to form a smartphone seismic network that can detect earthquake and even provide warnings. A planned public roll-out of MyShake could collect millions of seismic recordings for large earthquakes in many regions around the world.

  5. SETTER: web server for RNA structure comparison

    PubMed Central

    Čech, Petr; Svozil, Daniel; Hoksza, David

    2012-01-01

    The recent discoveries of regulatory non-coding RNAs changed our view of RNA as a simple information transfer molecule. Understanding the architecture and function of active RNA molecules requires methods for comparing and analyzing their 3D structures. While structural alignment of short RNAs is achievable in a reasonable amount of time, large structures represent much bigger challenge. Here, we present the SETTER web server for the RNA structure pairwise comparison utilizing the SETTER (SEcondary sTructure-based TERtiary Structure Similarity Algorithm) algorithm. The SETTER method divides an RNA structure into the set of non-overlapping structural elements called generalized secondary structure units (GSSUs). The SETTER algorithm scales as O(n2) with the size of a GSSUs and as O(n) with the number of GSSUs in the structure. This scaling gives SETTER its high speed as the average size of the GSSU remains constant irrespective of the size of the structure. However, the favorable speed of the algorithm does not compromise its accuracy. The SETTER web server together with the stand-alone implementation of the SETTER algorithm are freely accessible at http://siret.cz/setter. PMID:22693209

  6. A Web Server and Mobile App for Computing Hemolytic Potency of Peptides.

    PubMed

    Chaudhary, Kumardeep; Kumar, Ritesh; Singh, Sandeep; Tuknait, Abhishek; Gautam, Ankur; Mathur, Deepika; Anand, Priya; Varshney, Grish C; Raghava, Gajendra P S

    2016-03-08

    Numerous therapeutic peptides do not enter the clinical trials just because of their high hemolytic activity. Recently, we developed a database, Hemolytik, for maintaining experimentally validated hemolytic and non-hemolytic peptides. The present study describes a web server and mobile app developed for predicting, and screening of peptides having hemolytic potency. Firstly, we generated a dataset HemoPI-1 that contains 552 hemolytic peptides extracted from Hemolytik database and 552 random non-hemolytic peptides (from Swiss-Prot). The sequence analysis of these peptides revealed that certain residues (e.g., L, K, F, W) and motifs (e.g., "FKK", "LKL", "KKLL", "KWK", "VLK", "CYCR", "CRR", "RFC", "RRR", "LKKL") are more abundant in hemolytic peptides. Therefore, we developed models for discriminating hemolytic and non-hemolytic peptides using various machine learning techniques and achieved more than 95% accuracy. We also developed models for discriminating peptides having high and low hemolytic potential on different datasets called HemoPI-2 and HemoPI-3. In order to serve the scientific community, we developed a web server, mobile app and JAVA-based standalone software (http://crdd.osdd.net/raghava/hemopi/).

  7. The NASA Technical Report Server

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Gottlich, Gretchen L.; Bianco, David J.; Paulson, Sharon S.; Binkley, Robert L.; Kellogg, Yvonne D.; Beaumont, Chris J.; Schmunk, Robert B.; Kurtz, Michael J.; Accomazzi, Alberto

    1995-01-01

    The National Aeronautics and Space Act of 1958 established NASA and charged it to "provide for the widest practicable and appropriate dissemination of information concerning its activities and the results thereof." The search for innovative methods to distribute NASA's information lead a grass-roots team to create the NASA Technical Report Server (NTRS), which uses the World Wide Web and other popular Internet-based information systems as search engines. The NTRS is an inter-center effort which provides uniform access to various distributed publication servers residing on the Internet. Users have immediate desktop access to technical publications from NASA centers and institutes. The NTRS is comprised of several units, some constructed especially for inclusion in NTRS, and others that are existing NASA publication services that NTRS reuses. This paper presents the NTRS architecture, usage metrics, and the lessons learned while implementing and maintaining the service. The NTRS is largely constructed with freely available software running on existing hardware. NTRS builds upon existing hardware and software, and the resulting additional exposure for the body of literature contained ensures that NASA's institutional knowledge base will continue to receive the widest practicable and appropriate dissemination.

  8. Definitions of Digestive Terms

    MedlinePlus

    ... usual, and loose stools. May be chronic or acute (Ogilvie's syndrome). Top of Page Q Quality of life Perception of ability to meet daily needs, physical activities, well-being. Top of Page R Radiation proctitis Bleeding, mucous and bloody discharge, spasm of ...

  9. Implementing TCP/IP and a socket interface as a server in a message-passing operating system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hipp, E.; Wiltzius, D.

    1990-03-01

    The UNICOS 4.3BSD network code and socket transport interface are the basis of an explicit network server for NLTSS, a message passing operating system on the Cray YMP. A BSD socket user library provides access to the network server using an RPC mechanism. The advantages of this server methodology are its modularity and extensibility to migrate to future protocol suites (e.g. OSI) and transport interfaces. In addition, the network server is implemented in an explicit multi-tasking environment to take advantage of the Cray YMP multi-processor platform. 19 refs., 5 figs.

  10. Single-server blind quantum computation with quantum circuit model

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoqian; Weng, Jian; Li, Xiaochun; Luo, Weiqi; Tan, Xiaoqing; Song, Tingting

    2018-06-01

    Blind quantum computation (BQC) enables the client, who has few quantum technologies, to delegate her quantum computation to a server, who has strong quantum computabilities and learns nothing about the client's quantum inputs, outputs and algorithms. In this article, we propose a single-server BQC protocol with quantum circuit model by replacing any quantum gate with the combination of rotation operators. The trap quantum circuits are introduced, together with the combination of rotation operators, such that the server is unknown about quantum algorithms. The client only needs to perform operations X and Z, while the server honestly performs rotation operators.

  11. An Evaluation of Alternative Designs for a Grid Information Service

    NASA Technical Reports Server (NTRS)

    Smith, Warren; Waheed, Abdul; Meyers, David; Yan, Jerry; Kwak, Dochan (Technical Monitor)

    2001-01-01

    The Globus information service wasn't working well. There were many updates of data from Globus daemons which saturated the single server and users couldn't retrieve information. We created a second server for NASA and Alliance. Things were great on that server, but a bit slow on the other server. We needed to know exactly how the information service was being used. What were the best servers and configurations? This viewgraph presentation gives an overview of the evaluation of alternative designs for a Grid Information Service. Details are given on the workload characterization, methodology used, and the performance evaluation.

  12. Setup Instructions for the Applied Anomaly Detection Tool (AADT) Web Server

    DTIC Science & Technology

    2016-09-01

    ARL-TR-7798 ● SEP 2016 US Army Research Laboratory Setup Instructions for the Applied Anomaly Detection Tool (AADT) Web Server...for the Applied Anomaly Detection Tool (AADT) Web Server by Christian D Schlesiger Computational and Information Sciences Directorate, ARL...SUBTITLE Setup Instructions for the Applied Anomaly Detection Tool (AADT) Web Server 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT

  13. PREDICT: Privacy and Security Enhancing Dynamic Information Monitoring

    DTIC Science & Technology

    2015-08-03

    consisting of global server-side probabilistic assignment by an untrusted server using cloaked locations, followed by feedback-loop guided local...12], consisting of global server-side probabilistic assignment by an untrusted server using cloaked locations, followed by feedback-loop guided...these methods achieve high sensing coverage with low cost using cloaked locations [3]. In follow-on work, the issue of mobility is addressed. Task

  14. Performance Modeling of the ADA Rendezvous

    DTIC Science & Technology

    1991-10-01

    queueing network of figure 2, SERVERTASK can complete only one rendezvous at a time. Thus, the rate that the rendezvous requests are processed at the... Network 1, SERVERTASK competes with the traffic tasks of Server Processor. Each time SERVERTASK gains access to the processor, SERVERTASK completes...Client Processor Server Processor Software Server Nek Netork2 Figure 10. A conceptualization of the algorithm. The SERVERTASK software server of Network 2

  15. Remote Adaptive Communication System

    DTIC Science & Technology

    2001-10-25

    manage several different devices using the software tool A. Client /Server Architecture The architecture we are proposing is based on the Client ...Server model (see figure 3). We want both client and server to be accessible from anywhere via internet. The computer, acting as a server, is in...the other hand, each of the client applications will act as sender or receiver, depending on the associated interface: user interface or device

  16. GPS Ocean Reflection Experiment (GORE) Wind Explorer (WindEx) Instrument Design and Development

    NASA Astrophysics Data System (ADS)

    Ganoe, G.

    2004-12-01

    This paper describes the design and development of the WindEx instrument, and the technology implemented by it. The important design trades will be covered along with the justification for the options selected. An evaluation of the operation of the instrument, and plans for continued development and enhancements will also be given. The WindEx instrument consists of a processor that receives data from an included GPS Surface reflection receiver, and computes ocean surface wind speeds in real time utilizing an algorithm developed at LaRC by Dr. Stephen J. Katzberg. The WindEx performs a windspeed server function as well as acting as a repository for the client moving map applications, and providing a web page with instructions on the installation and use of the WindEx system. The server receives the GPS reflection data produced by the receiver, performs wind speed processing, then makes the wind speed data available as a moving map display to requesting client processors on the aircraft network. The client processors are existing systems used by the research personnel onboard. They can be configured to be WINDEX clients by downloading the Java client application from the WINDEX server. The client application provides a graphical display of a moving map that shows the aircraft position along with the position of the reflection point from the surface of the ocean where the wind speed is being estimated, and any coastlines within the field of view. Information associated with the reflection point includes the estimated wind speed, and a confidence factor that gives the researcher an idea about the reliability of the wind speed measurement. The instrument has been installed on one of NOAA's Hurricane Hunters, a Gulfstream IV, whose nickname is "Gonzo". Based at MacDill AFB, Florida, "Gonzo" flies around the periphery of the storm deploying GPS-based dropsondes which measure local winds. The dropsondes are the "gold-standard" for determining surface winds, but can only be deployed sparingly. The GPS WindEx system allows for a continuous map between dropsonde releases as well as monitoring the ocean surface for suspicious areas. The GPS technique is insensitive to clouds or rain and can give information concerning surface conditions not available to the flight crew.

  17. Database architectures for Space Telescope Science Institute

    NASA Astrophysics Data System (ADS)

    Lubow, Stephen

    1993-08-01

    At STScI nearly all large applications require database support. A general purpose architecture has been developed and is in use that relies upon an extended client-server paradigm. Processing is in general distributed across three processes, each of which generally resides on its own processor. Database queries are evaluated on one such process, called the DBMS server. The DBMS server software is provided by a database vendor. The application issues database queries and is called the application client. This client uses a set of generic DBMS application programming calls through our STDB/NET programming interface. Intermediate between the application client and the DBMS server is the STDB/NET server. This server accepts generic query requests from the application and converts them into the specific requirements of the DBMS server. In addition, it accepts query results from the DBMS server and passes them back to the application. Typically the STDB/NET server is local to the DBMS server, while the application client may be remote. The STDB/NET server provides additional capabilities such as database deadlock restart and performance monitoring. This architecture is currently in use for some major STScI applications, including the ground support system. We are currently investigating means of providing ad hoc query support to users through the above architecture. Such support is critical for providing flexible user interface capabilities. The Universal Relation advocated by Ullman, Kernighan, and others appears to be promising. In this approach, the user sees the entire database as a single table, thereby freeing the user from needing to understand the detailed schema. A software layer provides the translation between the user and detailed schema views of the database. However, many subtle issues arise in making this transformation. We are currently exploring this scheme for use in the Hubble Space Telescope user interface to the data archive system (DADS).

  18. Technical development of PubMed interact: an improved interface for MEDLINE/PubMed searches.

    PubMed

    Muin, Michael; Fontelo, Paul

    2006-11-03

    The project aims to create an alternative search interface for MEDLINE/PubMed that may provide assistance to the novice user and added convenience to the advanced user. An earlier version of the project was the 'Slider Interface for MEDLINE/PubMed searches' (SLIM) which provided JavaScript slider bars to control search parameters. In this new version, recent developments in Web-based technologies were implemented. These changes may prove to be even more valuable in enhancing user interactivity through client-side manipulation and management of results. PubMed Interact is a Web-based MEDLINE/PubMed search application built with HTML, JavaScript and PHP. It is implemented on a Windows Server 2003 with Apache 2.0.52, PHP 4.4.1 and MySQL 4.1.18. PHP scripts provide the backend engine that connects with E-Utilities and parses XML files. JavaScript manages client-side functionalities and converts Web pages into interactive platforms using dynamic HTML (DHTML), Document Object Model (DOM) tree manipulation and Ajax methods. With PubMed Interact, users can limit searches with JavaScript slider bars, preview result counts, delete citations from the list, display and add related articles and create relevance lists. Many interactive features occur at client-side, which allow instant feedback without reloading or refreshing the page resulting in a more efficient user experience. PubMed Interact is a highly interactive Web-based search application for MEDLINE/PubMed that explores recent trends in Web technologies like DOM tree manipulation and Ajax. It may become a valuable technical development for online medical search applications.

  19. GASS-WEB: a web server for identifying enzyme active sites based on genetic algorithms.

    PubMed

    Moraes, João P A; Pappa, Gisele L; Pires, Douglas E V; Izidoro, Sandro C

    2017-07-03

    Enzyme active sites are important and conserved functional regions of proteins whose identification can be an invaluable step toward protein function prediction. Most of the existing methods for this task are based on active site similarity and present limitations including performing only exact matches on template residues, template size restraints, despite not being capable of finding inter-domain active sites. To fill this gap, we proposed GASS-WEB, a user-friendly web server that uses GASS (Genetic Active Site Search), a method based on an evolutionary algorithm to search for similar active sites in proteins. GASS-WEB can be used under two different scenarios: (i) given a protein of interest, to match a set of specific active site templates; or (ii) given an active site template, looking for it in a database of protein structures. The method has shown to be very effective on a range of experiments and was able to correctly identify >90% of the catalogued active sites from the Catalytic Site Atlas. It also managed to achieve a Matthew correlation coefficient of 0.63 using the Critical Assessment of protein Structure Prediction (CASP 10) dataset. In our analysis, GASS was ranking fourth among 18 methods. GASS-WEB is freely available at http://gass.unifei.edu.br/. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  20. Modis, SeaWIFS, and Pathfinder funded activities

    NASA Technical Reports Server (NTRS)

    Evans, Robert H.

    1995-01-01

    MODIS (Moderate Resolution Imaging Spectrometer), SeaWIFS (Sea-viewing Wide Field Sensor), Pathfinder, and DSP (Digital Signal Processor) objectives are summarized. An overview of current progress is given for the automatic processing database, client/server status, matchup database, and DSP support.

Top