Science.gov

Sample records for active server pages

  1. ITMS: Individualized Teaching Material System: Adaptive Integration of Web Pages Distributed in Some Servers.

    ERIC Educational Resources Information Center

    Mitsuhara, Hiroyuki; Kurose, Yoshinobu; Ochi, Youji; Yano, Yoneo

    The authors developed a Web-based Adaptive Educational System (Web-based AES) named ITMS (Individualized Teaching Material System). ITMS adaptively integrates knowledge on the distributed Web pages and generates individualized teaching material that has various contents. ITMS also presumes the learners' knowledge levels from the states of their…

  2. Comments on Hudesman and Page's reply to Fudin's comments on Hudesman, Page and Rautianen's subliminal psychodynamic activation experiment.

    PubMed

    Fudin, R

    1993-06-01

    Hudesman and Page's contention that Gustafson and Källmén's 1991 results indicate that subsequent subliminal psychodynamic activation experiments do not require the controls suggested by Fudin in 1986 is questioned. The rationale for Fudin's 1993 comment concerning the limited generalizability of Hudesman, et al.'s (1992) results, a comment Hudesman and Page contended is unfounded, is discussed. PMID:8321599

  3. 76 FR 2754 - Agency Information Collection (Pay Now Enter Info Page) Activity Under OMB Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-14

    ... AFFAIRS Agency Information Collection (Pay Now Enter Info Page) Activity Under OMB Review AGENCY: Office... Info Page. OMB Control Number: 2900-0663. Type of Review: Extension of a currently approved collection... payments through VA's Pay Now Enter Info Page website. Data enter on the Pay Now Enter Info Page...

  4. Shakespeare Page to Stage: An Active Approach to "Othello."

    ERIC Educational Resources Information Center

    Thomas, Peter

    1994-01-01

    Presents an account of how one English teacher taught William Shakespeare's "Othello" through dramatics in a challenging way. Considers how teachers of drama might discuss props, stage directions, and the proper handling of Desdemona's handkerchief. Explains how teachers should try to take the plays from "page to stage." (HB)

  5. Establishment of Textbook Information Management System Based on Active Server Page

    ERIC Educational Resources Information Center

    Geng, Lihua

    2011-01-01

    In the process of textbook management of universities, the flow of storage, collection and check of textbook is quite complicated and daily management flow and system also seriously constrains the efficiency of the management process. Thus, in order to combine the information management model and the traditional management model, it is necessary…

  6. 75 FR 76080 - Agency Information Collection (VetBiz Vendor Information Pages Verification Program) Activity...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-07

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF VETERANS AFFAIRS Agency Information Collection (VetBiz Vendor Information Pages Verification Program) Activity... . Please refer to ``OMB Control No. 2900- 0675.'' SUPPLEMENTAL INFORMATION: Title: VetBiz...

  7. The Green Pages: Environmental Education Activities K-12.

    ERIC Educational Resources Information Center

    Clearing, 1990

    1990-01-01

    Presented are 20 science activities for students K-12. Topics include role playing, similarities between urban and forest communities, ecosystems, garbage, recycling, food production, habitats, insects, tidal zone, animals, diversity, interest groups, rivers, spaceship earth, ecological interactions, and the cost of recreation. (KR)

  8. The Green Pages: Environmental Education Activities K-12.

    ERIC Educational Resources Information Center

    Clearing, 1991

    1991-01-01

    Presented are 38 environmental education activities for grades K-12. Topics include seed dispersal, food chains, plant identification, sizes and shapes, trees, common names, air pollution, recycling, temperature, litter, water conservation, photography, insects, urban areas, diversity, natural cycles, rain, erosion, phosphates, human population,…

  9. The Green Pages: Environmental Education Activities K-12.

    ERIC Educational Resources Information Center

    Clearing, 1990

    1990-01-01

    Presented are 37 environmental science activities for students in grades K-12. Topics include water pollution, glaciers, protective coloration, shapes in nature, environmental impacts, recycling, creative writing, litter, shapes found in nature, color, rain cycle, waste management, plastics, energy, pH, landfills, runoff, watersheds,…

  10. CDRUG: a web server for predicting anticancer activity of chemical compounds.

    PubMed

    Li, Gong-Hua; Huang, Jing-Fei

    2012-12-15

    Cancer is the leading cause of death worldwide. Screening anticancer candidates from tens of millions of chemical compounds is expensive and time-consuming. A rapid and user-friendly web server, known as CDRUG, is described here to predict the anticancer activity of chemical compounds. In CDRUG, a hybrid score was developed to measure the similarity of different compounds. The performance analysis shows that CDRUG has the area under curve of 0.878, indicating that CDRUG is effective to distinguish active and inactive compounds.

  11. Final comments on Hudesman, Page, and Rautiainen's (1992) subliminal psychodynamic activation experiment.

    PubMed

    Fudin, R

    1993-10-01

    Hudesman, et al.'s (1992) contention that their finding shows that subliminal psychodynamic activation (SPA) improved academic performance is questioned. That experiment lacked controls outlined by Fudin in 1986 which are needed to support the assumption that a positive SPA outcome is effected because the meaning of an entire experimental message is encoded. In 1993 Hudesman and Page argued that Gustafson and Källmén's 1991 results, obtained with such controls, indicated that the controls do not have to be used in subsequent SPA experiments. The 1990 results of Greenberg and of Kothera, Fudin, and Nicastro, however, do not support those of Gustafson and Källmén. From a different perspective, it is argued that good experimental controls are needed in all SPA experiments because they increase internal validity. Given that Hudesman, et al.'s subjects scored in a limited range on the mathematics portion of the 1978 CUNY Skills Assessment Test, the implication that their result can be generalized to all subjects is questioned. PMID:8247655

  12. Land Use and Climate Impacts on Fluvial Systems (LUCIFS): A PAGES - Focus 4 (PHAROS) research activity

    NASA Astrophysics Data System (ADS)

    Dearing, John; Hoffmann, Thomas

    2010-05-01

    LUCIFS is a global research program which is concerned with understanding past interactions between climate, human activity and fluvial systems. Its focus is on evaluating the geomorphic impact of humans on landscapes, with a strong emphasis on geomorphological and sedimentological perspectives on mid- to long-term man-landscape interactions. Of particular relevance are aspects of sediment redistribution systems such as non-linear behaviour, the role of system configuration, scale effects, and emergent properties Over the last decade the LUCIFS program has been investigating both contemporary and long-term river response to global change with the principal aims of i)quantifying land use and climate change impacts of river-borne fluxes of water, sediment, C, N and P; ii) identification of key controls on these fluxes at the catchment scale; and iii) identification of the feedback on both human society and biogeochemical cycles of long-term changes in the fluxes of these materials The major scientific tasks of the LUCIFS-program are: • synthesising results of regional case studies • identify regional gaps and encouraging new case studies • addressing research gaps and formulating new research questions • organising workshops and conferences In this paper we present the LUCIFS program within the new PAGES structure. LUCIFS is located in the Focus 4 (PHAROS) dealing with how a knowledge of human-climate-ecosystem interactions in the past can help inform understanding and management today. In conjunction with the other working groups HITE (Human Impacts on Terrestrial Ecosystems), LIMPACS (Human Impacts on Lake Ecosystems) and IHOPE (Integrated History of People on Earth) PHAROS aims to compare regional-scale reconstructions of environmental and climatic processes using natural archives, documentary and instrumental data, with evidence of past human activity obtained from historical, paleoecological and archaeological records.

  13. Basics. [A Compilation of Learning Activities Pages from Seven Issues of Instructor Magazine, September 1982 through March 1983 and May 1983.

    ERIC Educational Resources Information Center

    Instructor, 1983

    1983-01-01

    This collection of 18 learning activities pages focuses on the subject areas of science, language arts, mathematics, and social studies. The science activities pages concern the study of earthquakes, sound, environmental changes, snails and slugs, and friction. Many of the activities are in the form of experiments for the students to perform.…

  14. The Green Pages Environmental Education Activities K-12: Gardens for Young Growing Lives.

    ERIC Educational Resources Information Center

    Larson, Jan

    1997-01-01

    Describes several gardening activities that can be kept simple or used as a foundation for more in-depth projects. Activities include setting up an indoor garden spot, making compost which helps students understand the terms "decompose" and "compost", watching plants drink in which students measure water movement in plants, making herb gardens,…

  15. Comments on Hudesman, Page, and Rautiainen's (1992) subliminal psychodynamic activation experiment.

    PubMed

    Fudin, R

    1993-02-01

    Hudesman, et al.'s (1992) contention that their finding and those of Ariam (1979), Parker (1982), and Cook (1985) show that subliminal psychodynamic activation (SPA) can improve academic performance is questioned. Results obtained from experiments using methodological innovations (Fudin, 1986) would allow a clearer interpretation of positive SPA outcomes. PMID:8451149

  16. WSKE: Web Server Key Enabled Cookies

    NASA Astrophysics Data System (ADS)

    Masone, Chris; Baek, Kwang-Hyun; Smith, Sean

    In this paper, we present the design and prototype of a new approach to cookie management: if a server deposits a cookie only after authenticating itself via the SSL handshake, the browser will return the cookie only to a server that can authenticate itself, via SSL, to the same keypair. This approach can enable usable but secure client authentication. This approach can improve the usability of server authentication by clients. This approach is superior to the prior work on Active Cookies in that it defends against both DNS spoofing and IP spoofing—and does not require binding a user's interaction with a server to individual IP addresses.

  17. The NEOS server.

    SciTech Connect

    Czyzyk, J.; Mesnier, M. P.; More, J. J.; Mathematics and Computer Science

    1998-07-01

    The Network-Enabled Optimization System (NEOS) is an Internet based optimization service. The NEOS Server introduces a novel approach for solving optimization problems. Users of the NEOS Server submit a problem and their choice of optimization solver over the Internet. The NEOS Server computes all information (for example, derivatives and sparsity patterns) required by the solver, links the optimization problem with the solver, and returns a solution.

  18. Promoting metacognition in first year anatomy laboratories using plasticine modeling and drawing activities: a pilot study of the "blank page" technique.

    PubMed

    Naug, Helen L; Colson, Natalie J; Donner, Daniel G

    2011-01-01

    Many first year students of anatomy and physiology courses demonstrate an inability to self-regulate their learning. To help students increase their awareness of their own learning in a first year undergraduate anatomy course, we piloted an exercise that incorporated the processes of (1) active learning: drawing and plasticine modeling and (2) metacognition: planning, monitoring, reaction, and reflection. The activity was termed "blank page" because all learning cues were removed and students had to create models and diagrams from reflection and recall. Two hundred and eighty-two students responded to a questionnaire reporting qualitative feedback on the exercise. Based on student responses, the "blank page" activity was a positive learning experience and confirmed a need to teach metacognitive skills. From this pilot study, we established that drawing or plasticine modeling is an excellent vehicle for demonstration of the metacognitive processes that enable self-regulation: a known predictor of academic success. PMID:21618445

  19. Promoting metacognition in first year anatomy laboratories using plasticine modeling and drawing activities: a pilot study of the "blank page" technique.

    PubMed

    Naug, Helen L; Colson, Natalie J; Donner, Daniel G

    2011-01-01

    Many first year students of anatomy and physiology courses demonstrate an inability to self-regulate their learning. To help students increase their awareness of their own learning in a first year undergraduate anatomy course, we piloted an exercise that incorporated the processes of (1) active learning: drawing and plasticine modeling and (2) metacognition: planning, monitoring, reaction, and reflection. The activity was termed "blank page" because all learning cues were removed and students had to create models and diagrams from reflection and recall. Two hundred and eighty-two students responded to a questionnaire reporting qualitative feedback on the exercise. Based on student responses, the "blank page" activity was a positive learning experience and confirmed a need to teach metacognitive skills. From this pilot study, we established that drawing or plasticine modeling is an excellent vehicle for demonstration of the metacognitive processes that enable self-regulation: a known predictor of academic success.

  20. Servers Made to Order

    SciTech Connect

    Anderson, Daryl L.

    2007-11-01

    Virtualization is a hot buzzword right now, and it’s no wonder federal agencies are coming around to the idea of consolidating their servers and storage. Traditional servers do nothing for about 80% of their lifecycle, yet use nearly half their peak energy consumption which wastes capacity and power. Server virtualization creates logical "machines" on a single physical server. At the Pacific Northwest National Laboratory in Richland, Washington, using virtualization technology is proving to be a cost-effective way to make better use of current server hardware resources while reducing hardware lifecycle costs and cooling demands, and saving precious data center space. And as an added bonus, virtualization also ties in with the Laboratory’s mission to be responsible stewards of the environment as well as the Department of Energy’s assets. This article explains why even the smallest IT shops can benefit from the Laboratory’s best practices.

  1. Using Web Server Logs to Track Users through the Electronic Forest

    ERIC Educational Resources Information Center

    Coombs, Karen A.

    2005-01-01

    This article analyzes server logs, providing helpful information in making decisions about Web-based services. The author indicates, as a result of analyzing server logs, several interesting things about the users' behavior were learned. The resulting findings are discussed in this article. Certain pages of the author's Web site, for instance, are…

  2. The NASA Technical Report Server

    NASA Astrophysics Data System (ADS)

    Nelson, M. L.; Gottlich, G. L.; Bianco, D. J.; Paulson, S. S.; Binkley, R. L.; Kellogg, Y. D.; Beaumont, C. J.; Schmunk, R. B.; Kurtz, M. J.; Accomazzi, A.; Syed, O.

    The National Aeronautics and Space Act of 1958 established the National Aeronautics and Space Administration (NASA) and charged it to "provide for the widest practicable and appropriate dissemination of information concerning...its activities and the results thereof". The search for innovative methods to distribute NASA's information led a grass-roots team to create the NASA Technical Report Server (NTRS), which uses the World Wide Web and other popular Internet-based information systems .

  3. Remote diagnosis server

    NASA Technical Reports Server (NTRS)

    Deb, Somnath (Inventor); Ghoshal, Sudipto (Inventor); Malepati, Venkata N. (Inventor); Kleinman, David L. (Inventor); Cavanaugh, Kevin F. (Inventor)

    2004-01-01

    A network-based diagnosis server for monitoring and diagnosing a system, the server being remote from the system it is observing, comprises a sensor for generating signals indicative of a characteristic of a component of the system, a network-interfaced sensor agent coupled to the sensor for receiving signals therefrom, a broker module coupled to the network for sending signals to and receiving signals from the sensor agent, a handler application connected to the broker module for transmitting signals to and receiving signals therefrom, a reasoner application in communication with the handler application for processing, and responding to signals received from the handler application, wherein the sensor agent, broker module, handler application, and reasoner applications operate simultaneously relative to each other, such that the present invention diagnosis server performs continuous monitoring and diagnosing of said components of the system in real time. The diagnosis server is readily adaptable to various different systems.

  4. Using servers to enhance control system capability

    SciTech Connect

    M. Bickley; B.A. Bowling; D.A. Bryan; J. van Zeijts; K.S. White; S. Witherspoon

    1999-03-01

    Many traditional control systems include a distributed collection of front end machines to control hardware. Back end tools are used to view, modify and record the signals generated by these front end machines. Software servers, which are a middleware layer between the front and back ends, can improve a control system in several ways. Servers can enable on-line processing of raw data, and consolidation of functionality. In many cases, data retrieved from the front end must be processed in order to convert the raw data into useful information. These calculations are often redundantly performed by different programs, frequently offline. Servers can monitor the raw data and rapidly perform calculations, producing new signals which can be treated like any other control system signal, and can be used by any back end application. Algorithms can be incorporated to actively modify signal values in the control system based upon changes of other signals, essentially producing feedback in a control system. Servers thus increase the flexibility of a control system. Lastly, servers running on inexpensive UNIX workstations can relay or cache frequently needed information, reducing the load on front end hardware by functioning as concentrators. Rather than many back end tools connecting directly to the front end machines, increasing the work load of these machines, they instead connect to the server. Servers like those discussed above have been used successfully at the Thomas Jefferson National Accelerator Facility to provide functionality such as beam steering, fault monitoring, storage of machine parameters, and on-line data processing. The authors discuss the potential uses of such servers, and share the results of work performed to date.

  5. USING SERVERS TO ENHANCE CONTROL SYSTEM CAPABILITY.

    SciTech Connect

    BICKLEY,M.; BOWLING,B.A.; BRYAN,D.A.; ZEIJTS,J.; WHITE,K.S.; WITHERSPOON,S.

    1999-03-29

    Many traditional control systems include a distributed collection of front end machines to control hardware. Back end tools are used to view, modify, and record the signals generated by these front end machines. Software servers, which are a middleware layer between the front and back ends, can improve a control system in several ways. Servers can enable on-line processing of raw data, and consolidation of functionality. In many cases data retrieved from the front end must be processed in order to convert the raw data into useful information. These calculations are often redundantly performed by different programs, frequently offline. Servers can monitor the raw data and rapidly perform calculations, producing new signals which can be treated like any other control system signal, and can be used by any back end application. Algorithms can be incorporated to actively modify signal values in the control system based upon changes of other signals, essentially producing feedback in a control system. Servers thus increase the flexibility of a control system. Lastly, servers running on inexpensive UNIX workstations can relay or cache frequently needed information, reducing the load on front end hardware by functioning as concentrators. Rather than many back end tools connecting directly to the front end machines, increasing the work load of these machines, they instead connect to the server. Servers like those discussed above have been used successfully at the Thomas Jefferson National Accelerator Facility to provide functionality such as beam steering, fault monitoring, storage of machine parameters, and on-line data processing. The authors discuss the potential uses of such, servers, and share the results of work performed to date.

  6. Secure IRC Server

    2003-08-25

    The IRCD is an IRC server that was originally distributed by the IRCD Hybrid developer team for use as a server in IRC message over the public Internet. By supporting the IRC protocol defined in the IRC RFC, IRCD allows the users to create and join channels for group or one-to-one text-based instant messaging. It stores information about channels (e.g., whether it is public, secret, or invite-only, the topic set, membership) and users (who ismore » online and what channels they are members of). It receives messages for a specific user or channel and forwards these messages to the targeted destination. Since server-to-server communication is also supported, these targeted destinations may be connected to different IRC servers. Messages are exchanged over TCP connections that remain open between the client and the server. The IRCD is being used within the Pervasive Computing Collaboration Environment (PCCE) as the 'chat server' for message exchange over public and private channels. After an LBNLSecureMessaging(PCCE chat) client has been authenticated, the client connects to IRCD with its assigned nickname or 'nick.' The client can then create or join channels for group discussions or one-to-one conversations. These channels can have an initial mode of public or invite-only and the mode may be changed after creation. If a channel is public, any one online can join the discussion; if a channel is invite-only, users can only join if existing members of the channel explicity invite them. Users can be invited to any type of channel and users may be members of multiple channels simultaneously. For use with the PCCE environment, the IRCD application (which was written in C) was ported to Linux and has been tested and installed under Linux Redhat 7.2. The source code was also modified with SSL so that all messages exchanged over the network are encrypted. This modified IRC server also verifies with an authentication server that the client is who he or she claims to be and

  7. MAVID multiple alignment server.

    PubMed

    Bray, Nicolas; Pachter, Lior

    2003-07-01

    MAVID is a multiple alignment program suitable for many large genomic regions. The MAVID web server allows biomedical researchers to quickly obtain multiple alignments for genomic sequences and to subsequently analyse the alignments for conserved regions. MAVID has been successfully used for the alignment of closely related species such as primates and also for the alignment of more distant organisms such as human and fugu. The server is fast, capable of aligning hundreds of kilobases in less than a minute. The multiple alignment is used to build a phylogenetic tree for the sequences, which is subsequently used as a basis for identifying conserved regions in the alignment. The server can be accessed at http://baboon.math.berkeley.edu/mavid/.

  8. A Web Server for MACCS Magnetometer Data

    NASA Technical Reports Server (NTRS)

    Engebretson, Mark J.

    1998-01-01

    NASA Grant NAG5-3719 was provided to Augsburg College to support the development of a web server for the Magnetometer Array for Cusp and Cleft Studies (MACCS), a two-dimensional array of fluxgate magnetometers located at cusp latitudes in Arctic Canada. MACCS was developed as part of the National Science Foundation's GEM (Geospace Environment Modeling) Program, which was designed in part to complement NASA's Global Geospace Science programs during the decade of the 1990s. This report describes the successful use of these grant funds to support a working web page that provides both daily plots and file access to any user accessing the worldwide web. The MACCS home page can be accessed at http://space.augsburg.edu/space/MaccsHome.html.

  9. Wetlands and Web Pages.

    ERIC Educational Resources Information Center

    Tisone-Bartels, Dede

    1998-01-01

    Argues that the preservation of areas like the Shoreline Park (California) wetlands depends on educating students about the value of natural resources. Describes the creation of a Web page on the wetlands for third-grade students by seventh-grade art and ecology students. Outlines the technical process of developing a Web page. (DSK)

  10. NL MIND-BEST: a web server for ligands and proteins discovery--theoretic-experimental study of proteins of Giardia lamblia and new compounds active against Plasmodium falciparum.

    PubMed

    González-Díaz, Humberto; Prado-Prado, Francisco; Sobarzo-Sánchez, Eduardo; Haddad, Mohamed; Maurel Chevalley, Séverine; Valentin, Alexis; Quetin-Leclercq, Joëlle; Dea-Ayuela, María A; Teresa Gomez-Muños, María; Munteanu, Cristian R; José Torres-Labandeira, Juan; García-Mera, Xerardo; Tapia, Ricardo A; Ubeira, Florencio M

    2011-05-01

    There are many protein ligands and/or drugs described with very different affinity to a large number of target proteins or receptors. In this work, we selected Ligands or Drug-target pairs (DTPs/nDTPs) of drugs with high affinity/non-affinity for different targets. Quantitative Structure-Activity Relationships (QSAR) models become a very useful tool in this context to substantially reduce time and resources consuming experiments. Unfortunately most QSAR models predict activity against only one protein target and/or have not been implemented in the form of public web server freely accessible online to the scientific community. To solve this problem, we developed here a multi-target QSAR (mt-QSAR) classifier using the MARCH-INSIDE technique to calculate structural parameters of drug and target plus one Artificial Neuronal Network (ANN) to seek the model. The best ANN model found is a Multi-Layer Perceptron (MLP) with profile MLP 20:20-15-1:1. This MLP classifies correctly 611 out of 678 DTPs (sensitivity=90.12%) and 3083 out of 3408 nDTPs (specificity=90.46%), corresponding to training accuracy=90.41%. The validation of the model was carried out by means of external predicting series. The model classifies correctly 310 out of 338 DTPs (sensitivity=91.72%) and 1527 out of 1674 nDTP (specificity=91.22%) in validation series, corresponding to total accuracy=91.30% for validation series (predictability). This model favorably compares with other ANN models developed in this work and Machine Learning classifiers published before to address the same problem in different aspects. We implemented the present model at web portal Bio-AIMS in the form of an online server called: Non-Linear MARCH-INSIDE Nested Drug-Bank Exploration & Screening Tool (NL MIND-BEST), which is located at URL: http://miaja.tic.udc.es/Bio-AIMS/NL-MIND-BEST.php. This online tool is based on PHP/HTML/Python and MARCH-INSIDE routines. Finally we illustrated two practical uses of this server with two

  11. Dali server update

    PubMed Central

    Holm, Liisa; Laakso, Laura M.

    2016-01-01

    The Dali server (http://ekhidna2.biocenter.helsinki.fi/dali) is a network service for comparing protein structures in 3D. In favourable cases, comparing 3D structures may reveal biologically interesting similarities that are not detectable by comparing sequences. The Dali server has been running in various places for over 20 years and is used routinely by crystallographers on newly solved structures. The latest update of the server provides enhanced analytics for the study of sequence and structure conservation. The server performs three types of structure comparisons: (i) Protein Data Bank (PDB) search compares one query structure against those in the PDB and returns a list of similar structures; (ii) pairwise comparison compares one query structure against a list of structures specified by the user; and (iii) all against all structure comparison returns a structural similarity matrix, a dendrogram and a multidimensional scaling projection of a set of structures specified by the user. Structural superimpositions are visualized using the Java-free WebGL viewer PV. The structural alignment view is enhanced by sequence similarity searches against Uniprot. The combined structure-sequence alignment information is compressed to a stack of aligned sequence logos. In the stack, each structure is structurally aligned to the query protein and represented by a sequence logo. PMID:27131377

  12. Dali server update.

    PubMed

    Holm, Liisa; Laakso, Laura M

    2016-07-01

    The Dali server (http://ekhidna2.biocenter.helsinki.fi/dali) is a network service for comparing protein structures in 3D. In favourable cases, comparing 3D structures may reveal biologically interesting similarities that are not detectable by comparing sequences. The Dali server has been running in various places for over 20 years and is used routinely by crystallographers on newly solved structures. The latest update of the server provides enhanced analytics for the study of sequence and structure conservation. The server performs three types of structure comparisons: (i) Protein Data Bank (PDB) search compares one query structure against those in the PDB and returns a list of similar structures; (ii) pairwise comparison compares one query structure against a list of structures specified by the user; and (iii) all against all structure comparison returns a structural similarity matrix, a dendrogram and a multidimensional scaling projection of a set of structures specified by the user. Structural superimpositions are visualized using the Java-free WebGL viewer PV. The structural alignment view is enhanced by sequence similarity searches against Uniprot. The combined structure-sequence alignment information is compressed to a stack of aligned sequence logos. In the stack, each structure is structurally aligned to the query protein and represented by a sequence logo.

  13. On-demand server-side image processing for web-based DICOM image display

    NASA Astrophysics Data System (ADS)

    Sakusabe, Takaya; Kimura, Michio; Onogi, Yuzo

    2000-04-01

    Low cost image delivery is needed in modern networked hospitals. If a hospital has hundreds of clients, cost of client systems is a big problem. Naturally, a Web-based system is the most effective solution. But a Web browser could not display medical images with certain image processing such as a lookup table transformation. We developed a Web-based medical image display system using Web browser and on-demand server-side image processing. All images displayed on a Web page are generated from DICOM files on a server, delivered on-demand. User interaction on the Web page is handled by a client-side scripting technology such as JavaScript. This combination makes a look-and-feel of an imaging workstation not only for its functionality but also for its speed. Real time update of images with tracing mouse motion is achieved on Web browser without any client-side image processing which may be done by client-side plug-in technology such as Java Applets or ActiveX. We tested performance of the system in three cases. Single client, small number of clients in a fast speed network, and large number of clients in a normal speed network. The result shows that there are very slight overhead for communication and very scalable in number of clients.

  14. Design and implementation of streaming media server cluster based on FFMpeg.

    PubMed

    Zhao, Hong; Zhou, Chun-long; Jin, Bao-zhao

    2015-01-01

    Poor performance and network congestion are commonly observed in the streaming media single server system. This paper proposes a scheme to construct a streaming media server cluster system based on FFMpeg. In this scheme, different users are distributed to different servers according to their locations and the balance among servers is maintained by the dynamic load-balancing algorithm based on active feedback. Furthermore, a service redirection algorithm is proposed to improve the transmission efficiency of streaming media data. The experiment results show that the server cluster system has significantly alleviated the network congestion and improved the performance in comparison with the single server system.

  15. Design and Implementation of Streaming Media Server Cluster Based on FFMpeg

    PubMed Central

    Zhao, Hong; Zhou, Chun-long; Jin, Bao-zhao

    2015-01-01

    Poor performance and network congestion are commonly observed in the streaming media single server system. This paper proposes a scheme to construct a streaming media server cluster system based on FFMpeg. In this scheme, different users are distributed to different servers according to their locations and the balance among servers is maintained by the dynamic load-balancing algorithm based on active feedback. Furthermore, a service redirection algorithm is proposed to improve the transmission efficiency of streaming media data. The experiment results show that the server cluster system has significantly alleviated the network congestion and improved the performance in comparison with the single server system. PMID:25734187

  16. Making Pages That Move.

    ERIC Educational Resources Information Center

    Gepner, Ivan

    2001-01-01

    Explains the mechanism of producing dynamic computer pages which is based on three technologies: (1) the document object model; (2) cascading stylesheets; and (3) javascript. Discusses the applications of these techniques in genetics and developmental biology. (YDS)

  17. Page turning system

    NASA Technical Reports Server (NTRS)

    Kerley, James J. (Inventor); Eklund, Wayne D. (Inventor)

    1992-01-01

    A device for holding reading materials for use by readers without arm mobility is presented. The device is adapted to hold the reading materials in position for reading with the pages displayed to enable turning by use of a rubber tipped stick that is held in the mouth and has a pair of rectangular frames. The frames are for holding and positioning the reading materials opened in reading posture with the pages displayed at a substantially unobstructed sighting position for reading. The pair of rectangular frames are connected to one another by a hinge so the angle between the frames may be varied thereby varying the inclination of the reading material. A pair of bent spring mounted wires for holding opposing pages of the reading material open for reading without substantial visual interference of the pages is mounted to the base. The wires are also adjustable to the thickness of the reading material and have a variable friction adjustment. This enables the force of the wires against the pages to be varied and permits the reader to manipulate the pages with the stick.

  18. SLITHER: a web server for generating contiguous conformations of substrate molecules entering into deep active sites of proteins or migrating through channels in membrane transporters.

    PubMed

    Lee, Po-Hsien; Kuo, Kuei-Ling; Chu, Pei-Ying; Liu, Eric M; Lin, Jung-Hsin

    2009-07-01

    Many proteins use a long channel to guide the substrate or ligand molecules into the well-defined active sites for catalytic reactions or for switching molecular states. In addition, substrates of membrane transporters can migrate to another side of cellular compartment by means of certain selective mechanisms. SLITHER (http://bioinfo.mc.ntu.edu.tw/slither/or http://slither.rcas.sinica.edu.tw/) is a web server that can generate contiguous conformations of a molecule along a curved tunnel inside a protein, and the binding free energy profile along the predicted channel pathway. SLITHER adopts an iterative docking scheme, which combines with a puddle-skimming procedure, i.e. repeatedly elevating the potential energies of the identified global minima, thereby determines the contiguous binding modes of substrates inside the protein. In contrast to some programs that are widely used to determine the geometric dimensions in the ion channels, SLITHER can be applied to predict whether a substrate molecule can crawl through an inner channel or a half-channel of proteins across surmountable energy barriers. Besides, SLITHER also provides the list of the pore-facing residues, which can be directly compared with many genetic diseases. Finally, the adjacent binding poses determined by SLITHER can also be used for fragment-based drug design.

  19. Network and User-Perceived Performance of Web Page Retrievals

    NASA Technical Reports Server (NTRS)

    Kruse, Hans; Allman, Mark; Mallasch, Paul

    1998-01-01

    The development of the HTTP protocol has been driven by the need to improve the network performance of the protocol by allowing the efficient retrieval of multiple parts of a web page without the need for multiple simultaneous TCP connections between a client and a server. We suggest that the retrieval of multiple page elements sequentially over a single TCP connection may result in a degradation of the perceived performance experienced by the user. We attempt to quantify this perceived degradation through the use of a model which combines a web retrieval simulation and an analytical model of TCP operation. Starting with the current HTTP/l.1 specification, we first suggest a client@side heuristic to improve the perceived transfer performance. We show that the perceived speed of the page retrieval can be increased without sacrificing data transfer efficiency. We then propose a new client/server extension to the HTTP/l.1 protocol to allow for the interleaving of page element retrievals. We finally address the issue of the display of advertisements on web pages, and in particular suggest a number of mechanisms which can make efficient use of IP multicast to send advertisements to a number of clients within the same network.

  20. Home media server content management

    NASA Astrophysics Data System (ADS)

    Tokmakoff, Andrew A.; van Vliet, Harry

    2001-07-01

    With the advent of set-top boxes, the convergence of TV (broadcasting) and PC (Internet) is set to enter the home environment. Currently, a great deal of activity is occurring in developing standards (TV-Anytime Forum) and devices (TiVo) for local storage on Home Media Servers (HMS). These devices lie at the heart of convergence of the triad: communications/networks - content/media - computing/software. Besides massive storage capacity and being a communications 'gateway', the home media server is characterised by the ability to handle metadata and software that provides an easy to use on-screen interface and intelligent search/content handling facilities. In this paper, we describe a research prototype HMS that is being developed within the GigaCE project at the Telematica Instituut . Our prototype demonstrates advanced search and retrieval (video browsing), adaptive user profiling and an innovative 3D component of the Electronic Program Guide (EPG) which represents online presence. We discuss the use of MPEG-7 for representing metadata, the use of MPEG-21 working draft standards for content identification, description and rights expression, and the use of HMS peer-to-peer content distribution approaches. Finally, we outline explorative user behaviour experiments that aim to investigate the effectiveness of the prototype HMS during development.

  1. The NASA Technical Report Server

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Gottlich, Gretchen L.; Bianco, David J.; Paulson, Sharon S.; Binkley, Robert L.; Kellogg, Yvonne D.; Beaumont, Chris J.; Schmunk, Robert B.; Kurtz, Michael J.; Accomazzi, Alberto

    1995-01-01

    The National Aeronautics and Space Act of 1958 established NASA and charged it to "provide for the widest practicable and appropriate dissemination of information concerning its activities and the results thereof." The search for innovative methods to distribute NASA's information lead a grass-roots team to create the NASA Technical Report Server (NTRS), which uses the World Wide Web and other popular Internet-based information systems as search engines. The NTRS is an inter-center effort which provides uniform access to various distributed publication servers residing on the Internet. Users have immediate desktop access to technical publications from NASA centers and institutes. The NTRS is comprised of several units, some constructed especially for inclusion in NTRS, and others that are existing NASA publication services that NTRS reuses. This paper presents the NTRS architecture, usage metrics, and the lessons learned while implementing and maintaining the service. The NTRS is largely constructed with freely available software running on existing hardware. NTRS builds upon existing hardware and software, and the resulting additional exposure for the body of literature contained ensures that NASA's institutional knowledge base will continue to receive the widest practicable and appropriate dissemination.

  2. Enhanced networked server management with random remote backups

    NASA Astrophysics Data System (ADS)

    Kim, Song-Kyoo

    2003-08-01

    In this paper, the model is focused on available server management in network environments. The (remote) backup servers are hooked up by VPN (Virtual Private Network) and replace broken main severs immediately. A virtual private network (VPN) is a way to use a public network infrastructure and hooks up long-distance servers within a single network infrastructure. The servers can be represent as "machines" and then the system deals with main unreliable and random auxiliary spare (remote backup) machines. When the system performs a mandatory routine maintenance, auxiliary machines are being used for backups during idle periods. Unlike other existing models, the availability of auxiliary machines is changed for each activation in this enhanced model. Analytically tractable results are obtained by using several mathematical techniques and the results are demonstrated in the framework of optimized networked server allocation problems.

  3. Handley Page metal construction

    NASA Technical Reports Server (NTRS)

    1929-01-01

    In this report Handley Page construction techniques are shown such as: solid-drawn tubular duralumin spars are used in the stabilizer; plain channel sections are used extensively for minor components; and the manner of assembling them into a stabilizer compression strut is shown.

  4. Client/server study

    NASA Technical Reports Server (NTRS)

    Dezhgosha, Kamyar; Marcus, Robert; Brewster, Stephen

    1995-01-01

    The goal of this project is to find cost-effective and efficient strategies/solutions to integrate existing databases, manage network, and improve productivity of users in a move towards client/server and Integrated Desktop Environment (IDE) at NASA LeRC. The project consisted of two tasks as follows: (1) Data collection, and (2) Database Development/Integration. Under task 1, survey questionnaires and a database were developed. Also, an investigation on commercially available tools for automated data-collection and net-management was performed. As requirements evolved, the main focus has been task 2 which involved the following subtasks: (1) Data gathering/analysis of database user requirements, (2) Database analysis and design, making recommendations for modification of existing data structures into relational database or proposing a common interface to access heterogeneous databases(INFOMAN system, CCNS equipment list, CCNS software list, USERMAN, and other databases), (3) Establishment of a client/server test bed at Central State University (CSU), (4) Investigation of multi-database integration technologies/ products for IDE at NASA LeRC, and (5) Development of prototypes using CASE tools (Object/View) for representative scenarios accessing multi-databases and tables in a client/server environment. Both CSU and NASA LeRC have benefited from this project. CSU team investigated and prototyped cost-effective/practical solutions to facilitate NASA LeRC move to a more productive environment. CSU students utilized new products and gained skills that could be a great resource for future needs of NASA.

  5. Frame architecture for video servers

    NASA Astrophysics Data System (ADS)

    Venkatramani, Chitra; Kienzle, Martin G.

    1999-11-01

    Video is inherently frame-oriented and most applications such as commercial video processing require to manipulate video in terms of frames. However, typical video servers treat videos as byte streams and perform random access based on approximate byte offsets to be supplied by the client. They do not provide frame or timecode oriented API which is essential for many applications. This paper describes a frame-oriented architecture for video servers. It also describes the implementation in the context of IBM's VideoCharger server. The later part of the paper describes an application that uses the frame architecture and provides fast and slow-motion scanning capabilities to the server.

  6. PACS image security server

    NASA Astrophysics Data System (ADS)

    Cao, Fei; Huang, H. K.

    2004-04-01

    Medical image security in a PACS environment has become a pressing issue as communications of images increasingly extends over open networks, and hospitals are currently hard-pushed by Health Insurance Portability and Accountability Act (HIPAA) to be HIPPA complaint for ensuring health data security. Other security-related guidelines and technical standards continue bringing to the public attention in healthcare. However, there is not an infrastructure or systematic method to implement and deploy these standards in a PACS. In this paper, we first review DICOM Part15 standard for secure communications of medical images and the HIPAA impacts on PACS security, as well as our previous works on image security. Then we outline a security infrastructure in a HIPAA mandated PACS environment using a dedicated PACS image security server. The server manages its own database of all image security information. It acts as an image Authority for checking and certificating the image origin and integrity upon request by a user, as a secure DICOM gateway to the outside connections and meanwhile also as a PACS operation monitor for HIPAA supporting information.

  7. Do-It-Yourself: A Special Library's Approach to Creating Dynamic Web Pages Using Commercial Off-The-Shelf Applications

    NASA Technical Reports Server (NTRS)

    Steeman, Gerald; Connell, Christopher

    2000-01-01

    Many librarians may feel that dynamic Web pages are out of their reach, financially and technically. Yet we are reminded in library and Web design literature that static home pages are a thing of the past. This paper describes how librarians at the Institute for Defense Analyses (IDA) library developed a database-driven, dynamic intranet site using commercial off-the-shelf applications. Administrative issues include surveying a library users group for interest and needs evaluation; outlining metadata elements; and, committing resources from managing time to populate the database and training in Microsoft FrontPage and Web-to-database design. Technical issues covered include Microsoft Access database fundamentals, lessons learned in the Web-to-database process (including setting up Database Source Names (DSNs), redesigning queries to accommodate the Web interface, and understanding Access 97 query language vs. Standard Query Language (SQL)). This paper also offers tips on editing Active Server Pages (ASP) scripting to create desired results. A how-to annotated resource list closes out the paper.

  8. NEOS server 4.0 administrative guide.

    SciTech Connect

    Dolan, E. D.

    2001-07-13

    The NEOS Server 4.0 provides a general Internet-based client/server as a link between users and software applications. The administrative guide covers the fundamental principals behind the operation of the NEOS Server, installation and trouble-shooting of the Server software, and implementation details of potential interest to a NEOS Server administrator. The guide also discusses making new software applications available through the Server, including areas of concern to remote solver administrators such as maintaining security, providing usage instructions, and enforcing reasonable restrictions on jobs. The administrative guide is intended both as an introduction to the NEOS Server and as a reference for use when running the Server.

  9. Purge Lock Server

    2012-08-21

    The software provides a simple web api to allow users to request a time window where a file will not be removed from cache. HPSS provides the concept of a "purge lock". When a purge lock is set on a file, the file will not be removed from disk, entering tape only state. A lot of network file protocols assume a file is on disk so it is good to purge lock a file beforemore » transferring using one of those protocols. HPSS's purge lock system is very coarse grained though. A file is either purge locked or not. Nothing enforces quotas, timely unlocking of purge locks, or managing the races inherent with multiple users wanting to lock/unlock the same file. The Purge Lock Server lets you, through a simple REST API, specify a list of files to purge lock and an expire time, and the system will ensure things happen properly.« less

  10. WMS Server 2.0

    NASA Technical Reports Server (NTRS)

    Plesea, Lucian; Wood, James F.

    2012-01-01

    This software is a simple, yet flexible server of raster map products, compliant with the Open Geospatial Consortium (OGC) Web Map Service (WMS) 1.1.1 protocol. The server is a full implementation of the OGC WMS 1.1.1 as a fastCGI client and using Geospatial Data Abstraction Library (GDAL) for data access. The server can operate in a proxy mode, where all or part of the WMS requests are done on a back server. The server has explicit support for a colocated tiled WMS, including rapid response of black (no-data) requests. It generates JPEG and PNG images, including 16-bit PNG. The GDAL back-end support allows great flexibility on the data access. The server is a port to a Linux/GDAL platform from the original IRIX/IL platform. It is simpler to configure and use, and depending on the storage format used, it has better performance than other available implementations. The WMS server 2.0 is a high-performance WMS implementation due to the fastCGI architecture. The use of GDAL data back end allows for great flexibility. The configuration is relatively simple, based on a single XML file. It provides scaling and cropping, as well as blending of multiple layers based on layer transparency.

  11. Project EAGLE (Early Academic Gifted Learning Experience): A Program for Gifted and Talented Students (Grades K-3)--Kindergarten Activity Booklets: Xanthus; Zhack; and Activity Pages H-Z.

    ERIC Educational Resources Information Center

    Merkoski, Kay

    Three activity booklets are presented for implementing Project EAGLE, an enrichment program for gifted and talented kindergarten children. The first activity booklet contains a poem by J. D. Evans titled "In Search of the Xanthus," which describes the search for an imaginary beast that leaves an "X" on the spot where it used to be. The second…

  12. CCTOP: a Consensus Constrained TOPology prediction web server.

    PubMed

    Dobson, László; Reményi, István; Tusnády, Gábor E

    2015-07-01

    The Consensus Constrained TOPology prediction (CCTOP; http://cctop.enzim.ttk.mta.hu) server is a web-based application providing transmembrane topology prediction. In addition to utilizing 10 different state-of-the-art topology prediction methods, the CCTOP server incorporates topology information from existing experimental and computational sources available in the PDBTM, TOPDB and TOPDOM databases using the probabilistic framework of hidden Markov model. The server provides the option to precede the topology prediction with signal peptide prediction and transmembrane-globular protein discrimination. The initial result can be recalculated by (de)selecting any of the prediction methods or mapped experiments or by adding user specified constraints. CCTOP showed superior performance to existing approaches. The reliability of each prediction is also calculated, which correlates with the accuracy of the per protein topology prediction. The prediction results and the collected experimental information are visualized on the CCTOP home page and can be downloaded in XML format. Programmable access of the CCTOP server is also available, and an example of client-side script is provided.

  13. CCTOP: a Consensus Constrained TOPology prediction web server

    PubMed Central

    Dobson, László; Reményi, István; Tusnády, Gábor E.

    2015-01-01

    The Consensus Constrained TOPology prediction (CCTOP; http://cctop.enzim.ttk.mta.hu) server is a web-based application providing transmembrane topology prediction. In addition to utilizing 10 different state-of-the-art topology prediction methods, the CCTOP server incorporates topology information from existing experimental and computational sources available in the PDBTM, TOPDB and TOPDOM databases using the probabilistic framework of hidden Markov model. The server provides the option to precede the topology prediction with signal peptide prediction and transmembrane-globular protein discrimination. The initial result can be recalculated by (de)selecting any of the prediction methods or mapped experiments or by adding user specified constraints. CCTOP showed superior performance to existing approaches. The reliability of each prediction is also calculated, which correlates with the accuracy of the per protein topology prediction. The prediction results and the collected experimental information are visualized on the CCTOP home page and can be downloaded in XML format. Programmable access of the CCTOP server is also available, and an example of client-side script is provided. PMID:25943549

  14. Compute Server Performance Results

    NASA Technical Reports Server (NTRS)

    Stockdale, I. E.; Barton, John; Woodrow, Thomas (Technical Monitor)

    1994-01-01

    Parallel-vector supercomputers have been the workhorses of high performance computing. As expectations of future computing needs have risen faster than projected vector supercomputer performance, much work has been done investigating the feasibility of using Massively Parallel Processor systems as supercomputers. An even more recent development is the availability of high performance workstations which have the potential, when clustered together, to replace parallel-vector systems. We present a systematic comparison of floating point performance and price-performance for various compute server systems. A suite of highly vectorized programs was run on systems including traditional vector systems such as the Cray C90, and RISC workstations such as the IBM RS/6000 590 and the SGI R8000. The C90 system delivers 460 million floating point operations per second (FLOPS), the highest single processor rate of any vendor. However, if the price-performance ration (PPR) is considered to be most important, then the IBM and SGI processors are superior to the C90 processors. Even without code tuning, the IBM and SGI PPR's of 260 and 220 FLOPS per dollar exceed the C90 PPR of 160 FLOPS per dollar when running our highly vectorized suite,

  15. THttpServer class in ROOT

    NASA Astrophysics Data System (ADS)

    Adamczewski-Musch, Joern; Linev, Sergey

    2015-12-01

    The new THttpServer class in ROOT implements HTTP server for arbitrary ROOT applications. It is based on Civetweb embeddable HTTP server and provides direct access to all objects registered for the server. Objects data could be provided in different formats: binary, XML, GIF/PNG, and JSON. A generic user interface for THttpServer has been implemented with HTML/JavaScript based on JavaScript ROOT development. With any modern web browser one could list, display, and monitor objects available on the server. THttpServer is used in Go4 framework to provide HTTP interface to the online analysis.

  16. Using Firefly Tools to Enhance Archive Web Pages

    NASA Astrophysics Data System (ADS)

    Roby, W.; Wu, X.; Ly, L.; Goldina, T.

    2013-10-01

    Astronomy web developers are looking for fast and powerful HTML 5/AJAX tools to enhance their web archives. We are exploring ways to make this easier for the developer. How could you have a full FITS visualizer or a Web 2.0 table that supports paging, sorting, and filtering in your web page in 10 minutes? Can it be done without even installing any software or maintaining a server? Firefly is a powerful, configurable system for building web-based user interfaces to access astronomy science archives. It has been in production for the past three years. Recently, we have made some of the advanced components available through very simple JavaScript calls. This allows a web developer, without any significant knowledge of Firefly, to have FITS visualizers, advanced table display, and spectrum plots on their web pages with minimal learning curve. Because we use cross-site JSONP, installing a server is not necessary. Web sites that use these tools can be created in minutes. Firefly was created in IRSA, the NASA/IPAC Infrared Science Archive (http://irsa.ipac.caltech.edu). We are using Firefly to serve many projects including Spitzer, Planck, WISE, PTF, LSST and others.

  17. 8. Photocopy of printed page (original Page 30 of the ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. Photocopy of printed page (original Page 30 of the Souvenir Program 1867-1967 Ridgely Centennial) Photographer unknown. Circa 1967. VIEW NORTHEAST, SOUTHWEST FRONT Ridgely's centennial was celebrated in 1967 and included in the souvenir brochure was page 30. This view shows the subject building with the 1950 modifications to provide for automotive traffic. It was a print of a current photograph. - 510 Central Avenue (Commercial Building), Ridgely, Caroline County, MD

  18. Generic OPC UA Server Framework

    NASA Astrophysics Data System (ADS)

    Nikiel, Piotr P.; Farnham, Benjamin; Filimonov, Viatcheslav; Schlenker, Stefan

    2015-12-01

    This paper describes a new approach for generic design and efficient development of OPC UA servers. Development starts with creation of a design file, in XML format, describing an object-oriented information model of the target system or device. Using this model, the framework generates an executable OPC UA server application, which exposes the per-design OPC UA address space, without the developer writing a single line of code. Furthermore, the framework generates skeleton code into which the developer adds the necessary logic for integration to the target system or device. This approach allows both developers unfamiliar with the OPC UA standard, and advanced OPC UA developers, to create servers for the systems they are experts in while greatly reducing design and development effort as compared to developments based purely on COTS OPC UA toolkits. Higher level software may further benefit from the explicit OPC UA server model by using the XML design description as the basis for generating client connectivity configuration and server data representation. Moreover, having the XML design description at hand facilitates automatic generation of validation tools. In this contribution, the concept and implementation of this framework is detailed along with examples of actual production-level usage in the detector control system of the ATLAS experiment at CERN and beyond.

  19. ACSM Fit Society Page

    MedlinePlus

    ... Exercise Current Sports Medicine Reports Exercise and Sport Sciences Reviews ACSM's Health & Fitness Journal Guidelines Books & Multimedia Sports Medicine Basics Fact Sheets Sports Medicine & Physical Activity Marketplace Health & Physical Activity Reference Database Fit ...

  20. Reese Sorenson's Individual Professional Page

    NASA Technical Reports Server (NTRS)

    Sorenson, Reese; Nixon, David (Technical Monitor)

    1998-01-01

    The subject document is a World Wide Web (WWW) page entitled, "Reese Sorenson's Individual Professional Page." Its can be accessed at "http://george.arc.nasa.gov/sorenson/personal/index.html". The purpose of this page is to make the reader aware of me, who I am, and what I do. It lists my work assignments, my computer experience, my place in the NASA hierarchy, publications by me, awards received by me, my education, and how to contact me. Writing this page was a learning experience, pursuant to an element in my Job Description which calls for me to be able to use the latest computers. This web page contains very little technical information, none of which is classified or sensitive.

  1. Hybrid metrology implementation: server approach

    NASA Astrophysics Data System (ADS)

    Osorio, Carmen; Timoney, Padraig; Vaid, Alok; Elia, Alex; Kang, Charles; Bozdog, Cornel; Yellai, Naren; Grubner, Eyal; Ikegami, Toru; Ikeno, Masahiko

    2015-03-01

    Hybrid metrology (HM) is the practice of combining measurements from multiple toolset types in order to enable or improve metrology for advanced structures. HM is implemented in two phases: Phase-1 includes readiness of the infrastructure to transfer processed data from the first toolset to the second. Phase-2 infrastructure allows simultaneous transfer and optimization of raw data between toolsets such as spectra, images, traces - co-optimization. We discuss the extension of Phase-1 to include direct high-bandwidth communication between toolsets using a hybrid server, enabling seamless fab deployment and further laying the groundwork for Phase-2 high volume manufacturing (HVM) implementation. An example of the communication protocol shows the information that can be used by the hybrid server, differentiating its capabilities from that of a host-based approach. We demonstrate qualification and production implementation of the hybrid server approach using CD-SEM and OCD toolsets for complex 20nm and 14nm applications. Finally we discuss the roadmap for Phase-2 HM implementation through use of the hybrid server.

  2. Internet resources and web pages for pediatric surgeons.

    PubMed

    Lugo-Vicente, H

    2000-02-01

    The Internet, the largest network of connected computers, provides immediate, dynamic, and downloadable information. By re-architecturing the work place and becoming familiar with Internet resources, pediatric surgeons have anticipated the informatics capabilities of this computer-based technology creating a new vision of work and organization in such areas as patient care, teaching, and research. This review aims to highlight how Internet navigational technology can be a useful educational resource in pediatric surgery, examines web pages of interest, and defines ideas of network communication. Basic Internet resources are electronic mail, discussion groups, file transfer, and the Worldwide Web (WWW). Electronic mailing is the most useful resource extending the avenue of learning to an international audience through news or list-servers groups. Pediatric Surgery List Server, the most popular discussion group, is a constant forum for exchange of ideas, difficult cases, consensus on management, and development of our specialty. The WWW provides an all-in-one medium of text, image, sound, and video. Associations, departments, educational sites, organizations, peer-reviewed scientific journals and Medline database web pages of prime interest to pediatric surgeons have been developing at an amazing pace. Future developments of technological advance nurturing our specialty will consist of online journals, telemedicine, international chatting, computer-based training for surgical education, and centralization of cyberspace information into database search sites.

  3. Parallel Computing Using Web Servers and "Servlets".

    ERIC Educational Resources Information Center

    Lo, Alfred; Bloor, Chris; Choi, Y. K.

    2000-01-01

    Describes parallel computing and presents inexpensive ways to implement a virtual parallel computer with multiple Web servers. Highlights include performance measurement of parallel systems; models for using Java and intranet technology including single server, multiple clients and multiple servers, single client; and a comparison of CGI (common…

  4. Nuke@ - a nuclear information internet server

    SciTech Connect

    Slone, B.J. III.; Richardson, C.E.

    1994-12-31

    To facilitate Internet communications between nuclear utilities, vendors, agencies, and other interested parties, an Internet server is being established. This server will provide the nuclear industry with its first file-transfer protocol (ftp) connection point, its second mail server, and a potential telnet connection location.

  5. Code AI Personal Web Pages

    NASA Technical Reports Server (NTRS)

    Garcia, Joseph A.; Smith, Charles A. (Technical Monitor)

    1998-01-01

    The document consists of a publicly available web site (george.arc.nasa.gov) for Joseph A. Garcia's personal web pages in the AI division. Only general information will be posted and no technical material. All the information is unclassified.

  6. Protein knot server: detection of knots in protein structures.

    PubMed

    Kolesov, Grigory; Virnau, Peter; Kardar, Mehran; Mirny, Leonid A

    2007-07-01

    KNOTS (http://knots.mit.edu) is a web server that detects knots in protein structures. Several protein structures have been reported to contain intricate knots. The physiological role of knots and their effect on folding and evolution is an area of active research. The user submits a PDB id or uploads a 3D protein structure in PDB or mmCIF format. The current implementation of the server uses the Alexander polynomial to detect knots. The results of the analysis that are presented to the user are the location of the knot in the structure, the type of the knot and an interactive visualization of the knot. The results can also be downloaded and viewed offline. The server also maintains a regularly updated list of known knots in protein structures.

  7. File servers, networking, and supercomputers

    NASA Technical Reports Server (NTRS)

    Moore, Reagan W.

    1991-01-01

    One of the major tasks of a supercomputer center is managing the massive amount of data generated by application codes. A data flow analysis of the San Diego Supercomputer Center is presented that illustrates the hierarchical data buffering/caching capacity requirements and the associated I/O throughput requirements needed to sustain file service and archival storage. Usage paradigms are examined for both tightly-coupled and loosely-coupled file servers linked to the supercomputer by high-speed networks.

  8. File servers, networking, and supercomputers

    NASA Technical Reports Server (NTRS)

    Moore, Reagan W.

    1992-01-01

    One of the major tasks of a supercomputer center is managing the massive amount of data generated by application codes. A data flow analysis of the San Diego Supercomputer Center is presented that illustrates the hierarchical data buffering/caching capacity requirements and the associated I/O throughput requirements needed to sustain file service and archival storage. Usage paradigms are examined for both tightly-coupled and loosely-coupled file servers linked to the supercomputer by high-speed networks.

  9. Client-Server Password Recovery

    NASA Astrophysics Data System (ADS)

    Chmielewski, Łukasz; Hoepman, Jaap-Henk; van Rossum, Peter

    Human memory is not perfect - people constantly memorize new facts and forget old ones. One example is forgetting a password, a common problem raised at IT help desks. We present several protocols that allow a user to automatically recover a password from a server using partial knowledge of the password. These protocols can be easily adapted to the personal entropy setting [7], where a user can recover a password only if he can answer a large enough subset of personal questions.

  10. PROMALS3D web server for accurate multiple protein sequence and structure alignments.

    PubMed

    Pei, Jimin; Tang, Ming; Grishin, Nick V

    2008-07-01

    Multiple sequence alignments are essential in computational sequence and structural analysis, with applications in homology detection, structure modeling, function prediction and phylogenetic analysis. We report PROMALS3D web server for constructing alignments for multiple protein sequences and/or structures using information from available 3D structures, database homologs and predicted secondary structures. PROMALS3D shows higher alignment accuracy than a number of other advanced methods. Input of PROMALS3D web server can be FASTA format protein sequences, PDB format protein structures and/or user-defined alignment constraints. The output page provides alignments with several formats, including a colored alignment augmented with useful information about sequence grouping, predicted secondary structures and consensus sequences. Intermediate results of sequence and structural database searches are also available. The PROMALS3D web server is available at: http://prodata.swmed.edu/promals3d/. PMID:18503087

  11. The Faculty Web Page: Contrivance or Continuation?

    ERIC Educational Resources Information Center

    Lennex, Lesia

    2007-01-01

    In an age of Internet education, what does it mean for a tenure/tenure-track faculty to have a web page? How many professors have web pages? If they have a page, what does it look like? Do they really need a web page at all? Many universities have faculty web pages. What do those collective pages look like? In what way do they represent the…

  12. Accelerating Demand Paging for Local and Remote Out-of-Core Visualization

    NASA Technical Reports Server (NTRS)

    Ellsworth, David

    2001-01-01

    This paper describes a new algorithm that improves the performance of application-controlled demand paging for the out-of-core visualization of data sets that are on either local disks or disks on remote servers. The performance improvements come from better overlapping the computation with the page reading process, and by performing multiple page reads in parallel. The new algorithm can be applied to many different visualization algorithms since application-controlled demand paging is not specific to any visualization algorithm. The paper includes measurements that show that the new multi-threaded paging algorithm decreases the time needed to compute visualizations by one third when using one processor and reading data from local disk. The time needed when using one processor and reading data from remote disk decreased by up to 60%. Visualization runs using data from remote disk ran about as fast as ones using data from local disk because the remote runs were able to make use of the remote server's high performance disk array.

  13. Minimizing Thermal Stress for Data Center Servers through Thermal-Aware Relocation

    PubMed Central

    Ling, T. C.; Hussain, S. A.

    2014-01-01

    A rise in inlet air temperature may lower the rate of heat dissipation from air cooled computing servers. This introduces a thermal stress to these servers. As a result, the poorly cooled active servers will start conducting heat to the neighboring servers and giving rise to hotspot regions of thermal stress, inside the data center. As a result, the physical hardware of these servers may fail, thus causing performance loss, monetary loss, and higher energy consumption for cooling mechanism. In order to minimize these situations, this paper performs the profiling of inlet temperature sensitivity (ITS) and defines the optimum location for each server to minimize the chances of creating a thermal hotspot and thermal stress. Based upon novel ITS analysis, a thermal state monitoring and server relocation algorithm for data centers is being proposed. The contribution of this paper is bringing the peak outlet temperatures of the relocated servers closer to average outlet temperature by over 5 times, lowering the average peak outlet temperature by 3.5% and minimizing the thermal stress. PMID:24987743

  14. Design of Educational Web Pages

    ERIC Educational Resources Information Center

    Galan, Jose Gomez; Blanco, Soledad Mateos

    2004-01-01

    The methodological characteristics of teaching in primary and secondary education make it necessary to revise the pedagogical and instructive lines with which to introduce the new Information and Communication Technologies into the school context. The construction of Web pages that can be used to improve student learning is, therefore, fundamental…

  15. Learning through Web Page Design.

    ERIC Educational Resources Information Center

    Peel, Deborah

    2001-01-01

    Describes and evaluates the use of Web page design in an undergraduate course in the United Kingdom on town planning. Highlights include incorporating information and communication technologies into higher education; and a theoretical framework for the use of educational technology. (LRW)

  16. National Medical Terminology Server in Korea

    NASA Astrophysics Data System (ADS)

    Lee, Sungin; Song, Seung-Jae; Koh, Soonjeong; Lee, Soo Kyoung; Kim, Hong-Gee

    Interoperable EHR (Electronic Health Record) necessitates at least the use of standardized medical terminologies. This paper describes a medical terminology server, LexCare Suite, which houses terminology management applications, such as a terminology editor, and a terminology repository populated with international standard terminology systems such as Systematized Nomenclature of Medicine (SNOMED). The server is to satisfy the needs of quality terminology systems to local primary to tertiary hospitals. Our partner general hospitals have used the server to test its applicability. This paper describes the server and the results of the applicability test.

  17. UniTree Name Server internals

    SciTech Connect

    Mecozzi, D.; Minton, J.

    1996-01-01

    The UniTree Name Server (UNS) is one of several servers which make up the UniTree storage system. The Name Server is responsible for mapping names to capabilities Names are generally human readable ASCII strings of any length. Capabilities are unique 256-bit identifiers that point to files, directories, or symbolic links. The Name Server implements a UNIX style hierarchical directory structure to facilitate name-to-capability mapping. The principal task of the Name Server is to manage the directories which make up the UniTree directory structure. The principle clients of the Name Server are the FTP Daemon, NFS and a few UniTree utility routines. However, the Name Server is a generalized server and will accept messages from any client. The purpose of this paper is to describe the internal workings of the UniTree Name Server. In cases where it seems appropriate, the motivation for a particular choice of algorithm as description of the algorithm itself will be given.

  18. HDF-EOS Web Server

    NASA Technical Reports Server (NTRS)

    Ullman, Richard; Bane, Bob; Yang, Jingli

    2008-01-01

    A shell script has been written as a means of automatically making HDF-EOS-formatted data sets available via the World Wide Web. ("HDF-EOS" and variants thereof are defined in the first of the two immediately preceding articles.) The shell script chains together some software tools developed by the Data Usability Group at Goddard Space Flight Center to perform the following actions: Extract metadata in Object Definition Language (ODL) from an HDF-EOS file, Convert the metadata from ODL to Extensible Markup Language (XML), Reformat the XML metadata into human-readable Hypertext Markup Language (HTML), Publish the HTML metadata and the original HDF-EOS file to a Web server and an Open-source Project for a Network Data Access Protocol (OPeN-DAP) server computer, and Reformat the XML metadata and submit the resulting file to the EOS Clearinghouse, which is a Web-based metadata clearinghouse that facilitates searching for, and exchange of, Earth-Science data.

  19. Promoting Metacognition in First Year Anatomy Laboratories Using Plasticine Modeling and Drawing Activities: A Pilot Study of the "Blank Page" Technique

    ERIC Educational Resources Information Center

    Naug, Helen L.; Colson, Natalie J.; Donner, Daniel G.

    2011-01-01

    Many first year students of anatomy and physiology courses demonstrate an inability to self-regulate their learning. To help students increase their awareness of their own learning in a first year undergraduate anatomy course, we piloted an exercise that incorporated the processes of (1) active learning: drawing and plasticine modeling and (2)…

  20. Market study: Tactile paging system

    NASA Technical Reports Server (NTRS)

    1977-01-01

    A market survey was conducted regarding the commercialization potential and key market factors relevant to a tactile paging system for deaf-blind people. The purpose of the tactile paging system is to communicate to the deaf-blind people in an institutional environment. The system consists of a main console and individual satellite wrist units. The console emits three signals by telemetry to the wrist com (receiving unit) which will measure approximately 2 x 4 x 3/4 inches and will be fastened to the wrist by a strap. The three vibration signals are fire alarm, time period indication, and a third signal which will alert the wearer of the wrist com to the fact that the pin on the top of the wrist is emitting a morse coded message. The Morse code message can be felt and recognized with the finger.

  1. Interfaces for Distributed Systems of Information Servers.

    ERIC Educational Resources Information Center

    Kahle, Brewster; And Others

    1992-01-01

    Describes two systems--Wide Area Information Servers (WAIS) and Rosebud--that provide protocol-based mechanisms for accessing remote full-text information servers. Design constraints, human interface design, and implementation are examined for five interfaces to these systems developed to run on the Macintosh or Unix terminals. Sample screen…

  2. Optimizing the NASA Technical Report Server.

    ERIC Educational Resources Information Center

    Nelson, Michael L.; Maa, Ming-Hokng

    1996-01-01

    Modifying the NASA Technical Report Server (NTRS), a World Wide Web report distribution NASA technical publications service, has enhanced its performance, protocol support, and human interfacing. This article discusses the original and revised NTRS architecture, sequential and parallel query methods, and wide area information server (WAIS) uniform…

  3. Get the Word Out with List Servers

    ERIC Educational Resources Information Center

    Goldberg, Laurence

    2006-01-01

    In this article, the author details the use of electronic mail server in their school. In their school district of about 7,300 students in suburban Philadelphia (Abington SD), electronic mail list servers are now being used, along with other methods of communication, to disseminate information quickly and widely. They began by manually maintaining…

  4. You're a What? Process Server

    ERIC Educational Resources Information Center

    Torpey, Elka

    2012-01-01

    In this article, the author talks about the role and functions of a process server. The job of a process server is to hand deliver legal documents to the people involved in court cases. These legal documents range from a summons to appear in court to a subpoena for producing evidence. Process serving can involve risk, as some people take out their…

  5. Performance of a distributed superscalar storage server

    NASA Technical Reports Server (NTRS)

    Finestead, Arlan; Yeager, Nancy

    1993-01-01

    The RS/6000 performed well in our test environment. The potential exists for the RS/6000 to act as a departmental server for a small number of users, rather than as a high speed archival server. Multiple UniTree Disk Server's utilizing one UniTree Disk Server's utilizing one UniTree Name Server could be developed that would allow for a cost effective archival system. Our performance tests were clearly limited by the network bandwidth. The performance gathered by the LibUnix testing shows that UniTree is capable of exceeding ethernet speeds on an RS/6000 Model 550. The performance of FTP might be significantly faster if asked to perform across a higher bandwidth network. The UniTree Name Server also showed signs of being a potential bottleneck. UniTree sites that would require a high ratio of file creations and deletions to reads and writes would run into this bottleneck. It is possible to improve the UniTree Name Server performance by bypassing the UniTree LibUnix Library altogether and communicating directly with the UniTree Name Server and optimizing creations. Although testing was performed in a less than ideal environment, hopefully the performance statistics stated in this paper will give end-users a realistic idea as to what performance they can expect in this type of setup.

  6. JPred4: a protein secondary structure prediction server.

    PubMed

    Drozdetskiy, Alexey; Cole, Christian; Procter, James; Barton, Geoffrey J

    2015-07-01

    JPred4 (http://www.compbio.dundee.ac.uk/jpred4) is the latest version of the popular JPred protein secondary structure prediction server which provides predictions by the JNet algorithm, one of the most accurate methods for secondary structure prediction. In addition to protein secondary structure, JPred also makes predictions of solvent accessibility and coiled-coil regions. The JPred service runs up to 94 000 jobs per month and has carried out over 1.5 million predictions in total for users in 179 countries. The JPred4 web server has been re-implemented in the Bootstrap framework and JavaScript to improve its design, usability and accessibility from mobile devices. JPred4 features higher accuracy, with a blind three-state (α-helix, β-strand and coil) secondary structure prediction accuracy of 82.0% while solvent accessibility prediction accuracy has been raised to 90% for residues <5% accessible. Reporting of results is enhanced both on the website and through the optional email summaries and batch submission results. Predictions are now presented in SVG format with options to view full multiple sequence alignments with and without gaps and insertions. Finally, the help-pages have been updated and tool-tips added as well as step-by-step tutorials. PMID:25883141

  7. JPred4: a protein secondary structure prediction server

    PubMed Central

    Drozdetskiy, Alexey; Cole, Christian; Procter, James; Barton, Geoffrey J.

    2015-01-01

    JPred4 (http://www.compbio.dundee.ac.uk/jpred4) is the latest version of the popular JPred protein secondary structure prediction server which provides predictions by the JNet algorithm, one of the most accurate methods for secondary structure prediction. In addition to protein secondary structure, JPred also makes predictions of solvent accessibility and coiled-coil regions. The JPred service runs up to 94 000 jobs per month and has carried out over 1.5 million predictions in total for users in 179 countries. The JPred4 web server has been re-implemented in the Bootstrap framework and JavaScript to improve its design, usability and accessibility from mobile devices. JPred4 features higher accuracy, with a blind three-state (α-helix, β-strand and coil) secondary structure prediction accuracy of 82.0% while solvent accessibility prediction accuracy has been raised to 90% for residues <5% accessible. Reporting of results is enhanced both on the website and through the optional email summaries and batch submission results. Predictions are now presented in SVG format with options to view full multiple sequence alignments with and without gaps and insertions. Finally, the help-pages have been updated and tool-tips added as well as step-by-step tutorials. PMID:25883141

  8. Photojournal Home Page Graphic 2007

    NASA Technical Reports Server (NTRS)

    2008-01-01

    This image is an unannotated version of the Photojournal Home Page graphic released in October 2007. This digital collage contains a highly stylized rendition of our solar system and points beyond. As this graphic was intended to be used as a navigation aid in searching for data within the Photojournal, certain artistic embellishments have been added (color, location, etc.). Several data sets from various planetary and astronomy missions were combined to create this image.

  9. Planetary Photojournal Home Page Graphic

    NASA Technical Reports Server (NTRS)

    2004-01-01

    This image is an unannotated version of the Planetary Photojournal Home Page graphic. This digital collage contains a highly stylized rendition of our solar system and points beyond. As this graphic was intended to be used as a navigation aid in searching for data within the Photojournal, certain artistic embellishments have been added (color, location, etc.). Several data sets from various planetary and astronomy missions were combined to create this image.

  10. Implementing a Physician's Workstation using client/server technology and the distributed computing environment.

    PubMed Central

    Pham, T. Q.; Young, C. Y.; Tang, P. C.; Suermondt, H. J.; Annevelink, J.

    1994-01-01

    PWS is a physician's workstation research prototype developed to explore the use of information management tools by physicians in the context of patient care. The original prototype was implemented in a client/server architecture using a broadcast message server. As we expanded the scope of the prototyping activities, we identified the limitations of the broadcast message server in the areas of scalability, security, and interoperability. To address these issues, we reimplemented PWS using the Open Software Foundation's Distributed Computing Environment (DCE). We describe the rationale for using DCE, the migration process, and the benefits achieved. Future work and recommendations are discussed. PMID:7950003

  11. The Argonne Voyager multimedia server

    SciTech Connect

    Disz, T.; Judson, I.; Olson, R.; Stevens, R.

    1997-07-01

    With the growing presence of multimedia-enabled systems, one will see an integration of collaborative computing concepts into the everyday environments of future scientific and technical workplaces. Desktop teleconferencing is in common use today, while more complex desktop teleconferencing technology that relies on the availability of multipoint (greater than two nodes) enabled tools is now starting to become available on PCs. A critical problem when using these collaboration tools is the inability to easily archive multistream, multipoint meetings and make the content available to others. Ideally one would like the ability to capture, record, playback, index, annotate and distribute multimedia stream data as easily as one currently handles text or still image data. While the ultimate goal is still some years away, the Argonne Voyager project is aimed at exploring and developing media server technology needed to provide a flexible virtual multipoint recording/playback capability. In this article the authors describe the motivating requirements, architecture implementation, operation, performance, and related work.

  12. Heap/stack guard pages using a wakeup unit

    DOEpatents

    Gooding, Thomas M; Satterfield, David L; Steinmacher-Burow, Burkhard

    2014-04-29

    A method and system for providing a memory access check on a processor including the steps of detecting accesses to a memory device including level-1 cache using a wakeup unit. The method includes invalidating level-1 cache ranges corresponding to a guard page, and configuring a plurality of wakeup address compare (WAC) registers to allow access to selected WAC registers. The method selects one of the plurality of WAC registers, and sets up a WAC register related to the guard page. The method configures the wakeup unit to interrupt on access of the selected WAC register. The method detects access of the memory device using the wakeup unit when a guard page is violated. The method generates an interrupt to the core using the wakeup unit, and determines the source of the interrupt. The method detects the activated WAC registers assigned to the violated guard page, and initiates a response.

  13. PiRaNhA: a server for the computational prediction of RNA-binding residues in protein sequences

    PubMed Central

    Murakami, Yoichi; Spriggs, Ruth V.; Nakamura, Haruki; Jones, Susan

    2010-01-01

    The PiRaNhA web server is a publicly available online resource that automatically predicts the location of RNA-binding residues (RBRs) in protein sequences. The goal of functional annotation of sequences in the field of RNA binding is to provide predictions of high accuracy that require only small numbers of targeted mutations for verification. The PiRaNhA server uses a support vector machine (SVM), with position-specific scoring matrices, residue interface propensity, predicted residue accessibility and residue hydrophobicity as features. The server allows the submission of up to 10 protein sequences, and the predictions for each sequence are provided on a web page and via email. The prediction results are provided in sequence format with predicted RBRs highlighted, in text format with the SVM threshold score indicated and as a graph which enables users to quickly identify those residues above any specific SVM threshold. The graph effectively enables the increase or decrease of the false positive rate. When tested on a non-redundant data set of 42 protein sequences not used in training, the PiRaNhA server achieved an accuracy of 85%, specificity of 90% and a Matthews correlation coefficient of 0.41 and outperformed other publicly available servers. The PiRaNhA prediction server is freely available at http://www.bioinformatics.sussex.ac.uk/PIRANHA. PMID:20507911

  14. Interstellar Initiative Web Page Design

    NASA Technical Reports Server (NTRS)

    Mehta, Alkesh

    1999-01-01

    This summer at NASA/MSFC, I have contributed to two projects: Interstellar Initiative Web Page Design and Lenz's Law Relative Motion Demonstration. In the Web Design Project, I worked on an Outline. The Web Design Outline was developed to provide a foundation for a Hierarchy Tree Structure. The Outline would help design a Website information base for future and near-term missions. The Website would give in-depth information on Propulsion Systems and Interstellar Travel. The Lenz's Law Relative Motion Demonstrator is discussed in this volume by Russell Lee.

  15. Interstellar Initiative Web Page Design

    NASA Astrophysics Data System (ADS)

    Mehta, Alkesh

    1999-10-01

    This summer at NASA/MSFC, I have contributed to two projects: Interstellar Initiative Web Page Design and Lenz's Law Relative Motion Demonstration. In the Web Design Project, I worked on an Outline. The Web Design Outline was developed to provide a foundation for a Hierarchy Tree Structure. The Outline would help design a Website information base for future and near-term missions. The Website would give in-depth information on Propulsion Systems and Interstellar Travel. The Lenz's Law Relative Motion Demonstrator is discussed in this volume by Russell Lee.

  16. Web server with ATMEGA 2560 microcontroller

    NASA Astrophysics Data System (ADS)

    Răduca, E.; Ungureanu-Anghel, D.; Nistor, L.; Haţiegan, C.; Drăghici, S.; Chioncel, C.; Spunei, E.; Lolea, R.

    2016-02-01

    This paper presents the design and building of a Web Server to command, control and monitor at a distance lots of industrial or personal equipments and/or sensors. The server works based on a personal software. The software can be written by users and can work with many types of operating system. The authors were realized the Web server based on two platforms, an UC board and a network board. The source code was written in "open source" language Arduino 1.0.5.

  17. The Matpar Server on the HP Exemplar

    NASA Technical Reports Server (NTRS)

    Springer, Paul

    2000-01-01

    This presentation reviews the design of Matlab for parallel processing on a parallel system. Matlab was found to be too slow on many large problems, and with the Next Generation Space Telescope requiring greater capability, the work was begun in early 1996 on parallel extensions to Matlab, called Matpar. This presentation reviews the architecture, the functionality, and the design of MatPar. The design utilizes a client server strategy, with the client code written in C, and the object-oriented server code written in C++. The client/server approach for Matpar provides ease of use an good speed.

  18. The SDSS data archive server

    SciTech Connect

    Neilsen, Eric H., Jr.; /Fermilab

    2007-10-01

    The Sloan Digital Sky Survey (SDSS) Data Archive Server (DAS) provides public access to data files produced by the SDSS data reduction pipeline. This article discusses challenges in public distribution of data of this volume and complexity, and how the project addressed them. The Sloan Digital Sky Survey (SDSS)1 is an astronomical survey of covering roughly one quarter of the night sky. It contains images of this area, a catalog of almost 300 million objects detected in those images, and spectra of more than a million of these objects. The catalog of objects includes a variety of data on each object. These data include not only basic information but also fit parameters for a variety of models, classifications by sophisticated object classification algorithms, statistical parameters, and more. If the survey contains the spectrum of an object, the catalog includes a variety of other parameters derived from its spectrum. Data processing and catalog generation, described more completely in the SDSS Early Data Release2 paper, consists of several stages: collection of imaging data, processing of imaging data, selection of spectroscopic targets from catalogs generated from the imaging data, collection of spectroscopic data, processing of spectroscopic data, and loading of processed data into a database. Each of these stages is itself a complex process. For example, the software that processes the imaging data determines and removes some instrumental signatures in the raw images to create 'corrected frames', models the point spread function, models and removes the sky background, detects objects, measures object positions, measures the radial profile and other morphological parameters for each object, measures the brightness of each object using a variety of methods, classifies the objects, calibrates the brightness measurements against survey standards, and produces a variety of quality assurance plots and diagnostic tables. The complexity of the spectroscopic data

  19. The network-enabled optimization system server

    SciTech Connect

    Mesnier, M.P.

    1995-08-01

    Mathematical optimization is a technology under constant change and advancement, drawing upon the most efficient and accurate numerical methods to date. Further, these methods can be tailored for a specific application or generalized to accommodate a wider range of problems. This perpetual change creates an ever growing field, one that is often difficult to stay abreast of. Hence, the impetus behind the Network-Enabled Optimization System (NEOS) server, which aims to provide users, both novice and expert, with a guided tour through the expanding world of optimization. The NEOS server is responsible for bridging the gap between users and the optimization software they seek. More specifically, the NEOS server will accept optimization problems over the Internet and return a solution to the user either interactively or by e-mail. This paper discusses the current implementation of the server.

  20. PDS: A Performance Database Server

    DOE PAGES

    Berry, Michael W.; Dongarra, Jack J.; Larose, Brian H.; Letsche, Todd A.

    1994-01-01

    The process of gathering, archiving, and distributing computer benchmark data is a cumbersome task usually performed by computer users and vendors with little coordination. Most important, there is no publicly available central depository of performance data for all ranges of machines from personal computers to supercomputers. We present an Internet-accessible performance database server (PDS) that can be used to extract current benchmark data and literature. As an extension to the X-Windows-based user interface (Xnetlib) to the Netlib archival system, PDS provides an on-line catalog of public domain computer benchmarks such as the LINPACK benchmark, Perfect benchmarks, and the NAS parallelmore » benchmarks. PDS does not reformat or present the benchmark data in any way that conflicts with the original methodology of any particular benchmark; it is thereby devoid of any subjective interpretations of machine performance. We believe that all branches (research laboratories, academia, and industry) of the general computing community can use this facility to archive performance metrics and make them readily available to the public. PDS can provide a more manageable approach to the development and support of a large dynamic database of published performance metrics.« less

  1. RCD+: Fast loop modeling server.

    PubMed

    López-Blanco, José Ramón; Canosa-Valls, Alejandro Jesús; Li, Yaohang; Chacón, Pablo

    2016-07-01

    Modeling loops is a critical and challenging step in protein modeling and prediction. We have developed a quick online service (http://rcd.chaconlab.org) for ab initio loop modeling combining a coarse-grained conformational search with a full-atom refinement. Our original Random Coordinate Descent (RCD) loop closure algorithm has been greatly improved to enrich the sampling distribution towards near-native conformations. These improvements include a new workflow optimization, MPI-parallelization and fast backbone angle sampling based on neighbor-dependent Ramachandran probability distributions. The server starts by efficiently searching the vast conformational space from only the loop sequence information and the environment atomic coordinates. The generated closed loop models are subsequently ranked using a fast distance-orientation dependent energy filter. Top ranked loops are refined with the Rosetta energy function to obtain accurate all-atom predictions that can be interactively inspected in an user-friendly web interface. Using standard benchmarks, the average root mean squared deviation (RMSD) is 0.8 and 1.4 Å for 8 and 12 residues loops, respectively, in the challenging modeling scenario in where the side chains of the loop environment are fully remodeled. These results are not only very competitive compared to those obtained with public state of the art methods, but also they are obtained ∼10-fold faster. PMID:27151199

  2. RCD+: Fast loop modeling server

    PubMed Central

    López-Blanco, José Ramón; Canosa-Valls, Alejandro Jesús; Li, Yaohang; Chacón, Pablo

    2016-01-01

    Modeling loops is a critical and challenging step in protein modeling and prediction. We have developed a quick online service (http://rcd.chaconlab.org) for ab initio loop modeling combining a coarse-grained conformational search with a full-atom refinement. Our original Random Coordinate Descent (RCD) loop closure algorithm has been greatly improved to enrich the sampling distribution towards near-native conformations. These improvements include a new workflow optimization, MPI-parallelization and fast backbone angle sampling based on neighbor-dependent Ramachandran probability distributions. The server starts by efficiently searching the vast conformational space from only the loop sequence information and the environment atomic coordinates. The generated closed loop models are subsequently ranked using a fast distance-orientation dependent energy filter. Top ranked loops are refined with the Rosetta energy function to obtain accurate all-atom predictions that can be interactively inspected in an user-friendly web interface. Using standard benchmarks, the average root mean squared deviation (RMSD) is 0.8 and 1.4 Å for 8 and 12 residues loops, respectively, in the challenging modeling scenario in where the side chains of the loop environment are fully remodeled. These results are not only very competitive compared to those obtained with public state of the art methods, but also they are obtained ∼10-fold faster. PMID:27151199

  3. PEM public key certificate cache server

    NASA Astrophysics Data System (ADS)

    Cheung, T.

    1993-12-01

    Privacy Enhanced Mail (PEM) provides privacy enhancement services to users of Internet electronic mail. Confidentiality, authentication, message integrity, and non-repudiation of origin are provided by applying cryptographic measures to messages transferred between end systems by the Message Transfer System. PEM supports both symmetric and asymmetric key distribution. However, the prevalent implementation uses a public key certificate-based strategy, modeled after the X.509 directory authentication framework. This scheme provides an infrastructure compatible with X.509. According to RFC 1422, public key certificates can be stored in directory servers, transmitted via non-secure message exchanges, or distributed via other means. Directory services provide a specialized distributed database for OSI applications. The directory contains information about objects and then provides structured mechanisms for accessing that information. Since directory services are not widely available now, a good approach is to manage certificates in a centralized certificate server. This document describes the detailed design of a centralized certificate cache serve. This server manages a cache of certificates and a cache of Certificate Revocation Lists (CRL's) for PEM applications. PEMapplications contact the server to obtain/store certificates and CRL's. The server software is programmed in C and ELROS. To use this server, ISODE has to be configured and installed properly. The ISODE library 'libisode.a' has to be linked together with this library because ELROS uses the transport layer functions provided by 'libisode.a.' The X.500 DAP library that is included with the ELROS distribution has to be linked in also, since the server uses the DAP library functions to communicate with directory servers.

  4. "I didn't know her, but…": parasocial mourning of mediated deaths on Facebook RIP pages

    NASA Astrophysics Data System (ADS)

    Klastrup, Lisbeth

    2015-04-01

    This article examines the use of six Danish "Rest in Peace" or (RIP) memorial pages. The article focuses on the relation between news media and RIP page use, in relation to general communicative practices on these pages. Based on an analysis of press coverage of the deaths of six young people and a close analysis of 1,015 comments extracted from the RIP pages created to memorialize them, it is shown that their deaths attracted considerable media attention, as did the RIP pages themselves. Comment activity seem to reflect the news stories in the way the commenters refer to the context of death and the emotional distress they experience, but mainly comments on the RIP pages are conventional expressions of sympathy and "RIP" wishes. The article concludes that public RIP pages might be understood as virtual spontaneous shrines, affording an emerging practice of "RIP-ing."

  5. DINAMelt web server for nucleic acid melting prediction

    PubMed Central

    Markham, Nicholas R.; Zuker, Michael

    2005-01-01

    The DINAMelt web server simulates the melting of one or two single-stranded nucleic acids in solution. The goal is to predict not just a melting temperature for a hybridized pair of nucleic acids, but entire equilibrium melting profiles as a function of temperature. The two molecules are not required to be complementary, nor must the two strand concentrations be equal. Competition among different molecular species is automatically taken into account. Calculations consider not only the heterodimer, but also the two possible homodimers, as well as the folding of each single-stranded molecule. For each of these five molecular species, free energies are computed by summing Boltzmann factors over every possible hybridized or folded state. For temperatures within a user-specified range, calculations predict species mole fractions together with the free energy, enthalpy, entropy and heat capacity of the ensemble. Ultraviolet (UV) absorbance at 260 nm is simulated using published extinction coefficients and computed base pair probabilities. All results are available as text files and plots are provided for species concentrations, heat capacity and UV absorbance versus temperature. This server is connected to an active research program and should evolve as new theory and software are developed. The server URL is . PMID:15980540

  6. ACFIS: a web server for fragment-based drug discovery

    PubMed Central

    Hao, Ge-Fei; Jiang, Wen; Ye, Yuan-Nong; Wu, Feng-Xu; Zhu, Xiao-Lei; Guo, Feng-Biao; Yang, Guang-Fu

    2016-01-01

    In order to foster innovation and improve the effectiveness of drug discovery, there is a considerable interest in exploring unknown ‘chemical space’ to identify new bioactive compounds with novel and diverse scaffolds. Hence, fragment-based drug discovery (FBDD) was developed rapidly due to its advanced expansive search for ‘chemical space’, which can lead to a higher hit rate and ligand efficiency (LE). However, computational screening of fragments is always hampered by the promiscuous binding model. In this study, we developed a new web server Auto Core Fragment in silico Screening (ACFIS). It includes three computational modules, PARA_GEN, CORE_GEN and CAND_GEN. ACFIS can generate core fragment structure from the active molecule using fragment deconstruction analysis and perform in silico screening by growing fragments to the junction of core fragment structure. An integrated energy calculation rapidly identifies which fragments fit the binding site of a protein. We constructed a simple interface to enable users to view top-ranking molecules in 2D and the binding mode in 3D for further experimental exploration. This makes the ACFIS a highly valuable tool for drug discovery. The ACFIS web server is free and open to all users at http://chemyang.ccnu.edu.cn/ccb/server/ACFIS/. PMID:27150808

  7. ACFIS: a web server for fragment-based drug discovery.

    PubMed

    Hao, Ge-Fei; Jiang, Wen; Ye, Yuan-Nong; Wu, Feng-Xu; Zhu, Xiao-Lei; Guo, Feng-Biao; Yang, Guang-Fu

    2016-07-01

    In order to foster innovation and improve the effectiveness of drug discovery, there is a considerable interest in exploring unknown 'chemical space' to identify new bioactive compounds with novel and diverse scaffolds. Hence, fragment-based drug discovery (FBDD) was developed rapidly due to its advanced expansive search for 'chemical space', which can lead to a higher hit rate and ligand efficiency (LE). However, computational screening of fragments is always hampered by the promiscuous binding model. In this study, we developed a new web server Auto Core Fragment in silico Screening (ACFIS). It includes three computational modules, PARA_GEN, CORE_GEN and CAND_GEN. ACFIS can generate core fragment structure from the active molecule using fragment deconstruction analysis and perform in silico screening by growing fragments to the junction of core fragment structure. An integrated energy calculation rapidly identifies which fragments fit the binding site of a protein. We constructed a simple interface to enable users to view top-ranking molecules in 2D and the binding mode in 3D for further experimental exploration. This makes the ACFIS a highly valuable tool for drug discovery. The ACFIS web server is free and open to all users at http://chemyang.ccnu.edu.cn/ccb/server/ACFIS/. PMID:27150808

  8. Finding Specification Pages from the Web

    NASA Astrophysics Data System (ADS)

    Yoshinaga, Naoki; Torisawa, Kentaro

    This paper presents a method of finding a specification page on the Web for a given object (e.g., ``Ch. d'Yquem'') and its class label (e.g., ``wine''). A specification page for an object is a Web page which gives concise attribute-value information about the object (e.g., ``county''-``Sauternes'') in well formatted structures. A simple unsupervised method using layout and symbolic decoration cues was applied to a large number of the Web pages to acquire candidate attributes for each class (e.g., ``county'' for a class ``wine''). We then filter out irrelevant words from the putative attributes through an author-aware scoring function that we called site frequency. We used the acquired attributes to select a representative specification page for a given object from the Web pages retrieved by a normal search engine. Experimental results revealed that our system greatly outperformed the normal search engine in terms of this specification retrieval.

  9. Realistic page-turning of electronic books

    NASA Astrophysics Data System (ADS)

    Fan, Chaoran; Li, Haisheng; Bai, Yannan

    2014-01-01

    The booming electronic books (e-books), as an extension to the paper book, are popular with readers. Recently, many efforts are put into the realistic page-turning simulation o f e-book to improve its reading experience. This paper presents a new 3D page-turning simulation approach, which employs piecewise time-dependent cylindrical surfaces to describe the turning page and constructs smooth transition method between time-dependent cylinders. The page-turning animation is produced by sequentially mapping the turning page into the cylinders with different radii and positions. Compared to the previous approaches, our method is able to imitate various effects efficiently and obtains more natural animation of turning page.

  10. Autofluorescence based visualization of proteins from unstained native-PAGE

    NASA Astrophysics Data System (ADS)

    Manjunath, S.; Rao, Bola Sadashiva S.; Satyamoorthy, Kapaettu; Mahato, Krishna Kishore

    2015-03-01

    Proteins are the most diverse and functionally active biomolecules in the living system. In order to understand their diversity and dynamic functionality, visualization in native form without altering structural and functional properties during the separation from the complex mixtures is very much essential. In the present study, a sensitive methodology for optimal visualization of unstained or untagged proteins in native poly-acrylamide gel electrophoresis (N-PAGE) has been developed where, concentration of the acrylamide and bis-acrylamide mixture, Percentage of the gel, fixing of the N-PAGE by methanol: acetic acid: water and washing of the gel in the mili-Q water has been optimized for highest sensitivity using laser induced autofluorescence. The outcome with bovine serum albumin (BSA) in PAGE was found to be highest at acrylamide and bis-acrylamide concentrations of 29.2 and 0.8 respectively in 12% N-PAGE. After the electrophoresis run, washing of the N-PAGE immediately with miliQ water for 12 times and eliminating the methanol: acetic acid: water, fixing of the N-PAGE yielded better sensitivity of visualization. Using the above methodology 25ng of BSA protein band in PAGE was clearly identified by the technique. The currently used staining techniques for the visualization of proteins are coomassie brilliant blue and silver staining, have the sensitivity of 100ng and 5ng respectively. The current methodology was found to be more sensitive as compared to coomassie staining and less sensitive compared to silver staining respectively. The added advantage of this methodology is the faster visualization of proteins without altering their structure and functional properties.

  11. Library links on medical school home pages.

    PubMed

    Thomas, Sheila L

    2011-01-01

    The purpose of this study was to assess the websites of American Association of Medical Colleges (AAMC)-member medical schools for the presence of library links. Sixty-one percent (n = 92) of home pages of the 150 member schools of the AAMC contain library links. For the 58 home pages not offering such links, 50 provided a pathway of two or three clicks to a library link. The absence of library links on 39% of AAMC medical school home pages indicates that the designers of those pages did not consider the library to be a primary destination for their visitors.

  12. Graphic Server: A real time system for displaying and monitoring telemetry data of several satellites

    NASA Technical Reports Server (NTRS)

    Douard, Stephane

    1994-01-01

    Known as a Graphic Server, the system presented was designed for the control ground segment of the Telecom 2 satellites. It is a tool used to dynamically display telemetry data within graphic pages, also known as views. The views are created off-line through various utilities and then, on the operator's request, displayed and animated in real time as data is received. The system was designed as an independent component, and is installed in different Telecom 2 operational control centers. It enables operators to monitor changes in the platform and satellite payloads in real time. It has been in operation since December 1991.

  13. MISTIC: Mutual information server to infer coevolution.

    PubMed

    Simonetti, Franco L; Teppa, Elin; Chernomoretz, Ariel; Nielsen, Morten; Marino Buslje, Cristina

    2013-07-01

    MISTIC (mutual information server to infer coevolution) is a web server for graphical representation of the information contained within a MSA (multiple sequence alignment) and a complete analysis tool for Mutual Information networks in protein families. The server outputs a graphical visualization of several information-related quantities using a circos representation. This provides an integrated view of the MSA in terms of (i) the mutual information (MI) between residue pairs, (ii) sequence conservation and (iii) the residue cumulative and proximity MI scores. Further, an interactive interface to explore and characterize the MI network is provided. Several tools are offered for selecting subsets of nodes from the network for visualization. Node coloring can be set to match different attributes, such as conservation, cumulative MI, proximity MI and secondary structure. Finally, a zip file containing all results can be downloaded. The server is available at http://mistic.leloir.org.ar. In summary, MISTIC allows for a comprehensive, compact, visually rich view of the information contained within an MSA in a manner unique to any other publicly available web server. In particular, the use of circos representation of MI networks and the visualization of the cumulative MI and proximity MI concepts is novel.

  14. The Live Access Server Scientific Product Generation Through Workflow Orchestration

    NASA Astrophysics Data System (ADS)

    Hankin, S.; Calahan, J.; Li, J.; Manke, A.; O'Brien, K.; Schweitzer, R.

    2006-12-01

    The Live Access Server (LAS) is a well-established Web-application for display and analysis of geo-science data sets. The software, which can be downloaded and installed by anyone, gives data providers an easy way to establish services for their on-line data holdings, so their users can make plots; create and download data sub-sets; compare (difference) fields; and perform simple analyses. Now at version 7.0, LAS has been in operation since 1994. The current "Armstrong" release of LAS V7 consists of three components in a tiered architecture: user interface, workflow orchestration and Web Services. The LAS user interface (UI) communicates with the LAS Product Server via an XML protocol embedded in an HTTP "get" URL. Libraries (APIs) have been developed in Java, JavaScript and perl that can readily generate this URL. As a result of this flexibility it is common to find LAS user interfaces of radically different character, tailored to the nature of specific datasets or the mindset of specific users. When a request is received by the LAS Product Server (LPS -- the workflow orchestration component), business logic converts this request into a series of Web Service requests invoked via SOAP. These "back- end" Web services perform data access and generate products (visualizations, data subsets, analyses, etc.). LPS then packages these outputs into final products (typically HTML pages) via Jakarta Velocity templates for delivery to the end user. "Fine grained" data access is performed by back-end services that may utilize JDBC for data base access; the OPeNDAP "DAPPER" protocol; or (in principle) the OGC WFS protocol. Back-end visualization services are commonly legacy science applications wrapped in Java or Python (or perl) classes and deployed as Web Services accessible via SOAP. Ferret is the default visualization application used by LAS, though other applications such as Matlab, CDAT, and GrADS can also be used. Other back-end services may include generation of Google

  15. Recurrence of Acute Page Kidney in a Renal Transplant Allograft

    PubMed Central

    Zayas, Carlos; Mulloy, Laura; Jagadeesan, Muralidharan

    2016-01-01

    Acute Page Kidney (APK) phenomenon is a rare cause of secondary hypertension, mediated by activation of renin-angiotensin-aldosterone system (RAAS). Timely intervention is of great importance to prevent any end organ damage from hypertension. We present a unique case of three episodes of APK in the same renal transplant allograft. PMID:27725836

  16. Creating a Facebook Page for the Seismological Society of America

    NASA Astrophysics Data System (ADS)

    Newman, S. B.

    2009-12-01

    In August, 2009 I created a Facebook “fan” page for the Seismological Society of America. We had been exploring cost-effective options for providing forums for two-way communication for some months. We knew that a number of larger technical societies had invested significant sums of money to create customized social networking sites but that a small society would need to use existing low-cost software options. The first thing I discovered when I began to set up the fan page was that an unofficial SSA Facebook group already existed, established by Steven J. Gibbons, a member in Norway. Steven had done an excellent job of posting material about SSA. Partly because of the existing group, the official SSA fan page gained fans rapidly. We began by posting information about our own activities and then added links to activities in the broader geoscience community. While much of this material also appeared on our website and in our publication, Seismological Research Letters (SRL), the tone on the FB page is different. It is less formal with more emphasis on photos and links to other sites, including our own. Fans who are active on FB see the posts as part of their social network and do not need to take the initiative to go to the SSA site. Although the goal was to provide a forum for two-way communication, our initial experience was that people were clearly reading the page but not contributing content. This appears to be case with fan pages of sister geoscience societies. FB offers some demographic information to fan site administrators. In an initial review of the demographics it appeared that fans were younger than the overall demographics of the Society. It appeared that a few of the fans are not members or even scientists. Open questions are: what content will be most useful to fans? How will the existence of the page benefit the membership as a whole? Will the page ultimately encourage two-way communication as hoped? Web 2.0 is generating a series of new

  17. Automated Title Page Cataloging: A Feasibility Study.

    ERIC Educational Resources Information Center

    Weibel, Stuart; And Others

    1989-01-01

    Describes the design of a prototype rule-based system for the automation of descriptive cataloging from title pages. The discussion covers the results of tests of the prototype, major impediments to automatic cataloging from title pages, and prospects for further progress. The rules implemented in the prototype are appended. (16 references)…

  18. 40 CFR 1502.7 - Page limits.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Page limits. 1502.7 Section 1502.7 Protection of Environment COUNCIL ON ENVIRONMENTAL QUALITY ENVIRONMENTAL IMPACT STATEMENT § 1502.7 Page limits. The text of final environmental impact statements (e.g., paragraphs (d) through (g) of §...

  19. Preventing radio-paging system tieup

    NASA Technical Reports Server (NTRS)

    Jasmin, J. P.

    1978-01-01

    Time-delay relay limits message time of emergency radio-paging system, thereby preventing inadvertent tieup. Relay is connected with telephone circuit and permits adjustable message time between 30 and 55 seconds. After that time interval, relay opens, making line free for another paging regardless of what previous caller did with his telephone.

  20. 40 CFR 1502.7 - Page limits.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Page limits. 1502.7 Section 1502.7 Protection of Environment COUNCIL ON ENVIRONMENTAL QUALITY ENVIRONMENTAL IMPACT STATEMENT § 1502.7 Page limits. The text of final environmental impact statements (e.g., paragraphs (d) through (g) of §...

  1. Web Page Authoring Tools: Comparison and Trends.

    ERIC Educational Resources Information Center

    Craney, Linda

    Initially available from universities and individual enthusiasts, software tools to author World Wide Web pages are maturing into very feature-rich applications and are now offered by large corporations. These applications are enabling more companies to create and maintain pages themselves on the Web or on corporate Intranets. The market continues…

  2. A Server-Based Mobile Coaching System

    PubMed Central

    Baca, Arnold; Kornfeind, Philipp; Preuschl, Emanuel; Bichler, Sebastian; Tampier, Martin; Novatchkov, Hristo

    2010-01-01

    A prototype system for monitoring, transmitting and processing performance data in sports for the purpose of providing feedback has been developed. During training, athletes are equipped with a mobile device and wireless sensors using the ANT protocol in order to acquire biomechanical, physiological and other sports specific parameters. The measured data is buffered locally and forwarded via the Internet to a server. The server provides experts (coaches, biomechanists, sports medicine specialists etc.) with remote data access, analysis and (partly automated) feedback routines. In this way, experts are able to analyze the athlete’s performance and return individual feedback messages from remote locations. PMID:22163490

  3. Optimizing the NASA Technical Report Server

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Maa, Ming-Hokng

    1996-01-01

    The NASA Technical Report Server (NTRS), a World Wide Web report distribution NASA technical publications service, is modified for performance enhancement, greater protocol support, and human interface optimization. Results include: Parallel database queries, significantly decreasing user access times by an average factor of 2.3; access from clients behind firewalls and/ or proxies which truncate excessively long Uniform Resource Locators (URLs); access to non-Wide Area Information Server (WAIS) databases and compatibility with the 239-50.3 protocol; and a streamlined user interface.

  4. How To Get Your Web Page Noticed.

    ERIC Educational Resources Information Center

    Schrock, Kathleen

    1997-01-01

    Presents guidelines for making a Web site noticeable. Discusses submitting the URL to directories, links, and announcement lists, and sending the site over the server via FTP to search engines. Describes how to index the site with "Title,""Heading," and "Meta" tags. (AEF)

  5. TOPS On-Line: Automating the Construction and Maintenance of HTML Pages

    NASA Technical Reports Server (NTRS)

    Jones, Kennie H.

    1994-01-01

    After the Technology Opportunities Showcase (TOPS), in October, 1993, Langley Research Center's (LaRC) Information Systems Division (ISD) accepted the challenge to preserve the investment in information assembled in the TOPS exhibits by establishing a data base. Following the lead of several people at LaRC and others around the world, the HyperText Transport Protocol (HTTP) server and Mosaic were the obvious tools of choice for implementation. Initially, some TOPS exhibitors began the conventional approach of constructing HyperText Markup Language (HTML) pages of their exhibits as input to Mosaic. Considering the number of pages to construct, a better approach was conceived that would automate the construction of pages. This approach allowed completion of the data base construction in a shorter period of time using fewer resources than would have been possible with the conventional approach. It also provided flexibility for the maintenance and enhancement of the data base. Since that time, this approach has been used to automate construction of other HTML data bases. Through these experiences, it is concluded that the most effective use of the HTTP/Mosaic technology will require better tools and techniques for creating, maintaining and managing the HTML pages. The development and use of these tools and techniques are the subject of this document.

  6. Network characteristics for server selection in online games

    NASA Astrophysics Data System (ADS)

    Claypool, Mark

    2008-01-01

    Online gameplay is impacted by the network characteristics of players connected to the same server. Unfortunately, the network characteristics of online game servers are not well-understood, particularly for groups that wish to play together on the same server. As a step towards a remedy, this paper presents analysis of an extensive set of measurements of game servers on the Internet. Over the course of many months, actual Internet game servers were queried simultaneously by twenty-five emulated game clients, with both servers and clients spread out on the Internet. The data provides statistics on the uptime and populations of game servers over a month long period an an in-depth look at the suitability for game servers for multi-player server selection, concentrating on characteristics critical to playability--latency and fairness. Analysis finds most game servers have latencies suitable for third-person and omnipresent games, such as real-time strategy, sports and role-playing games, providing numerous server choices for game players. However, far fewer game servers have the low latencies required for first-person games, such as shooters or race games. In all cases, groups that wish to play together have a greatly reduced set of servers from which to choose because of inherent unfairness in server latencies and server selection is particularly limited as the group size increases. These results hold across different game types and even across different generations of games. The data should be useful for game developers and network researchers that seek to improve game server selection, whether for single or multiple players.

  7. Interfaces for Distributed Systems of Information Servers.

    ERIC Educational Resources Information Center

    Kahle, Brewster M.; And Others

    1993-01-01

    Describes five interfaces to remote, full-text databases accessed through distributed systems of servers. These are WAIStation for the Macintosh, XWAIS for X-Windows, GWAIS for Gnu-Emacs; SWAIS for dumb terminals, and Rosebud for the Macintosh. Sixteen illustrations provide examples of display screens. Problems and needed improvements are…

  8. Implementing bioinformatic workflows within the bioextract server

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Computational workflows in bioinformatics are becoming increasingly important in the achievement of scientific advances. These workflows typically require the integrated use of multiple, distributed data sources and analytic tools. The BioExtract Server (http://bioextract.org) is a distributed servi...

  9. Managing heterogeneous wireless environments via Hotspot servers

    NASA Astrophysics Data System (ADS)

    Simunic, Tajana; Qadeer, Wajahat; De Micheli, Giovanni

    2005-01-01

    Wireless communication today supports heterogeneous wireless devices with a number of different wireless network interfaces (WNICs). A large fraction of communication is infrastructure based, so the wireless access points and hotspot servers have become more ubiquitous. Battery lifetime is still a critical issue, with WNICs typically consuming a large fraction of the overall power budget in a mobile device. In this work we present a new technique for managing power consumption and QoS in diverse wireless environments using Hotspot servers. We introduce a resource manager module at both Hotspot server and the client. Resource manager schedules communication bursts between it and each client. The schedulers decide what WNIC to employ for communication, when to communicate data and how to minimize power dissipation while maintaining an acceptable QoS based on the application needs. We present two new scheduling policies derived from well known earliest deadline first (EDF) and rate monotonic (RM) [26] algorithms. The resource manager and the schedulers have been implemented in the HP's Hotspot server [14]. Our measurement and simulation results show a significant improvement in power dissipation and QoS of Bluetooth and 802.11b for applications such as MP3, MPEG4, WWW, and email.

  10. Managing heterogeneous wireless environments via Hotspot servers

    NASA Astrophysics Data System (ADS)

    Simunic, Tajana; Qadeer, Wajahat; De Micheli, Giovanni

    2004-12-01

    Wireless communication today supports heterogeneous wireless devices with a number of different wireless network interfaces (WNICs). A large fraction of communication is infrastructure based, so the wireless access points and hotspot servers have become more ubiquitous. Battery lifetime is still a critical issue, with WNICs typically consuming a large fraction of the overall power budget in a mobile device. In this work we present a new technique for managing power consumption and QoS in diverse wireless environments using Hotspot servers. We introduce a resource manager module at both Hotspot server and the client. Resource manager schedules communication bursts between it and each client. The schedulers decide what WNIC to employ for communication, when to communicate data and how to minimize power dissipation while maintaining an acceptable QoS based on the application needs. We present two new scheduling policies derived from well known earliest deadline first (EDF) and rate monotonic (RM) [26] algorithms. The resource manager and the schedulers have been implemented in the HP's Hotspot server [14]. Our measurement and simulation results show a significant improvement in power dissipation and QoS of Bluetooth and 802.11b for applications such as MP3, MPEG4, WWW, and email.

  11. Client/Server Architecture Promises Radical Changes.

    ERIC Educational Resources Information Center

    Freeman, Grey; York, Jerry

    1991-01-01

    This article discusses the emergence of the client/server paradigm for the delivery of computer applications, its emergence in response to the proliferation of microcomputers and local area networks, the applicability of the model in academic institutions, and its implications for college campus information technology organizations. (Author/DB)

  12. PROCAIN server for remote protein sequence similarity search

    PubMed Central

    Wang, Yong; Sadreyev, Ruslan I.; Grishin, Nick V.

    2009-01-01

    Sensitive and accurate detection of distant protein homology is essential for the studies of protein structure, function and evolution. We recently developed PROCAIN, a method that is based on sequence profile comparison and involves the analysis of four signals—similarities of residue content at the profile positions combined with three types of assisting information: sequence motifs, residue conservation and predicted secondary structure. Here we present the PROCAIN web server that allows the user to submit a query sequence or multiple sequence alignment and perform the search in a profile database of choice. The output is structured similar to that of BLAST, with the list of detected homologs sorted by E-value and followed by profile–profile alignments. The front page allows the user to adjust multiple options of input processing and output formatting, as well as search settings, including the relative weights assigned to the three types of assisting information. Availability: http://prodata.swmed.edu/procain/ Contact: grishin@chop.swmed.edu PMID:19497935

  13. Workload Characterization and Performance Implications of Large-Scale Blog Servers

    SciTech Connect

    Jeon, Myeongjae; Kim, Youngjae; Hwang, Jeaho; Lee, Joonwon; Seo, Euiseong

    2012-11-01

    With the ever-increasing popularity of social network services (SNSs), an understanding of the characteristics of these services and their effects on the behavior of their host servers is critical. However, there has been a lack of research on the workload characterization of servers running SNS applications such as blog services. To fill this void, we empirically characterized real-world web server logs collected from one of the largest South Korean blog hosting sites for 12 consecutive days. The logs consist of more than 96 million HTTP requests and 4.7 TB of network traffic. Our analysis reveals the followings: (i) The transfer size of non-multimedia files and blog articles can be modeled using a truncated Pareto distribution and a log-normal distribution, respectively; (ii) User access for blog articles does not show temporal locality, but is strongly biased towards those posted with image or audio files. We additionally discuss the potential performance improvement through clustering of small files on a blog page into contiguous disk blocks, which benefits from the observed file access patterns. Trace-driven simulations show that, on average, the suggested approach achieves 60.6% better system throughput and reduces the processing time for file access by 30.8% compared to the best performance of the Ext4 file system.

  14. San Mateo County's Server Information Program (S.I.P.): A Community-Based Alcohol Server Training Program.

    ERIC Educational Resources Information Center

    de Miranda, John

    The field of alcohol server awareness and training has grown dramatically in the past several years and the idea of training servers to reduce alcohol problems has become a central fixture in the current alcohol policy debate. The San Mateo County, California Server Information Program (SIP) is a community-based prevention strategy designed to…

  15. Page turning solutions for musicians: a survey.

    PubMed

    Wolberg, George; Schipper, Irene

    2012-01-01

    Musicians have long been hampered by the challenge in turning sheet music while their hands are occupied playing an instrument. The sight of a human page turner assisting a pianist during a performance, for instance, is not uncommon. This need for a page turning solution is no less acute during practice sessions, which account for the vast majority of playing time. Despite widespread appreciation of the problem, there have been virtually no robust and affordable products to assist the musician. Recent progress in assistive technology and electronic reading devices offers promising solutions to this long-standing problem. The objective of this paper is to survey the technology landscape and assess the benefits and drawbacks of page turning solutions for musicians. A full range of mechanical and digital page turning products are reviewed. PMID:22246302

  16. AGU acts on NSF Page Charge Policy

    NASA Astrophysics Data System (ADS)

    Fast action by Headquarters alerted AGU members to a proposed change to the National Science Foundation's page charge policy that would weaken the ability of scientific societies to serve the scientific community.If adopted, NSF's new policy, announced in the Federal Register December 18, would remove the prohibition against allowing page charges to commercially produced journals. The proposal for the change was supposedly put forth to obtain a reaction from the scientific community.

  17. Description Meta Tags in Public Home and Linked Pages.

    ERIC Educational Resources Information Center

    Craven, Timothy C.

    2001-01-01

    Random samples of 1,872 Web pages registered with Yahoo! And 1,638 pages reachable from Yahoo!-registered pages were analyzed for use of meta tags and specifically those containing descriptions. Results: 727 (38.8%) of the Yahoo!-registered pages and 442 (27%) of the other pages included descriptions in meta tages. Some descriptions greatly…

  18. 47 CFR 22.503 - Paging geographic area authorizations.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 2 2012-10-01 2012-10-01 false Paging geographic area authorizations. 22.503... PUBLIC MOBILE SERVICES Paging and Radiotelephone Service § 22.503 Paging geographic area authorizations. The FCC considers applications for and issues paging geographic area authorizations in the Paging...

  19. 47 CFR 22.503 - Paging geographic area authorizations.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 2 2013-10-01 2013-10-01 false Paging geographic area authorizations. 22.503... PUBLIC MOBILE SERVICES Paging and Radiotelephone Service § 22.503 Paging geographic area authorizations. The FCC considers applications for and issues paging geographic area authorizations in the Paging...

  20. 47 CFR 22.503 - Paging geographic area authorizations.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 2 2011-10-01 2011-10-01 false Paging geographic area authorizations. 22.503... PUBLIC MOBILE SERVICES Paging and Radiotelephone Service § 22.503 Paging geographic area authorizations. The FCC considers applications for and issues paging geographic area authorizations in the Paging...

  1. A Web Page Summarization for Mobile Phones

    NASA Astrophysics Data System (ADS)

    Hasegawa, Takaaki; Nishikawa, Hitoshi; Imamura, Kenji; Kikui, Gen'ichiro; Okumur, Manabu

    Recently, web pages for mobile devices are widely spread on the Internet and a lot of people can access web pages through search engines by mobile devices as well as personal computers. A summary of a retrieved web page is important because the people judge whether or not the page would be relevant to their information need according to the summary. In particular, the summary must be not only compact but also grammatical and meaningful when the users retrieve information using a mobile phone with a small screen. Most search engines seem to produce a snippet based on the keyword-in-context (KWIC) method. However, this simple method could not generate a refined summary suitable for mobile phones because of low grammaticality and content overlap with the page title. We propose a more suitable method to generate a snippet for mobile devices using sentence extraction and sentence compression methods. First, sentences are biased based on whether they include the query terms from the users or words that are relevant to the queries, as well as whether they do not overlap with the page title based on maximal marginal relevance (MMR). Second, the selected sentences are compressed based on their phrase coverage, which is measured by the scores of words, and their phrase connection probability measured based on the language model, according to the dependency structure converted from the sentence. The experimental results reveal the proposed method outperformed the KWIC method in terms of relevance judgment, grammaticality, non-redundancy and content coverage.

  2. The PDB_REDO server for macromolecular structure model optimization.

    PubMed

    Joosten, Robbie P; Long, Fei; Murshudov, Garib N; Perrakis, Anastassis

    2014-07-01

    The refinement and validation of a crystallographic structure model is the last step before the coordinates and the associated data are submitted to the Protein Data Bank (PDB). The success of the refinement procedure is typically assessed by validating the models against geometrical criteria and the diffraction data, and is an important step in ensuring the quality of the PDB public archive [Read et al. (2011 ▶), Structure, 19, 1395-1412]. The PDB_REDO procedure aims for 'constructive validation', aspiring to consistent and optimal refinement parameterization and pro-active model rebuilding, not only correcting errors but striving for optimal interpretation of the electron density. A web server for PDB_REDO has been implemented, allowing thorough, consistent and fully automated optimization of the refinement procedure in REFMAC and partial model rebuilding. The goal of the web server is to help practicing crystallo-graphers to improve their model prior to submission to the PDB. For this, additional steps were implemented in the PDB_REDO pipeline, both in the refinement procedure, e.g. testing of resolution limits and k-fold cross-validation for small test sets, and as new validation criteria, e.g. the density-fit metrics implemented in EDSTATS and ligand validation as implemented in YASARA. Innovative ways to present the refinement and validation results to the user are also described, which together with auto-generated Coot scripts can guide users to subsequent model inspection and improvement. It is demonstrated that using the server can lead to substantial improvement of structure models before they are submitted to the PDB. PMID:25075342

  3. The PDB_REDO server for macromolecular structure model optimization

    PubMed Central

    Joosten, Robbie P.; Long, Fei; Murshudov, Garib N.; Perrakis, Anastassis

    2014-01-01

    The refinement and validation of a crystallographic structure model is the last step before the coordinates and the associated data are submitted to the Protein Data Bank (PDB). The success of the refinement procedure is typically assessed by validating the models against geometrical criteria and the diffraction data, and is an important step in ensuring the quality of the PDB public archive [Read et al. (2011 ▶), Structure, 19, 1395–1412]. The PDB_REDO procedure aims for ‘constructive validation’, aspiring to consistent and optimal refinement parameterization and pro-active model rebuilding, not only correcting errors but striving for optimal interpretation of the electron density. A web server for PDB_REDO has been implemented, allowing thorough, consistent and fully automated optimization of the refinement procedure in REFMAC and partial model rebuilding. The goal of the web server is to help practicing crystallo­graphers to improve their model prior to submission to the PDB. For this, additional steps were implemented in the PDB_REDO pipeline, both in the refinement procedure, e.g. testing of resolution limits and k-fold cross-validation for small test sets, and as new validation criteria, e.g. the density-fit metrics implemented in EDSTATS and ligand validation as implemented in YASARA. Innovative ways to present the refinement and validation results to the user are also described, which together with auto-generated Coot scripts can guide users to subsequent model inspection and improvement. It is demonstrated that using the server can lead to substantial improvement of structure models before they are submitted to the PDB. PMID:25075342

  4. Distributed analysis with CRAB: The client-server architecture evolution and commissioning

    SciTech Connect

    Codispoti, G.; Cinquilli, M.; Fanfani, A.; Fanzago, F.; Farina, F.; Lacaprara, S.; Miccio, V.; Spiga, D.; Vaandering, E.; /Fermilab

    2008-01-01

    CRAB (CMS Remote Analysis Builder) is the tool used by CMS to enable running physics analysis in a transparent manner over data distributed across many sites. It abstracts out the interaction with the underlying batch farms, grid infrastructure and CMS workload management tools, such that it is easily usable by non-experts. CRAB can be used as a direct interface to the computing system or can delegate the user task to a server. Major efforts have been dedicated to the client-server system development, allowing the user to deal only with a simple and intuitive interface and to delegate all the work to a server. The server takes care of handling the users jobs during the whole lifetime of the users task. In particular, it takes care of the data and resources discovery, process tracking and output handling. It also provides services such as automatic resubmission in case of failures, notification to the user of the task status, and automatic blacklisting of sites showing evident problems beyond what is provided by existing grid infrastructure. The CRAB Server architecture and its deployment will be presented, as well as the current status and future development. In addition the experience in using the system for initial detector commissioning activities and data analysis will be summarized.

  5. Client/server approach to image capturing

    NASA Astrophysics Data System (ADS)

    Tuijn, Chris; Stokes, Earle

    1998-01-01

    The diversity of the digital image capturing devices on the market today is quite astonishing and ranges from low-cost CCD scanners to digital cameras (for both action and stand-still scenes), mid-end CCD scanners for desktop publishing and pre- press applications and high-end CCD flatbed scanners and drum- scanners with photo multiplier technology. Each device and market segment has its own specific needs which explains the diversity of the associated scanner applications. What all those applications have in common is the need to communicate with a particular device to import the digital images; after the import, additional image processing might be needed as well as color management operations. Although the specific requirements for all of these applications might differ considerably, a number of image capturing and color management facilities as well as other services are needed which can be shared. In this paper, we propose a client/server architecture for scanning and image editing applications which can be used as a common component for all these applications. One of the principal components of the scan server is the input capturing module. The specification of the input jobs is based on a generic input device model. Through this model we make abstraction of the specific scanner parameters and define the scan job definitions by a number of absolute parameters. As a result, scan job definitions will be less dependent on a particular scanner and have a more universal meaning. In this context, we also elaborate on the interaction of the generic parameters and the color characterization (i.e., the ICC profile). Other topics that are covered are the scheduling and parallel processing capabilities of the server, the image processing facilities, the interaction with the ICC engine, the communication facilities (both in-memory and over the network) and the different client architectures (stand-alone applications, TWAIN servers, plug-ins, OLE or Apple-event driven

  6. Multiple-server Flexible Blind Quantum Computation in Networks

    NASA Astrophysics Data System (ADS)

    Kong, Xiaoqin; Li, Qin; Wu, Chunhui; Yu, Fang; He, Jinjun; Sun, Zhiyuan

    2016-06-01

    Blind quantum computation (BQC) can allow a client with limited quantum power to delegate his quantum computation to a powerful server and still keep his own data private. In this paper, we present a multiple-server flexible BQC protocol, where a client who only needs the ability of accessing qua ntum channels can delegate the computational task to a number of servers. Especially, the client's quantum computation also can be achieved even when one or more delegated quantum servers break down in networks. In other words, when connections to certain quantum servers are lost, clients can adjust flexibly and delegate their quantum computation to other servers. Obviously it is trivial that the computation will be unsuccessful if all servers are interrupted.

  7. EarthServer: Information Retrieval and Query Language

    NASA Astrophysics Data System (ADS)

    Perperis, Thanassis; Koltsida, Panagiota; Kakaletris, George

    2013-04-01

    Establishing open, unified, seamless, access and ad-hoc analytics on cross-disciplinary, multi-source, multi-dimensional, spatiotemporal Earth Science data of extreme-size and their supporting metadata are the main challenges of the EarthServer project (www.earthserver.eu), funded by the European Commission under its Seventh Framework Program. One of EarthServer's main objectives is to provide users with higher level coverage and metadata search, retrieval and processing capabilities to multi-disciplinary Earth Science data. Six Lighthouse Applications are being established, each one providing access to Cryospheric, Airborne, Atmospheric, Geology, Oceanography and Planetary science raster data repositories through strictly WCS 2.0 standard based service endpoints. EarthServers' information retrieval subsystem aims towards exploiting the WCS endpoints through a physically and logically distributed service oriented architecture, foreseeing the collaboration of several standard compliant services, capable of exploiting modern large grid and cloud infrastructures and of dynamically responding to availability and capabilities of underlying resources. Towards furthering technology for integrated, coherent service provision based on WCS and WCPS the concept of a query language (QL), unifying coverage and metadata processing and retrieval is introduced. EarthServer's information retrieval subsystem receives QL requests involving high volumes of all Earth Science data categories, executes them on the services that reside on the infrastructure and sends the results back to the requester through a high performance pipeline. In this contribution we briefly discuss EarthServer's service oriented coverage data and metadata search and retrieval architecture and further elaborate on the potentials of EarthServer's Query Language, called xWCPS (XQuery compliant WCPS). xWCPS aims towards merging the path that the two widely adopted standards (W3C XQuery, OGC WCPS) have paved, into a

  8. PSSweb: protein structural statistics web server.

    PubMed

    Gaillard, Thomas; Stote, Roland H; Dejaegere, Annick

    2016-07-01

    With the increasing number of protein structures available, there is a need for tools capable of automating the comparison of ensembles of structures, a common requirement in structural biology and bioinformatics. PSSweb is a web server for protein structural statistics. It takes as input an ensemble of PDB files of protein structures, performs a multiple sequence alignment and computes structural statistics for each position of the alignment. Different optional functionalities are proposed: structure superposition, Cartesian coordinate statistics, dihedral angle calculation and statistics, and a cluster analysis based on dihedral angles. An interactive report is generated, containing a summary of the results, tables, figures and 3D visualization of superposed structures. The server is available at http://pssweb.org.

  9. The Uppsala Electron-Density Server.

    PubMed

    Kleywegt, Gerard J; Harris, Mark R; Zou, Jin Yu; Taylor, Thomas C; Wählby, Anders; Jones, T Alwyn

    2004-12-01

    The Uppsala Electron Density Server (EDS; http://eds.bmc.uu.se/) is a web-based facility that provides access to electron-density maps and statistics concerning the fit of crystal structures and their maps. Maps are available for approximately 87% of the crystallographic Protein Data Bank (PDB) entries for which structure factors have been deposited and for which straightforward map calculations succeed in reproducing the published R value to within five percentage points. Here, an account is provided of the methods that are used to generate the information contained in the server. Some of the problems that are encountered in the map-generation process as well as some spin-offs of the project are also discussed.

  10. PSSweb: protein structural statistics web server

    PubMed Central

    Gaillard, Thomas; Stote, Roland H.; Dejaegere, Annick

    2016-01-01

    With the increasing number of protein structures available, there is a need for tools capable of automating the comparison of ensembles of structures, a common requirement in structural biology and bioinformatics. PSSweb is a web server for protein structural statistics. It takes as input an ensemble of PDB files of protein structures, performs a multiple sequence alignment and computes structural statistics for each position of the alignment. Different optional functionalities are proposed: structure superposition, Cartesian coordinate statistics, dihedral angle calculation and statistics, and a cluster analysis based on dihedral angles. An interactive report is generated, containing a summary of the results, tables, figures and 3D visualization of superposed structures. The server is available at http://pssweb.org. PMID:27174930

  11. Energy Servers Deliver Clean, Affordable Power

    NASA Technical Reports Server (NTRS)

    2010-01-01

    K.R. Sridhar developed a fuel cell device for Ames Research Center, that could use solar power to split water into oxygen for breathing and hydrogen for fuel on Mars. Sridhar saw the potential of the technology, when reversed, to create clean energy on Earth. He founded Bloom Energy, of Sunnyvale, California, to advance the technology. Today, the Bloom Energy Server is providing cost-effective, environmentally friendly energy to a host of companies such as eBay, Google, and The Coca-Cola Company. Bloom's NASA-derived Energy Servers generate energy that is about 67-percent cleaner than a typical coal-fired power plant when using fossil fuels and 100-percent cleaner with renewable fuels.

  12. Implementing a secure client/server application

    SciTech Connect

    Kissinger, B.A.

    1994-08-01

    There is an increasing rise in attacks and security breaches on computer systems. Particularly vulnerable are systems that exchange user names and passwords directly across a network without encryption. These kinds of systems include many commercial-off-the-shelf client/server applications. A secure technique for authenticating computer users and transmitting passwords through the use of a trusted {open_quotes}broker{close_quotes} and public/private keys is described in this paper.

  13. COMPASS server for remote homology inference.

    PubMed

    Sadreyev, Ruslan I; Tang, Ming; Kim, Bong-Hyun; Grishin, Nick V

    2007-07-01

    COMPASS is a method for homology detection and local alignment construction based on the comparison of multiple sequence alignments (MSAs). The method derives numerical profiles from given MSAs, constructs local profile-profile alignments and analytically estimates E-values for the detected similarities. Until now, COMPASS was only available for download and local installation. Here, we present a new web server featuring the latest version of COMPASS, which provides (i) increased sensitivity and selectivity of homology detection; (ii) longer, more complete alignments; and (iii) faster computational speed. After submission of the query MSA or single sequence, the server performs searches versus a user-specified database. The server includes detailed and intuitive control of the search parameters. A flexible output format, structured similarly to BLAST and PSI-BLAST, provides an easy way to read and analyze the detected profile similarities. Brief help sections are available for all input parameters and output options, along with detailed documentation. To illustrate the value of this tool for protein structure-functional prediction, we present two examples of detecting distant homologs for uncharacterized protein families. Available at http://prodata.swmed.edu/compass. PMID:17517780

  14. SPEER-SERVER: a web server for prediction of protein specificity determining sites.

    PubMed

    Chakraborty, Abhijit; Mandloi, Sapan; Lanczycki, Christopher J; Panchenko, Anna R; Chakrabarti, Saikat

    2012-07-01

    Sites that show specific conservation patterns within subsets of proteins in a protein family are likely to be involved in the development of functional specificity. These sites, generally termed specificity determining sites (SDS), might play a crucial role in binding to a specific substrate or proteins. Identification of SDS through experimental techniques is a slow, difficult and tedious job. Hence, it is very important to develop efficient computational methods that can more expediently identify SDS. Herein, we present Specificity prediction using amino acids' Properties, Entropy and Evolution Rate (SPEER)-SERVER, a web server that predicts SDS by analyzing quantitative measures of the conservation patterns of protein sites based on their physico-chemical properties and the heterogeneity of evolutionary changes between and within the protein subfamilies. This web server provides an improved representation of results, adds useful input and output options and integrates a wide range of analysis and data visualization tools when compared with the original standalone version of the SPEER algorithm. Extensive benchmarking finds that SPEER-SERVER exhibits sensitivity and precision performance that, on average, meets or exceeds that of other currently available methods. SPEER-SERVER is available at http://www.hpppi.iicb.res.in/ss/.

  15. ProBiS-ligands: a web server for prediction of ligands by examination of protein binding sites

    PubMed Central

    Konc, Janez; Janežič, Dušanka

    2014-01-01

    The ProBiS-ligands web server predicts binding of ligands to a protein structure. Starting with a protein structure or binding site, ProBiS-ligands first identifies template proteins in the Protein Data Bank that share similar binding sites. Based on the superimpositions of the query protein and the similar binding sites found, the server then transposes the ligand structures from those sites to the query protein. Such ligand prediction supports many activities, e.g. drug repurposing. The ProBiS-ligands web server, an extension of the ProBiS web server, is open and free to all users at http://probis.cmm.ki.si/ligands. PMID:24861616

  16. Engineering Proteins for Thermostability with iRDP Web Server

    PubMed Central

    Ghanate, Avinash; Ramasamy, Sureshkumar; Suresh, C. G.

    2015-01-01

    Engineering protein molecules with desired structure and biological functions has been an elusive goal. Development of industrially viable proteins with improved properties such as stability, catalytic activity and altered specificity by modifying the structure of an existing protein has widely been targeted through rational protein engineering. Although a range of factors contributing to thermal stability have been identified and widely researched, the in silico implementation of these as strategies directed towards enhancement of protein stability has not yet been explored extensively. A wide range of structural analysis tools is currently available for in silico protein engineering. However these tools concentrate on only a limited number of factors or individual protein structures, resulting in cumbersome and time-consuming analysis. The iRDP web server presented here provides a unified platform comprising of iCAPS, iStability and iMutants modules. Each module addresses different facets of effective rational engineering of proteins aiming towards enhanced stability. While iCAPS aids in selection of target protein based on factors contributing to structural stability, iStability uniquely offers in silico implementation of known thermostabilization strategies in proteins for identification and stability prediction of potential stabilizing mutation sites. iMutants aims to assess mutants based on changes in local interaction network and degree of residue conservation at the mutation sites. Each module was validated using an extensively diverse dataset. The server is freely accessible at http://irdp.ncl.res.in and has no login requirements. PMID:26436543

  17. The Widest Practicable Dissemination: The NASA Technical Report Server

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Gottlich, Gretchen L.; Bianco, David J.; Binkley, Robert L.; Kellogg, Yvonne D.; Paulson, Sharon S.; Beaumont, Chris J.; Schmunk, Robert B.; Kurtz, Michael J.; Accomazzi, Alberto

    1995-01-01

    The National Aeronautics and Space Act of 1958 established NASA and charged it to "provide for the widest practicable and appropriate dissemination of information concerning [...] its activities and the results thereof." The search for innovative methods to distribute NASA s information lead a grass-roots team to create the NASA Technical Report Server (NTRS), which uses the World Wide Web and other popular Internet-based information systems as search engines. The NTRS is an inter-center effort which provides uniform access to various distributed publication servers residing on the Internet. Users have immediate desktop access to technical publications from NASA centers and institutes. The NTRS is comprised of several units, some constructed especially for inclusion in NTRS, and others that are existing NASA publication services that NTRS reuses. This paper presents the NTRS architecture, usage metrics, and the lessons learned while implementing and maintaining the services over the initial 6-month period. The NTRS is largely constructed with freely available software running on existing hardware. NTRS builds upon existing hardware and software, and the resulting additional exposure for the body of literature contained will allow NASA to ensure that its institutional knowledge base will continue to receive the widest practicable and appropriate dissemination.

  18. The widest practicable dissemination: The NASA technical report server

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Gottlich, Gretchen L.; Bianco, David J.; Binkley, Robert L.; Kellogg, Yvonne D.; Paulson, Sharon S.; Beaumont, Chris J.; Schmunk, Robert B.; Kurtz, Michael J.; Accomazzi, Alberto

    1995-01-01

    The National Aeronautics and Space Act of 1958 established NASA and charged it to 'provide for the widest practicable and appropriate dissemination of information concerning...its activities and the results thereof.' The search for innovative methods to distribute NASA's information lead a grass-roots team to create the NASA Technical Report Server (NTRS), which uses the World Wide Web and other popular Internet-based information systems as search engines. The NTRS is an inter-center effort which provides uniform access to various distributed publication servers residing on the Internet. Users have immediate desktop access to technical publications from NASA centers and institutes. The NTRS is comprised of several units, some constructed especially for inclusion in NTRS, and others that are existing NASA publication services that NTRS reuses. This paper presents the NTRS architecture, usage metrics, and the lessons learned while implementing and maintaining the services over the initial six-month period. The NTRS is largely constructed with freely available software running on existing hardware. NTRS builds upon existing hardware and software, and the resulting additional exposure for the body of literature contained will allow NASA to ensure that its institutional knowledge base will continue to receive the widest practicable and appropriate dissemination.

  19. Engineering Proteins for Thermostability with iRDP Web Server.

    PubMed

    Panigrahi, Priyabrata; Sule, Manas; Ghanate, Avinash; Ramasamy, Sureshkumar; Suresh, C G

    2015-01-01

    Engineering protein molecules with desired structure and biological functions has been an elusive goal. Development of industrially viable proteins with improved properties such as stability, catalytic activity and altered specificity by modifying the structure of an existing protein has widely been targeted through rational protein engineering. Although a range of factors contributing to thermal stability have been identified and widely researched, the in silico implementation of these as strategies directed towards enhancement of protein stability has not yet been explored extensively. A wide range of structural analysis tools is currently available for in silico protein engineering. However these tools concentrate on only a limited number of factors or individual protein structures, resulting in cumbersome and time-consuming analysis. The iRDP web server presented here provides a unified platform comprising of iCAPS, iStability and iMutants modules. Each module addresses different facets of effective rational engineering of proteins aiming towards enhanced stability. While iCAPS aids in selection of target protein based on factors contributing to structural stability, iStability uniquely offers in silico implementation of known thermostabilization strategies in proteins for identification and stability prediction of potential stabilizing mutation sites. iMutants aims to assess mutants based on changes in local interaction network and degree of residue conservation at the mutation sites. Each module was validated using an extensively diverse dataset. The server is freely accessible at http://irdp.ncl.res.in and has no login requirements.

  20. Efficient server selection system for widely distributed multiserver networks

    NASA Astrophysics Data System (ADS)

    Lee, Hyun-pyo; Park, Sung-sik; Lee, Kyoon-Ha

    2001-07-01

    In order to providing more improved quality of Internet service, the access speed to a subscriber's network and a server which is the Internet access device was rapidly enhanced by traffic distribution and installation of high-performance server. But the Internet access quality and the content for a speed were remained out of satisfaction. With such a hazard, an extended node at Internet access device has a limitation for coping with growing network traffic, and the root cause is located in the Middle-mile node between a CP (Content Provider) server and a user node. For such a problem, this paper proposes a new method to select a effective server to a client as minimizing the number of node between the server and the client while keeping the load balance among servers which is clustered by the client's location on the physically distributed multi-site environments. The proposed method use a NSP (Network Status Prober) and a contents server manager so as to get a status of each servers and distributed network, a new architecture will be shown for the server selecting algorithm and the implementation for the algorithm. And also, this paper shows the parameters selecting a best service providing server for client and that the grantor will be confirmed by the experiment over the proposed architectures.

  1. Energy Efficiency in Small Server Rooms: Field Surveys and Findings

    SciTech Connect

    Cheung, Iris; Greenberg, Steve; Mahdavi, Roozbeh; Brown, Richard; Tschudi, William

    2014-08-11

    Fifty-seven percent of US servers are housed in server closets, server rooms, and localized data centers, in what are commonly referred to as small server rooms, which comprise 99percent of all server spaces in the US. While many mid-tier and enterprise-class data centers are owned by large corporations that consider energy efficiency a goal to minimize business operating costs, small server rooms typically are not similarly motivated. They are characterized by decentralized ownership and management and come in many configurations, which creates a unique set of efficiency challenges. To develop energy efficiency strategies for these spaces, we surveyed 30 small server rooms across eight institutions, and selected four of them for detailed assessments. The four rooms had Power Usage Effectiveness (PUE) values ranging from 1.5 to 2.1. Energy saving opportunities ranged from no- to low-cost measures such as raising cooling set points and better airflow management, to more involved but cost-effective measures including server consolidation and virtualization, and dedicated cooling with economizers. We found that inefficiencies mainly resulted from organizational rather than technical issues. Because of the inherent space and resource limitations, the most effective measure is to operate servers through energy-efficient cloud-based services or well-managed larger data centers, rather than server rooms. Backup power requirement, and IT and cooling efficiency should be evaluated to minimize energy waste in the server space. Utility programs are instrumental in raising awareness and spreading technical knowledge on server operation, and the implementation of energy efficiency measures in small server rooms.

  2. CheD: chemical database compilation tool, Internet server, and client for SQL servers.

    PubMed

    Trepalin, S V; Yarkov, A V

    2001-01-01

    An efficient program, which runs on a personal computer, for the storage, retrieval, and processing of chemical information, is presented, The program can work both as a stand-alone application or in conjunction with a specifically written Web server application or with some standard SQL servers, e.g., Oracle, Interbase, and MS SQL. New types of data fields are introduced, e.g., arrays for spectral information storage, HTML and database links, and user-defined functions. CheD has an open architecture; thus, custom data types, controls, and services may be added. A WWW server application for chemical data retrieval features an easy and user-friendly installation on Windows NT or 95 platforms.

  3. The mediating role of facebook fan pages.

    PubMed

    Chih, Wen-Hai; Hsu, Li-Chun; Wang, Kai-Yu; Lin, Kuan-Yu

    2014-01-01

    Using the dual mediation hypothesis, this study investigates the role of interestingness (the power of attracting or holding one's attention) attitude towards the news, in the formation of Facebook Fan Page users' electronic word-of-mouth intentions. A total of 599 Facebook fan page users in Taiwan were recruited and structural equation modeling (SEM) was used to test the research hypotheses. The results show that both perceived news entertainment and informativeness positively influence interestingness attitude towards the news. Interestingness attitude towards the news subsequently influences hedonism and utilitarianism attitudes towards the Fan Page, which then influence eWOM intentions. Interestingness attitude towards the news plays a more important role than hedonism and utilitarianism attitudes in generating electronic word-of-mouth intentions. Based on the findings, the implications and future research suggestions are provided. PMID:24875695

  4. European user trial of paging by satellite

    NASA Technical Reports Server (NTRS)

    Fudge, R. E.; Fenton, C. J.

    1990-01-01

    British Telecom conceived the idea of adapting their existing paging service, together with the use of existing terrestrial pagers, to yield a one way data (i.e., paging) satellite service to mobiles. The user trial of paging by satellites was successful. It demonstrated that services could be provided over a wide geographical area to low priced terminals. Many lessons were learned in unexpected areas. These include the need for extensive liaison with all users involved, especially the drivers, to ensure they understood the potential benefits. There was a significant desire for a return acknowledgement channel or even a return data channel. Above all there is a need to ensure that the equipment can be taken across European borders and legitimately used in all European countries. The next step in a marketing assessment would be to consider the impact of two way data messaging such as INMARSAT-C.

  5. A rendering approach for stereoscopic web pages

    NASA Astrophysics Data System (ADS)

    Zhang, Jianlong; Wang, Wenmin; Wang, Ronggang; Chen, Qinshui

    2014-03-01

    Web technology provides a relatively easy way to generate contents for us to recognize the world, and with the development of stereoscopic display technology, the stereoscopic devices will become much more popular. The combination of web technology and stereoscopic display technology will bring revolutionary visual effect. The Stereoscopic 3D (S3D) web pages, in which text, image and video may have different depth, can be displayed on stereoscopic display devices. This paper presents the approach about how to render two viewing S3D web pages including text, images, widgets: first, an algorithm should be developed in order to display stereoscopic elements like text, widgets by using 2D graphic library; second, a method should be presented to render stereoscopic web page based on current framework of the browser; third, a rough solution is invented to fix the problem that comes out in the method.

  6. 60 FR 39766 - Supplement to Renovate the Alcohol Detention Center, Page, Arizona

    Federal Register 2010, 2011, 2012, 2013, 2014

    1995-08-03

    ... and alcohol treatment activities. The facility is currently being used to serve individuals targeted... treatment and counseling interventions. Alcohol related deaths, accidents, injuries, illness, violence, and... HUMAN SERVICES Supplement to Renovate the Alcohol Detention Center, Page, Arizona AGENCY: Center...

  7. Page Recognition: Quantum Leap In Recognition Technology

    NASA Astrophysics Data System (ADS)

    Miller, Larry

    1989-07-01

    No milestone has proven as elusive as the always-approaching "year of the LAN," but the "year of the scanner" might claim the silver medal. Desktop scanners have been around almost as long as personal computers. And everyone thinks they are used for obvious desktop-publishing and business tasks like scanning business documents, magazine articles and other pages, and translating those words into files your computer understands. But, until now, the reality fell far short of the promise. Because it's true that scanners deliver an accurate image of the page to your computer, but the software to recognize this text has been woefully disappointing. Old optical-character recognition (OCR) software recognized such a limited range of pages as to be virtually useless to real users. (For example, one OCR vendor specified 12-point Courier font from an IBM Selectric typewriter: the same font in 10-point, or from a Diablo printer, was unrecognizable!) Computer dealers have told me the chasm between OCR expectations and reality is so broad and deep that nine out of ten prospects leave their stores in disgust when they learn the limitations. And this is a very important, very unfortunate gap. Because the promise of recognition -- what people want it to do -- carries with it tremendous improvements in our productivity and ability to get tons of written documents into our computers where we can do real work with it. The good news is that a revolutionary new development effort has led to the new technology of "page recognition," which actually does deliver the promise we've always wanted from OCR. I'm sure every reader appreciates the breakthrough represented by the laser printer and page-makeup software, a combination so powerful it created new reasons for buying a computer. A similar breakthrough is happening right now in page recognition: the Macintosh (and, I must admit, other personal computers) equipped with a moderately priced scanner and OmniPage software (from Caere

  8. Design of a distributed CORBA based image processing server.

    PubMed

    Giess, C; Evers, H; Heid, V; Meinzer, H P

    2000-01-01

    This paper presents the design and implementation of a distributed image processing server based on CORBA. Existing image processing tools were encapsulated in a common way with this server. Data exchange and conversion is done automatically inside the server, hiding these tasks from the user. The different image processing tools are visible as one large collection of algorithms and due to the use of CORBA are accessible via intra-/internet.

  9. Accounting Programs' Home Pages: What's Happening.

    ERIC Educational Resources Information Center

    Peek, Lucia E.; Roxas, Maria L.

    2002-01-01

    Content analysis of 62 accounting programs' websites indicated the following: 53% include mission statements; 62.9% list accreditation; many faculty biographies and personal pages used inconsistent formats; provision of information on financial aid, student organizations, career services, and certified public accountant requirements varied. Many…

  10. 24 CFR 1710.105 - Cover page.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... DEVELOPMENT (INTERSTATE LAND SALES REGISTRATION PROGRAM) LAND REGISTRATION Reporting Requirements § 1710.105... 24 Housing and Urban Development 5 2012-04-01 2012-04-01 false Cover page. 1710.105 Section 1710.105 Housing and Urban Development Regulations Relating to Housing and Urban Development...

  11. Turning the Page with Preconference Workshops

    ERIC Educational Resources Information Center

    Knowledge Quest, 2011

    2011-01-01

    For those who are experiencing a lack of creative inspiration within their school library program but are ready to "turn a page" in their career or school library program, they may head to Minneapolis to attend one of the many great preconference workshops. This article presents and describes preconference workshops design to rid librarians of the…

  12. Adding Graphics to Your WWW Page.

    ERIC Educational Resources Information Center

    Descy, Don E.

    1995-01-01

    Explains how to retrieve graphics that are available on the World Wide Web and add them to a Web page using a word processor that can save documents in an ASCII (American Standard Code Information Interchange) text format and a new version of Netscape. A list of various, unrelated Internet resources is also included. (LRW)

  13. What's Not Funny about the Funny Pages?

    ERIC Educational Resources Information Center

    Lum, Lydia

    2008-01-01

    As a kid, Darrin Bell devoured newspaper comic strips. So it was disappointing whenever editors refused years later to add his comic strip, "Candorville," to their funny pages as soon as they saw that his lead characters were minorities. The editors would say they already carried a so-called Black strip. It is difficult for cartoonists like Bell…

  14. Accessibility of Special Education Program Home Pages.

    ERIC Educational Resources Information Center

    Flowers, Claudia P.; Bray, Marty; Algozzine, Robert F.

    1999-01-01

    Eighty-nine special education Web sites were evaluated for accessibility errors. Most (73 percent) special education home pages had accessibility problems, and the majority of these errors severely limited access for individuals with disabilities. The majority of the errors can be easily corrected. Recommendations and methods for improving…

  15. Thomas Jefferson, Page Design, and Desktop Publishing.

    ERIC Educational Resources Information Center

    Hartley, James

    1991-01-01

    Discussion of page design for desktop publishing focuses on the importance of functional issues as opposed to aesthetic issues, and criticizes a previous article that stressed aesthetic issues. Topics discussed include balance, consistency in text structure, and how differences in layout affect the clarity of "The Declaration of Independence."…

  16. Efficient Web Change Monitoring with Page Digest

    SciTech Connect

    Buttler, D J; Rocco, D; Liu, L

    2004-02-20

    The Internet and the World Wide Web have enabled a publishing explosion of useful online information, which has produced the unfortunate side effect of information overload: it is increasingly difficult for individuals to keep abreast of fresh information. In this paper we describe an approach for building a system for efficiently monitoring changes to Web documents. This paper has three main contributions. First, we present a coherent framework that captures different characteristics of Web documents. The system uses the Page Digest encoding to provide a comprehensive monitoring system for content, structure, and other interesting properties of Web documents. Second, the Page Digest encoding enables improved performance for individual page monitors through mechanisms such as short-circuit evaluation, linear time algorithms for document and structure similarity, and data size reduction. Finally, we develop a collection of sentinel grouping techniques based on the Page Digest encoding to reduce redundant processing in large-scale monitoring systems by grouping similar monitoring requests together. We examine how effective these techniques are over a wide range of parameters and have seen an order of magnitude speed up over existing Web-based information monitoring systems.

  17. Reconfigurable Full-Page Braille Displays

    NASA Technical Reports Server (NTRS)

    Garner, H. Douglas

    1994-01-01

    Electrically actuated braille display cells of proposed type arrayed together to form full-page braille displays. Like other braille display cells, these provide changeable patterns of bumps driven by digitally recorded text stored on magnetic tapes or in solid-state electronic memories. Proposed cells contain electrorheological fluid. Viscosity of such fluid increases in strong electrostatic field.

  18. 47 CFR 22.531 - Channels for paging operation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 22.531 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES PUBLIC MOBILE SERVICES Paging and Radiotelephone Service Paging Operation § 22.531 Channels for paging operation. The following channels are allocated for assignment to base transmitters that provide paging...

  19. [An internet based medical communication server].

    PubMed

    Hu, B; Bai, J; Ye, D

    1998-04-01

    The telemedicine and medical conference usually need multi-point to multi-point communication. Because the communication users can be patients, specialists or medical centers, they have different communication ratios and different physical connection, therefore, this kind of communication is complicated and limited by the communication ratios. In this paper, to meet the requirements of medical communication, we presented a concept of medical communication server which is able to receive data packages and deliver them according to the request of clients, and described its implementation in Windows 95 environment by using Windows Sockets.

  20. HS06 Benchmark for an ARM Server

    NASA Astrophysics Data System (ADS)

    Kluth, Stefan

    2014-06-01

    We benchmarked an ARM cortex-A9 based server system with a four-core CPU running at 1.1 GHz. The system used Ubuntu 12.04 as operating system and the HEPSPEC 2006 (HS06) benchmarking suite was compiled natively with gcc-4.4 on the system. The benchmark was run for various settings of the relevant gcc compiler options. We did not find significant influence from the compiler options on the benchmark result. The final HS06 benchmark result is 10.4.

  1. Remote Sensing Data Analytics for Planetary Science with PlanetServer/EarthServer

    NASA Astrophysics Data System (ADS)

    Rossi, Angelo Pio; Figuera, Ramiro Marco; Flahaut, Jessica; Martinot, Melissa; Misev, Dimitar; Baumann, Peter; Pham Huu, Bang; Besse, Sebastien

    2016-04-01

    Planetary Science datasets, beyond the change in the last two decades from physical volumes to internet-accessible archives, still face the problem of large-scale processing and analytics (e.g. Rossi et al., 2014, Gaddis and Hare, 2015). PlanetServer, the Planetary Science Data Service of the EC-funded EarthServer-2 project (#654367) tackles the planetary Big Data analytics problem with an array database approach (Baumann et al., 2014). It is developed to serve a large amount of calibrated, map-projected planetary data online, mainly through Open Geospatial Consortium (OGC) Web Coverage Processing Service (WCPS) (e.g. Rossi et al., 2014; Oosthoek et al., 2013; Cantini et al., 2014). The focus of the H2020 evolution of PlanetServer is still on complex multidimensional data, particularly hyperspectral imaging and topographic cubes and imagery. In addition to hyperspectral and topographic from Mars (Rossi et al., 2014), the use of WCPS is applied to diverse datasets on the Moon, as well as Mercury. Other Solar System Bodies are going to be progressively available. Derived parameters such as summary products and indices can be produced through WCPS queries, as well as derived imagery colour combination products, dynamically generated and accessed also through OGC Web Coverage Service (WCS). Scientific questions translated into queries can be posed to a large number of individual coverages (data products), locally, regionally or globally. The new PlanetServer system uses the the Open Source Nasa WorldWind (e.g. Hogan, 2011) virtual globe as visualisation engine, and the array database Rasdaman Community Edition as core server component. Analytical tools and client components of relevance for multiple communities and disciplines are shared across service such as the Earth Observation and Marine Data Services of EarthServer. The Planetary Science Data Service of EarthServer is accessible on http://planetserver.eu. All its code base is going to be available on GitHub, on

  2. A Web Terminology Server Using UMLS for the Description of Medical Procedures

    PubMed Central

    Burgun, Anita; Denier, Patrick; Bodenreider, Olivier; Botti, Geneviève; Delamarre, Denis; Pouliquen, Bruno; Oberlin, Philippe; Lévéque, Jean M.; Lukacs, Bertrand; Kohler, François; Fieschi, Marius; Le Beux, Pierre

    1997-01-01

    Abstract The Model for Assistance in the Orientation of a User within Coding Systems (MAOUSSC) project has been designed to provide a representation for medical and surgical procedures that allows several applications to be developed from several viewpoints. It is based on a conceptual model, a controlled set of terms, and Web server development. The design includes the UMLS knowledge sources associated with additional knowledge about medico-surgical procedures. The model was implemented using a relational database. The authors developed a complete interface for the Web presentation, with the intermediary layer being written in PERL. The server has been used for the representation of medico-surgical procedures that occur in the discharge summaries of the national survey of hospital activities that is performed by the French Health Statistics Agency in order to produce inpatient profiles. The authors describe the current status of the MAOUSSC server and discuss their interest in using such a server to assist in the coordination of terminology tasks and in the sharing of controlled terminologies. PMID:9292841

  3. The EarthServer project: Exploiting Identity Federations, Science Gateways and Social and Mobile Clients for Big Earth Data Analysis

    NASA Astrophysics Data System (ADS)

    Barbera, Roberto; Bruno, Riccardo; Calanducci, Antonio; Messina, Antonio; Pappalardo, Marco; Passaro, Gianluca

    2013-04-01

    The EarthServer project (www.earthserver.eu), funded by the European Commission under its Seventh Framework Program, aims at establishing open access and ad-hoc analytics on extreme-size Earth Science data, based on and extending leading-edge Array Database technology. The core idea is to use database query languages as client/server interface to achieve barrier-free "mix & match" access to multi-source, any-size, multi-dimensional space-time data -- in short: "Big Earth Data Analytics" - based on the open standards of the Open Geospatial Consortium Web Coverage Processing Service (OGC WCPS) and the W3C XQuery. EarthServer combines both, thereby achieving a tight data/metadata integration. Further, the rasdaman Array Database System (www.rasdaman.com) is extended with further space-time coverage data types. On server side, highly effective optimizations - such as parallel and distributed query processing - ensure scalability to Exabyte volumes. Six Lighthouse Applications are being established in EarthServer, each of which poses distinct challenges on Earth Data Analytics: Cryospheric Science, Airborne Science, Atmospheric Science, Geology, Oceanography, and Planetary Science. Altogether, they cover all Earth Science domains; the Planetary Science use case has been added to challenge concepts and standards in non-standard environments. In addition, EarthLook (maintained by Jacobs University) showcases use of OGC standards in 1D through 5D use cases. In this contribution we will report on the first applications integrated in the EarthServer Science Gateway and on the clients for mobile appliances developed to access them. We will also show how federated and social identity services can allow Big Earth Data Providers to expose their data in a distributed environment keeping a strict and fine-grained control on user authentication and authorisation. The degree of fulfilment of the EarthServer implementation with the recommendations made in the recent TERENA Study on

  4. Developing a web page: bringing clinics online.

    PubMed

    Peterson, Ronnie; Berns, Susan

    2004-01-01

    Introducing clinical staff education, along with new policies and procedures, to over 50 different clinical sites can be a challenge. As any staff educator will confess, getting people to attend an educational inservice session can be difficult. Clinical staff request training, but no one has time to attend training sessions. Putting the training along with the policies and other information into "neat" concise packages via the computer and over the company's intranet was the way to go. However, how do you bring the clinics online when some of the clinical staff may still be reluctant to turn on their computers for anything other than to gather laboratory results? Developing an easy, fun, and accessible Web page was the answer. This article outlines the development of the first training Web page at the University of Wisconsin Medical Foundation, Madison, WI.

  5. VCL: a high performance virtual CD library server

    NASA Astrophysics Data System (ADS)

    Wan, Jiguang; Xie, ChangSheng; Tan, Zhihu

    2005-09-01

    With the increasing of CD data in internet, CD mirror server has become the new technology. Considering the performance requirement of the traditional CD mirror server, we present a novel high performance VCL (Virtual CD Library) server. What makes VCL server superior is the two patented technologies: a new caching architecture and an efficient network protocol specifically tailored to VCL applications. VCL server is built based on an innovative caching technology. It employs a two-level cache structure on both a client side and the server side. Instead of using existing network and file protocols such as SMB/CIFS etc that are generally used by existing CD server, we have developed a set of new protocols specifically suitable to VCL environment. The new protocol is a native VCL protocol built directly on TCP/IP protocol. VCL protocol optimizes data transfer performance for block level data as opposed to file system level data. The advantage of using block level native protocol is reduced network-bandwidth requirement to transfer same amount of data as compared to file system level protocol. Our experiment and independent testing have shown that VCL servers allow much more number of concurrent users than existing products. For very high resolution DVD videos, VCL with 100Mbps NIC supports over 10 concurrent users viewing the same or different videos simultaneously. For VCD videos, the same VCL can support over 65 concurrent users viewing videos simultaneously. For data CDs, the VCL can support over 500 concurrent data stream users.

  6. OPC Data Acquisition Server for CPDev Engineering Environment

    NASA Astrophysics Data System (ADS)

    Rzońca, Dariusz; Sadolewski, Jan; Trybus, Bartosz

    OPC Server has been created for the CPDev engineering environment, which provides classified process data for OPC client applications. Hierarchical Coloured Petri nets are used at design stage to model communications of the server with CPDev target controllers. Implementation involves an universal interface for acquisition data via different communication protocols like Modbus or .NET Remoting.

  7. DISULFIND: a disulfide bonding state and cysteine connectivity prediction server

    PubMed Central

    Ceroni, Alessio; Passerini, Andrea; Vullo, Alessandro; Frasconi, Paolo

    2006-01-01

    DISULFIND is a server for predicting the disulfide bonding state of cysteines and their disulfide connectivity starting from sequence alone. Optionally, disulfide connectivity can be predicted from sequence and a bonding state assignment given as input. The output is a simple visualization of the assigned bonding state (with confidence degrees) and the most likely connectivity patterns. The server is available at . PMID:16844986

  8. Client-Server Connection Status Monitoring Using Ajax Push Technology

    NASA Technical Reports Server (NTRS)

    Lamongie, Julien R.

    2008-01-01

    This paper describes how simple client-server connection status monitoring can be implemented using Ajax (Asynchronous JavaScript and XML), JSF (Java Server Faces) and ICEfaces technologies. This functionality is required for NASA LCS (Launch Control System) displays used in the firing room for the Constellation project. Two separate implementations based on two distinct approaches are detailed and analyzed.

  9. A FPGA embedded web server for remote monitoring and control of smart sensors networks.

    PubMed

    Magdaleno, Eduardo; Rodríguez, Manuel; Pérez, Fernando; Hernández, David; García, Enrique

    2013-01-01

    This article describes the implementation of a web server using an embedded Altera NIOS II IP core, a general purpose and configurable RISC processor which is embedded in a Cyclone FPGA. The processor uses the μCLinux operating system to support a Boa web server of dynamic pages using Common Gateway Interface (CGI). The FPGA is configured to act like the master node of a network, and also to control and monitor a network of smart sensors or instruments. In order to develop a totally functional system, the FPGA also includes an implementation of the time-triggered protocol (TTP/A). Thus, the implemented master node has two interfaces, the webserver that acts as an Internet interface and the other to control the network. This protocol is widely used to connecting smart sensors and actuators and microsystems in embedded real-time systems in different application domains, e.g., industrial, automotive, domotic, etc., although this protocol can be easily replaced by any other because of the inherent characteristics of the FPGA-based technology. PMID:24379047

  10. A FPGA Embedded Web Server for Remote Monitoring and Control of Smart Sensors Networks

    PubMed Central

    Magdaleno, Eduardo; Rodríguez, Manuel; Pérez, Fernando; Hernández, David; García, Enrique

    2014-01-01

    This article describes the implementation of a web server using an embedded Altera NIOS II IP core, a general purpose and configurable RISC processor which is embedded in a Cyclone FPGA. The processor uses the μCLinux operating system to support a Boa web server of dynamic pages using Common Gateway Interface (CGI). The FPGA is configured to act like the master node of a network, and also to control and monitor a network of smart sensors or instruments. In order to develop a totally functional system, the FPGA also includes an implementation of the time-triggered protocol (TTP/A). Thus, the implemented master node has two interfaces, the webserver that acts as an Internet interface and the other to control the network. This protocol is widely used to connecting smart sensors and actuators and microsystems in embedded real-time systems in different application domains, e.g., industrial, automotive, domotic, etc., although this protocol can be easily replaced by any other because of the inherent characteristics of the FPGA-based technology. PMID:24379047

  11. A FPGA embedded web server for remote monitoring and control of smart sensors networks.

    PubMed

    Magdaleno, Eduardo; Rodríguez, Manuel; Pérez, Fernando; Hernández, David; García, Enrique

    2013-12-27

    This article describes the implementation of a web server using an embedded Altera NIOS II IP core, a general purpose and configurable RISC processor which is embedded in a Cyclone FPGA. The processor uses the μCLinux operating system to support a Boa web server of dynamic pages using Common Gateway Interface (CGI). The FPGA is configured to act like the master node of a network, and also to control and monitor a network of smart sensors or instruments. In order to develop a totally functional system, the FPGA also includes an implementation of the time-triggered protocol (TTP/A). Thus, the implemented master node has two interfaces, the webserver that acts as an Internet interface and the other to control the network. This protocol is widely used to connecting smart sensors and actuators and microsystems in embedded real-time systems in different application domains, e.g., industrial, automotive, domotic, etc., although this protocol can be easily replaced by any other because of the inherent characteristics of the FPGA-based technology.

  12. FAF-Drugs3: a web server for compound property calculation and chemical library design.

    PubMed

    Lagorce, David; Sperandio, Olivier; Baell, Jonathan B; Miteva, Maria A; Villoutreix, Bruno O

    2015-07-01

    Drug attrition late in preclinical or clinical development is a serious economic problem in the field of drug discovery. These problems can be linked, in part, to the quality of the compound collections used during the hit generation stage and to the selection of compounds undergoing optimization. Here, we present FAF-Drugs3, a web server that can be used for drug discovery and chemical biology projects to help in preparing compound libraries and to assist decision-making during the hit selection/lead optimization phase. Since it was first described in 2006, FAF-Drugs has been significantly modified. The tool now applies an enhanced structure curation procedure, can filter or analyze molecules with user-defined or eight predefined physicochemical filters as well as with several simple ADMET (absorption, distribution, metabolism, excretion and toxicity) rules. In addition, compounds can be filtered using an updated list of 154 hand-curated structural alerts while Pan Assay Interference compounds (PAINS) and other, generally unwanted groups are also investigated. FAF-Drugs3 offers access to user-friendly html result pages and the possibility to download all computed data. The server requires as input an SDF file of the compounds; it is open to all users and can be accessed without registration at http://fafdrugs3.mti.univ-paris-diderot.fr.

  13. NMR Constraints Analyser: a web-server for the graphical analysis of NMR experimental constraints.

    PubMed

    Heller, Davide Martin; Giorgetti, Alejandro

    2010-07-01

    Nuclear magnetic resonance (NMR) spectroscopy together with X-ray crystallography, are the main techniques used for the determination of high-resolution 3D structures of biological molecules. The output of an NMR experiment includes a set of lower and upper limits for the distances (constraints) between pairs of atoms. If the number of constraints is high enough, there will be a finite number of possible conformations (models) of the macromolecule satisfying the data. Thus, the more constraints are measured, the better defined these structures will be. The availability of a user-friendly tool able to help in the analysis and interpretation of the number of experimental constraints per residue, is thus of valuable importance when assessing the levels of structure definition of NMR solved biological macromolecules, in particular, when high-quality structures are needed in techniques such as, computational biology approaches, site-directed mutagenesis experiments and/or drug design. Here, we present a free publicly available web-server, i.e. NMR Constraints Analyser, which is aimed at providing an automatic graphical analysis of the NMR experimental constraints atom by atom. The NMR Constraints Analyser server is available from the web-page http://molsim.sci.univr.it/constraint.

  14. FAF-Drugs3: a web server for compound property calculation and chemical library design

    PubMed Central

    Lagorce, David; Sperandio, Olivier; Baell, Jonathan B.; Miteva, Maria A.; Villoutreix, Bruno O.

    2015-01-01

    Drug attrition late in preclinical or clinical development is a serious economic problem in the field of drug discovery. These problems can be linked, in part, to the quality of the compound collections used during the hit generation stage and to the selection of compounds undergoing optimization. Here, we present FAF-Drugs3, a web server that can be used for drug discovery and chemical biology projects to help in preparing compound libraries and to assist decision-making during the hit selection/lead optimization phase. Since it was first described in 2006, FAF-Drugs has been significantly modified. The tool now applies an enhanced structure curation procedure, can filter or analyze molecules with user-defined or eight predefined physicochemical filters as well as with several simple ADMET (absorption, distribution, metabolism, excretion and toxicity) rules. In addition, compounds can be filtered using an updated list of 154 hand-curated structural alerts while Pan Assay Interference compounds (PAINS) and other, generally unwanted groups are also investigated. FAF-Drugs3 offers access to user-friendly html result pages and the possibility to download all computed data. The server requires as input an SDF file of the compounds; it is open to all users and can be accessed without registration at http://fafdrugs3.mti.univ-paris-diderot.fr. PMID:25883137

  15. R3D-2-MSA: the RNA 3D structure-to-multiple sequence alignment server

    PubMed Central

    Cannone, Jamie J.; Sweeney, Blake A.; Petrov, Anton I.; Gutell, Robin R.; Zirbel, Craig L.; Leontis, Neocles

    2015-01-01

    The RNA 3D Structure-to-Multiple Sequence Alignment Server (R3D-2-MSA) is a new web service that seamlessly links RNA three-dimensional (3D) structures to high-quality RNA multiple sequence alignments (MSAs) from diverse biological sources. In this first release, R3D-2-MSA provides manual and programmatic access to curated, representative ribosomal RNA sequence alignments from bacterial, archaeal, eukaryal and organellar ribosomes, using nucleotide numbers from representative atomic-resolution 3D structures. A web-based front end is available for manual entry and an Application Program Interface for programmatic access. Users can specify up to five ranges of nucleotides and 50 nucleotide positions per range. The R3D-2-MSA server maps these ranges to the appropriate columns of the corresponding MSA and returns the contents of the columns, either for display in a web browser or in JSON format for subsequent programmatic use. The browser output page provides a 3D interactive display of the query, a full list of sequence variants with taxonomic information and a statistical summary of distinct sequence variants found. The output can be filtered and sorted in the browser. Previous user queries can be viewed at any time by resubmitting the output URL, which encodes the search and re-generates the results. The service is freely available with no login requirement at http://rna.bgsu.edu/r3d-2-msa. PMID:26048960

  16. Insights into Facebook Pages: an early adolescent health research study page targeted at parents.

    PubMed

    Amon, Krestina L; Paxton, Karen; Klineberg, Emily; Riley, Lisa; Hawke, Catherine; Steinbeck, Katharine

    2016-02-01

    Facebook has been used in health research, but there is a lack of literature regarding how Facebook may be used to recruit younger adolescents. A Facebook Page was created for an adolescent cohort study on the effects of puberty hormones on well-being and behaviour in early adolescence. Used as a communication tool with existing participants, it also aimed to alert potential participants to the study. The purpose of this paper is to provide a detailed description of the development of the study Facebook Page and present the fan response to the types of posts made on the Page using the Facebook-generated Insights data. Two types of posts were made on the study Facebook Page. The first type was study-related update posts and events. The second was relevant adolescent and family research and current news posts. Observations on the use of and response to the Page were made over 1 year across three phases (phase 1, very low Facebook use; phase 2, high Facebook use; phase 3, low Facebook use). Most Page fans were female (88.6%), with the largest group of fans aged between 35 and 44 years. Study-related update posts with photographs were the most popular. This paper provides a model on which other researchers could base Facebook communication and potential recruitment in the absence of established guidelines. PMID:25781667

  17. Insights into Facebook Pages: an early adolescent health research study page targeted at parents.

    PubMed

    Amon, Krestina L; Paxton, Karen; Klineberg, Emily; Riley, Lisa; Hawke, Catherine; Steinbeck, Katharine

    2016-02-01

    Facebook has been used in health research, but there is a lack of literature regarding how Facebook may be used to recruit younger adolescents. A Facebook Page was created for an adolescent cohort study on the effects of puberty hormones on well-being and behaviour in early adolescence. Used as a communication tool with existing participants, it also aimed to alert potential participants to the study. The purpose of this paper is to provide a detailed description of the development of the study Facebook Page and present the fan response to the types of posts made on the Page using the Facebook-generated Insights data. Two types of posts were made on the study Facebook Page. The first type was study-related update posts and events. The second was relevant adolescent and family research and current news posts. Observations on the use of and response to the Page were made over 1 year across three phases (phase 1, very low Facebook use; phase 2, high Facebook use; phase 3, low Facebook use). Most Page fans were female (88.6%), with the largest group of fans aged between 35 and 44 years. Study-related update posts with photographs were the most popular. This paper provides a model on which other researchers could base Facebook communication and potential recruitment in the absence of established guidelines.

  18. Perspectives on the consecutive pages problem

    NASA Astrophysics Data System (ADS)

    Srinivasan, V. K.

    2011-04-01

    This article presents different approaches to a problem, dubbed by the author as 'the consecutive pages problem'. The aim of this teaching-oriented article is to promote the teaching of abstract concepts in mathematics, by selecting a challenging amusement problem and then presenting various solutions in such a way that it can engage the attention of a fourth-grade student, a high school senior student, an average college student and scholars.

  19. FarFetch--an Internet-based sequence entry server.

    PubMed

    Gilbert, W A

    1994-04-01

    This communication is to announce the availability of a network server for biological sequence database entries which will allow scientists to fetch entries in a desired format directly into their file store. This server will use TCP/IP protocols allowing any user with an Internet connection to participate. FarFetch will allow users to obtain sequence entries in a directly usable form as opposed to conventional e-mail based sequence retrievers. This server also differs from Gopher and WAIS servers in that the sequence entry is written into the user's file store in a format that is immediately usable. Clients for the OpenVMS, Unix and Macintosh operating systems have been written and are available via anonymous ftp. Development of MS-DOS and Windows clients is planned. There will be no usage fees associated with the server.

  20. Improvements to the NIST network time protocol servers

    NASA Astrophysics Data System (ADS)

    Levine, Judah

    2008-12-01

    The National Institute of Standards and Technology (NIST) operates 22 network time servers at various locations. These servers respond to requests for time in a number of different formats and provide time stamps that are directly traceable to the NIST atomic clock ensemble in Boulder. The link between the servers at locations outside of the NIST Boulder Laboratories and the atomic clock ensemble is provided by the Automated Computer Time Service (ACTS) system, which has a direct connection to the clock ensemble and which transmits time information over dial-up telephone lines with a two-way protocol to measure the transmission delay. I will discuss improvements to the ACTS servers and to the time servers themselves. These improvements have resulted in an improvement of almost an order of magnitude in the performance of the system.

  1. Oceanotron, Scalable Server for Marine Observations

    NASA Astrophysics Data System (ADS)

    Loubrieu, T.; Bregent, S.; Blower, J. D.; Griffiths, G.

    2013-12-01

    Ifremer, French marine institute, is deeply involved in data management for different ocean in-situ observation programs (ARGO, OceanSites, GOSUD, ...) or other European programs aiming at networking ocean in-situ observation data repositories (myOcean, seaDataNet, Emodnet). To capitalize the effort for implementing advance data dissemination services (visualization, download with subsetting) for these programs and generally speaking water-column observations repositories, Ifremer decided to develop the oceanotron server (2010). Knowing the diversity of data repository formats (RDBMS, netCDF, ODV, ...) and the temperamental nature of the standard interoperability interface profiles (OGC/WMS, OGC/WFS, OGC/SOS, OpeNDAP, ...), the server is designed to manage plugins: - StorageUnits : which enable to read specific data repository formats (netCDF/OceanSites, RDBMS schema, ODV binary format). - FrontDesks : which get external requests and send results for interoperable protocols (OGC/WMS, OGC/SOS, OpenDAP). In between a third type of plugin may be inserted: - TransformationUnits : which enable ocean business related transformation of the features (for example conversion of vertical coordinates from pressure in dB to meters under sea surface). The server is released under open-source license so that partners can develop their own plugins. Within MyOcean project, University of Reading has plugged a WMS implementation as an oceanotron frontdesk. The modules are connected together by sharing the same information model for marine observations (or sampling features: vertical profiles, point series and trajectories), dataset metadata and queries. The shared information model is based on OGC/Observation & Measurement and Unidata/Common Data Model initiatives. The model is implemented in java (http://www.ifremer.fr/isi/oceanotron/javadoc/). This inner-interoperability level enables to capitalize ocean business expertise in software development without being indentured to

  2. CORAL Server and CORAL Server Proxy: Scalable Access to Relational Databases from CORAL Applications

    SciTech Connect

    Valassi, A.; Bartoldus, R.; Kalkhof, A.; Salnikov, A.; Wache, M.; /Mainz U., Inst. Phys.

    2012-04-19

    The CORAL software is widely used at CERN by the LHC experiments to access the data they store on relational databases, such as Oracle. Two new components have recently been added to implement a model involving a middle tier 'CORAL server' deployed close to the database and a tree of 'CORAL server proxies', providing data caching and multiplexing, deployed close to the client. A first implementation of the two new components, released in the summer 2009, is now deployed in the ATLAS online system to read the data needed by the High Level Trigger, allowing the configuration of a farm of several thousand processes. This paper reviews the architecture of the software, its development status and its usage in ATLAS.

  3. EarthServer: a Summary of Achievements in Technology, Services, and Standards

    NASA Astrophysics Data System (ADS)

    Baumann, Peter

    2015-04-01

    based on OGC and W3C standards, in particular the Web Coverage Processing Service (WCPS) which defines a high-level coverage query language. Reviewers have attested EarthServer that "With no doubt the project has been shaping the Big Earth Data landscape through the standardization activities within OGC, ISO and beyond". We present the project approach, its outcomes and impact on standardization and Big Data technology, and vistas for the future.

  4. Identifying and Analyzing Web Server Attacks

    SciTech Connect

    Seifert, Christian; Endicott-Popovsky, Barbara E.; Frincke, Deborah A.; Komisarczuk, Peter; Muschevici, Radu; Welch, Ian D.

    2008-08-29

    Abstract: Client honeypots can be used to identify malicious web servers that attack web browsers and push malware to client machines. Merely recording network traffic is insufficient to perform comprehensive forensic analyses of such attacks. Custom tools are required to access and analyze network protocol data. Moreover, specialized methods are required to perform a behavioral analysis of an attack, which helps determine exactly what transpired on the attacked system. This paper proposes a record/replay mechanism that enables forensic investigators to extract application data from recorded network streams and allows applications to interact with this data in order to conduct behavioral analyses. Implementations for the HTTP and DNS protocols are presented and their utility in network forensic investigations is demonstrated.

  5. STRAW: Species TRee Analysis Web server.

    PubMed

    Shaw, Timothy I; Ruan, Zheng; Glenn, Travis C; Liu, Liang

    2013-07-01

    The coalescent methods for species tree reconstruction are increasingly popular because they can accommodate coalescence and multilocus data sets. Herein, we present STRAW, a web server that offers workflows for reconstruction of phylogenies of species using three species tree methods-MP-EST, STAR and NJst. The input data are a collection of rooted gene trees (for STAR and MP-EST methods) or unrooted gene trees (for NJst). The output includes the estimated species tree, modified Robinson-Foulds distances between gene trees and the estimated species tree and visualization of trees to compare gene trees with the estimated species tree. The web sever is available at http://bioinformatics.publichealth.uga.edu/SpeciesTreeAnalysis/.

  6. 46. Photograph of a published page. 'OPERATIONS IN INCORPORATION BUILDINGS: ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    46. Photograph of a published page. 'OPERATIONS IN INCORPORATION BUILDINGS: HOLSTON DEFENSE CORPORATION. 'HOLSTON ARMY AMMUNITION PLANT.' Page 17. (no date). - Holston Army Ammunition Plant, RDX-and-Composition-B Manufacturing Line 9, Kingsport, Sullivan County, TN

  7. 45. Photograph of a published page. OPERATIONS IN 'H' OR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    45. Photograph of a published page. OPERATIONS IN 'H' OR DEWATERING BUILDING: HOLSTON DEFENSE CORPORATION. 'HOLSTON ARMY AMMUNITION PLANT.' Page 16. (no date). - Holston Army Ammunition Plant, RDX-and-Composition-B Manufacturing Line 9, Kingsport, Sullivan County, TN

  8. Secure Entanglement Distillation for Double-Server Blind Quantum Computation

    NASA Astrophysics Data System (ADS)

    Morimae, Tomoyuki; Fujii, Keisuke

    2013-07-01

    Blind quantum computation is a new secure quantum computing protocol where a client, who does not have enough quantum technologies at her disposal, can delegate her quantum computation to a server, who has a fully fledged quantum computer, in such a way that the server cannot learn anything about the client’s input, output, and program. If the client interacts with only a single server, the client has to have some minimum quantum power, such as the ability of emitting randomly rotated single-qubit states or the ability of measuring states. If the client interacts with two servers who share Bell pairs but cannot communicate with each other, the client can be completely classical. For such a double-server scheme, two servers have to share clean Bell pairs, and therefore the entanglement distillation is necessary in a realistic noisy environment. In this Letter, we show that it is possible to perform entanglement distillation in the double-server scheme without degrading the security of blind quantum computing.

  9. Secure entanglement distillation for double-server blind quantum computation.

    PubMed

    Morimae, Tomoyuki; Fujii, Keisuke

    2013-07-12

    Blind quantum computation is a new secure quantum computing protocol where a client, who does not have enough quantum technologies at her disposal, can delegate her quantum computation to a server, who has a fully fledged quantum computer, in such a way that the server cannot learn anything about the client's input, output, and program. If the client interacts with only a single server, the client has to have some minimum quantum power, such as the ability of emitting randomly rotated single-qubit states or the ability of measuring states. If the client interacts with two servers who share Bell pairs but cannot communicate with each other, the client can be completely classical. For such a double-server scheme, two servers have to share clean Bell pairs, and therefore the entanglement distillation is necessary in a realistic noisy environment. In this Letter, we show that it is possible to perform entanglement distillation in the double-server scheme without degrading the security of blind quantum computing.

  10. Understanding Customer Dissatisfaction with Underutilized Distributed File Servers

    NASA Technical Reports Server (NTRS)

    Riedel, Erik; Gibson, Garth

    1996-01-01

    An important trend in the design of storage subsystems is a move toward direct network attachment. Network-attached storage offers the opportunity to off-load distributed file system functionality from dedicated file server machines and execute many requests directly at the storage devices. For this strategy to lead to better performance, as perceived by users, the response time of distributed operations must improve. In this paper we analyze measurements of an Andrew file system (AFS) server that we recently upgraded in an effort to improve client performance in our laboratory. While the original server's overall utilization was only about 3%, we show how burst loads were sufficiently intense to lead to period of poor response time significant enough to trigger customer dissatisfaction. In particular, we show how, after adjusting for network load and traffic to non-project servers, 50% of the variation in client response time was explained by variation in server central processing unit (CPU) use. That is, clients saw long response times in large part because the server was often over-utilized when it was used at all. Using these measures, we see that off-loading file server work in a network-attached storage architecture has to potential to benefit user response time. Computational power in such a system scales directly with storage capacity, so the slowdown during burst period should be reduced.

  11. UNIX based client/server hospital information system.

    PubMed

    Nakamura, S; Sakurai, K; Uchiyama, M; Yoshii, Y; Tachibana, N

    1995-01-01

    SMILE (St. Luke's Medical Center Information Linkage Environment) is a HIS which is a client/server system using a UNIX workstation under an open network, LAN(FDDI&10BASE-T). It provides a multivendor environment, high performance with low cost and a user-friendly GUI. However, the client/server architecture with a UNIX workstation does not have the same OLTP environment (ex. TP monor) as the mainframe. So, our system problems and the steps used to solve them were reviewed. Several points that are necessary for a client/server system with a UNIX workstation in the future are presented.

  12. PROMALS web server for accurate multiple protein sequence alignments.

    PubMed

    Pei, Jimin; Kim, Bong-Hyun; Tang, Ming; Grishin, Nick V

    2007-07-01

    Multiple sequence alignments are essential in homology inference, structure modeling, functional prediction and phylogenetic analysis. We developed a web server that constructs multiple protein sequence alignments using PROMALS, a progressive method that improves alignment quality by using additional homologs from PSI-BLAST searches and secondary structure predictions from PSIPRED. PROMALS shows higher alignment accuracy than other advanced methods, such as MUMMALS, ProbCons, MAFFT and SPEM. The PROMALS web server takes FASTA format protein sequences as input. The output includes a colored alignment augmented with information about sequence grouping, predicted secondary structures and positional conservation. The PROMALS web server is available at: http://prodata.swmed.edu/promals/ PMID:17452345

  13. Sausalito: An Application Servers for RESTful Services in the Cloud

    NASA Astrophysics Data System (ADS)

    Brantner, Matthias

    This talk argues that Web Server, Application Server, and Database System should be bundled into a single system for development and deployment of Web-based applications in the cloud. Furthermore, this talk argues that the whole system should serve REST services and should behave like a REST service itself. The design and implementation of Sausalito is presented which is a combined Web, Application, and Database server that operates on top of Amazon’s cloud offerings. Furthermore, a demo of several example applications is given that show the advantages of the approach taken by Sausalito (see http://sausalito.28msec.com/).

  14. Classroom Web Pages: A "How-To" Guide for Educators.

    ERIC Educational Resources Information Center

    Fehling, Eric E.

    This manual provides teachers, with very little or no technology experience, with a step-by-step guide for developing the necessary skills for creating a class Web Page. The first part of the manual is devoted to the thought processes preceding the actual creation of the Web Page. These include looking at other Web Pages, deciding what should be…

  15. World Wide Web Pages--Tools for Teaching and Learning.

    ERIC Educational Resources Information Center

    Beasley, Sarah; Kent, Jean

    Created to help educators incorporate World Wide Web pages into teaching and learning, this collection of Web pages presents resources, materials, and techniques for using the Web. The first page focuses on tools for teaching and learning via the Web, providing pointers to sites containing the following: (1) course materials for both distance and…

  16. Young Children's Interpretations of Page Breaks in Contemporary Picture Storybooks

    ERIC Educational Resources Information Center

    Sipe, Lawrence R.; Brightman, Anne E.

    2009-01-01

    This article reports on a study of the responses of a second-grade class to the page breaks in contemporary picturebooks. In a picturebook, the text and accompanying illustrations are divided into a series of facing pages called openings, and the divisions between the openings are called page breaks or turns. Unlike a novel, in which the page…

  17. Digital Ethnography: Library Web Page Redesign among Digital Natives

    ERIC Educational Resources Information Center

    Klare, Diane; Hobbs, Kendall

    2011-01-01

    Presented with an opportunity to improve Wesleyan University's dated library home page, a team of librarians employed ethnographic techniques to explore how its users interacted with Wesleyan's current library home page and web pages in general. Based on the data that emerged, a group of library staff and members of the campus' information…

  18. Required Discussion Web Pages in Psychology Courses and Student Outcomes

    ERIC Educational Resources Information Center

    Pettijohn, Terry F., II; Pettijohn, Terry F.

    2007-01-01

    We conducted 2 studies that investigated student outcomes when using discussion Web pages in psychology classes. In Study 1, we assigned 213 students enrolled in Introduction to Psychology courses to either a mandatory or an optional Web page discussion condition. Students used the discussion Web page significantly more often and performed…

  19. Facebook's personal page modelling and simulation

    NASA Astrophysics Data System (ADS)

    Sarlis, Apostolos S.; Sakas, Damianos P.; Vlachos, D. S.

    2015-02-01

    In this paper we will try to define the utility of Facebook's Personal Page marketing method. This tool that Facebook provides, is modelled and simulated using iThink in the context of a Facebook marketing agency. The paper has leveraged the system's dynamic paradigm to conduct Facebook marketing tools and methods modelling, using iThink™ system to implement them. It uses the design science research methodology for the proof of concept of the models and modelling processes. The following model has been developed for a social media marketing agent/company, Facebook platform oriented and tested in real circumstances. This model is finalized through a number of revisions and iterators of the design, development, simulation, testing and evaluation processes. The validity and usefulness of this Facebook marketing model for the day-to-day decision making are authenticated by the management of the company organization. Facebook's Personal Page method can be adjusted, depending on the situation, in order to maximize the total profit of the company which is to bring new customers, keep the interest of the old customers and deliver traffic to its website.

  20. Native SDS-PAGE: high resolution electrophoretic separation of proteins with retention of native properties including bound metal ions.

    PubMed

    Nowakowski, Andrew B; Wobig, William J; Petering, David H

    2014-05-01

    Sodium dodecyl-sulfate polyacrylamide gel electrophoresis (SDS-PAGE) is commonly used to obtain high resolution separation of complex mixtures of proteins. The method initially denatures the proteins that will undergo electrophoresis. Although covalent structural features of resolved proteins can be determined with SDS-PAGE, functional properties are destroyed, including the presence of non-covalently bound metal ions. To address this shortcoming, blue-native (BN)-PAGE has been introduced. This method retains functional properties but at the cost of protein resolving power. To address the need for a high resolution PAGE method that results in the separation of native proteins, experiments tested the impact of changing the conditions of SDS-PAGE on the quality of protein separation and retention of functional properties. Removal of SDS and EDTA from the sample buffer together with omission of a heating step had no effect on the results of PAGE. Reduction of SDS in the running buffer from 0.1% to 0.0375% together with deletion of EDTA also made little impact on the quality of the electrophoretograms of fractions of pig kidney (LLC-PK1) cell proteome in comparison with that achieved with the SDS-PAGE method. The modified conditions were called native (N)SDS-PAGE. Retention of Zn(2+) bound in proteomic samples increased from 26 to 98% upon shifting from standard to modified conditions. Moreover, seven of nine model enzymes, including four Zn(2+) proteins that were subjected to NSDS-PAGE retained activity. All nine were active in BN-PAGE, whereas all underwent denaturation during SDS-PAGE. Metal retention after electrophoresis was additionally confirmed using laser ablation-inductively coupled plasma-mass spectrometry and in-gel Zn-protein staining using the fluorophore TSQ.

  1. Native SDS-PAGE: High Resolution Electrophoretic Separation of Proteins With Retention of Native Properties Including Bound Metal Ions

    PubMed Central

    Nowakowski, Andrew B.; Wobig, William J.; Petering, David H.

    2014-01-01

    Sodium dodecyl-sulfate polyacrylamide gel electrophoresis (SDS-PAGE) is commonly used to obtain high resolution separation of complex mixtures of proteins. The method initially denatures the proteins that will undergo electrophoresis. Although covalent structural features of resolved proteins can be determined with SDS-PAGE, functional properties are destroyed, including the presence of non-covalently bound metal ions. To address this shortcoming, blue-native (BN)-PAGE has been introduced. This method retains functional properties but at the cost of protein resolving power. To address the need for a high resolution PAGE method that results in the separation of native proteins, experiments tested the impact of changing the conditions of SDS-PAGE on the quality of protein separation and retention of functional properties. Removal of SDS and EDTA from the sample buffer together with omission of a heating step had no effect on the results of PAGE. Reduction of SDS in the running buffer from 0.1% to 0.0375% together with deletion of EDTA also made little impact on the quality of the electrophoretograms of fractions of pig kidney (LLC-PK1) cell proteome in comparison with that achieved with the SDS-PAGE method. The modified conditions were called native (N)SDS-PAGE. Retention of Zn2+ bound in proteomic samples increased from 26 to 98% upon shifting from standard to modified conditions. Moreover, seven of nine model enzymes, including four Zn2+ proteins that were subjected to NSDS-PAGE retained activity. All nine were active in BN-PAGE, whereas all underwent denaturation during SDS-PAGE. Metal retention after electrophoresis was additionally confirmed using laser ablation-inductively coupled plasma-mass spectrometry and in-gel Zn-protein staining using the fluorophore TSQ. PMID:24686569

  2. Design of a Web Page as a complement of educative innovation through MOODLE

    NASA Astrophysics Data System (ADS)

    Mendiola Ubillos, M. A.; Aguado Cortijo, Pedro L.

    2010-05-01

    In the context of Information Technology to impart knowledge and to establish MOODLE system as a support and complementary tool to on-site educational methodology (b-learning) a Web Page was designed in Agronomic and Food Industry Crops (Plantas de interés Agroalimentario) during 2006-07 course. This web was inserted in the Thecnical University of Madrid (Universidad Politécnica de Madrid) computer system to facilitate to the students the first contact with the contents of this subject. In this page the objectives and methodology, personal work planning, subject program given plus the activities are showed. At another web site, the evaluation criteria and recommended bibliography are located. The objective of this web page has been to make more transparent and accessible the necessary information in the learning process and presenting it in a more attractive frame. This page has been update and modified in each academic course offered since its first implementation. We had added in some cases new specific links to increase its useful. At the end of each course a test is applied to the students that take this subject. We have asked which elements would like to modify, delete and add to this web page. In this way the direct users give their point of view and help to improve the web page each course.

  3. How to secure your servers, code and data

    ScienceCinema

    None

    2016-07-12

    Oral presentation in English, slides in English. Advice and best practices regarding the security of your servers, code and data will be presented. We will also describe how the Computer Security Team can help you reduce the risks.

  4. Reviews of computing technology: Client-server technology

    SciTech Connect

    Johnson, S.M.

    1990-09-01

    One of the most frequently heard terms in the computer industry these days is ``client-server.`` There is much misinformation available on the topic, and competitive pressures on software vendors have led to a great deal of hype with little in the way of supporting products. The purpose of this document is to explain what is meant by client-server applications, why the Advanced Technology and Architecture (ATA) section of the Information Resources Management (IRM) Department sees this emerging technology as key for computer applications during the next ten years, and what ATA sees as the existing standards and products available today. Because of the relative immaturity of existing client-server products, IRM is not yet guidelining any specific client-server products, except those that are components of guidelined data communications products or database management systems.

  5. Reviews of computing technology: Client-server technology

    SciTech Connect

    Johnson, S.M.

    1990-09-01

    One of the most frequently heard terms in the computer industry these days is client-server.'' There is much misinformation available on the topic, and competitive pressures on software vendors have led to a great deal of hype with little in the way of supporting products. The purpose of this document is to explain what is meant by client-server applications, why the Advanced Technology and Architecture (ATA) section of the Information Resources Management (IRM) Department sees this emerging technology as key for computer applications during the next ten years, and what ATA sees as the existing standards and products available today. Because of the relative immaturity of existing client-server products, IRM is not yet guidelining any specific client-server products, except those that are components of guidelined data communications products or database management systems.

  6. Building a Library Web Server on a Budget.

    ERIC Educational Resources Information Center

    Orr, Giles

    1998-01-01

    Presents a method for libraries with limited budgets to create reliable Web servers with existing hardware and free software available via the Internet. Discusses staff, hardware and software requirements, and security; outlines the assembly process. (PEN)

  7. CommServer: A Communications Manager For Remote Data Sites

    NASA Astrophysics Data System (ADS)

    Irving, K.; Kane, D. L.

    2012-12-01

    CommServer is a software system that manages making connections to remote data-gathering stations, providing a simple network interface to client applications. The client requests a connection to a site by name, and the server establishes the connection, providing a bidirectional channel between the client and the target site if successful. CommServer was developed to manage networks of FreeWave serial data radios with multiple data sites, repeaters, and network-accessed base stations, and has been in continuous operational use for several years. Support for Iridium modems using RUDICS will be added soon, and no changes to the application interface are anticipated. CommServer is implemented on Linux using programs written in bash shell, Python, Perl, AWK, under a set of conventions we refer to as ThinObject.

  8. How to secure your servers, code and data

    SciTech Connect

    2010-06-24

    Oral presentation in English, slides in English. Advice and best practices regarding the security of your servers, code and data will be presented. We will also describe how the Computer Security Team can help you reduce the risks.

  9. Video 2 of 4: Navigating the Live Access Server

    NASA Video Gallery

    Learn how to navigate the MY NASA DATA website and server using the NASA Explorer Schools lesson, Analyzing Solar Energy Graphs. The video also shows you how to access, filter and manipulate the da...

  10. The HydroServer Platform for Sharing Hydrologic Data

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Horsburgh, J. S.; Schreuders, K.; Maidment, D. R.; Zaslavsky, I.; Valentine, D. W.

    2010-12-01

    The CUAHSI Hydrologic Information System (HIS) is an internet based system that supports sharing of hydrologic data. HIS consists of databases connected using the Internet through Web services, as well as software for data discovery, access, and publication. The HIS system architecture is comprised of servers for publishing and sharing data, a centralized catalog to support cross server data discovery and a desktop client to access and analyze data. This paper focuses on HydroServer, the component developed for sharing and publishing space-time hydrologic datasets. A HydroServer is a computer server that contains a collection of databases, web services, tools, and software applications that allow data producers to store, publish, and manage the data from an experimental watershed or project site. HydroServer is designed to permit publication of data as part of a distributed national/international system, while still locally managing access to the data. We describe the HydroServer architecture and software stack, including tools for managing and publishing time series data for fixed point monitoring sites as well as spatially distributed, GIS datasets that describe a particular study area, watershed, or region. HydroServer adopts a standards based approach to data publication, relying on accepted and emerging standards for data storage and transfer. CUAHSI developed HydroServer code is free with community code development managed through the codeplex open source code repository and development system. There is some reliance on widely used commercial software for general purpose and standard data publication capability. The sharing of data in a common format is one way to stimulate interdisciplinary research and collaboration. It is anticipated that the growing, distributed network of HydroServers will facilitate cross-site comparisons and large scale studies that synthesize information from diverse settings, making the network as a whole greater than the sum of its

  11. Comparing Server Energy Use and Efficiency Using Small Sample Sizes

    SciTech Connect

    Coles, Henry C.; Qin, Yong; Price, Phillip N.

    2014-11-01

    This report documents a demonstration that compared the energy consumption and efficiency of a limited sample size of server-type IT equipment from different manufacturers by measuring power at the server power supply power cords. The results are specific to the equipment and methods used. However, it is hoped that those responsible for IT equipment selection can used the methods described to choose models that optimize energy use efficiency. The demonstration was conducted in a data center at Lawrence Berkeley National Laboratory in Berkeley, California. It was performed with five servers of similar mechanical and electronic specifications; three from Intel and one each from Dell and Supermicro. Server IT equipment is constructed using commodity components, server manufacturer-designed assemblies, and control systems. Server compute efficiency is constrained by the commodity component specifications and integration requirements. The design freedom, outside of the commodity component constraints, provides room for the manufacturer to offer a product with competitive efficiency that meets market needs at a compelling price. A goal of the demonstration was to compare and quantify the server efficiency for three different brands. The efficiency is defined as the average compute rate (computations per unit of time) divided by the average energy consumption rate. The research team used an industry standard benchmark software package to provide a repeatable software load to obtain the compute rate and provide a variety of power consumption levels. Energy use when the servers were in an idle state (not providing computing work) were also measured. At high server compute loads, all brands, using the same key components (processors and memory), had similar results; therefore, from these results, it could not be concluded that one brand is more efficient than the other brands. The test results show that the power consumption variability caused by the key components as a

  12. Tiled WMS/KML Server V2

    NASA Technical Reports Server (NTRS)

    Plesea, Lucian

    2012-01-01

    This software is a higher-performance implementation of tiled WMS, with integral support for KML and time-varying data. This software is compliant with the Open Geospatial WMS standard, and supports KML natively as a WMS return type, including support for the time attribute. Regionated KML wrappers are generated that match the existing tiled WMS dataset. Ping and JPG formats are supported, and the software is implemented as an Apache 2.0 module that supports a threading execution model that is capable of supporting very high request rates. The module intercepts and responds to WMS requests that match certain patterns and returns the existing tiles. If a KML format that matches an existing pyramid and tile dataset is requested, regionated KML is generated and returned to the requesting application. In addition, KML requests that do not match the existing tile datasets generate a KML response that includes the corresponding JPG WMS request, effectively adding KML support to a backing WMS server.

  13. Anonymization server system for DICOM images

    NASA Astrophysics Data System (ADS)

    Suzuki, H.; Amano, M.; Kubo, M.; Kawata, Y.; Niki, N.; Nishitani, H.

    2007-03-01

    We have developed an anonymization system for DICOM images. It requires consent from the patient to use the DICOM images for research or education. However, providing the DICOM image to the other facilities is not safe because it contains a lot of personal data. Our system is a server that provides anonymization service of DICOM images for users in the facility. The distinctive features of the system are, input interface, flexible anonymization policy, and automatic body part identification. In the first feature, we can use the anonymization service on the existing DICOM workstations. In the second feature, we can select a best policy fitting for the Protection of personal data that is ruled by each medical facility. In the third feature, we can identify the body parts that are included in the input image set, even if the set lacks the body part tag in DICOM header. We installed the system for the first time to a hospital in December 2005. Currently, the system is working in other four facilities. In this paper we describe the system and how it works.

  14. Web Server Security on Open Source Environments

    NASA Astrophysics Data System (ADS)

    Gkoutzelis, Dimitrios X.; Sardis, Manolis S.

    Administering critical resources has never been more difficult that it is today. In a changing world of software innovation where major changes occur on a daily basis, it is crucial for the webmasters and server administrators to shield their data against an unknown arsenal of attacks in the hands of their attackers. Up until now this kind of defense was a privilege of the few, out-budgeted and low cost solutions let the defender vulnerable to the uprising of innovating attacking methods. Luckily, the digital revolution of the past decade left its mark, changing the way we face security forever: open source infrastructure today covers all the prerequisites for a secure web environment in a way we could never imagine fifteen years ago. Online security of large corporations, military and government bodies is more and more handled by open source application thus driving the technological trend of the 21st century in adopting open solutions to E-Commerce and privacy issues. This paper describes substantial security precautions in facing privacy and authentication issues in a totally open source web environment. Our goal is to state and face the most known problems in data handling and consequently propose the most appealing techniques to face these challenges through an open solution.

  15. EarthServer: an Intercontinental Collaboration on Petascale Datacubes

    NASA Astrophysics Data System (ADS)

    Baumann, P.; Rossi, A. P.

    2015-12-01

    With the unprecedented increase of orbital sensor, in-situ measurement, and simulation data there is a rich, yet not leveraged potential for getting insights from dissecting datasets and rejoining them with other datasets. Obviously, the goal is to allow users to "ask any question, any time" thereby enabling them to "build their own product on the go".One of the most influential initiatives in Big Geo Data is EarthServer which has demonstrated new directions for flexible, scalable EO services based on innovative NewSQL technology. Researchers from Europe, the US and recently Australia have teamed up to rigourously materialize the concept of the datacube. Such a datacube may have spatial and temporal dimensions (such as a satellite image time series) and may unite an unlimited number of scenes. Independently from whatever efficient data structuring a server network may perform internally, users will always see just a few datacubes they can slice and dice. EarthServer has established client and server technology for such spatio-temporal datacubes. The underlying scalable array engine, rasdaman, enables direct interaction, including 3-D visualization, what-if scenarios, common EO data processing, and general analytics. Services exclusively rely on the open OGC "Big Geo Data" standards suite, the Web Coverage Service (WCS) including the Web Coverage Processing Service (WCPS). Conversely, EarthServer has significantly shaped and advanced the OGC Big Geo Data standards landscape based on the experience gained.Phase 1 of EarthServer has advanced scalable array database technology into 100+ TB services; in phase 2, Petabyte datacubes will be built in Europe and Australia to perform ad-hoc querying and merging. Standing between EarthServer phase 1 (from 2011 through 2014) and phase 2 (from 2015 through 2018) we present the main results and outline the impact on the international standards landscape; effectively, the Big Geo Data standards established through initiative of

  16. An Array Library for Microsoft SQL Server with Astrophysical Applications

    NASA Astrophysics Data System (ADS)

    Dobos, L.; Szalay, A. S.; Blakeley, J.; Falck, B.; Budavári, T.; Csabai, I.

    2012-09-01

    Today's scientific simulations produce output on the 10-100 TB scale. This unprecedented amount of data requires data handling techniques that are beyond what is used for ordinary files. Relational database systems have been successfully used to store and process scientific data, but the new requirements constantly generate new challenges. Moving terabytes of data among servers on a timely basis is a tough problem, even with the newest high-throughput networks. Thus, moving the computations as close to the data as possible and minimizing the client-server overhead are absolutely necessary. At least data subsetting and preprocessing have to be done inside the server process. Out of the box commercial database systems perform very well in scientific applications from the prospective of data storage optimization, data retrieval, and memory management but lack basic functionality like handling scientific data structures or enabling advanced math inside the database server. The most important gap in Microsoft SQL Server is the lack of a native array data type. Fortunately, the technology exists to extend the database server with custom-written code that enables us to address these problems. We present the prototype of a custom-built extension to Microsoft SQL Server that adds array handling functionality to the database system. With our Array Library, fix-sized arrays of all basic numeric data types can be created and manipulated efficiently. Also, the library is designed to be able to be seamlessly integrated with the most common math libraries, such as BLAS, LAPACK, FFTW, etc. With the help of these libraries, complex operations, such as matrix inversions or Fourier transformations, can be done on-the-fly, from SQL code, inside the database server process. We are currently testing the prototype with two different scientific data sets: The Indra cosmological simulation will use it to store particle and density data from N-body simulations, and the Milky Way Laboratory

  17. Recent improvements in the NASA technical report server

    NASA Technical Reports Server (NTRS)

    Maa, Ming-Hokng; Nelson, Michael L.

    1995-01-01

    The NASA Technical Report Server (NTRS), a World Wide Web (WWW) report distribution service, has been modified to allow parallel database queries, significantly decreasing user access time by an average factor of 2.3, access from clients behind firewalls and/or proxies which truncate excessively long Uniform Resource Locators (URL's), access to non-Wide Area Information Server (WAIS) databases, and compatibility with the Z39-50.3 protocol.

  18. Lifting Events in RDF from Interactions with Annotated Web Pages

    NASA Astrophysics Data System (ADS)

    Stühmer, Roland; Anicic, Darko; Sen, Sinan; Ma, Jun; Schmidt, Kay-Uwe; Stojanovic, Nenad

    In this paper we present a method and an implementation for creating and processing semantic events from interaction with Web pages which opens possibilities to build event-driven applications for the (Semantic) Web. Events, simple or complex, are models for things that happen e.g., when a user interacts with a Web page. Events are consumed in some meaningful way e.g., for monitoring reasons or to trigger actions such as responses. In order for receiving parties to understand events e.g., comprehend what has led to an event, we propose a general event schema using RDFS. In this schema we cover the composition of complex events and event-to-event relationships. These events can then be used to route semantic information about an occurrence to different recipients helping in making the Semantic Web active. Additionally, we present an architecture for detecting and composing events in Web clients. For the contents of events we show a way of how they are enriched with semantic information about the context in which they occurred. The paper is presented in conjunction with the use case of Semantic Advertising, which extends traditional clickstream analysis by introducing semantic short-term profiling, enabling discovery of the current interest of a Web user and therefore supporting advertisement providers in responding with more relevant advertisements.

  19. HotSpot Wizard: a web server for identification of hot spots in protein engineering.

    PubMed

    Pavelka, Antonin; Chovancova, Eva; Damborsky, Jiri

    2009-07-01

    HotSpot Wizard is a web server for automatic identification of 'hot spots' for engineering of substrate specificity, activity or enantioselectivity of enzymes and for annotation of protein structures. The web server implements the protein engineering protocol, which targets evolutionarily variable amino acid positions located in the active site or lining the access tunnels. The 'hot spots' for mutagenesis are selected through the integration of structural, functional and evolutionary information obtained from: (i) the databases RCSB PDB, UniProt, PDBSWS, Catalytic Site Atlas and nr NCBI and (ii) the tools CASTp, CAVER, BLAST, CD-HIT, MUSCLE and Rate4Site. The protein structure and e-mail address are the only obligatory inputs for the calculation. In the output, HotSpot Wizard lists annotated residues ordered by estimated mutability. The results of the analysis are mapped on the enzyme structure and visualized in the web browser using Jmol. The HotSpot Wizard server should be useful for protein engineers interested in exploring the structure of their favourite protein and for the design of mutations in site-directed mutagenesis and focused directed evolution experiments. HotSpot Wizard is available at http://loschmidt.chemi.muni.cz/hotspotwizard/.

  20. Visible Human Slice Web Server: a first assessment

    NASA Astrophysics Data System (ADS)

    Hersch, Roger D.; Gennart, Benoit A.; Figueiredo, Oscar; Mazzariol, Marc; Tarraga, Joaquin; Vetsch, S.; Messerli, Vincent; Welz, R.; Bidaut, Luc M.

    1999-12-01

    The Visible Human Slice Server started offering its slicing services at the end of June 1998. From that date until the end of May, more than 280,000 slices were extracted from the Visible Man, by layman interested in anatomy, by students and by specialists. The Slice Server is based one Bi-Pentium PC and 16 disks. It is a scaled down version of a powerful parallel server comprising 5 Bi-Pentium Pro PCs and 60 disks. The parallel server program was created thanks to a computer-aided parallelization framework, which takes over the task of creating a multi-threaded pipelined parallel program from a high-level parallel program description. On the full blown architecture, the parallel program enables the extraction and resampling of up to 5 color slices per second. Extracting 5 slice/s requires to access the disks and extract subvolumes of the Visible Human at an aggregate throughput of 105 MB/s. The publicly accessible server enables to extract slices having any orientation. The slice position and orientation can either be specified for each slice separately or as a position and orientation offered by a Java applet and possible future improvements. In the very near future, the Web Slice Server will offer additional services, such as the possibility to extract ruled surfaces and to extract animations incorporating slices perpendicular to a user defined trajectory.

  1. LabKey Server: An open source platform for scientific data integration, analysis and collaboration

    PubMed Central

    2011-01-01

    Background Broad-based collaborations are becoming increasingly common among disease researchers. For example, the Global HIV Enterprise has united cross-disciplinary consortia to speed progress towards HIV vaccines through coordinated research across the boundaries of institutions, continents and specialties. New, end-to-end software tools for data and specimen management are necessary to achieve the ambitious goals of such alliances. These tools must enable researchers to organize and integrate heterogeneous data early in the discovery process, standardize processes, gain new insights into pooled data and collaborate securely. Results To meet these needs, we enhanced the LabKey Server platform, formerly known as CPAS. This freely available, open source software is maintained by professional engineers who use commercially proven practices for software development and maintenance. Recent enhancements support: (i) Submitting specimens requests across collaborating organizations (ii) Graphically defining new experimental data types, metadata and wizards for data collection (iii) Transitioning experimental results from a multiplicity of spreadsheets to custom tables in a shared database (iv) Securely organizing, integrating, analyzing, visualizing and sharing diverse data types, from clinical records to specimens to complex assays (v) Interacting dynamically with external data sources (vi) Tracking study participants and cohorts over time (vii) Developing custom interfaces using client libraries (viii) Authoring custom visualizations in a built-in R scripting environment. Diverse research organizations have adopted and adapted LabKey Server, including consortia within the Global HIV Enterprise. Atlas is an installation of LabKey Server that has been tailored to serve these consortia. It is in production use and demonstrates the core capabilities of LabKey Server. Atlas now has over 2,800 active user accounts originating from approximately 36 countries and 350

  2. SurveyWiz and factorWiz: JavaScript Web pages that make HTML forms for research on the Internet.

    PubMed

    Birnbaum, M H

    2000-05-01

    SurveyWiz and factorWiz are Web pages that act as wizards to create HTML forms that enable one to collect data via the Web. SurveyWiz allows the user to enter survey questions or personality test items with a mixture of text boxes and scales of radio buttons. One can add demographic questions of age, sex, education, and nationality with the push of a button. FactorWiz creates the HTML for within-subjects, two-factor designs as large as 9 x 9, or higher order factorial designs up to 81 cells. The user enters levels of the row and column factors, which can be text, images, or other multimedia. FactorWiz generates the stimulus combinations, randomizes their order, and creates the page. In both programs HTML is displayed in a window, and the user copies it to a text editor to save it. When uploaded to a Web server and supported by a CGI script, the created Web pages allow data to be collected, coded, and saved on the server. These programs are intended to assist researchers and students in quickly creating studies that can be administered via the Web.

  3. Miniaturized Airborne Imaging Central Server System

    NASA Technical Reports Server (NTRS)

    Sun, Xiuhong

    2011-01-01

    In recent years, some remote-sensing applications require advanced airborne multi-sensor systems to provide high performance reflective and emissive spectral imaging measurement rapidly over large areas. The key or unique problem of characteristics is associated with a black box back-end system that operates a suite of cutting-edge imaging sensors to collect simultaneously the high throughput reflective and emissive spectral imaging data with precision georeference. This back-end system needs to be portable, easy-to-use, and reliable with advanced onboard processing. The innovation of the black box backend is a miniaturized airborne imaging central server system (MAICSS). MAICSS integrates a complex embedded system of systems with dedicated power and signal electronic circuits inside to serve a suite of configurable cutting-edge electro- optical (EO), long-wave infrared (LWIR), and medium-wave infrared (MWIR) cameras, a hyperspectral imaging scanner, and a GPS and inertial measurement unit (IMU) for atmospheric and surface remote sensing. Its compatible sensor packages include NASA s 1,024 1,024 pixel LWIR quantum well infrared photodetector (QWIP) imager; a 60.5 megapixel BuckEye EO camera; and a fast (e.g. 200+ scanlines/s) and wide swath-width (e.g., 1,920+ pixels) CCD/InGaAs imager-based visible/near infrared reflectance (VNIR) and shortwave infrared (SWIR) imaging spectrometer. MAICSS records continuous precision georeferenced and time-tagged multisensor throughputs to mass storage devices at a high aggregate rate, typically 60 MB/s for its LWIR/EO payload. MAICSS is a complete stand-alone imaging server instrument with an easy-to-use software package for either autonomous data collection or interactive airborne operation. Advanced multisensor data acquisition and onboard processing software features have been implemented for MAICSS. With the onboard processing for real time image development, correction, histogram-equalization, compression, georeference, and

  4. ConoServer: updated content, knowledge, and discovery tools in the conopeptide database.

    PubMed

    Kaas, Quentin; Yu, Rilei; Jin, Ai-Hua; Dutertre, Sébastien; Craik, David J

    2012-01-01

    ConoServer (http://www.conoserver.org) is a database specializing in the sequences and structures of conopeptides, which are toxins expressed by marine cone snails. Cone snails are carnivorous gastropods, which hunt their prey using a cocktail of toxins that potently subvert nervous system function. The ability of these toxins to specifically target receptors, channels and transporters of the nervous system has attracted considerable interest for their use in physiological research and as drug leads. Since the founding publication on ConoServer in 2008, the number of entries in the database has nearly doubled, the interface has been redesigned and new annotations have been added, including a more detailed description of cone snail species, biological activity measurements and information regarding the identification of each sequence. Automatically updated statistics on classification schemes, three-dimensional structures, conopeptide-bearing species and endoplasmic reticulum signal sequence conservation trends, provide a convenient overview of current knowledge on conopeptides. Transcriptomics and proteomics have began generating massive numbers of new conopeptide sequences, and two dedicated tools have been recently implemented in ConoServer to standardize the analysis of conopeptide precursor sequences and to help in the identification by mass spectrometry of toxins whose sequences were predicted at the nucleic acid level.

  5. MO/DSD online information server and global information repository access

    NASA Technical Reports Server (NTRS)

    Nguyen, Diem; Ghaffarian, Kam; Hogie, Keith; Mackey, William

    1994-01-01

    Often in the past, standards and new technology information have been available only in hardcopy form, with reproduction and mailing costs proving rather significant. In light of NASA's current budget constraints and in the interest of efficient communications, the Mission Operations and Data Systems Directorate (MO&DSD) New Technology and Data Standards Office recognizes the need for an online information server (OLIS). This server would allow: (1) dissemination of standards and new technology information throughout the Directorate more quickly and economically; (2) online browsing and retrieval of documents that have been published for and by MO&DSD; and (3) searching for current and past study activities on related topics within NASA before issuing a task. This paper explores a variety of available information servers and searching tools, their current capabilities and limitations, and the application of these tools to MO&DSD. Most importantly, the discussion focuses on the way this concept could be easily applied toward improving dissemination of standards and new technologies and improving documentation processes.

  6. ConoServer: updated content, knowledge, and discovery tools in the conopeptide database

    PubMed Central

    Kaas, Quentin; Yu, Rilei; Jin, Ai-Hua; Dutertre, Sébastien; Craik, David J.

    2012-01-01

    ConoServer (http://www.conoserver.org) is a database specializing in the sequences and structures of conopeptides, which are toxins expressed by marine cone snails. Cone snails are carnivorous gastropods, which hunt their prey using a cocktail of toxins that potently subvert nervous system function. The ability of these toxins to specifically target receptors, channels and transporters of the nervous system has attracted considerable interest for their use in physiological research and as drug leads. Since the founding publication on ConoServer in 2008, the number of entries in the database has nearly doubled, the interface has been redesigned and new annotations have been added, including a more detailed description of cone snail species, biological activity measurements and information regarding the identification of each sequence. Automatically updated statistics on classification schemes, three-dimensional structures, conopeptide-bearing species and endoplasmic reticulum signal sequence conservation trends, provide a convenient overview of current knowledge on conopeptides. Transcriptomics and proteomics have began generating massive numbers of new conopeptide sequences, and two dedicated tools have been recently implemented in ConoServer to standardize the analysis of conopeptide precursor sequences and to help in the identification by mass spectrometry of toxins whose sequences were predicted at the nucleic acid level. PMID:22058133

  7. Japan Data Exchange Network JDXnet and Cloud-type Data Relay Server for Earthquake Observation Data

    NASA Astrophysics Data System (ADS)

    Takano, K.; Urabe, T.; Tsuruoka, H.; Nakagawa, S.

    2015-12-01

    In Japan, high-sensitive seismic observation and broad-band seismic observation are carried out by several organization such as Japan Meteorological Agency (JMA) , National Research Institute for Earth Science and Disaster Prevention (NIED), nine National Universities, Japan Agency for Marine-Earth Science and Technology (JAMSTEC) , etc. The total number of the observation station is about 1400 points. The total volume of the seismic waveform data collected from all these observation station is about 1MByte for 1 second (about 8 to 10Mbps) by using the WIN system(Urabe 1991). JDXnet is the Japan Data eXchange network for earthquake observation data. JDXnet was started from 2007 by cooperation of the researchers of each organization. All the seismic waveform data are available at the all organizations in real-time. The core of JDXnet is the broadcast type real-time data exchange by using the nationwide L2-VPN service offered in JGN-X of NICT and SINET4 of NII. Before the Tohoku earthquake, the nine national universities had collected seismic data to each data center and then exchanged with other universities and institutions by JDXnet. However, in this case, if the center of the university was stopped, all data of the university could not use even though there are some alive observation stations. Because of this problem, we have prepared the data relay server in the data center of SINET4 ie the cloud center. This data relay server collects data directly from the observation stations of the universities and delivers data to all universities and institutions by JDXnet. By using the relay server on cloud center, even if some universities are affected by a large disaster, it is eliminated that the data of the living station is lost. If the researchers set up seismometers and send data to the relay server, then data are available to all researchers. This mechanism promotes the joint use of the seismometers and joint research activities in nationwide researchers.

  8. Server-side Filtering and Aggregation within a Distributed Environment

    NASA Astrophysics Data System (ADS)

    Currey, J. C.; Bartle, A.

    2015-12-01

    Intercalibration, validation, and data mining use cases require more efficient access to the massive volumes of observation data distributed across multiple agency data centers. The traditional paradigm of downloading large volumes of data to a centralized server or desktop computer for analysis is no longer viable. More analysis should be performed within the host data centers using server-side functions. Many comparative analysis tasks require far less than 1% of the available observation data. The Multi-Instrument Intercalibration (MIIC) Framework provides web services to find, match, filter, and aggregate multi-instrument observation data. Matching measurements from separate spacecraft in time, location, wavelength, and viewing geometry is a difficult task especially when data are distributed across multiple agency data centers. Event prediction services identify near coincident measurements with matched viewing geometries near orbit crossings using complex orbit propagation and spherical geometry calculations. The number and duration of event opportunities depend on orbit inclinations, altitude differences, and requested viewing conditions (e.g., day/night). Event observation information is passed to remote server-side functions to retrieve matched data. Data may be gridded, spatially convolved onto instantaneous field-of-views, or spectrally resampled or convolved. Narrowband instruments are routinely compared to hyperspectal instruments such as AIRS and CRIS using relative spectral response (RSR) functions. Spectral convolution within server-side functions significantly reduces the amount of hyperspectral data needed by the client. This combination of intelligent selection and server-side processing significantly reduces network traffic and data to process on local servers. OPeNDAP is a mature networking middleware already deployed at many of the Earth science data centers. Custom OPeNDAP server-side functions that provide filtering, histogram analysis (1D

  9. PlanetServer/EarthServer: Big Data analytics in Planetary Science

    NASA Astrophysics Data System (ADS)

    Pio Rossi, Angelo; Oosthoek, Jelmer; Baumann, Peter; Beccati, Alan; Cantini, Federico; Misev, Dimitar; Orosei, Roberto; Flahaut, Jessica; Campalani, Piero; Unnithan, Vikram

    2014-05-01

    Planetary data are freely available on PDS/PSA archives and alike (e.g. Heather et al., 2013). Their exploitation by the community is somewhat limited by the variable availability of calibrated/higher level datasets. An additional complexity of these multi-experiment, multi-mission datasets is related to the heterogeneity of data themselves, rather than their volume. Orbital - so far - data are best suited for an inclusion in array databases (Baumann et al., 1994). Most lander- or rover-based remote sensing experiment (and possibly, in-situ as well) are suitable for similar approaches, although the complexity of coordinate reference systems (CRS) is higher in the latter case. PlanetServer, the Planetary Service of the EC FP7 e-infrastructure project EarthServer (http://earthserver.eu) is a state-of-art online data exploration and analysis system based on the Open Geospatial Consortium (OGC) standards for Mars orbital data. It provides access to topographic, panchromatic, multispectral and hyperspectral calibrated data. While its core focus has been on hyperspectral data analysis through the OGC Web Coverage Processing Service (Oosthoek et al., 2013; Rossi et al., 2013), the Service progressively expanded to host also sounding radar data (Cantini et al., this volume). Additionally, both single swath and mosaicked imagery and topographic data are being added to the Service, deriving from the HRSC experiment (e.g. Jaumann et al., 2007; Gwinner et al., 2009) The current Mars-centric focus can be extended to other planetary bodies and most components are general purpose ones, making possible its application to the Moon, Mercury or alike. The Planetary Service of EarthServer is accessible on http://www.planetserver.eu References: Baumann, P. (1994) VLDB J. 4 (3), 401-444, Special Issue on Spatial Database Systems. Cantini, F. et al. (2014) Geophys. Res. Abs., Vol. 16, #EGU2014-3784, this volume Heather, D., et al.(2013) EuroPlanet Sci. Congr. #EPSC2013-626 Gwinner, K

  10. Recognition of pornographic web pages by classifying texts and images.

    PubMed

    Hu, Weiming; Wu, Ou; Chen, Zhouyao; Fu, Zhouyu; Maybank, Steve

    2007-06-01

    With the rapid development of the World Wide Web, people benefit more and more from the sharing of information. However, Web pages with obscene, harmful, or illegal content can be easily accessed. It is important to recognize such unsuitable, offensive, or pornographic Web pages. In this paper, a novel framework for recognizing pornographic Web pages is described. A C4.5 decision tree is used to divide Web pages, according to content representations, into continuous text pages, discrete text pages, and image pages. These three categories of Web pages are handled, respectively, by a continuous text classifier, a discrete text classifier, and an algorithm that fuses the results from the image classifier and the discrete text classifier. In the continuous text classifier, statistical and semantic features are used to recognize pornographic texts. In the discrete text classifier, the naive Bayes rule is used to calculate the probability that a discrete text is pornographic. In the image classifier, the object's contour-based features are extracted to recognize pornographic images. In the text and image fusion algorithm, the Bayes theory is used to combine the recognition results from images and texts. Experimental results demonstrate that the continuous text classifier outperforms the traditional keyword-statistics-based classifier, the contour-based image classifier outperforms the traditional skin-region-based image classifier, the results obtained by our fusion algorithm outperform those by either of the individual classifiers, and our framework can be adapted to different categories of Web pages. PMID:17431300

  11. Perceptual metrics and visualization tools for evaluation of page uniformity

    NASA Astrophysics Data System (ADS)

    Nguyen, Minh Q.; Jessome, Renee; Astling, Steve; Maggard, Eric; Nelson, Terry; Shaw, Mark; Allebach, Jan P.

    2014-01-01

    Uniformity is one of the issues of most critical concern for laser electrophotographic (EP) printers. Typically, full coverage constant-tint test pages are printed to assess uniformity. Exemplary nonuniformity defects include mottle, grain, pinholes, and "finger prints". It is a real challenge to make an overall Print Quality (PQ) assessment due to the large coverage of a letter-size, constant-tint printed test page and the variety of possible nonuniformity defects. In this paper, we propose a novel method that uses a block-based technique to analyze the page both visually and metrically. We use a grid of 150 pixels × 150 pixels ( ¼ inch × ¼ inch at 600-dpi resolution) square blocks throughout the scanned page. For each block, we examine two aspects: behavior of its pixels within the block (metrics of graininess) and behavior of the blocks within the printed page (metrics of nonuniformity). Both ΔE (CIE 1976) and the L* lightness channel are employed. For an input scanned page, we create eight visual outputs, each displaying a different aspect of nonuniformity. To apply machine learning, we train scanned pages of different 100% solid colors separately with the support vector machine (SVM) algorithm. We use two metrics as features for the SVM: average dispersion of page lightness and standard deviation in dispersion of page lightness. Our results show that we can predict, with 83% to 90% accuracy, the assignment by a print quality expert of one of two grades of uniformity in the print.

  12. Filtering False Positives Based on Server-Side Behaviors

    NASA Astrophysics Data System (ADS)

    Shimamura, Makoto; Hanaoka, Miyuki; Kono, Kenji

    Reducing the rate of false positives is of vital importance in enhancing the usefulness of signature-based network intrusion detection systems (NIDSs). To reduce the number of false positives, a network administrator must thoroughly investigate a lengthy list of signatures and carefully disable the ones that detect attacks that are not harmful to the administrator's environment. This is a daunting task; if some signatures are disabled by mistake, the NIDS fails to detect critical remote attacks. We designed a NIDS, TrueAlarm, to reduce the rate of false positives. Conventional NIDSs alert administrators that a malicious message has been detected, regardless of whether the message actually attempts to compromise the protected server. In contrast, TrueAlarm delays the alert until it has confirmed that an attempt has been made. The TrueAlarm NIDS cooperates with a server-side monitor that observes the protected server's behavior. TrueAlarm only alerts administrators when a server-side monitor has detected deviant server behavior that must have been caused by a message detected by a NIDS. Our experimental results revealed that TrueAlarm reduces the rate of false positives. Using actual network traffic collected over 14 days, TrueAlarm produced 46 false positives, while Snort, a conventional NIDS, produced 818.

  13. Advancing the Power and Utility of Server-Side Aggregation

    NASA Technical Reports Server (NTRS)

    Fulker, Dave; Gallagher, James

    2016-01-01

    During the upcoming Summer 2016 meeting of the ESIP Federation (July 19-22), OpenDAP will hold a Developers and Users Workshop. While a broad set of topics will be covered, a key focus is capitalizing on recent EOSDIS-sponsored advances in Hyrax, OPeNDAPs own software for server-side realization of the DAP2 and DAP4 protocols. These Hyrax advances are as important to data users as to data providers, and the workshop will include hands-on experiences of value to both. Specifically, a balanced set of presentations and hands-on tutorials will address advances in1.server installation,2.server configuration,3.Hyrax aggregation capabilities,4.support for data-access from clients that are HTTP-based, JSON-based or OGC-compliant (especially WCS and WMS),5.support for DAP4,6.use and extension of server-side computational capabilities, and7.several performance-affecting matters.Topics 2 through 7 will be relevant to data consumers, data providers andnotably, due to the open-source nature of all OPeNDAP softwareto developers wishing to extend Hyrax, to build compatible clients and servers, andor to employ Hyrax as middleware that enables interoperability across a variety of end-user and source-data contexts. A session for contributed talks will elaborate the topics listed above and embrace additional ones.

  14. RosettaAntibody: antibody variable region homology modeling server.

    PubMed

    Sircar, Aroop; Kim, Eric T; Gray, Jeffrey J

    2009-07-01

    The RosettaAntibody server (http://antibody.graylab.jhu.edu) predicts the structure of an antibody variable region given the amino-acid sequences of the respective light and heavy chains. In an initial stage, the server identifies and displays the most sequence homologous template structures for the light and heavy framework regions and each of the complementarity determining region (CDR) loops. Subsequently, the most homologous templates are assembled into a side-chain optimized crude model, and the server returns a picture and coordinate file. For users requesting a high-resolution model, the server executes the full RosettaAntibody protocol which additionally models the hyper-variable CDR H3 loop. The high-resolution protocol also relieves steric clashes by optimizing the CDR backbone torsion angles and by simultaneously perturbing the relative orientation of the light and heavy chains. RosettaAntibody generates 2000 independent structures, and the server returns pictures, coordinate files, and detailed scoring information for the 10 top-scoring models. The 10 models enable users to use rational judgment in choosing the best model or to use the set as an ensemble for further studies such as docking. The high-resolution models generated by RosettaAntibody have been used for the successful prediction of antibody-antigen complex structures.

  15. APPRIS WebServer and WebServices

    PubMed Central

    Rodriguez, Jose Manuel; Carro, Angel; Valencia, Alfonso; Tress, Michael L.

    2015-01-01

    This paper introduces the APPRIS WebServer (http://appris.bioinfo.cnio.es) and WebServices (http://apprisws.bioinfo.cnio.es). Both the web servers and the web services are based around the APPRIS Database, a database that presently houses annotations of splice isoforms for five different vertebrate genomes. The APPRIS WebServer and WebServices provide access to the computational methods implemented in the APPRIS Database, while the APPRIS WebServices also allows retrieval of the annotations. The APPRIS WebServer and WebServices annotate splice isoforms with protein structural and functional features, and with data from cross-species alignments. In addition they can use the annotations of structure, function and conservation to select a single reference isoform for each protein-coding gene (the principal protein isoform). APPRIS principal isoforms have been shown to agree overwhelmingly with the main protein isoform detected in proteomics experiments. The APPRIS WebServer allows for the annotation of splice isoforms for individual genes, and provides a range of visual representations and tools to allow researchers to identify the likely effect of splicing events. The APPRIS WebServices permit users to generate annotations automatically in high throughput mode and to interrogate the annotations in the APPRIS Database. The APPRIS WebServices have been implemented using REST architecture to be flexible, modular and automatic. PMID:25990727

  16. Performance measurements of single server fuzzy queues with unreliable server using left and right method

    NASA Astrophysics Data System (ADS)

    Mueen, Zeina; Ramli, Razamin; Zaibidi, Nerda Zura

    2015-12-01

    There are a number of real life systems that can be described as a queuing system, and this paper presents a queuing system model applied in a manufacturing system example. The queuing model considered is depicted in a fuzzy environment with retrial queues and unreliable server. The stability condition state of this model is investigated and the performance measurement is obtained by adopting the left and right method. The new approach adopted in this study merges the existing α-cut interval and nonlinear programming techniques and a numerical example was considered to explain the methodology of this technique. From the numerical example, the flexibility of the method was shown graphically showing the exact real mean value of customers in the system and also the expected waiting times.

  17. College of DuPage Information Technology Plan, Fiscal Year 1994-95.

    ERIC Educational Resources Information Center

    College of DuPage, Glen Ellyn, IL.

    Building upon four previous planning documents for computing at College of DuPage in Illinois, this plan for fiscal year 1995 (FY95) provides a starting point for future plans to address all activities that relate to the use of information technology on campus. The FY95 "Information Technology Plan" is divided into six sections, each providing an…

  18. Mathematical defense method of networked servers with controlled remote backups

    NASA Astrophysics Data System (ADS)

    Kim, Song-Kyoo

    2006-05-01

    The networked server defense model is focused on reliability and availability in security respects. The (remote) backup servers are hooked up by VPN (Virtual Private Network) with high-speed optical network and replace broken main severs immediately. The networked server can be represent as "machines" and then the system deals with main unreliable, spare, and auxiliary spare machine. During vacation periods, when the system performs a mandatory routine maintenance, auxiliary machines are being used for back-ups; the information on the system is naturally delayed. Analog of the N-policy to restrict the usage of auxiliary machines to some reasonable quantity. The results are demonstrated in the network architecture by using the stochastic optimization techniques.

  19. The State of Energy and Performance Benchmarking for Enterprise Servers

    NASA Astrophysics Data System (ADS)

    Fanara, Andrew; Haines, Evan; Howard, Arthur

    To address the server industry’s marketing focus on performance, benchmarking organizations have played a pivotal role in developing techniques to determine the maximum achievable performance level of a system. Generally missing has been an assessment of energy use to achieve that performance. The connection between performance and energy consumption is becoming necessary information for designers and operators as they grapple with power constraints in the data center. While industry and policy makers continue to strategize about a universal metric to holistically measure IT equipment efficiency, existing server benchmarks for various workloads could provide an interim proxy to assess the relative energy efficiency of general servers. This paper discusses ideal characteristics a future energy-performance benchmark might contain, suggests ways in which current benchmarks might be adapted to provide a transitional step to this end, and notes the need for multiple workloads to provide a holistic proxy for a universal metric.

  20. LassoProt: server to analyze biopolymers with lassos

    PubMed Central

    Dabrowski-Tumanski, Pawel; Niemyska, Wanda; Pasznik, Pawel; Sulkowska, Joanna I.

    2016-01-01

    The LassoProt server, http://lassoprot.cent.uw.edu.pl/, enables analysis of biopolymers with entangled configurations called lassos. The server offers various ways of visualizing lasso configurations, as well as their time trajectories, with all the results and plots downloadable. Broad spectrum of applications makes LassoProt a useful tool for biologists, biophysicists, chemists, polymer physicists and mathematicians. The server and our methods have been validated on the whole PDB, and the results constitute the database of proteins with complex lassos, supported with basic biological data. This database can serve as a source of information about protein geometry and entanglement-function correlations, as a reference set in protein modeling, and for many other purposes. PMID:27131383

  1. LassoProt: server to analyze biopolymers with lassos.

    PubMed

    Dabrowski-Tumanski, Pawel; Niemyska, Wanda; Pasznik, Pawel; Sulkowska, Joanna I

    2016-07-01

    The LassoProt server, http://lassoprot.cent.uw.edu.pl/, enables analysis of biopolymers with entangled configurations called lassos. The server offers various ways of visualizing lasso configurations, as well as their time trajectories, with all the results and plots downloadable. Broad spectrum of applications makes LassoProt a useful tool for biologists, biophysicists, chemists, polymer physicists and mathematicians. The server and our methods have been validated on the whole PDB, and the results constitute the database of proteins with complex lassos, supported with basic biological data. This database can serve as a source of information about protein geometry and entanglement-function correlations, as a reference set in protein modeling, and for many other purposes.

  2. 48 CFR 1852.215-81 - Proposal page limitations.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Proposal page limitations. 1852.215-81 Section 1852.215-81 Federal Acquisition Regulations System NATIONAL AERONAUTICS AND SPACE... 1852.215-81 Proposal page limitations. As prescribed in 1815.209-70(d), insert the following...

  3. The Library as Information Provider: The Home Page.

    ERIC Educational Resources Information Center

    Clyde, Laurel A.

    1996-01-01

    Discusses ways in which libraries are using the World Wide Web to provide information via a home page, based on information from a survey in Iceland as well as a larger study that conducted content analyses of home pages of public and school libraries in 13 countries. (Author/LRW)

  4. Toward a User-Centered Academic Library Home Page

    ERIC Educational Resources Information Center

    McHale, Nina

    2008-01-01

    In the past decade, academic libraries have struggled with the design of an effective library home page. Since librarians' mental models of information architecture differ from those of their patrons, usability assessments are necessary in designing a user-centered home page. This study details a usability sequence of card sort and paper and…

  5. Paging and Scrolling: Cognitive Styles in Learning from Hypermedia

    ERIC Educational Resources Information Center

    Eyuboglu, Filiz; Orhan, Feza

    2011-01-01

    This study investigates the navigational patterns and learning achievement of university students with different cognitive styles, on hypermedia learning environments using paging or scrolling. The global-local subscales of Sternberg's Thinking Styles Inventory, two hypermedia, one using paging, the other using scrolling, a multiple choice…

  6. JavaScript: Convenient Interactivity for the Class Web Page.

    ERIC Educational Resources Information Center

    Gray, Patricia

    This paper shows how JavaScript can be used within HTML pages to add interactive review sessions and quizzes incorporating graphics and sound files. JavaScript has the advantage of providing basic interactive functions without the use of separate software applications and players. Because it can be part of a standard HTML page, it is…

  7. An Analysis of Academic Library Web Pages for Faculty

    ERIC Educational Resources Information Center

    Gardner, Susan J.; Juricek, John Eric; Xu, F. Grace

    2008-01-01

    Web sites are increasingly used by academic libraries to promote key services and collections to teaching faculty. This study analyzes the content, location, language, and technological features of fifty-four academic library Web pages designed especially for faculty to expose patterns in the development of these pages.

  8. Evaluating Information Quality: Hidden Biases on the Children's Web Pages

    ERIC Educational Resources Information Center

    Kurubacak, Gulsun

    2006-01-01

    As global digital communication continues to flourish, the Children's Web pages become more critical for children to realize not only the surface but also breadth and deeper meanings in presenting these milieus. These pages not only are very diverse and complex but also enable intense communication across social, cultural and political…

  9. World Wide Web Page Design: A Structured Approach.

    ERIC Educational Resources Information Center

    Gregory, Gwen; Brown, M. Marlo

    1997-01-01

    Describes how to develop a World Wide Web site based on structured programming concepts. Highlights include flowcharting, first page design, evaluation, page titles, documenting source code, text, graphics, and browsers. Includes a template for HTML writers, tips for using graphics, a sample homepage, guidelines for authoring structured HTML, and…

  10. Add Java extensions to your wiki: Java applets can bring dynamic functionality to your wiki pages

    SciTech Connect

    Scarberry, Randall E.

    2008-08-12

    Virtually everyone familiar with today’s world wide web has encountered the free online encyclopedia Wikipedia many times. What you may not know is that Wikipedia is driven by an excellent open-source product called MediaWiki which is available to anyone for free. This has led to a proliferation of wiki sites devoted to just about any topic one can imagine. Users of a wiki can add content -- all that is required of them is that they type in their additions into their web browsers using the simple markup language called wikitext. Even better, the developers of wikitext made it extensible. With a little server-side development of your own, you can add your own custom syntax. Users aware of your extensions can then utilize them on their wiki pages with a few simple keystrokes. These extensions can be custom decorations, formatting, web applications, and even instances of the venerable old Java applet. One example of a Java applet extension is the Jmol extension (REF), used to embed a 3-D molecular viewer. This article will walk you through the deployment of a fairly elaborate applet via a MediaWiki extension. By no means exhaustive -- an entire book would be required for that -- it will demonstrate how to give the applet resize handles using using a little Javascript and CSS coding and some popular Javascript libraries. It even describes how a user may customize the extension somewhat using a wiki template. Finally, it explains a rudimentary persistence mechanism which allows applets to save data directly to the wiki pages on which they reside.

  11. Client-server technology meets operational-planning challenges

    SciTech Connect

    Cole, L.A.; Stansberry, C.J. Jr.; Le, K.D.; Ma, H.

    1996-07-01

    Utilities are starting to find that it is rather difficult to upgrade their proprietary energy management system, which was designed for real-time operations, fast enough to keep pace with rapidly changing business needs. To solve this problem, many utilities are building a data warehouse to store real-time data and using the data warehouse to launch client-server applications to meet their pressing business requirements. This article describes a client-server implementation launched at Tennessee Valley Authority in 1994 to meet the utility`s operational-planning needs. The article summarizes some of the lessons learned and outlines future development plans.

  12. Performance model of the Argonne Voyager multimedia server

    SciTech Connect

    Disz, T.; Olson, R.; Stevens, R.

    1997-07-01

    The Argonne Voyager Multimedia Server is being developed in the Futures Lab of the Mathematics and Computer Science Division at Argonne National Laboratory. As a network-based service for recording and playing multimedia streams, it is important that the Voyager system be capable of sustaining certain minimal levels of performance in order for it to be a viable system. In this article, the authors examine the performance characteristics of the server. As they examine the architecture of the system, they try to determine where bottlenecks lie, show actual vs potential performance, and recommend areas for improvement through custom architectures and system tuning.

  13. The SAPHIRE server: a new algorithm and implementation.

    PubMed Central

    Hersh, W.; Leone, T. J.

    1995-01-01

    SAPHIRE is an experimental information retrieval system implemented to test new approaches to automated indexing and retrieval of medical documents. Due to limitations in its original concept-matching algorithm, a modified algorithm has been implemented which allows greater flexibility in partial matching and different word order within concepts. With the concomitant growth in client-server applications and the Internet in general, the new algorithm has been implemented as a server that can be accessed via other applications on the Internet. PMID:8563413

  14. SAbPred: a structure-based antibody prediction server

    PubMed Central

    Dunbar, James; Krawczyk, Konrad; Leem, Jinwoo; Marks, Claire; Nowak, Jaroslaw; Regep, Cristian; Georges, Guy; Kelm, Sebastian; Popovic, Bojana; Deane, Charlotte M.

    2016-01-01

    SAbPred is a server that makes predictions of the properties of antibodies focusing on their structures. Antibody informatics tools can help improve our understanding of immune responses to disease and aid in the design and engineering of therapeutic molecules. SAbPred is a single platform containing multiple applications which can: number and align sequences; automatically generate antibody variable fragment homology models; annotate such models with estimated accuracy alongside sequence and structural properties including potential developability issues; predict paratope residues; and predict epitope patches on protein antigens. The server is available at http://opig.stats.ox.ac.uk/webapps/sabpred. PMID:27131379

  15. File caching in video-on-demand servers

    NASA Astrophysics Data System (ADS)

    Wang, Fu-Ching; Chang, Shin-Hung; Hung, Chi-Wei; Chang, Jia-Yang; Oyang, Yen-Jen; Lee, Meng-Huang

    1997-12-01

    This paper studies the file caching issue in video-on-demand (VOD) servers. Because the characteristics of video files are very different from those of conventional files, different type of caching algorithms must be developed. For VOD servers, the goal is to optimize resource allocation and tradeoff between memory and disk bandwidth. This paper first proves that resource allocation and tradeoff between memory and disk bandwidth is an NP-complete problem. Then, a heuristic algorithm, called the generalized relay mechanism, is introduced and a simulation-based optimization procedure is conducted to evaluate the effects of applying the generalized relay mechanism.

  16. An Enhanced Biometric Based Authentication with Key-Agreement Protocol for Multi-Server Architecture Based on Elliptic Curve Cryptography

    PubMed Central

    Reddy, Alavalapati Goutham; Das, Ashok Kumar; Odelu, Vanga; Yoo, Kee-Young

    2016-01-01

    Biometric based authentication protocols for multi-server architectures have gained momentum in recent times due to advancements in wireless technologies and associated constraints. Lu et al. recently proposed a robust biometric based authentication with key agreement protocol for a multi-server environment using smart cards. They claimed that their protocol is efficient and resistant to prominent security attacks. The careful investigation of this paper proves that Lu et al.’s protocol does not provide user anonymity, perfect forward secrecy and is susceptible to server and user impersonation attacks, man-in-middle attacks and clock synchronization problems. In addition, this paper proposes an enhanced biometric based authentication with key-agreement protocol for multi-server architecture based on elliptic curve cryptography using smartcards. We proved that the proposed protocol achieves mutual authentication using Burrows-Abadi-Needham (BAN) logic. The formal security of the proposed protocol is verified using the AVISPA (Automated Validation of Internet Security Protocols and Applications) tool to show that our protocol can withstand active and passive attacks. The formal and informal security analyses and performance analysis demonstrates that the proposed protocol is robust and efficient compared to Lu et al.’s protocol and existing similar protocols. PMID:27163786

  17. WebRASP: a server for computing energy scores to assess the accuracy and stability of RNA 3D structures

    PubMed Central

    Norambuena, Tomas; Cares, Jorge F.; Capriotti, Emidio; Melo, Francisco

    2013-01-01

    Summary: The understanding of the biological role of RNA molecules has changed. Although it is widely accepted that RNAs play important regulatory roles without necessarily coding for proteins, the functions of many of these non-coding RNAs are unknown. Thus, determining or modeling the 3D structure of RNA molecules as well as assessing their accuracy and stability has become of great importance for characterizing their functional activity. Here, we introduce a new web application, WebRASP, that uses knowledge-based potentials for scoring RNA structures based on distance-dependent pairwise atomic interactions. This web server allows the users to upload a structure in PDB format, select several options to visualize the structure and calculate the energy profile. The server contains online help, tutorials and links to other related resources. We believe this server will be a useful tool for predicting and assessing the quality of RNA 3D structures. Availability and implementation: The web server is available at http://melolab.org/webrasp. It has been tested on the most popular web browsers and requires Java plugin for Jmol visualization. Contact: fmelo@bio.puc.cl PMID:23929030

  18. Design and Analysis of an Enhanced Patient-Server Mutual Authentication Protocol for Telecare Medical Information System.

    PubMed

    Amin, Ruhul; Islam, S K Hafizul; Biswas, G P; Khan, Muhammad Khurram; Obaidat, Mohammad S

    2015-11-01

    In order to access remote medical server, generally the patients utilize smart card to login to the server. It has been observed that most of the user (patient) authentication protocols suffer from smart card stolen attack that means the attacker can mount several common attacks after extracting smart card information. Recently, Lu et al.'s proposes a session key agreement protocol between the patient and remote medical server and claims that the same protocol is secure against relevant security attacks. However, this paper presents several security attacks on Lu et al.'s protocol such as identity trace attack, new smart card issue attack, patient impersonation attack and medical server impersonation attack. In order to fix the mentioned security pitfalls including smart card stolen attack, this paper proposes an efficient remote mutual authentication protocol using smart card. We have then simulated the proposed protocol using widely-accepted AVISPA simulation tool whose results make certain that the same protocol is secure against active and passive attacks including replay and man-in-the-middle attacks. Moreover, the rigorous security analysis proves that the proposed protocol provides strong security protection on the relevant security attacks including smart card stolen attack. We compare the proposed scheme with several related schemes in terms of computation cost and communication cost as well as security functionalities. It has been observed that the proposed scheme is comparatively better than related existing schemes.

  19. An Enhanced Biometric Based Authentication with Key-Agreement Protocol for Multi-Server Architecture Based on Elliptic Curve Cryptography.

    PubMed

    Reddy, Alavalapati Goutham; Das, Ashok Kumar; Odelu, Vanga; Yoo, Kee-Young

    2016-01-01

    Biometric based authentication protocols for multi-server architectures have gained momentum in recent times due to advancements in wireless technologies and associated constraints. Lu et al. recently proposed a robust biometric based authentication with key agreement protocol for a multi-server environment using smart cards. They claimed that their protocol is efficient and resistant to prominent security attacks. The careful investigation of this paper proves that Lu et al.'s protocol does not provide user anonymity, perfect forward secrecy and is susceptible to server and user impersonation attacks, man-in-middle attacks and clock synchronization problems. In addition, this paper proposes an enhanced biometric based authentication with key-agreement protocol for multi-server architecture based on elliptic curve cryptography using smartcards. We proved that the proposed protocol achieves mutual authentication using Burrows-Abadi-Needham (BAN) logic. The formal security of the proposed protocol is verified using the AVISPA (Automated Validation of Internet Security Protocols and Applications) tool to show that our protocol can withstand active and passive attacks. The formal and informal security analyses and performance analysis demonstrates that the proposed protocol is robust and efficient compared to Lu et al.'s protocol and existing similar protocols.

  20. An Enhanced Biometric Based Authentication with Key-Agreement Protocol for Multi-Server Architecture Based on Elliptic Curve Cryptography.

    PubMed

    Reddy, Alavalapati Goutham; Das, Ashok Kumar; Odelu, Vanga; Yoo, Kee-Young

    2016-01-01

    Biometric based authentication protocols for multi-server architectures have gained momentum in recent times due to advancements in wireless technologies and associated constraints. Lu et al. recently proposed a robust biometric based authentication with key agreement protocol for a multi-server environment using smart cards. They claimed that their protocol is efficient and resistant to prominent security attacks. The careful investigation of this paper proves that Lu et al.'s protocol does not provide user anonymity, perfect forward secrecy and is susceptible to server and user impersonation attacks, man-in-middle attacks and clock synchronization problems. In addition, this paper proposes an enhanced biometric based authentication with key-agreement protocol for multi-server architecture based on elliptic curve cryptography using smartcards. We proved that the proposed protocol achieves mutual authentication using Burrows-Abadi-Needham (BAN) logic. The formal security of the proposed protocol is verified using the AVISPA (Automated Validation of Internet Security Protocols and Applications) tool to show that our protocol can withstand active and passive attacks. The formal and informal security analyses and performance analysis demonstrates that the proposed protocol is robust and efficient compared to Lu et al.'s protocol and existing similar protocols. PMID:27163786

  1. Design and Analysis of an Enhanced Patient-Server Mutual Authentication Protocol for Telecare Medical Information System.

    PubMed

    Amin, Ruhul; Islam, S K Hafizul; Biswas, G P; Khan, Muhammad Khurram; Obaidat, Mohammad S

    2015-11-01

    In order to access remote medical server, generally the patients utilize smart card to login to the server. It has been observed that most of the user (patient) authentication protocols suffer from smart card stolen attack that means the attacker can mount several common attacks after extracting smart card information. Recently, Lu et al.'s proposes a session key agreement protocol between the patient and remote medical server and claims that the same protocol is secure against relevant security attacks. However, this paper presents several security attacks on Lu et al.'s protocol such as identity trace attack, new smart card issue attack, patient impersonation attack and medical server impersonation attack. In order to fix the mentioned security pitfalls including smart card stolen attack, this paper proposes an efficient remote mutual authentication protocol using smart card. We have then simulated the proposed protocol using widely-accepted AVISPA simulation tool whose results make certain that the same protocol is secure against active and passive attacks including replay and man-in-the-middle attacks. Moreover, the rigorous security analysis proves that the proposed protocol provides strong security protection on the relevant security attacks including smart card stolen attack. We compare the proposed scheme with several related schemes in terms of computation cost and communication cost as well as security functionalities. It has been observed that the proposed scheme is comparatively better than related existing schemes. PMID:26324169

  2. The World-2DPAGE Constellation to promote and publish gel-based proteomics data through the ExPASy server.

    PubMed

    Hoogland, Christine; Mostaguir, Khaled; Appel, Ron D; Lisacek, Frédérique

    2008-07-21

    Since it was launched in 1993, the ExPASy server has been and is still a reference in the proteomics world. ExPASy users access various databases, many dedicated tools, and lists of resources, among other services. A significant part of resources available is devoted to two-dimensional electrophoresis data. Our latest contribution to the expansion of the pool of on-line proteomics data is the World-2DPAGE Constellation, accessible at http://world-2dpage.expasy.org/. It is composed of the established WORLD-2DPAGE List of 2-D PAGE database servers, the World-2DPAGE Portal that queries simultaneously world-wide proteomics databases, and the recently created World-2DPAGE Repository. The latter component is a public standards-compliant repository for gel-based proteomics data linked to protein identifications published in the literature. It has been set up using the Make2D-DB package, a software tool that helps building SWISS-2DPAGE-like databases on one's own Web site. The lack of necessary informatics infrastructure to build and run a dedicated website is no longer an obstacle to make proteomics data publicly accessible on the Internet. PMID:18617148

  3. Optimizing TLB entries for mixed page size storage in contiguous memory

    DOEpatents

    Chen, Dong; Gara, Alan; Giampapa, Mark E.; Heidelberger, Philip; Kriegel, Jon K.; Ohmacht, Martin; Steinmacher-Burow, Burkhard

    2013-04-30

    A system and method for accessing memory are provided. The system comprises a lookup buffer for storing one or more page table entries, wherein each of the one or more page table entries comprises at least a virtual page number and a physical page number; a logic circuit for receiving a virtual address from said processor, said logic circuit for matching the virtual address to the virtual page number in one of the page table entries to select the physical page number in the same page table entry, said page table entry having one or more bits set to exclude a memory range from a page.

  4. Final Report for ''Client Server Software for the National Transport Code Collaboration''

    SciTech Connect

    John R Cary; David Alexander; Johan Carlsson; Kelly Luetkemeyer; Nathaniel Sizemore

    2004-04-30

    OAK-B135 Tech-X Corporation designed and developed all the networking code tying together the NTCC data server with the data client and the physics server with the data server and physics client. We were also solely responsible for the data and physics clients and the vast majority of the work on the data server. We also performed a number of other tasks.

  5. Performance analysis of a fault-tolerant distributed multimedia server

    NASA Astrophysics Data System (ADS)

    Derryberry, Barbara

    1998-12-01

    The evolving demands of networks to support Webtone, H.323, AIN and other advanced services require multimedia servers that can deliver a number of value-added capabilities such as to negotiate protocols, deliver network services, and respond to QoS requests. The server is one of the primary limiters on network capacity. THe next generation server must be based upon a flexible, robust, scalable, and reliable platform to keep abreast with the revolutionary pace of service demand and development while continuing to provide the same dependability that voice networks have provided for decades. A new distributed platform, which is based upon the Totem fault-tolerant messaging system, is described. Processor and network resources are modeled and analyzed. Quantitative results are presented that assess this platform in terms of messaging capacity and performance for various architecture and design options including processing technologies and fault-tolerance modes. The impacts of fault-tolerant messaging are identified based upon analytical modeling of the proposed server architecture.

  6. Economics of Computing: The Case of Centralized Network File Servers.

    ERIC Educational Resources Information Center

    Solomon, Martin B.

    1994-01-01

    Discusses computer networking and the cost effectiveness of decentralization, including local area networks. A planned experiment with a centralized approach to the operation and management of file servers at the University of South Carolina is described that hopes to realize cost savings and the avoidance of staffing problems. (Contains four…

  7. Perspectives of IT Professionals on Employing Server Virtualization Technologies

    ERIC Educational Resources Information Center

    Sligh, Darla

    2010-01-01

    Server virtualization enables a physical computer to support multiple applications logically by decoupling the application from the hardware layer, thereby reducing operational costs and competitive in delivering IT services to their enterprise organizations. IT organizations continually examine the efficiency of their internal IT systems and…

  8. Training to Increase Safe Tray Carrying among Cocktail Servers

    ERIC Educational Resources Information Center

    Scherrer, Megan D.; Wilder, David A.

    2008-01-01

    We evaluated the effects of training on proper carrying techniques among 3 cocktail servers to increase safe tray carrying on the job and reduce participants' risk of developing musculoskeletal disorders. As participants delivered drinks to their tables, their finger, arm, and neck positions were observed and recorded. Each participant received…

  9. Two-cloud-servers-assisted secure outsourcing multiparty computation.

    PubMed

    Sun, Yi; Wen, Qiaoyan; Zhang, Yudong; Zhang, Hua; Jin, Zhengping; Li, Wenmin

    2014-01-01

    We focus on how to securely outsource computation task to the cloud and propose a secure outsourcing multiparty computation protocol on lattice-based encrypted data in two-cloud-servers scenario. Our main idea is to transform the outsourced data respectively encrypted by different users' public keys to the ones that are encrypted by the same two private keys of the two assisted servers so that it is feasible to operate on the transformed ciphertexts to compute an encrypted result following the function to be computed. In order to keep the privacy of the result, the two servers cooperatively produce a custom-made result for each user that is authorized to get the result so that all authorized users can recover the desired result while other unauthorized ones including the two servers cannot. Compared with previous research, our protocol is completely noninteractive between any users, and both of the computation and the communication complexities of each user in our solution are independent of the computing function.

  10. BION web server: predicting non-specifically bound surface ions

    PubMed Central

    Alexov, Emil

    2013-01-01

    Motivation: Ions are essential component of the cell and frequently are found bound to various macromolecules, in particular to proteins. A binding of an ion to a protein greatly affects protein’s biophysical characteristics and needs to be taken into account in any modeling approach. However, ion’s bounded positions cannot be easily revealed experimentally, especially if they are loosely bound to macromolecular surface. Results: Here, we report a web server, the BION web server, which addresses the demand for tools of predicting surface bound ions, for which specific interactions are not crucial; thus, they are difficult to predict. The BION is easy to use web server that requires only coordinate file to be inputted, and the user is provided with various, but easy to navigate, options. The coordinate file with predicted bound ions is displayed on the output and is available for download. Availability: http://compbio.clemson.edu/bion_server/ Supplementary information: Supplementary data are available at Bioinformatics online. Contact: ealexov@clemson.edu PMID:23380591

  11. Microsoft SQL Server 6.0{reg_sign} Workbook

    SciTech Connect

    Augustenborg, E.C.

    1996-09-01

    This workbook was prepared for introductory training in the use of Microsoft SQL Server Version 6.0. The examples are all taken from the PUBS database that Microsoft distributes for training purposes or from the Microsoft Online Documentation. The merits of the relational database are presented.

  12. Geographic Information Systems-Transportation ISTEA management systems server-net prototype pooled fund study: Phase B summary

    SciTech Connect

    Espinoza, J. Jr.; Dean, C.D.; Armstrong, H.M.

    1997-06-01

    The Geographic Information System-Transportation (GIS-T) ISTEA Management Systems Server Net Prototype Pooled Fund Study represents the first national cooperative effort in the transportation industry to address the management and monitoring systems as well as the statewide and metropolitan transportation planning requirements of the Intermodal Surface Transportation Efficiency Act of 1991 (ISTEA). The Study was initiated in November 1993 through the Alliance for Transportation Research and under the leadership of the New Mexico State Highway and Transportation Department. Sandia National Laboratories, an Alliance partner, and Geographic Paradigm Computing. Inc. provided technical leadership for the project. In 1992, the Alliance for Transportation Research, the New Mexico State Highway and Transportation Department, Sandia National Laboratories, and Geographic Paradigm Computing, Inc., proposed a comprehensive research agenda for GIS-T. That program outlined a national effort to synthesize new transportation policy initiatives (e.g., management systems and Intelligent Transportation Systems) with the GIS-T server net ideas contained in the NCHRP project {open_quotes}Adaptation of GIS to Transportation{close_quotes}. After much consultation with state, federal, and private interests, a project proposal based on this agenda was prepared and resulted in this Study. The general objective of the Study was to develop GIS-T server net prototypes supporting the ISTEA requirements for transportation planning and management and monitoring systems. This objective can be further qualified to: (1) Create integrated information system architectures and design requirements encompassing transportation planning activities and data. (2) Encourage the development of functional GIS-T server net prototypes. (3) Demonstrate multiple information systems implemented in a server net environment.

  13. Multimedia medical data archive and retrieval server on the Internet

    NASA Astrophysics Data System (ADS)

    Komo, Darmadi; Levine, Betty A.; Freedman, Matthew T.; Mun, Seong K.; Tang, Y. K.; Chiang, Ted T.

    1997-05-01

    The Multimedia Medical Data Archive and Retrieval Server has been installed at the imaging science and information systems (ISIS) center in Georgetown University Medical Center to provide medical data archive and retrieval support for medical researchers. The medical data includes text, images, sound, and video. All medical data is keyword indexed using a database management system and placed temporarily in a staging area and then transferred to a StorageTek one terabyte tape library system with a robotic arm for permanent archive. There are two methods of interaction with the system. The first method is to use a web browser with HTML functions to perform insert, query, update, and retrieve operations. These generate dynamic SQL calls to the database and produce StorageTek API calls to the tape library. The HTML functions consist of a database, StorageTek interface, HTTP server, common gateway interface, and Java programs. The second method is to issue a DICOM store command, which is translated by the system's DICOM server to SQL calls and then produce StorageTek API calls to the tape library. The system performs as both an Internet and a DICOM server using standard protocols such as HTTP, HTML, Java, and DICOM. Users with proper authentication can log on to the server from anywhere on the Internet using a standard web browser resulting in a user-friendly, open environment, and platform independent solution for archiving multimedia medical data. It represents a complex integration of different components including a robotic tape storage system, database, user-interface, WWW protocols, and TCP/IP networking. The user will only deal with the WWW and DICOM server components of the system, the database and robotic tape library system are transparent and the user will not know that the medical data is stored on magnetic tapes. The server provides the researchers a cost-effective tool for archiving and retrieving medical data across a TCP/IP network environment. It will

  14. Towards Big Earth Data Analytics: The EarthServer Approach

    NASA Astrophysics Data System (ADS)

    Baumann, Peter

    2013-04-01

    Big Data in the Earth sciences, the Tera- to Exabyte archives, mostly are made up from coverage data whereby the term "coverage", according to ISO and OGC, is defined as the digital representation of some space-time varying phenomenon. Common examples include 1-D sensor timeseries, 2-D remote sensing imagery, 3D x/y/t image timeseries and x/y/z geology data, and 4-D x/y/z/t atmosphere and ocean data. Analytics on such data requires on-demand processing of sometimes significant complexity, such as getting the Fourier transform of satellite images. As network bandwidth limits prohibit transfer of such Big Data it is indispensable to devise protocols allowing clients to task flexible and fast processing on the server. The EarthServer initiative, funded by EU FP7 eInfrastructures, unites 11 partners from computer and earth sciences to establish Big Earth Data Analytics. One key ingredient is flexibility for users to ask what they want, not impeded and complicated by system internals. The EarthServer answer to this is to use high-level query languages; these have proven tremendously successful on tabular and XML data, and we extend them with a central geo data structure, multi-dimensional arrays. A second key ingredient is scalability. Without any doubt, scalability ultimately can only be achieved through parallelization. In the past, parallelizing code has been done at compile time and usually with manual intervention. The EarthServer approach is to perform a samentic-based dynamic distribution of queries fragments based on networks optimization and further criteria. The EarthServer platform is comprised by rasdaman, an Array DBMS enabling efficient storage and retrieval of any-size, any-type multi-dimensional raster data. In the project, rasdaman is being extended with several functionality and scalability features, including: support for irregular grids and general meshes; in-situ retrieval (evaluation of database queries on existing archive structures, avoiding data

  15. ASPEN--A Web-Based Application for Managing Student Server Accounts

    ERIC Educational Resources Information Center

    Sandvig, J. Christopher

    2004-01-01

    The growth of the Internet has greatly increased the demand for server-side programming courses at colleges and universities. Students enrolled in such courses must be provided with server-based accounts that support the technologies that they are learning. The process of creating, managing and removing large numbers of student server accounts is…

  16. primers4clades: a web server that uses phylogenetic trees to design lineage-specific PCR primers for metagenomic and diversity studies

    PubMed Central

    Contreras-Moreira, Bruno; Sachman-Ruiz, Bernardo; Figueroa-Palacios, Iraís; Vinuesa, Pablo

    2009-01-01

    Primers4clades is an easy-to-use web server that implements a fully automatic PCR primer design pipeline for cross-species amplification of novel sequences from metagenomic DNA, or from uncharacterized organisms, belonging to user-specified phylogenetic clades or taxa. The server takes a set of non-aligned protein coding genes, with or without introns, aligns them and computes a neighbor-joining tree, which is displayed on screen for easy selection of species or sequence clusters to design lineage-specific PCR primers. Primers4clades implements an extended CODEHOP primer design strategy based on both DNA and protein multiple sequence alignments. It evaluates several thermodynamic properties of the oligonucleotide pairs, and computes the phylogenetic information content of the predicted amplicon sets from Shimodaira–Hasegawa-like branch support values of maximum likelihood phylogenies. A non-redundant set of primer formulations is returned, ranked according to their thermodynamic properties. An amplicon distribution map provides a convenient overview of the coverage of the target locus. Altogether these features greatly help the user in making an informed choice between alternative primer pair formulations. Primers4clades is available at two mirror sites: http://maya.ccg.unam.mx/primers4clades/and http://floresta.eead.csic.es/primers4clades/. Three demo data sets and a comprehensive documentation/tutorial page are provided for easy testing of the server's capabilities and interface. PMID:19465390

  17. primers4clades: a web server that uses phylogenetic trees to design lineage-specific PCR primers for metagenomic and diversity studies.

    PubMed

    Contreras-Moreira, Bruno; Sachman-Ruiz, Bernardo; Figueroa-Palacios, Iraís; Vinuesa, Pablo

    2009-07-01

    Primers4clades is an easy-to-use web server that implements a fully automatic PCR primer design pipeline for cross-species amplification of novel sequences from metagenomic DNA, or from uncharacterized organisms, belonging to user-specified phylogenetic clades or taxa. The server takes a set of non-aligned protein coding genes, with or without introns, aligns them and computes a neighbor-joining tree, which is displayed on screen for easy selection of species or sequence clusters to design lineage-specific PCR primers. Primers4clades implements an extended CODEHOP primer design strategy based on both DNA and protein multiple sequence alignments. It evaluates several thermodynamic properties of the oligonucleotide pairs, and computes the phylogenetic information content of the predicted amplicon sets from Shimodaira-Hasegawa-like branch support values of maximum likelihood phylogenies. A non-redundant set of primer formulations is returned, ranked according to their thermodynamic properties. An amplicon distribution map provides a convenient overview of the coverage of the target locus. Altogether these features greatly help the user in making an informed choice between alternative primer pair formulations. Primers4clades is available at two mirror sites: http://maya.ccg.unam.mx/primers4clades/and http://floresta.eead.csic.es/primers4clades/. Three demo data sets and a comprehensive documentation/tutorial page are provided for easy testing of the server's capabilities and interface.

  18. An Efficient Web Page Ranking for Semantic Web

    NASA Astrophysics Data System (ADS)

    Chahal, P.; Singh, M.; Kumar, S.

    2014-01-01

    With the enormous amount of information presented on the web, the retrieval of relevant information has become a serious problem and is also the topic of research for last few years. The most common tools to retrieve information from web are search engines like Google. The Search engines are usually based on keyword searching and indexing of web pages. This approach is not very efficient as the result-set of web pages obtained include large irrelevant pages. Sometimes even the entire result-set may contain lot of irrelevant pages for the user. The next generation of search engines must address this problem. Recently, many semantic web search engines have been developed like Ontolook, Swoogle, which help in searching meaningful documents presented on semantic web. In this process the ranking of the retrieved web pages is very crucial. Some attempts have been made in ranking of semantic web pages but still the ranking of these semantic web documents is neither satisfactory and nor up to the user's expectations. In this paper we have proposed a semantic web based document ranking scheme that relies not only on the keywords but also on the conceptual instances present between the keywords. As a result only the relevant page will be on the top of the result-set of searched web pages. We explore all relevant relations between the keywords exploring the user's intention and then calculate the fraction of these relations on each web page to determine their relevance. We have found that this ranking technique gives better results than those by the prevailing methods.

  19. Exploring the use of a Facebook page in anatomy education.

    PubMed

    Jaffar, Akram Abood

    2014-01-01

    Facebook is the most popular social media site visited by university students on a daily basis. Consequently, Facebook is the logical place to start with for integrating social media technologies into education. This study explores how a faculty-administered Facebook Page can be used to supplement anatomy education beyond the traditional classroom. Observations were made on students' perceptions and effectiveness of using the Page, potential benefits and challenges of such use, and which Insights metrics best reflect user's engagement. The Human Anatomy Education Page was launched on Facebook and incorporated into anatomy resources for 157 medical students during two academic years. Students' use of Facebook and their perceptions of the Page were surveyed. Facebook's "Insights" tool was also used to evaluate Page performance during a period of 600 days. The majority of in-class students had a Facebook account which they adopted in education. Most students perceived Human Anatomy Education Page as effective in contributing to learning and favored "self-assessment" posts. The majority of students agreed that Facebook could be a suitable learning environment. The "Insights" tool revealed globally distributed fans with considerable Page interactions. The use of a faculty-administered Facebook Page provided a venue to enhance classroom teaching without intruding into students' social life. A wider educational use of Facebook should be adopted not only because students are embracing its use, but for its inherent potentials in boosting learning. The "Insights" metrics analyzed in this study might be helpful when establishing and evaluating the performance of education-oriented Facebook Pages.

  20. Exploring the use of a Facebook page in anatomy education.

    PubMed

    Jaffar, Akram Abood

    2014-01-01

    Facebook is the most popular social media site visited by university students on a daily basis. Consequently, Facebook is the logical place to start with for integrating social media technologies into education. This study explores how a faculty-administered Facebook Page can be used to supplement anatomy education beyond the traditional classroom. Observations were made on students' perceptions and effectiveness of using the Page, potential benefits and challenges of such use, and which Insights metrics best reflect user's engagement. The Human Anatomy Education Page was launched on Facebook and incorporated into anatomy resources for 157 medical students during two academic years. Students' use of Facebook and their perceptions of the Page were surveyed. Facebook's "Insights" tool was also used to evaluate Page performance during a period of 600 days. The majority of in-class students had a Facebook account which they adopted in education. Most students perceived Human Anatomy Education Page as effective in contributing to learning and favored "self-assessment" posts. The majority of students agreed that Facebook could be a suitable learning environment. The "Insights" tool revealed globally distributed fans with considerable Page interactions. The use of a faculty-administered Facebook Page provided a venue to enhance classroom teaching without intruding into students' social life. A wider educational use of Facebook should be adopted not only because students are embracing its use, but for its inherent potentials in boosting learning. The "Insights" metrics analyzed in this study might be helpful when establishing and evaluating the performance of education-oriented Facebook Pages. PMID:24022984

  1. Home Page: The Mode of Transport through the Information Superhighway

    NASA Technical Reports Server (NTRS)

    Lujan, Michelle R.

    1995-01-01

    The purpose of the project with the Aeroacoustics Branch was to create and submit a home page for the internet about branch information. In order to do this, one must also become familiar with the way that the internet operates. Learning HyperText Markup Language (HTML), and the ability to create a document using this language was the final objective in order to place a home page on the internet (World Wide Web). A manual of instructions regarding maintenance of the home page, and how to keep it up to date was also necessary in order to provide branch members with the opportunity to make any pertinent changes.

  2. PockDrug-Server: a new web server for predicting pocket druggability on holo and apo proteins.

    PubMed

    Hussein, Hiba Abi; Borrel, Alexandre; Geneix, Colette; Petitjean, Michel; Regad, Leslie; Camproux, Anne-Claude

    2015-07-01

    Predicting protein pocket's ability to bind drug-like molecules with high affinity, i.e. druggability, is of major interest in the target identification phase of drug discovery. Therefore, pocket druggability investigations represent a key step of compound clinical progression projects. Currently computational druggability prediction models are attached to one unique pocket estimation method despite pocket estimation uncertainties. In this paper, we propose 'PockDrug-Server' to predict pocket druggability, efficient on both (i) estimated pockets guided by the ligand proximity (extracted by proximity to a ligand from a holo protein structure) and (ii) estimated pockets based solely on protein structure information (based on amino atoms that form the surface of potential binding cavities). PockDrug-Server provides consistent druggability results using different pocket estimation methods. It is robust with respect to pocket boundary and estimation uncertainties, thus efficient using apo pockets that are challenging to estimate. It clearly distinguishes druggable from less druggable pockets using different estimation methods and outperformed recent druggability models for apo pockets. It can be carried out from one or a set of apo/holo proteins using different pocket estimation methods proposed by our web server or from any pocket previously estimated by the user. PockDrug-Server is publicly available at: http://pockdrug.rpbs.univ-paris-diderot.fr.

  3. Hardware and Software Interfacing at New Mexico Geochronology Research Laboratory: Distributed Control Using Pychron and RemoteControlServer.cs

    NASA Astrophysics Data System (ADS)

    McIntosh, W. C.; Ross, J. I.

    2012-12-01

    We developed a system for interfacing existing hardware and software to two new Thermo Scientific Argus VI mass spectrometers and three Photon Machines Fusions laser systems at New Mexico Geochronology Research Laboratory. NMGRL's upgrade to the new analytical equipment required the design and implementation of a software ecosystem that allows seamless communication between various software and hardware components. Based on past experience and initial testing we choose to pursue a "Fully Distributed Control" model. In this model, hardware is compartmentalized and controlled by customized software running on individual computers. Each computer is connected to a Local Area Network (LAN) facilitating inter-process communication using TCP or UDP Internet Protocols. Two other options for interfacing are 1) Single Control, in which all hardware is controlled by a single application on a single computer and 2), Partial Distributed Control, in which the mass spectrometer is controlled directly by Thermo Scientific's Qtegra and all other hardware is controlled by a separate application. The "Fully Distributed Control" model offers the most efficient use of software resources, leveraging our in-house laboratory software with proprietary third-party applications, such as Qtegra and Mass Spec. Two software products resulted from our efforts. 1) Pychron, a configurable and extensible package for hardware control, data acquisition and preprocessing, and 2) RemoteControlServer.cs, a C# script for Thermo's Qtegra software that implements a TCP/UDP command server. Pychron is written in python and uses standard well-established libraries such as, Numpy, Scipy, and Enthought ETS. Pychron is flexible and extensible, encouraging experimentation and rapid development of new features. A project page for Pychron is located at http://code.google.com/p/arlab, featuring an issue tracker and a Version Control System (Mercurial). RemoteControlServer.cs is a simple socket server that listens

  4. 8. DETAIL OF PAGE ONE IN VOLUME ONE OF KREGEL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. DETAIL OF PAGE ONE IN VOLUME ONE OF KREGEL WINDMILL COMPANY LEDGERS RECORDING THE FOUNDING OF THE COMPANY ON 19 AUGUST 1879. - Kregel Windmill Company Factory, 1416 Central Avenue, Nebraska City, Otoe County, NE

  5. Book Holder And Page Turner For The Elderly And Handicapped

    NASA Technical Reports Server (NTRS)

    Kerley, James; Eklund, Wayne

    1993-01-01

    Device holds reading matter and facilitates page turning for person not having use of arms and hands. Accommodates variety of publication formats, whether book, magazine, or newspaper. Holder sits on hospital-bed table and adjusted to convenient viewing angle. Includes flat upright back support for reading matter, hinged base, and main bracket with bent-wire page holders. Top support on back extended for such large items as newspapers. Wings on back support extended for oversize materials. Reader turns page by gripping special rod via mouthpiece, applying friction cup at its tip to page, and manipulating rod. Mouthpiece wide and tapered so user grips with teeth and uses jaws to move it, rather than using tongue or lips. Helpful to older people, whose facial and mouth muscles weak.

  6. 1. Historic American Buildings Survey Annals of SF, Page 170 ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. Historic American Buildings Survey Annals of SF, Page 170 Jocelyn Annin-Del Photo Taken: 1836 FOURTH OF JULY CELEBRATION YEAR BUILT 1836 - Jacob Leese House, Historic View, Grant Avenue, San Francisco, San Francisco County, CA

  7. Progress In Automatic Reading Of Complex Typeset Pages

    NASA Astrophysics Data System (ADS)

    Vincent, Philippe

    1989-07-01

    For a long time, automatic reading has been limited to optical character recognition. one year ago, except for one high end product, all industrial software or hardware products where limited to the reading of mono-column texts without images. This does not correspond to real life needs. In a current, company, pages which need to be transformed into electronic form are not only typewritten pages, but also complex pages from professional magazines, technical manuals, financial reports and tables, administrative documents, various directories, lists of spare parts etc... The real problem of automatic reading is to transform such complex paper pages including columns, images, drawings, titles, footnotes, legends, tables, occasionally in landscape format, into a computer text file without the help of an operator. Moreover, the problem is to perform this operation at an economical cost with limited computer resources in terms of processor and memory.

  8. 47 CFR 22.503 - Paging geographic area authorizations.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... interference problems before bringing matters to the FCC. In the event that there is no co-channel paging... on a non-interfering basis. (j) Site location restriction. The transmitting antenna of each...

  9. Visualizing Worlds from Words on a Page

    ERIC Educational Resources Information Center

    Parsons, Linda T.

    2006-01-01

    This study involved fourth grade children as co-researchers of their engaged, aesthetic reading experience. As members of the "Readers as Researchers Club," they documented their engagement with text--how they create, enter, and sustain the story world. The children, who self-identified as avid readers, explored the activities central to their…

  10. ASM Based Synthesis of Handwritten Arabic Text Pages

    PubMed Central

    Dinges, Laslo; Al-Hamadi, Ayoub; Elzobi, Moftah; El-etriby, Sherif; Ghoneim, Ahmed

    2015-01-01

    Document analysis tasks, as text recognition, word spotting, or segmentation, are highly dependent on comprehensive and suitable databases for training and validation. However their generation is expensive in sense of labor and time. As a matter of fact, there is a lack of such databases, which complicates research and development. This is especially true for the case of Arabic handwriting recognition, that involves different preprocessing, segmentation, and recognition methods, which have individual demands on samples and ground truth. To bypass this problem, we present an efficient system that automatically turns Arabic Unicode text into synthetic images of handwritten documents and detailed ground truth. Active Shape Models (ASMs) based on 28046 online samples were used for character synthesis and statistical properties were extracted from the IESK-arDB database to simulate baselines and word slant or skew. In the synthesis step ASM based representations are composed to words and text pages, smoothed by B-Spline interpolation and rendered considering writing speed and pen characteristics. Finally, we use the synthetic data to validate a segmentation method. An experimental comparison with the IESK-arDB database encourages to train and test document analysis related methods on synthetic samples, whenever no sufficient natural ground truthed data is available. PMID:26295059

  11. ASM Based Synthesis of Handwritten Arabic Text Pages.

    PubMed

    Dinges, Laslo; Al-Hamadi, Ayoub; Elzobi, Moftah; El-Etriby, Sherif; Ghoneim, Ahmed

    2015-01-01

    Document analysis tasks, as text recognition, word spotting, or segmentation, are highly dependent on comprehensive and suitable databases for training and validation. However their generation is expensive in sense of labor and time. As a matter of fact, there is a lack of such databases, which complicates research and development. This is especially true for the case of Arabic handwriting recognition, that involves different preprocessing, segmentation, and recognition methods, which have individual demands on samples and ground truth. To bypass this problem, we present an efficient system that automatically turns Arabic Unicode text into synthetic images of handwritten documents and detailed ground truth. Active Shape Models (ASMs) based on 28046 online samples were used for character synthesis and statistical properties were extracted from the IESK-arDB database to simulate baselines and word slant or skew. In the synthesis step ASM based representations are composed to words and text pages, smoothed by B-Spline interpolation and rendered considering writing speed and pen characteristics. Finally, we use the synthetic data to validate a segmentation method. An experimental comparison with the IESK-arDB database encourages to train and test document analysis related methods on synthetic samples, whenever no sufficient natural ground truthed data is available. PMID:26295059

  12. Using Shadow Page Cache to Improve Isolated Drivers Performance

    PubMed Central

    Dong, Xiaoshe; Wang, Endong; Chen, Baoke; Zhu, Zhengdong; Liu, Chengzhe

    2015-01-01

    With the advantage of the reusability property of the virtualization technology, users can reuse various types and versions of existing operating systems and drivers in a virtual machine, so as to customize their application environment. In order to prevent users' virtualization environments being impacted by driver faults in virtual machine, Chariot examines the correctness of driver's write operations by the method of combining a driver's write operation capture and a driver's private access control table. However, this method needs to keep the write permission of shadow page table as read-only, so as to capture isolated driver's write operations through page faults, which adversely affect the performance of the driver. Based on delaying setting frequently used shadow pages' write permissions to read-only, this paper proposes an algorithm using shadow page cache to improve the performance of isolated drivers and carefully study the relationship between the performance of drivers and the size of shadow page cache. Experimental results show that, through the shadow page cache, the performance of isolated drivers can be greatly improved without impacting Chariot's reliability too much. PMID:25815373

  13. Using shadow page cache to improve isolated drivers performance.

    PubMed

    Zheng, Hao; Dong, Xiaoshe; Wang, Endong; Chen, Baoke; Zhu, Zhengdong; Liu, Chengzhe

    2015-01-01

    With the advantage of the reusability property of the virtualization technology, users can reuse various types and versions of existing operating systems and drivers in a virtual machine, so as to customize their application environment. In order to prevent users' virtualization environments being impacted by driver faults in virtual machine, Chariot examines the correctness of driver's write operations by the method of combining a driver's write operation capture and a driver's private access control table. However, this method needs to keep the write permission of shadow page table as read-only, so as to capture isolated driver's write operations through page faults, which adversely affect the performance of the driver. Based on delaying setting frequently used shadow pages' write permissions to read-only, this paper proposes an algorithm using shadow page cache to improve the performance of isolated drivers and carefully study the relationship between the performance of drivers and the size of shadow page cache. Experimental results show that, through the shadow page cache, the performance of isolated drivers can be greatly improved without impacting Chariot's reliability too much. PMID:25815373

  14. Prostate-associated gene 4 (PAGE4), an intrinsically disordered cancer/testis antigen, is a novel therapeutic target for prostate cancer

    PubMed Central

    Kulkarni, Prakash; Dunker, A Keith; Weninger, Keith; Orban, John

    2016-01-01

    Prostate-associated gene 4 (PAGE4) is a remarkably prostate-specific Cancer/Testis Antigen that is highly upregulated in the human fetal prostate and its diseased states but not in the adult normal gland. PAGE4 is an intrinsically disordered protein (IDP) that functions as a stress-response protein to suppress reactive oxygen species as well as prevent DNA damage. In addition, PAGE4 is also a transcriptional regulator that potentiates transactivation by the oncogene c-Jun. c-Jun forms the AP-1 complex by heterodimerizing with members of the Fos family and plays an important role in the development and pathology of the prostate gland, underscoring the importance of the PAGE4/c-Jun interaction. HIPK1, also a component of the stress-response pathway, phosphorylates PAGE4 at T51 which is critical for its transcriptional activity. Phosphorylation induces conformational and dynamic switching in the PAGE4 ensemble leading to a new cellular function. Finally, bioinformatics evidence suggests that the PAGE4 mRNA could be alternatively spliced resulting in four potential isoforms of the polypeptide alluding to the possibility of a range of conformational ensembles with latent functions. Considered together, the data suggest that PAGE4 may represent the first molecular link between stress and prostate cancer (PCa). Thus, pharmacologically targeting PAGE4 may be a novel opportunity for treating and managing patients with PCa, especially patients with low-risk disease. PMID:27270343

  15. MARSIS data and simulation exploited using array databases: PlanetServer/EarthServer for sounding radars

    NASA Astrophysics Data System (ADS)

    Cantini, Federico; Pio Rossi, Angelo; Orosei, Roberto; Baumann, Peter; Misev, Dimitar; Oosthoek, Jelmer; Beccati, Alan; Campalani, Piero; Unnithan, Vikram

    2014-05-01

    MARSIS is an orbital synthetic aperture radar for both ionosphere and subsurface sounding on board ESA's Mars Express (Picardi et al. 2005). It transmits electromagnetic pulses centered at 1.8, 3, 4 or 5 MHz that penetrate below the surface and are reflected by compositional and/or structural discontinuities in the subsurface of Mars. MARSIS data are available as a collection of single orbit data files. The availability of tools for a more effective access to such data would greatly ease data analysis and exploitation by the community of users. For this purpose, we are developing a database built on the raster database management system RasDaMan (e.g. Baumann et al., 1994), to be populated with MARSIS data and integrated in the PlanetServer/EarthServer (e.g. Oosthoek et al., 2013; Rossi et al., this meeting) project. The data (and related metadata) are stored in the db for each frequency used by MARSIS radar. The capability of retrieving data belonging to a certain orbit or to multiple orbit on the base of latitute/longitude boundaries is a key requirement of the db design, allowing, besides the "classical" radargram representation of the data, and in area with sufficiently hight orbit density, a 3D data extraction, subset and analysis of subsurface structures. Moreover the use of the OGC WCPS (Web Coverage Processing Service) standard can allow calculations on database query results for multiple echoes and/or subsets of a certain data product. Because of the low directivity of its dipole antenna, MARSIS receives echoes from portions of the surface of Mars that are distant from nadir and can be mistakenly interpreted as subsurface echoes. For this reason, methods have been developed to simulate surface echoes (e.g. Nouvel et al., 2004), to reveal the true origin of an echo through comparison with instrument data. These simulations are usually time-consuming, and so far have been performed either on a case-by-case basis or in some simplified form. A code for

  16. MARSIS data and simulation exploited using array databases: PlanetServer/EarthServer for sounding radars

    NASA Astrophysics Data System (ADS)

    Cantini, Federico; Pio Rossi, Angelo; Orosei, Roberto; Baumann, Peter; Misev, Dimitar; Oosthoek, Jelmer; Beccati, Alan; Campalani, Piero; Unnithan, Vikram

    2014-05-01

    MARSIS is an orbital synthetic aperture radar for both ionosphere and subsurface sounding on board ESA's Mars Express (Picardi et al. 2005). It transmits electromagnetic pulses centered at 1.8, 3, 4 or 5 MHz that penetrate below the surface and are reflected by compositional and/or structural discontinuities in the subsurface of Mars. MARSIS data are available as a collection of single orbit data files. The availability of tools for a more effective access to such data would greatly ease data analysis and exploitation by the community of users. For this purpose, we are developing a database built on the raster database management system RasDaMan (e.g. Baumann et al., 1994), to be populated with MARSIS data and integrated in the PlanetServer/EarthServer (e.g. Oosthoek et al., 2013; Rossi et al., this meeting) project. The data (and related metadata) are stored in the db for each frequency used by MARSIS radar. The capability of retrieving data belonging to a certain orbit or to multiple orbit on the base of latitute/longitude boundaries is a key requirement of the db design, allowing, besides the "classical" radargram representation of the data, and in area with sufficiently hight orbit density, a 3D data extraction, subset and analysis of subsurface structures. Moreover the use of the OGC WCPS (Web Coverage Processing Service) standard can allow calculations on database query results for multiple echoes and/or subsets of a certain data product. Because of the low directivity of its dipole antenna, MARSIS receives echoes from portions of the surface of Mars that are distant from nadir and can be mistakenly interpreted as subsurface echoes. For this reason, methods have been developed to simulate surface echoes (e.g. Nouvel et al., 2004), to reveal the true origin of an echo through comparison with instrument data. These simulations are usually time-consuming, and so far have been performed either on a case-by-case basis or in some simplified form. A code for

  17. The Medicago truncatula gene expression atlas web server

    PubMed Central

    2009-01-01

    Background Legumes (Leguminosae or Fabaceae) play a major role in agriculture. Transcriptomics studies in the model legume species, Medicago truncatula, are instrumental in helping to formulate hypotheses about the role of legume genes. With the rapid growth of publically available Affymetrix GeneChip Medicago Genome Array GeneChip data from a great range of tissues, cell types, growth conditions, and stress treatments, the legume research community desires an effective bioinformatics system to aid efforts to interpret the Medicago genome through functional genomics. We developed the Medicago truncatula Gene Expression Atlas (MtGEA) web server for this purpose. Description The Medicago truncatula Gene Expression Atlas (MtGEA) web server is a centralized platform for analyzing the Medicago transcriptome. Currently, the web server hosts gene expression data from 156 Affymetrix GeneChip® Medicago genome arrays in 64 different experiments, covering a broad range of developmental and environmental conditions. The server enables flexible, multifaceted analyses of transcript data and provides a range of additional information about genes, including different types of annotation and links to the genome sequence, which help users formulate hypotheses about gene function. Transcript data can be accessed using Affymetrix probe identification number, DNA sequence, gene name, functional description in natural language, GO and KEGG annotation terms, and InterPro domain number. Transcripts can also be discovered through co-expression or differential expression analysis. Flexible tools to select a subset of experiments and to visualize and compare expression profiles of multiple genes have been implemented. Data can be downloaded, in part or full, in a tabular form compatible with common analytical and visualization software. The web server will be updated on a regular basis to incorporate new gene expression data and genome annotation, and is accessible at: http

  18. EarthServer - 3D Visualization on the Web

    NASA Astrophysics Data System (ADS)

    Wagner, Sebastian; Herzig, Pasquale; Bockholt, Ulrich; Jung, Yvonne; Behr, Johannes

    2013-04-01

    EarthServer (www.earthserver.eu), funded by the European Commission under its Seventh Framework Program, is a project to enable the management, access and exploration of massive, multi-dimensional datasets using Open GeoSpatial Consortium (OGC) query and processing language standards like WCS 2.0 and WCPS. To this end, a server/client architecture designed to handle Petabyte/Exabyte volumes of multi-dimensional data is being developed and deployed. As an important part of the EarthServer project, six Lighthouse Applications, major scientific data exploitation initiatives, are being established to make cross-domain, Earth Sciences related data repositories available in an open and unified manner, as service endpoints based on solutions and infrastructure developed within the project. Clients technology developed and deployed in EarthServer ranges from mobile and web clients to immersive virtual reality systems, all designed to interact with a physically and logically distributed server infrastructure using exclusively OGC standards. In this contribution, we would like to present our work on a web-based 3D visualization and interaction client for Earth Sciences data using only technology found in standard web browsers without requiring the user to install plugins or addons. Additionally, we are able to run the earth data visualization client on a wide range of different platforms with very different soft- and hardware requirements such as smart phones (e.g. iOS, Android), different desktop systems etc. High-quality, hardware-accelerated visualization of 3D and 4D content in standard web browsers can be realized now and we believe it will become more and more common to use this fast, lightweight and ubiquitous platform to provide insights into big datasets without requiring the user to set up a specialized client first. With that in mind, we will also point out some of the limitations we encountered using current web technologies. Underlying the EarthServer web client

  19. Parmodel: a web server for automated comparative modeling of proteins.

    PubMed

    Uchôa, Hugo Brandão; Jorge, Guilherme Eberhart; Freitas Da Silveira, Nelson José; Camera, João Carlos; Canduri, Fernanda; De Azevedo, Walter Filgueira

    2004-12-24

    Parmodel is a web server for automated comparative modeling and evaluation of protein structures. The aim of this tool is to help inexperienced users to perform modeling, assessment, visualization, and optimization of protein models as well as crystallographers to evaluate structures solved experimentally. It is subdivided in four modules: Parmodel Modeling, Parmodel Assessment, Parmodel Visualization, and Parmodel Optimization. The main module is the Parmodel Modeling that allows the building of several models for a same protein in a reduced time, through the distribution of modeling processes on a Beowulf cluster. Parmodel automates and integrates the main softwares used in comparative modeling as MODELLER, Whatcheck, Procheck, Raster3D, Molscript, and Gromacs. This web server is freely accessible at .

  20. Introducing djatoka: a reuse friendly, open source JPEG image server

    SciTech Connect

    Chute, Ryan M; Van De Sompel, Herbert

    2008-01-01

    The ISO-standardized JPEG 2000 image format has started to attract significant attention. Support for the format is emerging in major consumer applications, and the cultural heritage community seriously considers it a viable format for digital preservation. So far, only commercial image servers with JPEG 2000 support have been available. They come with significant license fees and typically provide the customers with limited extensibility capabilities. Here, we introduce djatoka, an open source JPEG 2000 image server with an attractive basic feature set, and extensibility under control of the community of implementers. We describe djatoka, and point at demonstrations that feature digitized images of marvelous historical manuscripts from the collections of the British Library and the University of Ghent. We also caIl upon the community to engage in further development of djatoka.

  1. Peptiderive server: derive peptide inhibitors from protein-protein interactions.

    PubMed

    Sedan, Yuval; Marcu, Orly; Lyskov, Sergey; Schueler-Furman, Ora

    2016-07-01

    The Rosetta Peptiderive protocol identifies, in a given structure of a protein-protein interaction, the linear polypeptide segment suggested to contribute most to binding energy. Interactions that feature a 'hot segment', a linear peptide with significant binding energy compared to that of the complex, may be amenable for inhibition and the peptide sequence and structure derived from the interaction provide a starting point for rational drug design. Here we present a web server for Peptiderive, which is incorporated within the ROSIE web interface for Rosetta protocols. A new feature of the protocol also evaluates whether derived peptides are good candidates for cyclization. Fast computation times and clear visualization allow users to quickly assess the interaction of interest. The Peptiderive server is available for free use at http://rosie.rosettacommons.org/peptiderive. PMID:27141963

  2. The HADDOCK web server for data-driven biomolecular docking.

    PubMed

    de Vries, Sjoerd J; van Dijk, Marc; Bonvin, Alexandre M J J

    2010-05-01

    Computational docking is the prediction or modeling of the three-dimensional structure of a biomolecular complex, starting from the structures of the individual molecules in their free, unbound form. HADDOCK is a popular docking program that takes a data-driven approach to docking, with support for a wide range of experimental data. Here we present the HADDOCK web server protocol, facilitating the modeling of biomolecular complexes for a wide community. The main web interface is user-friendly, requiring only the structures of the individual components and a list of interacting residues as input. Additional web interfaces allow the more advanced user to exploit the full range of experimental data supported by HADDOCK and to customize the docking process. The HADDOCK server has access to the resources of a dedicated cluster and of the e-NMR GRID infrastructure. Therefore, a typical docking run takes only a few minutes to prepare and a few hours to complete.

  3. High performance medical image processing in client/server-environments.

    PubMed

    Mayer, A; Meinzer, H P

    1999-03-01

    As 3D scanning devices like computer tomography (CT) or magnetic resonance imaging (MRI) become more widespread, there is also an increasing need for powerful computers that can handle the enormous amounts of data with acceptable response times. We describe an approach to parallelize some of the more frequently used image processing operators on distributed memory architectures. It is desirable to make such specialized machines accessible on a network, in order to save costs by sharing resources. We present a client/server approach that is specifically tailored to the interactive work with volume data. Our image processing server implements a volume visualization method that allows the user to assess the segmentation of anatomical structures. We can enhance the presentation by combining the volume visualizations on a viewing station with additional graphical elements, which can be manipulated in real-time. The methods presented were verified on two applications for different domains. PMID:10094225

  4. User Evaluation of the NASA Technical Report Server Recommendation Service

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Bollen, Johan; Calhoun, JoAnne R.; Mackey, Calvin E.

    2004-01-01

    We present the user evaluation of two recommendation server methodologies implemented for the NASA Technical Report Server (NTRS). One methodology for generating recommendations uses log analysis to identify co-retrieval events on full-text documents. For comparison, we used the Vector Space Model (VSM) as the second methodology. We calculated cosine similarities and used the top 10 most similar documents (based on metadata) as 'recommendations'. We then ran an experiment with NASA Langley Research Center (LaRC) staff members to gather their feedback on which method produced the most 'quality' recommendations. We found that in most cases VSM outperformed log analysis of co-retrievals. However, analyzing the data revealed the evaluations may have been structurally biased in favor of the VSM generated recommendations. We explore some possible methods for combining log analysis and VSM generated recommendations and suggest areas of future work.

  5. Experience of public procurement of Open Compute servers

    NASA Astrophysics Data System (ADS)

    Bärring, Olof; Guerri, Marco; Bonfillou, Eric; Valsan, Liviu; Grigore, Alexandru; Dore, Vincent; Gentit, Alain; Clement, Benoît; Grossir, Anthony

    2015-12-01

    The Open Compute Project. OCP (http://www.opencompute.org/). was launched by Facebook in 2011 with the objective of building efficient computing infrastructures at the lowest possible cost. The technologies are released as open hardware. with the goal to develop servers and data centres following the model traditionally associated with open source software projects. In 2013 CERN acquired a few OCP servers in order to compare performance and power consumption with standard hardware. The conclusions were that there are sufficient savings to motivate an attempt to procure a large scale installation. One objective is to evaluate if the OCP market is sufficiently mature and broad enough to meet the constraints of a public procurement. This paper summarizes this procurement. which started in September 2014 and involved the Request for information (RFI) to qualify bidders and Request for Tender (RFT).

  6. An empirical performance analysis of commodity memories in commodity servers

    SciTech Connect

    Kerbyson, D. J.; Lang, M. K.; Patino, G.

    2004-01-01

    This work details a performance study of six different commodity memories in two commodity server nodes on a number of microbenchmarks, that measure low-level performance characteristics, as well as on two applications representative of the ASCI workload. Thc memories vary both in terms of performance, including latency and bandwidths, and also in terms of their physical properties and manufacturer. Two server nodes were used; one Itanium-II Madison based system, and one Xeon based system. All the memories examined can be used within both processing nodes. This allows the performance of the memories to be directly examined while keeping all other factors within a processing node the same (processor, motherboard, operating system etc.). The results of this study show that there can be a significant difference in application performance from the different memories - by as much as 20%. Thus, by choosing the most appropriate memory for a processing node at a minimal cost differential, significant improved performance may be achievable.

  7. The widest practicable dissemination: The NASA technical report server

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Gottlich, Gretchen L.; Bianco, David J.; Binkley, Robert L.; Kellogg, Yvonne D.; Paulson, Sharon S.; Beaumont, Chris J.; Schmunk, Robert B.; Kurtz, Michael; Accomazzi, Alberto

    1995-01-01

    The search for innovative methods to distribute NASA's information lead a gross-roots team to create the NASA Technical Report Server (NTRS), which uses the World Wide Web and other popular Internet-based information systems as search engines. The NTRS is an inter-center effort which provides uniform access to various distributed publication servers residing on the Internet. Users have immediate desktop access to technical publications from NASA centers and institutes. This paper presents the NTRS architecture, usage metrics, and the lessons learned while implementing and maintaining the services over the initial 6-month period. The NTRS is largely constructed with freely available software running on existing hardware. NTRS builds upon existing hardware and software, and the resulting additional exposure for the body of literature contained will allow NASA to ensure that its institutional knowledge base will continue to receive the widest practicable and appropriate dissemination.

  8. Design of Accelerator Online Simulator Server Using Structured Data

    SciTech Connect

    Shen, Guobao; Chu, Chungming; Wu, Juhao; Kraimer, Martin; /Argonne

    2012-07-06

    Model based control plays an important role for a modern accelerator during beam commissioning, beam study, and even daily operation. With a realistic model, beam behaviour can be predicted and therefore effectively controlled. The approach used by most current high level application environments is to use a built-in simulation engine and feed a realistic model into that simulation engine. Instead of this traditional monolithic structure, a new approach using a client-server architecture is under development. An on-line simulator server is accessed via network accessible structured data. With this approach, a user can easily access multiple simulation codes. This paper describes the design, implementation, and current status of PVData, which defines the structured data, and PVAccess, which provides network access to the structured data.

  9. Running the Sloan Digital Sky Survey data archive server

    SciTech Connect

    Neilsen, Eric H., Jr.; Stoughton, Chris; /Fermilab

    2006-11-01

    The Sloan Digital Sky Survey (SDSS) Data Archive Server (DAS) provides public access to over 12Tb of data in 17 million files produced by the SDSS data reduction pipeline. Many tasks which seem trivial when serving smaller, less complex data sets present challenges when serving data of this volume and technical complexity. The included output files should be chosen to support as much science as possible from publicly released data, and only publicly released data. Users must have the resources needed to read and interpret the data correctly. Server administrators must generate new data releases at regular intervals, monitor usage, quickly recover from hardware failures, and monitor the data served by the DAS both for contents and corruption. We discuss these challenges, describe tools we use to administer and support the DAS, and discuss future development plans.

  10. DSP: a protein shape string and its profile prediction server.

    PubMed

    Sun, Jiangming; Tang, Shengnan; Xiong, Wenwei; Cong, Peisheng; Li, Tonghua

    2012-07-01

    Many studies have demonstrated that shape string is an extremely important structure representation, since it is more complete than the classical secondary structure. The shape string provides detailed information also in the regions denoted random coil. But few services are provided for systematic analysis of protein shape string. To fill this gap, we have developed an accurate shape string predictor based on two innovative technologies: a knowledge-driven sequence alignment and a sequence shape string profile method. The performance on blind test data demonstrates that the proposed method can be used for accurate prediction of protein shape string. The DSP server provides both predicted shape string and sequence shape string profile for each query sequence. Using this information, the users can compare protein structure or display protein evolution in shape string space. The DSP server is available at both http://cheminfo.tongji.edu.cn/dsp/ and its main mirror http://chemcenter.tongji.edu.cn/dsp/.

  11. User Evaluation of the NASA Technical Report Server Recommendation Service

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Bollen, Johan; Calhoun, JoAnne R.; Mackey, Calvin E.

    2004-01-01

    We present the user evaluation of two recommendation server methodologies implemented for the NASA Technical Report Server (NTRS). One methodology for generating recommendations uses log analysis to identify co-retrieval events on full-text documents. For comparison, we used the Vector Space Model (VSM) as the second methodology. We calculated cosine similarities and used the top 10 most similar documents (based on metadata) as recommendations . We then ran an experiment with NASA Langley Research Center (LaRC) staff members to gather their feedback on which method produced the most quality recommendations. We found that in most cases VSM outperformed log analysis of co-retrievals. However, analyzing the data revealed the evaluations may have been structurally biased in favor of the VSM generated recommendations. We explore some possible methods for combining log analysis and VSM generated recommendations and suggest areas of future work.

  12. Running the Sloan Digital Sky Survey Data Archive Server

    NASA Astrophysics Data System (ADS)

    Neilsen, E. H., Jr.; Stoughton, C.

    2007-10-01

    The Sloan Digital Sky Survey (SDSS) Data Archive Server (DAS) provides public access to over 12~Tb of data in 17 million files produced by the SDSS data reduction pipeline. Many tasks that seem trivial when serving smaller, less complex data sets present challenges when serving data of this volume and technical complexity. The included output files should be chosen to support as much science as possible from publicly released data, and only publicly released data. Users must have the resources needed to read and interpret the data correctly. Server administrators must generate new data releases at regular intervals, monitor usage, quickly recover from hardware failures, and monitor the data served by the DAS both for content and corruption. We discuss these challenges, describe tools we use to administer and support the DAS, and discuss future development plans.

  13. Peptiderive server: derive peptide inhibitors from protein–protein interactions

    PubMed Central

    Sedan, Yuval; Marcu, Orly; Lyskov, Sergey; Schueler-Furman, Ora

    2016-01-01

    The Rosetta Peptiderive protocol identifies, in a given structure of a protein–protein interaction, the linear polypeptide segment suggested to contribute most to binding energy. Interactions that feature a ‘hot segment’, a linear peptide with significant binding energy compared to that of the complex, may be amenable for inhibition and the peptide sequence and structure derived from the interaction provide a starting point for rational drug design. Here we present a web server for Peptiderive, which is incorporated within the ROSIE web interface for Rosetta protocols. A new feature of the protocol also evaluates whether derived peptides are good candidates for cyclization. Fast computation times and clear visualization allow users to quickly assess the interaction of interest. The Peptiderive server is available for free use at http://rosie.rosettacommons.org/peptiderive. PMID:27141963

  14. Berkeley Phylogenomics Group web servers: resources for structural phylogenomic analysis.

    PubMed

    Glanville, Jake Gunn; Kirshner, Dan; Krishnamurthy, Nandini; Sjölander, Kimmen

    2007-07-01

    Phylogenomic analysis addresses the limitations of function prediction based on annotation transfer, and has been shown to enable the highest accuracy in prediction of protein molecular function. The Berkeley Phylogenomics Group provides a series of web servers for phylogenomic analysis: classification of sequences to pre-computed families and subfamilies using the PhyloFacts Phylogenomic Encyclopedia, FlowerPower clustering of proteins sharing the same domain architecture, MUSCLE multiple sequence alignment, SATCHMO simultaneous alignment and tree construction and SCI-PHY subfamily identification. The PhyloBuilder web server provides an integrated phylogenomic pipeline starting with a user-supplied protein sequence, proceeding to homolog identification, multiple alignment, phylogenetic tree construction, subfamily identification and structure prediction. The Berkeley Phylogenomics Group resources are available at http://phylogenomics.berkeley.edu.

  15. GrayStarServer: Server-side Spectrum Synthesis with a Browser-based Client-side User Interface

    NASA Astrophysics Data System (ADS)

    Short, C. Ian

    2016-10-01

    We present GrayStarServer (GSS), a stellar atmospheric modeling and spectrum synthesis code of pedagogical accuracy that is accessible in any web browser on commonplace computational devices and that runs on a timescale of a few seconds. The addition of spectrum synthesis annotated with line identifications extends the functionality and pedagogical applicability of GSS beyond that of its predecessor, GrayStar3 (GS3). The spectrum synthesis is based on a line list acquired from the NIST atomic spectra database, and the GSS post-processing and user interface client allows the user to inspect the plain text ASCII version of the line list, as well as to apply macroscopic broadening. Unlike GS3, GSS carries out the physical modeling on the server side in Java, and communicates with the JavaScript and HTML client via an asynchronous HTTP request. We also describe other improvements beyond GS3 such as a more physical treatment of background opacity and atmospheric physics, the comparison of key results with those of the Phoenix code, and the use of the HTML < {canvas}> element for higher quality plotting and rendering of results. We also present LineListServer, a Java code for converting custom ASCII line lists in NIST format to the byte data type file format required by GSS so that users can prepare their own custom line lists. We propose a standard for marking up and packaging model atmosphere and spectrum synthesis output for data transmission and storage that will facilitate a web-based approach to stellar atmospheric modeling and spectrum synthesis. We describe some pedagogical demonstrations and exercises enabled by easily accessible, on-demand, responsive spectrum synthesis. GSS may serve as a research support tool by providing quick spectroscopic reconnaissance. GSS may be found at www.ap.smu.ca/~ishort/OpenStars/GrayStarServer/grayStarServer.html, and source tarballs for local installations of both GSS and LineListServer may be found at www.ap.smu.ca/~ishort/OpenStars/.

  16. Architecture: client/server moves into managed healthcare.

    PubMed

    Worthington, R

    1997-01-01

    The healthcare industry is in transition from indemnity-based products to managed care during a period marked by consolidation, competitiveness and increasingly demanding consumers. This powerful combination of industry change and customer interest requires more efficient operations and flexible information systems. Host-based managed care systems are running into limitations meeting business needs, creating a demand for client/server architectures. PMID:10164671

  17. Sharing limited Ethernet resources with a client-server model

    NASA Astrophysics Data System (ADS)

    Brownless, D. M.; Burton, P. D.

    1994-12-01

    The new control system proposed for the ISIS facility at Rutherford uses an Ethernet spine to provide mutual communications between disparate equipment, including the control computers. This paper describes the limitations imposed on the use of Ethernet in Local/Wide Area Networks and how a client-server based system can be used to circumvent them. The actual system we developed is discussed with particular reference to the problems we have faced, implementing data standards and the performance statistics attained.

  18. 78 FR 51265 - 30-Day Notice of Proposed Information Collection: Application for Additional Visa Pages or...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-20

    ... Notice of Proposed Information Collection: Application for Additional Visa Pages or Miscellaneous...: Title of Information Collection: Application for Additional Visa Pages or Miscellaneous Passport... applies for the addition of visa pages to that passport, the Department must confirm the...

  19. Does the Op-Ed Page Have a Chance to Become a Public Forum?

    ERIC Educational Resources Information Center

    Ciofalo, Andrew; Traverso, Kim

    1994-01-01

    Surveys op-ed page editors, finding that fewer than half of the responding papers have op-ed pages; that professional journalists, public figures, and propagandists dominate the pages; and that editors firmly control the agenda. (SR)

  20. Barcode server: a visualization-based genome analysis system.

    PubMed

    Mao, Fenglou; Olman, Victor; Wang, Yan; Xu, Ying

    2013-01-01

    We have previously developed a computational method for representing a genome as a barcode image, which makes various genomic features visually apparent. We have demonstrated that this visual capability has made some challenging genome analysis problems relatively easy to solve. We have applied this capability to a number of challenging problems, including (a) identification of horizontally transferred genes, (b) identification of genomic islands with special properties and (c) binning of metagenomic sequences, and achieved highly encouraging results. These application results inspired us to develop this barcode-based genome analysis server for public service, which supports the following capabilities: (a) calculation of the k-mer based barcode image for a provided DNA sequence; (b) detection of sequence fragments in a given genome with distinct barcodes from those of the majority of the genome, (c) clustering of provided DNA sequences into groups having similar barcodes; and (d) homology-based search using Blast against a genome database for any selected genomic regions deemed to have interesting barcodes. The barcode server provides a job management capability, allowing processing of a large number of analysis jobs for barcode-based comparative genome analyses. The barcode server is accessible at http://csbl1.bmb.uga.edu/Barcode. PMID:23457606

  1. GPCR & company: databases and servers for GPCRs and interacting partners.

    PubMed

    Kowalsman, Noga; Niv, Masha Y

    2014-01-01

    G-protein-coupled receptors (GPCRs) are a large superfamily of membrane receptors that are involved in a wide range of signaling pathways. To fulfill their tasks, GPCRs interact with a variety of partners, including small molecules, lipids and proteins. They are accompanied by different proteins during all phases of their life cycle. Therefore, GPCR interactions with their partners are of great interest in basic cell-signaling research and in drug discovery.Due to the rapid development of computers and internet communication, knowledge and data can be easily shared within the worldwide research community via freely available databases and servers. These provide an abundance of biological, chemical and pharmacological information.This chapter describes the available web resources for investigating GPCR interactions. We review about 40 freely available databases and servers, and provide a few sentences about the essence and the data they supply. For simplification, the databases and servers were grouped under the following topics: general GPCR-ligand interactions; particular families of GPCRs and their ligands; GPCR oligomerization; GPCR interactions with intracellular partners; and structural information on GPCRs. In conclusion, a multitude of useful tools are currently available. Summary tables are provided to ease navigation between the numerous and partially overlapping resources. Suggestions for future enhancements of the online tools include the addition of links from general to specialized databases and enabling usage of user-supplied template for GPCR structural modeling. PMID:24158806

  2. SARA: a server for function annotation of RNA structures.

    PubMed

    Capriotti, Emidio; Marti-Renom, Marc A

    2009-07-01

    Recent interest in non-coding RNA transcripts has resulted in a rapid increase of deposited RNA structures in the Protein Data Bank. However, a characterization and functional classification of the RNA structure and function space have only been partially addressed. Here, we introduce the SARA program for pair-wise alignment of RNA structures as a web server for structure-based RNA function assignment. The SARA server relies on the SARA program, which aligns two RNA structures based on a unit-vector root-mean-square approach. The likely accuracy of the SARA alignments is assessed by three different P-values estimating the statistical significance of the sequence, secondary structure and tertiary structure identity scores, respectively. Our benchmarks, which relied on a set of 419 RNA structures with known SCOR structural class, indicate that at a negative logarithm of mean P-value higher or equal than 2.5, SARA can assign the correct or a similar SCOR class to 81.4% and 95.3% of the benchmark set, respectively. The SARA server is freely accessible via the World Wide Web at http://sgu.bioinfo.cipf.es/services/SARA/.

  3. Mobile object retrieval in server-based image databases

    NASA Astrophysics Data System (ADS)

    Manger, D.; Pagel, F.; Widak, H.

    2013-05-01

    The increasing number of mobile phones equipped with powerful cameras leads to huge collections of user-generated images. To utilize the information of the images on site, image retrieval systems are becoming more and more popular to search for similar objects in an own image database. As the computational performance and the memory capacity of mobile devices are constantly increasing, this search can often be performed on the device itself. This is feasible, for example, if the images are represented with global image features or if the search is done using EXIF or textual metadata. However, for larger image databases, if multiple users are meant to contribute to a growing image database or if powerful content-based image retrieval methods with local features are required, a server-based image retrieval backend is needed. In this work, we present a content-based image retrieval system with a client server architecture working with local features. On the server side, the scalability to large image databases is addressed with the popular bag-of-word model with state-of-the-art extensions. The client end of the system focuses on a lightweight user interface presenting the most similar images of the database highlighting the visual information which is common with the query image. Additionally, new images can be added to the database making it a powerful and interactive tool for mobile contentbased image retrieval.

  4. [Communication server in the hospital--advantages, expenses and limitations].

    PubMed

    Jendrysiak, U

    1997-01-01

    The common situation in a hospital with multiple departments is a heterogeneous set of subsystems, one or more for each department. Today, we have a rising number of requests for an information interchange between these independent systems. The exchange of patients data has a technical and a conceptional part. Establishing a connection between more than two subsystems requires links from one system to all the others, each of them with its own code translation, interface and message transfer. A communication server is an important tool for significantly reducing the amount of work for the technical realisation. It reduces the number of interfaces, facilitates the definition, maintenance and documentation of the message structure and translation tables and helps to keep control on the message pipelines. Existing interfaces can be adapted for similar purposes. Anyway, a communication server needs a lot of configuration and it is necessary to know about low-level internetworking on different hard- and software to take advantage of its features. The code for writing files on a remote system and for process communication via TCP/IP sockets or similar techniques has to be written specifically for each communication task. There are first experiences in the university school of medicine in Mainz setting up a communication server to connect different departments. We also made a checklist for the selection of such a product. PMID:9381841

  5. Mining the SDSS SkyServer SQL queries log

    NASA Astrophysics Data System (ADS)

    Hirota, Vitor M.; Santos, Rafael; Raddick, Jordan; Thakar, Ani

    2016-05-01

    SkyServer, the Internet portal for the Sloan Digital Sky Survey (SDSS) astronomic catalog, provides a set of tools that allows data access for astronomers and scientific education. One of SkyServer data access interfaces allows users to enter ad-hoc SQL statements to query the catalog. SkyServer also presents some template queries that can be used as basis for more complex queries. This interface has logged over 330 million queries submitted since 2001. It is expected that analysis of this data can be used to investigate usage patterns, identify potential new classes of queries, find similar queries, etc. and to shed some light on how users interact with the Sloan Digital Sky Survey data and how scientists have adopted the new paradigm of e-Science, which could in turn lead to enhancements on the user interfaces and experience in general. In this paper we review some approaches to SQL query mining, apply the traditional techniques used in the literature and present lessons learned, namely, that the general text mining approach for feature extraction and clustering does not seem to be adequate for this type of data, and, most importantly, we find that this type of analysis can result in very different queries being clustered together.

  6. (PS)2: protein structure prediction server version 3.0.

    PubMed

    Huang, Tsun-Tsao; Hwang, Jenn-Kang; Chen, Chu-Huang; Chu, Chih-Sheng; Lee, Chi-Wen; Chen, Chih-Chieh

    2015-07-01

    Protein complexes are involved in many biological processes. Examining coupling between subunits of a complex would be useful to understand the molecular basis of protein function. Here, our updated (PS)(2) web server predicts the three-dimensional structures of protein complexes based on comparative modeling; furthermore, this server examines the coupling between subunits of the predicted complex by combining structural and evolutionary considerations. The predicted complex structure could be indicated and visualized by Java-based 3D graphics viewers and the structural and evolutionary profiles are shown and compared chain-by-chain. For each subunit, considerations with or without the packing contribution of other subunits cause the differences in similarities between structural and evolutionary profiles, and these differences imply which form, complex or monomeric, is preferred in the biological condition for the subunit. We believe that the (PS)(2) server would be a useful tool for biologists who are interested not only in the structures of protein complexes but also in the coupling between subunits of the complexes. The (PS)(2) is freely available at http://ps2v3.life.nctu.edu.tw/. PMID:25943546

  7. (PS)2: protein structure prediction server version 3.0.

    PubMed

    Huang, Tsun-Tsao; Hwang, Jenn-Kang; Chen, Chu-Huang; Chu, Chih-Sheng; Lee, Chi-Wen; Chen, Chih-Chieh

    2015-07-01

    Protein complexes are involved in many biological processes. Examining coupling between subunits of a complex would be useful to understand the molecular basis of protein function. Here, our updated (PS)(2) web server predicts the three-dimensional structures of protein complexes based on comparative modeling; furthermore, this server examines the coupling between subunits of the predicted complex by combining structural and evolutionary considerations. The predicted complex structure could be indicated and visualized by Java-based 3D graphics viewers and the structural and evolutionary profiles are shown and compared chain-by-chain. For each subunit, considerations with or without the packing contribution of other subunits cause the differences in similarities between structural and evolutionary profiles, and these differences imply which form, complex or monomeric, is preferred in the biological condition for the subunit. We believe that the (PS)(2) server would be a useful tool for biologists who are interested not only in the structures of protein complexes but also in the coupling between subunits of the complexes. The (PS)(2) is freely available at http://ps2v3.life.nctu.edu.tw/.

  8. A distributed clients/distributed servers model for STARCAT

    NASA Technical Reports Server (NTRS)

    Pirenne, B.; Albrecht, M. A.; Durand, D.; Gaudet, S.

    1992-01-01

    STARCAT, the Space Telescope ARchive and CATalogue user interface has been along for a number of years already. During this time it has been enhanced and augmented in a number of different fields. This time, we would like to dwell on a new capability allowing geographically distributed user interfaces to connect to geographically distributed data servers. This new concept permits users anywhere on the internet running STARCAT on their local hardware to access e.g., whichever of the 3 existing HST archive sites is available, or get information on the CFHT archive through a transparent connection to the CADC in BC or to get the La Silla weather by connecting to the ESO database in Munich during the same session. Similarly PreView (or quick look) images and spectra will also flow directly to the user from wherever it is available. Moving towards an 'X'-based STARCAT is another goal being pursued: a graphic/image server and a help/doc server are currently being added to it. They should further enhance the user independence and access transparency.

  9. GPCR & company: databases and servers for GPCRs and interacting partners.

    PubMed

    Kowalsman, Noga; Niv, Masha Y

    2014-01-01

    G-protein-coupled receptors (GPCRs) are a large superfamily of membrane receptors that are involved in a wide range of signaling pathways. To fulfill their tasks, GPCRs interact with a variety of partners, including small molecules, lipids and proteins. They are accompanied by different proteins during all phases of their life cycle. Therefore, GPCR interactions with their partners are of great interest in basic cell-signaling research and in drug discovery.Due to the rapid development of computers and internet communication, knowledge and data can be easily shared within the worldwide research community via freely available databases and servers. These provide an abundance of biological, chemical and pharmacological information.This chapter describes the available web resources for investigating GPCR interactions. We review about 40 freely available databases and servers, and provide a few sentences about the essence and the data they supply. For simplification, the databases and servers were grouped under the following topics: general GPCR-ligand interactions; particular families of GPCRs and their ligands; GPCR oligomerization; GPCR interactions with intracellular partners; and structural information on GPCRs. In conclusion, a multitude of useful tools are currently available. Summary tables are provided to ease navigation between the numerous and partially overlapping resources. Suggestions for future enhancements of the online tools include the addition of links from general to specialized databases and enabling usage of user-supplied template for GPCR structural modeling.

  10. The Biosphere 2 Global Change Testbed world wide web server: closed system research and education using the Internet

    PubMed

    Tosteson, J L; Marino, B D

    1996-01-01

    At this time, a fully materially closed system of large scale and complexity has not yet been built. However, Biosphere 2--a unique "living" Earth laboratory--is an example of a large (3.15 acres) and biologically complex (several thousand terrestrial plant species) system that can be operated with minimal exchange of ambient substances (annual exchange of materials is estimated to be approximately 10%). Biosphere 2 provides a multidisciplinary platform for scientific studies related to both Earth system processes and microcosms of the Earth that may be transported into space. The scale and versatility of the facility make Biosphere 2 a unique place to support integrated research and educational activities. The Biosphere 2 Global Change Testbed world wide web server has been developed to facilitate such activities by disseminating information about the facility, as well as current research and education efforts. Currently, these efforts focus of studies on carbon and other elemental cycles, coral reef ecology and physiology, stable isotopic research, studies in biodiversity, and ecophysiological studies of plant responses to elevated CO2. The Biosphere 2 Global Change Testbed web server is briefly described, and goals for use of the server to promote research and education endeavors are outlined.

  11. A Web Server and Mobile App for Computing Hemolytic Potency of Peptides

    NASA Astrophysics Data System (ADS)

    Chaudhary, Kumardeep; Kumar, Ritesh; Singh, Sandeep; Tuknait, Abhishek; Gautam, Ankur; Mathur, Deepika; Anand, Priya; Varshney, Grish C.; Raghava, Gajendra P. S.

    2016-03-01

    Numerous therapeutic peptides do not enter the clinical trials just because of their high hemolytic activity. Recently, we developed a database, Hemolytik, for maintaining experimentally validated hemolytic and non-hemolytic peptides. The present study describes a web server and mobile app developed for predicting, and screening of peptides having hemolytic potency. Firstly, we generated a dataset HemoPI-1 that contains 552 hemolytic peptides extracted from Hemolytik database and 552 random non-hemolytic peptides (from Swiss-Prot). The sequence analysis of these peptides revealed that certain residues (e.g., L, K, F, W) and motifs (e.g., “FKK”, “LKL”, “KKLL”, “KWK”, “VLK”, “CYCR”, “CRR”, “RFC”, “RRR”, “LKKL”) are more abundant in hemolytic peptides. Therefore, we developed models for discriminating hemolytic and non-hemolytic peptides using various machine learning techniques and achieved more than 95% accuracy. We also developed models for discriminating peptides having high and low hemolytic potential on different datasets called HemoPI-2 and HemoPI-3. In order to serve the scientific community, we developed a web server, mobile app and JAVA-based standalone software (http://crdd.osdd.net/raghava/hemopi/).

  12. The RING 2.0 web server for high quality residue interaction networks.

    PubMed

    Piovesan, Damiano; Minervini, Giovanni; Tosatto, Silvio C E

    2016-07-01

    Residue interaction networks (RINs) are an alternative way of representing protein structures where nodes are residues and arcs physico-chemical interactions. RINs have been extensively and successfully used for analysing mutation effects, protein folding, domain-domain communication and catalytic activity. Here we present RING 2.0, a new version of the RING software for the identification of covalent and non-covalent bonds in protein structures, including π-π stacking and π-cation interactions. RING 2.0 is extremely fast and generates both intra and inter-chain interactions including solvent and ligand atoms. The generated networks are very accurate and reliable thanks to a complex empirical re-parameterization of distance thresholds performed on the entire Protein Data Bank. By default, RING output is generated with optimal parameters but the web server provides an exhaustive interface to customize the calculation. The network can be visualized directly in the browser or in Cytoscape. Alternatively, the RING-Viz script for Pymol allows visualizing the interactions at atomic level in the structure. The web server and RING-Viz, together with an extensive help and tutorial, are available from URL: http://protein.bio.unipd.it/ring.

  13. A Web Server and Mobile App for Computing Hemolytic Potency of Peptides.

    PubMed

    Chaudhary, Kumardeep; Kumar, Ritesh; Singh, Sandeep; Tuknait, Abhishek; Gautam, Ankur; Mathur, Deepika; Anand, Priya; Varshney, Grish C; Raghava, Gajendra P S

    2016-01-01

    Numerous therapeutic peptides do not enter the clinical trials just because of their high hemolytic activity. Recently, we developed a database, Hemolytik, for maintaining experimentally validated hemolytic and non-hemolytic peptides. The present study describes a web server and mobile app developed for predicting, and screening of peptides having hemolytic potency. Firstly, we generated a dataset HemoPI-1 that contains 552 hemolytic peptides extracted from Hemolytik database and 552 random non-hemolytic peptides (from Swiss-Prot). The sequence analysis of these peptides revealed that certain residues (e.g., L, K, F, W) and motifs (e.g., "FKK", "LKL", "KKLL", "KWK", "VLK", "CYCR", "CRR", "RFC", "RRR", "LKKL") are more abundant in hemolytic peptides. Therefore, we developed models for discriminating hemolytic and non-hemolytic peptides using various machine learning techniques and achieved more than 95% accuracy. We also developed models for discriminating peptides having high and low hemolytic potential on different datasets called HemoPI-2 and HemoPI-3. In order to serve the scientific community, we developed a web server, mobile app and JAVA-based standalone software (http://crdd.osdd.net/raghava/hemopi/). PMID:26953092

  14. wwLigCSRre: a 3D ligand-based server for hit identification and optimization

    PubMed Central

    Sperandio, O.; Petitjean, M.; Tuffery, P.

    2009-01-01

    The wwLigCSRre web server performs ligand-based screening using a 3D molecular similarity engine. Its aim is to provide an online versatile facility to assist the exploration of the chemical similarity of families of compounds, or to propose some scaffold hopping from a query compound. The service allows the user to screen several chemically diversified focused banks, such as Kinase-, CNS-, GPCR-, Ion-channel-, Antibacterial-, Anticancer- and Analgesic-focused libraries. The server also provides the possibility to screen the DrugBank and DSSTOX/Carcinogenic compounds databases. User banks can also been downloaded. The 3D similarity search combines both geometrical (3D) and physicochemical information. Starting from one 3D ligand molecule as query, the screening of such databases can lead to unraveled compound scaffold as hits or help to optimize previously identified hit molecules in a SAR (Structure activity relationship) project. wwLigCSRre can be accessed at http://bioserv.rpbs.univ-paris-diderot.fr/wwLigCSRre.html. PMID:19429687

  15. The RING 2.0 web server for high quality residue interaction networks

    PubMed Central

    Piovesan, Damiano; Minervini, Giovanni; Tosatto, Silvio C.E.

    2016-01-01

    Residue interaction networks (RINs) are an alternative way of representing protein structures where nodes are residues and arcs physico–chemical interactions. RINs have been extensively and successfully used for analysing mutation effects, protein folding, domain–domain communication and catalytic activity. Here we present RING 2.0, a new version of the RING software for the identification of covalent and non-covalent bonds in protein structures, including π–π stacking and π–cation interactions. RING 2.0 is extremely fast and generates both intra and inter-chain interactions including solvent and ligand atoms. The generated networks are very accurate and reliable thanks to a complex empirical re-parameterization of distance thresholds performed on the entire Protein Data Bank. By default, RING output is generated with optimal parameters but the web server provides an exhaustive interface to customize the calculation. The network can be visualized directly in the browser or in Cytoscape. Alternatively, the RING-Viz script for Pymol allows visualizing the interactions at atomic level in the structure. The web server and RING-Viz, together with an extensive help and tutorial, are available from URL: http://protein.bio.unipd.it/ring. PMID:27198219

  16. A Web Server and Mobile App for Computing Hemolytic Potency of Peptides

    PubMed Central

    Chaudhary, Kumardeep; Kumar, Ritesh; Singh, Sandeep; Tuknait, Abhishek; Gautam, Ankur; Mathur, Deepika; Anand, Priya; Varshney, Grish C.; Raghava, Gajendra P. S.

    2016-01-01

    Numerous therapeutic peptides do not enter the clinical trials just because of their high hemolytic activity. Recently, we developed a database, Hemolytik, for maintaining experimentally validated hemolytic and non-hemolytic peptides. The present study describes a web server and mobile app developed for predicting, and screening of peptides having hemolytic potency. Firstly, we generated a dataset HemoPI-1 that contains 552 hemolytic peptides extracted from Hemolytik database and 552 random non-hemolytic peptides (from Swiss-Prot). The sequence analysis of these peptides revealed that certain residues (e.g., L, K, F, W) and motifs (e.g., “FKK”, “LKL”, “KKLL”, “KWK”, “VLK”, “CYCR”, “CRR”, “RFC”, “RRR”, “LKKL”) are more abundant in hemolytic peptides. Therefore, we developed models for discriminating hemolytic and non-hemolytic peptides using various machine learning techniques and achieved more than 95% accuracy. We also developed models for discriminating peptides having high and low hemolytic potential on different datasets called HemoPI-2 and HemoPI-3. In order to serve the scientific community, we developed a web server, mobile app and JAVA-based standalone software (http://crdd.osdd.net/raghava/hemopi/). PMID:26953092

  17. Performance, accuracy, and Web server for evolutionary placement of short sequence reads under maximum likelihood.

    PubMed

    Berger, Simon A; Krompass, Denis; Stamatakis, Alexandros

    2011-05-01

    We present an evolutionary placement algorithm (EPA) and a Web server for the rapid assignment of sequence fragments (short reads) to edges of a given phylogenetic tree under the maximum-likelihood model. The accuracy of the algorithm is evaluated on several real-world data sets and compared with placement by pair-wise sequence comparison, using edit distances and BLAST. We introduce a slow and accurate as well as a fast and less accurate placement algorithm. For the slow algorithm, we develop additional heuristic techniques that yield almost the same run times as the fast version with only a small loss of accuracy. When those additional heuristics are employed, the run time of the more accurate algorithm is comparable with that of a simple BLAST search for data sets with a high number of short query sequences. Moreover, the accuracy of the EPA is significantly higher, in particular when the sample of taxa in the reference topology is sparse or inadequate. Our algorithm, which has been integrated into RAxML, therefore provides an equally fast but more accurate alternative to BLAST for tree-based inference of the evolutionary origin and composition of short sequence reads. We are also actively developing a Web server that offers a freely available service for computing read placements on trees using the EPA.

  18. Automatic comic page image understanding based on edge segment analysis

    NASA Astrophysics Data System (ADS)

    Liu, Dong; Wang, Yongtao; Tang, Zhi; Li, Luyuan; Gao, Liangcai

    2013-12-01

    Comic page image understanding aims to analyse the layout of the comic page images by detecting the storyboards and identifying the reading order automatically. It is the key technique to produce the digital comic documents suitable for reading on mobile devices. In this paper, we propose a novel comic page image understanding method based on edge segment analysis. First, we propose an efficient edge point chaining method to extract Canny edge segments (i.e., contiguous chains of Canny edge points) from the input comic page image; second, we propose a top-down scheme to detect line segments within each obtained edge segment; third, we develop a novel method to detect the storyboards by selecting the border lines and further identify the reading order of these storyboards. The proposed method is performed on a data set consisting of 2000 comic page images from ten printed comic series. The experimental results demonstrate that the proposed method achieves satisfactory results on different comics and outperforms the existing methods.

  19. Optimizing Parallel Access to the BaBar Database System Using CORBA Servers

    SciTech Connect

    Becla, Jacek

    2002-05-01

    The BaBar Experiment collected around 20 TB of data during its first 6 months of running. Now, after 18 months, data size exceeds 300 TB, and according to prognosis, it is a small fraction of the size of data coming in the next few months. In order to keep up with the data, significant effort was put into tuning the database system. It led to great performance improvements, as well as to inevitable system expansion--450 simultaneous processing nodes alone used for data reconstruction. It is believed, that further growth beyond 600 nodes will happen soon. In such an environment, many complex operations are executed simultaneously on hundreds of machines, putting a huge load on data servers and increasing network traffic. Introducing two CORBA servers halved startup time, and dramatically offloaded database servers: data servers as well as lock servers. The paper describes details of design and implementation of two servers recently introduced in the BaBar system: Conditions OID Server and Clustering Server. The first experience of using these servers is discussed. A discussion on a Collection Server for data analysis, currently being designed is included.

  20. PockDrug-Server: a new web server for predicting pocket druggability on holo and apo proteins

    PubMed Central

    Hussein, Hiba Abi; Borrel, Alexandre; Geneix, Colette; Petitjean, Michel; Regad, Leslie; Camproux, Anne-Claude

    2015-01-01

    Predicting protein pocket's ability to bind drug-like molecules with high affinity, i.e. druggability, is of major interest in the target identification phase of drug discovery. Therefore, pocket druggability investigations represent a key step of compound clinical progression projects. Currently computational druggability prediction models are attached to one unique pocket estimation method despite pocket estimation uncertainties. In this paper, we propose ‘PockDrug-Server’ to predict pocket druggability, efficient on both (i) estimated pockets guided by the ligand proximity (extracted by proximity to a ligand from a holo protein structure) and (ii) estimated pockets based solely on protein structure information (based on amino atoms that form the surface of potential binding cavities). PockDrug-Server provides consistent druggability results using different pocket estimation methods. It is robust with respect to pocket boundary and estimation uncertainties, thus efficient using apo pockets that are challenging to estimate. It clearly distinguishes druggable from less druggable pockets using different estimation methods and outperformed recent druggability models for apo pockets. It can be carried out from one or a set of apo/holo proteins using different pocket estimation methods proposed by our web server or from any pocket previously estimated by the user. PockDrug-Server is publicly available at: http://pockdrug.rpbs.univ-paris-diderot.fr. PMID:25956651

  1. Experimental Results on Statistical Approaches to Page Replacement Policies

    SciTech Connect

    LEUNG,VITUS J.; IRANI,SANDY

    2000-12-08

    This paper investigates the questions of what statistical information about a memory request sequence is useful to have in making page replacement decisions: Our starting point is the Markov Request Model for page request sequences. Although the utility of modeling page request sequences by the Markov model has been recently put into doubt, we find that two previously suggested algorithms (Maximum Hitting Time and Dominating Distribution) which are based on the Markov model work well on the trace data used in this study. Interestingly, both of these algorithms perform equally well despite the fact that the theoretical results for these two algorithms differ dramatically. We then develop succinct characteristics of memory access patterns in an attempt to approximate the simpler of the two algorithms. Finally, we investigate how to collect these characteristics in an online manner in order to have a purely online algorithm.

  2. User-Friendly Data Servers for Climate Studies at the Asia-Pacific Data-Research Center (APDRC)

    NASA Astrophysics Data System (ADS)

    Yuan, G.; Shen, Y.; Zhang, Y.; Merrill, R.; Waseda, T.; Mitsudera, H.; Hacker, P.

    2002-12-01

    The APDRC was recently established within the International Pacific Research Center (IPRC) at the University of Hawaii. The APDRC mission is to increase understanding of climate variability in the Asia-Pacific region by developing the computational, data-management, and networking infrastructure necessary to make data resources readily accessible and usable by researchers, and by undertaking data-intensive research activities that will both advance knowledge and lead to improvements in data preparation and data products. A focus of recent activity is the implementation of user-friendly data servers. The APDRC is currently running a Live Access Server (LAS) developed at NOAA/PMEL to provide access to and visualization of gridded climate products via the web. The LAS also allows users to download the selected data subsets in various formats (such as binary, netCDF and ASCII). Most of the datasets served by the LAS are also served through our OPeNDAP server (formerly DODS), which allows users to directly access the data using their desktop client tools (e.g. GrADS, Matlab and Ferret). In addition, the APDRC is running an OPeNDAP Catalog/Aggregation Server (CAS) developed by Unidata at UCAR to serve climate data and products such as model output and satellite-derived products. These products are often large (> 2 GB) and are therefore stored as multiple files (stored separately in time or in parameters). The CAS remedies the inconvenience of multiple files and allows access to the whole dataset (or any subset that cuts across the multiple files) via a single request command from any DODS enabled client software. Once the aggregation of files is configured at the server (CAS), the process of aggregation is transparent to the user. The user only needs to know a single URL for the entire dataset, which is, in fact, stored as multiple files. CAS even allows aggregation of files on different systems and at different locations. Currently, the APDRC is serving NCEP, ECMWF

  3. A membrane page composer - Further developments. [for holographic memory system

    NASA Technical Reports Server (NTRS)

    Cosentino, L. S.; Stewart, W. C.

    1974-01-01

    Membrane page composers were made and were evaluated in a simulated holographic optical memory system. Calculated and experimentally determined electromechanical and optical characteristics of the circular membrane light valves used on the arrays are shown to be in close agreement. Several operating prototypes of 8 x 8 and 16 x 16 elements were produced. Measurements were made of switching time, optical contrast, and dynamic storage time of many cells on the devices. Digital patterns were stored in the arrays. The performance required of the page composer as a component of an optical memory system is considered. The fabrication techniques used can be easily extended to larger arrays.

  4. Key-phrase based classification of public health web pages.

    PubMed

    Dolamic, Ljiljana; Boyer, Célia

    2013-01-01

    This paper describes and evaluates the public health web pages classification model based on key phrase extraction and matching. Easily extendible both in terms of new classes as well as the new language this method proves to be a good solution for text classification faced with the total lack of training data. To evaluate the proposed solution we have used a small collection of public health related web pages created by a double blind manual classification. Our experiments have shown that by choosing the adequate threshold value the desired value for either precision or recall can be achieved.

  5. A tactile paging system for deaf-blind people, phase 1. [human factors engineering of bioinstrumentation

    NASA Technical Reports Server (NTRS)

    Baer, J. A.

    1976-01-01

    A tactile paging system for deaf-blind people has been brought from the concept stage to the development of a first model. The model consists of a central station that transmits coded information via radio link to an on-body (i.e., worn on the wrist) receiving unit, the output from which is a coded vibrotactile signal. The model is a combination of commercially available equipment, customized electronic circuits, and electromechanical transducers. The paging system facilitates communication to deaf-blind clients in an institutional environment as an aid in their training and other activities. Several subunits of the system were individually developed, tested, and integrated into an operating system ready for experimentation and evaluation. The operation and characteristics of the system are described and photographs are shown.

  6. A Predictive Performance Model to Evaluate the Contention Cost in Application Servers

    SciTech Connect

    Chen, Shiping; Gorton, Ian )

    2002-12-04

    In multi-tier enterprise systems, application servers are key components that implement business logic and provide application services. To support a large number of simultaneous accesses from clients over the Internet and intranet, most application servers use replication and multi-threading to handle concurrent requests. While multiple processes and multiple threads enhance the processing bandwidth of servers, they also increase the contention for resources in application servers. This paper investigates this issue empirically based on a middleware benchmark. A cost model is proposed to estimate the overall performance of application servers, including the contention overhead. This model is then used to determine the optimal degree of the concurrency of application servers for a specific client load. A case study based on CORBA is presented to validate our model and demonstrate its application.

  7. From honeybees to Internet servers: biomimicry for distributed management of Internet hosting centers.

    PubMed

    Nakrani, Sunil; Tovey, Craig

    2007-12-01

    An Internet hosting center hosts services on its server ensemble. The center must allocate servers dynamically amongst services to maximize revenue earned from hosting fees. The finite server ensemble, unpredictable request arrival behavior and server reallocation cost make server allocation optimization difficult. Server allocation closely resembles honeybee forager allocation amongst flower patches to optimize nectar influx. The resemblance inspires a honeybee biomimetic algorithm. This paper describes details of the honeybee self-organizing model in terms of information flow and feedback, analyzes the homology between the two problems and derives the resulting biomimetic algorithm for hosting centers. The algorithm is assessed for effectiveness and adaptiveness by comparative testing against benchmark and conventional algorithms. Computational results indicate that the new algorithm is highly adaptive to widely varying external environments and quite competitive against benchmark assessment algorithms. Other swarm intelligence applications are briefly surveyed, and some general speculations are offered regarding their various degrees of success. PMID:18037727

  8. From honeybees to Internet servers: biomimicry for distributed management of Internet hosting centers.

    PubMed

    Nakrani, Sunil; Tovey, Craig

    2007-12-01

    An Internet hosting center hosts services on its server ensemble. The center must allocate servers dynamically amongst services to maximize revenue earned from hosting fees. The finite server ensemble, unpredictable request arrival behavior and server reallocation cost make server allocation optimization difficult. Server allocation closely resembles honeybee forager allocation amongst flower patches to optimize nectar influx. The resemblance inspires a honeybee biomimetic algorithm. This paper describes details of the honeybee self-organizing model in terms of information flow and feedback, analyzes the homology between the two problems and derives the resulting biomimetic algorithm for hosting centers. The algorithm is assessed for effectiveness and adaptiveness by comparative testing against benchmark and conventional algorithms. Computational results indicate that the new algorithm is highly adaptive to widely varying external environments and quite competitive against benchmark assessment algorithms. Other swarm intelligence applications are briefly surveyed, and some general speculations are offered regarding their various degrees of success.

  9. Tcoffee@igs: A web server for computing, evaluating and combining multiple sequence alignments.

    PubMed

    Poirot, Olivier; O'Toole, Eamonn; Notredame, Cedric

    2003-07-01

    This paper presents Tcoffee@igs, a new server provided to the community by Hewlet Packard computers and the Centre National de la Recherche Scientifique. This server is a web-based tool dedicated to the computation, the evaluation and the combination of multiple sequence alignments. It uses the latest version of the T-Coffee package. Given a set of unaligned sequences, the server returns an evaluated multiple sequence alignment and the associated phylogenetic tree. This server also makes it possible to evaluate the local reliability of an existing alignment and to combine several alternative multiple alignments into a single new one. Tcoffee@igs can be used for aligning protein, RNA or DNA sequences. Datasets of up to 100 sequences (2000 residues long) can be processed. The server and its documentation are available from: http://igs-server.cnrs-mrs.fr/Tcoffee/.

  10. Web GIS in practice IV: publishing your health maps and connecting to remote WMS sources using the Open Source UMN MapServer and DM Solutions MapLab.

    PubMed

    Boulos, Maged N Kamel; Honda, Kiyoshi

    2006-01-18

    Open Source Web GIS software systems have reached a stage of maturity, sophistication, robustness and stability, and usability and user friendliness rivalling that of commercial, proprietary GIS and Web GIS server products. The Open Source Web GIS community is also actively embracing OGC (Open Geospatial Consortium) standards, including WMS (Web Map Service). WMS enables the creation of Web maps that have layers coming from multiple different remote servers/sources. In this article we present one easy to implement Web GIS server solution that is based on the Open Source University of Minnesota (UMN) MapServer. By following the accompanying step-by-step tutorial instructions, interested readers running mainstream Microsoft(R) Windows machines and with no prior technical experience in Web GIS or Internet map servers will be able to publish their own health maps on the Web and add to those maps additional layers retrieved from remote WMS servers. The 'digital Asia' and 2004 Indian Ocean tsunami experiences in using free Open Source Web GIS software are also briefly described.

  11. ArchPRED: a template based loop structure prediction server.

    PubMed

    Fernandez-Fuentes, Narcis; Zhai, Jun; Fiser, András

    2006-07-01

    ArchPRED server (http://www.fiserlab.org/servers/archpred) implements a novel fragment-search based method for predicting loop conformations. The inputs to the server are the atomic coordinates of the query protein and the position of the loop. The algorithm selects candidate loop fragments from a regularly updated loop library (Search Space) by matching the length, the types of bracing secondary structures of the query and by satisfying the geometrical restraints imposed by the stem residues. Subsequently, candidate loops are inserted in the query protein framework where their side chains are rebuilt and their fit is assessed by the root mean square deviation (r.m.s.d.) of stem regions and by the number of rigid body clashes with the environment. In the final step remaining candidate loops are ranked by a Z-score that combines information on sequence similarity and fit of predicted and observed [/psi] main chain dihedral angle propensities. The final loop conformation is built in the protein structure and annealed in the environment using conjugate gradient minimization. The prediction method was benchmarked on artificially prepared search datasets where all trivial sequence similarities on the SCOP superfamily level were removed. Under these conditions it was possible to predict loops of length 4, 8 and 12 with coverage of 98, 78 and 28% with at least of 0.22, 1.38 and 2.47 A of r.m.s.d. accuracy, respectively. In a head to head comparison on loops extracted from freshly deposited new protein folds the current method outperformed in a approximately 5:1 ratio an earlier developed database search method. PMID:16844985

  12. MESSA: MEta-Server for protein Sequence Analysis

    PubMed Central

    2012-01-01

    Background Computational sequence analysis, that is, prediction of local sequence properties, homologs, spatial structure and function from the sequence of a protein, offers an efficient way to obtain needed information about proteins under study. Since reliable prediction is usually based on the consensus of many computer programs, meta-severs have been developed to fit such needs. Most meta-servers focus on one aspect of sequence analysis, while others incorporate more information, such as PredictProtein for local sequence feature predictions, SMART for domain architecture and sequence motif annotation, and GeneSilico for secondary and spatial structure prediction. However, as predictions of local sequence properties, three-dimensional structure and function are usually intertwined, it is beneficial to address them together. Results We developed a MEta-Server for protein Sequence Analysis (MESSA) to facilitate comprehensive protein sequence analysis and gather structural and functional predictions for a protein of interest. For an input sequence, the server exploits a number of select tools to predict local sequence properties, such as secondary structure, structurally disordered regions, coiled coils, signal peptides and transmembrane helices; detect homologous proteins and assign the query to a protein family; identify three-dimensional structure templates and generate structure models; and provide predictive statements about the protein's function, including functional annotations, Gene Ontology terms, enzyme classification and possible functionally associated proteins. We tested MESSA on the proteome of Candidatus Liberibacter asiaticus. Manual curation shows that three-dimensional structure models generated by MESSA covered around 75% of all the residues in this proteome and the function of 80% of all proteins could be predicted. Availability MESSA is free for non-commercial use at http://prodata.swmed.edu/MESSA/ PMID:23031578

  13. Creating a GIS data server on the World Wide Web: The GISST example

    SciTech Connect

    Pace, P.J.; Evers, T.K.

    1996-01-01

    In an effort to facilitate user access to Geographic Information Systems (GIS) data, the GIS and Computer Modeling Group from the Computational Physics and Engineering Division at the Oak Ridge National Laboratory (ORNL) in Oak Ridge, Tennessee (TN), has developed a World Wide Web server named GISST. The server incorporates a highly interactive and dynamic forms-based interface to browse and download a variety of GIS data types. This paper describes the server`s design considerations, development, resulting implementation and future enhancements.

  14. Asynchronous data change notification between database server and accelerator controls system

    SciTech Connect

    Fu, W.; Morris, J.; Nemesure, S.

    2011-10-10

    Database data change notification (DCN) is a commonly used feature. Not all database management systems (DBMS) provide an explicit DCN mechanism. Even for those DBMS's which support DCN (such as Oracle and MS SQL server), some server side and/or client side programming may be required to make the DCN system work. This makes the setup of DCN between database server and interested clients tedious and time consuming. In accelerator control systems, there are many well established software client/server architectures (such as CDEV, EPICS, and ADO) that can be used to implement data reflection servers that transfer data asynchronously to any client using the standard SET/GET API. This paper describes a method for using such a data reflection server to set up asynchronous DCN (ADCN) between a DBMS and clients. This method works well for all DBMS systems which provide database trigger functionality. Asynchronous data change notification (ADCN) between database server and clients can be realized by combining the use of a database trigger mechanism, which is supported by major DBMS systems, with server processes that use client/server software architectures that are familiar in the accelerator controls community (such as EPICS, CDEV or ADO). This approach makes the ADCN system easy to set up and integrate into an accelerator controls system. Several ADCN systems have been set up and used in the RHIC-AGS controls system.

  15. Adventures in the evolution of a high-bandwidth network for central servers

    SciTech Connect

    Swartz, K.L.; Cottrell, L.; Dart, M.

    1994-08-01

    In a small network, clients and servers may all be connected to a single Ethernet without significant performance concerns. As the number of clients on a network grows, the necessity of splitting the network into multiple sub-networks, each with a manageable number of clients, becomes clear. Less obvious is what to do with the servers. Group file servers on subnets and multihomed servers offer only partial solutions -- many other types of servers do not lend themselves to a decentralized model, and tend to collect on another, well-connected but overloaded Ethernet. The higher speed of FDDI seems to offer an easy solution, but in practice both expense and interoperability problems render FDDI a poor choice. Ethernet switches appear to permit cheaper and more reliable networking to the servers while providing an aggregate network bandwidth greater than a simple Ethernet. This paper studies the evolution of the server networks at SLAC. Difficulties encountered in the deployment of FDDI are described, as are the tools and techniques used to characterize the traffic patterns on the server network. Performance of Ethernet, FDDI, and switched Ethernet networks is analyzed, as are reliability and maintainability issues for these alternatives. The motivations for re-designing the SLAC general server network to use a switched Ethernet instead of FDDI are described, as are the reasons for choosing FDDI for the farm and firewall networks at SLAC. Guidelines are developed which may help in making this choice for other networks.

  16. Group-oriented coordination models for distributed client-server computing

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.; Hughes, Craig S.

    1994-01-01

    This paper describes group-oriented control models for distributed client-server interactions. These models transparently coordinate requests for services that involve multiple servers, such as queries across distributed databases. Specific capabilities include: decomposing and replicating client requests; dispatching request subtasks or copies to independent, networked servers; and combining server results into a single response for the client. The control models were implemented by combining request broker and process group technologies with an object-oriented communication middleware tool. The models are illustrated in the context of a distributed operations support application for space-based systems.

  17. 16. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    16. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  18. 37. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    37. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  19. 34. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    34. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  20. 38. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    38. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  1. 26. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    26. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  2. 31. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    31. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  3. 30. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    30. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  4. 28. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    28. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  5. 19. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    19. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  6. 27. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    27. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  7. 21. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    21. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  8. 4. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    4. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  9. 14. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    14. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  10. 8. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  11. 9. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  12. 39. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    39. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  13. 7. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  14. 35. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    35. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  15. 5. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  16. 10. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    10. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  17. 29. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    29. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  18. 17. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    17. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  19. 22. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    22. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  20. 24. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    24. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  1. 12. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    12. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  2. 11. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    11. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  3. 32. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    32. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  4. 25. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    25. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  5. 3. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  6. 33. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    33. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  7. 1. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  8. 18. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    18. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  9. 20. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    20. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  10. 23. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    23. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  11. 13. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    13. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  12. 15. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    15. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  13. 2. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  14. 36. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    36. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  15. 6. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  16. Designing Web Pages That Are Usable and Accessible to All.

    ERIC Educational Resources Information Center

    Wheaton, Joe E.; Granello, Paul F.

    The Internet is a growing source of information for persons worldwide, but for many people with disabilities the Internet can be a confusing jumble of images, frames, scripts, and colors that make little sense. Although learning how to make Web pages accessible to all takes some effort, it is effort well spent for one very good reason:…

  17. Building interactive simulations in a Web page design program.

    PubMed

    Kootsey, J Mailen; Siriphongs, Daniel; McAuley, Grant

    2004-01-01

    A new Web software architecture, NumberLinX (NLX), has been integrated into a commercial Web design program to produce a drag-and-drop environment for building interactive simulations. NLX is a library of reusable objects written in Java, including input, output, calculation, and control objects. The NLX objects were added to the palette of available objects in the Web design program to be selected and dropped on a page. Inserting an object in a Web page is accomplished by adding a template block of HTML code to the page file. HTML parameters in the block must be set to user-supplied values, so the HTML code is generated dynamically, based on user entries in a popup form. Implementing the object inspector for each object permits the user to edit object attributes in a form window. Except for model definition, the combination of the NLX architecture and the Web design program permits construction of interactive simulation pages without writing or inspecting code. PMID:17271495

  18. 48 CFR 804.1102 - Vendor Information Pages (VIP) Database.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...://www.VetBiz.gov, and also must be registered in the Central Contractor Registration (CCR) (see 48 CFR... (VIP) Database. 804.1102 Section 804.1102 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS GENERAL ADMINISTRATIVE MATTERS Contract Execution 804.1102 Vendor Information Pages (VIP)...

  19. 48 CFR 804.1102 - Vendor Information Pages (VIP) Database.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...://www.VetBiz.gov, and also must be registered in the Central Contractor Registration (CCR) (see 48 CFR... (VIP) Database. 804.1102 Section 804.1102 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS GENERAL ADMINISTRATIVE MATTERS Contract Execution 804.1102 Vendor Information Pages (VIP)...

  20. 48 CFR 804.1102 - Vendor Information Pages (VIP) Database.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...://www.VetBiz.gov, and also must be registered in the Central Contractor Registration (CCR) (see 48 CFR... (VIP) Database. 804.1102 Section 804.1102 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS GENERAL ADMINISTRATIVE MATTERS Contract Execution 804.1102 Vendor Information Pages (VIP)...

  1. 48 CFR 804.1102 - Vendor Information Pages (VIP) Database.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...://www.VetBiz.gov, and also must be registered in the Central Contractor Registration (CCR) (see 48 CFR... (VIP) Database. 804.1102 Section 804.1102 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS GENERAL ADMINISTRATIVE MATTERS Contract Execution 804.1102 Vendor Information Pages (VIP)...

  2. 48 CFR 804.1102 - Vendor Information Pages (VIP) Database.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...://www.VetBiz.gov, and also must be registered in the Central Contractor Registration (CCR) (see 48 CFR... (VIP) Database. 804.1102 Section 804.1102 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS GENERAL ADMINISTRATIVE MATTERS Contract Execution 804.1102 Vendor Information Pages (VIP)...

  3. 7 CFR 3402.11 - Proposal cover page.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 15 2011-01-01 2011-01-01 false Proposal cover page. 3402.11 Section 3402.11 Agriculture Regulations of the Department of Agriculture (Continued) NATIONAL INSTITUTE OF FOOD AND AGRICULTURE FOOD AND AGRICULTURAL SCIENCES NATIONAL NEEDS GRADUATE AND POSTGRADUATE FELLOWSHIP GRANTS PROGRAM Preparation of an Application § 3402.11...

  4. 7 CFR 3402.11 - Proposal cover page.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 15 2013-01-01 2013-01-01 false Proposal cover page. 3402.11 Section 3402.11 Agriculture Regulations of the Department of Agriculture (Continued) NATIONAL INSTITUTE OF FOOD AND AGRICULTURE FOOD AND AGRICULTURAL SCIENCES NATIONAL NEEDS GRADUATE AND POSTGRADUATE FELLOWSHIP GRANTS PROGRAM Preparation of an Application § 3402.11...

  5. What Should Be On A School Library Web Page?

    ERIC Educational Resources Information Center

    Baumbach, Donna; Brewer, Sally; Renfroe, Matt

    2004-01-01

    As varied as the schools and the communities they serve, so too are the Web pages for the library media programs that serve them. This article provides guidelines for effective web design and the information that might be included, including reference resources, reference asistance, curriculum support, literacy advocacy, and dynamic material. An…

  6. 7 CFR 3402.11 - Proposal cover page.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 15 2012-01-01 2012-01-01 false Proposal cover page. 3402.11 Section 3402.11 Agriculture Regulations of the Department of Agriculture (Continued) NATIONAL INSTITUTE OF FOOD AND AGRICULTURE FOOD AND AGRICULTURAL SCIENCES NATIONAL NEEDS GRADUATE AND POSTGRADUATE FELLOWSHIP GRANTS PROGRAM Preparation of an Application § 3402.11...

  7. On Apples and Onions: A Reply to Page.

    ERIC Educational Resources Information Center

    Phillips, Gerald M.

    1980-01-01

    Answers some of William Page's criticisms (see preceding article, EJ 227 456) regarding the use of rhetoritherapy v behavior therapy to deal with students who exhibit communication apprehension. Argues that rhetoritherapy deals with people who have problems, not with problems. It is concerned with what can be done about the problem, not what the…

  8. 8. Historic American Buildings Survey, From page 5, volume 26 ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. Historic American Buildings Survey, From page 5, volume 26 of the Castner Collection in the Print Room of the Free Library of Philadelphia, COPY OF c.1870 PHOTOGRAPH SHOWING NORTHEAST CORNER OF SIXTH AND ARCH STREETS-THE LEE HOUSE SIXTH FROM LEFT. - Robert M. Lee House & Law Office, 109-111 North Sixth Street, Philadelphia, Philadelphia County, PA

  9. 8. Historic American Buildings Survey, From page 18, volume 2 ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. Historic American Buildings Survey, From page 18, volume 2 of the Castner Collection in the Print Room of Free Library of Philadelphia, Logan Square. Negative is the gift of James F. O'Gorman COPY OF AN UNDATED DRAWING OF THE HALL VIEWED FROM NORTHEAST. - National Guard's Hall, 518-20 Race Street, Philadelphia, Philadelphia County, PA

  10. Taking Shakespeare from the Page to the Stage.

    ERIC Educational Resources Information Center

    Breen, Kathleen T.

    1993-01-01

    Describes an approach to teaching William Shakespeare by which one teacher had students take the plays from the page to the stage by becoming actors and directors as well as scholars. Shows ways of relating various plays to more contemporary works. (HB)

  11. College of DuPage Administrative Internship Program.

    ERIC Educational Resources Information Center

    College of DuPage, Glen Ellyn, IL.

    This paper describes the objectives, policies, and procedures of the College of DuPage's Administrative Internship Program, whereby faculty members can gain broad administrative experience in key administrative areas of the college. The program seeks not only to broaden the experience of the individual intern, but to promote empathy among faculty…

  12. 47 CFR 22.503 - Paging geographic area authorizations.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 2 2014-10-01 2014-10-01 false Paging geographic area authorizations. 22.503 Section 22.503 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES.... 42 (Salt Lake City) 148-150, 152. 43 (San Francisco-Oakland-San Jose) 151, 162-165. 44 (Los...

  13. College of DuPage Student Portrait, Fall Quarter 1999.

    ERIC Educational Resources Information Center

    College of DuPage, Glen Ellyn, IL. Office of Research and Planning.

    The report profiles the College of DuPage's (COD) fall quarter 1999 student body. It presents a brief history of the college's enrollment and a comparison of enrollments with other Illinois community colleges. It also provides demographic information on current students. Additionally, enrollment information is included by program, division, and…

  14. 7 CFR 3402.11 - Proposal cover page.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 15 2010-01-01 2010-01-01 false Proposal cover page. 3402.11 Section 3402.11 Agriculture Regulations of the Department of Agriculture (Continued) COOPERATIVE STATE RESEARCH, EDUCATION, AND EXTENSION SERVICE, DEPARTMENT OF AGRICULTURE FOOD AND AGRICULTURAL SCIENCES NATIONAL...

  15. Exploring the Use of a Facebook Page in Anatomy Education

    ERIC Educational Resources Information Center

    Jaffar, Akram Abood

    2014-01-01

    Facebook is the most popular social media site visited by university students on a daily basis. Consequently, Facebook is the logical place to start with for integrating social media technologies into education. This study explores how a faculty-administered Facebook Page can be used to supplement anatomy education beyond the traditional…

  16. 54. PAGE TWO OF PLANS FOR EXISTING GRAND CANAL COURT ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    54. PAGE TWO OF PLANS FOR EXISTING GRAND CANAL COURT (VIRGINIA COURT) ARCHED PEDESTRIAN BRIDGE OVER CARROLL CANAL Plan Sheet D-26358, Sheet No. 2 of 6 (delineated by Richard G. Carrizosa, January 1978) - Venice Canals, Community of Venice, Los Angeles, Los Angeles County, CA

  17. 56. PAGE FOUR OF PLANS FOR EXISTING GRAND CANAL COURT ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    56. PAGE FOUR OF PLANS FOR EXISTING GRAND CANAL COURT (VIRGINIA COURT) ARCHED PEDESTRIAN BRIDGE OVER CARROLL CANAL Plan Sheet D-26358, Sheet No. 4 of 6 (delineated by William Loo, January 1978) - Venice Canals, Community of Venice, Los Angeles, Los Angeles County, CA

  18. 55. PAGE THREE OF PLANS FOR EXISTING GRAND CANAL COURT ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    55. PAGE THREE OF PLANS FOR EXISTING GRAND CANAL COURT (VIRGINIA COURT) ARCHED PEDESTRIAN BRIDGE OVER CARROLL CANAL Plan Sheet D-26358, Sheet No. 3 of 6 (delineated by William Loo, January 1978) - Venice Canals, Community of Venice, Los Angeles, Los Angeles County, CA

  19. 57. PAGE FIVE OF PLANS FOR EXISTING GRAND CANAL COURT ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    57. PAGE FIVE OF PLANS FOR EXISTING GRAND CANAL COURT (VIRGINIA COURT) ARCHED PEDESTRIAN BRIDGE OVER CARROLL CANAL Plan Sheet D-26358, Sheet No. 5 of 6 (delineated by Richard G. Carrizosa, January 1978) - Venice Canals, Community of Venice, Los Angeles, Los Angeles County, CA

  20. 46. PAGE TWO OF PLANS FOR THE THREE ORIGINAL ARCHED ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    46. PAGE TWO OF PLANS FOR THE THREE ORIGINAL ARCHED PEDESTRIAN BRIDGES OVER GRAND CANAL AT 25TH AVENUE, CARROLL CANAL AT GRAND CANAL COURT, AND EASTERN CANAL AT LINNIE CANAL COURT Plan Sheet B-191 (delineator unknown, April 1924) - Venice Canals, Community of Venice, Los Angeles, Los Angeles County, CA

  1. 45. PAGE ONE OF PLANS FOR THE THREE ORIGINAL ARCHED ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    45. PAGE ONE OF PLANS FOR THE THREE ORIGINAL ARCHED PEDESTRIAN BRIDGES OVER GRAND CANAL AT 25TH AVENUE, CARROLL CANAL AT GRAND CANAL COURT, AND EASTERN CANAL AT LINNIE CANAL COURT Plan Sheet B-190 (delineator unknown, April 1924) - Venice Canals, Community of Venice, Los Angeles, Los Angeles County, CA

  2. The Inquiry Page: Bringing Digital Libraries to Learners.

    ERIC Educational Resources Information Center

    Bruce, Bertram C.; Bishop, Ann Peterson; Heidorn, P. Bryan; Lunsford, Karen J.; Poulakos, Steven; Won, Mihye

    2003-01-01

    Discusses digital library development, particularly a national science digital library, and describes the Inquiry Page which focuses on building a constructivist environment using Web resources, collaborative processes, and knowledge that bridges digital libraries with users in K-12 schools, museums, community groups, or other organizations. (LRW)

  3. Ranking nodes in growing networks: When PageRank fails

    PubMed Central

    Mariani, Manuel Sebastian; Medo, Matúš; Zhang, Yi-Cheng

    2015-01-01

    PageRank is arguably the most popular ranking algorithm which is being applied in real systems ranging from information to biological and infrastructure networks. Despite its outstanding popularity and broad use in different areas of science, the relation between the algorithm’s efficacy and properties of the network on which it acts has not yet been fully understood. We study here PageRank’s performance on a network model supported by real data, and show that realistic temporal effects make PageRank fail in individuating the most valuable nodes for a broad range of model parameters. Results on real data are in qualitative agreement with our model-based findings. This failure of PageRank reveals that the static approach to information filtering is inappropriate for a broad class of growing systems, and suggest that time-dependent algorithms that are based on the temporal linking patterns of these systems are needed to better rank the nodes. PMID:26553630

  4. 22. PHOTOGRAPHIC ENLARGEMENT OF UPPER PHOTOGRAPH ON PAGE 986 IN ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    22. PHOTOGRAPHIC ENLARGEMENT OF UPPER PHOTOGRAPH ON PAGE 986 IN Keystone Coal Buyers Catalog, 1922, VIEW SOUTH, COMMUNITY OF ETHEL; ETHEL COAL COMPANY MINE SUPPLY BUILDING IS LOCATED IN MID-GROUND LEFT OF CENTER PARTIALLY OBSCURED BY ROOF OF HOUSE IN FOREGROUND - Ethel Coal Company & Supply Building, Left fork of Dingess Run (Ethel Hollow), Ethel, Logan County, WV

  5. 23. PHOTOGRAPHIC ENLARGEMENT OF UPPER PHOTOGRAPH ON PAGE 986 IN ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    23. PHOTOGRAPHIC ENLARGEMENT OF UPPER PHOTOGRAPH ON PAGE 986 IN Keystone Coal Buyers Catalog, 1922, VIEW SOUTH, COMMUNITY OF ETHEL; ETHEL COAL COMPANY MINE SUPPLY BUILDING IS LOCATED IN MID-GROUND IN CENTER PARTIALLY OBSCURED BY ROOF OF HOUSE IN FOREGROUND - Ethel Coal Company & Supply Building, Left fork of Dingess Run (Ethel Hollow), Ethel, Logan County, WV

  6. 21. PHOTOGRAPH OF PAGE 986 IN Keystone Coal Buyers Catalog, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    21. PHOTOGRAPH OF PAGE 986 IN Keystone Coal Buyers Catalog, 1922, UPPER PHOTOGRAPH, VIEW SOUTH, COMMUNITY OF ETHEL; ETHEL COAL COMPANY MINE SUPPLY BUILDING IS LOCATED IN MID-GROUND LEFT OF CENTER PARTIALLY OBSCURED BY ROOF OF HOUSE IN FOREGROUND - Ethel Coal Company & Supply Building, Left fork of Dingess Run (Ethel Hollow), Ethel, Logan County, WV

  7. Automatic Caption Localization for Photographs on World Wide Web Pages.

    ERIC Educational Resources Information Center

    Rowe, Neil C.; Frew, Brian

    1998-01-01

    Explores the indirect method of locating for indexing the likely explicit and implicit captions of photographs, using multimodal clues including the specific words used, syntax, surrounding layout of the Web page, and general appearance of the associated image. The MARIE-3 system thus avoids full image processing and full natural-language…

  8. 48 CFR 1852.215-81 - Proposal page limitations.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 6 2011-10-01 2011-10-01 false Proposal page limitations. 1852.215-81 Section 1852.215-81 Federal Acquisition Regulations System NATIONAL AERONAUTICS AND SPACE... metric standard format most closely approximating the described standard 81/2″×11″ size may also be...

  9. College of DuPage Institutional Portrait, 1999-2000.

    ERIC Educational Resources Information Center

    College of DuPage, Glen Ellyn, IL. Office of Research and Planning.

    This report profiles the College of DuPage (Illinois) and provides a quick reference regarding the institution. It contains information on facilities and land, finance, staff, college/student services, the Library, athletics, academic calendars, maps, and instructional programs. This report reflects the college structure as of September 1999 and…

  10. Building interactive simulations in a Web page design program.

    PubMed

    Kootsey, J Mailen; Siriphongs, Daniel; McAuley, Grant

    2004-01-01

    A new Web software architecture, NumberLinX (NLX), has been integrated into a commercial Web design program to produce a drag-and-drop environment for building interactive simulations. NLX is a library of reusable objects written in Java, including input, output, calculation, and control objects. The NLX objects were added to the palette of available objects in the Web design program to be selected and dropped on a page. Inserting an object in a Web page is accomplished by adding a template block of HTML code to the page file. HTML parameters in the block must be set to user-supplied values, so the HTML code is generated dynamically, based on user entries in a popup form. Implementing the object inspector for each object permits the user to edit object attributes in a form window. Except for model definition, the combination of the NLX architecture and the Web design program permits construction of interactive simulation pages without writing or inspecting code.

  11. Google's Web Page Ranking Applied to Different Topological Web Graph Structures.

    ERIC Educational Resources Information Center

    Meghabghab, George

    2001-01-01

    This research, part of the ongoing study to better understand Web page ranking on the Web, looks at a Web page as a graph structure or Web graph, and classifies different Web graphs in the new coordinate space (out-degree, in-degree). Google's Web ranking algorithm (Brin & Page, 1998) on ranking Web pages is applied in this new coordinate space.…

  12. Client/Server data serving for high performance computing

    NASA Technical Reports Server (NTRS)

    Wood, Chris

    1994-01-01

    This paper will attempt to examine the industry requirements for shared network data storage and sustained high speed (10's to 100's to thousands of megabytes per second) network data serving via the NFS and FTP protocol suite. It will discuss the current structural and architectural impediments to achieving these sorts of data rates cost effectively today on many general purpose servers and will describe and architecture and resulting product family that addresses these problems. The sustained performance levels that were achieved in the lab will be shown as well as a discussion of early customer experiences utilizing both the HIPPI-IP and ATM OC3-IP network interfaces.

  13. Deploying Server-side File System Monitoring at NERSC

    SciTech Connect

    Uselton, Andrew

    2009-05-01

    The Franklin Cray XT4 at the NERSC center was equipped with the server-side I/O monitoring infrastructure Cerebro/LMT, which is described here in detail. Insights gained from the data produced include a better understanding of instantaneous data rates during file system testing, file system behavior during regular production time, and long-term average behaviors. Information and insights gleaned from this monitoring support efforts to proactively manage the I/O infrastructure on Franklin. A simple model for I/O transactions is introduced and compared with the 250 million observations sent to the LMT database from August 2008 to February 2009.

  14. CTserver: A Computational Thermodynamics Server for the Geoscience Community

    NASA Astrophysics Data System (ADS)

    Kress, V. C.; Ghiorso, M. S.

    2006-12-01

    The CTserver platform is an Internet-based computational resource that provides on-demand services in Computational Thermodynamics (CT) to a diverse geoscience user base. This NSF-supported resource can be accessed at ctserver.ofm-research.org. The CTserver infrastructure leverages a high-quality and rigorously tested software library of routines for computing equilibrium phase assemblages and for evaluating internally consistent thermodynamic properties of materials, e.g. mineral solid solutions and a variety of geological fluids, including magmas. Thermodynamic models are currently available for 167 phases. Recent additions include Duan, Møller and Weare's model for supercritical C-O-H-S, extended to include SO2 and S2 species, and an entirely new associated solution model for O-S-Fe-Ni sulfide liquids. This software library is accessed via the CORBA Internet protocol for client-server communication. CORBA provides a standardized, object-oriented, language and platform independent, fast, low-bandwidth interface to phase property modules running on the server cluster. Network transport, language translation and resource allocation are handled by the CORBA interface. Users access server functionality in two principal ways. Clients written as browser- based Java applets may be downloaded which provide specific functionality such as retrieval of thermodynamic properties of phases, computation of phase equilibria for systems of specified composition, or modeling the evolution of these systems along some particular reaction path. This level of user interaction requires minimal programming effort and is ideal for classroom use. A more universal and flexible mode of CTserver access involves making remote procedure calls from user programs directly to the server public interface. The CTserver infrastructure relieves the user of the burden of implementing and testing the often complex thermodynamic models of real liquids and solids. A pilot application of this distributed

  15. BPROMPT: A consensus server for membrane protein prediction.

    PubMed

    Taylor, Paul D; Attwood, Teresa K; Flower, Darren R

    2003-07-01

    Protein structure prediction is a cornerstone of bioinformatics research. Membrane proteins require their own prediction methods due to their intrinsically different composition. A variety of tools exist for topology prediction of membrane proteins, many of them available on the Internet. The server described in this paper, BPROMPT (Bayesian PRediction Of Membrane Protein Topology), uses a Bayesian Belief Network to combine the results of other prediction methods, providing a more accurate consensus prediction. Topology predictions with accuracies of 70% for prokaryotes and 53% for eukaryotes were achieved. BPROMPT can be accessed at http://www.jenner.ac.uk/BPROMPT. PMID:12824397

  16. Increased coverage of protein families with the blocks database servers.

    PubMed

    Henikoff, J G; Greene, E A; Pietrokovski, S; Henikoff, S

    2000-01-01

    The Blocks Database WWW (http://blocks.fhcrc.org ) and Email (blocks@blocks.fhcrc.org ) servers provide tools to search DNA and protein queries against the Blocks+ Database of multiple alignments, which represent conserved protein regions. Blocks+ nearly doubles the number of protein families included in the database by adding families from the Pfam-A, ProDom and Domo databases to those from PROSITE and PRINTS. Other new features include improved Block Searcher statistics, searching with NCBI's IMPALA program and 3D display of blocks on PDB structures.

  17. High-Performance Tiled WMS and KML Web Server

    NASA Technical Reports Server (NTRS)

    Plesea, Lucian

    2007-01-01

    This software is an Apache 2.0 module implementing a high-performance map server to support interactive map viewers and virtual planet client software. It can be used in applications that require access to very-high-resolution geolocated images, such as GIS, virtual planet applications, and flight simulators. It serves Web Map Service (WMS) requests that comply with a given request grid from an existing tile dataset. It also generates the KML super-overlay configuration files required to access the WMS image tiles.

  18. SHOT: a web server for the construction of genome phylogenies.

    PubMed

    Korbel, Jan O; Snel, Berend; Huynen, Martijn A; Bork, Peer

    2002-03-01

    With the increasing availability of genome sequences, new methods are being proposed that exploit information from complete genomes to classify species in a phylogeny. Here we present SHOT, a web server for the classification of genomes on the basis of shared gene content or the conservation of gene order that reflects the dominant, phylogenetic signal in these genomic properties. In general, the genome trees are consistent with classical gene-based phylogenies, although some interesting exceptions indicate massive horizontal gene transfer. SHOT is a useful tool for analysing the tree of life from a genomic point of view. It is available at http://www.Bork.EMBL-Heidelberg.de/SHOT.

  19. Blue native-PAGE analysis of Trichoderma harzianum secretome reveals cellulases and hemicellulases working as multienzymatic complexes.

    PubMed

    da Silva, Adelson Joel; Gómez-Mendoza, Diana Paola; Junqueira, Magno; Domont, Gilberto Barbosa; Ximenes Ferreira Filho, Edivaldo; de Sousa, Marcelo Valle; Ricart, Carlos André Ornelas

    2012-08-01

    Plant cell wall-degrading enzymes produced by microorganisms possess important biotechnological applications, including biofuel production. Some anaerobic bacteria are able to produce multienzymatic complexes called cellulosomes while filamentous fungi normally secrete individual hydrolytic enzymes that act synergistically for polysaccharide degradation. Here, we present evidence that the fungus Trichoderma harzianum, cultivated in medium containing the agricultural residue sugarcane bagasse, is able to secrete multienzymatic complexes. The T. harzianum secretome was firstly analyzed by 1D-BN (blue native)-PAGE that revealed several putative complexes. The three most intense 1D-BN-PAGE bands, named complexes [I], [II], and [III], were subsequently subjected to tricine SDS-PAGE that demonstrated that they were composed of smaller subunits. Zymographic assays were performed using 1D-BN-PAGE and 2D-BN/BN-PAGE demonstrating that the complexes bore cellulolytic and xylanolytic activities. The complexes [I], [II], and [III] were then trypsin digested and analyzed separately by LC-MS/MS that revealed their protein composition. Since T. harzianum has an unsequenced genome, a homology-driven proteomics approach provided a higher number of identified proteins than a conventional peptide-spectrum matching strategy. The results indicate that the complexes are formed by cellulolytic and hemicellulolytic enzymes and other proteins such as chitinase, cutinase, and swollenin, which may act synergistically to degrade plant cell wall components.

  20. systemsDock: a web server for network pharmacology-based prediction and analysis

    PubMed Central

    Hsin, Kun-Yi; Matsuoka, Yukiko; Asai, Yoshiyuki; Kamiyoshi, Kyota; Watanabe, Tokiko; Kawaoka, Yoshihiro; Kitano, Hiroaki

    2016-01-01

    We present systemsDock, a web server for network pharmacology-based prediction and analysis, which permits docking simulation and molecular pathway map for comprehensive characterization of ligand selectivity and interpretation of ligand action on a complex molecular network. It incorporates an elaborately designed scoring function for molecular docking to assess protein–ligand binding potential. For large-scale screening and ease of investigation, systemsDock has a user-friendly GUI interface for molecule preparation, parameter specification and result inspection. Ligand binding potentials against individual proteins can be directly displayed on an uploaded molecular interaction map, allowing users to systemically investigate network-dependent effects of a drug or drug candidate. A case study is given to demonstrate how systemsDock can be used to discover a test compound's multi-target activity. systemsDock is freely accessible at http://systemsdock.unit.oist.jp/. PMID:27131384

  1. Opportunities for the Mashup of Heterogenous Data Server via Semantic Web Technology

    NASA Astrophysics Data System (ADS)

    Ritschel, Bernd; Seelus, Christoph; Neher, Günther; Iyemori, Toshihiko; Koyama, Yukinobu; Yatagai, Akiyo; Murayama, Yasuhiro; King, Todd; Hughes, John; Fung, Shing; Galkin, Ivan; Hapgood, Michael; Belehaki, Anna

    2015-04-01

    Opportunities for the Mashup of Heterogenous Data Server via Semantic Web Technology European Union ESPAS, Japanese IUGONET and GFZ ISDC data server are developed for the ingestion, archiving and distributing of geo and space science domain data. Main parts of the data -managed by the mentioned data server- are related to near earth-space and geomagnetic field data. A smart mashup of the data server would allow a seamless browse and access to data and related context information. However the achievement of a high level of interoperability is a challenge because the data server are based on different data models and software frameworks. This paper is focused on the latest experiments and results for the mashup of the data server using the semantic Web approach. Besides the mashup of domain and terminological ontologies, especially the options to connect data managed by relational databases using D2R server and SPARQL technology will be addressed. A successful realization of the data server mashup will not only have a positive impact to the data users of the specific scientific domain but also to related projects, such as e.g. the development of a new interoperable version of NASA's Planetary Data System (PDS) or ICUS's World Data System alliance. ESPAS data server: https://www.espas-fp7.eu/portal/ IUGONET data server: http://search.iugonet.org/iugonet/ GFZ ISDC data server (semantic Web based prototype): http://rz-vm30.gfz-potsdam.de/drupal-7.9/ NASA PDS: http://pds.nasa.gov ICSU-WDS: https://www.icsu-wds.org

  2. WAMI: a web server for the analysis of minisatellite maps

    PubMed Central

    2010-01-01

    Background Minisatellites are genomic loci composed of tandem arrays of short repetitive DNA segments. A minisatellite map is a sequence of symbols that represents the tandem repeat array such that the set of symbols is in one-to-one correspondence with the set of distinct repeats. Due to variations in repeat type and organization as well as copy number, the minisatellite maps have been widely used in forensic and population studies. In either domain, researchers need to compare the set of maps to each other, to build phylogenetic trees, to spot structural variations, and to study duplication dynamics. Efficient algorithms for these tasks are required to carry them out reliably and in reasonable time. Results In this paper we present WAMI, a web-server for the analysis of minisatellite maps. It performs the above mentioned computational tasks using efficient algorithms that take the model of map evolution into account. The WAMI interface is easy to use and the results of each analysis task are visualized. Conclusions To the best of our knowledge, WAMI is the first server providing all these computational facilities to the minisatellite community. The WAMI web-interface and the source code of the underlying programs are available at http://www.nubios.nileu.edu.eg/tools/wami. PMID:20525398

  3. System level traffic shaping in disk servers with heterogeneous protocols

    NASA Astrophysics Data System (ADS)

    Cano, Eric; Kruse, Daniele Francesco

    2014-06-01

    Disk access and tape migrations compete for network bandwidth in CASTORs disk servers, over various protocols: RFIO, Xroot, root and GridFTP. As there are a limited number of tape drives, it is important to keep them busy all the time, at their nominal speed. With potentially 100s of user read streams per server, the bandwidth for the tape migrations has to be guaranteed to a controlled level, and not the fair share the system gives by default. Xroot provides a prioritization mechanism, but using it implies moving exclusively to the Xroot protocol, which is not possible in short to mid-term time frame, as users are equally using all protocols. The greatest commonality of all those protocols is not more than the usage of TCP/IP. We investigated the Linux kernel traffic shaper to control TCP/ IP bandwidth. The performance and limitations of the traffic shaper have been understood in test environment, and satisfactory working point has been found for production. Notably, TCP offload engines' negative impact on traffic shaping, and the limitations of the length of the traffic shaping rules were discovered and measured. A suitable working point has been found and the traffic shaping is now successfully deployed in the CASTOR production systems at CERN. This system level approach could be transposed easily to other environments.

  4. Seq2Ref: a web server to facilitate functional interpretation

    PubMed Central

    2013-01-01

    Background The size of the protein sequence database has been exponentially increasing due to advances in genome sequencing. However, experimentally characterized proteins only constitute a small portion of the database, such that the majority of sequences have been annotated by computational approaches. Current automatic annotation pipelines inevitably introduce errors, making the annotations unreliable. Instead of such error-prone automatic annotations, functional interpretation should rely on annotations of ‘reference proteins’ that have been experimentally characterized or manually curated. Results The Seq2Ref server uses BLAST to detect proteins homologous to a query sequence and identifies the reference proteins among them. Seq2Ref then reports publications with experimental characterizations of the identified reference proteins that might be relevant to the query. Furthermore, a plurality-based rating system is developed to evaluate the homologous relationships and rank the reference proteins by their relevance to the query. Conclusions The reference proteins detected by our server will lend insight into proteins of unknown function and provide extensive information to develop in-depth understanding of uncharacterized proteins. Seq2Ref is available at: http://prodata.swmed.edu/seq2ref. PMID:23356573

  5. A web-server of cell type discrimination system.

    PubMed

    Wang, Anyou; Zhong, Yan; Wang, Yanhua; He, Qianchuan

    2014-01-01

    Discriminating cell types is a daily request for stem cell biologists. However, there is not a user-friendly system available to date for public users to discriminate the common cell types, embryonic stem cells (ESCs), induced pluripotent stem cells (iPSCs), and somatic cells (SCs). Here, we develop WCTDS, a web-server of cell type discrimination system, to discriminate the three cell types and their subtypes like fetal versus adult SCs. WCTDS is developed as a top layer application of our recent publication regarding cell type discriminations, which employs DNA-methylation as biomarkers and machine learning models to discriminate cell types. Implemented by Django, Python, R, and Linux shell programming, run under Linux-Apache web server, and communicated through MySQL, WCTDS provides a friendly framework to efficiently receive the user input and to run mathematical models for analyzing data and then to present results to users. This framework is flexible and easy to be expended for other applications. Therefore, WCTDS works as a user-friendly framework to discriminate cell types and subtypes and it can also be expended to detect other cell types like cancer cells. PMID:24578634

  6. AISMIG--an interactive server-side molecule image generator.

    PubMed

    Bohne-Lang, Andreas; Groch, Wolf-Dieter; Ranzinger, René

    2005-07-01

    Using a web browser without additional software and generating interactive high quality and high resolution images of bio-molecules is no longer a problem. Interactive visualization of 3D molecule structures by Internet browsers normally is not possible without additional software and the disadvantage of browser-based structure images (e.g. by a Java applet) is their low resolution. Scientists who want to generate 3D molecular images with high quality and high resolution (e.g. for publications or to render a molecule for a poster) therefore require separately installed software that is often not easy to use. The alternative concept is an interactive server-side rendering application that can be interfaced with any web browser. Thus it combines the advantage of the web application with the high-end rendering of a raytracer. This article addresses users who want to generate high quality images from molecular structures and do not have software installed locally for structure visualization. Often people do not have a structure viewer, such as RasMol or Chime (or even Java) installed locally but want to visualize a molecule structure interactively. AISMIG (An Interactive Server-side Molecule Image Generator) is a web service that provides a visualization of molecule structures in such cases. AISMIG-URL: http://www.dkfz-heidelberg.de/spec/aismig/. PMID:15980568

  7. CTLPScanner: a web server for chromothripsis-like pattern detection

    PubMed Central

    Yang, Jian; Liu, Jixiang; Ouyang, Liang; Chen, Yi; Liu, Bo; Cai, Haoyang

    2016-01-01

    Chromothripsis is a recently observed phenomenon in cancer cells in which one or several chromosomes shatter into pieces with subsequent inaccurate reassembly and clonal propagation. This type of event generates a potentially vast number of mutations within a relatively short-time period, and has been considered as a new paradigm in cancer development. Despite recent advances, much work is still required to better understand the molecular mechanisms of this phenomenon, and thus an easy-to-use tool is in urgent need for automatically detecting and annotating chromothripsis. Here we present CTLPScanner, a web server for detection of chromothripsis-like pattern (CTLP) in genomic array data. The output interface presents intuitive graphical representations of detected chromosome pulverization region, as well as detailed results in table format. CTLPScanner also provides additional information for associated genes in chromothripsis region to help identify the potential candidates involved in tumorigenesis. To assist in performing meta-data analysis, we integrated over 50 000 pre-processed genomic arrays from The Cancer Genome Atlas and Gene Expression Omnibus into CTLPScanner. The server allows users to explore the presence of chromothripsis signatures from public data resources, without carrying out any local data processing. CTLPScanner is freely available at http://cgma.scu.edu.cn/CTLPScanner/. PMID:27185889

  8. Berkeley PHOG: PhyloFacts orthology group prediction web server.

    PubMed

    Datta, Ruchira S; Meacham, Christopher; Samad, Bushra; Neyer, Christoph; Sjölander, Kimmen

    2009-07-01

    Ortholog detection is essential in functional annotation of genomes, with applications to phylogenetic tree construction, prediction of protein-protein interaction and other bioinformatics tasks. We present here the PHOG web server employing a novel algorithm to identify orthologs based on phylogenetic analysis. Results on a benchmark dataset from the TreeFam-A manually curated orthology database show that PHOG provides a combination of high recall and precision competitive with both InParanoid and OrthoMCL, and allows users to target different taxonomic distances and precision levels through the use of tree-distance thresholds. For instance, OrthoMCL-DB achieved 76% recall and 66% precision on this dataset; at a slightly higher precision (68%) PHOG achieves 10% higher recall (86%). InParanoid achieved 87% recall at 24% precision on this dataset, while a PHOG variant designed for high recall achieves 88% recall at 61% precision, increasing precision by 37% over InParanoid. PHOG is based on pre-computed trees in the PhyloFacts resource, and contains over 366 K orthology groups with a minimum of three species. Predicted orthologs are linked to GO annotations, pathway information and biological literature. The PHOG web server is available at http://phylofacts.berkeley.edu/orthologs/.

  9. A web-server of cell type discrimination system.

    PubMed

    Wang, Anyou; Zhong, Yan; Wang, Yanhua; He, Qianchuan

    2014-01-01

    Discriminating cell types is a daily request for stem cell biologists. However, there is not a user-friendly system available to date for public users to discriminate the common cell types, embryonic stem cells (ESCs), induced pluripotent stem cells (iPSCs), and somatic cells (SCs). Here, we develop WCTDS, a web-server of cell type discrimination system, to discriminate the three cell types and their subtypes like fetal versus adult SCs. WCTDS is developed as a top layer application of our recent publication regarding cell type discriminations, which employs DNA-methylation as biomarkers and machine learning models to discriminate cell types. Implemented by Django, Python, R, and Linux shell programming, run under Linux-Apache web server, and communicated through MySQL, WCTDS provides a friendly framework to efficiently receive the user input and to run mathematical models for analyzing data and then to present results to users. This framework is flexible and easy to be expended for other applications. Therefore, WCTDS works as a user-friendly framework to discriminate cell types and subtypes and it can also be expended to detect other cell types like cancer cells.

  10. 75 FR 8400 - In the Matter of Certain Wireless Communications System Server Software, Wireless Handheld...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-24

    ... COMMISSION In the Matter of Certain Wireless Communications System Server Software, Wireless Handheld Devices... importation, and the sale within the United States after importation of certain wireless communications system server software, wireless handheld devices and battery packs by reason of infringement of certain...

  11. Russian and CIS Library Internet Service: An Analysis of WWW-Server Development.

    ERIC Educational Resources Information Center

    Shraiberg, Yakov

    This paper traces the expansion of the Internet into Russian and Commonwealth of Independent States (CIS) libraries from basic access to the development of World Wide Web (WWW) servers. An analysis of the most representative groups of library WWW-servers arranged by projects, by corporate library network, or by geographical characteristics is…

  12. Think They're Drunk? Alcohol Servers and the Identification of Intoxication.

    ERIC Educational Resources Information Center

    Burns, Edward D.; Nusbaumer, Michael R.; Reiling, Denise M.

    2003-01-01

    Examines practices used by servers to assess intoxication. The analysis was based upon questionnaires mailed to a random probability sample of licensed servers from one state (N = 822). Indicators found to be most important were examined in relation to a variety of occupational characteristics. Implications for training curricula, policy…

  13. Design and implementation of web server soft load balancing in small and medium-sized enterprise

    NASA Astrophysics Data System (ADS)

    Yan, Liu

    2011-12-01

    With the expansion of business scale, small and medium-sized enterprises began to use information platform to improve their management and competition ability, the server becomes the core factor which restricts the enterprise's infomationization construction. This paper puts forward a suitable design scheme for small and medium-sized enterprise web server soft load balancing, and proved it effective through experiment.

  14. Developing Server-Side Infrastructure for Large-Scale E-Learning of Web Technology

    ERIC Educational Resources Information Center

    Simpkins, Neil

    2010-01-01

    The growth of E-business has made experience in server-side technology an increasingly important area for educators. Server-side skills are in increasing demand and recognised to be of relatively greater value than comparable client-side aspects (Ehie, 2002). In response to this, many educational organisations have developed E-business courses,…

  15. A Disk-Based Storage Architecture for Movie on Demand Servers.

    ERIC Educational Resources Information Center

    Ozden, Banu; And Others

    1995-01-01

    Discusses movie on demand (MOD) servers, which are computer systems that store movies in compressed digital form for broadcast cable television systems. Highlights include network bandwidths, a disk-based storage architecture for a MOD server, implementing VCR (video cassette recorder) functions to movie viewing, and buffers. (LRW)

  16. Usage of Thin-Client/Server Architecture in Computer Aided Education

    ERIC Educational Resources Information Center

    Cimen, Caghan; Kavurucu, Yusuf; Aydin, Halit

    2014-01-01

    With the advances of technology, thin-client/server architecture has become popular in multi-user/single network environments. Thin-client is a user terminal in which the user can login to a domain and run programs by connecting to a remote server. Recent developments in network and hardware technologies (cloud computing, virtualization, etc.)…

  17. 75 FR 43206 - In the Matter of Certain Wireless Communications System Server Software, Wireless Handheld...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-23

    ... COMMISSION In the Matter of Certain Wireless Communications System Server Software, Wireless Handheld Devices... communications system server software, wireless handheld devices and battery packs by reason of infringement of..., 2010, based on a complaint filed by Motorola, Inc. (``Motorola'') of Schaumburg, Illinois. 75 FR...

  18. Remote Patron Validation: Posting a Proxy Server at the Digital Doorway.

    ERIC Educational Resources Information Center

    Webster, Peter

    2002-01-01

    Discussion of remote access to library services focuses on proxy servers as a method for remote access, based on experiences at Saint Mary's University (Halifax). Topics include Internet protocol user validation; browser-directed proxies; server software proxies; vendor alternatives for validating remote users; and Internet security issues. (LRW)

  19. Design and Delivery of Multiple Server-Side Computer Languages Course

    ERIC Educational Resources Information Center

    Wang, Shouhong; Wang, Hai

    2011-01-01

    Given the emergence of service-oriented architecture, IS students need to be knowledgeable of multiple server-side computer programming languages to be able to meet the needs of the job market. This paper outlines the pedagogy of an innovative course of multiple server-side computer languages for the undergraduate IS majors. The paper discusses…

  20. Selection of Server-Side Technologies for an E-Business Curriculum

    ERIC Educational Resources Information Center

    Sandvig, J. Christopher

    2007-01-01

    The rapid growth of e-business and e-commerce has made server-side programming an increasingly important topic in information systems (IS) and computer science (CS) curricula. This article presents an overview of the major features of several popular server-side programming technologies and discusses the factors that influence the selection of…

  1. The data model of a PACS-based DICOM radiation therapy server

    NASA Astrophysics Data System (ADS)

    Law, Maria Y. Y.; Huang, H. K.; Zhang, Xiaoyan; Zhang, Jianguo

    2003-05-01

    Radiotherapy (RT) requires information and images from both diagnostic and treatment equipment. Standards for radiotherapy information have been ratified with seven DICOM-RT objects and their IODs (Information Object Definitions). However, the contents of these objects require the incorporation of the RT workflow in a logical sequence. The first step is to trace the RT workflow. The second step now is to direct all images and related information in their corresponding DICOM-RT objects into a DICOM RT Server and then ultimately to an RT application server. Methods: In our design, the RT DICOM Server was based on a PACS data model. The data model can be translated to web-based technology server and an application server built on top of the Web server for RT. In the process, the contents in each of the DICOM-RT objects were customized for the RT display windows. Results: Six display windows were designed and the data model in the RT application server was developed. The images and related information were grouped into the seven DICOM-RT Objects in the sequence of their procedures, and customized for the seven display windows. This is an important step in organizing the data model in the application server for radiation therapy. Conclusion: Radiation therapy workflow study is a pre-requisite for data model design that can enhance image-based healthcare delivery.

  2. Building Interactive Simulations in Web Pages without Programming.

    PubMed

    Mailen Kootsey, J; McAuley, Grant; Bernal, Julie

    2005-01-01

    A software system is described for building interactive simulations and other numerical calculations in Web pages. The system is based on a new Java-based software architecture named NumberLinX (NLX) that isolates each function required to build the simulation so that a library of reusable objects could be assembled. The NLX objects are integrated into a commercial Web design program for coding-free page construction. The model description is entered through a wizard-like utility program that also functions as a model editor. The complete system permits very rapid construction of interactive simulations without coding. A wide range of applications are possible with the system beyond interactive calculations, including remote data collection and processing and collaboration over a network. PMID:17282319

  3. How to find mathematics on a scanned page

    NASA Astrophysics Data System (ADS)

    Fateman, Richard J.

    1999-12-01

    We describe the design of document analysis procedures to separate mathematics from ordinary text on a scanned page of mixed material. It is easy to observe that the accuracy of commercial OCR programs is helped by separating mixed material into two (or more) streams, with conventional non-math text handled by the usual OCR text-based-heuristics analysis. The second stream, consisting of material judged to be mathematics, can be fed to a specialized recognizer. If that fails to decode it, it can be passed on to yet a third stream including diagrams, logos, or other miscellaneous material, perhaps including halftones. We explore the extent to which this separation can be automated in the context of scanning archival material for a digital library project including mathematical and scientific journal pages.

  4. DuPage County chilled water storage project

    SciTech Connect

    Grumman, D.L.

    1998-10-01

    Between 1992 and 1995, the DuPage County Governmental Center in Wheaton, Illinois, commissioned a detailed analysis of its chilled water plant and distribution system, as well as its future needs and options for meeting those needs. The result was a 10,000 ton-hour (35,170 kWh) chilled water storage tank with associated components and controls. This paper describes that process and the system that resulted.

  5. Adaptation of web pages and images for mobile applications

    NASA Astrophysics Data System (ADS)

    Kopf, Stephan; Guthier, Benjamin; Lemelson, Hendrik; Effelsberg, Wolfgang

    2009-02-01

    In this paper, we introduce our new visualization service which presents web pages and images on arbitrary devices with differing display resolutions. We analyze the layout of a web page and simplify its structure and formatting rules. The small screen of a mobile device is used much better this way. Our new image adaptation service combines several techniques. In a first step, border regions which do not contain relevant semantic content are identified. Cropping is used to remove these regions. Attention objects are identified in a second step. We use face detection, text detection and contrast based saliency maps to identify these objects and combine them into a region of interest. Optionally, the seam carving technique can be used to remove inner parts of an image. Additionally, we have developed a software tool to validate, add, delete, or modify all automatically extracted data. This tool also simulates different mobile devices, so that the user gets a feeling of how an adapted web page will look like. We have performed user studies to evaluate our web and image adaptation approach. Questions regarding software ergonomics, quality of the adapted content, and perceived benefit of the adaptation were asked.

  6. A General Purpose Connections type CTI Server Based on SIP Protocol and Its Implementation

    NASA Astrophysics Data System (ADS)

    Watanabe, Toru; Koizumi, Hisao

    In this paper, we propose a general purpose connections type CTI (Computer Telephony Integration) server that provides various CTI services such as voice logging where the CTI server communicates with IP-PBX using the SIP (Session Initiation Protocol), and accumulates voice packets of external line telephone call flowing between an IP telephone for extension and a VoIP gateway connected to outside line networks. The CTI server realizes CTI services such as voice logging, telephone conference, or IVR (interactive voice response) with accumulating and processing voice packets sampled. Furthermore, the CTI server incorporates a web server function which can provide various CTI services such as a Web telephone directory via a Web browser to PCs, cellular telephones or smart-phones in mobile environments.

  7. Cybersecurity, massive data processing, community interaction, and other developments at WWW-based computational X-ray Server

    NASA Astrophysics Data System (ADS)

    Stepanov, Sergey

    2013-03-01

    X-Ray Server (x-server.gmca.aps.anl.gov) is a WWW-based computational server for modeling of X-ray diffraction, reflection and scattering data. The modeling software operates directly on the server and can be accessed remotely either from web browsers or from user software. In the later case the server can be deployed as a software library or a data fitting engine. As the server recently surpassed the milestones of 15 years online and 1.5 million calculations, it accumulated a number of technical solutions that are discussed in this paper. The developed approaches to detecting physical model limits and user calculations failures, solutions to spam and firewall problems, ways to involve the community in replenishing databases and methods to teach users automated access to the server programs may be helpful for X-ray researchers interested in using the server or sharing their own software online.

  8. Verifying the secure setup of Unix client/servers and detection of network intrusion

    SciTech Connect

    Feingold, R.; Bruestle, H.R.; Bartoletti, T.; Saroyan, A.; Fisher, J.

    1995-07-01

    This paper describes our technical approach to developing and delivering Unix host- and network-based security products to meet the increasing challenges in information security. Today`s global ``Infosphere`` presents us with a networked environment that knows no geographical, national, or temporal boundaries, and no ownership, laws, or identity cards. This seamless aggregation of computers, networks, databases, applications, and the like store, transmit, and process information. This information is now recognized as an asset to governments, corporations, and individuals alike. This information must be protected from misuse. The Security Profile Inspector (SPI) performs static analyses of Unix-based clients and servers to check on their security configuration. SPI`s broad range of security tests and flexible usage options support the needs of novice and expert system administrators alike. SPI`s use within the Department of Energy and Department of Defense has resulted in more secure systems, less vulnerable to hostile intentions. Host-based information protection techniques and tools must also be supported by network-based capabilities. Our experience shows that a weak link in a network of clients and servers presents itself sooner or later, and can be more readily identified by dynamic intrusion detection techniques and tools. The Network Intrusion Detector (NID) is one such tool. NID is designed to monitor and analyze activity on an Ethernet broadcast Local Area Network segment and produce transcripts of suspicious user connections. NID`s retrospective and real-time modes have proven invaluable to security officers faced with ongoing attacks to their systems and networks.

  9. viwish: a visualization server for protein modelling and docking.

    PubMed

    Klein, T; Ackermann, F; Posch, S

    1996-12-12

    A visualization tool viwish for proteins based on the Tcl command language has been developed. The system is completely menu driven and can display arbitrary many proteins in arbitrary many windows. It isinstantly t o use, even for non computer experts and provides possibilities to modify menus, configurations, and windows. It may be used as a stand-alone molecular graphics package or as a graphics server for external programs. Communications with these client applications is established even across different machines (through the send command to Tk, an extension of Tcl). In addition, a wide rage of chemical data like molecular surfaces and 3D gridded samplings of chemical features can be displayed. Therefore the systmen is especially useful for the development of algorithms that need visual distributed freely, including the source code.

  10. Optimal routing of IP packets to multi-homed servers

    SciTech Connect

    Swartz, K.L.

    1992-08-01

    Multi-homing, or direct attachment to multiple networks, offers both performance and availability benefits for important servers on busy networks. Exploiting these benefits to their fullest requires a modicum of routing knowledge in the clients. Careful policy control must also be reflected in the routing used within the network to make best use of specialized and often scarce resources. While relatively straightforward in theory, this problem becomes much more difficult to solve in a real network containing often intractable implementations from a variety of vendors. This paper presents an analysis of the problem and proposes a useful solution for a typical campus network. Application of this solution at the Stanford Linear Accelerator Center is studied and the problems and pitfalls encountered are discussed, as are the workarounds used to make the system work in the real world.

  11. World wide web implementation of the Langley technical report server

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Gottlich, Gretchen L.; Bianco, David J.

    1994-01-01

    On January 14, 1993, NASA Langley Research Center (LaRC) made approximately 130 formal, 'unclassified, unlimited' technical reports available via the anonymous FTP Langley Technical Report Server (LTRS). LaRC was the first organization to provide a significant number of aerospace technical reports for open electronic dissemination. LTRS has been successful in its first 18 months of operation, with over 11,000 reports distributed and has helped lay the foundation for electronic document distribution for NASA. The availability of World Wide Web (WWW) technology has revolutionized the Internet-based information community. This paper describes the transition of LTRS from a centralized FTP site to a distributed data model using the WWW, and suggests how the general model for LTRS can be applied to other similar systems.

  12. Utilization of Virtual Server Technology in Mission Operations

    NASA Technical Reports Server (NTRS)

    Felton, Larry; Lankford, Kimberly; Pitts, R. Lee; Pruitt, Robert W.

    2010-01-01

    Virtualization provides the opportunity to continue to do "more with less"---more computing power with fewer physical boxes, thus reducing the overall hardware footprint, power and cooling requirements, software licenses, and their associated costs. This paper explores the tremendous advantages and any disadvantages of virtualization in all of the environments associated with software and systems development to operations flow. It includes the use and benefits of the Intelligent Platform Management Interface (IPMI) specification, and identifies lessons learned concerning hardware and network configurations. Using the Huntsville Operations Support Center (HOSC) at NASA Marshall Space Flight Center as an example, we demonstrate that deploying virtualized servers as a means of managing computing resources is applicable and beneficial to many areas of application, up to and including flight operations.

  13. GlusterFS One Storage Server to Rule Them All

    SciTech Connect

    Boyer, Eric B.; Broomfield, Matthew C.; Perrotti, Terrell A.

    2012-07-30

    GlusterFS is a Linux based distributed file system, designed to be highly scalable and serve many clients. Some reasons to use GlusterFS are: No centralized metadata server, Scalability, Open Source, Dynamic and live service modifications, Can be used over Infiniband or Ethernet, Can be tuned for speed and/or resilience and Flexible administration. It's useful for enterprise environments - virtualization; high performance computing (HPC) and it works with Mac, Linux and Windows clients. Conclusions are: (1) GlusterFS proved to have widespread capabilities as a virtual file system; (2) Scalability is very dependent upon the underlying hardware; (3) Lack of built-in encryption and security paradigm; and (4) Best suited in a general purpose computing environment.

  14. Scripps Genome ADVISER: Annotation and Distributed Variant Interpretation SERver

    PubMed Central

    Pham, Phillip H.; Shipman, William J.; Erikson, Galina A.; Schork, Nicholas J.; Torkamani, Ali

    2015-01-01

    Interpretation of human genomes is a major challenge. We present the Scripps Genome ADVISER (SG-ADVISER) suite, which aims to fill the gap between data generation and genome interpretation by performing holistic, in-depth, annotations and functional predictions on all variant types and effects. The SG-ADVISER suite includes a de-identification tool, a variant annotation web-server, and a user interface for inheritance and annotation-based filtration. SG-ADVISER allows users with no bioinformatics expertise to manipulate large volumes of variant data with ease – without the need to download large reference databases, install software, or use a command line interface. SG-ADVISER is freely available at genomics.scripps.edu/ADVISER. PMID:25706643

  15. GWFASTA: server for FASTA search in eukaryotic and microbial genomes.

    PubMed

    Issac, Biju; Raghava, G P S

    2002-09-01

    Similarity searches are a powerful method for solving important biological problems such as database scanning, evolutionary studies, gene prediction, and protein structure prediction. FASTA is a widely used sequence comparison tool for rapid database scanning. Here we describe the GWFASTA server that was developed to assist the FASTA user in similarity searches against partially and/or completely sequenced genomes. GWFASTA consists of more than 60 microbial genomes, eight eukaryote genomes, and proteomes of annotatedgenomes. Infact, it provides the maximum number of databases for similarity searching from a single platform. GWFASTA allows the submission of more than one sequence as a single query for a FASTA search. It also provides integrated post-processing of FASTA output, including compositional analysis of proteins, multiple sequences alignment, and phylogenetic analysis. Furthermore, it summarizes the search results organism-wise for prokaryotes and chromosome-wise for eukaryotes. Thus, the integration of different tools for sequence analyses makes GWFASTA a powerful toolfor biologists. PMID:12238765

  16. Optimal Routing in General Finite Multi-Server Queueing Networks

    PubMed Central

    van Woensel, Tom; Cruz, Frederico R. B.

    2014-01-01

    The design of general finite multi-server queueing networks is a challenging problem that arises in many real-life situations, including computer networks, manufacturing systems, and telecommunication networks. In this paper, we examine the optimal routing problem in arbitrary configured acyclic queueing networks. The performance of the finite queueing network is evaluated with a known approximate performance evaluation method and the optimization is done by means of a heuristics based on the Powell algorithm. The proposed methodology is then applied to determine the optimal routing probability vector that maximizes the throughput of the queueing network. We show numerical results for some networks to quantify the quality of the routing vector approximations obtained. PMID:25010660

  17. Network time synchronization servers at the US Naval Observatory

    NASA Technical Reports Server (NTRS)

    Schmidt, Richard E.

    1995-01-01

    Responding to an increased demand for reliable, accurate time on the Internet and Milnet, the U.S. Naval Observatory Time Service has established the network time servers, tick.usno.navy.mil and tock.usno.navy.mil. The system clocks of these HP9000/747i industrial work stations are synchronized to within a few tens of microseconds of USNO Master Clock 2 using VMEbus IRIG-B interfaces. Redundant time code is available from a VMEbus GPS receiver. UTC(USNO) is provided over the network via a number of protocols, including the Network Time Protocol (NTP) (DARPA Network Working Group Report RFC-1305), the Daytime Protocol (RFC-867), and the Time protocol (RFC-868). Access to USNO network time services is presently open and unrestricted. An overview of USNO time services and results of LAN and WAN time synchronization tests will be presented.

  18. SciServer Compute brings Analysis to Big Data in the Cloud

    NASA Astrophysics Data System (ADS)

    Raddick, Jordan; Medvedev, Dmitry; Lemson, Gerard; Souter, Barbara

    2016-06-01

    SciServer Compute uses Jupyter Notebooks running within server-side Docker containers attached to big data collections to bring advanced analysis to big data "in the cloud." SciServer Compute is a component in the SciServer Big-Data ecosystem under development at JHU, which will provide a stable, reproducible, sharable virtual research environment.SciServer builds on the popular CasJobs and SkyServer systems that made the Sloan Digital Sky Survey (SDSS) archive one of the most-used astronomical instruments. SciServer extends those systems with server-side computational capabilities and very large scratch storage space, and further extends their functions to a range of other scientific disciplines.Although big datasets like SDSS have revolutionized astronomy research, for further analysis, users are still restricted to downloading the selected data sets locally – but increasing data sizes make this local approach impractical. Instead, researchers need online tools that are co-located with data in a virtual research environment, enabling them to bring their analysis to the data.SciServer supports this using the popular Jupyter notebooks, which allow users to write their own Python and R scripts and execute them on the server with the data (extensions to Matlab and other languages are planned). We have written special-purpose libraries that enable querying the databases and other persistent datasets. Intermediate results can be stored in large scratch space (hundreds of TBs) and analyzed directly from within Python or R with state-of-the-art visualization and machine learning libraries. Users can store science-ready results in their permanent allocation on SciDrive, a Dropbox-like system for sharing and publishing files. Communication between the various components of the SciServer system is managed through SciServer‘s new Single Sign-on Portal.We have created a number of demos to illustrate the capabilities of SciServer Compute, including Python and R scripts

  19. SciServer Compute brings Analysis to Big Data in the Cloud

    NASA Astrophysics Data System (ADS)

    Raddick, Jordan; Medvedev, Dmitry; Lemson, Gerard; Souter, Barbara

    2016-06-01

    SciServer Compute uses Jupyter Notebooks running within server-side Docker containers attached to big data collections to bring advanced analysis to big data "in the cloud." SciServer Compute is a component in the SciServer Big-Data ecosystem under development at JHU, which will provide a stable, reproducible, sharable virtual research environment.SciServer builds on the popular CasJobs and SkyServer systems that made the Sloan Digital Sky Survey (SDSS) archive one of the most-used astronomical instruments. SciServer extends those systems with server-side computational capabilities and very large scratch storage space, and further extends their functions to a range of other scientific disciplines.Although big datasets like SDSS have revolutionized astronomy research, for further analysis, users are still restricted to downloading the selected data sets locally - but increasing data sizes make this local approach impractical. Instead, researchers need online tools that are co-located with data in a virtual research environment, enabling them to bring their analysis to the data.SciServer supports this using the popular Jupyter notebooks, which allow users to write their own Python and R scripts and execute them on the server with the data (extensions to Matlab and other languages are planned). We have written special-purpose libraries that enable querying the databases and other persistent datasets. Intermediate results can be stored in large scratch space (hundreds of TBs) and analyzed directly from within Python or R with state-of-the-art visualization and machine learning libraries. Users can store science-ready results in their permanent allocation on SciDrive, a Dropbox-like system for sharing and publishing files. Communication between the various components of the SciServer system is managed through SciServer‘s new Single Sign-on Portal.We have created a number of demos to illustrate the capabilities of SciServer Compute, including Python and R scripts

  20. Prototype of Multifunctional Full-text Library in the Architecture Web-browser / Web-server / SQL-server

    NASA Astrophysics Data System (ADS)

    Lyapin, Sergey; Kukovyakin, Alexey

    Within the framework of the research program "Textaurus" an operational prototype of multifunctional library T-Libra v.4.1. has been created which makes it possible to carry out flexible parametrizable search within a full-text database. The information system is realized in the architecture Web-browser / Web-server / SQL-server. This allows to achieve an optimal combination of universality and efficiency of text processing, on the one hand, and convenience and minimization of expenses for an end user (due to applying of a standard Web-browser as a client application), on the other one. The following principles underlie the information system: a) multifunctionality, b) intelligence, c) multilingual primary texts and full-text searching, d) development of digital library (DL) by a user ("administrative client"), e) multi-platform working. A "library of concepts", i.e. a block of functional models of semantic (concept-oriented) searching, as well as a subsystem of parametrizable queries to a full-text database, which is closely connected with the "library", serve as a conceptual basis of multifunctionality and "intelligence" of the DL T-Libra v.4.1. An author's paragraph is a unit of full-text searching in the suggested technology. At that, the "logic" of an educational / scientific topic or a problem can be built in a multilevel flexible structure of a query and the "library of concepts", replenishable by the developers and experts. About 10 queries of various level of complexity and conceptuality are realized in the suggested version of the information system: from simple terminological searching (taking into account lexical and grammatical paradigms of Russian) to several kinds of explication of terminological fields and adjustable two-parameter thematic searching (a [set of terms] and a [distance between terms] within the limits of an author's paragraph are such parameters correspondingly).