Science.gov

Sample records for active server pages

  1. Dynamic Web Pages: Performance Impact on Web Servers.

    ERIC Educational Resources Information Center

    Kothari, Bhupesh; Claypool, Mark

    2001-01-01

    Discussion of Web servers and requests for dynamic pages focuses on experimentally measuring and analyzing the performance of the three dynamic Web page generation technologies: CGI, FastCGI, and Servlets. Develops a multivariate linear regression model and predicts Web server performance under some typical dynamic requests. (Author/LRW)

  2. Social Bookmarking Induced Active Page Ranking

    NASA Astrophysics Data System (ADS)

    Takahashi, Tsubasa; Kitagawa, Hiroyuki; Watanabe, Keita

    Social bookmarking services have recently made it possible for us to register and share our own bookmarks on the web and are attracting attention. The services let us get structured data: (URL, Username, Timestamp, Tag Set). And these data represent user interest in web pages. The number of bookmarks is a barometer of web page value. Some web pages have many bookmarks, but most of those bookmarks may have been posted far in the past. Therefore, even if a web page has many bookmarks, their value is not guaranteed. If most of the bookmarks are very old, the page may be obsolete. In this paper, by focusing on the timestamp sequence of social bookmarkings on web pages, we model their activation levels representing current values. Further, we improve our previously proposed ranking method for web search by introducing the activation level concept. Finally, through experiments, we show effectiveness of the proposed ranking method.

  3. Shakespeare Page to Stage: An Active Approach to "Othello."

    ERIC Educational Resources Information Center

    Thomas, Peter

    1994-01-01

    Presents an account of how one English teacher taught William Shakespeare's "Othello" through dramatics in a challenging way. Considers how teachers of drama might discuss props, stage directions, and the proper handling of Desdemona's handkerchief. Explains how teachers should try to take the plays from "page to stage." (HB)

  4. Establishment of Textbook Information Management System Based on Active Server Page

    ERIC Educational Resources Information Center

    Geng, Lihua

    2011-01-01

    In the process of textbook management of universities, the flow of storage, collection and check of textbook is quite complicated and daily management flow and system also seriously constrains the efficiency of the management process. Thus, in order to combine the information management model and the traditional management model, it is necessary…

  5. The Green Pages: Environmental Education Activities K-12.

    ERIC Educational Resources Information Center

    Clearing, 1990

    1990-01-01

    Presented are 37 environmental science activities for students in grades K-12. Topics include water pollution, glaciers, protective coloration, shapes in nature, environmental impacts, recycling, creative writing, litter, shapes found in nature, color, rain cycle, waste management, plastics, energy, pH, landfills, runoff, watersheds,…

  6. The Green Pages: Environmental Education Activities K-12.

    ERIC Educational Resources Information Center

    Clearing, 1990

    1990-01-01

    Presented are 20 science activities for students K-12. Topics include role playing, similarities between urban and forest communities, ecosystems, garbage, recycling, food production, habitats, insects, tidal zone, animals, diversity, interest groups, rivers, spaceship earth, ecological interactions, and the cost of recreation. (KR)

  7. The Green Pages: Environmental Education Activities K-12.

    ERIC Educational Resources Information Center

    Clearing, 1991

    1991-01-01

    Presented are 38 environmental education activities for grades K-12. Topics include seed dispersal, food chains, plant identification, sizes and shapes, trees, common names, air pollution, recycling, temperature, litter, water conservation, photography, insects, urban areas, diversity, natural cycles, rain, erosion, phosphates, human population,…

  8. 76 FR 2754 - Agency Information Collection (Pay Now Enter Info Page) Activity Under OMB Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-14

    ... payments through VA's Pay Now Enter Info Page website. Data enter on the Pay Now Enter Info Page is redirected to the Department of Treasury's Pay.gov website allowing claimants to make payments with credit...

  9. Land Use and Climate Impacts on Fluvial Systems (LUCIFS): A PAGES - Focus 4 (PHAROS) research activity

    NASA Astrophysics Data System (ADS)

    Dearing, John; Hoffmann, Thomas

    2010-05-01

    LUCIFS is a global research program which is concerned with understanding past interactions between climate, human activity and fluvial systems. Its focus is on evaluating the geomorphic impact of humans on landscapes, with a strong emphasis on geomorphological and sedimentological perspectives on mid- to long-term man-landscape interactions. Of particular relevance are aspects of sediment redistribution systems such as non-linear behaviour, the role of system configuration, scale effects, and emergent properties Over the last decade the LUCIFS program has been investigating both contemporary and long-term river response to global change with the principal aims of i)quantifying land use and climate change impacts of river-borne fluxes of water, sediment, C, N and P; ii) identification of key controls on these fluxes at the catchment scale; and iii) identification of the feedback on both human society and biogeochemical cycles of long-term changes in the fluxes of these materials The major scientific tasks of the LUCIFS-program are: • synthesising results of regional case studies • identify regional gaps and encouraging new case studies • addressing research gaps and formulating new research questions • organising workshops and conferences In this paper we present the LUCIFS program within the new PAGES structure. LUCIFS is located in the Focus 4 (PHAROS) dealing with how a knowledge of human-climate-ecosystem interactions in the past can help inform understanding and management today. In conjunction with the other working groups HITE (Human Impacts on Terrestrial Ecosystems), LIMPACS (Human Impacts on Lake Ecosystems) and IHOPE (Integrated History of People on Earth) PHAROS aims to compare regional-scale reconstructions of environmental and climatic processes using natural archives, documentary and instrumental data, with evidence of past human activity obtained from historical, paleoecological and archaeological records.

  10. Basics. [A Compilation of Learning Activities Pages from Seven Issues of Instructor Magazine, September 1982 through March 1983 and May 1983.

    ERIC Educational Resources Information Center

    Instructor, 1983

    1983-01-01

    This collection of 18 learning activities pages focuses on the subject areas of science, language arts, mathematics, and social studies. The science activities pages concern the study of earthquakes, sound, environmental changes, snails and slugs, and friction. Many of the activities are in the form of experiments for the students to perform.…

  11. Server-Side Includes Made Simple.

    ERIC Educational Resources Information Center

    Fagan, Jody Condit

    2002-01-01

    Describes server-side include (SSI) codes which allow Webmasters to insert content into Web pages without programming knowledge. Explains how to enable the codes on a Web server, provides a step-by-step process for implementing them, discusses tags and syntax errors, and includes examples of their use on the Web site for Southern Illinois…

  12. The Green Pages Environmental Education Activities K-12: Gardens for Young Growing Lives.

    ERIC Educational Resources Information Center

    Larson, Jan

    1997-01-01

    Describes several gardening activities that can be kept simple or used as a foundation for more in-depth projects. Activities include setting up an indoor garden spot, making compost which helps students understand the terms "decompose" and "compost", watching plants drink in which students measure water movement in plants, making herb gardens,…

  13. Web Page Design.

    ERIC Educational Resources Information Center

    Lindsay, Lorin

    Designing a web home page involves many decisions that affect how the page will look, the kind of technology required to use the page, the links the page will provide, and kinds of patrons who can use the page. The theme of information literacy needs to be built into every web page; users need to be taught the skills of sorting and applying…

  14. ArachnoServer: a database of protein toxins from spiders

    PubMed Central

    2009-01-01

    Background Venomous animals incapacitate their prey using complex venoms that can contain hundreds of unique protein toxins. The realisation that many of these toxins may have pharmaceutical and insecticidal potential due to their remarkable potency and selectivity against target receptors has led to an explosion in the number of new toxins being discovered and characterised. From an evolutionary perspective, spiders are the most successful venomous animals and they maintain by far the largest pool of toxic peptides. However, at present, there are no databases dedicated to spider toxins and hence it is difficult to realise their full potential as drugs, insecticides, and pharmacological probes. Description We have developed ArachnoServer, a manually curated database that provides detailed information about proteinaceous toxins from spiders. Key features of ArachnoServer include a new molecular target ontology designed especially for venom toxins, the most up-to-date taxonomic information available, and a powerful advanced search interface. Toxin information can be browsed through dynamic trees, and each toxin has a dedicated page summarising all available information about its sequence, structure, and biological activity. ArachnoServer currently manages 567 protein sequences, 334 nucleic acid sequences, and 51 protein structures. Conclusion ArachnoServer provides a single source of high-quality information about proteinaceous spider toxins that will be an invaluable resource for pharmacologists, neuroscientists, toxinologists, medicinal chemists, ion channel scientists, clinicians, and structural biologists. ArachnoServer is available online at http://www.arachnoserver.org. PMID:19674480

  15. Full Page Departmental Advertising.

    ERIC Educational Resources Information Center

    Van Zante, Ben

    1978-01-01

    States that many school newspapers are condensing all advertising into one or two pages. Indicates that advertisers find this to be acceptable, students continue to read the ads, and the content pages look better. (TJ)

  16. FDA Kids' Home Page

    MedlinePlus

    ... kids learn about health and safety. To Help You Stay Healthy! Kids & Teens CVM Kid's Page National Agricultural Library Kids and Teens page Spotlight Pill Bottle Pete More in For Kids Resources for You Consumer Updates Children's Health Page Last Updated: 10/ ...

  17. Ondvrejov solar radio WWW page

    NASA Astrophysics Data System (ADS)

    Jivrivcka, Karel; Meszarosova, Hana

    Since mid 1997 the Solar Radio Astronomy Group of the Astronomical Institute in Ondvrejov has been running a new WWW page. You can find us at the address --- http://sunkl.asu.cas.cz/radio/ --- where information about our instruments, observed frequencies, as well as about our data archive is available. The home page includes four main topics: 1. Observation & Instrumentation --- here you can find information about which instruments are currently in use, observed frequencies, time resolutions etc. Presently we use for solar radio observations three dedicated instruments: RT3 -- single frequency 3 GHz receiver with 10 ms time resolution RT4 -- radio spectrograph 2.0--4.5 GHz with 100 ms time resolution RT5 -- radio spectrograph 0.8--2.0 GHz with 100 ms time resolution The observations are run daily, while the Sun is higher then 5 degrees above the horizon. Because of the tremendous amount of data, only chosen time intervals with radio events are archived. Event Archive Info: --- here you can find information about archived data (date, time interval) as well as overall images of individual events in GIF format (the image names are at the same time hyperlinks for direct loading) sorted by date and time of observation. Gallery: --- here you can find some representative examples of spectra of solar radio events, recorded by our instruments. Anonymous FTP Server: --- enables direct FTP access to our image archive. This WWW page should give you an insight, what data are available and what the events look roughly like. The uncalibrated rough data from our archive can be processed only by special programs and are not generally free. But if you are interested in a particular event from our archive, you can contact us via e-mail address: radio@asu.cas.cz This work has been supported by the Czech Academy of Sciences through grant no.A3003707.

  18. Web server for priority ordered multimedia services

    NASA Astrophysics Data System (ADS)

    Celenk, Mehmet; Godavari, Rakesh K.; Vetnes, Vermund

    2001-10-01

    In this work, our aim is to provide finer priority levels in the design of a general-purpose Web multimedia server with provisions of the CM services. The type of services provided include reading/writing a web page, downloading/uploading an audio/video stream, navigating the Web through browsing, and interactive video teleconferencing. The selected priority encoding levels for such operations follow the order of admin read/write, hot page CM and Web multicasting, CM read, Web read, CM write and Web write. Hot pages are the most requested CM streams (e.g., the newest movies, video clips, and HDTV channels) and Web pages (e.g., portal pages of the commercial Internet search engines). Maintaining a list of these hot Web pages and CM streams in a content addressable buffer enables a server to multicast hot streams with lower latency and higher system throughput. Cold Web pages and CM streams are treated as regular Web and CM requests. Interactive CM operations such as pause (P), resume (R), fast-forward (FF), and rewind (RW) have to be executed without allocation of extra resources. The proposed multimedia server model is a part of the distributed network with load balancing schedulers. The SM is connected to an integrated disk scheduler (IDS), which supervises an allocated disk manager. The IDS follows the same priority handling as the SM, and implements a SCAN disk-scheduling method for an improved disk access and a higher throughput. Different disks are used for the Web and CM services in order to meet the QoS requirements of CM services. The IDS ouput is forwarded to an Integrated Transmission Scheduler (ITS). The ITS creates a priority ordered buffering of the retrieved Web pages and CM data streams that are fed into an auto regressive moving average (ARMA) based traffic shaping circuitry before being transmitted through the network.

  19. Versatile page numbering analysis

    NASA Astrophysics Data System (ADS)

    Déjean, Hervé; Meunier, Jean-Luc

    2008-01-01

    In this paper, we revisit the problem of detecting the page numbers of a document. This work is motivated by a need for a generic method which applies on a large variety of documents, as well as the need for analyzing the document page numbering scheme rather than spotting one number per page. We propose here a novel method, based on the notion of sequence, which goes beyond any previous described work, and we report on an extensive evaluation of its performance.

  20. Evolving dynamic web pages using web mining

    NASA Astrophysics Data System (ADS)

    Menon, Kartik; Dagli, Cihan H.

    2003-08-01

    The heterogeneity and the lack of structure that permeates much of the ever expanding information sources on the WWW makes it difficult for the user to properly and efficiently access different web pages. Different users have different needs from the same web page. It is necessary to train the system to understand the needs and demands of the users. In other words there is a need for efficient and proper web mining. In this paper issues and possible ways of training the system and providing high level of organization for semi structured data available on the web is discussed. Web pages can be evolved based on history of query searches, browsing, links traversed and observation of the user behavior like book marking and time spent on viewing. Fuzzy clustering techniques help in grouping natural users and groups, neural networks, association rules and web traversals patterns help in efficient sequential anaysis based on previous searches and queries by the user. In this paper we analyze web server logs using above mentioned techniques to know more about user interactions. Analyzing these web server logs help to closely understand the user behavior and his/her web access pattern.

  1. Wetlands and Web Pages.

    ERIC Educational Resources Information Center

    Tisone-Bartels, Dede

    1998-01-01

    Argues that the preservation of areas like the Shoreline Park (California) wetlands depends on educating students about the value of natural resources. Describes the creation of a Web page on the wetlands for third-grade students by seventh-grade art and ecology students. Outlines the technical process of developing a Web page. (DSK)

  2. Configuring Battalion File Servers

    DTIC Science & Technology

    2012-01-01

    AGM Server 2008 to load on a Dell D630 laptop. Though not ideal, it did allow the battalion staff and command group to share...and install the AGM Microsoft Server 2008. The final contract included two Dell R610 1U servers with RAID 5 comprising of three 1 TB hard drives...continuity in data between garrison and deployment environ- ments. With the usage of AGM Server operating systems, the Army is

  3. Sign language Web pages.

    PubMed

    Fels, Deborah I; Richards, Jan; Hardman, Jim; Lee, Daniel G

    2006-01-01

    The WORLD WIDE WEB has changed the way people interact. It has also become an important equalizer of information access for many social sectors. However, for many people, including some sign language users, Web accessing can be difficult. For some, it not only presents another barrier to overcome but has left them without cultural equality. The present article describes a system that allows sign language-only Web pages to be created and linked through a video-based technique called sign-linking. In two studies, 14 Deaf participants examined two iterations of signlinked Web pages to gauge the usability and learnability of a signing Web page interface. The first study indicated that signing Web pages were usable by sign language users but that some interface features required improvement. The second study showed increased usability for those features; users consequently couldnavigate sign language information with ease and pleasure.

  4. Making Pages That Move.

    ERIC Educational Resources Information Center

    Gepner, Ivan

    2001-01-01

    Explains the mechanism of producing dynamic computer pages which is based on three technologies: (1) the document object model; (2) cascading stylesheets; and (3) javascript. Discusses the applications of these techniques in genetics and developmental biology. (YDS)

  5. TCRC Fertility Page

    MedlinePlus

    The Testicular Cancer Resource Center The TCRC Fertility Page Testicular Cancer and fertility are interrelated in numerous ways. TC usually ... Orchiectomy: As I mentioned, many men who have testicular cancer also already have fertility problems. In some, the ...

  6. Page turning system

    NASA Technical Reports Server (NTRS)

    Kerley, James J. (Inventor); Eklund, Wayne D. (Inventor)

    1992-01-01

    A device for holding reading materials for use by readers without arm mobility is presented. The device is adapted to hold the reading materials in position for reading with the pages displayed to enable turning by use of a rubber tipped stick that is held in the mouth and has a pair of rectangular frames. The frames are for holding and positioning the reading materials opened in reading posture with the pages displayed at a substantially unobstructed sighting position for reading. The pair of rectangular frames are connected to one another by a hinge so the angle between the frames may be varied thereby varying the inclination of the reading material. A pair of bent spring mounted wires for holding opposing pages of the reading material open for reading without substantial visual interference of the pages is mounted to the base. The wires are also adjustable to the thickness of the reading material and have a variable friction adjustment. This enables the force of the wires against the pages to be varied and permits the reader to manipulate the pages with the stick.

  7. Dali server: conservation mapping in 3D.

    PubMed

    Holm, Liisa; Rosenström, Päivi

    2010-07-01

    Our web site (http://ekhidna.biocenter.helsinki.fi/dali_server) runs the Dali program for protein structure comparison. The web site consists of three parts: (i) the Dali server compares newly solved structures against structures in the Protein Data Bank (PDB), (ii) the Dali database allows browsing precomputed structural neighbourhoods and (iii) the pairwise comparison generates suboptimal alignments for a pair of structures. Each part has its own query form and a common format for the results page. The inputs are either PDB identifiers or novel structures uploaded by the user. The results pages are hyperlinked to aid interactive analysis. The web interface is simple and easy to use. The key purpose of interactive analysis is to check whether conserved residues line up in multiple structural alignments and how conserved residues and ligands cluster together in multiple structure superimpositions. In favourable cases, protein structure comparison can lead to evolutionary discoveries not detected by sequence analysis.

  8. Title and title page.

    PubMed

    Peh, W C G; Ng, K H

    2008-08-01

    The title gives the first impression of a scientific article, and should accurately convey to a reader what the whole article is about. A good title is short, informative and attractive. The title page provides information about the authors, their affiliations and the corresponding author's contact details.

  9. Automatic page layout using genetic algorithms for electronic albuming

    NASA Astrophysics Data System (ADS)

    Geigel, Joe; Loui, Alexander C. P.

    2000-12-01

    In this paper, we describe a flexible system for automatic page layout that makes use of genetic algorithms for albuming applications. The system is divided into two modules, a page creator module which is responsible for distributing images amongst various album pages, and an image placement module which positions images on individual pages. Final page layouts are specified in a textual form using XML for printing or viewing over the Internet. The system makes use of genetic algorithms, a class of search and optimization algorithms that are based on the concepts of biological evolution, for generating solutions with fitness based on graphic design preferences supplied by the user. The genetic page layout algorithm has been incorporated into a web-based prototype system for interactive page layout over the Internet. The prototype system is built using client-server architecture and is implemented in java. The system described in this paper has demonstrated the feasibility of using genetic algorithms for automated page layout in albuming and web-based imaging applications. We believe that the system adequately proves the validity of the concept, providing creative layouts in a reasonable number of iterations. By optimizing the layout parameters of the fitness function, we hope to further improve the quality of the final layout in terms of user preference and computation speed.

  10. A proposed 30-45 minute 4 page standard protocol to evaluate rheumatoid arthritis (SPERA) that includes measures of inflammatory activity, joint damage, and longterm outcomes.

    PubMed

    Pincus, T; Brooks, R H; Callahan, L F

    1999-02-01

    A proposed 4 page, 30-45 minute standard protocol to assess rheumatoid arthritis (SPERA) is described that includes all relevant measures of inflammatory activity such as joint swelling, measures of joint damage such as joint deformity, and outcomes such as joint replacement surgery, to monitor patients in longterm observational studies. Forms are included: (1) a patient self-report modified health assessment questionnaire (MHAQ) to assess function, pain, fatigue, psychological distress, symptoms, and drugs used; (2) assessor-completed forms: "RA clinical features" --criteria for RA, functional class, family history, extraarticular disease, comorbidities, joint surgery, radiographic score, and laboratory findings. (3) A 32 joint count with 5 variables: (a) a "shorthand" normal/abnormal so that normal joints require no further detailed assessment; (b) tenderness or pain on motion; (c) swelling; (d) limited motion or deformity; (e) previous surgeries; physical measures of function, i.e., grip strength, walk time, and button test. (4) Medication review of previous disease modifying antirheumatic drugs (DMARD), work history, and years of education. The forms allow cost effective acquisition of all relevant measures of activity, damage, and outcomes in routine clinical care, and allow recognition that measures of activity may show similar or improved values over 5-10 years, while measures of damage and outcomes indicate severe progression in the same patients. The SPERA is feasible to acquire most known relevant measures of activity, damage, and outcomes in RA in 30-45 min in usual clinical settings, to provide a complete database for analyses of longterm outcomes.

  11. Network and User-Perceived Performance of Web Page Retrievals

    NASA Technical Reports Server (NTRS)

    Kruse, Hans; Allman, Mark; Mallasch, Paul

    1998-01-01

    The development of the HTTP protocol has been driven by the need to improve the network performance of the protocol by allowing the efficient retrieval of multiple parts of a web page without the need for multiple simultaneous TCP connections between a client and a server. We suggest that the retrieval of multiple page elements sequentially over a single TCP connection may result in a degradation of the perceived performance experienced by the user. We attempt to quantify this perceived degradation through the use of a model which combines a web retrieval simulation and an analytical model of TCP operation. Starting with the current HTTP/l.1 specification, we first suggest a client@side heuristic to improve the perceived transfer performance. We show that the perceived speed of the page retrieval can be increased without sacrificing data transfer efficiency. We then propose a new client/server extension to the HTTP/l.1 protocol to allow for the interleaving of page element retrievals. We finally address the issue of the display of advertisements on web pages, and in particular suggest a number of mechanisms which can make efficient use of IP multicast to send advertisements to a number of clients within the same network.

  12. Advantages and limitations of clear-native PAGE.

    PubMed

    Wittig, Ilka; Schägger, Hermann

    2005-11-01

    Clear-native PAGE (CN-PAGE) separates acidic water-soluble and membrane proteins (pI < 7) in an acrylamide gradient gel, and usually has lower resolution than blue-native PAGE (BN-PAGE). The migration distance depends on the protein intrinsic charge, and on the pore size of the gradient gel. This complicates estimation of native masses and oligomerization states when compared to BN-PAGE, which uses negatively charged protein-bound Coomassie-dye to impose a charge shift on the proteins. Therefore, BN-PAGE rather than CN-PAGE is commonly used for standard analyses. However, CN-PAGE offers advantages whenever Coomassie-dye interferes with techniques required to further analyze the native complexes, e.g., determination of catalytic activities, as shown here for mitochondrial ATP synthase, or efficient microscale separation of membrane protein complexes for fluorescence resonance energy transfer (FRET) analyses. CN-PAGE is milder than BN-PAGE. Especially the combination of digitonin and CN-PAGE can retain labile supramolecular assemblies of membrane protein complexes that are dissociated under the conditions of BN-PAGE. Enzymatically active oligomeric states of mitochondrial ATP synthase previously not detected using BN-PAGE were identified by CN-PAGE.

  13. Multiplex PageRank.

    PubMed

    Halu, Arda; Mondragón, Raúl J; Panzarasa, Pietro; Bianconi, Ginestra

    2013-01-01

    Many complex systems can be described as multiplex networks in which the same nodes can interact with one another in different layers, thus forming a set of interacting and co-evolving networks. Examples of such multiplex systems are social networks where people are involved in different types of relationships and interact through various forms of communication media. The ranking of nodes in multiplex networks is one of the most pressing and challenging tasks that research on complex networks is currently facing. When pairs of nodes can be connected through multiple links and in multiple layers, the ranking of nodes should necessarily reflect the importance of nodes in one layer as well as their importance in other interdependent layers. In this paper, we draw on the idea of biased random walks to define the Multiplex PageRank centrality measure in which the effects of the interplay between networks on the centrality of nodes are directly taken into account. In particular, depending on the intensity of the interaction between layers, we define the Additive, Multiplicative, Combined, and Neutral versions of Multiplex PageRank, and show how each version reflects the extent to which the importance of a node in one layer affects the importance the node can gain in another layer. We discuss these measures and apply them to an online multiplex social network. Findings indicate that taking the multiplex nature of the network into account helps uncover the emergence of rankings of nodes that differ from the rankings obtained from one single layer. Results provide support in favor of the salience of multiplex centrality measures, like Multiplex PageRank, for assessing the prominence of nodes embedded in multiple interacting networks, and for shedding a new light on structural properties that would otherwise remain undetected if each of the interacting networks were analyzed in isolation.

  14. The NASA Technical Report Server

    NASA Astrophysics Data System (ADS)

    Nelson, M. L.; Gottlich, G. L.; Bianco, D. J.; Paulson, S. S.; Binkley, R. L.; Kellogg, Y. D.; Beaumont, C. J.; Schmunk, R. B.; Kurtz, M. J.; Accomazzi, A.; Syed, O.

    The National Aeronautics and Space Act of 1958 established the National Aeronautics and Space Administration (NASA) and charged it to "provide for the widest practicable and appropriate dissemination of information concerning...its activities and the results thereof". The search for innovative methods to distribute NASA's information led a grass-roots team to create the NASA Technical Report Server (NTRS), which uses the World Wide Web and other popular Internet-based information systems .

  15. Remote diagnosis server

    NASA Technical Reports Server (NTRS)

    Deb, Somnath (Inventor); Ghoshal, Sudipto (Inventor); Malepati, Venkata N. (Inventor); Kleinman, David L. (Inventor); Cavanaugh, Kevin F. (Inventor)

    2004-01-01

    A network-based diagnosis server for monitoring and diagnosing a system, the server being remote from the system it is observing, comprises a sensor for generating signals indicative of a characteristic of a component of the system, a network-interfaced sensor agent coupled to the sensor for receiving signals therefrom, a broker module coupled to the network for sending signals to and receiving signals from the sensor agent, a handler application connected to the broker module for transmitting signals to and receiving signals therefrom, a reasoner application in communication with the handler application for processing, and responding to signals received from the handler application, wherein the sensor agent, broker module, handler application, and reasoner applications operate simultaneously relative to each other, such that the present invention diagnosis server performs continuous monitoring and diagnosing of said components of the system in real time. The diagnosis server is readily adaptable to various different systems.

  16. USING SERVERS TO ENHANCE CONTROL SYSTEM CAPABILITY.

    SciTech Connect

    BICKLEY,M.; BOWLING,B.A.; BRYAN,D.A.; ZEIJTS,J.; WHITE,K.S.; WITHERSPOON,S.

    1999-03-29

    Many traditional control systems include a distributed collection of front end machines to control hardware. Back end tools are used to view, modify, and record the signals generated by these front end machines. Software servers, which are a middleware layer between the front and back ends, can improve a control system in several ways. Servers can enable on-line processing of raw data, and consolidation of functionality. In many cases data retrieved from the front end must be processed in order to convert the raw data into useful information. These calculations are often redundantly performed by different programs, frequently offline. Servers can monitor the raw data and rapidly perform calculations, producing new signals which can be treated like any other control system signal, and can be used by any back end application. Algorithms can be incorporated to actively modify signal values in the control system based upon changes of other signals, essentially producing feedback in a control system. Servers thus increase the flexibility of a control system. Lastly, servers running on inexpensive UNIX workstations can relay or cache frequently needed information, reducing the load on front end hardware by functioning as concentrators. Rather than many back end tools connecting directly to the front end machines, increasing the work load of these machines, they instead connect to the server. Servers like those discussed above have been used successfully at the Thomas Jefferson National Accelerator Facility to provide functionality such as beam steering, fault monitoring, storage of machine parameters, and on-line data processing. The authors discuss the potential uses of such, servers, and share the results of work performed to date.

  17. 8. Photocopy of printed page (original Page 30 of the ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. Photocopy of printed page (original Page 30 of the Souvenir Program 1867-1967 Ridgely Centennial) Photographer unknown. Circa 1967. VIEW NORTHEAST, SOUTHWEST FRONT Ridgely's centennial was celebrated in 1967 and included in the souvenir brochure was page 30. This view shows the subject building with the 1950 modifications to provide for automotive traffic. It was a print of a current photograph. - 510 Central Avenue (Commercial Building), Ridgely, Caroline County, MD

  18. Do-It-Yourself: A Special Library's Approach to Creating Dynamic Web Pages Using Commercial Off-The-Shelf Applications

    NASA Technical Reports Server (NTRS)

    Steeman, Gerald; Connell, Christopher

    2000-01-01

    Many librarians may feel that dynamic Web pages are out of their reach, financially and technically. Yet we are reminded in library and Web design literature that static home pages are a thing of the past. This paper describes how librarians at the Institute for Defense Analyses (IDA) library developed a database-driven, dynamic intranet site using commercial off-the-shelf applications. Administrative issues include surveying a library users group for interest and needs evaluation; outlining metadata elements; and, committing resources from managing time to populate the database and training in Microsoft FrontPage and Web-to-database design. Technical issues covered include Microsoft Access database fundamentals, lessons learned in the Web-to-database process (including setting up Database Source Names (DSNs), redesigning queries to accommodate the Web interface, and understanding Access 97 query language vs. Standard Query Language (SQL)). This paper also offers tips on editing Active Server Pages (ASP) scripting to create desired results. A how-to annotated resource list closes out the paper.

  19. A Web Server for MACCS Magnetometer Data

    NASA Technical Reports Server (NTRS)

    Engebretson, Mark J.

    1998-01-01

    NASA Grant NAG5-3719 was provided to Augsburg College to support the development of a web server for the Magnetometer Array for Cusp and Cleft Studies (MACCS), a two-dimensional array of fluxgate magnetometers located at cusp latitudes in Arctic Canada. MACCS was developed as part of the National Science Foundation's GEM (Geospace Environment Modeling) Program, which was designed in part to complement NASA's Global Geospace Science programs during the decade of the 1990s. This report describes the successful use of these grant funds to support a working web page that provides both daily plots and file access to any user accessing the worldwide web. The MACCS home page can be accessed at http://space.augsburg.edu/space/MaccsHome.html.

  20. Project EAGLE (Early Academic Gifted Learning Experience): A Program for Gifted and Talented Students (Grades K-3)--Kindergarten Activity Booklets: Xanthus; Zhack; and Activity Pages H-Z.

    ERIC Educational Resources Information Center

    Merkoski, Kay

    Three activity booklets are presented for implementing Project EAGLE, an enrichment program for gifted and talented kindergarten children. The first activity booklet contains a poem by J. D. Evans titled "In Search of the Xanthus," which describes the search for an imaginary beast that leaves an "X" on the spot where it used to be. The second…

  1. Turning the page

    PubMed Central

    2015-01-01

    in hand to allow for a more limited production of a paper version of the Annals for RCS fellows and members who continue to elect to receive their Annals in the traditional format. Medical colleges around the world are currently undergoing similar deliberations and for some a digital version may represent the only opportunity to maintain editorial independence – unhindered by the implications of a commercial publishing partner. It is however hoped that for the vast majority of fellows and members, the new and enhanced digital platform will offer significant advantages such that the digital version becomes the de facto medium of choice. Matt Whitaker and the team at the Annals should be congratulated for their sterling efforts in making this transition. The new site, now live at http://publishing.rcseng.ac.uk, will enhance the experience of finding, accessing, reading, citing, sharing and saving articles from the Annals, Bulletin and FDJ. Sign-on will be much easier; page load times quicker and the search engine more powerful and intuitive. The new platform boasts improved functionality, full in-page article text and multi-media, citation tracking, reference generators and advanced social media integration. We are simultaneously launching a new video library where we will be hosting our technical videos. It will, I am certain, become a huge resource for our surgical fraternity. Our new platform will be followed later this year by the inevitable and ubiquitous app, which will allow readers to download issues of the Annals and read them offline and at leisure on whatever their tablet of choice might be. It is my belief that these and forthcoming changes herald the transformation of the Annals into a truly modern journal with all the digital services that authors and readers now rightly expect from their RCS publication. Tim Lane Editor-in-Chief, rcsannalseditor@gmail.com

  2. Secure IRC Server

    SciTech Connect

    Perry, Marcia

    2003-08-25

    The IRCD is an IRC server that was originally distributed by the IRCD Hybrid developer team for use as a server in IRC message over the public Internet. By supporting the IRC protocol defined in the IRC RFC, IRCD allows the users to create and join channels for group or one-to-one text-based instant messaging. It stores information about channels (e.g., whether it is public, secret, or invite-only, the topic set, membership) and users (who is online and what channels they are members of). It receives messages for a specific user or channel and forwards these messages to the targeted destination. Since server-to-server communication is also supported, these targeted destinations may be connected to different IRC servers. Messages are exchanged over TCP connections that remain open between the client and the server. The IRCD is being used within the Pervasive Computing Collaboration Environment (PCCE) as the 'chat server' for message exchange over public and private channels. After an LBNLSecureMessaging(PCCE chat) client has been authenticated, the client connects to IRCD with its assigned nickname or 'nick.' The client can then create or join channels for group discussions or one-to-one conversations. These channels can have an initial mode of public or invite-only and the mode may be changed after creation. If a channel is public, any one online can join the discussion; if a channel is invite-only, users can only join if existing members of the channel explicity invite them. Users can be invited to any type of channel and users may be members of multiple channels simultaneously. For use with the PCCE environment, the IRCD application (which was written in C) was ported to Linux and has been tested and installed under Linux Redhat 7.2. The source code was also modified with SSL so that all messages exchanged over the network are encrypted. This modified IRC server also verifies with an authentication server that the client is who he or she claims to be and that

  3. Comment: PAGES: Always Bottom Up

    NASA Astrophysics Data System (ADS)

    Bradley, Raymond

    2004-06-01

    In a recent article titled ``Back to the Future'' (Eos, 16 March, p. 107) L. C. Witton lays out the goals of IGBP-PAGES for the next few years, noting that, ``PAGES is aiming to become a truly bottom-up organization that is driven by the insights of individual scientists....'' In fact, PAGES has always been a truly bottom-up organization, and this statement unfortunately fosters the view that it has been otherwise. Those who promote such a view choose to overlook the countless workshops that PAGES has organized, largely at the suggestion of those ``at the bottom,'' and the numerous publications that have resulted from these meetings.

  4. Design and implementation of streaming media server cluster based on FFMpeg.

    PubMed

    Zhao, Hong; Zhou, Chun-long; Jin, Bao-zhao

    2015-01-01

    Poor performance and network congestion are commonly observed in the streaming media single server system. This paper proposes a scheme to construct a streaming media server cluster system based on FFMpeg. In this scheme, different users are distributed to different servers according to their locations and the balance among servers is maintained by the dynamic load-balancing algorithm based on active feedback. Furthermore, a service redirection algorithm is proposed to improve the transmission efficiency of streaming media data. The experiment results show that the server cluster system has significantly alleviated the network congestion and improved the performance in comparison with the single server system.

  5. Design and Implementation of Streaming Media Server Cluster Based on FFMpeg

    PubMed Central

    Zhao, Hong; Zhou, Chun-long; Jin, Bao-zhao

    2015-01-01

    Poor performance and network congestion are commonly observed in the streaming media single server system. This paper proposes a scheme to construct a streaming media server cluster system based on FFMpeg. In this scheme, different users are distributed to different servers according to their locations and the balance among servers is maintained by the dynamic load-balancing algorithm based on active feedback. Furthermore, a service redirection algorithm is proposed to improve the transmission efficiency of streaming media data. The experiment results show that the server cluster system has significantly alleviated the network congestion and improved the performance in comparison with the single server system. PMID:25734187

  6. Using Firefly Tools to Enhance Archive Web Pages

    NASA Astrophysics Data System (ADS)

    Roby, W.; Wu, X.; Ly, L.; Goldina, T.

    2013-10-01

    Astronomy web developers are looking for fast and powerful HTML 5/AJAX tools to enhance their web archives. We are exploring ways to make this easier for the developer. How could you have a full FITS visualizer or a Web 2.0 table that supports paging, sorting, and filtering in your web page in 10 minutes? Can it be done without even installing any software or maintaining a server? Firefly is a powerful, configurable system for building web-based user interfaces to access astronomy science archives. It has been in production for the past three years. Recently, we have made some of the advanced components available through very simple JavaScript calls. This allows a web developer, without any significant knowledge of Firefly, to have FITS visualizers, advanced table display, and spectrum plots on their web pages with minimal learning curve. Because we use cross-site JSONP, installing a server is not necessary. Web sites that use these tools can be created in minutes. Firefly was created in IRSA, the NASA/IPAC Infrared Science Archive (http://irsa.ipac.caltech.edu). We are using Firefly to serve many projects including Spitzer, Planck, WISE, PTF, LSST and others.

  7. Web Site On a Budget: How to Find an Affordable Home for Your Pages.

    ERIC Educational Resources Information Center

    Callihan, Steven E.

    1996-01-01

    Offers advice for choosing an Internet provider: consider the amount of time, effort, and expertise one has, coupled with the complexity of the Web page, which impact price and choice of provider; and question providers about server speed, ports, architecture, traffic levels, fee structures, and registration of domain names. Lists 33 Web presence…

  8. PubServer: literature searches by homology.

    PubMed

    Jaroszewski, Lukasz; Koska, Laszlo; Sedova, Mayya; Godzik, Adam

    2014-07-01

    PubServer, available at http://pubserver.burnham.org/, is a tool to automatically collect, filter and analyze publications associated with groups of homologous proteins. Protein entries in databases such as Entrez Protein database at NCBI contain information about publications associated with a given protein. The scope of these publications varies a lot: they include studies focused on biochemical functions of individual proteins, but also reports from genome sequencing projects that introduce tens of thousands of proteins. Collecting and analyzing publications related to sets of homologous proteins help in functional annotation of novel protein families and in improving annotations of well-studied protein families or individual genes. However, performing such collection and analysis manually is a tedious and time-consuming process. PubServer automatically collects identifiers of homologous proteins using PSI-Blast, retrieves literature references from corresponding database entries and filters out publications unlikely to contain useful information about individual proteins. It also prepares simple vocabulary statistics from titles, abstracts and MeSH terms to identify the most frequently occurring keywords, which may help to quickly identify common themes in these publications. The filtering criteria applied to collected publications are user-adjustable. The results of the server are presented as an interactive page that allows re-filtering and different presentations of the output.

  9. NL MIND-BEST: a web server for ligands and proteins discovery--theoretic-experimental study of proteins of Giardia lamblia and new compounds active against Plasmodium falciparum.

    PubMed

    González-Díaz, Humberto; Prado-Prado, Francisco; Sobarzo-Sánchez, Eduardo; Haddad, Mohamed; Maurel Chevalley, Séverine; Valentin, Alexis; Quetin-Leclercq, Joëlle; Dea-Ayuela, María A; Teresa Gomez-Muños, María; Munteanu, Cristian R; José Torres-Labandeira, Juan; García-Mera, Xerardo; Tapia, Ricardo A; Ubeira, Florencio M

    2011-05-07

    There are many protein ligands and/or drugs described with very different affinity to a large number of target proteins or receptors. In this work, we selected Ligands or Drug-target pairs (DTPs/nDTPs) of drugs with high affinity/non-affinity for different targets. Quantitative Structure-Activity Relationships (QSAR) models become a very useful tool in this context to substantially reduce time and resources consuming experiments. Unfortunately most QSAR models predict activity against only one protein target and/or have not been implemented in the form of public web server freely accessible online to the scientific community. To solve this problem, we developed here a multi-target QSAR (mt-QSAR) classifier using the MARCH-INSIDE technique to calculate structural parameters of drug and target plus one Artificial Neuronal Network (ANN) to seek the model. The best ANN model found is a Multi-Layer Perceptron (MLP) with profile MLP 20:20-15-1:1. This MLP classifies correctly 611 out of 678 DTPs (sensitivity=90.12%) and 3083 out of 3408 nDTPs (specificity=90.46%), corresponding to training accuracy=90.41%. The validation of the model was carried out by means of external predicting series. The model classifies correctly 310 out of 338 DTPs (sensitivity=91.72%) and 1527 out of 1674 nDTP (specificity=91.22%) in validation series, corresponding to total accuracy=91.30% for validation series (predictability). This model favorably compares with other ANN models developed in this work and Machine Learning classifiers published before to address the same problem in different aspects. We implemented the present model at web portal Bio-AIMS in the form of an online server called: Non-Linear MARCH-INSIDE Nested Drug-Bank Exploration & Screening Tool (NL MIND-BEST), which is located at URL: http://miaja.tic.udc.es/Bio-AIMS/NL-MIND-BEST.php. This online tool is based on PHP/HTML/Python and MARCH-INSIDE routines. Finally we illustrated two practical uses of this server with two

  10. Parent-Friendly Web Pages.

    ERIC Educational Resources Information Center

    Farmer, Lesley S. J.

    2000-01-01

    Discusses how librarians can help parents become more knowledgeable about the Internet so they can guide their children in Internet use and become technologically independent. Recommends that school libraries develop Web pages that parents can access and discusses Web page design, content for children, and content for parents. (LRW)

  11. Web Page Design (Part Three).

    ERIC Educational Resources Information Center

    Descy, Don E.

    1997-01-01

    Discusses fonts as well as design considerations that should be reviewed when designing World Wide Web pages and sites to make them easier for clients to use and easier to maintain. Also discusses the simplicity of names; organization of pages, folders, and files; and sites to help build Web sites. (LRW)

  12. Reese Sorenson's Individual Professional Page

    NASA Technical Reports Server (NTRS)

    Sorenson, Reese; Nixon, David (Technical Monitor)

    1998-01-01

    The subject document is a World Wide Web (WWW) page entitled, "Reese Sorenson's Individual Professional Page." Its can be accessed at "http://george.arc.nasa.gov/sorenson/personal/index.html". The purpose of this page is to make the reader aware of me, who I am, and what I do. It lists my work assignments, my computer experience, my place in the NASA hierarchy, publications by me, awards received by me, my education, and how to contact me. Writing this page was a learning experience, pursuant to an element in my Job Description which calls for me to be able to use the latest computers. This web page contains very little technical information, none of which is classified or sensitive.

  13. Dali server update

    PubMed Central

    Holm, Liisa; Laakso, Laura M.

    2016-01-01

    The Dali server (http://ekhidna2.biocenter.helsinki.fi/dali) is a network service for comparing protein structures in 3D. In favourable cases, comparing 3D structures may reveal biologically interesting similarities that are not detectable by comparing sequences. The Dali server has been running in various places for over 20 years and is used routinely by crystallographers on newly solved structures. The latest update of the server provides enhanced analytics for the study of sequence and structure conservation. The server performs three types of structure comparisons: (i) Protein Data Bank (PDB) search compares one query structure against those in the PDB and returns a list of similar structures; (ii) pairwise comparison compares one query structure against a list of structures specified by the user; and (iii) all against all structure comparison returns a structural similarity matrix, a dendrogram and a multidimensional scaling projection of a set of structures specified by the user. Structural superimpositions are visualized using the Java-free WebGL viewer PV. The structural alignment view is enhanced by sequence similarity searches against Uniprot. The combined structure-sequence alignment information is compressed to a stack of aligned sequence logos. In the stack, each structure is structurally aligned to the query protein and represented by a sequence logo. PMID:27131377

  14. Dali server update.

    PubMed

    Holm, Liisa; Laakso, Laura M

    2016-07-08

    The Dali server (http://ekhidna2.biocenter.helsinki.fi/dali) is a network service for comparing protein structures in 3D. In favourable cases, comparing 3D structures may reveal biologically interesting similarities that are not detectable by comparing sequences. The Dali server has been running in various places for over 20 years and is used routinely by crystallographers on newly solved structures. The latest update of the server provides enhanced analytics for the study of sequence and structure conservation. The server performs three types of structure comparisons: (i) Protein Data Bank (PDB) search compares one query structure against those in the PDB and returns a list of similar structures; (ii) pairwise comparison compares one query structure against a list of structures specified by the user; and (iii) all against all structure comparison returns a structural similarity matrix, a dendrogram and a multidimensional scaling projection of a set of structures specified by the user. Structural superimpositions are visualized using the Java-free WebGL viewer PV. The structural alignment view is enhanced by sequence similarity searches against Uniprot. The combined structure-sequence alignment information is compressed to a stack of aligned sequence logos. In the stack, each structure is structurally aligned to the query protein and represented by a sequence logo.

  15. SLITHER: a web server for generating contiguous conformations of substrate molecules entering into deep active sites of proteins or migrating through channels in membrane transporters.

    PubMed

    Lee, Po-Hsien; Kuo, Kuei-Ling; Chu, Pei-Ying; Liu, Eric M; Lin, Jung-Hsin

    2009-07-01

    Many proteins use a long channel to guide the substrate or ligand molecules into the well-defined active sites for catalytic reactions or for switching molecular states. In addition, substrates of membrane transporters can migrate to another side of cellular compartment by means of certain selective mechanisms. SLITHER (http://bioinfo.mc.ntu.edu.tw/slither/or http://slither.rcas.sinica.edu.tw/) is a web server that can generate contiguous conformations of a molecule along a curved tunnel inside a protein, and the binding free energy profile along the predicted channel pathway. SLITHER adopts an iterative docking scheme, which combines with a puddle-skimming procedure, i.e. repeatedly elevating the potential energies of the identified global minima, thereby determines the contiguous binding modes of substrates inside the protein. In contrast to some programs that are widely used to determine the geometric dimensions in the ion channels, SLITHER can be applied to predict whether a substrate molecule can crawl through an inner channel or a half-channel of proteins across surmountable energy barriers. Besides, SLITHER also provides the list of the pore-facing residues, which can be directly compared with many genetic diseases. Finally, the adjacent binding poses determined by SLITHER can also be used for fragment-based drug design.

  16. Celebrating Dr. King. Poetry Pages.

    ERIC Educational Resources Information Center

    Fina, Allan de

    1992-01-01

    Poetry that relates to the beliefs and actions of Martin Luther King, Jr. can be used to help students appreciate the civil rights leader's contributions, examine their own aspirations, and critically analyze poems. A reproducible poetry page is included. (IAH)

  17. Code AI Personal Web Pages

    NASA Technical Reports Server (NTRS)

    Garcia, Joseph A.; Smith, Charles A. (Technical Monitor)

    1998-01-01

    The document consists of a publicly available web site (george.arc.nasa.gov) for Joseph A. Garcia's personal web pages in the AI division. Only general information will be posted and no technical material. All the information is unclassified.

  18. Enhanced networked server management with random remote backups

    NASA Astrophysics Data System (ADS)

    Kim, Song-Kyoo

    2003-08-01

    In this paper, the model is focused on available server management in network environments. The (remote) backup servers are hooked up by VPN (Virtual Private Network) and replace broken main severs immediately. A virtual private network (VPN) is a way to use a public network infrastructure and hooks up long-distance servers within a single network infrastructure. The servers can be represent as "machines" and then the system deals with main unreliable and random auxiliary spare (remote backup) machines. When the system performs a mandatory routine maintenance, auxiliary machines are being used for backups during idle periods. Unlike other existing models, the availability of auxiliary machines is changed for each activation in this enhanced model. Analytically tractable results are obtained by using several mathematical techniques and the results are demonstrated in the framework of optimized networked server allocation problems.

  19. Home media server content management

    NASA Astrophysics Data System (ADS)

    Tokmakoff, Andrew A.; van Vliet, Harry

    2001-07-01

    With the advent of set-top boxes, the convergence of TV (broadcasting) and PC (Internet) is set to enter the home environment. Currently, a great deal of activity is occurring in developing standards (TV-Anytime Forum) and devices (TiVo) for local storage on Home Media Servers (HMS). These devices lie at the heart of convergence of the triad: communications/networks - content/media - computing/software. Besides massive storage capacity and being a communications 'gateway', the home media server is characterised by the ability to handle metadata and software that provides an easy to use on-screen interface and intelligent search/content handling facilities. In this paper, we describe a research prototype HMS that is being developed within the GigaCE project at the Telematica Instituut . Our prototype demonstrates advanced search and retrieval (video browsing), adaptive user profiling and an innovative 3D component of the Electronic Program Guide (EPG) which represents online presence. We discuss the use of MPEG-7 for representing metadata, the use of MPEG-21 working draft standards for content identification, description and rights expression, and the use of HMS peer-to-peer content distribution approaches. Finally, we outline explorative user behaviour experiments that aim to investigate the effectiveness of the prototype HMS during development.

  20. The NASA Technical Report Server

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Gottlich, Gretchen L.; Bianco, David J.; Paulson, Sharon S.; Binkley, Robert L.; Kellogg, Yvonne D.; Beaumont, Chris J.; Schmunk, Robert B.; Kurtz, Michael J.; Accomazzi, Alberto

    1995-01-01

    The National Aeronautics and Space Act of 1958 established NASA and charged it to "provide for the widest practicable and appropriate dissemination of information concerning its activities and the results thereof." The search for innovative methods to distribute NASA's information lead a grass-roots team to create the NASA Technical Report Server (NTRS), which uses the World Wide Web and other popular Internet-based information systems as search engines. The NTRS is an inter-center effort which provides uniform access to various distributed publication servers residing on the Internet. Users have immediate desktop access to technical publications from NASA centers and institutes. The NTRS is comprised of several units, some constructed especially for inclusion in NTRS, and others that are existing NASA publication services that NTRS reuses. This paper presents the NTRS architecture, usage metrics, and the lessons learned while implementing and maintaining the service. The NTRS is largely constructed with freely available software running on existing hardware. NTRS builds upon existing hardware and software, and the resulting additional exposure for the body of literature contained ensures that NASA's institutional knowledge base will continue to receive the widest practicable and appropriate dissemination.

  1. The Faculty Web Page: Contrivance or Continuation?

    ERIC Educational Resources Information Center

    Lennex, Lesia

    2007-01-01

    In an age of Internet education, what does it mean for a tenure/tenure-track faculty to have a web page? How many professors have web pages? If they have a page, what does it look like? Do they really need a web page at all? Many universities have faculty web pages. What do those collective pages look like? In what way do they represent the…

  2. Learning through Web Page Design.

    ERIC Educational Resources Information Center

    Peel, Deborah

    2001-01-01

    Describes and evaluates the use of Web page design in an undergraduate course in the United Kingdom on town planning. Highlights include incorporating information and communication technologies into higher education; and a theoretical framework for the use of educational technology. (LRW)

  3. Web Page Design (Part One).

    ERIC Educational Resources Information Center

    Descy, Don E.

    1997-01-01

    Discusses rules for Web page design: consider audiences' Internet skills and equipment; know your content; outline the material; map or sketch the site; be consistent; regulate size of graphics to control download time; place eye catching material in the first 300 pixels; moderate use of color to control file size and bandwidth; include a…

  4. Design of Educational Web Pages

    ERIC Educational Resources Information Center

    Galan, Jose Gomez; Blanco, Soledad Mateos

    2004-01-01

    The methodological characteristics of teaching in primary and secondary education make it necessary to revise the pedagogical and instructive lines with which to introduce the new Information and Communication Technologies into the school context. The construction of Web pages that can be used to improve student learning is, therefore, fundamental…

  5. NEOS server 4.0 administrative guide.

    SciTech Connect

    Dolan, E. D.

    2001-07-13

    The NEOS Server 4.0 provides a general Internet-based client/server as a link between users and software applications. The administrative guide covers the fundamental principals behind the operation of the NEOS Server, installation and trouble-shooting of the Server software, and implementation details of potential interest to a NEOS Server administrator. The guide also discusses making new software applications available through the Server, including areas of concern to remote solver administrators such as maintaining security, providing usage instructions, and enforcing reasonable restrictions on jobs. The administrative guide is intended both as an introduction to the NEOS Server and as a reference for use when running the Server.

  6. Web Page Retrieval System by Automatic Detection of Topic Words

    NASA Astrophysics Data System (ADS)

    Mochida, Hiroshi; Omachi, Shinichiro; Aso, Hirotomo

    In order to search out some documents on a specific topic from a huge amount of documents on the Internet, we sometimes use a Web site that is called a search engine and retrieve documents by using a keyword. However, since a lot of pages are obtained by a keyword, it is difficult to find out the page that we really want from the obtained pages. In this paper, we propose a system supporting to look for web pages on the Internet that a user desires to find. The system actively collects web pages of a topic that relates to a keyword specified by a user and presents them to the user. The experimental result shows the effectiveness of the proposed system.

  7. Accelerating Demand Paging for Local and Remote Out-of-Core Visualization

    NASA Technical Reports Server (NTRS)

    Ellsworth, David

    2001-01-01

    This paper describes a new algorithm that improves the performance of application-controlled demand paging for the out-of-core visualization of data sets that are on either local disks or disks on remote servers. The performance improvements come from better overlapping the computation with the page reading process, and by performing multiple page reads in parallel. The new algorithm can be applied to many different visualization algorithms since application-controlled demand paging is not specific to any visualization algorithm. The paper includes measurements that show that the new multi-threaded paging algorithm decreases the time needed to compute visualizations by one third when using one processor and reading data from local disk. The time needed when using one processor and reading data from remote disk decreased by up to 60%. Visualization runs using data from remote disk ran about as fast as ones using data from local disk because the remote runs were able to make use of the remote server's high performance disk array.

  8. Market study: Tactile paging system

    NASA Technical Reports Server (NTRS)

    1977-01-01

    A market survey was conducted regarding the commercialization potential and key market factors relevant to a tactile paging system for deaf-blind people. The purpose of the tactile paging system is to communicate to the deaf-blind people in an institutional environment. The system consists of a main console and individual satellite wrist units. The console emits three signals by telemetry to the wrist com (receiving unit) which will measure approximately 2 x 4 x 3/4 inches and will be fastened to the wrist by a strap. The three vibration signals are fire alarm, time period indication, and a third signal which will alert the wearer of the wrist com to the fact that the pin on the top of the wrist is emitting a morse coded message. The Morse code message can be felt and recognized with the finger.

  9. Planetary Photojournal Home Page Graphic

    NASA Technical Reports Server (NTRS)

    2004-01-01

    This image is an unannotated version of the Planetary Photojournal Home Page graphic. This digital collage contains a highly stylized rendition of our solar system and points beyond. As this graphic was intended to be used as a navigation aid in searching for data within the Photojournal, certain artistic embellishments have been added (color, location, etc.). Several data sets from various planetary and astronomy missions were combined to create this image.

  10. Photojournal Home Page Graphic 2007

    NASA Technical Reports Server (NTRS)

    2008-01-01

    This image is an unannotated version of the Photojournal Home Page graphic released in October 2007. This digital collage contains a highly stylized rendition of our solar system and points beyond. As this graphic was intended to be used as a navigation aid in searching for data within the Photojournal, certain artistic embellishments have been added (color, location, etc.). Several data sets from various planetary and astronomy missions were combined to create this image.

  11. Functional Multiplex PageRank

    NASA Astrophysics Data System (ADS)

    Iacovacci, Jacopo; Rahmede, Christoph; Arenas, Alex; Bianconi, Ginestra

    2016-10-01

    Recently it has been recognized that many complex social, technological and biological networks have a multilayer nature and can be described by multiplex networks. Multiplex networks are formed by a set of nodes connected by links having different connotations forming the different layers of the multiplex. Characterizing the centrality of the nodes in a multiplex network is a challenging task since the centrality of the node naturally depends on the importance associated to links of a certain type. Here we propose to assign to each node of a multiplex network a centrality called Functional Multiplex PageRank that is a function of the weights given to every different pattern of connections (multilinks) existent in the multiplex network between any two nodes. Since multilinks distinguish all the possible ways in which the links in different layers can overlap, the Functional Multiplex PageRank can describe important non-linear effects when large relevance or small relevance is assigned to multilinks with overlap. Here we apply the Functional Page Rank to the multiplex airport networks, to the neuronal network of the nematode C. elegans, and to social collaboration and citation networks between scientists. This analysis reveals important differences existing between the most central nodes of these networks, and the correlations between their so-called pattern to success.

  12. CCTOP: a Consensus Constrained TOPology prediction web server.

    PubMed

    Dobson, László; Reményi, István; Tusnády, Gábor E

    2015-07-01

    The Consensus Constrained TOPology prediction (CCTOP; http://cctop.enzim.ttk.mta.hu) server is a web-based application providing transmembrane topology prediction. In addition to utilizing 10 different state-of-the-art topology prediction methods, the CCTOP server incorporates topology information from existing experimental and computational sources available in the PDBTM, TOPDB and TOPDOM databases using the probabilistic framework of hidden Markov model. The server provides the option to precede the topology prediction with signal peptide prediction and transmembrane-globular protein discrimination. The initial result can be recalculated by (de)selecting any of the prediction methods or mapped experiments or by adding user specified constraints. CCTOP showed superior performance to existing approaches. The reliability of each prediction is also calculated, which correlates with the accuracy of the per protein topology prediction. The prediction results and the collected experimental information are visualized on the CCTOP home page and can be downloaded in XML format. Programmable access of the CCTOP server is also available, and an example of client-side script is provided.

  13. WMS Server 2.0

    NASA Technical Reports Server (NTRS)

    Plesea, Lucian; Wood, James F.

    2012-01-01

    This software is a simple, yet flexible server of raster map products, compliant with the Open Geospatial Consortium (OGC) Web Map Service (WMS) 1.1.1 protocol. The server is a full implementation of the OGC WMS 1.1.1 as a fastCGI client and using Geospatial Data Abstraction Library (GDAL) for data access. The server can operate in a proxy mode, where all or part of the WMS requests are done on a back server. The server has explicit support for a colocated tiled WMS, including rapid response of black (no-data) requests. It generates JPEG and PNG images, including 16-bit PNG. The GDAL back-end support allows great flexibility on the data access. The server is a port to a Linux/GDAL platform from the original IRIX/IL platform. It is simpler to configure and use, and depending on the storage format used, it has better performance than other available implementations. The WMS server 2.0 is a high-performance WMS implementation due to the fastCGI architecture. The use of GDAL data back end allows for great flexibility. The configuration is relatively simple, based on a single XML file. It provides scaling and cropping, as well as blending of multiple layers based on layer transparency.

  14. THttpServer class in ROOT

    NASA Astrophysics Data System (ADS)

    Adamczewski-Musch, Joern; Linev, Sergey

    2015-12-01

    The new THttpServer class in ROOT implements HTTP server for arbitrary ROOT applications. It is based on Civetweb embeddable HTTP server and provides direct access to all objects registered for the server. Objects data could be provided in different formats: binary, XML, GIF/PNG, and JSON. A generic user interface for THttpServer has been implemented with HTML/JavaScript based on JavaScript ROOT development. With any modern web browser one could list, display, and monitor objects available on the server. THttpServer is used in Go4 framework to provide HTTP interface to the online analysis.

  15. Promoting Metacognition in First Year Anatomy Laboratories Using Plasticine Modeling and Drawing Activities: A Pilot Study of the "Blank Page" Technique

    ERIC Educational Resources Information Center

    Naug, Helen L.; Colson, Natalie J.; Donner, Daniel G.

    2011-01-01

    Many first year students of anatomy and physiology courses demonstrate an inability to self-regulate their learning. To help students increase their awareness of their own learning in a first year undergraduate anatomy course, we piloted an exercise that incorporated the processes of (1) active learning: drawing and plasticine modeling and (2)…

  16. Interstellar Initiative Web Page Design

    NASA Technical Reports Server (NTRS)

    Mehta, Alkesh

    1999-01-01

    This summer at NASA/MSFC, I have contributed to two projects: Interstellar Initiative Web Page Design and Lenz's Law Relative Motion Demonstration. In the Web Design Project, I worked on an Outline. The Web Design Outline was developed to provide a foundation for a Hierarchy Tree Structure. The Outline would help design a Website information base for future and near-term missions. The Website would give in-depth information on Propulsion Systems and Interstellar Travel. The Lenz's Law Relative Motion Demonstrator is discussed in this volume by Russell Lee.

  17. A proteomics strategy to discover beta-glucosidases from Aspergillus fumigatus with two-dimensional page in-gel activity assay and tandem mass spectrometry.

    PubMed

    Kim, Kee-Hong; Brown, Kimberly M; Harris, Paul V; Langston, James A; Cherry, Joel R

    2007-12-01

    Economically competitive production of ethanol from lignocellulosic biomass by enzymatic hydrolysis and fermentation is currently limited, in part, by the relatively high cost and low efficiency of the enzymes required to hydrolyze cellulose to fermentable sugars. Discovery of novel cellulases with greater activity could be a critical step in overcoming this cost barrier. beta-Glucosidase catalyzes the final step in conversion of glucose polymers to glucose. Despite the importance, only a few beta-glucosidases are commercially available, and more efficient ones are clearly needed. We developed a proteomics strategy aiming to discover beta-glucosidases present in the secreted proteome of the cellulose-degrading fungus Aspergillus fumigatus. With the use of partial or complete protein denaturing conditions, the secretory proteome was fractionated in a 2DGE format and beta-glucosidase activity was detected in the gel after infusion with a substrate analogue that fluoresces upon hydrolysis. Fluorescing spots were subjected to tryptic-digestion, and identification as beta-glucosidases was confirmed by tandem mass spectrometry. Two novel beta-glucosidases of A. fumigatus were identified by this in situ activity staining method, and the gene coding for a novel beta-glucosidase ( EAL88289 ) was cloned and heterologously expressed. The expressed beta-glucosidase showed far superior heat stability to the previously characterized beta-glucosidases of Aspergillus niger and Aspergillus oryzae. Improved heat stability is important for development of the next generation of saccharifying enzymes capable of performing fast cellulose hydrolysis reactions at elevated temperatures, thereby lowering the cost of bioethanol production. The in situ activity staining approach described here would be a useful tool for cataloguing and assessing the efficiency of beta-glucosidases in a high throughput fashion.

  18. A Link Taxonomy for Web Pages.

    ERIC Educational Resources Information Center

    Haas, Stephanie W.; Grams, Erika S.

    1998-01-01

    Presents the results of an investigation of the use of links on Web pages; a content analysis was performed on 75 Web pages and their links. Results will provide a conceptual framework in which Web page design can be considered, including issues of authoring, retrieval, and built-in and on-the-fly guidance for reading and browsing. (Author/AEF)

  19. Web Page Design and Network Analysis.

    ERIC Educational Resources Information Center

    Wan, Hakman A.; Chung, Chi-wai

    1998-01-01

    Examines problems in Web-site design from the perspective of network analysis. In view of the similarity between the hypertext structure of Web pages and a generic network, network analysis presents concepts and theories that provide insight for Web-site design. Describes the problem of home-page location and control of number of Web pages and…

  20. Heap/stack guard pages using a wakeup unit

    DOEpatents

    Gooding, Thomas M; Satterfield, David L; Steinmacher-Burow, Burkhard

    2014-04-29

    A method and system for providing a memory access check on a processor including the steps of detecting accesses to a memory device including level-1 cache using a wakeup unit. The method includes invalidating level-1 cache ranges corresponding to a guard page, and configuring a plurality of wakeup address compare (WAC) registers to allow access to selected WAC registers. The method selects one of the plurality of WAC registers, and sets up a WAC register related to the guard page. The method configures the wakeup unit to interrupt on access of the selected WAC register. The method detects access of the memory device using the wakeup unit when a guard page is violated. The method generates an interrupt to the core using the wakeup unit, and determines the source of the interrupt. The method detects the activated WAC registers assigned to the violated guard page, and initiates a response.

  1. Accessibility of State Department of Education Home Pages and Special Education Pages.

    ERIC Educational Resources Information Center

    Opitz, Christine; Savenye, Wilhelmina; Rowland, Cyndi

    2003-01-01

    This study evaluated State Department of Education Internet home pages and special education pages for accessibility compliance with standards of the World Wide Web Consortium and Section 508 of the revised Rehabilitation Act. Only 26% of state department home pages and 52% of special education pages achieved W3C compliance and fewer conformed…

  2. Generic OPC UA Server Framework

    NASA Astrophysics Data System (ADS)

    Nikiel, Piotr P.; Farnham, Benjamin; Filimonov, Viatcheslav; Schlenker, Stefan

    2015-12-01

    This paper describes a new approach for generic design and efficient development of OPC UA servers. Development starts with creation of a design file, in XML format, describing an object-oriented information model of the target system or device. Using this model, the framework generates an executable OPC UA server application, which exposes the per-design OPC UA address space, without the developer writing a single line of code. Furthermore, the framework generates skeleton code into which the developer adds the necessary logic for integration to the target system or device. This approach allows both developers unfamiliar with the OPC UA standard, and advanced OPC UA developers, to create servers for the systems they are experts in while greatly reducing design and development effort as compared to developments based purely on COTS OPC UA toolkits. Higher level software may further benefit from the explicit OPC UA server model by using the XML design description as the basis for generating client connectivity configuration and server data representation. Moreover, having the XML design description at hand facilitates automatic generation of validation tools. In this contribution, the concept and implementation of this framework is detailed along with examples of actual production-level usage in the detector control system of the ATLAS experiment at CERN and beyond.

  3. Fault-tolerant PACS server

    NASA Astrophysics Data System (ADS)

    Cao, Fei; Liu, Brent J.; Huang, H. K.; Zhou, Michael Z.; Zhang, Jianguo; Zhang, X. C.; Mogel, Greg T.

    2002-05-01

    Failure of a PACS archive server could cripple an entire PACS operation. Last year we demonstrated that it was possible to design a fault-tolerant (FT) server with 99.999% uptime. The FT design was based on a triple modular redundancy with a simple majority vote to automatically detect and mask a faulty module. The purpose of this presentation is to report on its continuous developments in integrating with external mass storage devices, and to delineate laboratory failover experiments. An FT PACS Simulator with generic PACS software has been used in the experiment. To simulate a PACS clinical operation, image examinations are transmitted continuously from the modality simulator to the DICOM gateway and then to the FT PACS server and workstations. The hardware failures in network, FT server module, disk, RAID, and DLT are manually induced to observe the failover recovery of the FT PACS to resume its normal data flow. We then test and evaluate the FT PACS server in its reliability, functionality, and performance.

  4. Realistic page-turning of electronic books

    NASA Astrophysics Data System (ADS)

    Fan, Chaoran; Li, Haisheng; Bai, Yannan

    2014-01-01

    The booming electronic books (e-books), as an extension to the paper book, are popular with readers. Recently, many efforts are put into the realistic page-turning simulation o f e-book to improve its reading experience. This paper presents a new 3D page-turning simulation approach, which employs piecewise time-dependent cylindrical surfaces to describe the turning page and constructs smooth transition method between time-dependent cylinders. The page-turning animation is produced by sequentially mapping the turning page into the cylinders with different radii and positions. Compared to the previous approaches, our method is able to imitate various effects efficiently and obtains more natural animation of turning page.

  5. Parallel Computing Using Web Servers and "Servlets".

    ERIC Educational Resources Information Center

    Lo, Alfred; Bloor, Chris; Choi, Y. K.

    2000-01-01

    Describes parallel computing and presents inexpensive ways to implement a virtual parallel computer with multiple Web servers. Highlights include performance measurement of parallel systems; models for using Java and intranet technology including single server, multiple clients and multiple servers, single client; and a comparison of CGI (common…

  6. Library links on medical school home pages.

    PubMed

    Thomas, Sheila L

    2011-01-01

    The purpose of this study was to assess the websites of American Association of Medical Colleges (AAMC)-member medical schools for the presence of library links. Sixty-one percent (n = 92) of home pages of the 150 member schools of the AAMC contain library links. For the 58 home pages not offering such links, 50 provided a pathway of two or three clicks to a library link. The absence of library links on 39% of AAMC medical school home pages indicates that the designers of those pages did not consider the library to be a primary destination for their visitors.

  7. Effects of Linear Texts in Page Scrolling and Page-by-Page Reading Forms on Reading Comprehension Introduction

    ERIC Educational Resources Information Center

    Sahin, Ayfer

    2011-01-01

    This research aims to analyse the Effect of Scrolling and page by page moving Static Texts on Comprehension of Screen Reading of 4th grade students. The sample was composed of 46 students of 4th grade students of a elementary school in Kirsehir Central Province. The classrooms of the participants were selected by random sampling method and…

  8. Hybrid metrology implementation: server approach

    NASA Astrophysics Data System (ADS)

    Osorio, Carmen; Timoney, Padraig; Vaid, Alok; Elia, Alex; Kang, Charles; Bozdog, Cornel; Yellai, Naren; Grubner, Eyal; Ikegami, Toru; Ikeno, Masahiko

    2015-03-01

    Hybrid metrology (HM) is the practice of combining measurements from multiple toolset types in order to enable or improve metrology for advanced structures. HM is implemented in two phases: Phase-1 includes readiness of the infrastructure to transfer processed data from the first toolset to the second. Phase-2 infrastructure allows simultaneous transfer and optimization of raw data between toolsets such as spectra, images, traces - co-optimization. We discuss the extension of Phase-1 to include direct high-bandwidth communication between toolsets using a hybrid server, enabling seamless fab deployment and further laying the groundwork for Phase-2 high volume manufacturing (HVM) implementation. An example of the communication protocol shows the information that can be used by the hybrid server, differentiating its capabilities from that of a host-based approach. We demonstrate qualification and production implementation of the hybrid server approach using CD-SEM and OCD toolsets for complex 20nm and 14nm applications. Finally we discuss the roadmap for Phase-2 HM implementation through use of the hybrid server.

  9. Autofluorescence based visualization of proteins from unstained native-PAGE

    NASA Astrophysics Data System (ADS)

    Manjunath, S.; Rao, Bola Sadashiva S.; Satyamoorthy, Kapaettu; Mahato, Krishna Kishore

    2015-03-01

    Proteins are the most diverse and functionally active biomolecules in the living system. In order to understand their diversity and dynamic functionality, visualization in native form without altering structural and functional properties during the separation from the complex mixtures is very much essential. In the present study, a sensitive methodology for optimal visualization of unstained or untagged proteins in native poly-acrylamide gel electrophoresis (N-PAGE) has been developed where, concentration of the acrylamide and bis-acrylamide mixture, Percentage of the gel, fixing of the N-PAGE by methanol: acetic acid: water and washing of the gel in the mili-Q water has been optimized for highest sensitivity using laser induced autofluorescence. The outcome with bovine serum albumin (BSA) in PAGE was found to be highest at acrylamide and bis-acrylamide concentrations of 29.2 and 0.8 respectively in 12% N-PAGE. After the electrophoresis run, washing of the N-PAGE immediately with miliQ water for 12 times and eliminating the methanol: acetic acid: water, fixing of the N-PAGE yielded better sensitivity of visualization. Using the above methodology 25ng of BSA protein band in PAGE was clearly identified by the technique. The currently used staining techniques for the visualization of proteins are coomassie brilliant blue and silver staining, have the sensitivity of 100ng and 5ng respectively. The current methodology was found to be more sensitive as compared to coomassie staining and less sensitive compared to silver staining respectively. The added advantage of this methodology is the faster visualization of proteins without altering their structure and functional properties.

  10. "I didn't know her, but…": parasocial mourning of mediated deaths on Facebook RIP pages

    NASA Astrophysics Data System (ADS)

    Klastrup, Lisbeth

    2015-04-01

    This article examines the use of six Danish "Rest in Peace" or (RIP) memorial pages. The article focuses on the relation between news media and RIP page use, in relation to general communicative practices on these pages. Based on an analysis of press coverage of the deaths of six young people and a close analysis of 1,015 comments extracted from the RIP pages created to memorialize them, it is shown that their deaths attracted considerable media attention, as did the RIP pages themselves. Comment activity seem to reflect the news stories in the way the commenters refer to the context of death and the emotional distress they experience, but mainly comments on the RIP pages are conventional expressions of sympathy and "RIP" wishes. The article concludes that public RIP pages might be understood as virtual spontaneous shrines, affording an emerging practice of "RIP-ing."

  11. 40 CFR 1502.7 - Page limits.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Page limits. 1502.7 Section 1502.7 Protection of Environment COUNCIL ON ENVIRONMENTAL QUALITY ENVIRONMENTAL IMPACT STATEMENT § 1502.7 Page limits. The text of final environmental impact statements (e.g., paragraphs (d) through (g) of §...

  12. Teacher Web Pages that Build Parent Partnerships.

    ERIC Educational Resources Information Center

    Johnson, Doug

    2000-01-01

    Discusses the importance of collaboration between teachers and parents to help support students and describes teacher-created Web pages that help simplify communication and planning. Explains Web page design that can include general class descriptions, unit outlines and timetables, unit and project information, and student progress reports. (LRW)

  13. Web Page Authoring Tools: Comparison and Trends.

    ERIC Educational Resources Information Center

    Craney, Linda

    Initially available from universities and individual enthusiasts, software tools to author World Wide Web pages are maturing into very feature-rich applications and are now offered by large corporations. These applications are enabling more companies to create and maintain pages themselves on the Web or on corporate Intranets. The market continues…

  14. Minimal Guidelines for Authors of Web Pages.

    ERIC Educational Resources Information Center

    ADE Bulletin, 2002

    2002-01-01

    Presents guidelines that recommend the minimal reference information that should be provided on Web pages intended for use by students, teachers, and scholars in the modern languages. Suggests the inclusion of information about responsible parties, copyright declaration, privacy statements, and site information. Makes a note on Web page style. (SG)

  15. Unit Pages: Differentiation for 200 Students

    ERIC Educational Resources Information Center

    Carver, Andrea; Bailey, Janelle M.

    2010-01-01

    Based upon the models of differentiated instruction (Tomlinson and Edison 2003) and Layered Curriculum (Nunley 2004), the author created the Unit Pages strategy. Just like Layered Curriculum, the pages can be handed directly to students, allowing them to take charge of their own learning rather than requiring the teacher to individually monitor…

  16. Automated Title Page Cataloging: A Feasibility Study.

    ERIC Educational Resources Information Center

    Weibel, Stuart; And Others

    1989-01-01

    Describes the design of a prototype rule-based system for the automation of descriptive cataloging from title pages. The discussion covers the results of tests of the prototype, major impediments to automatic cataloging from title pages, and prospects for further progress. The rules implemented in the prototype are appended. (16 references)…

  17. Review of Theoretical Prediction Models for Organic Extract Metabolites, Effect of Drying Temperature on Smooth Muscle Relaxing Activity Induced by Organic Extracts Specially Cecropia Obtusifolia Portal and Web Server Predictors of Drug-Protein Interaction.

    PubMed

    Aguirre-Crespo, Francisco; García-Mera, Xerardo; Guillén-Poot, Mónica Anahi; May-Díaz, Héctor Fernado; Tun-Suárez, Adrián; Aguirre-Crespo, A; Hernández-Rodríguez, J; Vergara-Galicia, Jorge; Rodríguez-López, V; Prado-Prado, Francisco J

    2015-02-19

    Cecropia obtusifolia bertol is medicinal specie used in the treatment of diabetes mellitus and hypertension and it has scientific studies that support the traditional use. However, it is required to understand the influence of drying temperature on the yield and pharmacological activity. Drying rate, extraction efficiency, changes in the UV-Vis spectrum and estimating chlorophylls were stimulated with the increasing temperature. Finally, relaxant activity of vascular smooth muscle is increased by 70ºC and reducing ability by the method of CARF increases with temperature. Analytical studies are required to identify changes in the metabolic content and those that ensure the safety and efficacy for human consumption. In this sense, bioinformatic studies may be helpful. Studies such as QSAR can help us to study these metabolites derived from natural products. MIND-BETS model and NL MIND-BETS model to predict DPIs was introduced using MARCH-INSIDE (MI) software to calculate structural parameters for drugs and enzymes respectively. We firstly revised the state-of-art on the design with review of previous works with hypertension activity based on theoretical studies. A study, evaluating the effect of drying temperature of leaves of C. obtusifolia on the relaxing of vascular smooth muscle, antioxidant activity and the presence of chlorophylls, with a focus on Cecropia metabolites. Last, we carried out QSAR studies using MIND-BEST and NL MIND-BEST web servers in order to understand the essential metabolites structural requirement for binding with receptors for FDA proteins.

  18. Recurrence of Acute Page Kidney in a Renal Transplant Allograft

    PubMed Central

    Zayas, Carlos; Mulloy, Laura; Jagadeesan, Muralidharan

    2016-01-01

    Acute Page Kidney (APK) phenomenon is a rare cause of secondary hypertension, mediated by activation of renin-angiotensin-aldosterone system (RAAS). Timely intervention is of great importance to prevent any end organ damage from hypertension. We present a unique case of three episodes of APK in the same renal transplant allograft. PMID:27725836

  19. Recurrence of Acute Page Kidney in a Renal Transplant Allograft.

    PubMed

    Kapoor, Rajan; Zayas, Carlos; Mulloy, Laura; Jagadeesan, Muralidharan

    2016-01-01

    Acute Page Kidney (APK) phenomenon is a rare cause of secondary hypertension, mediated by activation of renin-angiotensin-aldosterone system (RAAS). Timely intervention is of great importance to prevent any end organ damage from hypertension. We present a unique case of three episodes of APK in the same renal transplant allograft.

  20. How To Get Your Web Page Noticed.

    ERIC Educational Resources Information Center

    Schrock, Kathleen

    1997-01-01

    Presents guidelines for making a Web site noticeable. Discusses submitting the URL to directories, links, and announcement lists, and sending the site over the server via FTP to search engines. Describes how to index the site with "Title,""Heading," and "Meta" tags. (AEF)

  1. Creating a Facebook Page for the Seismological Society of America

    NASA Astrophysics Data System (ADS)

    Newman, S. B.

    2009-12-01

    In August, 2009 I created a Facebook “fan” page for the Seismological Society of America. We had been exploring cost-effective options for providing forums for two-way communication for some months. We knew that a number of larger technical societies had invested significant sums of money to create customized social networking sites but that a small society would need to use existing low-cost software options. The first thing I discovered when I began to set up the fan page was that an unofficial SSA Facebook group already existed, established by Steven J. Gibbons, a member in Norway. Steven had done an excellent job of posting material about SSA. Partly because of the existing group, the official SSA fan page gained fans rapidly. We began by posting information about our own activities and then added links to activities in the broader geoscience community. While much of this material also appeared on our website and in our publication, Seismological Research Letters (SRL), the tone on the FB page is different. It is less formal with more emphasis on photos and links to other sites, including our own. Fans who are active on FB see the posts as part of their social network and do not need to take the initiative to go to the SSA site. Although the goal was to provide a forum for two-way communication, our initial experience was that people were clearly reading the page but not contributing content. This appears to be case with fan pages of sister geoscience societies. FB offers some demographic information to fan site administrators. In an initial review of the demographics it appeared that fans were younger than the overall demographics of the Society. It appeared that a few of the fans are not members or even scientists. Open questions are: what content will be most useful to fans? How will the existence of the page benefit the membership as a whole? Will the page ultimately encourage two-way communication as hoped? Web 2.0 is generating a series of new

  2. A batch arrival queue under randomised multi-vacation policy with unreliable server and repair

    NASA Astrophysics Data System (ADS)

    Ke, Jau-Chuan; Huang, Kai-Bin; Pearn, Wen Lea

    2012-03-01

    This article examines an M[x]/G/1 queueing system with an unreliable server and a repair, in which the server operates a randomised vacation policy with multiple available vacations. Upon the system being found to be empty, the server immediately takes a vacation. If there is at least one customer found waiting in the queue upon returning from a vacation, the server will be activated for service. Otherwise, if no customers are waiting for service at the end of a vacation, the server either remains idle with probability p or leaves for another vacation with probability 1 - p. When one or more customers arrive when the server is idle, the server immediately starts providing service for the arrivals. It is possible that an unpredictable breakdown may occur in the server, in which case a repair time is requested. For such a system, we derive the distributions of several important system characteristics, such as the system size distribution at a random epoch and at a departure epoch, the system size distribution at the busy period initiation epoch, and the distribution of the idle and busy periods. We perform a numerical analysis for changes in the system characteristics, along with changes in specific values of the system parameters. A cost effectiveness maximisation model is constructed to show the benefits of such a queueing system.

  3. Minimizing Thermal Stress for Data Center Servers through Thermal-Aware Relocation

    PubMed Central

    Ling, T. C.; Hussain, S. A.

    2014-01-01

    A rise in inlet air temperature may lower the rate of heat dissipation from air cooled computing servers. This introduces a thermal stress to these servers. As a result, the poorly cooled active servers will start conducting heat to the neighboring servers and giving rise to hotspot regions of thermal stress, inside the data center. As a result, the physical hardware of these servers may fail, thus causing performance loss, monetary loss, and higher energy consumption for cooling mechanism. In order to minimize these situations, this paper performs the profiling of inlet temperature sensitivity (ITS) and defines the optimum location for each server to minimize the chances of creating a thermal hotspot and thermal stress. Based upon novel ITS analysis, a thermal state monitoring and server relocation algorithm for data centers is being proposed. The contribution of this paper is bringing the peak outlet temperatures of the relocated servers closer to average outlet temperature by over 5 times, lowering the average peak outlet temperature by 3.5% and minimizing the thermal stress. PMID:24987743

  4. Mining, visualizing and comparing multidimensional biomolecular data using the Genomics Data Miner (GMine) Web-Server.

    PubMed

    Proietti, Carla; Zakrzewski, Martha; Watkins, Thomas S; Berger, Bernard; Hasan, Shihab; Ratnatunga, Champa N; Brion, Marie-Jo; Crompton, Peter D; Miles, John J; Doolan, Denise L; Krause, Lutz

    2016-12-06

    Genomics Data Miner (GMine) is a user-friendly online software that allows non-experts to mine, cluster and compare multidimensional biomolecular datasets. Various powerful visualization techniques are provided, generating high quality figures that can be directly incorporated into scientific publications. Robust and comprehensive analyses are provided via a broad range of data-mining techniques, including univariate and multivariate statistical analysis, supervised learning, correlation networks, clustering and multivariable regression. The software has a focus on multivariate techniques, which can attribute variance in the measurements to multiple explanatory variables and confounders. Various normalization methods are provided. Extensive help pages and a tutorial are available via a wiki server. Using GMine we reanalyzed proteome microarray data of host antibody response against Plasmodium falciparum. Our results support the hypothesis that immunity to malaria is a higher-order phenomenon related to a pattern of responses and not attributable to any single antigen. We also analyzed gene expression across resting and activated T cells, identifying many immune-related genes with differential expression. This highlights both the plasticity of T cells and the operation of a hardwired activation program. These application examples demonstrate that GMine facilitates an accurate and in-depth analysis of complex molecular datasets, including genomics, transcriptomics and proteomics data.

  5. Mining, visualizing and comparing multidimensional biomolecular data using the Genomics Data Miner (GMine) Web-Server

    PubMed Central

    Proietti, Carla; Zakrzewski, Martha; Watkins, Thomas S.; Berger, Bernard; Hasan, Shihab; Ratnatunga, Champa N.; Brion, Marie-Jo; Crompton, Peter D.; Miles, John J.; Doolan, Denise L.; Krause, Lutz

    2016-01-01

    Genomics Data Miner (GMine) is a user-friendly online software that allows non-experts to mine, cluster and compare multidimensional biomolecular datasets. Various powerful visualization techniques are provided, generating high quality figures that can be directly incorporated into scientific publications. Robust and comprehensive analyses are provided via a broad range of data-mining techniques, including univariate and multivariate statistical analysis, supervised learning, correlation networks, clustering and multivariable regression. The software has a focus on multivariate techniques, which can attribute variance in the measurements to multiple explanatory variables and confounders. Various normalization methods are provided. Extensive help pages and a tutorial are available via a wiki server. Using GMine we reanalyzed proteome microarray data of host antibody response against Plasmodium falciparum. Our results support the hypothesis that immunity to malaria is a higher-order phenomenon related to a pattern of responses and not attributable to any single antigen. We also analyzed gene expression across resting and activated T cells, identifying many immune-related genes with differential expression. This highlights both the plasticity of T cells and the operation of a hardwired activation program. These application examples demonstrate that GMine facilitates an accurate and in-depth analysis of complex molecular datasets, including genomics, transcriptomics and proteomics data. PMID:27922118

  6. 16 CFR 436.3 - Cover page.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... business address, telephone number, and, if applicable, email address and primary home page address. (c) A... is a complex investment. The information in this disclosure document can help you make up your...

  7. 16 CFR 436.3 - Cover page.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... business address, telephone number, and, if applicable, email address and primary home page address. (c) A... is a complex investment. The information in this disclosure document can help you make up your...

  8. Best Practices for Searchable Collection Pages

    EPA Pesticide Factsheets

    Searchable Collection pages are stand-alone documents that do not have any web area navigation. They should not recreate existing content on other sites and should be tagged with quality metadata and taxonomy terms.

  9. Comparing classical and quantum PageRanks

    NASA Astrophysics Data System (ADS)

    Loke, T.; Tang, J. W.; Rodriguez, J.; Small, M.; Wang, J. B.

    2017-01-01

    Following recent developments in quantum PageRanking, we present a comparative analysis of discrete-time and continuous-time quantum-walk-based PageRank algorithms. Relative to classical PageRank and to different extents, the quantum measures better highlight secondary hubs and resolve ranking degeneracy among peripheral nodes for all networks we studied in this paper. For the discrete-time case, we investigated the periodic nature of the walker's probability distribution for a wide range of networks and found that the dominant period does not grow with the size of these networks. Based on this observation, we introduce a new quantum measure using the maximum probabilities of the associated walker during the first couple of periods. This is particularly important, since it leads to a quantum PageRanking scheme that is scalable with respect to network size.

  10. TEST PAGE Air Data From Richmond, VA

    EPA Pesticide Factsheets

    This page presents graphs showing radiation air monitoring data for Riverside, CA from EPA's RadNet system. RadNet is a nationwide network of monitoring stations that measure radiation in air, drinking water and precipitation.

  11. A Web Page Summarization for Mobile Phones

    NASA Astrophysics Data System (ADS)

    Hasegawa, Takaaki; Nishikawa, Hitoshi; Imamura, Kenji; Kikui, Gen'ichiro; Okumur, Manabu

    Recently, web pages for mobile devices are widely spread on the Internet and a lot of people can access web pages through search engines by mobile devices as well as personal computers. A summary of a retrieved web page is important because the people judge whether or not the page would be relevant to their information need according to the summary. In particular, the summary must be not only compact but also grammatical and meaningful when the users retrieve information using a mobile phone with a small screen. Most search engines seem to produce a snippet based on the keyword-in-context (KWIC) method. However, this simple method could not generate a refined summary suitable for mobile phones because of low grammaticality and content overlap with the page title. We propose a more suitable method to generate a snippet for mobile devices using sentence extraction and sentence compression methods. First, sentences are biased based on whether they include the query terms from the users or words that are relevant to the queries, as well as whether they do not overlap with the page title based on maximal marginal relevance (MMR). Second, the selected sentences are compressed based on their phrase coverage, which is measured by the scores of words, and their phrase connection probability measured based on the language model, according to the dependency structure converted from the sentence. The experimental results reveal the proposed method outperformed the KWIC method in terms of relevance judgment, grammaticality, non-redundancy and content coverage.

  12. National Medical Terminology Server in Korea

    NASA Astrophysics Data System (ADS)

    Lee, Sungin; Song, Seung-Jae; Koh, Soonjeong; Lee, Soo Kyoung; Kim, Hong-Gee

    Interoperable EHR (Electronic Health Record) necessitates at least the use of standardized medical terminologies. This paper describes a medical terminology server, LexCare Suite, which houses terminology management applications, such as a terminology editor, and a terminology repository populated with international standard terminology systems such as Systematized Nomenclature of Medicine (SNOMED). The server is to satisfy the needs of quality terminology systems to local primary to tertiary hospitals. Our partner general hospitals have used the server to test its applicability. This paper describes the server and the results of the applicability test.

  13. UniTree Name Server internals

    SciTech Connect

    Mecozzi, D.; Minton, J.

    1996-01-01

    The UniTree Name Server (UNS) is one of several servers which make up the UniTree storage system. The Name Server is responsible for mapping names to capabilities Names are generally human readable ASCII strings of any length. Capabilities are unique 256-bit identifiers that point to files, directories, or symbolic links. The Name Server implements a UNIX style hierarchical directory structure to facilitate name-to-capability mapping. The principal task of the Name Server is to manage the directories which make up the UniTree directory structure. The principle clients of the Name Server are the FTP Daemon, NFS and a few UniTree utility routines. However, the Name Server is a generalized server and will accept messages from any client. The purpose of this paper is to describe the internal workings of the UniTree Name Server. In cases where it seems appropriate, the motivation for a particular choice of algorithm as description of the algorithm itself will be given.

  14. TOPS On-Line: Automating the Construction and Maintenance of HTML Pages

    NASA Technical Reports Server (NTRS)

    Jones, Kennie H.

    1994-01-01

    After the Technology Opportunities Showcase (TOPS), in October, 1993, Langley Research Center's (LaRC) Information Systems Division (ISD) accepted the challenge to preserve the investment in information assembled in the TOPS exhibits by establishing a data base. Following the lead of several people at LaRC and others around the world, the HyperText Transport Protocol (HTTP) server and Mosaic were the obvious tools of choice for implementation. Initially, some TOPS exhibitors began the conventional approach of constructing HyperText Markup Language (HTML) pages of their exhibits as input to Mosaic. Considering the number of pages to construct, a better approach was conceived that would automate the construction of pages. This approach allowed completion of the data base construction in a shorter period of time using fewer resources than would have been possible with the conventional approach. It also provided flexibility for the maintenance and enhancement of the data base. Since that time, this approach has been used to automate construction of other HTML data bases. Through these experiences, it is concluded that the most effective use of the HTTP/Mosaic technology will require better tools and techniques for creating, maintaining and managing the HTML pages. The development and use of these tools and techniques are the subject of this document.

  15. MICAS: a fully automated web server for microsatellite extraction and analysis from prokaryote and viral genomic sequences.

    PubMed

    Sreenu, Vattipally B; Ranjitkumar, Gundu; Swaminathan, Sugavanam; Priya, Sasidharan; Bose, Buddhaditta; Pavan, Mogili N; Thanu, Geeta; Nagaraju, Javaregowda; Nagarajaram, Hampapathalu A

    2003-01-01

    MICAS is a web server for extracting microsatellite information from completely sequenced prokaryote and viral genomes, or user-submitted sequences. This server provides an integrated platform for MICdb (database of prokaryote and viral microsatellites), W-SSRF (simple sequence repeat finding program) and Autoprimer (primer design software). MICAS, through dynamic HTML page generation, helps in the systematic extraction of microsatellite information from selected genomes hosted on MICdb or from user-submitted sequences. Further, it assists in the design of primers with the help of Autoprimer, for sequences containing selected microsatellite tracts.

  16. Description Meta Tags in Public Home and Linked Pages.

    ERIC Educational Resources Information Center

    Craven, Timothy C.

    2001-01-01

    Random samples of 1,872 Web pages registered with Yahoo! And 1,638 pages reachable from Yahoo!-registered pages were analyzed for use of meta tags and specifically those containing descriptions. Results: 727 (38.8%) of the Yahoo!-registered pages and 442 (27%) of the other pages included descriptions in meta tages. Some descriptions greatly…

  17. Implementing a Physician's Workstation using client/server technology and the distributed computing environment.

    PubMed

    Pham, T Q; Young, C Y; Tang, P C; Suermondt, H J; Annevelink, J

    1994-01-01

    PWS is a physician's workstation research prototype developed to explore the use of information management tools by physicians in the context of patient care. The original prototype was implemented in a client/server architecture using a broadcast message server. As we expanded the scope of the prototyping activities, we identified the limitations of the broadcast message server in the areas of scalability, security, and interoperability. To address these issues, we reimplemented PWS using the Open Software Foundation's Distributed Computing Environment (DCE). We describe the rationale for using DCE, the migration process, and the benefits achieved. Future work and recommendations are discussed.

  18. PiRaNhA: a server for the computational prediction of RNA-binding residues in protein sequences

    PubMed Central

    Murakami, Yoichi; Spriggs, Ruth V.; Nakamura, Haruki; Jones, Susan

    2010-01-01

    The PiRaNhA web server is a publicly available online resource that automatically predicts the location of RNA-binding residues (RBRs) in protein sequences. The goal of functional annotation of sequences in the field of RNA binding is to provide predictions of high accuracy that require only small numbers of targeted mutations for verification. The PiRaNhA server uses a support vector machine (SVM), with position-specific scoring matrices, residue interface propensity, predicted residue accessibility and residue hydrophobicity as features. The server allows the submission of up to 10 protein sequences, and the predictions for each sequence are provided on a web page and via email. The prediction results are provided in sequence format with predicted RBRs highlighted, in text format with the SVM threshold score indicated and as a graph which enables users to quickly identify those residues above any specific SVM threshold. The graph effectively enables the increase or decrease of the false positive rate. When tested on a non-redundant data set of 42 protein sequences not used in training, the PiRaNhA server achieved an accuracy of 85%, specificity of 90% and a Matthews correlation coefficient of 0.41 and outperformed other publicly available servers. The PiRaNhA prediction server is freely available at http://www.bioinformatics.sussex.ac.uk/PIRANHA. PMID:20507911

  19. The mediating role of facebook fan pages.

    PubMed

    Chih, Wen-Hai; Hsu, Li-Chun; Wang, Kai-Yu; Lin, Kuan-Yu

    2014-01-01

    Using the dual mediation hypothesis, this study investigates the role of interestingness (the power of attracting or holding one's attention) attitude towards the news, in the formation of Facebook Fan Page users' electronic word-of-mouth intentions. A total of 599 Facebook fan page users in Taiwan were recruited and structural equation modeling (SEM) was used to test the research hypotheses. The results show that both perceived news entertainment and informativeness positively influence interestingness attitude towards the news. Interestingness attitude towards the news subsequently influences hedonism and utilitarianism attitudes towards the Fan Page, which then influence eWOM intentions. Interestingness attitude towards the news plays a more important role than hedonism and utilitarianism attitudes in generating electronic word-of-mouth intentions. Based on the findings, the implications and future research suggestions are provided.

  20. A rendering approach for stereoscopic web pages

    NASA Astrophysics Data System (ADS)

    Zhang, Jianlong; Wang, Wenmin; Wang, Ronggang; Chen, Qinshui

    2014-03-01

    Web technology provides a relatively easy way to generate contents for us to recognize the world, and with the development of stereoscopic display technology, the stereoscopic devices will become much more popular. The combination of web technology and stereoscopic display technology will bring revolutionary visual effect. The Stereoscopic 3D (S3D) web pages, in which text, image and video may have different depth, can be displayed on stereoscopic display devices. This paper presents the approach about how to render two viewing S3D web pages including text, images, widgets: first, an algorithm should be developed in order to display stereoscopic elements like text, widgets by using 2D graphic library; second, a method should be presented to render stereoscopic web page based on current framework of the browser; third, a rough solution is invented to fix the problem that comes out in the method.

  1. European user trial of paging by satellite

    NASA Technical Reports Server (NTRS)

    Fudge, R. E.; Fenton, C. J.

    1990-01-01

    British Telecom conceived the idea of adapting their existing paging service, together with the use of existing terrestrial pagers, to yield a one way data (i.e., paging) satellite service to mobiles. The user trial of paging by satellites was successful. It demonstrated that services could be provided over a wide geographical area to low priced terminals. Many lessons were learned in unexpected areas. These include the need for extensive liaison with all users involved, especially the drivers, to ensure they understood the potential benefits. There was a significant desire for a return acknowledgement channel or even a return data channel. Above all there is a need to ensure that the equipment can be taken across European borders and legitimately used in all European countries. The next step in a marketing assessment would be to consider the impact of two way data messaging such as INMARSAT-C.

  2. You're a What? Process Server

    ERIC Educational Resources Information Center

    Torpey, Elka

    2012-01-01

    In this article, the author talks about the role and functions of a process server. The job of a process server is to hand deliver legal documents to the people involved in court cases. These legal documents range from a summons to appear in court to a subpoena for producing evidence. Process serving can involve risk, as some people take out their…

  3. Optimizing the NASA Technical Report Server.

    ERIC Educational Resources Information Center

    Nelson, Michael L.; Maa, Ming-Hokng

    1996-01-01

    Modifying the NASA Technical Report Server (NTRS), a World Wide Web report distribution NASA technical publications service, has enhanced its performance, protocol support, and human interfacing. This article discusses the original and revised NTRS architecture, sequential and parallel query methods, and wide area information server (WAIS) uniform…

  4. HDF-EOS Web Server

    NASA Technical Reports Server (NTRS)

    Ullman, Richard; Bane, Bob; Yang, Jingli

    2008-01-01

    A shell script has been written as a means of automatically making HDF-EOS-formatted data sets available via the World Wide Web. ("HDF-EOS" and variants thereof are defined in the first of the two immediately preceding articles.) The shell script chains together some software tools developed by the Data Usability Group at Goddard Space Flight Center to perform the following actions: Extract metadata in Object Definition Language (ODL) from an HDF-EOS file, Convert the metadata from ODL to Extensible Markup Language (XML), Reformat the XML metadata into human-readable Hypertext Markup Language (HTML), Publish the HTML metadata and the original HDF-EOS file to a Web server and an Open-source Project for a Network Data Access Protocol (OPeN-DAP) server computer, and Reformat the XML metadata and submit the resulting file to the EOS Clearinghouse, which is a Web-based metadata clearinghouse that facilitates searching for, and exchange of, Earth-Science data.

  5. Laser Based Information Systems (Selected Pages),

    DTIC Science & Technology

    1986-05-22

    CO lasers . Microwaves, 1967, M* 7. 85. W e I s s P. F., T o h n s o n R. E. Laser tracking wiht automatic reacquisi- tion capability. Appl. Optics, 1968, Vol. 7, M* 6. I it 313 lab- Now - ...DIVISIONCD LASER BASED INFORMATION SYSTEMS (Selected Pages) bDTIC L.Z. Kriksunov EL’, %N16 86 4. I’, Approved for public release; Distribution...HUMAN TRANSLATION FTD-ID(RS)T-0563-85 22 May 1986 MICROFICHE NR: FTD-86-C-O01863 LASER BASED INFORMATION SYSTEMS (Selected Pages) By: L.Z.

  6. [Native electrophoresis in cell proteomics: BN-PAGE and CN-PAGE].

    PubMed

    Shykoliukov, S A

    2011-01-01

    The presented mini-review aims to attract the attention of domestic researchers for rapid, cheap and easily reproducible method of native polyacrylamide gel electrophoresis (PAGE), which for some reason has not yet found application in our country. The review collected the most interesting examples of the use of three types of native electrophoresis (BN-PAGE, CN-PAGE and hrCN-PAGE) to study the peculiarities of proteomes of various animal, plant and bacterial cells. The references to fundamental reviews, basic protocols, modifications of the initial methods and the examples of the combination of native electrophoresis with other chemical or physical methods are presented. Particular attention to the principles of BN-, CN- and hrCN-PAGE as well as to their advantages and disadvantages is paid.

  7. 24 CFR 1710.105 - Cover page.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 24 Housing and Urban Development 5 2014-04-01 2014-04-01 false Cover page. 1710.105 Section 1710.105 Housing and Urban Development Regulations Relating to Housing and Urban Development (Continued....C. 1718; sec. 7(d), Dept. of Housing and Urban Development Act, 42 U.S.C. 3535(d))...

  8. Monitoring and Managing Links on Your Pages

    EPA Pesticide Factsheets

    Some of these tools can be used on Drupal pages that are not published yet, or on non-Drupal content. Some, such as the Bookmarklet tools, can help make checking and correcting your links easier when used alongside Drupal's link reports.

  9. Referencing web pages and e-journals.

    PubMed

    Bryson, David

    2013-12-01

    One of the areas that can confuse students and authors alike is how to reference web pages and electronic journals (e-journals). The aim of this professional development article is to go back to first principles for referencing and see how with examples these should be referenced.

  10. Thomas Jefferson, Page Design, and Desktop Publishing.

    ERIC Educational Resources Information Center

    Hartley, James

    1991-01-01

    Discussion of page design for desktop publishing focuses on the importance of functional issues as opposed to aesthetic issues, and criticizes a previous article that stressed aesthetic issues. Topics discussed include balance, consistency in text structure, and how differences in layout affect the clarity of "The Declaration of…

  11. Reconfigurable Full-Page Braille Displays

    NASA Technical Reports Server (NTRS)

    Garner, H. Douglas

    1994-01-01

    Electrically actuated braille display cells of proposed type arrayed together to form full-page braille displays. Like other braille display cells, these provide changeable patterns of bumps driven by digitally recorded text stored on magnetic tapes or in solid-state electronic memories. Proposed cells contain electrorheological fluid. Viscosity of such fluid increases in strong electrostatic field.

  12. Efficient Web Change Monitoring with Page Digest

    SciTech Connect

    Buttler, D J; Rocco, D; Liu, L

    2004-02-20

    The Internet and the World Wide Web have enabled a publishing explosion of useful online information, which has produced the unfortunate side effect of information overload: it is increasingly difficult for individuals to keep abreast of fresh information. In this paper we describe an approach for building a system for efficiently monitoring changes to Web documents. This paper has three main contributions. First, we present a coherent framework that captures different characteristics of Web documents. The system uses the Page Digest encoding to provide a comprehensive monitoring system for content, structure, and other interesting properties of Web documents. Second, the Page Digest encoding enables improved performance for individual page monitors through mechanisms such as short-circuit evaluation, linear time algorithms for document and structure similarity, and data size reduction. Finally, we develop a collection of sentinel grouping techniques based on the Page Digest encoding to reduce redundant processing in large-scale monitoring systems by grouping similar monitoring requests together. We examine how effective these techniques are over a wide range of parameters and have seen an order of magnitude speed up over existing Web-based information monitoring systems.

  13. Perspectives on the Consecutive Pages Problem

    ERIC Educational Resources Information Center

    Srinivasan, V. K.

    2011-01-01

    This article presents different approaches to a problem, dubbed by the author as "the consecutive pages problem". The aim of this teaching-oriented article is to promote the teaching of abstract concepts in mathematics, by selecting a challenging amusement problem and then presenting various solutions in such a way that it can engage the attention…

  14. What's Not Funny about the Funny Pages?

    ERIC Educational Resources Information Center

    Lum, Lydia

    2008-01-01

    As a kid, Darrin Bell devoured newspaper comic strips. So it was disappointing whenever editors refused years later to add his comic strip, "Candorville," to their funny pages as soon as they saw that his lead characters were minorities. The editors would say they already carried a so-called Black strip. It is difficult for cartoonists like Bell…

  15. Adding Graphics to Your WWW Page.

    ERIC Educational Resources Information Center

    Descy, Don E.

    1995-01-01

    Explains how to retrieve graphics that are available on the World Wide Web and add them to a Web page using a word processor that can save documents in an ASCII (American Standard Code Information Interchange) text format and a new version of Netscape. A list of various, unrelated Internet resources is also included. (LRW)

  16. 16 CFR 436.3 - Cover page.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... with a cover page, in the order and form as follows: (a) The title “FRANCHISE DISCLOSURE DOCUMENT” in... begin operation of a franchise is . This includes that must be paid to the franchisor or affiliate. (2) This disclosure document summarizes certain provisions of your franchise agreement and...

  17. 16 CFR 436.3 - Cover page.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... with a cover page, in the order and form as follows: (a) The title “FRANCHISE DISCLOSURE DOCUMENT” in... begin operation of a franchise is . This includes that must be paid to the franchisor or affiliate. (2) This disclosure document summarizes certain provisions of your franchise agreement and...

  18. Accounting Programs' Home Pages: What's Happening.

    ERIC Educational Resources Information Center

    Peek, Lucia E.; Roxas, Maria L.

    2002-01-01

    Content analysis of 62 accounting programs' websites indicated the following: 53% include mission statements; 62.9% list accreditation; many faculty biographies and personal pages used inconsistent formats; provision of information on financial aid, student organizations, career services, and certified public accountant requirements varied. Many…

  19. The Argonne Voyager multimedia server

    SciTech Connect

    Disz, T.; Judson, I.; Olson, R.; Stevens, R.

    1997-07-01

    With the growing presence of multimedia-enabled systems, one will see an integration of collaborative computing concepts into the everyday environments of future scientific and technical workplaces. Desktop teleconferencing is in common use today, while more complex desktop teleconferencing technology that relies on the availability of multipoint (greater than two nodes) enabled tools is now starting to become available on PCs. A critical problem when using these collaboration tools is the inability to easily archive multistream, multipoint meetings and make the content available to others. Ideally one would like the ability to capture, record, playback, index, annotate and distribute multimedia stream data as easily as one currently handles text or still image data. While the ultimate goal is still some years away, the Argonne Voyager project is aimed at exploring and developing media server technology needed to provide a flexible virtual multipoint recording/playback capability. In this article the authors describe the motivating requirements, architecture implementation, operation, performance, and related work.

  20. Intelligence Data Object Server (IDOS)

    NASA Astrophysics Data System (ADS)

    Barnum, Doug J.; Barth, Stephen W.

    2002-07-01

    The Intelligence Data Object Server (IDOS) has been developed under the Air Force Research Laboratory Global Information Base Branch (AFRL/IFED) Global Awareness Virtual Testbed project to provide automated mechanisms for using Military Intelligence Data in modeling and simulation experiments. The IDOS software allows information from multiple data sources to be published in exercises using the High Level Architecture (HLA) or other object-oriented formats. IDOS uses the AFRL/IFEB Broadsword Gatekeeper for data source access. IDOS has been used in simulation-based acquisition experiments designed and carried out among distributed AFRL sites. This paper describes the IDOS architecture and capabilities including the use of the eXtensible Markup Language (XML) to provide a common representation for data objects, and application of IDOS to visualization of Intelligence Information.

  1. Web server with ATMEGA 2560 microcontroller

    NASA Astrophysics Data System (ADS)

    Răduca, E.; Ungureanu-Anghel, D.; Nistor, L.; Haţiegan, C.; Drăghici, S.; Chioncel, C.; Spunei, E.; Lolea, R.

    2016-02-01

    This paper presents the design and building of a Web Server to command, control and monitor at a distance lots of industrial or personal equipments and/or sensors. The server works based on a personal software. The software can be written by users and can work with many types of operating system. The authors were realized the Web server based on two platforms, an UC board and a network board. The source code was written in "open source" language Arduino 1.0.5.

  2. The Matpar Server on the HP Exemplar

    NASA Technical Reports Server (NTRS)

    Springer, Paul

    2000-01-01

    This presentation reviews the design of Matlab for parallel processing on a parallel system. Matlab was found to be too slow on many large problems, and with the Next Generation Space Telescope requiring greater capability, the work was begun in early 1996 on parallel extensions to Matlab, called Matpar. This presentation reviews the architecture, the functionality, and the design of MatPar. The design utilizes a client server strategy, with the client code written in C, and the object-oriented server code written in C++. The client/server approach for Matpar provides ease of use an good speed.

  3. Developing a web page: bringing clinics online.

    PubMed

    Peterson, Ronnie; Berns, Susan

    2004-01-01

    Introducing clinical staff education, along with new policies and procedures, to over 50 different clinical sites can be a challenge. As any staff educator will confess, getting people to attend an educational inservice session can be difficult. Clinical staff request training, but no one has time to attend training sessions. Putting the training along with the policies and other information into "neat" concise packages via the computer and over the company's intranet was the way to go. However, how do you bring the clinics online when some of the clinical staff may still be reluctant to turn on their computers for anything other than to gather laboratory results? Developing an easy, fun, and accessible Web page was the answer. This article outlines the development of the first training Web page at the University of Wisconsin Medical Foundation, Madison, WI.

  4. The SDSS data archive server

    SciTech Connect

    Neilsen, Eric H., Jr.; /Fermilab

    2007-10-01

    The Sloan Digital Sky Survey (SDSS) Data Archive Server (DAS) provides public access to data files produced by the SDSS data reduction pipeline. This article discusses challenges in public distribution of data of this volume and complexity, and how the project addressed them. The Sloan Digital Sky Survey (SDSS)1 is an astronomical survey of covering roughly one quarter of the night sky. It contains images of this area, a catalog of almost 300 million objects detected in those images, and spectra of more than a million of these objects. The catalog of objects includes a variety of data on each object. These data include not only basic information but also fit parameters for a variety of models, classifications by sophisticated object classification algorithms, statistical parameters, and more. If the survey contains the spectrum of an object, the catalog includes a variety of other parameters derived from its spectrum. Data processing and catalog generation, described more completely in the SDSS Early Data Release2 paper, consists of several stages: collection of imaging data, processing of imaging data, selection of spectroscopic targets from catalogs generated from the imaging data, collection of spectroscopic data, processing of spectroscopic data, and loading of processed data into a database. Each of these stages is itself a complex process. For example, the software that processes the imaging data determines and removes some instrumental signatures in the raw images to create 'corrected frames', models the point spread function, models and removes the sky background, detects objects, measures object positions, measures the radial profile and other morphological parameters for each object, measures the brightness of each object using a variety of methods, classifies the objects, calibrates the brightness measurements against survey standards, and produces a variety of quality assurance plots and diagnostic tables. The complexity of the spectroscopic data

  5. Insights into Facebook Pages: an early adolescent health research study page targeted at parents.

    PubMed

    Amon, Krestina L; Paxton, Karen; Klineberg, Emily; Riley, Lisa; Hawke, Catherine; Steinbeck, Katharine

    2016-02-01

    Facebook has been used in health research, but there is a lack of literature regarding how Facebook may be used to recruit younger adolescents. A Facebook Page was created for an adolescent cohort study on the effects of puberty hormones on well-being and behaviour in early adolescence. Used as a communication tool with existing participants, it also aimed to alert potential participants to the study. The purpose of this paper is to provide a detailed description of the development of the study Facebook Page and present the fan response to the types of posts made on the Page using the Facebook-generated Insights data. Two types of posts were made on the study Facebook Page. The first type was study-related update posts and events. The second was relevant adolescent and family research and current news posts. Observations on the use of and response to the Page were made over 1 year across three phases (phase 1, very low Facebook use; phase 2, high Facebook use; phase 3, low Facebook use). Most Page fans were female (88.6%), with the largest group of fans aged between 35 and 44 years. Study-related update posts with photographs were the most popular. This paper provides a model on which other researchers could base Facebook communication and potential recruitment in the absence of established guidelines.

  6. PEM public key certificate cache server

    NASA Astrophysics Data System (ADS)

    Cheung, T.

    1993-12-01

    Privacy Enhanced Mail (PEM) provides privacy enhancement services to users of Internet electronic mail. Confidentiality, authentication, message integrity, and non-repudiation of origin are provided by applying cryptographic measures to messages transferred between end systems by the Message Transfer System. PEM supports both symmetric and asymmetric key distribution. However, the prevalent implementation uses a public key certificate-based strategy, modeled after the X.509 directory authentication framework. This scheme provides an infrastructure compatible with X.509. According to RFC 1422, public key certificates can be stored in directory servers, transmitted via non-secure message exchanges, or distributed via other means. Directory services provide a specialized distributed database for OSI applications. The directory contains information about objects and then provides structured mechanisms for accessing that information. Since directory services are not widely available now, a good approach is to manage certificates in a centralized certificate server. This document describes the detailed design of a centralized certificate cache serve. This server manages a cache of certificates and a cache of Certificate Revocation Lists (CRL's) for PEM applications. PEMapplications contact the server to obtain/store certificates and CRL's. The server software is programmed in C and ELROS. To use this server, ISODE has to be configured and installed properly. The ISODE library 'libisode.a' has to be linked together with this library because ELROS uses the transport layer functions provided by 'libisode.a.' The X.500 DAP library that is included with the ELROS distribution has to be linked in also, since the server uses the DAP library functions to communicate with directory servers.

  7. Conversation Threads Hidden within Email Server Logs

    NASA Astrophysics Data System (ADS)

    Palus, Sebastian; Kazienko, Przemysław

    Email server logs contain records of all email Exchange through this server. Often we would like to analyze those emails not separately but in conversation thread, especially when we need to analyze social network extracted from those email logs. Unfortunately each mail is in different record and those record are not tided to each other in any obvious way. In this paper method for discussion threads extraction was proposed together with experiments on two different data sets - Enron and WrUT..

  8. Graphic Server: A real time system for displaying and monitoring telemetry data of several satellites

    NASA Technical Reports Server (NTRS)

    Douard, Stephane

    1994-01-01

    Known as a Graphic Server, the system presented was designed for the control ground segment of the Telecom 2 satellites. It is a tool used to dynamically display telemetry data within graphic pages, also known as views. The views are created off-line through various utilities and then, on the operator's request, displayed and animated in real time as data is received. The system was designed as an independent component, and is installed in different Telecom 2 operational control centers. It enables operators to monitor changes in the platform and satellite payloads in real time. It has been in operation since December 1991.

  9. RCD+: Fast loop modeling server

    PubMed Central

    López-Blanco, José Ramón; Canosa-Valls, Alejandro Jesús; Li, Yaohang; Chacón, Pablo

    2016-01-01

    Modeling loops is a critical and challenging step in protein modeling and prediction. We have developed a quick online service (http://rcd.chaconlab.org) for ab initio loop modeling combining a coarse-grained conformational search with a full-atom refinement. Our original Random Coordinate Descent (RCD) loop closure algorithm has been greatly improved to enrich the sampling distribution towards near-native conformations. These improvements include a new workflow optimization, MPI-parallelization and fast backbone angle sampling based on neighbor-dependent Ramachandran probability distributions. The server starts by efficiently searching the vast conformational space from only the loop sequence information and the environment atomic coordinates. The generated closed loop models are subsequently ranked using a fast distance-orientation dependent energy filter. Top ranked loops are refined with the Rosetta energy function to obtain accurate all-atom predictions that can be interactively inspected in an user-friendly web interface. Using standard benchmarks, the average root mean squared deviation (RMSD) is 0.8 and 1.4 Å for 8 and 12 residues loops, respectively, in the challenging modeling scenario in where the side chains of the loop environment are fully remodeled. These results are not only very competitive compared to those obtained with public state of the art methods, but also they are obtained ∼10-fold faster. PMID:27151199

  10. PDS: A Performance Database Server

    DOE PAGES

    Berry, Michael W.; Dongarra, Jack J.; Larose, Brian H.; ...

    1994-01-01

    The process of gathering, archiving, and distributing computer benchmark data is a cumbersome task usually performed by computer users and vendors with little coordination. Most important, there is no publicly available central depository of performance data for all ranges of machines from personal computers to supercomputers. We present an Internet-accessible performance database server (PDS) that can be used to extract current benchmark data and literature. As an extension to the X-Windows-based user interface (Xnetlib) to the Netlib archival system, PDS provides an on-line catalog of public domain computer benchmarks such as the LINPACK benchmark, Perfect benchmarks, and the NAS parallelmore » benchmarks. PDS does not reformat or present the benchmark data in any way that conflicts with the original methodology of any particular benchmark; it is thereby devoid of any subjective interpretations of machine performance. We believe that all branches (research laboratories, academia, and industry) of the general computing community can use this facility to archive performance metrics and make them readily available to the public. PDS can provide a more manageable approach to the development and support of a large dynamic database of published performance metrics.« less

  11. Creating a nursing home page on the World Wide Web.

    PubMed

    Shellenbarger, T; Thomas, S

    1996-01-01

    The authors provide a brief overview of the internet and home pages on the World Wide Web. Definitions of Web terminology are provided to help the reader understand home page creation. The authors also describe the steps in electronic publishing and how to create a home page. Supplemental tables provide internet addresses for nursing and non-nursing sites for reviewal of other home pages. The article continues with information about design, formatting, adding text and images, and dissemination suggestions for home pages. Examples of home pages and instruction commands (tags) are provided. The future of Web publishing is discussed, and issues and concerns are raised regarding electronic publishing.

  12. The Live Access Server Scientific Product Generation Through Workflow Orchestration

    NASA Astrophysics Data System (ADS)

    Hankin, S.; Calahan, J.; Li, J.; Manke, A.; O'Brien, K.; Schweitzer, R.

    2006-12-01

    The Live Access Server (LAS) is a well-established Web-application for display and analysis of geo-science data sets. The software, which can be downloaded and installed by anyone, gives data providers an easy way to establish services for their on-line data holdings, so their users can make plots; create and download data sub-sets; compare (difference) fields; and perform simple analyses. Now at version 7.0, LAS has been in operation since 1994. The current "Armstrong" release of LAS V7 consists of three components in a tiered architecture: user interface, workflow orchestration and Web Services. The LAS user interface (UI) communicates with the LAS Product Server via an XML protocol embedded in an HTTP "get" URL. Libraries (APIs) have been developed in Java, JavaScript and perl that can readily generate this URL. As a result of this flexibility it is common to find LAS user interfaces of radically different character, tailored to the nature of specific datasets or the mindset of specific users. When a request is received by the LAS Product Server (LPS -- the workflow orchestration component), business logic converts this request into a series of Web Service requests invoked via SOAP. These "back- end" Web services perform data access and generate products (visualizations, data subsets, analyses, etc.). LPS then packages these outputs into final products (typically HTML pages) via Jakarta Velocity templates for delivery to the end user. "Fine grained" data access is performed by back-end services that may utilize JDBC for data base access; the OPeNDAP "DAPPER" protocol; or (in principle) the OGC WFS protocol. Back-end visualization services are commonly legacy science applications wrapped in Java or Python (or perl) classes and deployed as Web Services accessible via SOAP. Ferret is the default visualization application used by LAS, though other applications such as Matlab, CDAT, and GrADS can also be used. Other back-end services may include generation of Google

  13. Facebook's personal page modelling and simulation

    NASA Astrophysics Data System (ADS)

    Sarlis, Apostolos S.; Sakas, Damianos P.; Vlachos, D. S.

    2015-02-01

    In this paper we will try to define the utility of Facebook's Personal Page marketing method. This tool that Facebook provides, is modelled and simulated using iThink in the context of a Facebook marketing agency. The paper has leveraged the system's dynamic paradigm to conduct Facebook marketing tools and methods modelling, using iThink™ system to implement them. It uses the design science research methodology for the proof of concept of the models and modelling processes. The following model has been developed for a social media marketing agent/company, Facebook platform oriented and tested in real circumstances. This model is finalized through a number of revisions and iterators of the design, development, simulation, testing and evaluation processes. The validity and usefulness of this Facebook marketing model for the day-to-day decision making are authenticated by the management of the company organization. Facebook's Personal Page method can be adjusted, depending on the situation, in order to maximize the total profit of the company which is to bring new customers, keep the interest of the old customers and deliver traffic to its website.

  14. Some Guidelines for Creating World Wide Web Home Page Files.

    ERIC Educational Resources Information Center

    van Brakel, Pieter A.; And Others

    1995-01-01

    Provides guidelines for home page design, and suggests that the physical appearance of a home page is similar to that of a good graphical user interface. In designing a complete home page file, the premise is that basic hypertext design principles could also be applied in the World Wide Web environment. (Author/JKP)

  15. World Wide Web Pages--Tools for Teaching and Learning.

    ERIC Educational Resources Information Center

    Beasley, Sarah; Kent, Jean

    Created to help educators incorporate World Wide Web pages into teaching and learning, this collection of Web pages presents resources, materials, and techniques for using the Web. The first page focuses on tools for teaching and learning via the Web, providing pointers to sites containing the following: (1) course materials for both distance and…

  16. Digital Ethnography: Library Web Page Redesign among Digital Natives

    ERIC Educational Resources Information Center

    Klare, Diane; Hobbs, Kendall

    2011-01-01

    Presented with an opportunity to improve Wesleyan University's dated library home page, a team of librarians employed ethnographic techniques to explore how its users interacted with Wesleyan's current library home page and web pages in general. Based on the data that emerged, a group of library staff and members of the campus' information…

  17. Required Discussion Web Pages in Psychology Courses and Student Outcomes

    ERIC Educational Resources Information Center

    Pettijohn, Terry F., II; Pettijohn, Terry F.

    2007-01-01

    We conducted 2 studies that investigated student outcomes when using discussion Web pages in psychology classes. In Study 1, we assigned 213 students enrolled in Introduction to Psychology courses to either a mandatory or an optional Web page discussion condition. Students used the discussion Web page significantly more often and performed…

  18. Young Children's Interpretations of Page Breaks in Contemporary Picture Storybooks

    ERIC Educational Resources Information Center

    Sipe, Lawrence R.; Brightman, Anne E.

    2009-01-01

    This article reports on a study of the responses of a second-grade class to the page breaks in contemporary picturebooks. In a picturebook, the text and accompanying illustrations are divided into a series of facing pages called openings, and the divisions between the openings are called page breaks or turns. Unlike a novel, in which the page…

  19. The ClusPro web server for protein-protein docking.

    PubMed

    Kozakov, Dima; Hall, David R; Xia, Bing; Porter, Kathryn A; Padhorny, Dzmitry; Yueh, Christine; Beglov, Dmitri; Vajda, Sandor

    2017-02-01

    The ClusPro server (https://cluspro.org) is a widely used tool for protein-protein docking. The server provides a simple home page for basic use, requiring only two files in Protein Data Bank (PDB) format. However, ClusPro also offers a number of advanced options to modify the search; these include the removal of unstructured protein regions, application of attraction or repulsion, accounting for pairwise distance restraints, construction of homo-multimers, consideration of small-angle X-ray scattering (SAXS) data, and location of heparin-binding sites. Six different energy functions can be used, depending on the type of protein. Docking with each energy parameter set results in ten models defined by centers of highly populated clusters of low-energy docked structures. This protocol describes the use of the various options, the construction of auxiliary restraints files, the selection of the energy parameters, and the analysis of the results. Although the server is heavily used, runs are generally completed in <4 h.

  20. Workload Characterization and Performance Implications of Large-Scale Blog Servers

    SciTech Connect

    Jeon, Myeongjae; Kim, Youngjae; Hwang, Jeaho; Lee, Joonwon; Seo, Euiseong

    2012-11-01

    With the ever-increasing popularity of social network services (SNSs), an understanding of the characteristics of these services and their effects on the behavior of their host servers is critical. However, there has been a lack of research on the workload characterization of servers running SNS applications such as blog services. To fill this void, we empirically characterized real-world web server logs collected from one of the largest South Korean blog hosting sites for 12 consecutive days. The logs consist of more than 96 million HTTP requests and 4.7 TB of network traffic. Our analysis reveals the followings: (i) The transfer size of non-multimedia files and blog articles can be modeled using a truncated Pareto distribution and a log-normal distribution, respectively; (ii) User access for blog articles does not show temporal locality, but is strongly biased towards those posted with image or audio files. We additionally discuss the potential performance improvement through clustering of small files on a blog page into contiguous disk blocks, which benefits from the observed file access patterns. Trace-driven simulations show that, on average, the suggested approach achieves 60.6% better system throughput and reduces the processing time for file access by 30.8% compared to the best performance of the Ext4 file system.

  1. ProFunc: a server for predicting protein function from 3D structure.

    PubMed

    Laskowski, Roman A; Watson, James D; Thornton, Janet M

    2005-07-01

    ProFunc (http://www.ebi.ac.uk/thornton-srv/databases/ProFunc) is a web server for predicting the likely function of proteins whose 3D structure is known but whose function is not. Users submit the coordinates of their structure to the server in PDB format. ProFunc makes use of both existing and novel methods to analyse the protein's sequence and structure identifying functional motifs or close relationships to functionally characterized proteins. A summary of the analyses provides an at-a-glance view of what each of the different methods has found. More detailed results are available on separate pages. Often where one method has failed to find anything useful another may be more forthcoming. The server is likely to be of most use in structural genomics where a large proportion of the proteins whose structures are solved are of hypothetical proteins of unknown function. However, it may also find use in a comparative analysis of members of large protein families. It provides a convenient compendium of sequence and structural information that often hold vital functional clues to be followed up experimentally.

  2. A Server-Based Mobile Coaching System

    PubMed Central

    Baca, Arnold; Kornfeind, Philipp; Preuschl, Emanuel; Bichler, Sebastian; Tampier, Martin; Novatchkov, Hristo

    2010-01-01

    A prototype system for monitoring, transmitting and processing performance data in sports for the purpose of providing feedback has been developed. During training, athletes are equipped with a mobile device and wireless sensors using the ANT protocol in order to acquire biomechanical, physiological and other sports specific parameters. The measured data is buffered locally and forwarded via the Internet to a server. The server provides experts (coaches, biomechanists, sports medicine specialists etc.) with remote data access, analysis and (partly automated) feedback routines. In this way, experts are able to analyze the athlete’s performance and return individual feedback messages from remote locations. PMID:22163490

  3. Optimizing the NASA Technical Report Server

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Maa, Ming-Hokng

    1996-01-01

    The NASA Technical Report Server (NTRS), a World Wide Web report distribution NASA technical publications service, is modified for performance enhancement, greater protocol support, and human interface optimization. Results include: Parallel database queries, significantly decreasing user access times by an average factor of 2.3; access from clients behind firewalls and/ or proxies which truncate excessively long Uniform Resource Locators (URLs); access to non-Wide Area Information Server (WAIS) databases and compatibility with the 239-50.3 protocol; and a streamlined user interface.

  4. Readers, Authors, and Page Structure: A Discussion of Four Questions Arising from a Content Analysis of Web Pages.

    ERIC Educational Resources Information Center

    Haas, Stephanie W.; Grams, Erika S.

    2000-01-01

    Discusses research describing Web page and link classification systems resulting from a content analysis of over 75 Web pages. Topics include the decision-making processes of Web page authors and readers; syntactic analysis of labeled and isolated anchors; expansion and resource links; and where links lead. (Author/LRW)

  5. A Video Broadcast Architecture with Server Placement Programming

    NASA Astrophysics Data System (ADS)

    He, Lei; Ma, Xiangjie; Zhang, Weili; Guo, Yunfei; Liu, Wenbo

    We propose a hybrid architecture MTreeTV to support fast channel switching. MTreeTV combines the use of P2P networks with dedicated streaming servers, and was proposed to build on the advantages of both P2P and CDN paradigms. We study the placement of the servers with constraints on the client to server paths and evaluate the effect of the server parameters. Through analysis and simulation, we show that MTreeTV supports fast channel switching (<4s).

  6. Network characteristics for server selection in online games

    NASA Astrophysics Data System (ADS)

    Claypool, Mark

    2008-01-01

    Online gameplay is impacted by the network characteristics of players connected to the same server. Unfortunately, the network characteristics of online game servers are not well-understood, particularly for groups that wish to play together on the same server. As a step towards a remedy, this paper presents analysis of an extensive set of measurements of game servers on the Internet. Over the course of many months, actual Internet game servers were queried simultaneously by twenty-five emulated game clients, with both servers and clients spread out on the Internet. The data provides statistics on the uptime and populations of game servers over a month long period an an in-depth look at the suitability for game servers for multi-player server selection, concentrating on characteristics critical to playability--latency and fairness. Analysis finds most game servers have latencies suitable for third-person and omnipresent games, such as real-time strategy, sports and role-playing games, providing numerous server choices for game players. However, far fewer game servers have the low latencies required for first-person games, such as shooters or race games. In all cases, groups that wish to play together have a greatly reduced set of servers from which to choose because of inherent unfairness in server latencies and server selection is particularly limited as the group size increases. These results hold across different game types and even across different generations of games. The data should be useful for game developers and network researchers that seek to improve game server selection, whether for single or multiple players.

  7. Optimal allocation of file servers in a local network environment

    NASA Technical Reports Server (NTRS)

    Woodside, C. M.; Tripathi, S. K.

    1986-01-01

    Files associated with workstations in a local area network are to be allocated among two or more file servers. Assuming statistically identical workstations and file servers and a performance model which is a closed multiclass separable queueing network, an optimal allocation is found. It is shown that all the files of each workstation should be placed on one file server, with the workstations divided as equally as possible among the file servers.

  8. Dynamic Context-Sensitive PageRank for Expertise Mining

    NASA Astrophysics Data System (ADS)

    Schall, Daniel; Dustdar, Schahram

    Online tools for collaboration and social platforms have become omnipresent in Web-based environments. Interests and skills of people evolve over time depending in performed activities and joint collaborations. We believe that ranking models for recommending experts or collaboration partners should not only rely on profiles or skill information that need to be manually maintained and updated by the user. In this work we address the problem of expertise mining based on performed interactions between people. We argue that an expertise mining algorithm must consider a person's interest and activity level in a certain collaboration context. Our approach is based on the PageRank algorithm enhanced by techniques to incorporate contextual link information. An approach comprising two steps is presented. First, offline analysis of human interactions considering tagged interaction links and second composition of ranking scores based on preferences. We evaluate our approach using an email interaction network.

  9. Lifting Events in RDF from Interactions with Annotated Web Pages

    NASA Astrophysics Data System (ADS)

    Stühmer, Roland; Anicic, Darko; Sen, Sinan; Ma, Jun; Schmidt, Kay-Uwe; Stojanovic, Nenad

    In this paper we present a method and an implementation for creating and processing semantic events from interaction with Web pages which opens possibilities to build event-driven applications for the (Semantic) Web. Events, simple or complex, are models for things that happen e.g., when a user interacts with a Web page. Events are consumed in some meaningful way e.g., for monitoring reasons or to trigger actions such as responses. In order for receiving parties to understand events e.g., comprehend what has led to an event, we propose a general event schema using RDFS. In this schema we cover the composition of complex events and event-to-event relationships. These events can then be used to route semantic information about an occurrence to different recipients helping in making the Semantic Web active. Additionally, we present an architecture for detecting and composing events in Web clients. For the contents of events we show a way of how they are enriched with semantic information about the context in which they occurred. The paper is presented in conjunction with the use case of Semantic Advertising, which extends traditional clickstream analysis by introducing semantic short-term profiling, enabling discovery of the current interest of a Web user and therefore supporting advertisement providers in responding with more relevant advertisements.

  10. Calibrating page sized Gafchromic EBT3 films

    SciTech Connect

    Crijns, W.; Maes, F.; Heide, U. A. van der; Van den Heuvel, F.

    2013-01-15

    Purpose: The purpose is the development of a novel calibration method for dosimetry with Gafchromic EBT3 films. The method should be applicable for pretreatment verification of volumetric modulated arc, and intensity modulated radiotherapy. Because the exposed area on film can be large for such treatments, lateral scan errors must be taken into account. The correction for the lateral scan effect is obtained from the calibration data itself. Methods: In this work, the film measurements were modeled using their relative scan values (Transmittance, T). Inside the transmittance domain a linear combination and a parabolic lateral scan correction described the observed transmittance values. The linear combination model, combined a monomer transmittance state (T{sub 0}) and a polymer transmittance state (T{sub {infinity}}) of the film. The dose domain was associated with the observed effects in the transmittance domain through a rational calibration function. On the calibration film only simple static fields were applied and page sized films were used for calibration and measurements (treatment verification). Four different calibration setups were considered and compared with respect to dose estimation accuracy. The first (I) used a calibration table from 32 regions of interest (ROIs) spread on 4 calibration films, the second (II) used 16 ROIs spread on 2 calibration films, the third (III), and fourth (IV) used 8 ROIs spread on a single calibration film. The calibration tables of the setups I, II, and IV contained eight dose levels delivered to different positions on the films, while for setup III only four dose levels were applied. Validation was performed by irradiating film strips with known doses at two different time points over the course of a week. Accuracy of the dose response and the lateral effect correction was estimated using the dose difference and the root mean squared error (RMSE), respectively. Results: A calibration based on two films was the optimal

  11. HexServer: an FFT-based protein docking server powered by graphics processors.

    PubMed

    Macindoe, Gary; Mavridis, Lazaros; Venkatraman, Vishwesh; Devignes, Marie-Dominique; Ritchie, David W

    2010-07-01

    HexServer (http://hexserver.loria.fr/) is the first Fourier transform (FFT)-based protein docking server to be powered by graphics processors. Using two graphics processors simultaneously, a typical 6D docking run takes approximately 15 s, which is up to two orders of magnitude faster than conventional FFT-based docking approaches using comparable resolution and scoring functions. The server requires two protein structures in PDB format to be uploaded, and it produces a ranked list of up to 1000 docking predictions. Knowledge of one or both protein binding sites may be used to focus and shorten the calculation when such information is available. The first 20 predictions may be accessed individually, and a single file of all predicted orientations may be downloaded as a compressed multi-model PDB file. The server is publicly available and does not require any registration or identification by the user.

  12. Proteomic study of muscle sarcoplasmic proteins using AUT-PAGE/SDS-PAGE as two-dimensional gel electrophoresis.

    PubMed

    Picariello, Gianluca; De Martino, Alessandra; Mamone, Gianfranco; Ferranti, Pasquale; Addeo, Francesco; Faccia, Michele; Spagnamusso, Salvatore; Di Luccia, Aldo

    2006-03-20

    In the present study, an alternative procedure for two-dimensional (2D) electrophoretic analysis in proteomic investigation of the most represented basic muscle water-soluble proteins is suggested. Our method consists of Acetic acid-Urea-Triton polyacrylamide gel (AUT-PAGE) analysis in the first dimension and standard sodium dodecyl sulphate polyacrylamide gel (SDS-PAGE) in the second dimension. Although standard two-dimensional Immobilized pH Gradient-Sodium Dodecyl-Sulphate (2D IPG-SDS) gel electrophoresis has been successfully used to study these proteins, most of the water-soluble proteins are spread on the alkaline part of the 2D map and are poorly focused. Furthermore, the similarity in their molecular weights impairs resolution of the classical approach. The addition of Triton X-100, a non-ionic detergent, into the gel induces a differential electrophoretic mobility of proteins as a result of the formation of mixed micelles between the detergent and the hydrophobic moieties of polypeptides, separating basic proteins with a criterion similar to reversed phase chromatography based on their hydrophobicity. The acid pH induces positive net charges, increasing with the isoelectric point of proteins, thus allowing enhanced resolution in the separation. By using 2D AUT-PAGE/SDS electrophoresis approach to separate water-soluble proteins from fresh pork and from dry-cured products, we could spread proteins over a greater area, achieving a greater resolution than that obtained by IPG in the pH range 3-10 and 6-11. Sarcoplasmic proteins undergoing proteolysis during the ripening of products were identified by Matrix Assisted Laser Desorption/Ionization-Time of Flight (MALDI-ToF) mass spectrometry peptide mass fingerprinting in a easier and more effective way. Two-dimensional AUT-PAGE/SDS electrophoresis has allowed to simplify separation of sarcoplasmic protein mixtures making this technique suitable in the defining of quality of dry-cured pork products by immediate

  13. Interfaces for Distributed Systems of Information Servers.

    ERIC Educational Resources Information Center

    Kahle, Brewster M.; And Others

    1993-01-01

    Describes five interfaces to remote, full-text databases accessed through distributed systems of servers. These are WAIStation for the Macintosh, XWAIS for X-Windows, GWAIS for Gnu-Emacs; SWAIS for dumb terminals, and Rosebud for the Macintosh. Sixteen illustrations provide examples of display screens. Problems and needed improvements are…

  14. World Wide Web Server Standards and Guidelines.

    ERIC Educational Resources Information Center

    Stubbs, Keith M.

    This document defines the specific standards and general guidelines which the U.S. Department of Education (ED) will use to make information available on the World Wide Web (WWW). The purpose of providing such guidance is to ensure high quality and consistent content, organization, and presentation of information on ED WWW servers, in order to…

  15. Implementing bioinformatic workflows within the bioextract server

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Computational workflows in bioinformatics are becoming increasingly important in the achievement of scientific advances. These workflows typically require the integrated use of multiple, distributed data sources and analytic tools. The BioExtract Server (http://bioextract.org) is a distributed servi...

  16. Implementing Adaptive Performance Management in Server Applications

    SciTech Connect

    Liu, Yan; Gorton, Ian

    2007-06-11

    Performance and scalability are critical quality attributes for server applications in Internet-facing business systems. These applications operate in dynamic environments with rapidly fluctuating user loads and resource levels, and unpredictable system faults. Adaptive (autonomic) systems research aims to augment such server applications with intelligent control logic that can detect and react to sudden environmental changes. However, developing this adaptive logic is complex in itself. In addition, executing the adaptive logic consumes processing resources, and hence may (paradoxically) adversely affect application performance. In this paper we describe an approach for developing high-performance adaptive server applications and the supporting technology. The Adaptive Server Framework (ASF) is built on standard middleware services, and can be used to augment legacy systems with adaptive behavior without needing to change the application business logic. Crucially, ASF provides built-in control loop components to optimize the overall application performance, which comprises both the business and adaptive logic. The control loop is based on performance models and allows systems designers to tune the performance levels simply by modifying high level declarative policies. We demonstrate the use of ASF in a case study.

  17. Distributed analysis with CRAB: The client-server architecture evolution and commissioning

    SciTech Connect

    Codispoti, G.; Cinquilli, M.; Fanfani, A.; Fanzago, F.; Farina, F.; Lacaprara, S.; Miccio, V.; Spiga, D.; Vaandering, E.; /Fermilab

    2008-01-01

    CRAB (CMS Remote Analysis Builder) is the tool used by CMS to enable running physics analysis in a transparent manner over data distributed across many sites. It abstracts out the interaction with the underlying batch farms, grid infrastructure and CMS workload management tools, such that it is easily usable by non-experts. CRAB can be used as a direct interface to the computing system or can delegate the user task to a server. Major efforts have been dedicated to the client-server system development, allowing the user to deal only with a simple and intuitive interface and to delegate all the work to a server. The server takes care of handling the users jobs during the whole lifetime of the users task. In particular, it takes care of the data and resources discovery, process tracking and output handling. It also provides services such as automatic resubmission in case of failures, notification to the user of the task status, and automatic blacklisting of sites showing evident problems beyond what is provided by existing grid infrastructure. The CRAB Server architecture and its deployment will be presented, as well as the current status and future development. In addition the experience in using the system for initial detector commissioning activities and data analysis will be summarized.

  18. Design of a Web Page as a complement of educative innovation through MOODLE

    NASA Astrophysics Data System (ADS)

    Mendiola Ubillos, M. A.; Aguado Cortijo, Pedro L.

    2010-05-01

    In the context of Information Technology to impart knowledge and to establish MOODLE system as a support and complementary tool to on-site educational methodology (b-learning) a Web Page was designed in Agronomic and Food Industry Crops (Plantas de interés Agroalimentario) during 2006-07 course. This web was inserted in the Thecnical University of Madrid (Universidad Politécnica de Madrid) computer system to facilitate to the students the first contact with the contents of this subject. In this page the objectives and methodology, personal work planning, subject program given plus the activities are showed. At another web site, the evaluation criteria and recommended bibliography are located. The objective of this web page has been to make more transparent and accessible the necessary information in the learning process and presenting it in a more attractive frame. This page has been update and modified in each academic course offered since its first implementation. We had added in some cases new specific links to increase its useful. At the end of each course a test is applied to the students that take this subject. We have asked which elements would like to modify, delete and add to this web page. In this way the direct users give their point of view and help to improve the web page each course.

  19. EarthServer: Information Retrieval and Query Language

    NASA Astrophysics Data System (ADS)

    Perperis, Thanassis; Koltsida, Panagiota; Kakaletris, George

    2013-04-01

    Establishing open, unified, seamless, access and ad-hoc analytics on cross-disciplinary, multi-source, multi-dimensional, spatiotemporal Earth Science data of extreme-size and their supporting metadata are the main challenges of the EarthServer project (www.earthserver.eu), funded by the European Commission under its Seventh Framework Program. One of EarthServer's main objectives is to provide users with higher level coverage and metadata search, retrieval and processing capabilities to multi-disciplinary Earth Science data. Six Lighthouse Applications are being established, each one providing access to Cryospheric, Airborne, Atmospheric, Geology, Oceanography and Planetary science raster data repositories through strictly WCS 2.0 standard based service endpoints. EarthServers' information retrieval subsystem aims towards exploiting the WCS endpoints through a physically and logically distributed service oriented architecture, foreseeing the collaboration of several standard compliant services, capable of exploiting modern large grid and cloud infrastructures and of dynamically responding to availability and capabilities of underlying resources. Towards furthering technology for integrated, coherent service provision based on WCS and WCPS the concept of a query language (QL), unifying coverage and metadata processing and retrieval is introduced. EarthServer's information retrieval subsystem receives QL requests involving high volumes of all Earth Science data categories, executes them on the services that reside on the infrastructure and sends the results back to the requester through a high performance pipeline. In this contribution we briefly discuss EarthServer's service oriented coverage data and metadata search and retrieval architecture and further elaborate on the potentials of EarthServer's Query Language, called xWCPS (XQuery compliant WCPS). xWCPS aims towards merging the path that the two widely adopted standards (W3C XQuery, OGC WCPS) have paved, into a

  20. Survey cover pages: to take or not to take.

    PubMed

    Sansone, Randy A; Lam, Charlene; Wiederman, Michael W

    2010-01-01

    In survey research, the elements of informed conset, including contact information for the researchers and the Institutional Review Board, may be located on a cover page, which participants are advised that they may take. To date, we are not aware of any studies examining the percentage of research participants that actually take these cover pages, which was the purpose of this study. Among a consecutive sample of 419 patients in an internal medicine setting, 16% removed the cover page. There were no demographic predictors regarding who took versus did not take the cover page.

  1. Client/server approach to image capturing

    NASA Astrophysics Data System (ADS)

    Tuijn, Chris; Stokes, Earle

    1998-01-01

    The diversity of the digital image capturing devices on the market today is quite astonishing and ranges from low-cost CCD scanners to digital cameras (for both action and stand-still scenes), mid-end CCD scanners for desktop publishing and pre- press applications and high-end CCD flatbed scanners and drum- scanners with photo multiplier technology. Each device and market segment has its own specific needs which explains the diversity of the associated scanner applications. What all those applications have in common is the need to communicate with a particular device to import the digital images; after the import, additional image processing might be needed as well as color management operations. Although the specific requirements for all of these applications might differ considerably, a number of image capturing and color management facilities as well as other services are needed which can be shared. In this paper, we propose a client/server architecture for scanning and image editing applications which can be used as a common component for all these applications. One of the principal components of the scan server is the input capturing module. The specification of the input jobs is based on a generic input device model. Through this model we make abstraction of the specific scanner parameters and define the scan job definitions by a number of absolute parameters. As a result, scan job definitions will be less dependent on a particular scanner and have a more universal meaning. In this context, we also elaborate on the interaction of the generic parameters and the color characterization (i.e., the ICC profile). Other topics that are covered are the scheduling and parallel processing capabilities of the server, the image processing facilities, the interaction with the ICC engine, the communication facilities (both in-memory and over the network) and the different client architectures (stand-alone applications, TWAIN servers, plug-ins, OLE or Apple-event driven

  2. Nuclear proteasomes carry a constitutive posttranslational modification which derails SDS-PAGE (but not CTAB-PAGE).

    PubMed

    Pitcher, David S; de Mattos-Shipley, Kate; Wang, Ziming; Tzortzis, Konstantinos; Goudevenou, Katerina; Flynn, Helen; Bohn, Georg; Rahemtulla, Amin; Roberts, Irene; Snijders, Ambrosius P; Karadimitris, Anastasios; Kleijnen, Maurits F

    2014-12-01

    We report that subunits of human nuclear proteasomes carry a previously unrecognised, constitutive posttranslational modification. Subunits with this modification are not visualised by SDS-PAGE, which is used in almost all denaturing protein gel electrophoresis. In contrast, CTAB-PAGE readily visualises such modified subunits. Thus, under most experimental conditions, with identical samples, SDS-PAGE yielded gel electrophoresis patterns for subunits of nuclear proteasomes which were misleading and strikingly different from those obtained with CTAB-PAGE. Initial analysis indicates a novel modification of a high negative charge with some similarity to polyADP-ribose, possibly explaining compatibility with (positively-charged) CTAB-PAGE but not (negatively-charged) SDS-PAGE and providing a mechanism for how nuclear proteasomes may interact with chromatin, DNA and other nuclear components.

  3. SurveyWiz and factorWiz: JavaScript Web pages that make HTML forms for research on the Internet.

    PubMed

    Birnbaum, M H

    2000-05-01

    SurveyWiz and factorWiz are Web pages that act as wizards to create HTML forms that enable one to collect data via the Web. SurveyWiz allows the user to enter survey questions or personality test items with a mixture of text boxes and scales of radio buttons. One can add demographic questions of age, sex, education, and nationality with the push of a button. FactorWiz creates the HTML for within-subjects, two-factor designs as large as 9 x 9, or higher order factorial designs up to 81 cells. The user enters levels of the row and column factors, which can be text, images, or other multimedia. FactorWiz generates the stimulus combinations, randomizes their order, and creates the page. In both programs HTML is displayed in a window, and the user copies it to a text editor to save it. When uploaded to a Web server and supported by a CGI script, the created Web pages allow data to be collected, coded, and saved on the server. These programs are intended to assist researchers and students in quickly creating studies that can be administered via the Web.

  4. PSSweb: protein structural statistics web server.

    PubMed

    Gaillard, Thomas; Stote, Roland H; Dejaegere, Annick

    2016-07-08

    With the increasing number of protein structures available, there is a need for tools capable of automating the comparison of ensembles of structures, a common requirement in structural biology and bioinformatics. PSSweb is a web server for protein structural statistics. It takes as input an ensemble of PDB files of protein structures, performs a multiple sequence alignment and computes structural statistics for each position of the alignment. Different optional functionalities are proposed: structure superposition, Cartesian coordinate statistics, dihedral angle calculation and statistics, and a cluster analysis based on dihedral angles. An interactive report is generated, containing a summary of the results, tables, figures and 3D visualization of superposed structures. The server is available at http://pssweb.org.

  5. PSSweb: protein structural statistics web server

    PubMed Central

    Gaillard, Thomas; Stote, Roland H.; Dejaegere, Annick

    2016-01-01

    With the increasing number of protein structures available, there is a need for tools capable of automating the comparison of ensembles of structures, a common requirement in structural biology and bioinformatics. PSSweb is a web server for protein structural statistics. It takes as input an ensemble of PDB files of protein structures, performs a multiple sequence alignment and computes structural statistics for each position of the alignment. Different optional functionalities are proposed: structure superposition, Cartesian coordinate statistics, dihedral angle calculation and statistics, and a cluster analysis based on dihedral angles. An interactive report is generated, containing a summary of the results, tables, figures and 3D visualization of superposed structures. The server is available at http://pssweb.org. PMID:27174930

  6. Energy Servers Deliver Clean, Affordable Power

    NASA Technical Reports Server (NTRS)

    2010-01-01

    K.R. Sridhar developed a fuel cell device for Ames Research Center, that could use solar power to split water into oxygen for breathing and hydrogen for fuel on Mars. Sridhar saw the potential of the technology, when reversed, to create clean energy on Earth. He founded Bloom Energy, of Sunnyvale, California, to advance the technology. Today, the Bloom Energy Server is providing cost-effective, environmentally friendly energy to a host of companies such as eBay, Google, and The Coca-Cola Company. Bloom's NASA-derived Energy Servers generate energy that is about 67-percent cleaner than a typical coal-fired power plant when using fossil fuels and 100-percent cleaner with renewable fuels.

  7. The widest practicable dissemination: The NASA technical report server

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Gottlich, Gretchen L.; Bianco, David J.; Binkley, Robert L.; Kellogg, Yvonne D.; Paulson, Sharon S.; Beaumont, Chris J.; Schmunk, Robert B.; Kurtz, Michael J.; Accomazzi, Alberto

    1995-01-01

    The National Aeronautics and Space Act of 1958 established NASA and charged it to 'provide for the widest practicable and appropriate dissemination of information concerning...its activities and the results thereof.' The search for innovative methods to distribute NASA's information lead a grass-roots team to create the NASA Technical Report Server (NTRS), which uses the World Wide Web and other popular Internet-based information systems as search engines. The NTRS is an inter-center effort which provides uniform access to various distributed publication servers residing on the Internet. Users have immediate desktop access to technical publications from NASA centers and institutes. The NTRS is comprised of several units, some constructed especially for inclusion in NTRS, and others that are existing NASA publication services that NTRS reuses. This paper presents the NTRS architecture, usage metrics, and the lessons learned while implementing and maintaining the services over the initial six-month period. The NTRS is largely constructed with freely available software running on existing hardware. NTRS builds upon existing hardware and software, and the resulting additional exposure for the body of literature contained will allow NASA to ensure that its institutional knowledge base will continue to receive the widest practicable and appropriate dissemination.

  8. The Widest Practicable Dissemination: The NASA Technical Report Server

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Gottlich, Gretchen L.; Bianco, David J.; Binkley, Robert L.; Kellogg, Yvonne D.; Paulson, Sharon S.; Beaumont, Chris J.; Schmunk, Robert B.; Kurtz, Michael J.; Accomazzi, Alberto

    1995-01-01

    The National Aeronautics and Space Act of 1958 established NASA and charged it to "provide for the widest practicable and appropriate dissemination of information concerning [...] its activities and the results thereof." The search for innovative methods to distribute NASA s information lead a grass-roots team to create the NASA Technical Report Server (NTRS), which uses the World Wide Web and other popular Internet-based information systems as search engines. The NTRS is an inter-center effort which provides uniform access to various distributed publication servers residing on the Internet. Users have immediate desktop access to technical publications from NASA centers and institutes. The NTRS is comprised of several units, some constructed especially for inclusion in NTRS, and others that are existing NASA publication services that NTRS reuses. This paper presents the NTRS architecture, usage metrics, and the lessons learned while implementing and maintaining the services over the initial 6-month period. The NTRS is largely constructed with freely available software running on existing hardware. NTRS builds upon existing hardware and software, and the resulting additional exposure for the body of literature contained will allow NASA to ensure that its institutional knowledge base will continue to receive the widest practicable and appropriate dissemination.

  9. Engineering Proteins for Thermostability with iRDP Web Server

    PubMed Central

    Ghanate, Avinash; Ramasamy, Sureshkumar; Suresh, C. G.

    2015-01-01

    Engineering protein molecules with desired structure and biological functions has been an elusive goal. Development of industrially viable proteins with improved properties such as stability, catalytic activity and altered specificity by modifying the structure of an existing protein has widely been targeted through rational protein engineering. Although a range of factors contributing to thermal stability have been identified and widely researched, the in silico implementation of these as strategies directed towards enhancement of protein stability has not yet been explored extensively. A wide range of structural analysis tools is currently available for in silico protein engineering. However these tools concentrate on only a limited number of factors or individual protein structures, resulting in cumbersome and time-consuming analysis. The iRDP web server presented here provides a unified platform comprising of iCAPS, iStability and iMutants modules. Each module addresses different facets of effective rational engineering of proteins aiming towards enhanced stability. While iCAPS aids in selection of target protein based on factors contributing to structural stability, iStability uniquely offers in silico implementation of known thermostabilization strategies in proteins for identification and stability prediction of potential stabilizing mutation sites. iMutants aims to assess mutants based on changes in local interaction network and degree of residue conservation at the mutation sites. Each module was validated using an extensively diverse dataset. The server is freely accessible at http://irdp.ncl.res.in and has no login requirements. PMID:26436543

  10. Engineering Proteins for Thermostability with iRDP Web Server.

    PubMed

    Panigrahi, Priyabrata; Sule, Manas; Ghanate, Avinash; Ramasamy, Sureshkumar; Suresh, C G

    2015-01-01

    Engineering protein molecules with desired structure and biological functions has been an elusive goal. Development of industrially viable proteins with improved properties such as stability, catalytic activity and altered specificity by modifying the structure of an existing protein has widely been targeted through rational protein engineering. Although a range of factors contributing to thermal stability have been identified and widely researched, the in silico implementation of these as strategies directed towards enhancement of protein stability has not yet been explored extensively. A wide range of structural analysis tools is currently available for in silico protein engineering. However these tools concentrate on only a limited number of factors or individual protein structures, resulting in cumbersome and time-consuming analysis. The iRDP web server presented here provides a unified platform comprising of iCAPS, iStability and iMutants modules. Each module addresses different facets of effective rational engineering of proteins aiming towards enhanced stability. While iCAPS aids in selection of target protein based on factors contributing to structural stability, iStability uniquely offers in silico implementation of known thermostabilization strategies in proteins for identification and stability prediction of potential stabilizing mutation sites. iMutants aims to assess mutants based on changes in local interaction network and degree of residue conservation at the mutation sites. Each module was validated using an extensively diverse dataset. The server is freely accessible at http://irdp.ncl.res.in and has no login requirements.

  11. SPEER-SERVER: a web server for prediction of protein specificity determining sites

    PubMed Central

    Chakraborty, Abhijit; Mandloi, Sapan; Lanczycki, Christopher J.; Panchenko, Anna R.; Chakrabarti, Saikat

    2012-01-01

    Sites that show specific conservation patterns within subsets of proteins in a protein family are likely to be involved in the development of functional specificity. These sites, generally termed specificity determining sites (SDS), might play a crucial role in binding to a specific substrate or proteins. Identification of SDS through experimental techniques is a slow, difficult and tedious job. Hence, it is very important to develop efficient computational methods that can more expediently identify SDS. Herein, we present Specificity prediction using amino acids’ Properties, Entropy and Evolution Rate (SPEER)-SERVER, a web server that predicts SDS by analyzing quantitative measures of the conservation patterns of protein sites based on their physico-chemical properties and the heterogeneity of evolutionary changes between and within the protein subfamilies. This web server provides an improved representation of results, adds useful input and output options and integrates a wide range of analysis and data visualization tools when compared with the original standalone version of the SPEER algorithm. Extensive benchmarking finds that SPEER-SERVER exhibits sensitivity and precision performance that, on average, meets or exceeds that of other currently available methods. SPEER-SERVER is available at http://www.hpppi.iicb.res.in/ss/. PMID:22689646

  12. JCID Compliant Thin Server for Sensors

    DTIC Science & Technology

    2008-10-01

    JCID SST Insertion Project (JSP) is a team effort between a group at Penn State’s Applied Research Lab and Lattice/RTI of Herndon, VA to integrate a...Applied Research Lab and Lattice/RTI of Herndon, VA to integrate a flexible table-based sensor configuration capability into the the JCID/JWARN...ARO JCID Compliant Thin Server for Sensors D.C. Swanson The Applied Research Laboratory The Pennsylvania State University POB 30 State College

  13. Energy Efficiency in Small Server Rooms: Field Surveys and Findings

    SciTech Connect

    Cheung, Iris; Greenberg, Steve; Mahdavi, Roozbeh; Brown, Richard; Tschudi, William

    2014-08-11

    Fifty-seven percent of US servers are housed in server closets, server rooms, and localized data centers, in what are commonly referred to as small server rooms, which comprise 99percent of all server spaces in the US. While many mid-tier and enterprise-class data centers are owned by large corporations that consider energy efficiency a goal to minimize business operating costs, small server rooms typically are not similarly motivated. They are characterized by decentralized ownership and management and come in many configurations, which creates a unique set of efficiency challenges. To develop energy efficiency strategies for these spaces, we surveyed 30 small server rooms across eight institutions, and selected four of them for detailed assessments. The four rooms had Power Usage Effectiveness (PUE) values ranging from 1.5 to 2.1. Energy saving opportunities ranged from no- to low-cost measures such as raising cooling set points and better airflow management, to more involved but cost-effective measures including server consolidation and virtualization, and dedicated cooling with economizers. We found that inefficiencies mainly resulted from organizational rather than technical issues. Because of the inherent space and resource limitations, the most effective measure is to operate servers through energy-efficient cloud-based services or well-managed larger data centers, rather than server rooms. Backup power requirement, and IT and cooling efficiency should be evaluated to minimize energy waste in the server space. Utility programs are instrumental in raising awareness and spreading technical knowledge on server operation, and the implementation of energy efficiency measures in small server rooms.

  14. Professional dental services: the yellow pages advertising decision.

    PubMed

    Sanchez, P M

    1998-01-01

    Yellow pages advertising decisions are among the most important marketing decisions made by dental services professionals. Yet, little empirical evidence is available to guide these decisions. Through a literature review, the purpose of this article is to synthesize available knowledge in this area and provide guidelines for more effective yellow pages advertising.

  15. Toward a User-Centered Academic Library Home Page

    ERIC Educational Resources Information Center

    McHale, Nina

    2008-01-01

    In the past decade, academic libraries have struggled with the design of an effective library home page. Since librarians' mental models of information architecture differ from those of their patrons, usability assessments are necessary in designing a user-centered home page. This study details a usability sequence of card sort and paper and…

  16. JavaScript: Convenient Interactivity for the Class Web Page.

    ERIC Educational Resources Information Center

    Gray, Patricia

    This paper shows how JavaScript can be used within HTML pages to add interactive review sessions and quizzes incorporating graphics and sound files. JavaScript has the advantage of providing basic interactive functions without the use of separate software applications and players. Because it can be part of a standard HTML page, it is…

  17. An Analysis of Academic Library Web Pages for Faculty

    ERIC Educational Resources Information Center

    Gardner, Susan J.; Juricek, John Eric; Xu, F. Grace

    2008-01-01

    Web sites are increasingly used by academic libraries to promote key services and collections to teaching faculty. This study analyzes the content, location, language, and technological features of fifty-four academic library Web pages designed especially for faculty to expose patterns in the development of these pages.

  18. Evaluating Information Quality: Hidden Biases on the Children's Web Pages

    ERIC Educational Resources Information Center

    Kurubacak, Gulsun

    2006-01-01

    As global digital communication continues to flourish, the Children's Web pages become more critical for children to realize not only the surface but also breadth and deeper meanings in presenting these milieus. These pages not only are very diverse and complex but also enable intense communication across social, cultural and political…

  19. A one-page orofacial myofunctional assessment form: a proposal.

    PubMed

    Paskay, Licia Coceani

    2012-11-01

    The author presents her own proposal of a one-page orofacial myofunctional assessment and for each item on the list a brief rationale is provided. The protocol is an easy but comprehensive form that can be faxed or emailed to referral sources as needed. As science provides more objective assessment and evaluation tools, this one-page form can be easily modified.

  20. Paging and Scrolling: Cognitive Styles in Learning from Hypermedia

    ERIC Educational Resources Information Center

    Eyuboglu, Filiz; Orhan, Feza

    2011-01-01

    This study investigates the navigational patterns and learning achievement of university students with different cognitive styles, on hypermedia learning environments using paging or scrolling. The global-local subscales of Sternberg's Thinking Styles Inventory, two hypermedia, one using paging, the other using scrolling, a multiple choice…

  1. The Library as Information Provider: The Home Page.

    ERIC Educational Resources Information Center

    Clyde, Laurel A.

    1996-01-01

    Discusses ways in which libraries are using the World Wide Web to provide information via a home page, based on information from a survey in Iceland as well as a larger study that conducted content analyses of home pages of public and school libraries in 13 countries. (Author/LRW)

  2. World Wide Web Page Design: A Structured Approach.

    ERIC Educational Resources Information Center

    Gregory, Gwen; Brown, M. Marlo

    1997-01-01

    Describes how to develop a World Wide Web site based on structured programming concepts. Highlights include flowcharting, first page design, evaluation, page titles, documenting source code, text, graphics, and browsers. Includes a template for HTML writers, tips for using graphics, a sample homepage, guidelines for authoring structured HTML, and…

  3. CrazyEgg Reports for Single Page Analysis

    EPA Pesticide Factsheets

    CrazyEgg provides an in depth look at visitor behavior on one page. While you can use GA to do trend analysis of your web area, CrazyEgg helps diagnose the design of a single Web page by visually displaying all visitor clicks during a specified time.

  4. PageRank model of opinion formation on social networks

    NASA Astrophysics Data System (ADS)

    Kandiah, Vivek; Shepelyansky, Dima L.

    2012-11-01

    We propose the PageRank model of opinion formation and investigate its rich properties on real directed networks of the Universities of Cambridge and Oxford, LiveJournal, and Twitter. In this model, the opinion formation of linked electors is weighted with their PageRank probability. Such a probability is used by the Google search engine for ranking of web pages. We find that the society elite, corresponding to the top PageRank nodes, can impose its opinion on a significant fraction of the society. However, for a homogeneous distribution of two opinions, there exists a bistability range of opinions which depends on a conformist parameter characterizing the opinion formation. We find that the LiveJournal and Twitter networks have a stronger tendency to a totalitarian opinion formation than the university networks. We also analyze the Sznajd model generalized for scale-free networks with the weighted PageRank vote of electors.

  5. College of DuPage Information Technology Plan, Fiscal Year 1994-95.

    ERIC Educational Resources Information Center

    College of DuPage, Glen Ellyn, IL.

    Building upon four previous planning documents for computing at College of DuPage in Illinois, this plan for fiscal year 1995 (FY95) provides a starting point for future plans to address all activities that relate to the use of information technology on campus. The FY95 "Information Technology Plan" is divided into six sections, each…

  6. Triple-server blind quantum computation using entanglement swapping

    NASA Astrophysics Data System (ADS)

    Li, Qin; Chan, Wai Hong; Wu, Chunhui; Wen, Zhonghua

    2014-04-01

    Blind quantum computation allows a client who does not have enough quantum resources or technologies to achieve quantum computation on a remote quantum server such that the client's input, output, and algorithm remain unknown to the server. Up to now, single- and double-server blind quantum computation have been considered. In this work, we propose a triple-server blind computation protocol where the client can delegate quantum computation to three quantum servers by the use of entanglement swapping. Furthermore, the three quantum servers can communicate with each other and the client is almost classical since one does not require any quantum computational power, quantum memory, and the ability to prepare any quantum states and only needs to be capable of getting access to quantum channels.

  7. Design of a distributed CORBA based image processing server.

    PubMed

    Giess, C; Evers, H; Heid, V; Meinzer, H P

    2000-01-01

    This paper presents the design and implementation of a distributed image processing server based on CORBA. Existing image processing tools were encapsulated in a common way with this server. Data exchange and conversion is done automatically inside the server, hiding these tasks from the user. The different image processing tools are visible as one large collection of algorithms and due to the use of CORBA are accessible via intra-/internet.

  8. Web Server. Security Technical Implementation Guide. Version 6, Release 1

    DTIC Science & Technology

    2006-12-11

    deployment, and operational maintenance of the web server lifecycle. Specific security configuration guidance for the Netscape /iPlanet/Sun JAVA...required security standards of this document. For example, a default installation setting for Netscape web servers is that automatic directory indexing...users with third-party confirmation of authenticity. Most web browsers perform server authentication automatically; the user is notified only if the

  9. SCRATCH: a protein structure and structural feature prediction server

    PubMed Central

    Cheng, J.; Randall, A. Z.; Sweredoski, M. J.; Baldi, P.

    2005-01-01

    SCRATCH is a server for predicting protein tertiary structure and structural features. The SCRATCH software suite includes predictors for secondary structure, relative solvent accessibility, disordered regions, domains, disulfide bridges, single mutation stability, residue contacts versus average, individual residue contacts and tertiary structure. The user simply provides an amino acid sequence and selects the desired predictions, then submits to the server. Results are emailed to the user. The server is available at . PMID:15980571

  10. Adapting the right web pages to the right users

    NASA Astrophysics Data System (ADS)

    Hui, Xiong; Sung, Sam Y.; Huang, Stephen

    2000-04-01

    With the explosive use of the Internet, there is an ever- increasing volume of Web usage data being generated and warehoused in numerous successful Web sites. Analyzing Web usage data can help Web developers to improve the organization and presentation of their Web sites. Considering the fact that mining for patterns and rules in market basket data is well studied in data mining field, we provide a mapping approach, which can transform Web usage data into the form like market basket data. Using our model, all the methods developed by data mining research groups can be directly applied on Web usage data without much change. Existing methods for knowledge discovery in Web logs are restricted by the difficulty of getting the complete and reliable Web usage data and effectively identifying user sessions using current Web server log mechanism. The problem is due to Web caching and the existence of proxy servers. As an effort to remedy this problem, we built our own Web server log mechanism that can effectively capture user access behavior and will not be deliberately bypassed by proxy servers and end users.

  11. Add Java extensions to your wiki: Java applets can bring dynamic functionality to your wiki pages

    SciTech Connect

    Scarberry, Randall E.

    2008-08-12

    Virtually everyone familiar with today’s world wide web has encountered the free online encyclopedia Wikipedia many times. What you may not know is that Wikipedia is driven by an excellent open-source product called MediaWiki which is available to anyone for free. This has led to a proliferation of wiki sites devoted to just about any topic one can imagine. Users of a wiki can add content -- all that is required of them is that they type in their additions into their web browsers using the simple markup language called wikitext. Even better, the developers of wikitext made it extensible. With a little server-side development of your own, you can add your own custom syntax. Users aware of your extensions can then utilize them on their wiki pages with a few simple keystrokes. These extensions can be custom decorations, formatting, web applications, and even instances of the venerable old Java applet. One example of a Java applet extension is the Jmol extension (REF), used to embed a 3-D molecular viewer. This article will walk you through the deployment of a fairly elaborate applet via a MediaWiki extension. By no means exhaustive -- an entire book would be required for that -- it will demonstrate how to give the applet resize handles using using a little Javascript and CSS coding and some popular Javascript libraries. It even describes how a user may customize the extension somewhat using a wiki template. Finally, it explains a rudimentary persistence mechanism which allows applets to save data directly to the wiki pages on which they reside.

  12. The EarthServer project: Exploiting Identity Federations, Science Gateways and Social and Mobile Clients for Big Earth Data Analysis

    NASA Astrophysics Data System (ADS)

    Barbera, Roberto; Bruno, Riccardo; Calanducci, Antonio; Messina, Antonio; Pappalardo, Marco; Passaro, Gianluca

    2013-04-01

    The EarthServer project (www.earthserver.eu), funded by the European Commission under its Seventh Framework Program, aims at establishing open access and ad-hoc analytics on extreme-size Earth Science data, based on and extending leading-edge Array Database technology. The core idea is to use database query languages as client/server interface to achieve barrier-free "mix & match" access to multi-source, any-size, multi-dimensional space-time data -- in short: "Big Earth Data Analytics" - based on the open standards of the Open Geospatial Consortium Web Coverage Processing Service (OGC WCPS) and the W3C XQuery. EarthServer combines both, thereby achieving a tight data/metadata integration. Further, the rasdaman Array Database System (www.rasdaman.com) is extended with further space-time coverage data types. On server side, highly effective optimizations - such as parallel and distributed query processing - ensure scalability to Exabyte volumes. Six Lighthouse Applications are being established in EarthServer, each of which poses distinct challenges on Earth Data Analytics: Cryospheric Science, Airborne Science, Atmospheric Science, Geology, Oceanography, and Planetary Science. Altogether, they cover all Earth Science domains; the Planetary Science use case has been added to challenge concepts and standards in non-standard environments. In addition, EarthLook (maintained by Jacobs University) showcases use of OGC standards in 1D through 5D use cases. In this contribution we will report on the first applications integrated in the EarthServer Science Gateway and on the clients for mobile appliances developed to access them. We will also show how federated and social identity services can allow Big Earth Data Providers to expose their data in a distributed environment keeping a strict and fine-grained control on user authentication and authorisation. The degree of fulfilment of the EarthServer implementation with the recommendations made in the recent TERENA Study on

  13. An Efficient Web Page Ranking for Semantic Web

    NASA Astrophysics Data System (ADS)

    Chahal, P.; Singh, M.; Kumar, S.

    2014-01-01

    With the enormous amount of information presented on the web, the retrieval of relevant information has become a serious problem and is also the topic of research for last few years. The most common tools to retrieve information from web are search engines like Google. The Search engines are usually based on keyword searching and indexing of web pages. This approach is not very efficient as the result-set of web pages obtained include large irrelevant pages. Sometimes even the entire result-set may contain lot of irrelevant pages for the user. The next generation of search engines must address this problem. Recently, many semantic web search engines have been developed like Ontolook, Swoogle, which help in searching meaningful documents presented on semantic web. In this process the ranking of the retrieved web pages is very crucial. Some attempts have been made in ranking of semantic web pages but still the ranking of these semantic web documents is neither satisfactory and nor up to the user's expectations. In this paper we have proposed a semantic web based document ranking scheme that relies not only on the keywords but also on the conceptual instances present between the keywords. As a result only the relevant page will be on the top of the result-set of searched web pages. We explore all relevant relations between the keywords exploring the user's intention and then calculate the fraction of these relations on each web page to determine their relevance. We have found that this ranking technique gives better results than those by the prevailing methods.

  14. Exploring the use of a Facebook page in anatomy education.

    PubMed

    Jaffar, Akram Abood

    2014-01-01

    Facebook is the most popular social media site visited by university students on a daily basis. Consequently, Facebook is the logical place to start with for integrating social media technologies into education. This study explores how a faculty-administered Facebook Page can be used to supplement anatomy education beyond the traditional classroom. Observations were made on students' perceptions and effectiveness of using the Page, potential benefits and challenges of such use, and which Insights metrics best reflect user's engagement. The Human Anatomy Education Page was launched on Facebook and incorporated into anatomy resources for 157 medical students during two academic years. Students' use of Facebook and their perceptions of the Page were surveyed. Facebook's "Insights" tool was also used to evaluate Page performance during a period of 600 days. The majority of in-class students had a Facebook account which they adopted in education. Most students perceived Human Anatomy Education Page as effective in contributing to learning and favored "self-assessment" posts. The majority of students agreed that Facebook could be a suitable learning environment. The "Insights" tool revealed globally distributed fans with considerable Page interactions. The use of a faculty-administered Facebook Page provided a venue to enhance classroom teaching without intruding into students' social life. A wider educational use of Facebook should be adopted not only because students are embracing its use, but for its inherent potentials in boosting learning. The "Insights" metrics analyzed in this study might be helpful when establishing and evaluating the performance of education-oriented Facebook Pages.

  15. Visualizing Worlds from Words on a Page

    ERIC Educational Resources Information Center

    Parsons, Linda T.

    2006-01-01

    This study involved fourth grade children as co-researchers of their engaged, aesthetic reading experience. As members of the "Readers as Researchers Club," they documented their engagement with text--how they create, enter, and sustain the story world. The children, who self-identified as avid readers, explored the activities central to their…

  16. DC3 Data and Information Page

    Atmospheric Science Data Center

    2015-03-16

    ... species, and d) chemistry in the anvil. To quantify the changes in chemistry and composition after active convection, focusing on a) ... hours after convection and b) the seasonal transition of the chemical composition of the upper troposphere.   The DC3 aircraft ...

  17. The Page kidney phenomenon secondary to a traumatic fall.

    PubMed

    Babel, Nitin; Sakpal, Sujit Vijay; Chamberlain, Ronald Scott

    2010-02-01

    Page kidney is a rare phenomenon of hyperreninemic hypertension caused by compression of the renal parenchyma. It has been reported in healthy individuals after blunt abdominal or flank trauma, and in patients after invasive nephrological interventions. We present a case of acute on chronic renal failure and Page kidney phenomenon in an elderly male after a traumatic fall, who underwent effective medical management until spontaneous recovery to baseline was observed. A brief discussion on the Page kidney phenomenon is provided with a suggested algorithmic approach towards the management of this process.

  18. Home Page: The Mode of Transport through the Information Superhighway

    NASA Technical Reports Server (NTRS)

    Lujan, Michelle R.

    1995-01-01

    The purpose of the project with the Aeroacoustics Branch was to create and submit a home page for the internet about branch information. In order to do this, one must also become familiar with the way that the internet operates. Learning HyperText Markup Language (HTML), and the ability to create a document using this language was the final objective in order to place a home page on the internet (World Wide Web). A manual of instructions regarding maintenance of the home page, and how to keep it up to date was also necessary in order to provide branch members with the opportunity to make any pertinent changes.

  19. Optimizing TLB entries for mixed page size storage in contiguous memory

    DOEpatents

    Chen, Dong; Gara, Alan; Giampapa, Mark E.; Heidelberger, Philip; Kriegel, Jon K.; Ohmacht, Martin; Steinmacher-Burow, Burkhard

    2013-04-30

    A system and method for accessing memory are provided. The system comprises a lookup buffer for storing one or more page table entries, wherein each of the one or more page table entries comprises at least a virtual page number and a physical page number; a logic circuit for receiving a virtual address from said processor, said logic circuit for matching the virtual address to the virtual page number in one of the page table entries to select the physical page number in the same page table entry, said page table entry having one or more bits set to exclude a memory range from a page.

  20. A Web Terminology Server Using UMLS for the Description of Medical Procedures

    PubMed Central

    Burgun, Anita; Denier, Patrick; Bodenreider, Olivier; Botti, Geneviève; Delamarre, Denis; Pouliquen, Bruno; Oberlin, Philippe; Lévéque, Jean M.; Lukacs, Bertrand; Kohler, François; Fieschi, Marius; Le Beux, Pierre

    1997-01-01

    Abstract The Model for Assistance in the Orientation of a User within Coding Systems (MAOUSSC) project has been designed to provide a representation for medical and surgical procedures that allows several applications to be developed from several viewpoints. It is based on a conceptual model, a controlled set of terms, and Web server development. The design includes the UMLS knowledge sources associated with additional knowledge about medico-surgical procedures. The model was implemented using a relational database. The authors developed a complete interface for the Web presentation, with the intermediary layer being written in PERL. The server has been used for the representation of medico-surgical procedures that occur in the discharge summaries of the national survey of hospital activities that is performed by the French Health Statistics Agency in order to produce inpatient profiles. The authors describe the current status of the MAOUSSC server and discuss their interest in using such a server to assist in the coordination of terminology tasks and in the sharing of controlled terminologies. PMID:9292841

  1. A FPGA Embedded Web Server for Remote Monitoring and Control of Smart Sensors Networks

    PubMed Central

    Magdaleno, Eduardo; Rodríguez, Manuel; Pérez, Fernando; Hernández, David; García, Enrique

    2014-01-01

    This article describes the implementation of a web server using an embedded Altera NIOS II IP core, a general purpose and configurable RISC processor which is embedded in a Cyclone FPGA. The processor uses the μCLinux operating system to support a Boa web server of dynamic pages using Common Gateway Interface (CGI). The FPGA is configured to act like the master node of a network, and also to control and monitor a network of smart sensors or instruments. In order to develop a totally functional system, the FPGA also includes an implementation of the time-triggered protocol (TTP/A). Thus, the implemented master node has two interfaces, the webserver that acts as an Internet interface and the other to control the network. This protocol is widely used to connecting smart sensors and actuators and microsystems in embedded real-time systems in different application domains, e.g., industrial, automotive, domotic, etc., although this protocol can be easily replaced by any other because of the inherent characteristics of the FPGA-based technology. PMID:24379047

  2. NMR Constraints Analyser: a web-server for the graphical analysis of NMR experimental constraints

    PubMed Central

    Heller, Davide Martin; Giorgetti, Alejandro

    2010-01-01

    Nuclear magnetic resonance (NMR) spectroscopy together with X-ray crystallography, are the main techniques used for the determination of high-resolution 3D structures of biological molecules. The output of an NMR experiment includes a set of lower and upper limits for the distances (constraints) between pairs of atoms. If the number of constraints is high enough, there will be a finite number of possible conformations (models) of the macromolecule satisfying the data. Thus, the more constraints are measured, the better defined these structures will be. The availability of a user-friendly tool able to help in the analysis and interpretation of the number of experimental constraints per residue, is thus of valuable importance when assessing the levels of structure definition of NMR solved biological macromolecules, in particular, when high-quality structures are needed in techniques such as, computational biology approaches, site-directed mutagenesis experiments and/or drug design. Here, we present a free publicly available web-server, i.e. NMR Constraints Analyser, which is aimed at providing an automatic graphical analysis of the NMR experimental constraints atom by atom. The NMR Constraints Analyser server is available from the web-page http://molsim.sci.univr.it/constraint PMID:20513646

  3. NMR Constraints Analyser: a web-server for the graphical analysis of NMR experimental constraints.

    PubMed

    Heller, Davide Martin; Giorgetti, Alejandro

    2010-07-01

    Nuclear magnetic resonance (NMR) spectroscopy together with X-ray crystallography, are the main techniques used for the determination of high-resolution 3D structures of biological molecules. The output of an NMR experiment includes a set of lower and upper limits for the distances (constraints) between pairs of atoms. If the number of constraints is high enough, there will be a finite number of possible conformations (models) of the macromolecule satisfying the data. Thus, the more constraints are measured, the better defined these structures will be. The availability of a user-friendly tool able to help in the analysis and interpretation of the number of experimental constraints per residue, is thus of valuable importance when assessing the levels of structure definition of NMR solved biological macromolecules, in particular, when high-quality structures are needed in techniques such as, computational biology approaches, site-directed mutagenesis experiments and/or drug design. Here, we present a free publicly available web-server, i.e. NMR Constraints Analyser, which is aimed at providing an automatic graphical analysis of the NMR experimental constraints atom by atom. The NMR Constraints Analyser server is available from the web-page http://molsim.sci.univr.it/constraint.

  4. R3D-2-MSA: the RNA 3D structure-to-multiple sequence alignment server

    PubMed Central

    Cannone, Jamie J.; Sweeney, Blake A.; Petrov, Anton I.; Gutell, Robin R.; Zirbel, Craig L.; Leontis, Neocles

    2015-01-01

    The RNA 3D Structure-to-Multiple Sequence Alignment Server (R3D-2-MSA) is a new web service that seamlessly links RNA three-dimensional (3D) structures to high-quality RNA multiple sequence alignments (MSAs) from diverse biological sources. In this first release, R3D-2-MSA provides manual and programmatic access to curated, representative ribosomal RNA sequence alignments from bacterial, archaeal, eukaryal and organellar ribosomes, using nucleotide numbers from representative atomic-resolution 3D structures. A web-based front end is available for manual entry and an Application Program Interface for programmatic access. Users can specify up to five ranges of nucleotides and 50 nucleotide positions per range. The R3D-2-MSA server maps these ranges to the appropriate columns of the corresponding MSA and returns the contents of the columns, either for display in a web browser or in JSON format for subsequent programmatic use. The browser output page provides a 3D interactive display of the query, a full list of sequence variants with taxonomic information and a statistical summary of distinct sequence variants found. The output can be filtered and sorted in the browser. Previous user queries can be viewed at any time by resubmitting the output URL, which encodes the search and re-generates the results. The service is freely available with no login requirement at http://rna.bgsu.edu/r3d-2-msa. PMID:26048960

  5. A FPGA embedded web server for remote monitoring and control of smart sensors networks.

    PubMed

    Magdaleno, Eduardo; Rodríguez, Manuel; Pérez, Fernando; Hernández, David; García, Enrique

    2013-12-27

    This article describes the implementation of a web server using an embedded Altera NIOS II IP core, a general purpose and configurable RISC processor which is embedded in a Cyclone FPGA. The processor uses the μCLinux operating system to support a Boa web server of dynamic pages using Common Gateway Interface (CGI). The FPGA is configured to act like the master node of a network, and also to control and monitor a network of smart sensors or instruments. In order to develop a totally functional system, the FPGA also includes an implementation of the time-triggered protocol (TTP/A). Thus, the implemented master node has two interfaces, the webserver that acts as an Internet interface and the other to control the network. This protocol is widely used to connecting smart sensors and actuators and microsystems in embedded real-time systems in different application domains, e.g., industrial, automotive, domotic, etc., although this protocol can be easily replaced by any other because of the inherent characteristics of the FPGA-based technology.

  6. FAF-Drugs3: a web server for compound property calculation and chemical library design.

    PubMed

    Lagorce, David; Sperandio, Olivier; Baell, Jonathan B; Miteva, Maria A; Villoutreix, Bruno O

    2015-07-01

    Drug attrition late in preclinical or clinical development is a serious economic problem in the field of drug discovery. These problems can be linked, in part, to the quality of the compound collections used during the hit generation stage and to the selection of compounds undergoing optimization. Here, we present FAF-Drugs3, a web server that can be used for drug discovery and chemical biology projects to help in preparing compound libraries and to assist decision-making during the hit selection/lead optimization phase. Since it was first described in 2006, FAF-Drugs has been significantly modified. The tool now applies an enhanced structure curation procedure, can filter or analyze molecules with user-defined or eight predefined physicochemical filters as well as with several simple ADMET (absorption, distribution, metabolism, excretion and toxicity) rules. In addition, compounds can be filtered using an updated list of 154 hand-curated structural alerts while Pan Assay Interference compounds (PAINS) and other, generally unwanted groups are also investigated. FAF-Drugs3 offers access to user-friendly html result pages and the possibility to download all computed data. The server requires as input an SDF file of the compounds; it is open to all users and can be accessed without registration at http://fafdrugs3.mti.univ-paris-diderot.fr.

  7. R3D-2-MSA: the RNA 3D structure-to-multiple sequence alignment server.

    PubMed

    Cannone, Jamie J; Sweeney, Blake A; Petrov, Anton I; Gutell, Robin R; Zirbel, Craig L; Leontis, Neocles

    2015-07-01

    The RNA 3D Structure-to-Multiple Sequence Alignment Server (R3D-2-MSA) is a new web service that seamlessly links RNA three-dimensional (3D) structures to high-quality RNA multiple sequence alignments (MSAs) from diverse biological sources. In this first release, R3D-2-MSA provides manual and programmatic access to curated, representative ribosomal RNA sequence alignments from bacterial, archaeal, eukaryal and organellar ribosomes, using nucleotide numbers from representative atomic-resolution 3D structures. A web-based front end is available for manual entry and an Application Program Interface for programmatic access. Users can specify up to five ranges of nucleotides and 50 nucleotide positions per range. The R3D-2-MSA server maps these ranges to the appropriate columns of the corresponding MSA and returns the contents of the columns, either for display in a web browser or in JSON format for subsequent programmatic use. The browser output page provides a 3D interactive display of the query, a full list of sequence variants with taxonomic information and a statistical summary of distinct sequence variants found. The output can be filtered and sorted in the browser. Previous user queries can be viewed at any time by resubmitting the output URL, which encodes the search and re-generates the results. The service is freely available with no login requirement at http://rna.bgsu.edu/r3d-2-msa.

  8. FAF-Drugs3: a web server for compound property calculation and chemical library design

    PubMed Central

    Lagorce, David; Sperandio, Olivier; Baell, Jonathan B.; Miteva, Maria A.; Villoutreix, Bruno O.

    2015-01-01

    Drug attrition late in preclinical or clinical development is a serious economic problem in the field of drug discovery. These problems can be linked, in part, to the quality of the compound collections used during the hit generation stage and to the selection of compounds undergoing optimization. Here, we present FAF-Drugs3, a web server that can be used for drug discovery and chemical biology projects to help in preparing compound libraries and to assist decision-making during the hit selection/lead optimization phase. Since it was first described in 2006, FAF-Drugs has been significantly modified. The tool now applies an enhanced structure curation procedure, can filter or analyze molecules with user-defined or eight predefined physicochemical filters as well as with several simple ADMET (absorption, distribution, metabolism, excretion and toxicity) rules. In addition, compounds can be filtered using an updated list of 154 hand-curated structural alerts while Pan Assay Interference compounds (PAINS) and other, generally unwanted groups are also investigated. FAF-Drugs3 offers access to user-friendly html result pages and the possibility to download all computed data. The server requires as input an SDF file of the compounds; it is open to all users and can be accessed without registration at http://fafdrugs3.mti.univ-paris-diderot.fr. PMID:25883137

  9. A client/server approach to telemedicine.

    PubMed Central

    Vaughan, B. J.; Torok, K. E.; Kelly, L. M.; Ewing, D. J.; Andrews, L. T.

    1995-01-01

    This paper describes the Medical College of Ohio's efforts in developing a client/server telemedicine system. Telemedicine vastly improves the ability of a medical center physician or specialist to interactively consult with a physician at a remote health care facility. The patient receives attention more quickly, he and his family do not need to travel long distances to obtain specialists' services, and the primary care physician can be involved in diagnosis and developing a treatment program [1, 2]. Telemedicine consultations are designed to improve access to health services in underserved urban and rural communities and reduce isolation of rural practitioners [3]. Images Figure 2 Figure 3 Figure 4 PMID:8563396

  10. Remote Sensing Data Analytics for Planetary Science with PlanetServer/EarthServer

    NASA Astrophysics Data System (ADS)

    Rossi, Angelo Pio; Figuera, Ramiro Marco; Flahaut, Jessica; Martinot, Melissa; Misev, Dimitar; Baumann, Peter; Pham Huu, Bang; Besse, Sebastien

    2016-04-01

    Planetary Science datasets, beyond the change in the last two decades from physical volumes to internet-accessible archives, still face the problem of large-scale processing and analytics (e.g. Rossi et al., 2014, Gaddis and Hare, 2015). PlanetServer, the Planetary Science Data Service of the EC-funded EarthServer-2 project (#654367) tackles the planetary Big Data analytics problem with an array database approach (Baumann et al., 2014). It is developed to serve a large amount of calibrated, map-projected planetary data online, mainly through Open Geospatial Consortium (OGC) Web Coverage Processing Service (WCPS) (e.g. Rossi et al., 2014; Oosthoek et al., 2013; Cantini et al., 2014). The focus of the H2020 evolution of PlanetServer is still on complex multidimensional data, particularly hyperspectral imaging and topographic cubes and imagery. In addition to hyperspectral and topographic from Mars (Rossi et al., 2014), the use of WCPS is applied to diverse datasets on the Moon, as well as Mercury. Other Solar System Bodies are going to be progressively available. Derived parameters such as summary products and indices can be produced through WCPS queries, as well as derived imagery colour combination products, dynamically generated and accessed also through OGC Web Coverage Service (WCS). Scientific questions translated into queries can be posed to a large number of individual coverages (data products), locally, regionally or globally. The new PlanetServer system uses the the Open Source Nasa WorldWind (e.g. Hogan, 2011) virtual globe as visualisation engine, and the array database Rasdaman Community Edition as core server component. Analytical tools and client components of relevance for multiple communities and disciplines are shared across service such as the Earth Observation and Marine Data Services of EarthServer. The Planetary Science Data Service of EarthServer is accessible on http://planetserver.eu. All its code base is going to be available on GitHub, on

  11. Separation of membrane protein complexes by native LDS-PAGE.

    PubMed

    Arnold, Janine; Shapiguzov, Alexey; Fucile, Geoffrey; Rochaix, Jean-David; Goldschmidt-Clermont, Michel; Eichacker, Lutz Andreas

    2014-01-01

    Gel electrophoresis has become one of the most important methods for the analysis of proteins and protein complexes in a molecular weight range of 1-10(7) kDa. The separation of membrane protein complexes remained challenging to standardize until the demonstration of Blue Native PAGE in 1991 [1] and Clear Native PAGE in 1994 [2]. We present a robust protocol for high-resolution separation of photosynthetic complexes from Arabidopsis thaliana using lithium dodecyl sulfate as anion in a modified Blue Native PAGE (LDS-PAGE). Here, non-covalently bound chlorophyll is used as a sensitive probe to characterize the assembly/biogenesis of the pigment-protein complexes essential for photosynthesis. The high fluorescence yield recorded from chlorophyll-binding protein complexes can also be used to establish the separation of native protein complexes as an electrophoretic standard.

  12. 113. Photocopy of illustration on page 109 in Owen, Hints. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    113. Photocopy of illustration on page 109 in Owen, Hints. SOUTHERN GATEWAY, SMITHSONIAN INSTITUTION - Smithsonian Institution Building, 1000 Jefferson Drive, between Ninth & Twelfth Streets, Southwest, Washington, District of Columbia, DC

  13. 1. Historic American Buildings Survey, Louis C. Page, Jr., Photographer ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. Historic American Buildings Survey, Louis C. Page, Jr., Photographer January 20, 1934 VIEW FROM SOUTH (FRONT). - French Legation to Republic of Texas, Seventh & San Marcos Streets, Austin, Travis County, TX

  14. 2. Historic American Buildings Survey, Louis C. Page, Jr., Photographer ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. Historic American Buildings Survey, Louis C. Page, Jr., Photographer January 20, 1934 VIEW FROM WEST (FRONT). - French Legation to Republic of Texas, Seventh & San Marcos Streets, Austin, Travis County, TX

  15. 4. Historic American Buildings Survey, Louis C. Page, Jr., Photographer ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    4. Historic American Buildings Survey, Louis C. Page, Jr., Photographer January 20, 1934 VIEW FROM SOUTHWEST (FRONT). - French Legation to Republic of Texas, Seventh & San Marcos Streets, Austin, Travis County, TX

  16. 3. Historic American Buildings Survey, Louis C. Page, Jr., Photographer ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. Historic American Buildings Survey, Louis C. Page, Jr., Photographer February 10, 1934 VIEW FROM SOUTH (FRONT). - French Legation to Republic of Texas, Seventh & San Marcos Streets, Austin, Travis County, TX

  17. Book Holder And Page Turner For The Elderly And Handicapped

    NASA Technical Reports Server (NTRS)

    Kerley, James; Eklund, Wayne

    1993-01-01

    Device holds reading matter and facilitates page turning for person not having use of arms and hands. Accommodates variety of publication formats, whether book, magazine, or newspaper. Holder sits on hospital-bed table and adjusted to convenient viewing angle. Includes flat upright back support for reading matter, hinged base, and main bracket with bent-wire page holders. Top support on back extended for such large items as newspapers. Wings on back support extended for oversize materials. Reader turns page by gripping special rod via mouthpiece, applying friction cup at its tip to page, and manipulating rod. Mouthpiece wide and tapered so user grips with teeth and uses jaws to move it, rather than using tongue or lips. Helpful to older people, whose facial and mouth muscles weak.

  18. 107. Photocopy of plate opposite page 104 in Owen, Hints. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    107. Photocopy of plate opposite page 104 in Owen, Hints. SMITHSONIAN INSTITUTION, FROM THE NORTH-EAST - Smithsonian Institution Building, 1000 Jefferson Drive, between Ninth & Twelfth Streets, Southwest, Washington, District of Columbia, DC

  19. 110. Photocopy of plate opposite page 19 in Owen, Hints. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    110. Photocopy of plate opposite page 19 in Owen, Hints. CAMPANILE, SMITHSONIAN INSTITUTION, FROM THE NORTH-EAST - Smithsonian Institution Building, 1000 Jefferson Drive, between Ninth & Twelfth Streets, Southwest, Washington, District of Columbia, DC

  20. 112. Photocopy of plate opposite page 43 in Owen, Hints. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    112. Photocopy of plate opposite page 43 in Owen, Hints. CENTRAL SOUTHERN TOWER, SMITHSONIAN INSTITUTION; FROM THE SOUTH-WEST - Smithsonian Institution Building, 1000 Jefferson Drive, between Ninth & Twelfth Streets, Southwest, Washington, District of Columbia, DC

  1. 109. Photocopy of plate opposite page 75 in Owen, Hints. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    109. Photocopy of plate opposite page 75 in Owen, Hints. WEST WING, SMITHSONIAN INSTITUTION: FROM THE NORTH-EAST - Smithsonian Institution Building, 1000 Jefferson Drive, between Ninth & Twelfth Streets, Southwest, Washington, District of Columbia, DC

  2. 111. Photocopy of plate opposite page 108 in Owen, Hints. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    111. Photocopy of plate opposite page 108 in Owen, Hints. SMITHSONIAN INSTITUTION FROM THE SOUTH WEST - Smithsonian Institution Building, 1000 Jefferson Drive, between Ninth & Twelfth Streets, Southwest, Washington, District of Columbia, DC

  3. Label Review Training: Module 1: Label Basics, Page 24

    EPA Pesticide Factsheets

    This module of the pesticide label review training provides basic information about pesticides, their labeling and regulation, and the core principles of pesticide label review. This page is about which labels require review.

  4. 1. Historic American Buildings Survey Annals of SF, Page 170 ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. Historic American Buildings Survey Annals of SF, Page 170 Jocelyn Annin-Del Photo Taken: 1836 FOURTH OF JULY CELEBRATION YEAR BUILT 1836 - Jacob Leese House, Historic View, Grant Avenue, San Francisco, San Francisco County, CA

  5. Label Review Training: Module 1: Label Basics, Page 29

    EPA Pesticide Factsheets

    This module of the pesticide label review training provides basic information about pesticides, their labeling and regulation, and the core principles of pesticide label review. This page is a quiz on Module 1.

  6. Label Review Training: Module 1: Label Basics, Page 6

    EPA Pesticide Factsheets

    Page 6, Pesticide labels translate results of our extensive evaluations of pesticide products into conditions, directions and precautions that define parameters for use of a pesticide with the goal of ensuring protection of human health and the environment

  7. Label Review Training: Module 1: Label Basics, Page 7

    EPA Pesticide Factsheets

    Page 7, Label Training, Pesticide labels translate results of our extensive evaluations of pesticide products into conditions, directions and precautions that define parameters for use of a pesticide with the goal of ensuring protection of human he

  8. ASM Based Synthesis of Handwritten Arabic Text Pages

    PubMed Central

    Dinges, Laslo; Al-Hamadi, Ayoub; Elzobi, Moftah; El-etriby, Sherif; Ghoneim, Ahmed

    2015-01-01

    Document analysis tasks, as text recognition, word spotting, or segmentation, are highly dependent on comprehensive and suitable databases for training and validation. However their generation is expensive in sense of labor and time. As a matter of fact, there is a lack of such databases, which complicates research and development. This is especially true for the case of Arabic handwriting recognition, that involves different preprocessing, segmentation, and recognition methods, which have individual demands on samples and ground truth. To bypass this problem, we present an efficient system that automatically turns Arabic Unicode text into synthetic images of handwritten documents and detailed ground truth. Active Shape Models (ASMs) based on 28046 online samples were used for character synthesis and statistical properties were extracted from the IESK-arDB database to simulate baselines and word slant or skew. In the synthesis step ASM based representations are composed to words and text pages, smoothed by B-Spline interpolation and rendered considering writing speed and pen characteristics. Finally, we use the synthetic data to validate a segmentation method. An experimental comparison with the IESK-arDB database encourages to train and test document analysis related methods on synthetic samples, whenever no sufficient natural ground truthed data is available. PMID:26295059

  9. Using shadow page cache to improve isolated drivers performance.

    PubMed

    Zheng, Hao; Dong, Xiaoshe; Wang, Endong; Chen, Baoke; Zhu, Zhengdong; Liu, Chengzhe

    2015-01-01

    With the advantage of the reusability property of the virtualization technology, users can reuse various types and versions of existing operating systems and drivers in a virtual machine, so as to customize their application environment. In order to prevent users' virtualization environments being impacted by driver faults in virtual machine, Chariot examines the correctness of driver's write operations by the method of combining a driver's write operation capture and a driver's private access control table. However, this method needs to keep the write permission of shadow page table as read-only, so as to capture isolated driver's write operations through page faults, which adversely affect the performance of the driver. Based on delaying setting frequently used shadow pages' write permissions to read-only, this paper proposes an algorithm using shadow page cache to improve the performance of isolated drivers and carefully study the relationship between the performance of drivers and the size of shadow page cache. Experimental results show that, through the shadow page cache, the performance of isolated drivers can be greatly improved without impacting Chariot's reliability too much.

  10. Client-Server Connection Status Monitoring Using Ajax Push Technology

    NASA Technical Reports Server (NTRS)

    Lamongie, Julien R.

    2008-01-01

    This paper describes how simple client-server connection status monitoring can be implemented using Ajax (Asynchronous JavaScript and XML), JSF (Java Server Faces) and ICEfaces technologies. This functionality is required for NASA LCS (Launch Control System) displays used in the firing room for the Constellation project. Two separate implementations based on two distinct approaches are detailed and analyzed.

  11. HMMER web server: interactive sequence similarity searching.

    PubMed

    Finn, Robert D; Clements, Jody; Eddy, Sean R

    2011-07-01

    HMMER is a software suite for protein sequence similarity searches using probabilistic methods. Previously, HMMER has mainly been available only as a computationally intensive UNIX command-line tool, restricting its use. Recent advances in the software, HMMER3, have resulted in a 100-fold speed gain relative to previous versions. It is now feasible to make efficient profile hidden Markov model (profile HMM) searches via the web. A HMMER web server (http://hmmer.janelia.org) has been designed and implemented such that most protein database searches return within a few seconds. Methods are available for searching either a single protein sequence, multiple protein sequence alignment or profile HMM against a target sequence database, and for searching a protein sequence against Pfam. The web server is designed to cater to a range of different user expertise and accepts batch uploading of multiple queries at once. All search methods are also available as RESTful web services, thereby allowing them to be readily integrated as remotely executed tasks in locally scripted workflows. We have focused on minimizing search times and the ability to rapidly display tabular results, regardless of the number of matches found, developing graphical summaries of the search results to provide quick, intuitive appraisement of them.

  12. Software design of the ASTRI camera server proposed for the Cherenkov Telescope Array

    NASA Astrophysics Data System (ADS)

    Conforti, Vito; Trifoglio, Massimo; Gianotti, Fulvio; Malaguti, Giuseppe; Bulgarelli, Andrea; Fioretti, Valentina; Zoli, Andrea; Catalano, Osvaldo; Capalbi, Milvia; Sangiorgi, Pierluca

    2016-07-01

    perform this processing on all the incoming event packets. The camera server provides interfaces to the array control software to allow for monitoring and control during array operations. In this paper we present the design of the camera server software with particular emphasis on the external interfaces. In addition, we report the results of the first integration activities and performance tests.

  13. Improvements to the NIST network time protocol servers

    NASA Astrophysics Data System (ADS)

    Levine, Judah

    2008-12-01

    The National Institute of Standards and Technology (NIST) operates 22 network time servers at various locations. These servers respond to requests for time in a number of different formats and provide time stamps that are directly traceable to the NIST atomic clock ensemble in Boulder. The link between the servers at locations outside of the NIST Boulder Laboratories and the atomic clock ensemble is provided by the Automated Computer Time Service (ACTS) system, which has a direct connection to the clock ensemble and which transmits time information over dial-up telephone lines with a two-way protocol to measure the transmission delay. I will discuss improvements to the ACTS servers and to the time servers themselves. These improvements have resulted in an improvement of almost an order of magnitude in the performance of the system.

  14. Oceanotron, Scalable Server for Marine Observations

    NASA Astrophysics Data System (ADS)

    Loubrieu, T.; Bregent, S.; Blower, J. D.; Griffiths, G.

    2013-12-01

    Ifremer, French marine institute, is deeply involved in data management for different ocean in-situ observation programs (ARGO, OceanSites, GOSUD, ...) or other European programs aiming at networking ocean in-situ observation data repositories (myOcean, seaDataNet, Emodnet). To capitalize the effort for implementing advance data dissemination services (visualization, download with subsetting) for these programs and generally speaking water-column observations repositories, Ifremer decided to develop the oceanotron server (2010). Knowing the diversity of data repository formats (RDBMS, netCDF, ODV, ...) and the temperamental nature of the standard interoperability interface profiles (OGC/WMS, OGC/WFS, OGC/SOS, OpeNDAP, ...), the server is designed to manage plugins: - StorageUnits : which enable to read specific data repository formats (netCDF/OceanSites, RDBMS schema, ODV binary format). - FrontDesks : which get external requests and send results for interoperable protocols (OGC/WMS, OGC/SOS, OpenDAP). In between a third type of plugin may be inserted: - TransformationUnits : which enable ocean business related transformation of the features (for example conversion of vertical coordinates from pressure in dB to meters under sea surface). The server is released under open-source license so that partners can develop their own plugins. Within MyOcean project, University of Reading has plugged a WMS implementation as an oceanotron frontdesk. The modules are connected together by sharing the same information model for marine observations (or sampling features: vertical profiles, point series and trajectories), dataset metadata and queries. The shared information model is based on OGC/Observation & Measurement and Unidata/Common Data Model initiatives. The model is implemented in java (http://www.ifremer.fr/isi/oceanotron/javadoc/). This inner-interoperability level enables to capitalize ocean business expertise in software development without being indentured to

  15. CORAL Server and CORAL Server Proxy: Scalable Access to Relational Databases from CORAL Applications

    SciTech Connect

    Valassi, A.; Bartoldus, R.; Kalkhof, A.; Salnikov, A.; Wache, M.; /Mainz U., Inst. Phys.

    2012-04-19

    The CORAL software is widely used at CERN by the LHC experiments to access the data they store on relational databases, such as Oracle. Two new components have recently been added to implement a model involving a middle tier 'CORAL server' deployed close to the database and a tree of 'CORAL server proxies', providing data caching and multiplexing, deployed close to the client. A first implementation of the two new components, released in the summer 2009, is now deployed in the ATLAS online system to read the data needed by the High Level Trigger, allowing the configuration of a farm of several thousand processes. This paper reviews the architecture of the software, its development status and its usage in ATLAS.

  16. Automatic comic page image understanding based on edge segment analysis

    NASA Astrophysics Data System (ADS)

    Liu, Dong; Wang, Yongtao; Tang, Zhi; Li, Luyuan; Gao, Liangcai

    2013-12-01

    Comic page image understanding aims to analyse the layout of the comic page images by detecting the storyboards and identifying the reading order automatically. It is the key technique to produce the digital comic documents suitable for reading on mobile devices. In this paper, we propose a novel comic page image understanding method based on edge segment analysis. First, we propose an efficient edge point chaining method to extract Canny edge segments (i.e., contiguous chains of Canny edge points) from the input comic page image; second, we propose a top-down scheme to detect line segments within each obtained edge segment; third, we develop a novel method to detect the storyboards by selecting the border lines and further identify the reading order of these storyboards. The proposed method is performed on a data set consisting of 2000 comic page images from ten printed comic series. The experimental results demonstrate that the proposed method achieves satisfactory results on different comics and outperforms the existing methods.

  17. Identifying and Analyzing Web Server Attacks

    SciTech Connect

    Seifert, Christian; Endicott-Popovsky, Barbara E.; Frincke, Deborah A.; Komisarczuk, Peter; Muschevici, Radu; Welch, Ian D.

    2008-08-29

    Abstract: Client honeypots can be used to identify malicious web servers that attack web browsers and push malware to client machines. Merely recording network traffic is insufficient to perform comprehensive forensic analyses of such attacks. Custom tools are required to access and analyze network protocol data. Moreover, specialized methods are required to perform a behavioral analysis of an attack, which helps determine exactly what transpired on the attacked system. This paper proposes a record/replay mechanism that enables forensic investigators to extract application data from recorded network streams and allows applications to interact with this data in order to conduct behavioral analyses. Implementations for the HTTP and DNS protocols are presented and their utility in network forensic investigations is demonstrated.

  18. STRAW: Species TRee Analysis Web server

    PubMed Central

    Shaw, Timothy I.; Ruan, Zheng; Glenn, Travis C.; Liu, Liang

    2013-01-01

    The coalescent methods for species tree reconstruction are increasingly popular because they can accommodate coalescence and multilocus data sets. Herein, we present STRAW, a web server that offers workflows for reconstruction of phylogenies of species using three species tree methods—MP-EST, STAR and NJst. The input data are a collection of rooted gene trees (for STAR and MP-EST methods) or unrooted gene trees (for NJst). The output includes the estimated species tree, modified Robinson-Foulds distances between gene trees and the estimated species tree and visualization of trees to compare gene trees with the estimated species tree. The web sever is available at http://bioinformatics.publichealth.uga.edu/SpeciesTreeAnalysis/. PMID:23661681

  19. CENTROIDFOLD: a web server for RNA secondary structure prediction.

    PubMed

    Sato, Kengo; Hamada, Michiaki; Asai, Kiyoshi; Mituyama, Toutai

    2009-07-01

    The CENTROIDFOLD web server (http://www.ncrna.org/centroidfold/) is a web application for RNA secondary structure prediction powered by one of the most accurate prediction engine. The server accepts two kinds of sequence data: a single RNA sequence and a multiple alignment of RNA sequences. It responses with a prediction result shown as a popular base-pair notation and a graph representation. PDF version of the graph representation is also available. For a multiple alignment sequence, the server predicts a common secondary structure. Usage of the server is quite simple. You can paste a single RNA sequence (FASTA or plain sequence text) or a multiple alignment (CLUSTAL-W format) into the textarea then click on the 'execute CentroidFold' button. The server quickly responses with a prediction result. The major advantage of this server is that it employs our original CentroidFold software as its prediction engine which scores the best accuracy in our benchmark results. Our web server is freely available with no login requirement.

  20. Understanding Customer Dissatisfaction with Underutilized Distributed File Servers

    NASA Technical Reports Server (NTRS)

    Riedel, Erik; Gibson, Garth

    1996-01-01

    An important trend in the design of storage subsystems is a move toward direct network attachment. Network-attached storage offers the opportunity to off-load distributed file system functionality from dedicated file server machines and execute many requests directly at the storage devices. For this strategy to lead to better performance, as perceived by users, the response time of distributed operations must improve. In this paper we analyze measurements of an Andrew file system (AFS) server that we recently upgraded in an effort to improve client performance in our laboratory. While the original server's overall utilization was only about 3%, we show how burst loads were sufficiently intense to lead to period of poor response time significant enough to trigger customer dissatisfaction. In particular, we show how, after adjusting for network load and traffic to non-project servers, 50% of the variation in client response time was explained by variation in server central processing unit (CPU) use. That is, clients saw long response times in large part because the server was often over-utilized when it was used at all. Using these measures, we see that off-loading file server work in a network-attached storage architecture has to potential to benefit user response time. Computational power in such a system scales directly with storage capacity, so the slowdown during burst period should be reduced.

  1. Secure entanglement distillation for double-server blind quantum computation.

    PubMed

    Morimae, Tomoyuki; Fujii, Keisuke

    2013-07-12

    Blind quantum computation is a new secure quantum computing protocol where a client, who does not have enough quantum technologies at her disposal, can delegate her quantum computation to a server, who has a fully fledged quantum computer, in such a way that the server cannot learn anything about the client's input, output, and program. If the client interacts with only a single server, the client has to have some minimum quantum power, such as the ability of emitting randomly rotated single-qubit states or the ability of measuring states. If the client interacts with two servers who share Bell pairs but cannot communicate with each other, the client can be completely classical. For such a double-server scheme, two servers have to share clean Bell pairs, and therefore the entanglement distillation is necessary in a realistic noisy environment. In this Letter, we show that it is possible to perform entanglement distillation in the double-server scheme without degrading the security of blind quantum computing.

  2. Key-phrase based classification of public health web pages.

    PubMed

    Dolamic, Ljiljana; Boyer, Célia

    2013-01-01

    This paper describes and evaluates the public health web pages classification model based on key phrase extraction and matching. Easily extendible both in terms of new classes as well as the new language this method proves to be a good solution for text classification faced with the total lack of training data. To evaluate the proposed solution we have used a small collection of public health related web pages created by a double blind manual classification. Our experiments have shown that by choosing the adequate threshold value the desired value for either precision or recall can be achieved.

  3. Machine Learning Feature Selection for Tuning Memory Page Swapping

    DTIC Science & Technology

    2013-09-01

    be found in Figure 3.3. The features we added are marked with "(MLVM)". Features 14 and 16 may not be immediately obvious. See Figure 3.4 for a...see Table 3.1). All faults below the 50 percent mark , that is, pages recalled from the backing store in less than approximately 127 seconds, were...labeled as a bad decisions, while faults above the 50 percent mark , that is, pages that lived in the backing store for more than approximately 127 seconds

  4. Enriching the trustworthiness of health-related web pages.

    PubMed

    Gaudinat, Arnaud; Cruchet, Sarah; Boyer, Celia; Chrawdhry, Pravir

    2011-06-01

    We present an experimental mechanism for enriching web content with quality metadata. This mechanism is based on a simple and well-known initiative in the field of the health-related web, the HONcode. The Resource Description Framework (RDF) format and the Dublin Core Metadata Element Set were used to formalize these metadata. The model of trust proposed is based on a quality model for health-related web pages that has been tested in practice over a period of thirteen years. Our model has been explored in the context of a project to develop a research tool that automatically detects the occurrence of quality criteria in health-related web pages.

  5. Tracking the Inside Intruder Using Net Log on Debug Logging in Microsoft Windows Server Operating Systems

    SciTech Connect

    Davis, CS

    2004-01-20

    In today's well-connected environments of the Internet, intranets, and extranets, protecting the Microsoft Windows network can be a daunting task for the security engineer. Intrusion Detection Systems are a must-have for most companies, but few have either the financial resources or the people resources to implement and maintain full-scale intrusion detection systems for their networks and hosts. Many will at least invest in intrusion detection for their Internet presence, but others have not yet stepped up to the plate with regard to internal intrusion detection. Unfortunately, most attacks will come from within. Microsoft Windows server operating systems are widely used across both large and small enterprises. Unfortunately, there is no intrusion detection built-in to the Windows server operating system. The security logs are valuable but can be difficult to manage even in a small to medium sized environment. So the question arises, can one effectively detect and identify an in side intruder using the native tools that come with Microsoft Windows Server operating systems? One such method is to use Net Logon Service debug logging to identify and track malicious user activity. This paper discusses how to use Net Logon debug logging to identify and track malicious user activity both in real-time and for forensic analysis.

  6. SDS-PAGE of recombinant and endogenous erythropoietins: benefits and limitations of the method for application in doping control.

    PubMed

    Reichel, Christian; Kulovics, Ronald; Jordan, Veronika; Watzinger, Martina; Geisendorfer, Thomas

    2009-01-01

    Doping of athletes with recombinant and genetically modified erythropoietins (EPO) is currently detected by isoelectric focusing (IEF). The application of these drugs leads to a significant change in the isoform profile of endogenous urinary erythropoietin (uhEPO). Dynepo, MIRCERA, biosimilars with variable IEF-profiles as well as active urines and effort urines have made additional testing strategies necessary. The new generation of small molecule EPO-receptor stimulating agents like Hematide will also challenge the analytical concept of detecting the abuse of erythropoiesis stimulating agents (ESA). By determining their apparent molecular masses with SDS-PAGE a clear differentiation between endogenous and exogenous substances also concerning new EPO modifications is possible. Due to the orthogonal character of IEF- and SDS-PAGE both methods complement each other. The additional benefits of SDS-PAGE especially in relation to active and effort urines as well as the detection of Dynepo were investigated. Due to significant differences between the apparent molecular masses of uhEPO/serum EPO (shEPO) and recombinant, genetically or chemically modified erythropoietins the presence of active or effort urines was easily revealed. The characteristic band shape and apparent molecular mass of Dynepo on SDS-PAGE additionally evidenced the presence of this substance in urine. A protocol for the detection of EPO-doping in serum and plasma by SDS-PAGE was developed. Blood appears to be the ideal matrix for detecting all forms ESA-doping in the future.

  7. 7 CFR 3402.11 - Proposal cover page.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 15 2010-01-01 2010-01-01 false Proposal cover page. 3402.11 Section 3402.11 Agriculture Regulations of the Department of Agriculture (Continued) COOPERATIVE STATE RESEARCH, EDUCATION, AND EXTENSION SERVICE, DEPARTMENT OF AGRICULTURE FOOD AND AGRICULTURAL SCIENCES NATIONAL...

  8. College of DuPage Student Portrait, Fall Quarter 1999.

    ERIC Educational Resources Information Center

    College of DuPage, Glen Ellyn, IL. Office of Research and Planning.

    The report profiles the College of DuPage's (COD) fall quarter 1999 student body. It presents a brief history of the college's enrollment and a comparison of enrollments with other Illinois community colleges. It also provides demographic information on current students. Additionally, enrollment information is included by program, division, and…

  9. Taking Shakespeare from the Page to the Stage.

    ERIC Educational Resources Information Center

    Breen, Kathleen T.

    1993-01-01

    Describes an approach to teaching William Shakespeare by which one teacher had students take the plays from the page to the stage by becoming actors and directors as well as scholars. Shows ways of relating various plays to more contemporary works. (HB)

  10. 7 CFR 3402.11 - Proposal cover page.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 15 2011-01-01 2011-01-01 false Proposal cover page. 3402.11 Section 3402.11 Agriculture Regulations of the Department of Agriculture (Continued) NATIONAL INSTITUTE OF FOOD AND AGRICULTURE FOOD AND AGRICULTURAL SCIENCES NATIONAL NEEDS GRADUATE AND POSTGRADUATE FELLOWSHIP GRANTS...

  11. 105. Photocopy of plate opposite page 105 in Robert Dale ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    105. Photocopy of plate opposite page 105 in Robert Dale Owen, Hints on Public Architecture (New York, G. P. Putnam, 1849). GROUND-PLANS, SMITHSONIAN INSTITUTION - Smithsonian Institution Building, 1000 Jefferson Drive, between Ninth & Twelfth Streets, Southwest, Washington, District of Columbia, DC

  12. The 'Don'ts' of Web Page Design.

    ERIC Educational Resources Information Center

    Balas, Janet L.

    1999-01-01

    Discusses online resources that focus on what not to do in Web page design. "Don'ts" include: making any of the top 10 mistakes identified by Nielsen, qualifying for a "muddie" award for bad Web sites, forgetting to listen to users, and forgetting accessibility. A sidebar lists the Web site addresses for the nine resources…

  13. A Tour of Information Science through the Pages of JASIS.

    ERIC Educational Resources Information Center

    Bates, Marcia J.

    1999-01-01

    Provides selected article titles and descriptive material drawn from the pages of the "Journal of the American Society for Information Science (JASIS)" and its precursor title, "American Documentation," dating from the beginning of the journal in January 1950 until the spring of 1999. Descriptions are arranged by subject and…

  14. On Apples and Onions: A Reply to Page.

    ERIC Educational Resources Information Center

    Phillips, Gerald M.

    1980-01-01

    Answers some of William Page's criticisms (see preceding article, EJ 227 456) regarding the use of rhetoritherapy v behavior therapy to deal with students who exhibit communication apprehension. Argues that rhetoritherapy deals with people who have problems, not with problems. It is concerned with what can be done about the problem, not what the…

  15. Building interactive simulations in a Web page design program.

    PubMed

    Kootsey, J Mailen; Siriphongs, Daniel; McAuley, Grant

    2004-01-01

    A new Web software architecture, NumberLinX (NLX), has been integrated into a commercial Web design program to produce a drag-and-drop environment for building interactive simulations. NLX is a library of reusable objects written in Java, including input, output, calculation, and control objects. The NLX objects were added to the palette of available objects in the Web design program to be selected and dropped on a page. Inserting an object in a Web page is accomplished by adding a template block of HTML code to the page file. HTML parameters in the block must be set to user-supplied values, so the HTML code is generated dynamically, based on user entries in a popup form. Implementing the object inspector for each object permits the user to edit object attributes in a form window. Except for model definition, the combination of the NLX architecture and the Web design program permits construction of interactive simulation pages without writing or inspecting code.

  16. Turning a New Page to Life and Literacy.

    ERIC Educational Resources Information Center

    Taylor, Rosemarye T.; McAtee, Richard

    2003-01-01

    Discusses how a literacy intervention program found success among struggling readers in prison. Describes "Turning a New Page," an unconditional literacy project that develops vocabulary, comprehension, fluency, and self-esteem in the older, reluctant reader. Concludes that older, reluctant readers need motivation and respect in learning…

  17. 7 CFR 3402.11 - Proposal cover page.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 15 2013-01-01 2013-01-01 false Proposal cover page. 3402.11 Section 3402.11 Agriculture Regulations of the Department of Agriculture (Continued) NATIONAL INSTITUTE OF FOOD AND AGRICULTURE FOOD AND AGRICULTURAL SCIENCES NATIONAL NEEDS GRADUATE AND POSTGRADUATE FELLOWSHIP GRANTS...

  18. 7 CFR 3402.11 - Proposal cover page.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 15 2012-01-01 2012-01-01 false Proposal cover page. 3402.11 Section 3402.11 Agriculture Regulations of the Department of Agriculture (Continued) NATIONAL INSTITUTE OF FOOD AND AGRICULTURE FOOD AND AGRICULTURAL SCIENCES NATIONAL NEEDS GRADUATE AND POSTGRADUATE FELLOWSHIP GRANTS...

  19. 7 CFR 3402.11 - Proposal cover page.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 15 2014-01-01 2014-01-01 false Proposal cover page. 3402.11 Section 3402.11 Agriculture Regulations of the Department of Agriculture (Continued) NATIONAL INSTITUTE OF FOOD AND AGRICULTURE FOOD AND AGRICULTURAL SCIENCES NATIONAL NEEDS GRADUATE AND POSTGRADUATE FELLOWSHIP GRANTS...

  20. Exploring the Use of a Facebook Page in Anatomy Education

    ERIC Educational Resources Information Center

    Jaffar, Akram Abood

    2014-01-01

    Facebook is the most popular social media site visited by university students on a daily basis. Consequently, Facebook is the logical place to start with for integrating social media technologies into education. This study explores how a faculty-administered Facebook Page can be used to supplement anatomy education beyond the traditional…

  1. Student-Constructed Web Pages for Intercultural Understanding.

    ERIC Educational Resources Information Center

    Kitao, Kenji; Kitao, S. Kathleen

    The Internet is a resource that allows English as a Second Language (ESL) students to communicate meaningfully in English. One way to combine the Internet with teaching English is to give students a group or individual assignment to make their own Web pages. As they complete these assignments, they can develop skills in searching out resources on…

  2. 25. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    25. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  3. 1. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  4. 38. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    38. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  5. 21. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    21. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  6. 35. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    35. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  7. 8. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  8. 37. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    37. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  9. 3. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  10. 2. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  11. 14. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    14. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  12. 29. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    29. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  13. 36. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    36. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  14. 7. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  15. 27. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    27. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  16. 5. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  17. 20. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    20. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  18. 4. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    4. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  19. 10. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    10. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  20. 17. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    17. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  1. 30. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    30. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  2. 19. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    19. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  3. 26. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    26. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  4. 34. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    34. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  5. 23. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    23. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  6. 39. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    39. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  7. 15. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    15. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  8. 24. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    24. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  9. 12. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    12. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  10. 33. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    33. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  11. 18. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    18. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  12. 6. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  13. 22. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    22. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  14. 32. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    32. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  15. 13. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    13. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  16. 9. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  17. 16. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    16. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  18. 31. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    31. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  19. 28. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    28. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  20. 11. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    11. PHOTOCOPIES OF TITLE PAGE, BOOKPLATE, MEMBERSHIP CERTIFICATE AND PLATES ILLUSTRATING EXAMPLES OF CARPENTERS' WORK PUBLISHED IN ARTICLES OF THE CARPENTERS' COMPANY OF PHILADELPHIA AND THEIR RULES FOR MEASURING AND VALUING HOUSE CARPENTERS' WORK (Philadelphia: Hall and Sellars, 1786). - Carpenters' Company, Rule Book (carpentry manual), Philadelphia, Philadelphia County, PA

  1. 48 CFR 804.1102 - Vendor Information Pages (VIP) Database.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... (VIP) Database. 804.1102 Section 804.1102 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS GENERAL ADMINISTRATIVE MATTERS Contract Execution 804.1102 Vendor Information Pages (VIP) Database. Prior to January 1, 2012, all VOSBs and SDVOSBs must be listed in the VIP database, available at...

  2. 48 CFR 804.1102 - Vendor Information Pages (VIP) Database.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... (VIP) Database. 804.1102 Section 804.1102 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS GENERAL ADMINISTRATIVE MATTERS Contract Execution 804.1102 Vendor Information Pages (VIP) Database. Prior to January 1, 2012, all VOSBs and SDVOSBs must be listed in the VIP database, available at...

  3. 48 CFR 804.1102 - Vendor Information Pages (VIP) Database.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... (VIP) Database. 804.1102 Section 804.1102 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS GENERAL ADMINISTRATIVE MATTERS Contract Execution 804.1102 Vendor Information Pages (VIP) Database. Prior to January 1, 2012, all VOSBs and SDVOSBs must be listed in the VIP database, available at...

  4. 48 CFR 804.1102 - Vendor Information Pages (VIP) Database.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... (VIP) Database. 804.1102 Section 804.1102 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS GENERAL ADMINISTRATIVE MATTERS Contract Execution 804.1102 Vendor Information Pages (VIP) Database. Prior to January 1, 2012, all VOSBs and SDVOSBs must be listed in the VIP database, available at...

  5. 48 CFR 804.1102 - Vendor Information Pages (VIP) Database.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... (VIP) Database. 804.1102 Section 804.1102 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS GENERAL ADMINISTRATIVE MATTERS Contract Execution 804.1102 Vendor Information Pages (VIP) Database. Prior to January 1, 2012, all VOSBs and SDVOSBs must be listed in the VIP database, available at...

  6. What Should Be On A School Library Web Page?

    ERIC Educational Resources Information Center

    Baumbach, Donna; Brewer, Sally; Renfroe, Matt

    2004-01-01

    As varied as the schools and the communities they serve, so too are the Web pages for the library media programs that serve them. This article provides guidelines for effective web design and the information that might be included, including reference resources, reference asistance, curriculum support, literacy advocacy, and dynamic material. An…

  7. Ranking nodes in growing networks: When PageRank fails

    PubMed Central

    Mariani, Manuel Sebastian; Medo, Matúš; Zhang, Yi-Cheng

    2015-01-01

    PageRank is arguably the most popular ranking algorithm which is being applied in real systems ranging from information to biological and infrastructure networks. Despite its outstanding popularity and broad use in different areas of science, the relation between the algorithm’s efficacy and properties of the network on which it acts has not yet been fully understood. We study here PageRank’s performance on a network model supported by real data, and show that realistic temporal effects make PageRank fail in individuating the most valuable nodes for a broad range of model parameters. Results on real data are in qualitative agreement with our model-based findings. This failure of PageRank reveals that the static approach to information filtering is inappropriate for a broad class of growing systems, and suggest that time-dependent algorithms that are based on the temporal linking patterns of these systems are needed to better rank the nodes. PMID:26553630

  8. 61. PAGE THREE OF PLANS FOR GRAND CANAL AT WASHINGTON ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    61. PAGE THREE OF PLANS FOR GRAND CANAL AT WASHINGTON STREET TIDEGATE SYSTEM REHABILITATION Plan Sheet D-28451, Sheet No. 3 of 3 (delineated by H. V. Nguyen, November 1985) - Venice Canals, Community of Venice, Los Angeles, Los Angeles County, CA

  9. The Inquiry Page: Bringing Digital Libraries to Learners.

    ERIC Educational Resources Information Center

    Bruce, Bertram C.; Bishop, Ann Peterson; Heidorn, P. Bryan; Lunsford, Karen J.; Poulakos, Steven; Won, Mihye

    2003-01-01

    Discusses digital library development, particularly a national science digital library, and describes the Inquiry Page which focuses on building a constructivist environment using Web resources, collaborative processes, and knowledge that bridges digital libraries with users in K-12 schools, museums, community groups, or other organizations. (LRW)

  10. 22. PHOTOGRAPHIC ENLARGEMENT OF UPPER PHOTOGRAPH ON PAGE 986 IN ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    22. PHOTOGRAPHIC ENLARGEMENT OF UPPER PHOTOGRAPH ON PAGE 986 IN Keystone Coal Buyers Catalog, 1922, VIEW SOUTH, COMMUNITY OF ETHEL; ETHEL COAL COMPANY MINE SUPPLY BUILDING IS LOCATED IN MID-GROUND LEFT OF CENTER PARTIALLY OBSCURED BY ROOF OF HOUSE IN FOREGROUND - Ethel Coal Company & Supply Building, Left fork of Dingess Run (Ethel Hollow), Ethel, Logan County, WV

  11. 21. PHOTOGRAPH OF PAGE 986 IN Keystone Coal Buyers Catalog, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    21. PHOTOGRAPH OF PAGE 986 IN Keystone Coal Buyers Catalog, 1922, UPPER PHOTOGRAPH, VIEW SOUTH, COMMUNITY OF ETHEL; ETHEL COAL COMPANY MINE SUPPLY BUILDING IS LOCATED IN MID-GROUND LEFT OF CENTER PARTIALLY OBSCURED BY ROOF OF HOUSE IN FOREGROUND - Ethel Coal Company & Supply Building, Left fork of Dingess Run (Ethel Hollow), Ethel, Logan County, WV

  12. 23. PHOTOGRAPHIC ENLARGEMENT OF UPPER PHOTOGRAPH ON PAGE 986 IN ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    23. PHOTOGRAPHIC ENLARGEMENT OF UPPER PHOTOGRAPH ON PAGE 986 IN Keystone Coal Buyers Catalog, 1922, VIEW SOUTH, COMMUNITY OF ETHEL; ETHEL COAL COMPANY MINE SUPPLY BUILDING IS LOCATED IN MID-GROUND IN CENTER PARTIALLY OBSCURED BY ROOF OF HOUSE IN FOREGROUND - Ethel Coal Company & Supply Building, Left fork of Dingess Run (Ethel Hollow), Ethel, Logan County, WV

  13. Video 2 of 4: Navigating the Live Access Server

    NASA Video Gallery

    Learn how to navigate the MY NASA DATA website and server using the NASA Explorer Schools lesson, Analyzing Solar Energy Graphs. The video also shows you how to access, filter and manipulate the da...

  14. How to secure your servers, code and data

    ScienceCinema

    None

    2016-07-12

    Oral presentation in English, slides in English. Advice and best practices regarding the security of your servers, code and data will be presented. We will also describe how the Computer Security Team can help you reduce the risks.

  15. GrayStarServer: Stellar atmospheric modeling and spectrum synthesis

    NASA Astrophysics Data System (ADS)

    Short, C. Ian

    2017-01-01

    GrayStarServer is a stellar atmospheric modeling and spectrum synthesis code of pedagogical accuracy that is accessible in any web browser on commonplace computational devices and that runs on a timescale of a few seconds.

  16. Comparing Server Energy Use and Efficiency Using Small Sample Sizes

    SciTech Connect

    Coles, Henry C.; Qin, Yong; Price, Phillip N.

    2014-11-01

    This report documents a demonstration that compared the energy consumption and efficiency of a limited sample size of server-type IT equipment from different manufacturers by measuring power at the server power supply power cords. The results are specific to the equipment and methods used. However, it is hoped that those responsible for IT equipment selection can used the methods described to choose models that optimize energy use efficiency. The demonstration was conducted in a data center at Lawrence Berkeley National Laboratory in Berkeley, California. It was performed with five servers of similar mechanical and electronic specifications; three from Intel and one each from Dell and Supermicro. Server IT equipment is constructed using commodity components, server manufacturer-designed assemblies, and control systems. Server compute efficiency is constrained by the commodity component specifications and integration requirements. The design freedom, outside of the commodity component constraints, provides room for the manufacturer to offer a product with competitive efficiency that meets market needs at a compelling price. A goal of the demonstration was to compare and quantify the server efficiency for three different brands. The efficiency is defined as the average compute rate (computations per unit of time) divided by the average energy consumption rate. The research team used an industry standard benchmark software package to provide a repeatable software load to obtain the compute rate and provide a variety of power consumption levels. Energy use when the servers were in an idle state (not providing computing work) were also measured. At high server compute loads, all brands, using the same key components (processors and memory), had similar results; therefore, from these results, it could not be concluded that one brand is more efficient than the other brands. The test results show that the power consumption variability caused by the key components as a

  17. The HydroServer Platform for Sharing Hydrologic Data

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Horsburgh, J. S.; Schreuders, K.; Maidment, D. R.; Zaslavsky, I.; Valentine, D. W.

    2010-12-01

    The CUAHSI Hydrologic Information System (HIS) is an internet based system that supports sharing of hydrologic data. HIS consists of databases connected using the Internet through Web services, as well as software for data discovery, access, and publication. The HIS system architecture is comprised of servers for publishing and sharing data, a centralized catalog to support cross server data discovery and a desktop client to access and analyze data. This paper focuses on HydroServer, the component developed for sharing and publishing space-time hydrologic datasets. A HydroServer is a computer server that contains a collection of databases, web services, tools, and software applications that allow data producers to store, publish, and manage the data from an experimental watershed or project site. HydroServer is designed to permit publication of data as part of a distributed national/international system, while still locally managing access to the data. We describe the HydroServer architecture and software stack, including tools for managing and publishing time series data for fixed point monitoring sites as well as spatially distributed, GIS datasets that describe a particular study area, watershed, or region. HydroServer adopts a standards based approach to data publication, relying on accepted and emerging standards for data storage and transfer. CUAHSI developed HydroServer code is free with community code development managed through the codeplex open source code repository and development system. There is some reliance on widely used commercial software for general purpose and standard data publication capability. The sharing of data in a common format is one way to stimulate interdisciplinary research and collaboration. It is anticipated that the growing, distributed network of HydroServers will facilitate cross-site comparisons and large scale studies that synthesize information from diverse settings, making the network as a whole greater than the sum of its

  18. EarthServer: an Intercontinental Collaboration on Petascale Datacubes

    NASA Astrophysics Data System (ADS)

    Baumann, P.; Rossi, A. P.

    2015-12-01

    With the unprecedented increase of orbital sensor, in-situ measurement, and simulation data there is a rich, yet not leveraged potential for getting insights from dissecting datasets and rejoining them with other datasets. Obviously, the goal is to allow users to "ask any question, any time" thereby enabling them to "build their own product on the go".One of the most influential initiatives in Big Geo Data is EarthServer which has demonstrated new directions for flexible, scalable EO services based on innovative NewSQL technology. Researchers from Europe, the US and recently Australia have teamed up to rigourously materialize the concept of the datacube. Such a datacube may have spatial and temporal dimensions (such as a satellite image time series) and may unite an unlimited number of scenes. Independently from whatever efficient data structuring a server network may perform internally, users will always see just a few datacubes they can slice and dice. EarthServer has established client and server technology for such spatio-temporal datacubes. The underlying scalable array engine, rasdaman, enables direct interaction, including 3-D visualization, what-if scenarios, common EO data processing, and general analytics. Services exclusively rely on the open OGC "Big Geo Data" standards suite, the Web Coverage Service (WCS) including the Web Coverage Processing Service (WCPS). Conversely, EarthServer has significantly shaped and advanced the OGC Big Geo Data standards landscape based on the experience gained.Phase 1 of EarthServer has advanced scalable array database technology into 100+ TB services; in phase 2, Petabyte datacubes will be built in Europe and Australia to perform ad-hoc querying and merging. Standing between EarthServer phase 1 (from 2011 through 2014) and phase 2 (from 2015 through 2018) we present the main results and outline the impact on the international standards landscape; effectively, the Big Geo Data standards established through initiative of

  19. Recent improvements in the NASA technical report server

    NASA Technical Reports Server (NTRS)

    Maa, Ming-Hokng; Nelson, Michael L.

    1995-01-01

    The NASA Technical Report Server (NTRS), a World Wide Web (WWW) report distribution service, has been modified to allow parallel database queries, significantly decreasing user access time by an average factor of 2.3, access from clients behind firewalls and/or proxies which truncate excessively long Uniform Resource Locators (URL's), access to non-Wide Area Information Server (WAIS) databases, and compatibility with the Z39-50.3 protocol.

  20. Reading with optical magnifiers: page navigation strategies and difficulties

    PubMed Central

    Bowers, Alex; Cheong, Allen MY; Lovie-Kitchin, Jan E.

    2006-01-01

    Purpose To read efficiently with a simple hand or stand magnifier, people with visual impairment have to move (navigate) the device along each line (forward phase) and back to the correct position at the start of the next line (retrace phase). Page navigation difficulties have been implicated as limiting factors when reading with hand and stand magnifiers, but have not been objectively measured. Methods Magnifier movements were recorded using a 3SPACE Isotrak system for 43 participants with age-related macular degeneration (AMD) who read two short stories using their habitual hand or stand magnifier. Page navigation was quantified in terms of magnifier movements and navigation errors for the forward and retrace phases. Visual acuities and visual fields were measured, and magnifier usage and page navigation difficulties were surveyed. Results During the forward phase participants primarily used either a straight (47%) or diagonal downward (46%) movement, whereas during the retrace phase the majority (56%) used a downward movement. On average, forward navigation time was four times longer than retrace navigation time (p < 0.001). The most common navigation error was incorrect positioning of the magnifier at the end of the retrace movement. Near word acuity correlated strongly with forward time (r = 0.78), and moderately with retrace time (r = 0.53) and forward errors (r = 0.50). Vertical field of view correlated with retrace errors (r = −0.53). Participants’ estimates of page navigation difficulties were not predictive of objective measures of performance. Conclusions We quantified page navigation strategies and difficulties of people with AMD reading with magnifiers. Retrace, which presents the most common difficulty, is not well predicted by vision measures or magnifier characteristics; future studies should investigate the relationship between motor skills and navigation performance, and the impact of training or devices on reducing retrace navigation

  1. A tactile paging system for deaf-blind people, phase 1. [human factors engineering of bioinstrumentation

    NASA Technical Reports Server (NTRS)

    Baer, J. A.

    1976-01-01

    A tactile paging system for deaf-blind people has been brought from the concept stage to the development of a first model. The model consists of a central station that transmits coded information via radio link to an on-body (i.e., worn on the wrist) receiving unit, the output from which is a coded vibrotactile signal. The model is a combination of commercially available equipment, customized electronic circuits, and electromechanical transducers. The paging system facilitates communication to deaf-blind clients in an institutional environment as an aid in their training and other activities. Several subunits of the system were individually developed, tested, and integrated into an operating system ready for experimentation and evaluation. The operation and characteristics of the system are described and photographs are shown.

  2. The Most Popular Astronomical Web Server in China

    NASA Astrophysics Data System (ADS)

    Cui, Chenzhou; Zhao, Yongheng

    Affected by the consistent depressibility of IT economy free homepage space is becoming less and less. It is more and more difficult to construct websites for amateur astronomers who do not have ability to pay for commercial space. In last May with the support of Chinese National Astronomical Observatory and Large Sky Area Multi-Object Fiber Spectroscopic Telescope project we setup a special web server (amateur.lamost.org) to provide free huge stable and no-advertisement homepage space to Chinese amateur astronomers and non-professional organizations. After only one year there has been more than 80 websites hosted on the server. More than 10000 visitors from nearly 40 countries visit the server and the amount of data downloaded by them exceeds 4 Giga-Bytes per day. The server has become the most popular amateur astronomical web server in China. It stores the most abundant Chinese amateur astronomical resources. Because of the extremely success our service has been drawing tremendous attentions from related institutions. Recently Chinese National Natural Science Foundation shows great interest to support the service. In the paper the emergence of the thought construction of the server and its present utilization and our future plan are introduced

  3. Library Home Page Design: A Comparison of Page Layout for Front-Ends to ARL Library Web Sites.

    ERIC Educational Resources Information Center

    King, David L.

    1998-01-01

    Describes a study that examined the home pages of all 120 libraries in the Association of Research Libraries (ARL) to compare design similarities and differences. Highlights include background; document headers and footers; graphics; hypertext links; and unlinked text. (Author/LRW)

  4. Web Server Security on Open Source Environments

    NASA Astrophysics Data System (ADS)

    Gkoutzelis, Dimitrios X.; Sardis, Manolis S.

    Administering critical resources has never been more difficult that it is today. In a changing world of software innovation where major changes occur on a daily basis, it is crucial for the webmasters and server administrators to shield their data against an unknown arsenal of attacks in the hands of their attackers. Up until now this kind of defense was a privilege of the few, out-budgeted and low cost solutions let the defender vulnerable to the uprising of innovating attacking methods. Luckily, the digital revolution of the past decade left its mark, changing the way we face security forever: open source infrastructure today covers all the prerequisites for a secure web environment in a way we could never imagine fifteen years ago. Online security of large corporations, military and government bodies is more and more handled by open source application thus driving the technological trend of the 21st century in adopting open solutions to E-Commerce and privacy issues. This paper describes substantial security precautions in facing privacy and authentication issues in a totally open source web environment. Our goal is to state and face the most known problems in data handling and consequently propose the most appealing techniques to face these challenges through an open solution.

  5. CRONOS: the cross-reference navigation server

    PubMed Central

    Waegele, Brigitte; Dunger-Kaltenbach, Irmtraud; Fobo, Gisela; Montrone, Corinna; Mewes, H.-Werner; Ruepp, Andreas

    2009-01-01

    Summary: Cross-mapping of gene and protein identifiers between different databases is a tedious and time-consuming task. To overcome this, we developed CRONOS, a cross-reference server that contains entries from five mammalian organisms presented by major gene and protein information resources. Sequence similarity analysis of the mapped entries shows that the cross-references are highly accurate. In total, up to 18 different identifier types can be used for identification of cross-references. The quality of the mapping could be improved substantially by exclusion of ambiguous gene and protein names which were manually validated. Organism-specific lists of ambiguous terms, which are valuable for a variety of bioinformatics applications like text mining are available for download. Availability: CRONOS is freely available to non-commercial users at http://mips.gsf.de/genre/proj/cronos/index.html, web services are available at http://mips.gsf.de/CronosWSService/CronosWS?wsdl. Contact: brigitte.waegele@helmholtz-muenchen.de Supplementary information: Supplementary data are available at Bioinformatics online. The online Supplementary Material contains all figures and tables referenced by this article. PMID:19010804

  6. Anonymization server system for DICOM images

    NASA Astrophysics Data System (ADS)

    Suzuki, H.; Amano, M.; Kubo, M.; Kawata, Y.; Niki, N.; Nishitani, H.

    2007-03-01

    We have developed an anonymization system for DICOM images. It requires consent from the patient to use the DICOM images for research or education. However, providing the DICOM image to the other facilities is not safe because it contains a lot of personal data. Our system is a server that provides anonymization service of DICOM images for users in the facility. The distinctive features of the system are, input interface, flexible anonymization policy, and automatic body part identification. In the first feature, we can use the anonymization service on the existing DICOM workstations. In the second feature, we can select a best policy fitting for the Protection of personal data that is ruled by each medical facility. In the third feature, we can identify the body parts that are included in the input image set, even if the set lacks the body part tag in DICOM header. We installed the system for the first time to a hospital in December 2005. Currently, the system is working in other four facilities. In this paper we describe the system and how it works.

  7. ConoServer: updated content, knowledge, and discovery tools in the conopeptide database

    PubMed Central

    Kaas, Quentin; Yu, Rilei; Jin, Ai-Hua; Dutertre, Sébastien; Craik, David J.

    2012-01-01

    ConoServer (http://www.conoserver.org) is a database specializing in the sequences and structures of conopeptides, which are toxins expressed by marine cone snails. Cone snails are carnivorous gastropods, which hunt their prey using a cocktail of toxins that potently subvert nervous system function. The ability of these toxins to specifically target receptors, channels and transporters of the nervous system has attracted considerable interest for their use in physiological research and as drug leads. Since the founding publication on ConoServer in 2008, the number of entries in the database has nearly doubled, the interface has been redesigned and new annotations have been added, including a more detailed description of cone snail species, biological activity measurements and information regarding the identification of each sequence. Automatically updated statistics on classification schemes, three-dimensional structures, conopeptide-bearing species and endoplasmic reticulum signal sequence conservation trends, provide a convenient overview of current knowledge on conopeptides. Transcriptomics and proteomics have began generating massive numbers of new conopeptide sequences, and two dedicated tools have been recently implemented in ConoServer to standardize the analysis of conopeptide precursor sequences and to help in the identification by mass spectrometry of toxins whose sequences were predicted at the nucleic acid level. PMID:22058133

  8. MO/DSD online information server and global information repository access

    NASA Technical Reports Server (NTRS)

    Nguyen, Diem; Ghaffarian, Kam; Hogie, Keith; Mackey, William

    1994-01-01

    Often in the past, standards and new technology information have been available only in hardcopy form, with reproduction and mailing costs proving rather significant. In light of NASA's current budget constraints and in the interest of efficient communications, the Mission Operations and Data Systems Directorate (MO&DSD) New Technology and Data Standards Office recognizes the need for an online information server (OLIS). This server would allow: (1) dissemination of standards and new technology information throughout the Directorate more quickly and economically; (2) online browsing and retrieval of documents that have been published for and by MO&DSD; and (3) searching for current and past study activities on related topics within NASA before issuing a task. This paper explores a variety of available information servers and searching tools, their current capabilities and limitations, and the application of these tools to MO&DSD. Most importantly, the discussion focuses on the way this concept could be easily applied toward improving dissemination of standards and new technologies and improving documentation processes.

  9. AVCpred: an integrated web server for prediction and design of antiviral compounds.

    PubMed

    Qureshi, Abid; Kaur, Gazaldeep; Kumar, Manoj

    2017-01-01

    Viral infections constantly jeopardize the global public health due to lack of effective antiviral therapeutics. Therefore, there is an imperative need to speed up the drug discovery process to identify novel and efficient drug candidates. In this study, we have developed quantitative structure-activity relationship (QSAR)-based models for predicting antiviral compounds (AVCs) against deadly viruses like human immunodeficiency virus (HIV), hepatitis C virus (HCV), hepatitis B virus (HBV), human herpesvirus (HHV) and 26 others using publicly available experimental data from the ChEMBL bioactivity database. Support vector machine (SVM) models achieved a maximum Pearson correlation coefficient of 0.72, 0.74, 0.66, 0.68, and 0.71 in regression mode and a maximum Matthew's correlation coefficient 0.91, 0.93, 0.70, 0.89, and 0.71, respectively, in classification mode during 10-fold cross-validation. Furthermore, similar performance was observed on the independent validation sets. We have integrated these models in the AVCpred web server, freely available at http://crdd.osdd.net/servers/avcpred. In addition, the datasets are provided in a searchable format. We hope this web server will assist researchers in the identification of potential antiviral agents. It would also save time and cost by prioritizing new drugs against viruses before their synthesis and experimental testing.

  10. Japan Data Exchange Network JDXnet and Cloud-type Data Relay Server for Earthquake Observation Data

    NASA Astrophysics Data System (ADS)

    Takano, K.; Urabe, T.; Tsuruoka, H.; Nakagawa, S.

    2015-12-01

    In Japan, high-sensitive seismic observation and broad-band seismic observation are carried out by several organization such as Japan Meteorological Agency (JMA) , National Research Institute for Earth Science and Disaster Prevention (NIED), nine National Universities, Japan Agency for Marine-Earth Science and Technology (JAMSTEC) , etc. The total number of the observation station is about 1400 points. The total volume of the seismic waveform data collected from all these observation station is about 1MByte for 1 second (about 8 to 10Mbps) by using the WIN system(Urabe 1991). JDXnet is the Japan Data eXchange network for earthquake observation data. JDXnet was started from 2007 by cooperation of the researchers of each organization. All the seismic waveform data are available at the all organizations in real-time. The core of JDXnet is the broadcast type real-time data exchange by using the nationwide L2-VPN service offered in JGN-X of NICT and SINET4 of NII. Before the Tohoku earthquake, the nine national universities had collected seismic data to each data center and then exchanged with other universities and institutions by JDXnet. However, in this case, if the center of the university was stopped, all data of the university could not use even though there are some alive observation stations. Because of this problem, we have prepared the data relay server in the data center of SINET4 ie the cloud center. This data relay server collects data directly from the observation stations of the universities and delivers data to all universities and institutions by JDXnet. By using the relay server on cloud center, even if some universities are affected by a large disaster, it is eliminated that the data of the living station is lost. If the researchers set up seismometers and send data to the relay server, then data are available to all researchers. This mechanism promotes the joint use of the seismometers and joint research activities in nationwide researchers.

  11. Google's Web Page Ranking Applied to Different Topological Web Graph Structures.

    ERIC Educational Resources Information Center

    Meghabghab, George

    2001-01-01

    This research, part of the ongoing study to better understand Web page ranking on the Web, looks at a Web page as a graph structure or Web graph, and classifies different Web graphs in the new coordinate space (out-degree, in-degree). Google's Web ranking algorithm (Brin & Page, 1998) on ranking Web pages is applied in this new coordinate…

  12. Building Interactive Simulations in Web Pages without Programming.

    PubMed

    Mailen Kootsey, J; McAuley, Grant; Bernal, Julie

    2005-01-01

    A software system is described for building interactive simulations and other numerical calculations in Web pages. The system is based on a new Java-based software architecture named NumberLinX (NLX) that isolates each function required to build the simulation so that a library of reusable objects could be assembled. The NLX objects are integrated into a commercial Web design program for coding-free page construction. The model description is entered through a wizard-like utility program that also functions as a model editor. The complete system permits very rapid construction of interactive simulations without coding. A wide range of applications are possible with the system beyond interactive calculations, including remote data collection and processing and collaboration over a network.

  13. Evaluation of literacy level of patient education pages in health-related journals.

    PubMed

    Cotugna, Nancy; Vickery, Connie E; Carpenter-Haefele, Kara M

    2005-06-01

    The purpose of this study was to evaluate the reading level of patient education material from selected current health care journals. Ten patient education pages from a variety of health care journals were entered into a Microsoft Word program. Applying the Flesch-Kincaid readability formula available from Microsoft Word, a reading level for each page was established and compared to recommended standards. Only 2 of 10 patient education pages fell within the recommended reading levels for health-related materials, and 5 of 10 were above the estimated mean U.S. reading level of 8th grade. A 5th to 6th grade level is recommended for patient education materials. This study suggests that although it is known that low health literacy is a widespread problem, it is not always considered when patient-targeted materials are developed. Health care professionals need to become more active in addressing the literacy needs of the intended receiver of written health-related information.

  14. Desialylation improves the detection of recombinant erythropoietins in urine samples analyzed by SDS-PAGE.

    PubMed

    Desharnais, Philippe; Naud, Jean-François; Ayotte, Christiane

    2013-01-01

    Recombinant erythropoietin (rhEPO) has been misused for over two decades by athletes, mainly but not only in endurance sports. A direct rhEPO detection method in urine by isoelectric focusing (IEF) was introduced in 2000, but the emergence of third-generation erythropoiesis-stimulating agents and so-called biosimilar rhEPOs, together with the sensitivity of human endogenous EPO (huEPO) pattern to enzymatic activities and its modification following short strenuous exercise, prompted the development of a complementary test based on SDS-PAGE analysis. While Mircera and NESP are easily detected with the existing IEF and SDS-PAGE methods, some samples containing both epoetin-α/β and huEPO present profiles that are still difficult to interpret. As doping practices have moved to micro-dosing, these mixed patterns are more frequently observed. We investigated the impact of enzymatic desialylation on the urinary and serum EPO profiles obtained by SDS-PAGE with the aim of improving the separation of the bands in these mixed EPO populations. We observed that the removal with neuraminidase of the sialic acid moieties from the different EPOs studied reduced their apparent molecular weight (MW) and increased the migration distance between huEPO and rhEPO centroids, therefore eliminating the size overlaps between them and improving the detection of rhEPO.

  15. Hawking-Page phase transition on the brane

    SciTech Connect

    Chamblin, A.; Karch, A.

    2005-09-15

    We show that the Hawking-Page phase transition of a conformal field theory on AdS{sub d-1} weakly coupled to gravity has a dual bulk description in terms of a phase transition between a black string and a thermal gas on AdS{sub d}. At even lower temperatures the black string develops a Gregory Laflamme instability, which is dual to black hole evaporation in the boundary theory.

  16. Server-side Filtering and Aggregation within a Distributed Environment

    NASA Astrophysics Data System (ADS)

    Currey, J. C.; Bartle, A.

    2015-12-01

    Intercalibration, validation, and data mining use cases require more efficient access to the massive volumes of observation data distributed across multiple agency data centers. The traditional paradigm of downloading large volumes of data to a centralized server or desktop computer for analysis is no longer viable. More analysis should be performed within the host data centers using server-side functions. Many comparative analysis tasks require far less than 1% of the available observation data. The Multi-Instrument Intercalibration (MIIC) Framework provides web services to find, match, filter, and aggregate multi-instrument observation data. Matching measurements from separate spacecraft in time, location, wavelength, and viewing geometry is a difficult task especially when data are distributed across multiple agency data centers. Event prediction services identify near coincident measurements with matched viewing geometries near orbit crossings using complex orbit propagation and spherical geometry calculations. The number and duration of event opportunities depend on orbit inclinations, altitude differences, and requested viewing conditions (e.g., day/night). Event observation information is passed to remote server-side functions to retrieve matched data. Data may be gridded, spatially convolved onto instantaneous field-of-views, or spectrally resampled or convolved. Narrowband instruments are routinely compared to hyperspectal instruments such as AIRS and CRIS using relative spectral response (RSR) functions. Spectral convolution within server-side functions significantly reduces the amount of hyperspectral data needed by the client. This combination of intelligent selection and server-side processing significantly reduces network traffic and data to process on local servers. OPeNDAP is a mature networking middleware already deployed at many of the Earth science data centers. Custom OPeNDAP server-side functions that provide filtering, histogram analysis (1D

  17. Adaptation of web pages and images for mobile applications

    NASA Astrophysics Data System (ADS)

    Kopf, Stephan; Guthier, Benjamin; Lemelson, Hendrik; Effelsberg, Wolfgang

    2009-02-01

    In this paper, we introduce our new visualization service which presents web pages and images on arbitrary devices with differing display resolutions. We analyze the layout of a web page and simplify its structure and formatting rules. The small screen of a mobile device is used much better this way. Our new image adaptation service combines several techniques. In a first step, border regions which do not contain relevant semantic content are identified. Cropping is used to remove these regions. Attention objects are identified in a second step. We use face detection, text detection and contrast based saliency maps to identify these objects and combine them into a region of interest. Optionally, the seam carving technique can be used to remove inner parts of an image. Additionally, we have developed a software tool to validate, add, delete, or modify all automatically extracted data. This tool also simulates different mobile devices, so that the user gets a feeling of how an adapted web page will look like. We have performed user studies to evaluate our web and image adaptation approach. Questions regarding software ergonomics, quality of the adapted content, and perceived benefit of the adaptation were asked.

  18. The Leggett-Garg inequality and Page-Wootters mechanism

    NASA Astrophysics Data System (ADS)

    Gangopadhyay, D.; Sinha Roy, A.

    2016-11-01

    Violation of the Leggett-Garg inequality (LGI) implies quantum phenomena. In this light we establish that Moreva et al.'s (Phys. Rev. A, 89 (2014) 052122) experiment demonstrating Page-Wootters' mechanism (Page D. N. and Wootters W. K., Phys. Rev. D, 27 (1983) 2885; Wootters W. K., Int. J. Theor. Phys., 23 (1984) 701). falls in the quantum domain. An observer outside a 2-photons world does not detect any change in the 2-photons state, i.e., there is no time parameter for the outside observer. But an observer attached to one of the photons sees the other photon evolving and this means that there is an “internal” time. The LGI is violated for the clock photon whose state evolves with the internal time as measured by the system photon. Conditional probabilities in this 2-photons system are computed for both sharp and unsharp measurements. The conditional probability increases for entangled states as obtained by Page and Wootters for both ideal and also unsharp measurements.

  19. Miniaturized Airborne Imaging Central Server System

    NASA Technical Reports Server (NTRS)

    Sun, Xiuhong

    2011-01-01

    In recent years, some remote-sensing applications require advanced airborne multi-sensor systems to provide high performance reflective and emissive spectral imaging measurement rapidly over large areas. The key or unique problem of characteristics is associated with a black box back-end system that operates a suite of cutting-edge imaging sensors to collect simultaneously the high throughput reflective and emissive spectral imaging data with precision georeference. This back-end system needs to be portable, easy-to-use, and reliable with advanced onboard processing. The innovation of the black box backend is a miniaturized airborne imaging central server system (MAICSS). MAICSS integrates a complex embedded system of systems with dedicated power and signal electronic circuits inside to serve a suite of configurable cutting-edge electro- optical (EO), long-wave infrared (LWIR), and medium-wave infrared (MWIR) cameras, a hyperspectral imaging scanner, and a GPS and inertial measurement unit (IMU) for atmospheric and surface remote sensing. Its compatible sensor packages include NASA s 1,024 1,024 pixel LWIR quantum well infrared photodetector (QWIP) imager; a 60.5 megapixel BuckEye EO camera; and a fast (e.g. 200+ scanlines/s) and wide swath-width (e.g., 1,920+ pixels) CCD/InGaAs imager-based visible/near infrared reflectance (VNIR) and shortwave infrared (SWIR) imaging spectrometer. MAICSS records continuous precision georeferenced and time-tagged multisensor throughputs to mass storage devices at a high aggregate rate, typically 60 MB/s for its LWIR/EO payload. MAICSS is a complete stand-alone imaging server instrument with an easy-to-use software package for either autonomous data collection or interactive airborne operation. Advanced multisensor data acquisition and onboard processing software features have been implemented for MAICSS. With the onboard processing for real time image development, correction, histogram-equalization, compression, georeference, and

  20. Blue native-PAGE analysis of Trichoderma harzianum secretome reveals cellulases and hemicellulases working as multienzymatic complexes.

    PubMed

    da Silva, Adelson Joel; Gómez-Mendoza, Diana Paola; Junqueira, Magno; Domont, Gilberto Barbosa; Ximenes Ferreira Filho, Edivaldo; de Sousa, Marcelo Valle; Ricart, Carlos André Ornelas

    2012-08-01

    Plant cell wall-degrading enzymes produced by microorganisms possess important biotechnological applications, including biofuel production. Some anaerobic bacteria are able to produce multienzymatic complexes called cellulosomes while filamentous fungi normally secrete individual hydrolytic enzymes that act synergistically for polysaccharide degradation. Here, we present evidence that the fungus Trichoderma harzianum, cultivated in medium containing the agricultural residue sugarcane bagasse, is able to secrete multienzymatic complexes. The T. harzianum secretome was firstly analyzed by 1D-BN (blue native)-PAGE that revealed several putative complexes. The three most intense 1D-BN-PAGE bands, named complexes [I], [II], and [III], were subsequently subjected to tricine SDS-PAGE that demonstrated that they were composed of smaller subunits. Zymographic assays were performed using 1D-BN-PAGE and 2D-BN/BN-PAGE demonstrating that the complexes bore cellulolytic and xylanolytic activities. The complexes [I], [II], and [III] were then trypsin digested and analyzed separately by LC-MS/MS that revealed their protein composition. Since T. harzianum has an unsequenced genome, a homology-driven proteomics approach provided a higher number of identified proteins than a conventional peptide-spectrum matching strategy. The results indicate that the complexes are formed by cellulolytic and hemicellulolytic enzymes and other proteins such as chitinase, cutinase, and swollenin, which may act synergistically to degrade plant cell wall components.

  1. Advancing the Power and Utility of Server-Side Aggregation

    NASA Technical Reports Server (NTRS)

    Fulker, Dave; Gallagher, James

    2016-01-01

    During the upcoming Summer 2016 meeting of the ESIP Federation (July 19-22), OpenDAP will hold a Developers and Users Workshop. While a broad set of topics will be covered, a key focus is capitalizing on recent EOSDIS-sponsored advances in Hyrax, OPeNDAPs own software for server-side realization of the DAP2 and DAP4 protocols. These Hyrax advances are as important to data users as to data providers, and the workshop will include hands-on experiences of value to both. Specifically, a balanced set of presentations and hands-on tutorials will address advances in1.server installation,2.server configuration,3.Hyrax aggregation capabilities,4.support for data-access from clients that are HTTP-based, JSON-based or OGC-compliant (especially WCS and WMS),5.support for DAP4,6.use and extension of server-side computational capabilities, and7.several performance-affecting matters.Topics 2 through 7 will be relevant to data consumers, data providers andnotably, due to the open-source nature of all OPeNDAP softwareto developers wishing to extend Hyrax, to build compatible clients and servers, andor to employ Hyrax as middleware that enables interoperability across a variety of end-user and source-data contexts. A session for contributed talks will elaborate the topics listed above and embrace additional ones.

  2. RosettaAntibody: antibody variable region homology modeling server.

    PubMed

    Sircar, Aroop; Kim, Eric T; Gray, Jeffrey J

    2009-07-01

    The RosettaAntibody server (http://antibody.graylab.jhu.edu) predicts the structure of an antibody variable region given the amino-acid sequences of the respective light and heavy chains. In an initial stage, the server identifies and displays the most sequence homologous template structures for the light and heavy framework regions and each of the complementarity determining region (CDR) loops. Subsequently, the most homologous templates are assembled into a side-chain optimized crude model, and the server returns a picture and coordinate file. For users requesting a high-resolution model, the server executes the full RosettaAntibody protocol which additionally models the hyper-variable CDR H3 loop. The high-resolution protocol also relieves steric clashes by optimizing the CDR backbone torsion angles and by simultaneously perturbing the relative orientation of the light and heavy chains. RosettaAntibody generates 2000 independent structures, and the server returns pictures, coordinate files, and detailed scoring information for the 10 top-scoring models. The 10 models enable users to use rational judgment in choosing the best model or to use the set as an ensemble for further studies such as docking. The high-resolution models generated by RosettaAntibody have been used for the successful prediction of antibody-antigen complex structures.

  3. Design of SIP transformation server for efficient media negotiation

    NASA Astrophysics Data System (ADS)

    Pack, Sangheon; Paik, Eun Kyoung; Choi, Yanghee

    2001-07-01

    Voice over IP (VoIP) is one of the advanced services supported by the next generation mobile communication. VoIP should support various media formats and terminals existing together. This heterogeneous environment may prevent diverse users from establishing VoIP sessions among them. To solve the problem an efficient media negotiation mechanism is required. In this paper, we propose the efficient media negotiation architecture using the transformation server and the Intelligent Location Server (ILS). The transformation server is an extended Session Initiation Protocol (SIP) proxy server. It can modify an unacceptable session INVITE message into an acceptable one using the ILS. The ILS is a directory server based on the Lightweight Directory Access Protocol (LDAP) that keeps userí*s location information and available media information. The proposed architecture can eliminate an unnecessary response and re-INVITE messages of the standard SIP architecture. It takes only 1.5 round trip times to negotiate two different media types while the standard media negotiation mechanism takes 2.5 round trip times. The extra processing time in message handling is negligible in comparison to the reduced round trip time. The experimental results show that the session setup time in the proposed architecture is less than the setup time in the standard SIP. These results verify that the proposed media negotiation mechanism is more efficient in solving diversity problems.

  4. APPRIS WebServer and WebServices.

    PubMed

    Rodriguez, Jose Manuel; Carro, Angel; Valencia, Alfonso; Tress, Michael L

    2015-07-01

    This paper introduces the APPRIS WebServer (http://appris.bioinfo.cnio.es) and WebServices (http://apprisws.bioinfo.cnio.es). Both the web servers and the web services are based around the APPRIS Database, a database that presently houses annotations of splice isoforms for five different vertebrate genomes. The APPRIS WebServer and WebServices provide access to the computational methods implemented in the APPRIS Database, while the APPRIS WebServices also allows retrieval of the annotations. The APPRIS WebServer and WebServices annotate splice isoforms with protein structural and functional features, and with data from cross-species alignments. In addition they can use the annotations of structure, function and conservation to select a single reference isoform for each protein-coding gene (the principal protein isoform). APPRIS principal isoforms have been shown to agree overwhelmingly with the main protein isoform detected in proteomics experiments. The APPRIS WebServer allows for the annotation of splice isoforms for individual genes, and provides a range of visual representations and tools to allow researchers to identify the likely effect of splicing events. The APPRIS WebServices permit users to generate annotations automatically in high throughput mode and to interrogate the annotations in the APPRIS Database. The APPRIS WebServices have been implemented using REST architecture to be flexible, modular and automatic.

  5. SWISS-MODEL: an automated protein homology-modeling server

    PubMed Central

    Schwede, Torsten; Kopp, Jürgen; Guex, Nicolas; Peitsch, Manuel C.

    2003-01-01

    SWISS-MODEL (http://swissmodel.expasy.org) is a server for automated comparative modeling of three-dimensional (3D) protein structures. It pioneered the field of automated modeling starting in 1993 and is the most widely-used free web-based automated modeling facility today. In 2002 the server computed 120 000 user requests for 3D protein models. SWISS-MODEL provides several levels of user interaction through its World Wide Web interface: in the ‘first approach mode’ only an amino acid sequence of a protein is submitted to build a 3D model. Template selection, alignment and model building are done completely automated by the server. In the ‘alignment mode’, the modeling process is based on a user-defined target-template alignment. Complex modeling tasks can be handled with the ‘project mode’ using DeepView (Swiss-PdbViewer), an integrated sequence-to-structure workbench. All models are sent back via email with a detailed modeling report. WhatCheck analyses and ANOLEA evaluations are provided optionally. The reliability of SWISS-MODEL is continuously evaluated in the EVA-CM project. The SWISS-MODEL server is under constant development to improve the successful implementation of expert knowledge into an easy-to-use server. PMID:12824332

  6. WebRASP: a server for computing energy scores to assess the accuracy and stability of RNA 3D structures

    PubMed Central

    Norambuena, Tomas; Cares, Jorge F.; Capriotti, Emidio; Melo, Francisco

    2013-01-01

    Summary: The understanding of the biological role of RNA molecules has changed. Although it is widely accepted that RNAs play important regulatory roles without necessarily coding for proteins, the functions of many of these non-coding RNAs are unknown. Thus, determining or modeling the 3D structure of RNA molecules as well as assessing their accuracy and stability has become of great importance for characterizing their functional activity. Here, we introduce a new web application, WebRASP, that uses knowledge-based potentials for scoring RNA structures based on distance-dependent pairwise atomic interactions. This web server allows the users to upload a structure in PDB format, select several options to visualize the structure and calculate the energy profile. The server contains online help, tutorials and links to other related resources. We believe this server will be a useful tool for predicting and assessing the quality of RNA 3D structures. Availability and implementation: The web server is available at http://melolab.org/webrasp. It has been tested on the most popular web browsers and requires Java plugin for Jmol visualization. Contact: fmelo@bio.puc.cl PMID:23929030

  7. Design and Analysis of an Enhanced Patient-Server Mutual Authentication Protocol for Telecare Medical Information System.

    PubMed

    Amin, Ruhul; Islam, S K Hafizul; Biswas, G P; Khan, Muhammad Khurram; Obaidat, Mohammad S

    2015-11-01

    In order to access remote medical server, generally the patients utilize smart card to login to the server. It has been observed that most of the user (patient) authentication protocols suffer from smart card stolen attack that means the attacker can mount several common attacks after extracting smart card information. Recently, Lu et al.'s proposes a session key agreement protocol between the patient and remote medical server and claims that the same protocol is secure against relevant security attacks. However, this paper presents several security attacks on Lu et al.'s protocol such as identity trace attack, new smart card issue attack, patient impersonation attack and medical server impersonation attack. In order to fix the mentioned security pitfalls including smart card stolen attack, this paper proposes an efficient remote mutual authentication protocol using smart card. We have then simulated the proposed protocol using widely-accepted AVISPA simulation tool whose results make certain that the same protocol is secure against active and passive attacks including replay and man-in-the-middle attacks. Moreover, the rigorous security analysis proves that the proposed protocol provides strong security protection on the relevant security attacks including smart card stolen attack. We compare the proposed scheme with several related schemes in terms of computation cost and communication cost as well as security functionalities. It has been observed that the proposed scheme is comparatively better than related existing schemes.

  8. An Enhanced Biometric Based Authentication with Key-Agreement Protocol for Multi-Server Architecture Based on Elliptic Curve Cryptography.

    PubMed

    Reddy, Alavalapati Goutham; Das, Ashok Kumar; Odelu, Vanga; Yoo, Kee-Young

    2016-01-01

    Biometric based authentication protocols for multi-server architectures have gained momentum in recent times due to advancements in wireless technologies and associated constraints. Lu et al. recently proposed a robust biometric based authentication with key agreement protocol for a multi-server environment using smart cards. They claimed that their protocol is efficient and resistant to prominent security attacks. The careful investigation of this paper proves that Lu et al.'s protocol does not provide user anonymity, perfect forward secrecy and is susceptible to server and user impersonation attacks, man-in-middle attacks and clock synchronization problems. In addition, this paper proposes an enhanced biometric based authentication with key-agreement protocol for multi-server architecture based on elliptic curve cryptography using smartcards. We proved that the proposed protocol achieves mutual authentication using Burrows-Abadi-Needham (BAN) logic. The formal security of the proposed protocol is verified using the AVISPA (Automated Validation of Internet Security Protocols and Applications) tool to show that our protocol can withstand active and passive attacks. The formal and informal security analyses and performance analysis demonstrates that the proposed protocol is robust and efficient compared to Lu et al.'s protocol and existing similar protocols.

  9. An Enhanced Biometric Based Authentication with Key-Agreement Protocol for Multi-Server Architecture Based on Elliptic Curve Cryptography

    PubMed Central

    Reddy, Alavalapati Goutham; Das, Ashok Kumar; Odelu, Vanga; Yoo, Kee-Young

    2016-01-01

    Biometric based authentication protocols for multi-server architectures have gained momentum in recent times due to advancements in wireless technologies and associated constraints. Lu et al. recently proposed a robust biometric based authentication with key agreement protocol for a multi-server environment using smart cards. They claimed that their protocol is efficient and resistant to prominent security attacks. The careful investigation of this paper proves that Lu et al.’s protocol does not provide user anonymity, perfect forward secrecy and is susceptible to server and user impersonation attacks, man-in-middle attacks and clock synchronization problems. In addition, this paper proposes an enhanced biometric based authentication with key-agreement protocol for multi-server architecture based on elliptic curve cryptography using smartcards. We proved that the proposed protocol achieves mutual authentication using Burrows-Abadi-Needham (BAN) logic. The formal security of the proposed protocol is verified using the AVISPA (Automated Validation of Internet Security Protocols and Applications) tool to show that our protocol can withstand active and passive attacks. The formal and informal security analyses and performance analysis demonstrates that the proposed protocol is robust and efficient compared to Lu et al.’s protocol and existing similar protocols. PMID:27163786

  10. Performance measurements of single server fuzzy queues with unreliable server using left and right method

    NASA Astrophysics Data System (ADS)

    Mueen, Zeina; Ramli, Razamin; Zaibidi, Nerda Zura

    2015-12-01

    There are a number of real life systems that can be described as a queuing system, and this paper presents a queuing system model applied in a manufacturing system example. The queuing model considered is depicted in a fuzzy environment with retrial queues and unreliable server. The stability condition state of this model is investigated and the performance measurement is obtained by adopting the left and right method. The new approach adopted in this study merges the existing α-cut interval and nonlinear programming techniques and a numerical example was considered to explain the methodology of this technique. From the numerical example, the flexibility of the method was shown graphically showing the exact real mean value of customers in the system and also the expected waiting times.

  11. A client/server database system for project evaluation

    SciTech Connect

    Brule, M.R.; Fair, W.B.; Jiang, J.; Sanvido, R.D.

    1994-12-31

    PETS (Project Evaluation Tool Set) is a networked client/server system that provides a full set of decision-support tools for evaluating the business potential of onshore and offshore development projects. This distributed workgroup computing system combines and streamlines preliminary design, routine cost estimation, economic evaluation, and risk analysis for conceptual developments as well as for ongoing projects and operations. A flexible and extendible client/server integration framework links in-house and third-party software applications with a database and an expert-system knowledgebase, and, where appropriate, links the applications among themselves. The capability and richness of inexpensive commercial operating systems and off-the-shelf applications have made building a client/server system like PETS possible in a relatively short time and at low cost. We will discuss the object-oriented design of the PETS system, detail its capabilities, and outline the methods used to integrate applications from other domains.

  12. Mathematical defense method of networked servers with controlled remote backups

    NASA Astrophysics Data System (ADS)

    Kim, Song-Kyoo

    2006-05-01

    The networked server defense model is focused on reliability and availability in security respects. The (remote) backup servers are hooked up by VPN (Virtual Private Network) with high-speed optical network and replace broken main severs immediately. The networked server can be represent as "machines" and then the system deals with main unreliable, spare, and auxiliary spare machine. During vacation periods, when the system performs a mandatory routine maintenance, auxiliary machines are being used for back-ups; the information on the system is naturally delayed. Analog of the N-policy to restrict the usage of auxiliary machines to some reasonable quantity. The results are demonstrated in the network architecture by using the stochastic optimization techniques.

  13. LassoProt: server to analyze biopolymers with lassos

    PubMed Central

    Dabrowski-Tumanski, Pawel; Niemyska, Wanda; Pasznik, Pawel; Sulkowska, Joanna I.

    2016-01-01

    The LassoProt server, http://lassoprot.cent.uw.edu.pl/, enables analysis of biopolymers with entangled configurations called lassos. The server offers various ways of visualizing lasso configurations, as well as their time trajectories, with all the results and plots downloadable. Broad spectrum of applications makes LassoProt a useful tool for biologists, biophysicists, chemists, polymer physicists and mathematicians. The server and our methods have been validated on the whole PDB, and the results constitute the database of proteins with complex lassos, supported with basic biological data. This database can serve as a source of information about protein geometry and entanglement-function correlations, as a reference set in protein modeling, and for many other purposes. PMID:27131383

  14. LassoProt: server to analyze biopolymers with lassos.

    PubMed

    Dabrowski-Tumanski, Pawel; Niemyska, Wanda; Pasznik, Pawel; Sulkowska, Joanna I

    2016-07-08

    The LassoProt server, http://lassoprot.cent.uw.edu.pl/, enables analysis of biopolymers with entangled configurations called lassos. The server offers various ways of visualizing lasso configurations, as well as their time trajectories, with all the results and plots downloadable. Broad spectrum of applications makes LassoProt a useful tool for biologists, biophysicists, chemists, polymer physicists and mathematicians. The server and our methods have been validated on the whole PDB, and the results constitute the database of proteins with complex lassos, supported with basic biological data. This database can serve as a source of information about protein geometry and entanglement-function correlations, as a reference set in protein modeling, and for many other purposes.

  15. The State of Energy and Performance Benchmarking for Enterprise Servers

    NASA Astrophysics Data System (ADS)

    Fanara, Andrew; Haines, Evan; Howard, Arthur

    To address the server industry’s marketing focus on performance, benchmarking organizations have played a pivotal role in developing techniques to determine the maximum achievable performance level of a system. Generally missing has been an assessment of energy use to achieve that performance. The connection between performance and energy consumption is becoming necessary information for designers and operators as they grapple with power constraints in the data center. While industry and policy makers continue to strategize about a universal metric to holistically measure IT equipment efficiency, existing server benchmarks for various workloads could provide an interim proxy to assess the relative energy efficiency of general servers. This paper discusses ideal characteristics a future energy-performance benchmark might contain, suggests ways in which current benchmarks might be adapted to provide a transitional step to this end, and notes the need for multiple workloads to provide a holistic proxy for a universal metric.

  16. Server-Controlled Identity-Based Authenticated Key Exchange

    NASA Astrophysics Data System (ADS)

    Guo, Hua; Mu, Yi; Zhang, Xiyong; Li, Zhoujun

    We present a threshold identity-based authenticated key exchange protocol that can be applied to an authenticated server-controlled gateway-user key exchange. The objective is to allow a user and a gateway to establish a shared session key with the permission of the back-end servers, while the back-end servers cannot obtain any information about the established session key. Our protocol has potential applications in strong access control of confidential resources. In particular, our protocol possesses the semantic security and demonstrates several highly-desirable security properties such as key privacy and transparency. We prove the security of the protocol based on the Bilinear Diffie-Hellman assumption in the random oracle model.

  17. primers4clades: a web server that uses phylogenetic trees to design lineage-specific PCR primers for metagenomic and diversity studies.

    PubMed

    Contreras-Moreira, Bruno; Sachman-Ruiz, Bernardo; Figueroa-Palacios, Iraís; Vinuesa, Pablo

    2009-07-01

    Primers4clades is an easy-to-use web server that implements a fully automatic PCR primer design pipeline for cross-species amplification of novel sequences from metagenomic DNA, or from uncharacterized organisms, belonging to user-specified phylogenetic clades or taxa. The server takes a set of non-aligned protein coding genes, with or without introns, aligns them and computes a neighbor-joining tree, which is displayed on screen for easy selection of species or sequence clusters to design lineage-specific PCR primers. Primers4clades implements an extended CODEHOP primer design strategy based on both DNA and protein multiple sequence alignments. It evaluates several thermodynamic properties of the oligonucleotide pairs, and computes the phylogenetic information content of the predicted amplicon sets from Shimodaira-Hasegawa-like branch support values of maximum likelihood phylogenies. A non-redundant set of primer formulations is returned, ranked according to their thermodynamic properties. An amplicon distribution map provides a convenient overview of the coverage of the target locus. Altogether these features greatly help the user in making an informed choice between alternative primer pair formulations. Primers4clades is available at two mirror sites: http://maya.ccg.unam.mx/primers4clades/and http://floresta.eead.csic.es/primers4clades/. Three demo data sets and a comprehensive documentation/tutorial page are provided for easy testing of the server's capabilities and interface.

  18. Children's recognition of advertisements on television and on Web pages.

    PubMed

    Blades, Mark; Oates, Caroline; Li, Shiying

    2013-03-01

    In this paper we consider the issue of advertising to children. Advertising to children raises a number of concerns, in particular the effects of food advertising on children's eating habits. We point out that virtually all the research into children's understanding of advertising has focused on traditional television advertisements, but much marketing aimed at children is now via the Internet and little is known about children's awareness of advertising on the Web. One important component of understanding advertisements is the ability to distinguish advertisements from other messages, and we suggest that young children's ability to recognise advertisements on a Web page is far behind their ability to recognise advertisements on television.

  19. Communication Systems through Artificial Earth Satellites (Selected Pages)

    DTIC Science & Technology

    1987-02-05

    82173X& 4, p. 975 --1238. 7. rerua8itea r. r., Kajiaw1HnKon H. H., Bw~ioa B. Af. It Ap. Pe- ZyJnbTaThi 3xcnepHueHTa no PaAHOCaaa3H ’epes 3xo-2 H .flyny...second method of radio communication, with which on board satellite there will not be radio equipment, but signals, sent from point/item A, will be...in the first version, but with a sufficient antenna gain and the sensitive receivers this method in a number of cases is DOC = 86120401 PAGE 11 Q

  20. NUCLEAR STRUCTURE AND DECAY DATA: INTRODUCTION TO RELEVANT WEB PAGES.

    SciTech Connect

    BURROWS, T.W.; MCLAUGHLIN, P.D.; NICHOLS, A.L.

    2005-04-04

    A brief description is given of the nuclear data centers around the world able to provide access to those databases and programs of highest relevance to nuclear structure and decay data specialists. A number of Web-page addresses are also provided for the reader to inspect and investigate these data and codes for study, evaluation and calculation. These instructions are not meant to be comprehensive, but should provide the reader with a reasonable means of electronic access to the most important data sets and programs.

  1. Business Systems Branch Abilities, Capabilities, and Services Web Page

    NASA Technical Reports Server (NTRS)

    Cortes-Pena, Aida Yoguely

    2009-01-01

    During the INSPIRE summer internship I acted as the Business Systems Branch Capability Owner for the Kennedy Web-based Initiative for Communicating Capabilities System (KWICC), with the responsibility of creating a portal that describes the services provided by this Branch. This project will help others achieve a clear view ofthe services that the Business System Branch provides to NASA and the Kennedy Space Center. After collecting the data through the interviews with subject matter experts and the literature in Business World and other web sites I identified discrepancies, made the necessary corrections to the sites and placed the information from the report into the KWICC web page.

  2. Photocopy of copy of blueprint between pages 151 and 152 ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Photocopy of copy of blueprint between pages 151 and 152 in completion report and history of the U.S.A. General hospital no. 21, compiled and edited under the direction of Major W.J. Cameron, by T.R. Wieger, Chief Engineer. The report was dated April 1, 1919. A copy of the report (4003/14/29) is at the National Archives, military records (floor 13 W), Washington, D.C. - Fitzsimons General Hospital, Bounded by East Colfax to south, Peoria Street to west, Denver City/County & Adams County Line to north, & U.S. Route 255 to east, Aurora, Adams County, CO

  3. Report: Results of Technical Vulnerability Assessment: EPA’s Directory Service System Authentication and Authorization Servers

    EPA Pesticide Factsheets

    Report #11-P-0597, September 9, 2011. Vulnerability testing of EPA’s directory service system authentication and authorization servers conducted in March 2011 identified authentication and authorization servers with numerous vulnerabilities.

  4. Performance model of the Argonne Voyager multimedia server

    SciTech Connect

    Disz, T.; Olson, R.; Stevens, R.

    1997-07-01

    The Argonne Voyager Multimedia Server is being developed in the Futures Lab of the Mathematics and Computer Science Division at Argonne National Laboratory. As a network-based service for recording and playing multimedia streams, it is important that the Voyager system be capable of sustaining certain minimal levels of performance in order for it to be a viable system. In this article, the authors examine the performance characteristics of the server. As they examine the architecture of the system, they try to determine where bottlenecks lie, show actual vs potential performance, and recommend areas for improvement through custom architectures and system tuning.

  5. EPICS Channel Access Server for LabVIEW

    SciTech Connect

    Zhukov, Alexander P.

    2016-10-01

    It can be challenging to interface National Instruments LabVIEW (http://www.ni.com/labview/) with EPICS (http://www.aps.anl.gov/epics/). Such interface is required when an instrument control program was developed in LabVIEW but it also has to be part of global control system. This is frequently useful in big accelerator facilities. The Channel Access Server is written in LabVIEW, so it works on any hardware/software platform where LabVIEW is available. It provides full server functionality, so any EPICS client can communicate with it.

  6. Rich-club and page-club coefficients for directed graphs

    NASA Astrophysics Data System (ADS)

    Smilkov, Daniel; Kocarev, Ljupco

    2010-06-01

    Rich-club and page-club coefficients and their null models are introduced for directed graphs. Null models allow for a quantitative discussion of the rich-club and page-club phenomena. These coefficients are computed for four directed real-world networks: Arxiv High Energy Physics paper citation network, Web network (released from Google), Citation network among US Patents, and email network from a EU research institution. The results show a high correlation between rich-club and page-club ordering. For journal paper citation network, we identify both rich-club and page-club ordering, showing that “elite” papers are cited by other “elite” papers. Google web network shows partial rich-club and page-club ordering up to some point and then a narrow declining of the corresponding normalized coefficients, indicating the lack of rich-club ordering and the lack of page-club ordering, i.e. high in-degree (PageRank) pages purposely avoid sharing links with other high in-degree (PageRank) pages. For UC patents citation network, we identify page-club and rich-club ordering providing a conclusion that “elite” patents are cited by other “elite” patents. Finally, for email communication network we show lack of both rich-club and page-club ordering. We construct an example of synthetic network showing page-club ordering and the lack of rich-club ordering.

  7. Page layout analysis and classification for complex scanned documents

    NASA Astrophysics Data System (ADS)

    Erkilinc, M. Sezer; Jaber, Mustafa; Saber, Eli; Bauer, Peter; Depalov, Dejan

    2011-09-01

    A framework for region/zone classification in color and gray-scale scanned documents is proposed in this paper. The algorithm includes modules for extracting text, photo, and strong edge/line regions. Firstly, a text detection module which is based on wavelet analysis and Run Length Encoding (RLE) technique is employed. Local and global energy maps in high frequency bands of the wavelet domain are generated and used as initial text maps. Further analysis using RLE yields a final text map. The second module is developed to detect image/photo and pictorial regions in the input document. A block-based classifier using basis vector projections is employed to identify photo candidate regions. Then, a final photo map is obtained by applying probabilistic model based on Markov random field (MRF) based maximum a posteriori (MAP) optimization with iterated conditional mode (ICM). The final module detects lines and strong edges using Hough transform and edge-linkages analysis, respectively. The text, photo, and strong edge/line maps are combined to generate a page layout classification of the scanned target document. Experimental results and objective evaluation show that the proposed technique has a very effective performance on variety of simple and complex scanned document types obtained from MediaTeam Oulu document database. The proposed page layout classifier can be used in systems for efficient document storage, content based document retrieval, optical character recognition, mobile phone imagery, and augmented reality.

  8. Young children's ability to recognize advertisements in web page designs.

    PubMed

    Ali, Moondore; Blades, Mark; Oates, Caroline; Blumberg, Fran

    2009-03-01

    Identifying what is, and what is not an advertisement is the first step in realizing that an advertisement is a marketing message. Children can distinguish television advertisements from programmes by about 5 years of age. Although previous researchers have investigated television advertising, little attention has been given to advertisements in other media, even though other media, especially the Internet, have become important channels of marketing to children. We showed children printed copies of invented web pages that included advertisements, half of which had price information, and asked the children to point to whatever they thought was an advertisement. In two experiments we tested a total of 401 children, aged 6, 8, 10 and 12 years of age, from the United Kingdom and Indonesia. Six-year-olds recognized a quarter of the advertisements, 8-year-olds recognized half the advertisements, and the 10- and 12-year-olds recognized about three-quarters. Only the 10- and 12-year-olds were more likely to identify an advertisement when it included a price. We contrast our findings with previous results about the identification of television advertising, and discuss why children were poorer at recognizing web page advertisements. The performance of the children has implications for theories about how children develop an understanding of advertising.

  9. 75 FR 8400 - In the Matter of Certain Wireless Communications System Server Software, Wireless Handheld...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-24

    ... COMMISSION In the Matter of Certain Wireless Communications System Server Software, Wireless Handheld Devices... server software, wireless handheld devices and battery packs by reason of infringement of certain claims... importation of certain wireless communications system server software, wireless handheld devices or...

  10. ASPEN--A Web-Based Application for Managing Student Server Accounts

    ERIC Educational Resources Information Center

    Sandvig, J. Christopher

    2004-01-01

    The growth of the Internet has greatly increased the demand for server-side programming courses at colleges and universities. Students enrolled in such courses must be provided with server-based accounts that support the technologies that they are learning. The process of creating, managing and removing large numbers of student server accounts is…

  11. Hardware and Software Interfacing at New Mexico Geochronology Research Laboratory: Distributed Control Using Pychron and RemoteControlServer.cs

    NASA Astrophysics Data System (ADS)

    McIntosh, W. C.; Ross, J. I.

    2012-12-01

    We developed a system for interfacing existing hardware and software to two new Thermo Scientific Argus VI mass spectrometers and three Photon Machines Fusions laser systems at New Mexico Geochronology Research Laboratory. NMGRL's upgrade to the new analytical equipment required the design and implementation of a software ecosystem that allows seamless communication between various software and hardware components. Based on past experience and initial testing we choose to pursue a "Fully Distributed Control" model. In this model, hardware is compartmentalized and controlled by customized software running on individual computers. Each computer is connected to a Local Area Network (LAN) facilitating inter-process communication using TCP or UDP Internet Protocols. Two other options for interfacing are 1) Single Control, in which all hardware is controlled by a single application on a single computer and 2), Partial Distributed Control, in which the mass spectrometer is controlled directly by Thermo Scientific's Qtegra and all other hardware is controlled by a separate application. The "Fully Distributed Control" model offers the most efficient use of software resources, leveraging our in-house laboratory software with proprietary third-party applications, such as Qtegra and Mass Spec. Two software products resulted from our efforts. 1) Pychron, a configurable and extensible package for hardware control, data acquisition and preprocessing, and 2) RemoteControlServer.cs, a C# script for Thermo's Qtegra software that implements a TCP/UDP command server. Pychron is written in python and uses standard well-established libraries such as, Numpy, Scipy, and Enthought ETS. Pychron is flexible and extensible, encouraging experimentation and rapid development of new features. A project page for Pychron is located at http://code.google.com/p/arlab, featuring an issue tracker and a Version Control System (Mercurial). RemoteControlServer.cs is a simple socket server that listens

  12. Computers in Small Libraries: Learning Server-Side Scripting

    ERIC Educational Resources Information Center

    Roberts, Gary

    2005-01-01

    In this column, the author compares and contrasts the most popular scripting languages that are used to create truly dynamic service-oriented Web sites, building a conceptual framework that be can used as a starting point for specific server-side library projects.

  13. Microsoft SQL Server 6.0{reg_sign} Workbook

    SciTech Connect

    Augustenborg, E.C.

    1996-09-01

    This workbook was prepared for introductory training in the use of Microsoft SQL Server Version 6.0. The examples are all taken from the PUBS database that Microsoft distributes for training purposes or from the Microsoft Online Documentation. The merits of the relational database are presented.

  14. Two-Cloud-Servers-Assisted Secure Outsourcing Multiparty Computation

    PubMed Central

    Wen, Qiaoyan; Zhang, Hua; Jin, Zhengping; Li, Wenmin

    2014-01-01

    We focus on how to securely outsource computation task to the cloud and propose a secure outsourcing multiparty computation protocol on lattice-based encrypted data in two-cloud-servers scenario. Our main idea is to transform the outsourced data respectively encrypted by different users' public keys to the ones that are encrypted by the same two private keys of the two assisted servers so that it is feasible to operate on the transformed ciphertexts to compute an encrypted result following the function to be computed. In order to keep the privacy of the result, the two servers cooperatively produce a custom-made result for each user that is authorized to get the result so that all authorized users can recover the desired result while other unauthorized ones including the two servers cannot. Compared with previous research, our protocol is completely noninteractive between any users, and both of the computation and the communication complexities of each user in our solution are independent of the computing function. PMID:24982949

  15. Training to Increase Safe Tray Carrying among Cocktail Servers

    ERIC Educational Resources Information Center

    Scherrer, Megan D.; Wilder, David A.

    2008-01-01

    We evaluated the effects of training on proper carrying techniques among 3 cocktail servers to increase safe tray carrying on the job and reduce participants' risk of developing musculoskeletal disorders. As participants delivered drinks to their tables, their finger, arm, and neck positions were observed and recorded. Each participant received…

  16. Perspectives of IT Professionals on Employing Server Virtualization Technologies

    ERIC Educational Resources Information Center

    Sligh, Darla

    2010-01-01

    Server virtualization enables a physical computer to support multiple applications logically by decoupling the application from the hardware layer, thereby reducing operational costs and competitive in delivering IT services to their enterprise organizations. IT organizations continually examine the efficiency of their internal IT systems and…

  17. Migrating from Mainframes to Client-Server Systems.

    DTIC Science & Technology

    1995-09-01

    The prevailing trend within the computer industry is to downsize information systems. This quite often entails migrating an application from a...centralized mainframe environment to a distributed client-server system. Navy IS managers are often given the mandate to downsize all information systems

  18. Distributed control system for demand response by servers

    NASA Astrophysics Data System (ADS)

    Hall, Joseph Edward

    Within the broad topical designation of smart grid, research in demand response, or demand-side management, focuses on investigating possibilities for electrically powered devices to adapt their power consumption patterns to better match generation and more efficiently integrate intermittent renewable energy sources, especially wind. Devices such as battery chargers, heating and cooling systems, and computers can be controlled to change the time, duration, and magnitude of their power consumption while still meeting workload constraints such as deadlines and rate of throughput. This thesis presents a system by which a computer server, or multiple servers in a data center, can estimate the power imbalance on the electrical grid and use that information to dynamically change the power consumption as a service to the grid. Implementation on a testbed demonstrates the system with a hypothetical but realistic usage case scenario of an online video streaming service in which there are workloads with deadlines (high-priority) and workloads without deadlines (low-priority). The testbed is implemented with real servers, estimates the power imbalance from the grid frequency with real-time measurements of the live outlet, and uses a distributed, real-time algorithm to dynamically adjust the power consumption of the servers based on the frequency estimate and the throughput of video transcoder workloads. Analysis of the system explains and justifies multiple design choices, compares the significance of the system in relation to similar publications in the literature, and explores the potential impact of the system.

  19. Two-cloud-servers-assisted secure outsourcing multiparty computation.

    PubMed

    Sun, Yi; Wen, Qiaoyan; Zhang, Yudong; Zhang, Hua; Jin, Zhengping; Li, Wenmin

    2014-01-01

    We focus on how to securely outsource computation task to the cloud and propose a secure outsourcing multiparty computation protocol on lattice-based encrypted data in two-cloud-servers scenario. Our main idea is to transform the outsourced data respectively encrypted by different users' public keys to the ones that are encrypted by the same two private keys of the two assisted servers so that it is feasible to operate on the transformed ciphertexts to compute an encrypted result following the function to be computed. In order to keep the privacy of the result, the two servers cooperatively produce a custom-made result for each user that is authorized to get the result so that all authorized users can recover the desired result while other unauthorized ones including the two servers cannot. Compared with previous research, our protocol is completely noninteractive between any users, and both of the computation and the communication complexities of each user in our solution are independent of the computing function.

  20. Molecular studies of microbial community structure on stained pages of Leonardo da Vinci's Atlantic Codex.

    PubMed

    Principi, Pamela; Villa, Federica; Sorlini, Claudia; Cappitelli, Francesca

    2011-01-01

    In 2006, after a visual inspection of the Leonardo da Vinci's Atlantic Codex by a scholar, active molds were reported to have been present on Codex pages showing areas of staining. In the present paper, molecular methods were used to assess the current microbiological risk to stained pages of the manuscript. Bacterial and fungal communities were sampled by a non-invasive technique employing nitrocellulose membranes. Denaturing gradient gel electrophoresis of 16 S rRNA gene and internal transcribed space regions were carried out to study the structure of the bacterial and fungal communities and band patterns were analyzed by the multivariate technique of principal component analysis. Any relationship between the presence of an active microbial community and staining was excluded. The presence of potential biodeteriogens was evaluated by constructing bacterial and fungal clone libraries, analyzing them by an operational taxonomic unit (OTU) approach. Among the bacteria, some OTUs were associated with species found on floors in clean room while others were identified with human skin contamination. Some fungal OTU representatives were potential biodeteriogens that, under proper thermo-hygrometric conditions, could grow. The retrieval of these potential biodeteriogens and microorganisms related to human skin suggests the need for a continuous and rigorous monitoring of the environmental conditions, and the need to improve handling procedures.

  1. Some Features of "Alt" Texts Associated with Images in Web Pages

    ERIC Educational Resources Information Center

    Craven, Timothy C.

    2006-01-01

    Introduction: This paper extends a series on summaries of Web objects, in this case, the alt attribute of image files. Method: Data were logged from 1894 pages from Yahoo!'s random page service and 4703 pages from the Google directory; an img tag was extracted randomly from each where present; its alt attribute, if any, was recorded; and the…

  2. MedlinePlus FAQ: What's New on Medline Plus Page and Email Updates

    MedlinePlus

    ... faq/whatsnew.html Question: How is the What's New on MedlinePlus page and RSS feed different from ... this page, please enable JavaScript. Answer: The What's New on MedlinePlus page and RSS feed include alerts ...

  3. Using Facebook Page Insights Data to Determine Posting Best Practices in an Academic Health Sciences Library

    ERIC Educational Resources Information Center

    Houk, Kathryn M.; Thornhill, Kate

    2013-01-01

    Tufts University Hirsh Health Sciences Library created a Facebook page and a corresponding managing committee in March 2010. Facebook Page Insights data collected from the library's Facebook page were statistically analyzed to investigate patterns of user engagement. The committee hoped to improve posting practices and increase user engagement…

  4. 77 FR 3324 - Release of Airport Property: Page Field, Fort Myers, FL

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-23

    ... Property: Page Field, Fort Myers, FL AGENCY: Federal Aviation Administration (FAA), DOT. ACTION: Request... properties 1.52 acres at Page Field, Fort Myers, FL from the conditions, release certain properties from all... the Lee County Port Authority, owner of Page Field, to dispose of the property for other...

  5. 47 CFR 90.490 - One-way paging operations in the private services.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... SPECIAL RADIO SERVICES PRIVATE LAND MOBILE RADIO SERVICES Paging Operations § 90.490 One-way paging... governing the radio service in which a licensee's radio system is authorized, paging operations are... directly from telephone positions in the public switched telephone network. When land stations are...

  6. 47 CFR 90.490 - One-way paging operations in the private services.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... SPECIAL RADIO SERVICES PRIVATE LAND MOBILE RADIO SERVICES Paging Operations § 90.490 One-way paging... governing the radio service in which a licensee's radio system is authorized, paging operations are... directly from telephone positions in the public switched telephone network. When land stations are...

  7. 47 CFR 90.490 - One-way paging operations in the private services.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... SPECIAL RADIO SERVICES PRIVATE LAND MOBILE RADIO SERVICES Paging Operations § 90.490 One-way paging... governing the radio service in which a licensee's radio system is authorized, paging operations are... directly from telephone positions in the public switched telephone network. When land stations are...

  8. Cosmic Ink: Fragments from the Past on Journal Pages

    NASA Astrophysics Data System (ADS)

    Mandrino, A.; Gargano, M.; Gasperini, A.

    2015-04-01

    This contribution describes an editorial project started in 2012 to enhance the cultural heritage of the Italian observatories. It includes a regular column Cieli di inchiostro (Cosmic ink) devoted to promoting the astronomical historical archives and published in the Giornale di astronomia, a journal of the Società Astronomica Italiana. In every issue of the journal, a significant historical artifact is presented and described. This can be a letter, a diary page, a photograph, a map, a drawing, or another type of item pulled out of the archival folders to bring its history to light. The column is intended to invite historians, amateurs, and students to search and use the documents kept in the archives of the observatories.

  9. Collecting responses through Web page drag and drop.

    PubMed

    Britt, M Anne; Gabrys, Gareth

    2004-02-01

    This article describes how to collect responses from experimental participants using drag and drop on a Web page. In particular, we describe how drag and drop can be used in a text search task in which participants read a text and then locate and categorize certain elements of the text (e.g., to identify the main claim of a persuasive paragraph). Using this technique, participants respond by clicking on a text segment and dragging it to a screen field or icon. We have successfully used this technique in both the argument element identification experiment that we describe here and a tutoring system that we created to teach students to identify source characteristics while reading historical texts (Britt, Perfetti, Van Dyke, & Gabrys, 2000). The implementation described here exploits the capability of recent versions of Microsoft's Internet Explorer Web browser to handle embedded XML documents and drag and drop events.

  10. Intelligent Paging Based Mobile User Tracking Using Fuzzy Logic

    NASA Astrophysics Data System (ADS)

    Saha, Sajal; Dutta, Raju; Debnath, Soumen; Mukhopadhyay, Asish K.

    2010-11-01

    In general, a mobile user travels in a predefined path that depends mostly on the user's characteristics. Thus, tracking the locations of a mobile user is one of the challenges for location management. In this paper, we introduce a movement pattern learning strategy system to track the user's movements using adaptive fuzzy logic. Our fuzzy inference system extracts patterns from the historical data record of the cell numbers along with the date and time stamp of the users occupying the cell. Implementation of this strategy has been evaluated with the real time user data which proves the efficiency and accuracy of the model. This mechanism not only reduces user location tracking costs, but also significantly decreases the call-loss rates and average paging delays.

  11. Learning Layouts for Single-Page Graphic Designs.

    PubMed

    O'Donovan, Peter; Agarwala, Aseem; Hertzmann, Aaron

    2014-08-01

    This paper presents an approach for automatically creating graphic design layouts using a new energy-based model derived from design principles. The model includes several new algorithms for analyzing graphic designs, including the prediction of perceived importance, alignment detection, and hierarchical segmentation. Given the model, we use optimization to synthesize new layouts for a variety of single-page graphic designs. Model parameters are learned with Nonlinear Inverse Optimization (NIO) from a small number of example layouts. To demonstrate our approach, we show results for applications including generating design layouts in various styles, retargeting designs to new sizes, and improving existing designs. We also compare our automatic results with designs created using crowdsourcing and show that our approach performs slightly better than novice designers.

  12. Credibility judgments in web page design - a brief review.

    PubMed

    Selejan, O; Muresanu, D F; Popa, L; Muresanu-Oloeriu, I; Iudean, D; Buzoianu, A; Suciu, S

    2016-01-01

    Today, more than ever, knowledge that interfaces appearance analysis is a crucial point in human-computer interaction field has been accepted. As nowadays virtually anyone can publish information on the web, the credibility role has grown increasingly important in relation to the web-based content. Areas like trust, credibility, and behavior, doubled by overall impression and user expectation are today in the spotlight of research compared to the last period, when other pragmatic areas such as usability and utility were considered. Credibility has been discussed as a theoretical construct in the field of communication in the past decades and revealed that people tend to evaluate the credibility of communication primarily by the communicator's expertise. Other factors involved in the content communication process are trustworthiness and dynamism as well as various other criteria but to a lower extent. In this brief review, factors like web page aesthetics, browsing experiences and user experience are considered.

  13. Credibility judgments in web page design – a brief review

    PubMed Central

    Selejan, O; Muresanu, DF; Popa, L; Muresanu-Oloeriu, I; Iudean, D; Buzoianu, A; Suciu, S

    2016-01-01

    Today, more than ever, knowledge that interfaces appearance analysis is a crucial point in human-computer interaction field has been accepted. As nowadays virtually anyone can publish information on the web, the credibility role has grown increasingly important in relation to the web-based content. Areas like trust, credibility, and behavior, doubled by overall impression and user expectation are today in the spotlight of research compared to the last period, when other pragmatic areas such as usability and utility were considered. Credibility has been discussed as a theoretical construct in the field of communication in the past decades and revealed that people tend to evaluate the credibility of communication primarily by the communicator’s expertise. Other factors involved in the content communication process are trustworthiness and dynamism as well as various other criteria but to a lower extent. In this brief review, factors like web page aesthetics, browsing experiences and user experience are considered. PMID:27453738

  14. Utilization of the Google Maps API in WebPages

    NASA Astrophysics Data System (ADS)

    Ricket, D.

    2006-12-01

    Google Maps, which offers a powerful, user-friendly mapping technology including business locations, contact information, and driving directions, also provides an easy-to-use platform for representing scientific information in a geographic format. Users can add draggable maps, satellite imagery, and zoom functionality technology to their own web pages using the Google Maps API. Features such as overlays (including markers and polylines) can be customized to show geologic map features and display shadowed "info windows" can be customized with additional information, images along with the direction they were taken, and access to data. A demonstration will be given of how to import large datasets into Google Maps, along formatting tips and tricks. Discussion of how the geoscience community would like to use both 2D and 3D mapping technologies is encouraged.

  15. Experimental parametric study of servers cooling management in data centers buildings

    NASA Astrophysics Data System (ADS)

    Nada, S. A.; Elfeky, K. E.; Attia, Ali M. A.; Alshaer, W. G.

    2017-01-01

    A parametric study of air flow and cooling management of data centers servers is experimentally conducted for different design conditions. A physical scale model of data center accommodating one rack of four servers was designed and constructed for testing purposes. Front and rear rack and server's temperatures distributions and supply/return heat indices (SHI/RHI) are used to evaluate data center thermal performance. Experiments were conducted to parametrically study the effects of perforated tiles opening ratio, servers power load variation and rack power density. The results showed that (1) perforated tile of 25% opening ratio provides the best results among the other opening ratios, (2) optimum benefit of cold air in servers cooling is obtained at uniformly power loading of servers (3) increasing power density decrease air re-circulation but increase air bypass and servers temperature. The present results are compared with previous experimental and CFD results and fair agreement was found.

  16. Analysis of mitochondrial respiratory chain supercomplexes using blue native polyacrylamide gel electrophoresis (BN-PAGE)

    PubMed Central

    Jha, Pooja; Wang, Xu; Auwerx, Johan

    2016-01-01

    Mitochondria are cellular organelles that produce energy in the form of ATP through a process termed oxidative phosphorylation (OXPHOS), which occurs via the protein complexes of the electron transport chain (ETC). In recent years it has become unequivocally clear that mitochondrial complexes of the ETC are not static entities in the inner mitochondrial membrane. These complexes are dynamic and in mammals they aggregate in different stoichiometric combinations to form supercomplexes (SCs) or respirasomes. It has been proposed that the net respiration is more efficient via SCs than via isolated complexes. However, it still needs to be determined whether the activity of a particular SC is associated with a disease etiology. Here we describe a simplified method to visualize and assess in-gel activity of SCs and the individual complexes with a good resolution on blue native polyacrylamide gel electrophoresis (BN-PAGE). PMID:26928661

  17. EarthServer - 3D Visualization on the Web

    NASA Astrophysics Data System (ADS)

    Wagner, Sebastian; Herzig, Pasquale; Bockholt, Ulrich; Jung, Yvonne; Behr, Johannes

    2013-04-01

    EarthServer (www.earthserver.eu), funded by the European Commission under its Seventh Framework Program, is a project to enable the management, access and exploration of massive, multi-dimensional datasets using Open GeoSpatial Consortium (OGC) query and processing language standards like WCS 2.0 and WCPS. To this end, a server/client architecture designed to handle Petabyte/Exabyte volumes of multi-dimensional data is being developed and deployed. As an important part of the EarthServer project, six Lighthouse Applications, major scientific data exploitation initiatives, are being established to make cross-domain, Earth Sciences related data repositories available in an open and unified manner, as service endpoints based on solutions and infrastructure developed within the project. Clients technology developed and deployed in EarthServer ranges from mobile and web clients to immersive virtual reality systems, all designed to interact with a physically and logically distributed server infrastructure using exclusively OGC standards. In this contribution, we would like to present our work on a web-based 3D visualization and interaction client for Earth Sciences data using only technology found in standard web browsers without requiring the user to install plugins or addons. Additionally, we are able to run the earth data visualization client on a wide range of different platforms with very different soft- and hardware requirements such as smart phones (e.g. iOS, Android), different desktop systems etc. High-quality, hardware-accelerated visualization of 3D and 4D content in standard web browsers can be realized now and we believe it will become more and more common to use this fast, lightweight and ubiquitous platform to provide insights into big datasets without requiring the user to set up a specialized client first. With that in mind, we will also point out some of the limitations we encountered using current web technologies. Underlying the EarthServer web client

  18. Socorro Students Translate NRAO Web Pages Into Spanish

    NASA Astrophysics Data System (ADS)

    2002-07-01

    Six Socorro High School students are spending their summer working at the National Radio Astronomy Observatory (NRAO) on a unique project that gives them experience in language translation, World Wide Web design, and technical communication. Under the project, called "Un puente a los cielos," the students are translating many of NRAO's Web pages on astronomy into Spanish. "These students are using their bilingual skills to help us make basic information about astronomy and radio telescopes available to the Spanish-speaking community," said Kristy Dyer, who works at NRAO as a National Science Foundation postdoctoral fellow and who developed the project and obtained funding for it from the National Aeronautics and Space Administration. The students are: Daniel Acosta, 16; Rossellys Amarante, 15; Sandra Cano, 16; Joel Gonzalez, 16; Angelica Hernandez, 16; and Cecilia Lopez, 16. The translation project, a joint effort of NRAO and the NM Tech physics department, also includes Zammaya Moreno, a teacher from Ecuador, Robyn Harrison, NRAO's education officer, and NRAO computer specialist Allan Poindexter. The students are translating NRAO Web pages aimed at the general public. These pages cover the basics of radio astronomy and frequently-asked questions about NRAO and the scientific research done with NRAO's telescopes. "Writing about science for non-technical audiences has to be done carefully. Scientific concepts must be presented in terms that are understandable to non-scientists but also that remain scientifically accurate," Dyer said. "When translating this type of writing from one language to another, we need to preserve both the understandability and the accuracy," she added. For that reason, Dyer recruited 14 Spanish-speaking astronomers from Argentina, Mexico and the U.S. to help verify the scientific accuracy of the Spanish translations. The astronomers will review the translations. The project is giving the students a broad range of experience. "They are

  19. PAGES-Powell North America 2k database

    NASA Astrophysics Data System (ADS)

    McKay, N.

    2014-12-01

    Syntheses of paleoclimate data in North America are essential for understanding long-term spatiotemporal variability in climate and for properly assessing risk on decadal and longer timescales. Existing reconstructions of the past 2,000 years rely almost exclusively on tree-ring records, which can underestimate low-frequency variability and rarely extend beyond the last millennium. Meanwhile, many records from the full spectrum of paleoclimate archives are available and hold the potential of enhancing our understanding of past climate across North America over the past 2000 years. The second phase of the Past Global Changes (PAGES) North America 2k project began in 2014, with a primary goal of assembling these disparate paleoclimate records into a unified database. This effort is currently supported by the USGS Powell Center together with PAGES. Its success requires grassroots support from the community of researchers developing and interpreting paleoclimatic evidence relevant to the past 2000 years. Most likely, fewer than half of the published records appropriate for this database are publicly archived, and far fewer include the data needed to quantify geochronologic uncertainty, or to concisely describe how best to interpret the data in context of a large-scale paleoclimatic synthesis. The current version of the database includes records that (1) have been published in a peer-reviewed journal (including evidence of the record's relationship to climate), (2) cover a substantial portion of the past 2000 yr (>300 yr for annual records, >500 yr for lower frequency records) at relatively high resolution (<50 yr/observation), and (3) have reasonably small and quantifiable age uncertainty. Presently, the database includes records from boreholes, ice cores, lake and marine sediments, speleothems, and tree rings. This poster presentation will display the site locations and basic metadata of the records currently in the database. We invite anyone with interest in

  20. MARSIS data and simulation exploited using array databases: PlanetServer/EarthServer for sounding radars

    NASA Astrophysics Data System (ADS)

    Cantini, Federico; Pio Rossi, Angelo; Orosei, Roberto; Baumann, Peter; Misev, Dimitar; Oosthoek, Jelmer; Beccati, Alan; Campalani, Piero; Unnithan, Vikram

    2014-05-01

    MARSIS is an orbital synthetic aperture radar for both ionosphere and subsurface sounding on board ESA's Mars Express (Picardi et al. 2005). It transmits electromagnetic pulses centered at 1.8, 3, 4 or 5 MHz that penetrate below the surface and are reflected by compositional and/or structural discontinuities in the subsurface of Mars. MARSIS data are available as a collection of single orbit data files. The availability of tools for a more effective access to such data would greatly ease data analysis and exploitation by the community of users. For this purpose, we are developing a database built on the raster database management system RasDaMan (e.g. Baumann et al., 1994), to be populated with MARSIS data and integrated in the PlanetServer/EarthServer (e.g. Oosthoek et al., 2013; Rossi et al., this meeting) project. The data (and related metadata) are stored in the db for each frequency used by MARSIS radar. The capability of retrieving data belonging to a certain orbit or to multiple orbit on the base of latitute/longitude boundaries is a key requirement of the db design, allowing, besides the "classical" radargram representation of the data, and in area with sufficiently hight orbit density, a 3D data extraction, subset and analysis of subsurface structures. Moreover the use of the OGC WCPS (Web Coverage Processing Service) standard can allow calculations on database query results for multiple echoes and/or subsets of a certain data product. Because of the low directivity of its dipole antenna, MARSIS receives echoes from portions of the surface of Mars that are distant from nadir and can be mistakenly interpreted as subsurface echoes. For this reason, methods have been developed to simulate surface echoes (e.g. Nouvel et al., 2004), to reveal the true origin of an echo through comparison with instrument data. These simulations are usually time-consuming, and so far have been performed either on a case-by-case basis or in some simplified form. A code for

  1. User Evaluation of the NASA Technical Report Server Recommendation Service

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Bollen, Johan; Calhoun, JoAnne R.; Mackey, Calvin E.

    2004-01-01

    We present the user evaluation of two recommendation server methodologies implemented for the NASA Technical Report Server (NTRS). One methodology for generating recommendations uses log analysis to identify co-retrieval events on full-text documents. For comparison, we used the Vector Space Model (VSM) as the second methodology. We calculated cosine similarities and used the top 10 most similar documents (based on metadata) as recommendations . We then ran an experiment with NASA Langley Research Center (LaRC) staff members to gather their feedback on which method produced the most quality recommendations. We found that in most cases VSM outperformed log analysis of co-retrievals. However, analyzing the data revealed the evaluations may have been structurally biased in favor of the VSM generated recommendations. We explore some possible methods for combining log analysis and VSM generated recommendations and suggest areas of future work.

  2. DSP: a protein shape string and its profile prediction server

    PubMed Central

    Sun, Jiangming; Tang, Shengnan; Xiong, Wenwei; Cong, Peisheng; Li, Tonghua

    2012-01-01

    Many studies have demonstrated that shape string is an extremely important structure representation, since it is more complete than the classical secondary structure. The shape string provides detailed information also in the regions denoted random coil. But few services are provided for systematic analysis of protein shape string. To fill this gap, we have developed an accurate shape string predictor based on two innovative technologies: a knowledge-driven sequence alignment and a sequence shape string profile method. The performance on blind test data demonstrates that the proposed method can be used for accurate prediction of protein shape string. The DSP server provides both predicted shape string and sequence shape string profile for each query sequence. Using this information, the users can compare protein structure or display protein evolution in shape string space. The DSP server is available at both http://cheminfo.tongji.edu.cn/dsp/ and its main mirror http://chemcenter.tongji.edu.cn/dsp/. PMID:22553364

  3. Peptiderive server: derive peptide inhibitors from protein–protein interactions

    PubMed Central

    Sedan, Yuval; Marcu, Orly; Lyskov, Sergey; Schueler-Furman, Ora

    2016-01-01

    The Rosetta Peptiderive protocol identifies, in a given structure of a protein–protein interaction, the linear polypeptide segment suggested to contribute most to binding energy. Interactions that feature a ‘hot segment’, a linear peptide with significant binding energy compared to that of the complex, may be amenable for inhibition and the peptide sequence and structure derived from the interaction provide a starting point for rational drug design. Here we present a web server for Peptiderive, which is incorporated within the ROSIE web interface for Rosetta protocols. A new feature of the protocol also evaluates whether derived peptides are good candidates for cyclization. Fast computation times and clear visualization allow users to quickly assess the interaction of interest. The Peptiderive server is available for free use at http://rosie.rosettacommons.org/peptiderive. PMID:27141963

  4. Fact Sheet: Improving Energy Efficiency for Server Rooms and Closets

    SciTech Connect

    Cheung, Hoi Ying; Mahdavi, Rod; Greenberg, Steve; Brown, Rich; Tschudi, William; Delforge, Pierre; Dickerson, Joyce

    2012-09-01

    Is there a ghost in your IT closet? If your building has one or more IT rooms or closets containing between 5 and 50 servers, chances are that they account for a significant share of the building’s energy use (in some cases, over half!). Servers, data storage arrays, networking equipment, and the cooling and power conditioning that support them tend to draw large amounts of energy 24/7, in many cases using more energy annually than traditional building loads such as HVAC and lighting. The good news is that there are many cost-effective actions, ranging from simple to advanced, that can dramatically reduce that energy use, helping you to save money and reduce pollution.

  5. Experience of public procurement of Open Compute servers

    NASA Astrophysics Data System (ADS)

    Bärring, Olof; Guerri, Marco; Bonfillou, Eric; Valsan, Liviu; Grigore, Alexandru; Dore, Vincent; Gentit, Alain; Clement, Benoît; Grossir, Anthony

    2015-12-01

    The Open Compute Project. OCP (http://www.opencompute.org/). was launched by Facebook in 2011 with the objective of building efficient computing infrastructures at the lowest possible cost. The technologies are released as open hardware. with the goal to develop servers and data centres following the model traditionally associated with open source software projects. In 2013 CERN acquired a few OCP servers in order to compare performance and power consumption with standard hardware. The conclusions were that there are sufficient savings to motivate an attempt to procure a large scale installation. One objective is to evaluate if the OCP market is sufficiently mature and broad enough to meet the constraints of a public procurement. This paper summarizes this procurement. which started in September 2014 and involved the Request for information (RFI) to qualify bidders and Request for Tender (RFT).

  6. User Evaluation of the NASA Technical Report Server Recommendation Service

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Bollen, Johan; Calhoun, JoAnne R.; Mackey, Calvin E.

    2004-01-01

    We present the user evaluation of two recommendation server methodologies implemented for the NASA Technical Report Server (NTRS). One methodology for generating recommendations uses log analysis to identify co-retrieval events on full-text documents. For comparison, we used the Vector Space Model (VSM) as the second methodology. We calculated cosine similarities and used the top 10 most similar documents (based on metadata) as 'recommendations'. We then ran an experiment with NASA Langley Research Center (LaRC) staff members to gather their feedback on which method produced the most 'quality' recommendations. We found that in most cases VSM outperformed log analysis of co-retrievals. However, analyzing the data revealed the evaluations may have been structurally biased in favor of the VSM generated recommendations. We explore some possible methods for combining log analysis and VSM generated recommendations and suggest areas of future work.

  7. Berkeley Phylogenomics Group web servers: resources for structural phylogenomic analysis.

    PubMed

    Glanville, Jake Gunn; Kirshner, Dan; Krishnamurthy, Nandini; Sjölander, Kimmen

    2007-07-01

    Phylogenomic analysis addresses the limitations of function prediction based on annotation transfer, and has been shown to enable the highest accuracy in prediction of protein molecular function. The Berkeley Phylogenomics Group provides a series of web servers for phylogenomic analysis: classification of sequences to pre-computed families and subfamilies using the PhyloFacts Phylogenomic Encyclopedia, FlowerPower clustering of proteins sharing the same domain architecture, MUSCLE multiple sequence alignment, SATCHMO simultaneous alignment and tree construction and SCI-PHY subfamily identification. The PhyloBuilder web server provides an integrated phylogenomic pipeline starting with a user-supplied protein sequence, proceeding to homolog identification, multiple alignment, phylogenetic tree construction, subfamily identification and structure prediction. The Berkeley Phylogenomics Group resources are available at http://phylogenomics.berkeley.edu.

  8. Design of Accelerator Online Simulator Server Using Structured Data

    SciTech Connect

    Shen, Guobao; Chu, Chungming; Wu, Juhao; Kraimer, Martin; /Argonne

    2012-07-06

    Model based control plays an important role for a modern accelerator during beam commissioning, beam study, and even daily operation. With a realistic model, beam behaviour can be predicted and therefore effectively controlled. The approach used by most current high level application environments is to use a built-in simulation engine and feed a realistic model into that simulation engine. Instead of this traditional monolithic structure, a new approach using a client-server architecture is under development. An on-line simulator server is accessed via network accessible structured data. With this approach, a user can easily access multiple simulation codes. This paper describes the design, implementation, and current status of PVData, which defines the structured data, and PVAccess, which provides network access to the structured data.

  9. DSP: a protein shape string and its profile prediction server.

    PubMed

    Sun, Jiangming; Tang, Shengnan; Xiong, Wenwei; Cong, Peisheng; Li, Tonghua

    2012-07-01

    Many studies have demonstrated that shape string is an extremely important structure representation, since it is more complete than the classical secondary structure. The shape string provides detailed information also in the regions denoted random coil. But few services are provided for systematic analysis of protein shape string. To fill this gap, we have developed an accurate shape string predictor based on two innovative technologies: a knowledge-driven sequence alignment and a sequence shape string profile method. The performance on blind test data demonstrates that the proposed method can be used for accurate prediction of protein shape string. The DSP server provides both predicted shape string and sequence shape string profile for each query sequence. Using this information, the users can compare protein structure or display protein evolution in shape string space. The DSP server is available at both http://cheminfo.tongji.edu.cn/dsp/ and its main mirror http://chemcenter.tongji.edu.cn/dsp/.

  10. An empirical performance analysis of commodity memories in commodity servers

    SciTech Connect

    Kerbyson, D. J.; Lang, M. K.; Patino, G.

    2004-01-01

    This work details a performance study of six different commodity memories in two commodity server nodes on a number of microbenchmarks, that measure low-level performance characteristics, as well as on two applications representative of the ASCI workload. Thc memories vary both in terms of performance, including latency and bandwidths, and also in terms of their physical properties and manufacturer. Two server nodes were used; one Itanium-II Madison based system, and one Xeon based system. All the memories examined can be used within both processing nodes. This allows the performance of the memories to be directly examined while keeping all other factors within a processing node the same (processor, motherboard, operating system etc.). The results of this study show that there can be a significant difference in application performance from the different memories - by as much as 20%. Thus, by choosing the most appropriate memory for a processing node at a minimal cost differential, significant improved performance may be achievable.

  11. Rankprop: a web server for protein remote homology detection

    PubMed Central

    Melvin, Iain; Weston, Jason; Leslie, Christina; Noble, William Stafford

    2009-01-01

    Summary: We present a large-scale implementation of the Rankprop protein homology ranking algorithm in the form of an openly accessible web server. We use the NRDB40 PSI-BLAST all-versus-all protein similarity network of 1.1 million proteins to construct the graph for the Rankprop algorithm, whereas previously, results were only reported for a database of 108 000 proteins. We also describe two algorithmic improvements to the original algorithm, including propagation from multiple homologs of the query and better normalization of ranking scores, that lead to higher accuracy and to scores with a probabilistic interpretation. Availability: The Rankprop web server and source code are available at http://rankprop.gs.washington.edu Contact: iain@nec-labs.com; noble@gs.washington.edu PMID:18990723

  12. Introducing djatoka: a reuse friendly, open source JPEG image server

    SciTech Connect

    Chute, Ryan M; Van De Sompel, Herbert

    2008-01-01

    The ISO-standardized JPEG 2000 image format has started to attract significant attention. Support for the format is emerging in major consumer applications, and the cultural heritage community seriously considers it a viable format for digital preservation. So far, only commercial image servers with JPEG 2000 support have been available. They come with significant license fees and typically provide the customers with limited extensibility capabilities. Here, we introduce djatoka, an open source JPEG 2000 image server with an attractive basic feature set, and extensibility under control of the community of implementers. We describe djatoka, and point at demonstrations that feature digitized images of marvelous historical manuscripts from the collections of the British Library and the University of Ghent. We also caIl upon the community to engage in further development of djatoka.

  13. The Photometric Classification Server of PanSTARRS1

    NASA Astrophysics Data System (ADS)

    Saglia, R. P.

    2008-12-01

    The Panoramic Survey Telescope and Rapid Response System 1 (PanSTARRS1) project is on his way to start science operations early 2009. In the next 3.5 years it will produce a grizy survey of 3/4 of the sky, 2 mag deeper than Sloan. The Photometric Classification Server is responsibile for the object classification as star, galaxy or quasar, based on multiband photometry. Accordingly, it t should also deliver accurate and timely stellar parameters or photometric redshifts. Several science projects rely on the output of the server, from transit planet search, to transient detections, the structure of the Milky Way, high redshift Quasars, galaxy evolution, cosmological shear, baryonic oscillations and galaxy cluster searches.

  14. The web server of IBM's Bioinformatics and Pattern Discovery group.

    PubMed

    Huynh, Tien; Rigoutsos, Isidore; Parida, Laxmi; Platt, Daniel; Shibuya, Tetsuo

    2003-07-01

    We herein present and discuss the services and content which are available on the web server of IBM's Bioinformatics and Pattern Discovery group. The server is operational around the clock and provides access to a variety of methods that have been published by the group's members and collaborators. The available tools correspond to applications ranging from the discovery of patterns in streams of events and the computation of multiple sequence alignments, to the discovery of genes in nucleic acid sequences and the interactive annotation of amino acid sequences. Additionally, annotations for more than 70 archaeal, bacterial, eukaryotic and viral genomes are available on-line and can be searched interactively. The tools and code bundles can be accessed beginning at http://cbcsrv.watson.ibm.com/Tspd.html whereas the genomics annotations are available at http://cbcsrv.watson.ibm.com/Annotations/.

  15. Parmodel: a web server for automated comparative modeling of proteins.

    PubMed

    Uchôa, Hugo Brandão; Jorge, Guilherme Eberhart; Freitas Da Silveira, Nelson José; Camera, João Carlos; Canduri, Fernanda; De Azevedo, Walter Filgueira

    2004-12-24

    Parmodel is a web server for automated comparative modeling and evaluation of protein structures. The aim of this tool is to help inexperienced users to perform modeling, assessment, visualization, and optimization of protein models as well as crystallographers to evaluate structures solved experimentally. It is subdivided in four modules: Parmodel Modeling, Parmodel Assessment, Parmodel Visualization, and Parmodel Optimization. The main module is the Parmodel Modeling that allows the building of several models for a same protein in a reduced time, through the distribution of modeling processes on a Beowulf cluster. Parmodel automates and integrates the main softwares used in comparative modeling as MODELLER, Whatcheck, Procheck, Raster3D, Molscript, and Gromacs. This web server is freely accessible at .

  16. GrayStarServer: Server-side Spectrum Synthesis with a Browser-based Client-side User Interface

    NASA Astrophysics Data System (ADS)

    Short, C. Ian

    2016-10-01

    We present GrayStarServer (GSS), a stellar atmospheric modeling and spectrum synthesis code of pedagogical accuracy that is accessible in any web browser on commonplace computational devices and that runs on a timescale of a few seconds. The addition of spectrum synthesis annotated with line identifications extends the functionality and pedagogical applicability of GSS beyond that of its predecessor, GrayStar3 (GS3). The spectrum synthesis is based on a line list acquired from the NIST atomic spectra database, and the GSS post-processing and user interface client allows the user to inspect the plain text ASCII version of the line list, as well as to apply macroscopic broadening. Unlike GS3, GSS carries out the physical modeling on the server side in Java, and communicates with the JavaScript and HTML client via an asynchronous HTTP request. We also describe other improvements beyond GS3 such as a more physical treatment of background opacity and atmospheric physics, the comparison of key results with those of the Phoenix code, and the use of the HTML < {canvas}> element for higher quality plotting and rendering of results. We also present LineListServer, a Java code for converting custom ASCII line lists in NIST format to the byte data type file format required by GSS so that users can prepare their own custom line lists. We propose a standard for marking up and packaging model atmosphere and spectrum synthesis output for data transmission and storage that will facilitate a web-based approach to stellar atmospheric modeling and spectrum synthesis. We describe some pedagogical demonstrations and exercises enabled by easily accessible, on-demand, responsive spectrum synthesis. GSS may serve as a research support tool by providing quick spectroscopic reconnaissance. GSS may be found at www.ap.smu.ca/~ishort/OpenStars/GrayStarServer/grayStarServer.html, and source tarballs for local installations of both GSS and LineListServer may be found at www.ap.smu.ca/~ishort/OpenStars/.

  17. To Overcome HITS Rank Similarity Confliction of Web Pages using Weight Calculation and Rank Improvement

    NASA Astrophysics Data System (ADS)

    Nath, Rajender; Kumar, Naresh

    2011-12-01

    Search Engine gives an ordered list of web search results in response to a user query, wherein the important pages are usually displayed at the top with less important ones afterwards. It may be possible that the user may have to look for many screen results to get the required documents. In literatures, many page ranking algorithms has been given to find the page rank of a page. For example PageRank is considered in this work. This algorithm treats all the links equally when distributing rank scores. That's why this algorithm some time gives equal importance to all the pages. But in real this can not be happen because, if two pages have same rank then how we can judge which page is more important then other. So this paper proposes another idea to organize the search results and describe which page is more important when confliction of same rank is produced by the PageRank. So that the user can get more relevant and important results easily and in a short span of time.

  18. The impact of visual layout factors on performance in Web pages: a cross-language study.

    PubMed

    Parush, Avi; Shwarts, Yonit; Shtub, Avy; Chandra, M Jeya

    2005-01-01

    Visual layout has a strong impact on performance and is a critical factor in the design of graphical user interfaces (GUIs) and Web pages. Many design guidelines employed in Web page design were inherited from human performance literature and GUI design studies and practices. However, few studies have investigated the more specific patterns of performance with Web pages that may reflect some differences between Web page and GUI design. We investigated interactions among four visual layout factors in Web page design (quantity of links, alignment, grouping indications, and density) in two experiments: one with pages in Hebrew, entailing right-to-left reading, and the other with English pages, entailing left-to-right reading. Some performance patterns (measured by search times and eye movements) were similar between languages. Performance was particularly poor in pages with many links and variable densities, but it improved with the presence of uniform density. Alignment was not shown to be a performance-enhancing factor. The findings are discussed in terms of the similarities and differences in the impact of layout factors between GUIs and Web pages. Actual or potential applications of this research include specific guidelines for Web page design.

  19. Performance, Accuracy, and Web Server for Evolutionary Placement of Short Sequence Reads under Maximum Likelihood

    PubMed Central

    Berger, Simon A.; Krompass, Denis; Stamatakis, Alexandros

    2011-01-01

    We present an evolutionary placement algorithm (EPA) and a Web server for the rapid assignment of sequence fragments (short reads) to edges of a given phylogenetic tree under the maximum-likelihood model. The accuracy of the algorithm is evaluated on several real-world data sets and compared with placement by pair-wise sequence comparison, using edit distances and BLAST. We introduce a slow and accurate as well as a fast and less accurate placement algorithm. For the slow algorithm, we develop additional heuristic techniques that yield almost the same run times as the fast version with only a small loss of accuracy. When those additional heuristics are employed, the run time of the more accurate algorithm is comparable with that of a simple BLAST search for data sets with a high number of short query sequences. Moreover, the accuracy of the EPA is significantly higher, in particular when the sample of taxa in the reference topology is sparse or inadequate. Our algorithm, which has been integrated into RAxML, therefore provides an equally fast but more accurate alternative to BLAST for tree-based inference of the evolutionary origin and composition of short sequence reads. We are also actively developing a Web server that offers a freely available service for computing read placements on trees using the EPA. PMID:21436105

  20. A Web Server and Mobile App for Computing Hemolytic Potency of Peptides

    PubMed Central

    Chaudhary, Kumardeep; Kumar, Ritesh; Singh, Sandeep; Tuknait, Abhishek; Gautam, Ankur; Mathur, Deepika; Anand, Priya; Varshney, Grish C.; Raghava, Gajendra P. S.

    2016-01-01

    Numerous therapeutic peptides do not enter the clinical trials just because of their high hemolytic activity. Recently, we developed a database, Hemolytik, for maintaining experimentally validated hemolytic and non-hemolytic peptides. The present study describes a web server and mobile app developed for predicting, and screening of peptides having hemolytic potency. Firstly, we generated a dataset HemoPI-1 that contains 552 hemolytic peptides extracted from Hemolytik database and 552 random non-hemolytic peptides (from Swiss-Prot). The sequence analysis of these peptides revealed that certain residues (e.g., L, K, F, W) and motifs (e.g., “FKK”, “LKL”, “KKLL”, “KWK”, “VLK”, “CYCR”, “CRR”, “RFC”, “RRR”, “LKKL”) are more abundant in hemolytic peptides. Therefore, we developed models for discriminating hemolytic and non-hemolytic peptides using various machine learning techniques and achieved more than 95% accuracy. We also developed models for discriminating peptides having high and low hemolytic potential on different datasets called HemoPI-2 and HemoPI-3. In order to serve the scientific community, we developed a web server, mobile app and JAVA-based standalone software (http://crdd.osdd.net/raghava/hemopi/). PMID:26953092

  1. A Web Server and Mobile App for Computing Hemolytic Potency of Peptides

    NASA Astrophysics Data System (ADS)

    Chaudhary, Kumardeep; Kumar, Ritesh; Singh, Sandeep; Tuknait, Abhishek; Gautam, Ankur; Mathur, Deepika; Anand, Priya; Varshney, Grish C.; Raghava, Gajendra P. S.

    2016-03-01

    Numerous therapeutic peptides do not enter the clinical trials just because of their high hemolytic activity. Recently, we developed a database, Hemolytik, for maintaining experimentally validated hemolytic and non-hemolytic peptides. The present study describes a web server and mobile app developed for predicting, and screening of peptides having hemolytic potency. Firstly, we generated a dataset HemoPI-1 that contains 552 hemolytic peptides extracted from Hemolytik database and 552 random non-hemolytic peptides (from Swiss-Prot). The sequence analysis of these peptides revealed that certain residues (e.g., L, K, F, W) and motifs (e.g., “FKK”, “LKL”, “KKLL”, “KWK”, “VLK”, “CYCR”, “CRR”, “RFC”, “RRR”, “LKKL”) are more abundant in hemolytic peptides. Therefore, we developed models for discriminating hemolytic and non-hemolytic peptides using various machine learning techniques and achieved more than 95% accuracy. We also developed models for discriminating peptides having high and low hemolytic potential on different datasets called HemoPI-2 and HemoPI-3. In order to serve the scientific community, we developed a web server, mobile app and JAVA-based standalone software (http://crdd.osdd.net/raghava/hemopi/).

  2. The RING 2.0 web server for high quality residue interaction networks.

    PubMed

    Piovesan, Damiano; Minervini, Giovanni; Tosatto, Silvio C E

    2016-07-08

    Residue interaction networks (RINs) are an alternative way of representing protein structures where nodes are residues and arcs physico-chemical interactions. RINs have been extensively and successfully used for analysing mutation effects, protein folding, domain-domain communication and catalytic activity. Here we present RING 2.0, a new version of the RING software for the identification of covalent and non-covalent bonds in protein structures, including π-π stacking and π-cation interactions. RING 2.0 is extremely fast and generates both intra and inter-chain interactions including solvent and ligand atoms. The generated networks are very accurate and reliable thanks to a complex empirical re-parameterization of distance thresholds performed on the entire Protein Data Bank. By default, RING output is generated with optimal parameters but the web server provides an exhaustive interface to customize the calculation. The network can be visualized directly in the browser or in Cytoscape. Alternatively, the RING-Viz script for Pymol allows visualizing the interactions at atomic level in the structure. The web server and RING-Viz, together with an extensive help and tutorial, are available from URL: http://protein.bio.unipd.it/ring.

  3. A Web Server and Mobile App for Computing Hemolytic Potency of Peptides.

    PubMed

    Chaudhary, Kumardeep; Kumar, Ritesh; Singh, Sandeep; Tuknait, Abhishek; Gautam, Ankur; Mathur, Deepika; Anand, Priya; Varshney, Grish C; Raghava, Gajendra P S

    2016-03-08

    Numerous therapeutic peptides do not enter the clinical trials just because of their high hemolytic activity. Recently, we developed a database, Hemolytik, for maintaining experimentally validated hemolytic and non-hemolytic peptides. The present study describes a web server and mobile app developed for predicting, and screening of peptides having hemolytic potency. Firstly, we generated a dataset HemoPI-1 that contains 552 hemolytic peptides extracted from Hemolytik database and 552 random non-hemolytic peptides (from Swiss-Prot). The sequence analysis of these peptides revealed that certain residues (e.g., L, K, F, W) and motifs (e.g., "FKK", "LKL", "KKLL", "KWK", "VLK", "CYCR", "CRR", "RFC", "RRR", "LKKL") are more abundant in hemolytic peptides. Therefore, we developed models for discriminating hemolytic and non-hemolytic peptides using various machine learning techniques and achieved more than 95% accuracy. We also developed models for discriminating peptides having high and low hemolytic potential on different datasets called HemoPI-2 and HemoPI-3. In order to serve the scientific community, we developed a web server, mobile app and JAVA-based standalone software (http://crdd.osdd.net/raghava/hemopi/).

  4. Evolutionary Trace Annotation Server: automated enzyme function prediction in protein structures using 3D templates

    PubMed Central

    Matthew Ward, R.; Venner, Eric; Daines, Bryce; Murray, Stephen; Erdin, Serkan; Kristensen, David M.; Lichtarge, Olivier

    2009-01-01

    Summary:The Evolutionary Trace Annotation (ETA) Server predicts enzymatic activity. ETA starts with a structure of unknown function, such as those from structural genomics, and with no prior knowledge of its mechanism uses the phylogenetic Evolutionary Trace (ET) method to extract key functional residues and propose a function-associated 3D motif, called a 3D template. ETA then searches previously annotated structures for geometric template matches that suggest molecular and thus functional mimicry. In order to maximize the predictive value of these matches, ETA next applies distinctive specificity filters—evolutionary similarity, function plurality and match reciprocity. In large scale controls on enzymes, prediction coverage is 43% but the positive predictive value rises to 92%, thus minimizing false annotations. Users may modify any search parameter, including the template. ETA thus expands the ET suite for protein structure annotation, and can contribute to the annotation efforts of metaservers. Availability:The ETA Server is a web application available at http://mammoth.bcm.tmc.edu/eta/. Contact: lichtarge@bcm.edu PMID:19307237

  5. The web server of IBM's Bioinformatics and Pattern Discovery group: 2004 update.

    PubMed

    Huynh, Tien; Rigoutsos, Isidore

    2004-07-01

    In this report, we provide an update on the services and content which are available on the web server of IBM's Bioinformatics and Pattern Discovery group. The server, which is operational around the clock, provides access to a large number of methods that have been developed and published by the group's members. There is an increasing number of problems that these tools can help tackle; these problems range from the discovery of patterns in streams of events and the computation of multiple sequence alignments, to the discovery of genes in nucleic acid sequences, the identification--directly from sequence--of structural deviations from alpha-helicity and the annotation of amino acid sequences for antimicrobial activity. Additionally, annotations for more than 130 archaeal, bacterial, eukaryotic and viral genomes are now available on-line and can be searched interactively. The tools and code bundles continue to be accessible from http://cbcsrv.watson.ibm.com/Tspd.html whereas the genomics annotations are available at http://cbcsrv.watson.ibm.com/Annotations/.

  6. The RING 2.0 web server for high quality residue interaction networks

    PubMed Central

    Piovesan, Damiano; Minervini, Giovanni; Tosatto, Silvio C.E.

    2016-01-01

    Residue interaction networks (RINs) are an alternative way of representing protein structures where nodes are residues and arcs physico–chemical interactions. RINs have been extensively and successfully used for analysing mutation effects, protein folding, domain–domain communication and catalytic activity. Here we present RING 2.0, a new version of the RING software for the identification of covalent and non-covalent bonds in protein structures, including π–π stacking and π–cation interactions. RING 2.0 is extremely fast and generates both intra and inter-chain interactions including solvent and ligand atoms. The generated networks are very accurate and reliable thanks to a complex empirical re-parameterization of distance thresholds performed on the entire Protein Data Bank. By default, RING output is generated with optimal parameters but the web server provides an exhaustive interface to customize the calculation. The network can be visualized directly in the browser or in Cytoscape. Alternatively, the RING-Viz script for Pymol allows visualizing the interactions at atomic level in the structure. The web server and RING-Viz, together with an extensive help and tutorial, are available from URL: http://protein.bio.unipd.it/ring. PMID:27198219

  7. Analyzing Web pages visual scanpaths: between and within tasks variability.

    PubMed

    Drusch, Gautier; Bastien, J M Christian

    2012-01-01

    In this paper, we propose a new method for comparing scanpaths in a bottom-up approach, and a test of the scanpath theory. To do so, we conducted a laboratory experiment in which 113 participants were invited to accomplish a set of tasks on two different websites. For each site, they had to perform two tasks that had to be repeated ounce. The data were analyzed using a procedure similar to the one used by Duchowski et al. [8]. The first step was to automatically identify, then label, AOIs with the mean-shift clustering procedure [19]. Then, scanpaths were compared two by two with a modified version of the string-edit method, which take into account the order of AOIs visualizations [2]. Our results show that scanpaths variability between tasks but within participants seems to be lower than the variability within task for a given participant. In other words participants seem to be more coherent when they perform different tasks, than when they repeat the same tasks. In addition, participants view more of the same AOI when they perform a different task on the same Web page than when they repeated the same task. These results are quite different from what predicts the scanpath theory.

  8. Page segmentation using script identification vectors: A first look

    SciTech Connect

    Hochberg, J.; Cannon, M.; Kelly, P.; White, J.

    1997-07-01

    Document images in which different scripts, such as Chinese and Roman, appear on a single page pose a problem for optical character recognition (OCR) systems. This paper explores the use of script identification vectors in the analysis of multilingual document images. A script identification vector is calculated for each connected component in a document. The vector expresses the closest distance between the component and templates developed for each of thirteen scripts, including Arabic, Chinese, Cyrillic, and Roman. The authors calculate the first three principal components within the resulting thirteen-dimensional space for each image. By mapping these components to red, green, and blue, they can visualize the information contained in the script identification vectors. The visualization of several multilingual images suggests that the script identification vectors can be used to segment images into script-specific regions as large as several paragraphs or as small as a few characters. The visualized vectors also reveal distinctions within scripts, such as font in Roman documents, and kanji vs. kana in Japanese. Results are best for documents containing highly dissimilar scripts such as Roman and Japanese. Documents containing similar scripts, such as Roman and Cyrillic will require further investigation.

  9. SARA: a server for function annotation of RNA structures.

    PubMed

    Capriotti, Emidio; Marti-Renom, Marc A

    2009-07-01

    Recent interest in non-coding RNA transcripts has resulted in a rapid increase of deposited RNA structures in the Protein Data Bank. However, a characterization and functional classification of the RNA structure and function space have only been partially addressed. Here, we introduce the SARA program for pair-wise alignment of RNA structures as a web server for structure-based RNA function assignment. The SARA server relies on the SARA program, which aligns two RNA structures based on a unit-vector root-mean-square approach. The likely accuracy of the SARA alignments is assessed by three different P-values estimating the statistical significance of the sequence, secondary structure and tertiary structure identity scores, respectively. Our benchmarks, which relied on a set of 419 RNA structures with known SCOR structural class, indicate that at a negative logarithm of mean P-value higher or equal than 2.5, SARA can assign the correct or a similar SCOR class to 81.4% and 95.3% of the benchmark set, respectively. The SARA server is freely accessible via the World Wide Web at http://sgu.bioinfo.cipf.es/services/SARA/.

  10. A distributed clients/distributed servers model for STARCAT

    NASA Technical Reports Server (NTRS)

    Pirenne, B.; Albrecht, M. A.; Durand, D.; Gaudet, S.

    1992-01-01

    STARCAT, the Space Telescope ARchive and CATalogue user interface has been along for a number of years already. During this time it has been enhanced and augmented in a number of different fields. This time, we would like to dwell on a new capability allowing geographically distributed user interfaces to connect to geographically distributed data servers. This new concept permits users anywhere on the internet running STARCAT on their local hardware to access e.g., whichever of the 3 existing HST archive sites is available, or get information on the CFHT archive through a transparent connection to the CADC in BC or to get the La Silla weather by connecting to the ESO database in Munich during the same session. Similarly PreView (or quick look) images and spectra will also flow directly to the user from wherever it is available. Moving towards an 'X'-based STARCAT is another goal being pursued: a graphic/image server and a help/doc server are currently being added to it. They should further enhance the user independence and access transparency.

  11. Mobile object retrieval in server-based image databases

    NASA Astrophysics Data System (ADS)

    Manger, D.; Pagel, F.; Widak, H.

    2013-05-01

    The increasing number of mobile phones equipped with powerful cameras leads to huge collections of user-generated images. To utilize the information of the images on site, image retrieval systems are becoming more and more popular to search for similar objects in an own image database. As the computational performance and the memory capacity of mobile devices are constantly increasing, this search can often be performed on the device itself. This is feasible, for example, if the images are represented with global image features or if the search is done using EXIF or textual metadata. However, for larger image databases, if multiple users are meant to contribute to a growing image database or if powerful content-based image retrieval methods with local features are required, a server-based image retrieval backend is needed. In this work, we present a content-based image retrieval system with a client server architecture working with local features. On the server side, the scalability to large image databases is addressed with the popular bag-of-word model with state-of-the-art extensions. The client end of the system focuses on a lightweight user interface presenting the most similar images of the database highlighting the visual information which is common with the query image. Additionally, new images can be added to the database making it a powerful and interactive tool for mobile contentbased image retrieval.

  12. Quartet decomposition server: a platform for analyzing phylogenetic trees

    PubMed Central

    2012-01-01

    Background The frequent exchange of genetic material among prokaryotes means that extracting a majority or plurality phylogenetic signal from many gene families, and the identification of gene families that are in significant conflict with the plurality signal is a frequent task in comparative genomics, and especially in phylogenomic analyses. Decomposition of gene trees into embedded quartets (unrooted trees each with four taxa) is a convenient and statistically powerful technique to address this challenging problem. This approach was shown to be useful in several studies of completely sequenced microbial genomes. Results We present here a web server that takes a collection of gene phylogenies, decomposes them into quartets, generates a Quartet Spectrum, and draws a split network. Users are also provided with various data download options for further analyses. Each gene phylogeny is to be represented by an assessment of phylogenetic information content, such as sets of trees reconstructed from bootstrap replicates or sampled from a posterior distribution. The Quartet Decomposition server is accessible at http://quartets.uga.edu. Conclusions The Quartet Decomposition server presented here provides a convenient means to perform Quartet Decomposition analyses and will empower users to find statistically supported phylogenetic conflicts. PMID:22676320

  13. Mining the SDSS SkyServer SQL queries log

    NASA Astrophysics Data System (ADS)

    Hirota, Vitor M.; Santos, Rafael; Raddick, Jordan; Thakar, Ani

    2016-05-01

    SkyServer, the Internet portal for the Sloan Digital Sky Survey (SDSS) astronomic catalog, provides a set of tools that allows data access for astronomers and scientific education. One of SkyServer data access interfaces allows users to enter ad-hoc SQL statements to query the catalog. SkyServer also presents some template queries that can be used as basis for more complex queries. This interface has logged over 330 million queries submitted since 2001. It is expected that analysis of this data can be used to investigate usage patterns, identify potential new classes of queries, find similar queries, etc. and to shed some light on how users interact with the Sloan Digital Sky Survey data and how scientists have adopted the new paradigm of e-Science, which could in turn lead to enhancements on the user interfaces and experience in general. In this paper we review some approaches to SQL query mining, apply the traditional techniques used in the literature and present lessons learned, namely, that the general text mining approach for feature extraction and clustering does not seem to be adequate for this type of data, and, most importantly, we find that this type of analysis can result in very different queries being clustered together.

  14. Federated healthcare record server--the Synapses paradigm.

    PubMed

    Grimson, W; Berry, D; Grimson, J; Stephens, G; Felton, E; Given, P; O'Moore, R

    1998-01-01

    The delivery of healthcare relies on the sharing of patient information between those who are providing for the care of the patient and this information is increasingly being expressed in terms of a 'record'. Further, it is desirable that these records are available in electronic form as Electronic HealthCare Records. As it is likely that patient records or parts of records will be stored in many different information systems and in the form of disparate record architectures, uniform access to patient records would be problematic. This paper presents an overview of the Synapses computing environment in which a Federated Healthcare Record Server provides uniform access to patient information stored in connected heterogeneous autonomous information systems and other Synapses servers. The Synapses record architecture is based on the architecture proposed by the Technical Committee 251 of the European Committee for Standardisation and the interfaces to the Synapses server are specified in the ISO standard Interface Definition Language. Synapses is a pan-European project involving a number of hospitals, software companies, universities and research institutes and is partly funded by the EU Health Telematics Programme. The overview is described in terms of the Open Distributed Processing Reference Model.

  15. Personal Web home pages of adolescents with cancer: self-presentation, information dissemination, and interpersonal connection.

    PubMed

    Suzuki, Lalita K; Beale, Ivan L

    2006-01-01

    The content of personal Web home pages created by adolescents with cancer is a new source of information about this population of potential benefit to oncology nurses and psychologists. Individual Internet elements found on 21 home pages created by youths with cancer (14-22 years old) were rated for cancer-related self-presentation, information dissemination, and interpersonal connection. Examples of adolescents' online narratives were also recorded. Adolescents with cancer used various Internet elements on their home pages for cancer-related self-presentation (eg, welcome messages, essays, personal history and diary pages, news articles, and poetry), information dissemination (e.g., through personal interest pages, multimedia presentations, lists, charts, and hyperlinks), and interpersonal connection (eg, guestbook entries). Results suggest that various elements found on personal home pages are being used by a limited number of young patients with cancer for self-expression, information access, and contact with peers.

  16. Calculating PageRank in a changing network with added or removed edges

    NASA Astrophysics Data System (ADS)

    Engström, Christopher; Silvestrov, Sergei

    2017-01-01

    PageRank was initially developed by S. Brinn and L. Page in 1998 to rank homepages on the Internet using the stationary distribution of a Markov chain created using the web graph. Due to the large size of the web graph and many other real world networks fast methods to calculate PageRank is needed and even if the original way of calculating PageRank using a Power iterations is rather fast, many other approaches have been made to improve the speed further. In this paper we will consider the problem of recalculating PageRank of a changing network where the PageRank of a previous version of the network is known. In particular we will consider the special case of adding or removing edges to a single vertex in the graph or graph component.

  17. A biplex approach to PageRank centrality: From classic to multiplex networks.

    PubMed

    Pedroche, Francisco; Romance, Miguel; Criado, Regino

    2016-06-01

    In this paper, we present a new view of the PageRank algorithm inspired by multiplex networks. This new approach allows to introduce a new centrality measure for classic complex networks and a new proposal to extend the usual PageRank algorithm to multiplex networks. We give some analytical relations between these new approaches and the classic PageRank centrality measure, and we illustrate the new parameters presented by computing them on real underground networks.

  18. From honeybees to Internet servers: biomimicry for distributed management of Internet hosting centers.

    PubMed

    Nakrani, Sunil; Tovey, Craig

    2007-12-01

    An Internet hosting center hosts services on its server ensemble. The center must allocate servers dynamically amongst services to maximize revenue earned from hosting fees. The finite server ensemble, unpredictable request arrival behavior and server reallocation cost make server allocation optimization difficult. Server allocation closely resembles honeybee forager allocation amongst flower patches to optimize nectar influx. The resemblance inspires a honeybee biomimetic algorithm. This paper describes details of the honeybee self-organizing model in terms of information flow and feedback, analyzes the homology between the two problems and derives the resulting biomimetic algorithm for hosting centers. The algorithm is assessed for effectiveness and adaptiveness by comparative testing against benchmark and conventional algorithms. Computational results indicate that the new algorithm is highly adaptive to widely varying external environments and quite competitive against benchmark assessment algorithms. Other swarm intelligence applications are briefly surveyed, and some general speculations are offered regarding their various degrees of success.

  19. A Predictive Performance Model to Evaluate the Contention Cost in Application Servers

    SciTech Connect

    Chen, Shiping; Gorton, Ian )

    2002-12-04

    In multi-tier enterprise systems, application servers are key components that implement business logic and provide application services. To support a large number of simultaneous accesses from clients over the Internet and intranet, most application servers use replication and multi-threading to handle concurrent requests. While multiple processes and multiple threads enhance the processing bandwidth of servers, they also increase the contention for resources in application servers. This paper investigates this issue empirically based on a middleware benchmark. A cost model is proposed to estimate the overall performance of application servers, including the contention overhead. This model is then used to determine the optimal degree of the concurrency of application servers for a specific client load. A case study based on CORBA is presented to validate our model and demonstrate its application.

  20. 47 CFR 22.201 - Paging geographic area authorizations are subject to competitive bidding.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... (CONTINUED) COMMON CARRIER SERVICES PUBLIC MOBILE SERVICES Licensing Requirements and Procedures Competitive.... Mutually exclusive initial applications for paging geographic area licenses are subject to...

  1. 47 CFR 22.201 - Paging geographic area authorizations are subject to competitive bidding.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... (CONTINUED) COMMON CARRIER SERVICES PUBLIC MOBILE SERVICES Licensing Requirements and Procedures Competitive.... Mutually exclusive initial applications for paging geographic area licenses are subject to...

  2. Measuring consistency of web page design and its effects on performance and satisfaction.

    PubMed

    Ozok, A A; Salvendy, G

    2000-04-01

    This study examines the methods for measuring the consistency levels of web pages and the effect of consistency on the performance and satisfaction of the world-wide web (WWW) user. For clarification, a home page is referred to as a single page that is the default page of a web site on the WWW. A web page refers to a single screen that indicates a specific address on the WWW. This study has tested a series of web pages that were mostly hyperlinked. Therefore, the term 'web page' has been adopted for the nomenclature while referring to the objects of which the features were tested. It was hypothesized that participants would perform better and be more satisfied using web pages that have consistent rather than inconsistent interface design; that the overall consistency level of an interface design would significantly correlate with the three elements of consistency, physical, communicational and conceptual consistency; and that physical and communicational consistencies would interact with each other. The hypotheses were tested in a four-group, between-subject design, with 10 participants in each group. The results partially support the hypothesis regarding error rate, but not regarding satisfaction and performance time. The results also support the hypothesis that each of the three elements of consistency significantly contribute to the overall consistency of a web page, and that physical and communicational consistencies interact with each other, while conceptual consistency does not interact with them.

  3. Resources and Fact Sheets on Servicing Motor Vehicle Air Conditioners (Summary Page)

    EPA Pesticide Factsheets

    Page provides links to resources that can assist motor vehicle air-conditioning system technicians in understanding system servicing requirements and best practices, and learn about alternative refrigerants.

  4. Flood-inundation maps for the DuPage River from Plainfield to Shorewood, Illinois, 2013

    USGS Publications Warehouse

    Murphy, Elizabeth A.; Sharpe, Jennifer B.

    2013-01-01

    Digital flood-inundation maps for a 15.5-mi reach of the DuPage River from Plainfield to Shorewood, Illinois, were created by the U.S. Geological Survey (USGS) in cooperation with the Will County Stormwater Management Planning Committee. The inundation maps, which can be accessed through the USGS Flood Inundation Mapping Science Web site at http://water.usgs.gov/osw/flood_inundation/ depict estimates of the areal extent of flooding corresponding to selected water levels (gage heights or stages) at the USGS streamgage at DuPage River at Shorewood, Illinois (sta. no. 05540500). Current conditions at the USGS streamgage may be obtained on the Internet at http://waterdata.usgs.gov/usa/nwis/uv?05540500. In addition, the information has been provided to the National Weather Service (NWS) for incorporation into their Advanced Hydrologic Prediction Service (AHPS) flood warning system (http://water.weather.gov/ahps/). The NWS forecasts flood hydrographs at many places that are often colocated with USGS streamgages. The NWS-forecasted peak-stage information, also shown on the DuPage River at Shorewood inundation Web site, may be used in conjunction with the maps developed in this study to show predicted areas of flood inundation. In this study, flood profiles were computed for the stream reach by means of a one-dimensional step-backwater model. The hydraulic model was then used to determine nine water-surface profiles for flood stages at 1-ft intervals referenced to the streamgage datum and ranging from NWS Action stage of 6 ft to the historic crest of 14.0 ft. The simulated water-surface profiles were then combined with a Digital Elevation Model (DEM) (derived from Light Detection And Ranging (LiDAR) data) by using a Geographic Information System (GIS) in order to delineate the area flooded at each water level. These maps, along with information on the Internet regarding current gage height from USGS streamgages and forecasted stream stages from the NWS, provide emergency

  5. Adventures in the evolution of a high-bandwidth network for central servers

    SciTech Connect

    Swartz, K.L.; Cottrell, L.; Dart, M.

    1994-08-01

    In a small network, clients and servers may all be connected to a single Ethernet without significant performance concerns. As the number of clients on a network grows, the necessity of splitting the network into multiple sub-networks, each with a manageable number of clients, becomes clear. Less obvious is what to do with the servers. Group file servers on subnets and multihomed servers offer only partial solutions -- many other types of servers do not lend themselves to a decentralized model, and tend to collect on another, well-connected but overloaded Ethernet. The higher speed of FDDI seems to offer an easy solution, but in practice both expense and interoperability problems render FDDI a poor choice. Ethernet switches appear to permit cheaper and more reliable networking to the servers while providing an aggregate network bandwidth greater than a simple Ethernet. This paper studies the evolution of the server networks at SLAC. Difficulties encountered in the deployment of FDDI are described, as are the tools and techniques used to characterize the traffic patterns on the server network. Performance of Ethernet, FDDI, and switched Ethernet networks is analyzed, as are reliability and maintainability issues for these alternatives. The motivations for re-designing the SLAC general server network to use a switched Ethernet instead of FDDI are described, as are the reasons for choosing FDDI for the farm and firewall networks at SLAC. Guidelines are developed which may help in making this choice for other networks.

  6. Asynchronous data change notification between database server and accelerator controls system

    SciTech Connect

    Fu, W.; Morris, J.; Nemesure, S.

    2011-10-10

    Database data change notification (DCN) is a commonly used feature. Not all database management systems (DBMS) provide an explicit DCN mechanism. Even for those DBMS's which support DCN (such as Oracle and MS SQL server), some server side and/or client side programming may be required to make the DCN system work. This makes the setup of DCN between database server and interested clients tedious and time consuming. In accelerator control systems, there are many well established software client/server architectures (such as CDEV, EPICS, and ADO) that can be used to implement data reflection servers that transfer data asynchronously to any client using the standard SET/GET API. This paper describes a method for using such a data reflection server to set up asynchronous DCN (ADCN) between a DBMS and clients. This method works well for all DBMS systems which provide database trigger functionality. Asynchronous data change notification (ADCN) between database server and clients can be realized by combining the use of a database trigger mechanism, which is supported by major DBMS systems, with server processes that use client/server software architectures that are familiar in the accelerator controls community (such as EPICS, CDEV or ADO). This approach makes the ADCN system easy to set up and integrate into an accelerator controls system. Several ADCN systems have been set up and used in the RHIC-AGS controls system.

  7. Group-oriented coordination models for distributed client-server computing

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.; Hughes, Craig S.

    1994-01-01

    This paper describes group-oriented control models for distributed client-server interactions. These models transparently coordinate requests for services that involve multiple servers, such as queries across distributed databases. Specific capabilities include: decomposing and replicating client requests; dispatching request subtasks or copies to independent, networked servers; and combining server results into a single response for the client. The control models were implemented by combining request broker and process group technologies with an object-oriented communication middleware tool. The models are illustrated in the context of a distributed operations support application for space-based systems.

  8. Creating a GIS data server on the World Wide Web: The GISST example

    SciTech Connect

    Pace, P.J.; Evers, T.K.

    1996-01-01

    In an effort to facilitate user access to Geographic Information Systems (GIS) data, the GIS and Computer Modeling Group from the Computational Physics and Engineering Division at the Oak Ridge National Laboratory (ORNL) in Oak Ridge, Tennessee (TN), has developed a World Wide Web server named GISST. The server incorporates a highly interactive and dynamic forms-based interface to browse and download a variety of GIS data types. This paper describes the server`s design considerations, development, resulting implementation and future enhancements.

  9. 75 FR 43206 - In the Matter of Certain Wireless Communications System Server Software, Wireless Handheld...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-23

    ... and Battery Packs: Notice of Commission Determination Not To Review An Initial Determination... communications system server software, wireless handheld devices and battery packs by reason of infringement...

  10. Environment: General; Grammar & Usage; Money Management; Music History; Web Page Creation & Design.

    ERIC Educational Resources Information Center

    Web Feet, 2001

    2001-01-01

    Describes Web site resources for elementary and secondary education in the topics of: environment, grammar, money management, music history, and Web page creation and design. Each entry includes an illustration of a sample page on the site and an indication of the grade levels for which it is appropriate. (AEF)

  11. 47 CFR 22.509 - Procedures for mutually exclusive applications in the Paging and Radiotelephone Service.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 2 2011-10-01 2011-10-01 false Procedures for mutually exclusive applications... COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES PUBLIC MOBILE SERVICES Paging and Radiotelephone Service § 22.509 Procedures for mutually exclusive applications in the Paging and Radiotelephone...

  12. 47 CFR 22.509 - Procedures for mutually exclusive applications in the Paging and Radiotelephone Service.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 2 2010-10-01 2010-10-01 false Procedures for mutually exclusive applications... COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES PUBLIC MOBILE SERVICES Paging and Radiotelephone Service § 22.509 Procedures for mutually exclusive applications in the Paging and Radiotelephone...

  13. Web Page Design and Graphic Use of Three U.S. Newspapers.

    ERIC Educational Resources Information Center

    Li, Xigen

    1998-01-01

    Contributes to scholarship on journalism and new technology by exploring approaches to Web page design and graphic use in three Internet newspapers. Explores how they demonstrate a change from the convention of newspaper publishing to the new media age, and how Web page design and graphic use reflect interconnectedness and a shift of control from…

  14. A New Era of Search Engines: Not Just Web Pages Anymore.

    ERIC Educational Resources Information Center

    Hock, Ran

    2002-01-01

    Discusses various types of information that can be retrieved from the Web via search engines. Highlights include Web pages; time frames, including historical coverage and currentness; text pages in formats other than HTML; directory sites; news articles; discussion groups; images; and audio and video. (LRW)

  15. Future Trends in Children's Web Pages: Probing Hidden Biases for Information Quality

    ERIC Educational Resources Information Center

    Kurubacak, Gulsun

    2007-01-01

    As global digital communication continues to flourish, Children's Web pages become more critical for children to realize not only the surface but also breadth and deeper meanings in presenting these milieus. These pages not only are very diverse and complex but also enable intense communication across social, cultural and political restrictions…

  16. Future Trends in Chlldren's Web Pages: Probing Hidden Biases for Information Quality

    ERIC Educational Resources Information Center

    Kurubacak, Gulsun

    2007-01-01

    As global digital communication continues to flourish, Children's Web pages become more critical for children to realize not only the surface but also breadth and deeper meanings in presenting these milieus. These pages not only are very diverse and complex but also enable intense communication across social, cultural and political restrictions…

  17. The Situated Aspect of Creativity in Communicative Events: How Do Children Design Web Pages Together?

    ERIC Educational Resources Information Center

    Fernandez-Cardenas, Juan Manuel

    2008-01-01

    This paper looks at the collaborative construction of web pages in History by a Year-4 group of children in a primary school in the UK. The aim of this paper is to find out: (a) How did children interpret their involvement in this literacy practice? (b) How the construction of web pages was interactionally accomplished? and (c) How can creativity…

  18. Exploring Cultural Variation in Eye Movements on a Web Page between Americans and Koreans

    ERIC Educational Resources Information Center

    Yang, Changwoo

    2009-01-01

    This study explored differences in eye movement on a Web page between members of two different cultures to provide insight and guidelines for implementation of global Web site development. More specifically, the research examines whether differences of eye movement exist between the two cultures (American vs. Korean) when viewing a Web page, and…

  19. 47 CFR 90.490 - One-way paging operations in the private services.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... directly from telephone positions in the public switched telephone network. When land stations are multiple... 47 Telecommunication 5 2012-10-01 2012-10-01 false One-way paging operations in the private... SPECIAL RADIO SERVICES PRIVATE LAND MOBILE RADIO SERVICES Paging Operations § 90.490 One-way...

  20. 47 CFR 90.490 - One-way paging operations in the private services.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... directly from telephone positions in the public switched telephone network. When land stations are multiple... 47 Telecommunication 5 2014-10-01 2014-10-01 false One-way paging operations in the private... SPECIAL RADIO SERVICES PRIVATE LAND MOBILE RADIO SERVICES Paging Operations § 90.490 One-way...

  1. Teaching E-Commerce Web Page Evaluation and Design: A Pilot Study Using Tourism Destination Sites

    ERIC Educational Resources Information Center

    Susser, Bernard; Ariga, Taeko

    2006-01-01

    This study explores a teaching method for improving business students' skills in e-commerce page evaluation and making Web design majors aware of business content issues through cooperative learning. Two groups of female students at a Japanese university studying either tourism or Web page design were assigned tasks that required cooperation to…

  2. World of Our Mothers: The Women's Page of the "Jewish Daily Forward" in 1919.

    ERIC Educational Resources Information Center

    Seller, Maxine Schwartz

    In 1919 the "Jewish Daily Forward" published in New York City was the leading Yiddish language newspaper in the world. This analysis explores how the themes of socialism, feminism, and Americanization were defined and developed on the women's pages, and what advice and information the page transmitted to its immigrant readers about each…

  3. Determination of the Subunit Molecular Mass and Composition of Alcohol Dehydrogenase by SDS-PAGE

    ERIC Educational Resources Information Center

    Nash, Barbara T.

    2007-01-01

    SDS-PAGE is a simple, rapid technique that has many uses in biochemistry and is readily adaptable to the undergraduate laboratory. It is, however, a technique prone to several types of procedural pitfalls. This article describes the use of SDS-PAGE to determine the subunit molecular mass and composition of yeast alcohol dehydrogenase employing…

  4. Crafting a Web-Based Lesson. Part Two: Organizing the Information and Constructing the Page.

    ERIC Educational Resources Information Center

    Quinlan, Laurie A.

    1997-01-01

    Provides guidelines for the organization of information (tables of contents, headings, grouping, URLs--Uniform Resource Locators--and e-mail tags) and the construction of a Web page (size considerations, concision, spelling, file dates, e-mail links, visual appeal, link relevance, copyrights). Includes source code for a sample lesson page and…

  5. Social Responsibility and Corporate Web Pages: Self-Presentation or Agenda-Setting?

    ERIC Educational Resources Information Center

    Esrock, Stuart L.; Leichty, Greg B.

    1998-01-01

    Examines how corporate entities use the Web to present themselves as socially responsible citizens and to advance policy positions. Samples randomly "Fortune 500" companies, revealing that, although 90% had Web pages and 82% of the sites addressed a corporate social responsibility issue, few corporations used their pages to monitor…

  6. Formal Features of Cyberspace: Relationships between Web Page Complexity and Site Traffic.

    ERIC Educational Resources Information Center

    Bucy, Erik P.; Lang, Annie; Potter, Robert F.; Grabe, Maria Elizabeth

    1999-01-01

    Examines differences between the formal features of commercial versus noncommercial Web sites, and the relationship between Web page complexity and amount of traffic a site receives. Findings indicate that, although most pages in this stage of the Web's development remain technologically simple and noninteractive, there are significant…

  7. CTserver: A Computational Thermodynamics Server for the Geoscience Community

    NASA Astrophysics Data System (ADS)

    Kress, V. C.; Ghiorso, M. S.

    2006-12-01

    The CTserver platform is an Internet-based computational resource that provides on-demand services in Computational Thermodynamics (CT) to a diverse geoscience user base. This NSF-supported resource can be accessed at ctserver.ofm-research.org. The CTserver infrastructure leverages a high-quality and rigorously tested software library of routines for computing equilibrium phase assemblages and for evaluating internally consistent thermodynamic properties of materials, e.g. mineral solid solutions and a variety of geological fluids, including magmas. Thermodynamic models are currently available for 167 phases. Recent additions include Duan, Møller and Weare's model for supercritical C-O-H-S, extended to include SO2 and S2 species, and an entirely new associated solution model for O-S-Fe-Ni sulfide liquids. This software library is accessed via the CORBA Internet protocol for client-server communication. CORBA provides a standardized, object-oriented, language and platform independent, fast, low-bandwidth interface to phase property modules running on the server cluster. Network transport, language translation and resource allocation are handled by the CORBA interface. Users access server functionality in two principal ways. Clients written as browser- based Java applets may be downloaded which provide specific functionality such as retrieval of thermodynamic properties of phases, computation of phase equilibria for systems of specified composition, or modeling the evolution of these systems along some particular reaction path. This level of user interaction requires minimal programming effort and is ideal for classroom use. A more universal and flexible mode of CTserver access involves making remote procedure calls from user programs directly to the server public interface. The CTserver infrastructure relieves the user of the burden of implementing and testing the often complex thermodynamic models of real liquids and solids. A pilot application of this distributed

  8. Client/Server data serving for high performance computing

    NASA Technical Reports Server (NTRS)

    Wood, Chris

    1994-01-01

    This paper will attempt to examine the industry requirements for shared network data storage and sustained high speed (10's to 100's to thousands of megabytes per second) network data serving via the NFS and FTP protocol suite. It will discuss the current structural and architectural impediments to achieving these sorts of data rates cost effectively today on many general purpose servers and will describe and architecture and resulting product family that addresses these problems. The sustained performance levels that were achieved in the lab will be shown as well as a discussion of early customer experiences utilizing both the HIPPI-IP and ATM OC3-IP network interfaces.

  9. High-Performance Tiled WMS and KML Web Server

    NASA Technical Reports Server (NTRS)

    Plesea, Lucian

    2007-01-01

    This software is an Apache 2.0 module implementing a high-performance map server to support interactive map viewers and virtual planet client software. It can be used in applications that require access to very-high-resolution geolocated images, such as GIS, virtual planet applications, and flight simulators. It serves Web Map Service (WMS) requests that comply with a given request grid from an existing tile dataset. It also generates the KML super-overlay configuration files required to access the WMS image tiles.

  10. Deploying Server-side File System Monitoring at NERSC

    SciTech Connect

    Uselton, Andrew

    2009-05-01

    The Franklin Cray XT4 at the NERSC center was equipped with the server-side I/O monitoring infrastructure Cerebro/LMT, which is described here in detail. Insights gained from the data produced include a better understanding of instantaneous data rates during file system testing, file system behavior during regular production time, and long-term average behaviors. Information and insights gleaned from this monitoring support efforts to proactively manage the I/O infrastructure on Franklin. A simple model for I/O transactions is introduced and compared with the 250 million observations sent to the LMT database from August 2008 to February 2009.

  11. BPROMPT: A consensus server for membrane protein prediction.

    PubMed

    Taylor, Paul D; Attwood, Teresa K; Flower, Darren R

    2003-07-01

    Protein structure prediction is a cornerstone of bioinformatics research. Membrane proteins require their own prediction methods due to their intrinsically different composition. A variety of tools exist for topology prediction of membrane proteins, many of them available on the Internet. The server described in this paper, BPROMPT (Bayesian PRediction Of Membrane Protein Topology), uses a Bayesian Belief Network to combine the results of other prediction methods, providing a more accurate consensus prediction. Topology predictions with accuracies of 70% for prokaryotes and 53% for eukaryotes were achieved. BPROMPT can be accessed at http://www.jenner.ac.uk/BPROMPT.

  12. Evaluating the online activity of users of the e-Bug web site.

    PubMed

    de Quincey, Ed; Kostkova, Patty; Jawaheer, Gawesh; Farrell, David; McNulty, Cliodna A M; Weinberg, Julius

    2011-06-01

    Web server log analysis is being increasingly used to evaluate the user behaviour on healthcare resource web sites due to the detailed record of activity that they contain. This study aimed to use this information to evaluate the e-Bug web site, a healthcare resource that provides a range of educational resources about microbes, hand and respiratory hygiene, and antibiotics. This evaluation was conducted by analysing the web server logs of the e-Bug web site for the period January 2008 to November 2009, using a proprietary application named Sawmill. The e-Bug web site has had >900,000 page views generated from >88,000 users, with an increase in May 2009 during the swine flu epidemic and a further increase in September 2009 following the official launch of e-Bug. The majority of visitors were from the UK, but visits were recorded from 190 different countries. Word(®) document resources were downloaded >169,000 times, with the most popular being a swine flu factsheet. PowerPoint(®) document resources were downloaded >36,000 times, with the most popular relating to the 'chain of infection'. The majority of visitor referrals originated from search engines, with the most popular referral keywords being variations on the e-Bug name. The most common non-search engine referrals were from other healthcare resources and agencies. Use of the site has increased markedly since the official launch of e-Bug, with average page views of >200,000 per month, from a range of countries, illustrating the international demand for a teaching resource for microbes, hygiene and antibiotics.

  13. Opportunities for the Mashup of Heterogenous Data Server via Semantic Web Technology

    NASA Astrophysics Data System (ADS)

    Ritschel, Bernd; Seelus, Christoph; Neher, Günther; Iyemori, Toshihiko; Koyama, Yukinobu; Yatagai, Akiyo; Murayama, Yasuhiro; King, Todd; Hughes, John; Fung, Shing; Galkin, Ivan; Hapgood, Michael; Belehaki, Anna

    2015-04-01

    Opportunities for the Mashup of Heterogenous Data Server via Semantic Web Technology European Union ESPAS, Japanese IUGONET and GFZ ISDC data server are developed for the ingestion, archiving and distributing of geo and space science domain data. Main parts of the data -managed by the mentioned data server- are related to near earth-space and geomagnetic field data. A smart mashup of the data server would allow a seamless browse and access to data and related context information. However the achievement of a high level of interoperability is a challenge because the data server are based on different data models and software frameworks. This paper is focused on the latest experiments and results for the mashup of the data server using the semantic Web approach. Besides the mashup of domain and terminological ontologies, especially the options to connect data managed by relational databases using D2R server and SPARQL technology will be addressed. A successful realization of the data server mashup will not only have a positive impact to the data users of the specific scientific domain but also to related projects, such as e.g. the development of a new interoperable version of NASA's Planetary Data System (PDS) or ICUS's World Data System alliance. ESPAS data server: https://www.espas-fp7.eu/portal/ IUGONET data server: http://search.iugonet.org/iugonet/ GFZ ISDC data server (semantic Web based prototype): http://rz-vm30.gfz-potsdam.de/drupal-7.9/ NASA PDS: http://pds.nasa.gov ICSU-WDS: https://www.icsu-wds.org

  14. The Technology of Extracting Content Information from Web Page Based on DOM Tree

    NASA Astrophysics Data System (ADS)

    Yuan, Dingrong; Mo, Zhuoying; Xie, Bing; Xie, Yangcai

    There are huge amounts of information on Web pages, which includes content information and other useless information, such as navigation, advertisement and flash of animation etc. Reducing the toils of Web users, we estabished a thechnique to extract the content information from web page. Fristly, we analyzed the semantic of web documents by V8 engine of Google and parsed the web document into DOM tree. And then, traversed the DOM tree, pruned the DOM tree in the light of the characteristic of Web page's edit language. Finally, we extracted the content information from Web page. Theoretics and experiments showed that the technique could simplify the web page, present the content information to web users and supply clean data for applicable area, such as retrieval, KDD and DM from web.

  15. PageRank, HITS and a unified framework for link analysis

    SciTech Connect

    Ding, Chris; He, Xiaofeng; Husbands, Parry; Zha, Hongyuan; Simon, Horst

    2001-10-01

    Two popular webpage ranking algorithms are HITS and PageRank. HITS emphasizes mutual reinforcement between authority and hub webpages, while PageRank emphasizes hyperlink weight normalization and web surfing based on random walk models. We systematically generalize/combine these concepts into a unified framework. The ranking framework contains a large algorithm space; HITS and PageRank are two extreme ends in this space. We study several normalized ranking algorithms which are intermediate between HITS and PageRank, and obtain closed-form solutions. We show that, to first order approximation, all ranking algorithms in this framework, including PageRank and HITS, lead to same ranking which is highly correlated with ranking by indegree. These results support the notion that in web resource ranking indegree and outdegree are of fundamental importance. Rankings of webgraphs of different sizes and queries are presented to illustrate our analysis.

  16. Intention to continue using Facebook fan pages from the perspective of social capital theory.

    PubMed

    Lin, Kuan-Yu; Lu, Hsi-Peng

    2011-10-01

    Social network sites enable users to express themselves, establish ties, and develop and maintain social relationships. Recently, many companies have begun using social media identity (e.g., Facebook fan pages) to enhance brand attractiveness, and social network sites have evolved into social utility networks, thereby creating a number of promising business opportunities. To this end, the operators of fan pages need to be aware of the factors motivating users to continue their patronization of such pages. This study set out to identify these motivating factors from the point of view of social capital. This study employed structural equation modeling to investigate a research model based on a survey of 327 fan pages users. This study discovered that ties related to social interaction (structural dimension), shared values (cognitive dimension), and trust (relational dimension) play important roles in users' continued intention to use Facebook fan pages. Finally, this study discusses the implications of these findings and offers directions for future research.

  17. Writing your thesis Paul Oliver Writing your thesis Publisher: Sage No. of pages: 208 £16.99 0761942998 0761942998 [Formula: see text].

    PubMed

    2004-10-01

    So you need to write a thesis and want some succinct, practical guidance. You don't want to plough through hundreds of pages or follow up a list of references to other material. Both of these activities use up your precious study time and divert your attention from the subject material of the thesis.

  18. Inexpensive automated paging system for use at remote research sites

    USGS Publications Warehouse

    Sargent, S.L.; Dey, W.S.; Keefer, D.A.

    1998-01-01

    The use of a flow-activated automatic sampler at a remote research site required personnel to periodically visit the site to collect samples and reset the automatic sampler. To reduce site visits, a cellular telephone was modified for activation by a datalogger. The purpose of this study was to demonstrate the use and benefit of the modified telephone. Both the power switch and the speed-dial button on the telephone were bypassed and wired to a relay driver. The datalogger was programmed to compare values of a monitored environmental parameter with a target value. When the target value was reached or exceeded, the datalogger pulsed a relay driver, activating power to the telephone. A separate relay activated the speed dial, dialing the number of a tone-only pager. The use of this system has saved time and reduced travel costs by reducing the number of trips to the site, without the loss of any data.The use of a flow-activated automatic sampler at a remote research site required personnel to periodically visit the site to collect samples and reset the automatic sampler. To reduce site visits, a cellular telephone was modified for activation by a datalogger. The purpose of this study was to demonstrate the use and benefit of the modified telephone. Both the power switch and the speed-dial button on the telephone were bypassed and wired to a relay driver. The datalogger was programmed to compare values of a monitored environmental parameter with a target value. When the target value was reached or exceeded, the datalogger pulsed a relay driver, activating power to the telephone. A separate relay activated the speed dial, dialing the number of a tone-only pager. The use of this system has saved time and reduced travel costs by reducing the number of trips to the site, without the loss of any data.

  19. Cybersecurity, massive data processing, community interaction, and other developments at WWW-based computational X-ray Server

    NASA Astrophysics Data System (ADS)

    Stepanov, Sergey

    2013-03-01

    X-Ray Server (x-server.gmca.aps.anl.gov) is a WWW-based computational server for modeling of X-ray diffraction, reflection and scattering data. The modeling software operates directly on the server and can be accessed remotely either from web browsers or from user software. In the later case the server can be deployed as a software library or a data fitting engine. As the server recently surpassed the milestones of 15 years online and 1.5 million calculations, it accumulated a number of technical solutions that are discussed in this paper. The developed approaches to detecting physical model limits and user calculations failures, solutions to spam and firewall problems, ways to involve the community in replenishing databases and methods to teach users automated access to the server programs may be helpful for X-ray researchers interested in using the server or sharing their own software online.

  20. Developing Server-Side Infrastructure for Large-Scale E-Learning of Web Technology

    ERIC Educational Resources Information Center

    Simpkins, Neil

    2010-01-01

    The growth of E-business has made experience in server-side technology an increasingly important area for educators. Server-side skills are in increasing demand and recognised to be of relatively greater value than comparable client-side aspects (Ehie, 2002). In response to this, many educational organisations have developed E-business courses,…